Input
stringlengths
251
41.6k
Output
stringlengths
137
9.7k
input_ids
listlengths
157
2.05k
attention_mask
listlengths
157
2.05k
labels
listlengths
157
2.05k
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper studies why and when averaging the weights of a model improve outofdomain generalization by way of a new a biasvariancecovariancelocality decomposition of the expected target domain error they show how diversity is needed and thus propose diverse weight averaging diwa wherein weights of n independent trained models from shared initialization are averaged together in an optionally greedy fashion they show how diwa outperforms weight averaging across n model checkpoints of a single training run and other baselines on the public domainbed benchmark strengths the paper is very well written exposition is clear wellorganized high quality free of typos and each section is wellmotivated for example limitations of the flatnessbased analysis discusses why the current explanation for weight averagings performance is insufficient and why a different type of analysis like their 4 term decomposition is needed the proposed biasvariancecovariancelocality decomposition proposition 1 is novel and aids our understanding of weight averaging furthermore it directly motivates their proposed diwa diwa is evaluated on domainbed which is the standard public benchmark for outofdomain generalization right now experimental setup is sound and the comparisons with baselines fair weaknesses despite diwa outperforming weight averaging checkpoints of a single run wa it is unlikely to be adopted for some use cases for example for large language models there is no extra cost for doing wa but training the model multiple times as needed by diwa is not practical or feasible the extra compute is probably better spent scaling the model size or training for longer yes extreme hyperparameter ranges lead to weights whose average may perform poorly diversity is key as long as the weights remain averageable linear connectivity in weight space is needed docsepin this paper the authors proposed a simple method to improve the outofdomain generalization ie by averaging weights from different training runs rather than one run some theoretical analysis is also given for explaining weight averaging wa and the proposed method experiments on the domainbed benchmark show the performance of the proposed method strengths 1 the research question of when wa succeeds and how to improve it is interesting and relatively less explored 2 the paper provides theoretical insights for wa and the proposed method weaknesses 1 the writings of the paper may have some overclaim issues in the abstract the authors claim that for outofdistribution generalization in computer vision the best current approach averages the weights along a training run line 23 but from the introduction it seems that such a claim is only based on the observation from the domainbed benchmark line 2224 im wondering does the observation on one benchmark could represent the whole situation in computer vision 2 the contribution of this paper is quite minimal especially regarding the experimental results it seems there is no large performance improvement of the proposed method when compared with ma 29 in table 1 under the random initialization the performance of the proposed method looks similar to the performance of swad 14 3 the proposed method needs to ensemble weights from different runs eg 20 runs in the experiments which may cause unnecessary additional computational cost the authors have addressed the limitations of their proposed method docsepthis paper proposes diverse weight averaging diwa which averages weights obtained from different training runs starting from the same initial parameters and mildly different hyperparameters the paper decomposes the expected ood error into bias variance covariance locality and claims that weight averaging is most effective when the covariance term dominates verified empirically they show improved ood generalization performance on the domainbed benchmark overall i think this is a solid paper with a conceptually simple method and clean writing over ens wa has the benefit of requiring only one feedforward at test time both lemma 1 and figure 1 show that wa is a close approximation to ens in the training setting of diwa in fact wa seems to perform slightly better than ens diwa is simple and effective and the experiments show that it can be combined with previous advances for more benefits lp initialization leveraging diversity from ermmixupcoral algorithms etc to my knowledge this work has no potential negative societal impact other than what was already present in the existing literature docsepin this paper the authors propose an approach to mitigating the generalization problem in computer vision under the common scenarios of either a difference in the data generating marginals covariate shift or a difference in the classconditional distributions which they call concept shift or correlation shift they connect previous work on the errors incurred by weight averaging and standard ensembling and offer a new decomposition that better explains when weight averaging fails under correlation shift or covariate shift their proposed method of averaging diverse sets of weights leads to improved results on the domainbed benchmark suite of datasets strengths the paper is well written the sections are subdivided cleanly and they preview the arguments established in each subsection culminating in a larger point the authors are clearly well versed in the literature of ensembling and transfer learning in particular section 24 and its subsections develop the argument for why weight averaged ensembles are robust to covariate shift or diversity shift quite convincingly also the analysis in section 22 of why flatnessbased minimizers do not have any effect on ood error was illuminating weaknesses there are a few small issues that id like to see addressed at the outset in the abstract the authors state that the best current approaches for out of distribution generalization are derived from averaging the weights along a training run this is not quite so as there are multiple other approaches swag deep ensembles lnns loss surface simplexes for mode connecting volumes and fast ensembling these all introduce greater member diversity though different mechanisms so it would be more correct to broaden the related work section to include them and to acknowledge them in the abstract on lines 185186 the authors state that hey follow the objective of decorrelating the learning procedures of ensemble members citing the dice paper its an exaggeration to say that the dice objective is followed in section 3 the authors of dice perform considerable work to implement a variational approximation to the conditional entropy bottleneck between two learners the authors of this work do not go to such lengths to decorrelate the features extracted from the inputs by ensemble members comments i think the paper would benefit from a perspective that comments upon the difficulty of transfer learning based on the presence of neural collapse or learning neural network subspaces especially as the authors note that the size of the space spanned by the furthest member from the barycentre is important its hard to believe that this papers claims of increased diversity by new initializations are an actual advance this is present in all the modern ensembling literature cf papers descended from the deepensembles literature in table 1 there does not seem to be a comparison against simple deep ensembles fens in the notation of the paper a question i have with the setup in 21 is why would we not want to compare against gains made by finetuning in the target distribution in most practical scenarios there is limited target data available to train on certainly in almost all modern nlp tasks this is a baseline to consider including this baseline where a model or ensemble of models that is trained on s is allowed to be finetuned in increments of data or learning iterations on data drawn from t before being evaluated would qualify the degree to which diwa is a feasible solution to practical transfer learning overall despite some small quibbles i find that there is a lot to offer in this paper and the community would benefit from seeing it in a first class venue yes they have ### Summary:
the reviewers unanimously recommend accepting the paper congratulations my only concern is in the related work the submission mentions the recent model soups by wortsman et al 28 developed a wa algorithm similar to algorithm 1 however the task the theoretical analysis and most importantly the goals of these two works are different this is not an accurate characterization because wortsman et al 28 were also interested in outofdistribution generalization their paper mentions robustness and distribution shift several times and contains results on multiple ood test sets the results in this submission and in wortsman et al 28 reinforce each other since the two papers evaluate on different ood benchmarks and find that weight averaging helps in both i encourage the authors to clarify this in their related work section so that the reader can correctly put the results in context
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 2175, 2139, 285, 672, 25001, 253, 13461, 273, 247, 1566, 3157, 562, 1171, 13517, 26647, 407, 1039, 273, 247, 747, 247, 8492, 87, 14417, 31485, 6656, 6226, 406, 1319, 14717, 273, 253, 3264, 2303, 5028, 2228, 597, 921, 849, 9991, 310, 3058, 285, 3021, 12661, 11117, 2801, 25001, 1073, 8754, 10646, 13461, 273, 295, 3907, 10166, 3210, 432, 6096, 31850, 403, 17522, 2366, 275, 271, 33323, 38754, 8142, 597, 921, 849, 1073, 8754, 41731, 13015, 2801, 25001, 2439, 295, 1566, 2451, 10801, 273, 247, 2014, 3733, 1408, 285, 643, 1666, 25379, 327, 253, 1345, 5028, 3026, 22791, 50276, 296, 3755, 20556, 50276, 783, 2929, 310, 1077, 973, 3542, 50276, 911, 3321, 310, 2590, 973, 34092, 1029, 3290, 1959, 273, 963, 993, 285, 1016, 2593, 310, 973, 24013, 8550, 323, 1650, 7364, 273, 253, 6507, 1255, 3169, 1783, 25339, 2139, 253, 1655, 8813, 323, 2801, 10180, 356, 723, 3045, 310, 12497, 285, 2139, 247, 1027, 1511, 273, 1783, 751, 616, 577, 1307, 14717, 310, 3058, 50276, 783, 4081, 8492, 87, 14417, 31485, 6656, 6226, 406, 1319, 14717, 13989, 337, 310, 4460, 285, 34253, 776, 4685, 273, 2801, 25001, 33810, 352, 3587, 15265, 684, 616, 4081, 1073, 8754, 50276, 5168, 8754, 310, 6760, 327, 5028, 3026, 534, 310, 253, 2629, 1345, 22791, 323, 562, 1171, 13517, 26647, 987, 1024, 5661, 9978, 310, 3590, 285, 253, 14023, 342, 1666, 25379, 4344, 50276, 20881, 1255, 265, 50276, 3229, 3784, 1073, 8754, 41731, 14692, 2801, 25001, 2451, 10801, 273, 247, 2014, 1408, 5172, 352, 310, 11543, 281, 320, 8671, 323, 690, 897, 2219, 323, 1650, 323, 1781, 3448, 3210, 627, 310, 642, 4465, 2105, 323, 2509, 5172, 533, 3733, 253, 1566, 2709, 2069, 347, 3058, 407, 1073, 8754, 310, 417, 8542, 390, 17887, 253, 4465, 11897, 310, 3164, 1805, 5262, 13642, 253, 1566, 1979, 390, 3733, 323, 3356, 4754, 50276, 2068, 4190, 4373, 19484, 13794, 1421, 281, 13461, 3692, 3388, 778, 1347, 15225, 50276, 69, 2095, 310, 2234, 347, 1048, 347, 253, 13461, 3464, 3388, 494, 50276, 8172, 17769, 275, 2801, 2317, 310, 3058, 50276, 7152, 339, 9852, 436, 2929, 253, 4477, 4081, 247, 2969, 1332, 281, 3157, 253, 562, 1171, 13517, 26647, 26332, 407, 25001, 13461, 432, 1027, 3733, 6613, 2581, 685, 581, 1408, 690, 10527, 1783, 310, 671, 1677, 323, 15571, 2801, 25001, 5172, 285, 253, 4081, 1332, 4679, 327, 253, 5028, 3026, 22791, 921, 253, 3045, 273, 253, 4081, 1332, 50276, 296, 3755, 20556, 337, 253, 2561, 1953, 273, 672, 5172, 44584, 285, 849, 281, 3157, 352, 310, 4722, 285, 4942, 1679, 14859, 374, 253, 2929, 3400, 10527, 16039, 323, 5172, 285, 253, 4081, 1332, 50276, 20881, 1255, 265, 337, 253, 25527, 273, 253, 2929, 778, 452, 690, 689, 7041, 3374, 275, 253, 12002, 253, 4477, 1750, 326, 323, 562, 1171, 35360, 26647, 275, 4382, 8113, 253, 1682, 1655, 2746, 31218, 253, 13461, 2112, 247, 3733, 1408, 1386, 3495, 533, 432, 253, 10199, 352, 3133, 326, 824, 247, 1750, 310, 760, 1754, 327, 253, 8310, 432, 253, 5028, 3026, 22791, 1386, 3307, 1348, 516, 12371, 1057, 253, 8310, 327, 581, 22791, 812, 1957, 253, 2644, 4112, 275, 4382, 8113, 50276, 19, 253, 7680, 273, 436, 2929, 310, 3240, 8723, 3340, 5001, 253, 5661, 1543, 352, 3133, 627, 310, 642, 1781, 3045, 7756, 273, 253, 4081, 1332, 672, 2429, 342, 6429, 3285, 275, 2829, 337, 762, 253, 3632, 31850, 253, 3045, 273, 253, 4081, 1332, 4453, 2074, 281, 253, 3045, 273, 1863, 324, 1638, 50274, 20, 253, 4081, 1332, 3198, 281, 19862, 13461, 432, 1027, 6613, 24088, 1384, 6613, 275, 253, 4679, 534, 778, 2847, 15279, 3081, 15180, 2105, 253, 4477, 452, 9713, 253, 7364, 273, 616, 4081, 1332, 50276, 7152, 33032, 2520, 2929, 29328, 11117, 2801, 25001, 1073, 8754, 534, 31218, 13461, 2797, 432, 1027, 3733, 6613, 4983, 432, 253, 1072, 3302, 3602, 285, 38920, 1027, 4373, 22041, 253, 2929, 11101, 6013, 253, 3264, 258, 351, 2228, 715, 8492, 11041, 26677, 33643, 285, 3916, 326, 2801, 25001, 310, 954, 3576, 672, 253, 26677, 1307, 36807, 16058, 45190, 597, 921, 5520, 258, 351, 26647, 3045, 327, 253, 5028, 3026, 22791, 4583, 891, 1158, 436, 310, 247, 4891, 2929, 342, 247, 4473, 1230, 2969, 1332, 285, 4076, 4028, 50276, 1189, 3994, 5172, 556, 253, 5649, 273, 10568, 760, 581, 3997, 10495, 387, 1071, 673, 1097, 18057, 337, 285, 4677, 337, 921, 326, 5172, 310, 247, 2810, 11193, 281, 3994, 275, 253, 3733, 4758, 273, 1073, 8754, 275, 958, 5172, 3133, 281, 1347, 5777, 1805, 685, 3994, 50276, 5168, 8754, 310, 2969, 285, 3576, 285, 253, 4679, 921, 326, 352, 476, 320, 5678, 342, 2045, 16424, 323, 625, 5373, 39322, 31850, 19732, 2977, 9991, 432, 209, 693, 24706, 484, 5528, 267, 11333, 3966, 281, 619, 3640, 436, 789, 556, 642, 2442, 4016, 38058, 3486, 643, 685, 752, 369, 2168, 1246, 275, 253, 5368, 6239, 5474, 339, 9852, 436, 2929, 253, 4477, 12661, 271, 2746, 281, 37460, 253, 26647, 1895, 275, 4382, 8113, 762, 253, 1846, 15216, 273, 2057, 247, 3064, 275, 253, 941, 11365, 8459, 932, 9383, 11610, 5333, 390, 247, 3064, 275, 253, 966, 35428, 10670, 534, 597, 1067, 4473, 5333, 390, 5921, 5333, 50276, 9328, 4684, 2045, 789, 327, 253, 6332, 23122, 407, 2801, 25001, 285, 2629, 546, 35128, 285, 3959, 247, 747, 14717, 326, 1805, 11424, 672, 2801, 25001, 10224, 762, 5921, 5333, 390, 9383, 11610, 5333, 50276, 14094, 4081, 1332, 273, 25001, 11117, 5239, 273, 13461, 5644, 281, 5520, 1543, 327, 253, 5028, 3026, 22791, 18880, 273, 15302, 50273, 296, 3755, 20556, 50276, 783, 2929, 310, 973, 3542, 50276, 783, 7118, 403, 18375, 1356, 4076, 314, 285, 597, 25044, 253, 7125, 4232, 275, 1016, 19087, 27798, 839, 275, 247, 4067, 1127, 50276, 783, 4477, 403, 4518, 973, 1888, 264, 275, 253, 6239, 273, 546, 35128, 285, 3700, 4715, 50274, 249, 1798, 2593, 2164, 285, 697, 749, 21454, 1287, 253, 4154, 323, 2139, 2801, 17522, 49328, 403, 10237, 281, 9383, 11610, 5333, 390, 9991, 5333, 3240, 2410, 1763, 5356, 50274, 12563, 253, 1783, 275, 2593, 3307, 273, 2139, 6507, 1255, 3169, 7221, 14460, 513, 417, 452, 667, 1055, 327, 258, 351, 2228, 369, 48374, 50275, 20881, 1255, 265, 50276, 9088, 403, 247, 1643, 1355, 3374, 326, 2654, 751, 281, 923, 9713, 50276, 255, 253, 35681, 275, 253, 12002, 253, 4477, 1375, 326, 253, 1682, 1655, 7274, 323, 562, 273, 3268, 26647, 403, 6012, 432, 25001, 253, 13461, 2112, 247, 3733, 1408, 50276, 2520, 310, 417, 3240, 594, 347, 627, 403, 2709, 643, 7274, 1863, 356, 3676, 49328, 43321, 2224, 2957, 2553, 44053, 265, 323, 4438, 12873, 14118, 285, 3809, 546, 35128, 841, 512, 9569, 3687, 3558, 9991, 2167, 1027, 6297, 594, 352, 651, 320, 625, 3451, 281, 3862, 257, 253, 2905, 789, 2593, 281, 2486, 731, 285, 281, 14409, 731, 275, 253, 12002, 50276, 251, 3104, 23512, 20270, 253, 4477, 1375, 326, 18981, 956, 253, 8103, 273, 11482, 1661, 839, 253, 4715, 7259, 273, 19862, 2758, 19936, 253, 25807, 2929, 50276, 953, 271, 23668, 318, 281, 1333, 326, 253, 25807, 8103, 310, 3560, 275, 2593, 495, 50276, 783, 4477, 273, 25807, 1347, 10665, 789, 281, 3359, 247, 39762, 11193, 281, 253, 17697, 15579, 3673, 44856, 875, 767, 40390, 50276, 783, 4477, 273, 436, 789, 513, 417, 564, 281, 824, 16095, 281, 11482, 1661, 366, 253, 3386, 10375, 432, 253, 14800, 407, 19862, 2758, 50275, 26122, 50276, 74, 1158, 253, 2929, 651, 5649, 432, 247, 8668, 326, 5701, 2220, 253, 10183, 273, 3700, 4715, 1754, 327, 253, 3361, 273, 11454, 13551, 390, 4715, 11454, 2990, 749, 31748, 3340, 347, 253, 4477, 3877, 326, 253, 1979, 273, 253, 2317, 40423, 407, 253, 11829, 783, 296, 3558, 432, 253, 28556, 1154, 250, 310, 1774, 50276, 953, 1892, 281, 2868, 326, 436, 9380, 3916, 273, 2559, 9991, 407, 747, 3302, 5904, 403, 271, 4588, 7170, 436, 310, 1246, 275, 512, 253, 4980, 546, 35128, 6239, 21194, 9380, 31206, 432, 253, 3676, 1215, 1814, 868, 6239, 50276, 249, 2829, 337, 50276, 9088, 1057, 417, 1646, 281, 320, 247, 5301, 1411, 2969, 3676, 49328, 269, 561, 275, 253, 14951, 273, 253, 2929, 50276, 66, 1953, 891, 452, 342, 253, 9978, 275, 3127, 310, 2139, 651, 359, 417, 971, 281, 7277, 1411, 15988, 1160, 407, 1442, 292, 25004, 275, 253, 2303, 3268, 50276, 249, 954, 8542, 15216, 627, 310, 3710, 2303, 941, 2130, 281, 6194, 327, 50276, 68, 20427, 275, 2761, 512, 4980, 295, 24343, 8892, 436, 310, 247, 8245, 281, 1908, 50276, 10387, 436, 8245, 835, 247, 1566, 390, 19862, 273, 3210, 326, 310, 10166, 327, 256, 310, 4136, 281, 320, 1442, 292, 37437, 275, 42344, 273, 941, 390, 4715, 25142, 327, 941, 8392, 432, 246, 1078, 1146, 6760, 651, 19478, 253, 4248, 281, 534, 1073, 8754, 310, 247, 17887, 2900, 281, 8542, 3700, 4715, 50276, 1189, 455, 5747, 690, 1355, 572, 487, 9143, 891, 1089, 326, 627, 310, 247, 2257, 281, 3959, 275, 436, 2929, 285, 253, 3114, 651, 5649, 432, 6523, 352, 275, 247, 806, 966, 18767, 50276, 9820, 597, 452, 2490, 187, 4118, 18435, 27, 783, 30628, 38350, 5583, 18738, 253, 2929, 50276, 14829, 37230, 50276, 2577, 760, 4468, 310, 275, 253, 2905, 789, 253, 19529, 25957, 50275, 783, 3332, 1566, 256, 2121, 407, 259, 8707, 1342, 1162, 355, 3349, 3715, 247, 5172, 5933, 2074, 281, 5933, 337, 2299, 253, 4836, 253, 10527, 1783, 285, 954, 15538, 253, 7342, 273, 841, 767, 2987, 403, 1027, 50276, 2520, 310, 417, 271, 7899, 14846, 984, 259, 8707, 1342, 1162, 355, 3349, 497, 671, 6110, 275, 562, 1171, 35360, 26647, 50276, 14094, 2929, 25957, 31640, 285, 3268, 5333, 2067, 2069, 285, 4428, 1543, 327, 2709, 258, 351, 1071, 5239, 253, 1543, 275, 436, 19529, 285, 275, 259, 8707, 1342, 1162, 355, 3349, 28432, 1016, 643, 1580, 253, 767, 9380, 7472, 327, 1027, 258, 351, 49602, 285, 1089, 326, 2801, 25001, 7729, 275, 1097, 891, 11907, 253, 4477, 281, 19148, 436, 275, 616, 2905, 789, 2593, 594, 326, 253, 9414, 476, 9113, 1691, 253, 1543, 275, 3634 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 2175, 2139, 285, 672, 25001, 253, 13461, 273, 247, 1566, 3157, 562, 1171, 13517, 26647, 407, 1039, 273, 247, 747, 247, 8492, 87, 14417, 31485, 6656, 6226, 406, 1319, 14717, 273, 253, 3264, 2303, 5028, 2228, 597, 921, 849, 9991, 310, 3058, 285, 3021, 12661, 11117, 2801, 25001, 1073, 8754, 10646, 13461, 273, 295, 3907, 10166, 3210, 432, 6096, 31850, 403, 17522, 2366, 275, 271, 33323, 38754, 8142, 597, 921, 849, 1073, 8754, 41731, 13015, 2801, 25001, 2439, 295, 1566, 2451, 10801, 273, 247, 2014, 3733, 1408, 285, 643, 1666, 25379, 327, 253, 1345, 5028, 3026, 22791, 50276, 296, 3755, 20556, 50276, 783, 2929, 310, 1077, 973, 3542, 50276, 911, 3321, 310, 2590, 973, 34092, 1029, 3290, 1959, 273, 963, 993, 285, 1016, 2593, 310, 973, 24013, 8550, 323, 1650, 7364, 273, 253, 6507, 1255, 3169, 1783, 25339, 2139, 253, 1655, 8813, 323, 2801, 10180, 356, 723, 3045, 310, 12497, 285, 2139, 247, 1027, 1511, 273, 1783, 751, 616, 577, 1307, 14717, 310, 3058, 50276, 783, 4081, 8492, 87, 14417, 31485, 6656, 6226, 406, 1319, 14717, 13989, 337, 310, 4460, 285, 34253, 776, 4685, 273, 2801, 25001, 33810, 352, 3587, 15265, 684, 616, 4081, 1073, 8754, 50276, 5168, 8754, 310, 6760, 327, 5028, 3026, 534, 310, 253, 2629, 1345, 22791, 323, 562, 1171, 13517, 26647, 987, 1024, 5661, 9978, 310, 3590, 285, 253, 14023, 342, 1666, 25379, 4344, 50276, 20881, 1255, 265, 50276, 3229, 3784, 1073, 8754, 41731, 14692, 2801, 25001, 2451, 10801, 273, 247, 2014, 1408, 5172, 352, 310, 11543, 281, 320, 8671, 323, 690, 897, 2219, 323, 1650, 323, 1781, 3448, 3210, 627, 310, 642, 4465, 2105, 323, 2509, 5172, 533, 3733, 253, 1566, 2709, 2069, 347, 3058, 407, 1073, 8754, 310, 417, 8542, 390, 17887, 253, 4465, 11897, 310, 3164, 1805, 5262, 13642, 253, 1566, 1979, 390, 3733, 323, 3356, 4754, 50276, 2068, 4190, 4373, 19484, 13794, 1421, 281, 13461, 3692, 3388, 778, 1347, 15225, 50276, 69, 2095, 310, 2234, 347, 1048, 347, 253, 13461, 3464, 3388, 494, 50276, 8172, 17769, 275, 2801, 2317, 310, 3058, 50276, 7152, 339, 9852, 436, 2929, 253, 4477, 4081, 247, 2969, 1332, 281, 3157, 253, 562, 1171, 13517, 26647, 26332, 407, 25001, 13461, 432, 1027, 3733, 6613, 2581, 685, 581, 1408, 690, 10527, 1783, 310, 671, 1677, 323, 15571, 2801, 25001, 5172, 285, 253, 4081, 1332, 4679, 327, 253, 5028, 3026, 22791, 921, 253, 3045, 273, 253, 4081, 1332, 50276, 296, 3755, 20556, 337, 253, 2561, 1953, 273, 672, 5172, 44584, 285, 849, 281, 3157, 352, 310, 4722, 285, 4942, 1679, 14859, 374, 253, 2929, 3400, 10527, 16039, 323, 5172, 285, 253, 4081, 1332, 50276, 20881, 1255, 265, 337, 253, 25527, 273, 253, 2929, 778, 452, 690, 689, 7041, 3374, 275, 253, 12002, 253, 4477, 1750, 326, 323, 562, 1171, 35360, 26647, 275, 4382, 8113, 253, 1682, 1655, 2746, 31218, 253, 13461, 2112, 247, 3733, 1408, 1386, 3495, 533, 432, 253, 10199, 352, 3133, 326, 824, 247, 1750, 310, 760, 1754, 327, 253, 8310, 432, 253, 5028, 3026, 22791, 1386, 3307, 1348, 516, 12371, 1057, 253, 8310, 327, 581, 22791, 812, 1957, 253, 2644, 4112, 275, 4382, 8113, 50276, 19, 253, 7680, 273, 436, 2929, 310, 3240, 8723, 3340, 5001, 253, 5661, 1543, 352, 3133, 627, 310, 642, 1781, 3045, 7756, 273, 253, 4081, 1332, 672, 2429, 342, 6429, 3285, 275, 2829, 337, 762, 253, 3632, 31850, 253, 3045, 273, 253, 4081, 1332, 4453, 2074, 281, 253, 3045, 273, 1863, 324, 1638, 50274, 20, 253, 4081, 1332, 3198, 281, 19862, 13461, 432, 1027, 6613, 24088, 1384, 6613, 275, 253, 4679, 534, 778, 2847, 15279, 3081, 15180, 2105, 253, 4477, 452, 9713, 253, 7364, 273, 616, 4081, 1332, 50276, 7152, 33032, 2520, 2929, 29328, 11117, 2801, 25001, 1073, 8754, 534, 31218, 13461, 2797, 432, 1027, 3733, 6613, 4983, 432, 253, 1072, 3302, 3602, 285, 38920, 1027, 4373, 22041, 253, 2929, 11101, 6013, 253, 3264, 258, 351, 2228, 715, 8492, 11041, 26677, 33643, 285, 3916, 326, 2801, 25001, 310, 954, 3576, 672, 253, 26677, 1307, 36807, 16058, 45190, 597, 921, 5520, 258, 351, 26647, 3045, 327, 253, 5028, 3026, 22791, 4583, 891, 1158, 436, 310, 247, 4891, 2929, 342, 247, 4473, 1230, 2969, 1332, 285, 4076, 4028, 50276, 1189, 3994, 5172, 556, 253, 5649, 273, 10568, 760, 581, 3997, 10495, 387, 1071, 673, 1097, 18057, 337, 285, 4677, 337, 921, 326, 5172, 310, 247, 2810, 11193, 281, 3994, 275, 253, 3733, 4758, 273, 1073, 8754, 275, 958, 5172, 3133, 281, 1347, 5777, 1805, 685, 3994, 50276, 5168, 8754, 310, 2969, 285, 3576, 285, 253, 4679, 921, 326, 352, 476, 320, 5678, 342, 2045, 16424, 323, 625, 5373, 39322, 31850, 19732, 2977, 9991, 432, 209, 693, 24706, 484, 5528, 267, 11333, 3966, 281, 619, 3640, 436, 789, 556, 642, 2442, 4016, 38058, 3486, 643, 685, 752, 369, 2168, 1246, 275, 253, 5368, 6239, 5474, 339, 9852, 436, 2929, 253, 4477, 12661, 271, 2746, 281, 37460, 253, 26647, 1895, 275, 4382, 8113, 762, 253, 1846, 15216, 273, 2057, 247, 3064, 275, 253, 941, 11365, 8459, 932, 9383, 11610, 5333, 390, 247, 3064, 275, 253, 966, 35428, 10670, 534, 597, 1067, 4473, 5333, 390, 5921, 5333, 50276, 9328, 4684, 2045, 789, 327, 253, 6332, 23122, 407, 2801, 25001, 285, 2629, 546, 35128, 285, 3959, 247, 747, 14717, 326, 1805, 11424, 672, 2801, 25001, 10224, 762, 5921, 5333, 390, 9383, 11610, 5333, 50276, 14094, 4081, 1332, 273, 25001, 11117, 5239, 273, 13461, 5644, 281, 5520, 1543, 327, 253, 5028, 3026, 22791, 18880, 273, 15302, 50273, 296, 3755, 20556, 50276, 783, 2929, 310, 973, 3542, 50276, 783, 7118, 403, 18375, 1356, 4076, 314, 285, 597, 25044, 253, 7125, 4232, 275, 1016, 19087, 27798, 839, 275, 247, 4067, 1127, 50276, 783, 4477, 403, 4518, 973, 1888, 264, 275, 253, 6239, 273, 546, 35128, 285, 3700, 4715, 50274, 249, 1798, 2593, 2164, 285, 697, 749, 21454, 1287, 253, 4154, 323, 2139, 2801, 17522, 49328, 403, 10237, 281, 9383, 11610, 5333, 390, 9991, 5333, 3240, 2410, 1763, 5356, 50274, 12563, 253, 1783, 275, 2593, 3307, 273, 2139, 6507, 1255, 3169, 7221, 14460, 513, 417, 452, 667, 1055, 327, 258, 351, 2228, 369, 48374, 50275, 20881, 1255, 265, 50276, 9088, 403, 247, 1643, 1355, 3374, 326, 2654, 751, 281, 923, 9713, 50276, 255, 253, 35681, 275, 253, 12002, 253, 4477, 1375, 326, 253, 1682, 1655, 7274, 323, 562, 273, 3268, 26647, 403, 6012, 432, 25001, 253, 13461, 2112, 247, 3733, 1408, 50276, 2520, 310, 417, 3240, 594, 347, 627, 403, 2709, 643, 7274, 1863, 356, 3676, 49328, 43321, 2224, 2957, 2553, 44053, 265, 323, 4438, 12873, 14118, 285, 3809, 546, 35128, 841, 512, 9569, 3687, 3558, 9991, 2167, 1027, 6297, 594, 352, 651, 320, 625, 3451, 281, 3862, 257, 253, 2905, 789, 2593, 281, 2486, 731, 285, 281, 14409, 731, 275, 253, 12002, 50276, 251, 3104, 23512, 20270, 253, 4477, 1375, 326, 18981, 956, 253, 8103, 273, 11482, 1661, 839, 253, 4715, 7259, 273, 19862, 2758, 19936, 253, 25807, 2929, 50276, 953, 271, 23668, 318, 281, 1333, 326, 253, 25807, 8103, 310, 3560, 275, 2593, 495, 50276, 783, 4477, 273, 25807, 1347, 10665, 789, 281, 3359, 247, 39762, 11193, 281, 253, 17697, 15579, 3673, 44856, 875, 767, 40390, 50276, 783, 4477, 273, 436, 789, 513, 417, 564, 281, 824, 16095, 281, 11482, 1661, 366, 253, 3386, 10375, 432, 253, 14800, 407, 19862, 2758, 50275, 26122, 50276, 74, 1158, 253, 2929, 651, 5649, 432, 247, 8668, 326, 5701, 2220, 253, 10183, 273, 3700, 4715, 1754, 327, 253, 3361, 273, 11454, 13551, 390, 4715, 11454, 2990, 749, 31748, 3340, 347, 253, 4477, 3877, 326, 253, 1979, 273, 253, 2317, 40423, 407, 253, 11829, 783, 296, 3558, 432, 253, 28556, 1154, 250, 310, 1774, 50276, 953, 1892, 281, 2868, 326, 436, 9380, 3916, 273, 2559, 9991, 407, 747, 3302, 5904, 403, 271, 4588, 7170, 436, 310, 1246, 275, 512, 253, 4980, 546, 35128, 6239, 21194, 9380, 31206, 432, 253, 3676, 1215, 1814, 868, 6239, 50276, 249, 2829, 337, 50276, 9088, 1057, 417, 1646, 281, 320, 247, 5301, 1411, 2969, 3676, 49328, 269, 561, 275, 253, 14951, 273, 253, 2929, 50276, 66, 1953, 891, 452, 342, 253, 9978, 275, 3127, 310, 2139, 651, 359, 417, 971, 281, 7277, 1411, 15988, 1160, 407, 1442, 292, 25004, 275, 253, 2303, 3268, 50276, 249, 954, 8542, 15216, 627, 310, 3710, 2303, 941, 2130, 281, 6194, 327, 50276, 68, 20427, 275, 2761, 512, 4980, 295, 24343, 8892, 436, 310, 247, 8245, 281, 1908, 50276, 10387, 436, 8245, 835, 247, 1566, 390, 19862, 273, 3210, 326, 310, 10166, 327, 256, 310, 4136, 281, 320, 1442, 292, 37437, 275, 42344, 273, 941, 390, 4715, 25142, 327, 941, 8392, 432, 246, 1078, 1146, 6760, 651, 19478, 253, 4248, 281, 534, 1073, 8754, 310, 247, 17887, 2900, 281, 8542, 3700, 4715, 50276, 1189, 455, 5747, 690, 1355, 572, 487, 9143, 891, 1089, 326, 627, 310, 247, 2257, 281, 3959, 275, 436, 2929, 285, 253, 3114, 651, 5649, 432, 6523, 352, 275, 247, 806, 966, 18767, 50276, 9820, 597, 452, 2490, 187, 4118, 18435, 27, 783, 30628, 38350, 5583, 18738, 253, 2929, 50276, 14829, 37230, 50276, 2577, 760, 4468, 310, 275, 253, 2905, 789, 253, 19529, 25957, 50275, 783, 3332, 1566, 256, 2121, 407, 259, 8707, 1342, 1162, 355, 3349, 3715, 247, 5172, 5933, 2074, 281, 5933, 337, 2299, 253, 4836, 253, 10527, 1783, 285, 954, 15538, 253, 7342, 273, 841, 767, 2987, 403, 1027, 50276, 2520, 310, 417, 271, 7899, 14846, 984, 259, 8707, 1342, 1162, 355, 3349, 497, 671, 6110, 275, 562, 1171, 35360, 26647, 50276, 14094, 2929, 25957, 31640, 285, 3268, 5333, 2067, 2069, 285, 4428, 1543, 327, 2709, 258, 351, 1071, 5239, 253, 1543, 275, 436, 19529, 285, 275, 259, 8707, 1342, 1162, 355, 3349, 28432, 1016, 643, 1580, 253, 767, 9380, 7472, 327, 1027, 258, 351, 49602, 285, 1089, 326, 2801, 25001, 7729, 275, 1097, 891, 11907, 253, 4477, 281, 19148, 436, 275, 616, 2905, 789, 2593, 594, 326, 253, 9414, 476, 9113, 1691, 253, 1543, 275, 3634 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the authors propose an inductive graph partitioning igp framework across multiple snapshots of a dynamic graph to produce an effective partitioning of the graph igp is based on training on the snapshots of the graph using a dual gnn architecture and is used to generalize to subsequent snapshots of the graph the paper proposes an interesting framework which is resistant to permutations of the training set used by the igp framework specifically the framework derives graph embeddings based on the input graph as an adjacency matrix and feature matrix which encodes neighbor similarity the embeddings are then used in the assignment of the nodedisjoint cluster labels the ablation studies clearly show the effect of different aspects of the proposed architecture the reporting of the tradeoff scores is slightly unclear what is the efficiency measure that is being tradedoff clarifying this in the main body of the results will make them clear finally there is literature both in terms of discrete algorithms and learning representations that addressed the problem of incremental partitioning of an evolving graph see for example httpsrlgmgithubiopapers41pdf and httpswwwvldborgpvldbvol13p1261fanpdf and references within comparison with the prior art in terms of the tradeoff achievable will make the empirical analysis stronger other drawbacks of the work include the availability of training data that already computes an optimal partition of a snapshot its not clear how this can be achieved secondly the comparisons in the experiments are against other ml models whose performance cannot be verified in terms optimality loss ie how far are those solutions from the optimal solution in this context comparison to an algorithm with known approximation factors would help i feel the paper in its current form can be significantly strengthened in terms of the presentation of the empirical results by clearly explaining the tradeoff achieved as well comparing with more algorithms from the literature as baselines docsepthe authors propose an inductive graph partitioning framework across multiple evolving graph snapshots to alleviate the nphard challenge it first conducts the offline training of a dual graph neural network on historical snapshots to capture the structural properties of a system the trained model is then generalized to newly generated snapshots for fast highquality online gp without additional optimization where a better tradeoff between quality and efficiency is achieved strong points overall the paper is well written and the experimental part seems comprehensive a few comments 1 theorem 1 is stated in a very informal way it seems not good to add discussion on one concrete example in figure 2 in a statement of a theorem i would suggest that please move that outside and discuss the potential meanings and implications of theorem 1 on examples separately 2 figures 1 and 2 both are too small and it is hard to read details there the paper seems well written here are a few typos 1 the first line on the first paragraph of section 2 s a comma is missing after the symbol mathcalg2 2 on page 6 in the sentence just above equation 12 we drive should be we derive docsepthis paper considers the problem of solving the graph partitioning problem repeatedly over many different graphs graphs are sampled iid from an unknown but fixed distribution and the goal of the algorithm is to solve the gp problem on each of the graphs there are two quantities that are opposing efficiency on one hand we can solve an nphard problem each time but would be prohibitively time consuming and quality on the other hand we can generalize from the learnings on the other graph since they are related via the iid distribution this paper proposes a nn architecture that uses a subset of the graphs in the family to learn an embedding and then derives the solution to the other instances by using this embedding via a matrix multiplication using a number of simulated and realworld datasets they show that this method works well empirically the strengths of this paper are as follows the problem consider is important and practical graph partitioning is an important preprocessing step in many applications moreover the model of being sampled iid has practical relevance and also considered in many prominent works in the graph networks literature for the most part the paper is written well with detailed explanantion of the methodology and the results the experiments are also exhaustive and sound they considered an exhaustive list of baselines and show that they perform nearly well or better than most methods on the range of tasks considered they use publicly available datasets and the results can largely be reproduced the weakness of this paper is as follows the paper does not do a good job of crisply characterizing how their methods differ from prior works the reasons why their method is better that the authors state seem somewhat shallowarbitrary they claim that computing gradients etc are expensivetimeconsuming yet their method is based on adversarial autoencoders which need to be trained via gradient based optimization methods so it seems like this is not a real difference the only difference i can see is that this paper considers a different nn architecture and training method compared to some of the other gnn based methods eg hamilton et al i would like the authors to be a bit more precise and scientific also credit prior works better along the lines of 1 it is not entirely clear to me why this method works adding intuition about some of the choices would make it much better at the moment it seems like they created a new method threw it at the datasets and it happened to work some more principled study on why it works would greatly help the paper the paper does not report cis in the experiment some of the points are so close to each other that without cis its unclear if its real improvement or just noise this is important since the thrust of the claims made by this paper is in the experiments and having rigoros statistical analysis gives more confidence in the results my main review is based on the weaknesses i stated above i am not totally confident about the importancemerit of the approach provided in this paper thus i am giving a weak reject docsepin this work the authors focus on an important problem building an inductive framework for graph partitioning this is a major problem with the potential of improving the performance of various classic transductive algorithms for graph partitioning both in terms of output quality but also in terms of speed when a new snapshot of a system needs to be partitioned into communities this is a recent line of research see eg nazi et al 2019 gap generalizable approximate graph partitioning framework the proposed framework can address snapshots with differing number of nodes by projecting them down to coarsened versions of the network there are two versions of the framework based on normalized cut and modularity objectives respectively the framework can be potentially used with other measures as well a nice idea is to leverage the unsupervised communities extracted from modularity optimization or spectral algorithms through a dual gnn structure that can then be used to partition fast unseen instances the paper has various strong points including an extensive experimental evaluation with numerous other baselines what is the advantage that the dual gnn offers over other choices in prior work pan et al 2018 nazi et al 2019 under what conditions on the training dataset does the framework work well is it easy to test those conditions eg from equation 11 that encodes a kcut size it seems that you implicitly assume small changes from snapshot to snapshot is this what you mean by saying that the graphs are snapshots of a given complex system can you please elaborate on how the uncoarsening is done to get the actual community participation of the query graph g having an axis with t secs is not very informative for the proposed method can you please be more specific about the runtimes can you test your method on communities with community groundtruth for the synthetic experiments using the stochastic blockmodel it would have been nice to see a comparison with the paper supervised community detection with line graph neural networks by chen et al that provides stateoftheart results the paper has various strong points but the writeup can be improved i found overall the paper hard to read eg the choice of the dual gnn is not well justified what issues is it resolving from gap reading the theorem 1 was kind of confusing as the statement feels more as an observation derived from the definition of a partition etc i also found the term of permutation invariant for labels 011 vs 100 a bit confusing as up to that point i was expecting labels to be indicator vectors of which community a node participates in docsepthe authors consider the problem of inductive graph partitioning which they formulate as clustering or partitioning multiple snapshots of a timeevolving graph for which we have no node correspondence in other words we cannot link the nodes in snapshot t to those in snapshot t1 which prevents incremental or evolutionary clustering algorithms from being applied the propose a complicated dual graph neural network gnn architecture for this problem setting and demonstrate potentially good clustering accuracy with low computation time on simulated and real networks strengths performs graph partitioning on multiple graph snapshots while not requiring any form of correspondence of nodes between snapshots proposed architecture includes a few novel elements weaknesses presentation of results is extremely poor figure 3 is trying to show way too much it took me several minutes of staring at the figure zooming in and out and examining multiple legends to understand what is being shown be selective in the results that you show in the main body and provide the rest in the supplementary text in figures and tables is too small proposed problem setting specifically assumes that there is no node correspondence between snapshots but then all of the real data experiments involve snapshots over time which do have node correspondence methods that use node correspondence eg incremental and evolutionary clustering could possibly do better on these datasets no motivating application for the no node correspondence setting since there is no dataset targeting the proposed setting a motivating application without data could also provide some justification as to why one should care about this setting but no such application is present so it feels more like the authors invented a problem to solve on real data sets the number of clusters k is not known so the authors run the louvain algorithm to choose k but the louvain algorithm also outputs a set of clusters which appears to be discarded furthermore the computation time of the louvain algorithm does not appear to be included although i cant be sure because the presentation of the results is such a mess theorem 1 is trivial and im not sure why it is highlighted as a theorem after discussion period i have added a strike through the incorrect portions of my review that have been clarified by the authors the presentation of the results is extremely poor making empirical results very hard to interpret i also have concerns regarding the problem setting and evaluation so i do not view this paper as ready for publication at this time docsepthis paper proposes an inductive gnnbased framework that partitions unattributed graphs with possibly different numbers of nodes and groundtruth clusters extensive empirical evaluations using both synthetic and real datasets demonstrate that the new method achieves a better tradeoff between quality and efficiency when compared with 15 baselines from 4 categories strengths both the inductive graph partitioning problem which the paper attempts to solve and the dual gnn adversarial approach are interesting the authors conduct comprehensive experiments and showcase the usefulness of their method even though there are a few notable limitations of the current work which i will detail below the authors explicitly state the assumptions and carry out a number of experiments that cover a reasonable range of inductive scenarios weaknesses limitations 1 the paper does not provide a theoretical guarantee on the generalization to new unseen graphs it would be nice if a guarantee on modularity or ncut is provided 2 the proposed framework does not work with attributed graphs for graph partitioning this might be the usual setting but the authors should comment on if incorporating node features could be helpful 3 the authors assume that the set of graphs which includes both offline training and online testing graphs follow similar distributions this assumption easily breaks in practice for graphs that come from different domains table 21 in the appendix seems to indicate that in general the proposed inductive method may not generalize well due to distribution shifts on the other hand i appreciate the authors conduct additional transferring tests even though the results are mixed this is a plus since it provides the reader with additional information for the experiments with lfr synthetic graphs in the main text what would happen if we increase the range of n eg from 500 to 10000 4 the method requires knowing the number of groundtruth clusters k it may not really be a limitation as many graph partitioning methods also require k as a parameter minor issues 1 in order for 1 and 2 to be equivalent do we need to perform some rounding algorithm on an optimal solution ht of 2 2 theorem 1 is trivial it should be stated as a fact with detailed arguments left to the appendix 3 the font size in figure 2 is too small i cannot read it without zooming in on a screen the paper proposes a new gnnbased framework for inductive graph partitioning the authors conduct extensive experiments that cover a reasonable range of scenarios that showcase the usefulness of the proposed method my concerns are 1 there is no generalization guarantee and 2 the method may fail due to distribution shifts as shown in the experiments ### Summary:
this paper considers an important problem graph partitioning from a transductive viewpoint assuming that the graphs are generated by independent draws from an unknown distribution learn some parameters in an offline phase and use these in the online phase much as in pac learning the authors have also answered many of the reviewer questions in particular the comparison with existing work is substantial while i laud the positives of this work and the importance of the transductive approach i see an issue as a reviewer points out and as agreed by the authors the paper does not provide a theoretical guarantee of the quality of the generalization to unseen graphs it would have been useful eg to consider this on erdosrenyi gnp models stochastic block models etc
[ 643, 4216, 1580, 597, 403, 2905, 3066, 253, 891, 301, 3268, 436, 2929, 29328, 247, 48257, 10336, 326, 4648, 247, 8578, 273, 253, 14580, 275, 253, 2021, 281, 3037, 271, 21496, 285, 840, 38422, 253, 2900, 281, 253, 643, 10872, 407, 970, 436, 21496, 3066, 247, 4315, 25219, 970, 247, 1180, 273, 15524, 285, 1524, 10186, 15302, 597, 921, 326, 436, 1332, 2987, 973, 45190, 253, 20544, 273, 436, 2929, 403, 347, 3637, 50275, 783, 1895, 1908, 310, 1774, 285, 8542, 4216, 41463, 310, 271, 1774, 638, 21678, 3213, 275, 1142, 4893, 25761, 253, 1566, 273, 1146, 19958, 891, 301, 556, 8542, 17200, 285, 671, 2783, 275, 1142, 11906, 2987, 275, 253, 4216, 6928, 6239, 50275, 1542, 253, 954, 629, 253, 2929, 310, 3542, 973, 342, 7000, 7148, 386, 279, 273, 253, 16182, 285, 253, 1543, 253, 4679, 403, 671, 41389, 285, 3590, 597, 2783, 271, 41389, 1618, 273, 1666, 25379, 285, 921, 326, 597, 1347, 4829, 973, 390, 1805, 685, 954, 3082, 327, 253, 2491, 273, 8892, 2783, 50275, 9328, 897, 13644, 2130, 15302, 285, 253, 1543, 476, 8127, 320, 23775, 50276, 783, 14855, 273, 436, 2929, 310, 347, 3637, 50275, 783, 2929, 1057, 417, 513, 247, 1175, 2628, 273, 7550, 2881, 39330, 849, 616, 3082, 9184, 432, 2720, 2987, 253, 4606, 2139, 616, 1332, 310, 1805, 326, 253, 4477, 1375, 1646, 8489, 20126, 45749, 26257, 597, 1750, 326, 12672, 27935, 3966, 403, 866, 561, 400, 7816, 33136, 2568, 616, 1332, 310, 1754, 327, 48960, 6753, 2083, 351, 398, 534, 878, 281, 320, 10166, 3066, 11786, 1754, 13757, 3082, 594, 352, 3133, 751, 436, 310, 417, 247, 1524, 3064, 253, 760, 3064, 891, 476, 923, 310, 326, 436, 2929, 19401, 247, 1027, 48257, 10336, 285, 3733, 1332, 2429, 281, 690, 273, 253, 643, 305, 9866, 1754, 3082, 24088, 10546, 7839, 1162, 355, 891, 651, 751, 253, 4477, 281, 320, 247, 2372, 625, 10799, 285, 8249, 50276, 12563, 6152, 2720, 2987, 1805, 50275, 28694, 253, 3104, 273, 337, 352, 310, 417, 7094, 2590, 281, 479, 2139, 436, 1332, 2987, 6240, 30328, 670, 690, 273, 253, 10165, 651, 1056, 352, 1199, 1805, 387, 253, 2774, 352, 3133, 751, 597, 3562, 247, 747, 1332, 13222, 352, 387, 253, 15302, 285, 352, 4592, 281, 789, 690, 625, 3505, 74, 6216, 1263, 327, 2139, 352, 2987, 651, 10260, 1361, 253, 2929, 50275, 783, 2929, 1057, 417, 1304, 21693, 275, 253, 3368, 690, 273, 253, 2792, 403, 594, 2810, 281, 1016, 643, 326, 1293, 21693, 697, 12744, 604, 697, 1524, 7756, 390, 816, 6046, 436, 310, 1774, 1580, 253, 19031, 273, 253, 3916, 1160, 407, 436, 2929, 310, 275, 253, 4679, 285, 1907, 8132, 263, 375, 7605, 1783, 4245, 625, 7162, 275, 253, 1543, 619, 2022, 2278, 310, 1754, 327, 253, 32213, 891, 4767, 1840, 891, 717, 417, 9106, 13224, 670, 253, 6349, 961, 262, 273, 253, 2746, 2530, 275, 436, 2929, 3021, 891, 717, 4933, 247, 5075, 12009, 50276, 7152, 339, 9852, 436, 789, 253, 4477, 2770, 327, 271, 1774, 1895, 3652, 271, 42115, 7792, 323, 4216, 41463, 436, 310, 247, 2201, 1895, 342, 253, 2442, 273, 11138, 253, 3045, 273, 2710, 10610, 811, 43324, 50276, 267, 46042, 323, 4216, 41463, 1097, 275, 2426, 273, 3453, 3290, 533, 671, 275, 2426, 273, 3885, 672, 247, 747, 29679, 50276, 1171, 247, 985, 3198, 281, 320, 10883, 264, 715, 7888, 436, 310, 247, 3332, 1386, 273, 2561, 923, 24088, 295, 23248, 1162, 355, 6247, 8037, 2087, 12729, 16851, 4216, 41463, 7792, 50276, 783, 4081, 7792, 476, 2953, 14496, 28853, 342, 26704, 1180, 273, 7632, 407, 35104, 731, 1066, 281, 820, 1032, 2348, 9508, 273, 253, 2990, 627, 403, 767, 9508, 273, 253, 7792, 1754, 327, 12650, 2624, 285, 23178, 414, 16566, 2975, 253, 7792, 476, 320, 7826, 908, 342, 643, 5593, 347, 973, 247, 5322, 2934, 310, 281, 25057, 253, 440, 35421, 7888, 10375, 432, 23178, 414, 13757, 390, 9879, 11333, 949, 247, 8746, 305, 9866, 2605, 326, 476, 840, 320, 908, 281, 10883, 3809, 39709, 10872, 50275, 783, 2929, 556, 2710, 2266, 2792, 1690, 271, 9470, 5661, 7103, 342, 7418, 643, 1666, 25379, 50274, 5371, 310, 253, 5750, 326, 253, 8746, 305, 9866, 6131, 689, 643, 10165, 275, 2720, 789, 3199, 1162, 355, 4765, 295, 23248, 1162, 355, 6247, 50273, 4524, 752, 2515, 327, 253, 3733, 10895, 1057, 253, 7792, 789, 973, 310, 352, 3477, 281, 1071, 1110, 2515, 24088, 432, 5150, 1903, 326, 31360, 247, 465, 7317, 1979, 352, 3133, 326, 368, 29688, 5467, 1355, 2544, 432, 29679, 281, 29679, 50276, 261, 436, 752, 368, 1599, 407, 3981, 326, 253, 14580, 403, 14496, 28853, 273, 247, 1677, 2570, 985, 50275, 5092, 368, 4496, 21184, 327, 849, 253, 440, 1940, 1032, 2980, 310, 2218, 281, 755, 253, 4588, 3114, 11497, 273, 253, 7316, 4216, 305, 50274, 30819, 271, 7844, 342, 246, 4706, 84, 310, 417, 1077, 27096, 323, 253, 4081, 1332, 476, 368, 4496, 320, 625, 2173, 670, 253, 1408, 3181, 50274, 5092, 368, 1071, 634, 1332, 327, 7888, 342, 3114, 3216, 33024, 50274, 1542, 253, 13506, 4679, 970, 253, 19191, 2972, 7645, 352, 651, 452, 644, 5322, 281, 923, 247, 5301, 342, 253, 2929, 50276, 35421, 3114, 5481, 342, 1386, 4216, 11454, 6928, 407, 260, 864, 1162, 355, 326, 3400, 1375, 23037, 14387, 1543, 50272, 783, 2929, 556, 2710, 2266, 2792, 533, 253, 3630, 484, 476, 320, 5520, 891, 1119, 4583, 253, 2929, 1892, 281, 1239, 24088, 253, 4327, 273, 253, 8746, 305, 9866, 310, 417, 973, 17285, 752, 3374, 310, 352, 30426, 432, 8037, 4361, 253, 10012, 337, 369, 2238, 273, 21643, 347, 253, 3908, 9193, 625, 347, 271, 8310, 6012, 432, 253, 5426, 273, 247, 10883, 3966, 50276, 74, 671, 1119, 253, 1307, 273, 29391, 13727, 323, 13301, 470, 883, 4632, 2233, 247, 2372, 21643, 347, 598, 281, 326, 1127, 891, 369, 16764, 13301, 281, 320, 15301, 11390, 273, 534, 3114, 247, 4666, 45347, 275, 50276, 7152, 339, 431, 248, 4477, 1908, 253, 1895, 273, 42115, 4216, 41463, 534, 597, 36803, 347, 17524, 390, 41463, 2709, 14496, 28853, 273, 247, 673, 1173, 11932, 4216, 323, 534, 359, 452, 642, 4666, 17668, 275, 643, 3000, 359, 2550, 3048, 253, 7632, 275, 29679, 246, 281, 1110, 275, 29679, 246, 18, 534, 16897, 32809, 390, 16483, 17524, 11333, 432, 1146, 3732, 253, 12661, 247, 9542, 8746, 4216, 11454, 2990, 305, 9866, 10336, 323, 436, 1895, 4758, 285, 7568, 7826, 1175, 17524, 7200, 342, 1698, 13782, 673, 327, 15524, 285, 1524, 6928, 20544, 50275, 468, 13015, 4216, 41463, 327, 2709, 4216, 14496, 28853, 1223, 417, 10568, 667, 830, 273, 17668, 273, 7632, 875, 14496, 28853, 50276, 856, 7334, 10336, 3797, 247, 1643, 4460, 3603, 50276, 20881, 1255, 265, 50275, 49836, 273, 1543, 310, 6685, 4105, 4677, 495, 310, 2820, 281, 921, 1039, 1512, 1199, 352, 2335, 479, 2067, 2909, 273, 17303, 387, 253, 4677, 21282, 272, 275, 285, 562, 285, 17565, 2709, 38209, 281, 2096, 752, 310, 1146, 2011, 320, 13687, 275, 253, 1543, 326, 368, 921, 275, 253, 2022, 2133, 285, 2085, 253, 1551, 275, 253, 24864, 2505, 275, 8442, 285, 7180, 310, 1512, 1355, 50276, 856, 7334, 1895, 4758, 5742, 19584, 326, 627, 310, 642, 4666, 17668, 875, 14496, 28853, 533, 840, 512, 273, 253, 1524, 941, 4679, 6388, 14496, 28853, 689, 673, 534, 513, 452, 4666, 17668, 3082, 326, 897, 4666, 17668, 24088, 32809, 285, 16483, 17524, 812, 6830, 513, 1805, 327, 841, 15302, 50276, 2369, 15265, 839, 2898, 323, 253, 642, 4666, 17668, 4758, 1580, 627, 310, 642, 10895, 12262, 253, 4081, 4758, 247, 15265, 839, 2898, 1293, 941, 812, 671, 2085, 690, 22861, 347, 281, 2139, 581, 943, 1557, 670, 436, 4758, 533, 642, 824, 2898, 310, 1246, 594, 352, 9193, 625, 751, 253, 4477, 23179, 247, 1895, 281, 8415, 50276, 251, 1524, 941, 5239, 253, 1180, 273, 9959, 465, 310, 417, 1929, 594, 253, 4477, 1408, 253, 29245, 87, 404, 5933, 281, 5206, 465, 533, 253, 29245, 87, 404, 5933, 671, 18012, 247, 873, 273, 9959, 534, 4620, 281, 320, 25665, 33810, 253, 13782, 673, 273, 253, 29245, 87, 404, 5933, 1057, 417, 3176, 281, 320, 2908, 3738, 891, 16216, 320, 2119, 984, 253, 9759, 273, 253, 1543, 310, 824, 247, 4840, 50276, 33921, 337, 310, 14916, 285, 516, 417, 2119, 2139, 352, 310, 16318, 347, 247, 10012, 50276, 6438, 5955, 2180, 891, 452, 2879, 247, 9974, 949, 253, 13583, 11821, 273, 619, 2278, 326, 452, 644, 31637, 407, 253, 4477, 253, 9759, 273, 253, 1543, 310, 6685, 4105, 2403, 16774, 1543, 1077, 1892, 281, 4665, 891, 671, 452, 7350, 5001, 253, 1895, 4758, 285, 7103, 594, 891, 513, 417, 1859, 436, 2929, 347, 4704, 323, 9311, 387, 436, 673, 5474, 33032, 2520, 2929, 29328, 271, 42115, 305, 9866, 3169, 7792, 326, 27959, 39152, 3567, 14580, 342, 6830, 1027, 3904, 273, 7632, 285, 3216, 33024, 9959, 9470, 16774, 27163, 970, 1097, 13506, 285, 1524, 15302, 7568, 326, 253, 747, 1332, 33526, 247, 1805, 5454, 2727, 875, 3290, 285, 6733, 672, 2429, 342, 1458, 1666, 25379, 432, 577, 9050, 20544, 50276, 15617, 253, 42115, 4216, 41463, 1895, 534, 253, 2929, 9437, 281, 8415, 285, 253, 8746, 305, 9866, 48960, 2746, 403, 4722, 253, 4477, 2589, 11088, 4679, 285, 34647, 253, 31471, 273, 616, 1332, 1014, 2167, 627, 403, 247, 1643, 16613, 7364, 273, 253, 1655, 789, 534, 891, 588, 2508, 2708, 253, 4477, 11120, 1375, 253, 13260, 285, 4459, 562, 247, 1180, 273, 4679, 326, 3835, 247, 5272, 2491, 273, 42115, 15216, 50276, 20881, 1255, 265, 50276, 17465, 569, 50276, 18, 253, 2929, 1057, 417, 2085, 247, 10527, 12215, 327, 253, 26647, 281, 747, 39709, 14580, 352, 651, 320, 5322, 604, 247, 12215, 327, 23178, 414, 390, 295, 7317, 310, 2530, 50276, 19, 253, 4081, 7792, 1057, 417, 789, 342, 12877, 14580, 323, 4216, 41463, 436, 1537, 320, 253, 7312, 4758, 533, 253, 4477, 943, 4385, 327, 604, 24049, 4666, 3386, 812, 320, 9371, 50276, 20, 253, 4477, 5467, 326, 253, 873, 273, 14580, 534, 3797, 1097, 28841, 3733, 285, 3909, 5175, 14580, 956, 2074, 10670, 436, 9376, 4354, 13471, 275, 3946, 323, 14580, 326, 1705, 432, 1027, 10625, 2829, 3127, 275, 253, 30762, 3133, 281, 5224, 326, 275, 2087, 253, 4081, 42115, 1332, 778, 417, 39970, 973, 1955, 281, 3268, 15036, 327, 253, 643, 1133, 891, 11435, 253, 4477, 2589, 3081, 27090, 5216, 1014, 2167, 253, 1543, 403, 6804, 436, 310, 247, 5043, 1580, 352, 3400, 253, 9414, 342, 3081, 1491, 323, 253, 4679, 342, 298, 925, 13506, 14580, 275, 253, 2022, 2505, 752, 651, 5108, 604, 359, 2572, 253, 2491, 273, 295, 24088, 432, 6783, 281, 30321, 50276, 21, 253, 1332, 4419, 8958, 253, 1180, 273, 3216, 33024, 9959, 465, 352, 778, 417, 1663, 320, 247, 12291, 347, 1142, 4216, 41463, 3082, 671, 2430, 465, 347, 247, 4764, 50276, 37585, 3374, 50276, 18, 275, 1340, 323, 337, 285, 374, 281, 320, 6425, 513, 359, 878, 281, 1347, 690, 46551, 5933, 327, 271, 8654, 2900, 288, 85, 273, 374, 50276, 19, 10012, 337, 310, 14916, 352, 943, 320, 4767, 347, 247, 958, 342, 7000, 7125, 1669, 281, 253, 30762, 50276, 20, 253, 8266, 1979, 275, 4677, 374, 310, 1512, 1355, 891, 2550, 1239, 352, 1293, 21282, 272, 275, 327, 247, 3601, 253, 2929, 29328, 247, 747, 305, 9866, 3169, 7792, 323, 42115, 4216, 41463, 253, 4477, 2589, 9470, 4679, 326, 3835, 247, 5272, 2491, 273, 15216, 326, 34647, 253, 31471, 273, 253, 4081, 1332, 619, 7350, 403, 337, 627, 310, 642, 26647, 12215, 285, 374, 253, 1332, 778, 1891, 1955, 281, 3268, 15036, 347, 2011, 275, 253, 4679, 2490, 187, 4118, 18435, 27, 2520, 2929, 19401, 271, 1774, 1895, 4216, 41463, 432, 247, 811, 43324, 31460, 7384, 326, 253, 14580, 403, 4561, 407, 3907, 21354, 432, 271, 7202, 3268, 3037, 690, 3602, 275, 271, 28841, 3408, 285, 897, 841, 275, 253, 3909, 3408, 1199, 347, 275, 19162, 4715, 253, 4477, 452, 671, 9577, 1142, 273, 253, 37317, 3533, 275, 1798, 253, 5301, 342, 5368, 789, 310, 6832, 50275, 6050, 891, 826, 438, 253, 37865, 273, 436, 789, 285, 253, 6349, 273, 253, 811, 43324, 2746, 891, 923, 271, 2523, 347, 247, 37317, 2792, 562, 285, 347, 5821, 407, 253, 4477, 253, 2929, 1057, 417, 2085, 247, 10527, 12215, 273, 253, 3290, 273, 253, 26647, 281, 39709, 14580, 352, 651, 452, 644, 4217, 24088, 281, 1908, 436, 327, 2827, 38755, 445, 28212, 305, 18650, 3210, 19191, 2972, 3210, 3966 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 643, 4216, 1580, 597, 403, 2905, 3066, 253, 891, 301, 3268, 436, 2929, 29328, 247, 48257, 10336, 326, 4648, 247, 8578, 273, 253, 14580, 275, 253, 2021, 281, 3037, 271, 21496, 285, 840, 38422, 253, 2900, 281, 253, 643, 10872, 407, 970, 436, 21496, 3066, 247, 4315, 25219, 970, 247, 1180, 273, 15524, 285, 1524, 10186, 15302, 597, 921, 326, 436, 1332, 2987, 973, 45190, 253, 20544, 273, 436, 2929, 403, 347, 3637, 50275, 783, 1895, 1908, 310, 1774, 285, 8542, 4216, 41463, 310, 271, 1774, 638, 21678, 3213, 275, 1142, 4893, 25761, 253, 1566, 273, 1146, 19958, 891, 301, 556, 8542, 17200, 285, 671, 2783, 275, 1142, 11906, 2987, 275, 253, 4216, 6928, 6239, 50275, 1542, 253, 954, 629, 253, 2929, 310, 3542, 973, 342, 7000, 7148, 386, 279, 273, 253, 16182, 285, 253, 1543, 253, 4679, 403, 671, 41389, 285, 3590, 597, 2783, 271, 41389, 1618, 273, 1666, 25379, 285, 921, 326, 597, 1347, 4829, 973, 390, 1805, 685, 954, 3082, 327, 253, 2491, 273, 8892, 2783, 50275, 9328, 897, 13644, 2130, 15302, 285, 253, 1543, 476, 8127, 320, 23775, 50276, 783, 14855, 273, 436, 2929, 310, 347, 3637, 50275, 783, 2929, 1057, 417, 513, 247, 1175, 2628, 273, 7550, 2881, 39330, 849, 616, 3082, 9184, 432, 2720, 2987, 253, 4606, 2139, 616, 1332, 310, 1805, 326, 253, 4477, 1375, 1646, 8489, 20126, 45749, 26257, 597, 1750, 326, 12672, 27935, 3966, 403, 866, 561, 400, 7816, 33136, 2568, 616, 1332, 310, 1754, 327, 48960, 6753, 2083, 351, 398, 534, 878, 281, 320, 10166, 3066, 11786, 1754, 13757, 3082, 594, 352, 3133, 751, 436, 310, 417, 247, 1524, 3064, 253, 760, 3064, 891, 476, 923, 310, 326, 436, 2929, 19401, 247, 1027, 48257, 10336, 285, 3733, 1332, 2429, 281, 690, 273, 253, 643, 305, 9866, 1754, 3082, 24088, 10546, 7839, 1162, 355, 891, 651, 751, 253, 4477, 281, 320, 247, 2372, 625, 10799, 285, 8249, 50276, 12563, 6152, 2720, 2987, 1805, 50275, 28694, 253, 3104, 273, 337, 352, 310, 417, 7094, 2590, 281, 479, 2139, 436, 1332, 2987, 6240, 30328, 670, 690, 273, 253, 10165, 651, 1056, 352, 1199, 1805, 387, 253, 2774, 352, 3133, 751, 597, 3562, 247, 747, 1332, 13222, 352, 387, 253, 15302, 285, 352, 4592, 281, 789, 690, 625, 3505, 74, 6216, 1263, 327, 2139, 352, 2987, 651, 10260, 1361, 253, 2929, 50275, 783, 2929, 1057, 417, 1304, 21693, 275, 253, 3368, 690, 273, 253, 2792, 403, 594, 2810, 281, 1016, 643, 326, 1293, 21693, 697, 12744, 604, 697, 1524, 7756, 390, 816, 6046, 436, 310, 1774, 1580, 253, 19031, 273, 253, 3916, 1160, 407, 436, 2929, 310, 275, 253, 4679, 285, 1907, 8132, 263, 375, 7605, 1783, 4245, 625, 7162, 275, 253, 1543, 619, 2022, 2278, 310, 1754, 327, 253, 32213, 891, 4767, 1840, 891, 717, 417, 9106, 13224, 670, 253, 6349, 961, 262, 273, 253, 2746, 2530, 275, 436, 2929, 3021, 891, 717, 4933, 247, 5075, 12009, 50276, 7152, 339, 9852, 436, 789, 253, 4477, 2770, 327, 271, 1774, 1895, 3652, 271, 42115, 7792, 323, 4216, 41463, 436, 310, 247, 2201, 1895, 342, 253, 2442, 273, 11138, 253, 3045, 273, 2710, 10610, 811, 43324, 50276, 267, 46042, 323, 4216, 41463, 1097, 275, 2426, 273, 3453, 3290, 533, 671, 275, 2426, 273, 3885, 672, 247, 747, 29679, 50276, 1171, 247, 985, 3198, 281, 320, 10883, 264, 715, 7888, 436, 310, 247, 3332, 1386, 273, 2561, 923, 24088, 295, 23248, 1162, 355, 6247, 8037, 2087, 12729, 16851, 4216, 41463, 7792, 50276, 783, 4081, 7792, 476, 2953, 14496, 28853, 342, 26704, 1180, 273, 7632, 407, 35104, 731, 1066, 281, 820, 1032, 2348, 9508, 273, 253, 2990, 627, 403, 767, 9508, 273, 253, 7792, 1754, 327, 12650, 2624, 285, 23178, 414, 16566, 2975, 253, 7792, 476, 320, 7826, 908, 342, 643, 5593, 347, 973, 247, 5322, 2934, 310, 281, 25057, 253, 440, 35421, 7888, 10375, 432, 23178, 414, 13757, 390, 9879, 11333, 949, 247, 8746, 305, 9866, 2605, 326, 476, 840, 320, 908, 281, 10883, 3809, 39709, 10872, 50275, 783, 2929, 556, 2710, 2266, 2792, 1690, 271, 9470, 5661, 7103, 342, 7418, 643, 1666, 25379, 50274, 5371, 310, 253, 5750, 326, 253, 8746, 305, 9866, 6131, 689, 643, 10165, 275, 2720, 789, 3199, 1162, 355, 4765, 295, 23248, 1162, 355, 6247, 50273, 4524, 752, 2515, 327, 253, 3733, 10895, 1057, 253, 7792, 789, 973, 310, 352, 3477, 281, 1071, 1110, 2515, 24088, 432, 5150, 1903, 326, 31360, 247, 465, 7317, 1979, 352, 3133, 326, 368, 29688, 5467, 1355, 2544, 432, 29679, 281, 29679, 50276, 261, 436, 752, 368, 1599, 407, 3981, 326, 253, 14580, 403, 14496, 28853, 273, 247, 1677, 2570, 985, 50275, 5092, 368, 4496, 21184, 327, 849, 253, 440, 1940, 1032, 2980, 310, 2218, 281, 755, 253, 4588, 3114, 11497, 273, 253, 7316, 4216, 305, 50274, 30819, 271, 7844, 342, 246, 4706, 84, 310, 417, 1077, 27096, 323, 253, 4081, 1332, 476, 368, 4496, 320, 625, 2173, 670, 253, 1408, 3181, 50274, 5092, 368, 1071, 634, 1332, 327, 7888, 342, 3114, 3216, 33024, 50274, 1542, 253, 13506, 4679, 970, 253, 19191, 2972, 7645, 352, 651, 452, 644, 5322, 281, 923, 247, 5301, 342, 253, 2929, 50276, 35421, 3114, 5481, 342, 1386, 4216, 11454, 6928, 407, 260, 864, 1162, 355, 326, 3400, 1375, 23037, 14387, 1543, 50272, 783, 2929, 556, 2710, 2266, 2792, 533, 253, 3630, 484, 476, 320, 5520, 891, 1119, 4583, 253, 2929, 1892, 281, 1239, 24088, 253, 4327, 273, 253, 8746, 305, 9866, 310, 417, 973, 17285, 752, 3374, 310, 352, 30426, 432, 8037, 4361, 253, 10012, 337, 369, 2238, 273, 21643, 347, 253, 3908, 9193, 625, 347, 271, 8310, 6012, 432, 253, 5426, 273, 247, 10883, 3966, 50276, 74, 671, 1119, 253, 1307, 273, 29391, 13727, 323, 13301, 470, 883, 4632, 2233, 247, 2372, 21643, 347, 598, 281, 326, 1127, 891, 369, 16764, 13301, 281, 320, 15301, 11390, 273, 534, 3114, 247, 4666, 45347, 275, 50276, 7152, 339, 431, 248, 4477, 1908, 253, 1895, 273, 42115, 4216, 41463, 534, 597, 36803, 347, 17524, 390, 41463, 2709, 14496, 28853, 273, 247, 673, 1173, 11932, 4216, 323, 534, 359, 452, 642, 4666, 17668, 275, 643, 3000, 359, 2550, 3048, 253, 7632, 275, 29679, 246, 281, 1110, 275, 29679, 246, 18, 534, 16897, 32809, 390, 16483, 17524, 11333, 432, 1146, 3732, 253, 12661, 247, 9542, 8746, 4216, 11454, 2990, 305, 9866, 10336, 323, 436, 1895, 4758, 285, 7568, 7826, 1175, 17524, 7200, 342, 1698, 13782, 673, 327, 15524, 285, 1524, 6928, 20544, 50275, 468, 13015, 4216, 41463, 327, 2709, 4216, 14496, 28853, 1223, 417, 10568, 667, 830, 273, 17668, 273, 7632, 875, 14496, 28853, 50276, 856, 7334, 10336, 3797, 247, 1643, 4460, 3603, 50276, 20881, 1255, 265, 50275, 49836, 273, 1543, 310, 6685, 4105, 4677, 495, 310, 2820, 281, 921, 1039, 1512, 1199, 352, 2335, 479, 2067, 2909, 273, 17303, 387, 253, 4677, 21282, 272, 275, 285, 562, 285, 17565, 2709, 38209, 281, 2096, 752, 310, 1146, 2011, 320, 13687, 275, 253, 1543, 326, 368, 921, 275, 253, 2022, 2133, 285, 2085, 253, 1551, 275, 253, 24864, 2505, 275, 8442, 285, 7180, 310, 1512, 1355, 50276, 856, 7334, 1895, 4758, 5742, 19584, 326, 627, 310, 642, 4666, 17668, 875, 14496, 28853, 533, 840, 512, 273, 253, 1524, 941, 4679, 6388, 14496, 28853, 689, 673, 534, 513, 452, 4666, 17668, 3082, 326, 897, 4666, 17668, 24088, 32809, 285, 16483, 17524, 812, 6830, 513, 1805, 327, 841, 15302, 50276, 2369, 15265, 839, 2898, 323, 253, 642, 4666, 17668, 4758, 1580, 627, 310, 642, 10895, 12262, 253, 4081, 4758, 247, 15265, 839, 2898, 1293, 941, 812, 671, 2085, 690, 22861, 347, 281, 2139, 581, 943, 1557, 670, 436, 4758, 533, 642, 824, 2898, 310, 1246, 594, 352, 9193, 625, 751, 253, 4477, 23179, 247, 1895, 281, 8415, 50276, 251, 1524, 941, 5239, 253, 1180, 273, 9959, 465, 310, 417, 1929, 594, 253, 4477, 1408, 253, 29245, 87, 404, 5933, 281, 5206, 465, 533, 253, 29245, 87, 404, 5933, 671, 18012, 247, 873, 273, 9959, 534, 4620, 281, 320, 25665, 33810, 253, 13782, 673, 273, 253, 29245, 87, 404, 5933, 1057, 417, 3176, 281, 320, 2908, 3738, 891, 16216, 320, 2119, 984, 253, 9759, 273, 253, 1543, 310, 824, 247, 4840, 50276, 33921, 337, 310, 14916, 285, 516, 417, 2119, 2139, 352, 310, 16318, 347, 247, 10012, 50276, 6438, 5955, 2180, 891, 452, 2879, 247, 9974, 949, 253, 13583, 11821, 273, 619, 2278, 326, 452, 644, 31637, 407, 253, 4477, 253, 9759, 273, 253, 1543, 310, 6685, 4105, 2403, 16774, 1543, 1077, 1892, 281, 4665, 891, 671, 452, 7350, 5001, 253, 1895, 4758, 285, 7103, 594, 891, 513, 417, 1859, 436, 2929, 347, 4704, 323, 9311, 387, 436, 673, 5474, 33032, 2520, 2929, 29328, 271, 42115, 305, 9866, 3169, 7792, 326, 27959, 39152, 3567, 14580, 342, 6830, 1027, 3904, 273, 7632, 285, 3216, 33024, 9959, 9470, 16774, 27163, 970, 1097, 13506, 285, 1524, 15302, 7568, 326, 253, 747, 1332, 33526, 247, 1805, 5454, 2727, 875, 3290, 285, 6733, 672, 2429, 342, 1458, 1666, 25379, 432, 577, 9050, 20544, 50276, 15617, 253, 42115, 4216, 41463, 1895, 534, 253, 2929, 9437, 281, 8415, 285, 253, 8746, 305, 9866, 48960, 2746, 403, 4722, 253, 4477, 2589, 11088, 4679, 285, 34647, 253, 31471, 273, 616, 1332, 1014, 2167, 627, 403, 247, 1643, 16613, 7364, 273, 253, 1655, 789, 534, 891, 588, 2508, 2708, 253, 4477, 11120, 1375, 253, 13260, 285, 4459, 562, 247, 1180, 273, 4679, 326, 3835, 247, 5272, 2491, 273, 42115, 15216, 50276, 20881, 1255, 265, 50276, 17465, 569, 50276, 18, 253, 2929, 1057, 417, 2085, 247, 10527, 12215, 327, 253, 26647, 281, 747, 39709, 14580, 352, 651, 320, 5322, 604, 247, 12215, 327, 23178, 414, 390, 295, 7317, 310, 2530, 50276, 19, 253, 4081, 7792, 1057, 417, 789, 342, 12877, 14580, 323, 4216, 41463, 436, 1537, 320, 253, 7312, 4758, 533, 253, 4477, 943, 4385, 327, 604, 24049, 4666, 3386, 812, 320, 9371, 50276, 20, 253, 4477, 5467, 326, 253, 873, 273, 14580, 534, 3797, 1097, 28841, 3733, 285, 3909, 5175, 14580, 956, 2074, 10670, 436, 9376, 4354, 13471, 275, 3946, 323, 14580, 326, 1705, 432, 1027, 10625, 2829, 3127, 275, 253, 30762, 3133, 281, 5224, 326, 275, 2087, 253, 4081, 42115, 1332, 778, 417, 39970, 973, 1955, 281, 3268, 15036, 327, 253, 643, 1133, 891, 11435, 253, 4477, 2589, 3081, 27090, 5216, 1014, 2167, 253, 1543, 403, 6804, 436, 310, 247, 5043, 1580, 352, 3400, 253, 9414, 342, 3081, 1491, 323, 253, 4679, 342, 298, 925, 13506, 14580, 275, 253, 2022, 2505, 752, 651, 5108, 604, 359, 2572, 253, 2491, 273, 295, 24088, 432, 6783, 281, 30321, 50276, 21, 253, 1332, 4419, 8958, 253, 1180, 273, 3216, 33024, 9959, 465, 352, 778, 417, 1663, 320, 247, 12291, 347, 1142, 4216, 41463, 3082, 671, 2430, 465, 347, 247, 4764, 50276, 37585, 3374, 50276, 18, 275, 1340, 323, 337, 285, 374, 281, 320, 6425, 513, 359, 878, 281, 1347, 690, 46551, 5933, 327, 271, 8654, 2900, 288, 85, 273, 374, 50276, 19, 10012, 337, 310, 14916, 352, 943, 320, 4767, 347, 247, 958, 342, 7000, 7125, 1669, 281, 253, 30762, 50276, 20, 253, 8266, 1979, 275, 4677, 374, 310, 1512, 1355, 891, 2550, 1239, 352, 1293, 21282, 272, 275, 327, 247, 3601, 253, 2929, 29328, 247, 747, 305, 9866, 3169, 7792, 323, 42115, 4216, 41463, 253, 4477, 2589, 9470, 4679, 326, 3835, 247, 5272, 2491, 273, 15216, 326, 34647, 253, 31471, 273, 253, 4081, 1332, 619, 7350, 403, 337, 627, 310, 642, 26647, 12215, 285, 374, 253, 1332, 778, 1891, 1955, 281, 3268, 15036, 347, 2011, 275, 253, 4679, 2490, 187, 4118, 18435, 27, 2520, 2929, 19401, 271, 1774, 1895, 4216, 41463, 432, 247, 811, 43324, 31460, 7384, 326, 253, 14580, 403, 4561, 407, 3907, 21354, 432, 271, 7202, 3268, 3037, 690, 3602, 275, 271, 28841, 3408, 285, 897, 841, 275, 253, 3909, 3408, 1199, 347, 275, 19162, 4715, 253, 4477, 452, 671, 9577, 1142, 273, 253, 37317, 3533, 275, 1798, 253, 5301, 342, 5368, 789, 310, 6832, 50275, 6050, 891, 826, 438, 253, 37865, 273, 436, 789, 285, 253, 6349, 273, 253, 811, 43324, 2746, 891, 923, 271, 2523, 347, 247, 37317, 2792, 562, 285, 347, 5821, 407, 253, 4477, 253, 2929, 1057, 417, 2085, 247, 10527, 12215, 273, 253, 3290, 273, 253, 26647, 281, 39709, 14580, 352, 651, 452, 644, 4217, 24088, 281, 1908, 436, 327, 2827, 38755, 445, 28212, 305, 18650, 3210, 19191, 2972, 3210, 3966 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper propose to address the heteroscedastic regression problem using deep neural networks it assumes the variance of heteroscedastic noise is known as privileged information and suggests to reweight the samples by their noise variance in the loss the major issue to me is the lack of novelty heteroscedastic regression is a classic problem in statistics and reweighting using the inverse variance is a textbook method see chapter 10 of httpwwwstatcmueducshaliziadafaepovadafaepovpdf this paper failed to cite any relevant reference and clarify the novelty to apply the method to a deep learning setting some interesting problem can be how to estimate the variance with deep network in a reliable way this was done previously using classic models however this paper did not tackle this harder and more interesting problem instead it assume the variance is simply given during training this is not very realistic in real world setting the experiments are all synthetic and are not particularly convincing finally the paper claims a lot of connection with privileged information lupi but i found it hard to consider this variance a similar concept as privileged information which is realistic and interesting docsepin this paper a reweighting technique is proposed to suppress the impact of heteroscedastic label noise in regression model training the objective function of the regression model training process is composed of a weighted combination of instancewise training loss the instancewise weight is determined by the estimated noise variance based on prior information of the label generation process the weighting formulation is inspired by the best possible estimator of noisy measurements reaching the cramerrao bound the paper is clearly written it explains well the problem definition and the methodological formulation however we think the innovation in this work is limited the downside of this paper is as follows 1 it is a strong and usually impractical assumption to know a priori knowledge of label noise in the regression model training process especially in the proposed method the estimate of the noise variance needs to be accurate enough so as to help suppress the noise impact accordingly it is better to incorporate jointly the learning of noise distribution and the regressionclassification model so as to optimize the tolerance against the datadependent noise 2 it only considers the noisy learning process for regression however in classification scenarios noise with respect to labels is usually presented in the form of label flipping the proposed reweighing technique is not directly applicable in that case please refer to the following work for further reading learning with noisy labels nagarajan natarajan inderjit s dhillon pradeep ravikumar and ambuj tewari nips 2013 docsep paper summary this work targets regression tasks with noisy labels and proposes to incorporate knowledge about the variance of the gaussian noise corrupting the observed labels to weight the loss function at training time the proposed method is evaluated in a series of experiments involving deep networks trained according to the weighted loss function and compared to a baseline method that omits training samples that have a label noise variance larger than a threshold results indicate the proposed method is more robust to noisy labels when compared to alternatives that do not exploit the information on the noise affecting labels reasons for score  on the one hand the paper is extremely well written and somehow pedagogic in that it provides compelling motivations for considering heteroscedasticity and its possible sources and general intuitive descriptions of the proposed method before specialising them to the instance they evaluate on the other hand i think the prose lacks sufficient technical depth on the model they use its relation to text book material on heteroscedasticity eg for maximum likelihood estimation mle and on the properties of the proposed method the experimental evaluation while representing a reasonable starting point is not sufficient to fully understand the behaviour and the properties of the proposed method for these reasons i think this work cannot be accepted as is positive points  1 the editorial quality of this paper is high and the overall motivations given to support the problem statement are compelling and well discussed this also relates to the fundamental assumption underlying this work access to privileged information taking the form of knowledge of the stochastic noise variance affecting observed labels at training time 2 the proposed method appears to be well positioned wrt the recent literature on statistical modelling with noisy labels especially concerning neural network based methods it is unfortunate though that the literature scan doesnt cover wellknown approaches to tackle heteroscedastic noise in simple linear models or in general mle frameworks which may be considered textbook material 3 the experimental evaluation considers two regression tasks on their respective ucibike sharing and utk face datasets considering several variants of noise generation processes affecting labels in different and sufficiently realistic manner   negative points 1 my main concern is the thin contribution of this paper the technical details of the proposed method are not sufficiently developed drawing inspiration from fisher information calls for an appropriate discussion on the likelihood model its noise model to begin with then i think the relation of the proposed idea to simple linear models for which heteroscedastic regression has been studied in great detail eg 1 for a general reference and for mle eg 2 would become more clear and would give the opportunity for the authors to develop what are the merits of their proposed method for example the weighted least square method is very similar to what is proposed in this paper 1 econometric analysis greene william h 2003 pearson education india 2 maximum likelihood estimators with heteroscedastic errors g r fisher review of the international statistical institute vol 25 1957 see also pattern recognition and machine learning bishop 2006 chapter 5 and 6 2 the experimental evaluation is not sufficient to appreciate the virtues of the proposed method for several noise distributions including the additional ones considered in the supplement the proposed method biv does not seem to behave much better than the proposed baseline that uses a simple threshold additionally the figures are cropped and do not allow to get a sense of what happens for all competing methods fig1 and fig2 in fig3 the figures report test loss and as there seems to be overfitting kicking in also it is mentioned the baseline method requires to set a cutoff parameter but the proposed method also depends on a hyper parameter to optimise done in the appendix as a consequence it is difficult to appreciate the main advantage of biv wrt the baseline finally in fig3 there are clear signs of overfitting why did the authors suggest end of sec3 that their work does not require regularisation main criticism i think overall the main criticism i have for this work is that the contribution is not sufficient the main idea proposed in the paper fits sec 42 and it is based on well known results from text books in eq5 the summation term is mle with heteroscedasticity the loss is scaled by a coefficient that collects statistics on the sample batch if this batch is very small or its elements not sufficiently diverse i am afraid it could have a negative impact on the optimisation process this is why in the experiments the authors chose a batch size of 256 one possible way to overcome this criticism is to clarify the likelihood model and compare the proposed method to existing approaches to address heteroscedastic gaussian noise a possible advice would be to reduce or move to the appendix the discursive parts on heteroscedasticity and the general formulations eg sec 22 sec 41 and gain more space to explain how biv is different from what is known additional comments a note on experiments using the utkface dataset in this case the mlp used with 4 layers may be a bit too simple in light of the high test loss on gt labels did you try with convolutional architectures even simple ones just to clarify the test loss reported in the figures as a function of training steps what is the test batch size same as training batch size the test loss is computer according to sumk in texttest batch mathcallfmathbfxk theta tildeyk right ### Summary:
the manuscript presents a deep network approach for heteroscedastic regression problem it assumes the variance of heteroscedastic noise is known as privileged information and suggests to reweight the samples by their noise variance in the loss three reviewers agreed that the manuscript is not ready for publication the major issue is the lack of novelty heteroscedastic regression is a classic problem in statistics and reweighting using the inverse variance is a textbook method r2 and r4 confirmed that they have read author response the rebuttals are useful to clarify some points especially related to experimental settings and results however they are not convinced by the authors argument on novelty and whether the assumption is realistic
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 12661, 281, 2953, 253, 6895, 375, 758, 3258, 9077, 1895, 970, 3676, 11454, 6928, 352, 19584, 253, 11041, 273, 6895, 375, 758, 3258, 6046, 310, 1929, 347, 30082, 1491, 285, 5936, 281, 294, 6712, 253, 3530, 407, 616, 6046, 11041, 275, 253, 2957, 50276, 783, 2201, 2523, 281, 479, 310, 253, 3480, 273, 38135, 6895, 375, 758, 3258, 9077, 310, 247, 10610, 1895, 275, 9990, 285, 294, 6712, 272, 970, 253, 13737, 11041, 310, 247, 40554, 1332, 923, 8857, 884, 273, 50275, 2413, 2700, 8766, 68, 1906, 23081, 1200, 267, 37694, 324, 37124, 554, 729, 324, 37124, 554, 729, 9275, 50276, 2520, 2929, 4242, 281, 26542, 667, 4623, 3806, 285, 19148, 253, 38135, 50276, 936, 4647, 253, 1332, 281, 247, 3676, 4715, 4758, 690, 4722, 1895, 476, 320, 849, 281, 6642, 253, 11041, 342, 3676, 2990, 275, 247, 9630, 1039, 436, 369, 2218, 3786, 970, 10610, 3210, 2299, 436, 2929, 858, 417, 18915, 436, 12150, 285, 625, 4722, 1895, 3185, 352, 5467, 253, 11041, 310, 3365, 1677, 1309, 3733, 436, 310, 417, 1077, 15958, 275, 1524, 1533, 4758, 253, 4679, 403, 512, 13506, 285, 403, 417, 3782, 21414, 50275, 71, 3341, 253, 2929, 3916, 247, 2257, 273, 4602, 342, 30082, 1491, 37311, 74, 533, 891, 1119, 352, 1892, 281, 1908, 436, 11041, 247, 2074, 4473, 347, 30082, 1491, 534, 310, 15958, 285, 4722, 50276, 7152, 339, 9852, 436, 2929, 247, 294, 6712, 272, 5853, 310, 4081, 281, 10476, 253, 3486, 273, 6895, 375, 758, 3258, 5203, 6046, 275, 9077, 1566, 3733, 253, 8103, 1159, 273, 253, 9077, 1566, 3733, 1232, 310, 9924, 273, 247, 17375, 5019, 273, 4227, 3020, 3733, 2957, 253, 4227, 3020, 2801, 310, 3413, 407, 253, 5998, 6046, 11041, 1754, 327, 2720, 1491, 273, 253, 5203, 5978, 1232, 253, 42428, 15895, 310, 11797, 407, 253, 1682, 1896, 29107, 273, 27620, 6341, 10922, 253, 1531, 13429, 376, 80, 3033, 50275, 783, 2929, 310, 4518, 3542, 352, 11424, 973, 253, 1895, 5426, 285, 253, 35961, 15895, 2299, 359, 1158, 253, 15832, 275, 436, 789, 310, 3710, 253, 42719, 273, 436, 2929, 310, 347, 3637, 50276, 18, 352, 310, 247, 2266, 285, 3798, 45783, 9376, 281, 871, 247, 30400, 3640, 273, 5203, 6046, 275, 253, 9077, 1566, 3733, 1232, 3340, 275, 253, 4081, 1332, 253, 6642, 273, 253, 6046, 11041, 3198, 281, 320, 7899, 2217, 594, 347, 281, 1361, 10476, 253, 6046, 3486, 15672, 352, 310, 1805, 281, 19071, 26277, 253, 4715, 273, 6046, 3268, 285, 253, 9077, 42070, 50276, 7645, 594, 347, 281, 22318, 253, 13761, 1411, 253, 2856, 324, 2662, 6046, 50276, 19, 352, 760, 19401, 253, 27620, 4715, 1232, 323, 9077, 2299, 275, 9162, 15216, 6046, 342, 1675, 281, 13301, 310, 3798, 3559, 275, 253, 830, 273, 5203, 46899, 253, 4081, 294, 664, 798, 272, 5853, 310, 417, 3587, 7763, 275, 326, 1083, 4496, 3730, 281, 253, 1563, 789, 323, 2007, 4361, 50276, 28269, 342, 27620, 13301, 295, 28923, 48964, 2889, 274, 48964, 275, 491, 75, 262, 256, 277, 18610, 251, 819, 796, 554, 1218, 30127, 22711, 285, 6861, 10441, 716, 88, 1792, 295, 2824, 4072, 50276, 7152, 33032, 50276, 20790, 6010, 436, 789, 8571, 9077, 8892, 342, 27620, 13301, 285, 29328, 281, 19071, 3640, 670, 253, 11041, 273, 253, 305, 12064, 6046, 17715, 272, 253, 2540, 13301, 281, 2801, 253, 2957, 1159, 387, 3733, 673, 253, 4081, 1332, 310, 6760, 275, 247, 2962, 273, 4679, 7668, 3676, 6928, 10166, 2556, 281, 253, 17375, 2957, 1159, 285, 2429, 281, 247, 8245, 1332, 326, 7005, 953, 3733, 3530, 326, 452, 247, 5203, 6046, 11041, 4067, 685, 247, 7887, 1543, 5224, 253, 4081, 1332, 310, 625, 10237, 281, 27620, 13301, 672, 2429, 281, 18075, 326, 513, 417, 22059, 253, 1491, 327, 253, 6046, 13567, 13301, 50274, 250, 3743, 323, 4868, 575, 327, 253, 581, 1133, 253, 2929, 310, 6685, 973, 3542, 285, 10380, 7690, 356, 462, 280, 275, 326, 352, 3400, 18511, 42852, 323, 7296, 6895, 375, 758, 3258, 414, 285, 697, 1896, 4973, 285, 2087, 27350, 20121, 273, 253, 4081, 1332, 1078, 2714, 2182, 731, 281, 253, 4227, 597, 7472, 327, 253, 643, 1133, 891, 1158, 253, 36045, 19756, 4209, 7681, 6864, 327, 253, 1566, 597, 897, 697, 5886, 281, 2505, 1984, 2144, 327, 6895, 375, 758, 3258, 414, 24088, 323, 4869, 12177, 13418, 278, 282, 285, 327, 253, 3607, 273, 253, 4081, 1332, 253, 5661, 7103, 1223, 9999, 247, 5272, 4983, 1127, 310, 417, 4209, 281, 4751, 2096, 253, 8770, 285, 253, 3607, 273, 253, 4081, 1332, 323, 841, 4606, 891, 1158, 436, 789, 2550, 320, 7607, 347, 310, 50274, 10247, 2792, 575, 50276, 18, 253, 21977, 3290, 273, 436, 2929, 310, 1029, 285, 253, 4583, 42852, 1677, 281, 1329, 253, 1895, 3908, 403, 18511, 285, 973, 5469, 436, 671, 7033, 281, 253, 7936, 9376, 6944, 436, 789, 2289, 281, 30082, 1491, 3192, 253, 830, 273, 3640, 273, 253, 19191, 6046, 11041, 13567, 2540, 13301, 387, 3733, 673, 50276, 19, 253, 4081, 1332, 4620, 281, 320, 973, 15471, 8772, 253, 3332, 6239, 327, 7605, 26278, 342, 27620, 13301, 3340, 8664, 11454, 2990, 1754, 3082, 352, 310, 23293, 2167, 326, 253, 6239, 11017, 36908, 3835, 973, 4304, 7274, 281, 18915, 6895, 375, 758, 3258, 6046, 275, 2969, 4872, 3210, 390, 275, 2087, 278, 282, 31225, 534, 778, 320, 2783, 40554, 2144, 50276, 20, 253, 5661, 7103, 19401, 767, 9077, 8892, 327, 616, 9056, 44274, 487, 2804, 9628, 285, 2780, 76, 2454, 15302, 7296, 2067, 11640, 273, 6046, 5978, 4870, 13567, 13301, 275, 1027, 285, 10481, 15958, 5133, 50276, 575, 50275, 12373, 2792, 337, 619, 2022, 4468, 310, 253, 6906, 7680, 273, 436, 2929, 253, 7681, 4278, 273, 253, 4081, 1332, 403, 417, 10481, 3715, 10263, 17006, 432, 27633, 1491, 5841, 323, 271, 4569, 5955, 327, 253, 12177, 1566, 697, 6046, 1566, 281, 3135, 342, 840, 891, 1158, 253, 5886, 273, 253, 4081, 2934, 281, 2969, 4872, 3210, 323, 534, 6895, 375, 758, 3258, 9077, 556, 644, 5421, 275, 1270, 2508, 24088, 337, 323, 247, 2087, 3806, 285, 323, 278, 282, 24088, 374, 651, 2489, 625, 2590, 285, 651, 1918, 253, 5107, 323, 253, 4477, 281, 1287, 752, 403, 253, 16108, 273, 616, 4081, 1332, 323, 1650, 253, 17375, 1878, 6278, 1332, 310, 1077, 2074, 281, 752, 310, 4081, 275, 436, 2929, 50276, 18, 2895, 6853, 1783, 13738, 1751, 588, 16726, 288, 6469, 27887, 1665, 4730, 801, 571, 374, 4869, 12177, 48489, 342, 6895, 375, 758, 3258, 6332, 305, 391, 27633, 2278, 273, 253, 5213, 7605, 32097, 1936, 2030, 50276, 35364, 923, 671, 3102, 8981, 285, 5145, 4715, 29417, 5403, 8857, 608, 285, 721, 50276, 19, 253, 5661, 7103, 310, 417, 4209, 281, 11435, 253, 37398, 273, 253, 4081, 1332, 323, 2067, 6046, 10670, 1690, 253, 3081, 4394, 2783, 275, 253, 8499, 253, 4081, 1332, 270, 400, 1057, 417, 1646, 281, 21319, 1199, 1805, 685, 253, 4081, 8245, 326, 4648, 247, 2969, 7887, 23000, 253, 8442, 403, 50276, 23853, 1882, 285, 513, 417, 1581, 281, 755, 247, 3282, 273, 752, 6569, 323, 512, 11771, 3082, 3036, 18, 285, 3036, 19, 275, 3036, 20, 253, 8442, 1304, 1071, 2957, 285, 347, 627, 3133, 281, 320, 689, 31893, 29048, 275, 671, 352, 310, 5393, 253, 8245, 1332, 4419, 281, 873, 247, 23046, 4764, 533, 253, 4081, 1332, 671, 7024, 327, 247, 4373, 4764, 281, 5556, 885, 2218, 275, 253, 30762, 347, 247, 9936, 352, 310, 2834, 281, 11435, 253, 2022, 5750, 273, 270, 400, 8772, 253, 8245, 4720, 275, 3036, 20, 627, 403, 2590, 7871, 273, 689, 31893, 2139, 858, 253, 4477, 1804, 990, 273, 4706, 20, 326, 616, 789, 1057, 417, 2430, 3963, 5837, 50273, 7265, 14226, 891, 1158, 4583, 253, 2022, 14226, 891, 452, 323, 436, 789, 310, 326, 253, 7680, 310, 417, 4209, 253, 2022, 2934, 4081, 275, 253, 2929, 13840, 4706, 5976, 285, 352, 310, 1754, 327, 973, 1929, 1543, 432, 2505, 5098, 275, 16186, 22, 253, 36138, 1307, 310, 278, 282, 342, 6895, 375, 758, 3258, 414, 253, 2957, 310, 24337, 407, 247, 10235, 326, 41084, 9990, 327, 253, 3410, 14604, 604, 436, 14604, 310, 1077, 1355, 390, 697, 3603, 417, 10481, 11117, 891, 717, 9202, 352, 812, 452, 247, 4016, 3486, 327, 253, 5556, 5837, 1232, 436, 310, 2139, 275, 253, 4679, 253, 4477, 9703, 247, 14604, 1979, 273, 17558, 581, 1896, 1039, 281, 11399, 436, 14226, 310, 281, 19148, 253, 12177, 1566, 285, 7277, 253, 4081, 1332, 281, 5368, 7274, 281, 2953, 6895, 375, 758, 3258, 305, 12064, 6046, 247, 1896, 7535, 651, 320, 281, 4796, 390, 2118, 281, 253, 30762, 253, 1262, 2244, 422, 4243, 327, 6895, 375, 758, 3258, 414, 285, 253, 2087, 26850, 24088, 4706, 3307, 4706, 7609, 285, 6351, 625, 2317, 281, 5513, 849, 270, 400, 310, 1027, 432, 752, 310, 1929, 50276, 38092, 5701, 247, 3877, 327, 4679, 970, 253, 2780, 76, 1664, 10895, 275, 436, 1083, 253, 13361, 81, 908, 342, 577, 8090, 778, 320, 247, 2372, 1512, 2969, 275, 1708, 273, 253, 1029, 1071, 2957, 327, 305, 85, 13301, 858, 368, 1611, 342, 27311, 267, 35615, 1014, 2969, 4394, 816, 281, 19148, 253, 1071, 2957, 2361, 275, 253, 8442, 347, 247, 1159, 273, 3733, 5018, 752, 310, 253, 1071, 14604, 1979, 1072, 347, 3733, 14604, 1979, 253, 1071, 2957, 310, 4382, 2556, 281, 2020, 76, 275, 2505, 2566, 14604, 14168, 4065, 71, 2407, 89, 76, 39116, 246, 6227, 25073, 987, 2490, 187, 4118, 18435, 27, 783, 7714, 10262, 247, 3676, 2990, 2746, 323, 6895, 375, 758, 3258, 9077, 1895, 352, 19584, 253, 11041, 273, 6895, 375, 758, 3258, 6046, 310, 1929, 347, 30082, 1491, 285, 5936, 281, 294, 6712, 253, 3530, 407, 616, 6046, 11041, 275, 253, 2957, 50276, 13524, 30628, 5821, 326, 253, 7714, 310, 417, 4704, 323, 9311, 253, 2201, 2523, 310, 253, 3480, 273, 38135, 6895, 375, 758, 3258, 9077, 310, 247, 10610, 1895, 275, 9990, 285, 294, 6712, 272, 970, 253, 13737, 11041, 310, 247, 40554, 1332, 50275, 83, 19, 285, 391, 21, 5783, 326, 597, 452, 1239, 2488, 2380, 253, 30080, 85, 932, 403, 4217, 281, 19148, 690, 2792, 3340, 2905, 281, 5661, 7533, 285, 1543, 2299, 597, 403, 417, 13762, 407, 253, 4477, 4154, 327, 38135, 285, 1880, 253, 9376, 310, 15958, 50276 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 12661, 281, 2953, 253, 6895, 375, 758, 3258, 9077, 1895, 970, 3676, 11454, 6928, 352, 19584, 253, 11041, 273, 6895, 375, 758, 3258, 6046, 310, 1929, 347, 30082, 1491, 285, 5936, 281, 294, 6712, 253, 3530, 407, 616, 6046, 11041, 275, 253, 2957, 50276, 783, 2201, 2523, 281, 479, 310, 253, 3480, 273, 38135, 6895, 375, 758, 3258, 9077, 310, 247, 10610, 1895, 275, 9990, 285, 294, 6712, 272, 970, 253, 13737, 11041, 310, 247, 40554, 1332, 923, 8857, 884, 273, 50275, 2413, 2700, 8766, 68, 1906, 23081, 1200, 267, 37694, 324, 37124, 554, 729, 324, 37124, 554, 729, 9275, 50276, 2520, 2929, 4242, 281, 26542, 667, 4623, 3806, 285, 19148, 253, 38135, 50276, 936, 4647, 253, 1332, 281, 247, 3676, 4715, 4758, 690, 4722, 1895, 476, 320, 849, 281, 6642, 253, 11041, 342, 3676, 2990, 275, 247, 9630, 1039, 436, 369, 2218, 3786, 970, 10610, 3210, 2299, 436, 2929, 858, 417, 18915, 436, 12150, 285, 625, 4722, 1895, 3185, 352, 5467, 253, 11041, 310, 3365, 1677, 1309, 3733, 436, 310, 417, 1077, 15958, 275, 1524, 1533, 4758, 253, 4679, 403, 512, 13506, 285, 403, 417, 3782, 21414, 50275, 71, 3341, 253, 2929, 3916, 247, 2257, 273, 4602, 342, 30082, 1491, 37311, 74, 533, 891, 1119, 352, 1892, 281, 1908, 436, 11041, 247, 2074, 4473, 347, 30082, 1491, 534, 310, 15958, 285, 4722, 50276, 7152, 339, 9852, 436, 2929, 247, 294, 6712, 272, 5853, 310, 4081, 281, 10476, 253, 3486, 273, 6895, 375, 758, 3258, 5203, 6046, 275, 9077, 1566, 3733, 253, 8103, 1159, 273, 253, 9077, 1566, 3733, 1232, 310, 9924, 273, 247, 17375, 5019, 273, 4227, 3020, 3733, 2957, 253, 4227, 3020, 2801, 310, 3413, 407, 253, 5998, 6046, 11041, 1754, 327, 2720, 1491, 273, 253, 5203, 5978, 1232, 253, 42428, 15895, 310, 11797, 407, 253, 1682, 1896, 29107, 273, 27620, 6341, 10922, 253, 1531, 13429, 376, 80, 3033, 50275, 783, 2929, 310, 4518, 3542, 352, 11424, 973, 253, 1895, 5426, 285, 253, 35961, 15895, 2299, 359, 1158, 253, 15832, 275, 436, 789, 310, 3710, 253, 42719, 273, 436, 2929, 310, 347, 3637, 50276, 18, 352, 310, 247, 2266, 285, 3798, 45783, 9376, 281, 871, 247, 30400, 3640, 273, 5203, 6046, 275, 253, 9077, 1566, 3733, 1232, 3340, 275, 253, 4081, 1332, 253, 6642, 273, 253, 6046, 11041, 3198, 281, 320, 7899, 2217, 594, 347, 281, 1361, 10476, 253, 6046, 3486, 15672, 352, 310, 1805, 281, 19071, 26277, 253, 4715, 273, 6046, 3268, 285, 253, 9077, 42070, 50276, 7645, 594, 347, 281, 22318, 253, 13761, 1411, 253, 2856, 324, 2662, 6046, 50276, 19, 352, 760, 19401, 253, 27620, 4715, 1232, 323, 9077, 2299, 275, 9162, 15216, 6046, 342, 1675, 281, 13301, 310, 3798, 3559, 275, 253, 830, 273, 5203, 46899, 253, 4081, 294, 664, 798, 272, 5853, 310, 417, 3587, 7763, 275, 326, 1083, 4496, 3730, 281, 253, 1563, 789, 323, 2007, 4361, 50276, 28269, 342, 27620, 13301, 295, 28923, 48964, 2889, 274, 48964, 275, 491, 75, 262, 256, 277, 18610, 251, 819, 796, 554, 1218, 30127, 22711, 285, 6861, 10441, 716, 88, 1792, 295, 2824, 4072, 50276, 7152, 33032, 50276, 20790, 6010, 436, 789, 8571, 9077, 8892, 342, 27620, 13301, 285, 29328, 281, 19071, 3640, 670, 253, 11041, 273, 253, 305, 12064, 6046, 17715, 272, 253, 2540, 13301, 281, 2801, 253, 2957, 1159, 387, 3733, 673, 253, 4081, 1332, 310, 6760, 275, 247, 2962, 273, 4679, 7668, 3676, 6928, 10166, 2556, 281, 253, 17375, 2957, 1159, 285, 2429, 281, 247, 8245, 1332, 326, 7005, 953, 3733, 3530, 326, 452, 247, 5203, 6046, 11041, 4067, 685, 247, 7887, 1543, 5224, 253, 4081, 1332, 310, 625, 10237, 281, 27620, 13301, 672, 2429, 281, 18075, 326, 513, 417, 22059, 253, 1491, 327, 253, 6046, 13567, 13301, 50274, 250, 3743, 323, 4868, 575, 327, 253, 581, 1133, 253, 2929, 310, 6685, 973, 3542, 285, 10380, 7690, 356, 462, 280, 275, 326, 352, 3400, 18511, 42852, 323, 7296, 6895, 375, 758, 3258, 414, 285, 697, 1896, 4973, 285, 2087, 27350, 20121, 273, 253, 4081, 1332, 1078, 2714, 2182, 731, 281, 253, 4227, 597, 7472, 327, 253, 643, 1133, 891, 1158, 253, 36045, 19756, 4209, 7681, 6864, 327, 253, 1566, 597, 897, 697, 5886, 281, 2505, 1984, 2144, 327, 6895, 375, 758, 3258, 414, 24088, 323, 4869, 12177, 13418, 278, 282, 285, 327, 253, 3607, 273, 253, 4081, 1332, 253, 5661, 7103, 1223, 9999, 247, 5272, 4983, 1127, 310, 417, 4209, 281, 4751, 2096, 253, 8770, 285, 253, 3607, 273, 253, 4081, 1332, 323, 841, 4606, 891, 1158, 436, 789, 2550, 320, 7607, 347, 310, 50274, 10247, 2792, 575, 50276, 18, 253, 21977, 3290, 273, 436, 2929, 310, 1029, 285, 253, 4583, 42852, 1677, 281, 1329, 253, 1895, 3908, 403, 18511, 285, 973, 5469, 436, 671, 7033, 281, 253, 7936, 9376, 6944, 436, 789, 2289, 281, 30082, 1491, 3192, 253, 830, 273, 3640, 273, 253, 19191, 6046, 11041, 13567, 2540, 13301, 387, 3733, 673, 50276, 19, 253, 4081, 1332, 4620, 281, 320, 973, 15471, 8772, 253, 3332, 6239, 327, 7605, 26278, 342, 27620, 13301, 3340, 8664, 11454, 2990, 1754, 3082, 352, 310, 23293, 2167, 326, 253, 6239, 11017, 36908, 3835, 973, 4304, 7274, 281, 18915, 6895, 375, 758, 3258, 6046, 275, 2969, 4872, 3210, 390, 275, 2087, 278, 282, 31225, 534, 778, 320, 2783, 40554, 2144, 50276, 20, 253, 5661, 7103, 19401, 767, 9077, 8892, 327, 616, 9056, 44274, 487, 2804, 9628, 285, 2780, 76, 2454, 15302, 7296, 2067, 11640, 273, 6046, 5978, 4870, 13567, 13301, 275, 1027, 285, 10481, 15958, 5133, 50276, 575, 50275, 12373, 2792, 337, 619, 2022, 4468, 310, 253, 6906, 7680, 273, 436, 2929, 253, 7681, 4278, 273, 253, 4081, 1332, 403, 417, 10481, 3715, 10263, 17006, 432, 27633, 1491, 5841, 323, 271, 4569, 5955, 327, 253, 12177, 1566, 697, 6046, 1566, 281, 3135, 342, 840, 891, 1158, 253, 5886, 273, 253, 4081, 2934, 281, 2969, 4872, 3210, 323, 534, 6895, 375, 758, 3258, 9077, 556, 644, 5421, 275, 1270, 2508, 24088, 337, 323, 247, 2087, 3806, 285, 323, 278, 282, 24088, 374, 651, 2489, 625, 2590, 285, 651, 1918, 253, 5107, 323, 253, 4477, 281, 1287, 752, 403, 253, 16108, 273, 616, 4081, 1332, 323, 1650, 253, 17375, 1878, 6278, 1332, 310, 1077, 2074, 281, 752, 310, 4081, 275, 436, 2929, 50276, 18, 2895, 6853, 1783, 13738, 1751, 588, 16726, 288, 6469, 27887, 1665, 4730, 801, 571, 374, 4869, 12177, 48489, 342, 6895, 375, 758, 3258, 6332, 305, 391, 27633, 2278, 273, 253, 5213, 7605, 32097, 1936, 2030, 50276, 35364, 923, 671, 3102, 8981, 285, 5145, 4715, 29417, 5403, 8857, 608, 285, 721, 50276, 19, 253, 5661, 7103, 310, 417, 4209, 281, 11435, 253, 37398, 273, 253, 4081, 1332, 323, 2067, 6046, 10670, 1690, 253, 3081, 4394, 2783, 275, 253, 8499, 253, 4081, 1332, 270, 400, 1057, 417, 1646, 281, 21319, 1199, 1805, 685, 253, 4081, 8245, 326, 4648, 247, 2969, 7887, 23000, 253, 8442, 403, 50276, 23853, 1882, 285, 513, 417, 1581, 281, 755, 247, 3282, 273, 752, 6569, 323, 512, 11771, 3082, 3036, 18, 285, 3036, 19, 275, 3036, 20, 253, 8442, 1304, 1071, 2957, 285, 347, 627, 3133, 281, 320, 689, 31893, 29048, 275, 671, 352, 310, 5393, 253, 8245, 1332, 4419, 281, 873, 247, 23046, 4764, 533, 253, 4081, 1332, 671, 7024, 327, 247, 4373, 4764, 281, 5556, 885, 2218, 275, 253, 30762, 347, 247, 9936, 352, 310, 2834, 281, 11435, 253, 2022, 5750, 273, 270, 400, 8772, 253, 8245, 4720, 275, 3036, 20, 627, 403, 2590, 7871, 273, 689, 31893, 2139, 858, 253, 4477, 1804, 990, 273, 4706, 20, 326, 616, 789, 1057, 417, 2430, 3963, 5837, 50273, 7265, 14226, 891, 1158, 4583, 253, 2022, 14226, 891, 452, 323, 436, 789, 310, 326, 253, 7680, 310, 417, 4209, 253, 2022, 2934, 4081, 275, 253, 2929, 13840, 4706, 5976, 285, 352, 310, 1754, 327, 973, 1929, 1543, 432, 2505, 5098, 275, 16186, 22, 253, 36138, 1307, 310, 278, 282, 342, 6895, 375, 758, 3258, 414, 253, 2957, 310, 24337, 407, 247, 10235, 326, 41084, 9990, 327, 253, 3410, 14604, 604, 436, 14604, 310, 1077, 1355, 390, 697, 3603, 417, 10481, 11117, 891, 717, 9202, 352, 812, 452, 247, 4016, 3486, 327, 253, 5556, 5837, 1232, 436, 310, 2139, 275, 253, 4679, 253, 4477, 9703, 247, 14604, 1979, 273, 17558, 581, 1896, 1039, 281, 11399, 436, 14226, 310, 281, 19148, 253, 12177, 1566, 285, 7277, 253, 4081, 1332, 281, 5368, 7274, 281, 2953, 6895, 375, 758, 3258, 305, 12064, 6046, 247, 1896, 7535, 651, 320, 281, 4796, 390, 2118, 281, 253, 30762, 253, 1262, 2244, 422, 4243, 327, 6895, 375, 758, 3258, 414, 285, 253, 2087, 26850, 24088, 4706, 3307, 4706, 7609, 285, 6351, 625, 2317, 281, 5513, 849, 270, 400, 310, 1027, 432, 752, 310, 1929, 50276, 38092, 5701, 247, 3877, 327, 4679, 970, 253, 2780, 76, 1664, 10895, 275, 436, 1083, 253, 13361, 81, 908, 342, 577, 8090, 778, 320, 247, 2372, 1512, 2969, 275, 1708, 273, 253, 1029, 1071, 2957, 327, 305, 85, 13301, 858, 368, 1611, 342, 27311, 267, 35615, 1014, 2969, 4394, 816, 281, 19148, 253, 1071, 2957, 2361, 275, 253, 8442, 347, 247, 1159, 273, 3733, 5018, 752, 310, 253, 1071, 14604, 1979, 1072, 347, 3733, 14604, 1979, 253, 1071, 2957, 310, 4382, 2556, 281, 2020, 76, 275, 2505, 2566, 14604, 14168, 4065, 71, 2407, 89, 76, 39116, 246, 6227, 25073, 987, 2490, 187, 4118, 18435, 27, 783, 7714, 10262, 247, 3676, 2990, 2746, 323, 6895, 375, 758, 3258, 9077, 1895, 352, 19584, 253, 11041, 273, 6895, 375, 758, 3258, 6046, 310, 1929, 347, 30082, 1491, 285, 5936, 281, 294, 6712, 253, 3530, 407, 616, 6046, 11041, 275, 253, 2957, 50276, 13524, 30628, 5821, 326, 253, 7714, 310, 417, 4704, 323, 9311, 253, 2201, 2523, 310, 253, 3480, 273, 38135, 6895, 375, 758, 3258, 9077, 310, 247, 10610, 1895, 275, 9990, 285, 294, 6712, 272, 970, 253, 13737, 11041, 310, 247, 40554, 1332, 50275, 83, 19, 285, 391, 21, 5783, 326, 597, 452, 1239, 2488, 2380, 253, 30080, 85, 932, 403, 4217, 281, 19148, 690, 2792, 3340, 2905, 281, 5661, 7533, 285, 1543, 2299, 597, 403, 417, 13762, 407, 253, 4477, 4154, 327, 38135, 285, 1880, 253, 9376, 310, 15958, 50276 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: in this paper the authors propose knodempconline an mpc that relies on knowledge based neural ordinary differential equations as dynamic models and updates the model online using neural networks experiments are conducted both in simulation and on a real robot pros the paper is well written and well structured related work is sufficiently addressed real world experiments are conducted and limitations are discussed the approach is interesting and pseudocode is provided in the form of algorithm cons some clarifications should be provided about the approach which is currently not reproducible according to how it is described how do you compute the m matrix it is unclear how you use learned model in the mpc although there is a section about that in the paper specifically given you learn several models how do you choose or mix them in order to obtain your prediction it would be interesting to compare against other learning methods even if not online with mpc to have a better sense of how it performs docsepthe major contribution of this paper is to augment the existing work 20 to be implemented for online training for the learning of dynamics to execute online neural networks are stacked with weighted outputs to progressively capture the new features in the vehicle dynamics experiments are conducted to show the capability of the proposed method to adapt to changing dynamics of a quadrotor the major strength of the paper is to design an iterative update scheme to incorporate newly learned nn to describe the dynamical model from online data better the more precise dynamics will yield better knowledge in the mpc controller and hence improve tracking performance subject to changes in the dynamics the major weakness of the paper is the lack of descriptionexplanation of the selection mechanism of the stacked nn specifically the authors mentioned the selection matrix mpsi which couples the neural network with knowledge however theres no description of how the selection matrix is formed or designed the ambiguity introduces further questions to the proposed framework for example what if a newly introduced nn contains the information captured by a previously stacked nn this question tailors more towards the efficiency of stacking nns where it is possible that the nns contain repeated knowledge another weakness is that the paper frequently refers to the adaptiveness of the proposed framework yet the literature review covers no results from the adaptive control community however there are recent adaptive control designs that work fairly well for quadrotor control the authors should include it in the literature review and cite related works eg 1 kostadinov d and scaramuzza d 2020 august online weightadaptive nonlinear model predictive control in 2020 ieeersj international conference on intelligent robots and systems iros pp 11801185 ieee 2 hanover d foehn p sun s kaufmann e and scaramuzza d 2021 performance precision and payloads adaptive nonlinear mpc for quadrotors ieee robotics and automation letters 72 pp690697 3 joshi g virdi j and chowdhary g 2020 asynchronous deep model reference adaptive control arxiv preprint arxiv201102920 4 wu z cheng s ackerman ka gahlawat a lakshmanan a zhao p and hovakimyan n 2022 may l 1 adaptive augmentation for geometric tracking control of quadrotors in 2022 international conference on robotics and automation icra pp 13291336 ieee docsepthis paper developed a framework for online updating of the dynamics modeled by knode for mpc this ensured adaptability to changes in the environment and even when unknown weights were actually attached to the drone that change was reflected appropriately in the dynamics to achieve optimal control strengths the online adjustment of dynamics enhanced adaptability to the environment the proposed method introduced a trick to avoid forgetting the past which is an important point in online learning weaknesses online dynamics learning itself is not new in avoiding past forgetting the appendtype method diverges in both memory and computational cost the adaptability of the proposed method is not well demonstrated with only one example docsepthis paper presents an online learning framework in the context of model predictive control the method includes a joint parallel framework where the control scheme is updated based on new incoming models and the models are trained and updated based on the acquisition of new data in an online fashion the models are based on the nowwellknown neural ode framework the paper evaluates the method with a series of simulated quadrotor experiments wrt tracking error strengths the justification for knode presented in lines 8292 is appreciated the limitations section discusses several of my concerns indicating that the the authors seem to anticipate several questions about the work that might arise weaknesses maintaining a potentially large number of neural networks seems burdensome and possibly unnecessary there is a lack of a discussion or theory wrt convergence guarantees the paper seems to imply that more data is better but this is not always the case both in terms of the data that is generated itself andor in terms of the algorithms that are using these data ### Summary:
strengths interesting combination of mpc and knode knowledgebased neural differential equations hardware experiments with the crazyflie with an external computer weaknesses no detail about the solver and more importantly about the overall computing cost of the approach especially for long experiment more data not many baselines eg nonmpc baselines some ablations might be useful to understand the contribution of each component no source code is provided the related work section can be improved no theoretical guarantees statistics how many times was the training algorithm run with different seeds postrebuttal update i would like to thank the authors for their effort to improve their paper i agree with one of the reviewers that the technique itself is not truly original mpc model learning is not novel in addition the statistics are too weak for empirical results a single experiment with only 5 seeds is not enough to conclude that the training works reliably the training algorithm might have been lucky or the authors might have chosen their network well however i do think that combining learning and mpc is a very interesting and promising topic and this paper is an interesting contribution in that direction
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 249, 436, 2929, 253, 4477, 12661, 694, 351, 21568, 585, 1282, 271, 278, 5902, 326, 15771, 327, 3640, 1754, 11454, 9826, 8967, 7424, 347, 7870, 3210, 285, 11269, 253, 1566, 3909, 970, 11454, 6928, 4679, 403, 5196, 1097, 275, 9864, 285, 327, 247, 1524, 15688, 5847, 50276, 783, 2929, 310, 973, 3542, 285, 973, 18872, 50276, 4919, 789, 310, 10481, 9713, 50276, 6549, 1533, 4679, 403, 5196, 285, 7364, 403, 5469, 50276, 783, 2746, 310, 4722, 285, 10585, 406, 853, 310, 2530, 275, 253, 830, 273, 5933, 50276, 5040, 50276, 8826, 8254, 6787, 943, 320, 2530, 670, 253, 2746, 534, 310, 4390, 417, 41374, 2556, 281, 849, 352, 310, 2529, 50274, 5430, 513, 368, 11897, 253, 278, 4315, 50274, 262, 310, 12744, 849, 368, 897, 6311, 1566, 275, 253, 278, 5902, 3738, 627, 310, 247, 2593, 670, 326, 275, 253, 2929, 5742, 1677, 368, 3037, 2067, 3210, 849, 513, 368, 5206, 390, 5878, 731, 275, 1340, 281, 4044, 634, 10554, 50276, 262, 651, 320, 4722, 281, 7277, 1411, 643, 4715, 3082, 1014, 604, 417, 3909, 342, 278, 5902, 281, 452, 247, 1805, 3282, 273, 849, 352, 17923, 5474, 339, 431, 248, 2201, 7680, 273, 436, 2929, 310, 281, 35919, 253, 5368, 789, 1384, 281, 320, 9009, 323, 3909, 3733, 323, 253, 4715, 273, 8062, 281, 13194, 3909, 11454, 6928, 403, 24982, 342, 17375, 18012, 281, 31414, 9232, 253, 747, 3386, 275, 253, 4958, 8062, 4679, 403, 5196, 281, 921, 253, 14603, 273, 253, 4081, 1332, 281, 5223, 281, 6890, 8062, 273, 247, 9853, 8601, 263, 253, 2201, 4757, 273, 253, 2929, 310, 281, 2216, 271, 34560, 5731, 6974, 281, 19071, 9841, 6311, 48257, 281, 6266, 253, 18525, 1566, 432, 3909, 941, 1805, 253, 625, 10799, 8062, 588, 4917, 1805, 3640, 275, 253, 278, 5902, 9763, 285, 7613, 3157, 12544, 3045, 2256, 281, 2544, 275, 253, 8062, 50276, 783, 2201, 14855, 273, 253, 2929, 310, 253, 3480, 273, 5740, 911, 45525, 273, 253, 5438, 5122, 273, 253, 24982, 48257, 5742, 253, 4477, 5393, 253, 5438, 4315, 278, 4144, 534, 17581, 253, 11454, 2990, 342, 3640, 2299, 253, 373, 642, 5740, 273, 849, 253, 5438, 4315, 310, 4447, 390, 4158, 253, 28931, 23970, 2007, 3533, 281, 253, 4081, 7792, 323, 1650, 752, 604, 247, 9841, 5611, 48257, 4428, 253, 1491, 10848, 407, 247, 3786, 24982, 48257, 436, 1953, 8105, 641, 625, 4404, 253, 6733, 273, 37444, 295, 2224, 835, 352, 310, 1896, 326, 253, 295, 2224, 3831, 6015, 3640, 50276, 23955, 14855, 310, 326, 253, 2929, 7208, 10770, 281, 253, 5223, 6460, 273, 253, 4081, 7792, 2568, 253, 6239, 2278, 10949, 642, 1543, 432, 253, 17825, 1453, 3114, 2299, 627, 403, 3332, 17825, 1453, 11809, 326, 789, 9648, 973, 323, 9853, 8601, 263, 1453, 253, 4477, 943, 2486, 352, 275, 253, 6239, 2278, 285, 26542, 2905, 2987, 24088, 50276, 18, 465, 493, 36877, 729, 277, 285, 12689, 312, 7958, 4019, 277, 9169, 14688, 461, 3909, 2801, 26672, 422, 14561, 1566, 15970, 1453, 275, 9169, 26332, 70, 398, 75, 5213, 8059, 327, 17497, 25497, 285, 2718, 891, 2921, 7266, 12643, 520, 17786, 26332, 1796, 374, 15761, 1189, 277, 47190, 13107, 268, 5101, 256, 465, 21393, 8420, 299, 285, 12689, 312, 7958, 4019, 277, 43425, 3045, 12320, 285, 24098, 84, 17825, 14561, 278, 5902, 323, 9853, 8601, 641, 26332, 1796, 15688, 982, 285, 29885, 4876, 8187, 7266, 31055, 29790, 495, 480, 25453, 305, 362, 1817, 74, 480, 285, 448, 319, 20402, 552, 305, 9169, 35576, 3676, 1566, 3806, 17825, 1453, 549, 32693, 638, 3845, 549, 32693, 1252, 740, 1717, 938, 577, 259, 86, 1182, 260, 24176, 256, 247, 777, 8592, 16288, 305, 1240, 6937, 255, 247, 298, 518, 1200, 1342, 266, 247, 1182, 31035, 268, 285, 288, 729, 518, 303, 8202, 295, 1384, 1423, 778, 298, 337, 17825, 42072, 323, 17856, 12544, 1453, 273, 9853, 8601, 641, 275, 1384, 1423, 5213, 8059, 327, 15688, 982, 285, 29885, 17857, 376, 7266, 13718, 26, 1012, 1812, 26332, 1796, 50276, 7152, 33032, 2520, 2929, 3715, 247, 7792, 323, 3909, 22753, 273, 253, 8062, 23115, 407, 694, 853, 323, 278, 5902, 436, 33075, 5223, 1430, 281, 2544, 275, 253, 3126, 285, 1014, 672, 7202, 13461, 497, 2686, 7660, 281, 253, 28141, 326, 1818, 369, 11392, 20420, 275, 253, 8062, 281, 5115, 8654, 1453, 20544, 50276, 783, 3909, 14000, 273, 8062, 8655, 5223, 1430, 281, 253, 3126, 50276, 783, 4081, 1332, 5611, 247, 10480, 281, 3693, 37264, 253, 2469, 534, 310, 271, 1774, 1127, 275, 3909, 4715, 50275, 20881, 1255, 265, 50276, 27381, 8062, 4715, 3139, 310, 417, 747, 50276, 249, 17816, 2469, 37264, 253, 14801, 881, 1332, 11711, 2510, 275, 1097, 3541, 285, 15180, 2105, 50276, 783, 5223, 1430, 273, 253, 4081, 1332, 310, 417, 973, 5183, 342, 760, 581, 1650, 5474, 33032, 2520, 2929, 10262, 271, 3909, 4715, 7792, 275, 253, 3634, 273, 1566, 15970, 1453, 253, 1332, 3797, 247, 6036, 7529, 7792, 835, 253, 1453, 6974, 310, 9300, 1754, 327, 747, 19363, 3210, 285, 253, 3210, 403, 10166, 285, 9300, 1754, 327, 253, 11931, 273, 747, 941, 275, 271, 3909, 8142, 253, 3210, 403, 1754, 327, 253, 1024, 4714, 4304, 11454, 258, 615, 7792, 253, 2929, 44995, 253, 1332, 342, 247, 2962, 273, 15524, 9853, 8601, 263, 4679, 8772, 12544, 2228, 50276, 296, 3755, 20556, 50276, 783, 22861, 323, 694, 853, 3559, 275, 3104, 854, 27122, 310, 14109, 50276, 783, 7364, 2593, 25339, 2067, 273, 619, 7350, 7809, 326, 253, 253, 4477, 1646, 281, 30258, 2067, 3533, 670, 253, 789, 326, 1537, 12893, 50276, 20881, 1255, 265, 50276, 41322, 1776, 247, 7826, 1781, 1180, 273, 11454, 6928, 3133, 32274, 485, 285, 6830, 15279, 50276, 9088, 310, 247, 3480, 273, 247, 5955, 390, 3762, 8772, 14940, 23632, 253, 2929, 3133, 281, 16084, 326, 625, 941, 310, 1805, 533, 436, 310, 417, 1900, 253, 1083, 1097, 275, 2426, 273, 253, 941, 326, 310, 4561, 3139, 285, 263, 275, 2426, 273, 253, 11333, 326, 403, 970, 841, 941, 2490, 187, 4118, 18435, 27, 20544, 50276, 47606, 5019, 273, 278, 5902, 285, 694, 853, 3640, 3169, 11454, 8967, 7424, 50276, 10984, 1935, 4679, 342, 253, 10412, 1258, 466, 342, 271, 6024, 4382, 50275, 20881, 1255, 265, 50276, 2369, 2508, 670, 253, 47037, 285, 625, 15538, 670, 253, 4583, 12672, 2105, 273, 253, 2746, 3340, 323, 1048, 3368, 625, 941, 50276, 1439, 1142, 1666, 25379, 24088, 1327, 2503, 68, 1666, 25379, 50276, 8826, 490, 77, 569, 1537, 320, 4217, 281, 2096, 253, 7680, 273, 1016, 4445, 50276, 2369, 2603, 2127, 310, 2530, 50276, 783, 2905, 789, 2593, 476, 320, 5520, 50276, 2369, 10527, 23632, 50276, 8766, 3397, 849, 1142, 2069, 369, 253, 3733, 5933, 1408, 342, 1027, 12922, 50275, 5996, 250, 2858, 22559, 5731, 891, 651, 751, 281, 5717, 253, 4477, 323, 616, 3434, 281, 3157, 616, 2929, 50275, 74, 5194, 342, 581, 273, 253, 30628, 326, 253, 5853, 3139, 310, 417, 7777, 3236, 278, 5902, 50276, 7645, 4715, 310, 417, 4460, 275, 1635, 253, 9990, 403, 1512, 5075, 323, 16774, 1543, 247, 2014, 3368, 342, 760, 608, 12922, 310, 417, 2217, 281, 7525, 326, 253, 3733, 2987, 27340, 253, 3733, 5933, 1537, 452, 644, 13476, 390, 253, 4477, 1537, 452, 6777, 616, 2990, 973, 2299, 891, 513, 1158, 326, 16248, 4715, 285, 278, 5902, 310, 247, 1077, 4722, 285, 12532, 9400, 285, 436, 2929, 310, 271, 4722, 7680, 275, 326, 3884 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 249, 436, 2929, 253, 4477, 12661, 694, 351, 21568, 585, 1282, 271, 278, 5902, 326, 15771, 327, 3640, 1754, 11454, 9826, 8967, 7424, 347, 7870, 3210, 285, 11269, 253, 1566, 3909, 970, 11454, 6928, 4679, 403, 5196, 1097, 275, 9864, 285, 327, 247, 1524, 15688, 5847, 50276, 783, 2929, 310, 973, 3542, 285, 973, 18872, 50276, 4919, 789, 310, 10481, 9713, 50276, 6549, 1533, 4679, 403, 5196, 285, 7364, 403, 5469, 50276, 783, 2746, 310, 4722, 285, 10585, 406, 853, 310, 2530, 275, 253, 830, 273, 5933, 50276, 5040, 50276, 8826, 8254, 6787, 943, 320, 2530, 670, 253, 2746, 534, 310, 4390, 417, 41374, 2556, 281, 849, 352, 310, 2529, 50274, 5430, 513, 368, 11897, 253, 278, 4315, 50274, 262, 310, 12744, 849, 368, 897, 6311, 1566, 275, 253, 278, 5902, 3738, 627, 310, 247, 2593, 670, 326, 275, 253, 2929, 5742, 1677, 368, 3037, 2067, 3210, 849, 513, 368, 5206, 390, 5878, 731, 275, 1340, 281, 4044, 634, 10554, 50276, 262, 651, 320, 4722, 281, 7277, 1411, 643, 4715, 3082, 1014, 604, 417, 3909, 342, 278, 5902, 281, 452, 247, 1805, 3282, 273, 849, 352, 17923, 5474, 339, 431, 248, 2201, 7680, 273, 436, 2929, 310, 281, 35919, 253, 5368, 789, 1384, 281, 320, 9009, 323, 3909, 3733, 323, 253, 4715, 273, 8062, 281, 13194, 3909, 11454, 6928, 403, 24982, 342, 17375, 18012, 281, 31414, 9232, 253, 747, 3386, 275, 253, 4958, 8062, 4679, 403, 5196, 281, 921, 253, 14603, 273, 253, 4081, 1332, 281, 5223, 281, 6890, 8062, 273, 247, 9853, 8601, 263, 253, 2201, 4757, 273, 253, 2929, 310, 281, 2216, 271, 34560, 5731, 6974, 281, 19071, 9841, 6311, 48257, 281, 6266, 253, 18525, 1566, 432, 3909, 941, 1805, 253, 625, 10799, 8062, 588, 4917, 1805, 3640, 275, 253, 278, 5902, 9763, 285, 7613, 3157, 12544, 3045, 2256, 281, 2544, 275, 253, 8062, 50276, 783, 2201, 14855, 273, 253, 2929, 310, 253, 3480, 273, 5740, 911, 45525, 273, 253, 5438, 5122, 273, 253, 24982, 48257, 5742, 253, 4477, 5393, 253, 5438, 4315, 278, 4144, 534, 17581, 253, 11454, 2990, 342, 3640, 2299, 253, 373, 642, 5740, 273, 849, 253, 5438, 4315, 310, 4447, 390, 4158, 253, 28931, 23970, 2007, 3533, 281, 253, 4081, 7792, 323, 1650, 752, 604, 247, 9841, 5611, 48257, 4428, 253, 1491, 10848, 407, 247, 3786, 24982, 48257, 436, 1953, 8105, 641, 625, 4404, 253, 6733, 273, 37444, 295, 2224, 835, 352, 310, 1896, 326, 253, 295, 2224, 3831, 6015, 3640, 50276, 23955, 14855, 310, 326, 253, 2929, 7208, 10770, 281, 253, 5223, 6460, 273, 253, 4081, 7792, 2568, 253, 6239, 2278, 10949, 642, 1543, 432, 253, 17825, 1453, 3114, 2299, 627, 403, 3332, 17825, 1453, 11809, 326, 789, 9648, 973, 323, 9853, 8601, 263, 1453, 253, 4477, 943, 2486, 352, 275, 253, 6239, 2278, 285, 26542, 2905, 2987, 24088, 50276, 18, 465, 493, 36877, 729, 277, 285, 12689, 312, 7958, 4019, 277, 9169, 14688, 461, 3909, 2801, 26672, 422, 14561, 1566, 15970, 1453, 275, 9169, 26332, 70, 398, 75, 5213, 8059, 327, 17497, 25497, 285, 2718, 891, 2921, 7266, 12643, 520, 17786, 26332, 1796, 374, 15761, 1189, 277, 47190, 13107, 268, 5101, 256, 465, 21393, 8420, 299, 285, 12689, 312, 7958, 4019, 277, 43425, 3045, 12320, 285, 24098, 84, 17825, 14561, 278, 5902, 323, 9853, 8601, 641, 26332, 1796, 15688, 982, 285, 29885, 4876, 8187, 7266, 31055, 29790, 495, 480, 25453, 305, 362, 1817, 74, 480, 285, 448, 319, 20402, 552, 305, 9169, 35576, 3676, 1566, 3806, 17825, 1453, 549, 32693, 638, 3845, 549, 32693, 1252, 740, 1717, 938, 577, 259, 86, 1182, 260, 24176, 256, 247, 777, 8592, 16288, 305, 1240, 6937, 255, 247, 298, 518, 1200, 1342, 266, 247, 1182, 31035, 268, 285, 288, 729, 518, 303, 8202, 295, 1384, 1423, 778, 298, 337, 17825, 42072, 323, 17856, 12544, 1453, 273, 9853, 8601, 641, 275, 1384, 1423, 5213, 8059, 327, 15688, 982, 285, 29885, 17857, 376, 7266, 13718, 26, 1012, 1812, 26332, 1796, 50276, 7152, 33032, 2520, 2929, 3715, 247, 7792, 323, 3909, 22753, 273, 253, 8062, 23115, 407, 694, 853, 323, 278, 5902, 436, 33075, 5223, 1430, 281, 2544, 275, 253, 3126, 285, 1014, 672, 7202, 13461, 497, 2686, 7660, 281, 253, 28141, 326, 1818, 369, 11392, 20420, 275, 253, 8062, 281, 5115, 8654, 1453, 20544, 50276, 783, 3909, 14000, 273, 8062, 8655, 5223, 1430, 281, 253, 3126, 50276, 783, 4081, 1332, 5611, 247, 10480, 281, 3693, 37264, 253, 2469, 534, 310, 271, 1774, 1127, 275, 3909, 4715, 50275, 20881, 1255, 265, 50276, 27381, 8062, 4715, 3139, 310, 417, 747, 50276, 249, 17816, 2469, 37264, 253, 14801, 881, 1332, 11711, 2510, 275, 1097, 3541, 285, 15180, 2105, 50276, 783, 5223, 1430, 273, 253, 4081, 1332, 310, 417, 973, 5183, 342, 760, 581, 1650, 5474, 33032, 2520, 2929, 10262, 271, 3909, 4715, 7792, 275, 253, 3634, 273, 1566, 15970, 1453, 253, 1332, 3797, 247, 6036, 7529, 7792, 835, 253, 1453, 6974, 310, 9300, 1754, 327, 747, 19363, 3210, 285, 253, 3210, 403, 10166, 285, 9300, 1754, 327, 253, 11931, 273, 747, 941, 275, 271, 3909, 8142, 253, 3210, 403, 1754, 327, 253, 1024, 4714, 4304, 11454, 258, 615, 7792, 253, 2929, 44995, 253, 1332, 342, 247, 2962, 273, 15524, 9853, 8601, 263, 4679, 8772, 12544, 2228, 50276, 296, 3755, 20556, 50276, 783, 22861, 323, 694, 853, 3559, 275, 3104, 854, 27122, 310, 14109, 50276, 783, 7364, 2593, 25339, 2067, 273, 619, 7350, 7809, 326, 253, 253, 4477, 1646, 281, 30258, 2067, 3533, 670, 253, 789, 326, 1537, 12893, 50276, 20881, 1255, 265, 50276, 41322, 1776, 247, 7826, 1781, 1180, 273, 11454, 6928, 3133, 32274, 485, 285, 6830, 15279, 50276, 9088, 310, 247, 3480, 273, 247, 5955, 390, 3762, 8772, 14940, 23632, 253, 2929, 3133, 281, 16084, 326, 625, 941, 310, 1805, 533, 436, 310, 417, 1900, 253, 1083, 1097, 275, 2426, 273, 253, 941, 326, 310, 4561, 3139, 285, 263, 275, 2426, 273, 253, 11333, 326, 403, 970, 841, 941, 2490, 187, 4118, 18435, 27, 20544, 50276, 47606, 5019, 273, 278, 5902, 285, 694, 853, 3640, 3169, 11454, 8967, 7424, 50276, 10984, 1935, 4679, 342, 253, 10412, 1258, 466, 342, 271, 6024, 4382, 50275, 20881, 1255, 265, 50276, 2369, 2508, 670, 253, 47037, 285, 625, 15538, 670, 253, 4583, 12672, 2105, 273, 253, 2746, 3340, 323, 1048, 3368, 625, 941, 50276, 1439, 1142, 1666, 25379, 24088, 1327, 2503, 68, 1666, 25379, 50276, 8826, 490, 77, 569, 1537, 320, 4217, 281, 2096, 253, 7680, 273, 1016, 4445, 50276, 2369, 2603, 2127, 310, 2530, 50276, 783, 2905, 789, 2593, 476, 320, 5520, 50276, 2369, 10527, 23632, 50276, 8766, 3397, 849, 1142, 2069, 369, 253, 3733, 5933, 1408, 342, 1027, 12922, 50275, 5996, 250, 2858, 22559, 5731, 891, 651, 751, 281, 5717, 253, 4477, 323, 616, 3434, 281, 3157, 616, 2929, 50275, 74, 5194, 342, 581, 273, 253, 30628, 326, 253, 5853, 3139, 310, 417, 7777, 3236, 278, 5902, 50276, 7645, 4715, 310, 417, 4460, 275, 1635, 253, 9990, 403, 1512, 5075, 323, 16774, 1543, 247, 2014, 3368, 342, 760, 608, 12922, 310, 417, 2217, 281, 7525, 326, 253, 3733, 2987, 27340, 253, 3733, 5933, 1537, 452, 644, 13476, 390, 253, 4477, 1537, 452, 6777, 616, 2990, 973, 2299, 891, 513, 1158, 326, 16248, 4715, 285, 278, 5902, 310, 247, 1077, 4722, 285, 12532, 9400, 285, 436, 2929, 310, 271, 4722, 7680, 275, 326, 3884 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper proposes a metalearning method that uses natural language to describe features the natural language feature description can be encoded with pretrained models in the same way for different tasks thus enabling the transferability of these features across tasks the author conducted experiments using some statistical datasets that have categorical features and numerical labels and split these datasets into subsets for meta training meta validation and meta testing experiments show that the proposed method achieve faster adaptation to a new task with a few taskspecific examples strengths 1 the idea of describing features in natural language in the metalearning setting looks novel and reasonable because such natural language features can be easily sharable 2 the problem formulation and method description are very clear weaknesses 1 despite the idea being general the evaluation setting looks very restricted the datasets are not commonly used in the existing metalearning literature as far as i know i would suggest the author justify why they chose these datasets and test their method more broadly on other datasets too 2 its not clear what the tasks are in their datasets and how related they are how different are the features of different tasks the author should give some examples for readers to better understand the difficulty of the generalization across these tasks 3 from the results in table 4 it seems most of the improvement is from using pretrained models comparing nn vs ours which makes me question the contributions of other parts of the proposed method 4 regarding technical details the author averages the embeddings of different features equation 3 which raises a concern about how representative the resulted vector can be if the number of features increases in this work it seems not a big issue because the number of features for each task is very small 45 but i question this if the proposed method is used for many real machine learning problems which usually have thousands of features yes the author discussed the limitations quite well docsepthe paper proposes a metalearning method that builds relationships across supervised learning tasks with different feature spaces using feature descriptions written in natural language when different tasks may share similar or related feature descriptions benefiting from the pretrained language model eg bert which contains different kinds of knowledge implicitly the proposed method could improve the generalization performance on unseen tasks with small labeled data the authors also empirically demonstrate that the proposed method outperforms the existing metalearning methods in realworld datasets strengths 1 the novel problem setting is reasonable and challenging where instances have the same feature space in the same task and different feature description sets and feature spaces among different tasks 2 extensive experiments and ablation studies are provided to determine the effectiveness of the proposed method 3 the paper is wellwritten and easy to follow weaknesses 1 despite the plausible setting there are still a few points that the authors fail to explain elaborately for example whether the feature description sets or feature type sets exist intersection between metatraining datasets and metatest datasets the question is out of my curiosity about whether the metatraining model using feature descriptions could really generalize to those unseen tasks which contain entirely different feature description sets or even feature types from the metatraining tasks 2 as the authors mentioned in the limitation section the proposed method exits several limitations first the method seems to depend on the number of metatraining datasets second in this work it is to be doubted whether the method could extend to those datasets where instances contain numerical features or the corresponding labels are discrete it could be better to find a dataset conforming to the characteristics described above to evaluate the effectiveness of the method 3 a baseline that neural network with feature description is not metatrained could be considered which could evaluate the necessity of metatraining despite using the feature description the authors have addressed the limitations docsepthis work proposes to use the bert representation of the textual description of the feature value as input feature to improve the model generalization particularly in the metalearning setup empirically compared with baseline metalearning methods on 2 datasets the proposed method achieves better or competitive performances the idea of the work is very interesting and intuitive the paper is well written and easy to understand the empirical evaluation shows some promising results however the empirical evaluation is done only on two relatively small datasets which makes it difficult to judge how general the proposed method is in practice it would be much more informative to try this method on large datasets the work only tests the proposed method in two relatively small datasets hence its hard to judge how general the conclusions are docsepthis paper considers the problem of heterogenous meta learning where the different datasets potentially have different number and type of features to enable using the same model for all the datasets the authors consider an approach where they use textual feature descriptions which makes the model permutation invariant with respect to the features and also agnostic to the number of features this allows the same model to be used for all tasks regardless of their composition of features because new features can simply be encoded using a sentence encoder they show that they outperform other approaches on two datasets strengths 1 the approach is very sound and in line with other recent approaches which propose to use descriptions of classes features and tasks 1 the method was compared with a lot of other baselines which helps put the scores in context 1 the gains seems to be consistent and strong on both metatasks considered weaknesses 1 it looks like the only difference between the proposed method and a baseline mdk b is the usage of the feature encoder fig 1 which is a 3 layer neural network it looks like the authors agree with that as well line 220 so the technical novelty although guided by good intuition seems to boil down to the addition of a single layer on top of a baseline 1 i think it is important to experiment with different types of sentence encoders given that the descriptions they consider are very short its possible that simple word2vec vectors bagofwords averaging could do the trick 1 another baseline could have been added which finetunes the nn model on the downstream task has been addressed in the supplementary pdf ### Summary:
this paper presents a novel metalearning approach based on learning a sentence encoder which maps feature descriptions to embeddings the sentence encoder is shown to generalize to new tasks during the test phase hence allowing fewshot learning the main concern raised by the reviewers was about the use of only two datasets which are nonstandard for evaluation metalearning however as the authors note the proposed approach requires using datasets where feature descriptions are available and hence the choice of datasets seems reasonable the authors are encouraged to revise the paper to discuss how the approach might be generalized other setups in metalearning
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 247, 5148, 613, 920, 1332, 326, 4648, 3626, 3448, 281, 6266, 3386, 253, 3626, 3448, 4735, 5740, 476, 320, 16202, 342, 3215, 11273, 3210, 275, 253, 1072, 1039, 323, 1027, 8892, 3021, 17690, 253, 3700, 1430, 273, 841, 3386, 2439, 8892, 253, 2488, 5196, 4679, 970, 690, 7605, 15302, 326, 452, 31091, 3386, 285, 10704, 13301, 285, 8085, 841, 15302, 715, 20077, 323, 11419, 3733, 11419, 12820, 285, 11419, 5175, 4679, 921, 326, 253, 4081, 1332, 5115, 7938, 15644, 281, 247, 747, 4836, 342, 247, 1643, 8892, 29765, 6667, 50276, 296, 3755, 20556, 337, 253, 2934, 273, 12930, 3386, 275, 3626, 3448, 275, 253, 5148, 613, 920, 4758, 4453, 4460, 285, 5272, 984, 824, 3626, 3448, 3386, 476, 320, 4354, 17614, 494, 374, 253, 1895, 15895, 285, 1332, 5740, 403, 1077, 2590, 50276, 20881, 1255, 265, 337, 5747, 253, 2934, 1146, 2087, 253, 7103, 4758, 4453, 1077, 11096, 253, 15302, 403, 417, 7744, 908, 275, 253, 5368, 5148, 613, 920, 6239, 347, 2080, 347, 891, 871, 891, 651, 1804, 253, 2488, 15249, 2139, 597, 9703, 841, 15302, 285, 1071, 616, 1332, 625, 21450, 327, 643, 15302, 1512, 374, 697, 417, 2590, 752, 253, 8892, 403, 275, 616, 15302, 285, 849, 2905, 597, 403, 849, 1027, 403, 253, 3386, 273, 1027, 8892, 253, 2488, 943, 1918, 690, 6667, 323, 10668, 281, 1805, 2096, 253, 10183, 273, 253, 26647, 2439, 841, 8892, 495, 432, 253, 1543, 275, 2829, 577, 352, 3133, 954, 273, 253, 7756, 310, 432, 970, 3215, 11273, 3210, 10941, 48257, 4632, 20451, 534, 2789, 479, 1953, 253, 9021, 273, 643, 4243, 273, 253, 4081, 1332, 50276, 21, 5001, 7681, 4278, 253, 2488, 31218, 253, 46234, 273, 1027, 3386, 5150, 495, 534, 16540, 247, 4468, 670, 849, 8612, 253, 7369, 4972, 476, 320, 604, 253, 1180, 273, 3386, 5459, 275, 436, 789, 352, 3133, 417, 247, 1943, 2523, 984, 253, 1180, 273, 3386, 323, 1016, 4836, 310, 1077, 1355, 5329, 533, 891, 1953, 436, 604, 253, 4081, 1332, 310, 908, 323, 1142, 1524, 5145, 4715, 3237, 534, 3798, 452, 6763, 273, 3386, 4754, 253, 2488, 5469, 253, 7364, 3240, 973, 5474, 339, 431, 248, 2929, 29328, 247, 5148, 613, 920, 1332, 326, 21168, 7688, 2439, 22296, 4715, 8892, 342, 1027, 4735, 8470, 970, 4735, 20121, 3542, 275, 3626, 3448, 672, 1027, 8892, 778, 3894, 2074, 390, 2905, 4735, 20121, 2750, 2996, 432, 253, 3215, 11273, 3448, 1566, 24088, 270, 797, 534, 4428, 1027, 9351, 273, 3640, 29688, 253, 4081, 1332, 812, 3157, 253, 26647, 3045, 327, 39709, 8892, 342, 1355, 13130, 941, 253, 4477, 671, 45190, 7568, 326, 253, 4081, 1332, 41731, 13015, 253, 5368, 5148, 613, 920, 3082, 275, 1524, 10186, 15302, 20544, 337, 253, 4460, 1895, 4758, 310, 5272, 285, 11132, 835, 10872, 452, 253, 1072, 4735, 2317, 275, 253, 1072, 4836, 285, 1027, 4735, 5740, 5239, 285, 4735, 8470, 2190, 1027, 8892, 50276, 19, 9470, 4679, 285, 28913, 2175, 403, 2530, 281, 3653, 253, 12510, 273, 253, 4081, 1332, 50275, 20, 253, 2929, 310, 973, 15720, 285, 3477, 281, 956, 50276, 20881, 1255, 265, 337, 5747, 253, 21541, 4758, 627, 403, 1335, 247, 1643, 2792, 326, 253, 4477, 1891, 281, 5513, 14883, 1523, 323, 1650, 1880, 253, 4735, 5740, 5239, 390, 4735, 1511, 5239, 2226, 15171, 875, 1313, 255, 26208, 15302, 285, 1313, 255, 383, 15302, 253, 1953, 310, 562, 273, 619, 24536, 670, 1880, 253, 1313, 255, 26208, 1566, 970, 4735, 20121, 812, 1663, 39970, 281, 1110, 39709, 8892, 534, 3831, 7094, 1027, 4735, 5740, 5239, 390, 1014, 4735, 3510, 432, 253, 1313, 255, 26208, 8892, 50276, 19, 347, 253, 4477, 5393, 275, 253, 12291, 2593, 253, 4081, 1332, 39852, 2067, 7364, 806, 253, 1332, 3133, 281, 3469, 327, 253, 1180, 273, 1313, 255, 26208, 15302, 1273, 275, 436, 789, 352, 310, 281, 320, 42960, 1880, 253, 1332, 812, 9017, 281, 1110, 15302, 835, 10872, 3831, 10704, 3386, 390, 253, 3969, 13301, 403, 13358, 352, 812, 320, 1805, 281, 1089, 247, 10895, 10138, 272, 281, 253, 5319, 2529, 1840, 281, 7472, 253, 12510, 273, 253, 1332, 50276, 20, 247, 8245, 326, 11454, 2990, 342, 4735, 5740, 310, 417, 1313, 255, 11273, 812, 320, 2783, 534, 812, 7472, 253, 15504, 273, 1313, 255, 26208, 5747, 970, 253, 4735, 5740, 253, 50276, 43355, 452, 9713, 253, 7364, 5474, 33032, 2520, 789, 29328, 281, 897, 253, 270, 797, 6779, 273, 253, 45860, 5740, 273, 253, 4735, 1318, 347, 3280, 4735, 281, 3157, 253, 1566, 26647, 3782, 275, 253, 5148, 613, 920, 9978, 45190, 2429, 342, 8245, 5148, 613, 920, 3082, 327, 374, 15302, 253, 4081, 1332, 33526, 1805, 390, 12085, 16226, 50276, 783, 2934, 273, 253, 789, 310, 1077, 4722, 285, 27350, 253, 2929, 310, 973, 3542, 285, 3477, 281, 2096, 253, 16774, 7103, 2722, 690, 12532, 1543, 50275, 35529, 253, 16774, 7103, 310, 2218, 760, 327, 767, 4942, 1355, 15302, 534, 2789, 352, 2834, 281, 5963, 849, 2087, 253, 4081, 1332, 310, 275, 3946, 352, 651, 320, 1199, 625, 27096, 281, 1611, 436, 1332, 327, 1781, 15302, 50276, 783, 789, 760, 5216, 253, 4081, 1332, 275, 767, 4942, 1355, 15302, 7613, 697, 1892, 281, 5963, 849, 2087, 253, 11815, 403, 5474, 33032, 2520, 2929, 19401, 253, 1895, 273, 6895, 11426, 11419, 4715, 835, 253, 1027, 15302, 7826, 452, 1027, 1180, 285, 1511, 273, 3386, 281, 8046, 970, 253, 1072, 1566, 323, 512, 253, 15302, 253, 4477, 1908, 271, 2746, 835, 597, 897, 45860, 4735, 20121, 534, 2789, 253, 1566, 29391, 13727, 342, 1675, 281, 253, 3386, 285, 671, 639, 79, 6932, 281, 253, 1180, 273, 3386, 436, 4483, 253, 1072, 1566, 281, 320, 908, 323, 512, 8892, 10159, 273, 616, 5889, 273, 3386, 984, 747, 3386, 476, 3365, 320, 16202, 970, 247, 6197, 32049, 597, 921, 326, 597, 562, 32231, 643, 7274, 327, 767, 15302, 20544, 337, 253, 2746, 310, 1077, 3590, 285, 275, 1386, 342, 643, 3332, 7274, 534, 12661, 281, 897, 20121, 273, 5971, 3386, 285, 8892, 337, 253, 1332, 369, 2429, 342, 247, 2257, 273, 643, 1666, 25379, 534, 7729, 1691, 253, 7363, 275, 3634, 337, 253, 15988, 3133, 281, 320, 5185, 285, 2266, 327, 1097, 1313, 255, 6579, 2783, 50275, 20881, 1255, 265, 337, 352, 4453, 751, 253, 760, 3064, 875, 253, 4081, 1332, 285, 247, 8245, 278, 17261, 50276, 67, 310, 253, 10393, 273, 253, 4735, 32049, 3036, 337, 534, 310, 247, 495, 3828, 11454, 2990, 352, 4453, 751, 253, 4477, 5194, 342, 326, 347, 973, 1386, 18881, 594, 253, 7681, 38135, 3738, 18107, 407, 1175, 30328, 3133, 281, 22149, 1066, 281, 253, 1635, 273, 247, 2014, 3828, 327, 1755, 273, 247, 8245, 337, 891, 1158, 352, 310, 1774, 281, 3368, 342, 1027, 3510, 273, 6197, 2349, 351, 398, 1677, 326, 253, 20121, 597, 1908, 403, 1077, 2159, 697, 1896, 326, 2969, 3159, 19, 4642, 11390, 7351, 1171, 12113, 25001, 812, 513, 253, 10480, 337, 1529, 8245, 812, 452, 644, 2879, 534, 1442, 292, 14124, 253, 48257, 1566, 327, 253, 15450, 4836, 556, 644, 9713, 275, 253, 24864, 31697, 2490, 187, 4118, 18435, 27, 2520, 2929, 10262, 247, 4460, 5148, 613, 920, 2746, 1754, 327, 4715, 247, 6197, 32049, 534, 8115, 4735, 20121, 281, 46234, 253, 6197, 32049, 310, 2011, 281, 39970, 281, 747, 8892, 1309, 253, 1071, 3408, 7613, 6941, 1643, 11860, 4715, 253, 2022, 4468, 5439, 407, 253, 30628, 369, 670, 253, 897, 273, 760, 767, 15302, 534, 403, 1327, 15291, 323, 7103, 5148, 613, 920, 2299, 347, 253, 4477, 3877, 253, 4081, 2746, 4419, 970, 15302, 835, 4735, 20121, 403, 2130, 285, 7613, 253, 4327, 273, 15302, 3133, 5272, 253, 4477, 403, 14659, 281, 49620, 253, 2929, 281, 2319, 849, 253, 2746, 1537, 320, 14923, 643, 873, 8777, 275, 5148, 613, 920 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 247, 5148, 613, 920, 1332, 326, 4648, 3626, 3448, 281, 6266, 3386, 253, 3626, 3448, 4735, 5740, 476, 320, 16202, 342, 3215, 11273, 3210, 275, 253, 1072, 1039, 323, 1027, 8892, 3021, 17690, 253, 3700, 1430, 273, 841, 3386, 2439, 8892, 253, 2488, 5196, 4679, 970, 690, 7605, 15302, 326, 452, 31091, 3386, 285, 10704, 13301, 285, 8085, 841, 15302, 715, 20077, 323, 11419, 3733, 11419, 12820, 285, 11419, 5175, 4679, 921, 326, 253, 4081, 1332, 5115, 7938, 15644, 281, 247, 747, 4836, 342, 247, 1643, 8892, 29765, 6667, 50276, 296, 3755, 20556, 337, 253, 2934, 273, 12930, 3386, 275, 3626, 3448, 275, 253, 5148, 613, 920, 4758, 4453, 4460, 285, 5272, 984, 824, 3626, 3448, 3386, 476, 320, 4354, 17614, 494, 374, 253, 1895, 15895, 285, 1332, 5740, 403, 1077, 2590, 50276, 20881, 1255, 265, 337, 5747, 253, 2934, 1146, 2087, 253, 7103, 4758, 4453, 1077, 11096, 253, 15302, 403, 417, 7744, 908, 275, 253, 5368, 5148, 613, 920, 6239, 347, 2080, 347, 891, 871, 891, 651, 1804, 253, 2488, 15249, 2139, 597, 9703, 841, 15302, 285, 1071, 616, 1332, 625, 21450, 327, 643, 15302, 1512, 374, 697, 417, 2590, 752, 253, 8892, 403, 275, 616, 15302, 285, 849, 2905, 597, 403, 849, 1027, 403, 253, 3386, 273, 1027, 8892, 253, 2488, 943, 1918, 690, 6667, 323, 10668, 281, 1805, 2096, 253, 10183, 273, 253, 26647, 2439, 841, 8892, 495, 432, 253, 1543, 275, 2829, 577, 352, 3133, 954, 273, 253, 7756, 310, 432, 970, 3215, 11273, 3210, 10941, 48257, 4632, 20451, 534, 2789, 479, 1953, 253, 9021, 273, 643, 4243, 273, 253, 4081, 1332, 50276, 21, 5001, 7681, 4278, 253, 2488, 31218, 253, 46234, 273, 1027, 3386, 5150, 495, 534, 16540, 247, 4468, 670, 849, 8612, 253, 7369, 4972, 476, 320, 604, 253, 1180, 273, 3386, 5459, 275, 436, 789, 352, 3133, 417, 247, 1943, 2523, 984, 253, 1180, 273, 3386, 323, 1016, 4836, 310, 1077, 1355, 5329, 533, 891, 1953, 436, 604, 253, 4081, 1332, 310, 908, 323, 1142, 1524, 5145, 4715, 3237, 534, 3798, 452, 6763, 273, 3386, 4754, 253, 2488, 5469, 253, 7364, 3240, 973, 5474, 339, 431, 248, 2929, 29328, 247, 5148, 613, 920, 1332, 326, 21168, 7688, 2439, 22296, 4715, 8892, 342, 1027, 4735, 8470, 970, 4735, 20121, 3542, 275, 3626, 3448, 672, 1027, 8892, 778, 3894, 2074, 390, 2905, 4735, 20121, 2750, 2996, 432, 253, 3215, 11273, 3448, 1566, 24088, 270, 797, 534, 4428, 1027, 9351, 273, 3640, 29688, 253, 4081, 1332, 812, 3157, 253, 26647, 3045, 327, 39709, 8892, 342, 1355, 13130, 941, 253, 4477, 671, 45190, 7568, 326, 253, 4081, 1332, 41731, 13015, 253, 5368, 5148, 613, 920, 3082, 275, 1524, 10186, 15302, 20544, 337, 253, 4460, 1895, 4758, 310, 5272, 285, 11132, 835, 10872, 452, 253, 1072, 4735, 2317, 275, 253, 1072, 4836, 285, 1027, 4735, 5740, 5239, 285, 4735, 8470, 2190, 1027, 8892, 50276, 19, 9470, 4679, 285, 28913, 2175, 403, 2530, 281, 3653, 253, 12510, 273, 253, 4081, 1332, 50275, 20, 253, 2929, 310, 973, 15720, 285, 3477, 281, 956, 50276, 20881, 1255, 265, 337, 5747, 253, 21541, 4758, 627, 403, 1335, 247, 1643, 2792, 326, 253, 4477, 1891, 281, 5513, 14883, 1523, 323, 1650, 1880, 253, 4735, 5740, 5239, 390, 4735, 1511, 5239, 2226, 15171, 875, 1313, 255, 26208, 15302, 285, 1313, 255, 383, 15302, 253, 1953, 310, 562, 273, 619, 24536, 670, 1880, 253, 1313, 255, 26208, 1566, 970, 4735, 20121, 812, 1663, 39970, 281, 1110, 39709, 8892, 534, 3831, 7094, 1027, 4735, 5740, 5239, 390, 1014, 4735, 3510, 432, 253, 1313, 255, 26208, 8892, 50276, 19, 347, 253, 4477, 5393, 275, 253, 12291, 2593, 253, 4081, 1332, 39852, 2067, 7364, 806, 253, 1332, 3133, 281, 3469, 327, 253, 1180, 273, 1313, 255, 26208, 15302, 1273, 275, 436, 789, 352, 310, 281, 320, 42960, 1880, 253, 1332, 812, 9017, 281, 1110, 15302, 835, 10872, 3831, 10704, 3386, 390, 253, 3969, 13301, 403, 13358, 352, 812, 320, 1805, 281, 1089, 247, 10895, 10138, 272, 281, 253, 5319, 2529, 1840, 281, 7472, 253, 12510, 273, 253, 1332, 50276, 20, 247, 8245, 326, 11454, 2990, 342, 4735, 5740, 310, 417, 1313, 255, 11273, 812, 320, 2783, 534, 812, 7472, 253, 15504, 273, 1313, 255, 26208, 5747, 970, 253, 4735, 5740, 253, 50276, 43355, 452, 9713, 253, 7364, 5474, 33032, 2520, 789, 29328, 281, 897, 253, 270, 797, 6779, 273, 253, 45860, 5740, 273, 253, 4735, 1318, 347, 3280, 4735, 281, 3157, 253, 1566, 26647, 3782, 275, 253, 5148, 613, 920, 9978, 45190, 2429, 342, 8245, 5148, 613, 920, 3082, 327, 374, 15302, 253, 4081, 1332, 33526, 1805, 390, 12085, 16226, 50276, 783, 2934, 273, 253, 789, 310, 1077, 4722, 285, 27350, 253, 2929, 310, 973, 3542, 285, 3477, 281, 2096, 253, 16774, 7103, 2722, 690, 12532, 1543, 50275, 35529, 253, 16774, 7103, 310, 2218, 760, 327, 767, 4942, 1355, 15302, 534, 2789, 352, 2834, 281, 5963, 849, 2087, 253, 4081, 1332, 310, 275, 3946, 352, 651, 320, 1199, 625, 27096, 281, 1611, 436, 1332, 327, 1781, 15302, 50276, 783, 789, 760, 5216, 253, 4081, 1332, 275, 767, 4942, 1355, 15302, 7613, 697, 1892, 281, 5963, 849, 2087, 253, 11815, 403, 5474, 33032, 2520, 2929, 19401, 253, 1895, 273, 6895, 11426, 11419, 4715, 835, 253, 1027, 15302, 7826, 452, 1027, 1180, 285, 1511, 273, 3386, 281, 8046, 970, 253, 1072, 1566, 323, 512, 253, 15302, 253, 4477, 1908, 271, 2746, 835, 597, 897, 45860, 4735, 20121, 534, 2789, 253, 1566, 29391, 13727, 342, 1675, 281, 253, 3386, 285, 671, 639, 79, 6932, 281, 253, 1180, 273, 3386, 436, 4483, 253, 1072, 1566, 281, 320, 908, 323, 512, 8892, 10159, 273, 616, 5889, 273, 3386, 984, 747, 3386, 476, 3365, 320, 16202, 970, 247, 6197, 32049, 597, 921, 326, 597, 562, 32231, 643, 7274, 327, 767, 15302, 20544, 337, 253, 2746, 310, 1077, 3590, 285, 275, 1386, 342, 643, 3332, 7274, 534, 12661, 281, 897, 20121, 273, 5971, 3386, 285, 8892, 337, 253, 1332, 369, 2429, 342, 247, 2257, 273, 643, 1666, 25379, 534, 7729, 1691, 253, 7363, 275, 3634, 337, 253, 15988, 3133, 281, 320, 5185, 285, 2266, 327, 1097, 1313, 255, 6579, 2783, 50275, 20881, 1255, 265, 337, 352, 4453, 751, 253, 760, 3064, 875, 253, 4081, 1332, 285, 247, 8245, 278, 17261, 50276, 67, 310, 253, 10393, 273, 253, 4735, 32049, 3036, 337, 534, 310, 247, 495, 3828, 11454, 2990, 352, 4453, 751, 253, 4477, 5194, 342, 326, 347, 973, 1386, 18881, 594, 253, 7681, 38135, 3738, 18107, 407, 1175, 30328, 3133, 281, 22149, 1066, 281, 253, 1635, 273, 247, 2014, 3828, 327, 1755, 273, 247, 8245, 337, 891, 1158, 352, 310, 1774, 281, 3368, 342, 1027, 3510, 273, 6197, 2349, 351, 398, 1677, 326, 253, 20121, 597, 1908, 403, 1077, 2159, 697, 1896, 326, 2969, 3159, 19, 4642, 11390, 7351, 1171, 12113, 25001, 812, 513, 253, 10480, 337, 1529, 8245, 812, 452, 644, 2879, 534, 1442, 292, 14124, 253, 48257, 1566, 327, 253, 15450, 4836, 556, 644, 9713, 275, 253, 24864, 31697, 2490, 187, 4118, 18435, 27, 2520, 2929, 10262, 247, 4460, 5148, 613, 920, 2746, 1754, 327, 4715, 247, 6197, 32049, 534, 8115, 4735, 20121, 281, 46234, 253, 6197, 32049, 310, 2011, 281, 39970, 281, 747, 8892, 1309, 253, 1071, 3408, 7613, 6941, 1643, 11860, 4715, 253, 2022, 4468, 5439, 407, 253, 30628, 369, 670, 253, 897, 273, 760, 767, 15302, 534, 403, 1327, 15291, 323, 7103, 5148, 613, 920, 2299, 347, 253, 4477, 3877, 253, 4081, 2746, 4419, 970, 15302, 835, 4735, 20121, 403, 2130, 285, 7613, 253, 4327, 273, 15302, 3133, 5272, 253, 4477, 403, 14659, 281, 49620, 253, 2929, 281, 2319, 849, 253, 2746, 1537, 320, 14923, 643, 873, 8777, 275, 5148, 613, 920 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: in the paper the authors propose to minimize the discrepancy between pairs of conditional neural axioms to align the embedding spaces of different kgs this method is justified by the authors study of all kinds of owl2 properties the author also studied the influence of margin lambda on less constrainedlongtail entities the authors conducted experiments by adding the proposed model on top of the best models for entity alignment the results are mixed but the proposed model improves the sea and rdgcn consistently reasons to accept 1 this paper provides a theoretic point of view of the entity alignment task which was mostly studied in empirical methods the idea to align the axioms by minimizing wasserstein distance is welljustified 2 the experiment results are in favor of the intuitions 2 the method described in this paper can be in principle adapted to any previous and future eea scoring functions reasons to reject the idea of using adversarial training to align spaces especially crosslingual spaces is based on the assumption of the large overlap between kgs for kgs that are on very different domains this method may include errors as two heterogeneous kgs do not naturally fit in one unified space the influence of overlap on this method is not wellstudied all of kgs used in the experiments are general domain kgs docsep overall this paper proposes an entity alignment framework that leverages the dependencies between entities and relations reminiscent of transr lin y et al aaai 15 to further refine the results of conventional embeddingbased alignment approach merits the proposed framework is shown to be effective in improving the performance of baseline models weaknesses 1 the methodological contribution is limited 2 the theoretical explanation part is trivial and contributes little scientific knowledge major comments 1 the bound in eq5 seems meaningless since the assumption ie one relation and one neighbour on which the bound basis is too idealistic to meet in practice 2 the explanations about the behaviour of embeddingbased entity alignment in both section 22 and section 31 are straightforward and trivial thus contribute little knowledge 3 in my point of view section 21 and section 31 are too lengthy it would be better to highlight the most important part ie loss function while avoid emphasising too much on the detailed definitions and examples minor comments there are some typos and grammar mistakes need to be proofread carefully eg ex2 ey2 in the paragraph just above eq6 take x for example take x as an example docsepoverall comments entity alignment plays an important role in improving the quality of crosslingual knowledge graphs as one of the most important solutions embeddingbased methods aim at learning a semantic space where the unique entity cross knowledge graphs can have the closest distance most of research focus on entitylevel granular but discard the whole picture of embedding space of crosslingual kgs besides the aligned entity pairs as the labelled data this paper extended the labelled data with the conditional neural and basic axioms which are actually sets of randomly selected entities or entities with the same relation type then the final objective is to align the crosslingual knowledge graphs by both optimizing the distance of labelled entity pairs and neural axioms clarity the presentation and organization of this paper is very difficult to follow besides the grammar and type errors there exists many concepts that not clear which makes it difficult to understand the main idea of this work for example the concept of axiom and ontology are introduced before giving a formal definition the claimed challenges that have not been solved well by previous works are not convincing enough in the 3rd paragraph authors argued that previous research shows very good performance but has not made on the theoretical analysis after reading the whole draft its still a big question on the given theoretical analysis of this work taking the theorem 1 as example its more like a justification but not a theorem to show the connection between the proposed axiom and ontology ontology provides an empirical structure to organize and classify the entities in the kgs its structure will be changed along with the kg in hand from this paper i can not find the connection between relation type alignment loss and the ontology please pay more attention to the writing and organization its an interesting work but not ready questions for rebuttal 1 how to build the relation seed in this work are they labelled manually if so it will have flexibility issue for dealing multiple kgs 2 please compare to a recent proposed method 1 which also optimizes the distance between a group of entities from crosslingual kgs different from this work its not condition on the relation type but based on a randomly sampled group of entities minor comments 1 in the paragraph around the equation 6 the ex2 should be ey2 2 figure 3 shows the overall architecture of the proposed method it should appear in the main content references 1 pei shichao lu yu and xiangliang zhang improving crosslingual entity alignment via optimal transport international joint conferences on artificial intelligence organization 2019docsepsummary the paper proposes neoea an approach that further constrains kg embedding with ontology knowledge the paper first tries to summarize the existing embeddingbased entity alignment methods stating that most of the methods choose transe as scoring functions but their embedding features are not aligned well compared to the neuralbased or compositionbased loss function the paper therefore solves this problem by developing a new neoea architecture which shows that adding a kginvariant ontology knowledge can minimize such difference the experiment shows the new constraints can improve stateoftheart baselines strengths the idea of using ontology constraint as an additional loss is new the paper shows significant improvements when combining the newly designed loss with stateoftheart baselines the overall paper is clear to understand weaknesses the paper doesnt have a related work section figure 1 is a little bit messy especially for figure 1d it would be better to make the figure a little bit larger moreover its unclear which color corresponds to the first kg making readers confused section 2 should be merged into the introduction section some of the words are confusing such as neural axioms the neural ontology alignment which is stated in the appendix figure is much more clear than the current conditional neural axioms section 3 is not wellorganized the theorems and axioms should be propositions or some hypothesis the usage of those words is a little bit wield here the proof in the appendix is also not well defined the experiment section is too short and not very informative it would be better to include more comprehensive analysis such as providing some visualization before and after the new loss etc postrebuttal i appreciate that the authors have conduct revisions on the current version however i think the current paper is probably still not strong enough for iclr ### Summary:
the authors study the problem of augmenting embeddingbased entity alignment in knowledge graphs kg through the use of joint alignment with deduced neural ontologies more specifically alignment of the kg neural axioms motivated by the observation that the representation between two potentially aligned entities must be bound by a minimal margin which can be problematic when there are many potential alignments they propose aligning neural axioms by wasserstein distancebased loss between learned entity embeddings conditioned on the relation embeddings experiments are conducted on openea against multiple strong baselines showing that adding the ontology alignment to these baselines improves the results pros the addition of aligning conditional ontologies is ostensibly novel for kgs with sufficient entityrelation overlap the proposed neoea method is applicable neoea has been shown empirically to improve many sota methods cons while the theoretical justification is a welcome motivation the reviewers did not find the theoretical arguments significant nor convincing overall the narrative needs work to make the paper more selfcontained and approachable for a broader range of readers the reviewers and myself found many concepts and statements somewhat confusing and needing clearly context and contrast with existing works evaluating along the requested dimensions quality conceptually the core idea is interesting wellmotivated original and ostensibly effective empirically neoea is shown able to improve upon several strong baseline underlying methods i believe that all of the reviewers find the work is interesting and promising however there were continuing concerns the strengthvalue of the described theory it isnt clear if stronger theory isnt possible or if this just hasnt been fleshed out clarity most of the reviewers and myself found the paper difficult to follow as a selfcontained work in terms of concepts clear definitions eg mathcal t isnt defined early on and the actual applicability of the theory the figures help but even these need some work a related work section or more structured presentation of related work might be clarifying along with running examples and a more unifying math presentation that captures existing and proposed work after thinking about this more it is actually a relative simple in a good way and clever idea however it took several readings and readings of related work to get there additionally the fact that all of the reviewers were concerned about different limitations is concerning wrt clarity appendix b helps a bit and i believe can also be put into the main paper originality as best as the reviewers and i can tell we havent seen this method applied to entity alignment despite this being a relatively mature subfield significance the consensus seems to be that the approach could be a notable contribution to an important area however it also appears that most of the reviewers dont feel the paper is ready for publication at a toptier venue yet as stated throughout this metareview there are several aspects to like about this work including the originality of the idea strong motivation and good empirical results however we all agreed that the paper isnt quite ready in its current form thus i presently recommend reject for this submission
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 249, 253, 2929, 253, 4477, 12661, 281, 15338, 253, 26210, 875, 8557, 273, 17697, 11454, 26373, 3056, 281, 8495, 253, 21496, 8470, 273, 1027, 465, 5943, 436, 1332, 310, 17285, 407, 253, 4477, 1263, 273, 512, 9351, 273, 18454, 77, 19, 3607, 253, 2488, 671, 5421, 253, 4833, 273, 8459, 29331, 327, 1679, 20793, 5056, 14694, 14429, 253, 4477, 5196, 4679, 407, 6240, 253, 4081, 1566, 327, 1755, 273, 253, 1682, 3210, 323, 10726, 12420, 253, 1543, 403, 6804, 533, 253, 4081, 1566, 19132, 253, 6150, 285, 391, 27421, 14340, 12724, 50275, 250, 3743, 281, 2997, 337, 436, 2929, 3400, 247, 253, 30325, 1127, 273, 1859, 273, 253, 10726, 12420, 4836, 534, 369, 6571, 5421, 275, 16774, 3082, 253, 2934, 281, 8495, 253, 26373, 3056, 407, 28699, 369, 2152, 6339, 4181, 310, 973, 6309, 1245, 374, 253, 3368, 1543, 403, 275, 3718, 273, 253, 16875, 4431, 374, 253, 1332, 2529, 275, 436, 2929, 476, 320, 275, 8063, 12956, 281, 667, 2045, 285, 2852, 299, 14576, 14755, 3470, 50275, 250, 3743, 281, 12009, 253, 2934, 273, 970, 48960, 3733, 281, 8495, 8470, 3340, 2831, 1981, 780, 8470, 310, 1754, 327, 253, 9376, 273, 253, 1781, 14787, 875, 465, 5943, 323, 465, 5943, 326, 403, 327, 1077, 1027, 10625, 436, 1332, 778, 2486, 6332, 347, 767, 22766, 465, 5943, 513, 417, 10748, 4944, 275, 581, 27998, 2317, 253, 4833, 273, 14787, 327, 436, 1332, 310, 417, 973, 14091, 728, 512, 273, 465, 5943, 908, 275, 253, 4679, 403, 2087, 5028, 465, 5943, 5474, 33032, 4583, 50275, 2520, 2929, 29328, 271, 10726, 12420, 7792, 326, 19732, 1131, 253, 21011, 875, 14429, 285, 2493, 35036, 273, 811, 83, 19169, 340, 1162, 355, 39951, 2284, 1458, 281, 2007, 39494, 253, 1543, 273, 6041, 21496, 3169, 12420, 2746, 50275, 961, 953, 253, 4081, 7792, 310, 2011, 281, 320, 3576, 275, 11138, 253, 3045, 273, 8245, 3210, 50276, 20881, 1255, 265, 337, 253, 35961, 7680, 310, 3710, 374, 253, 10527, 8813, 629, 310, 14916, 285, 17904, 1652, 8249, 3640, 50274, 24330, 5701, 50275, 18, 186, 783, 3033, 275, 16186, 22, 3133, 34209, 1580, 253, 9376, 26332, 581, 5886, 285, 581, 14646, 327, 534, 253, 3033, 3720, 310, 1512, 7445, 2531, 281, 2525, 275, 3946, 50276, 19, 186, 783, 22909, 670, 253, 8770, 273, 21496, 3169, 10726, 12420, 275, 1097, 2593, 3307, 285, 2593, 4562, 403, 15246, 285, 14916, 3021, 8162, 1652, 3640, 495, 186, 249, 619, 1127, 273, 1859, 2593, 3127, 285, 2593, 4562, 403, 1512, 24585, 352, 651, 320, 1805, 281, 6780, 253, 954, 1774, 629, 26332, 2957, 1159, 1223, 3693, 10251, 2182, 1512, 1199, 327, 253, 7000, 14308, 285, 6667, 50274, 37585, 5701, 627, 403, 690, 963, 993, 285, 28146, 16503, 878, 281, 320, 4737, 1088, 9257, 24088, 385, 19, 50276, 2653, 19, 275, 253, 12494, 816, 1840, 16186, 23, 1379, 1269, 323, 1650, 1379, 1269, 347, 271, 1650, 5474, 33032, 1189, 455, 5701, 10726, 12420, 7120, 271, 1774, 2554, 275, 11138, 253, 3290, 273, 2831, 1981, 780, 3640, 14580, 347, 581, 273, 253, 954, 1774, 5482, 21496, 3169, 3082, 4388, 387, 4715, 247, 24705, 2317, 835, 253, 4451, 10726, 2831, 3640, 14580, 476, 452, 253, 8642, 4181, 954, 273, 2561, 2770, 327, 10726, 5251, 32449, 533, 37271, 253, 2644, 5406, 273, 21496, 2317, 273, 2831, 1981, 780, 465, 5943, 16280, 253, 15616, 10726, 8557, 347, 253, 27214, 941, 436, 2929, 6508, 253, 27214, 941, 342, 253, 17697, 11454, 285, 5044, 26373, 3056, 534, 403, 2686, 5239, 273, 12421, 4236, 14429, 390, 14429, 342, 253, 1072, 5886, 1511, 840, 253, 2457, 8103, 310, 281, 8495, 253, 2831, 1981, 780, 3640, 14580, 407, 1097, 39793, 253, 4181, 273, 27214, 10726, 8557, 285, 11454, 26373, 3056, 50276, 498, 15752, 253, 9759, 285, 6003, 273, 436, 2929, 310, 1077, 2834, 281, 956, 50276, 67, 11587, 253, 28146, 285, 1511, 6332, 627, 4961, 1142, 12342, 326, 417, 2590, 534, 2789, 352, 2834, 281, 2096, 253, 2022, 2934, 273, 436, 789, 323, 1650, 253, 4473, 273, 26373, 297, 285, 42081, 403, 5611, 1078, 4933, 247, 7473, 5426, 253, 7558, 7881, 326, 452, 417, 644, 14042, 973, 407, 2045, 2987, 403, 417, 21414, 2217, 275, 253, 495, 5784, 12494, 4477, 9125, 326, 2045, 2561, 2722, 1077, 1175, 3045, 533, 556, 417, 1160, 327, 253, 10527, 1783, 846, 4361, 253, 2644, 7482, 697, 1335, 247, 1943, 1953, 327, 253, 1677, 10527, 1783, 273, 436, 789, 3192, 253, 10012, 337, 347, 1650, 697, 625, 751, 247, 22861, 533, 417, 247, 10012, 281, 921, 253, 4602, 875, 253, 4081, 26373, 297, 285, 42081, 42081, 3400, 271, 16774, 2605, 281, 23968, 285, 30215, 253, 14429, 275, 253, 465, 5943, 697, 2605, 588, 320, 4391, 2112, 342, 253, 15841, 275, 1133, 432, 436, 2929, 891, 476, 417, 1089, 253, 4602, 875, 5886, 1511, 12420, 2957, 285, 253, 42081, 4496, 2075, 625, 4116, 281, 253, 4028, 285, 6003, 697, 271, 4722, 789, 533, 417, 4704, 50276, 34974, 323, 30080, 22559, 337, 849, 281, 1973, 253, 5886, 8357, 275, 436, 789, 403, 597, 27214, 13542, 604, 594, 352, 588, 452, 15840, 2523, 323, 10620, 2709, 465, 5943, 50276, 19, 4496, 7277, 281, 247, 3332, 4081, 1332, 337, 534, 671, 5556, 4219, 253, 4181, 875, 247, 1387, 273, 14429, 432, 2831, 1981, 780, 465, 5943, 1027, 432, 436, 789, 697, 417, 1617, 327, 253, 5886, 1511, 533, 1754, 327, 247, 12421, 19958, 1387, 273, 14429, 50276, 37585, 5701, 50276, 18, 275, 253, 12494, 1475, 253, 5150, 721, 253, 385, 19, 943, 320, 2046, 19, 50276, 19, 4677, 495, 2722, 253, 4583, 10336, 273, 253, 4081, 1332, 352, 943, 3176, 275, 253, 2022, 2600, 50274, 250, 3065, 50276, 18, 759, 74, 439, 469, 8500, 26535, 340, 86, 285, 1269, 22589, 965, 606, 1182, 12109, 11138, 2831, 1981, 780, 10726, 12420, 3066, 8654, 4616, 5213, 6036, 27691, 327, 13345, 9260, 6003, 6247, 7152, 339, 793, 360, 3454, 50276, 783, 2929, 29328, 425, 3703, 66, 271, 2746, 326, 2007, 1030, 44196, 15841, 21496, 342, 42081, 3640, 253, 2929, 806, 14177, 281, 26799, 253, 5368, 21496, 3169, 10726, 12420, 3082, 14851, 326, 954, 273, 253, 3082, 5206, 21191, 339, 347, 14755, 3470, 533, 616, 21496, 3386, 403, 417, 15616, 973, 2429, 281, 253, 11454, 3169, 390, 5889, 3169, 2957, 1159, 253, 2929, 3103, 35910, 436, 1895, 407, 6684, 247, 747, 425, 3703, 66, 10336, 534, 2722, 326, 6240, 247, 465, 6058, 15417, 42081, 3640, 476, 15338, 824, 3064, 253, 3368, 2722, 253, 747, 10806, 476, 3157, 1375, 23037, 14387, 1666, 25379, 50276, 296, 3755, 20556, 50276, 783, 2934, 273, 970, 42081, 7658, 347, 271, 3081, 2957, 310, 747, 253, 2929, 2722, 1534, 11701, 672, 16248, 253, 9841, 4158, 2957, 342, 1375, 23037, 14387, 1666, 25379, 50276, 783, 4583, 2929, 310, 2590, 281, 2096, 50275, 20881, 1255, 265, 50276, 783, 2929, 36908, 452, 247, 2905, 789, 2593, 4677, 337, 310, 247, 1652, 2372, 36396, 3340, 323, 4677, 337, 69, 352, 651, 320, 1805, 281, 1056, 253, 4677, 247, 1652, 2372, 4067, 25761, 697, 12744, 534, 3295, 10140, 281, 253, 806, 15841, 2403, 10668, 13477, 2593, 374, 943, 320, 21884, 715, 253, 10199, 2593, 50276, 8826, 273, 253, 3000, 403, 21643, 824, 347, 11454, 26373, 3056, 253, 11454, 42081, 12420, 534, 310, 4767, 275, 253, 30762, 4677, 310, 1199, 625, 2590, 685, 253, 1655, 17697, 11454, 26373, 3056, 2593, 495, 310, 417, 973, 34092, 253, 39383, 285, 26373, 3056, 943, 320, 39325, 390, 690, 9079, 253, 10393, 273, 1110, 3000, 310, 247, 1652, 2372, 37214, 1060, 253, 4737, 275, 253, 30762, 310, 671, 417, 973, 2931, 50276, 783, 3368, 2593, 310, 1512, 2159, 285, 417, 1077, 27096, 352, 651, 320, 1805, 281, 2486, 625, 11088, 1783, 824, 347, 5277, 690, 24426, 1078, 285, 846, 253, 747, 2957, 3966, 50276, 5996, 250, 2858, 22559, 50276, 74, 11435, 326, 253, 4477, 452, 2589, 38549, 327, 253, 1655, 2715, 2299, 891, 1158, 253, 1655, 2929, 310, 3164, 1335, 417, 2266, 2217, 323, 17857, 32888, 2490, 187, 4118, 18435, 27, 783, 4477, 1263, 253, 1895, 273, 35919, 272, 21496, 3169, 10726, 12420, 275, 3640, 14580, 15841, 949, 253, 897, 273, 6036, 12420, 342, 36183, 11454, 14642, 5970, 625, 5742, 12420, 273, 253, 15841, 11454, 26373, 3056, 17194, 407, 253, 8310, 326, 253, 6779, 875, 767, 7826, 15616, 14429, 1364, 320, 3033, 407, 247, 8723, 8459, 534, 476, 320, 20276, 672, 627, 403, 1142, 2442, 43097, 597, 12661, 8495, 272, 11454, 26373, 3056, 407, 369, 2152, 6339, 4181, 3169, 2957, 875, 6311, 10726, 46234, 27039, 327, 253, 5886, 46234, 4679, 403, 5196, 327, 1527, 14576, 1411, 2709, 2266, 1666, 25379, 50276, 9029, 272, 326, 6240, 253, 42081, 12420, 281, 841, 1666, 25379, 19132, 253, 1543, 50275, 856, 84, 50275, 783, 1635, 273, 8495, 272, 17697, 14642, 5970, 310, 27723, 46123, 4460, 50276, 1542, 465, 5943, 342, 4209, 10726, 16429, 14787, 253, 4081, 425, 3703, 66, 1332, 310, 7763, 50275, 570, 3703, 66, 556, 644, 2011, 45190, 281, 3157, 1142, 256, 5503, 3082, 50275, 5040, 50274, 6050, 253, 10527, 22861, 310, 247, 10112, 16038, 253, 30628, 858, 417, 1089, 253, 10527, 7125, 1534, 4543, 21414, 50276, 1189, 455, 253, 14511, 3198, 789, 281, 1056, 253, 2929, 625, 1881, 41010, 285, 2746, 494, 323, 247, 16055, 2491, 273, 10668, 253, 30628, 285, 4266, 1119, 1142, 12342, 285, 7234, 8489, 21643, 285, 25312, 4518, 3634, 285, 4499, 342, 5368, 2987, 50276, 15419, 18186, 2112, 253, 9521, 10103, 50276, 15177, 4473, 1230, 253, 5161, 2934, 310, 4722, 973, 24013, 8550, 3236, 285, 27723, 46123, 3576, 45190, 425, 3703, 66, 310, 2011, 2104, 281, 3157, 2220, 2067, 2266, 8245, 6944, 3082, 50276, 74, 2868, 326, 512, 273, 253, 30628, 1089, 253, 789, 310, 4722, 285, 12532, 2299, 627, 497, 11440, 7350, 253, 4757, 2877, 273, 253, 2529, 3762, 352, 310, 2649, 2590, 604, 10046, 3762, 310, 2649, 1896, 390, 604, 436, 816, 556, 2649, 644, 269, 868, 742, 562, 50275, 498, 15752, 954, 273, 253, 30628, 285, 4266, 1119, 253, 2929, 2834, 281, 956, 347, 247, 1881, 41010, 789, 275, 2426, 273, 12342, 2590, 14308, 24088, 14168, 1179, 246, 310, 2649, 2931, 2393, 327, 285, 253, 4588, 30437, 273, 253, 3762, 253, 8442, 1361, 533, 1014, 841, 878, 690, 789, 247, 2905, 789, 2593, 390, 625, 18872, 9759, 273, 2905, 789, 1537, 320, 8254, 5411, 2112, 342, 3515, 6667, 285, 247, 625, 440, 5411, 14168, 9759, 326, 28174, 5368, 285, 4081, 789, 846, 4680, 670, 436, 625, 352, 310, 2686, 247, 4103, 2969, 275, 247, 1175, 1039, 285, 19080, 2934, 2299, 352, 2335, 2067, 28799, 285, 28799, 273, 2905, 789, 281, 755, 627, 23000, 253, 958, 326, 512, 273, 253, 30628, 497, 7514, 670, 1027, 7364, 310, 8664, 8772, 19843, 30762, 270, 7729, 247, 2372, 285, 891, 2868, 476, 671, 320, 1691, 715, 253, 2022, 2929, 50276, 19164, 414, 347, 1682, 347, 253, 30628, 285, 891, 476, 2028, 359, 419, 2254, 2326, 436, 1332, 3732, 281, 10726, 12420, 5747, 436, 1146, 247, 4942, 14242, 749, 3423, 50276, 9188, 40348, 253, 13969, 3133, 281, 320, 326, 253, 2746, 812, 320, 247, 16613, 7680, 281, 271, 1774, 2170, 2299, 352, 671, 4620, 326, 954, 273, 253, 30628, 13414, 1928, 253, 2929, 310, 4704, 323, 9311, 387, 247, 281, 431, 1321, 18767, 2568, 50276, 284, 4767, 4768, 436, 1313, 609, 1374, 627, 403, 2067, 7794, 281, 751, 670, 436, 789, 1690, 253, 3236, 414, 273, 253, 2934, 2266, 16038, 285, 1175, 16774, 1543, 2299, 359, 512, 5821, 326, 253, 2929, 310, 2649, 3240, 4704, 275, 697, 1655, 830, 50276, 40622, 891, 21654, 5583, 12009, 323, 436, 19529 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 249, 253, 2929, 253, 4477, 12661, 281, 15338, 253, 26210, 875, 8557, 273, 17697, 11454, 26373, 3056, 281, 8495, 253, 21496, 8470, 273, 1027, 465, 5943, 436, 1332, 310, 17285, 407, 253, 4477, 1263, 273, 512, 9351, 273, 18454, 77, 19, 3607, 253, 2488, 671, 5421, 253, 4833, 273, 8459, 29331, 327, 1679, 20793, 5056, 14694, 14429, 253, 4477, 5196, 4679, 407, 6240, 253, 4081, 1566, 327, 1755, 273, 253, 1682, 3210, 323, 10726, 12420, 253, 1543, 403, 6804, 533, 253, 4081, 1566, 19132, 253, 6150, 285, 391, 27421, 14340, 12724, 50275, 250, 3743, 281, 2997, 337, 436, 2929, 3400, 247, 253, 30325, 1127, 273, 1859, 273, 253, 10726, 12420, 4836, 534, 369, 6571, 5421, 275, 16774, 3082, 253, 2934, 281, 8495, 253, 26373, 3056, 407, 28699, 369, 2152, 6339, 4181, 310, 973, 6309, 1245, 374, 253, 3368, 1543, 403, 275, 3718, 273, 253, 16875, 4431, 374, 253, 1332, 2529, 275, 436, 2929, 476, 320, 275, 8063, 12956, 281, 667, 2045, 285, 2852, 299, 14576, 14755, 3470, 50275, 250, 3743, 281, 12009, 253, 2934, 273, 970, 48960, 3733, 281, 8495, 8470, 3340, 2831, 1981, 780, 8470, 310, 1754, 327, 253, 9376, 273, 253, 1781, 14787, 875, 465, 5943, 323, 465, 5943, 326, 403, 327, 1077, 1027, 10625, 436, 1332, 778, 2486, 6332, 347, 767, 22766, 465, 5943, 513, 417, 10748, 4944, 275, 581, 27998, 2317, 253, 4833, 273, 14787, 327, 436, 1332, 310, 417, 973, 14091, 728, 512, 273, 465, 5943, 908, 275, 253, 4679, 403, 2087, 5028, 465, 5943, 5474, 33032, 4583, 50275, 2520, 2929, 29328, 271, 10726, 12420, 7792, 326, 19732, 1131, 253, 21011, 875, 14429, 285, 2493, 35036, 273, 811, 83, 19169, 340, 1162, 355, 39951, 2284, 1458, 281, 2007, 39494, 253, 1543, 273, 6041, 21496, 3169, 12420, 2746, 50275, 961, 953, 253, 4081, 7792, 310, 2011, 281, 320, 3576, 275, 11138, 253, 3045, 273, 8245, 3210, 50276, 20881, 1255, 265, 337, 253, 35961, 7680, 310, 3710, 374, 253, 10527, 8813, 629, 310, 14916, 285, 17904, 1652, 8249, 3640, 50274, 24330, 5701, 50275, 18, 186, 783, 3033, 275, 16186, 22, 3133, 34209, 1580, 253, 9376, 26332, 581, 5886, 285, 581, 14646, 327, 534, 253, 3033, 3720, 310, 1512, 7445, 2531, 281, 2525, 275, 3946, 50276, 19, 186, 783, 22909, 670, 253, 8770, 273, 21496, 3169, 10726, 12420, 275, 1097, 2593, 3307, 285, 2593, 4562, 403, 15246, 285, 14916, 3021, 8162, 1652, 3640, 495, 186, 249, 619, 1127, 273, 1859, 2593, 3127, 285, 2593, 4562, 403, 1512, 24585, 352, 651, 320, 1805, 281, 6780, 253, 954, 1774, 629, 26332, 2957, 1159, 1223, 3693, 10251, 2182, 1512, 1199, 327, 253, 7000, 14308, 285, 6667, 50274, 37585, 5701, 627, 403, 690, 963, 993, 285, 28146, 16503, 878, 281, 320, 4737, 1088, 9257, 24088, 385, 19, 50276, 2653, 19, 275, 253, 12494, 816, 1840, 16186, 23, 1379, 1269, 323, 1650, 1379, 1269, 347, 271, 1650, 5474, 33032, 1189, 455, 5701, 10726, 12420, 7120, 271, 1774, 2554, 275, 11138, 253, 3290, 273, 2831, 1981, 780, 3640, 14580, 347, 581, 273, 253, 954, 1774, 5482, 21496, 3169, 3082, 4388, 387, 4715, 247, 24705, 2317, 835, 253, 4451, 10726, 2831, 3640, 14580, 476, 452, 253, 8642, 4181, 954, 273, 2561, 2770, 327, 10726, 5251, 32449, 533, 37271, 253, 2644, 5406, 273, 21496, 2317, 273, 2831, 1981, 780, 465, 5943, 16280, 253, 15616, 10726, 8557, 347, 253, 27214, 941, 436, 2929, 6508, 253, 27214, 941, 342, 253, 17697, 11454, 285, 5044, 26373, 3056, 534, 403, 2686, 5239, 273, 12421, 4236, 14429, 390, 14429, 342, 253, 1072, 5886, 1511, 840, 253, 2457, 8103, 310, 281, 8495, 253, 2831, 1981, 780, 3640, 14580, 407, 1097, 39793, 253, 4181, 273, 27214, 10726, 8557, 285, 11454, 26373, 3056, 50276, 498, 15752, 253, 9759, 285, 6003, 273, 436, 2929, 310, 1077, 2834, 281, 956, 50276, 67, 11587, 253, 28146, 285, 1511, 6332, 627, 4961, 1142, 12342, 326, 417, 2590, 534, 2789, 352, 2834, 281, 2096, 253, 2022, 2934, 273, 436, 789, 323, 1650, 253, 4473, 273, 26373, 297, 285, 42081, 403, 5611, 1078, 4933, 247, 7473, 5426, 253, 7558, 7881, 326, 452, 417, 644, 14042, 973, 407, 2045, 2987, 403, 417, 21414, 2217, 275, 253, 495, 5784, 12494, 4477, 9125, 326, 2045, 2561, 2722, 1077, 1175, 3045, 533, 556, 417, 1160, 327, 253, 10527, 1783, 846, 4361, 253, 2644, 7482, 697, 1335, 247, 1943, 1953, 327, 253, 1677, 10527, 1783, 273, 436, 789, 3192, 253, 10012, 337, 347, 1650, 697, 625, 751, 247, 22861, 533, 417, 247, 10012, 281, 921, 253, 4602, 875, 253, 4081, 26373, 297, 285, 42081, 42081, 3400, 271, 16774, 2605, 281, 23968, 285, 30215, 253, 14429, 275, 253, 465, 5943, 697, 2605, 588, 320, 4391, 2112, 342, 253, 15841, 275, 1133, 432, 436, 2929, 891, 476, 417, 1089, 253, 4602, 875, 5886, 1511, 12420, 2957, 285, 253, 42081, 4496, 2075, 625, 4116, 281, 253, 4028, 285, 6003, 697, 271, 4722, 789, 533, 417, 4704, 50276, 34974, 323, 30080, 22559, 337, 849, 281, 1973, 253, 5886, 8357, 275, 436, 789, 403, 597, 27214, 13542, 604, 594, 352, 588, 452, 15840, 2523, 323, 10620, 2709, 465, 5943, 50276, 19, 4496, 7277, 281, 247, 3332, 4081, 1332, 337, 534, 671, 5556, 4219, 253, 4181, 875, 247, 1387, 273, 14429, 432, 2831, 1981, 780, 465, 5943, 1027, 432, 436, 789, 697, 417, 1617, 327, 253, 5886, 1511, 533, 1754, 327, 247, 12421, 19958, 1387, 273, 14429, 50276, 37585, 5701, 50276, 18, 275, 253, 12494, 1475, 253, 5150, 721, 253, 385, 19, 943, 320, 2046, 19, 50276, 19, 4677, 495, 2722, 253, 4583, 10336, 273, 253, 4081, 1332, 352, 943, 3176, 275, 253, 2022, 2600, 50274, 250, 3065, 50276, 18, 759, 74, 439, 469, 8500, 26535, 340, 86, 285, 1269, 22589, 965, 606, 1182, 12109, 11138, 2831, 1981, 780, 10726, 12420, 3066, 8654, 4616, 5213, 6036, 27691, 327, 13345, 9260, 6003, 6247, 7152, 339, 793, 360, 3454, 50276, 783, 2929, 29328, 425, 3703, 66, 271, 2746, 326, 2007, 1030, 44196, 15841, 21496, 342, 42081, 3640, 253, 2929, 806, 14177, 281, 26799, 253, 5368, 21496, 3169, 10726, 12420, 3082, 14851, 326, 954, 273, 253, 3082, 5206, 21191, 339, 347, 14755, 3470, 533, 616, 21496, 3386, 403, 417, 15616, 973, 2429, 281, 253, 11454, 3169, 390, 5889, 3169, 2957, 1159, 253, 2929, 3103, 35910, 436, 1895, 407, 6684, 247, 747, 425, 3703, 66, 10336, 534, 2722, 326, 6240, 247, 465, 6058, 15417, 42081, 3640, 476, 15338, 824, 3064, 253, 3368, 2722, 253, 747, 10806, 476, 3157, 1375, 23037, 14387, 1666, 25379, 50276, 296, 3755, 20556, 50276, 783, 2934, 273, 970, 42081, 7658, 347, 271, 3081, 2957, 310, 747, 253, 2929, 2722, 1534, 11701, 672, 16248, 253, 9841, 4158, 2957, 342, 1375, 23037, 14387, 1666, 25379, 50276, 783, 4583, 2929, 310, 2590, 281, 2096, 50275, 20881, 1255, 265, 50276, 783, 2929, 36908, 452, 247, 2905, 789, 2593, 4677, 337, 310, 247, 1652, 2372, 36396, 3340, 323, 4677, 337, 69, 352, 651, 320, 1805, 281, 1056, 253, 4677, 247, 1652, 2372, 4067, 25761, 697, 12744, 534, 3295, 10140, 281, 253, 806, 15841, 2403, 10668, 13477, 2593, 374, 943, 320, 21884, 715, 253, 10199, 2593, 50276, 8826, 273, 253, 3000, 403, 21643, 824, 347, 11454, 26373, 3056, 253, 11454, 42081, 12420, 534, 310, 4767, 275, 253, 30762, 4677, 310, 1199, 625, 2590, 685, 253, 1655, 17697, 11454, 26373, 3056, 2593, 495, 310, 417, 973, 34092, 253, 39383, 285, 26373, 3056, 943, 320, 39325, 390, 690, 9079, 253, 10393, 273, 1110, 3000, 310, 247, 1652, 2372, 37214, 1060, 253, 4737, 275, 253, 30762, 310, 671, 417, 973, 2931, 50276, 783, 3368, 2593, 310, 1512, 2159, 285, 417, 1077, 27096, 352, 651, 320, 1805, 281, 2486, 625, 11088, 1783, 824, 347, 5277, 690, 24426, 1078, 285, 846, 253, 747, 2957, 3966, 50276, 5996, 250, 2858, 22559, 50276, 74, 11435, 326, 253, 4477, 452, 2589, 38549, 327, 253, 1655, 2715, 2299, 891, 1158, 253, 1655, 2929, 310, 3164, 1335, 417, 2266, 2217, 323, 17857, 32888, 2490, 187, 4118, 18435, 27, 783, 4477, 1263, 253, 1895, 273, 35919, 272, 21496, 3169, 10726, 12420, 275, 3640, 14580, 15841, 949, 253, 897, 273, 6036, 12420, 342, 36183, 11454, 14642, 5970, 625, 5742, 12420, 273, 253, 15841, 11454, 26373, 3056, 17194, 407, 253, 8310, 326, 253, 6779, 875, 767, 7826, 15616, 14429, 1364, 320, 3033, 407, 247, 8723, 8459, 534, 476, 320, 20276, 672, 627, 403, 1142, 2442, 43097, 597, 12661, 8495, 272, 11454, 26373, 3056, 407, 369, 2152, 6339, 4181, 3169, 2957, 875, 6311, 10726, 46234, 27039, 327, 253, 5886, 46234, 4679, 403, 5196, 327, 1527, 14576, 1411, 2709, 2266, 1666, 25379, 50276, 9029, 272, 326, 6240, 253, 42081, 12420, 281, 841, 1666, 25379, 19132, 253, 1543, 50275, 856, 84, 50275, 783, 1635, 273, 8495, 272, 17697, 14642, 5970, 310, 27723, 46123, 4460, 50276, 1542, 465, 5943, 342, 4209, 10726, 16429, 14787, 253, 4081, 425, 3703, 66, 1332, 310, 7763, 50275, 570, 3703, 66, 556, 644, 2011, 45190, 281, 3157, 1142, 256, 5503, 3082, 50275, 5040, 50274, 6050, 253, 10527, 22861, 310, 247, 10112, 16038, 253, 30628, 858, 417, 1089, 253, 10527, 7125, 1534, 4543, 21414, 50276, 1189, 455, 253, 14511, 3198, 789, 281, 1056, 253, 2929, 625, 1881, 41010, 285, 2746, 494, 323, 247, 16055, 2491, 273, 10668, 253, 30628, 285, 4266, 1119, 1142, 12342, 285, 7234, 8489, 21643, 285, 25312, 4518, 3634, 285, 4499, 342, 5368, 2987, 50276, 15419, 18186, 2112, 253, 9521, 10103, 50276, 15177, 4473, 1230, 253, 5161, 2934, 310, 4722, 973, 24013, 8550, 3236, 285, 27723, 46123, 3576, 45190, 425, 3703, 66, 310, 2011, 2104, 281, 3157, 2220, 2067, 2266, 8245, 6944, 3082, 50276, 74, 2868, 326, 512, 273, 253, 30628, 1089, 253, 789, 310, 4722, 285, 12532, 2299, 627, 497, 11440, 7350, 253, 4757, 2877, 273, 253, 2529, 3762, 352, 310, 2649, 2590, 604, 10046, 3762, 310, 2649, 1896, 390, 604, 436, 816, 556, 2649, 644, 269, 868, 742, 562, 50275, 498, 15752, 954, 273, 253, 30628, 285, 4266, 1119, 253, 2929, 2834, 281, 956, 347, 247, 1881, 41010, 789, 275, 2426, 273, 12342, 2590, 14308, 24088, 14168, 1179, 246, 310, 2649, 2931, 2393, 327, 285, 253, 4588, 30437, 273, 253, 3762, 253, 8442, 1361, 533, 1014, 841, 878, 690, 789, 247, 2905, 789, 2593, 390, 625, 18872, 9759, 273, 2905, 789, 1537, 320, 8254, 5411, 2112, 342, 3515, 6667, 285, 247, 625, 440, 5411, 14168, 9759, 326, 28174, 5368, 285, 4081, 789, 846, 4680, 670, 436, 625, 352, 310, 2686, 247, 4103, 2969, 275, 247, 1175, 1039, 285, 19080, 2934, 2299, 352, 2335, 2067, 28799, 285, 28799, 273, 2905, 789, 281, 755, 627, 23000, 253, 958, 326, 512, 273, 253, 30628, 497, 7514, 670, 1027, 7364, 310, 8664, 8772, 19843, 30762, 270, 7729, 247, 2372, 285, 891, 2868, 476, 671, 320, 1691, 715, 253, 2022, 2929, 50276, 19164, 414, 347, 1682, 347, 253, 30628, 285, 891, 476, 2028, 359, 419, 2254, 2326, 436, 1332, 3732, 281, 10726, 12420, 5747, 436, 1146, 247, 4942, 14242, 749, 3423, 50276, 9188, 40348, 253, 13969, 3133, 281, 320, 326, 253, 2746, 812, 320, 247, 16613, 7680, 281, 271, 1774, 2170, 2299, 352, 671, 4620, 326, 954, 273, 253, 30628, 13414, 1928, 253, 2929, 310, 4704, 323, 9311, 387, 247, 281, 431, 1321, 18767, 2568, 50276, 284, 4767, 4768, 436, 1313, 609, 1374, 627, 403, 2067, 7794, 281, 751, 670, 436, 789, 1690, 253, 3236, 414, 273, 253, 2934, 2266, 16038, 285, 1175, 16774, 1543, 2299, 359, 512, 5821, 326, 253, 2929, 310, 2649, 3240, 4704, 275, 697, 1655, 830, 50276, 40622, 891, 21654, 5583, 12009, 323, 436, 19529 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper proposes carlane a 3way simtoreal domain adaptation benchmark for 2d lane detection which encompasses the singletarget datasets molane tulane and the multitarget dataset mulane compensates for data scarcity and encourages future research in lane detection experiments part are limited the algorithms covered by this paper is not well rounded more classic algorithm should be included docsepthe paper proposes a 3way sim to real benchmark for crossdomain lane detection including single and multitarget uda the author use the simulation software carla simulator and a realbuilt 18th model vehicle to collect synthetic data and real lane data respectively based on the three datasets the authors introduce the first simtoreal domain adaptation benchmark for lane detection carlane well known uda methods like dann and adda are evaluated and the authors also propose a new method named sgpcs which achieves stateoftheart performance on carlane 1 lane detection is a cuttingedge field which has gained much attention in recent years the vast amount of collected labeled data has greatly facilitated progress in this field also a large benchmark for crossdomain lane detection can greatly promote the development of the field 2 extensive experimental results and visualization are provided the code is also provided making it easy to reproduce the result 3 the paper is well organized and easy to understand 1 this dataset is somewhat simple for domain adaptation ie the domain shift is small first the sourceonly model performs well on the target domain data and all three settings can approach 90 lane accuracy second the da method is not significantly improved compared to the sourceonly results it seems that da has no obvious effect this phenomenon is more obvious in the tsne visualization third the gap between the sourceonly results and the targetonly results is small which makes the significance of this study very limited 2 more explanations are needed why dann performs worse than source only in tab3 why does dann cause the phenomenon of negative transfer is it caused by the distribution of the dataset or is it caused by the characteristics of lane detection 3 only sim to real adaptation is included benchmark for real to real eg model vehicle tusimple is also important 4 the statistics of the datasets proposed by the authors and those of previous works need to be compared 5 some related works for crossdomain detection and selfsupervised land detection methods are missing docsepthe paper introduces a 3way benchmark called carlane focused on simulation to realworld unsupervised domain adaptation with single and multitarget the paper also provides dataset tools for labeling images the authors conduct extensive experiments and draw interesting conclusions 1 the paper proposes a systematic way to evaluate uda methods on the proposed datset which is useful for research in lane detection 1 data from simulation the model vehicle and realworld scenarios allows flexible usage to learn lane detection models 1 the paper extends the standard singletarget uda to multitarget uda which is closer to realworld applications it would be good to consider different weather conditions in the 3 proposed datasets ### Summary:
the meta reviewer has read the paper rebuttal reviews and the authors revision the meta reviewer agrees with the reviewers that this lane benchmark is of interest to uda and the authors release the data tool and source codes to support their claimed contribution the meta reviewer thus recommends acceptance
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 1113, 34669, 247, 495, 1106, 948, 85, 43549, 5028, 15644, 22791, 323, 374, 69, 18209, 5481, 534, 37035, 253, 34791, 1816, 15302, 14008, 1351, 246, 335, 1351, 285, 253, 1554, 262, 1816, 10895, 22871, 1351, 7037, 684, 323, 941, 22888, 414, 285, 29426, 2852, 2561, 275, 18209, 5481, 4679, 629, 403, 3710, 253, 11333, 6107, 407, 436, 2929, 310, 417, 973, 9971, 625, 10610, 5933, 943, 320, 2908, 5474, 339, 431, 248, 2929, 29328, 247, 495, 1106, 948, 281, 1524, 22791, 323, 2831, 13517, 18209, 5481, 1690, 2014, 285, 1554, 262, 1816, 209, 14776, 253, 2488, 897, 253, 9864, 3694, 1113, 4123, 40022, 285, 247, 1524, 20989, 1283, 394, 1566, 4958, 281, 4822, 13506, 941, 285, 1524, 18209, 941, 2975, 1754, 327, 253, 1264, 15302, 253, 4477, 9569, 253, 806, 948, 85, 43549, 5028, 15644, 22791, 323, 18209, 5481, 1113, 34669, 973, 1929, 209, 14776, 3082, 751, 40696, 285, 823, 66, 403, 6760, 285, 253, 4477, 671, 12661, 247, 747, 1332, 4907, 48237, 5902, 84, 534, 33526, 1375, 23037, 14387, 3045, 327, 1113, 34669, 337, 18209, 5481, 310, 247, 9968, 13057, 1673, 534, 556, 12103, 1199, 4116, 275, 3332, 1107, 253, 8485, 2408, 273, 5728, 13130, 941, 556, 10260, 30199, 4780, 275, 436, 1673, 671, 247, 1781, 22791, 323, 2831, 13517, 18209, 5481, 476, 10260, 8591, 253, 2440, 273, 253, 1673, 374, 9470, 5661, 1543, 285, 24426, 403, 2530, 253, 2127, 310, 671, 2530, 2403, 352, 3477, 281, 18302, 253, 906, 495, 253, 2929, 310, 973, 10932, 285, 3477, 281, 2096, 50275, 18, 436, 10895, 310, 8489, 2969, 323, 5028, 15644, 26332, 253, 5028, 5333, 310, 1355, 806, 253, 2603, 7483, 1566, 17923, 973, 327, 253, 2303, 5028, 941, 285, 512, 1264, 7533, 476, 2746, 5091, 18209, 7200, 1273, 253, 4204, 1332, 310, 417, 3012, 5520, 2429, 281, 253, 2603, 7483, 1543, 352, 3133, 326, 4204, 556, 642, 4755, 1055, 436, 11562, 310, 625, 4755, 275, 253, 28669, 570, 24426, 2626, 253, 8037, 875, 253, 2603, 7483, 1543, 285, 253, 2303, 7483, 1543, 310, 1355, 534, 2789, 253, 8453, 273, 436, 1263, 1077, 3710, 374, 625, 22909, 403, 3058, 2139, 40696, 17923, 7197, 685, 2603, 760, 275, 10334, 20, 2139, 1057, 40696, 2847, 253, 11562, 273, 4016, 3700, 310, 352, 4269, 407, 253, 3268, 273, 253, 10895, 390, 310, 352, 4269, 407, 253, 5319, 273, 18209, 5481, 495, 760, 948, 281, 1524, 15644, 310, 2908, 22791, 323, 1524, 281, 1524, 24088, 1566, 4958, 50275, 85, 316, 45501, 310, 671, 1774, 577, 253, 9990, 273, 253, 15302, 4081, 407, 253, 4477, 285, 1110, 273, 2045, 2987, 878, 281, 320, 2429, 608, 690, 2905, 2987, 323, 2831, 13517, 5481, 285, 1881, 35421, 2659, 5481, 3082, 403, 5816, 50275, 7152, 339, 431, 248, 2929, 23970, 247, 495, 1106, 22791, 1925, 1113, 34669, 7106, 327, 9864, 281, 1524, 10186, 440, 35421, 5028, 15644, 342, 2014, 285, 1554, 262, 1816, 253, 2929, 671, 3400, 10895, 5657, 323, 21473, 3888, 253, 4477, 2589, 9470, 4679, 285, 3812, 4722, 11815, 337, 253, 2929, 29328, 247, 12082, 1039, 281, 7472, 209, 14776, 3082, 327, 253, 4081, 2856, 1178, 534, 310, 4217, 323, 2561, 275, 18209, 5481, 337, 941, 432, 9864, 253, 1566, 4958, 285, 1524, 10186, 15216, 4483, 12112, 10393, 281, 3037, 18209, 5481, 3210, 337, 253, 2929, 8725, 253, 2629, 34791, 1816, 209, 14776, 281, 1554, 262, 1816, 209, 14776, 534, 310, 8003, 281, 1524, 10186, 4893, 352, 651, 320, 1175, 281, 1908, 1027, 8588, 2515, 275, 253, 495, 4081, 15302, 50276, 187, 187, 4118, 18435, 27, 783, 11419, 37317, 556, 1239, 253, 2929, 30080, 22559, 10123, 285, 253, 4477, 18520, 253, 11419, 37317, 18726, 342, 253, 30628, 326, 436, 18209, 22791, 310, 273, 1600, 281, 209, 14776, 285, 253, 4477, 3727, 253, 941, 4968, 285, 2603, 11646, 281, 1329, 616, 7558, 7680, 253, 11419, 37317, 3021, 32636, 14924 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 1113, 34669, 247, 495, 1106, 948, 85, 43549, 5028, 15644, 22791, 323, 374, 69, 18209, 5481, 534, 37035, 253, 34791, 1816, 15302, 14008, 1351, 246, 335, 1351, 285, 253, 1554, 262, 1816, 10895, 22871, 1351, 7037, 684, 323, 941, 22888, 414, 285, 29426, 2852, 2561, 275, 18209, 5481, 4679, 629, 403, 3710, 253, 11333, 6107, 407, 436, 2929, 310, 417, 973, 9971, 625, 10610, 5933, 943, 320, 2908, 5474, 339, 431, 248, 2929, 29328, 247, 495, 1106, 948, 281, 1524, 22791, 323, 2831, 13517, 18209, 5481, 1690, 2014, 285, 1554, 262, 1816, 209, 14776, 253, 2488, 897, 253, 9864, 3694, 1113, 4123, 40022, 285, 247, 1524, 20989, 1283, 394, 1566, 4958, 281, 4822, 13506, 941, 285, 1524, 18209, 941, 2975, 1754, 327, 253, 1264, 15302, 253, 4477, 9569, 253, 806, 948, 85, 43549, 5028, 15644, 22791, 323, 18209, 5481, 1113, 34669, 973, 1929, 209, 14776, 3082, 751, 40696, 285, 823, 66, 403, 6760, 285, 253, 4477, 671, 12661, 247, 747, 1332, 4907, 48237, 5902, 84, 534, 33526, 1375, 23037, 14387, 3045, 327, 1113, 34669, 337, 18209, 5481, 310, 247, 9968, 13057, 1673, 534, 556, 12103, 1199, 4116, 275, 3332, 1107, 253, 8485, 2408, 273, 5728, 13130, 941, 556, 10260, 30199, 4780, 275, 436, 1673, 671, 247, 1781, 22791, 323, 2831, 13517, 18209, 5481, 476, 10260, 8591, 253, 2440, 273, 253, 1673, 374, 9470, 5661, 1543, 285, 24426, 403, 2530, 253, 2127, 310, 671, 2530, 2403, 352, 3477, 281, 18302, 253, 906, 495, 253, 2929, 310, 973, 10932, 285, 3477, 281, 2096, 50275, 18, 436, 10895, 310, 8489, 2969, 323, 5028, 15644, 26332, 253, 5028, 5333, 310, 1355, 806, 253, 2603, 7483, 1566, 17923, 973, 327, 253, 2303, 5028, 941, 285, 512, 1264, 7533, 476, 2746, 5091, 18209, 7200, 1273, 253, 4204, 1332, 310, 417, 3012, 5520, 2429, 281, 253, 2603, 7483, 1543, 352, 3133, 326, 4204, 556, 642, 4755, 1055, 436, 11562, 310, 625, 4755, 275, 253, 28669, 570, 24426, 2626, 253, 8037, 875, 253, 2603, 7483, 1543, 285, 253, 2303, 7483, 1543, 310, 1355, 534, 2789, 253, 8453, 273, 436, 1263, 1077, 3710, 374, 625, 22909, 403, 3058, 2139, 40696, 17923, 7197, 685, 2603, 760, 275, 10334, 20, 2139, 1057, 40696, 2847, 253, 11562, 273, 4016, 3700, 310, 352, 4269, 407, 253, 3268, 273, 253, 10895, 390, 310, 352, 4269, 407, 253, 5319, 273, 18209, 5481, 495, 760, 948, 281, 1524, 15644, 310, 2908, 22791, 323, 1524, 281, 1524, 24088, 1566, 4958, 50275, 85, 316, 45501, 310, 671, 1774, 577, 253, 9990, 273, 253, 15302, 4081, 407, 253, 4477, 285, 1110, 273, 2045, 2987, 878, 281, 320, 2429, 608, 690, 2905, 2987, 323, 2831, 13517, 5481, 285, 1881, 35421, 2659, 5481, 3082, 403, 5816, 50275, 7152, 339, 431, 248, 2929, 23970, 247, 495, 1106, 22791, 1925, 1113, 34669, 7106, 327, 9864, 281, 1524, 10186, 440, 35421, 5028, 15644, 342, 2014, 285, 1554, 262, 1816, 253, 2929, 671, 3400, 10895, 5657, 323, 21473, 3888, 253, 4477, 2589, 9470, 4679, 285, 3812, 4722, 11815, 337, 253, 2929, 29328, 247, 12082, 1039, 281, 7472, 209, 14776, 3082, 327, 253, 4081, 2856, 1178, 534, 310, 4217, 323, 2561, 275, 18209, 5481, 337, 941, 432, 9864, 253, 1566, 4958, 285, 1524, 10186, 15216, 4483, 12112, 10393, 281, 3037, 18209, 5481, 3210, 337, 253, 2929, 8725, 253, 2629, 34791, 1816, 209, 14776, 281, 1554, 262, 1816, 209, 14776, 534, 310, 8003, 281, 1524, 10186, 4893, 352, 651, 320, 1175, 281, 1908, 1027, 8588, 2515, 275, 253, 495, 4081, 15302, 50276, 187, 187, 4118, 18435, 27, 783, 11419, 37317, 556, 1239, 253, 2929, 30080, 22559, 10123, 285, 253, 4477, 18520, 253, 11419, 37317, 18726, 342, 253, 30628, 326, 436, 18209, 22791, 310, 273, 1600, 281, 209, 14776, 285, 253, 4477, 3727, 253, 941, 4968, 285, 2603, 11646, 281, 1329, 616, 7558, 7680, 253, 11419, 37317, 3021, 32636, 14924 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: in the manuscript the authors propose squant which is a datafree quantization method that can apply posttraining quantization ptq without any backpropagation specifically squant is taking advantage of approximated hessian information based on the assumptions and deductions in the paper squant tries to optimize constrained absolute sum of error case instead of mse the authors show many experimental results to validate the effectiveness of squant strengths 1 i think the paper is very well written with good illustrations 2 the related work section is thorough 3 the notations in section 21 are necessary and helpful so as the algorithm 1 and 2 otherwise the notion of blocks layers elements inputoutput channels kernels will make the math become super confusing 4 the experimental results are stateoftheart with evaluations on various neural networks 5 i go through the math and find them generally correct the decompositionapproximation idea is decent weaknesses 1 squant is for weight and activation quantization i think the weight quantization part is well supported but there lacks discussion about activation quantization especially how it affects the accuracy of approximationshessian representations used in squant 2 extra clarifications are needed for equation 7 to 8 as i understand it equation 7 to 8 works well for squant because of its iterative optimization on the three components but generally those two are not equivalent 3 how is the quantization range min max determined in squant would squant work better if combined with other orthogonal ptq methods 4 for completeness i wonder about 4bit quantization results of shufflenet small issues 1 in supp a extra w post rebuttal i think the rebuttal from the authors makes sense and can address my concerns excluding nontechnical issues such as novelty and potential influence as a result i maintain my score tending to accept i think the manuscript is technically sound with a solid understanding of the ptq problem the experimental results are quite good since there are still clarifications to be made and the fact that similar deductions and flipping methods exist in previous works i recommend a weak acceptance docsepthis paper proposes a datafree quantization method based on the secondorder taylor expansion of loss where the hessian matrix is approximated with different levels elementwise kernelwise and channelwise the authors progressively determine the quantized weights from elementwise to kernelwise and then to channelwise the derivation and solution of the quantization are novel empirical results show that the proposed method outperforms recent datafree methods the paper is overall clearly structured and written the proposed method is wellmotivated and the resultant flipping solution is novel as far as i know this paper provides a brilliant way to directly use discrete optimization for quantization instead of conventional training with gradient backpropagation however the following points still require some more clarifications it is not clear why the firstorder term is omitted in equation 1 it is not clear why the scaling parameters like eni kn and cm are omitted in equation 7 the solution from the proposed progressive optimization is not the only solution and other solutions may still depend on these scaling parameters this paper is well written and the proposed method is novel docsepthis paper proposes a new datafree quantization method that does not require backpropagation nor finetuning the key idea is adopting hessianbased optimization that can be decomposed into three parts squante k and c corresponding to the three dimensions of the weight tensor using a few approximations such as crosslayer independence to simplify the optimization then instead of mse the authors introduce case constrained ase of weight perturbation the experimental results show that the proposed dfq method outperforms even gdfq that is basically a kind of qat the proposed technique is especially useful for a lowbit quantization this paper is clearly written and wellmotivated after a successful approximation of the hessianbased approach the authors discuss how to minimize the necessity of activation distribution information in section 3 assuming that input feature maps autocorrelate with each other in a similar way the experimental results are impressive in section 4 1 what are the limitations of this work the quality of the proposed method would depend on the validity of the assumptions to ignore activation distribution if a summary of limitations and assumptions of input distribution is provided it would be helpful to understand why this work is only limited to certain cnn models as written in the paper since transformers and mlps are increasingly utilized for the area that has been dominated by cnn models it would be necessary to describe which dataset or model architecture is best performed by the proposed method 2 the flipping approach is limited to up or down with a step size of 1 is there any chance to improve the accuracy further if we increase the step size ie allowing wider flipping approaches 3 flipping approach has been introduced to a few previous ptq techniques while such techniques usually assume that a calibration set is provided how close is the proposed work to those previous flippingbased ptq methods in terms of model accuracy if the difference is marginal does it mean that investigating input data distribution is inherently less important for cnn models is there any chance such a difference between datafree ptq and calibrationbased ptq can be larger depending on the characteristics of the dataset since all experimental data in this paper is given for imagenet only this paper might present an impression that the proposed method might be optimized for imagenetlike datasets this paper introduces a few reasonable assumptions to alleviate the efforts to estimate the input data distribution successful approximations lead to improved model accuracy even with a very small number of bits to represent weights docsepin this paper the authors propose a datafree quantization algorithm of deep neural networks called squant the main idea of squant is to decompose the hessianbased optimization objective into three components elementwise kernelwise and channelwise components in order to jointly optimize these three objective functions a constrained absolute sum of error case is studied and a progressive algorithm is used experiment results show that squant algorithm is able to keep higher accuracy with the same number of bits compared to several baseline algorithms strength the main strength of this paper is the introduction of multiscale approximation of hessian matrix due to the scalability issue it is impossible to store the entire matrix traditional methods usually only keep track of the diagonal terms inspired by matrix decomposition ideas this paper adopt a diagonal blockwise diagonal low rank approximation which is a good idea experiment results also show that the matrix decomposition idea work well weakness i think there are some flaws in the derivation of the optimization objective 1 in formula 2 hwml lm xl xlt but from formula 3 to formula 4 ehwml is replaced by exl xlt where lm disappears why this step hold true 2 in formula 7 the objective is sumn i en i delta wmnil2 sumn kn delta wmnl jk delta wmnlt cm delta wml jnk wmlt but in formula 8 the coefficients en i kn and cm disappear i understand that all the coefficients are positive from appendix a2 but as long as they are not all equal the objective should be different without these coefficients i dont quite understand how we get 8 from 7 the motivation to decompose the hessian matrix into diagonal blockwise diagonal and lowrank components is the crucial contribution of this paper experiment results show that the multiscale optimization objective lead to good performance after quantization i dont follow some steps in the derivations of the optimization objective this paper will be better if the authors can illustrate these steps well ### Summary:
the authors propose a datafree quantization method that can be applied posttraining quantization without backpropagation the method takes advantage of approximate hessian information in a certain scalable approximate way based on the assumptions and deductions in the paper squant tries to optimize constrained absolute sum of error case instead of mse there are good empirical results showing the effectiveness of the method and the paper is well written and the method should be of broad interest
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 249, 253, 7714, 253, 4477, 12661, 3896, 386, 534, 310, 247, 941, 4924, 36643, 1332, 326, 476, 4647, 1501, 31158, 36643, 31048, 82, 1293, 667, 896, 44263, 318, 50276, 46458, 3896, 386, 310, 3192, 5750, 273, 34930, 344, 859, 757, 1491, 1754, 327, 253, 13260, 285, 45131, 275, 253, 2929, 3896, 386, 14177, 281, 22318, 20793, 7880, 2020, 273, 2228, 1083, 3185, 273, 278, 339, 50276, 783, 4477, 921, 1142, 5661, 1543, 281, 17813, 253, 12510, 273, 3896, 386, 20544, 337, 891, 1158, 253, 2929, 310, 1077, 973, 3542, 342, 1175, 33954, 374, 253, 2905, 789, 2593, 310, 11080, 495, 253, 41818, 275, 2593, 3127, 403, 3309, 285, 9371, 594, 347, 253, 5933, 337, 285, 374, 5010, 253, 10732, 273, 8336, 8090, 3603, 3280, 9252, 8123, 34501, 588, 1056, 253, 14168, 2489, 2221, 21643, 577, 253, 5661, 1543, 403, 1375, 23037, 14387, 342, 27163, 327, 2710, 11454, 6928, 608, 891, 564, 949, 253, 14168, 285, 1089, 731, 3839, 3451, 253, 14717, 6772, 3266, 318, 2934, 310, 12524, 50276, 20881, 1255, 265, 337, 3896, 386, 310, 323, 2801, 285, 5743, 36643, 891, 1158, 253, 2801, 36643, 629, 310, 973, 4516, 533, 627, 19756, 5955, 670, 5743, 36643, 3340, 849, 352, 11852, 253, 7200, 273, 34754, 35659, 757, 14237, 908, 275, 3896, 386, 374, 4465, 8254, 6787, 403, 3058, 323, 5150, 818, 281, 854, 347, 891, 2096, 352, 5150, 818, 281, 854, 2987, 973, 323, 3896, 386, 984, 273, 697, 34560, 13757, 327, 253, 1264, 4295, 533, 3839, 1110, 767, 403, 417, 6425, 495, 849, 310, 253, 36643, 2491, 1054, 2781, 3413, 275, 3896, 386, 651, 3896, 386, 789, 1805, 604, 5678, 342, 643, 19627, 31048, 82, 3082, 577, 323, 29867, 891, 4282, 670, 577, 2713, 36643, 1543, 273, 439, 2066, 5025, 292, 50276, 6795, 3374, 337, 275, 915, 247, 4465, 259, 50276, 5996, 30080, 22559, 50276, 74, 1158, 253, 30080, 22559, 432, 253, 4477, 2789, 3282, 285, 476, 2953, 619, 7350, 22914, 1327, 48746, 3374, 824, 347, 38135, 285, 2442, 4833, 347, 247, 906, 891, 6558, 619, 4868, 43981, 281, 2997, 891, 1158, 253, 7714, 310, 22335, 3590, 342, 247, 4891, 4685, 273, 253, 31048, 82, 1895, 253, 5661, 1543, 403, 3240, 1175, 1580, 627, 403, 1335, 8254, 6787, 281, 320, 1160, 285, 253, 958, 326, 2074, 45131, 285, 46899, 3082, 2226, 275, 2045, 2987, 891, 5583, 247, 5075, 14924, 5474, 33032, 2520, 2929, 29328, 247, 941, 4924, 36643, 1332, 1754, 327, 253, 1273, 2621, 246, 9614, 7466, 273, 50276, 18585, 835, 253, 344, 859, 757, 4315, 310, 34930, 342, 1027, 2308, 3284, 3020, 10295, 3020, 285, 5048, 3020, 253, 4477, 31414, 3653, 253, 2677, 1025, 13461, 432, 3284, 3020, 281, 10295, 3020, 285, 840, 281, 5048, 3020, 253, 28529, 285, 2900, 273, 253, 36643, 403, 4460, 16774, 1543, 921, 326, 253, 4081, 1332, 41731, 13015, 3332, 941, 4924, 3082, 50275, 783, 2929, 310, 4583, 4518, 18872, 285, 3542, 253, 4081, 1332, 310, 973, 24013, 8550, 285, 253, 29395, 46899, 2900, 310, 4460, 347, 2080, 347, 891, 871, 436, 2929, 3400, 247, 15925, 1039, 281, 3587, 897, 13358, 13757, 323, 36643, 3185, 273, 6041, 3733, 342, 11786, 896, 44263, 318, 2299, 253, 1563, 2792, 1335, 2430, 690, 625, 8254, 6787, 50276, 262, 310, 417, 2590, 2139, 253, 806, 2621, 1307, 310, 11035, 275, 5150, 337, 50276, 262, 310, 417, 2590, 2139, 253, 13642, 3602, 751, 546, 74, 694, 285, 7892, 403, 11035, 275, 5150, 818, 253, 2900, 432, 253, 4081, 13439, 13757, 310, 417, 253, 760, 2900, 285, 643, 5482, 778, 1335, 3469, 327, 841, 13642, 3602, 436, 2929, 310, 973, 3542, 285, 253, 4081, 1332, 310, 4460, 5474, 33032, 2520, 2929, 29328, 247, 747, 941, 4924, 36643, 1332, 326, 1057, 417, 2430, 896, 44263, 318, 4543, 1442, 292, 25004, 253, 2234, 2934, 310, 25987, 344, 859, 757, 3169, 13757, 326, 476, 320, 45765, 715, 1264, 4243, 3896, 7961, 465, 285, 260, 3969, 281, 253, 1264, 10103, 273, 253, 2801, 13148, 970, 247, 1643, 34754, 824, 347, 2831, 12026, 14275, 281, 25636, 253, 13757, 840, 3185, 273, 278, 339, 253, 4477, 9569, 1083, 20793, 247, 339, 273, 2801, 20452, 253, 5661, 1543, 921, 326, 253, 4081, 20926, 82, 1332, 41731, 13015, 1014, 305, 4989, 82, 326, 310, 10323, 247, 2238, 273, 2805, 255, 253, 4081, 5853, 310, 3340, 4217, 323, 247, 1698, 2713, 36643, 436, 2929, 310, 4518, 3542, 285, 973, 24013, 8550, 846, 247, 5547, 11193, 273, 253, 344, 859, 757, 3169, 2746, 253, 4477, 2319, 849, 281, 15338, 253, 15504, 273, 5743, 3268, 1491, 275, 2593, 495, 7384, 326, 3280, 4735, 8115, 42436, 263, 1661, 366, 342, 1016, 643, 275, 247, 2074, 1039, 253, 5661, 1543, 403, 13943, 275, 2593, 577, 50276, 18, 752, 403, 253, 7364, 273, 436, 789, 253, 3290, 273, 253, 4081, 1332, 651, 3469, 327, 253, 13091, 273, 253, 13260, 281, 11823, 5743, 3268, 604, 247, 6010, 273, 7364, 285, 13260, 273, 3280, 3268, 310, 2530, 352, 651, 320, 9371, 281, 2096, 2139, 436, 789, 310, 760, 3710, 281, 2176, 260, 9866, 3210, 347, 3542, 275, 253, 2929, 1580, 4979, 398, 285, 13361, 793, 403, 9592, 12845, 323, 253, 2170, 326, 556, 644, 14691, 407, 260, 9866, 3210, 352, 651, 320, 3309, 281, 6266, 534, 10895, 390, 1566, 10336, 310, 1682, 2684, 407, 253, 4081, 1332, 50276, 19, 253, 46899, 2746, 310, 3710, 281, 598, 390, 1066, 342, 247, 3213, 1979, 273, 337, 310, 627, 667, 4839, 281, 3157, 253, 7200, 2007, 604, 359, 2572, 253, 3213, 1979, 26332, 6941, 14200, 46899, 7274, 50276, 20, 46899, 2746, 556, 644, 5611, 281, 247, 1643, 2045, 31048, 82, 5609, 1223, 824, 5609, 3798, 5467, 326, 247, 18543, 873, 310, 2530, 849, 2810, 310, 253, 4081, 789, 281, 1110, 2045, 46899, 3169, 31048, 82, 3082, 275, 2426, 273, 1566, 7200, 604, 253, 3064, 310, 16888, 1057, 352, 1599, 326, 15686, 3280, 941, 3268, 310, 26557, 1679, 1774, 323, 260, 9866, 3210, 310, 627, 667, 4839, 824, 247, 3064, 875, 941, 4924, 31048, 82, 285, 18543, 3169, 31048, 82, 476, 320, 4067, 7293, 327, 253, 5319, 273, 253, 10895, 1580, 512, 5661, 941, 275, 436, 2929, 310, 1677, 323, 4440, 257, 292, 760, 436, 2929, 1537, 1246, 271, 13214, 326, 253, 4081, 1332, 1537, 320, 18325, 323, 4440, 257, 292, 3022, 15302, 436, 2929, 23970, 247, 1643, 5272, 13260, 281, 33623, 253, 6031, 281, 6642, 253, 3280, 941, 3268, 5547, 34754, 1421, 281, 5520, 1566, 7200, 1014, 342, 247, 1077, 1355, 1180, 273, 9886, 281, 1957, 13461, 5474, 339, 9852, 436, 2929, 253, 4477, 12661, 247, 941, 4924, 36643, 5933, 273, 3676, 11454, 6928, 1925, 3896, 386, 253, 2022, 2934, 273, 3896, 386, 310, 281, 11101, 3014, 253, 344, 859, 757, 3169, 13757, 8103, 715, 1264, 4295, 3284, 3020, 10295, 3020, 285, 5048, 3020, 4295, 275, 1340, 281, 26277, 22318, 841, 1264, 8103, 3470, 247, 20793, 7880, 2020, 273, 2228, 1083, 310, 5421, 285, 247, 13439, 5933, 310, 908, 3368, 1543, 921, 326, 3896, 386, 5933, 310, 2104, 281, 1978, 2169, 7200, 342, 253, 1072, 1180, 273, 9886, 2429, 281, 2067, 8245, 11333, 4757, 253, 2022, 4757, 273, 436, 2929, 310, 253, 10199, 273, 1554, 2865, 1079, 11193, 273, 344, 859, 757, 4315, 1955, 281, 253, 9171, 1430, 2523, 352, 310, 7479, 281, 4657, 253, 2862, 4315, 5899, 3082, 3798, 760, 1978, 3540, 273, 253, 16421, 2426, 11797, 407, 4315, 14717, 5697, 436, 2929, 5283, 247, 16421, 50276, 6172, 3020, 16421, 50275, 676, 5958, 11193, 534, 310, 247, 1175, 2934, 3368, 1543, 671, 921, 326, 253, 4315, 14717, 2934, 789, 973, 50276, 20881, 1255, 891, 1158, 627, 403, 690, 32138, 275, 253, 28529, 273, 253, 13757, 8103, 50276, 18, 275, 7212, 374, 45850, 1686, 50276, 20347, 1269, 77, 1269, 5792, 533, 432, 7212, 495, 281, 7212, 577, 299, 13816, 1686, 310, 7932, 407, 385, 77, 1269, 5792, 835, 298, 78, 34654, 2139, 436, 3213, 2186, 2032, 374, 275, 7212, 818, 253, 8103, 310, 2020, 79, 891, 546, 891, 18687, 259, 16192, 300, 19, 50276, 2204, 79, 694, 18687, 259, 78, 13307, 480, 76, 18687, 259, 16192, 5792, 50276, 3591, 18687, 259, 1686, 480, 30664, 259, 1686, 85, 533, 275, 7212, 854, 253, 10303, 546, 891, 694, 285, 7892, 15529, 891, 2096, 326, 512, 253, 10303, 403, 2762, 432, 30762, 247, 19, 533, 347, 1048, 347, 597, 403, 417, 512, 4503, 253, 8103, 943, 320, 1027, 1293, 841, 10303, 891, 13414, 3240, 2096, 849, 359, 755, 854, 432, 818, 50276, 783, 16038, 281, 11101, 3014, 253, 344, 859, 757, 4315, 715, 16421, 2972, 3020, 16421, 285, 1698, 14714, 4295, 310, 253, 9560, 7680, 273, 436, 2929, 3368, 1543, 921, 326, 253, 1554, 2865, 1079, 13757, 8103, 1421, 281, 1175, 3045, 846, 36643, 891, 13414, 956, 690, 5018, 275, 253, 3538, 569, 273, 253, 13757, 8103, 436, 2929, 588, 320, 1805, 604, 253, 4477, 476, 17093, 841, 5018, 973, 2490, 187, 4118, 18435, 27, 783, 4477, 12661, 247, 941, 4924, 36643, 1332, 326, 476, 320, 3732, 1501, 31158, 36643, 1293, 896, 44263, 318, 50276, 783, 1332, 3936, 5750, 273, 16851, 344, 859, 757, 1491, 275, 247, 2176, 44755, 16851, 1039, 1754, 327, 253, 13260, 285, 45131, 275, 253, 2929, 3896, 386, 14177, 281, 22318, 20793, 7880, 2020, 273, 2228, 1083, 3185, 273, 278, 339, 50276, 9088, 403, 1175, 16774, 1543, 4645, 253, 12510, 273, 253, 1332, 285, 253, 2929, 310, 973, 3542, 285, 253, 1332, 943, 320, 273, 3862, 1600 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 249, 253, 7714, 253, 4477, 12661, 3896, 386, 534, 310, 247, 941, 4924, 36643, 1332, 326, 476, 4647, 1501, 31158, 36643, 31048, 82, 1293, 667, 896, 44263, 318, 50276, 46458, 3896, 386, 310, 3192, 5750, 273, 34930, 344, 859, 757, 1491, 1754, 327, 253, 13260, 285, 45131, 275, 253, 2929, 3896, 386, 14177, 281, 22318, 20793, 7880, 2020, 273, 2228, 1083, 3185, 273, 278, 339, 50276, 783, 4477, 921, 1142, 5661, 1543, 281, 17813, 253, 12510, 273, 3896, 386, 20544, 337, 891, 1158, 253, 2929, 310, 1077, 973, 3542, 342, 1175, 33954, 374, 253, 2905, 789, 2593, 310, 11080, 495, 253, 41818, 275, 2593, 3127, 403, 3309, 285, 9371, 594, 347, 253, 5933, 337, 285, 374, 5010, 253, 10732, 273, 8336, 8090, 3603, 3280, 9252, 8123, 34501, 588, 1056, 253, 14168, 2489, 2221, 21643, 577, 253, 5661, 1543, 403, 1375, 23037, 14387, 342, 27163, 327, 2710, 11454, 6928, 608, 891, 564, 949, 253, 14168, 285, 1089, 731, 3839, 3451, 253, 14717, 6772, 3266, 318, 2934, 310, 12524, 50276, 20881, 1255, 265, 337, 3896, 386, 310, 323, 2801, 285, 5743, 36643, 891, 1158, 253, 2801, 36643, 629, 310, 973, 4516, 533, 627, 19756, 5955, 670, 5743, 36643, 3340, 849, 352, 11852, 253, 7200, 273, 34754, 35659, 757, 14237, 908, 275, 3896, 386, 374, 4465, 8254, 6787, 403, 3058, 323, 5150, 818, 281, 854, 347, 891, 2096, 352, 5150, 818, 281, 854, 2987, 973, 323, 3896, 386, 984, 273, 697, 34560, 13757, 327, 253, 1264, 4295, 533, 3839, 1110, 767, 403, 417, 6425, 495, 849, 310, 253, 36643, 2491, 1054, 2781, 3413, 275, 3896, 386, 651, 3896, 386, 789, 1805, 604, 5678, 342, 643, 19627, 31048, 82, 3082, 577, 323, 29867, 891, 4282, 670, 577, 2713, 36643, 1543, 273, 439, 2066, 5025, 292, 50276, 6795, 3374, 337, 275, 915, 247, 4465, 259, 50276, 5996, 30080, 22559, 50276, 74, 1158, 253, 30080, 22559, 432, 253, 4477, 2789, 3282, 285, 476, 2953, 619, 7350, 22914, 1327, 48746, 3374, 824, 347, 38135, 285, 2442, 4833, 347, 247, 906, 891, 6558, 619, 4868, 43981, 281, 2997, 891, 1158, 253, 7714, 310, 22335, 3590, 342, 247, 4891, 4685, 273, 253, 31048, 82, 1895, 253, 5661, 1543, 403, 3240, 1175, 1580, 627, 403, 1335, 8254, 6787, 281, 320, 1160, 285, 253, 958, 326, 2074, 45131, 285, 46899, 3082, 2226, 275, 2045, 2987, 891, 5583, 247, 5075, 14924, 5474, 33032, 2520, 2929, 29328, 247, 941, 4924, 36643, 1332, 1754, 327, 253, 1273, 2621, 246, 9614, 7466, 273, 50276, 18585, 835, 253, 344, 859, 757, 4315, 310, 34930, 342, 1027, 2308, 3284, 3020, 10295, 3020, 285, 5048, 3020, 253, 4477, 31414, 3653, 253, 2677, 1025, 13461, 432, 3284, 3020, 281, 10295, 3020, 285, 840, 281, 5048, 3020, 253, 28529, 285, 2900, 273, 253, 36643, 403, 4460, 16774, 1543, 921, 326, 253, 4081, 1332, 41731, 13015, 3332, 941, 4924, 3082, 50275, 783, 2929, 310, 4583, 4518, 18872, 285, 3542, 253, 4081, 1332, 310, 973, 24013, 8550, 285, 253, 29395, 46899, 2900, 310, 4460, 347, 2080, 347, 891, 871, 436, 2929, 3400, 247, 15925, 1039, 281, 3587, 897, 13358, 13757, 323, 36643, 3185, 273, 6041, 3733, 342, 11786, 896, 44263, 318, 2299, 253, 1563, 2792, 1335, 2430, 690, 625, 8254, 6787, 50276, 262, 310, 417, 2590, 2139, 253, 806, 2621, 1307, 310, 11035, 275, 5150, 337, 50276, 262, 310, 417, 2590, 2139, 253, 13642, 3602, 751, 546, 74, 694, 285, 7892, 403, 11035, 275, 5150, 818, 253, 2900, 432, 253, 4081, 13439, 13757, 310, 417, 253, 760, 2900, 285, 643, 5482, 778, 1335, 3469, 327, 841, 13642, 3602, 436, 2929, 310, 973, 3542, 285, 253, 4081, 1332, 310, 4460, 5474, 33032, 2520, 2929, 29328, 247, 747, 941, 4924, 36643, 1332, 326, 1057, 417, 2430, 896, 44263, 318, 4543, 1442, 292, 25004, 253, 2234, 2934, 310, 25987, 344, 859, 757, 3169, 13757, 326, 476, 320, 45765, 715, 1264, 4243, 3896, 7961, 465, 285, 260, 3969, 281, 253, 1264, 10103, 273, 253, 2801, 13148, 970, 247, 1643, 34754, 824, 347, 2831, 12026, 14275, 281, 25636, 253, 13757, 840, 3185, 273, 278, 339, 253, 4477, 9569, 1083, 20793, 247, 339, 273, 2801, 20452, 253, 5661, 1543, 921, 326, 253, 4081, 20926, 82, 1332, 41731, 13015, 1014, 305, 4989, 82, 326, 310, 10323, 247, 2238, 273, 2805, 255, 253, 4081, 5853, 310, 3340, 4217, 323, 247, 1698, 2713, 36643, 436, 2929, 310, 4518, 3542, 285, 973, 24013, 8550, 846, 247, 5547, 11193, 273, 253, 344, 859, 757, 3169, 2746, 253, 4477, 2319, 849, 281, 15338, 253, 15504, 273, 5743, 3268, 1491, 275, 2593, 495, 7384, 326, 3280, 4735, 8115, 42436, 263, 1661, 366, 342, 1016, 643, 275, 247, 2074, 1039, 253, 5661, 1543, 403, 13943, 275, 2593, 577, 50276, 18, 752, 403, 253, 7364, 273, 436, 789, 253, 3290, 273, 253, 4081, 1332, 651, 3469, 327, 253, 13091, 273, 253, 13260, 281, 11823, 5743, 3268, 604, 247, 6010, 273, 7364, 285, 13260, 273, 3280, 3268, 310, 2530, 352, 651, 320, 9371, 281, 2096, 2139, 436, 789, 310, 760, 3710, 281, 2176, 260, 9866, 3210, 347, 3542, 275, 253, 2929, 1580, 4979, 398, 285, 13361, 793, 403, 9592, 12845, 323, 253, 2170, 326, 556, 644, 14691, 407, 260, 9866, 3210, 352, 651, 320, 3309, 281, 6266, 534, 10895, 390, 1566, 10336, 310, 1682, 2684, 407, 253, 4081, 1332, 50276, 19, 253, 46899, 2746, 310, 3710, 281, 598, 390, 1066, 342, 247, 3213, 1979, 273, 337, 310, 627, 667, 4839, 281, 3157, 253, 7200, 2007, 604, 359, 2572, 253, 3213, 1979, 26332, 6941, 14200, 46899, 7274, 50276, 20, 46899, 2746, 556, 644, 5611, 281, 247, 1643, 2045, 31048, 82, 5609, 1223, 824, 5609, 3798, 5467, 326, 247, 18543, 873, 310, 2530, 849, 2810, 310, 253, 4081, 789, 281, 1110, 2045, 46899, 3169, 31048, 82, 3082, 275, 2426, 273, 1566, 7200, 604, 253, 3064, 310, 16888, 1057, 352, 1599, 326, 15686, 3280, 941, 3268, 310, 26557, 1679, 1774, 323, 260, 9866, 3210, 310, 627, 667, 4839, 824, 247, 3064, 875, 941, 4924, 31048, 82, 285, 18543, 3169, 31048, 82, 476, 320, 4067, 7293, 327, 253, 5319, 273, 253, 10895, 1580, 512, 5661, 941, 275, 436, 2929, 310, 1677, 323, 4440, 257, 292, 760, 436, 2929, 1537, 1246, 271, 13214, 326, 253, 4081, 1332, 1537, 320, 18325, 323, 4440, 257, 292, 3022, 15302, 436, 2929, 23970, 247, 1643, 5272, 13260, 281, 33623, 253, 6031, 281, 6642, 253, 3280, 941, 3268, 5547, 34754, 1421, 281, 5520, 1566, 7200, 1014, 342, 247, 1077, 1355, 1180, 273, 9886, 281, 1957, 13461, 5474, 339, 9852, 436, 2929, 253, 4477, 12661, 247, 941, 4924, 36643, 5933, 273, 3676, 11454, 6928, 1925, 3896, 386, 253, 2022, 2934, 273, 3896, 386, 310, 281, 11101, 3014, 253, 344, 859, 757, 3169, 13757, 8103, 715, 1264, 4295, 3284, 3020, 10295, 3020, 285, 5048, 3020, 4295, 275, 1340, 281, 26277, 22318, 841, 1264, 8103, 3470, 247, 20793, 7880, 2020, 273, 2228, 1083, 310, 5421, 285, 247, 13439, 5933, 310, 908, 3368, 1543, 921, 326, 3896, 386, 5933, 310, 2104, 281, 1978, 2169, 7200, 342, 253, 1072, 1180, 273, 9886, 2429, 281, 2067, 8245, 11333, 4757, 253, 2022, 4757, 273, 436, 2929, 310, 253, 10199, 273, 1554, 2865, 1079, 11193, 273, 344, 859, 757, 4315, 1955, 281, 253, 9171, 1430, 2523, 352, 310, 7479, 281, 4657, 253, 2862, 4315, 5899, 3082, 3798, 760, 1978, 3540, 273, 253, 16421, 2426, 11797, 407, 4315, 14717, 5697, 436, 2929, 5283, 247, 16421, 50276, 6172, 3020, 16421, 50275, 676, 5958, 11193, 534, 310, 247, 1175, 2934, 3368, 1543, 671, 921, 326, 253, 4315, 14717, 2934, 789, 973, 50276, 20881, 1255, 891, 1158, 627, 403, 690, 32138, 275, 253, 28529, 273, 253, 13757, 8103, 50276, 18, 275, 7212, 374, 45850, 1686, 50276, 20347, 1269, 77, 1269, 5792, 533, 432, 7212, 495, 281, 7212, 577, 299, 13816, 1686, 310, 7932, 407, 385, 77, 1269, 5792, 835, 298, 78, 34654, 2139, 436, 3213, 2186, 2032, 374, 275, 7212, 818, 253, 8103, 310, 2020, 79, 891, 546, 891, 18687, 259, 16192, 300, 19, 50276, 2204, 79, 694, 18687, 259, 78, 13307, 480, 76, 18687, 259, 16192, 5792, 50276, 3591, 18687, 259, 1686, 480, 30664, 259, 1686, 85, 533, 275, 7212, 854, 253, 10303, 546, 891, 694, 285, 7892, 15529, 891, 2096, 326, 512, 253, 10303, 403, 2762, 432, 30762, 247, 19, 533, 347, 1048, 347, 597, 403, 417, 512, 4503, 253, 8103, 943, 320, 1027, 1293, 841, 10303, 891, 13414, 3240, 2096, 849, 359, 755, 854, 432, 818, 50276, 783, 16038, 281, 11101, 3014, 253, 344, 859, 757, 4315, 715, 16421, 2972, 3020, 16421, 285, 1698, 14714, 4295, 310, 253, 9560, 7680, 273, 436, 2929, 3368, 1543, 921, 326, 253, 1554, 2865, 1079, 13757, 8103, 1421, 281, 1175, 3045, 846, 36643, 891, 13414, 956, 690, 5018, 275, 253, 3538, 569, 273, 253, 13757, 8103, 436, 2929, 588, 320, 1805, 604, 253, 4477, 476, 17093, 841, 5018, 973, 2490, 187, 4118, 18435, 27, 783, 4477, 12661, 247, 941, 4924, 36643, 1332, 326, 476, 320, 3732, 1501, 31158, 36643, 1293, 896, 44263, 318, 50276, 783, 1332, 3936, 5750, 273, 16851, 344, 859, 757, 1491, 275, 247, 2176, 44755, 16851, 1039, 1754, 327, 253, 13260, 285, 45131, 275, 253, 2929, 3896, 386, 14177, 281, 22318, 20793, 7880, 2020, 273, 2228, 1083, 3185, 273, 278, 339, 50276, 9088, 403, 1175, 16774, 1543, 4645, 253, 12510, 273, 253, 1332, 285, 253, 2929, 310, 973, 3542, 285, 253, 1332, 943, 320, 273, 3862, 1600 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper presents an automated metric based on mutual information divergence for multimodal generative models the method exploits gaussian mutual information framework and crossmutual information it uses image and text encoders of clip to compute the mutual information divergence the method is theoretically well motivated and presents promising empirical results strengths 1 the metric is theoretically well motivated and is consistent across datasets and domains 2 the proposal in the paper correlates well with human judgement across several datasets 3 thorough comparison to previous work potential concerns the results currently do not support if this metric is absolute it is not clear how reliant the empirical results are with clip what if there was no clip missing reference vifidel evaluating the visual fidelity of image descriptions by madhyastha et al acl 2019 the paper does not present critical limitations of the metric such as reliance of cliplack of validity over languages or issues relating potential dual use of the metric docsepthis paper introduced mutual information divergence mid a new metric for measuring the quality of imagetotext and texttoimage generation similar to modelbased metrics like fid mid relied on a pretrained backbone clip to extract features which are used for computing scores the score is computed by measuring the pointwise mi between to two feature spaces with gaussian assumption results show that mid can better measure the quality of imagetext pairs by comparing against existing metics on augmented dataset pros the metric is simple and straightforward in a good way by combining powerful multimodal backbone that allows crossmodal relation to be considered and mi to be computed rather easily my only concern is the gaussian assumption see questions the most important contribution is perhaps the fact that this metric can be applied regardless of the inputoutput modality and the choice of backbone can potentially be flexible as well experiments showed that mid is better than existing metrics and aligns nicely to human evaluation the authors considered many different aspects such as consistency overfitting backbone hallucination sensitivity of designing a new metric and conducted experiment respectively to show that mid is robust cons only few models are considered for evaluation it would be nice to see the results when applying mid to evaluate more models since metrics are designed to be used to compare models while the advantage over simple metrics like bleu is clear the disadvantage is not discussed throughout in the paper see limitations besides numerical limitation there are other limitations not discussed in the paper for example as a tradeoff for leveraging pretrained backbone model this metric is might not be applicable to imagetext that mismatches the pretraining scenario of the backbone model eg different language different shape of image in other word this metric is only feasible for highresource problems docsepthis submission proposes a new evaluation metric for visuallanguage generation tasks compared to previous metrics the authors propose to leverage negative crossmutual information with multivariate gaussian distributions to calculate mutual information the authors conduct experiments on both texttoimage generation and image captioning datasets and show the effectiveness of the proposed metrics the motivation is clear and proposed framework is easy to understand besides the authors also released the source code to reproduce the metric calculation the technical details are clearly explained with valid experiments results support the authors also conduct ablation study to discuss the metrics usage on both texttoimage and image captioning tasks on various datasets which is much appreciated please check the questions section above docsepthis paper introduces the mutual information divergence mid metric for multimodal generative models where the metric attempts to measure the mutual information between conditions and generations under the gaussian assumption various experiments on texttoimage generation image captioning and theoretical analysis are performed to demonstrate the effectiveness and rationality of the proposed metric it is interesting that human likertscale judgment is employed to validate the superiority of mid strengths 1 this paper introduce the mid metric for multimodal generative models which is somewhat interesting and brings more reasonable results 2 the paper conducts solid experiments to validate the effectiveness of mid on both texttoimage generation and image captioning tasks such as the generated and human likerscale judgment correlations visual reasoning accuracy flickr8kexpert flickr8kcf pascal50s and foil hallucination detection weaknesses 1 it lacks necessary analysis on relations between mid and existing metrics listed in related work please see questions and weaknesses docsepthis paper proposes a new metric for evaluating texttoimage and imagetotext models the metric is mutual information divergence which uses clip features as a source of ground truth the authors provide theoretical analysis of their metric and demonstrate that it outperforms other metrics on numerous evaluations of real vs fake images and captions originality strengths this is a novel area of inquiry and the metric employed achieves better results than any other proposed method weaknesses there have been methods that employed clip features previously and the improvement over these is relatively small quality strengths outperformance of other methods strong theoretical evaluation of the proposed method weaknesses no discussion of the potential limitations and biases of clip clarity strengths visualizations are easy to understand intro and mathematical descriptions are well presented weaknesses i found the paper hard to follow at times especially as it moves into evaluation and discussions significance strengths outperformance of other methods wide variety of environments in which the method might be useful for making sense of the relationship between text and images weaknesses many of the best synthetic generators and image captioners now directly employ the clip embedding space to achieve strong results dalle models glide vqganclip antarctic captions clip prefixlm and its cousins etc wont using the clip embedding space as a means of evaluating multimodal representations be a confound whenever these systems that already employ a clip model need to be evaluated the paper does not have a limitations section it could use a discussion of the biases in clip see for example wolfe et al 2022 acm facct birhane et al 2021 arxiv ### Summary:
the paper studies the evaluation metric for multimodal generation models the authors propose a method mid based on estimating mutual information of visual and text embeddings at sample and distribution level from experiments the mid correlates with human evaluation on multiple tasks texttoimage and image captioning the authors provide theoretical intuition and analysis of mid and relation to other divergence scores experiments are solid and convincing the reliance on clip is discussed though other multimodal encoders than clip are not evaluated in experiments author discussion with reviewers are helpful to better understand the paper overall it is a solid paper with a clearly described simple and effective method
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 10262, 271, 16644, 7982, 1754, 327, 15577, 1491, 23279, 323, 23390, 26306, 1006, 800, 3210, 253, 1332, 40725, 305, 12064, 15577, 1491, 7792, 285, 2831, 10082, 780, 1491, 352, 4648, 2460, 285, 2505, 2349, 351, 398, 273, 17230, 281, 11897, 253, 15577, 1491, 23279, 253, 1332, 310, 28055, 973, 17194, 285, 10262, 12532, 16774, 1543, 50275, 296, 3755, 20556, 50276, 18, 253, 7982, 310, 28055, 973, 17194, 285, 310, 5185, 2439, 15302, 285, 10625, 50276, 19, 253, 10419, 275, 253, 2929, 27972, 973, 342, 1966, 31536, 2439, 2067, 15302, 50276, 20, 11080, 5301, 281, 2045, 789, 50274, 33177, 7350, 50276, 783, 1543, 4390, 513, 417, 1329, 604, 436, 7982, 310, 7880, 50276, 262, 310, 417, 2590, 849, 774, 6231, 253, 16774, 1543, 403, 342, 17230, 752, 604, 627, 369, 642, 17230, 50276, 33722, 3806, 362, 338, 5735, 16344, 253, 5304, 32422, 273, 2460, 20121, 407, 10279, 1994, 284, 19243, 1162, 355, 247, 498, 6247, 50275, 783, 2929, 1057, 417, 1246, 4619, 7364, 273, 253, 7982, 824, 347, 22095, 273, 502, 7070, 471, 273, 13091, 689, 11515, 390, 3374, 12600, 2442, 8746, 897, 273, 253, 7982, 50276, 7152, 33032, 2520, 2929, 5611, 15577, 1491, 23279, 4260, 247, 747, 7982, 323, 10499, 253, 3290, 273, 4440, 292, 1584, 633, 285, 2505, 936, 5695, 5978, 2074, 281, 1566, 3169, 17082, 751, 269, 301, 4260, 15494, 327, 247, 3215, 11273, 27882, 17230, 281, 4908, 3386, 534, 403, 908, 323, 12672, 7363, 253, 4868, 310, 10302, 407, 10499, 253, 1127, 3020, 3641, 875, 281, 767, 4735, 8470, 342, 305, 12064, 9376, 1543, 921, 326, 4260, 476, 1805, 2557, 253, 3290, 273, 4440, 292, 2068, 8557, 407, 10941, 1411, 5368, 1313, 982, 327, 31612, 10895, 5847, 50276, 783, 7982, 310, 2969, 285, 15246, 275, 247, 1175, 1039, 407, 16248, 6422, 23390, 26306, 27882, 326, 4483, 2831, 24353, 5886, 281, 320, 2783, 285, 3641, 281, 320, 10302, 2581, 4354, 619, 760, 4468, 310, 253, 305, 12064, 9376, 923, 3533, 50276, 783, 954, 1774, 7680, 310, 4931, 253, 958, 326, 436, 7982, 476, 320, 3732, 10159, 273, 253, 3280, 9252, 36453, 285, 253, 4327, 273, 27882, 476, 7826, 320, 12112, 347, 973, 50276, 16217, 3825, 2692, 326, 4260, 310, 1805, 685, 5368, 17082, 285, 8495, 84, 23395, 281, 1966, 7103, 50276, 783, 4477, 2783, 1142, 1027, 7794, 824, 347, 15274, 689, 31893, 27882, 33092, 1515, 7340, 273, 20462, 247, 747, 7982, 285, 5196, 3368, 2975, 281, 921, 326, 4260, 310, 10237, 50276, 5040, 50276, 7483, 1643, 3210, 403, 2783, 323, 7103, 352, 651, 320, 5322, 281, 923, 253, 1543, 672, 9433, 4260, 281, 7472, 625, 3210, 1580, 17082, 403, 4158, 281, 320, 908, 281, 7277, 3210, 50276, 6050, 253, 5750, 689, 2969, 17082, 751, 7387, 86, 310, 2590, 253, 18928, 310, 417, 5469, 4768, 275, 253, 2929, 923, 7364, 16280, 10704, 12291, 627, 403, 643, 7364, 417, 5469, 275, 253, 2929, 323, 1650, 347, 247, 5454, 2727, 323, 19732, 2977, 3215, 11273, 27882, 1566, 436, 7982, 310, 1537, 417, 320, 7763, 281, 4440, 292, 2068, 326, 19412, 32358, 253, 3215, 26208, 10076, 273, 253, 27882, 1566, 24088, 1027, 3448, 1027, 5281, 273, 2460, 275, 643, 3159, 436, 7982, 310, 760, 17887, 323, 1029, 15024, 3237, 5474, 33032, 2520, 19529, 29328, 247, 747, 7103, 7982, 323, 1649, 86, 455, 2848, 5978, 8892, 2429, 281, 2045, 17082, 253, 4477, 12661, 281, 25057, 4016, 2831, 10082, 780, 1491, 342, 21471, 305, 12064, 10670, 281, 10173, 15577, 1491, 253, 4477, 2589, 4679, 327, 1097, 2505, 936, 5695, 5978, 285, 2460, 11743, 272, 15302, 285, 921, 253, 12510, 273, 253, 4081, 17082, 50274, 783, 16038, 310, 2590, 285, 4081, 7792, 310, 3477, 281, 2096, 16280, 253, 4477, 671, 4439, 253, 2603, 2127, 281, 18302, 253, 7982, 10272, 50275, 783, 7681, 4278, 403, 4518, 5544, 342, 3588, 4679, 1543, 1329, 50275, 783, 4477, 671, 2589, 28913, 1263, 281, 2319, 253, 17082, 10393, 327, 1097, 2505, 936, 5695, 285, 2460, 11743, 272, 8892, 327, 2710, 15302, 534, 310, 1199, 14109, 50275, 32897, 2451, 253, 3533, 2593, 1840, 50276, 7152, 33032, 2520, 2929, 23970, 253, 15577, 1491, 23279, 4260, 7982, 323, 23390, 26306, 1006, 800, 3210, 835, 253, 7982, 9437, 281, 2557, 253, 15577, 1491, 875, 2515, 285, 14649, 762, 253, 305, 12064, 9376, 2710, 4679, 327, 2505, 936, 5695, 5978, 50276, 5695, 11743, 272, 285, 10527, 1783, 403, 2684, 281, 7568, 253, 12510, 285, 8870, 414, 273, 253, 4081, 7982, 352, 310, 4722, 326, 1966, 2078, 797, 7527, 3883, 310, 7091, 281, 17813, 253, 34385, 273, 4260, 50276, 296, 3755, 20556, 337, 436, 2929, 9569, 253, 4260, 7982, 323, 23390, 26306, 1006, 800, 3210, 534, 310, 8489, 4722, 285, 10316, 625, 5272, 1543, 374, 253, 2929, 2589, 84, 4891, 4679, 281, 17813, 253, 12510, 273, 4260, 327, 1097, 2505, 936, 5695, 5978, 285, 2460, 11743, 272, 8892, 824, 347, 253, 4561, 285, 1966, 2078, 398, 25912, 3883, 13007, 5304, 14720, 7200, 22336, 83, 25, 413, 89, 8292, 22336, 83, 25, 76, 7836, 7222, 1179, 1235, 84, 285, 27204, 33092, 1515, 5481, 50274, 20881, 1255, 265, 337, 352, 19756, 3309, 1783, 327, 2493, 875, 4260, 285, 5368, 17082, 7117, 275, 2905, 789, 4496, 923, 3533, 285, 32213, 5474, 33032, 2520, 2929, 29328, 247, 747, 7982, 323, 16344, 2505, 936, 5695, 285, 4440, 292, 1584, 633, 3210, 253, 7982, 310, 15577, 1491, 23279, 534, 4648, 17230, 3386, 347, 247, 2603, 273, 3216, 5083, 253, 4477, 2085, 10527, 1783, 273, 616, 7982, 285, 7568, 326, 352, 41731, 13015, 643, 17082, 327, 7418, 27163, 273, 1524, 4632, 15223, 3888, 285, 3403, 621, 3236, 414, 20544, 436, 310, 247, 4460, 2170, 273, 14392, 285, 253, 7982, 7091, 33526, 1805, 1543, 685, 667, 643, 4081, 1332, 32213, 627, 452, 644, 3082, 326, 7091, 17230, 3386, 3786, 285, 253, 7756, 689, 841, 310, 4942, 1355, 50276, 15177, 20544, 562, 24159, 273, 643, 3082, 2266, 10527, 7103, 273, 253, 4081, 1332, 32213, 642, 5955, 273, 253, 2442, 7364, 285, 31306, 273, 17230, 50276, 498, 15752, 20544, 5304, 5904, 403, 3477, 281, 2096, 26432, 285, 15965, 20121, 403, 973, 3559, 32213, 891, 1119, 253, 2929, 1892, 281, 956, 387, 2069, 3340, 347, 352, 9727, 715, 7103, 285, 11985, 50275, 9188, 40348, 20544, 562, 24159, 273, 643, 3082, 4618, 5235, 273, 12620, 275, 534, 253, 1332, 1537, 320, 4217, 323, 2403, 3282, 273, 253, 2954, 875, 2505, 285, 3888, 32213, 1142, 273, 253, 1682, 13506, 21025, 285, 2460, 11743, 398, 1024, 3587, 2126, 253, 17230, 21496, 2317, 281, 5115, 2266, 1543, 277, 4781, 3210, 1289, 504, 362, 82, 1247, 11536, 1331, 274, 10010, 3403, 621, 17230, 17744, 20347, 285, 697, 37827, 3966, 31451, 970, 253, 17230, 21496, 2317, 347, 247, 2097, 273, 16344, 23390, 26306, 14237, 320, 247, 44667, 10793, 841, 2718, 326, 2168, 2126, 247, 17230, 1566, 878, 281, 320, 6760, 50276, 783, 2929, 1057, 417, 452, 247, 7364, 2593, 352, 812, 897, 247, 5955, 273, 253, 31306, 275, 17230, 923, 323, 1650, 259, 311, 453, 1162, 355, 1384, 1423, 913, 78, 2344, 291, 3350, 73, 1351, 1162, 355, 43425, 549, 32693, 2490, 187, 4118, 18435, 27, 783, 2929, 2175, 253, 7103, 7982, 323, 23390, 26306, 5978, 3210, 253, 4477, 12661, 247, 1332, 4260, 1754, 327, 26230, 15577, 1491, 273, 5304, 285, 2505, 46234, 387, 3410, 285, 3268, 1268, 432, 4679, 253, 4260, 27972, 342, 1966, 7103, 327, 2709, 8892, 2505, 936, 5695, 285, 2460, 11743, 272, 253, 4477, 2085, 10527, 30328, 285, 1783, 273, 4260, 285, 5886, 281, 643, 23279, 7363, 4679, 403, 4891, 285, 21414, 253, 22095, 327, 17230, 310, 5469, 2167, 643, 23390, 26306, 2349, 351, 398, 685, 17230, 403, 417, 6760, 275, 4679, 2488, 5955, 342, 30628, 403, 9371, 281, 1805, 2096, 253, 2929, 50275, 1189, 455, 352, 310, 247, 4891, 2929, 342, 247, 4518, 2529, 2969, 285, 3576, 1332, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 10262, 271, 16644, 7982, 1754, 327, 15577, 1491, 23279, 323, 23390, 26306, 1006, 800, 3210, 253, 1332, 40725, 305, 12064, 15577, 1491, 7792, 285, 2831, 10082, 780, 1491, 352, 4648, 2460, 285, 2505, 2349, 351, 398, 273, 17230, 281, 11897, 253, 15577, 1491, 23279, 253, 1332, 310, 28055, 973, 17194, 285, 10262, 12532, 16774, 1543, 50275, 296, 3755, 20556, 50276, 18, 253, 7982, 310, 28055, 973, 17194, 285, 310, 5185, 2439, 15302, 285, 10625, 50276, 19, 253, 10419, 275, 253, 2929, 27972, 973, 342, 1966, 31536, 2439, 2067, 15302, 50276, 20, 11080, 5301, 281, 2045, 789, 50274, 33177, 7350, 50276, 783, 1543, 4390, 513, 417, 1329, 604, 436, 7982, 310, 7880, 50276, 262, 310, 417, 2590, 849, 774, 6231, 253, 16774, 1543, 403, 342, 17230, 752, 604, 627, 369, 642, 17230, 50276, 33722, 3806, 362, 338, 5735, 16344, 253, 5304, 32422, 273, 2460, 20121, 407, 10279, 1994, 284, 19243, 1162, 355, 247, 498, 6247, 50275, 783, 2929, 1057, 417, 1246, 4619, 7364, 273, 253, 7982, 824, 347, 22095, 273, 502, 7070, 471, 273, 13091, 689, 11515, 390, 3374, 12600, 2442, 8746, 897, 273, 253, 7982, 50276, 7152, 33032, 2520, 2929, 5611, 15577, 1491, 23279, 4260, 247, 747, 7982, 323, 10499, 253, 3290, 273, 4440, 292, 1584, 633, 285, 2505, 936, 5695, 5978, 2074, 281, 1566, 3169, 17082, 751, 269, 301, 4260, 15494, 327, 247, 3215, 11273, 27882, 17230, 281, 4908, 3386, 534, 403, 908, 323, 12672, 7363, 253, 4868, 310, 10302, 407, 10499, 253, 1127, 3020, 3641, 875, 281, 767, 4735, 8470, 342, 305, 12064, 9376, 1543, 921, 326, 4260, 476, 1805, 2557, 253, 3290, 273, 4440, 292, 2068, 8557, 407, 10941, 1411, 5368, 1313, 982, 327, 31612, 10895, 5847, 50276, 783, 7982, 310, 2969, 285, 15246, 275, 247, 1175, 1039, 407, 16248, 6422, 23390, 26306, 27882, 326, 4483, 2831, 24353, 5886, 281, 320, 2783, 285, 3641, 281, 320, 10302, 2581, 4354, 619, 760, 4468, 310, 253, 305, 12064, 9376, 923, 3533, 50276, 783, 954, 1774, 7680, 310, 4931, 253, 958, 326, 436, 7982, 476, 320, 3732, 10159, 273, 253, 3280, 9252, 36453, 285, 253, 4327, 273, 27882, 476, 7826, 320, 12112, 347, 973, 50276, 16217, 3825, 2692, 326, 4260, 310, 1805, 685, 5368, 17082, 285, 8495, 84, 23395, 281, 1966, 7103, 50276, 783, 4477, 2783, 1142, 1027, 7794, 824, 347, 15274, 689, 31893, 27882, 33092, 1515, 7340, 273, 20462, 247, 747, 7982, 285, 5196, 3368, 2975, 281, 921, 326, 4260, 310, 10237, 50276, 5040, 50276, 7483, 1643, 3210, 403, 2783, 323, 7103, 352, 651, 320, 5322, 281, 923, 253, 1543, 672, 9433, 4260, 281, 7472, 625, 3210, 1580, 17082, 403, 4158, 281, 320, 908, 281, 7277, 3210, 50276, 6050, 253, 5750, 689, 2969, 17082, 751, 7387, 86, 310, 2590, 253, 18928, 310, 417, 5469, 4768, 275, 253, 2929, 923, 7364, 16280, 10704, 12291, 627, 403, 643, 7364, 417, 5469, 275, 253, 2929, 323, 1650, 347, 247, 5454, 2727, 323, 19732, 2977, 3215, 11273, 27882, 1566, 436, 7982, 310, 1537, 417, 320, 7763, 281, 4440, 292, 2068, 326, 19412, 32358, 253, 3215, 26208, 10076, 273, 253, 27882, 1566, 24088, 1027, 3448, 1027, 5281, 273, 2460, 275, 643, 3159, 436, 7982, 310, 760, 17887, 323, 1029, 15024, 3237, 5474, 33032, 2520, 19529, 29328, 247, 747, 7103, 7982, 323, 1649, 86, 455, 2848, 5978, 8892, 2429, 281, 2045, 17082, 253, 4477, 12661, 281, 25057, 4016, 2831, 10082, 780, 1491, 342, 21471, 305, 12064, 10670, 281, 10173, 15577, 1491, 253, 4477, 2589, 4679, 327, 1097, 2505, 936, 5695, 5978, 285, 2460, 11743, 272, 15302, 285, 921, 253, 12510, 273, 253, 4081, 17082, 50274, 783, 16038, 310, 2590, 285, 4081, 7792, 310, 3477, 281, 2096, 16280, 253, 4477, 671, 4439, 253, 2603, 2127, 281, 18302, 253, 7982, 10272, 50275, 783, 7681, 4278, 403, 4518, 5544, 342, 3588, 4679, 1543, 1329, 50275, 783, 4477, 671, 2589, 28913, 1263, 281, 2319, 253, 17082, 10393, 327, 1097, 2505, 936, 5695, 285, 2460, 11743, 272, 8892, 327, 2710, 15302, 534, 310, 1199, 14109, 50275, 32897, 2451, 253, 3533, 2593, 1840, 50276, 7152, 33032, 2520, 2929, 23970, 253, 15577, 1491, 23279, 4260, 7982, 323, 23390, 26306, 1006, 800, 3210, 835, 253, 7982, 9437, 281, 2557, 253, 15577, 1491, 875, 2515, 285, 14649, 762, 253, 305, 12064, 9376, 2710, 4679, 327, 2505, 936, 5695, 5978, 50276, 5695, 11743, 272, 285, 10527, 1783, 403, 2684, 281, 7568, 253, 12510, 285, 8870, 414, 273, 253, 4081, 7982, 352, 310, 4722, 326, 1966, 2078, 797, 7527, 3883, 310, 7091, 281, 17813, 253, 34385, 273, 4260, 50276, 296, 3755, 20556, 337, 436, 2929, 9569, 253, 4260, 7982, 323, 23390, 26306, 1006, 800, 3210, 534, 310, 8489, 4722, 285, 10316, 625, 5272, 1543, 374, 253, 2929, 2589, 84, 4891, 4679, 281, 17813, 253, 12510, 273, 4260, 327, 1097, 2505, 936, 5695, 5978, 285, 2460, 11743, 272, 8892, 824, 347, 253, 4561, 285, 1966, 2078, 398, 25912, 3883, 13007, 5304, 14720, 7200, 22336, 83, 25, 413, 89, 8292, 22336, 83, 25, 76, 7836, 7222, 1179, 1235, 84, 285, 27204, 33092, 1515, 5481, 50274, 20881, 1255, 265, 337, 352, 19756, 3309, 1783, 327, 2493, 875, 4260, 285, 5368, 17082, 7117, 275, 2905, 789, 4496, 923, 3533, 285, 32213, 5474, 33032, 2520, 2929, 29328, 247, 747, 7982, 323, 16344, 2505, 936, 5695, 285, 4440, 292, 1584, 633, 3210, 253, 7982, 310, 15577, 1491, 23279, 534, 4648, 17230, 3386, 347, 247, 2603, 273, 3216, 5083, 253, 4477, 2085, 10527, 1783, 273, 616, 7982, 285, 7568, 326, 352, 41731, 13015, 643, 17082, 327, 7418, 27163, 273, 1524, 4632, 15223, 3888, 285, 3403, 621, 3236, 414, 20544, 436, 310, 247, 4460, 2170, 273, 14392, 285, 253, 7982, 7091, 33526, 1805, 1543, 685, 667, 643, 4081, 1332, 32213, 627, 452, 644, 3082, 326, 7091, 17230, 3386, 3786, 285, 253, 7756, 689, 841, 310, 4942, 1355, 50276, 15177, 20544, 562, 24159, 273, 643, 3082, 2266, 10527, 7103, 273, 253, 4081, 1332, 32213, 642, 5955, 273, 253, 2442, 7364, 285, 31306, 273, 17230, 50276, 498, 15752, 20544, 5304, 5904, 403, 3477, 281, 2096, 26432, 285, 15965, 20121, 403, 973, 3559, 32213, 891, 1119, 253, 2929, 1892, 281, 956, 387, 2069, 3340, 347, 352, 9727, 715, 7103, 285, 11985, 50275, 9188, 40348, 20544, 562, 24159, 273, 643, 3082, 4618, 5235, 273, 12620, 275, 534, 253, 1332, 1537, 320, 4217, 323, 2403, 3282, 273, 253, 2954, 875, 2505, 285, 3888, 32213, 1142, 273, 253, 1682, 13506, 21025, 285, 2460, 11743, 398, 1024, 3587, 2126, 253, 17230, 21496, 2317, 281, 5115, 2266, 1543, 277, 4781, 3210, 1289, 504, 362, 82, 1247, 11536, 1331, 274, 10010, 3403, 621, 17230, 17744, 20347, 285, 697, 37827, 3966, 31451, 970, 253, 17230, 21496, 2317, 347, 247, 2097, 273, 16344, 23390, 26306, 14237, 320, 247, 44667, 10793, 841, 2718, 326, 2168, 2126, 247, 17230, 1566, 878, 281, 320, 6760, 50276, 783, 2929, 1057, 417, 452, 247, 7364, 2593, 352, 812, 897, 247, 5955, 273, 253, 31306, 275, 17230, 923, 323, 1650, 259, 311, 453, 1162, 355, 1384, 1423, 913, 78, 2344, 291, 3350, 73, 1351, 1162, 355, 43425, 549, 32693, 2490, 187, 4118, 18435, 27, 783, 2929, 2175, 253, 7103, 7982, 323, 23390, 26306, 5978, 3210, 253, 4477, 12661, 247, 1332, 4260, 1754, 327, 26230, 15577, 1491, 273, 5304, 285, 2505, 46234, 387, 3410, 285, 3268, 1268, 432, 4679, 253, 4260, 27972, 342, 1966, 7103, 327, 2709, 8892, 2505, 936, 5695, 285, 2460, 11743, 272, 253, 4477, 2085, 10527, 30328, 285, 1783, 273, 4260, 285, 5886, 281, 643, 23279, 7363, 4679, 403, 4891, 285, 21414, 253, 22095, 327, 17230, 310, 5469, 2167, 643, 23390, 26306, 2349, 351, 398, 685, 17230, 403, 417, 6760, 275, 4679, 2488, 5955, 342, 30628, 403, 9371, 281, 1805, 2096, 253, 2929, 50275, 1189, 455, 352, 310, 247, 4891, 2929, 342, 247, 4518, 2529, 2969, 285, 3576, 1332, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper proposes two selfsupervised pretraining tasks to further pretrain a pretrained language model for commonsense reasoning the first task is called concepttosentence generation which reconstructs input sentences from nounverb phrases extracted from the input the second task is called concept order recovering which predicts original sentences after shuffling the order of nounverb phrases in input sentences experimental results show that the pretrained language models finetuned with the two proposed tasks can lead to improvement on five commonsense reasoning benchmark datasets strengths the idea of teaching language models through selfsupervised learning tasks is neat the performance of the proposed methods on few training examples looks great the results section is well structured there are ablation studies on the training objectives of each proposed task as well as a comparison of generated sentences concerns the key concern about the paper is the lack of rigorous experimentation to study the effectiveness of the two selfsupervised learning tasks first the methods are only compared with t5base related methods on the four commonsense classification tasks the leaderboard of commonsense qa shows that more than 20 systems report an accuracy higher than 6332 which is the best configuration of the proposed method second the proposed tasks are applied only to t5 i am wondering if it is effective on the other pretrained language models third the performance improvement on the classification tasks appears marginal statistical tests are desirable to show if such improvements are significant noun and verb phrases extracted from sentences are not always concepts maskingout certain words in the input is similar to the idea of removing noncontent words from input a deeper analysis of the proposed method would have been nice to understand which part is effective in the new task keeping content words or not using maskout what if only concepts from a knowledge base is kept instead of content words the calm model proposed in this work performs worse than the sota models in three out of four metrics on commongen despite it uses less model parameters what if the proposed tasks are applied on t5large minor comments i am not convinced that the generated corrupted sentences are in general grammatically correct as stated in sec 21 i do not see a strong connection between completing cor and compositional reasoning as stated in sec 21 the way of getting distractor sentences appears adhoc may need further justification y in equation 5 and 6 needs explanation docsepthis paper suggestsan intermediate training regime that can be used between pretraining and the endtask finetuning the authors suggest that their method captures more commonsense knowledge by being focused on capturing knowledge about concepts four different denoising objectivestwo generative and two discriminativeare proposed and described in detail with various possible ways of optimizing for all four experimental results show improvements over both the base t5 model and the large t5 model the proposed method achieves sota results on commongen with slightly more than half the parameters of the current sota model ablations show the necessity of applying a 2step intermediate training scheme with mixed training followed by joint training calm shows better results with less data than the base model strengths unifying generative and contrastive training is an important and interesting goal the objectives suggested are cheap to compute and seem to increase the signal available in the data extensive results show improvements over a base model and a larger model across a range of tasks performance with relatively little finetuning data are encouraging weaknesses somewhat weaker results on some commongen metrics are disappointing using concept to stand in for verbs and nouns is somewhat confusing after reading other reviews and the authors responses to all of the reviewers i recommend this paper by acceptedextensive results show that the calm objectives offer more signal from data than current pretraining methods this paper suggests a number of cheaptocompute corruptions of the input data that when used to reconstruct the input enrich the underlying model these objectives certainly improve over the original t5 base and larges models that are used as initializations and especially outperform the base model in the lowdata regime the authors use objectives which capture both generative and discriminative information which some have suggested contain mutually beneficial signal but have not been unified in a single training method below are two paragraphs from my original review the authors have done further experiments and show that there are still gains on these tasks when model sized is increased significantly furthermore they have clarified that on the key metric of commongen they achieved sota with only slightly more than half the parameters of the current sota model i therefore believe that these results show merit however in every task except commongen the authors do not discuss any methods that are even close to the state of the art for csqa the best number in this paper is 6332 vs 795 on the current leaderboard similar numbers are true for the rest of the tasks 6090 vs 87 for obqa 7101 vs 90 for piqa and 6320 vs 8970 for anli i do not believe that sota results are necessary to write a good paper and indeed the obsession our field has with sota is unhealthy yet it is difficult for me to trust that the effects in this paper will generalize to better performing models without further evidence what if the calm intermediate objectives only help with mistakes that larger models do not make in the first place on the generative task calm performs closer to sota but it improves only slightly on t5 this is especially disappointing as the objectives introduced directly match the task in commongen making this intermediate training a form of noisy training data rather than pretraining i still feel that the authors use of concept and commonsense is vague when their method can be defined more clearly with more mundane terminology in practice the authors use nouns and verbs as their concepts which is fine in terms of pretraining objectives but surely does not capture the generality of concepts the authors have somewhat clarified in this in their updated version finally the calm intermediate objectives share many properties with all of the datasets tested on and are likely calibrating the model to the kind of correlations they should expect to predict in advance of finetuning one way this can be seen is that the slopes of the t5 and calm lines are very similar after an initial bump which t5 likely needs to calibrate to the new distribution this makes claims of learning commonsense hard to verify though i do agree that something relevant to solving these problems is clearly being learned altogether i think this paper makes an interesting contribution to the question of how can we get the most pretraining signal from unstructured data using offtheshelf tools i recommend this paper for acceptance though i encourage the authors to revise their paper to make this the focus of the story rather than the vaguely defined notion of conceptdocsepsummary this paper proposes the concept aware language model calm a pretrained t5 transformer is trained on selfsupervised intermediate tasks to learn relational commonsense knowledge before finetuning on downstream tasks the intermediate tasks include 1 concept to sentence c2s generation given a list of permuted concepts verbs and nouns generate the target sequence 2 concept order recovery cor given a sequence with the order of concepts verbs and nouns shuffled generate the original sequence 3 given a sequence and its perturbed version concepts shuffled generate the sequence classification but as a generation task training is first done in a multitask fashion followed by a stage of training that uses generated outputs from 1 and 2 for 3 the goal of the paper is to show that carefully designed objectives for selfsupervised intermediate task training add relational commonsense knowledge which helps improve model performance on downstream commonsense reasoning tasks strengths s1 the idea of selfsupervised intermediate task training is an exciting one especially in the form of adding reasoning capabilities that bert gpt t5 etc might not be acquiring during the largescale pretraining phase s2 as shown by results in table 1 on 5 commonsense reasoning tasks the intermediate task training proposed in this work improves performance over and above the t5 model and variants including with salient span masking for concepts s3 the experiments involved averaging across 3 random seeds and the authors have reported confidence intervals weaknesses and questions w1 one missing baseline is t5 trained to unshuffle entire sequences given an input with the tokens shuffled generate the original sequence this would show how much value c2s and cor are really adding the current t5 baselines are all trained purely for infilling which seems a bit unfair compared to calm which is generating entire sequences w2 given a t5 model that can score sequences maybe after training on the autoencoding objective would it score apples grow on trees higher than trees grow on apples if yes then the model seems to already exhibit this style of reasoning would it score apples grow on trees and apples grow in the ground similarly the distinction here is between sequences that are nongrammatical or unlikely to ever appear versus sequences that may have appeared eg potatoes grow in the ground presently a its unclear if the designed objectives are providing commonsense reasoning above something the model can know from autoregressive language model scoring and b it appears that the objectives are not designed to add relational commonsense knowledge of the sort where we know apples dont grow in the ground w3 c2s is designed specifically to do well on commongen are the gains on this task smaller than what you might have expected if yes why isnt it helping more w4 figure 4 needs error bars the dev sets are really small and its hard to interpret which differences are significant update thanks for the incredibly detailed response ive raised my score to a 8 i do in general quite like the paper and the responses here are thoughtprovoking im not sure im totally convinced by the wsc results comparing calm the classifier to t5 the sequence scorer not sure if its an apples to apples comparisonbut im not sure theres a straightforward setup for this and perhaps it starts to get beyond the scope of whats being presented here docsep summary this paper addresses the issue of incorporating commonsense into large pretrained language models they propose a pretraining method that does not leverage knowledge graphs but rather use approaches which involve corrupting the input sentence in various ways to recover the original sentence this methodology is used to try to bake in commonsense into models results are shown on both discriminative and generative common sense datasets novelty and clarity the training procedure of corrupting inputs to retrieve outputs is not new but the use on commonsense tasks does seem novel and also is an interesting approach the paper was very clear to read and the technical aspects were well described strengths 1 the use of a selfsupervised approach is great because it requires no annotation and the training procedure is simple it is also described well 2 the variety of baselines used is good and comparison against models larger than the proposed model is interesting to see 3 the use of generated sentences to improve language models on hard areas like commonsense and offensiveness is a great idea as it can help make the model more robust weaknesses 1 there should be a more comprehensive set of results completed to see how much improvement this model has mainly on the commongen class there should be some manual evaluation done similar in the original paper to see if the outputted sentences make sense and that the improvement in automatic metrics is carried over into human evaluation 2 a key aspect to look into is the robustness of this model in the c2s approach the concepts were shuffled to generate the correct sentence during inference time if the concepts were shuffled in a different manner would the model still be able to generate the correct sentences there was three random seeds used but as was said the performance is sensitive to different random seeds which seems that the model isnt as robust to newly seen inputs detailed comments if spacy was also used for pos tagging along with tokenization this should be made clear also for every sentence was there 3 nounsverbs extracted one thing im unclear about is in table 2 why is the our t5base better than the t5base above is this t5 with additional epochs i think this should be made clear additionally i wouldnt say and is only slightly worse than kgbart it seems a lot worse especially on bleu and cider it is nice to see a smaller model is beating a larger model on some metrics the difference between calm and calm joint is that the former is initialized by the calmmix did you mean to say latter instead of former also i dont see calm join in the table im assuming this is calm without mix warmup questions for the authors 1 how did you ensure shuffling the sentences still has grammatical correctness a sentence like running i am is not grammatically correct 2 instead of a pos tagger why did you not use an ner extractor also wouldnt swapping different fruits into sentences like replacing apples grow on trees with watermelons grow on trees help with robustness 3 and what point is the generative model good enough that it doesnt help to create distractor sentences ### Summary:
this paper presents two selfsupervised learning objectives that can be used as intermediate pretraining tasks to refine the t5 sequencetosequence model between pretraining and task finetuning it shows that at small to moderate model sizes adding this step significantly improves performance on commonsenseoriented target tasks pros this appears to be a fairly straightforward improvement in selfsupervised learning in nlp with fairly extensive experiments cons this model isnt trained at the same extremely large scales 10bwords as stateoftheart models and it performs significantly below the state of the art its not clear that the released model represents a useful model for any application asis and while its likely its not proven that the ideas in this paper would still be useful at larger scales given that it seems like the most likely audience for this work is other developers of pretrained models in nlp which makes the fit to a general ml conference less clear the framing around concepts and more importantly the model name conceptaware lm gives the unwarranted impression that the new model handles concepts in a way that t5 doesnt it is not reasonable to use the word concept to refer to specific parts of speech in your title even if you later explain that and whether your model handles concepts in a categorically different way from t5 would take a substantial analysis to show which doesnt seem to be present i dont think this paper is up to iclrs standards with the current name and urge the authors to change it
[ 15988, 327, 841, 8892, 672, 1566, 25180, 310, 2559, 3012, 33810, 597, 452, 31637, 326, 327, 253, 2234, 7982, 273, 764, 543, 257, 597, 6786, 256, 5503, 342, 760, 5777, 625, 685, 2716, 253, 3602, 273, 253, 1655, 256, 5503, 1566, 891, 3103, 2868, 326, 841, 1543, 921, 15785, 50276, 35529, 275, 1046, 4836, 3707, 764, 543, 257, 253, 4477, 513, 417, 2319, 667, 3082, 326, 403, 1014, 2810, 281, 253, 1375, 273, 253, 1445, 323, 260, 18858, 66, 253, 1682, 1180, 275, 436, 2929, 310, 9654, 1237, 4632, 818, 2222, 327, 253, 1655, 6657, 4697, 2074, 3904, 403, 2032, 323, 253, 1551, 273, 253, 8892, 3925, 2270, 4632, 11422, 323, 691, 31569, 818, 6903, 4632, 5091, 323, 12580, 31569, 285, 9654, 938, 4632, 50276, 2511, 1967, 323, 271, 965, 891, 513, 417, 2868, 326, 256, 5503, 1543, 403, 3309, 281, 3630, 247, 1175, 2929, 285, 6296, 253, 37238, 776, 1673, 556, 342, 256, 5503, 310, 36713, 2568, 352, 310, 2834, 323, 479, 281, 4517, 326, 253, 2538, 275, 436, 2929, 588, 39970, 281, 1805, 9591, 3210, 1293, 2007, 1941, 752, 604, 253, 11874, 10444, 16566, 760, 1361, 342, 16503, 326, 4067, 3210, 513, 417, 1056, 275, 253, 806, 1659, 50274, 251, 253, 1006, 800, 4836, 11874, 17923, 8003, 281, 256, 5503, 533, 352, 19132, 760, 5777, 327, 246, 22, 436, 310, 3340, 31623, 347, 253, 16566, 5611, 3587, 3761, 253, 4836, 275, 764, 543, 257, 2403, 436, 10444, 3733, 247, 830, 273, 27620, 3733, 941, 2581, 685, 3215, 26208, 50276, 74, 1335, 1928, 326, 253, 4477, 897, 273, 4473, 285, 764, 49235, 310, 21248, 672, 616, 1332, 476, 320, 2931, 625, 4518, 342, 625, 48657, 28939, 275, 3946, 253, 4477, 897, 28407, 84, 285, 43369, 347, 616, 12342, 534, 310, 4030, 275, 2426, 273, 3215, 26208, 16566, 533, 13353, 1057, 417, 9232, 253, 31376, 273, 12342, 253, 4477, 452, 8489, 31637, 275, 436, 275, 616, 9300, 2715, 50276, 71, 3341, 253, 11874, 10444, 16566, 3894, 1142, 3607, 342, 512, 273, 253, 15302, 5762, 327, 285, 403, 2779, 24403, 839, 253, 1566, 281, 253, 2238, 273, 13007, 597, 943, 1902, 281, 3283, 275, 7170, 273, 1442, 292, 25004, 581, 1039, 436, 476, 320, 2326, 310, 50276, 3529, 253, 28677, 273, 253, 246, 22, 285, 11874, 3104, 403, 1077, 2074, 846, 271, 3302, 19496, 534, 246, 22, 2779, 3198, 281, 24403, 366, 281, 253, 747, 3268, 436, 2789, 3916, 273, 4715, 764, 49235, 1892, 281, 12654, 2167, 891, 513, 5194, 326, 1633, 4623, 281, 16161, 841, 3237, 310, 4518, 1146, 6311, 50276, 2711, 9518, 891, 1158, 436, 2929, 2789, 271, 4722, 7680, 281, 253, 1953, 273, 849, 476, 359, 755, 253, 954, 3215, 26208, 2625, 432, 440, 34218, 941, 970, 273, 649, 1041, 48164, 5657, 891, 5583, 436, 2929, 323, 14924, 2167, 891, 11907, 253, 4477, 281, 49620, 616, 2929, 281, 1056, 436, 253, 2770, 273, 253, 2926, 2581, 685, 253, 39559, 2931, 10732, 273, 4473, 7152, 339, 793, 360, 3454, 436, 2929, 29328, 253, 4473, 6600, 3448, 1566, 11874, 50276, 66, 3215, 11273, 246, 22, 39707, 310, 10166, 327, 1881, 35421, 10444, 8892, 281, 3037, 38524, 764, 49235, 3640, 1078, 1442, 292, 25004, 327, 15450, 8892, 253, 10444, 8892, 2486, 337, 4473, 281, 6197, 260, 19, 84, 5978, 50276, 28821, 247, 1618, 273, 8143, 4525, 12342, 43369, 285, 28407, 84, 6635, 253, 2303, 3425, 374, 4473, 1340, 7355, 944, 50276, 28821, 247, 3425, 342, 253, 1340, 273, 12342, 43369, 285, 28407, 84, 439, 31377, 6635, 253, 3236, 3425, 495, 1677, 247, 3425, 285, 697, 44711, 2715, 12342, 439, 31377, 6635, 253, 3425, 9162, 533, 347, 247, 5978, 4836, 3733, 310, 806, 2218, 275, 247, 1554, 262, 1945, 8142, 3560, 407, 247, 3924, 273, 3733, 326, 4648, 4561, 18012, 432, 337, 285, 374, 323, 495, 50276, 783, 4736, 273, 253, 2929, 310, 281, 921, 326, 9257, 4158, 16566, 323, 1881, 35421, 10444, 4836, 3733, 823, 38524, 764, 49235, 3640, 534, 7729, 3157, 1566, 3045, 327, 15450, 764, 49235, 14720, 8892, 50275, 296, 3755, 20556, 50276, 84, 18, 50276, 783, 2934, 273, 1881, 35421, 10444, 4836, 3733, 310, 271, 12302, 581, 3340, 275, 253, 830, 273, 6240, 14720, 13789, 326, 270, 797, 305, 431, 246, 22, 3966, 1537, 417, 320, 28635, 1309, 253, 1236, 2510, 25912, 3215, 26208, 3408, 50273, 84, 19, 50275, 284, 2011, 407, 1543, 275, 2829, 337, 327, 608, 764, 49235, 14720, 8892, 253, 10444, 4836, 3733, 4081, 275, 436, 789, 19132, 3045, 689, 285, 1840, 253, 246, 22, 1566, 285, 11640, 1690, 342, 43066, 13905, 44790, 323, 12342, 50274, 84, 20, 50276, 783, 4679, 3206, 25001, 2439, 495, 3632, 12922, 285, 253, 4477, 452, 2361, 7162, 11508, 50273, 20881, 1255, 265, 285, 3533, 50276, 88, 18, 50276, 531, 5816, 8245, 310, 246, 22, 10166, 281, 440, 1200, 23831, 2862, 6430, 50276, 28821, 271, 3280, 342, 253, 21761, 439, 31377, 6635, 253, 3236, 3425, 436, 651, 921, 849, 1199, 1318, 260, 19, 84, 285, 944, 403, 1663, 6240, 253, 1655, 246, 22, 1666, 25379, 403, 512, 10166, 15846, 323, 2192, 3867, 534, 3133, 247, 2372, 16593, 2429, 281, 11874, 534, 310, 11365, 2862, 6430, 50274, 88, 19, 50276, 28821, 247, 246, 22, 1566, 326, 476, 4868, 6430, 5046, 846, 3733, 327, 253, 6753, 27676, 8103, 651, 352, 4868, 28580, 1756, 327, 7139, 2169, 685, 7139, 1756, 327, 28580, 604, 4754, 840, 253, 1566, 3133, 281, 2168, 10738, 436, 3740, 273, 14720, 651, 352, 4868, 28580, 1756, 327, 7139, 285, 28580, 1756, 275, 253, 3216, 12014, 253, 13812, 1060, 310, 875, 6430, 326, 403, 295, 543, 3358, 2056, 474, 390, 11543, 281, 2455, 3176, 7147, 6430, 326, 778, 452, 5420, 24088, 20951, 1756, 275, 253, 3216, 21654, 247, 697, 12744, 604, 253, 4158, 16566, 403, 5277, 764, 49235, 14720, 1840, 1633, 253, 1566, 476, 871, 432, 47694, 11020, 3448, 1566, 14755, 285, 270, 352, 4620, 326, 253, 16566, 403, 417, 4158, 281, 823, 38524, 764, 49235, 3640, 273, 253, 3686, 835, 359, 871, 28580, 13414, 1756, 275, 253, 3216, 50273, 88, 20, 50276, 68, 19, 84, 310, 4158, 5742, 281, 513, 973, 327, 764, 543, 257, 403, 253, 15988, 327, 436, 4836, 4577, 685, 752, 368, 1537, 452, 3264, 604, 4754, 2139, 310, 2649, 352, 9073, 625, 50273, 88, 21, 50276, 13206, 577, 3198, 2228, 8965, 253, 1474, 5239, 403, 1663, 1355, 285, 697, 1892, 281, 4665, 534, 3910, 403, 1534, 50272, 11183, 50276, 35501, 323, 253, 16088, 7000, 2380, 209, 422, 5439, 619, 4868, 281, 247, 854, 50275, 74, 513, 275, 2087, 3240, 751, 253, 2929, 285, 253, 6128, 1060, 403, 1869, 11404, 6856, 516, 417, 2119, 516, 9106, 13762, 407, 253, 259, 1026, 1543, 10941, 11874, 253, 30410, 281, 246, 22, 253, 3425, 4868, 83, 417, 2119, 604, 697, 271, 28580, 281, 28580, 5301, 2858, 516, 417, 2119, 253, 373, 247, 15246, 9978, 323, 436, 285, 4931, 352, 7866, 281, 755, 4457, 253, 7990, 273, 47515, 1146, 3559, 1060, 5474, 33032, 6010, 436, 2929, 12453, 253, 2523, 273, 24049, 764, 49235, 715, 1781, 3215, 11273, 3448, 3210, 597, 12661, 247, 3215, 26208, 1332, 326, 1057, 417, 25057, 3640, 14580, 533, 2581, 897, 7274, 534, 6388, 17715, 272, 253, 3280, 6197, 275, 2710, 4088, 281, 9295, 253, 3236, 6197, 436, 16182, 310, 908, 281, 1611, 281, 32413, 275, 764, 49235, 715, 3210, 1543, 403, 2011, 327, 1097, 20741, 800, 285, 1006, 800, 1846, 3282, 15302, 50275, 2369, 652, 555, 285, 19843, 253, 3733, 5199, 273, 17715, 272, 14800, 281, 19553, 18012, 310, 417, 747, 533, 253, 897, 327, 764, 49235, 8892, 1057, 1646, 4460, 285, 671, 310, 271, 4722, 2746, 253, 2929, 369, 1077, 2590, 281, 1239, 285, 253, 7681, 7794, 497, 973, 2529, 50275, 296, 3755, 20556, 337, 253, 897, 273, 247, 1881, 35421, 2746, 310, 1270, 984, 352, 4419, 642, 22581, 285, 253, 3733, 5199, 310, 2969, 352, 310, 671, 2529, 973, 374, 253, 5235, 273, 1666, 25379, 908, 310, 1175, 285, 5301, 1411, 3210, 4067, 685, 253, 4081, 1566, 310, 4722, 281, 923, 495, 253, 897, 273, 4561, 14683, 281, 3157, 3448, 3210, 327, 1892, 3672, 751, 764, 49235, 285, 745, 561, 6460, 310, 247, 1270, 2934, 347, 352, 476, 1361, 1056, 253, 1566, 625, 10237, 50274, 20881, 1255, 265, 337, 627, 943, 320, 247, 625, 11088, 873, 273, 1543, 6312, 281, 923, 849, 1199, 7756, 436, 1566, 556, 7194, 327, 253, 764, 543, 257, 966, 627, 943, 320, 690, 11595, 7103, 2218, 2074, 275, 253, 3236, 2929, 281, 923, 604, 253, 3453, 8659, 14683, 1056, 3282, 285, 326, 253, 7756, 275, 12077, 17082, 310, 4824, 689, 715, 1966, 7103, 50276, 19, 247, 2234, 4809, 281, 1007, 715, 310, 253, 31640, 273, 436, 1566, 275, 253, 260, 19, 84, 2746, 253, 12342, 497, 439, 31377, 281, 6635, 253, 3451, 6197, 1309, 17032, 673, 604, 253, 12342, 497, 439, 31377, 275, 247, 1027, 5133, 651, 253, 1566, 1335, 320, 2104, 281, 6635, 253, 3451, 14683, 627, 369, 1264, 3632, 12922, 908, 533, 347, 369, 753, 253, 3045, 310, 7996, 281, 1027, 3632, 12922, 534, 3133, 326, 253, 1566, 310, 2649, 347, 10237, 281, 9841, 2326, 14800, 50275, 5992, 7193, 5701, 604, 653, 1974, 369, 671, 908, 323, 803, 48510, 2112, 342, 10669, 1320, 436, 943, 320, 1160, 2590, 671, 323, 1046, 6197, 369, 627, 495, 28407, 84, 332, 1768, 10375, 50276, 531, 2181, 516, 12744, 670, 310, 275, 2829, 374, 2139, 310, 253, 776, 246, 22, 4793, 1805, 685, 253, 246, 22, 4793, 1840, 310, 436, 246, 22, 342, 3081, 44540, 891, 1158, 436, 943, 320, 1160, 2590, 23000, 891, 651, 2649, 1333, 285, 310, 760, 5777, 7197, 685, 15841, 35292, 352, 3133, 247, 2257, 7197, 3340, 327, 7387, 86, 285, 260, 1334, 352, 310, 5322, 281, 923, 247, 4577, 1566, 310, 18019, 247, 4067, 1566, 327, 690, 17082, 253, 3064, 875, 11874, 285, 11874, 6036, 310, 326, 253, 3438, 310, 31260, 407, 253, 1724, 2188, 895, 858, 368, 1599, 281, 1333, 6158, 3185, 273, 3438, 671, 891, 13414, 923, 11874, 6604, 275, 253, 2829, 516, 7384, 436, 310, 11874, 1293, 5878, 5890, 484, 50274, 34974, 323, 253, 4477, 337, 849, 858, 368, 5416, 439, 47587, 253, 14683, 1335, 556, 47412, 474, 36594, 247, 6197, 751, 3515, 891, 717, 310, 417, 47412, 1037, 3451, 374, 3185, 273, 247, 803, 6809, 1063, 2139, 858, 368, 417, 897, 271, 38998, 4908, 263, 671, 651, 2649, 1863, 5436, 1027, 18098, 715, 14683, 751, 15706, 28580, 1756, 327, 7139, 342, 37385, 293, 790, 1756, 327, 7139, 1361, 342, 31640, 495, 285, 752, 1127, 310, 253, 1006, 800, 1566, 1175, 2217, 326, 352, 36908, 1361, 281, 2794, 940, 30524, 14683, 50275, 187, 187, 4118, 18435, 27, 2520, 2929, 10262, 767, 1881, 35421, 4715, 16566, 326, 476, 320, 908, 347, 10444, 3215, 26208, 8892, 281, 39494, 253, 246, 22, 2160, 2083, 292, 583, 371, 566, 1566, 875, 3215, 26208, 285, 4836, 1442, 292, 25004, 352, 2722, 326, 387, 1355, 281, 10290, 1566, 9552, 6240, 436, 3213, 3012, 19132, 3045, 327, 764, 49235, 21085, 2303, 8892, 50276, 856, 84, 50276, 2520, 4620, 281, 320, 247, 9648, 15246, 7756, 275, 1881, 35421, 4715, 275, 295, 24343, 342, 9648, 9470, 4679, 50276, 5040, 50275, 2520, 1566, 310, 2649, 10166, 387, 253, 1072, 6685, 1781, 11498, 884, 67, 12113, 347, 1375, 23037, 14387, 3210, 285, 352, 17923, 3012, 2708, 253, 1375, 273, 253, 1445, 697, 417, 2590, 326, 253, 4439, 1566, 6125, 247, 4217, 1566, 323, 667, 2898, 347, 261, 285, 1223, 697, 2779, 697, 417, 11464, 326, 253, 5697, 275, 436, 2929, 651, 1335, 320, 4217, 387, 4067, 11498, 50276, 28821, 326, 352, 3133, 751, 253, 954, 2779, 8446, 323, 436, 789, 310, 643, 12259, 273, 3215, 11273, 3210, 275, 295, 24343, 534, 2789, 253, 4944, 281, 247, 2087, 13361, 8059, 1679, 2590, 50276, 783, 39926, 1475, 12342, 285, 625, 15538, 253, 1566, 1416, 4473, 13823, 298, 78, 4245, 253, 10357, 3298, 9581, 13214, 326, 253, 747, 1566, 22139, 12342, 275, 247, 1039, 326, 246, 22, 36908, 352, 310, 417, 5272, 281, 897, 253, 3159, 4473, 281, 3730, 281, 2173, 4243, 273, 6519, 275, 634, 4060, 1014, 604, 368, 1996, 5513, 326, 285, 1880, 634, 1566, 22139, 12342, 275, 247, 13213, 1037, 1027, 1039, 432, 246, 22, 651, 1379, 247, 6832, 1783, 281, 921, 534, 36908, 1646, 281, 320, 1246, 891, 13414, 1158, 436, 2929, 310, 598, 281, 17857, 77, 2967, 7465, 342, 253, 1655, 1416, 285, 21434, 253, 4477, 281, 1818, 352, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 15988, 327, 841, 8892, 672, 1566, 25180, 310, 2559, 3012, 33810, 597, 452, 31637, 326, 327, 253, 2234, 7982, 273, 764, 543, 257, 597, 6786, 256, 5503, 342, 760, 5777, 625, 685, 2716, 253, 3602, 273, 253, 1655, 256, 5503, 1566, 891, 3103, 2868, 326, 841, 1543, 921, 15785, 50276, 35529, 275, 1046, 4836, 3707, 764, 543, 257, 253, 4477, 513, 417, 2319, 667, 3082, 326, 403, 1014, 2810, 281, 253, 1375, 273, 253, 1445, 323, 260, 18858, 66, 253, 1682, 1180, 275, 436, 2929, 310, 9654, 1237, 4632, 818, 2222, 327, 253, 1655, 6657, 4697, 2074, 3904, 403, 2032, 323, 253, 1551, 273, 253, 8892, 3925, 2270, 4632, 11422, 323, 691, 31569, 818, 6903, 4632, 5091, 323, 12580, 31569, 285, 9654, 938, 4632, 50276, 2511, 1967, 323, 271, 965, 891, 513, 417, 2868, 326, 256, 5503, 1543, 403, 3309, 281, 3630, 247, 1175, 2929, 285, 6296, 253, 37238, 776, 1673, 556, 342, 256, 5503, 310, 36713, 2568, 352, 310, 2834, 323, 479, 281, 4517, 326, 253, 2538, 275, 436, 2929, 588, 39970, 281, 1805, 9591, 3210, 1293, 2007, 1941, 752, 604, 253, 11874, 10444, 16566, 760, 1361, 342, 16503, 326, 4067, 3210, 513, 417, 1056, 275, 253, 806, 1659, 50274, 251, 253, 1006, 800, 4836, 11874, 17923, 8003, 281, 256, 5503, 533, 352, 19132, 760, 5777, 327, 246, 22, 436, 310, 3340, 31623, 347, 253, 16566, 5611, 3587, 3761, 253, 4836, 275, 764, 543, 257, 2403, 436, 10444, 3733, 247, 830, 273, 27620, 3733, 941, 2581, 685, 3215, 26208, 50276, 74, 1335, 1928, 326, 253, 4477, 897, 273, 4473, 285, 764, 49235, 310, 21248, 672, 616, 1332, 476, 320, 2931, 625, 4518, 342, 625, 48657, 28939, 275, 3946, 253, 4477, 897, 28407, 84, 285, 43369, 347, 616, 12342, 534, 310, 4030, 275, 2426, 273, 3215, 26208, 16566, 533, 13353, 1057, 417, 9232, 253, 31376, 273, 12342, 253, 4477, 452, 8489, 31637, 275, 436, 275, 616, 9300, 2715, 50276, 71, 3341, 253, 11874, 10444, 16566, 3894, 1142, 3607, 342, 512, 273, 253, 15302, 5762, 327, 285, 403, 2779, 24403, 839, 253, 1566, 281, 253, 2238, 273, 13007, 597, 943, 1902, 281, 3283, 275, 7170, 273, 1442, 292, 25004, 581, 1039, 436, 476, 320, 2326, 310, 50276, 3529, 253, 28677, 273, 253, 246, 22, 285, 11874, 3104, 403, 1077, 2074, 846, 271, 3302, 19496, 534, 246, 22, 2779, 3198, 281, 24403, 366, 281, 253, 747, 3268, 436, 2789, 3916, 273, 4715, 764, 49235, 1892, 281, 12654, 2167, 891, 513, 5194, 326, 1633, 4623, 281, 16161, 841, 3237, 310, 4518, 1146, 6311, 50276, 2711, 9518, 891, 1158, 436, 2929, 2789, 271, 4722, 7680, 281, 253, 1953, 273, 849, 476, 359, 755, 253, 954, 3215, 26208, 2625, 432, 440, 34218, 941, 970, 273, 649, 1041, 48164, 5657, 891, 5583, 436, 2929, 323, 14924, 2167, 891, 11907, 253, 4477, 281, 49620, 616, 2929, 281, 1056, 436, 253, 2770, 273, 253, 2926, 2581, 685, 253, 39559, 2931, 10732, 273, 4473, 7152, 339, 793, 360, 3454, 436, 2929, 29328, 253, 4473, 6600, 3448, 1566, 11874, 50276, 66, 3215, 11273, 246, 22, 39707, 310, 10166, 327, 1881, 35421, 10444, 8892, 281, 3037, 38524, 764, 49235, 3640, 1078, 1442, 292, 25004, 327, 15450, 8892, 253, 10444, 8892, 2486, 337, 4473, 281, 6197, 260, 19, 84, 5978, 50276, 28821, 247, 1618, 273, 8143, 4525, 12342, 43369, 285, 28407, 84, 6635, 253, 2303, 3425, 374, 4473, 1340, 7355, 944, 50276, 28821, 247, 3425, 342, 253, 1340, 273, 12342, 43369, 285, 28407, 84, 439, 31377, 6635, 253, 3236, 3425, 495, 1677, 247, 3425, 285, 697, 44711, 2715, 12342, 439, 31377, 6635, 253, 3425, 9162, 533, 347, 247, 5978, 4836, 3733, 310, 806, 2218, 275, 247, 1554, 262, 1945, 8142, 3560, 407, 247, 3924, 273, 3733, 326, 4648, 4561, 18012, 432, 337, 285, 374, 323, 495, 50276, 783, 4736, 273, 253, 2929, 310, 281, 921, 326, 9257, 4158, 16566, 323, 1881, 35421, 10444, 4836, 3733, 823, 38524, 764, 49235, 3640, 534, 7729, 3157, 1566, 3045, 327, 15450, 764, 49235, 14720, 8892, 50275, 296, 3755, 20556, 50276, 84, 18, 50276, 783, 2934, 273, 1881, 35421, 10444, 4836, 3733, 310, 271, 12302, 581, 3340, 275, 253, 830, 273, 6240, 14720, 13789, 326, 270, 797, 305, 431, 246, 22, 3966, 1537, 417, 320, 28635, 1309, 253, 1236, 2510, 25912, 3215, 26208, 3408, 50273, 84, 19, 50275, 284, 2011, 407, 1543, 275, 2829, 337, 327, 608, 764, 49235, 14720, 8892, 253, 10444, 4836, 3733, 4081, 275, 436, 789, 19132, 3045, 689, 285, 1840, 253, 246, 22, 1566, 285, 11640, 1690, 342, 43066, 13905, 44790, 323, 12342, 50274, 84, 20, 50276, 783, 4679, 3206, 25001, 2439, 495, 3632, 12922, 285, 253, 4477, 452, 2361, 7162, 11508, 50273, 20881, 1255, 265, 285, 3533, 50276, 88, 18, 50276, 531, 5816, 8245, 310, 246, 22, 10166, 281, 440, 1200, 23831, 2862, 6430, 50276, 28821, 271, 3280, 342, 253, 21761, 439, 31377, 6635, 253, 3236, 3425, 436, 651, 921, 849, 1199, 1318, 260, 19, 84, 285, 944, 403, 1663, 6240, 253, 1655, 246, 22, 1666, 25379, 403, 512, 10166, 15846, 323, 2192, 3867, 534, 3133, 247, 2372, 16593, 2429, 281, 11874, 534, 310, 11365, 2862, 6430, 50274, 88, 19, 50276, 28821, 247, 246, 22, 1566, 326, 476, 4868, 6430, 5046, 846, 3733, 327, 253, 6753, 27676, 8103, 651, 352, 4868, 28580, 1756, 327, 7139, 2169, 685, 7139, 1756, 327, 28580, 604, 4754, 840, 253, 1566, 3133, 281, 2168, 10738, 436, 3740, 273, 14720, 651, 352, 4868, 28580, 1756, 327, 7139, 285, 28580, 1756, 275, 253, 3216, 12014, 253, 13812, 1060, 310, 875, 6430, 326, 403, 295, 543, 3358, 2056, 474, 390, 11543, 281, 2455, 3176, 7147, 6430, 326, 778, 452, 5420, 24088, 20951, 1756, 275, 253, 3216, 21654, 247, 697, 12744, 604, 253, 4158, 16566, 403, 5277, 764, 49235, 14720, 1840, 1633, 253, 1566, 476, 871, 432, 47694, 11020, 3448, 1566, 14755, 285, 270, 352, 4620, 326, 253, 16566, 403, 417, 4158, 281, 823, 38524, 764, 49235, 3640, 273, 253, 3686, 835, 359, 871, 28580, 13414, 1756, 275, 253, 3216, 50273, 88, 20, 50276, 68, 19, 84, 310, 4158, 5742, 281, 513, 973, 327, 764, 543, 257, 403, 253, 15988, 327, 436, 4836, 4577, 685, 752, 368, 1537, 452, 3264, 604, 4754, 2139, 310, 2649, 352, 9073, 625, 50273, 88, 21, 50276, 13206, 577, 3198, 2228, 8965, 253, 1474, 5239, 403, 1663, 1355, 285, 697, 1892, 281, 4665, 534, 3910, 403, 1534, 50272, 11183, 50276, 35501, 323, 253, 16088, 7000, 2380, 209, 422, 5439, 619, 4868, 281, 247, 854, 50275, 74, 513, 275, 2087, 3240, 751, 253, 2929, 285, 253, 6128, 1060, 403, 1869, 11404, 6856, 516, 417, 2119, 516, 9106, 13762, 407, 253, 259, 1026, 1543, 10941, 11874, 253, 30410, 281, 246, 22, 253, 3425, 4868, 83, 417, 2119, 604, 697, 271, 28580, 281, 28580, 5301, 2858, 516, 417, 2119, 253, 373, 247, 15246, 9978, 323, 436, 285, 4931, 352, 7866, 281, 755, 4457, 253, 7990, 273, 47515, 1146, 3559, 1060, 5474, 33032, 6010, 436, 2929, 12453, 253, 2523, 273, 24049, 764, 49235, 715, 1781, 3215, 11273, 3448, 3210, 597, 12661, 247, 3215, 26208, 1332, 326, 1057, 417, 25057, 3640, 14580, 533, 2581, 897, 7274, 534, 6388, 17715, 272, 253, 3280, 6197, 275, 2710, 4088, 281, 9295, 253, 3236, 6197, 436, 16182, 310, 908, 281, 1611, 281, 32413, 275, 764, 49235, 715, 3210, 1543, 403, 2011, 327, 1097, 20741, 800, 285, 1006, 800, 1846, 3282, 15302, 50275, 2369, 652, 555, 285, 19843, 253, 3733, 5199, 273, 17715, 272, 14800, 281, 19553, 18012, 310, 417, 747, 533, 253, 897, 327, 764, 49235, 8892, 1057, 1646, 4460, 285, 671, 310, 271, 4722, 2746, 253, 2929, 369, 1077, 2590, 281, 1239, 285, 253, 7681, 7794, 497, 973, 2529, 50275, 296, 3755, 20556, 337, 253, 897, 273, 247, 1881, 35421, 2746, 310, 1270, 984, 352, 4419, 642, 22581, 285, 253, 3733, 5199, 310, 2969, 352, 310, 671, 2529, 973, 374, 253, 5235, 273, 1666, 25379, 908, 310, 1175, 285, 5301, 1411, 3210, 4067, 685, 253, 4081, 1566, 310, 4722, 281, 923, 495, 253, 897, 273, 4561, 14683, 281, 3157, 3448, 3210, 327, 1892, 3672, 751, 764, 49235, 285, 745, 561, 6460, 310, 247, 1270, 2934, 347, 352, 476, 1361, 1056, 253, 1566, 625, 10237, 50274, 20881, 1255, 265, 337, 627, 943, 320, 247, 625, 11088, 873, 273, 1543, 6312, 281, 923, 849, 1199, 7756, 436, 1566, 556, 7194, 327, 253, 764, 543, 257, 966, 627, 943, 320, 690, 11595, 7103, 2218, 2074, 275, 253, 3236, 2929, 281, 923, 604, 253, 3453, 8659, 14683, 1056, 3282, 285, 326, 253, 7756, 275, 12077, 17082, 310, 4824, 689, 715, 1966, 7103, 50276, 19, 247, 2234, 4809, 281, 1007, 715, 310, 253, 31640, 273, 436, 1566, 275, 253, 260, 19, 84, 2746, 253, 12342, 497, 439, 31377, 281, 6635, 253, 3451, 6197, 1309, 17032, 673, 604, 253, 12342, 497, 439, 31377, 275, 247, 1027, 5133, 651, 253, 1566, 1335, 320, 2104, 281, 6635, 253, 3451, 14683, 627, 369, 1264, 3632, 12922, 908, 533, 347, 369, 753, 253, 3045, 310, 7996, 281, 1027, 3632, 12922, 534, 3133, 326, 253, 1566, 310, 2649, 347, 10237, 281, 9841, 2326, 14800, 50275, 5992, 7193, 5701, 604, 653, 1974, 369, 671, 908, 323, 803, 48510, 2112, 342, 10669, 1320, 436, 943, 320, 1160, 2590, 671, 323, 1046, 6197, 369, 627, 495, 28407, 84, 332, 1768, 10375, 50276, 531, 2181, 516, 12744, 670, 310, 275, 2829, 374, 2139, 310, 253, 776, 246, 22, 4793, 1805, 685, 253, 246, 22, 4793, 1840, 310, 436, 246, 22, 342, 3081, 44540, 891, 1158, 436, 943, 320, 1160, 2590, 23000, 891, 651, 2649, 1333, 285, 310, 760, 5777, 7197, 685, 15841, 35292, 352, 3133, 247, 2257, 7197, 3340, 327, 7387, 86, 285, 260, 1334, 352, 310, 5322, 281, 923, 247, 4577, 1566, 310, 18019, 247, 4067, 1566, 327, 690, 17082, 253, 3064, 875, 11874, 285, 11874, 6036, 310, 326, 253, 3438, 310, 31260, 407, 253, 1724, 2188, 895, 858, 368, 1599, 281, 1333, 6158, 3185, 273, 3438, 671, 891, 13414, 923, 11874, 6604, 275, 253, 2829, 516, 7384, 436, 310, 11874, 1293, 5878, 5890, 484, 50274, 34974, 323, 253, 4477, 337, 849, 858, 368, 5416, 439, 47587, 253, 14683, 1335, 556, 47412, 474, 36594, 247, 6197, 751, 3515, 891, 717, 310, 417, 47412, 1037, 3451, 374, 3185, 273, 247, 803, 6809, 1063, 2139, 858, 368, 417, 897, 271, 38998, 4908, 263, 671, 651, 2649, 1863, 5436, 1027, 18098, 715, 14683, 751, 15706, 28580, 1756, 327, 7139, 342, 37385, 293, 790, 1756, 327, 7139, 1361, 342, 31640, 495, 285, 752, 1127, 310, 253, 1006, 800, 1566, 1175, 2217, 326, 352, 36908, 1361, 281, 2794, 940, 30524, 14683, 50275, 187, 187, 4118, 18435, 27, 2520, 2929, 10262, 767, 1881, 35421, 4715, 16566, 326, 476, 320, 908, 347, 10444, 3215, 26208, 8892, 281, 39494, 253, 246, 22, 2160, 2083, 292, 583, 371, 566, 1566, 875, 3215, 26208, 285, 4836, 1442, 292, 25004, 352, 2722, 326, 387, 1355, 281, 10290, 1566, 9552, 6240, 436, 3213, 3012, 19132, 3045, 327, 764, 49235, 21085, 2303, 8892, 50276, 856, 84, 50276, 2520, 4620, 281, 320, 247, 9648, 15246, 7756, 275, 1881, 35421, 4715, 275, 295, 24343, 342, 9648, 9470, 4679, 50276, 5040, 50275, 2520, 1566, 310, 2649, 10166, 387, 253, 1072, 6685, 1781, 11498, 884, 67, 12113, 347, 1375, 23037, 14387, 3210, 285, 352, 17923, 3012, 2708, 253, 1375, 273, 253, 1445, 697, 417, 2590, 326, 253, 4439, 1566, 6125, 247, 4217, 1566, 323, 667, 2898, 347, 261, 285, 1223, 697, 2779, 697, 417, 11464, 326, 253, 5697, 275, 436, 2929, 651, 1335, 320, 4217, 387, 4067, 11498, 50276, 28821, 326, 352, 3133, 751, 253, 954, 2779, 8446, 323, 436, 789, 310, 643, 12259, 273, 3215, 11273, 3210, 275, 295, 24343, 534, 2789, 253, 4944, 281, 247, 2087, 13361, 8059, 1679, 2590, 50276, 783, 39926, 1475, 12342, 285, 625, 15538, 253, 1566, 1416, 4473, 13823, 298, 78, 4245, 253, 10357, 3298, 9581, 13214, 326, 253, 747, 1566, 22139, 12342, 275, 247, 1039, 326, 246, 22, 36908, 352, 310, 417, 5272, 281, 897, 253, 3159, 4473, 281, 3730, 281, 2173, 4243, 273, 6519, 275, 634, 4060, 1014, 604, 368, 1996, 5513, 326, 285, 1880, 634, 1566, 22139, 12342, 275, 247, 13213, 1037, 1027, 1039, 432, 246, 22, 651, 1379, 247, 6832, 1783, 281, 921, 534, 36908, 1646, 281, 320, 1246, 891, 13414, 1158, 436, 2929, 310, 598, 281, 17857, 77, 2967, 7465, 342, 253, 1655, 1416, 285, 21434, 253, 4477, 281, 1818, 352, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper proposes a new tabular deep learning architecture based on sharing attention matrices enable transformers and linear layers comparisons with other tabular learning models on various benchmarks are demonstrated i have serious concerns about the paper as listed below the paper writing is very poor there are numerous language and grammar errors at many places the expressions are not clear the quality of the figures is low overall the novelty is low transformer architecture is modified in a straightforward way even the semisupervised learning approach is the simple adaption of mlmlike unsupervised pretraining benchmarking is employed and presented very poorly for comparison models like tabnet xgboost etc it is not clear how the hyperparameter tuning is done and what parameters are included in the search space the authors should clearly describe which parameters were tuned and what validation reward is used otherwise the outperformance conclusions are not convincing at all how do you do hyperparameter tuning in semisupervised regime it is unclear how semisupervised validation data is used the results are presented in auroc but what is the training objective if the training objective is not auroc how do you ensure the metric mismatch is not dominating what is the significance of fig 5 how can you convince the readers that salt has learned attention patterns that are meaningful the difference between supervised learning results is very low across different models it is unclear whether the results are statistically significant no ablation studies are presented for the major constituents of the claims such as the benefit of sharing attention layers i suggest substantially revising the paper and submitting to another venue docsepthis paper proposes a hybrid deep network named salt sharing attention between linear layer and transformer the salt consists of two blocks transformer and linear layer blocks that take advantage of shared attention matrices they compare salt with treebased ensemble models and previous deep learning models on multiple benchmark datasets it furher shows robustness of the proposed salt with semisupervised learning and pretraining with small dataset scenarios the main body of salt has two blocks transformers block inspired by vaswani et al 2017 and linear layer block inspired by liu et al 2021 each block has subblocks by featurewise and dimension wise these two subblocks allow communications between different features and different embed ding elements and makes the model robust tolstikhin et al 2021 the output values from two blocks become the contextual embedding values salt performs fintuning and pretraining with contextual embedding values for supervised learning setting the mean auroc score of the proposed salt improves upon the treebased lightgbm by mere 009 and it improves upon the best deep learning based model saint by 029 the main strength of the paper is marginal improvement over the sota deep learning model saint for supervised setting scenario the main weakness is in lack of justification for the proposed choice of salt model and inadeqate explanation of the salt architecture it is not clear what is dmensionwise attention there is no motivationjustification for the transformers and linear layers with attention sharing further why is that there is only one layer of featurewise attention and linear layer and then l layers of dimensionwise attention and linear layer neiher has been given any good justification for sharing attention between transformer and linear layer further for the most important supervised learning scenario the salt model barely beats the treebased lightgbm model though the numerical exeriments show that the proposed model salt marginally improves over the sota deep learning model saint it barely beats the treebased lightgbm model the architectural choices of salt have no motivationjustification docsepthis paper proposed a hybrid deep network architecture for tabular data dubbed salt sharing attention between linear layer and transformer there are two blocks in salt transformers and linear layer blocks and sharing attention matrices are introduced to promote cooperation between these two blocks pros this manuscript attempts to propose a deep learning model for tabular data a problem with practical application value an interesting hybrid network architecture is presented the main idea of which is good and reasonable cons the novelty is limit the key block sharing attention shares similar idea with existing works realformer the writing of the manuscript can be polished typos in equation 7 and 8 are confusing the proposed salt in my view should be technically correct but i think the novelty is limit consequently although both the problem and the method is okay this manuscript is a borderline work i recommend rejecting it docsepthis paper introduces a transformerbased architecture for tabular datasets this architecture combines a transformer with a gating mlp gmlp by sharing the attention matrices the authors also proposed a new approach to encode continuous variables the proposed model is evaluated on six binary classification tabular datasets the proposed model is evaluated in a supervised learning context but also in a semisupervised context strengths i think it is interesting to develop deep models for tabular data learning on tabular datasets is still an underexplored research area the idea of sharing attention matrices looks interesting the authors show that combining a transformerlike architecture with a gmlplike one improves the performances i assume the salt results are significantly better than saltformer and saltlinear see my comment later the authors analyzed different variants of their model weaknesses section 35 salt improves this method to use the same embedding matrix between categorical variables and continuous variables i am not sure what is the intuition behind this idea the authors should motivate this claim the model is presented to work on any tabular datasets however from experience a lot of tabular datasets contain at least one datetime column i wonder if salt can work on datetime columns the model is evaluated only for binary classification tasks i think adding experiments for other tasks eg multiclass classification multilabel classification regression can increase the quality of the paper the authors should explain how they do the semisupervised learning in section 53 for most of the tables i recommend showing the variance for different random seeds or some other confidence interval measures for a lot of results the difference between two models looks small and it is difficult to know if the difference is significant it is difficult to understand table 3 without additional information i recommend adding information about the dataset or metric used in the caption this table is not mentioned in the text the authors visualize the attention matrices for different variants of their model however these visualizations are not commented on so it is difficult to understand the message from this analysis the authors should comment on the scalability of their model the saint model was evaluated on a larger dataset i wonder why salt was not evaluated on this dataset i think the authors should also compare the training timeinference time of their model with existing models the authors should add information about the transformer architecture used is it the original transformer architecture also called postnorm or is it more recent architecture like prenorm or rezero the last sentence of the conclusion breaks the anonymity of the paper the code is available at httpsgithubcomjuseong03salt overall i feel it is quite difficult to understand the potential impact of this paper the idea of sharing attention matrices looks interesting but there are some questions about the scalability of the proposed approach and if it can work on any tabular datasets ### Summary:
this paper proposes a method to use transformers with tabular data by sharing attention reviewers raise significant concerns about the motivation writing and experimental results authors did not submit a response hence i recommend rejection
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 29328, 247, 747, 10334, 792, 3676, 4715, 10336, 1754, 327, 9628, 4116, 12624, 8046, 4979, 398, 285, 4872, 8090, 14023, 342, 643, 10334, 792, 4715, 3210, 327, 2710, 49602, 403, 5183, 50276, 74, 452, 4092, 7350, 670, 253, 2929, 347, 7117, 2708, 50274, 783, 2929, 4028, 310, 1077, 4105, 627, 403, 7418, 3448, 285, 28146, 6332, 387, 1142, 5053, 253, 12091, 403, 417, 2590, 253, 3290, 273, 253, 8442, 310, 1698, 50274, 1189, 455, 253, 38135, 310, 1698, 39707, 10336, 310, 7321, 275, 247, 15246, 1039, 1014, 253, 49863, 29974, 13337, 4715, 2746, 310, 253, 2969, 5223, 279, 273, 13361, 78, 3022, 440, 35421, 3215, 26208, 50274, 31591, 4698, 272, 310, 7091, 285, 3559, 1077, 15225, 323, 5301, 3210, 751, 10334, 3024, 1269, 72, 15467, 3966, 352, 310, 417, 2590, 849, 253, 4373, 19484, 25184, 310, 2218, 285, 752, 3602, 403, 2908, 275, 253, 3186, 2317, 253, 4477, 943, 4518, 6266, 534, 3602, 497, 24251, 285, 752, 12820, 10921, 310, 908, 5010, 253, 562, 24159, 11815, 403, 417, 21414, 387, 512, 50275, 5430, 513, 368, 513, 4373, 19484, 25184, 275, 49863, 29974, 13337, 9459, 352, 310, 12744, 849, 49863, 29974, 13337, 12820, 941, 310, 908, 50275, 783, 1543, 403, 3559, 275, 247, 1822, 68, 533, 752, 310, 253, 3733, 8103, 604, 253, 3733, 8103, 310, 417, 247, 1822, 68, 849, 513, 368, 5416, 253, 7982, 29713, 310, 417, 41297, 50274, 5371, 310, 253, 8453, 273, 3036, 608, 849, 476, 368, 18578, 253, 10668, 326, 7043, 556, 6311, 4116, 6127, 326, 403, 14282, 50275, 783, 3064, 875, 22296, 4715, 1543, 310, 1077, 1698, 2439, 1027, 3210, 352, 310, 12744, 1880, 253, 1543, 403, 10126, 1534, 50274, 2369, 28913, 2175, 403, 3559, 323, 253, 2201, 26290, 273, 253, 3916, 824, 347, 253, 5649, 273, 9628, 4116, 8090, 50276, 74, 1804, 9619, 3585, 2182, 253, 2929, 285, 29315, 281, 1529, 18767, 50276, 7152, 33032, 2520, 2929, 29328, 247, 9769, 3676, 2990, 4907, 7043, 9628, 4116, 875, 4872, 3828, 285, 39707, 253, 7043, 8414, 273, 767, 8336, 39707, 285, 4872, 3828, 8336, 326, 1379, 5750, 273, 6096, 4116, 12624, 597, 7277, 7043, 342, 5202, 3169, 19862, 3210, 285, 2045, 3676, 4715, 3210, 327, 2709, 22791, 15302, 352, 11829, 379, 2722, 31640, 273, 253, 4081, 7043, 342, 49863, 29974, 13337, 4715, 285, 3215, 26208, 342, 1355, 10895, 15216, 50275, 783, 2022, 2133, 273, 7043, 556, 767, 8336, 4979, 398, 2972, 11797, 407, 16016, 88, 6451, 1162, 355, 4240, 285, 4872, 3828, 2972, 11797, 407, 632, 86, 1162, 355, 43425, 1016, 2972, 556, 749, 27027, 407, 4735, 3020, 285, 7877, 15822, 841, 767, 749, 27027, 1581, 10924, 875, 1027, 3386, 285, 1027, 8473, 50097, 3603, 285, 2789, 253, 1566, 10237, 281, 42663, 20323, 249, 1162, 355, 43425, 253, 3453, 2193, 432, 767, 8336, 2489, 253, 33876, 21496, 2193, 7043, 17923, 269, 565, 25004, 285, 3215, 26208, 342, 33876, 21496, 2193, 50276, 1542, 22296, 4715, 4758, 253, 1599, 247, 1822, 68, 4868, 273, 253, 4081, 7043, 19132, 2220, 253, 5202, 3169, 1708, 72, 5844, 407, 11019, 7449, 26, 285, 352, 19132, 2220, 253, 1682, 3676, 4715, 1754, 1566, 24032, 407, 470, 1717, 253, 2022, 4757, 273, 253, 2929, 310, 16888, 7756, 689, 253, 256, 5503, 3676, 4715, 1566, 24032, 323, 22296, 4758, 10076, 50275, 783, 2022, 14855, 310, 275, 3480, 273, 22861, 323, 253, 4081, 4327, 273, 7043, 1566, 285, 275, 796, 82, 366, 8813, 273, 253, 7043, 10336, 352, 310, 417, 2590, 752, 310, 42961, 2452, 3020, 4116, 627, 310, 642, 16038, 6309, 1877, 323, 253, 4979, 398, 285, 4872, 8090, 342, 4116, 9628, 2007, 2139, 310, 326, 627, 310, 760, 581, 3828, 273, 4735, 3020, 4116, 285, 4872, 3828, 285, 840, 298, 8090, 273, 7877, 3020, 4116, 285, 4872, 3828, 425, 74, 379, 556, 644, 1677, 667, 1175, 22861, 323, 9628, 4116, 875, 39707, 285, 4872, 3828, 2007, 323, 253, 954, 1774, 22296, 4715, 10076, 253, 7043, 1566, 12345, 27125, 253, 5202, 3169, 1708, 72, 5844, 1566, 2167, 253, 10704, 385, 254, 3825, 921, 326, 253, 4081, 1566, 7043, 42876, 19132, 689, 253, 256, 5503, 3676, 4715, 1566, 24032, 352, 12345, 27125, 253, 5202, 3169, 1708, 72, 5844, 1566, 253, 27934, 10165, 273, 7043, 452, 642, 16038, 6309, 1877, 50276, 7152, 33032, 2520, 2929, 4081, 247, 9769, 3676, 2990, 10336, 323, 10334, 792, 941, 33690, 7043, 9628, 4116, 875, 4872, 3828, 285, 39707, 50276, 9088, 403, 767, 8336, 275, 7043, 4979, 398, 285, 4872, 3828, 8336, 285, 9628, 4116, 12624, 403, 5611, 281, 8591, 15850, 875, 841, 767, 8336, 5847, 50275, 2520, 7714, 9437, 281, 12661, 247, 3676, 4715, 1566, 323, 10334, 792, 941, 247, 1895, 342, 8542, 2898, 1318, 50276, 266, 4722, 9769, 2990, 10336, 310, 3559, 253, 2022, 2934, 273, 534, 310, 1175, 285, 5272, 50276, 5040, 50275, 783, 38135, 310, 2701, 253, 2234, 2972, 9628, 4116, 10764, 2074, 2934, 342, 5368, 2987, 1524, 19946, 50276, 783, 4028, 273, 253, 7714, 476, 320, 29422, 963, 993, 275, 5150, 818, 285, 854, 403, 21643, 253, 4081, 7043, 275, 619, 1859, 943, 320, 22335, 3451, 50276, 2858, 891, 1158, 253, 38135, 310, 2701, 50276, 585, 9642, 3738, 1097, 253, 1895, 285, 253, 1332, 310, 8261, 436, 7714, 310, 247, 45210, 789, 891, 5583, 33944, 352, 5474, 33032, 2520, 2929, 23970, 247, 39707, 3169, 10336, 323, 10334, 792, 15302, 436, 10336, 24772, 247, 39707, 342, 247, 305, 839, 13361, 81, 305, 1686, 81, 407, 9628, 253, 4116, 12624, 253, 4477, 671, 4081, 247, 747, 2746, 281, 22573, 5415, 4903, 253, 4081, 1566, 310, 6760, 327, 2800, 8985, 9162, 10334, 792, 15302, 253, 4081, 1566, 310, 6760, 275, 247, 22296, 4715, 3634, 533, 671, 275, 247, 49863, 29974, 13337, 3634, 20544, 50275, 74, 1158, 352, 310, 4722, 281, 1287, 3676, 3210, 323, 10334, 792, 941, 4715, 327, 10334, 792, 15302, 310, 1335, 271, 15560, 18398, 446, 2149, 2561, 2170, 50276, 783, 2934, 273, 9628, 4116, 12624, 4453, 4722, 50275, 783, 4477, 921, 326, 16248, 247, 39707, 3022, 10336, 342, 247, 305, 1686, 446, 2804, 581, 19132, 253, 16226, 891, 5467, 253, 7043, 1543, 403, 3012, 1805, 685, 7043, 19946, 285, 7043, 8172, 923, 619, 4385, 1996, 50276, 783, 4477, 5867, 1027, 11640, 273, 616, 1566, 50276, 20881, 1255, 265, 50275, 4674, 4791, 7043, 19132, 436, 1332, 281, 897, 253, 1072, 21496, 4315, 875, 31091, 4903, 285, 5415, 4903, 891, 717, 417, 2119, 752, 310, 253, 30328, 3212, 436, 2934, 253, 4477, 943, 41509, 436, 1750, 50274, 783, 1566, 310, 3559, 281, 789, 327, 667, 10334, 792, 15302, 2299, 432, 2793, 247, 2257, 273, 10334, 792, 15302, 3831, 387, 1878, 581, 36743, 5084, 891, 4282, 604, 7043, 476, 789, 327, 36743, 9930, 50275, 783, 1566, 310, 6760, 760, 323, 8985, 9162, 8892, 891, 1158, 6240, 4679, 323, 643, 8892, 24088, 23559, 14407, 9162, 33362, 1492, 9162, 9077, 476, 2572, 253, 3290, 273, 253, 2929, 50275, 783, 4477, 943, 5513, 849, 597, 513, 253, 49863, 29974, 13337, 4715, 275, 2593, 8676, 50275, 1542, 954, 273, 253, 7180, 891, 5583, 4645, 253, 11041, 323, 1027, 3632, 12922, 390, 690, 643, 7162, 7726, 5593, 323, 247, 2257, 273, 1543, 253, 3064, 875, 767, 3210, 4453, 1355, 285, 352, 310, 2834, 281, 871, 604, 253, 3064, 310, 1534, 50275, 262, 310, 2834, 281, 2096, 2829, 495, 1293, 3081, 1491, 891, 5583, 6240, 1491, 670, 253, 10895, 390, 7982, 908, 275, 253, 11743, 436, 2829, 310, 417, 5393, 275, 253, 2505, 50276, 783, 4477, 31986, 253, 4116, 12624, 323, 1027, 11640, 273, 616, 1566, 2299, 841, 5304, 5904, 403, 417, 20503, 327, 594, 352, 310, 2834, 281, 2096, 253, 3935, 432, 436, 1783, 50275, 783, 4477, 943, 4385, 327, 253, 9171, 1430, 273, 616, 1566, 253, 24032, 1566, 369, 6760, 327, 247, 4067, 10895, 891, 4282, 2139, 7043, 369, 417, 6760, 327, 436, 10895, 891, 1158, 253, 4477, 943, 671, 7277, 253, 3733, 673, 249, 1793, 673, 273, 616, 1566, 342, 5368, 3210, 50275, 783, 4477, 943, 823, 1491, 670, 253, 39707, 10336, 908, 310, 352, 253, 3236, 39707, 10336, 671, 1925, 1501, 12850, 390, 310, 352, 625, 3332, 10336, 751, 25861, 526, 390, 294, 10528, 50276, 783, 1390, 6197, 273, 253, 6452, 13471, 253, 39185, 273, 253, 2929, 253, 2127, 310, 2130, 387, 5987, 7280, 681, 75, 2327, 543, 2941, 40918, 50275, 1189, 455, 891, 1928, 352, 310, 3240, 2834, 281, 2096, 253, 2442, 3486, 273, 436, 2929, 253, 2934, 273, 9628, 4116, 12624, 4453, 4722, 533, 627, 403, 690, 3533, 670, 253, 9171, 1430, 273, 253, 4081, 2746, 285, 604, 352, 476, 789, 327, 667, 10334, 792, 15302, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 29328, 247, 1332, 281, 897, 4979, 398, 342, 10334, 792, 941, 407, 9628, 4116, 30628, 7164, 1534, 7350, 670, 253, 16038, 4028, 285, 5661, 1543, 4477, 858, 417, 11929, 247, 2380, 7613, 891, 5583, 18235 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 29328, 247, 747, 10334, 792, 3676, 4715, 10336, 1754, 327, 9628, 4116, 12624, 8046, 4979, 398, 285, 4872, 8090, 14023, 342, 643, 10334, 792, 4715, 3210, 327, 2710, 49602, 403, 5183, 50276, 74, 452, 4092, 7350, 670, 253, 2929, 347, 7117, 2708, 50274, 783, 2929, 4028, 310, 1077, 4105, 627, 403, 7418, 3448, 285, 28146, 6332, 387, 1142, 5053, 253, 12091, 403, 417, 2590, 253, 3290, 273, 253, 8442, 310, 1698, 50274, 1189, 455, 253, 38135, 310, 1698, 39707, 10336, 310, 7321, 275, 247, 15246, 1039, 1014, 253, 49863, 29974, 13337, 4715, 2746, 310, 253, 2969, 5223, 279, 273, 13361, 78, 3022, 440, 35421, 3215, 26208, 50274, 31591, 4698, 272, 310, 7091, 285, 3559, 1077, 15225, 323, 5301, 3210, 751, 10334, 3024, 1269, 72, 15467, 3966, 352, 310, 417, 2590, 849, 253, 4373, 19484, 25184, 310, 2218, 285, 752, 3602, 403, 2908, 275, 253, 3186, 2317, 253, 4477, 943, 4518, 6266, 534, 3602, 497, 24251, 285, 752, 12820, 10921, 310, 908, 5010, 253, 562, 24159, 11815, 403, 417, 21414, 387, 512, 50275, 5430, 513, 368, 513, 4373, 19484, 25184, 275, 49863, 29974, 13337, 9459, 352, 310, 12744, 849, 49863, 29974, 13337, 12820, 941, 310, 908, 50275, 783, 1543, 403, 3559, 275, 247, 1822, 68, 533, 752, 310, 253, 3733, 8103, 604, 253, 3733, 8103, 310, 417, 247, 1822, 68, 849, 513, 368, 5416, 253, 7982, 29713, 310, 417, 41297, 50274, 5371, 310, 253, 8453, 273, 3036, 608, 849, 476, 368, 18578, 253, 10668, 326, 7043, 556, 6311, 4116, 6127, 326, 403, 14282, 50275, 783, 3064, 875, 22296, 4715, 1543, 310, 1077, 1698, 2439, 1027, 3210, 352, 310, 12744, 1880, 253, 1543, 403, 10126, 1534, 50274, 2369, 28913, 2175, 403, 3559, 323, 253, 2201, 26290, 273, 253, 3916, 824, 347, 253, 5649, 273, 9628, 4116, 8090, 50276, 74, 1804, 9619, 3585, 2182, 253, 2929, 285, 29315, 281, 1529, 18767, 50276, 7152, 33032, 2520, 2929, 29328, 247, 9769, 3676, 2990, 4907, 7043, 9628, 4116, 875, 4872, 3828, 285, 39707, 253, 7043, 8414, 273, 767, 8336, 39707, 285, 4872, 3828, 8336, 326, 1379, 5750, 273, 6096, 4116, 12624, 597, 7277, 7043, 342, 5202, 3169, 19862, 3210, 285, 2045, 3676, 4715, 3210, 327, 2709, 22791, 15302, 352, 11829, 379, 2722, 31640, 273, 253, 4081, 7043, 342, 49863, 29974, 13337, 4715, 285, 3215, 26208, 342, 1355, 10895, 15216, 50275, 783, 2022, 2133, 273, 7043, 556, 767, 8336, 4979, 398, 2972, 11797, 407, 16016, 88, 6451, 1162, 355, 4240, 285, 4872, 3828, 2972, 11797, 407, 632, 86, 1162, 355, 43425, 1016, 2972, 556, 749, 27027, 407, 4735, 3020, 285, 7877, 15822, 841, 767, 749, 27027, 1581, 10924, 875, 1027, 3386, 285, 1027, 8473, 50097, 3603, 285, 2789, 253, 1566, 10237, 281, 42663, 20323, 249, 1162, 355, 43425, 253, 3453, 2193, 432, 767, 8336, 2489, 253, 33876, 21496, 2193, 7043, 17923, 269, 565, 25004, 285, 3215, 26208, 342, 33876, 21496, 2193, 50276, 1542, 22296, 4715, 4758, 253, 1599, 247, 1822, 68, 4868, 273, 253, 4081, 7043, 19132, 2220, 253, 5202, 3169, 1708, 72, 5844, 407, 11019, 7449, 26, 285, 352, 19132, 2220, 253, 1682, 3676, 4715, 1754, 1566, 24032, 407, 470, 1717, 253, 2022, 4757, 273, 253, 2929, 310, 16888, 7756, 689, 253, 256, 5503, 3676, 4715, 1566, 24032, 323, 22296, 4758, 10076, 50275, 783, 2022, 14855, 310, 275, 3480, 273, 22861, 323, 253, 4081, 4327, 273, 7043, 1566, 285, 275, 796, 82, 366, 8813, 273, 253, 7043, 10336, 352, 310, 417, 2590, 752, 310, 42961, 2452, 3020, 4116, 627, 310, 642, 16038, 6309, 1877, 323, 253, 4979, 398, 285, 4872, 8090, 342, 4116, 9628, 2007, 2139, 310, 326, 627, 310, 760, 581, 3828, 273, 4735, 3020, 4116, 285, 4872, 3828, 285, 840, 298, 8090, 273, 7877, 3020, 4116, 285, 4872, 3828, 425, 74, 379, 556, 644, 1677, 667, 1175, 22861, 323, 9628, 4116, 875, 39707, 285, 4872, 3828, 2007, 323, 253, 954, 1774, 22296, 4715, 10076, 253, 7043, 1566, 12345, 27125, 253, 5202, 3169, 1708, 72, 5844, 1566, 2167, 253, 10704, 385, 254, 3825, 921, 326, 253, 4081, 1566, 7043, 42876, 19132, 689, 253, 256, 5503, 3676, 4715, 1566, 24032, 352, 12345, 27125, 253, 5202, 3169, 1708, 72, 5844, 1566, 253, 27934, 10165, 273, 7043, 452, 642, 16038, 6309, 1877, 50276, 7152, 33032, 2520, 2929, 4081, 247, 9769, 3676, 2990, 10336, 323, 10334, 792, 941, 33690, 7043, 9628, 4116, 875, 4872, 3828, 285, 39707, 50276, 9088, 403, 767, 8336, 275, 7043, 4979, 398, 285, 4872, 3828, 8336, 285, 9628, 4116, 12624, 403, 5611, 281, 8591, 15850, 875, 841, 767, 8336, 5847, 50275, 2520, 7714, 9437, 281, 12661, 247, 3676, 4715, 1566, 323, 10334, 792, 941, 247, 1895, 342, 8542, 2898, 1318, 50276, 266, 4722, 9769, 2990, 10336, 310, 3559, 253, 2022, 2934, 273, 534, 310, 1175, 285, 5272, 50276, 5040, 50275, 783, 38135, 310, 2701, 253, 2234, 2972, 9628, 4116, 10764, 2074, 2934, 342, 5368, 2987, 1524, 19946, 50276, 783, 4028, 273, 253, 7714, 476, 320, 29422, 963, 993, 275, 5150, 818, 285, 854, 403, 21643, 253, 4081, 7043, 275, 619, 1859, 943, 320, 22335, 3451, 50276, 2858, 891, 1158, 253, 38135, 310, 2701, 50276, 585, 9642, 3738, 1097, 253, 1895, 285, 253, 1332, 310, 8261, 436, 7714, 310, 247, 45210, 789, 891, 5583, 33944, 352, 5474, 33032, 2520, 2929, 23970, 247, 39707, 3169, 10336, 323, 10334, 792, 15302, 436, 10336, 24772, 247, 39707, 342, 247, 305, 839, 13361, 81, 305, 1686, 81, 407, 9628, 253, 4116, 12624, 253, 4477, 671, 4081, 247, 747, 2746, 281, 22573, 5415, 4903, 253, 4081, 1566, 310, 6760, 327, 2800, 8985, 9162, 10334, 792, 15302, 253, 4081, 1566, 310, 6760, 275, 247, 22296, 4715, 3634, 533, 671, 275, 247, 49863, 29974, 13337, 3634, 20544, 50275, 74, 1158, 352, 310, 4722, 281, 1287, 3676, 3210, 323, 10334, 792, 941, 4715, 327, 10334, 792, 15302, 310, 1335, 271, 15560, 18398, 446, 2149, 2561, 2170, 50276, 783, 2934, 273, 9628, 4116, 12624, 4453, 4722, 50275, 783, 4477, 921, 326, 16248, 247, 39707, 3022, 10336, 342, 247, 305, 1686, 446, 2804, 581, 19132, 253, 16226, 891, 5467, 253, 7043, 1543, 403, 3012, 1805, 685, 7043, 19946, 285, 7043, 8172, 923, 619, 4385, 1996, 50276, 783, 4477, 5867, 1027, 11640, 273, 616, 1566, 50276, 20881, 1255, 265, 50275, 4674, 4791, 7043, 19132, 436, 1332, 281, 897, 253, 1072, 21496, 4315, 875, 31091, 4903, 285, 5415, 4903, 891, 717, 417, 2119, 752, 310, 253, 30328, 3212, 436, 2934, 253, 4477, 943, 41509, 436, 1750, 50274, 783, 1566, 310, 3559, 281, 789, 327, 667, 10334, 792, 15302, 2299, 432, 2793, 247, 2257, 273, 10334, 792, 15302, 3831, 387, 1878, 581, 36743, 5084, 891, 4282, 604, 7043, 476, 789, 327, 36743, 9930, 50275, 783, 1566, 310, 6760, 760, 323, 8985, 9162, 8892, 891, 1158, 6240, 4679, 323, 643, 8892, 24088, 23559, 14407, 9162, 33362, 1492, 9162, 9077, 476, 2572, 253, 3290, 273, 253, 2929, 50275, 783, 4477, 943, 5513, 849, 597, 513, 253, 49863, 29974, 13337, 4715, 275, 2593, 8676, 50275, 1542, 954, 273, 253, 7180, 891, 5583, 4645, 253, 11041, 323, 1027, 3632, 12922, 390, 690, 643, 7162, 7726, 5593, 323, 247, 2257, 273, 1543, 253, 3064, 875, 767, 3210, 4453, 1355, 285, 352, 310, 2834, 281, 871, 604, 253, 3064, 310, 1534, 50275, 262, 310, 2834, 281, 2096, 2829, 495, 1293, 3081, 1491, 891, 5583, 6240, 1491, 670, 253, 10895, 390, 7982, 908, 275, 253, 11743, 436, 2829, 310, 417, 5393, 275, 253, 2505, 50276, 783, 4477, 31986, 253, 4116, 12624, 323, 1027, 11640, 273, 616, 1566, 2299, 841, 5304, 5904, 403, 417, 20503, 327, 594, 352, 310, 2834, 281, 2096, 253, 3935, 432, 436, 1783, 50275, 783, 4477, 943, 4385, 327, 253, 9171, 1430, 273, 616, 1566, 253, 24032, 1566, 369, 6760, 327, 247, 4067, 10895, 891, 4282, 2139, 7043, 369, 417, 6760, 327, 436, 10895, 891, 1158, 253, 4477, 943, 671, 7277, 253, 3733, 673, 249, 1793, 673, 273, 616, 1566, 342, 5368, 3210, 50275, 783, 4477, 943, 823, 1491, 670, 253, 39707, 10336, 908, 310, 352, 253, 3236, 39707, 10336, 671, 1925, 1501, 12850, 390, 310, 352, 625, 3332, 10336, 751, 25861, 526, 390, 294, 10528, 50276, 783, 1390, 6197, 273, 253, 6452, 13471, 253, 39185, 273, 253, 2929, 253, 2127, 310, 2130, 387, 5987, 7280, 681, 75, 2327, 543, 2941, 40918, 50275, 1189, 455, 891, 1928, 352, 310, 3240, 2834, 281, 2096, 253, 2442, 3486, 273, 436, 2929, 253, 2934, 273, 9628, 4116, 12624, 4453, 4722, 533, 627, 403, 690, 3533, 670, 253, 9171, 1430, 273, 253, 4081, 2746, 285, 604, 352, 476, 789, 327, 667, 10334, 792, 15302, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 29328, 247, 1332, 281, 897, 4979, 398, 342, 10334, 792, 941, 407, 9628, 4116, 30628, 7164, 1534, 7350, 670, 253, 16038, 4028, 285, 5661, 1543, 4477, 858, 417, 11929, 247, 2380, 7613, 891, 5583, 18235 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper is a purely theoretical contribution no code or experiments on parameter identifiability for a deep relu network the main contribution is to give criteria for when the behavior of the function parameterized by the network on a finite set of points induces local identifiability of the parameters whereas global identifiability shows that the parameters that generate that function are unique local identifiability only tells you that the parameters are unique within an arbitrarily small neighborhood they are the unique parameters in that neighborhood that produce that behavior of the function the motivations come from security privacy and intellectual property to summarize the main steps in the argument the paper considers a deep fullyconnected relu network with parameters theta that outputs a function fthetax on a finite set of inputs x it is wellknown that some operations on the parameters do not change the function including permutation and rescaling positive rescaling when you multiply all the output weights for a single neuron by a positive constant and divide the input weights by the same constant changes the parameters without changing the function they decompose the function implemented by the network into a composition of the lifting operator which only depends on the parameters not on x and a piecewise constant operator that depends on both the parameters and the inputs they exclude a measure zero set they call s from the parameter space the excluded set s is the finite set of parameters that lie at the edges of the piecewise constants they show the lifting operator is invariant to rescaling if the parameters are equivalent then the lifting operators match if the lifting operators match then the parameters are equivalent modulo rescaling after excluding the set s the lifting operator maps the parameter space minus s to a smooth manifold sigma1 with a certain dimension next they fix some of the weights so that they cant be rescaled this removes some of the degrees of freedom for the parameters i didnt fully understand this step this allows them to give a geometric condition that the smooth manifold and an affine space intersect at a single point that is equivalent necessary and sufficient to local identifiability however in practice if i gave you a network and a finite sample you wouldnt know whether this geometric criteria was satisfied so they continue to give computable criteria based on the rank of quantities that depend on both x and theta the computable criteria arent as tight as the geometric criteria the geometric criterion is necessary and sufficient for local identifiability there are two different computable criteria one is necessary for identifiability and the other is sufficient for local identifiability they explain how to numerically compute the sufficient criterion including giving the bigo complexity of the runtimes they leaving efficient computation of necessary criterion as future work the paper is reasonably wellwritten although im not sure i gained a lot of intuition from reading it there are no experiments to evaluate i read the main text of the paper carefully but did not read any of the supplementary material and did not check if the proofs are correct my main comment is that im unsure of the significance of this paper i dont fully understand the motivation for studying identifiability of neural network parameters from their function behavior since neural networks are overparameterized by design and in practice this appears to be important for optimization further the assumptions required for this theory are so far from practice and even then only yield a result on local identifiability identifiability within arbitrarily small neighborhoods of the parameter space the motivations given in the paper around securityprivacyip dont seem compelling because anyone wanting to protect the exact values of parameters could just use a more complex and more overparameterized architecture however my background is biased in that i have worked more on the problem of identifying which function a neural network parameterizes given observed data along the lines of nonlinear ica i think the paper is clear about its limitations i dont have concerns about negative societal impact docsepthis paper targets an interesting question of neural network basis the main focus is the uniqueness of parameters in a reluactivated nn proofs are provided to support the conditions of such uniqueness the reviewer has the following concerns about the work this work starts with a question to doubt the uniqueness of nn in terms of the parameters ie weights and bias the authors try to provide the conditions necessary and sufficient of the uniqueness for a reluactivated nn to the best of the reviews knowledge this work is new for confidentialsecurity concerns the topic is of great importance and needs more attention in the community while the problem is important the review has some concerns and questions about the work 1 the motivation and the relevance to existing work need to be clarified a for example the authors claim not solving the inverse stability problem in sec 11 it is confusing what is the connection between your work and inverse stability the review didnt find a discussion in the following paper what are the insights generated from this work that can help with inverse stability b in sec 12 although the authors provide a literature review of the parameter uniqueness of nns some details are missing for example in row 48 interesting results on identifiability of networks what are the interesting results in row 51 under some conditions what are the conditions that narrow down the implementation for this entire section what is the limitations in previous work so that yours is necessary 2 concerns about the general technical contents a through the paper reading the review would like to know why does the paper focus on relu activation can the theorems and proofs be generalized to other activation functions b from this reviews point of view its better to include the discussion of how the theory can be used for the theoretical justification of relubased or other neural networks in massive applications they are essential and very helpful in related research and could emphasize the broader impact of this work 3 details of the mathematical derivation and proofs need to be clarified a the authors mention tangent spaces in the abstract and introduction for proof why should tangent space be used b could the author include some explanations of why graph modeling has to be used for nn representation is it easier to represent the weights on the edges like in figure 1 c its easy to get lost in heavy math formulas for example it is confusing what is the purpose of defining several notations in sec2122 until the illustration in row 145147 on page 4 d then the last sentence of sec22 makes it clear that the purpose of phi is to show the local dependency of fthetax and this is related to the local identifiability the reviewer suggests that the authors should define local identifiability at the beginning of sec 2 which should be a mathematical definition other than the one in sec 11 e for sec 3 could the authors explain why the smooth manifold of image is essential for proving local identifiability it seems to be a key intermediate result but has not been mentioned much f how do you generalize the results from fixing weights to free weights the case now may be too narrow it is worth more discussion as the freedom of weights combination and function composition are the magic of deep nn g the definition of affine space needs to be clarified does ker mean kernel in this paper h the visualization in figure 2 is intuitive and helpful i suggested the authors put it around sec 13 where the intersection is first mentioned in contribution otherwise the description is confusing to follow please see the limitations and suggestions in the above comments docsepthis study investigates the identifiability of deep relu nets that is the existence of a tractable inverse map from finite inputresponse examples xfthetax to the parameter theta and presents necessary and sufficient conditions of the identifiability strength in general identification of parameters is a challenging problem this can be expected by considering fourier series instead of neural networks which have been investigated much longer than neural networks that is it is difficult to estimate the frequencies or the parameters of the original signal from a finite number of irregularly sampled inputoutput examples even in the case of regular sampling classical issues such as aliasing are known to occur therefore i consider the necessary and sufficient conditions to be nontrivial and interesting weakness it would be better if the authors have presented a few simple numerical examples please refer to the questions section below for more details i found the notations are sometimes hard to follow for example i could not immediately understand that the equation in proposition 1 is an abbreviation of sump alphapxtheta phipvtheta yes they have in sections 5 and 6 ### Summary:
ratings 655 confidence 343 discussion among reviewers no this paper provides results on local identifiability of deep relu networks identifiability of neural networks is an important theoretical topic with practical implications such as reproducability and we think the neurips community would find the material interesting although the result is only for fairly specific assumptions that typically dont hold in practice and only on local identifiability the reviewers generally agree that the material is well presented i think the result could lead as a stepping stone towards more general results and i think that the results are worthy of presentation at neurips my recommendation is to accept assuming the list of promised updates to the paper as detailed by the author are executed
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 310, 247, 15846, 10527, 7680, 642, 2127, 390, 4679, 327, 4764, 1548, 18279, 1430, 323, 247, 3676, 774, 86, 2990, 253, 2022, 7680, 310, 281, 1918, 6866, 323, 672, 253, 3879, 273, 253, 1159, 4764, 1025, 407, 253, 2990, 327, 247, 6486, 873, 273, 2792, 14757, 1980, 1548, 18279, 1430, 273, 253, 3602, 50276, 2811, 284, 4156, 1548, 18279, 1430, 2722, 326, 253, 3602, 326, 6635, 326, 1159, 403, 4451, 1980, 1548, 18279, 1430, 760, 8599, 368, 326, 253, 3602, 403, 4451, 1561, 271, 29607, 1355, 9168, 597, 403, 253, 4451, 3602, 275, 326, 9168, 326, 4711, 326, 3879, 273, 253, 1159, 50276, 783, 42852, 1705, 432, 3988, 11068, 285, 12720, 2867, 50276, 936, 26799, 253, 2022, 5018, 275, 253, 4154, 50275, 783, 2929, 19401, 247, 3676, 4751, 14063, 774, 86, 2990, 342, 3602, 39116, 326, 18012, 247, 1159, 269, 3124, 89, 327, 247, 6486, 873, 273, 14800, 1269, 50276, 262, 310, 973, 4304, 326, 690, 5871, 327, 253, 3602, 513, 417, 1818, 253, 1159, 1690, 29391, 285, 46595, 272, 2762, 46595, 272, 672, 368, 30247, 512, 253, 3453, 13461, 323, 247, 2014, 23586, 407, 247, 2762, 3638, 285, 10957, 253, 3280, 13461, 407, 253, 1072, 3638, 2544, 253, 3602, 1293, 6890, 253, 1159, 50276, 9328, 11101, 3014, 253, 1159, 9009, 407, 253, 2990, 715, 247, 5889, 273, 253, 20284, 5572, 534, 760, 7024, 327, 253, 3602, 417, 327, 1269, 285, 247, 5313, 3020, 3638, 5572, 326, 7024, 327, 1097, 253, 3602, 285, 253, 14800, 50276, 9328, 16670, 247, 2557, 5058, 873, 597, 1067, 256, 432, 253, 4764, 2317, 253, 10432, 873, 256, 310, 253, 6486, 873, 273, 3602, 326, 7027, 387, 253, 9297, 273, 253, 5313, 3020, 14637, 50276, 9328, 921, 253, 20284, 5572, 310, 13727, 281, 46595, 272, 604, 253, 3602, 403, 6425, 840, 253, 20284, 9158, 3761, 604, 253, 20284, 9158, 3761, 840, 253, 3602, 403, 6425, 40090, 46595, 272, 50276, 6438, 22914, 253, 873, 256, 253, 20284, 5572, 8115, 253, 4764, 2317, 19734, 256, 281, 247, 6032, 16751, 40009, 18, 342, 247, 2176, 7877, 50276, 8384, 597, 4993, 690, 273, 253, 13461, 594, 326, 597, 16216, 320, 46595, 264, 436, 26586, 690, 273, 253, 7759, 273, 7185, 323, 253, 3602, 891, 42126, 4751, 2096, 436, 3213, 50276, 2520, 4483, 731, 281, 1918, 247, 17856, 1617, 326, 253, 6032, 16751, 285, 271, 29438, 2317, 23965, 387, 247, 2014, 1127, 326, 310, 6425, 3309, 285, 4209, 281, 1980, 1548, 18279, 1430, 50276, 35529, 275, 3946, 604, 891, 3534, 368, 247, 2990, 285, 247, 6486, 3410, 368, 651, 2649, 871, 1880, 436, 17856, 6866, 369, 10048, 594, 597, 4035, 281, 1918, 2475, 494, 6866, 1754, 327, 253, 5958, 273, 13483, 326, 3469, 327, 1097, 1269, 285, 39116, 50276, 783, 2475, 494, 6866, 403, 2649, 347, 6863, 347, 253, 17856, 6866, 253, 17856, 17705, 310, 3309, 285, 4209, 323, 1980, 1548, 18279, 1430, 627, 403, 767, 1027, 2475, 494, 6866, 50276, 531, 310, 3309, 323, 1548, 18279, 1430, 285, 253, 643, 310, 4209, 323, 1980, 1548, 18279, 1430, 50276, 9328, 5513, 849, 281, 27184, 11897, 253, 4209, 17705, 1690, 4933, 253, 1943, 80, 10454, 273, 253, 1408, 3181, 597, 6108, 5919, 13782, 273, 3309, 17705, 347, 2852, 789, 253, 2929, 310, 12054, 973, 15720, 3738, 516, 417, 2119, 891, 12103, 247, 2257, 273, 30328, 432, 4361, 352, 627, 403, 642, 4679, 281, 7472, 891, 1239, 253, 2022, 2505, 273, 253, 2929, 9257, 533, 858, 417, 1239, 667, 273, 253, 24864, 2144, 285, 858, 417, 2451, 604, 253, 27947, 403, 3451, 50276, 2577, 2022, 4385, 310, 326, 516, 31488, 273, 253, 8453, 273, 436, 2929, 891, 13414, 4751, 2096, 253, 16038, 323, 12392, 1548, 18279, 1430, 273, 11454, 2990, 3602, 432, 616, 1159, 3879, 1580, 11454, 6928, 403, 689, 19484, 1025, 407, 2216, 285, 275, 3946, 436, 4620, 281, 320, 1774, 323, 13757, 2007, 253, 13260, 2424, 323, 436, 3762, 403, 594, 2080, 432, 3946, 285, 1014, 840, 760, 4917, 247, 906, 327, 1980, 1548, 18279, 1430, 1548, 18279, 1430, 1561, 29607, 1355, 25237, 273, 253, 4764, 2317, 253, 42852, 1677, 275, 253, 2929, 1475, 3988, 13552, 1974, 532, 13414, 1646, 18511, 984, 3780, 14707, 281, 4017, 253, 3242, 2193, 273, 3602, 812, 816, 897, 247, 625, 2570, 285, 625, 689, 19484, 1025, 10336, 2299, 619, 4114, 310, 23539, 275, 326, 891, 452, 4307, 625, 327, 253, 1895, 273, 12488, 534, 1159, 247, 11454, 2990, 4764, 4219, 1677, 2540, 941, 2112, 253, 3104, 273, 14561, 209, 3737, 891, 1158, 253, 2929, 310, 2590, 670, 697, 7364, 891, 13414, 452, 7350, 670, 4016, 38058, 3486, 5474, 33032, 2520, 2929, 8571, 271, 4722, 1953, 273, 11454, 2990, 3720, 253, 2022, 2770, 310, 253, 34002, 273, 3602, 275, 247, 774, 86, 18132, 48257, 27947, 403, 2530, 281, 1329, 253, 2515, 273, 824, 34002, 253, 37317, 556, 253, 1563, 7350, 670, 253, 789, 436, 789, 7866, 342, 247, 1953, 281, 5545, 253, 34002, 273, 48257, 275, 2426, 273, 253, 3602, 26332, 13461, 285, 8492, 253, 4477, 1611, 281, 2085, 253, 2515, 3309, 285, 4209, 273, 253, 34002, 323, 247, 774, 86, 18132, 48257, 281, 253, 1682, 273, 253, 10123, 3640, 436, 789, 310, 747, 323, 18987, 13980, 7350, 253, 9400, 310, 273, 1270, 6349, 285, 3198, 625, 4116, 275, 253, 3114, 1223, 253, 1895, 310, 1774, 253, 2278, 556, 690, 7350, 285, 3533, 670, 253, 789, 50276, 18, 186, 783, 16038, 285, 253, 17200, 281, 5368, 789, 878, 281, 320, 31637, 50275, 66, 186, 1542, 1650, 253, 4477, 1750, 417, 16161, 253, 13737, 7882, 1895, 275, 4706, 1903, 352, 310, 21643, 752, 310, 253, 4602, 875, 634, 789, 285, 13737, 7882, 253, 2278, 42126, 1089, 247, 5955, 275, 253, 1563, 2929, 752, 403, 253, 16039, 4561, 432, 436, 789, 326, 476, 1361, 342, 13737, 7882, 50276, 67, 186, 249, 4706, 1249, 3738, 253, 4477, 2085, 247, 6239, 2278, 273, 253, 4764, 34002, 273, 295, 2224, 690, 4278, 403, 5816, 323, 1650, 275, 4194, 5693, 4722, 1543, 327, 1548, 18279, 1430, 273, 6928, 752, 403, 253, 4722, 1543, 275, 4194, 8319, 762, 690, 2515, 752, 403, 253, 2515, 326, 6891, 1066, 253, 7092, 323, 436, 2862, 2593, 752, 310, 253, 7364, 275, 2045, 789, 594, 326, 13298, 310, 3309, 50275, 19, 186, 585, 1209, 2224, 670, 253, 2087, 7681, 9410, 50276, 66, 186, 10489, 253, 2929, 4361, 253, 2278, 651, 751, 281, 871, 2139, 1057, 253, 2929, 2770, 327, 774, 86, 5743, 476, 253, 39383, 285, 27947, 320, 14923, 281, 643, 5743, 3470, 50276, 67, 186, 4064, 436, 10123, 1127, 273, 1859, 697, 1805, 281, 2486, 253, 5955, 273, 849, 253, 3762, 476, 320, 908, 323, 253, 10527, 22861, 273, 774, 538, 833, 390, 643, 11454, 6928, 275, 7863, 4893, 597, 403, 5667, 285, 1077, 9371, 275, 2905, 2561, 285, 812, 22175, 253, 16055, 3486, 273, 436, 789, 50275, 20, 186, 23454, 273, 253, 15965, 28529, 285, 27947, 878, 281, 320, 31637, 50275, 66, 186, 783, 4477, 3748, 28196, 8470, 275, 253, 12002, 285, 10199, 323, 4737, 2139, 943, 28196, 2317, 320, 908, 50275, 67, 186, 16534, 253, 2488, 2486, 690, 22909, 273, 2139, 4216, 14053, 556, 281, 320, 908, 323, 48257, 6779, 310, 352, 6927, 281, 1957, 253, 13461, 327, 253, 9297, 751, 275, 4677, 337, 50275, 68, 186, 953, 3477, 281, 755, 3663, 275, 5536, 14168, 23276, 323, 1650, 352, 310, 21643, 752, 310, 253, 4096, 273, 13947, 2067, 41818, 275, 4706, 19, 14351, 1919, 253, 23356, 275, 4194, 19092, 14555, 327, 3239, 577, 50276, 69, 186, 7461, 253, 1390, 6197, 273, 4706, 1423, 2789, 352, 2590, 326, 253, 4096, 273, 815, 74, 310, 281, 921, 253, 1980, 18925, 273, 269, 3124, 89, 285, 436, 310, 2905, 281, 253, 1980, 1548, 18279, 1430, 253, 37317, 5936, 326, 253, 4477, 943, 4853, 1980, 1548, 18279, 1430, 387, 253, 5068, 273, 4706, 374, 534, 943, 320, 247, 15965, 5426, 643, 685, 253, 581, 275, 4706, 1903, 50276, 70, 186, 1542, 4706, 495, 812, 253, 4477, 5513, 2139, 253, 6032, 16751, 273, 2460, 310, 5667, 323, 18597, 1980, 1548, 18279, 1430, 352, 3133, 281, 320, 247, 2234, 10444, 906, 533, 556, 417, 644, 5393, 1199, 50276, 71, 186, 5430, 513, 368, 39970, 253, 1543, 432, 18505, 13461, 281, 1959, 13461, 253, 1083, 1024, 778, 320, 1512, 6891, 352, 310, 4409, 625, 5955, 347, 253, 7185, 273, 13461, 5019, 285, 1159, 5889, 403, 253, 10721, 273, 3676, 48257, 50275, 72, 186, 783, 5426, 273, 29438, 2317, 3198, 281, 320, 31637, 1057, 20017, 1599, 10295, 275, 436, 2929, 50276, 73, 186, 783, 24426, 275, 4677, 374, 310, 27350, 285, 9371, 891, 5125, 253, 4477, 1691, 352, 1475, 4706, 2145, 835, 253, 15171, 310, 806, 5393, 275, 7680, 5010, 253, 5740, 310, 21643, 281, 956, 50276, 32897, 923, 253, 7364, 285, 13991, 275, 253, 1840, 5701, 5474, 33032, 2520, 1263, 2340, 684, 253, 1548, 18279, 1430, 273, 3676, 774, 86, 37507, 326, 310, 253, 6242, 273, 247, 10649, 494, 13737, 3711, 432, 6486, 3280, 10927, 6667, 1269, 649, 248, 9292, 281, 253, 4764, 39116, 285, 10262, 3309, 285, 4209, 2515, 273, 253, 1548, 18279, 1430, 4757, 275, 2087, 8137, 273, 3602, 310, 247, 11132, 1895, 436, 476, 320, 3264, 407, 7296, 269, 15421, 2962, 3185, 273, 11454, 6928, 534, 452, 644, 6949, 1199, 3356, 685, 11454, 6928, 326, 310, 352, 310, 2834, 281, 6642, 253, 11383, 390, 253, 3602, 273, 253, 3236, 2625, 432, 247, 6486, 1180, 273, 17948, 314, 19958, 3280, 9252, 6667, 1014, 275, 253, 1083, 273, 3963, 10491, 8946, 3374, 824, 347, 19541, 2355, 403, 1929, 281, 2826, 3103, 891, 1908, 253, 3309, 285, 4209, 2515, 281, 320, 37825, 285, 4722, 50276, 20881, 1255, 352, 651, 320, 1805, 604, 253, 4477, 452, 3559, 247, 1643, 2969, 10704, 6667, 4496, 3730, 281, 253, 3533, 2593, 2708, 323, 625, 4278, 50276, 74, 1119, 253, 41818, 403, 4536, 1892, 281, 956, 323, 1650, 891, 812, 417, 4745, 2096, 326, 253, 5150, 275, 13989, 337, 310, 271, 31931, 2492, 273, 256, 1765, 355, 545, 522, 633, 22666, 815, 532, 87, 3124, 50276, 9820, 597, 452, 275, 7118, 608, 285, 721, 50276, 187, 187, 4118, 18435, 27, 9296, 723, 39005, 7162, 35815, 5955, 2190, 30628, 642, 50276, 2520, 2929, 3400, 1543, 327, 1980, 1548, 18279, 1430, 273, 3676, 774, 86, 6928, 1548, 18279, 1430, 273, 11454, 6928, 310, 271, 1774, 10527, 9400, 342, 8542, 12739, 824, 347, 7598, 68, 1430, 285, 359, 1158, 253, 5723, 2824, 3114, 651, 1089, 253, 2144, 4722, 3738, 253, 906, 310, 760, 323, 9648, 2173, 13260, 326, 5431, 13414, 2186, 275, 3946, 285, 760, 327, 1980, 1548, 18279, 1430, 253, 30628, 3839, 5194, 326, 253, 2144, 310, 973, 3559, 891, 1158, 253, 906, 812, 1421, 347, 247, 24655, 8805, 4404, 625, 2087, 1543, 285, 891, 1158, 326, 253, 1543, 403, 18338, 273, 9759, 387, 5723, 2824, 50276, 2577, 17401, 310, 281, 2997, 7384, 253, 1618, 273, 12316, 11269, 281, 253, 2929, 347, 7000, 407, 253, 2488, 403, 11407 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 310, 247, 15846, 10527, 7680, 642, 2127, 390, 4679, 327, 4764, 1548, 18279, 1430, 323, 247, 3676, 774, 86, 2990, 253, 2022, 7680, 310, 281, 1918, 6866, 323, 672, 253, 3879, 273, 253, 1159, 4764, 1025, 407, 253, 2990, 327, 247, 6486, 873, 273, 2792, 14757, 1980, 1548, 18279, 1430, 273, 253, 3602, 50276, 2811, 284, 4156, 1548, 18279, 1430, 2722, 326, 253, 3602, 326, 6635, 326, 1159, 403, 4451, 1980, 1548, 18279, 1430, 760, 8599, 368, 326, 253, 3602, 403, 4451, 1561, 271, 29607, 1355, 9168, 597, 403, 253, 4451, 3602, 275, 326, 9168, 326, 4711, 326, 3879, 273, 253, 1159, 50276, 783, 42852, 1705, 432, 3988, 11068, 285, 12720, 2867, 50276, 936, 26799, 253, 2022, 5018, 275, 253, 4154, 50275, 783, 2929, 19401, 247, 3676, 4751, 14063, 774, 86, 2990, 342, 3602, 39116, 326, 18012, 247, 1159, 269, 3124, 89, 327, 247, 6486, 873, 273, 14800, 1269, 50276, 262, 310, 973, 4304, 326, 690, 5871, 327, 253, 3602, 513, 417, 1818, 253, 1159, 1690, 29391, 285, 46595, 272, 2762, 46595, 272, 672, 368, 30247, 512, 253, 3453, 13461, 323, 247, 2014, 23586, 407, 247, 2762, 3638, 285, 10957, 253, 3280, 13461, 407, 253, 1072, 3638, 2544, 253, 3602, 1293, 6890, 253, 1159, 50276, 9328, 11101, 3014, 253, 1159, 9009, 407, 253, 2990, 715, 247, 5889, 273, 253, 20284, 5572, 534, 760, 7024, 327, 253, 3602, 417, 327, 1269, 285, 247, 5313, 3020, 3638, 5572, 326, 7024, 327, 1097, 253, 3602, 285, 253, 14800, 50276, 9328, 16670, 247, 2557, 5058, 873, 597, 1067, 256, 432, 253, 4764, 2317, 253, 10432, 873, 256, 310, 253, 6486, 873, 273, 3602, 326, 7027, 387, 253, 9297, 273, 253, 5313, 3020, 14637, 50276, 9328, 921, 253, 20284, 5572, 310, 13727, 281, 46595, 272, 604, 253, 3602, 403, 6425, 840, 253, 20284, 9158, 3761, 604, 253, 20284, 9158, 3761, 840, 253, 3602, 403, 6425, 40090, 46595, 272, 50276, 6438, 22914, 253, 873, 256, 253, 20284, 5572, 8115, 253, 4764, 2317, 19734, 256, 281, 247, 6032, 16751, 40009, 18, 342, 247, 2176, 7877, 50276, 8384, 597, 4993, 690, 273, 253, 13461, 594, 326, 597, 16216, 320, 46595, 264, 436, 26586, 690, 273, 253, 7759, 273, 7185, 323, 253, 3602, 891, 42126, 4751, 2096, 436, 3213, 50276, 2520, 4483, 731, 281, 1918, 247, 17856, 1617, 326, 253, 6032, 16751, 285, 271, 29438, 2317, 23965, 387, 247, 2014, 1127, 326, 310, 6425, 3309, 285, 4209, 281, 1980, 1548, 18279, 1430, 50276, 35529, 275, 3946, 604, 891, 3534, 368, 247, 2990, 285, 247, 6486, 3410, 368, 651, 2649, 871, 1880, 436, 17856, 6866, 369, 10048, 594, 597, 4035, 281, 1918, 2475, 494, 6866, 1754, 327, 253, 5958, 273, 13483, 326, 3469, 327, 1097, 1269, 285, 39116, 50276, 783, 2475, 494, 6866, 403, 2649, 347, 6863, 347, 253, 17856, 6866, 253, 17856, 17705, 310, 3309, 285, 4209, 323, 1980, 1548, 18279, 1430, 627, 403, 767, 1027, 2475, 494, 6866, 50276, 531, 310, 3309, 323, 1548, 18279, 1430, 285, 253, 643, 310, 4209, 323, 1980, 1548, 18279, 1430, 50276, 9328, 5513, 849, 281, 27184, 11897, 253, 4209, 17705, 1690, 4933, 253, 1943, 80, 10454, 273, 253, 1408, 3181, 597, 6108, 5919, 13782, 273, 3309, 17705, 347, 2852, 789, 253, 2929, 310, 12054, 973, 15720, 3738, 516, 417, 2119, 891, 12103, 247, 2257, 273, 30328, 432, 4361, 352, 627, 403, 642, 4679, 281, 7472, 891, 1239, 253, 2022, 2505, 273, 253, 2929, 9257, 533, 858, 417, 1239, 667, 273, 253, 24864, 2144, 285, 858, 417, 2451, 604, 253, 27947, 403, 3451, 50276, 2577, 2022, 4385, 310, 326, 516, 31488, 273, 253, 8453, 273, 436, 2929, 891, 13414, 4751, 2096, 253, 16038, 323, 12392, 1548, 18279, 1430, 273, 11454, 2990, 3602, 432, 616, 1159, 3879, 1580, 11454, 6928, 403, 689, 19484, 1025, 407, 2216, 285, 275, 3946, 436, 4620, 281, 320, 1774, 323, 13757, 2007, 253, 13260, 2424, 323, 436, 3762, 403, 594, 2080, 432, 3946, 285, 1014, 840, 760, 4917, 247, 906, 327, 1980, 1548, 18279, 1430, 1548, 18279, 1430, 1561, 29607, 1355, 25237, 273, 253, 4764, 2317, 253, 42852, 1677, 275, 253, 2929, 1475, 3988, 13552, 1974, 532, 13414, 1646, 18511, 984, 3780, 14707, 281, 4017, 253, 3242, 2193, 273, 3602, 812, 816, 897, 247, 625, 2570, 285, 625, 689, 19484, 1025, 10336, 2299, 619, 4114, 310, 23539, 275, 326, 891, 452, 4307, 625, 327, 253, 1895, 273, 12488, 534, 1159, 247, 11454, 2990, 4764, 4219, 1677, 2540, 941, 2112, 253, 3104, 273, 14561, 209, 3737, 891, 1158, 253, 2929, 310, 2590, 670, 697, 7364, 891, 13414, 452, 7350, 670, 4016, 38058, 3486, 5474, 33032, 2520, 2929, 8571, 271, 4722, 1953, 273, 11454, 2990, 3720, 253, 2022, 2770, 310, 253, 34002, 273, 3602, 275, 247, 774, 86, 18132, 48257, 27947, 403, 2530, 281, 1329, 253, 2515, 273, 824, 34002, 253, 37317, 556, 253, 1563, 7350, 670, 253, 789, 436, 789, 7866, 342, 247, 1953, 281, 5545, 253, 34002, 273, 48257, 275, 2426, 273, 253, 3602, 26332, 13461, 285, 8492, 253, 4477, 1611, 281, 2085, 253, 2515, 3309, 285, 4209, 273, 253, 34002, 323, 247, 774, 86, 18132, 48257, 281, 253, 1682, 273, 253, 10123, 3640, 436, 789, 310, 747, 323, 18987, 13980, 7350, 253, 9400, 310, 273, 1270, 6349, 285, 3198, 625, 4116, 275, 253, 3114, 1223, 253, 1895, 310, 1774, 253, 2278, 556, 690, 7350, 285, 3533, 670, 253, 789, 50276, 18, 186, 783, 16038, 285, 253, 17200, 281, 5368, 789, 878, 281, 320, 31637, 50275, 66, 186, 1542, 1650, 253, 4477, 1750, 417, 16161, 253, 13737, 7882, 1895, 275, 4706, 1903, 352, 310, 21643, 752, 310, 253, 4602, 875, 634, 789, 285, 13737, 7882, 253, 2278, 42126, 1089, 247, 5955, 275, 253, 1563, 2929, 752, 403, 253, 16039, 4561, 432, 436, 789, 326, 476, 1361, 342, 13737, 7882, 50276, 67, 186, 249, 4706, 1249, 3738, 253, 4477, 2085, 247, 6239, 2278, 273, 253, 4764, 34002, 273, 295, 2224, 690, 4278, 403, 5816, 323, 1650, 275, 4194, 5693, 4722, 1543, 327, 1548, 18279, 1430, 273, 6928, 752, 403, 253, 4722, 1543, 275, 4194, 8319, 762, 690, 2515, 752, 403, 253, 2515, 326, 6891, 1066, 253, 7092, 323, 436, 2862, 2593, 752, 310, 253, 7364, 275, 2045, 789, 594, 326, 13298, 310, 3309, 50275, 19, 186, 585, 1209, 2224, 670, 253, 2087, 7681, 9410, 50276, 66, 186, 10489, 253, 2929, 4361, 253, 2278, 651, 751, 281, 871, 2139, 1057, 253, 2929, 2770, 327, 774, 86, 5743, 476, 253, 39383, 285, 27947, 320, 14923, 281, 643, 5743, 3470, 50276, 67, 186, 4064, 436, 10123, 1127, 273, 1859, 697, 1805, 281, 2486, 253, 5955, 273, 849, 253, 3762, 476, 320, 908, 323, 253, 10527, 22861, 273, 774, 538, 833, 390, 643, 11454, 6928, 275, 7863, 4893, 597, 403, 5667, 285, 1077, 9371, 275, 2905, 2561, 285, 812, 22175, 253, 16055, 3486, 273, 436, 789, 50275, 20, 186, 23454, 273, 253, 15965, 28529, 285, 27947, 878, 281, 320, 31637, 50275, 66, 186, 783, 4477, 3748, 28196, 8470, 275, 253, 12002, 285, 10199, 323, 4737, 2139, 943, 28196, 2317, 320, 908, 50275, 67, 186, 16534, 253, 2488, 2486, 690, 22909, 273, 2139, 4216, 14053, 556, 281, 320, 908, 323, 48257, 6779, 310, 352, 6927, 281, 1957, 253, 13461, 327, 253, 9297, 751, 275, 4677, 337, 50275, 68, 186, 953, 3477, 281, 755, 3663, 275, 5536, 14168, 23276, 323, 1650, 352, 310, 21643, 752, 310, 253, 4096, 273, 13947, 2067, 41818, 275, 4706, 19, 14351, 1919, 253, 23356, 275, 4194, 19092, 14555, 327, 3239, 577, 50276, 69, 186, 7461, 253, 1390, 6197, 273, 4706, 1423, 2789, 352, 2590, 326, 253, 4096, 273, 815, 74, 310, 281, 921, 253, 1980, 18925, 273, 269, 3124, 89, 285, 436, 310, 2905, 281, 253, 1980, 1548, 18279, 1430, 253, 37317, 5936, 326, 253, 4477, 943, 4853, 1980, 1548, 18279, 1430, 387, 253, 5068, 273, 4706, 374, 534, 943, 320, 247, 15965, 5426, 643, 685, 253, 581, 275, 4706, 1903, 50276, 70, 186, 1542, 4706, 495, 812, 253, 4477, 5513, 2139, 253, 6032, 16751, 273, 2460, 310, 5667, 323, 18597, 1980, 1548, 18279, 1430, 352, 3133, 281, 320, 247, 2234, 10444, 906, 533, 556, 417, 644, 5393, 1199, 50276, 71, 186, 5430, 513, 368, 39970, 253, 1543, 432, 18505, 13461, 281, 1959, 13461, 253, 1083, 1024, 778, 320, 1512, 6891, 352, 310, 4409, 625, 5955, 347, 253, 7185, 273, 13461, 5019, 285, 1159, 5889, 403, 253, 10721, 273, 3676, 48257, 50275, 72, 186, 783, 5426, 273, 29438, 2317, 3198, 281, 320, 31637, 1057, 20017, 1599, 10295, 275, 436, 2929, 50276, 73, 186, 783, 24426, 275, 4677, 374, 310, 27350, 285, 9371, 891, 5125, 253, 4477, 1691, 352, 1475, 4706, 2145, 835, 253, 15171, 310, 806, 5393, 275, 7680, 5010, 253, 5740, 310, 21643, 281, 956, 50276, 32897, 923, 253, 7364, 285, 13991, 275, 253, 1840, 5701, 5474, 33032, 2520, 1263, 2340, 684, 253, 1548, 18279, 1430, 273, 3676, 774, 86, 37507, 326, 310, 253, 6242, 273, 247, 10649, 494, 13737, 3711, 432, 6486, 3280, 10927, 6667, 1269, 649, 248, 9292, 281, 253, 4764, 39116, 285, 10262, 3309, 285, 4209, 2515, 273, 253, 1548, 18279, 1430, 4757, 275, 2087, 8137, 273, 3602, 310, 247, 11132, 1895, 436, 476, 320, 3264, 407, 7296, 269, 15421, 2962, 3185, 273, 11454, 6928, 534, 452, 644, 6949, 1199, 3356, 685, 11454, 6928, 326, 310, 352, 310, 2834, 281, 6642, 253, 11383, 390, 253, 3602, 273, 253, 3236, 2625, 432, 247, 6486, 1180, 273, 17948, 314, 19958, 3280, 9252, 6667, 1014, 275, 253, 1083, 273, 3963, 10491, 8946, 3374, 824, 347, 19541, 2355, 403, 1929, 281, 2826, 3103, 891, 1908, 253, 3309, 285, 4209, 2515, 281, 320, 37825, 285, 4722, 50276, 20881, 1255, 352, 651, 320, 1805, 604, 253, 4477, 452, 3559, 247, 1643, 2969, 10704, 6667, 4496, 3730, 281, 253, 3533, 2593, 2708, 323, 625, 4278, 50276, 74, 1119, 253, 41818, 403, 4536, 1892, 281, 956, 323, 1650, 891, 812, 417, 4745, 2096, 326, 253, 5150, 275, 13989, 337, 310, 271, 31931, 2492, 273, 256, 1765, 355, 545, 522, 633, 22666, 815, 532, 87, 3124, 50276, 9820, 597, 452, 275, 7118, 608, 285, 721, 50276, 187, 187, 4118, 18435, 27, 9296, 723, 39005, 7162, 35815, 5955, 2190, 30628, 642, 50276, 2520, 2929, 3400, 1543, 327, 1980, 1548, 18279, 1430, 273, 3676, 774, 86, 6928, 1548, 18279, 1430, 273, 11454, 6928, 310, 271, 1774, 10527, 9400, 342, 8542, 12739, 824, 347, 7598, 68, 1430, 285, 359, 1158, 253, 5723, 2824, 3114, 651, 1089, 253, 2144, 4722, 3738, 253, 906, 310, 760, 323, 9648, 2173, 13260, 326, 5431, 13414, 2186, 275, 3946, 285, 760, 327, 1980, 1548, 18279, 1430, 253, 30628, 3839, 5194, 326, 253, 2144, 310, 973, 3559, 891, 1158, 253, 906, 812, 1421, 347, 247, 24655, 8805, 4404, 625, 2087, 1543, 285, 891, 1158, 326, 253, 1543, 403, 18338, 273, 9759, 387, 5723, 2824, 50276, 2577, 17401, 310, 281, 2997, 7384, 253, 1618, 273, 12316, 11269, 281, 253, 2929, 347, 7000, 407, 253, 2488, 403, 11407 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper tests performance of different node similarity measures when used in kmeans for clustering lfr graphs it provides recommendations for which measure is more appropriate for different parameter spaces it is easy to read paper and well organized a rich set of graph similarity measures is studied there are however critical issues with the design of the experiments first the space of parameters considered is very unrealistic and hence most experiments and recommendations are not applicable this makes for example most of the space in figure 6 irrelevant for instance the exponent of the degree distribution is between 2 and 3 for most real world networks this goes up to 100 in the experiments where did you use the transformed 01 version the average degree is much smaller whereas graphs are usually much larger than the setting here a negative modularity is result of poor parameter choices and generally signifies there is no cluster structure in the graph generally given this is focused on studying the space of lfr graphs one expects more careful understanding of what there parameters mean on a minor point lfr is not using preferential attachment mechanism and is achieving powerlaw degree distribution by directly sampling degrees and using configuration model i suggest fixing the parameter settings and making sure the modularity is at least 01 for all the graphs the common link prediction measures eg number of common neighbours could also be added as simple baselines as well as more recent embedding based models where kmeans could be applied directly in the embedded space another possibility is to also include another clustering measure say a density based one the motivation needs to be expanded what justifies going n square when a graph based clustering could do the same job also in the results using ranking is good but the actual numbers are also meaningful it is hard to see if these methods are recovering anything meaningful when only the relative performance is reported basically if the ari is too small which means no correlation between results and the groundtruth you can still have a ranking of all the close to zero numbers docsepthe paper deals with the problem of community detection on graphs examining the impact of graph measures to do so the paper proposes an experimental framework where clustering is achieved using the kernel kmeans algorithm and the performance of graph measures is examined on various instances of artificially generated graphs using the lfr benchmark the overall approach is empirical supported mainly by the experimental results the main observations concern the consistent behavior of particular graph measures across multiple settings of the dataset strong points the paper addresses an important problem in network analysis which also concerns practitioners in a wide range of disciplines various graph measures are considered in the evaluation this is very interesting since in my view some of them are not very wellknown among the graph clustering community detection communities the paper is wellstructured and wellwritten most of the concepts including the experimental framework are clearly presented weak points my main concern about the paper has to do with the consistency of the proposed evaluation framework under different evaluation criteria and graph data beyond the lfr benchmark firstly as the paper also mentions focusing only on lfr graphs can definitely reveal important properties of algorithms but limits the generalization of the observations in the case where other generators might be used eg sbm or even realworld graphs besides the argument made in the paper that the lfr benchmark generates graphs similar to realworld ones is not very accurate lfr focuses on the clustering structure as well as on the degree distribution but might miss other key properties including graph diameter or the number of triangles or the clustering coefficient is there any evidence that could support the observations of the paper in the case of real graphs a closely related point has to do with the observation that realworld graphs do not have a clear clustering structure ie welldefined cuts focusing only lfr graphs might not be enough to capture such instances the modularity criterion is used to evaluate the quality of communities which overall is a widely used criterion nevertheless modularity has been shown to be prawn to the particular structure of the communities eg resolution limit how is this taken into account in the evaluation of the communities another point is that the paper is purely empirical definitely this is a good starting point especially when the focus is on experimental settings that have not been used before nevertheless despite the interesting observations i would expect to have a theoretical justification or some reasoning of why scct performs well on lfr graphs or for instance why ppr which is a widely used measure shows poor behavior in the paper all graph measures are considered as kernels how valid is this argument why not just using the pure kmeans for graph measures which are not kernels docsep 1 no good reasons to choose settings of evaluation particularly kernel kmeans and lfr 2 i think this paper may think about something very obvious the setting is kernel kmeans and if the similarity measure is given as some kernel it might be more clearly shown what kernel is good under what condition under kernel kmeans in other words we may find some connection between a kernel and the condition of data generation under the kernel kmeans already before doing some experiments i think this type of investigation is missing in this paper 3 so what is the reason why scct is the best andor why highlyranked methods are so i think that would be simply connected to the scheme of data generation of lfr or another feature in generating data or noise i think this point might be obvious but might become some good contribution while just data generation and comparison would not be something people can say contribution docsepusing 7500 lfrgenerated graphs as a benchmarks suite the authors compare 25 graph clustering measures determining the best measure for every area of the parameter space the paper is well written mathematically sound and interesting and definitely useful to the graph theory community however as acknowledged by the authors the study is limited by the structure of the benchmark suites which is restricted to networks that can be generated by lfr rules overall i rate it as a weak accept pros the analysis is clear and grounded with a sufficient level of mathematical details the authors point out a clear winner out of the set of compared metrics cons the amount of novelty in the manuscript is limited the benchmark suite is very specific and no real world example that would have added great value to the submission is provided ### Summary:
this paper studies various graph measures in depth the paper was reviewed by three expert reviewers who complemented the ease of understanding because of clear writing but they also expressed concerns for limited novelty theoretical justification and unrealistic setting the authors are encouraged to continue research taking into consideration the detailed comments provided by the reviewers
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 5216, 3045, 273, 1027, 4666, 14259, 5593, 672, 908, 275, 465, 30799, 323, 17524, 298, 925, 14580, 352, 3400, 12645, 323, 534, 2557, 310, 625, 4569, 323, 1027, 4764, 8470, 50275, 262, 310, 3477, 281, 1239, 2929, 285, 973, 10932, 247, 6793, 873, 273, 4216, 14259, 5593, 310, 5421, 50276, 9088, 403, 2299, 4619, 3374, 342, 253, 2216, 273, 253, 4679, 806, 253, 2317, 273, 3602, 2783, 310, 1077, 46521, 285, 7613, 954, 4679, 285, 12645, 403, 417, 7763, 436, 2789, 323, 1650, 954, 273, 253, 2317, 275, 4677, 721, 19124, 323, 4227, 253, 23653, 273, 253, 4248, 3268, 310, 875, 374, 285, 495, 323, 954, 1524, 1533, 6928, 436, 4566, 598, 281, 2233, 275, 253, 4679, 835, 858, 368, 897, 253, 13657, 14805, 2715, 253, 3388, 4248, 310, 1199, 4577, 5727, 14580, 403, 3798, 1199, 4067, 685, 253, 4758, 1060, 247, 4016, 23178, 414, 310, 906, 273, 4105, 4764, 10165, 285, 3839, 861, 7790, 627, 310, 642, 7368, 2605, 275, 253, 4216, 3839, 1677, 436, 310, 7106, 327, 12392, 253, 2317, 273, 298, 925, 14580, 581, 21973, 625, 10182, 4685, 273, 752, 627, 3602, 1599, 327, 247, 5884, 1127, 298, 925, 310, 417, 970, 41637, 14170, 5122, 285, 310, 17170, 1612, 6937, 4248, 3268, 407, 3587, 10491, 7759, 285, 970, 6661, 1566, 50276, 74, 1804, 18505, 253, 4764, 7533, 285, 2403, 2119, 253, 23178, 414, 310, 387, 1878, 14805, 323, 512, 253, 14580, 253, 1846, 3048, 10554, 5593, 24088, 1180, 273, 1846, 31359, 812, 671, 320, 2879, 347, 2969, 1666, 25379, 347, 973, 347, 625, 3332, 21496, 1754, 3210, 835, 465, 30799, 812, 320, 3732, 3587, 275, 253, 12691, 2317, 1529, 6387, 310, 281, 671, 2486, 1529, 17524, 2557, 1333, 247, 4038, 1754, 581, 253, 16038, 3198, 281, 320, 11848, 752, 816, 7790, 1469, 295, 6278, 672, 247, 4216, 1754, 17524, 812, 513, 253, 1072, 2628, 671, 275, 253, 1543, 970, 19947, 310, 1175, 533, 253, 4588, 3904, 403, 671, 14282, 352, 310, 1892, 281, 923, 604, 841, 3082, 403, 27930, 2712, 14282, 672, 760, 253, 4103, 3045, 310, 2361, 10323, 604, 253, 247, 363, 310, 1512, 1355, 534, 2097, 642, 5921, 875, 1543, 285, 253, 3216, 33024, 368, 476, 1335, 452, 247, 19947, 273, 512, 253, 2810, 281, 5058, 3904, 5474, 339, 431, 248, 2929, 13330, 342, 253, 1895, 273, 3114, 5481, 327, 14580, 17565, 253, 3486, 273, 4216, 5593, 281, 513, 594, 253, 2929, 29328, 271, 5661, 7792, 835, 17524, 310, 6786, 970, 253, 10295, 465, 30799, 5933, 285, 253, 3045, 273, 4216, 5593, 310, 6730, 327, 2710, 10872, 273, 41544, 4561, 14580, 970, 253, 298, 925, 22791, 253, 4583, 2746, 310, 16774, 4516, 7194, 407, 253, 5661, 1543, 253, 2022, 7313, 4468, 253, 5185, 3879, 273, 1798, 4216, 5593, 2439, 2709, 7533, 273, 253, 10895, 50276, 9072, 2792, 50275, 783, 2929, 12453, 271, 1774, 1895, 275, 2990, 1783, 534, 671, 7350, 24432, 275, 247, 4618, 2491, 273, 32870, 50275, 2044, 784, 4216, 5593, 403, 2783, 275, 253, 7103, 436, 310, 1077, 4722, 1580, 275, 619, 1859, 690, 273, 731, 403, 417, 1077, 973, 4304, 2190, 253, 4216, 17524, 50276, 24925, 5481, 7888, 50275, 783, 2929, 310, 973, 34218, 285, 973, 15720, 954, 273, 253, 12342, 1690, 253, 5661, 7792, 403, 4518, 3559, 50275, 20881, 2792, 50275, 2577, 2022, 4468, 670, 253, 2929, 556, 281, 513, 342, 253, 15274, 273, 253, 4081, 7103, 7792, 762, 1027, 7103, 6866, 285, 4216, 941, 4457, 253, 298, 925, 22791, 41005, 347, 253, 2929, 671, 25957, 13654, 760, 327, 298, 925, 14580, 476, 7964, 10313, 1774, 3607, 273, 11333, 533, 7787, 253, 26647, 273, 253, 7313, 275, 253, 1083, 835, 643, 21025, 1537, 320, 908, 24088, 256, 5844, 390, 1014, 1524, 10186, 14580, 16280, 253, 4154, 1160, 275, 253, 2929, 326, 253, 298, 925, 22791, 15693, 14580, 2074, 281, 1524, 10186, 4394, 310, 417, 1077, 7899, 298, 925, 16633, 327, 253, 17524, 2605, 347, 973, 347, 327, 253, 4248, 3268, 533, 1537, 2985, 643, 2234, 3607, 1690, 4216, 9080, 390, 253, 1180, 273, 30102, 390, 253, 17524, 10235, 310, 627, 667, 1941, 326, 812, 1329, 253, 7313, 273, 253, 2929, 275, 253, 1083, 273, 1524, 14580, 50274, 66, 8244, 2905, 1127, 556, 281, 513, 342, 253, 8310, 326, 1524, 10186, 14580, 513, 417, 452, 247, 2590, 17524, 2605, 26332, 6210, 392, 37224, 12176, 13654, 760, 298, 925, 14580, 1537, 417, 320, 2217, 281, 9232, 824, 10872, 50274, 783, 23178, 414, 17705, 310, 908, 281, 7472, 253, 3290, 273, 7888, 534, 4583, 310, 247, 7561, 908, 17705, 17837, 23178, 414, 556, 644, 2011, 281, 320, 9791, 939, 281, 253, 1798, 2605, 273, 253, 7888, 24088, 6064, 2701, 849, 310, 436, 2668, 715, 2395, 275, 253, 7103, 273, 253, 7888, 50275, 23955, 1127, 310, 326, 253, 2929, 310, 15846, 16774, 7964, 436, 310, 247, 1175, 4983, 1127, 3340, 672, 253, 2770, 310, 327, 5661, 7533, 326, 452, 417, 644, 908, 1078, 17837, 5747, 253, 4722, 7313, 891, 651, 1902, 281, 452, 247, 10527, 22861, 390, 690, 14720, 273, 2139, 660, 291, 17923, 973, 327, 298, 925, 14580, 390, 323, 4227, 2139, 268, 1087, 534, 310, 247, 7561, 908, 2557, 2722, 4105, 3879, 50275, 249, 253, 2929, 512, 4216, 5593, 403, 2783, 347, 34501, 849, 3588, 310, 436, 4154, 2139, 417, 816, 970, 253, 6313, 465, 30799, 323, 4216, 5593, 534, 403, 417, 34501, 50274, 7152, 33032, 337, 642, 1175, 4606, 281, 5206, 7533, 273, 7103, 3782, 10295, 465, 30799, 285, 298, 925, 50275, 19, 891, 1158, 436, 2929, 778, 1158, 670, 1633, 1077, 4755, 253, 4758, 310, 10295, 465, 30799, 285, 604, 253, 14259, 2557, 310, 1677, 347, 690, 10295, 352, 1537, 320, 625, 4518, 2011, 752, 10295, 310, 1175, 762, 752, 1617, 762, 10295, 465, 30799, 50276, 249, 643, 3000, 359, 778, 1089, 690, 4602, 875, 247, 10295, 285, 253, 1617, 273, 941, 5978, 762, 253, 10295, 465, 30799, 2168, 1078, 2509, 690, 4679, 50276, 74, 1158, 436, 1511, 273, 5839, 310, 5816, 275, 436, 2929, 50276, 20, 594, 752, 310, 253, 1921, 2139, 660, 291, 310, 253, 1682, 285, 263, 2139, 4122, 14714, 264, 3082, 403, 594, 891, 1158, 326, 651, 320, 3365, 4802, 281, 253, 6974, 273, 941, 5978, 273, 298, 925, 390, 1529, 4735, 275, 11365, 941, 390, 6046, 891, 1158, 436, 1127, 1537, 320, 4755, 533, 1537, 2489, 690, 1175, 7680, 1223, 816, 941, 5978, 285, 5301, 651, 417, 320, 1633, 952, 476, 1333, 7680, 50275, 7152, 33032, 5302, 818, 5388, 298, 925, 20419, 14580, 347, 247, 49602, 18880, 253, 4477, 7277, 2030, 4216, 17524, 5593, 8925, 253, 1682, 2557, 323, 1046, 2170, 273, 253, 4764, 2317, 253, 2929, 310, 973, 3542, 11076, 1037, 3590, 285, 4722, 285, 7964, 4217, 281, 253, 4216, 3762, 3114, 2299, 347, 14969, 407, 253, 4477, 253, 1263, 310, 3710, 407, 253, 2605, 273, 253, 22791, 49130, 534, 310, 11096, 281, 6928, 326, 476, 320, 4561, 407, 298, 925, 4803, 4583, 891, 2281, 352, 347, 247, 5075, 2997, 50276, 856, 84, 50276, 783, 1783, 310, 2590, 285, 28462, 342, 247, 4209, 1268, 273, 15965, 4278, 50276, 783, 4477, 1127, 562, 247, 2590, 13688, 562, 273, 253, 873, 273, 2429, 17082, 50276, 5040, 50276, 783, 2408, 273, 38135, 275, 253, 7714, 310, 3710, 50275, 783, 22791, 18880, 310, 1077, 2173, 285, 642, 1524, 1533, 1650, 326, 651, 452, 2879, 1270, 1318, 281, 253, 19529, 310, 2530, 2490, 187, 4118, 18435, 27, 2520, 2929, 2175, 2710, 4216, 5593, 275, 6864, 50276, 783, 2929, 369, 9814, 407, 1264, 6485, 30628, 665, 48912, 253, 11990, 273, 4685, 984, 273, 2590, 4028, 533, 597, 671, 4469, 7350, 323, 3710, 38135, 10527, 22861, 285, 46521, 4758, 253, 4477, 403, 14659, 281, 4035, 2561, 3192, 715, 8180, 253, 7000, 5701, 2530, 407, 253, 30628 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 5216, 3045, 273, 1027, 4666, 14259, 5593, 672, 908, 275, 465, 30799, 323, 17524, 298, 925, 14580, 352, 3400, 12645, 323, 534, 2557, 310, 625, 4569, 323, 1027, 4764, 8470, 50275, 262, 310, 3477, 281, 1239, 2929, 285, 973, 10932, 247, 6793, 873, 273, 4216, 14259, 5593, 310, 5421, 50276, 9088, 403, 2299, 4619, 3374, 342, 253, 2216, 273, 253, 4679, 806, 253, 2317, 273, 3602, 2783, 310, 1077, 46521, 285, 7613, 954, 4679, 285, 12645, 403, 417, 7763, 436, 2789, 323, 1650, 954, 273, 253, 2317, 275, 4677, 721, 19124, 323, 4227, 253, 23653, 273, 253, 4248, 3268, 310, 875, 374, 285, 495, 323, 954, 1524, 1533, 6928, 436, 4566, 598, 281, 2233, 275, 253, 4679, 835, 858, 368, 897, 253, 13657, 14805, 2715, 253, 3388, 4248, 310, 1199, 4577, 5727, 14580, 403, 3798, 1199, 4067, 685, 253, 4758, 1060, 247, 4016, 23178, 414, 310, 906, 273, 4105, 4764, 10165, 285, 3839, 861, 7790, 627, 310, 642, 7368, 2605, 275, 253, 4216, 3839, 1677, 436, 310, 7106, 327, 12392, 253, 2317, 273, 298, 925, 14580, 581, 21973, 625, 10182, 4685, 273, 752, 627, 3602, 1599, 327, 247, 5884, 1127, 298, 925, 310, 417, 970, 41637, 14170, 5122, 285, 310, 17170, 1612, 6937, 4248, 3268, 407, 3587, 10491, 7759, 285, 970, 6661, 1566, 50276, 74, 1804, 18505, 253, 4764, 7533, 285, 2403, 2119, 253, 23178, 414, 310, 387, 1878, 14805, 323, 512, 253, 14580, 253, 1846, 3048, 10554, 5593, 24088, 1180, 273, 1846, 31359, 812, 671, 320, 2879, 347, 2969, 1666, 25379, 347, 973, 347, 625, 3332, 21496, 1754, 3210, 835, 465, 30799, 812, 320, 3732, 3587, 275, 253, 12691, 2317, 1529, 6387, 310, 281, 671, 2486, 1529, 17524, 2557, 1333, 247, 4038, 1754, 581, 253, 16038, 3198, 281, 320, 11848, 752, 816, 7790, 1469, 295, 6278, 672, 247, 4216, 1754, 17524, 812, 513, 253, 1072, 2628, 671, 275, 253, 1543, 970, 19947, 310, 1175, 533, 253, 4588, 3904, 403, 671, 14282, 352, 310, 1892, 281, 923, 604, 841, 3082, 403, 27930, 2712, 14282, 672, 760, 253, 4103, 3045, 310, 2361, 10323, 604, 253, 247, 363, 310, 1512, 1355, 534, 2097, 642, 5921, 875, 1543, 285, 253, 3216, 33024, 368, 476, 1335, 452, 247, 19947, 273, 512, 253, 2810, 281, 5058, 3904, 5474, 339, 431, 248, 2929, 13330, 342, 253, 1895, 273, 3114, 5481, 327, 14580, 17565, 253, 3486, 273, 4216, 5593, 281, 513, 594, 253, 2929, 29328, 271, 5661, 7792, 835, 17524, 310, 6786, 970, 253, 10295, 465, 30799, 5933, 285, 253, 3045, 273, 4216, 5593, 310, 6730, 327, 2710, 10872, 273, 41544, 4561, 14580, 970, 253, 298, 925, 22791, 253, 4583, 2746, 310, 16774, 4516, 7194, 407, 253, 5661, 1543, 253, 2022, 7313, 4468, 253, 5185, 3879, 273, 1798, 4216, 5593, 2439, 2709, 7533, 273, 253, 10895, 50276, 9072, 2792, 50275, 783, 2929, 12453, 271, 1774, 1895, 275, 2990, 1783, 534, 671, 7350, 24432, 275, 247, 4618, 2491, 273, 32870, 50275, 2044, 784, 4216, 5593, 403, 2783, 275, 253, 7103, 436, 310, 1077, 4722, 1580, 275, 619, 1859, 690, 273, 731, 403, 417, 1077, 973, 4304, 2190, 253, 4216, 17524, 50276, 24925, 5481, 7888, 50275, 783, 2929, 310, 973, 34218, 285, 973, 15720, 954, 273, 253, 12342, 1690, 253, 5661, 7792, 403, 4518, 3559, 50275, 20881, 2792, 50275, 2577, 2022, 4468, 670, 253, 2929, 556, 281, 513, 342, 253, 15274, 273, 253, 4081, 7103, 7792, 762, 1027, 7103, 6866, 285, 4216, 941, 4457, 253, 298, 925, 22791, 41005, 347, 253, 2929, 671, 25957, 13654, 760, 327, 298, 925, 14580, 476, 7964, 10313, 1774, 3607, 273, 11333, 533, 7787, 253, 26647, 273, 253, 7313, 275, 253, 1083, 835, 643, 21025, 1537, 320, 908, 24088, 256, 5844, 390, 1014, 1524, 10186, 14580, 16280, 253, 4154, 1160, 275, 253, 2929, 326, 253, 298, 925, 22791, 15693, 14580, 2074, 281, 1524, 10186, 4394, 310, 417, 1077, 7899, 298, 925, 16633, 327, 253, 17524, 2605, 347, 973, 347, 327, 253, 4248, 3268, 533, 1537, 2985, 643, 2234, 3607, 1690, 4216, 9080, 390, 253, 1180, 273, 30102, 390, 253, 17524, 10235, 310, 627, 667, 1941, 326, 812, 1329, 253, 7313, 273, 253, 2929, 275, 253, 1083, 273, 1524, 14580, 50274, 66, 8244, 2905, 1127, 556, 281, 513, 342, 253, 8310, 326, 1524, 10186, 14580, 513, 417, 452, 247, 2590, 17524, 2605, 26332, 6210, 392, 37224, 12176, 13654, 760, 298, 925, 14580, 1537, 417, 320, 2217, 281, 9232, 824, 10872, 50274, 783, 23178, 414, 17705, 310, 908, 281, 7472, 253, 3290, 273, 7888, 534, 4583, 310, 247, 7561, 908, 17705, 17837, 23178, 414, 556, 644, 2011, 281, 320, 9791, 939, 281, 253, 1798, 2605, 273, 253, 7888, 24088, 6064, 2701, 849, 310, 436, 2668, 715, 2395, 275, 253, 7103, 273, 253, 7888, 50275, 23955, 1127, 310, 326, 253, 2929, 310, 15846, 16774, 7964, 436, 310, 247, 1175, 4983, 1127, 3340, 672, 253, 2770, 310, 327, 5661, 7533, 326, 452, 417, 644, 908, 1078, 17837, 5747, 253, 4722, 7313, 891, 651, 1902, 281, 452, 247, 10527, 22861, 390, 690, 14720, 273, 2139, 660, 291, 17923, 973, 327, 298, 925, 14580, 390, 323, 4227, 2139, 268, 1087, 534, 310, 247, 7561, 908, 2557, 2722, 4105, 3879, 50275, 249, 253, 2929, 512, 4216, 5593, 403, 2783, 347, 34501, 849, 3588, 310, 436, 4154, 2139, 417, 816, 970, 253, 6313, 465, 30799, 323, 4216, 5593, 534, 403, 417, 34501, 50274, 7152, 33032, 337, 642, 1175, 4606, 281, 5206, 7533, 273, 7103, 3782, 10295, 465, 30799, 285, 298, 925, 50275, 19, 891, 1158, 436, 2929, 778, 1158, 670, 1633, 1077, 4755, 253, 4758, 310, 10295, 465, 30799, 285, 604, 253, 14259, 2557, 310, 1677, 347, 690, 10295, 352, 1537, 320, 625, 4518, 2011, 752, 10295, 310, 1175, 762, 752, 1617, 762, 10295, 465, 30799, 50276, 249, 643, 3000, 359, 778, 1089, 690, 4602, 875, 247, 10295, 285, 253, 1617, 273, 941, 5978, 762, 253, 10295, 465, 30799, 2168, 1078, 2509, 690, 4679, 50276, 74, 1158, 436, 1511, 273, 5839, 310, 5816, 275, 436, 2929, 50276, 20, 594, 752, 310, 253, 1921, 2139, 660, 291, 310, 253, 1682, 285, 263, 2139, 4122, 14714, 264, 3082, 403, 594, 891, 1158, 326, 651, 320, 3365, 4802, 281, 253, 6974, 273, 941, 5978, 273, 298, 925, 390, 1529, 4735, 275, 11365, 941, 390, 6046, 891, 1158, 436, 1127, 1537, 320, 4755, 533, 1537, 2489, 690, 1175, 7680, 1223, 816, 941, 5978, 285, 5301, 651, 417, 320, 1633, 952, 476, 1333, 7680, 50275, 7152, 33032, 5302, 818, 5388, 298, 925, 20419, 14580, 347, 247, 49602, 18880, 253, 4477, 7277, 2030, 4216, 17524, 5593, 8925, 253, 1682, 2557, 323, 1046, 2170, 273, 253, 4764, 2317, 253, 2929, 310, 973, 3542, 11076, 1037, 3590, 285, 4722, 285, 7964, 4217, 281, 253, 4216, 3762, 3114, 2299, 347, 14969, 407, 253, 4477, 253, 1263, 310, 3710, 407, 253, 2605, 273, 253, 22791, 49130, 534, 310, 11096, 281, 6928, 326, 476, 320, 4561, 407, 298, 925, 4803, 4583, 891, 2281, 352, 347, 247, 5075, 2997, 50276, 856, 84, 50276, 783, 1783, 310, 2590, 285, 28462, 342, 247, 4209, 1268, 273, 15965, 4278, 50276, 783, 4477, 1127, 562, 247, 2590, 13688, 562, 273, 253, 873, 273, 2429, 17082, 50276, 5040, 50276, 783, 2408, 273, 38135, 275, 253, 7714, 310, 3710, 50275, 783, 22791, 18880, 310, 1077, 2173, 285, 642, 1524, 1533, 1650, 326, 651, 452, 2879, 1270, 1318, 281, 253, 19529, 310, 2530, 2490, 187, 4118, 18435, 27, 2520, 2929, 2175, 2710, 4216, 5593, 275, 6864, 50276, 783, 2929, 369, 9814, 407, 1264, 6485, 30628, 665, 48912, 253, 11990, 273, 4685, 984, 273, 2590, 4028, 533, 597, 671, 4469, 7350, 323, 3710, 38135, 10527, 22861, 285, 46521, 4758, 253, 4477, 403, 14659, 281, 4035, 2561, 3192, 715, 8180, 253, 7000, 5701, 2530, 407, 253, 30628 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the main contributions 1 novel algorithm for linear bandits 2 upper and lower performance bounds 3 numerical experiments 4 smaller comparision and collation of existing methods the authors propose a new algorithm that combines various advancements of previous work and add their own novel contribution in the usage of goptimality they then proceed to show differences to existing algorithms and to prove lower and upper bounds the proof for minimax is argued verbally and more loosely than the rest of the paper numerical results then demonstrate the effictiveness on one experimental setup strenghts 1 strong and novel and complex algorithm 2 good mathematical analysis and comparison to other papers 3 good upper and lower bound proofs 4 good numerical results weaknesses 1 weak verbal proof of minimax limitations 1 minimax is perhaps not strictly satisfied and so the core claim of the paper is slightly in question societal impact 1 everything seems to have been addressed docsepthe paper studies the problem of best arm identification in linear arms with a budget t a key element in their algorithm is the utilization of a goptimal design to select arms resulting in a parameterfree algorithm at each steps eliminates nonoptimal arms the authors provide theoretical results for the probability error depending only on the top d arms where d is the effective dimension of the arm vectors seen as a vector space the authors compare their algorithm to other previous algorithms and prefer experiments showcasing the empirical and theoretical improvements of their algorithm from the previous ones strengths their algorithm has several improvements from previous algorithms in the literature in particular while the idea of partitioning the set of arms into pieces and eliminating certain pieces of the set at each step isnt new most divide the budget into log2 k phases where k is the number of arms the authorss algorithm divides the budget into log2d phases where d is the effective dimension of the set of arms as a vector space moreover when compare to peace in katzsamuels the algorithm does is fully parameterfree and does not need that suboptimal arms satisfy a certain inequality is not satisfied if the inequality is not satisfied the linear bandit instance needs to be rescaled before peace is run resulting in a larger bound on the error probability furthermore the exponential term of the error probability is improved by logarithmic factors from katzsamuels et al 2020 ayya et al 2021 and azizi et all 2021 finally it seems their algorithm performs well empirically another important point is that their lower bound is tight and their algorithm is minimax optimal up to multiplicative factors in the exponent weakness the authors do not seem to directly address potential applications and limitations of their work i think they would benefit from including a real world example where linear bandits with a budget is used minor typographical error line 182 theta change the cdots for ldots the authors do not seem to mention any reallife applications that their work can have they do mention other contexts where bandits are applied in the real world but not including a budget setting i think they would benefit from adding a potential application in industry of their work they also do not discuss any negative societal impact of their work but i think for theory papers this is a bit harder to do docsepthis paper proposes a study of fixed budget bai with linear bandits the authors derive a minimax lower bound for linear bai based on the lower bound provided by carpentier and locatelli and propose an algorithm with an upper bound matching the lower bound the authors work is based on ideas that seem natural and convincing in the context of bai study the theoretical contributions are not surprising and the results may be somewhat obvious given the existing context however it is a result that deserves to be adopted in this field therefore i vote to weak accept after receiving a reply from the author i thank the authors for their replies to my comments first weak accept in my comment was a typo for borderline accept after receiving your reply i checked the submitted manuscript again but the scores were still the same this study deals with bai for linear bandits in a fixed budget setting which has not been well studied in that sense it is interesting however as a highlevel comment i felt that there is little strong technical novelty in that sense i do not think it makes a strong contribution in conclusion while i do not strongly support acceptance there is no reason to reject it and there is a novelty in the proposed method for this reason i have voted to borderline the scoring although the score may change after discussion among the reviewers see the above comments docsepthis work studies linear bandit with the goal of identifying the best arm the goal is to minimize the error probability when the best arm is not identified with limited number of pulls the contribution of this paper is threefold 1 the authors introduced an eliminationbased algorithm called odlinbai utilizing the goptimal design compared with the prior arts the proposed one is parameterfree 2 the authors also presented a lower bound which shows that odlinbai is nearly optimal 3 experiment results are presented to demonstrate the superior performance of the proposed algorithm strengths this paper is written well and in good presentation which make it easy to follow the newly proposed algorithm odlinbai is parameterfree and computationefficient which makes it easier to be applied to realworld applicaitons both synthetic and realworld experiments are provided to prove the superior performance of the proposed policy weaknesses the ideas and techniques applied in this paper are not new specifically the proposed algorithm is quite similar to the one proposed in 1 although the latter one targets best arm identification under fixedconfidence setting i am not saying it is a bad thing since the goals of the problems in these two papers are different however this will harm the originality of this paper 1 chao tao sal blanco and yuan zhou best arm identification in linear bandits with linear 403 dimension dependency in international conference on machine learning pages 48774886 404 pmlr 2018 there is still some gap between the upper and lower bounds despite the constant parameters inside the exponent in particular the parameter is omega frackd log2d na ### Summary:
this paper proposed an algorithm and presented a good theoretical evaluation in particular the tight lower bound is a nice theoretical contribution it is also good that it appears to be descriptively and mathematically sound the novelty of the algorithm has been questioned by some reviewers but even with that contribution it is shown to have some sufficient advantage
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2022, 9021, 337, 4460, 5933, 323, 4872, 3961, 953, 50276, 19, 5170, 285, 2406, 3045, 14493, 495, 10704, 4679, 577, 4577, 3294, 1297, 285, 3007, 318, 273, 5368, 3082, 50276, 783, 4477, 12661, 247, 747, 5933, 326, 24772, 2710, 7170, 942, 273, 2045, 789, 285, 823, 616, 1211, 4460, 7680, 275, 253, 10393, 273, 305, 32581, 1319, 597, 840, 4262, 281, 921, 3910, 281, 5368, 11333, 285, 281, 5276, 2406, 285, 5170, 14493, 253, 4737, 323, 7221, 991, 310, 9125, 17257, 595, 285, 625, 35056, 685, 253, 1551, 273, 253, 2929, 10704, 1543, 840, 7568, 253, 3098, 882, 6460, 327, 581, 5661, 9978, 50276, 296, 3755, 384, 84, 337, 2266, 285, 4460, 285, 2570, 5933, 374, 1175, 15965, 1783, 285, 5301, 281, 643, 9380, 495, 1175, 5170, 285, 2406, 3033, 27947, 577, 1175, 10704, 1543, 50276, 20881, 1255, 265, 337, 5075, 21765, 4737, 273, 7221, 991, 7364, 337, 7221, 991, 310, 4931, 417, 13714, 10048, 285, 594, 253, 5161, 1750, 273, 253, 2929, 310, 5777, 275, 1953, 50276, 84, 1118, 16190, 3486, 337, 3253, 3133, 281, 452, 644, 9713, 50276, 7152, 339, 431, 248, 2929, 2175, 253, 1895, 273, 1682, 4430, 8137, 275, 4872, 6174, 342, 247, 7563, 246, 247, 2234, 3284, 275, 616, 5933, 310, 253, 19575, 273, 247, 305, 29776, 2216, 281, 3609, 6174, 4795, 275, 247, 4764, 4924, 5933, 387, 1016, 5018, 35580, 1327, 29776, 6174, 253, 4477, 2085, 10527, 1543, 323, 253, 5912, 2228, 7293, 760, 327, 253, 1755, 277, 6174, 835, 277, 310, 253, 3576, 7877, 273, 253, 4430, 11390, 2326, 347, 247, 4972, 2317, 253, 4477, 7277, 616, 5933, 281, 643, 2045, 11333, 285, 4510, 4679, 44762, 2355, 253, 16774, 285, 10527, 11701, 273, 616, 5933, 432, 253, 2045, 4394, 50274, 296, 3755, 20556, 50276, 14094, 5933, 556, 2067, 11701, 432, 2045, 11333, 275, 253, 6239, 275, 1798, 1223, 253, 2934, 273, 41463, 253, 873, 273, 6174, 715, 7437, 285, 23703, 2176, 7437, 273, 253, 873, 387, 1016, 3213, 310, 2649, 747, 954, 10957, 253, 7563, 715, 2412, 19, 465, 12475, 835, 465, 310, 253, 1180, 273, 6174, 253, 4477, 84, 5933, 37141, 253, 7563, 715, 2412, 19, 69, 12475, 835, 277, 310, 253, 3576, 7877, 273, 253, 873, 273, 6174, 347, 247, 4972, 2317, 25761, 672, 7277, 281, 6330, 275, 465, 16859, 22163, 86, 1241, 253, 5933, 1057, 310, 4751, 4764, 4924, 285, 1057, 417, 878, 326, 749, 29776, 6174, 10517, 247, 2176, 11370, 310, 417, 10048, 604, 253, 11370, 310, 417, 10048, 253, 4872, 3961, 262, 4227, 3198, 281, 320, 46595, 264, 1078, 6330, 310, 1408, 4795, 275, 247, 4067, 3033, 327, 253, 2228, 5912, 33810, 253, 17619, 1307, 273, 253, 2228, 5912, 310, 5520, 407, 32643, 2616, 432, 465, 16859, 22163, 86, 1241, 1162, 355, 9169, 26275, 5973, 1162, 355, 43425, 285, 11775, 37694, 1162, 512, 43425, 4720, 352, 3133, 616, 5933, 17923, 973, 45190, 50275, 23955, 1774, 1127, 310, 326, 616, 2406, 3033, 310, 6863, 285, 616, 5933, 310, 7221, 991, 8654, 598, 281, 43904, 2616, 275, 253, 23653, 50275, 20881, 1255, 50276, 783, 4477, 513, 417, 1646, 281, 3587, 2953, 2442, 4893, 285, 7364, 273, 616, 789, 891, 1158, 597, 651, 5649, 432, 1690, 247, 1524, 1533, 1650, 835, 4872, 3961, 953, 342, 247, 7563, 310, 908, 50275, 37585, 1745, 15553, 2228, 1386, 25985, 39116, 50276, 4168, 253, 260, 6768, 323, 298, 6768, 253, 4477, 513, 417, 1646, 281, 3748, 667, 294, 455, 1074, 4893, 326, 616, 789, 476, 452, 597, 513, 3748, 643, 22349, 835, 3961, 953, 403, 3732, 275, 253, 1524, 1533, 533, 417, 1690, 247, 7563, 4758, 891, 1158, 597, 651, 5649, 432, 6240, 247, 2442, 2898, 275, 4491, 273, 616, 789, 50276, 9328, 671, 513, 417, 2319, 667, 4016, 38058, 3486, 273, 616, 789, 533, 891, 1158, 323, 3762, 9380, 436, 310, 247, 2372, 12150, 281, 513, 50276, 7152, 33032, 2520, 2929, 29328, 247, 1263, 273, 4229, 7563, 270, 2284, 342, 4872, 3961, 953, 253, 4477, 15313, 247, 7221, 991, 2406, 3033, 323, 4872, 270, 2284, 1754, 327, 253, 2406, 3033, 2530, 407, 1113, 21601, 1321, 285, 1150, 255, 13890, 285, 12661, 271, 5933, 342, 271, 5170, 3033, 11038, 253, 2406, 3033, 253, 4477, 789, 310, 1754, 327, 5697, 326, 1646, 3626, 285, 21414, 275, 253, 3634, 273, 270, 2284, 1263, 253, 10527, 9021, 403, 417, 10084, 285, 253, 1543, 778, 320, 8489, 4755, 1677, 253, 5368, 3634, 2299, 352, 310, 247, 906, 326, 22828, 281, 320, 8671, 275, 436, 1673, 3103, 891, 6273, 281, 5075, 2997, 50275, 6438, 6883, 247, 12252, 432, 253, 2488, 50275, 74, 5717, 253, 4477, 323, 616, 32114, 281, 619, 5701, 806, 5075, 2997, 275, 619, 4385, 369, 247, 1745, 80, 323, 45210, 2997, 846, 6883, 634, 12252, 891, 10141, 253, 9262, 7714, 969, 533, 253, 7363, 497, 1335, 253, 1072, 50276, 2520, 1263, 13330, 342, 270, 2284, 323, 4872, 3961, 953, 275, 247, 4229, 7563, 4758, 534, 556, 417, 644, 973, 5421, 275, 326, 3282, 352, 310, 4722, 2299, 347, 247, 1029, 5251, 4385, 891, 3543, 326, 627, 310, 1652, 2266, 7681, 38135, 275, 326, 3282, 891, 513, 417, 1158, 352, 2789, 247, 2266, 7680, 50276, 249, 6452, 1223, 891, 513, 417, 7052, 1329, 14924, 627, 310, 642, 1921, 281, 12009, 352, 285, 627, 310, 247, 38135, 275, 253, 4081, 1332, 323, 436, 1921, 891, 452, 14285, 281, 45210, 253, 14755, 3738, 253, 4868, 778, 1818, 846, 5955, 2190, 253, 30628, 50276, 2887, 253, 1840, 5701, 50276, 7152, 33032, 2520, 789, 2175, 4872, 3961, 262, 342, 253, 4736, 273, 12488, 253, 1682, 4430, 253, 4736, 310, 281, 15338, 253, 2228, 5912, 672, 253, 1682, 4430, 310, 417, 3636, 342, 3710, 1180, 273, 25612, 253, 7680, 273, 436, 2929, 310, 1264, 8089, 337, 253, 4477, 5611, 271, 20408, 3169, 5933, 1925, 7687, 3642, 67, 2284, 17617, 253, 305, 29776, 2216, 2429, 342, 253, 2720, 14635, 253, 4081, 581, 310, 4764, 4924, 374, 253, 4477, 671, 3559, 247, 2406, 3033, 534, 2722, 326, 7687, 3642, 67, 2284, 310, 4829, 8654, 495, 3368, 1543, 403, 3559, 281, 7568, 253, 8936, 3045, 273, 253, 4081, 5933, 50276, 296, 3755, 20556, 50275, 2520, 2929, 310, 3542, 973, 285, 275, 1175, 9759, 534, 1056, 352, 3477, 281, 956, 50276, 783, 9841, 4081, 5933, 7687, 3642, 67, 2284, 310, 4764, 4924, 285, 13782, 20246, 534, 2789, 352, 6927, 281, 320, 3732, 281, 1524, 10186, 1799, 1942, 790, 50276, 15617, 13506, 285, 1524, 10186, 4679, 403, 2530, 281, 5276, 253, 8936, 3045, 273, 253, 4081, 3646, 50275, 20881, 1255, 265, 50275, 783, 5697, 285, 5609, 3732, 275, 436, 2929, 403, 417, 747, 5742, 253, 4081, 5933, 310, 3240, 2074, 281, 253, 581, 4081, 275, 337, 3738, 253, 6158, 581, 8571, 1682, 4430, 8137, 762, 4229, 39943, 4758, 891, 717, 417, 3981, 352, 310, 247, 3076, 2181, 1580, 253, 7342, 273, 253, 3237, 275, 841, 767, 9380, 403, 1027, 2299, 436, 588, 5237, 253, 3236, 414, 273, 436, 2929, 50276, 18, 448, 8500, 246, 8500, 3779, 787, 26798, 285, 340, 9041, 1182, 14451, 1682, 4430, 8137, 275, 4872, 3961, 953, 342, 4872, 29716, 7877, 18925, 275, 5213, 8059, 327, 5145, 4715, 7223, 5693, 30715, 35265, 21566, 268, 1686, 83, 4765, 50275, 9088, 310, 1335, 690, 8037, 875, 253, 5170, 285, 2406, 14493, 5747, 253, 3638, 3602, 3304, 253, 23653, 275, 1798, 253, 4764, 310, 40639, 1315, 471, 69, 50276, 2808, 19, 69, 50276, 2072, 2490, 187, 4118, 18435, 27, 2520, 2929, 4081, 271, 5933, 285, 3559, 247, 1175, 10527, 7103, 275, 1798, 253, 6863, 2406, 3033, 310, 247, 5322, 10527, 7680, 352, 310, 671, 1175, 326, 352, 4620, 281, 320, 4494, 1242, 285, 11076, 1037, 3590, 50276, 783, 38135, 273, 253, 5933, 556, 644, 17801, 407, 690, 30628, 533, 1014, 342, 326, 7680, 352, 310, 2011, 281, 452, 690, 4209, 5750, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2022, 9021, 337, 4460, 5933, 323, 4872, 3961, 953, 50276, 19, 5170, 285, 2406, 3045, 14493, 495, 10704, 4679, 577, 4577, 3294, 1297, 285, 3007, 318, 273, 5368, 3082, 50276, 783, 4477, 12661, 247, 747, 5933, 326, 24772, 2710, 7170, 942, 273, 2045, 789, 285, 823, 616, 1211, 4460, 7680, 275, 253, 10393, 273, 305, 32581, 1319, 597, 840, 4262, 281, 921, 3910, 281, 5368, 11333, 285, 281, 5276, 2406, 285, 5170, 14493, 253, 4737, 323, 7221, 991, 310, 9125, 17257, 595, 285, 625, 35056, 685, 253, 1551, 273, 253, 2929, 10704, 1543, 840, 7568, 253, 3098, 882, 6460, 327, 581, 5661, 9978, 50276, 296, 3755, 384, 84, 337, 2266, 285, 4460, 285, 2570, 5933, 374, 1175, 15965, 1783, 285, 5301, 281, 643, 9380, 495, 1175, 5170, 285, 2406, 3033, 27947, 577, 1175, 10704, 1543, 50276, 20881, 1255, 265, 337, 5075, 21765, 4737, 273, 7221, 991, 7364, 337, 7221, 991, 310, 4931, 417, 13714, 10048, 285, 594, 253, 5161, 1750, 273, 253, 2929, 310, 5777, 275, 1953, 50276, 84, 1118, 16190, 3486, 337, 3253, 3133, 281, 452, 644, 9713, 50276, 7152, 339, 431, 248, 2929, 2175, 253, 1895, 273, 1682, 4430, 8137, 275, 4872, 6174, 342, 247, 7563, 246, 247, 2234, 3284, 275, 616, 5933, 310, 253, 19575, 273, 247, 305, 29776, 2216, 281, 3609, 6174, 4795, 275, 247, 4764, 4924, 5933, 387, 1016, 5018, 35580, 1327, 29776, 6174, 253, 4477, 2085, 10527, 1543, 323, 253, 5912, 2228, 7293, 760, 327, 253, 1755, 277, 6174, 835, 277, 310, 253, 3576, 7877, 273, 253, 4430, 11390, 2326, 347, 247, 4972, 2317, 253, 4477, 7277, 616, 5933, 281, 643, 2045, 11333, 285, 4510, 4679, 44762, 2355, 253, 16774, 285, 10527, 11701, 273, 616, 5933, 432, 253, 2045, 4394, 50274, 296, 3755, 20556, 50276, 14094, 5933, 556, 2067, 11701, 432, 2045, 11333, 275, 253, 6239, 275, 1798, 1223, 253, 2934, 273, 41463, 253, 873, 273, 6174, 715, 7437, 285, 23703, 2176, 7437, 273, 253, 873, 387, 1016, 3213, 310, 2649, 747, 954, 10957, 253, 7563, 715, 2412, 19, 465, 12475, 835, 465, 310, 253, 1180, 273, 6174, 253, 4477, 84, 5933, 37141, 253, 7563, 715, 2412, 19, 69, 12475, 835, 277, 310, 253, 3576, 7877, 273, 253, 873, 273, 6174, 347, 247, 4972, 2317, 25761, 672, 7277, 281, 6330, 275, 465, 16859, 22163, 86, 1241, 253, 5933, 1057, 310, 4751, 4764, 4924, 285, 1057, 417, 878, 326, 749, 29776, 6174, 10517, 247, 2176, 11370, 310, 417, 10048, 604, 253, 11370, 310, 417, 10048, 253, 4872, 3961, 262, 4227, 3198, 281, 320, 46595, 264, 1078, 6330, 310, 1408, 4795, 275, 247, 4067, 3033, 327, 253, 2228, 5912, 33810, 253, 17619, 1307, 273, 253, 2228, 5912, 310, 5520, 407, 32643, 2616, 432, 465, 16859, 22163, 86, 1241, 1162, 355, 9169, 26275, 5973, 1162, 355, 43425, 285, 11775, 37694, 1162, 512, 43425, 4720, 352, 3133, 616, 5933, 17923, 973, 45190, 50275, 23955, 1774, 1127, 310, 326, 616, 2406, 3033, 310, 6863, 285, 616, 5933, 310, 7221, 991, 8654, 598, 281, 43904, 2616, 275, 253, 23653, 50275, 20881, 1255, 50276, 783, 4477, 513, 417, 1646, 281, 3587, 2953, 2442, 4893, 285, 7364, 273, 616, 789, 891, 1158, 597, 651, 5649, 432, 1690, 247, 1524, 1533, 1650, 835, 4872, 3961, 953, 342, 247, 7563, 310, 908, 50275, 37585, 1745, 15553, 2228, 1386, 25985, 39116, 50276, 4168, 253, 260, 6768, 323, 298, 6768, 253, 4477, 513, 417, 1646, 281, 3748, 667, 294, 455, 1074, 4893, 326, 616, 789, 476, 452, 597, 513, 3748, 643, 22349, 835, 3961, 953, 403, 3732, 275, 253, 1524, 1533, 533, 417, 1690, 247, 7563, 4758, 891, 1158, 597, 651, 5649, 432, 6240, 247, 2442, 2898, 275, 4491, 273, 616, 789, 50276, 9328, 671, 513, 417, 2319, 667, 4016, 38058, 3486, 273, 616, 789, 533, 891, 1158, 323, 3762, 9380, 436, 310, 247, 2372, 12150, 281, 513, 50276, 7152, 33032, 2520, 2929, 29328, 247, 1263, 273, 4229, 7563, 270, 2284, 342, 4872, 3961, 953, 253, 4477, 15313, 247, 7221, 991, 2406, 3033, 323, 4872, 270, 2284, 1754, 327, 253, 2406, 3033, 2530, 407, 1113, 21601, 1321, 285, 1150, 255, 13890, 285, 12661, 271, 5933, 342, 271, 5170, 3033, 11038, 253, 2406, 3033, 253, 4477, 789, 310, 1754, 327, 5697, 326, 1646, 3626, 285, 21414, 275, 253, 3634, 273, 270, 2284, 1263, 253, 10527, 9021, 403, 417, 10084, 285, 253, 1543, 778, 320, 8489, 4755, 1677, 253, 5368, 3634, 2299, 352, 310, 247, 906, 326, 22828, 281, 320, 8671, 275, 436, 1673, 3103, 891, 6273, 281, 5075, 2997, 50275, 6438, 6883, 247, 12252, 432, 253, 2488, 50275, 74, 5717, 253, 4477, 323, 616, 32114, 281, 619, 5701, 806, 5075, 2997, 275, 619, 4385, 369, 247, 1745, 80, 323, 45210, 2997, 846, 6883, 634, 12252, 891, 10141, 253, 9262, 7714, 969, 533, 253, 7363, 497, 1335, 253, 1072, 50276, 2520, 1263, 13330, 342, 270, 2284, 323, 4872, 3961, 953, 275, 247, 4229, 7563, 4758, 534, 556, 417, 644, 973, 5421, 275, 326, 3282, 352, 310, 4722, 2299, 347, 247, 1029, 5251, 4385, 891, 3543, 326, 627, 310, 1652, 2266, 7681, 38135, 275, 326, 3282, 891, 513, 417, 1158, 352, 2789, 247, 2266, 7680, 50276, 249, 6452, 1223, 891, 513, 417, 7052, 1329, 14924, 627, 310, 642, 1921, 281, 12009, 352, 285, 627, 310, 247, 38135, 275, 253, 4081, 1332, 323, 436, 1921, 891, 452, 14285, 281, 45210, 253, 14755, 3738, 253, 4868, 778, 1818, 846, 5955, 2190, 253, 30628, 50276, 2887, 253, 1840, 5701, 50276, 7152, 33032, 2520, 789, 2175, 4872, 3961, 262, 342, 253, 4736, 273, 12488, 253, 1682, 4430, 253, 4736, 310, 281, 15338, 253, 2228, 5912, 672, 253, 1682, 4430, 310, 417, 3636, 342, 3710, 1180, 273, 25612, 253, 7680, 273, 436, 2929, 310, 1264, 8089, 337, 253, 4477, 5611, 271, 20408, 3169, 5933, 1925, 7687, 3642, 67, 2284, 17617, 253, 305, 29776, 2216, 2429, 342, 253, 2720, 14635, 253, 4081, 581, 310, 4764, 4924, 374, 253, 4477, 671, 3559, 247, 2406, 3033, 534, 2722, 326, 7687, 3642, 67, 2284, 310, 4829, 8654, 495, 3368, 1543, 403, 3559, 281, 7568, 253, 8936, 3045, 273, 253, 4081, 5933, 50276, 296, 3755, 20556, 50275, 2520, 2929, 310, 3542, 973, 285, 275, 1175, 9759, 534, 1056, 352, 3477, 281, 956, 50276, 783, 9841, 4081, 5933, 7687, 3642, 67, 2284, 310, 4764, 4924, 285, 13782, 20246, 534, 2789, 352, 6927, 281, 320, 3732, 281, 1524, 10186, 1799, 1942, 790, 50276, 15617, 13506, 285, 1524, 10186, 4679, 403, 2530, 281, 5276, 253, 8936, 3045, 273, 253, 4081, 3646, 50275, 20881, 1255, 265, 50275, 783, 5697, 285, 5609, 3732, 275, 436, 2929, 403, 417, 747, 5742, 253, 4081, 5933, 310, 3240, 2074, 281, 253, 581, 4081, 275, 337, 3738, 253, 6158, 581, 8571, 1682, 4430, 8137, 762, 4229, 39943, 4758, 891, 717, 417, 3981, 352, 310, 247, 3076, 2181, 1580, 253, 7342, 273, 253, 3237, 275, 841, 767, 9380, 403, 1027, 2299, 436, 588, 5237, 253, 3236, 414, 273, 436, 2929, 50276, 18, 448, 8500, 246, 8500, 3779, 787, 26798, 285, 340, 9041, 1182, 14451, 1682, 4430, 8137, 275, 4872, 3961, 953, 342, 4872, 29716, 7877, 18925, 275, 5213, 8059, 327, 5145, 4715, 7223, 5693, 30715, 35265, 21566, 268, 1686, 83, 4765, 50275, 9088, 310, 1335, 690, 8037, 875, 253, 5170, 285, 2406, 14493, 5747, 253, 3638, 3602, 3304, 253, 23653, 275, 1798, 253, 4764, 310, 40639, 1315, 471, 69, 50276, 2808, 19, 69, 50276, 2072, 2490, 187, 4118, 18435, 27, 2520, 2929, 4081, 271, 5933, 285, 3559, 247, 1175, 10527, 7103, 275, 1798, 253, 6863, 2406, 3033, 310, 247, 5322, 10527, 7680, 352, 310, 671, 1175, 326, 352, 4620, 281, 320, 4494, 1242, 285, 11076, 1037, 3590, 50276, 783, 38135, 273, 253, 5933, 556, 644, 17801, 407, 690, 30628, 533, 1014, 342, 326, 7680, 352, 310, 2011, 281, 452, 690, 4209, 5750, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper develops highdegree lower bounds for the sparse pca and tensor pca problems for both of these problems there is a wellknown discrepancy between the statistical complexity ie samples or snr required to detect the hidden signal by any means feasible and computational complexity ie samples or snr required to approximately recover the hidden signal by computationally efficient ie poly time algorithms i phrase the above in terms of detection to align with the paper analogous definitions using recovery may be made with the obvious ordering that detection is easier than recovery along with the planted clique problem sparse and tensor pca are arguably some of the most wellstudied examples of statistical problems exhibiting this computationalstatistical phenomenon while one way of developing hardness arguments is to construct problem reductions this paper contributes to the line of unconditional hardness results by analyzing the sumofsquares hierarchy of convex programs for sparsetensor pca the first two steps also called degrees in this hierarchy for sparse pca were shown ineffective by krauthgamer et al and wigderson and ma respectively this paper develops bounds for the next dimensiona small const many degrees strengths 1 i think the paper is an interesting and solid contribution to the hardness results on sparse pca and tensor pca both problems are sparse pca somewhat more than tensor pca wellmotivated and used in applications 2 the proof techniques are to my mind also potentially interesting to the theoryinclined audience proving sumofsquares lower bounds tends to be a technically challenging intricate and involved task and the authors do a good job in the beginning of the appendix explaining some of the highlevel ideas and challenges weaknesses 1 at the outset and given 2 neurips might not be the best fit for this submission i would expect colt theoretical cs conferences stocfocs and of course a number of journals on the other hand to be quite wellsuited 2 the interesting and generalizable part of the paper seem to be some of the proof techniques and these are relegated to the appendix i did not have time to read the full appendix 65 pages but the first 1520 were useful 3 separately from 1 the paper is made inaccessible by a somewhat loose use of notation that is widespread as a small example after using d for dimension the authors also use it for degree of the sos program in the beginning of sec 2 for accessibility to a statsml audience this paper would require significant polishing 4 related to 3 i trust the results despite point 2 but some of the main results in the text theorems 31 and 32 likely should be rephrased na docsepthis paper fits into the broad area of theoretically understanding informationcomputation tradeoffs for statistical inference the abstract setup is that there is some signal x typically in the vector space rd and as input we are given noisy observations about x v1 v2 vm and the goal is to infer x up to the best accuracy possible via an efficient algorithm when m is too small the problem is impossible to solve even information theoretically at some threshold for m the problem becomes information theoretically identifiable information threshold but could possibly remain computationally hard and at a possibly distinct threshold computational threshold for m the problem also becomes algorithmically easy to solve understanding the computational threshold for averagecase problems and in particular if there is a gap in between the information and computational thresholds is a topic that has attracted a lot of attention in the theoretical computer science community to understand the computational threshold one must give an algorithm along with some evidence that the problem is hard under the threshold which usually comes in the form of lower bounds against particular algorithms semidefinite programs statistical queries polynomialbased methods eigenvalue methods iterative algorithms etc a ubiquitous phenomenon for a wide variety of such averagecase algorithmic problems is that at some threshold several simple algorithms and most notably some spectral algorithm succeed at solving the problem and under the threshold many seemingly more powerful algorithms fail this work is concerned with providing more evidence to our understanding of this computational threshold for the sparse pca and also tensor pca problems in the sparse pca problem the hidden signal x is a random ddimensional vector supported on a much smaller number of coordinates k coordinates where the nonzero entries are random in 1 the observations are independent samples from the gaussian distribution with covariance spiked by x n0 identity lambda xx where lambda is some positive scalar to be interpreted as a signaltonoise parameter the main result of this paper is that a certain powerful subexponentialsized semidefinite programmingbased algorithm degreenepsilon sumofsquares algorithm fails to solve sparse pca better than a simple spectral algorithm which adds to our understanding of how hard this problem should be similar results are presented for the tensor pca problem as well the current state of affairs in obtaining computational hardness for restricted classes of algorithms is that when we want to argue that its computationally hard to beat a simple spectral we study the following methods of hardness which are listed in roughly increasing order of difficulty 1 lowdegree polynomials 2 statistical queries 3 semidefinite programming lower bounds sumofsquares and this papers contributions fall here 4 reductionbased hardness i must note that the gap between the difficulty of 2 and 3 is quite large in the current sociological setting of the subarea it is a generally widespread belief that a hardness result against 1 or 2 does mean that our current algorithmic techniques likely wont surmount the barriers they are facing and likely new algorithmic ideas are needed an important question to answer to support this belief is to show that hardness results for 1 and 2 which are often easy to establish imply hardness results for other algorithms such metatheorems are hard to come by with no particularly promising attacks for the case of sumofsquares and therefore the current approach is to substantiate the conjecture via a rich set of example lower bounds which together hopefully illuminate a path to such metatheorems with respect to this papers strengths and weaknesses i would like to touch upon two points a sparse pca is a problem that has received a lot of attention in theoretical algorithmic statistics and in light of that evidence for computational thresholds is interesting however i would not say that the paper gave us a surprising answer as to where the computational threshold should lie the preexisting lower bounds in the style of 1 and 2 already told us what we think the answer should be b i think the papers main strength is in progress in substantiating the predictions 1 and 2 have for sparse pca since sumofsquares lower bounds are only known for a handful of problems especially since they are technically difficult to achieve it is worthy to note that the best reductionbased hardness conditional on some conjecture about planted clique only rules out quasipolynomial time algorithms as opposed to subexponential algorithms some of the other existing sos lower bounds planted clique csps sk model highly use problemspecific structure and the most desirable aspect for any new sos lower bound is to rely less on the problemspecific structure and try to abstract conditions needed it is hard for me to tell whether this lower bound does this given the time constraint of the review but it would be nice if the authors can comment on this overall i recommend acceptance docsepsparse pca is a fundamental problem in the machine learning and statistical inference community suppose we are given vectors v1 dots vm in mathbbrd sampled from the meanzero gaussian distribution with covariance id lambda uutop where u is a ksparse vector the task is to recover the hidden vector u principal component this is the wishart model there are several parameters in play here the number of samples m the dimension d the signaltonoise ratio lambda and the sparsity k a long line of work has established an almost complete picture of the tractability of sparse pca in different parameter regimes in particular in the hard regime where m lambda2 ll d and m lambda2 ll k2 no known algorithm exists this paper shows that the powerful sumofsquares sos algorithm fails to solve sparse pca in this hard regime even with degree do1 specifically the authors show that the natural sos relaxation at degree doepsilon fails at the easier task of distinguishing between vectors sampled randomly vs vectors sampled from the spiked model when m lambda2 ll d1epsilon and m lambda2 ll k2epsilon their techniques also apply to tensor pca for kgeq 2 we are given an order k tensor a lambda uotimes k b where uin mathbbrn is a unit vector and b is a tensor with gaussian entries and the task is to recover u they prove analogous sos lower bounds for the regime lambda ll nk4 techniques the authors showed that in the hard regime even when given random inputs no planted spike the canonical sos relaxation has a large optimal value of m mlambda o1 thus failing to distinguish from the case when the inputs are drawn from the spiked gaussian to prove this the authors constructed a candidate pseudodistribution from the spiked distribution aka planted distribution using the pseudocalibration method which is by now a standard technique to prove sos lower bounds as in the sos lower bound for planted clique 18 they decompose the pseudomoment matrix into a linear combination of graph matrices which are structured random matrices that can be represented as graphs called shapes and whose spectral norms can be bounded by analyzing vertex separators in the shape following planted clique 18 they factorize each shape into left right and middle shapes and then bound the intersection terms the errors in the factorization one technicality is that the coefficients in the linear combination also depend on the shapes which they handle by analyzing the coefficient matrix they show that the middle part is psd by charging nontrivial shapes to the diagonal which implies the whole matrix is psd the bulk of their proof is in fact handling the intersection terms strengths this paper is important as it gives insights to the complexity of sparsetensor pca and also the power of sos algorithms it provides a strong piece of evidence that the hardness regimes are indeed hard previous work by hopkins et al 49 claimed sos lower bounds for sparse pca in the spiked wigner model though their proofs are not online this paper generalizes their ideas and proves an sos lower bound for the more natural wishart model i view this as the main contribution of this paper as in previous papers proving sos lower bounds requires a significant amount of work especially for polynomialdegree therefore even though many techniques in this paper are not new eg factorization of shapes charging arguments etc grinding out all the technical details is a strength in my opinion weaknesses i would like to see a detailed comparison to 49 and why their techniques fail for the wishart model i believe that 49 has most of the main ideas even though a full proof hasnt been posted there must be some technical challenges hidden in the details but this was not explained in either the main paper or the appendix finally it is almost impossible for readers unfamiliar with graph matrices to follow the proofs maybe include some example graphs the authors adequately addressed the limitations docsepsumofsquares is a family of algorithms that capture many known techniques for combinatorial optimization problems and it has been recently shown to be very powerful for many mlrobust statistics problems for the theoretical computer science community some has seen this as a proxy to strong algorithmic impossibility result especially in the averagecase setting this paper rigorously establishes strong sumofsquares lower bounds at neps degree for several problems in the colloquial densegraph regime including tensor pca sparse pca both of which have been claimed by a previous work of hopkins et al hkprss17 but without an explicit spelledout proof on the technical level it builds upon the sos lower bound for planted clique by barak et al bhkkmp16 that they use the recursive factorization framework to show psdness of the candidate moment matrix obtained from pseudocalibration with intense matrix analysis for the specific hardness this work obtains not much is known before except deg2 and deg4 lower bound in the sos realm and this work obtains almost tight hardness matching known algorithms strength 1 in general this is detailedly written paper 2 though the hardness result may not be too surprising the almost tight hardness result of this work is certainly nice to have and necessary for the community considering how universal these two problems are in ml practice 3 their generalization of the framework from planted clique is likely to find applications in other problems weakness 1 as the paper is filled with details itd be nice to incorporate more exposition in the technical section it is hard to imagine someone going through this generalization and as technical as it already is more intuition between the technical proofs can be helpful for readability 2 it is not immediately clear what technical challenges the authors claimed to have overcome from planted clique they note that the method from planted clique does not immediately work while the challenges are not explicitly stated na docsepthis is a theoretical work that studies the applicability of sum of squares sos algorithms for sparse principal component analysis pca and tensor pca specifically this work provides theoretical proof that despite the success of sos algorithms in high dimensional statistics they underperform when applied to sparse pca and tensor pca the paper claims that subexponential time sos algorithms cannot beat traditional sparse pca algorithms even if allowed subexponential time strengths 1 the text is easy to follow 2 visual illustrations eg in figure 1 assist in understanding the claims of the paper 3 the paper is theoretical and appears to propose a broad result on the usefulness of sos algorithms for sparse pca and tensor pca weaknesses 1 the flow of the paper can be further improved 2 is the proposed theory relevant in practice please elaborate for example theorem 31 assumes a range for m and theorem 32 assumes an upper limit for lambda are these valid assumptions in practice 3 figures are referred to as fig please correct them 4 equations are not referenced please number them for easy reference 5 line 124 consists of a tensor product notation and it is not explained please provide an explanation the paper identifies relevant future directions of research based on limitations of the current work ### Summary:
the reviewers appreciate the solid theoretical results on the hardness of sparse and tensor pca the nearly sharp characterization of limitations of sumofsquare methods makes the paper stand out the proof techniques may potentially be useful for other problems based on the above i recommend acceptance meanwhile please carefully revise the paper to improve presentation as the paper is technically involved it would be nice to better elaborate the proof ideas and highlight the key challenges
[ 310, 690, 2762, 13434, 281, 320, 12814, 347, 247, 2625, 1299, 45416, 4764, 50276, 783, 2022, 906, 273, 436, 2929, 310, 326, 247, 2176, 6422, 749, 4347, 45426, 10306, 3300, 504, 35161, 10717, 3169, 5933, 6797, 1820, 4259, 2020, 1171, 23600, 4420, 5933, 10224, 281, 8415, 23507, 268, 6357, 1805, 685, 247, 2969, 9879, 5933, 534, 11323, 281, 776, 4685, 273, 849, 1892, 436, 1895, 943, 320, 50276, 22202, 1543, 403, 3559, 323, 253, 13148, 268, 6357, 1895, 347, 973, 253, 1655, 1375, 273, 13909, 275, 13546, 15180, 38576, 323, 11096, 5971, 273, 11333, 310, 326, 672, 359, 971, 281, 9059, 326, 697, 43245, 1892, 281, 7171, 247, 2969, 9879, 359, 1263, 253, 1563, 3082, 273, 38576, 534, 403, 7117, 275, 11467, 3629, 1340, 273, 10183, 337, 1698, 14577, 21783, 374, 7605, 19241, 495, 3300, 504, 35161, 10717, 2406, 14493, 2020, 1171, 23600, 4420, 285, 436, 9380, 9021, 2965, 1060, 577, 5141, 3169, 38576, 50276, 74, 1364, 3877, 326, 253, 8037, 875, 253, 10183, 273, 374, 285, 495, 310, 3240, 1781, 50276, 249, 253, 1655, 10621, 1975, 4758, 273, 253, 749, 12879, 352, 310, 247, 3839, 14414, 9927, 326, 247, 38576, 906, 1411, 337, 390, 374, 1057, 1599, 326, 776, 1655, 5933, 280, 5609, 2779, 31451, 919, 16285, 253, 15938, 597, 403, 10268, 285, 2779, 747, 5933, 280, 5697, 403, 3058, 50276, 266, 1774, 1953, 281, 3662, 281, 1329, 436, 9927, 310, 281, 921, 326, 38576, 1543, 323, 337, 285, 374, 534, 403, 2223, 3477, 281, 5100, 16084, 38576, 1543, 323, 643, 11333, 50276, 10328, 1313, 4349, 28657, 403, 1892, 281, 1705, 407, 342, 642, 3782, 12532, 8104, 323, 253, 1083, 273, 2020, 1171, 23600, 4420, 285, 3103, 253, 1655, 2746, 310, 281, 4326, 4513, 253, 24366, 3066, 247, 6793, 873, 273, 1650, 2406, 14493, 534, 2366, 18670, 16023, 4024, 247, 1854, 281, 824, 1313, 4349, 28657, 50276, 3113, 1675, 281, 436, 9380, 20544, 285, 32213, 891, 651, 751, 281, 5181, 2220, 767, 2792, 247, 23507, 268, 6357, 310, 247, 1895, 326, 556, 2959, 247, 2257, 273, 4116, 275, 10527, 5933, 280, 9990, 285, 275, 1708, 273, 326, 1941, 323, 15180, 26682, 310, 4722, 50276, 35529, 891, 651, 417, 1333, 326, 253, 2929, 3534, 441, 247, 10084, 3662, 347, 281, 835, 253, 15180, 7887, 943, 7027, 253, 638, 20137, 2406, 14493, 275, 253, 3740, 273, 337, 285, 374, 2168, 2183, 441, 752, 359, 1158, 253, 3662, 943, 320, 270, 891, 1158, 253, 9380, 2022, 4757, 310, 275, 4780, 275, 4326, 15544, 253, 13650, 337, 285, 374, 452, 323, 23507, 268, 6357, 1580, 2020, 1171, 23600, 4420, 2406, 14493, 403, 760, 1929, 323, 247, 17167, 273, 3237, 3340, 1580, 597, 403, 22335, 2834, 281, 5115, 50276, 262, 310, 18338, 281, 3877, 326, 253, 1682, 5141, 3169, 38576, 17697, 327, 690, 24366, 670, 23846, 502, 2271, 760, 4803, 562, 21582, 532, 311, 9822, 451, 673, 11333, 347, 10066, 281, 749, 4347, 45426, 11333, 50276, 8826, 273, 253, 643, 5368, 256, 375, 2406, 14493, 23846, 502, 2271, 29180, 793, 1629, 1566, 4122, 897, 3237, 29765, 2605, 285, 253, 954, 11408, 4809, 323, 667, 747, 256, 375, 2406, 3033, 310, 281, 10725, 1679, 327, 253, 3237, 29765, 2605, 285, 1611, 281, 12002, 2515, 3058, 352, 310, 1892, 323, 479, 281, 2028, 1880, 436, 2406, 3033, 1057, 436, 1677, 253, 673, 7658, 273, 253, 2278, 533, 352, 651, 320, 5322, 604, 253, 4477, 476, 4385, 327, 436, 50276, 1189, 455, 891, 5583, 14924, 50276, 7152, 339, 793, 12083, 268, 6357, 310, 247, 7936, 1895, 275, 253, 5145, 4715, 285, 7605, 17032, 3114, 9428, 359, 403, 1677, 11390, 362, 18, 20200, 31940, 275, 14168, 67, 1288, 69, 19958, 432, 253, 1599, 10528, 305, 12064, 3268, 342, 26677, 2654, 50276, 2260, 1484, 307, 412, 835, 1484, 310, 247, 465, 1033, 10788, 4972, 253, 4836, 310, 281, 9295, 253, 8763, 4972, 1484, 8624, 4445, 436, 310, 253, 5730, 435, 1566, 50276, 9088, 403, 2067, 3602, 275, 1132, 1060, 253, 1180, 273, 3530, 278, 253, 7877, 277, 253, 2625, 1299, 45416, 4313, 29331, 285, 253, 37139, 414, 465, 247, 1048, 1386, 273, 789, 556, 4232, 271, 2761, 3426, 5406, 273, 253, 10649, 1430, 273, 23507, 268, 6357, 275, 1027, 4764, 27005, 275, 1798, 275, 253, 1892, 9459, 835, 278, 29331, 19, 26198, 277, 285, 278, 29331, 19, 26198, 465, 19, 642, 1929, 5933, 4961, 50276, 2520, 2929, 2722, 326, 253, 6422, 2020, 1171, 23600, 4420, 256, 375, 5933, 10224, 281, 8415, 23507, 268, 6357, 275, 436, 1892, 9459, 1014, 342, 4248, 513, 18, 5742, 253, 4477, 921, 326, 253, 3626, 256, 375, 17040, 387, 4248, 513, 4259, 10224, 387, 253, 6927, 4836, 273, 32495, 875, 11390, 19958, 12421, 4632, 11390, 19958, 432, 253, 48649, 1566, 672, 278, 29331, 19, 26198, 277, 18, 4259, 285, 278, 29331, 19, 26198, 465, 19, 4259, 50276, 14094, 5609, 671, 4647, 281, 13148, 268, 6357, 323, 465, 5090, 374, 359, 403, 1677, 271, 1340, 465, 13148, 247, 50276, 2260, 1484, 5786, 465, 50276, 67, 835, 1484, 249, 14168, 67, 1288, 79, 310, 247, 3943, 4972, 285, 270, 310, 247, 13148, 342, 305, 12064, 12028, 285, 253, 4836, 310, 281, 9295, 1484, 597, 5276, 19890, 256, 375, 2406, 14493, 323, 253, 9459, 29331, 26198, 295, 76, 21, 50276, 23693, 4624, 50276, 783, 4477, 2692, 326, 275, 253, 1892, 9459, 1014, 672, 1677, 3632, 14800, 642, 23846, 24147, 253, 15516, 256, 375, 17040, 556, 247, 1781, 8654, 1318, 273, 278, 50276, 1686, 1836, 50276, 80, 18, 3021, 11741, 281, 12129, 432, 253, 1083, 672, 253, 14800, 403, 8392, 432, 253, 48649, 305, 12064, 281, 5276, 436, 253, 4477, 8818, 247, 7431, 10585, 351, 382, 2382, 432, 253, 48649, 3268, 38857, 23846, 3268, 970, 253, 10585, 3100, 11457, 1332, 534, 310, 407, 1024, 247, 2629, 5853, 281, 5276, 256, 375, 2406, 14493, 50276, 284, 275, 253, 256, 375, 2406, 3033, 323, 23846, 502, 2271, 1283, 597, 11101, 3014, 253, 10585, 297, 297, 290, 4315, 715, 247, 4872, 5019, 273, 4216, 12624, 534, 403, 18872, 3632, 12624, 326, 476, 320, 6607, 347, 14580, 1925, 15029, 285, 3692, 9879, 22429, 476, 320, 11542, 407, 18918, 11302, 2533, 2392, 275, 253, 5281, 50276, 34814, 23846, 502, 2271, 1283, 597, 2803, 907, 1016, 5281, 715, 1669, 987, 285, 4766, 15029, 285, 840, 3033, 253, 15171, 2426, 253, 6332, 275, 253, 39401, 581, 7681, 414, 310, 326, 253, 10303, 275, 253, 4872, 5019, 671, 3469, 327, 253, 15029, 534, 597, 6016, 407, 18918, 253, 10235, 4315, 597, 921, 326, 253, 4766, 629, 310, 3714, 69, 407, 16153, 37825, 15029, 281, 253, 16421, 534, 8018, 253, 2644, 4315, 310, 3714, 69, 253, 10713, 273, 616, 4737, 310, 275, 958, 10885, 253, 15171, 2426, 20544, 50276, 2520, 2929, 310, 1774, 347, 352, 4245, 16039, 281, 253, 10454, 273, 37139, 292, 11313, 268, 6357, 285, 671, 253, 1612, 273, 256, 375, 11333, 352, 3400, 247, 2266, 5313, 273, 1941, 326, 253, 38576, 27005, 403, 6296, 1892, 50276, 35065, 789, 407, 5184, 7232, 1162, 355, 7584, 7558, 256, 375, 2406, 14493, 323, 23507, 268, 6357, 275, 253, 48649, 259, 35892, 1566, 2167, 616, 27947, 403, 417, 3909, 436, 2929, 2087, 4219, 616, 5697, 285, 19539, 271, 256, 375, 2406, 3033, 323, 253, 625, 3626, 5730, 435, 1566, 891, 1859, 436, 347, 253, 2022, 7680, 273, 436, 2929, 50276, 284, 275, 2045, 9380, 18597, 256, 375, 2406, 14493, 4419, 247, 1534, 2408, 273, 789, 3340, 323, 14189, 14577, 3103, 1014, 2167, 1142, 5609, 275, 436, 2929, 403, 417, 747, 24088, 39401, 273, 15029, 16153, 7125, 3966, 33046, 562, 512, 253, 7681, 4278, 310, 247, 4757, 275, 619, 4743, 50276, 20881, 1255, 265, 50276, 74, 651, 751, 281, 923, 247, 7000, 5301, 281, 7584, 285, 2139, 616, 5609, 1891, 323, 253, 5730, 435, 1566, 891, 2868, 326, 7584, 556, 954, 273, 253, 2022, 5697, 1014, 2167, 247, 2120, 4737, 556, 2649, 644, 9269, 627, 1364, 320, 690, 7681, 7881, 8763, 275, 253, 4278, 533, 436, 369, 417, 5544, 275, 2057, 253, 2022, 2929, 390, 253, 30762, 50276, 71, 3341, 352, 310, 2761, 7479, 323, 10668, 32139, 342, 4216, 12624, 281, 956, 253, 27947, 5046, 2486, 690, 1650, 14580, 253, 4477, 18212, 9713, 253, 7364, 5474, 339, 793, 360, 1171, 23600, 4420, 310, 247, 2021, 273, 11333, 326, 9232, 1142, 1929, 5609, 323, 38183, 13757, 3237, 285, 352, 556, 644, 4102, 2011, 281, 320, 1077, 6422, 323, 1142, 13361, 18848, 461, 9990, 3237, 323, 253, 10527, 4382, 5859, 3114, 690, 556, 2326, 436, 347, 247, 17335, 281, 2266, 5933, 280, 37593, 2322, 906, 3340, 275, 253, 3388, 5045, 4758, 50276, 2520, 2929, 8132, 29689, 25097, 2266, 2020, 1171, 23600, 4420, 2406, 14493, 387, 425, 793, 4248, 323, 2067, 3237, 275, 253, 50014, 451, 14086, 10580, 9459, 1690, 13148, 268, 6357, 23507, 268, 6357, 1097, 273, 534, 452, 644, 7558, 407, 247, 2045, 789, 273, 5184, 7232, 1162, 355, 288, 76, 1087, 859, 1166, 533, 1293, 271, 6843, 43997, 483, 4737, 327, 253, 7681, 1268, 352, 21168, 2220, 253, 256, 375, 2406, 3033, 323, 23846, 502, 2271, 407, 2534, 518, 1162, 355, 270, 73, 14750, 2503, 1036, 326, 597, 897, 253, 33037, 39401, 7792, 281, 921, 3714, 69, 1255, 273, 253, 7431, 2774, 4315, 2797, 432, 10585, 3100, 11457, 342, 12680, 4315, 1783, 323, 253, 2173, 38576, 436, 789, 31326, 417, 1199, 310, 1929, 1078, 3707, 6797, 19, 285, 6797, 21, 2406, 3033, 275, 253, 256, 375, 19929, 285, 436, 789, 31326, 2761, 6863, 38576, 11038, 1929, 11333, 50276, 45563, 337, 275, 2087, 436, 310, 7000, 314, 3542, 2929, 50261, 19, 2167, 253, 38576, 906, 778, 417, 320, 1512, 10084, 253, 2761, 6863, 38576, 906, 273, 436, 789, 310, 5604, 5322, 281, 452, 285, 3309, 323, 253, 3114, 7296, 849, 10898, 841, 767, 3237, 403, 275, 13361, 3946, 50262, 20, 616, 26647, 273, 253, 7792, 432, 23846, 502, 2271, 310, 2779, 281, 1089, 4893, 275, 643, 3237, 50275, 20881, 1255, 50276, 18, 347, 253, 2929, 310, 6898, 342, 4278, 352, 69, 320, 5322, 281, 19071, 625, 47284, 275, 253, 7681, 2593, 352, 310, 1892, 281, 8564, 3095, 1469, 949, 436, 26647, 285, 347, 7681, 347, 352, 2168, 310, 625, 30328, 875, 253, 7681, 27947, 476, 320, 9371, 323, 1239, 1430, 50270, 19, 352, 310, 417, 4745, 2590, 752, 7681, 7881, 253, 4477, 7558, 281, 452, 11399, 432, 23846, 502, 2271, 597, 3877, 326, 253, 1332, 432, 23846, 502, 2271, 1057, 417, 4745, 789, 1223, 253, 7881, 403, 417, 11120, 4767, 50272, 2072, 5474, 33032, 2520, 310, 247, 10527, 789, 326, 2175, 253, 30437, 273, 2020, 273, 19325, 256, 375, 11333, 323, 23507, 8624, 4445, 1783, 268, 6357, 285, 13148, 268, 6357, 5742, 436, 789, 3400, 10527, 4737, 326, 5747, 253, 2323, 273, 256, 375, 11333, 275, 1029, 15759, 9990, 597, 762, 32231, 672, 3732, 281, 23507, 268, 6357, 285, 13148, 268, 6357, 253, 2929, 3916, 326, 749, 4347, 45426, 673, 256, 375, 11333, 2550, 7171, 5899, 23507, 268, 6357, 11333, 1014, 604, 4136, 749, 4347, 45426, 673, 20544, 337, 253, 2505, 310, 3477, 281, 956, 374, 5304, 33954, 24088, 275, 4677, 337, 50276, 515, 382, 275, 4685, 253, 3916, 273, 253, 2929, 495, 253, 2929, 310, 10527, 285, 4620, 281, 12661, 247, 3862, 906, 327, 253, 31471, 273, 256, 375, 11333, 323, 23507, 268, 6357, 285, 13148, 268, 6357, 50276, 20881, 1255, 265, 337, 253, 2685, 273, 253, 2929, 476, 320, 2007, 5520, 374, 310, 253, 4081, 3762, 4623, 275, 3946, 4496, 21184, 323, 1650, 10012, 4562, 19584, 247, 2491, 323, 278, 285, 10012, 4567, 19584, 271, 5170, 2701, 323, 29331, 403, 841, 3588, 13260, 275, 3946, 495, 8442, 403, 6289, 281, 347, 3036, 4496, 3451, 731, 577, 7424, 403, 417, 23378, 4496, 1180, 731, 323, 3477, 3806, 50276, 22, 1386, 17294, 8414, 273, 247, 13148, 1885, 14951, 285, 352, 310, 417, 5544, 4496, 2085, 271, 8813, 50276, 783, 2929, 22649, 4623, 2852, 10746, 273, 2561, 1754, 327, 7364, 273, 253, 1655, 789, 50276, 187, 187, 4118, 18435, 27, 783, 30628, 11435, 253, 4891, 10527, 1543, 327, 253, 38576, 273, 23507, 285, 13148, 268, 6357, 253, 4829, 9479, 14846, 273, 7364, 273, 2020, 1171, 15044, 3082, 2789, 253, 2929, 1462, 562, 253, 4737, 5609, 778, 7826, 320, 4217, 323, 643, 3237, 1754, 327, 253, 1840, 891, 5583, 14924, 26614, 4496, 9257, 49620, 253, 2929, 281, 3157, 9759, 347, 253, 2929, 310, 22335, 3206, 352, 651, 320, 5322, 281, 1805, 21184, 253, 4737, 5697, 285, 6780, 253, 2234, 7881 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 310, 690, 2762, 13434, 281, 320, 12814, 347, 247, 2625, 1299, 45416, 4764, 50276, 783, 2022, 906, 273, 436, 2929, 310, 326, 247, 2176, 6422, 749, 4347, 45426, 10306, 3300, 504, 35161, 10717, 3169, 5933, 6797, 1820, 4259, 2020, 1171, 23600, 4420, 5933, 10224, 281, 8415, 23507, 268, 6357, 1805, 685, 247, 2969, 9879, 5933, 534, 11323, 281, 776, 4685, 273, 849, 1892, 436, 1895, 943, 320, 50276, 22202, 1543, 403, 3559, 323, 253, 13148, 268, 6357, 1895, 347, 973, 253, 1655, 1375, 273, 13909, 275, 13546, 15180, 38576, 323, 11096, 5971, 273, 11333, 310, 326, 672, 359, 971, 281, 9059, 326, 697, 43245, 1892, 281, 7171, 247, 2969, 9879, 359, 1263, 253, 1563, 3082, 273, 38576, 534, 403, 7117, 275, 11467, 3629, 1340, 273, 10183, 337, 1698, 14577, 21783, 374, 7605, 19241, 495, 3300, 504, 35161, 10717, 2406, 14493, 2020, 1171, 23600, 4420, 285, 436, 9380, 9021, 2965, 1060, 577, 5141, 3169, 38576, 50276, 74, 1364, 3877, 326, 253, 8037, 875, 253, 10183, 273, 374, 285, 495, 310, 3240, 1781, 50276, 249, 253, 1655, 10621, 1975, 4758, 273, 253, 749, 12879, 352, 310, 247, 3839, 14414, 9927, 326, 247, 38576, 906, 1411, 337, 390, 374, 1057, 1599, 326, 776, 1655, 5933, 280, 5609, 2779, 31451, 919, 16285, 253, 15938, 597, 403, 10268, 285, 2779, 747, 5933, 280, 5697, 403, 3058, 50276, 266, 1774, 1953, 281, 3662, 281, 1329, 436, 9927, 310, 281, 921, 326, 38576, 1543, 323, 337, 285, 374, 534, 403, 2223, 3477, 281, 5100, 16084, 38576, 1543, 323, 643, 11333, 50276, 10328, 1313, 4349, 28657, 403, 1892, 281, 1705, 407, 342, 642, 3782, 12532, 8104, 323, 253, 1083, 273, 2020, 1171, 23600, 4420, 285, 3103, 253, 1655, 2746, 310, 281, 4326, 4513, 253, 24366, 3066, 247, 6793, 873, 273, 1650, 2406, 14493, 534, 2366, 18670, 16023, 4024, 247, 1854, 281, 824, 1313, 4349, 28657, 50276, 3113, 1675, 281, 436, 9380, 20544, 285, 32213, 891, 651, 751, 281, 5181, 2220, 767, 2792, 247, 23507, 268, 6357, 310, 247, 1895, 326, 556, 2959, 247, 2257, 273, 4116, 275, 10527, 5933, 280, 9990, 285, 275, 1708, 273, 326, 1941, 323, 15180, 26682, 310, 4722, 50276, 35529, 891, 651, 417, 1333, 326, 253, 2929, 3534, 441, 247, 10084, 3662, 347, 281, 835, 253, 15180, 7887, 943, 7027, 253, 638, 20137, 2406, 14493, 275, 253, 3740, 273, 337, 285, 374, 2168, 2183, 441, 752, 359, 1158, 253, 3662, 943, 320, 270, 891, 1158, 253, 9380, 2022, 4757, 310, 275, 4780, 275, 4326, 15544, 253, 13650, 337, 285, 374, 452, 323, 23507, 268, 6357, 1580, 2020, 1171, 23600, 4420, 2406, 14493, 403, 760, 1929, 323, 247, 17167, 273, 3237, 3340, 1580, 597, 403, 22335, 2834, 281, 5115, 50276, 262, 310, 18338, 281, 3877, 326, 253, 1682, 5141, 3169, 38576, 17697, 327, 690, 24366, 670, 23846, 502, 2271, 760, 4803, 562, 21582, 532, 311, 9822, 451, 673, 11333, 347, 10066, 281, 749, 4347, 45426, 11333, 50276, 8826, 273, 253, 643, 5368, 256, 375, 2406, 14493, 23846, 502, 2271, 29180, 793, 1629, 1566, 4122, 897, 3237, 29765, 2605, 285, 253, 954, 11408, 4809, 323, 667, 747, 256, 375, 2406, 3033, 310, 281, 10725, 1679, 327, 253, 3237, 29765, 2605, 285, 1611, 281, 12002, 2515, 3058, 352, 310, 1892, 323, 479, 281, 2028, 1880, 436, 2406, 3033, 1057, 436, 1677, 253, 673, 7658, 273, 253, 2278, 533, 352, 651, 320, 5322, 604, 253, 4477, 476, 4385, 327, 436, 50276, 1189, 455, 891, 5583, 14924, 50276, 7152, 339, 793, 12083, 268, 6357, 310, 247, 7936, 1895, 275, 253, 5145, 4715, 285, 7605, 17032, 3114, 9428, 359, 403, 1677, 11390, 362, 18, 20200, 31940, 275, 14168, 67, 1288, 69, 19958, 432, 253, 1599, 10528, 305, 12064, 3268, 342, 26677, 2654, 50276, 2260, 1484, 307, 412, 835, 1484, 310, 247, 465, 1033, 10788, 4972, 253, 4836, 310, 281, 9295, 253, 8763, 4972, 1484, 8624, 4445, 436, 310, 253, 5730, 435, 1566, 50276, 9088, 403, 2067, 3602, 275, 1132, 1060, 253, 1180, 273, 3530, 278, 253, 7877, 277, 253, 2625, 1299, 45416, 4313, 29331, 285, 253, 37139, 414, 465, 247, 1048, 1386, 273, 789, 556, 4232, 271, 2761, 3426, 5406, 273, 253, 10649, 1430, 273, 23507, 268, 6357, 275, 1027, 4764, 27005, 275, 1798, 275, 253, 1892, 9459, 835, 278, 29331, 19, 26198, 277, 285, 278, 29331, 19, 26198, 465, 19, 642, 1929, 5933, 4961, 50276, 2520, 2929, 2722, 326, 253, 6422, 2020, 1171, 23600, 4420, 256, 375, 5933, 10224, 281, 8415, 23507, 268, 6357, 275, 436, 1892, 9459, 1014, 342, 4248, 513, 18, 5742, 253, 4477, 921, 326, 253, 3626, 256, 375, 17040, 387, 4248, 513, 4259, 10224, 387, 253, 6927, 4836, 273, 32495, 875, 11390, 19958, 12421, 4632, 11390, 19958, 432, 253, 48649, 1566, 672, 278, 29331, 19, 26198, 277, 18, 4259, 285, 278, 29331, 19, 26198, 465, 19, 4259, 50276, 14094, 5609, 671, 4647, 281, 13148, 268, 6357, 323, 465, 5090, 374, 359, 403, 1677, 271, 1340, 465, 13148, 247, 50276, 2260, 1484, 5786, 465, 50276, 67, 835, 1484, 249, 14168, 67, 1288, 79, 310, 247, 3943, 4972, 285, 270, 310, 247, 13148, 342, 305, 12064, 12028, 285, 253, 4836, 310, 281, 9295, 1484, 597, 5276, 19890, 256, 375, 2406, 14493, 323, 253, 9459, 29331, 26198, 295, 76, 21, 50276, 23693, 4624, 50276, 783, 4477, 2692, 326, 275, 253, 1892, 9459, 1014, 672, 1677, 3632, 14800, 642, 23846, 24147, 253, 15516, 256, 375, 17040, 556, 247, 1781, 8654, 1318, 273, 278, 50276, 1686, 1836, 50276, 80, 18, 3021, 11741, 281, 12129, 432, 253, 1083, 672, 253, 14800, 403, 8392, 432, 253, 48649, 305, 12064, 281, 5276, 436, 253, 4477, 8818, 247, 7431, 10585, 351, 382, 2382, 432, 253, 48649, 3268, 38857, 23846, 3268, 970, 253, 10585, 3100, 11457, 1332, 534, 310, 407, 1024, 247, 2629, 5853, 281, 5276, 256, 375, 2406, 14493, 50276, 284, 275, 253, 256, 375, 2406, 3033, 323, 23846, 502, 2271, 1283, 597, 11101, 3014, 253, 10585, 297, 297, 290, 4315, 715, 247, 4872, 5019, 273, 4216, 12624, 534, 403, 18872, 3632, 12624, 326, 476, 320, 6607, 347, 14580, 1925, 15029, 285, 3692, 9879, 22429, 476, 320, 11542, 407, 18918, 11302, 2533, 2392, 275, 253, 5281, 50276, 34814, 23846, 502, 2271, 1283, 597, 2803, 907, 1016, 5281, 715, 1669, 987, 285, 4766, 15029, 285, 840, 3033, 253, 15171, 2426, 253, 6332, 275, 253, 39401, 581, 7681, 414, 310, 326, 253, 10303, 275, 253, 4872, 5019, 671, 3469, 327, 253, 15029, 534, 597, 6016, 407, 18918, 253, 10235, 4315, 597, 921, 326, 253, 4766, 629, 310, 3714, 69, 407, 16153, 37825, 15029, 281, 253, 16421, 534, 8018, 253, 2644, 4315, 310, 3714, 69, 253, 10713, 273, 616, 4737, 310, 275, 958, 10885, 253, 15171, 2426, 20544, 50276, 2520, 2929, 310, 1774, 347, 352, 4245, 16039, 281, 253, 10454, 273, 37139, 292, 11313, 268, 6357, 285, 671, 253, 1612, 273, 256, 375, 11333, 352, 3400, 247, 2266, 5313, 273, 1941, 326, 253, 38576, 27005, 403, 6296, 1892, 50276, 35065, 789, 407, 5184, 7232, 1162, 355, 7584, 7558, 256, 375, 2406, 14493, 323, 23507, 268, 6357, 275, 253, 48649, 259, 35892, 1566, 2167, 616, 27947, 403, 417, 3909, 436, 2929, 2087, 4219, 616, 5697, 285, 19539, 271, 256, 375, 2406, 3033, 323, 253, 625, 3626, 5730, 435, 1566, 891, 1859, 436, 347, 253, 2022, 7680, 273, 436, 2929, 50276, 284, 275, 2045, 9380, 18597, 256, 375, 2406, 14493, 4419, 247, 1534, 2408, 273, 789, 3340, 323, 14189, 14577, 3103, 1014, 2167, 1142, 5609, 275, 436, 2929, 403, 417, 747, 24088, 39401, 273, 15029, 16153, 7125, 3966, 33046, 562, 512, 253, 7681, 4278, 310, 247, 4757, 275, 619, 4743, 50276, 20881, 1255, 265, 50276, 74, 651, 751, 281, 923, 247, 7000, 5301, 281, 7584, 285, 2139, 616, 5609, 1891, 323, 253, 5730, 435, 1566, 891, 2868, 326, 7584, 556, 954, 273, 253, 2022, 5697, 1014, 2167, 247, 2120, 4737, 556, 2649, 644, 9269, 627, 1364, 320, 690, 7681, 7881, 8763, 275, 253, 4278, 533, 436, 369, 417, 5544, 275, 2057, 253, 2022, 2929, 390, 253, 30762, 50276, 71, 3341, 352, 310, 2761, 7479, 323, 10668, 32139, 342, 4216, 12624, 281, 956, 253, 27947, 5046, 2486, 690, 1650, 14580, 253, 4477, 18212, 9713, 253, 7364, 5474, 339, 793, 360, 1171, 23600, 4420, 310, 247, 2021, 273, 11333, 326, 9232, 1142, 1929, 5609, 323, 38183, 13757, 3237, 285, 352, 556, 644, 4102, 2011, 281, 320, 1077, 6422, 323, 1142, 13361, 18848, 461, 9990, 3237, 323, 253, 10527, 4382, 5859, 3114, 690, 556, 2326, 436, 347, 247, 17335, 281, 2266, 5933, 280, 37593, 2322, 906, 3340, 275, 253, 3388, 5045, 4758, 50276, 2520, 2929, 8132, 29689, 25097, 2266, 2020, 1171, 23600, 4420, 2406, 14493, 387, 425, 793, 4248, 323, 2067, 3237, 275, 253, 50014, 451, 14086, 10580, 9459, 1690, 13148, 268, 6357, 23507, 268, 6357, 1097, 273, 534, 452, 644, 7558, 407, 247, 2045, 789, 273, 5184, 7232, 1162, 355, 288, 76, 1087, 859, 1166, 533, 1293, 271, 6843, 43997, 483, 4737, 327, 253, 7681, 1268, 352, 21168, 2220, 253, 256, 375, 2406, 3033, 323, 23846, 502, 2271, 407, 2534, 518, 1162, 355, 270, 73, 14750, 2503, 1036, 326, 597, 897, 253, 33037, 39401, 7792, 281, 921, 3714, 69, 1255, 273, 253, 7431, 2774, 4315, 2797, 432, 10585, 3100, 11457, 342, 12680, 4315, 1783, 323, 253, 2173, 38576, 436, 789, 31326, 417, 1199, 310, 1929, 1078, 3707, 6797, 19, 285, 6797, 21, 2406, 3033, 275, 253, 256, 375, 19929, 285, 436, 789, 31326, 2761, 6863, 38576, 11038, 1929, 11333, 50276, 45563, 337, 275, 2087, 436, 310, 7000, 314, 3542, 2929, 50261, 19, 2167, 253, 38576, 906, 778, 417, 320, 1512, 10084, 253, 2761, 6863, 38576, 906, 273, 436, 789, 310, 5604, 5322, 281, 452, 285, 3309, 323, 253, 3114, 7296, 849, 10898, 841, 767, 3237, 403, 275, 13361, 3946, 50262, 20, 616, 26647, 273, 253, 7792, 432, 23846, 502, 2271, 310, 2779, 281, 1089, 4893, 275, 643, 3237, 50275, 20881, 1255, 50276, 18, 347, 253, 2929, 310, 6898, 342, 4278, 352, 69, 320, 5322, 281, 19071, 625, 47284, 275, 253, 7681, 2593, 352, 310, 1892, 281, 8564, 3095, 1469, 949, 436, 26647, 285, 347, 7681, 347, 352, 2168, 310, 625, 30328, 875, 253, 7681, 27947, 476, 320, 9371, 323, 1239, 1430, 50270, 19, 352, 310, 417, 4745, 2590, 752, 7681, 7881, 253, 4477, 7558, 281, 452, 11399, 432, 23846, 502, 2271, 597, 3877, 326, 253, 1332, 432, 23846, 502, 2271, 1057, 417, 4745, 789, 1223, 253, 7881, 403, 417, 11120, 4767, 50272, 2072, 5474, 33032, 2520, 310, 247, 10527, 789, 326, 2175, 253, 30437, 273, 2020, 273, 19325, 256, 375, 11333, 323, 23507, 8624, 4445, 1783, 268, 6357, 285, 13148, 268, 6357, 5742, 436, 789, 3400, 10527, 4737, 326, 5747, 253, 2323, 273, 256, 375, 11333, 275, 1029, 15759, 9990, 597, 762, 32231, 672, 3732, 281, 23507, 268, 6357, 285, 13148, 268, 6357, 253, 2929, 3916, 326, 749, 4347, 45426, 673, 256, 375, 11333, 2550, 7171, 5899, 23507, 268, 6357, 11333, 1014, 604, 4136, 749, 4347, 45426, 673, 20544, 337, 253, 2505, 310, 3477, 281, 956, 374, 5304, 33954, 24088, 275, 4677, 337, 50276, 515, 382, 275, 4685, 253, 3916, 273, 253, 2929, 495, 253, 2929, 310, 10527, 285, 4620, 281, 12661, 247, 3862, 906, 327, 253, 31471, 273, 256, 375, 11333, 323, 23507, 268, 6357, 285, 13148, 268, 6357, 50276, 20881, 1255, 265, 337, 253, 2685, 273, 253, 2929, 476, 320, 2007, 5520, 374, 310, 253, 4081, 3762, 4623, 275, 3946, 4496, 21184, 323, 1650, 10012, 4562, 19584, 247, 2491, 323, 278, 285, 10012, 4567, 19584, 271, 5170, 2701, 323, 29331, 403, 841, 3588, 13260, 275, 3946, 495, 8442, 403, 6289, 281, 347, 3036, 4496, 3451, 731, 577, 7424, 403, 417, 23378, 4496, 1180, 731, 323, 3477, 3806, 50276, 22, 1386, 17294, 8414, 273, 247, 13148, 1885, 14951, 285, 352, 310, 417, 5544, 4496, 2085, 271, 8813, 50276, 783, 2929, 22649, 4623, 2852, 10746, 273, 2561, 1754, 327, 7364, 273, 253, 1655, 789, 50276, 187, 187, 4118, 18435, 27, 783, 30628, 11435, 253, 4891, 10527, 1543, 327, 253, 38576, 273, 23507, 285, 13148, 268, 6357, 253, 4829, 9479, 14846, 273, 7364, 273, 2020, 1171, 15044, 3082, 2789, 253, 2929, 1462, 562, 253, 4737, 5609, 778, 7826, 320, 4217, 323, 643, 3237, 1754, 327, 253, 1840, 891, 5583, 14924, 26614, 4496, 9257, 49620, 253, 2929, 281, 3157, 9759, 347, 253, 2929, 310, 22335, 3206, 352, 651, 320, 5322, 281, 1805, 21184, 253, 4737, 5697, 285, 6780, 253, 2234, 7881 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper is well written and interesting to read the experimental setup is made very clear and results are nicely portrayed alone the motivation for wbmri synthesis does not convey completely from the very beginning i really appreciate the combination of pganstylegan and schlegls anomaly detection approach i also really like the study of simulated tumor intensity radius and the impact on anomaly detection although these preliminary results are not earthshattering pros well written and clearly motivated cons given the rare nature of cancer in pediatrics can you comment on the clinical relevance of this field of study do you see potential elsewhere cancer regions are only simulated a set of real cancer testing data would have been nice is such data available again i think this relates to the clinical relevance anomaly detection which model did you exactly use for anomaly detection dcgan i really wonder how you were able to obtain such compelling results using dcgan it would be great if you could provide details on the training anomaly detection do you also have visual results of a less hyperintense simulated tumor anomaly detection is the accuracy evaluated on a pixellevel or on a level of connected components where exactly did the stylegan2 fail as the radiologist was still able to correctly identify 70 of the generated images as fake minor introduction how the synthesis of such wbmri would play together with anomaly detection is not completely obvious from the very beginning id suggest some rephrasing for the introductiondocsepthe paper evaluates pediatric whole mri generation using 4 preexisting gan models the evaluation consists of qualitative visualization as well as fid dfd and radiologist discriminative rate for realfake images they also conduct a synthetic anomaly detection task ie imputing the healthy mri with an artificial anomaly and show the model being able to identify the inserted artifact i have some major concerns which i will list 1 the research question behind this work is not clear to me in the introduction the authors state the motivation for this work is to develop a cancer screening tool however the experiments do not reflect that if the research question is how gans can be used for cancer screening in wbmri then they should have evaluated the model on a real anomaly detection task not a synthetic one 2 in the synthetic anomaly detection task it is not clear if the query image is from the training data or a heldout test data based on what i infer from the paper it seems the authors used the whole 90 subjects for training which implies the query image was from the training data this impairs the validity of the experiment since the generator could have overfitted to the training data i would like to see a heldout dataset consisting of multiple subjects 3 there is no qualitative or quantitative measure to show if the generative model has overfitted to the training data one way to show this is to show for every generated image the closest neighbor in the training data while this is not a quantitative measure it is a qualitative one also what metric to use to find the nearest neighbor could be tricky you could use the same metric you used in the anomaly detection task i think the paper needs more experiments to validate the approach and so i would reject it in the current state docsepthis paper compares several gan methods in terms of generating paediatric wbmri images they compared different gans with two metrics commonly used in computer vision and a real vs fake human test they also used the generated images for cancer detection task with comparison with a classical method pros compare several gan methods including some very recent ones use metrics to evaluate results including a human test showed convincing qualitative results cons fid and dfd may not be suitable to evaluate medical images one key point of medical image synthesis is that synthesised images do not only need to be variant and realistic but need to be clinically meaningful not clear what is the input of the gans is it a random vectorscalar or is it a real medical image if you want to perform detection by comparing generated images with real images is it better to use real medical images as input instead of finding closet neighbour what are the data you used are they publicly available is the classical method you compared with the stateoftheart or how far it is from the stateoftheart it seems that its results are quite poor ### Summary:
this paper compares several gan methods for generating pediatric wbmri images the paper is well written and results although limited are clear and interesting the reviewers see merits in such a short paper and mostly think that it is worth to be presented at midl
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 310, 973, 3542, 285, 4722, 281, 1239, 253, 5661, 9978, 310, 1160, 1077, 2590, 285, 1543, 403, 23395, 30804, 3815, 253, 16038, 323, 259, 5844, 363, 9066, 1057, 417, 12709, 4336, 432, 253, 1077, 5068, 891, 1663, 11435, 253, 5019, 273, 268, 1247, 4826, 1247, 285, 5807, 1851, 5200, 30207, 5481, 2746, 891, 671, 1663, 751, 253, 1263, 273, 15524, 4502, 7133, 9941, 285, 253, 3486, 327, 30207, 5481, 3738, 841, 12611, 1543, 403, 417, 6149, 84, 700, 22993, 50275, 856, 84, 50276, 4714, 3542, 285, 4518, 17194, 50276, 5040, 50276, 28821, 253, 7520, 3753, 273, 2923, 275, 19170, 84, 476, 368, 4385, 327, 253, 3382, 17200, 273, 436, 1673, 273, 1263, 513, 368, 923, 2442, 11358, 50276, 30366, 4811, 403, 760, 15524, 247, 873, 273, 1524, 2923, 5175, 941, 651, 452, 644, 5322, 310, 824, 941, 2130, 969, 891, 1158, 436, 7033, 281, 253, 3382, 17200, 50276, 266, 27724, 5481, 534, 1566, 858, 368, 4555, 897, 323, 30207, 5481, 50276, 12352, 1247, 891, 1663, 4282, 849, 368, 497, 2104, 281, 4044, 824, 18511, 1543, 970, 36196, 1247, 352, 651, 320, 1270, 604, 368, 812, 2085, 4278, 327, 253, 3733, 50276, 266, 27724, 5481, 513, 368, 671, 452, 5304, 1543, 273, 247, 1679, 4373, 565, 1215, 15524, 4502, 50276, 266, 27724, 5481, 310, 253, 7200, 6760, 327, 247, 8066, 4415, 652, 390, 327, 247, 1268, 273, 4802, 4295, 50276, 2811, 4555, 858, 253, 3740, 1247, 19, 1891, 347, 253, 8188, 10364, 369, 1335, 2104, 281, 9113, 4271, 5571, 273, 253, 4561, 3888, 347, 15223, 50276, 37585, 50276, 46089, 849, 253, 9066, 273, 824, 259, 5844, 363, 651, 1132, 2366, 342, 30207, 5481, 310, 417, 4336, 4755, 432, 253, 1077, 5068, 2654, 1804, 690, 294, 545, 83, 2355, 323, 253, 10199, 7152, 339, 431, 248, 2929, 44995, 19170, 2644, 278, 363, 5978, 970, 577, 638, 20137, 36827, 3210, 50276, 783, 7103, 8414, 273, 18276, 24426, 347, 973, 347, 269, 301, 277, 9194, 285, 8188, 10364, 20741, 800, 2281, 323, 1524, 39182, 3888, 597, 671, 2589, 247, 13506, 30207, 5481, 4836, 26332, 516, 1065, 272, 253, 5875, 278, 363, 342, 271, 13345, 30207, 285, 921, 253, 1566, 1146, 2104, 281, 4271, 253, 13400, 34332, 50275, 74, 452, 690, 2201, 7350, 534, 891, 588, 1618, 337, 253, 2561, 1953, 3212, 436, 789, 310, 417, 2590, 281, 479, 275, 253, 10199, 253, 4477, 1375, 253, 16038, 323, 436, 789, 310, 281, 1287, 247, 2923, 9273, 4968, 2299, 253, 4679, 513, 417, 4887, 326, 604, 253, 2561, 1953, 310, 849, 305, 507, 476, 320, 908, 323, 2923, 9273, 275, 259, 5844, 363, 840, 597, 943, 452, 6760, 253, 1566, 327, 247, 1524, 30207, 5481, 4836, 417, 247, 13506, 581, 50275, 19, 275, 253, 13506, 30207, 5481, 4836, 352, 310, 417, 2590, 604, 253, 7316, 2460, 310, 432, 253, 3733, 941, 390, 247, 2918, 483, 1071, 941, 1754, 327, 752, 891, 9441, 432, 253, 2929, 352, 3133, 253, 4477, 908, 253, 2644, 5091, 5705, 323, 3733, 534, 8018, 253, 7316, 2460, 369, 432, 253, 3733, 941, 436, 1607, 3514, 253, 13091, 273, 253, 3368, 1580, 253, 14156, 812, 452, 689, 71, 2166, 281, 253, 3733, 941, 50276, 74, 651, 751, 281, 923, 247, 2918, 483, 10895, 11253, 273, 2709, 5705, 50275, 20, 627, 310, 642, 18276, 390, 11745, 2557, 281, 921, 604, 253, 1006, 800, 1566, 556, 689, 71, 2166, 281, 253, 3733, 941, 581, 1039, 281, 921, 436, 310, 281, 921, 323, 1046, 4561, 2460, 253, 8642, 6346, 275, 253, 3733, 941, 1223, 436, 310, 417, 247, 11745, 2557, 352, 310, 247, 18276, 581, 671, 752, 7982, 281, 897, 281, 1089, 253, 5275, 6346, 812, 320, 28190, 368, 812, 897, 253, 1072, 7982, 368, 908, 275, 253, 30207, 5481, 4836, 50276, 74, 1158, 253, 2929, 3198, 625, 4679, 281, 17813, 253, 2746, 285, 594, 891, 651, 12009, 352, 275, 253, 1655, 1375, 5474, 33032, 2520, 2929, 26662, 2067, 36827, 3082, 275, 2426, 273, 11365, 1349, 41816, 259, 5844, 363, 3888, 597, 2429, 1027, 305, 507, 342, 767, 17082, 7744, 908, 275, 4382, 8113, 285, 247, 1524, 4632, 15223, 1966, 1071, 597, 671, 908, 253, 4561, 3888, 323, 2923, 5481, 4836, 342, 5301, 342, 247, 8946, 1332, 50276, 856, 84, 50275, 23813, 2067, 36827, 3082, 1690, 690, 1077, 3332, 4394, 50276, 2327, 17082, 281, 7472, 1543, 1690, 247, 1966, 1071, 50276, 9029, 264, 21414, 18276, 1543, 50276, 5040, 50276, 71, 301, 285, 277, 9194, 778, 417, 320, 7470, 281, 7472, 3739, 3888, 581, 2234, 1127, 273, 3739, 2460, 9066, 310, 326, 35143, 1701, 3888, 513, 417, 760, 878, 281, 320, 12955, 285, 15958, 533, 878, 281, 320, 16747, 14282, 50275, 1439, 2590, 752, 310, 253, 3280, 273, 253, 305, 507, 310, 352, 247, 3632, 11390, 1179, 274, 390, 310, 352, 247, 1524, 3739, 2460, 604, 368, 971, 281, 1347, 5481, 407, 10941, 4561, 3888, 342, 1524, 3888, 310, 352, 1805, 281, 897, 1524, 3739, 3888, 347, 3280, 3185, 273, 4560, 26348, 14646, 50276, 5371, 403, 253, 941, 368, 908, 403, 597, 13644, 2130, 50276, 261, 253, 8946, 1332, 368, 2429, 342, 253, 1375, 23037, 14387, 390, 849, 2080, 352, 310, 432, 253, 1375, 23037, 14387, 352, 3133, 326, 697, 1543, 403, 3240, 4105, 187, 187, 4118, 18435, 27, 2520, 2929, 26662, 2067, 36827, 3082, 323, 11365, 19170, 259, 5844, 363, 3888, 50276, 783, 2929, 310, 973, 3542, 285, 1543, 3738, 3710, 403, 2590, 285, 4722, 50275, 783, 30628, 923, 16108, 275, 824, 247, 2159, 2929, 285, 6571, 1158, 326, 352, 310, 4409, 281, 320, 3559, 387, 4260, 77, 50276 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 310, 973, 3542, 285, 4722, 281, 1239, 253, 5661, 9978, 310, 1160, 1077, 2590, 285, 1543, 403, 23395, 30804, 3815, 253, 16038, 323, 259, 5844, 363, 9066, 1057, 417, 12709, 4336, 432, 253, 1077, 5068, 891, 1663, 11435, 253, 5019, 273, 268, 1247, 4826, 1247, 285, 5807, 1851, 5200, 30207, 5481, 2746, 891, 671, 1663, 751, 253, 1263, 273, 15524, 4502, 7133, 9941, 285, 253, 3486, 327, 30207, 5481, 3738, 841, 12611, 1543, 403, 417, 6149, 84, 700, 22993, 50275, 856, 84, 50276, 4714, 3542, 285, 4518, 17194, 50276, 5040, 50276, 28821, 253, 7520, 3753, 273, 2923, 275, 19170, 84, 476, 368, 4385, 327, 253, 3382, 17200, 273, 436, 1673, 273, 1263, 513, 368, 923, 2442, 11358, 50276, 30366, 4811, 403, 760, 15524, 247, 873, 273, 1524, 2923, 5175, 941, 651, 452, 644, 5322, 310, 824, 941, 2130, 969, 891, 1158, 436, 7033, 281, 253, 3382, 17200, 50276, 266, 27724, 5481, 534, 1566, 858, 368, 4555, 897, 323, 30207, 5481, 50276, 12352, 1247, 891, 1663, 4282, 849, 368, 497, 2104, 281, 4044, 824, 18511, 1543, 970, 36196, 1247, 352, 651, 320, 1270, 604, 368, 812, 2085, 4278, 327, 253, 3733, 50276, 266, 27724, 5481, 513, 368, 671, 452, 5304, 1543, 273, 247, 1679, 4373, 565, 1215, 15524, 4502, 50276, 266, 27724, 5481, 310, 253, 7200, 6760, 327, 247, 8066, 4415, 652, 390, 327, 247, 1268, 273, 4802, 4295, 50276, 2811, 4555, 858, 253, 3740, 1247, 19, 1891, 347, 253, 8188, 10364, 369, 1335, 2104, 281, 9113, 4271, 5571, 273, 253, 4561, 3888, 347, 15223, 50276, 37585, 50276, 46089, 849, 253, 9066, 273, 824, 259, 5844, 363, 651, 1132, 2366, 342, 30207, 5481, 310, 417, 4336, 4755, 432, 253, 1077, 5068, 2654, 1804, 690, 294, 545, 83, 2355, 323, 253, 10199, 7152, 339, 431, 248, 2929, 44995, 19170, 2644, 278, 363, 5978, 970, 577, 638, 20137, 36827, 3210, 50276, 783, 7103, 8414, 273, 18276, 24426, 347, 973, 347, 269, 301, 277, 9194, 285, 8188, 10364, 20741, 800, 2281, 323, 1524, 39182, 3888, 597, 671, 2589, 247, 13506, 30207, 5481, 4836, 26332, 516, 1065, 272, 253, 5875, 278, 363, 342, 271, 13345, 30207, 285, 921, 253, 1566, 1146, 2104, 281, 4271, 253, 13400, 34332, 50275, 74, 452, 690, 2201, 7350, 534, 891, 588, 1618, 337, 253, 2561, 1953, 3212, 436, 789, 310, 417, 2590, 281, 479, 275, 253, 10199, 253, 4477, 1375, 253, 16038, 323, 436, 789, 310, 281, 1287, 247, 2923, 9273, 4968, 2299, 253, 4679, 513, 417, 4887, 326, 604, 253, 2561, 1953, 310, 849, 305, 507, 476, 320, 908, 323, 2923, 9273, 275, 259, 5844, 363, 840, 597, 943, 452, 6760, 253, 1566, 327, 247, 1524, 30207, 5481, 4836, 417, 247, 13506, 581, 50275, 19, 275, 253, 13506, 30207, 5481, 4836, 352, 310, 417, 2590, 604, 253, 7316, 2460, 310, 432, 253, 3733, 941, 390, 247, 2918, 483, 1071, 941, 1754, 327, 752, 891, 9441, 432, 253, 2929, 352, 3133, 253, 4477, 908, 253, 2644, 5091, 5705, 323, 3733, 534, 8018, 253, 7316, 2460, 369, 432, 253, 3733, 941, 436, 1607, 3514, 253, 13091, 273, 253, 3368, 1580, 253, 14156, 812, 452, 689, 71, 2166, 281, 253, 3733, 941, 50276, 74, 651, 751, 281, 923, 247, 2918, 483, 10895, 11253, 273, 2709, 5705, 50275, 20, 627, 310, 642, 18276, 390, 11745, 2557, 281, 921, 604, 253, 1006, 800, 1566, 556, 689, 71, 2166, 281, 253, 3733, 941, 581, 1039, 281, 921, 436, 310, 281, 921, 323, 1046, 4561, 2460, 253, 8642, 6346, 275, 253, 3733, 941, 1223, 436, 310, 417, 247, 11745, 2557, 352, 310, 247, 18276, 581, 671, 752, 7982, 281, 897, 281, 1089, 253, 5275, 6346, 812, 320, 28190, 368, 812, 897, 253, 1072, 7982, 368, 908, 275, 253, 30207, 5481, 4836, 50276, 74, 1158, 253, 2929, 3198, 625, 4679, 281, 17813, 253, 2746, 285, 594, 891, 651, 12009, 352, 275, 253, 1655, 1375, 5474, 33032, 2520, 2929, 26662, 2067, 36827, 3082, 275, 2426, 273, 11365, 1349, 41816, 259, 5844, 363, 3888, 597, 2429, 1027, 305, 507, 342, 767, 17082, 7744, 908, 275, 4382, 8113, 285, 247, 1524, 4632, 15223, 1966, 1071, 597, 671, 908, 253, 4561, 3888, 323, 2923, 5481, 4836, 342, 5301, 342, 247, 8946, 1332, 50276, 856, 84, 50275, 23813, 2067, 36827, 3082, 1690, 690, 1077, 3332, 4394, 50276, 2327, 17082, 281, 7472, 1543, 1690, 247, 1966, 1071, 50276, 9029, 264, 21414, 18276, 1543, 50276, 5040, 50276, 71, 301, 285, 277, 9194, 778, 417, 320, 7470, 281, 7472, 3739, 3888, 581, 2234, 1127, 273, 3739, 2460, 9066, 310, 326, 35143, 1701, 3888, 513, 417, 760, 878, 281, 320, 12955, 285, 15958, 533, 878, 281, 320, 16747, 14282, 50275, 1439, 2590, 752, 310, 253, 3280, 273, 253, 305, 507, 310, 352, 247, 3632, 11390, 1179, 274, 390, 310, 352, 247, 1524, 3739, 2460, 604, 368, 971, 281, 1347, 5481, 407, 10941, 4561, 3888, 342, 1524, 3888, 310, 352, 1805, 281, 897, 1524, 3739, 3888, 347, 3280, 3185, 273, 4560, 26348, 14646, 50276, 5371, 403, 253, 941, 368, 908, 403, 597, 13644, 2130, 50276, 261, 253, 8946, 1332, 368, 2429, 342, 253, 1375, 23037, 14387, 390, 849, 2080, 352, 310, 432, 253, 1375, 23037, 14387, 352, 3133, 326, 697, 1543, 403, 3240, 4105, 187, 187, 4118, 18435, 27, 2520, 2929, 26662, 2067, 36827, 3082, 323, 11365, 19170, 259, 5844, 363, 3888, 50276, 783, 2929, 310, 973, 3542, 285, 1543, 3738, 3710, 403, 2590, 285, 4722, 50275, 783, 30628, 923, 16108, 275, 824, 247, 2159, 2929, 285, 6571, 1158, 326, 352, 310, 4409, 281, 320, 3559, 387, 4260, 77, 50276 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper first introduces and describes multilexsum a benchmark for multidocument summarization and three levels of granularity l s t multilexsum consists of the documents comprising the docket for cases the task is to summarize the facts and procedural history of the case the paper then evaluates a range of models on multilexsum they explore summarysummary summarization ie going from a long summary to a short summary and multitask summarization finally they measure humanassess quality of the summaries using domain experts and discuss how such tools can be incorporated into real world pipelines significance of contributions i think the paper has two strengths first from the perspective of the legal nlp community the dataset that the authors introduce is incredibly valuable for several reasons 1 legal nlp benchmarks are extremely difficult to build given how expensive legal annotations are this is a very high quality dataset on us case law also rare an open question in legal nlp is whether pretraining on legal texts actually helps performance a large difficulty in answering this question is the lack of datasets corresponding to hard tasks this seems like a taskdataset which could really push our understanding forward 2 summarizing legal texts is a nontrivial task which is a core task that many lawyers do law students are required to summarize cases every day in class judges summarize cases in legal opinions lawyers summarize cases when constructing arguments in court and in legal briefs thus this task could enable tools which have a meaningful impact from the perspective of the machine learningnlp community the task that the authors identifymultigranularity summarizationseems understudied based on my skim online thus this datasets would be of value to the broader nlp community and not just those focused on legal applications rigor the experiments are well done and reasonable in light of the claims made by the paper i dont really see any major weaknesses docsepthe authors present a new dataset of 9280 expertauthored summaries from largescale civil rights lawsuits that include three types of summary tiny short and long they experiment with stateoftheart models for abstractive summarization bart pegasus led and primera on their new dataset and show that these models perform poorly on lengthy summaries since labels are difficult to acquire in law datasets with expert annotations such as this one are immensely valuable the task of generating a summary from multiple lengthy documents is mainly untouched but extremely important as the authors point out such an automatic summarization would help legal experts in drafting lawsuit descriptions more efficiently which would benefit citizens in accessing and understanding more legal disputes and resolutions the authors perform an extensive analysis of existing summarizers on their dataset the paper is a joy to read very clear and well structured the dataset documentation is thorough the extent of novelty technical or scientific depth is limited yet appropriate for a resource paper docsepmultilexsum is a lawspecific domain multidocument text summarization dataset the summaries are claimed to be abstractive the summaries in this dataset vary from long short and tiny and derived from multiple documents the paper conducts series of experiments to show how challenging the dataset is on the current stateoftheart text summarization models the contribution of the paper is as follows 1 the summaries have multiplegranularities which means that there are long short and tiny summaries 2 the summaries are written by law experts 3 the dataset includes a fair number of multiple documents and their summaries 1 the dataset creates new challenges for the current text summarization models 2 the paper conducted a good series of experiments to support their claim 3 the dataset can enhance the summarization models for the law and civil rights domain although the paper has conducted a series of good experiments but it misses many crucial points that weakens the papers claim in the paper there are many ambiguities and unclear statements that hinder the delivery of data construction experiment settings and data evaluation for data collection and construction in page two the authors mention that there are about 40000 source documents and 9000 expertwritten summaries then in page 4 the number of the samples are less than 4600 there is a great confusion here besides it is not clear that whether each document has multiple summaries or multiple documents that have one or more summary for example in table 2 the number of samples for tiny summaries for documentsmultidocuments is 1603 3138 for short summaries and 4534 for long summaries the average documents for those summaries vary between 810 documents it is better to show that how many documents that only have one tiny short or long summary or one document can have many summaries is each summary derived from more than one document how many summaries that are only derived from one single document in addition there is a misleading claim regarding the size of the dataset the paper compares the lawspecific domain summaries which is acceptable to newsroom and other large text summarization datasets newsroom cnndaily mail and xsum are far larger in size how can you claim that your data is large compared to those datasets there are other ambiguities in the data evaluation regarding the characteristics of the summaries the paper claims that their summaries are abstractive because they are written by authors which is undeniable however when looking at the coverage scores of their analysis to the characteristics of their data the numbers are concerning the coverage score quantifies how much the summary borrows words from the original document multilex sum average scores range between 092096 which indicates a very high number of words borrowed from the document it is true the sentences are not extracted directly from the document but there are not new words that could represent the summary almost all the words are present in the original document which makes them extractive but not necessarily organized in the same order the density measure could support the authors claim because the numbers are not very high it is better to include figures to show the scores of coverages and density the other concerning point is the human evaluation of dataset the first attempt did not succeed in page 8 so the authors designed an alternative system to score their human evaluation but the system is not clearly described and the results are briefly described which casts some doubts and unclarity on the human evaluation method what is the specific objective of the human evaluation is there a small example of the evaluation in the paper the detailed examples can be added in the appendix but the main explanation and short examples should be present in the paper the intended measurements of human evaluation are missing for the experiment setting the paper tries to generate more succinct version tiny or short summaries of the long summaries and the documents what is the objective of this experiment multilexsum has already multiple granularities the experiment of generating summaries from summaries casts doubts on the evaluation are the summaries generated only from the original document from the original document long summary or only from the long summary each scenario has its own followup questions for evaluation docsepthis paper introduces multilexsum a multidoc summarization dataset based on civil rights lawsuits from us federal courts compared to existing benchmarks multilexsum provides three granularities tiny short and long of summaries for source documents it enables the studies of multitarget summaries at these three granularities the paper also conducted a series of experiments on multilexsum the results indicate that exiting popular summarization models cannot achieve satisfying performances 1 from the perspective of document summarization multilexsum firstly provides summaries at multiple levels of granularity 2 from the perspective of legal ai the proposed multilexsum consists of legal expertauthored summaries and would encourage legal ai studies especially the applications such as legal case summarization etc 3 the experiments were conducted by considering both the automatic evaluations and the human evaluations 1 some details on data preprocessing especially the data selection are not introduced clearly the dataset is drawn from the ongoing crlc writing what features are used for selectingfiltering the data records from the whole posts are they selected based on the charge year or other features sometimes the inappropriate data selection criteria may result in a dataset with potential social biases 2 multilexsum is based on largescale civil rights lawsuits it depends on the lawyers and law students to produce highquality summaries of key events and outcomes is there any critical difference between the generalpurpose long document summarization and the summarization in multilexsum are there any legaldomain specific designs in multilexsum due to the special properties of lawsuits 3 the details on the annotation are not provided for example how many annotators are spent on one document how about the interannotator agreement and how to measure the agreement docsepthanks for the opportunity to review this paper i recommend accepting the authors introduce a new set of summaries of long legal documents concerning civil rights cases to my knowledge a set of expertwritten summaries of this magnitude is unique for a legal application especially because many of the previous projects cited rely on data that is subject to extremely restrictive licensing regimes canlii and auslii and this data is focused on a subject area with high potential for social impact the existence of multiple levels of granularity is a very nice feature this is not only useful for legal applications many people are increasingly aware of the need to handle long sequences and this will be a nice task to push models in that direction the authors point on the length of output sequences from sota models speaks to this point this data will be really useful in future efforts to solve that problem also i was especially impressed by the hci contribution which is all too rare in studies like this one the paper is clearly written some issues to be aware of my 1 question is about data quality in the example notebook the snippet of the source document suggests serious parsing errors disctcqur for the middle districtlo alaama these are not atypical for ocr projects but i wonder what the prevalence of errors like these is please help clarify this in your responses pervasive parsing issues would make learning the summaries difficult and reduce the value of the data as for the benchmark results i wasnt sure why the authors chose to finetune for only 6 epochs or the choices that were made to handle other specific elements eg unique legal vocab citations etc that might have impeded performance more detail on the hci summarization project would have been nice to include in the main paper as thats a unique and important element overall i think these problems can probably be solved the dataset is to my knowledge a unique collection of expertwritten summaries of very long documents the summaries are presented at varying levels of granularity the data addresses an important social problem the experiments demonstrate the value of the data the authors demonstrate a realworld application with feedback from human subjects i am uncertain about the data quality of the source documents i was not confident that the experiments displayed the best possible versions of the models ### Summary:
this is a very highquality dataset which is especially noteworthy since legal nlp benchmarks are very difficult to build the authors approached every aspect of the process with extreme care it will likely have an impact on law practice and start interesting discussions about the use of ml in these settings the paper is also very well written and enjoyable to read most reviewers are heavily in favor of acceptance reviewer aeqk brought up some weaknesses but at least from the acs perspective these seem to be answered well by the authors
[ 776, 4685, 3579, 374, 10405, 3006, 4320, 17438, 310, 247, 37825, 4836, 534, 310, 247, 5161, 4836, 326, 1142, 16099, 513, 1569, 3484, 403, 2424, 281, 26799, 2219, 1046, 1388, 275, 966, 16006, 26799, 2219, 275, 4320, 11626, 16099, 26799, 2219, 672, 26736, 7125, 275, 1302, 285, 275, 4320, 32852, 3021, 436, 4836, 812, 8046, 5657, 534, 452, 247, 14282, 3486, 50274, 4064, 253, 8668, 273, 253, 5145, 4715, 13307, 81, 3114, 253, 4836, 326, 253, 4477, 1548, 338, 1105, 503, 304, 4011, 792, 414, 10405, 1320, 339, 3030, 762, 14091, 728, 1754, 327, 619, 43816, 3909, 3021, 436, 15302, 651, 320, 273, 1318, 281, 253, 16055, 295, 24343, 3114, 285, 417, 816, 1110, 7106, 327, 4320, 4893, 50274, 10389, 263, 253, 4679, 403, 973, 2218, 285, 5272, 275, 1708, 273, 253, 3916, 1160, 407, 253, 2929, 50276, 74, 13414, 1663, 923, 667, 2201, 32213, 5474, 339, 431, 248, 4477, 1246, 247, 747, 10895, 273, 11266, 1438, 1172, 893, 3097, 2149, 14568, 3927, 432, 1236, 2510, 25912, 5079, 3570, 33563, 326, 2486, 1264, 3510, 273, 6010, 10058, 2159, 285, 1048, 575, 9328, 3368, 342, 1375, 23037, 14387, 3210, 323, 12002, 422, 10405, 1320, 44693, 759, 22228, 316, 3977, 285, 2248, 3525, 327, 616, 747, 10895, 285, 921, 326, 841, 3210, 1347, 15225, 327, 24585, 14568, 3927, 50276, 17480, 13301, 403, 2834, 281, 16270, 275, 1569, 15302, 342, 6485, 31825, 824, 347, 436, 581, 403, 45224, 9865, 50276, 783, 4836, 273, 11365, 247, 6010, 432, 2709, 24585, 7177, 310, 7194, 48976, 533, 6685, 1774, 347, 253, 4477, 1127, 562, 824, 271, 12077, 10405, 1320, 651, 1361, 4320, 10071, 275, 34318, 15091, 20121, 625, 14556, 534, 651, 5649, 7815, 275, 24497, 285, 4685, 625, 4320, 24598, 285, 30285, 50276, 783, 4477, 1347, 271, 9470, 1783, 273, 5368, 10405, 14460, 327, 616, 10895, 50276, 783, 2929, 310, 247, 11010, 281, 1239, 1077, 2590, 285, 973, 18872, 50276, 783, 10895, 10097, 310, 11080, 253, 6070, 273, 38135, 7681, 390, 8249, 6864, 310, 3710, 2568, 4569, 323, 247, 7741, 2929, 5474, 339, 2617, 503, 587, 89, 2204, 310, 247, 1569, 6160, 5028, 23964, 1829, 2505, 10405, 1320, 10895, 253, 14568, 3927, 403, 7558, 281, 320, 12002, 422, 253, 14568, 3927, 275, 436, 10895, 6889, 432, 1048, 2159, 285, 10058, 285, 6012, 432, 2709, 7177, 253, 2929, 2589, 84, 2962, 273, 4679, 281, 921, 849, 11132, 253, 10895, 310, 327, 253, 1655, 1375, 23037, 14387, 2505, 10405, 1320, 3210, 50276, 783, 7680, 273, 253, 2929, 310, 347, 3637, 50276, 18, 50276, 783, 14568, 3927, 452, 2709, 28859, 792, 1005, 534, 2097, 326, 627, 403, 1048, 2159, 285, 10058, 14568, 3927, 50276, 19, 50276, 783, 14568, 3927, 403, 3542, 407, 1569, 10071, 50276, 20, 50276, 783, 10895, 3797, 247, 4344, 1180, 273, 2709, 7177, 285, 616, 14568, 3927, 50274, 18, 50276, 783, 10895, 10513, 747, 7881, 323, 253, 1655, 2505, 10405, 1320, 3210, 50276, 19, 50276, 783, 2929, 5196, 247, 1175, 2962, 273, 4679, 281, 1329, 616, 1750, 50276, 20, 50276, 783, 10895, 476, 7278, 253, 10405, 1320, 3210, 323, 253, 1569, 285, 5079, 3570, 5028, 50276, 20261, 253, 2929, 556, 5196, 247, 2962, 273, 1175, 4679, 533, 352, 38771, 1142, 9560, 2792, 326, 5075, 561, 253, 9380, 1750, 50276, 249, 253, 2929, 627, 403, 1142, 15200, 39560, 285, 12744, 7234, 326, 35007, 253, 6742, 273, 941, 5140, 3368, 7533, 285, 941, 7103, 50276, 1542, 941, 4849, 285, 5140, 275, 3239, 767, 253, 4477, 3748, 326, 627, 403, 670, 577, 1418, 2603, 7177, 285, 898, 933, 6485, 15720, 14568, 3927, 840, 275, 3239, 577, 253, 1180, 273, 253, 3530, 403, 1679, 685, 7904, 361, 627, 310, 247, 1270, 13775, 1060, 16280, 352, 310, 417, 2590, 326, 1880, 1016, 3389, 556, 2709, 14568, 3927, 390, 2709, 7177, 326, 452, 581, 390, 625, 6010, 323, 1650, 275, 2829, 374, 253, 1180, 273, 3530, 323, 10058, 14568, 3927, 323, 7177, 9961, 301, 406, 3222, 310, 12036, 20, 495, 15148, 323, 2159, 14568, 3927, 285, 5329, 1706, 50276, 1542, 1048, 14568, 3927, 253, 3388, 7177, 323, 1110, 14568, 3927, 6889, 875, 854, 740, 7177, 352, 310, 1805, 281, 921, 326, 849, 1142, 7177, 326, 760, 452, 581, 10058, 2159, 390, 1048, 6010, 390, 581, 3389, 476, 452, 1142, 14568, 3927, 50276, 261, 1016, 6010, 6012, 432, 625, 685, 581, 3389, 849, 1142, 14568, 3927, 326, 403, 760, 6012, 432, 581, 2014, 3389, 275, 1635, 627, 310, 247, 24363, 1750, 5001, 253, 1979, 273, 253, 10895, 253, 2929, 26662, 253, 1569, 6160, 5028, 14568, 3927, 534, 310, 12207, 281, 3668, 4461, 285, 643, 1781, 2505, 10405, 1320, 15302, 3668, 4461, 260, 79, 2109, 4170, 8888, 285, 1269, 2204, 403, 2080, 4067, 275, 1979, 849, 476, 368, 1750, 326, 634, 941, 310, 1781, 2429, 281, 1110, 15302, 50276, 9088, 403, 643, 15200, 39560, 275, 253, 941, 7103, 5001, 253, 5319, 273, 253, 14568, 3927, 253, 2929, 3916, 326, 616, 14568, 3927, 403, 12002, 422, 984, 597, 403, 3542, 407, 4477, 534, 310, 43296, 6051, 2299, 672, 2819, 387, 253, 7031, 7363, 273, 616, 1783, 281, 253, 5319, 273, 616, 941, 253, 3904, 403, 8664, 253, 7031, 4868, 2677, 7790, 849, 1199, 253, 6010, 13179, 84, 3000, 432, 253, 3236, 3389, 1554, 587, 89, 2020, 3388, 7363, 2491, 875, 15630, 938, 4196, 534, 6492, 247, 1077, 1029, 1180, 273, 3000, 29563, 432, 253, 3389, 352, 310, 2032, 253, 14683, 403, 417, 10375, 3587, 432, 253, 3389, 533, 627, 403, 417, 747, 3000, 326, 812, 1957, 253, 6010, 2761, 512, 253, 3000, 403, 1246, 275, 253, 3236, 3389, 534, 2789, 731, 4908, 422, 533, 417, 7933, 10932, 275, 253, 1072, 1340, 253, 4038, 2557, 812, 1329, 253, 4477, 1750, 984, 253, 3904, 403, 417, 1077, 1029, 352, 310, 1805, 281, 2486, 8442, 281, 921, 253, 7363, 273, 3835, 1131, 285, 4038, 50276, 783, 643, 8664, 1127, 310, 253, 1966, 7103, 273, 10895, 253, 806, 3177, 858, 417, 9302, 275, 3239, 854, 594, 253, 4477, 4158, 271, 5795, 985, 281, 4868, 616, 1966, 7103, 533, 253, 985, 310, 417, 4518, 2529, 285, 253, 1543, 403, 13366, 2529, 534, 43603, 690, 24626, 285, 440, 498, 15752, 327, 253, 1966, 7103, 1332, 752, 310, 253, 2173, 8103, 273, 253, 1966, 7103, 310, 627, 247, 1355, 1650, 273, 253, 7103, 275, 253, 2929, 253, 7000, 6667, 476, 320, 2879, 275, 253, 30762, 533, 253, 2022, 8813, 285, 2159, 6667, 943, 320, 1246, 275, 253, 2929, 253, 6034, 6341, 273, 1966, 7103, 403, 5816, 50275, 1542, 253, 3368, 4758, 253, 2929, 14177, 281, 6635, 625, 18382, 4291, 2715, 10058, 390, 2159, 14568, 3927, 273, 253, 1048, 14568, 3927, 285, 253, 7177, 50276, 5371, 310, 253, 8103, 273, 436, 3368, 1554, 587, 89, 2204, 556, 2168, 2709, 32449, 1005, 253, 3368, 273, 11365, 14568, 3927, 432, 14568, 3927, 43603, 24626, 327, 253, 7103, 403, 253, 14568, 3927, 4561, 760, 432, 253, 3236, 3389, 432, 253, 3236, 3389, 50276, 5056, 6010, 390, 760, 432, 253, 1048, 6010, 1016, 10076, 556, 697, 1211, 956, 484, 3533, 323, 7103, 50275, 7152, 33032, 2520, 2929, 23970, 1554, 587, 89, 2204, 247, 23964, 406, 10405, 1320, 10895, 1754, 327, 5079, 3570, 33563, 432, 441, 4400, 7829, 50276, 3118, 1096, 281, 5368, 49602, 1554, 587, 89, 2204, 3400, 1264, 32449, 1005, 10058, 2159, 285, 1048, 273, 14568, 3927, 323, 2603, 7177, 352, 13276, 253, 2175, 273, 1554, 262, 1816, 14568, 3927, 387, 841, 1264, 32449, 1005, 253, 2929, 671, 5196, 247, 2962, 273, 4679, 327, 1554, 587, 89, 2204, 253, 1543, 5224, 326, 44528, 4633, 10405, 1320, 3210, 2550, 5115, 14127, 16226, 50276, 18, 186, 4064, 253, 8668, 273, 3389, 10405, 1320, 1554, 587, 89, 2204, 41005, 3400, 14568, 3927, 387, 2709, 2308, 273, 32449, 414, 374, 186, 4064, 253, 8668, 273, 4320, 23105, 253, 4081, 1554, 587, 89, 2204, 8414, 273, 4320, 1172, 893, 3097, 2149, 14568, 3927, 285, 651, 11907, 4320, 23105, 2175, 3340, 253, 4893, 824, 347, 4320, 1083, 10405, 1320, 3966, 495, 186, 783, 4679, 497, 5196, 407, 7296, 1097, 253, 12077, 27163, 285, 253, 1966, 27163, 50276, 18, 186, 8826, 4278, 327, 941, 638, 21678, 3340, 253, 941, 5438, 403, 417, 5611, 4518, 253, 10895, 310, 8392, 432, 253, 10800, 1531, 27827, 4028, 752, 3386, 403, 908, 323, 17221, 10978, 272, 253, 941, 5861, 432, 253, 2644, 9319, 403, 597, 4236, 1754, 327, 253, 4179, 807, 390, 643, 3386, 4536, 253, 19582, 941, 5438, 6866, 778, 906, 275, 247, 10895, 342, 2442, 2675, 31306, 50275, 19, 186, 9961, 587, 89, 2204, 310, 1754, 327, 1236, 2510, 25912, 5079, 3570, 33563, 352, 7024, 327, 253, 16099, 285, 1569, 3484, 281, 4711, 1029, 15177, 14568, 3927, 273, 2234, 3394, 285, 6973, 310, 627, 667, 4619, 3064, 875, 253, 2087, 27299, 1048, 3389, 10405, 1320, 285, 253, 10405, 1320, 275, 1554, 587, 89, 2204, 403, 627, 667, 4320, 13517, 2173, 11809, 275, 1554, 587, 89, 2204, 1955, 281, 253, 2714, 3607, 273, 33563, 50275, 20, 186, 783, 4278, 327, 253, 22581, 403, 417, 2530, 323, 1650, 849, 1142, 12182, 2392, 403, 5262, 327, 581, 3389, 849, 670, 253, 734, 11423, 1080, 4345, 285, 849, 281, 2557, 253, 4345, 50275, 7152, 33032, 35501, 323, 253, 5107, 281, 2278, 436, 2929, 891, 5583, 18738, 50275, 783, 4477, 9569, 247, 747, 873, 273, 14568, 3927, 273, 1048, 4320, 7177, 8664, 5079, 3570, 2219, 281, 619, 3640, 247, 873, 273, 6485, 15720, 14568, 3927, 273, 436, 9777, 310, 4451, 323, 247, 4320, 2898, 3340, 984, 1142, 273, 253, 2045, 6493, 11106, 10725, 327, 941, 326, 310, 2256, 281, 6685, 29190, 26920, 27005, 476, 965, 74, 285, 16506, 965, 74, 285, 436, 941, 310, 7106, 327, 247, 2256, 2170, 342, 1029, 2442, 323, 2675, 3486, 253, 6242, 273, 2709, 2308, 273, 32449, 414, 310, 247, 1077, 5322, 4735, 436, 310, 417, 760, 4217, 323, 4320, 4893, 1142, 952, 403, 9592, 6600, 273, 253, 878, 281, 6016, 1048, 6430, 285, 436, 588, 320, 247, 5322, 4836, 281, 7450, 3210, 275, 326, 3884, 253, 4477, 1127, 327, 253, 2978, 273, 3453, 6430, 432, 256, 5503, 3210, 16544, 281, 436, 1127, 436, 941, 588, 320, 1663, 4217, 275, 2852, 6031, 281, 8415, 326, 1895, 671, 891, 369, 3340, 17847, 407, 253, 288, 5297, 7680, 534, 310, 512, 1512, 7520, 275, 2175, 751, 436, 581, 253, 2929, 310, 4518, 3542, 50276, 8826, 3374, 281, 320, 6600, 273, 619, 337, 1953, 310, 670, 941, 3290, 275, 253, 1650, 24849, 253, 36408, 273, 253, 2603, 3389, 5936, 4092, 29072, 6332, 557, 291, 68, 82, 321, 323, 253, 4766, 3286, 4213, 355, 66, 2902, 841, 403, 417, 34162, 323, 258, 7083, 6493, 533, 891, 4282, 752, 253, 8996, 273, 6332, 751, 841, 310, 4496, 1361, 19148, 436, 275, 634, 6128, 42551, 29072, 3374, 651, 1056, 4715, 253, 14568, 3927, 2834, 285, 4796, 253, 1318, 273, 253, 941, 347, 323, 253, 22791, 1543, 891, 369, 2649, 2119, 2139, 253, 4477, 9703, 281, 1442, 292, 2517, 323, 760, 721, 44540, 390, 253, 10165, 326, 497, 1160, 281, 6016, 643, 2173, 3603, 24088, 4451, 4320, 11571, 357, 30404, 3966, 326, 1537, 452, 1607, 16533, 3045, 625, 2508, 327, 253, 288, 5297, 10405, 1320, 2199, 651, 452, 644, 5322, 281, 2486, 275, 253, 2022, 2929, 347, 28763, 247, 4451, 285, 1774, 3284, 4583, 891, 1158, 841, 3237, 476, 3164, 320, 14042, 50275, 783, 10895, 310, 281, 619, 3640, 247, 4451, 4849, 273, 6485, 15720, 14568, 3927, 273, 1077, 1048, 7177, 253, 14568, 3927, 403, 3559, 387, 11962, 2308, 273, 32449, 414, 253, 941, 12453, 271, 1774, 2675, 1895, 253, 4679, 7568, 253, 1318, 273, 253, 941, 253, 4477, 7568, 247, 1524, 10186, 2898, 342, 8680, 432, 1966, 5705, 50276, 74, 717, 8767, 670, 253, 941, 3290, 273, 253, 2603, 7177, 891, 369, 417, 13224, 326, 253, 4679, 8653, 253, 1682, 1896, 9508, 273, 253, 3210, 50276, 187, 187, 4118, 18435, 27, 2520, 310, 247, 1077, 1029, 15177, 10895, 534, 310, 3340, 35092, 1580, 4320, 295, 24343, 49602, 403, 1077, 2834, 281, 1973, 253, 4477, 13781, 1046, 4809, 273, 253, 1232, 342, 9559, 1557, 352, 588, 2779, 452, 271, 3486, 327, 1569, 3946, 285, 1265, 4722, 11985, 670, 253, 897, 273, 13361, 275, 841, 7533, 253, 2929, 310, 671, 1077, 973, 3542, 285, 30357, 281, 1239, 50276, 2252, 30628, 403, 11306, 275, 3718, 273, 14924, 37317, 247, 2574, 76, 3982, 598, 690, 32213, 533, 387, 1878, 432, 253, 913, 84, 8668, 841, 1646, 281, 320, 9577, 973, 407, 253, 4477 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 776, 4685, 3579, 374, 10405, 3006, 4320, 17438, 310, 247, 37825, 4836, 534, 310, 247, 5161, 4836, 326, 1142, 16099, 513, 1569, 3484, 403, 2424, 281, 26799, 2219, 1046, 1388, 275, 966, 16006, 26799, 2219, 275, 4320, 11626, 16099, 26799, 2219, 672, 26736, 7125, 275, 1302, 285, 275, 4320, 32852, 3021, 436, 4836, 812, 8046, 5657, 534, 452, 247, 14282, 3486, 50274, 4064, 253, 8668, 273, 253, 5145, 4715, 13307, 81, 3114, 253, 4836, 326, 253, 4477, 1548, 338, 1105, 503, 304, 4011, 792, 414, 10405, 1320, 339, 3030, 762, 14091, 728, 1754, 327, 619, 43816, 3909, 3021, 436, 15302, 651, 320, 273, 1318, 281, 253, 16055, 295, 24343, 3114, 285, 417, 816, 1110, 7106, 327, 4320, 4893, 50274, 10389, 263, 253, 4679, 403, 973, 2218, 285, 5272, 275, 1708, 273, 253, 3916, 1160, 407, 253, 2929, 50276, 74, 13414, 1663, 923, 667, 2201, 32213, 5474, 339, 431, 248, 4477, 1246, 247, 747, 10895, 273, 11266, 1438, 1172, 893, 3097, 2149, 14568, 3927, 432, 1236, 2510, 25912, 5079, 3570, 33563, 326, 2486, 1264, 3510, 273, 6010, 10058, 2159, 285, 1048, 575, 9328, 3368, 342, 1375, 23037, 14387, 3210, 323, 12002, 422, 10405, 1320, 44693, 759, 22228, 316, 3977, 285, 2248, 3525, 327, 616, 747, 10895, 285, 921, 326, 841, 3210, 1347, 15225, 327, 24585, 14568, 3927, 50276, 17480, 13301, 403, 2834, 281, 16270, 275, 1569, 15302, 342, 6485, 31825, 824, 347, 436, 581, 403, 45224, 9865, 50276, 783, 4836, 273, 11365, 247, 6010, 432, 2709, 24585, 7177, 310, 7194, 48976, 533, 6685, 1774, 347, 253, 4477, 1127, 562, 824, 271, 12077, 10405, 1320, 651, 1361, 4320, 10071, 275, 34318, 15091, 20121, 625, 14556, 534, 651, 5649, 7815, 275, 24497, 285, 4685, 625, 4320, 24598, 285, 30285, 50276, 783, 4477, 1347, 271, 9470, 1783, 273, 5368, 10405, 14460, 327, 616, 10895, 50276, 783, 2929, 310, 247, 11010, 281, 1239, 1077, 2590, 285, 973, 18872, 50276, 783, 10895, 10097, 310, 11080, 253, 6070, 273, 38135, 7681, 390, 8249, 6864, 310, 3710, 2568, 4569, 323, 247, 7741, 2929, 5474, 339, 2617, 503, 587, 89, 2204, 310, 247, 1569, 6160, 5028, 23964, 1829, 2505, 10405, 1320, 10895, 253, 14568, 3927, 403, 7558, 281, 320, 12002, 422, 253, 14568, 3927, 275, 436, 10895, 6889, 432, 1048, 2159, 285, 10058, 285, 6012, 432, 2709, 7177, 253, 2929, 2589, 84, 2962, 273, 4679, 281, 921, 849, 11132, 253, 10895, 310, 327, 253, 1655, 1375, 23037, 14387, 2505, 10405, 1320, 3210, 50276, 783, 7680, 273, 253, 2929, 310, 347, 3637, 50276, 18, 50276, 783, 14568, 3927, 452, 2709, 28859, 792, 1005, 534, 2097, 326, 627, 403, 1048, 2159, 285, 10058, 14568, 3927, 50276, 19, 50276, 783, 14568, 3927, 403, 3542, 407, 1569, 10071, 50276, 20, 50276, 783, 10895, 3797, 247, 4344, 1180, 273, 2709, 7177, 285, 616, 14568, 3927, 50274, 18, 50276, 783, 10895, 10513, 747, 7881, 323, 253, 1655, 2505, 10405, 1320, 3210, 50276, 19, 50276, 783, 2929, 5196, 247, 1175, 2962, 273, 4679, 281, 1329, 616, 1750, 50276, 20, 50276, 783, 10895, 476, 7278, 253, 10405, 1320, 3210, 323, 253, 1569, 285, 5079, 3570, 5028, 50276, 20261, 253, 2929, 556, 5196, 247, 2962, 273, 1175, 4679, 533, 352, 38771, 1142, 9560, 2792, 326, 5075, 561, 253, 9380, 1750, 50276, 249, 253, 2929, 627, 403, 1142, 15200, 39560, 285, 12744, 7234, 326, 35007, 253, 6742, 273, 941, 5140, 3368, 7533, 285, 941, 7103, 50276, 1542, 941, 4849, 285, 5140, 275, 3239, 767, 253, 4477, 3748, 326, 627, 403, 670, 577, 1418, 2603, 7177, 285, 898, 933, 6485, 15720, 14568, 3927, 840, 275, 3239, 577, 253, 1180, 273, 253, 3530, 403, 1679, 685, 7904, 361, 627, 310, 247, 1270, 13775, 1060, 16280, 352, 310, 417, 2590, 326, 1880, 1016, 3389, 556, 2709, 14568, 3927, 390, 2709, 7177, 326, 452, 581, 390, 625, 6010, 323, 1650, 275, 2829, 374, 253, 1180, 273, 3530, 323, 10058, 14568, 3927, 323, 7177, 9961, 301, 406, 3222, 310, 12036, 20, 495, 15148, 323, 2159, 14568, 3927, 285, 5329, 1706, 50276, 1542, 1048, 14568, 3927, 253, 3388, 7177, 323, 1110, 14568, 3927, 6889, 875, 854, 740, 7177, 352, 310, 1805, 281, 921, 326, 849, 1142, 7177, 326, 760, 452, 581, 10058, 2159, 390, 1048, 6010, 390, 581, 3389, 476, 452, 1142, 14568, 3927, 50276, 261, 1016, 6010, 6012, 432, 625, 685, 581, 3389, 849, 1142, 14568, 3927, 326, 403, 760, 6012, 432, 581, 2014, 3389, 275, 1635, 627, 310, 247, 24363, 1750, 5001, 253, 1979, 273, 253, 10895, 253, 2929, 26662, 253, 1569, 6160, 5028, 14568, 3927, 534, 310, 12207, 281, 3668, 4461, 285, 643, 1781, 2505, 10405, 1320, 15302, 3668, 4461, 260, 79, 2109, 4170, 8888, 285, 1269, 2204, 403, 2080, 4067, 275, 1979, 849, 476, 368, 1750, 326, 634, 941, 310, 1781, 2429, 281, 1110, 15302, 50276, 9088, 403, 643, 15200, 39560, 275, 253, 941, 7103, 5001, 253, 5319, 273, 253, 14568, 3927, 253, 2929, 3916, 326, 616, 14568, 3927, 403, 12002, 422, 984, 597, 403, 3542, 407, 4477, 534, 310, 43296, 6051, 2299, 672, 2819, 387, 253, 7031, 7363, 273, 616, 1783, 281, 253, 5319, 273, 616, 941, 253, 3904, 403, 8664, 253, 7031, 4868, 2677, 7790, 849, 1199, 253, 6010, 13179, 84, 3000, 432, 253, 3236, 3389, 1554, 587, 89, 2020, 3388, 7363, 2491, 875, 15630, 938, 4196, 534, 6492, 247, 1077, 1029, 1180, 273, 3000, 29563, 432, 253, 3389, 352, 310, 2032, 253, 14683, 403, 417, 10375, 3587, 432, 253, 3389, 533, 627, 403, 417, 747, 3000, 326, 812, 1957, 253, 6010, 2761, 512, 253, 3000, 403, 1246, 275, 253, 3236, 3389, 534, 2789, 731, 4908, 422, 533, 417, 7933, 10932, 275, 253, 1072, 1340, 253, 4038, 2557, 812, 1329, 253, 4477, 1750, 984, 253, 3904, 403, 417, 1077, 1029, 352, 310, 1805, 281, 2486, 8442, 281, 921, 253, 7363, 273, 3835, 1131, 285, 4038, 50276, 783, 643, 8664, 1127, 310, 253, 1966, 7103, 273, 10895, 253, 806, 3177, 858, 417, 9302, 275, 3239, 854, 594, 253, 4477, 4158, 271, 5795, 985, 281, 4868, 616, 1966, 7103, 533, 253, 985, 310, 417, 4518, 2529, 285, 253, 1543, 403, 13366, 2529, 534, 43603, 690, 24626, 285, 440, 498, 15752, 327, 253, 1966, 7103, 1332, 752, 310, 253, 2173, 8103, 273, 253, 1966, 7103, 310, 627, 247, 1355, 1650, 273, 253, 7103, 275, 253, 2929, 253, 7000, 6667, 476, 320, 2879, 275, 253, 30762, 533, 253, 2022, 8813, 285, 2159, 6667, 943, 320, 1246, 275, 253, 2929, 253, 6034, 6341, 273, 1966, 7103, 403, 5816, 50275, 1542, 253, 3368, 4758, 253, 2929, 14177, 281, 6635, 625, 18382, 4291, 2715, 10058, 390, 2159, 14568, 3927, 273, 253, 1048, 14568, 3927, 285, 253, 7177, 50276, 5371, 310, 253, 8103, 273, 436, 3368, 1554, 587, 89, 2204, 556, 2168, 2709, 32449, 1005, 253, 3368, 273, 11365, 14568, 3927, 432, 14568, 3927, 43603, 24626, 327, 253, 7103, 403, 253, 14568, 3927, 4561, 760, 432, 253, 3236, 3389, 432, 253, 3236, 3389, 50276, 5056, 6010, 390, 760, 432, 253, 1048, 6010, 1016, 10076, 556, 697, 1211, 956, 484, 3533, 323, 7103, 50275, 7152, 33032, 2520, 2929, 23970, 1554, 587, 89, 2204, 247, 23964, 406, 10405, 1320, 10895, 1754, 327, 5079, 3570, 33563, 432, 441, 4400, 7829, 50276, 3118, 1096, 281, 5368, 49602, 1554, 587, 89, 2204, 3400, 1264, 32449, 1005, 10058, 2159, 285, 1048, 273, 14568, 3927, 323, 2603, 7177, 352, 13276, 253, 2175, 273, 1554, 262, 1816, 14568, 3927, 387, 841, 1264, 32449, 1005, 253, 2929, 671, 5196, 247, 2962, 273, 4679, 327, 1554, 587, 89, 2204, 253, 1543, 5224, 326, 44528, 4633, 10405, 1320, 3210, 2550, 5115, 14127, 16226, 50276, 18, 186, 4064, 253, 8668, 273, 3389, 10405, 1320, 1554, 587, 89, 2204, 41005, 3400, 14568, 3927, 387, 2709, 2308, 273, 32449, 414, 374, 186, 4064, 253, 8668, 273, 4320, 23105, 253, 4081, 1554, 587, 89, 2204, 8414, 273, 4320, 1172, 893, 3097, 2149, 14568, 3927, 285, 651, 11907, 4320, 23105, 2175, 3340, 253, 4893, 824, 347, 4320, 1083, 10405, 1320, 3966, 495, 186, 783, 4679, 497, 5196, 407, 7296, 1097, 253, 12077, 27163, 285, 253, 1966, 27163, 50276, 18, 186, 8826, 4278, 327, 941, 638, 21678, 3340, 253, 941, 5438, 403, 417, 5611, 4518, 253, 10895, 310, 8392, 432, 253, 10800, 1531, 27827, 4028, 752, 3386, 403, 908, 323, 17221, 10978, 272, 253, 941, 5861, 432, 253, 2644, 9319, 403, 597, 4236, 1754, 327, 253, 4179, 807, 390, 643, 3386, 4536, 253, 19582, 941, 5438, 6866, 778, 906, 275, 247, 10895, 342, 2442, 2675, 31306, 50275, 19, 186, 9961, 587, 89, 2204, 310, 1754, 327, 1236, 2510, 25912, 5079, 3570, 33563, 352, 7024, 327, 253, 16099, 285, 1569, 3484, 281, 4711, 1029, 15177, 14568, 3927, 273, 2234, 3394, 285, 6973, 310, 627, 667, 4619, 3064, 875, 253, 2087, 27299, 1048, 3389, 10405, 1320, 285, 253, 10405, 1320, 275, 1554, 587, 89, 2204, 403, 627, 667, 4320, 13517, 2173, 11809, 275, 1554, 587, 89, 2204, 1955, 281, 253, 2714, 3607, 273, 33563, 50275, 20, 186, 783, 4278, 327, 253, 22581, 403, 417, 2530, 323, 1650, 849, 1142, 12182, 2392, 403, 5262, 327, 581, 3389, 849, 670, 253, 734, 11423, 1080, 4345, 285, 849, 281, 2557, 253, 4345, 50275, 7152, 33032, 35501, 323, 253, 5107, 281, 2278, 436, 2929, 891, 5583, 18738, 50275, 783, 4477, 9569, 247, 747, 873, 273, 14568, 3927, 273, 1048, 4320, 7177, 8664, 5079, 3570, 2219, 281, 619, 3640, 247, 873, 273, 6485, 15720, 14568, 3927, 273, 436, 9777, 310, 4451, 323, 247, 4320, 2898, 3340, 984, 1142, 273, 253, 2045, 6493, 11106, 10725, 327, 941, 326, 310, 2256, 281, 6685, 29190, 26920, 27005, 476, 965, 74, 285, 16506, 965, 74, 285, 436, 941, 310, 7106, 327, 247, 2256, 2170, 342, 1029, 2442, 323, 2675, 3486, 253, 6242, 273, 2709, 2308, 273, 32449, 414, 310, 247, 1077, 5322, 4735, 436, 310, 417, 760, 4217, 323, 4320, 4893, 1142, 952, 403, 9592, 6600, 273, 253, 878, 281, 6016, 1048, 6430, 285, 436, 588, 320, 247, 5322, 4836, 281, 7450, 3210, 275, 326, 3884, 253, 4477, 1127, 327, 253, 2978, 273, 3453, 6430, 432, 256, 5503, 3210, 16544, 281, 436, 1127, 436, 941, 588, 320, 1663, 4217, 275, 2852, 6031, 281, 8415, 326, 1895, 671, 891, 369, 3340, 17847, 407, 253, 288, 5297, 7680, 534, 310, 512, 1512, 7520, 275, 2175, 751, 436, 581, 253, 2929, 310, 4518, 3542, 50276, 8826, 3374, 281, 320, 6600, 273, 619, 337, 1953, 310, 670, 941, 3290, 275, 253, 1650, 24849, 253, 36408, 273, 253, 2603, 3389, 5936, 4092, 29072, 6332, 557, 291, 68, 82, 321, 323, 253, 4766, 3286, 4213, 355, 66, 2902, 841, 403, 417, 34162, 323, 258, 7083, 6493, 533, 891, 4282, 752, 253, 8996, 273, 6332, 751, 841, 310, 4496, 1361, 19148, 436, 275, 634, 6128, 42551, 29072, 3374, 651, 1056, 4715, 253, 14568, 3927, 2834, 285, 4796, 253, 1318, 273, 253, 941, 347, 323, 253, 22791, 1543, 891, 369, 2649, 2119, 2139, 253, 4477, 9703, 281, 1442, 292, 2517, 323, 760, 721, 44540, 390, 253, 10165, 326, 497, 1160, 281, 6016, 643, 2173, 3603, 24088, 4451, 4320, 11571, 357, 30404, 3966, 326, 1537, 452, 1607, 16533, 3045, 625, 2508, 327, 253, 288, 5297, 10405, 1320, 2199, 651, 452, 644, 5322, 281, 2486, 275, 253, 2022, 2929, 347, 28763, 247, 4451, 285, 1774, 3284, 4583, 891, 1158, 841, 3237, 476, 3164, 320, 14042, 50275, 783, 10895, 310, 281, 619, 3640, 247, 4451, 4849, 273, 6485, 15720, 14568, 3927, 273, 1077, 1048, 7177, 253, 14568, 3927, 403, 3559, 387, 11962, 2308, 273, 32449, 414, 253, 941, 12453, 271, 1774, 2675, 1895, 253, 4679, 7568, 253, 1318, 273, 253, 941, 253, 4477, 7568, 247, 1524, 10186, 2898, 342, 8680, 432, 1966, 5705, 50276, 74, 717, 8767, 670, 253, 941, 3290, 273, 253, 2603, 7177, 891, 369, 417, 13224, 326, 253, 4679, 8653, 253, 1682, 1896, 9508, 273, 253, 3210, 50276, 187, 187, 4118, 18435, 27, 2520, 310, 247, 1077, 1029, 15177, 10895, 534, 310, 3340, 35092, 1580, 4320, 295, 24343, 49602, 403, 1077, 2834, 281, 1973, 253, 4477, 13781, 1046, 4809, 273, 253, 1232, 342, 9559, 1557, 352, 588, 2779, 452, 271, 3486, 327, 1569, 3946, 285, 1265, 4722, 11985, 670, 253, 897, 273, 13361, 275, 841, 7533, 253, 2929, 310, 671, 1077, 973, 3542, 285, 30357, 281, 1239, 50276, 2252, 30628, 403, 11306, 275, 3718, 273, 14924, 37317, 247, 2574, 76, 3982, 598, 690, 32213, 533, 387, 1878, 432, 253, 913, 84, 8668, 841, 1646, 281, 320, 9577, 973, 407, 253, 4477 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this review is a collaboration between a senior and a junior reviewer the junior wrote most of the review and the senior modified the review both reviewers read the paper in detail summary this paper presents an alternative pretraining technique for language models in order to make them more knowledge aware in addition to pretraining a model with the traditional language modeling objective this work proposes an entity prediction task as pretraining furthermore this work also incorporates entity tokens at the input level of the model by summing the word embedding and its corresponding entity embedding experimental results show that such pretraining method yields models with greater factual knowledge according to the lama knowledge probing task compared to vanilla gpt2 models in addition it is also reported that such models perform better than a gpt2 model of the same size on triviaqa natural questions and web questions in a zeroshot setting and achieves competitive results when compared to larger models this paper is well written and easy to follow this work is timely because it shows that increasing the model size and pretraining data size is not the only way to achieve strong performance on language related tasks which is the current trend inductive biases like knowledge about entities can improve the performance of models to the point of achieving competitive results with bigger models some limitations of this work include the lack of experiments on more diverse architectures like encoderdecoders to make sure that the proposed technique not only works on autoregressive gpttype models moreover only one approach for entity integration is explored other than that i have the following questions interesting choice of the margin loss for entity prediction more motivation should be added into why this loss instead of the crossentropy loss optionally an experiment comparing the performance of the model when trained with the cross entropy vs the margin loss would be interesting to have at inference time you are forced to put entity labels on the input text as a preprocessing step how much does this impact the generalization capacity of the model how do you manage cases when an unseen entity is seen as input are the experiments on qa really zeroshot the examples presented in appendix are not dummy as they reflect true questions and answers minor have you tried your pretraining technique on an encoderdecoder architecture like t5 it would be interesting to see that this technique not only works with autoregressive models have you tried to finetune your resulting pretrained models on various tasks it would be interesting to see if your pretraining method also helps finetuning more efficiently on some tasks in addition i would be curious to see if while finetuning the entity prediction accuracy and the lama probing performance of the model drops or remains more or less the same 1 sun yu et al ernie enhanced representation through knowledge integration arxiv preprint arxiv190409223 2019 2 ji yangfeng et al dynamic entity representations in neural language models proceedings of the 2017 conference on empirical methods in natural language processing 2017 after author response thanks for the response after reading other reviews i feel the novelty of this work is somewhat limited and it would be useful to highlight the differences with respect to previous work also a discussion on when your method is preferred over existing proposals i am also decreasing my confidence score after reading other reviewsdocsepthis paper proposes kalm a knowledgeaware language model that incorporates entity information specifically the method involves using a dictionary lookup to match ngrams to entities from a database then each token is embedded into two vectors one using the surface form and one using the entity id it is matched to if there is a match these embeddings are then added and passed into a transformer model to predict the next word given the context experiments compare the model to gpt2 and t5 two stateofthe art transformerbased lms and demonstrate the benefit of adding entity information strengths the idea of using entity information is certainly interesting and new most details of the paper are clear from the writing weaknesses the paper leaves open several questions that should really be explored and discussed given that the modeling contribution is simply adding an extra embedding layer it would be good to have detailed analyses that provide more insights into the contribution itself see points 567 below detailed commentsquestions 1 the details of the dictionary lookup are not entirely clear do you lowercase all tokens do you handle small misspellings in the tokens do you perform fuzzy matches 2 thanks for providing several implementation details however the values of hyperparameter alpha are not discussed does this require a lot of tweaking or is the method stable to this choice 3 minor the zeroshot experiments really seem to be fewshot tests since you provide the model with 8 or so examples 4 the metrics used for the lama knowledge probing should be explained in the paper once at least expand the abbreviations so readers can understand 5 table 3 the parameter sizes indicated are misleading large kalm has close to 700m params and not 300m as indicated similarly base kalm has 550m params not 100m 6 related to the above point it seems like kalm base has more parameters than gpt2 base and even gpt2 large and yet this aspect is not accounted forinvestigated in the paper in fact the paper says kalm more efficiently packs practical knowledge into its parameters 12l kalm is very close to 24l gpt2 large this seems to be very misleading information since transformers are essentially a form of graph networks im not sure entity embeddings can be treated separately from the model parameters since one can view those as edges part of the graph concrete suggestion can you compare kalm vs gpt2 in all tables and fig 1 while controlling for the params to me the only closest comparison seems like kalm base vs gpt2 large where gpt2 large seems to be the better model 7 i appreciate the authors efforts to test on a variety of benchmarks and i understand that training these huge models is not easy but to me the essence of the paper ie addition of entity information seems underexplored for instance could you perform ablations studies that only uses the entity embeddings in the input to see how much knowledge it provides another ablation can be on the matching done in the dictionary lookup in terms of frequency cutoffs n1234 in the ngram matching etc one could also analyze the prediction accuracy of the model to see which types of entities eg persons organizations etc work better vs those that dont help as much update thanks to the authors for their response which helped clarify several of my minor questions and i believe those can be revised with a writing pass however i still think this paper has two significant deficiencies 1 the parameter size comparison still seems flawed to me the authors say that one can discount the entity embeddings but can we really arent they part of the models representation even if the inference does not use all of them in a single forward pass several neural net architectures exist including largescale lms that do not use all inference paths but still count them in the total params the bert vs gpt2 example provided in the rebuttal is only a difference of 26k tokens still significant but here were talking about a few hundred million parameters at the very least i think the true sizes of the models must be acknowledged and one can add the point on entity embeddings vs brain parameters as a caveat but it seems scientifically inaccurate to me to claim otherwise to be clear i dont think the size issue detracts much from the main contribution of adding entity embeddings here ie this work may still be of interest even if the size of model is larger but the current version of the paper has several claims about size savings that seem incorrect 2 the analysis of the proposed method where it helps where it fails which hyperparams matter is still lacking the authors did mention one ablation in the supplementary that i missed but i dont think that is sufficient for a reader to understand how to build on this method in this future without rerunning all the experiments doing an extensive hyperparam search etc docsepthe paper introduces to use additional knowledge to pretrain language models which adds entity information to input and output and the method improves the performance compared to the corresponding models without the knowledge this is an interesting piece of work and worth publishing however i have some comments and questions about the article since the main point of the paper is adding entity information to the models the paper would be better to include more explanations about the entity information for example with the dictionary lookup what is ei exactly why did the authors use such knowledge rather than other knowledge like pos or so why do we need negative sampling how many classes are there for entity the models enjoy more information provided by entity then what about more data to provide more information how much can we improve the performance by adding entity information rather than additional data samples the authors said that the models were trained with entity information from scratch can we use pretrained models since we can simply add entity information to the input and output layers it seems like that the additional information can finetune the pretrained models what is the role of the dummy qas and how did the authors come up with the 8 dummy qas docsep summary this paper presents a knowledgeaware language model pretraining method without changing model architecture specifically they add entity prediction task along with language modeling task to make the model aware of knowledge experiments show improved results on the lama knowledge probing task compared to gpt2 models they also show comparable results on zeroshot question answering task even with only 2 transformer parameters compared to gpt2 17b strengths 1 the authors propose a simple ie without changing model architecture method to give knowledgeaware signal during pretraining 2 proposed method consistently outperforms gpt2 on lama and zeroshot qa 3 interesting to see that entities from fuzzy frequencybased matching can make lms significantly better weaknesses questions 1 need extra parameters to save entity embedding 471m however as the authors mentioned this can be viewed as an external memory module and embedding lookup doesnt need much computation 2 i would like to see how much kalm can transfer to downstream tasks the authors already show zeroshot qa results but it would be better to see finetuning results on knowledgeintensive tasks for instance will kalm still outperform gpt2 after finetuned on kilt tasks httpsarxivorgabs200902252 i know kilt released right before the due date of iclr but tasks in kilt are all already published 3 even though kalm outperforms gpt2 it is still unclear for me the advantages of kalm compared to entities as experts or eae httpsarxivorgabs200407202 for example eae outperforms kalm on the lama probing tasks and eae also outperforms t511b when finetuned 4 the authors mentioned fuzzy frequencybased entity matching is better than precise entity linking eg eae on page 3 but eae outperforms kalm on lama probing tasks based on the experimental results precise entity linking seems more effective to me it would be helpful if the authors can provide a more thorough comparison with eae typo page 3 united states at wii2 united staes at wii1 post ar thank you for preparing detailed responses which helped me to clarify several questions however one of my major questions is still unclear although 4 out of the 11 kilt tasks are already included in the main paper most of them are lama knowledge probing tasks or zeroshot qa tasks it is still unclear how much and how robustly kalm can transfer to other downstream tasks with finetuning eg wizardofwikipedia fever qa with finetuning for instance corefbert paper which uses a similar idea but as you mentioned it use bidirecitonal attention shows its transferability on quoref six extractive qa benchmarks docred fever five coreference resolution benchmarks and glue which can convince me to choose corefbert over bert experiments on the paper are promising but not diverse enough to make me choose kalm over gpt2 thus i would like to stick to my original score ### Summary:
the authors propose to improve the lms ability on modelling entities by signalling the existence of entities and also allowing the model to represent entities also as units the embeddings of the surface form and then entity unit are then added and passed through a layer to predict the next word the paper evaluates on qa and conducts probing tasks and shows that such an entity modelling results in better performance all reviewers have found the idea conceptually simple and novel at the same time a number of concerns are raised with the most important being the lack of understanding around which and how hyperparameters matter for this model and most importantly the confounder introduced to the model by the much larger size of parameters introduced by the embedding layers while the authors comment that not all the parameters are used all the time the size of the embeddings still count at the total size of parameters a model has thus without properly controlling for this eg have an another model where the extra embedding params are given to another part of the model it is difficult to determine whether adding more parameters was the solution or adding more parameters for modelling the entities
[ 4240, 50275, 6438, 2488, 2380, 50276, 35501, 323, 253, 2380, 846, 4361, 643, 10123, 891, 1928, 253, 38135, 273, 436, 789, 310, 8489, 3710, 285, 352, 651, 320, 4217, 281, 6780, 253, 3910, 342, 1675, 281, 2045, 789, 671, 247, 5955, 327, 672, 634, 1332, 310, 9013, 689, 5368, 18595, 891, 717, 671, 11052, 619, 7162, 4868, 846, 4361, 643, 10123, 7152, 33032, 2520, 2929, 29328, 465, 27507, 50276, 66, 3640, 13823, 3448, 1566, 326, 31167, 10726, 1491, 5742, 253, 1332, 8687, 970, 247, 19034, 31994, 281, 3761, 295, 5059, 281, 14429, 432, 247, 5447, 840, 1016, 10669, 310, 12691, 715, 767, 11390, 50276, 531, 970, 253, 2553, 830, 285, 581, 970, 253, 10726, 2654, 352, 310, 13373, 281, 604, 627, 310, 247, 3761, 841, 46234, 403, 840, 2879, 285, 4817, 715, 247, 39707, 1566, 281, 3283, 253, 1735, 3159, 1677, 253, 3634, 4679, 7277, 253, 1566, 281, 305, 431, 19, 285, 246, 22, 767, 1375, 23037, 248, 1445, 39707, 3169, 298, 983, 285, 7568, 253, 5649, 273, 6240, 10726, 1491, 50275, 296, 3755, 20556, 253, 2934, 273, 970, 10726, 1491, 310, 5604, 4722, 285, 747, 954, 4278, 273, 253, 2929, 403, 2590, 432, 253, 4028, 50276, 20881, 1255, 265, 253, 2929, 6505, 1527, 2067, 3533, 326, 943, 1663, 320, 14859, 285, 5469, 1677, 326, 253, 14053, 7680, 310, 3365, 6240, 271, 4465, 21496, 3828, 352, 651, 320, 1175, 281, 452, 7000, 6260, 326, 2085, 625, 16039, 715, 253, 7680, 3139, 923, 2792, 49609, 2708, 50276, 5992, 7193, 5701, 34974, 337, 253, 4278, 273, 253, 19034, 31994, 403, 417, 7094, 2590, 513, 368, 2406, 5045, 512, 21761, 513, 368, 6016, 1355, 2985, 81, 437, 723, 275, 253, 21761, 513, 368, 1347, 31921, 10129, 374, 6701, 323, 5277, 2067, 7092, 4278, 2299, 253, 2193, 273, 4373, 19484, 9765, 403, 417, 5469, 1057, 436, 2430, 247, 2257, 273, 13660, 1170, 390, 310, 253, 1332, 6474, 281, 436, 4327, 495, 5884, 253, 1182, 254, 6934, 302, 4679, 1663, 1646, 281, 320, 1643, 11860, 5216, 1580, 368, 2085, 253, 1566, 342, 854, 390, 594, 6667, 50276, 21, 253, 17082, 908, 323, 253, 298, 2902, 3640, 39578, 943, 320, 5544, 275, 253, 2929, 2378, 387, 1878, 5645, 253, 490, 25669, 594, 10668, 476, 2096, 50276, 22, 2829, 495, 253, 4764, 9552, 4860, 403, 24363, 1781, 465, 27507, 556, 2810, 281, 18450, 78, 18912, 285, 417, 7469, 78, 347, 4860, 12014, 2613, 465, 27507, 556, 28564, 78, 18912, 417, 2233, 78, 50276, 23, 2905, 281, 253, 1840, 1127, 352, 3133, 751, 465, 27507, 2613, 556, 625, 3602, 685, 305, 431, 19, 2613, 285, 1014, 305, 431, 19, 1781, 285, 2568, 436, 4809, 310, 417, 20184, 323, 24889, 27285, 275, 253, 2929, 275, 958, 253, 2929, 2296, 465, 27507, 625, 14556, 30171, 8542, 3640, 715, 697, 3602, 50276, 805, 77, 465, 27507, 310, 1077, 2810, 50276, 936, 2164, 77, 305, 431, 19, 1781, 436, 3133, 281, 320, 1077, 24363, 1491, 1580, 4979, 398, 403, 9093, 247, 830, 273, 4216, 6928, 516, 417, 2119, 10726, 46234, 476, 320, 4127, 11794, 432, 253, 1566, 3602, 1580, 581, 476, 1859, 1110, 347, 9297, 629, 273, 253, 4216, 50276, 585, 6713, 14876, 476, 368, 7277, 465, 27507, 4632, 305, 431, 19, 275, 512, 7180, 285, 3036, 337, 1223, 10938, 323, 253, 18912, 281, 479, 253, 760, 8642, 5301, 3133, 751, 465, 27507, 2613, 4632, 305, 431, 19, 1781, 835, 305, 431, 19, 1781, 3133, 281, 320, 253, 1805, 1566, 818, 891, 11435, 253, 4477, 6031, 281, 1071, 327, 247, 5235, 273, 49602, 285, 891, 2096, 326, 3733, 841, 5699, 3210, 310, 417, 3477, 533, 281, 479, 253, 17718, 273, 253, 2929, 26332, 1635, 273, 10726, 1491, 3133, 15560, 18398, 446, 2149, 323, 4227, 812, 368, 1347, 490, 77, 569, 2175, 326, 760, 4648, 253, 10726, 46234, 275, 253, 3280, 281, 923, 849, 1199, 3640, 352, 3400, 1529, 28913, 476, 320, 327, 253, 11038, 2218, 275, 253, 19034, 31994, 275, 2426, 273, 4294, 2624, 14273, 295, 28941, 275, 253, 295, 1710, 11038, 3966, 50276, 531, 812, 671, 12106, 253, 10554, 7200, 273, 253, 1566, 281, 923, 534, 3510, 273, 14429, 24088, 7732, 8889, 3966, 789, 1805, 4632, 1110, 326, 13414, 1361, 347, 1199, 50274, 11183, 6701, 281, 253, 4477, 323, 616, 2380, 534, 6518, 19148, 2067, 273, 619, 5884, 3533, 285, 891, 2868, 1110, 476, 320, 17265, 342, 247, 4028, 1509, 2299, 891, 1335, 1158, 436, 2929, 556, 767, 1534, 30218, 50276, 18, 253, 4764, 1979, 5301, 1335, 3133, 33657, 281, 479, 253, 4477, 1333, 326, 581, 476, 13630, 253, 10726, 46234, 533, 476, 359, 1663, 403, 2649, 597, 629, 273, 253, 3210, 6779, 1014, 604, 253, 17032, 1057, 417, 897, 512, 273, 731, 275, 247, 2014, 3579, 1509, 2067, 11454, 2036, 35615, 2226, 1690, 1236, 2510, 25912, 298, 983, 326, 513, 417, 897, 512, 17032, 11865, 533, 1335, 1385, 731, 275, 253, 2264, 18912, 253, 270, 797, 4632, 305, 431, 19, 1650, 2530, 275, 253, 30080, 22559, 310, 760, 247, 3064, 273, 3436, 76, 21761, 1335, 1534, 533, 1060, 497, 5015, 670, 247, 1643, 4289, 3041, 3602, 50272, 255, 253, 1077, 1878, 891, 1158, 253, 2032, 9552, 273, 253, 3210, 1364, 320, 14969, 285, 581, 476, 823, 253, 1127, 327, 10726, 46234, 4632, 3998, 3602, 347, 247, 15985, 255, 533, 352, 3133, 50164, 31215, 281, 479, 281, 1750, 5010, 281, 320, 2590, 891, 13414, 1158, 253, 1979, 2523, 843, 974, 84, 1199, 432, 253, 2022, 7680, 273, 6240, 10726, 46234, 1060, 26332, 436, 789, 778, 1335, 320, 273, 1600, 1014, 604, 253, 1979, 273, 1566, 310, 4067, 533, 253, 1655, 2715, 273, 253, 2929, 556, 2067, 3916, 670, 1979, 16347, 326, 1646, 13583, 50276, 19, 253, 1783, 273, 253, 4081, 1332, 835, 352, 7729, 835, 352, 10224, 534, 4373, 12928, 2647, 310, 1335, 14999, 253, 4477, 858, 3748, 581, 28913, 275, 253, 24864, 326, 891, 9829, 533, 891, 13414, 1158, 326, 310, 4209, 323, 247, 9414, 281, 2096, 849, 281, 1973, 327, 436, 1332, 275, 436, 2852, 1293, 294, 24220, 512, 253, 4679, 2509, 271, 9470, 4373, 3575, 3186, 3966, 5474, 339, 431, 248, 2929, 23970, 281, 897, 3081, 3640, 281, 3215, 1949, 3448, 3210, 534, 11323, 10726, 1491, 281, 3280, 285, 3453, 285, 253, 1332, 19132, 253, 3045, 2429, 281, 253, 3969, 3210, 1293, 253, 3640, 436, 310, 271, 4722, 5313, 273, 789, 285, 4409, 18051, 2299, 891, 452, 690, 5701, 285, 3533, 670, 253, 3929, 50275, 17480, 253, 2022, 1127, 273, 253, 2929, 310, 6240, 10726, 1491, 281, 253, 3210, 253, 2929, 651, 320, 1805, 281, 2486, 625, 22909, 670, 253, 10726, 1491, 323, 1650, 342, 253, 19034, 31994, 752, 310, 22616, 4555, 2139, 858, 253, 4477, 897, 824, 3640, 2581, 685, 643, 3640, 751, 803, 390, 594, 2139, 513, 359, 878, 4016, 10491, 849, 1142, 5971, 403, 627, 323, 10726, 50275, 783, 3210, 4264, 625, 1491, 2530, 407, 10726, 840, 752, 670, 625, 941, 281, 2085, 625, 1491, 849, 1199, 476, 359, 3157, 253, 3045, 407, 6240, 10726, 1491, 2581, 685, 3081, 941, 3530, 50275, 783, 4477, 753, 326, 253, 3210, 497, 10166, 342, 10726, 1491, 432, 20041, 476, 359, 897, 3215, 11273, 3210, 1580, 359, 476, 3365, 823, 10726, 1491, 281, 253, 3280, 285, 3453, 8090, 352, 3133, 751, 326, 253, 3081, 1491, 476, 1442, 292, 2517, 253, 3215, 11273, 3210, 50276, 5371, 310, 253, 2554, 273, 253, 28726, 2805, 284, 285, 849, 858, 253, 4477, 1705, 598, 342, 253, 854, 28726, 2805, 284, 50274, 7152, 33032, 6010, 50276, 2520, 2929, 10262, 247, 3640, 13823, 3448, 1566, 3215, 26208, 1332, 1293, 6890, 1566, 10336, 5742, 597, 823, 10726, 10554, 4836, 2112, 342, 3448, 14053, 4836, 281, 1056, 253, 1566, 6600, 273, 3640, 4679, 921, 5520, 1543, 327, 253, 298, 2902, 3640, 39578, 4836, 2429, 281, 305, 431, 19, 3210, 597, 671, 921, 10870, 1543, 327, 1182, 254, 6934, 302, 1953, 22291, 4836, 1014, 342, 760, 374, 39707, 3602, 2429, 281, 305, 431, 19, 1722, 67, 50275, 296, 3755, 20556, 50276, 18, 253, 4477, 12661, 247, 2969, 26332, 1293, 6890, 1566, 10336, 1332, 281, 1918, 3640, 13823, 2625, 1309, 3215, 26208, 50276, 19, 4081, 1332, 12724, 41731, 13015, 305, 431, 19, 327, 298, 2902, 285, 1182, 254, 6934, 302, 2805, 66, 50276, 20, 4722, 281, 923, 326, 14429, 432, 31921, 4294, 3169, 11038, 476, 1056, 298, 983, 3012, 1805, 50275, 20881, 1255, 265, 50276, 34974, 50276, 18, 878, 4465, 3602, 281, 5321, 10726, 21496, 41468, 78, 2299, 347, 253, 4477, 5393, 436, 476, 320, 11575, 347, 271, 6024, 3541, 6333, 285, 21496, 31994, 36908, 878, 1199, 13782, 50276, 19, 891, 651, 751, 281, 923, 849, 1199, 465, 27507, 476, 3700, 281, 15450, 8892, 253, 4477, 2168, 921, 1182, 254, 6934, 302, 2805, 66, 1543, 533, 352, 651, 320, 1805, 281, 923, 1442, 292, 25004, 1543, 327, 3640, 47986, 8892, 323, 4227, 588, 465, 27507, 1335, 562, 32231, 305, 431, 19, 846, 1442, 292, 37437, 327, 465, 2878, 8892, 5987, 39962, 2061, 5375, 1518, 2270, 14832, 19, 891, 871, 465, 2878, 4439, 987, 1078, 253, 1955, 3522, 273, 17857, 32888, 533, 8892, 275, 465, 2878, 403, 512, 2168, 3863, 50276, 20, 1014, 2167, 465, 27507, 41731, 13015, 305, 431, 19, 352, 310, 1335, 12744, 323, 479, 253, 11361, 273, 465, 27507, 2429, 281, 14429, 347, 10071, 390, 299, 3348, 5987, 39962, 2061, 5375, 1518, 24769, 18161, 323, 1650, 299, 3348, 41731, 13015, 465, 27507, 327, 253, 298, 2902, 39578, 8892, 285, 299, 3348, 671, 41731, 13015, 246, 39849, 67, 672, 1442, 292, 37437, 50276, 21, 253, 4477, 5393, 31921, 4294, 3169, 10726, 11038, 310, 1805, 685, 10799, 10726, 20057, 24088, 299, 3348, 327, 3239, 495, 533, 299, 3348, 41731, 13015, 465, 27507, 327, 298, 2902, 39578, 8892, 1754, 327, 253, 5661, 1543, 10799, 10726, 20057, 3133, 625, 3576, 281, 479, 352, 651, 320, 9371, 604, 253, 4477, 476, 2085, 247, 625, 11080, 5301, 342, 299, 3348, 50275, 555, 5367, 50276, 6377, 495, 25536, 3054, 387, 259, 2886, 19, 50276, 328, 959, 21513, 265, 387, 259, 2886, 18, 50273, 5996, 549, 50276, 47033, 368, 323, 13828, 7000, 6128, 534, 6518, 479, 281, 19148, 2067, 3533, 2299, 581, 273, 619, 2201, 3533, 310, 1335, 12744, 50276, 20261, 577, 562, 273, 253, 1903, 465, 2878, 8892, 403, 2168, 2908, 275, 253, 2022, 2929, 954, 273, 731, 403, 298, 2902, 3640, 39578, 8892, 390, 1182, 254, 6934, 302, 2805, 66, 8892, 352, 310, 1335, 12744, 849, 1199, 285, 849, 10237, 314, 465, 27507, 476, 3700, 281, 643, 15450, 8892, 342, 1442, 292, 25004, 24088, 35589, 1171, 25842, 15198, 2805, 66, 342, 1442, 292, 25004, 323, 4227, 5161, 71, 6291, 2929, 534, 4648, 247, 2074, 2934, 533, 347, 368, 5393, 352, 897, 12246, 603, 10179, 2814, 4116, 2722, 697, 3700, 1430, 327, 572, 410, 71, 2800, 4908, 422, 2805, 66, 49602, 5474, 433, 15198, 2620, 5161, 1793, 6064, 49602, 285, 28400, 534, 476, 18578, 479, 281, 5206, 5161, 71, 6291, 689, 270, 797, 50276, 16217, 3825, 327, 253, 2929, 403, 12532, 533, 417, 11117, 2217, 281, 1056, 479, 5206, 465, 27507, 689, 305, 431, 19, 3021, 891, 651, 751, 281, 7356, 281, 619, 3236, 4868, 187, 187, 4118, 18435, 27, 783, 4477, 12661, 281, 3157, 253, 298, 983, 3745, 327, 26278, 14429, 407, 25345, 253, 6242, 273, 14429, 285, 671, 6941, 253, 1566, 281, 1957, 14429, 671, 347, 5085, 253, 46234, 273, 253, 2553, 830, 285, 840, 10726, 3943, 403, 840, 2879, 285, 4817, 949, 247, 3828, 281, 3283, 253, 1735, 3159, 253, 2929, 44995, 327, 2805, 66, 285, 2589, 84, 39578, 8892, 285, 2722, 326, 824, 271, 10726, 26278, 1543, 275, 1805, 3045, 50276, 455, 30628, 452, 1119, 253, 2934, 4473, 1230, 2969, 285, 4460, 387, 253, 1072, 673, 247, 1180, 273, 7350, 403, 5439, 342, 253, 954, 1774, 1146, 253, 3480, 273, 4685, 1475, 534, 285, 849, 4373, 22041, 2647, 323, 436, 1566, 285, 954, 15538, 253, 1461, 10117, 5611, 281, 253, 1566, 407, 253, 1199, 4067, 1979, 273, 3602, 5611, 407, 253, 21496, 8090, 1223, 253, 4477, 4385, 326, 417, 512, 253, 3602, 403, 908, 512, 253, 673, 253, 1979, 273, 253, 46234, 1335, 1385, 387, 253, 2264, 1979, 273, 3602, 247, 1566, 556, 3021, 1293, 6283, 10938, 323, 436, 24088, 452, 271, 1529, 1566, 835, 253, 4465, 21496, 18912, 403, 1677, 281, 1529, 629, 273, 253, 1566, 352, 310, 2834, 281, 3653, 1880, 6240, 625, 3602, 369, 253, 2900, 390, 6240, 625, 3602, 323, 26278, 253, 14429, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 4240, 50275, 6438, 2488, 2380, 50276, 35501, 323, 253, 2380, 846, 4361, 643, 10123, 891, 1928, 253, 38135, 273, 436, 789, 310, 8489, 3710, 285, 352, 651, 320, 4217, 281, 6780, 253, 3910, 342, 1675, 281, 2045, 789, 671, 247, 5955, 327, 672, 634, 1332, 310, 9013, 689, 5368, 18595, 891, 717, 671, 11052, 619, 7162, 4868, 846, 4361, 643, 10123, 7152, 33032, 2520, 2929, 29328, 465, 27507, 50276, 66, 3640, 13823, 3448, 1566, 326, 31167, 10726, 1491, 5742, 253, 1332, 8687, 970, 247, 19034, 31994, 281, 3761, 295, 5059, 281, 14429, 432, 247, 5447, 840, 1016, 10669, 310, 12691, 715, 767, 11390, 50276, 531, 970, 253, 2553, 830, 285, 581, 970, 253, 10726, 2654, 352, 310, 13373, 281, 604, 627, 310, 247, 3761, 841, 46234, 403, 840, 2879, 285, 4817, 715, 247, 39707, 1566, 281, 3283, 253, 1735, 3159, 1677, 253, 3634, 4679, 7277, 253, 1566, 281, 305, 431, 19, 285, 246, 22, 767, 1375, 23037, 248, 1445, 39707, 3169, 298, 983, 285, 7568, 253, 5649, 273, 6240, 10726, 1491, 50275, 296, 3755, 20556, 253, 2934, 273, 970, 10726, 1491, 310, 5604, 4722, 285, 747, 954, 4278, 273, 253, 2929, 403, 2590, 432, 253, 4028, 50276, 20881, 1255, 265, 253, 2929, 6505, 1527, 2067, 3533, 326, 943, 1663, 320, 14859, 285, 5469, 1677, 326, 253, 14053, 7680, 310, 3365, 6240, 271, 4465, 21496, 3828, 352, 651, 320, 1175, 281, 452, 7000, 6260, 326, 2085, 625, 16039, 715, 253, 7680, 3139, 923, 2792, 49609, 2708, 50276, 5992, 7193, 5701, 34974, 337, 253, 4278, 273, 253, 19034, 31994, 403, 417, 7094, 2590, 513, 368, 2406, 5045, 512, 21761, 513, 368, 6016, 1355, 2985, 81, 437, 723, 275, 253, 21761, 513, 368, 1347, 31921, 10129, 374, 6701, 323, 5277, 2067, 7092, 4278, 2299, 253, 2193, 273, 4373, 19484, 9765, 403, 417, 5469, 1057, 436, 2430, 247, 2257, 273, 13660, 1170, 390, 310, 253, 1332, 6474, 281, 436, 4327, 495, 5884, 253, 1182, 254, 6934, 302, 4679, 1663, 1646, 281, 320, 1643, 11860, 5216, 1580, 368, 2085, 253, 1566, 342, 854, 390, 594, 6667, 50276, 21, 253, 17082, 908, 323, 253, 298, 2902, 3640, 39578, 943, 320, 5544, 275, 253, 2929, 2378, 387, 1878, 5645, 253, 490, 25669, 594, 10668, 476, 2096, 50276, 22, 2829, 495, 253, 4764, 9552, 4860, 403, 24363, 1781, 465, 27507, 556, 2810, 281, 18450, 78, 18912, 285, 417, 7469, 78, 347, 4860, 12014, 2613, 465, 27507, 556, 28564, 78, 18912, 417, 2233, 78, 50276, 23, 2905, 281, 253, 1840, 1127, 352, 3133, 751, 465, 27507, 2613, 556, 625, 3602, 685, 305, 431, 19, 2613, 285, 1014, 305, 431, 19, 1781, 285, 2568, 436, 4809, 310, 417, 20184, 323, 24889, 27285, 275, 253, 2929, 275, 958, 253, 2929, 2296, 465, 27507, 625, 14556, 30171, 8542, 3640, 715, 697, 3602, 50276, 805, 77, 465, 27507, 310, 1077, 2810, 50276, 936, 2164, 77, 305, 431, 19, 1781, 436, 3133, 281, 320, 1077, 24363, 1491, 1580, 4979, 398, 403, 9093, 247, 830, 273, 4216, 6928, 516, 417, 2119, 10726, 46234, 476, 320, 4127, 11794, 432, 253, 1566, 3602, 1580, 581, 476, 1859, 1110, 347, 9297, 629, 273, 253, 4216, 50276, 585, 6713, 14876, 476, 368, 7277, 465, 27507, 4632, 305, 431, 19, 275, 512, 7180, 285, 3036, 337, 1223, 10938, 323, 253, 18912, 281, 479, 253, 760, 8642, 5301, 3133, 751, 465, 27507, 2613, 4632, 305, 431, 19, 1781, 835, 305, 431, 19, 1781, 3133, 281, 320, 253, 1805, 1566, 818, 891, 11435, 253, 4477, 6031, 281, 1071, 327, 247, 5235, 273, 49602, 285, 891, 2096, 326, 3733, 841, 5699, 3210, 310, 417, 3477, 533, 281, 479, 253, 17718, 273, 253, 2929, 26332, 1635, 273, 10726, 1491, 3133, 15560, 18398, 446, 2149, 323, 4227, 812, 368, 1347, 490, 77, 569, 2175, 326, 760, 4648, 253, 10726, 46234, 275, 253, 3280, 281, 923, 849, 1199, 3640, 352, 3400, 1529, 28913, 476, 320, 327, 253, 11038, 2218, 275, 253, 19034, 31994, 275, 2426, 273, 4294, 2624, 14273, 295, 28941, 275, 253, 295, 1710, 11038, 3966, 50276, 531, 812, 671, 12106, 253, 10554, 7200, 273, 253, 1566, 281, 923, 534, 3510, 273, 14429, 24088, 7732, 8889, 3966, 789, 1805, 4632, 1110, 326, 13414, 1361, 347, 1199, 50274, 11183, 6701, 281, 253, 4477, 323, 616, 2380, 534, 6518, 19148, 2067, 273, 619, 5884, 3533, 285, 891, 2868, 1110, 476, 320, 17265, 342, 247, 4028, 1509, 2299, 891, 1335, 1158, 436, 2929, 556, 767, 1534, 30218, 50276, 18, 253, 4764, 1979, 5301, 1335, 3133, 33657, 281, 479, 253, 4477, 1333, 326, 581, 476, 13630, 253, 10726, 46234, 533, 476, 359, 1663, 403, 2649, 597, 629, 273, 253, 3210, 6779, 1014, 604, 253, 17032, 1057, 417, 897, 512, 273, 731, 275, 247, 2014, 3579, 1509, 2067, 11454, 2036, 35615, 2226, 1690, 1236, 2510, 25912, 298, 983, 326, 513, 417, 897, 512, 17032, 11865, 533, 1335, 1385, 731, 275, 253, 2264, 18912, 253, 270, 797, 4632, 305, 431, 19, 1650, 2530, 275, 253, 30080, 22559, 310, 760, 247, 3064, 273, 3436, 76, 21761, 1335, 1534, 533, 1060, 497, 5015, 670, 247, 1643, 4289, 3041, 3602, 50272, 255, 253, 1077, 1878, 891, 1158, 253, 2032, 9552, 273, 253, 3210, 1364, 320, 14969, 285, 581, 476, 823, 253, 1127, 327, 10726, 46234, 4632, 3998, 3602, 347, 247, 15985, 255, 533, 352, 3133, 50164, 31215, 281, 479, 281, 1750, 5010, 281, 320, 2590, 891, 13414, 1158, 253, 1979, 2523, 843, 974, 84, 1199, 432, 253, 2022, 7680, 273, 6240, 10726, 46234, 1060, 26332, 436, 789, 778, 1335, 320, 273, 1600, 1014, 604, 253, 1979, 273, 1566, 310, 4067, 533, 253, 1655, 2715, 273, 253, 2929, 556, 2067, 3916, 670, 1979, 16347, 326, 1646, 13583, 50276, 19, 253, 1783, 273, 253, 4081, 1332, 835, 352, 7729, 835, 352, 10224, 534, 4373, 12928, 2647, 310, 1335, 14999, 253, 4477, 858, 3748, 581, 28913, 275, 253, 24864, 326, 891, 9829, 533, 891, 13414, 1158, 326, 310, 4209, 323, 247, 9414, 281, 2096, 849, 281, 1973, 327, 436, 1332, 275, 436, 2852, 1293, 294, 24220, 512, 253, 4679, 2509, 271, 9470, 4373, 3575, 3186, 3966, 5474, 339, 431, 248, 2929, 23970, 281, 897, 3081, 3640, 281, 3215, 1949, 3448, 3210, 534, 11323, 10726, 1491, 281, 3280, 285, 3453, 285, 253, 1332, 19132, 253, 3045, 2429, 281, 253, 3969, 3210, 1293, 253, 3640, 436, 310, 271, 4722, 5313, 273, 789, 285, 4409, 18051, 2299, 891, 452, 690, 5701, 285, 3533, 670, 253, 3929, 50275, 17480, 253, 2022, 1127, 273, 253, 2929, 310, 6240, 10726, 1491, 281, 253, 3210, 253, 2929, 651, 320, 1805, 281, 2486, 625, 22909, 670, 253, 10726, 1491, 323, 1650, 342, 253, 19034, 31994, 752, 310, 22616, 4555, 2139, 858, 253, 4477, 897, 824, 3640, 2581, 685, 643, 3640, 751, 803, 390, 594, 2139, 513, 359, 878, 4016, 10491, 849, 1142, 5971, 403, 627, 323, 10726, 50275, 783, 3210, 4264, 625, 1491, 2530, 407, 10726, 840, 752, 670, 625, 941, 281, 2085, 625, 1491, 849, 1199, 476, 359, 3157, 253, 3045, 407, 6240, 10726, 1491, 2581, 685, 3081, 941, 3530, 50275, 783, 4477, 753, 326, 253, 3210, 497, 10166, 342, 10726, 1491, 432, 20041, 476, 359, 897, 3215, 11273, 3210, 1580, 359, 476, 3365, 823, 10726, 1491, 281, 253, 3280, 285, 3453, 8090, 352, 3133, 751, 326, 253, 3081, 1491, 476, 1442, 292, 2517, 253, 3215, 11273, 3210, 50276, 5371, 310, 253, 2554, 273, 253, 28726, 2805, 284, 285, 849, 858, 253, 4477, 1705, 598, 342, 253, 854, 28726, 2805, 284, 50274, 7152, 33032, 6010, 50276, 2520, 2929, 10262, 247, 3640, 13823, 3448, 1566, 3215, 26208, 1332, 1293, 6890, 1566, 10336, 5742, 597, 823, 10726, 10554, 4836, 2112, 342, 3448, 14053, 4836, 281, 1056, 253, 1566, 6600, 273, 3640, 4679, 921, 5520, 1543, 327, 253, 298, 2902, 3640, 39578, 4836, 2429, 281, 305, 431, 19, 3210, 597, 671, 921, 10870, 1543, 327, 1182, 254, 6934, 302, 1953, 22291, 4836, 1014, 342, 760, 374, 39707, 3602, 2429, 281, 305, 431, 19, 1722, 67, 50275, 296, 3755, 20556, 50276, 18, 253, 4477, 12661, 247, 2969, 26332, 1293, 6890, 1566, 10336, 1332, 281, 1918, 3640, 13823, 2625, 1309, 3215, 26208, 50276, 19, 4081, 1332, 12724, 41731, 13015, 305, 431, 19, 327, 298, 2902, 285, 1182, 254, 6934, 302, 2805, 66, 50276, 20, 4722, 281, 923, 326, 14429, 432, 31921, 4294, 3169, 11038, 476, 1056, 298, 983, 3012, 1805, 50275, 20881, 1255, 265, 50276, 34974, 50276, 18, 878, 4465, 3602, 281, 5321, 10726, 21496, 41468, 78, 2299, 347, 253, 4477, 5393, 436, 476, 320, 11575, 347, 271, 6024, 3541, 6333, 285, 21496, 31994, 36908, 878, 1199, 13782, 50276, 19, 891, 651, 751, 281, 923, 849, 1199, 465, 27507, 476, 3700, 281, 15450, 8892, 253, 4477, 2168, 921, 1182, 254, 6934, 302, 2805, 66, 1543, 533, 352, 651, 320, 1805, 281, 923, 1442, 292, 25004, 1543, 327, 3640, 47986, 8892, 323, 4227, 588, 465, 27507, 1335, 562, 32231, 305, 431, 19, 846, 1442, 292, 37437, 327, 465, 2878, 8892, 5987, 39962, 2061, 5375, 1518, 2270, 14832, 19, 891, 871, 465, 2878, 4439, 987, 1078, 253, 1955, 3522, 273, 17857, 32888, 533, 8892, 275, 465, 2878, 403, 512, 2168, 3863, 50276, 20, 1014, 2167, 465, 27507, 41731, 13015, 305, 431, 19, 352, 310, 1335, 12744, 323, 479, 253, 11361, 273, 465, 27507, 2429, 281, 14429, 347, 10071, 390, 299, 3348, 5987, 39962, 2061, 5375, 1518, 24769, 18161, 323, 1650, 299, 3348, 41731, 13015, 465, 27507, 327, 253, 298, 2902, 39578, 8892, 285, 299, 3348, 671, 41731, 13015, 246, 39849, 67, 672, 1442, 292, 37437, 50276, 21, 253, 4477, 5393, 31921, 4294, 3169, 10726, 11038, 310, 1805, 685, 10799, 10726, 20057, 24088, 299, 3348, 327, 3239, 495, 533, 299, 3348, 41731, 13015, 465, 27507, 327, 298, 2902, 39578, 8892, 1754, 327, 253, 5661, 1543, 10799, 10726, 20057, 3133, 625, 3576, 281, 479, 352, 651, 320, 9371, 604, 253, 4477, 476, 2085, 247, 625, 11080, 5301, 342, 299, 3348, 50275, 555, 5367, 50276, 6377, 495, 25536, 3054, 387, 259, 2886, 19, 50276, 328, 959, 21513, 265, 387, 259, 2886, 18, 50273, 5996, 549, 50276, 47033, 368, 323, 13828, 7000, 6128, 534, 6518, 479, 281, 19148, 2067, 3533, 2299, 581, 273, 619, 2201, 3533, 310, 1335, 12744, 50276, 20261, 577, 562, 273, 253, 1903, 465, 2878, 8892, 403, 2168, 2908, 275, 253, 2022, 2929, 954, 273, 731, 403, 298, 2902, 3640, 39578, 8892, 390, 1182, 254, 6934, 302, 2805, 66, 8892, 352, 310, 1335, 12744, 849, 1199, 285, 849, 10237, 314, 465, 27507, 476, 3700, 281, 643, 15450, 8892, 342, 1442, 292, 25004, 24088, 35589, 1171, 25842, 15198, 2805, 66, 342, 1442, 292, 25004, 323, 4227, 5161, 71, 6291, 2929, 534, 4648, 247, 2074, 2934, 533, 347, 368, 5393, 352, 897, 12246, 603, 10179, 2814, 4116, 2722, 697, 3700, 1430, 327, 572, 410, 71, 2800, 4908, 422, 2805, 66, 49602, 5474, 433, 15198, 2620, 5161, 1793, 6064, 49602, 285, 28400, 534, 476, 18578, 479, 281, 5206, 5161, 71, 6291, 689, 270, 797, 50276, 16217, 3825, 327, 253, 2929, 403, 12532, 533, 417, 11117, 2217, 281, 1056, 479, 5206, 465, 27507, 689, 305, 431, 19, 3021, 891, 651, 751, 281, 7356, 281, 619, 3236, 4868, 187, 187, 4118, 18435, 27, 783, 4477, 12661, 281, 3157, 253, 298, 983, 3745, 327, 26278, 14429, 407, 25345, 253, 6242, 273, 14429, 285, 671, 6941, 253, 1566, 281, 1957, 14429, 671, 347, 5085, 253, 46234, 273, 253, 2553, 830, 285, 840, 10726, 3943, 403, 840, 2879, 285, 4817, 949, 247, 3828, 281, 3283, 253, 1735, 3159, 253, 2929, 44995, 327, 2805, 66, 285, 2589, 84, 39578, 8892, 285, 2722, 326, 824, 271, 10726, 26278, 1543, 275, 1805, 3045, 50276, 455, 30628, 452, 1119, 253, 2934, 4473, 1230, 2969, 285, 4460, 387, 253, 1072, 673, 247, 1180, 273, 7350, 403, 5439, 342, 253, 954, 1774, 1146, 253, 3480, 273, 4685, 1475, 534, 285, 849, 4373, 22041, 2647, 323, 436, 1566, 285, 954, 15538, 253, 1461, 10117, 5611, 281, 253, 1566, 407, 253, 1199, 4067, 1979, 273, 3602, 5611, 407, 253, 21496, 8090, 1223, 253, 4477, 4385, 326, 417, 512, 253, 3602, 403, 908, 512, 253, 673, 253, 1979, 273, 253, 46234, 1335, 1385, 387, 253, 2264, 1979, 273, 3602, 247, 1566, 556, 3021, 1293, 6283, 10938, 323, 436, 24088, 452, 271, 1529, 1566, 835, 253, 4465, 21496, 18912, 403, 1677, 281, 1529, 629, 273, 253, 1566, 352, 310, 2834, 281, 3653, 1880, 6240, 625, 3602, 369, 253, 2900, 390, 6240, 625, 3602, 323, 26278, 253, 14429, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: i find the idea of explaining the space of plans to the goal to be interesting and i believe using ilp for that is novel too the paper is mostly well written though it could benefit from a short conclusion section at the end and a comment regarding the need to evaluate this kind of explanation with users pros interesting idea and method explanations seem human readableintuitive to me cons it is not clear to me what input users or any human actually need to give to the method together with the problem description in order for explanations to be generated the text suggests users define predicates an inductive bias maybe i missed this but what would this be in tower of hanoi case for me it is odd to name the method an explainable pathfinding method typically the term pathfinding is used for very spatial applications like computer games and robotics but the method here seems very general and detached from spatial meaning the method does not automatically ground the explanations using any spatial knowledge so i find it hard to imagine how one would use it in realworld pathfinding gamesrobots some methods focusing on explainable pathfinding specifically actually do scale to these settings eg brandao et al explaining path plan optimality fast explanation methods for navigation meshes using icaps2021 why not call it explainable planning instead or explaining the space of plans eg eifler et al planspace explanation via planproperty dependencies ijcai2020 no user study minor issues please at least informally define what entailing means a node that entails a state entailed by a precondition in the introduction the authors say we focus on finding an explanation for solving all possible instances of a problem but do not give a motivation for why this would be a goodusefulimportant thing to do this seems strange to me there is a path to the goal from any goal node in this sentence unsolved states that the macroaction transforms negative examples there is a problem with punctuation and maybe by is missing after entailed in this sentence the results show a very simple representation that says that the largest disk goes to peg3 from either peg1 or peg2 actually the explanation is not written in this way and authors had to interpret the explanation to write it in this way even if trivially this form could have an impact in terms of how simple or intuitive the explanation is for users requires user study please discuss the issues behind and consequences for explainable systems given the lack of if statements learned by the ilp system we using there a longerdocsepthis paper takes an inductive logic programming approach to learning socalled explainable pathfinding graphs egraphs based on observations of solutions generated by planners whose solutions may otherwise be inscrutable to humans the paper proposes a number of ways in which egraphs may be parsed to generate explanations for human users of such systems the ilp method used can incorporate background knowledge and constrained such that only simple enough egraphs are generated the paper uses towers of hanoi as a running example and even discusses application potential to the rubiks cube domain typos related work section to introduction of a number of approaches to the introduction of a number of approaches though programs have a discrete structure one can optimize this a relaxed continuous structure instead remove this or add using before a in graphplan the graph is used to search for a plans change plans to plan in our work we construct the egraph that can solve all instances of a problem change the egraph to an egraph background the game of towers of hanoi is one player game the game of towers of hanoi is a one player game transitioning from some state s to another states s change another states to another state learning the graph are entailed the corresponding nodes precondition are used as positive examples are entailed by the corresponding nodes precondition are used as positive examples results we then create a hierarchical by learning programs we then create a hierarchical graph by learning programs discussion in our preliminary work we using an example based in our preliminary work we use an example based for example there a longer macroaction composed of only one atomic action for example a longer macroaction composed of only one atomic action suggestions and questions to the authors in the related work section it is mentioned that here the automated explanation method attempts to explain in a situation state why the planner chose an action why the planner did not choose an action why the planner decisions are better why things asked eg goals can not be done why one needs to replan and why one does not have to replan upon first reading this section i was anticipating that the approach described in the paper would be able to handle some of these explanation types however it is my understanding that the explanations derived from egraphs can only make interpretable the policy for all instances of some task i wonder if the authors could discuss how egraphs can be used to generate for example contrastive explanations and other types of explanations mentioned in the related work section related to the previous point it would be interesting to consider the relation of explanations derived from egraphs to work on model reconciliation 1 which aims to resolve misconceptions held by the recipient of an explanation typically a human user of a planning system for example in the rubiks cube domain a human observer may not be aware of some of the dynamics of the cube and may therefore be confused by say some of the ifthen rules generated by the approach described in the paper i would have liked to see a more detailed comparison to sreedharan et als 2022 work mentioned in the related work section the paper mentions that this work focuses on solving only one instance and that the current paper in contrast can solve all instances of a problem could the authors discuss how their work compares to sreedharan et als when given a single instance i appreciated that the paper discussed the simplicity of explanations to a human and the quantification of this notion using the cost of egraphs while the following consideration is likely beyond the scope of the current paper i was nevertheless wondering if the authors have given thought to the question of the perceived simplicity and interpretability of the various forms of explanations derived from egraphs for instance are such explanations meant mostly for experts with intimate knowledge of the domain dynamics or are they also intended for novice users since the term explanation is used somewhat differently in different places in the paper eg by representing an explanation as a directed graph in the intro and then the overall explanation is then any method that summarizes the entire graph later in the paper it would be helpful to have a at least somewhat formal definition of an explanation the use of pathfinding in the paper makes intuitive sense however it is a charged term in the planning community due to the rich literature on path planning i was therefore wondering why it was chosen and whether explainable planning would also accurately describe the problem addressed in the paper as a reader not overly familiar with inductive logic programming i found reading the technical section of the paper challenging without going to the literature i think it would be helpful to readers if the paper included concrete pointers to ilp exposition and possibly if theres space an ilp subsection in the background section 1 chakraborti t sreedharan s zhang y and kambham pati s 2017 plan explanations as model reconciliation moving beyond explanation as soliloquy in ijcai 156 163 ### Summary:
both reviewers agree that the paper presents a novel solution ie the use of ilpbased techniques to a very relevant problem ie explaining a space of plans the reviewer does point to some clarity issues in particular inconsistent or incongruent use of some terms in particular the term pathfinding i hope the authors will get a chance to address these problems in the future draft also going forward the authors would also consider running user studies to verify their method a recommendation also made by reviewer pv2u all in all i would recommend the acceptance of the paper
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 74, 1089, 253, 2934, 273, 15571, 253, 2317, 273, 5827, 281, 253, 4736, 281, 320, 4722, 285, 891, 2868, 970, 4164, 81, 323, 326, 310, 4460, 1512, 253, 2929, 310, 6571, 973, 3542, 2167, 352, 812, 5649, 432, 247, 2159, 6452, 2593, 387, 253, 990, 285, 247, 4385, 5001, 253, 878, 281, 7472, 436, 2238, 273, 8813, 342, 4212, 50276, 856, 84, 50276, 47606, 2934, 285, 1332, 50276, 911, 11139, 569, 1646, 1966, 34025, 565, 48714, 281, 479, 50276, 5040, 50276, 262, 310, 417, 2590, 281, 479, 752, 3280, 4212, 390, 667, 1966, 2686, 878, 281, 1918, 281, 253, 1332, 2366, 342, 253, 1895, 5740, 275, 1340, 323, 22909, 281, 320, 4561, 253, 2505, 5936, 4212, 4853, 2063, 31290, 50276, 266, 42115, 8492, 5046, 891, 9829, 436, 533, 752, 651, 436, 320, 275, 15469, 273, 288, 46116, 1083, 50276, 1542, 479, 352, 310, 8909, 281, 1416, 253, 1332, 271, 5513, 494, 1854, 28983, 1332, 5431, 253, 1307, 1854, 28983, 310, 908, 323, 1077, 8820, 4893, 751, 4382, 3958, 285, 15688, 982, 50276, 2858, 253, 1332, 1060, 3133, 1077, 2087, 285, 31418, 432, 8820, 4495, 253, 1332, 1057, 417, 8356, 3216, 253, 22909, 970, 667, 8820, 3640, 594, 891, 1089, 352, 1892, 281, 8564, 849, 581, 651, 897, 352, 275, 1524, 10186, 1854, 28983, 3958, 18848, 1502, 690, 3082, 13654, 327, 5513, 494, 1854, 28983, 5742, 2686, 513, 4311, 281, 841, 7533, 24088, 7138, 8500, 1162, 355, 15571, 1854, 2098, 5556, 1319, 3809, 8813, 3082, 323, 15034, 6191, 1041, 970, 17857, 1825, 938, 1797, 2139, 417, 1067, 352, 5513, 494, 7219, 3185, 390, 15571, 253, 2317, 273, 5827, 24088, 299, 338, 2146, 1162, 355, 5827, 4511, 8813, 3066, 2098, 9134, 21011, 891, 23925, 2284, 14952, 50276, 2369, 2608, 1263, 50276, 37585, 3374, 50276, 32897, 387, 1878, 4151, 595, 4853, 752, 994, 13454, 2097, 247, 4666, 326, 36782, 247, 1375, 994, 7193, 407, 247, 638, 12380, 50276, 249, 253, 10199, 253, 4477, 1333, 359, 2770, 327, 4560, 271, 8813, 323, 16161, 512, 1896, 10872, 273, 247, 1895, 533, 513, 417, 1918, 247, 16038, 323, 2139, 436, 651, 320, 247, 1175, 316, 4085, 18108, 2181, 281, 513, 50276, 2520, 3133, 8921, 281, 479, 627, 310, 247, 1854, 281, 253, 4736, 432, 667, 4736, 4666, 50276, 249, 436, 6197, 50276, 4539, 5336, 3054, 326, 253, 14823, 1913, 29698, 4016, 6667, 627, 310, 247, 1895, 342, 17256, 2368, 285, 5046, 407, 310, 5816, 846, 994, 7193, 50276, 249, 436, 6197, 253, 1543, 921, 247, 1077, 2969, 6779, 326, 2296, 326, 253, 6253, 7592, 4566, 281, 47997, 20, 432, 2057, 47997, 18, 390, 47997, 19, 50276, 35264, 253, 8813, 310, 417, 3542, 275, 436, 1039, 285, 4477, 574, 281, 4665, 253, 8813, 281, 3630, 352, 275, 436, 1039, 1014, 604, 35820, 1365, 436, 830, 812, 452, 271, 3486, 275, 2426, 273, 849, 2969, 390, 27350, 253, 8813, 310, 323, 4212, 4419, 2608, 1263, 50276, 32897, 2319, 253, 3374, 3212, 285, 9099, 323, 5513, 494, 2718, 1677, 253, 3480, 273, 604, 7234, 6311, 407, 253, 4164, 81, 985, 50276, 664, 970, 50276, 9088, 247, 3356, 7152, 33032, 2520, 2929, 3936, 271, 42115, 9317, 10717, 2746, 281, 4715, 9267, 18859, 5513, 494, 1854, 28983, 14580, 299, 33884, 1754, 327, 7313, 273, 5482, 4561, 407, 499, 23217, 3692, 5482, 778, 5010, 320, 275, 8658, 13508, 281, 7497, 253, 2929, 29328, 247, 1180, 273, 4088, 275, 534, 299, 33884, 778, 320, 36838, 281, 6635, 22909, 323, 1966, 4212, 273, 824, 2718, 253, 4164, 81, 1332, 908, 476, 19071, 4114, 3640, 285, 20793, 824, 326, 760, 2969, 2217, 299, 33884, 403, 4561, 253, 2929, 4648, 27731, 273, 288, 46116, 347, 247, 3515, 1650, 285, 1014, 25339, 2898, 2442, 281, 253, 7692, 74, 661, 23636, 5028, 50275, 555, 993, 50276, 4919, 789, 2593, 50272, 936, 10199, 273, 247, 1180, 273, 7274, 50276, 936, 253, 10199, 273, 247, 1180, 273, 7274, 50272, 2004, 5659, 452, 247, 13358, 2605, 581, 476, 22318, 436, 247, 19595, 5415, 2605, 3185, 50275, 12163, 436, 390, 823, 970, 1078, 247, 50272, 249, 4216, 11139, 253, 4216, 310, 908, 281, 3186, 323, 247, 5827, 50276, 4168, 5827, 281, 2098, 50272, 249, 776, 789, 359, 3989, 253, 299, 10580, 326, 476, 8415, 512, 10872, 273, 247, 1895, 50276, 4168, 253, 299, 10580, 281, 271, 299, 10580, 50276, 11814, 50272, 783, 2165, 273, 27731, 273, 288, 46116, 310, 581, 4760, 2165, 50276, 783, 2165, 273, 27731, 273, 288, 46116, 310, 247, 581, 4760, 2165, 50272, 25974, 272, 432, 690, 1375, 256, 281, 1529, 3054, 256, 50276, 4168, 1529, 3054, 281, 1529, 1375, 50276, 28269, 253, 4216, 50272, 609, 994, 7193, 253, 3969, 7632, 638, 12380, 403, 908, 347, 2762, 6667, 50276, 609, 994, 7193, 407, 253, 3969, 7632, 638, 12380, 403, 908, 347, 2762, 6667, 50276, 16680, 50272, 664, 840, 2794, 247, 24498, 407, 4715, 5659, 50276, 664, 840, 2794, 247, 24498, 4216, 407, 4715, 5659, 50276, 49794, 50272, 249, 776, 12611, 789, 359, 970, 271, 1650, 1754, 50276, 249, 776, 12611, 789, 359, 897, 271, 1650, 1754, 50272, 1542, 1650, 627, 247, 3356, 14823, 1913, 9924, 273, 760, 581, 13805, 2250, 50276, 1542, 1650, 247, 3356, 14823, 1913, 9924, 273, 760, 581, 13805, 2250, 50276, 35640, 621, 285, 3533, 281, 253, 4477, 50276, 249, 253, 2905, 789, 2593, 352, 310, 5393, 326, 50276, 1568, 253, 16644, 8813, 1332, 9437, 281, 5513, 275, 247, 4112, 1375, 2139, 253, 499, 9582, 9703, 271, 2250, 2139, 253, 499, 9582, 858, 417, 5206, 271, 2250, 2139, 253, 499, 9582, 7089, 403, 1805, 2139, 1841, 2546, 24088, 7342, 476, 417, 320, 2218, 2139, 581, 3198, 281, 2397, 266, 285, 2139, 581, 1057, 417, 452, 281, 2397, 266, 2220, 806, 4361, 436, 2593, 891, 369, 10437, 839, 326, 253, 2746, 2529, 275, 253, 2929, 651, 320, 2104, 281, 6016, 690, 273, 841, 8813, 3510, 2299, 352, 310, 619, 4685, 326, 253, 22909, 6012, 432, 299, 33884, 476, 760, 1056, 4665, 494, 253, 3646, 323, 512, 10872, 273, 690, 4836, 891, 4282, 604, 253, 4477, 812, 2319, 849, 299, 33884, 476, 320, 908, 281, 6635, 323, 1650, 4499, 422, 22909, 285, 643, 3510, 273, 22909, 5393, 275, 253, 2905, 789, 2593, 50275, 4919, 281, 253, 2045, 1127, 352, 651, 320, 4722, 281, 1908, 253, 5886, 273, 22909, 6012, 432, 299, 33884, 281, 789, 327, 1566, 40746, 337, 534, 13698, 281, 11322, 44890, 14233, 2918, 407, 253, 18344, 273, 271, 8813, 5431, 247, 1966, 2608, 273, 247, 7219, 985, 323, 1650, 275, 253, 7692, 74, 661, 23636, 5028, 247, 1966, 19969, 778, 417, 320, 6600, 273, 690, 273, 253, 8062, 273, 253, 23636, 285, 778, 3103, 320, 13477, 407, 1333, 690, 273, 253, 604, 7461, 4803, 4561, 407, 253, 2746, 2529, 275, 253, 2929, 50276, 74, 651, 452, 10490, 281, 923, 247, 625, 7000, 5301, 281, 256, 22767, 9432, 266, 1162, 14350, 1384, 1423, 789, 5393, 275, 253, 2905, 789, 2593, 253, 2929, 25957, 326, 436, 789, 16633, 327, 16161, 760, 581, 4227, 285, 326, 253, 1655, 2929, 275, 4499, 476, 8415, 512, 10872, 273, 247, 1895, 812, 253, 4477, 2319, 849, 616, 789, 26662, 281, 256, 22767, 9432, 266, 1162, 14350, 672, 1677, 247, 2014, 4227, 50276, 74, 14109, 326, 253, 2929, 5469, 253, 17647, 273, 22909, 281, 247, 1966, 285, 253, 21652, 273, 436, 10732, 970, 253, 2105, 273, 299, 33884, 1223, 253, 1563, 8180, 310, 2779, 4457, 253, 7990, 273, 253, 1655, 2929, 891, 369, 17837, 12371, 604, 253, 4477, 452, 1677, 1869, 281, 253, 1953, 273, 253, 12351, 17647, 285, 4665, 1430, 273, 253, 2710, 4948, 273, 22909, 6012, 432, 299, 33884, 323, 4227, 403, 824, 22909, 5486, 6571, 323, 10071, 342, 21383, 3640, 273, 253, 5028, 8062, 390, 403, 597, 671, 6034, 323, 22458, 547, 4212, 50276, 17480, 253, 1307, 8813, 310, 908, 8489, 13359, 275, 1027, 5053, 275, 253, 2929, 24088, 407, 9999, 271, 8813, 347, 247, 6828, 4216, 275, 253, 26432, 285, 840, 253, 4583, 8813, 310, 840, 667, 1332, 326, 37250, 253, 2862, 4216, 1996, 275, 253, 2929, 50276, 262, 651, 320, 9371, 281, 452, 247, 387, 1878, 8489, 7473, 5426, 273, 271, 8813, 50276, 783, 897, 273, 1854, 28983, 275, 253, 2929, 2789, 27350, 3282, 2299, 352, 310, 247, 6636, 1307, 275, 253, 7219, 3114, 1955, 281, 253, 6793, 6239, 327, 1854, 7219, 891, 369, 3103, 12371, 2139, 352, 369, 6777, 285, 1880, 5513, 494, 7219, 651, 671, 13613, 6266, 253, 1895, 9713, 275, 253, 2929, 50276, 284, 247, 9414, 417, 27662, 7615, 342, 42115, 9317, 10717, 891, 1119, 4361, 253, 7681, 2593, 273, 253, 2929, 11132, 1293, 1469, 281, 253, 6239, 891, 1158, 352, 651, 320, 9371, 281, 10668, 604, 253, 2929, 2908, 11859, 29476, 281, 4164, 81, 47284, 285, 6830, 604, 253, 373, 2317, 271, 4164, 81, 19087, 275, 253, 4114, 2593, 50276, 18, 448, 518, 37588, 430, 74, 246, 256, 22767, 9432, 266, 256, 1182, 12109, 340, 285, 465, 1369, 3964, 869, 74, 256, 4240, 2098, 22909, 347, 1566, 40746, 4886, 4457, 8813, 347, 1220, 300, 18101, 90, 275, 891, 23925, 2284, 21807, 23309, 187, 187, 4118, 18435, 27, 15617, 30628, 5194, 326, 253, 2929, 10262, 247, 4460, 2900, 26332, 253, 897, 273, 4164, 81, 3169, 5609, 281, 247, 1077, 4623, 1895, 26332, 15571, 247, 2317, 273, 5827, 253, 37317, 1057, 1127, 281, 690, 19843, 3374, 275, 1798, 16706, 390, 1485, 543, 579, 290, 897, 273, 690, 2426, 275, 1798, 253, 1307, 1854, 28983, 891, 3524, 253, 4477, 588, 755, 247, 4839, 281, 2953, 841, 3237, 275, 253, 2852, 7482, 671, 1469, 3579, 253, 4477, 651, 671, 1908, 3515, 2608, 2175, 281, 12654, 616, 1332, 247, 17401, 671, 1160, 407, 37317, 268, 87, 19, 86, 512, 275, 512, 891, 651, 5583, 253, 14924, 273, 253, 2929 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 74, 1089, 253, 2934, 273, 15571, 253, 2317, 273, 5827, 281, 253, 4736, 281, 320, 4722, 285, 891, 2868, 970, 4164, 81, 323, 326, 310, 4460, 1512, 253, 2929, 310, 6571, 973, 3542, 2167, 352, 812, 5649, 432, 247, 2159, 6452, 2593, 387, 253, 990, 285, 247, 4385, 5001, 253, 878, 281, 7472, 436, 2238, 273, 8813, 342, 4212, 50276, 856, 84, 50276, 47606, 2934, 285, 1332, 50276, 911, 11139, 569, 1646, 1966, 34025, 565, 48714, 281, 479, 50276, 5040, 50276, 262, 310, 417, 2590, 281, 479, 752, 3280, 4212, 390, 667, 1966, 2686, 878, 281, 1918, 281, 253, 1332, 2366, 342, 253, 1895, 5740, 275, 1340, 323, 22909, 281, 320, 4561, 253, 2505, 5936, 4212, 4853, 2063, 31290, 50276, 266, 42115, 8492, 5046, 891, 9829, 436, 533, 752, 651, 436, 320, 275, 15469, 273, 288, 46116, 1083, 50276, 1542, 479, 352, 310, 8909, 281, 1416, 253, 1332, 271, 5513, 494, 1854, 28983, 1332, 5431, 253, 1307, 1854, 28983, 310, 908, 323, 1077, 8820, 4893, 751, 4382, 3958, 285, 15688, 982, 50276, 2858, 253, 1332, 1060, 3133, 1077, 2087, 285, 31418, 432, 8820, 4495, 253, 1332, 1057, 417, 8356, 3216, 253, 22909, 970, 667, 8820, 3640, 594, 891, 1089, 352, 1892, 281, 8564, 849, 581, 651, 897, 352, 275, 1524, 10186, 1854, 28983, 3958, 18848, 1502, 690, 3082, 13654, 327, 5513, 494, 1854, 28983, 5742, 2686, 513, 4311, 281, 841, 7533, 24088, 7138, 8500, 1162, 355, 15571, 1854, 2098, 5556, 1319, 3809, 8813, 3082, 323, 15034, 6191, 1041, 970, 17857, 1825, 938, 1797, 2139, 417, 1067, 352, 5513, 494, 7219, 3185, 390, 15571, 253, 2317, 273, 5827, 24088, 299, 338, 2146, 1162, 355, 5827, 4511, 8813, 3066, 2098, 9134, 21011, 891, 23925, 2284, 14952, 50276, 2369, 2608, 1263, 50276, 37585, 3374, 50276, 32897, 387, 1878, 4151, 595, 4853, 752, 994, 13454, 2097, 247, 4666, 326, 36782, 247, 1375, 994, 7193, 407, 247, 638, 12380, 50276, 249, 253, 10199, 253, 4477, 1333, 359, 2770, 327, 4560, 271, 8813, 323, 16161, 512, 1896, 10872, 273, 247, 1895, 533, 513, 417, 1918, 247, 16038, 323, 2139, 436, 651, 320, 247, 1175, 316, 4085, 18108, 2181, 281, 513, 50276, 2520, 3133, 8921, 281, 479, 627, 310, 247, 1854, 281, 253, 4736, 432, 667, 4736, 4666, 50276, 249, 436, 6197, 50276, 4539, 5336, 3054, 326, 253, 14823, 1913, 29698, 4016, 6667, 627, 310, 247, 1895, 342, 17256, 2368, 285, 5046, 407, 310, 5816, 846, 994, 7193, 50276, 249, 436, 6197, 253, 1543, 921, 247, 1077, 2969, 6779, 326, 2296, 326, 253, 6253, 7592, 4566, 281, 47997, 20, 432, 2057, 47997, 18, 390, 47997, 19, 50276, 35264, 253, 8813, 310, 417, 3542, 275, 436, 1039, 285, 4477, 574, 281, 4665, 253, 8813, 281, 3630, 352, 275, 436, 1039, 1014, 604, 35820, 1365, 436, 830, 812, 452, 271, 3486, 275, 2426, 273, 849, 2969, 390, 27350, 253, 8813, 310, 323, 4212, 4419, 2608, 1263, 50276, 32897, 2319, 253, 3374, 3212, 285, 9099, 323, 5513, 494, 2718, 1677, 253, 3480, 273, 604, 7234, 6311, 407, 253, 4164, 81, 985, 50276, 664, 970, 50276, 9088, 247, 3356, 7152, 33032, 2520, 2929, 3936, 271, 42115, 9317, 10717, 2746, 281, 4715, 9267, 18859, 5513, 494, 1854, 28983, 14580, 299, 33884, 1754, 327, 7313, 273, 5482, 4561, 407, 499, 23217, 3692, 5482, 778, 5010, 320, 275, 8658, 13508, 281, 7497, 253, 2929, 29328, 247, 1180, 273, 4088, 275, 534, 299, 33884, 778, 320, 36838, 281, 6635, 22909, 323, 1966, 4212, 273, 824, 2718, 253, 4164, 81, 1332, 908, 476, 19071, 4114, 3640, 285, 20793, 824, 326, 760, 2969, 2217, 299, 33884, 403, 4561, 253, 2929, 4648, 27731, 273, 288, 46116, 347, 247, 3515, 1650, 285, 1014, 25339, 2898, 2442, 281, 253, 7692, 74, 661, 23636, 5028, 50275, 555, 993, 50276, 4919, 789, 2593, 50272, 936, 10199, 273, 247, 1180, 273, 7274, 50276, 936, 253, 10199, 273, 247, 1180, 273, 7274, 50272, 2004, 5659, 452, 247, 13358, 2605, 581, 476, 22318, 436, 247, 19595, 5415, 2605, 3185, 50275, 12163, 436, 390, 823, 970, 1078, 247, 50272, 249, 4216, 11139, 253, 4216, 310, 908, 281, 3186, 323, 247, 5827, 50276, 4168, 5827, 281, 2098, 50272, 249, 776, 789, 359, 3989, 253, 299, 10580, 326, 476, 8415, 512, 10872, 273, 247, 1895, 50276, 4168, 253, 299, 10580, 281, 271, 299, 10580, 50276, 11814, 50272, 783, 2165, 273, 27731, 273, 288, 46116, 310, 581, 4760, 2165, 50276, 783, 2165, 273, 27731, 273, 288, 46116, 310, 247, 581, 4760, 2165, 50272, 25974, 272, 432, 690, 1375, 256, 281, 1529, 3054, 256, 50276, 4168, 1529, 3054, 281, 1529, 1375, 50276, 28269, 253, 4216, 50272, 609, 994, 7193, 253, 3969, 7632, 638, 12380, 403, 908, 347, 2762, 6667, 50276, 609, 994, 7193, 407, 253, 3969, 7632, 638, 12380, 403, 908, 347, 2762, 6667, 50276, 16680, 50272, 664, 840, 2794, 247, 24498, 407, 4715, 5659, 50276, 664, 840, 2794, 247, 24498, 4216, 407, 4715, 5659, 50276, 49794, 50272, 249, 776, 12611, 789, 359, 970, 271, 1650, 1754, 50276, 249, 776, 12611, 789, 359, 897, 271, 1650, 1754, 50272, 1542, 1650, 627, 247, 3356, 14823, 1913, 9924, 273, 760, 581, 13805, 2250, 50276, 1542, 1650, 247, 3356, 14823, 1913, 9924, 273, 760, 581, 13805, 2250, 50276, 35640, 621, 285, 3533, 281, 253, 4477, 50276, 249, 253, 2905, 789, 2593, 352, 310, 5393, 326, 50276, 1568, 253, 16644, 8813, 1332, 9437, 281, 5513, 275, 247, 4112, 1375, 2139, 253, 499, 9582, 9703, 271, 2250, 2139, 253, 499, 9582, 858, 417, 5206, 271, 2250, 2139, 253, 499, 9582, 7089, 403, 1805, 2139, 1841, 2546, 24088, 7342, 476, 417, 320, 2218, 2139, 581, 3198, 281, 2397, 266, 285, 2139, 581, 1057, 417, 452, 281, 2397, 266, 2220, 806, 4361, 436, 2593, 891, 369, 10437, 839, 326, 253, 2746, 2529, 275, 253, 2929, 651, 320, 2104, 281, 6016, 690, 273, 841, 8813, 3510, 2299, 352, 310, 619, 4685, 326, 253, 22909, 6012, 432, 299, 33884, 476, 760, 1056, 4665, 494, 253, 3646, 323, 512, 10872, 273, 690, 4836, 891, 4282, 604, 253, 4477, 812, 2319, 849, 299, 33884, 476, 320, 908, 281, 6635, 323, 1650, 4499, 422, 22909, 285, 643, 3510, 273, 22909, 5393, 275, 253, 2905, 789, 2593, 50275, 4919, 281, 253, 2045, 1127, 352, 651, 320, 4722, 281, 1908, 253, 5886, 273, 22909, 6012, 432, 299, 33884, 281, 789, 327, 1566, 40746, 337, 534, 13698, 281, 11322, 44890, 14233, 2918, 407, 253, 18344, 273, 271, 8813, 5431, 247, 1966, 2608, 273, 247, 7219, 985, 323, 1650, 275, 253, 7692, 74, 661, 23636, 5028, 247, 1966, 19969, 778, 417, 320, 6600, 273, 690, 273, 253, 8062, 273, 253, 23636, 285, 778, 3103, 320, 13477, 407, 1333, 690, 273, 253, 604, 7461, 4803, 4561, 407, 253, 2746, 2529, 275, 253, 2929, 50276, 74, 651, 452, 10490, 281, 923, 247, 625, 7000, 5301, 281, 256, 22767, 9432, 266, 1162, 14350, 1384, 1423, 789, 5393, 275, 253, 2905, 789, 2593, 253, 2929, 25957, 326, 436, 789, 16633, 327, 16161, 760, 581, 4227, 285, 326, 253, 1655, 2929, 275, 4499, 476, 8415, 512, 10872, 273, 247, 1895, 812, 253, 4477, 2319, 849, 616, 789, 26662, 281, 256, 22767, 9432, 266, 1162, 14350, 672, 1677, 247, 2014, 4227, 50276, 74, 14109, 326, 253, 2929, 5469, 253, 17647, 273, 22909, 281, 247, 1966, 285, 253, 21652, 273, 436, 10732, 970, 253, 2105, 273, 299, 33884, 1223, 253, 1563, 8180, 310, 2779, 4457, 253, 7990, 273, 253, 1655, 2929, 891, 369, 17837, 12371, 604, 253, 4477, 452, 1677, 1869, 281, 253, 1953, 273, 253, 12351, 17647, 285, 4665, 1430, 273, 253, 2710, 4948, 273, 22909, 6012, 432, 299, 33884, 323, 4227, 403, 824, 22909, 5486, 6571, 323, 10071, 342, 21383, 3640, 273, 253, 5028, 8062, 390, 403, 597, 671, 6034, 323, 22458, 547, 4212, 50276, 17480, 253, 1307, 8813, 310, 908, 8489, 13359, 275, 1027, 5053, 275, 253, 2929, 24088, 407, 9999, 271, 8813, 347, 247, 6828, 4216, 275, 253, 26432, 285, 840, 253, 4583, 8813, 310, 840, 667, 1332, 326, 37250, 253, 2862, 4216, 1996, 275, 253, 2929, 50276, 262, 651, 320, 9371, 281, 452, 247, 387, 1878, 8489, 7473, 5426, 273, 271, 8813, 50276, 783, 897, 273, 1854, 28983, 275, 253, 2929, 2789, 27350, 3282, 2299, 352, 310, 247, 6636, 1307, 275, 253, 7219, 3114, 1955, 281, 253, 6793, 6239, 327, 1854, 7219, 891, 369, 3103, 12371, 2139, 352, 369, 6777, 285, 1880, 5513, 494, 7219, 651, 671, 13613, 6266, 253, 1895, 9713, 275, 253, 2929, 50276, 284, 247, 9414, 417, 27662, 7615, 342, 42115, 9317, 10717, 891, 1119, 4361, 253, 7681, 2593, 273, 253, 2929, 11132, 1293, 1469, 281, 253, 6239, 891, 1158, 352, 651, 320, 9371, 281, 10668, 604, 253, 2929, 2908, 11859, 29476, 281, 4164, 81, 47284, 285, 6830, 604, 253, 373, 2317, 271, 4164, 81, 19087, 275, 253, 4114, 2593, 50276, 18, 448, 518, 37588, 430, 74, 246, 256, 22767, 9432, 266, 256, 1182, 12109, 340, 285, 465, 1369, 3964, 869, 74, 256, 4240, 2098, 22909, 347, 1566, 40746, 4886, 4457, 8813, 347, 1220, 300, 18101, 90, 275, 891, 23925, 2284, 21807, 23309, 187, 187, 4118, 18435, 27, 15617, 30628, 5194, 326, 253, 2929, 10262, 247, 4460, 2900, 26332, 253, 897, 273, 4164, 81, 3169, 5609, 281, 247, 1077, 4623, 1895, 26332, 15571, 247, 2317, 273, 5827, 253, 37317, 1057, 1127, 281, 690, 19843, 3374, 275, 1798, 16706, 390, 1485, 543, 579, 290, 897, 273, 690, 2426, 275, 1798, 253, 1307, 1854, 28983, 891, 3524, 253, 4477, 588, 755, 247, 4839, 281, 2953, 841, 3237, 275, 253, 2852, 7482, 671, 1469, 3579, 253, 4477, 651, 671, 1908, 3515, 2608, 2175, 281, 12654, 616, 1332, 247, 17401, 671, 1160, 407, 37317, 268, 87, 19, 86, 512, 275, 512, 891, 651, 5583, 253, 14924, 273, 253, 2929 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the submission considers the problem of reconstructing private data from gradients in a federated learning system which has been recently shown to a threat in distributed learning systems two types of federated learning systems are considered vertical federated learning vfl refers to the case where different agents hold different features of the same data points while horizontal federated learning hfl refers to the case where different agents how all the features of different subsets of the data previous attacks solve an optimization problem that aims to infer the data by minimizing the mismatch between real gradients and fake gradients this method suffers difficulty when the number of samples in one round is large the paper proposes cafe which takes advantage of the fact that in vertical federated learning vfl systems the server can identify the indices of the samples that are selected in each round this extra information help reduce unwanted solutions in the optimization problem and help improve the reconstruction performance the authors conduct experiments to show that the proposed algorithm outperforms previous works the paper shows that data leakage from gradients is a potential threat in vfl systems even when the batch size is large however i have the following concerns about the paper 1 the paper claims that the attack also works for the hfl setting however this is no well justified for the following reasons 1 the assumption that the server knows the indices of the samples that are selected in each round is not valid in general for the hfl setting since each agent can sample a batch locally 2 in hfl settings it is generally assumed that the number of agents is large and each agent only participates in a few rounds which is not considered in the experiments in the submission 2 in the experiments it is shown that as the number of training epochs grows better inference on the private data can be made it would be better if the authors can also include the training error on each epoch in the same plot it is believable that if the training goes on forever enough information can be inferred about the training samples however it might be good to see whether the server can infer the train samples before the model has already converged i hope the authors can address my above concerns in the responsedocsepthis paper studies the data leakage issue in the federated learning more precisely when the servers have access to model parameters and gradients it can recover the input data via gradient matching and the authors claim that their method performs well even with large training batch sizes eg over 40 finally the author also studies the possibility of attacking during learning where they suggest that multiple updates of fake data helps however their contribution seems incremental gradient matching is used in previous literature zhu et al 2019 and their main modification is extra two regularization terms total variation and internal representation regularization and a data index alignment technique whose exact meaning is unclear in the paper the following are some questions what does index alignment mean is that the server controls the indexes of samples chosen at each iteration this seems to be very restrictive in practice especially for horizontal federated learning does the server have access to the aggregated grads from each worker separately or the workers aggregate all the gradients before sending them back to the server the second scenario cab be achieved while secure aggregation technique in vertical federated learning the gradients of part 1 of the network does not need to be exchanged with the server as there is no average operation needed even the parameter itself does not need to be transferred to the server for the same reason will your method work under this setting some terms are not properly defined such as normalized gradient descent batch ratio et al other questions what does the iterations represent in table 1a is that the number of iterations need to reach a 35 psnr using cosine dissimilarity decreases the psnr i assume this is because psnr penalize the scale is there noticeable degradation visually when using cosine dissimilarity in the attack during learning scenario is there any intuition why optimizing fake data multiple times works better docsepthe paper proposes an attack to extract information about training data from gradient updates sent as part of a federated learning setting the description of the attack setting and the attack algorithm is provided at a high level and detailed description is missing making it hard to understand the novelty of the contribution experimental results do show that the attack is stronger than previous work however the overall presentation of the paper could be improved to be ready for a publication some suggestions listed below attack setting it seems that the paper departs from some of the related work on information leakage by considering an attacker that can tamper with the federated learning process hence the attacker is malicious and not benign the authors should make this distinction clear if it is indeed the case commenting on why this malicious activity will not be noticed by the workers is important for example figure 5 indeed shows that training accuracy is impacted by the attack how does this attack compare to the active attacker in the work by melis et al the paper considers only 4 workers in most experiments federated learning usually has many more participants is this a problem do same workers need to be contacted at every iteration are there any assumptions on same data being used in each iteration attack algorithm the algorithm makes use of data index alignment some guidance on what it means would facilitate the reading and understanding of the algorithm red part of algorithm 1 should be expanded how do these values get computed do they replace the blue parts or complement them presentation as a motivating example figure 1 compares it would be best to motivate the algorithm key insight and not its improved performance over previous work that was already mentioned however as shown in figure 2 a curious server can provide the same legitimate computation as a benign server while simultaneously perform data recovery in a stealthy manner the server symmetrically generates fake the figure does not describe how fake parameters are computed and it is not clear how pictorial representation shows this minor details figure 1 captions why 40 vs 10 x 4 workers participating fl workers participating in fl please consider a better title for 44 attacking while fl docsepthis work introduces cafe a novel training algorithm to leak training data in a federated learning setup extending from deep leakage from gradient fake images are optimised with respect to the difference observed from the client gradients ie with the real images and the one observed with the current version of the fake image however dlg does not work when the minibatch size increases due to a messy gradient representation in this work the authors propose to keep track of the batch index indeed it may happen that the server decides of the batch index corresponding to the training data that will be used by the client during the local training within such conditions a malicious server can easily store fake images corresponding to specific indices and therefore optimise correctly each fake images wrt the corresponding real image it is clear from the obtained results that this method works and that images are recovered however i am unsure about the relevance of the experimental protocol 1 if the server does not ask for specific indices and it is pretty common the method is equivalent to dlg ie does not work well with large batches 2 what if we dont have the gradients a common way of doing fl is to simply communicate the locally trained weights with multiple local epochs as specified in the introduction point 3 the proposed method wouldnt work in this realistic scenario then i found section 33 unclear is some aspects are the two proposed regularisation methods relying on the real image if so isnt this a strong bias we are not expected to have the input images i suppose that this comes from the citation to the work of geipinget al 2020 however these two paragraph should be rewritten to clearly explain how we can extract the input vector and how it relates to eq 8 to promote the smoothness of the fake images we assume the tv norm of the real images as a constant we cant use the real image here so it is not valid pros in the given conditions cafe clearly outperforms the other approach to leak training data from gradients during fl very simple attack to implement cons the conditions necessary to the success of the proposed methods seem to be quite strong and not really connected to a realistic fl framework small ideas can lead to drastic changes in the field but the core idea of the paper is to solely store batch indices remarks in this section we provide necessary background of fl in this section figure 2 should be checked aggreaged upload fake gradient only once what are t and b in eq 5 ### Summary:
this paper focuses attacks on federated learning the reviewers had the following concerns the assumption of knowledge of batch indices is unrealistic in an hfl setting the setup only works when doing a single epoch i believe the authors claim that it is applicable in more general settings but evidence to that effect has not been provided the novelty of the approach is somewhat limited the description of the algorithm and comparison to prior work could be clearer i raised the question of whether the reviewers would be more positive if there were no claimed results on hfl they still did not seem positive enough to justify acceptance due to the other reasons mentioned above
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 19529, 19401, 253, 1895, 273, 17029, 272, 3055, 941, 432, 27935, 275, 247, 10208, 12072, 4715, 985, 534, 556, 644, 4102, 2011, 281, 247, 4322, 275, 5939, 4715, 2718, 767, 3510, 273, 10208, 12072, 4715, 2718, 403, 2783, 9118, 10208, 12072, 4715, 362, 1258, 10770, 281, 253, 1083, 835, 1027, 6083, 2186, 1027, 3386, 273, 253, 1072, 941, 2792, 1223, 50276, 33464, 10208, 12072, 4715, 288, 1258, 10770, 281, 253, 1083, 835, 1027, 6083, 849, 512, 253, 3386, 273, 1027, 20077, 273, 253, 941, 50276, 35065, 8104, 8415, 271, 13757, 1895, 326, 13698, 281, 9441, 253, 941, 407, 28699, 253, 29713, 875, 1524, 27935, 285, 15223, 27935, 436, 1332, 27171, 10183, 672, 253, 1180, 273, 3530, 275, 581, 3790, 310, 1781, 253, 2929, 29328, 40847, 534, 3936, 5750, 273, 253, 958, 326, 275, 9118, 10208, 12072, 4715, 362, 1258, 2718, 253, 4771, 476, 4271, 253, 14452, 273, 253, 3530, 326, 403, 4236, 275, 1016, 3790, 436, 4465, 1491, 1361, 4796, 25824, 5482, 275, 253, 13757, 1895, 285, 1361, 3157, 253, 14433, 3045, 253, 4477, 2589, 4679, 281, 921, 326, 253, 4081, 5933, 41731, 13015, 2045, 2987, 50276, 783, 2929, 2722, 326, 941, 23753, 432, 27935, 310, 247, 2442, 4322, 275, 362, 1258, 2718, 1014, 672, 253, 14604, 1979, 310, 1781, 2299, 891, 452, 253, 1563, 7350, 670, 253, 2929, 50276, 18, 253, 2929, 3916, 326, 253, 2983, 671, 2987, 323, 253, 288, 1258, 4758, 2299, 436, 310, 642, 973, 17285, 323, 253, 1563, 4606, 337, 253, 9376, 326, 253, 4771, 6057, 253, 14452, 273, 253, 3530, 326, 403, 4236, 275, 1016, 3790, 310, 417, 3588, 275, 2087, 323, 253, 288, 1258, 4758, 1580, 1016, 5570, 476, 3410, 247, 14604, 12171, 374, 275, 288, 1258, 7533, 352, 310, 3839, 8025, 326, 253, 1180, 273, 6083, 310, 1781, 285, 1016, 5570, 760, 45347, 275, 247, 1643, 16334, 534, 310, 417, 2783, 275, 253, 4679, 275, 253, 19529, 374, 275, 253, 4679, 352, 310, 2011, 326, 347, 253, 1180, 273, 3733, 44540, 17202, 1805, 17032, 327, 253, 3055, 941, 476, 320, 1160, 352, 651, 320, 1805, 604, 253, 4477, 476, 671, 2486, 253, 3733, 2228, 327, 1016, 23657, 275, 253, 1072, 7484, 352, 310, 1802, 17254, 326, 604, 253, 3733, 4566, 327, 11654, 50276, 40252, 1491, 476, 320, 22245, 670, 253, 3733, 3530, 2299, 352, 1537, 320, 1175, 281, 923, 1880, 253, 4771, 476, 9441, 253, 6194, 3530, 1078, 253, 1566, 556, 2168, 5975, 2400, 50276, 74, 3524, 253, 4477, 476, 2953, 619, 1840, 7350, 275, 253, 2539, 264, 406, 33032, 2520, 2929, 2175, 253, 941, 23753, 2523, 275, 253, 10208, 12072, 4715, 625, 10534, 672, 253, 14903, 452, 2289, 281, 1566, 3602, 285, 27935, 352, 476, 9295, 253, 3280, 941, 3066, 11786, 11038, 285, 253, 4477, 1750, 326, 616, 1332, 17923, 973, 1014, 342, 1781, 3733, 14604, 9552, 24088, 689, 3387, 4720, 253, 2488, 671, 2175, 253, 6387, 273, 20362, 1309, 4715, 835, 597, 1804, 326, 2709, 11269, 273, 15223, 941, 7729, 2299, 616, 7680, 3133, 32809, 11786, 11038, 310, 908, 275, 2045, 6239, 1182, 11917, 1162, 355, 6247, 285, 616, 2022, 11237, 310, 4465, 767, 37820, 2426, 2264, 7629, 285, 4812, 6779, 37820, 285, 247, 941, 3605, 12420, 5853, 3692, 3242, 4495, 310, 12744, 275, 253, 2929, 50276, 783, 1563, 403, 690, 3533, 50276, 5371, 1057, 3605, 12420, 1599, 310, 326, 253, 4771, 5760, 253, 28308, 273, 3530, 6777, 387, 1016, 19502, 436, 3133, 281, 320, 1077, 29190, 275, 3946, 3340, 323, 11593, 10208, 12072, 4715, 50276, 18566, 253, 4771, 452, 2289, 281, 253, 40006, 3805, 84, 432, 1016, 12954, 11794, 390, 253, 5820, 19737, 512, 253, 27935, 1078, 10430, 731, 896, 281, 253, 4771, 253, 1273, 10076, 7674, 320, 6786, 1223, 7895, 20828, 5853, 50276, 249, 9118, 10208, 12072, 4715, 253, 27935, 273, 629, 337, 273, 253, 2990, 1057, 417, 878, 281, 320, 25920, 342, 253, 4771, 347, 627, 310, 642, 3388, 50276, 20936, 3058, 1014, 253, 4764, 3139, 1057, 417, 878, 281, 320, 9495, 281, 253, 4771, 323, 253, 1072, 1921, 588, 634, 1332, 789, 762, 436, 4758, 50276, 8826, 2426, 403, 417, 6283, 2931, 824, 347, 12650, 11786, 18499, 14604, 4313, 1162, 355, 50276, 977, 3533, 50276, 5371, 1057, 253, 25142, 1957, 275, 2829, 337, 66, 310, 326, 253, 1180, 273, 25142, 878, 281, 3986, 247, 4791, 3714, 23838, 50276, 5302, 7349, 460, 43110, 414, 12075, 253, 3714, 23838, 891, 5467, 436, 310, 984, 3714, 23838, 29697, 907, 253, 4311, 310, 627, 28629, 11961, 25910, 672, 970, 7349, 460, 43110, 414, 50276, 249, 253, 2983, 1309, 4715, 10076, 310, 627, 667, 30328, 2139, 39793, 15223, 941, 2709, 2069, 2987, 1805, 5474, 339, 431, 248, 2929, 29328, 271, 2983, 281, 4908, 1491, 670, 3733, 941, 432, 11786, 11269, 2197, 347, 629, 273, 247, 10208, 12072, 4715, 4758, 50276, 783, 5740, 273, 253, 2983, 4758, 285, 253, 2983, 5933, 310, 2530, 387, 247, 1029, 1268, 285, 7000, 5740, 310, 5816, 2403, 352, 1892, 281, 2096, 253, 38135, 273, 253, 7680, 5661, 1543, 513, 921, 326, 253, 2983, 310, 10046, 685, 2045, 789, 2299, 253, 4583, 9759, 273, 253, 2929, 812, 320, 5520, 281, 320, 4704, 323, 247, 9311, 690, 13991, 7117, 2708, 50276, 35946, 4758, 352, 3133, 326, 253, 2929, 7907, 84, 432, 690, 273, 253, 2905, 789, 327, 1491, 23753, 407, 7296, 271, 30539, 326, 476, 21526, 468, 342, 253, 10208, 12072, 4715, 1232, 7613, 253, 30539, 310, 24764, 285, 417, 21690, 253, 4477, 943, 1056, 436, 13812, 2590, 604, 352, 310, 6296, 253, 1083, 36738, 327, 2139, 436, 24764, 2425, 588, 417, 320, 8344, 407, 253, 5820, 310, 1774, 323, 1650, 4677, 608, 6296, 2722, 326, 3733, 7200, 310, 27857, 407, 253, 2983, 849, 1057, 436, 2983, 7277, 281, 253, 3939, 30539, 275, 253, 789, 407, 6673, 261, 1162, 355, 50276, 783, 2929, 19401, 760, 577, 5820, 275, 954, 4679, 10208, 12072, 4715, 3798, 556, 1142, 625, 5014, 310, 436, 247, 1895, 513, 1072, 5820, 878, 281, 320, 18203, 387, 1046, 19502, 403, 627, 667, 13260, 327, 1072, 941, 1146, 908, 275, 1016, 19502, 50276, 35946, 5933, 253, 5933, 2789, 897, 273, 941, 3605, 12420, 690, 12925, 327, 752, 352, 2097, 651, 12454, 253, 4361, 285, 4685, 273, 253, 5933, 2502, 629, 273, 5933, 337, 943, 320, 11848, 849, 513, 841, 2193, 755, 10302, 513, 597, 8171, 253, 4797, 4243, 390, 13503, 731, 50276, 49836, 347, 247, 15265, 839, 1650, 4677, 337, 26662, 352, 651, 320, 1682, 281, 41509, 253, 5933, 2234, 12288, 285, 417, 697, 5520, 3045, 689, 2045, 789, 326, 369, 2168, 5393, 2299, 347, 2011, 275, 4677, 374, 247, 14338, 4771, 476, 2085, 253, 1072, 14905, 13782, 347, 247, 21690, 4771, 1223, 10486, 1347, 941, 7355, 275, 247, 41088, 90, 5133, 253, 4771, 6248, 11656, 1037, 15693, 15223, 253, 4677, 1057, 417, 6266, 849, 15223, 3602, 403, 10302, 285, 352, 310, 417, 2590, 849, 3705, 10317, 6779, 2722, 436, 50276, 37585, 4278, 4677, 337, 3403, 621, 2139, 3387, 4632, 884, 1269, 577, 5820, 15299, 892, 50276, 26719, 15299, 275, 892, 4496, 1908, 247, 1805, 4060, 323, 7127, 20362, 1223, 892, 50254, 50261, 7152, 33032, 2520, 789, 23970, 40847, 247, 4460, 3733, 5933, 281, 13584, 3733, 941, 275, 247, 10208, 12072, 4715, 9978, 13633, 432, 3676, 23753, 432, 11786, 15223, 3888, 403, 5556, 1701, 342, 1675, 281, 253, 3064, 2540, 432, 253, 5268, 27935, 26332, 342, 253, 1524, 3888, 285, 253, 581, 2540, 342, 253, 1655, 2715, 273, 253, 15223, 2460, 2299, 277, 21619, 1057, 417, 789, 672, 253, 1054, 487, 1506, 1979, 5459, 1955, 281, 247, 36396, 11786, 6779, 275, 436, 789, 253, 4477, 12661, 281, 1978, 3540, 273, 253, 14604, 3605, 6296, 352, 778, 5108, 326, 253, 4771, 21936, 273, 253, 14604, 3605, 3969, 281, 253, 3733, 941, 326, 588, 320, 908, 407, 253, 5268, 1309, 253, 1980, 3733, 1561, 824, 2515, 247, 24764, 4771, 476, 4354, 4657, 15223, 3888, 3969, 281, 2173, 14452, 285, 3103, 5556, 885, 9113, 1016, 15223, 3888, 8772, 253, 3969, 1524, 2460, 50275, 262, 310, 2590, 432, 253, 2797, 1543, 326, 436, 1332, 2987, 285, 326, 3888, 403, 12372, 2299, 891, 717, 31488, 670, 253, 17200, 273, 253, 5661, 7241, 337, 604, 253, 4771, 1057, 417, 1642, 323, 2173, 14452, 285, 352, 310, 3965, 1846, 253, 1332, 310, 6425, 281, 277, 21619, 26332, 1057, 417, 789, 973, 342, 1781, 39657, 374, 752, 604, 359, 13414, 452, 253, 27935, 50276, 66, 1846, 1039, 273, 2509, 892, 310, 281, 3365, 13791, 253, 12171, 10166, 13461, 342, 2709, 1980, 44540, 347, 7616, 275, 253, 10199, 1127, 495, 253, 4081, 1332, 651, 2649, 789, 275, 436, 15958, 10076, 50275, 7461, 891, 1119, 2593, 5922, 12744, 310, 690, 7794, 403, 253, 767, 4081, 3963, 5837, 3082, 22128, 327, 253, 1524, 2460, 50276, 338, 594, 310, 2649, 436, 247, 2266, 8492, 50276, 664, 403, 417, 3264, 281, 452, 253, 3280, 3888, 891, 9428, 326, 436, 3249, 432, 253, 25577, 281, 253, 789, 273, 3471, 532, 272, 292, 355, 9169, 2299, 841, 767, 12494, 943, 320, 35993, 281, 4518, 5513, 849, 359, 476, 4908, 253, 3280, 4972, 285, 849, 352, 7033, 281, 16186, 854, 281, 8591, 253, 6032, 1255, 273, 253, 15223, 3888, 359, 5467, 253, 23055, 5222, 273, 253, 1524, 3888, 347, 247, 3638, 50276, 664, 16216, 897, 253, 1524, 2460, 1060, 594, 352, 310, 417, 3588, 50275, 856, 84, 50275, 249, 253, 1677, 2515, 40847, 4518, 41731, 13015, 253, 643, 2746, 281, 13584, 3733, 941, 432, 27935, 1309, 892, 50276, 635, 2969, 2983, 281, 3359, 50276, 5040, 50276, 783, 2515, 3309, 281, 253, 2323, 273, 253, 4081, 3082, 1646, 281, 320, 3240, 2266, 285, 417, 1663, 4802, 281, 247, 15958, 892, 7792, 50275, 6795, 5697, 476, 1421, 281, 36927, 2544, 275, 253, 1673, 533, 253, 5161, 2934, 273, 253, 2929, 310, 281, 12718, 4657, 14604, 14452, 50275, 2013, 7969, 50276, 249, 436, 2593, 359, 2085, 3309, 4114, 273, 892, 275, 436, 2593, 50276, 13206, 374, 943, 320, 10141, 639, 24204, 2961, 12119, 15223, 11786, 760, 2378, 50275, 5371, 403, 246, 285, 270, 275, 16186, 608, 2490, 187, 4118, 18435, 27, 2520, 2929, 16633, 8104, 327, 10208, 12072, 4715, 253, 30628, 574, 253, 1563, 7350, 50276, 783, 9376, 273, 3640, 273, 14604, 14452, 310, 46521, 275, 271, 288, 1258, 4758, 50276, 783, 9978, 760, 2987, 672, 2509, 247, 2014, 23657, 891, 2868, 253, 4477, 1750, 326, 352, 310, 7763, 275, 625, 2087, 7533, 533, 1941, 281, 326, 1055, 556, 417, 644, 2530, 50276, 783, 38135, 273, 253, 2746, 310, 8489, 3710, 50276, 783, 5740, 273, 253, 5933, 285, 5301, 281, 2720, 789, 812, 320, 30909, 50276, 74, 5439, 253, 1953, 273, 1880, 253, 30628, 651, 320, 625, 2762, 604, 627, 497, 642, 7558, 1543, 327, 288, 1258, 597, 1335, 858, 417, 1646, 2762, 2217, 281, 15249, 14924, 1955, 281, 253, 643, 4606, 5393, 1840 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 19529, 19401, 253, 1895, 273, 17029, 272, 3055, 941, 432, 27935, 275, 247, 10208, 12072, 4715, 985, 534, 556, 644, 4102, 2011, 281, 247, 4322, 275, 5939, 4715, 2718, 767, 3510, 273, 10208, 12072, 4715, 2718, 403, 2783, 9118, 10208, 12072, 4715, 362, 1258, 10770, 281, 253, 1083, 835, 1027, 6083, 2186, 1027, 3386, 273, 253, 1072, 941, 2792, 1223, 50276, 33464, 10208, 12072, 4715, 288, 1258, 10770, 281, 253, 1083, 835, 1027, 6083, 849, 512, 253, 3386, 273, 1027, 20077, 273, 253, 941, 50276, 35065, 8104, 8415, 271, 13757, 1895, 326, 13698, 281, 9441, 253, 941, 407, 28699, 253, 29713, 875, 1524, 27935, 285, 15223, 27935, 436, 1332, 27171, 10183, 672, 253, 1180, 273, 3530, 275, 581, 3790, 310, 1781, 253, 2929, 29328, 40847, 534, 3936, 5750, 273, 253, 958, 326, 275, 9118, 10208, 12072, 4715, 362, 1258, 2718, 253, 4771, 476, 4271, 253, 14452, 273, 253, 3530, 326, 403, 4236, 275, 1016, 3790, 436, 4465, 1491, 1361, 4796, 25824, 5482, 275, 253, 13757, 1895, 285, 1361, 3157, 253, 14433, 3045, 253, 4477, 2589, 4679, 281, 921, 326, 253, 4081, 5933, 41731, 13015, 2045, 2987, 50276, 783, 2929, 2722, 326, 941, 23753, 432, 27935, 310, 247, 2442, 4322, 275, 362, 1258, 2718, 1014, 672, 253, 14604, 1979, 310, 1781, 2299, 891, 452, 253, 1563, 7350, 670, 253, 2929, 50276, 18, 253, 2929, 3916, 326, 253, 2983, 671, 2987, 323, 253, 288, 1258, 4758, 2299, 436, 310, 642, 973, 17285, 323, 253, 1563, 4606, 337, 253, 9376, 326, 253, 4771, 6057, 253, 14452, 273, 253, 3530, 326, 403, 4236, 275, 1016, 3790, 310, 417, 3588, 275, 2087, 323, 253, 288, 1258, 4758, 1580, 1016, 5570, 476, 3410, 247, 14604, 12171, 374, 275, 288, 1258, 7533, 352, 310, 3839, 8025, 326, 253, 1180, 273, 6083, 310, 1781, 285, 1016, 5570, 760, 45347, 275, 247, 1643, 16334, 534, 310, 417, 2783, 275, 253, 4679, 275, 253, 19529, 374, 275, 253, 4679, 352, 310, 2011, 326, 347, 253, 1180, 273, 3733, 44540, 17202, 1805, 17032, 327, 253, 3055, 941, 476, 320, 1160, 352, 651, 320, 1805, 604, 253, 4477, 476, 671, 2486, 253, 3733, 2228, 327, 1016, 23657, 275, 253, 1072, 7484, 352, 310, 1802, 17254, 326, 604, 253, 3733, 4566, 327, 11654, 50276, 40252, 1491, 476, 320, 22245, 670, 253, 3733, 3530, 2299, 352, 1537, 320, 1175, 281, 923, 1880, 253, 4771, 476, 9441, 253, 6194, 3530, 1078, 253, 1566, 556, 2168, 5975, 2400, 50276, 74, 3524, 253, 4477, 476, 2953, 619, 1840, 7350, 275, 253, 2539, 264, 406, 33032, 2520, 2929, 2175, 253, 941, 23753, 2523, 275, 253, 10208, 12072, 4715, 625, 10534, 672, 253, 14903, 452, 2289, 281, 1566, 3602, 285, 27935, 352, 476, 9295, 253, 3280, 941, 3066, 11786, 11038, 285, 253, 4477, 1750, 326, 616, 1332, 17923, 973, 1014, 342, 1781, 3733, 14604, 9552, 24088, 689, 3387, 4720, 253, 2488, 671, 2175, 253, 6387, 273, 20362, 1309, 4715, 835, 597, 1804, 326, 2709, 11269, 273, 15223, 941, 7729, 2299, 616, 7680, 3133, 32809, 11786, 11038, 310, 908, 275, 2045, 6239, 1182, 11917, 1162, 355, 6247, 285, 616, 2022, 11237, 310, 4465, 767, 37820, 2426, 2264, 7629, 285, 4812, 6779, 37820, 285, 247, 941, 3605, 12420, 5853, 3692, 3242, 4495, 310, 12744, 275, 253, 2929, 50276, 783, 1563, 403, 690, 3533, 50276, 5371, 1057, 3605, 12420, 1599, 310, 326, 253, 4771, 5760, 253, 28308, 273, 3530, 6777, 387, 1016, 19502, 436, 3133, 281, 320, 1077, 29190, 275, 3946, 3340, 323, 11593, 10208, 12072, 4715, 50276, 18566, 253, 4771, 452, 2289, 281, 253, 40006, 3805, 84, 432, 1016, 12954, 11794, 390, 253, 5820, 19737, 512, 253, 27935, 1078, 10430, 731, 896, 281, 253, 4771, 253, 1273, 10076, 7674, 320, 6786, 1223, 7895, 20828, 5853, 50276, 249, 9118, 10208, 12072, 4715, 253, 27935, 273, 629, 337, 273, 253, 2990, 1057, 417, 878, 281, 320, 25920, 342, 253, 4771, 347, 627, 310, 642, 3388, 50276, 20936, 3058, 1014, 253, 4764, 3139, 1057, 417, 878, 281, 320, 9495, 281, 253, 4771, 323, 253, 1072, 1921, 588, 634, 1332, 789, 762, 436, 4758, 50276, 8826, 2426, 403, 417, 6283, 2931, 824, 347, 12650, 11786, 18499, 14604, 4313, 1162, 355, 50276, 977, 3533, 50276, 5371, 1057, 253, 25142, 1957, 275, 2829, 337, 66, 310, 326, 253, 1180, 273, 25142, 878, 281, 3986, 247, 4791, 3714, 23838, 50276, 5302, 7349, 460, 43110, 414, 12075, 253, 3714, 23838, 891, 5467, 436, 310, 984, 3714, 23838, 29697, 907, 253, 4311, 310, 627, 28629, 11961, 25910, 672, 970, 7349, 460, 43110, 414, 50276, 249, 253, 2983, 1309, 4715, 10076, 310, 627, 667, 30328, 2139, 39793, 15223, 941, 2709, 2069, 2987, 1805, 5474, 339, 431, 248, 2929, 29328, 271, 2983, 281, 4908, 1491, 670, 3733, 941, 432, 11786, 11269, 2197, 347, 629, 273, 247, 10208, 12072, 4715, 4758, 50276, 783, 5740, 273, 253, 2983, 4758, 285, 253, 2983, 5933, 310, 2530, 387, 247, 1029, 1268, 285, 7000, 5740, 310, 5816, 2403, 352, 1892, 281, 2096, 253, 38135, 273, 253, 7680, 5661, 1543, 513, 921, 326, 253, 2983, 310, 10046, 685, 2045, 789, 2299, 253, 4583, 9759, 273, 253, 2929, 812, 320, 5520, 281, 320, 4704, 323, 247, 9311, 690, 13991, 7117, 2708, 50276, 35946, 4758, 352, 3133, 326, 253, 2929, 7907, 84, 432, 690, 273, 253, 2905, 789, 327, 1491, 23753, 407, 7296, 271, 30539, 326, 476, 21526, 468, 342, 253, 10208, 12072, 4715, 1232, 7613, 253, 30539, 310, 24764, 285, 417, 21690, 253, 4477, 943, 1056, 436, 13812, 2590, 604, 352, 310, 6296, 253, 1083, 36738, 327, 2139, 436, 24764, 2425, 588, 417, 320, 8344, 407, 253, 5820, 310, 1774, 323, 1650, 4677, 608, 6296, 2722, 326, 3733, 7200, 310, 27857, 407, 253, 2983, 849, 1057, 436, 2983, 7277, 281, 253, 3939, 30539, 275, 253, 789, 407, 6673, 261, 1162, 355, 50276, 783, 2929, 19401, 760, 577, 5820, 275, 954, 4679, 10208, 12072, 4715, 3798, 556, 1142, 625, 5014, 310, 436, 247, 1895, 513, 1072, 5820, 878, 281, 320, 18203, 387, 1046, 19502, 403, 627, 667, 13260, 327, 1072, 941, 1146, 908, 275, 1016, 19502, 50276, 35946, 5933, 253, 5933, 2789, 897, 273, 941, 3605, 12420, 690, 12925, 327, 752, 352, 2097, 651, 12454, 253, 4361, 285, 4685, 273, 253, 5933, 2502, 629, 273, 5933, 337, 943, 320, 11848, 849, 513, 841, 2193, 755, 10302, 513, 597, 8171, 253, 4797, 4243, 390, 13503, 731, 50276, 49836, 347, 247, 15265, 839, 1650, 4677, 337, 26662, 352, 651, 320, 1682, 281, 41509, 253, 5933, 2234, 12288, 285, 417, 697, 5520, 3045, 689, 2045, 789, 326, 369, 2168, 5393, 2299, 347, 2011, 275, 4677, 374, 247, 14338, 4771, 476, 2085, 253, 1072, 14905, 13782, 347, 247, 21690, 4771, 1223, 10486, 1347, 941, 7355, 275, 247, 41088, 90, 5133, 253, 4771, 6248, 11656, 1037, 15693, 15223, 253, 4677, 1057, 417, 6266, 849, 15223, 3602, 403, 10302, 285, 352, 310, 417, 2590, 849, 3705, 10317, 6779, 2722, 436, 50276, 37585, 4278, 4677, 337, 3403, 621, 2139, 3387, 4632, 884, 1269, 577, 5820, 15299, 892, 50276, 26719, 15299, 275, 892, 4496, 1908, 247, 1805, 4060, 323, 7127, 20362, 1223, 892, 50254, 50261, 7152, 33032, 2520, 789, 23970, 40847, 247, 4460, 3733, 5933, 281, 13584, 3733, 941, 275, 247, 10208, 12072, 4715, 9978, 13633, 432, 3676, 23753, 432, 11786, 15223, 3888, 403, 5556, 1701, 342, 1675, 281, 253, 3064, 2540, 432, 253, 5268, 27935, 26332, 342, 253, 1524, 3888, 285, 253, 581, 2540, 342, 253, 1655, 2715, 273, 253, 15223, 2460, 2299, 277, 21619, 1057, 417, 789, 672, 253, 1054, 487, 1506, 1979, 5459, 1955, 281, 247, 36396, 11786, 6779, 275, 436, 789, 253, 4477, 12661, 281, 1978, 3540, 273, 253, 14604, 3605, 6296, 352, 778, 5108, 326, 253, 4771, 21936, 273, 253, 14604, 3605, 3969, 281, 253, 3733, 941, 326, 588, 320, 908, 407, 253, 5268, 1309, 253, 1980, 3733, 1561, 824, 2515, 247, 24764, 4771, 476, 4354, 4657, 15223, 3888, 3969, 281, 2173, 14452, 285, 3103, 5556, 885, 9113, 1016, 15223, 3888, 8772, 253, 3969, 1524, 2460, 50275, 262, 310, 2590, 432, 253, 2797, 1543, 326, 436, 1332, 2987, 285, 326, 3888, 403, 12372, 2299, 891, 717, 31488, 670, 253, 17200, 273, 253, 5661, 7241, 337, 604, 253, 4771, 1057, 417, 1642, 323, 2173, 14452, 285, 352, 310, 3965, 1846, 253, 1332, 310, 6425, 281, 277, 21619, 26332, 1057, 417, 789, 973, 342, 1781, 39657, 374, 752, 604, 359, 13414, 452, 253, 27935, 50276, 66, 1846, 1039, 273, 2509, 892, 310, 281, 3365, 13791, 253, 12171, 10166, 13461, 342, 2709, 1980, 44540, 347, 7616, 275, 253, 10199, 1127, 495, 253, 4081, 1332, 651, 2649, 789, 275, 436, 15958, 10076, 50275, 7461, 891, 1119, 2593, 5922, 12744, 310, 690, 7794, 403, 253, 767, 4081, 3963, 5837, 3082, 22128, 327, 253, 1524, 2460, 50276, 338, 594, 310, 2649, 436, 247, 2266, 8492, 50276, 664, 403, 417, 3264, 281, 452, 253, 3280, 3888, 891, 9428, 326, 436, 3249, 432, 253, 25577, 281, 253, 789, 273, 3471, 532, 272, 292, 355, 9169, 2299, 841, 767, 12494, 943, 320, 35993, 281, 4518, 5513, 849, 359, 476, 4908, 253, 3280, 4972, 285, 849, 352, 7033, 281, 16186, 854, 281, 8591, 253, 6032, 1255, 273, 253, 15223, 3888, 359, 5467, 253, 23055, 5222, 273, 253, 1524, 3888, 347, 247, 3638, 50276, 664, 16216, 897, 253, 1524, 2460, 1060, 594, 352, 310, 417, 3588, 50275, 856, 84, 50275, 249, 253, 1677, 2515, 40847, 4518, 41731, 13015, 253, 643, 2746, 281, 13584, 3733, 941, 432, 27935, 1309, 892, 50276, 635, 2969, 2983, 281, 3359, 50276, 5040, 50276, 783, 2515, 3309, 281, 253, 2323, 273, 253, 4081, 3082, 1646, 281, 320, 3240, 2266, 285, 417, 1663, 4802, 281, 247, 15958, 892, 7792, 50275, 6795, 5697, 476, 1421, 281, 36927, 2544, 275, 253, 1673, 533, 253, 5161, 2934, 273, 253, 2929, 310, 281, 12718, 4657, 14604, 14452, 50275, 2013, 7969, 50276, 249, 436, 2593, 359, 2085, 3309, 4114, 273, 892, 275, 436, 2593, 50276, 13206, 374, 943, 320, 10141, 639, 24204, 2961, 12119, 15223, 11786, 760, 2378, 50275, 5371, 403, 246, 285, 270, 275, 16186, 608, 2490, 187, 4118, 18435, 27, 2520, 2929, 16633, 8104, 327, 10208, 12072, 4715, 253, 30628, 574, 253, 1563, 7350, 50276, 783, 9376, 273, 3640, 273, 14604, 14452, 310, 46521, 275, 271, 288, 1258, 4758, 50276, 783, 9978, 760, 2987, 672, 2509, 247, 2014, 23657, 891, 2868, 253, 4477, 1750, 326, 352, 310, 7763, 275, 625, 2087, 7533, 533, 1941, 281, 326, 1055, 556, 417, 644, 2530, 50276, 783, 38135, 273, 253, 2746, 310, 8489, 3710, 50276, 783, 5740, 273, 253, 5933, 285, 5301, 281, 2720, 789, 812, 320, 30909, 50276, 74, 5439, 253, 1953, 273, 1880, 253, 30628, 651, 320, 625, 2762, 604, 627, 497, 642, 7558, 1543, 327, 288, 1258, 597, 1335, 858, 417, 1646, 2762, 2217, 281, 15249, 14924, 1955, 281, 253, 643, 4606, 5393, 1840 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this manuscript describes a connection between potts models and attention as implemented in modern transformers the authors then present an attention model in which positional encodings are defined as onehot vectors indicating fixed positions in the multiple sequence alignment and train single layer attention models these models unsurprisingly perform similarly to potts models without apc correction for contact prediction the methods section is somewhat confusingly written i think the factored attention model would benefit from being described on its own terms rather than in connection with typical multiheaded attention especially because the isolation of position encodings and amino acids at those positions dramatically simplifies the understanding of wq wk and wv the authors also spend a long time describing well known methods but without providing additional insight the connection between the potts model and attention described in this paper should be obvious to those who already understand attention models and potts models and the empirical results of the factored attention model dont make this approach seem compelling in the discussion the authors make several broad future speculations some of these would be interesting contributions and i encourage the authors to develop this work further maybe factored attention could be promising for better capturing dependencies between positions for deeper transformers on msas but it isnt likely that this work will be of broad interest to the machine learning community this manuscript seems better suited to a workshop or other specialized venue some specific comments on this work follow below 1 in the factored attention model the authors use onehot encoding of the position index as the position encoding this is equivalent to learned position embeddings as in bert which is worth mentioning 2 the authors discuss singlesite potentials as a difference between potts models and single layer attention models and then show a comparison of attention models with and without singlesite potentials showing little difference however attention models already implicitly have singlesite potentials which arise from the positional encoding input features granted this is not the case for the factored attention model where singlesite potentials seem to have more effect though in the negative direction 3 the authors state that the ability of factored attention to capture similar contacts to potts without use of apc suggest that it may be more suitable for protein design i dont follow this conclusion if the factored attention model performs equivalently to the potts model alone and worse than the potts model with apc correction why would it be more suitable for protein design 4 what makes the singlelayer attention or factored attention models compelling for protein modeling what problems do these models solve that are not better solved by the potts model or traditional transformers what would raise my score 1 present a compelling use case for the factored attention model what questions can be answered or better answered with this model over the potts model or other alternatives one idea is to use the factored attention model as the layers in a full deep transformer model and see if this architecture can improve tasks where msa training data is available edit i have increased my score in light of the response and manuscript edits the manuscript is improved but i think the method still needs more development there are a number of interesting pieces but the final picture of an improved protein model is not fully resolveddocsepsummary this paper explores the connection between the classic potts modelbased approaches and modern transformerbased approaches for protein contact map prediction to this end the authors introduce a simplified variation of the attention layer called factored attention and show that a single layer of factored attention performs operations similar to those performed by the potts modelbased methods pros the paper attempts to connect classic and modern approaches to protein contact map prediction which might be interesting to the people working in this field the evidence presented simplifying attention layer so that the equations look similar to the classic methods numerical results of the simplified attention layer close to the classic methods is reasonably convincing the topic of the paper is quite timely there has been a lot of interest recently in modelling proteins using the latest nlp techniques the paper is well written i appreciate the effort put in by the authors to define basic protein terminologies which might not be obvious to readers without biology background cons the contributions of the paper would have been more interesting if the proposed modifications of the attention layer led to increased prediction performance of models which are representative of the stateoftheart specifically if retraining protbertbfd using the modified attention layer led to further improvement in performance that would have been a solid contribution are mrf models really that competitive for contact map prediction from what i understand deep neural networks have been far better at this task for quite some time now at multiple places in the paper the authors give the impression that mrf models are close to stateoftheart in the last paragraph of the introductory section the idea of encoding the msas is introduced which seemed interesting however from what i understood from the rest of the paper the queries and keys are extracted solely based on the position of the amino acid is that right if so does the position correspond to the position in the sequence or in the msa are the actual alignments used in any of the results in the paper please clarify comments section 31 each edge should have a capital e section 33 specifically the part where you show that factored attention is a pairwise mrf is too brief given that this is a main contribution of the paper it would be worthwhile to explain this connection in a more detailed mannerdocseprecently some researchers tried to apply attention models into the protein field using selfsupervised learning to predict protein contacts in this work the author attempt to build the connection between such works and the oldschool model potts model by simplifying some operations within the attention model the author managed to build an analog between the simplified model and the potts model the analog is intuitive and easytounderstand the authors further compare the simplified model and the potts model on 748 protein families showing that they are similar or probably the simplified attention model is even better this is an interesting work however i also have a number of concerns the advantages and disadvantages are listed below pros 1 the manuscript is concise and easytounderstand 2 the idea is intuitive and reasonable with experimental support cons 1 the analog between the simplified attention model and the potts model is intuitive but not rigorous the authors claim that they provide a theoretical connection between the two models however that part is not strong enough without proof 2 there are two assumptions in this work which make the simplified model different from the attention models that the previous researchers used firstly they train the model on multiple sequence alignment instead of the raw sequences if they train the model on the raw sequences the performance is unacceptable as shown in figure 16 which is consistent with the previous research secondly they removed the sequence embedding in queries and keys this simplification makes the model only consider the statistical pattern in the msa to me this one is a too strong assumption 3 the running time and hardware comparison is missing if the single layer of attention is comparable to the potts model not outperform it significantly while it would take much more time to train the researchers would need to think twice if they want to use the attention model 4 the ablation study makes me feel that the results are on the opposite of the conclusion here is my logic with the above two assumptions the attention model can achieve similar performance as the potts model or a little bit better however when we train on the unaligned sequences which is the usual case that we would use the attention model the performance becomes unacceptable then why we want to use the more expensive attention model the attention model in the nlp field is a different story those models are refreshing the stoa performance all the time however in the protein field the attention model can still only achieve comparable performance as the classific models after a twoyear study they seldom outperform classic algorithms the results in this manuscript are consistent with the previous research so i am not convinced regarding the conclusion in the abstract taken together these results provide motivation for training transformers on large protein datasets 5 the potential audience of this paper would be those who are specialized or interested in bioinformatics and proteindocsepsummary transformers models have been recently shown to capture protein contact information in their attention maps when trained unsupervised of millions of protein sequences this paper draws parallels between transformers and potts models fullyconnected pairwise mrfthe current standard approach for protein contact predictionand shows empirically that transformers are competitive with potts models understanding the differences and similarities between transformers and potts models makes transformers less of a blackbox and helps to establish them as a principled method for contact prediction the paper is clearly written and the evaluation is solid i have only a few comments major comments 1 what is the maximum sequence similarity between the training sequence of protbert and sequences in trrosetts alignments that were used for testing sequences must not overlap have a maximum similarity of lets say 80 2 you describe that you used three sets of families from the trrosetta dataset a41 why did you use only 732 families for testing set 3 were these all families that were not included in the first two sets how many families do the first two sets include and how similar are families of different sets ideally train tune and test families belong to different super families 3 you describe in section a3 how you extracted protein contact maps from the attention maps of protbert this is an important detail that must be described in the main text how did you choose the 6 heads did you choose them manually or for example by training a linear model to predict contacts from attention maps and using the weights for identifying important heads or computing the weighted average of attention maps minor comments 4 section 32 x eseqxi eposi how did you compute positional embeddings and why do and add embeddings instead of concatenating them 5 section 32 we treat the positional embedding epos as an overall summary perposition information please describe more clearly what this summary is 6 section 4 first paragraph the l of the precision at l metric is not the sequence length but the number of top sequences you describe l as being both 7 figure 6 is not discussed instead of showing this figure i suggest quantifying the correlation depending on the number of heads by computing and discussing the spearman correlation 8 rives et al 2020 biological structure and function emerge have recently shown in addition to vig et al that protein contact can be predicted from attention maps which must be also pointed out in the background section ### Summary:
the paper shows a connection between potts model and transformers and uses the connection to propose a factored attention energy to use in an mrf results are shown using this energy based on factored attention also pretrained bert models are used to predict contact maps as a comparison the reviewers found the paper interesting from a protein structures prediction point of view but from a machine learning perspective their opinion was that the paper does not offer a coherent compelling method that is very novel and the connection between potts and an energy based attention model is not that overwhelming in addition the presentation was somewhat circuitous the authors made improvements to the paper over the course of the review which is appreciated but the method presented does not match the target for an iclr paper in terms of methodological contributions
[ 2149, 4116, 1566, 13414, 1056, 436, 2746, 1646, 18511, 275, 253, 5955, 253, 4477, 1056, 2067, 3862, 2852, 946, 3339, 690, 273, 841, 651, 320, 4722, 9021, 285, 891, 11907, 253, 4477, 281, 1287, 436, 789, 2007, 5046, 958, 2149, 4116, 812, 320, 12532, 323, 1805, 26475, 21011, 875, 6887, 323, 12861, 4979, 398, 327, 13818, 284, 533, 352, 310, 2649, 2779, 326, 436, 789, 588, 320, 273, 3862, 1600, 281, 253, 5145, 4715, 3114, 436, 7714, 3133, 1805, 18960, 281, 247, 22586, 390, 643, 18052, 18767, 690, 2173, 5701, 327, 436, 789, 956, 2708, 337, 186, 249, 253, 958, 2149, 4116, 1566, 253, 4477, 897, 581, 12022, 9706, 273, 253, 1899, 3605, 347, 253, 1899, 9706, 436, 310, 6425, 281, 6311, 1899, 46234, 347, 275, 270, 797, 534, 310, 4409, 29570, 50276, 19, 186, 783, 4477, 2319, 21864, 614, 19316, 347, 247, 3064, 875, 268, 43285, 3210, 285, 2014, 3828, 4116, 3210, 285, 840, 921, 247, 5301, 273, 4116, 3210, 342, 285, 1293, 21864, 614, 19316, 4645, 1652, 3064, 2299, 4116, 3210, 2168, 29688, 452, 21864, 614, 19316, 534, 12893, 432, 253, 40798, 9706, 3280, 3386, 7169, 436, 310, 417, 253, 1083, 323, 253, 958, 2149, 4116, 1566, 835, 21864, 614, 19316, 1646, 281, 452, 625, 1055, 2167, 275, 253, 4016, 3884, 495, 186, 783, 4477, 1375, 326, 253, 3745, 273, 958, 2149, 4116, 281, 9232, 2074, 12716, 281, 268, 43285, 1293, 897, 273, 1049, 68, 1804, 326, 352, 778, 320, 625, 7470, 323, 2601, 2216, 891, 13414, 956, 436, 6452, 604, 253, 958, 2149, 4116, 1566, 17923, 39406, 281, 253, 268, 43285, 1566, 3815, 285, 7197, 685, 253, 268, 43285, 1566, 342, 1049, 68, 10618, 2139, 651, 352, 320, 625, 7470, 323, 2601, 2216, 577, 186, 752, 2789, 253, 2014, 12026, 4116, 390, 958, 2149, 4116, 3210, 18511, 323, 2601, 14053, 752, 3237, 513, 841, 3210, 8415, 326, 403, 417, 1805, 14042, 407, 253, 268, 43285, 1566, 390, 5899, 4979, 398, 50276, 5371, 651, 7164, 619, 4868, 337, 186, 15068, 247, 18511, 897, 1083, 323, 253, 958, 2149, 4116, 1566, 752, 3533, 476, 320, 9577, 390, 1805, 9577, 342, 436, 1566, 689, 253, 268, 43285, 1566, 390, 643, 18075, 581, 2934, 310, 281, 897, 253, 958, 2149, 4116, 1566, 347, 253, 8090, 275, 247, 2120, 3676, 39707, 1566, 285, 923, 604, 436, 10336, 476, 3157, 8892, 835, 278, 6678, 3733, 941, 310, 2130, 50276, 15576, 891, 452, 2559, 619, 4868, 275, 1708, 273, 253, 2380, 285, 7714, 1407, 953, 253, 7714, 310, 5520, 533, 891, 1158, 253, 1332, 1335, 3198, 625, 2440, 627, 403, 247, 1180, 273, 4722, 7437, 533, 253, 2457, 5406, 273, 271, 5520, 2601, 1566, 310, 417, 4751, 11512, 7152, 339, 793, 360, 3454, 436, 2929, 33826, 253, 4602, 875, 253, 10610, 268, 43285, 1566, 3169, 7274, 285, 4980, 39707, 3169, 7274, 323, 2601, 3057, 3711, 10554, 281, 436, 990, 253, 4477, 9569, 247, 21010, 7629, 273, 253, 4116, 3828, 1925, 958, 2149, 4116, 285, 921, 326, 247, 2014, 3828, 273, 958, 2149, 4116, 17923, 5871, 2074, 281, 1110, 2684, 407, 253, 268, 43285, 1566, 3169, 3082, 50276, 856, 84, 50276, 783, 2929, 9437, 281, 4684, 10610, 285, 4980, 7274, 281, 2601, 3057, 3711, 10554, 534, 1537, 320, 4722, 281, 253, 952, 2444, 275, 436, 1673, 253, 1941, 3559, 8077, 5411, 4116, 3828, 594, 326, 253, 7424, 1007, 2074, 281, 253, 10610, 3082, 10704, 1543, 273, 253, 21010, 4116, 3828, 2810, 281, 253, 10610, 3082, 310, 12054, 21414, 50276, 783, 9400, 273, 253, 2929, 310, 3240, 14793, 627, 556, 644, 247, 2257, 273, 1600, 4102, 275, 26278, 4324, 970, 253, 6323, 295, 24343, 5609, 50276, 783, 2929, 310, 973, 3542, 891, 11435, 253, 3434, 1691, 275, 407, 253, 4477, 281, 4853, 5044, 2601, 18376, 5970, 534, 1537, 417, 320, 4755, 281, 10668, 1293, 16775, 4114, 50275, 5040, 50276, 783, 9021, 273, 253, 2929, 651, 452, 644, 625, 4722, 604, 253, 4081, 14586, 273, 253, 4116, 3828, 3977, 281, 2559, 10554, 3045, 273, 3210, 534, 403, 8612, 273, 253, 1375, 23037, 14387, 5742, 604, 851, 26208, 3861, 6291, 3342, 69, 970, 253, 7321, 4116, 3828, 3977, 281, 2007, 7756, 275, 3045, 326, 651, 452, 644, 247, 4891, 7680, 50276, 609, 278, 19232, 3210, 1663, 326, 12085, 323, 3057, 3711, 10554, 432, 752, 891, 2096, 3676, 11454, 6928, 452, 644, 2080, 1805, 387, 436, 4836, 323, 3240, 690, 673, 1024, 387, 2709, 5053, 275, 253, 2929, 253, 4477, 1918, 253, 13214, 326, 278, 19232, 3210, 403, 2810, 281, 1375, 23037, 14387, 50276, 249, 253, 1390, 12494, 273, 253, 47649, 2593, 253, 2934, 273, 9706, 253, 13818, 284, 310, 5611, 534, 4455, 4722, 2299, 432, 752, 891, 7192, 432, 253, 1551, 273, 253, 2929, 253, 19241, 285, 10149, 403, 10375, 12718, 1754, 327, 253, 1899, 273, 253, 8898, 3527, 310, 326, 987, 604, 594, 1057, 253, 1899, 2723, 281, 253, 1899, 275, 253, 3425, 390, 275, 253, 278, 6678, 403, 253, 4588, 43097, 908, 275, 667, 273, 253, 1543, 275, 253, 2929, 4496, 19148, 50276, 26122, 50276, 4674, 4562, 1016, 5024, 943, 452, 247, 5347, 299, 50276, 4674, 5922, 5742, 253, 629, 835, 368, 921, 326, 958, 2149, 4116, 310, 247, 28208, 278, 19232, 310, 1512, 4864, 1677, 326, 436, 310, 247, 2022, 7680, 273, 253, 2929, 352, 651, 320, 32811, 281, 5513, 436, 4602, 275, 247, 625, 7000, 5133, 7152, 339, 3456, 1154, 314, 690, 8607, 3597, 281, 4647, 4116, 3210, 715, 253, 2601, 1673, 970, 1881, 35421, 4715, 281, 3283, 2601, 12716, 275, 436, 789, 253, 2488, 3177, 281, 1973, 253, 4602, 875, 824, 2987, 285, 253, 1711, 19221, 1566, 268, 43285, 1566, 407, 8077, 5411, 690, 5871, 1561, 253, 4116, 1566, 253, 2488, 7303, 281, 1973, 271, 7370, 875, 253, 21010, 1566, 285, 253, 268, 43285, 1566, 253, 7370, 310, 27350, 285, 1842, 1767, 10117, 1676, 253, 4477, 2007, 7277, 253, 21010, 1566, 285, 253, 268, 43285, 1566, 327, 818, 2385, 2601, 5870, 4645, 326, 597, 403, 2074, 390, 3164, 253, 21010, 4116, 1566, 310, 1014, 1805, 436, 310, 271, 4722, 789, 2299, 891, 671, 452, 247, 1180, 273, 7350, 253, 11361, 285, 23797, 403, 7117, 2708, 50276, 856, 84, 337, 253, 7714, 310, 44003, 285, 1842, 1767, 10117, 1676, 374, 253, 2934, 310, 27350, 285, 5272, 342, 5661, 1329, 50276, 5040, 337, 253, 7370, 875, 253, 21010, 4116, 1566, 285, 253, 268, 43285, 1566, 310, 27350, 533, 417, 26565, 253, 4477, 1750, 326, 597, 2085, 247, 10527, 4602, 875, 253, 767, 3210, 2299, 326, 629, 310, 417, 2266, 2217, 1293, 4737, 374, 627, 403, 767, 13260, 275, 436, 789, 534, 1056, 253, 21010, 1566, 1027, 432, 253, 4116, 3210, 326, 253, 2045, 8607, 908, 41005, 597, 6194, 253, 1566, 327, 2709, 3425, 12420, 3185, 273, 253, 9305, 6430, 604, 597, 6194, 253, 1566, 327, 253, 9305, 6430, 253, 3045, 310, 28536, 347, 2011, 275, 4677, 1668, 534, 310, 5185, 342, 253, 2045, 2561, 1273, 314, 597, 5176, 253, 3425, 21496, 275, 19241, 285, 10149, 436, 8077, 1877, 2789, 253, 1566, 760, 1908, 253, 7605, 3102, 275, 253, 278, 6678, 281, 479, 436, 581, 310, 247, 1512, 2266, 9376, 50276, 20, 253, 3515, 673, 285, 10309, 5301, 310, 5816, 604, 253, 2014, 3828, 273, 4116, 310, 10870, 281, 253, 268, 43285, 1566, 417, 562, 32231, 352, 3012, 1223, 352, 651, 1379, 1199, 625, 673, 281, 6194, 253, 8607, 651, 878, 281, 1158, 7019, 604, 597, 971, 281, 897, 253, 4116, 1566, 50276, 21, 253, 28913, 1263, 2789, 479, 1928, 326, 253, 1543, 403, 327, 253, 7285, 273, 253, 6452, 1060, 310, 619, 9317, 342, 253, 1840, 767, 13260, 253, 4116, 1566, 476, 5115, 2074, 3045, 347, 253, 268, 43285, 1566, 390, 247, 1652, 2372, 1805, 2299, 672, 359, 6194, 327, 253, 440, 2132, 6430, 534, 310, 253, 7312, 1083, 326, 359, 651, 897, 253, 4116, 1566, 253, 3045, 4916, 28536, 840, 2139, 359, 971, 281, 897, 253, 625, 8214, 4116, 1566, 253, 4116, 1566, 275, 253, 295, 24343, 1673, 310, 247, 1027, 2926, 1110, 3210, 403, 31255, 253, 4806, 66, 3045, 512, 253, 673, 2299, 275, 253, 2601, 1673, 253, 4116, 1566, 476, 1335, 760, 5115, 10870, 3045, 347, 253, 966, 692, 3210, 846, 247, 767, 2913, 1263, 597, 28277, 562, 32231, 10610, 11333, 253, 1543, 275, 436, 7714, 403, 5185, 342, 253, 2045, 2561, 594, 891, 717, 417, 13762, 5001, 253, 6452, 275, 253, 12002, 2668, 2366, 841, 1543, 2085, 16038, 323, 3733, 4979, 398, 327, 1781, 2601, 15302, 608, 253, 2442, 8446, 273, 436, 2929, 651, 320, 1110, 665, 403, 18052, 390, 6110, 275, 9015, 37366, 285, 1233, 527, 406, 339, 793, 360, 3454, 50276, 16702, 398, 3210, 452, 644, 4102, 2011, 281, 9232, 2601, 3057, 1491, 275, 616, 4116, 8115, 672, 10166, 440, 35421, 273, 9790, 273, 2601, 6430, 436, 2929, 21354, 43630, 875, 4979, 398, 285, 268, 43285, 3210, 4751, 14063, 28208, 278, 83, 649, 248, 1655, 2629, 2746, 323, 2601, 3057, 10554, 395, 2722, 45190, 326, 4979, 398, 403, 12085, 342, 268, 43285, 3210, 4685, 253, 3910, 285, 22620, 875, 4979, 398, 285, 268, 43285, 3210, 2789, 4979, 398, 1679, 273, 247, 2806, 3364, 285, 7729, 281, 5100, 731, 347, 247, 3505, 74, 6216, 1332, 323, 3057, 10554, 253, 2929, 310, 4518, 3542, 285, 253, 7103, 310, 4891, 891, 452, 760, 247, 1643, 5701, 50275, 24330, 5701, 50276, 18, 752, 310, 253, 4869, 3425, 14259, 875, 253, 3733, 3425, 273, 3861, 6291, 285, 6430, 275, 492, 287, 1178, 1641, 43097, 326, 497, 908, 323, 5175, 6430, 1364, 417, 14787, 452, 247, 4869, 14259, 273, 14935, 1333, 5096, 50276, 19, 368, 6266, 326, 368, 908, 1264, 5239, 273, 5870, 432, 253, 492, 287, 1178, 893, 10895, 247, 3156, 2139, 858, 368, 897, 760, 818, 1237, 5870, 323, 5175, 873, 495, 497, 841, 512, 5870, 326, 497, 417, 2908, 275, 253, 806, 767, 5239, 849, 1142, 5870, 513, 253, 806, 767, 5239, 2486, 285, 849, 2074, 403, 5870, 273, 1027, 5239, 34243, 6194, 19928, 285, 1071, 5870, 5663, 281, 1027, 2221, 5870, 50276, 20, 368, 6266, 275, 2593, 247, 20, 849, 368, 10375, 2601, 3057, 8115, 432, 253, 4116, 8115, 273, 3861, 6291, 436, 310, 271, 1774, 2508, 326, 1364, 320, 2529, 275, 253, 2022, 2505, 849, 858, 368, 5206, 253, 721, 9851, 858, 368, 5206, 731, 13542, 390, 323, 1650, 407, 3733, 247, 4872, 1566, 281, 3283, 12716, 432, 4116, 8115, 285, 970, 253, 13461, 323, 12488, 1774, 9851, 390, 12672, 253, 17375, 3388, 273, 4116, 8115, 50275, 37585, 5701, 50276, 21, 2593, 4567, 1269, 50276, 265, 2574, 2981, 50276, 554, 21221, 849, 858, 368, 11897, 40798, 46234, 285, 2139, 513, 285, 823, 46234, 3185, 273, 32147, 839, 731, 50276, 22, 2593, 4567, 359, 1555, 253, 40798, 21496, 299, 993, 347, 271, 4583, 6010, 591, 3321, 1491, 4496, 6266, 625, 4518, 752, 436, 6010, 310, 50276, 23, 2593, 577, 806, 12494, 253, 298, 273, 253, 12320, 387, 298, 7982, 310, 417, 253, 3425, 2978, 533, 253, 1180, 273, 1755, 6430, 368, 6266, 298, 347, 1146, 1097, 50275, 24, 4677, 721, 310, 417, 5469, 3185, 273, 4645, 436, 4677, 891, 1804, 2677, 5411, 253, 5921, 7293, 327, 253, 1180, 273, 9851, 407, 12672, 285, 16585, 50276, 783, 31636, 1342, 5921, 50276, 25, 10874, 265, 1162, 355, 50276, 14952, 7534, 2605, 285, 1159, 20177, 452, 4102, 2011, 275, 1635, 281, 13271, 1162, 355, 326, 2601, 3057, 476, 320, 8131, 432, 4116, 8115, 534, 1364, 320, 671, 8042, 562, 275, 253, 4114, 2593, 2490, 187, 4118, 18435, 27, 783, 2929, 2722, 247, 4602, 875, 268, 43285, 1566, 285, 4979, 398, 285, 4648, 253, 4602, 281, 12661, 247, 958, 2149, 4116, 2341, 281, 897, 275, 271, 278, 19232, 1543, 403, 2011, 970, 436, 2341, 1754, 327, 958, 2149, 4116, 671, 3215, 11273, 270, 797, 3210, 403, 908, 281, 3283, 3057, 8115, 347, 247, 5301, 253, 30628, 1119, 253, 2929, 4722, 432, 247, 2601, 5289, 10554, 1127, 273, 1859, 533, 432, 247, 5145, 4715, 8668, 616, 4743, 369, 326, 253, 2929, 1057, 417, 3959, 247, 18893, 18511, 1332, 326, 310, 1077, 4460, 285, 253, 4602, 875, 268, 43285, 285, 271, 2341, 1754, 4116, 1566, 310, 417, 326, 16400, 50276, 249, 1635, 253, 9759, 369, 8489, 5049, 528, 50274, 783, 4477, 1160, 11701, 281, 253, 2929, 689, 253, 2282, 273, 253, 2278, 534, 310, 14109, 533, 253, 1332, 3559, 1057, 417, 3761, 253, 2303, 323, 271, 17857, 32888, 2929, 275, 2426, 273, 35961, 9021, 50276 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 2149, 4116, 1566, 13414, 1056, 436, 2746, 1646, 18511, 275, 253, 5955, 253, 4477, 1056, 2067, 3862, 2852, 946, 3339, 690, 273, 841, 651, 320, 4722, 9021, 285, 891, 11907, 253, 4477, 281, 1287, 436, 789, 2007, 5046, 958, 2149, 4116, 812, 320, 12532, 323, 1805, 26475, 21011, 875, 6887, 323, 12861, 4979, 398, 327, 13818, 284, 533, 352, 310, 2649, 2779, 326, 436, 789, 588, 320, 273, 3862, 1600, 281, 253, 5145, 4715, 3114, 436, 7714, 3133, 1805, 18960, 281, 247, 22586, 390, 643, 18052, 18767, 690, 2173, 5701, 327, 436, 789, 956, 2708, 337, 186, 249, 253, 958, 2149, 4116, 1566, 253, 4477, 897, 581, 12022, 9706, 273, 253, 1899, 3605, 347, 253, 1899, 9706, 436, 310, 6425, 281, 6311, 1899, 46234, 347, 275, 270, 797, 534, 310, 4409, 29570, 50276, 19, 186, 783, 4477, 2319, 21864, 614, 19316, 347, 247, 3064, 875, 268, 43285, 3210, 285, 2014, 3828, 4116, 3210, 285, 840, 921, 247, 5301, 273, 4116, 3210, 342, 285, 1293, 21864, 614, 19316, 4645, 1652, 3064, 2299, 4116, 3210, 2168, 29688, 452, 21864, 614, 19316, 534, 12893, 432, 253, 40798, 9706, 3280, 3386, 7169, 436, 310, 417, 253, 1083, 323, 253, 958, 2149, 4116, 1566, 835, 21864, 614, 19316, 1646, 281, 452, 625, 1055, 2167, 275, 253, 4016, 3884, 495, 186, 783, 4477, 1375, 326, 253, 3745, 273, 958, 2149, 4116, 281, 9232, 2074, 12716, 281, 268, 43285, 1293, 897, 273, 1049, 68, 1804, 326, 352, 778, 320, 625, 7470, 323, 2601, 2216, 891, 13414, 956, 436, 6452, 604, 253, 958, 2149, 4116, 1566, 17923, 39406, 281, 253, 268, 43285, 1566, 3815, 285, 7197, 685, 253, 268, 43285, 1566, 342, 1049, 68, 10618, 2139, 651, 352, 320, 625, 7470, 323, 2601, 2216, 577, 186, 752, 2789, 253, 2014, 12026, 4116, 390, 958, 2149, 4116, 3210, 18511, 323, 2601, 14053, 752, 3237, 513, 841, 3210, 8415, 326, 403, 417, 1805, 14042, 407, 253, 268, 43285, 1566, 390, 5899, 4979, 398, 50276, 5371, 651, 7164, 619, 4868, 337, 186, 15068, 247, 18511, 897, 1083, 323, 253, 958, 2149, 4116, 1566, 752, 3533, 476, 320, 9577, 390, 1805, 9577, 342, 436, 1566, 689, 253, 268, 43285, 1566, 390, 643, 18075, 581, 2934, 310, 281, 897, 253, 958, 2149, 4116, 1566, 347, 253, 8090, 275, 247, 2120, 3676, 39707, 1566, 285, 923, 604, 436, 10336, 476, 3157, 8892, 835, 278, 6678, 3733, 941, 310, 2130, 50276, 15576, 891, 452, 2559, 619, 4868, 275, 1708, 273, 253, 2380, 285, 7714, 1407, 953, 253, 7714, 310, 5520, 533, 891, 1158, 253, 1332, 1335, 3198, 625, 2440, 627, 403, 247, 1180, 273, 4722, 7437, 533, 253, 2457, 5406, 273, 271, 5520, 2601, 1566, 310, 417, 4751, 11512, 7152, 339, 793, 360, 3454, 436, 2929, 33826, 253, 4602, 875, 253, 10610, 268, 43285, 1566, 3169, 7274, 285, 4980, 39707, 3169, 7274, 323, 2601, 3057, 3711, 10554, 281, 436, 990, 253, 4477, 9569, 247, 21010, 7629, 273, 253, 4116, 3828, 1925, 958, 2149, 4116, 285, 921, 326, 247, 2014, 3828, 273, 958, 2149, 4116, 17923, 5871, 2074, 281, 1110, 2684, 407, 253, 268, 43285, 1566, 3169, 3082, 50276, 856, 84, 50276, 783, 2929, 9437, 281, 4684, 10610, 285, 4980, 7274, 281, 2601, 3057, 3711, 10554, 534, 1537, 320, 4722, 281, 253, 952, 2444, 275, 436, 1673, 253, 1941, 3559, 8077, 5411, 4116, 3828, 594, 326, 253, 7424, 1007, 2074, 281, 253, 10610, 3082, 10704, 1543, 273, 253, 21010, 4116, 3828, 2810, 281, 253, 10610, 3082, 310, 12054, 21414, 50276, 783, 9400, 273, 253, 2929, 310, 3240, 14793, 627, 556, 644, 247, 2257, 273, 1600, 4102, 275, 26278, 4324, 970, 253, 6323, 295, 24343, 5609, 50276, 783, 2929, 310, 973, 3542, 891, 11435, 253, 3434, 1691, 275, 407, 253, 4477, 281, 4853, 5044, 2601, 18376, 5970, 534, 1537, 417, 320, 4755, 281, 10668, 1293, 16775, 4114, 50275, 5040, 50276, 783, 9021, 273, 253, 2929, 651, 452, 644, 625, 4722, 604, 253, 4081, 14586, 273, 253, 4116, 3828, 3977, 281, 2559, 10554, 3045, 273, 3210, 534, 403, 8612, 273, 253, 1375, 23037, 14387, 5742, 604, 851, 26208, 3861, 6291, 3342, 69, 970, 253, 7321, 4116, 3828, 3977, 281, 2007, 7756, 275, 3045, 326, 651, 452, 644, 247, 4891, 7680, 50276, 609, 278, 19232, 3210, 1663, 326, 12085, 323, 3057, 3711, 10554, 432, 752, 891, 2096, 3676, 11454, 6928, 452, 644, 2080, 1805, 387, 436, 4836, 323, 3240, 690, 673, 1024, 387, 2709, 5053, 275, 253, 2929, 253, 4477, 1918, 253, 13214, 326, 278, 19232, 3210, 403, 2810, 281, 1375, 23037, 14387, 50276, 249, 253, 1390, 12494, 273, 253, 47649, 2593, 253, 2934, 273, 9706, 253, 13818, 284, 310, 5611, 534, 4455, 4722, 2299, 432, 752, 891, 7192, 432, 253, 1551, 273, 253, 2929, 253, 19241, 285, 10149, 403, 10375, 12718, 1754, 327, 253, 1899, 273, 253, 8898, 3527, 310, 326, 987, 604, 594, 1057, 253, 1899, 2723, 281, 253, 1899, 275, 253, 3425, 390, 275, 253, 278, 6678, 403, 253, 4588, 43097, 908, 275, 667, 273, 253, 1543, 275, 253, 2929, 4496, 19148, 50276, 26122, 50276, 4674, 4562, 1016, 5024, 943, 452, 247, 5347, 299, 50276, 4674, 5922, 5742, 253, 629, 835, 368, 921, 326, 958, 2149, 4116, 310, 247, 28208, 278, 19232, 310, 1512, 4864, 1677, 326, 436, 310, 247, 2022, 7680, 273, 253, 2929, 352, 651, 320, 32811, 281, 5513, 436, 4602, 275, 247, 625, 7000, 5133, 7152, 339, 3456, 1154, 314, 690, 8607, 3597, 281, 4647, 4116, 3210, 715, 253, 2601, 1673, 970, 1881, 35421, 4715, 281, 3283, 2601, 12716, 275, 436, 789, 253, 2488, 3177, 281, 1973, 253, 4602, 875, 824, 2987, 285, 253, 1711, 19221, 1566, 268, 43285, 1566, 407, 8077, 5411, 690, 5871, 1561, 253, 4116, 1566, 253, 2488, 7303, 281, 1973, 271, 7370, 875, 253, 21010, 1566, 285, 253, 268, 43285, 1566, 253, 7370, 310, 27350, 285, 1842, 1767, 10117, 1676, 253, 4477, 2007, 7277, 253, 21010, 1566, 285, 253, 268, 43285, 1566, 327, 818, 2385, 2601, 5870, 4645, 326, 597, 403, 2074, 390, 3164, 253, 21010, 4116, 1566, 310, 1014, 1805, 436, 310, 271, 4722, 789, 2299, 891, 671, 452, 247, 1180, 273, 7350, 253, 11361, 285, 23797, 403, 7117, 2708, 50276, 856, 84, 337, 253, 7714, 310, 44003, 285, 1842, 1767, 10117, 1676, 374, 253, 2934, 310, 27350, 285, 5272, 342, 5661, 1329, 50276, 5040, 337, 253, 7370, 875, 253, 21010, 4116, 1566, 285, 253, 268, 43285, 1566, 310, 27350, 533, 417, 26565, 253, 4477, 1750, 326, 597, 2085, 247, 10527, 4602, 875, 253, 767, 3210, 2299, 326, 629, 310, 417, 2266, 2217, 1293, 4737, 374, 627, 403, 767, 13260, 275, 436, 789, 534, 1056, 253, 21010, 1566, 1027, 432, 253, 4116, 3210, 326, 253, 2045, 8607, 908, 41005, 597, 6194, 253, 1566, 327, 2709, 3425, 12420, 3185, 273, 253, 9305, 6430, 604, 597, 6194, 253, 1566, 327, 253, 9305, 6430, 253, 3045, 310, 28536, 347, 2011, 275, 4677, 1668, 534, 310, 5185, 342, 253, 2045, 2561, 1273, 314, 597, 5176, 253, 3425, 21496, 275, 19241, 285, 10149, 436, 8077, 1877, 2789, 253, 1566, 760, 1908, 253, 7605, 3102, 275, 253, 278, 6678, 281, 479, 436, 581, 310, 247, 1512, 2266, 9376, 50276, 20, 253, 3515, 673, 285, 10309, 5301, 310, 5816, 604, 253, 2014, 3828, 273, 4116, 310, 10870, 281, 253, 268, 43285, 1566, 417, 562, 32231, 352, 3012, 1223, 352, 651, 1379, 1199, 625, 673, 281, 6194, 253, 8607, 651, 878, 281, 1158, 7019, 604, 597, 971, 281, 897, 253, 4116, 1566, 50276, 21, 253, 28913, 1263, 2789, 479, 1928, 326, 253, 1543, 403, 327, 253, 7285, 273, 253, 6452, 1060, 310, 619, 9317, 342, 253, 1840, 767, 13260, 253, 4116, 1566, 476, 5115, 2074, 3045, 347, 253, 268, 43285, 1566, 390, 247, 1652, 2372, 1805, 2299, 672, 359, 6194, 327, 253, 440, 2132, 6430, 534, 310, 253, 7312, 1083, 326, 359, 651, 897, 253, 4116, 1566, 253, 3045, 4916, 28536, 840, 2139, 359, 971, 281, 897, 253, 625, 8214, 4116, 1566, 253, 4116, 1566, 275, 253, 295, 24343, 1673, 310, 247, 1027, 2926, 1110, 3210, 403, 31255, 253, 4806, 66, 3045, 512, 253, 673, 2299, 275, 253, 2601, 1673, 253, 4116, 1566, 476, 1335, 760, 5115, 10870, 3045, 347, 253, 966, 692, 3210, 846, 247, 767, 2913, 1263, 597, 28277, 562, 32231, 10610, 11333, 253, 1543, 275, 436, 7714, 403, 5185, 342, 253, 2045, 2561, 594, 891, 717, 417, 13762, 5001, 253, 6452, 275, 253, 12002, 2668, 2366, 841, 1543, 2085, 16038, 323, 3733, 4979, 398, 327, 1781, 2601, 15302, 608, 253, 2442, 8446, 273, 436, 2929, 651, 320, 1110, 665, 403, 18052, 390, 6110, 275, 9015, 37366, 285, 1233, 527, 406, 339, 793, 360, 3454, 50276, 16702, 398, 3210, 452, 644, 4102, 2011, 281, 9232, 2601, 3057, 1491, 275, 616, 4116, 8115, 672, 10166, 440, 35421, 273, 9790, 273, 2601, 6430, 436, 2929, 21354, 43630, 875, 4979, 398, 285, 268, 43285, 3210, 4751, 14063, 28208, 278, 83, 649, 248, 1655, 2629, 2746, 323, 2601, 3057, 10554, 395, 2722, 45190, 326, 4979, 398, 403, 12085, 342, 268, 43285, 3210, 4685, 253, 3910, 285, 22620, 875, 4979, 398, 285, 268, 43285, 3210, 2789, 4979, 398, 1679, 273, 247, 2806, 3364, 285, 7729, 281, 5100, 731, 347, 247, 3505, 74, 6216, 1332, 323, 3057, 10554, 253, 2929, 310, 4518, 3542, 285, 253, 7103, 310, 4891, 891, 452, 760, 247, 1643, 5701, 50275, 24330, 5701, 50276, 18, 752, 310, 253, 4869, 3425, 14259, 875, 253, 3733, 3425, 273, 3861, 6291, 285, 6430, 275, 492, 287, 1178, 1641, 43097, 326, 497, 908, 323, 5175, 6430, 1364, 417, 14787, 452, 247, 4869, 14259, 273, 14935, 1333, 5096, 50276, 19, 368, 6266, 326, 368, 908, 1264, 5239, 273, 5870, 432, 253, 492, 287, 1178, 893, 10895, 247, 3156, 2139, 858, 368, 897, 760, 818, 1237, 5870, 323, 5175, 873, 495, 497, 841, 512, 5870, 326, 497, 417, 2908, 275, 253, 806, 767, 5239, 849, 1142, 5870, 513, 253, 806, 767, 5239, 2486, 285, 849, 2074, 403, 5870, 273, 1027, 5239, 34243, 6194, 19928, 285, 1071, 5870, 5663, 281, 1027, 2221, 5870, 50276, 20, 368, 6266, 275, 2593, 247, 20, 849, 368, 10375, 2601, 3057, 8115, 432, 253, 4116, 8115, 273, 3861, 6291, 436, 310, 271, 1774, 2508, 326, 1364, 320, 2529, 275, 253, 2022, 2505, 849, 858, 368, 5206, 253, 721, 9851, 858, 368, 5206, 731, 13542, 390, 323, 1650, 407, 3733, 247, 4872, 1566, 281, 3283, 12716, 432, 4116, 8115, 285, 970, 253, 13461, 323, 12488, 1774, 9851, 390, 12672, 253, 17375, 3388, 273, 4116, 8115, 50275, 37585, 5701, 50276, 21, 2593, 4567, 1269, 50276, 265, 2574, 2981, 50276, 554, 21221, 849, 858, 368, 11897, 40798, 46234, 285, 2139, 513, 285, 823, 46234, 3185, 273, 32147, 839, 731, 50276, 22, 2593, 4567, 359, 1555, 253, 40798, 21496, 299, 993, 347, 271, 4583, 6010, 591, 3321, 1491, 4496, 6266, 625, 4518, 752, 436, 6010, 310, 50276, 23, 2593, 577, 806, 12494, 253, 298, 273, 253, 12320, 387, 298, 7982, 310, 417, 253, 3425, 2978, 533, 253, 1180, 273, 1755, 6430, 368, 6266, 298, 347, 1146, 1097, 50275, 24, 4677, 721, 310, 417, 5469, 3185, 273, 4645, 436, 4677, 891, 1804, 2677, 5411, 253, 5921, 7293, 327, 253, 1180, 273, 9851, 407, 12672, 285, 16585, 50276, 783, 31636, 1342, 5921, 50276, 25, 10874, 265, 1162, 355, 50276, 14952, 7534, 2605, 285, 1159, 20177, 452, 4102, 2011, 275, 1635, 281, 13271, 1162, 355, 326, 2601, 3057, 476, 320, 8131, 432, 4116, 8115, 534, 1364, 320, 671, 8042, 562, 275, 253, 4114, 2593, 2490, 187, 4118, 18435, 27, 783, 2929, 2722, 247, 4602, 875, 268, 43285, 1566, 285, 4979, 398, 285, 4648, 253, 4602, 281, 12661, 247, 958, 2149, 4116, 2341, 281, 897, 275, 271, 278, 19232, 1543, 403, 2011, 970, 436, 2341, 1754, 327, 958, 2149, 4116, 671, 3215, 11273, 270, 797, 3210, 403, 908, 281, 3283, 3057, 8115, 347, 247, 5301, 253, 30628, 1119, 253, 2929, 4722, 432, 247, 2601, 5289, 10554, 1127, 273, 1859, 533, 432, 247, 5145, 4715, 8668, 616, 4743, 369, 326, 253, 2929, 1057, 417, 3959, 247, 18893, 18511, 1332, 326, 310, 1077, 4460, 285, 253, 4602, 875, 268, 43285, 285, 271, 2341, 1754, 4116, 1566, 310, 417, 326, 16400, 50276, 249, 1635, 253, 9759, 369, 8489, 5049, 528, 50274, 783, 4477, 1160, 11701, 281, 253, 2929, 689, 253, 2282, 273, 253, 2278, 534, 310, 14109, 533, 253, 1332, 3559, 1057, 417, 3761, 253, 2303, 323, 271, 17857, 32888, 2929, 275, 2426, 273, 35961, 9021, 50276 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper devises an approach to semantically encode ordinal categorical features as numerical features by using pretrained language models when preprocessing tabular data named bertsort in addition this paper introduces a new benchmark that is collected from 10 different public data sets with 42 different ordinal features upon this benchmark bertsort significantly outperforms traditional encoding methods such as ordinalencoder na for reproducibility reviewers na for reproducibility reviewers na for reproducibility reviewers na for reproducibility reviewers na for reproducibility reviewers docsepthis paper considers preprocessing of categorical features by treating them as text to which a pretrained transformer like bert is applied to identify what would be the most likely ordering of these words in a natural corpus then the categories are ordinally encoded according to this ordering the authors show their bertsort is reliably able to improve the performance of 45 automl systems over 10 datasets compared to the default data preprocessing scheme used in those systems thus there could be moderate impact on the field of automl if these results generalize across more automl systems and datasets that said the authors did not compare with other preprocessing strategies which perhaps might work as well as bertsort for these particular datasets the way each automl tool was run seems correct to me assuming you ran each tool on a separate machine with no other processes running on the machine while the automl tool was running the results in table 5 seem biased if i understand correctly you are giving an automl tool a f1 score 0 when it fails on a dataset that seems like an overly strong penalty to me and muddies the comparison across different tools since their scores are over different datasets also any intuition why bertsort helps increase the number of successes for mljar and tpot a minor quibble is you are not accounting for the additional time taken by bertsort when comparing against the raw run of each automl system the paper is overall clear enough to understand the methodology and findings this paper introduces a very simple idea with promising looking results however there are not enough comparisonsexplanations of the details to convince me their results will be as useful in realworld automl as presented in the paper first off there should be more baselines eg one option is just to use pretrained or finetuned bert to embed the categories text into a higherdimensional vector which can be fed into automl systems this is considered in the following paper which should be discussed benchmarking multimodal automl for tabular data with text fields 2021 httpsarxivorgabs211102705 another option is to represent categories in a labeldependent way via target mean encoding a third option is to use a feedforward network for the dataset with embedding layers for each categorical variable that learn embeddings of its categories from scratch these embeddings could then easily be fed into the automl system other questions for reproducibility what versions of each automl tool are you using what machine are you running them on are you running each tool as the only process on the machine how much extra time does bertsort add to the overall training of each automl system why did you choose f1 score as the predictions evaluation metric i would be curious if the bertsort gains remain across other evaluation metrics like auc accuracy why are there so many automl failures on your chosen datasets are you running on a machine with enough ram or are these datasets peculiar in some fashion how do you handle missing values in your approach how do you handle previously unseen category in the test data during inference the presented idea is very simple the implement and can be combined with any automl system although the authors should comment more on its additional computational complexity my biggest concern is around the trustworthiness and generalizability of the results in particular the boost to automl systems accuracy which seems like the major contribution of this paper if the authors can assuage my concerns i am open to increasing more score as the results do look promising and i like the simplicity docsepthe authors introduce bertsort a novel method to semantically encode categorical values with zeroshot masked language models for tabular data they compare their approach to existing techniques used in modern automl systems both in terms of accuracy of ordinal encodings and endtoend automl performance via 10 datasets and show improvement in both areas being able to correctly preprocess features in a fashion which maximizes predictive power is of high importance to automl systems and ordinal encoding has been historically difficult to automate without hints from the user because this work mainly applies to a particular type of feature within tabular data this work is of medium impact while i have only minor issues with the approach theory and experiments in table 4 comparing against the ordinalencoder for ordinal accuracy accuracy i do have severe issues with table 5 and the conclusions found within 1 l140 how is the common word calculated what if a value contains multiple common words the implementation is not clear 2 l156 i am having trouble understanding how this could be ollogn when n elements need to be passed to bert sortcases to me this appears to be at minimum on2 additionally ollogn implies that we could simply have l n and get onlogn instead of on which doesnt seem right 3 l226 all automl tools failed on two data sets regressions this is never explained as to why 4 l226 and success on 8 classification problems this is only true for h2o 5 table 5 this table and the conclusions surrounding it are extremely misleading because you count failures as having a score of 0 the 1061 performance gain is purely due to the fact that tpot and mljar succeeded on more datasets with the bertsort input removing tpot and mljar due to this issue the conclusion would be that performance was made worse by bertsort due to the major drop in performance of h2o correcting this issue completely flips the conclusion of the paper yet the authors at no point mentioned this no explanation is ever made about what datasets each framework failed on and why nor any investigation into the cause of h2o performing worse with bertsort because this method doesnt change the format of the data it is hard to imagine why bertsort would improve the success rate of automl frameworks which makes this extra confusing another issue is that for example it is impossible to tell if the 3 datasets succeeded by flaml in raw and bertsort are even the same datasets making the comparison of scores between them meaningless there is no overall table that shows scores for each framework for each setting on each dataset the choice to use f1score is odd and was never explained as it skews the importance of positive and negative classes rocauc and acc both seem like more reasonable options although this is a minor issue compared to those i mentioned previously overall the paper is clear and well written my major issues come with the comments i mentioned in technical quality correctness most importantly in table 5 the authors propose a novel and interesting idea to leverage pretrained language models for more accurate ordinal encoding in tabular datasets this has strong potential to improve endtoend automl systems by improving the quality of the data fed to downstream models in particular this problem has no easy solution outside of naive approaches as described by the authors currently implemented in automl systems so leveraging pretrained models to order the values is a step in the right direction and shows encouraging results via table 4 however i cannot ignore the major issues in table 5 as i discuss in detail in earlier sections unfortunately this drastically hinders the paper due to a combination of misleading numbers lack of information on reasons for dataset failures a lack of overall perdataset scores a lack of reproducible code to run the experiments and an incorrect conclusion that fails to discuss any drawbacks to the method or inconclusive findings from the experiments because of these issues i cannot accept the paper in its current form this is a good paper hindered by a flawed experimental result analysis and conclusion which unfortunately cannot be overlooked in its severity because of these issues i vote to reject refer to the other sections for detailed comments update from rebuttal i have increased my score from 2 to 3 due to the improvements made in the rebuttal docsepthe authors primary contribution is the development of a method of creating more meaningful feature embeddings for categorical variables that may carry hidden ordinal information for instance the words mild moderate severe make the most sense in terms of an ordinal relationship and the authors posit that creating an embedding for this variable that respects this order could make ml models more successful at detecting relationships between these and other variables to enable these embeddings the authors leverage mlm masked language modeling models which are commonly used in nlp to predict a masked token in the context of other unmasked tokens these predictions are then used to determine which ordering of the variables is most appropriate in effect leveraging the mlms knowledge about what order these words appear in the corpora on which they are trained the authors test many different choices for mlms and emphasize that practitioners can choose from among many pretrained models in the application of their technique furthermore the authors suggest that the method they use to determine the most likely ordering can be leveraged as a kind of metric to indicate whether a sequence of variables exhibits an ordinal relationship this assertion along with the primary one is tested using different automl technologies which the authors claim demonstrate the superiority of their solution over the default categorical embeddings found in these libraries it is possible that more meaningful unsupervised embeddings of variables would be of use to the automl community as well as the ml community at large as is discussed below some technical issues and massive issues with clarity make it very difficult to know whether the impact cited in the abstract 10 performance gain is to be believed there are some glaring issues with correctness in this paper i have broken them down into different topics as i experienced them complexity computations on lines 156157 the authors say that their proposed computational method gets a speedup from mathcalon to mathcalozetalogn plus a constant that doesnt affect the bigoh computation this is highly dubious consider the case when one chooses zeta1 then the algorithm has to evaluate 1 2 3 cdots nfracnn12 different possible orderings resulting in a runtime of mathcalon2 if we assume that zeta is fixed my backoftheenvelope computations says the time complexity should be closer to mathcalonzeta n2 as long as zeta is chosen to be less than fracn2 this still represents a speedup over the naive approach but it is nowhere near logarithmic in n math notation there are a few math notational errors including umckexists mathcal weta where the authors intended to write umckin mathcal weta the u notation which may be common notation in nlp circles was unfamiliar to me and i had to try to decode its meaning from context some clarification would help there the notation hat pmcksceta was a bit misleading to me since it appeared to be computing the probability of a particular mask instead of the unmasked token which was masked in the application of the kth mask to the cth sequence i dont think this was necessarily wrong as much as it was confusing some work on notation would make this whole section more legible algorithm 1 there are a couple of small errorstypos that need to be addressed the first occurs on line 16 here one is determining the new sequences to consider but it is computed as the union of mathcal c the collection of all zeta length combinations in mathcal a and seqmathcal c mathcal e which i presume is supposed to denote the sequences created by inserting mathcal e at different places as can be seen in figure 2 i do not believe that you want to consider the sequences in mathcal c these have already been computed in the first steps and a better notation for what you want should be seqmathcal imathcal e the set of sequences created from inserting the token mathcal e into mathcal i at different places later on line 28 the authors use a union between what i understand to be a set of probability values and a single probability value set brackets around the latter value would aid in clarity although the intention can be deciphered since the values are summed at the end anyways i would suggest the authors use an accumulator variable simply adding the values after computing each ordinal value detection the results shown in the confusion matrix in table 1 are quite bad i computed their statistics again and got a precision of 0875 as stated in the paper a recall of 0204 much different than the stated value of 1 in the paper a specificity of zero and an f1 score of 0331 which was claimed to be 0933 these show either a severe miscalculation on the part of the authors or errors in transcribing their results my takeaway is that the results as recorded demonstrate that the ordinal value detection using bertsort fails horribly on its assigned task with a large majority of sequences being misclassified section 42 this section has a significant error the formula for ordacc in equation 2 does not compute the value that appears in table 3 the formula given is effectively a normalized version of manhattan distance between two vectors which also somewhat calls into question whether this is truly a novel metric but the values shown in the table bear no resemblance to the values that come from using it worse the definition given is a metric meaning that identical vectors will get a value of zero instead of 1 which one would expect with a measurement of accuracy i suspect the problem is that the definition in 2 is just wrong but i cannot determine what the correct formula should be from the rest of the paper section 43 there is a serious lack of clarity here discussed in the next section but from my understanding the methodology is also questionable i am assuming here that the intention of the authors as it seemed from reading on the easier side the paper is in serious need of some proofreading some sections are better than others but there is a general problem with subjectverb agreement and awkward phrasing that i found quite distracting some examples include one of the algorithms to process the context is maskedlanguage modeling mlm where it computes the probability lines 9899 and second bertsort instead of generating a large number of permutation cases when a it is sorting only number of elements then it sorts the rest of elements onebyone sequentially lines 147148 in both of these cases the meaning is clear but some work should be done to make the language more clear and natural unfortunately the part of the paper that suffered the most from a lack of clarity is the description of the main result of the paper in the results in section 43 the results seem to be incongruous with the experiments as described the authors use several data sets which by their own description include both classification and regression tasks yet all that is reported is the number of successes the definition for which is never discussed nor can i imagine a definition i would accept as well as f1 scores which can be generalized to classification tasks but as far as i know make no sense for regression the 10 accuracy increase appears to be related to the increased f1 scores which again are confusing and dont make much sense in the context of the problem i understood the authors claim to be that their method increased the accuracy of an endtoend automl model but i dont see any results in the main paper nor the appendices that talk about the performance of a complete model instead i see results about establishing the correct ordinal relationship of labels and these mysterious successfailuref1 statistics the authors have found a problem that many ml practitioners will find interesting and their solution seems to be a good one to me i appreciate how the authors draw from the knowledge of nlp researchers to aid in other fields the proposed solution is just complex enough to be interesting while still being tractable to implement for any experienced ml user the paper needs a lot of attention to the language and technical aspects however the authors are unclear about what their main results are which obscures from the reader whether they are meaningful mathematical and notational mistakes are common enough that they require the reader to make leaps to understand what the authors are saying in places finally some results eg the results about using mlms as a metric of ordinal relation among tokens seem to fail in spite of the text stating the opposite i would suggest that the authors carefully rewrite their paper with more clear results and some editing and resubmit at a future venue due to the clarity issues and the many technical errors and typos throughout the paper i do not recommend the paper be published in its current state the idea seems promising and i believe the authors could have made a significant contribution to the field but in its current state it fails to deliver on its promises or convince me that it was performed thoroughly and competently docsepthe authors introduce the bertsort algorithm to semantically encode ordinal categorical values via a zeroshot pretrained masked language model the authors also construct benchmarks from 10 public sets for sorting ordinal categorical values where their proposed method seems to outperform existing automl approaches 1 being the first to provide an automated machine learning approach to process ordinal categorical values into a form that facilitates model training and feature learning 2 construct benchmark tasks to evaluate models capacity in dealing with ordinal categorical values preprocessing the experimental details seem correct the paper is well written the authors should provide an additional ablation study on the necessity of the two approaches detailed in section 32 the design detailed in section 32 seems to indicate that the bertsort algorithm is not even able to 1 sort hot boiling hot lava hot cold into cold hot boiling hot lava hot or 2 sort months according to their ordinal numbers which seem to be extremely simple when a tremendously large model such as bert is used i think the author should provide experiments regarding bertsort without using the two approaches mentioned above otherwise i would presume it is the inductive bias in the first and the second approaches that induces the results in table 4 in table 5 i do not see a significant increase in the number of successful trained models when bertsort is used to be more specific the number of successfully trained models is 2350 w the bertsort algorithm and 2150 wo the algorithm moreover for autogluon flaml and h2o there is no impact when bertsort is used i like the ideas proposed by the authors which i believe to be novel in the field of automl but i only recommend weak acceptance at this moment as the empirical results are not extensive ie do not provide ablation study convincing ie do not repeat over different random seeds and significant ie the results in table 5 are far from being prominent ### Summary:
this paper considers preprocessing of categorical features by treating them as text to which a pretrained transformer like bert is used to identify what would be the most likely ordering of these words in a natural corpus i would like to thank the authors for actively addressing reviewers comments or providing clarifications whenever it was needed as a result aru2 and 61b6 and 96ap have raised their score and are inclined to weakly accept the paper however one of the reviewers raised many clarity issues technical errors and typos and suggested a weak reject i would recommend assigning a shepherd to this paper to address the reviewers 6k9h in terms of presentations and writing
[ 50276, 44295, 3062, 253, 4477, 1804, 326, 253, 1332, 597, 897, 281, 3653, 253, 954, 2779, 15824, 476, 320, 19732, 2961, 347, 247, 2238, 273, 7982, 281, 5224, 1880, 247, 3425, 273, 4903, 15646, 271, 50144, 2954, 436, 17077, 2112, 342, 253, 3625, 581, 310, 5762, 970, 1027, 3772, 77, 10296, 534, 253, 4477, 1750, 7568, 253, 34385, 273, 616, 2900, 689, 253, 4284, 31091, 46234, 1119, 275, 841, 13747, 352, 310, 1896, 326, 625, 14282, 440, 35421, 46234, 273, 4903, 651, 320, 273, 897, 281, 253, 3772, 77, 3114, 347, 973, 347, 253, 13361, 3114, 387, 1781, 347, 310, 5469, 2708, 690, 7681, 3374, 285, 7863, 3374, 342, 19843, 1056, 352, 1077, 2834, 281, 871, 1880, 253, 3486, 11106, 275, 253, 12002, 884, 3045, 6351, 310, 281, 320, 6566, 627, 403, 690, 45982, 3374, 342, 36594, 275, 436, 2929, 891, 452, 7154, 731, 1066, 715, 1027, 12989, 347, 891, 7407, 731, 50276, 19017, 414, 30745, 50275, 251, 3104, 21807, 15882, 253, 4477, 1333, 326, 616, 4081, 15180, 1332, 4850, 247, 3885, 484, 432, 14168, 1179, 251, 281, 14168, 1179, 6002, 16190, 2331, 5043, 247, 3638, 326, 36908, 2818, 253, 1943, 1368, 13782, 436, 310, 4122, 42326, 1908, 253, 1083, 672, 581, 28467, 1182, 1464, 18, 840, 253, 5933, 556, 281, 7472, 337, 50276, 19, 50276, 20, 50276, 7718, 50276, 79, 1124, 9866, 805, 1027, 1896, 1340, 723, 4795, 275, 247, 20243, 273, 14168, 1179, 251, 19, 604, 359, 5467, 326, 1182, 1464, 310, 4229, 619, 896, 23037, 248, 257, 15209, 30745, 2296, 253, 673, 10454, 943, 320, 8003, 281, 14168, 1179, 251, 7597, 50276, 79, 19, 347, 1048, 347, 1182, 1464, 310, 6777, 281, 320, 1679, 685, 1315, 317, 79, 19, 436, 1335, 6125, 247, 3885, 484, 689, 253, 27785, 2746, 533, 352, 310, 17663, 2822, 32643, 275, 295, 50276, 679, 14951, 50276, 9088, 403, 247, 1643, 14168, 417, 1050, 6332, 1690, 5111, 33144, 89, 1346, 14168, 1179, 259, 1464, 835, 253, 4477, 6034, 281, 3630, 5111, 777, 249, 14168, 1179, 259, 1464, 253, 1484, 14951, 534, 778, 320, 1846, 14951, 275, 295, 24343, 14240, 369, 32139, 281, 479, 285, 891, 574, 281, 1611, 281, 30358, 697, 4495, 432, 3634, 690, 37699, 651, 1361, 627, 253, 14951, 7856, 12920, 6163, 68, 1464, 369, 247, 2372, 24363, 281, 479, 1580, 352, 5420, 281, 320, 12672, 253, 5912, 273, 247, 1798, 8989, 3185, 273, 253, 440, 12477, 264, 10669, 534, 369, 34741, 275, 253, 2898, 273, 253, 465, 394, 8989, 281, 253, 260, 394, 3425, 891, 13414, 1158, 436, 369, 7933, 3430, 347, 1199, 347, 352, 369, 21643, 690, 789, 327, 14951, 651, 1056, 436, 2644, 2593, 625, 1791, 917, 50276, 41528, 337, 50275, 9088, 403, 247, 4564, 273, 1355, 2228, 25871, 993, 326, 878, 281, 320, 9713, 253, 806, 6634, 327, 1386, 1668, 1060, 581, 310, 8925, 253, 747, 6430, 281, 1908, 533, 352, 310, 10302, 347, 253, 8083, 273, 14168, 1179, 260, 253, 4849, 273, 512, 1182, 1464, 2978, 13553, 275, 14168, 1179, 247, 285, 22510, 1588, 260, 14168, 1179, 299, 534, 891, 35533, 310, 6326, 281, 9173, 253, 6430, 3562, 407, 30471, 14168, 1179, 299, 387, 1027, 5053, 347, 476, 320, 2326, 275, 4677, 374, 891, 513, 417, 2868, 326, 368, 971, 281, 1908, 253, 6430, 275, 14168, 1179, 260, 841, 452, 2168, 644, 10302, 275, 253, 806, 5018, 285, 247, 1805, 14951, 323, 752, 368, 971, 943, 320, 22510, 1588, 516, 506, 1179, 299, 253, 873, 273, 6430, 3562, 432, 30471, 253, 10669, 14168, 1179, 299, 715, 14168, 1179, 891, 387, 1027, 5053, 50276, 31312, 327, 1386, 3349, 253, 4477, 897, 247, 8083, 875, 752, 891, 2096, 281, 320, 247, 873, 273, 5912, 2193, 285, 247, 2014, 5912, 1318, 873, 26609, 1475, 253, 6158, 1318, 651, 8596, 275, 19843, 3738, 253, 8208, 476, 320, 1086, 532, 23258, 1580, 253, 2193, 403, 37254, 387, 253, 990, 667, 1576, 891, 651, 1804, 253, 4477, 897, 271, 7358, 11699, 4778, 3365, 6240, 253, 2193, 846, 12672, 1016, 50276, 636, 989, 1318, 5481, 50276, 783, 1543, 2011, 275, 253, 13775, 4315, 275, 2829, 337, 403, 3240, 3076, 891, 10302, 616, 9990, 969, 285, 1694, 247, 12320, 273, 16331, 1976, 347, 4767, 275, 253, 2929, 247, 6983, 273, 470, 15781, 1199, 1027, 685, 253, 4767, 1318, 273, 337, 275, 253, 2929, 247, 13005, 273, 5058, 285, 271, 269, 18, 4868, 273, 470, 27385, 534, 369, 7558, 281, 320, 15630, 1610, 841, 921, 2057, 247, 5460, 3731, 1179, 25101, 327, 253, 629, 273, 253, 4477, 390, 6332, 275, 811, 28010, 616, 1543, 619, 1379, 12594, 310, 326, 253, 1543, 347, 5950, 7568, 326, 253, 50144, 1318, 5481, 970, 270, 797, 15227, 10224, 3499, 23522, 327, 697, 7922, 4836, 342, 247, 1781, 5020, 273, 6430, 1146, 3731, 39651, 50276, 4674, 5976, 50275, 2520, 2593, 556, 247, 1534, 2228, 253, 7212, 323, 4036, 3649, 275, 5150, 374, 1057, 417, 11897, 253, 1318, 326, 4620, 275, 2829, 495, 253, 7212, 1677, 310, 8069, 247, 12650, 2715, 273, 637, 20660, 4181, 875, 767, 11390, 534, 671, 8489, 5841, 715, 1953, 1880, 436, 310, 7777, 247, 4460, 7982, 533, 253, 2193, 2011, 275, 253, 2829, 8800, 642, 36199, 281, 253, 2193, 326, 1705, 432, 970, 352, 7197, 253, 5426, 1677, 310, 247, 7982, 4495, 326, 8931, 11390, 588, 755, 247, 1318, 273, 5058, 3185, 273, 337, 534, 581, 651, 1902, 342, 247, 6814, 273, 7200, 891, 9101, 253, 1895, 310, 326, 253, 5426, 275, 374, 310, 816, 3430, 533, 891, 2550, 3653, 752, 253, 3451, 7212, 943, 320, 432, 253, 1551, 273, 253, 2929, 50276, 4674, 7652, 50276, 9088, 310, 247, 4092, 3480, 273, 19843, 1060, 5469, 275, 253, 1735, 2593, 533, 432, 619, 4685, 253, 16182, 310, 671, 30455, 891, 717, 7384, 1060, 326, 253, 8208, 273, 253, 4477, 347, 352, 4455, 432, 4361, 50275, 251, 253, 6927, 1930, 253, 2929, 310, 275, 4092, 878, 273, 690, 4737, 24042, 690, 7118, 403, 1805, 685, 2571, 533, 627, 310, 247, 2087, 1895, 342, 2256, 25340, 4345, 285, 19328, 9839, 2355, 326, 891, 1119, 3240, 940, 25031, 690, 6667, 2486, 581, 273, 253, 11333, 281, 1232, 253, 3634, 310, 34741, 12982, 14053, 13361, 78, 835, 352, 48169, 253, 5912, 3104, 10508, 1525, 285, 1273, 270, 797, 15227, 3185, 273, 11365, 247, 1781, 1180, 273, 29391, 2219, 672, 50275, 66, 352, 310, 23762, 760, 50276, 9133, 273, 3603, 840, 352, 16308, 253, 1551, 273, 3603, 581, 1615, 531, 32627, 3104, 20825, 16989, 275, 1097, 273, 841, 2219, 253, 4495, 310, 2590, 533, 690, 789, 943, 320, 2218, 281, 1056, 253, 3448, 625, 2590, 285, 3626, 50276, 328, 9520, 253, 629, 273, 253, 2929, 326, 9606, 253, 954, 432, 247, 3480, 273, 19843, 310, 253, 5740, 273, 253, 2022, 906, 273, 253, 2929, 275, 253, 1543, 275, 2593, 7652, 253, 1543, 1646, 281, 320, 1485, 543, 579, 528, 342, 253, 4679, 347, 2529, 253, 4477, 897, 2067, 941, 5239, 534, 407, 616, 1211, 5740, 2486, 1097, 9162, 285, 9077, 8892, 2568, 512, 326, 310, 2361, 310, 253, 1180, 273, 34574, 253, 5426, 323, 534, 310, 1620, 5469, 4543, 476, 891, 8564, 247, 5426, 891, 651, 2997, 347, 973, 347, 269, 18, 7363, 534, 476, 320, 14923, 281, 9162, 8892, 533, 347, 2080, 347, 891, 871, 1056, 642, 3282, 323, 9077, 253, 884, 7200, 2572, 4620, 281, 320, 2905, 281, 253, 2559, 269, 18, 7363, 534, 969, 403, 21643, 285, 13414, 1056, 1199, 3282, 275, 253, 3634, 273, 253, 1895, 50276, 74, 7192, 253, 4477, 1750, 281, 320, 326, 616, 1332, 2559, 253, 7200, 273, 271, 990, 936, 423, 3772, 77, 1566, 533, 891, 13414, 923, 667, 1543, 275, 253, 2022, 2929, 4543, 253, 14801, 1271, 326, 2312, 670, 253, 3045, 273, 247, 3426, 1566, 3185, 891, 923, 1543, 670, 14631, 253, 3451, 50144, 2954, 273, 13301, 285, 841, 19796, 2323, 33699, 71, 18, 9990, 253, 4477, 452, 1119, 247, 1895, 326, 1142, 13361, 24432, 588, 1089, 4722, 285, 616, 2900, 3133, 281, 320, 247, 1175, 581, 281, 479, 891, 11435, 849, 253, 4477, 3812, 432, 253, 3640, 273, 295, 24343, 8607, 281, 8596, 275, 643, 4910, 253, 4081, 2900, 310, 816, 2570, 2217, 281, 320, 4722, 1223, 1335, 1146, 10649, 494, 281, 3359, 323, 667, 7407, 13361, 2608, 50276, 783, 2929, 3198, 247, 2257, 273, 4116, 281, 253, 3448, 285, 7681, 7794, 2299, 253, 4477, 403, 12744, 670, 752, 616, 2022, 1543, 403, 534, 14551, 980, 432, 253, 9414, 1880, 597, 403, 14282, 15965, 285, 417, 1050, 16503, 403, 1846, 2217, 326, 597, 2430, 253, 9414, 281, 1056, 458, 1825, 281, 2096, 752, 253, 4477, 403, 3981, 275, 5053, 4720, 690, 1543, 24088, 253, 1543, 670, 970, 13361, 983, 347, 247, 7982, 273, 50144, 5886, 2190, 21761, 1646, 281, 1891, 275, 15866, 273, 253, 2505, 14851, 253, 7285, 50275, 74, 651, 1804, 326, 253, 4477, 9257, 24813, 616, 2929, 342, 625, 2590, 1543, 285, 690, 14835, 285, 501, 538, 2225, 387, 247, 2852, 18767, 1955, 281, 253, 19843, 3374, 285, 253, 1142, 7681, 6332, 285, 963, 993, 4768, 253, 2929, 891, 513, 417, 5583, 253, 2929, 320, 3863, 275, 697, 1655, 1375, 253, 2934, 3133, 12532, 285, 891, 2868, 253, 4477, 812, 452, 1160, 247, 1534, 7680, 281, 253, 1673, 533, 275, 697, 1655, 1375, 352, 10224, 281, 7257, 327, 697, 16966, 390, 18578, 479, 326, 352, 369, 2684, 16575, 285, 3947, 1574, 5474, 339, 431, 248, 4477, 9569, 253, 270, 797, 15227, 5933, 281, 3300, 39904, 22573, 50144, 31091, 2193, 3066, 247, 1182, 254, 6934, 302, 3215, 11273, 34741, 3448, 1566, 253, 4477, 671, 3989, 49602, 432, 884, 1345, 5239, 323, 23762, 50144, 31091, 2193, 835, 616, 4081, 1332, 3133, 281, 562, 32231, 5368, 3772, 77, 7274, 337, 1146, 253, 806, 281, 2085, 271, 16644, 5145, 4715, 2746, 281, 1232, 50144, 31091, 2193, 715, 247, 830, 326, 29499, 1566, 3733, 285, 4735, 4715, 374, 3989, 22791, 8892, 281, 7472, 3210, 5350, 275, 10620, 342, 50144, 31091, 2193, 638, 21678, 253, 5661, 4278, 1646, 3451, 253, 2929, 310, 973, 3542, 50276, 783, 4477, 943, 2085, 271, 3081, 28913, 1263, 327, 253, 15504, 273, 253, 767, 7274, 7000, 275, 2593, 4567, 253, 2216, 7000, 275, 2593, 4567, 3133, 281, 5224, 326, 253, 270, 797, 15227, 5933, 310, 417, 1014, 2104, 281, 337, 3686, 3511, 25317, 3511, 48343, 3511, 5412, 715, 5412, 50276, 12022, 50276, 2399, 4837, 3511, 50276, 77, 2623, 3511, 390, 374, 3686, 2607, 2556, 281, 616, 50144, 3904, 534, 1646, 281, 320, 6685, 2969, 672, 247, 17623, 4087, 1781, 1566, 824, 347, 270, 797, 310, 908, 891, 1158, 253, 2488, 943, 2085, 4679, 5001, 270, 797, 15227, 1293, 970, 253, 767, 7274, 5393, 1840, 5010, 891, 651, 35533, 352, 310, 253, 42115, 8492, 275, 253, 806, 285, 253, 1273, 7274, 326, 14757, 253, 1543, 275, 2829, 577, 50274, 249, 2829, 608, 891, 513, 417, 923, 247, 1534, 2572, 275, 253, 1180, 273, 5547, 10166, 3210, 672, 270, 797, 15227, 310, 908, 281, 320, 625, 2173, 253, 1180, 273, 8379, 10166, 3210, 310, 3495, 1235, 259, 253, 270, 797, 15227, 5933, 285, 374, 8970, 32063, 253, 5933, 25761, 323, 1125, 462, 7675, 251, 892, 16878, 285, 288, 19, 80, 627, 310, 642, 3486, 672, 270, 797, 15227, 310, 908, 891, 751, 253, 5697, 4081, 407, 253, 4477, 534, 891, 2868, 281, 320, 4460, 275, 253, 1673, 273, 3772, 77, 533, 891, 760, 5583, 5075, 14924, 387, 436, 2774, 347, 253, 16774, 1543, 403, 417, 9470, 26332, 513, 417, 2085, 28913, 1263, 21414, 26332, 513, 417, 10280, 689, 1027, 3632, 12922, 285, 1534, 26332, 253, 1543, 275, 2829, 608, 403, 2080, 432, 1146, 11906, 2490, 187, 4118, 18435, 27, 2520, 2929, 19401, 638, 21678, 273, 31091, 3386, 407, 12767, 731, 347, 2505, 281, 534, 247, 3215, 11273, 39707, 751, 270, 797, 310, 908, 281, 4271, 752, 651, 320, 253, 954, 2779, 15824, 273, 841, 3000, 275, 247, 3626, 20689, 50275, 74, 651, 751, 281, 5717, 253, 4477, 323, 15257, 15974, 30628, 5701, 390, 5277, 8254, 6787, 10793, 352, 369, 3058, 347, 247, 906, 549, 86, 19, 285, 9901, 67, 23, 285, 9161, 522, 452, 5439, 616, 4868, 285, 403, 21802, 281, 22112, 2997, 253, 2929, 2299, 581, 273, 253, 30628, 5439, 1142, 19843, 3374, 7681, 6332, 285, 963, 993, 285, 5125, 247, 5075, 12009, 50275, 74, 651, 5583, 34018, 247, 703, 29359, 281, 436, 2929, 281, 2953, 253, 30628, 721, 76, 26, 73, 275, 2426, 273, 27228, 285, 4028, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 50276, 44295, 3062, 253, 4477, 1804, 326, 253, 1332, 597, 897, 281, 3653, 253, 954, 2779, 15824, 476, 320, 19732, 2961, 347, 247, 2238, 273, 7982, 281, 5224, 1880, 247, 3425, 273, 4903, 15646, 271, 50144, 2954, 436, 17077, 2112, 342, 253, 3625, 581, 310, 5762, 970, 1027, 3772, 77, 10296, 534, 253, 4477, 1750, 7568, 253, 34385, 273, 616, 2900, 689, 253, 4284, 31091, 46234, 1119, 275, 841, 13747, 352, 310, 1896, 326, 625, 14282, 440, 35421, 46234, 273, 4903, 651, 320, 273, 897, 281, 253, 3772, 77, 3114, 347, 973, 347, 253, 13361, 3114, 387, 1781, 347, 310, 5469, 2708, 690, 7681, 3374, 285, 7863, 3374, 342, 19843, 1056, 352, 1077, 2834, 281, 871, 1880, 253, 3486, 11106, 275, 253, 12002, 884, 3045, 6351, 310, 281, 320, 6566, 627, 403, 690, 45982, 3374, 342, 36594, 275, 436, 2929, 891, 452, 7154, 731, 1066, 715, 1027, 12989, 347, 891, 7407, 731, 50276, 19017, 414, 30745, 50275, 251, 3104, 21807, 15882, 253, 4477, 1333, 326, 616, 4081, 15180, 1332, 4850, 247, 3885, 484, 432, 14168, 1179, 251, 281, 14168, 1179, 6002, 16190, 2331, 5043, 247, 3638, 326, 36908, 2818, 253, 1943, 1368, 13782, 436, 310, 4122, 42326, 1908, 253, 1083, 672, 581, 28467, 1182, 1464, 18, 840, 253, 5933, 556, 281, 7472, 337, 50276, 19, 50276, 20, 50276, 7718, 50276, 79, 1124, 9866, 805, 1027, 1896, 1340, 723, 4795, 275, 247, 20243, 273, 14168, 1179, 251, 19, 604, 359, 5467, 326, 1182, 1464, 310, 4229, 619, 896, 23037, 248, 257, 15209, 30745, 2296, 253, 673, 10454, 943, 320, 8003, 281, 14168, 1179, 251, 7597, 50276, 79, 19, 347, 1048, 347, 1182, 1464, 310, 6777, 281, 320, 1679, 685, 1315, 317, 79, 19, 436, 1335, 6125, 247, 3885, 484, 689, 253, 27785, 2746, 533, 352, 310, 17663, 2822, 32643, 275, 295, 50276, 679, 14951, 50276, 9088, 403, 247, 1643, 14168, 417, 1050, 6332, 1690, 5111, 33144, 89, 1346, 14168, 1179, 259, 1464, 835, 253, 4477, 6034, 281, 3630, 5111, 777, 249, 14168, 1179, 259, 1464, 253, 1484, 14951, 534, 778, 320, 1846, 14951, 275, 295, 24343, 14240, 369, 32139, 281, 479, 285, 891, 574, 281, 1611, 281, 30358, 697, 4495, 432, 3634, 690, 37699, 651, 1361, 627, 253, 14951, 7856, 12920, 6163, 68, 1464, 369, 247, 2372, 24363, 281, 479, 1580, 352, 5420, 281, 320, 12672, 253, 5912, 273, 247, 1798, 8989, 3185, 273, 253, 440, 12477, 264, 10669, 534, 369, 34741, 275, 253, 2898, 273, 253, 465, 394, 8989, 281, 253, 260, 394, 3425, 891, 13414, 1158, 436, 369, 7933, 3430, 347, 1199, 347, 352, 369, 21643, 690, 789, 327, 14951, 651, 1056, 436, 2644, 2593, 625, 1791, 917, 50276, 41528, 337, 50275, 9088, 403, 247, 4564, 273, 1355, 2228, 25871, 993, 326, 878, 281, 320, 9713, 253, 806, 6634, 327, 1386, 1668, 1060, 581, 310, 8925, 253, 747, 6430, 281, 1908, 533, 352, 310, 10302, 347, 253, 8083, 273, 14168, 1179, 260, 253, 4849, 273, 512, 1182, 1464, 2978, 13553, 275, 14168, 1179, 247, 285, 22510, 1588, 260, 14168, 1179, 299, 534, 891, 35533, 310, 6326, 281, 9173, 253, 6430, 3562, 407, 30471, 14168, 1179, 299, 387, 1027, 5053, 347, 476, 320, 2326, 275, 4677, 374, 891, 513, 417, 2868, 326, 368, 971, 281, 1908, 253, 6430, 275, 14168, 1179, 260, 841, 452, 2168, 644, 10302, 275, 253, 806, 5018, 285, 247, 1805, 14951, 323, 752, 368, 971, 943, 320, 22510, 1588, 516, 506, 1179, 299, 253, 873, 273, 6430, 3562, 432, 30471, 253, 10669, 14168, 1179, 299, 715, 14168, 1179, 891, 387, 1027, 5053, 50276, 31312, 327, 1386, 3349, 253, 4477, 897, 247, 8083, 875, 752, 891, 2096, 281, 320, 247, 873, 273, 5912, 2193, 285, 247, 2014, 5912, 1318, 873, 26609, 1475, 253, 6158, 1318, 651, 8596, 275, 19843, 3738, 253, 8208, 476, 320, 1086, 532, 23258, 1580, 253, 2193, 403, 37254, 387, 253, 990, 667, 1576, 891, 651, 1804, 253, 4477, 897, 271, 7358, 11699, 4778, 3365, 6240, 253, 2193, 846, 12672, 1016, 50276, 636, 989, 1318, 5481, 50276, 783, 1543, 2011, 275, 253, 13775, 4315, 275, 2829, 337, 403, 3240, 3076, 891, 10302, 616, 9990, 969, 285, 1694, 247, 12320, 273, 16331, 1976, 347, 4767, 275, 253, 2929, 247, 6983, 273, 470, 15781, 1199, 1027, 685, 253, 4767, 1318, 273, 337, 275, 253, 2929, 247, 13005, 273, 5058, 285, 271, 269, 18, 4868, 273, 470, 27385, 534, 369, 7558, 281, 320, 15630, 1610, 841, 921, 2057, 247, 5460, 3731, 1179, 25101, 327, 253, 629, 273, 253, 4477, 390, 6332, 275, 811, 28010, 616, 1543, 619, 1379, 12594, 310, 326, 253, 1543, 347, 5950, 7568, 326, 253, 50144, 1318, 5481, 970, 270, 797, 15227, 10224, 3499, 23522, 327, 697, 7922, 4836, 342, 247, 1781, 5020, 273, 6430, 1146, 3731, 39651, 50276, 4674, 5976, 50275, 2520, 2593, 556, 247, 1534, 2228, 253, 7212, 323, 4036, 3649, 275, 5150, 374, 1057, 417, 11897, 253, 1318, 326, 4620, 275, 2829, 495, 253, 7212, 1677, 310, 8069, 247, 12650, 2715, 273, 637, 20660, 4181, 875, 767, 11390, 534, 671, 8489, 5841, 715, 1953, 1880, 436, 310, 7777, 247, 4460, 7982, 533, 253, 2193, 2011, 275, 253, 2829, 8800, 642, 36199, 281, 253, 2193, 326, 1705, 432, 970, 352, 7197, 253, 5426, 1677, 310, 247, 7982, 4495, 326, 8931, 11390, 588, 755, 247, 1318, 273, 5058, 3185, 273, 337, 534, 581, 651, 1902, 342, 247, 6814, 273, 7200, 891, 9101, 253, 1895, 310, 326, 253, 5426, 275, 374, 310, 816, 3430, 533, 891, 2550, 3653, 752, 253, 3451, 7212, 943, 320, 432, 253, 1551, 273, 253, 2929, 50276, 4674, 7652, 50276, 9088, 310, 247, 4092, 3480, 273, 19843, 1060, 5469, 275, 253, 1735, 2593, 533, 432, 619, 4685, 253, 16182, 310, 671, 30455, 891, 717, 7384, 1060, 326, 253, 8208, 273, 253, 4477, 347, 352, 4455, 432, 4361, 50275, 251, 253, 6927, 1930, 253, 2929, 310, 275, 4092, 878, 273, 690, 4737, 24042, 690, 7118, 403, 1805, 685, 2571, 533, 627, 310, 247, 2087, 1895, 342, 2256, 25340, 4345, 285, 19328, 9839, 2355, 326, 891, 1119, 3240, 940, 25031, 690, 6667, 2486, 581, 273, 253, 11333, 281, 1232, 253, 3634, 310, 34741, 12982, 14053, 13361, 78, 835, 352, 48169, 253, 5912, 3104, 10508, 1525, 285, 1273, 270, 797, 15227, 3185, 273, 11365, 247, 1781, 1180, 273, 29391, 2219, 672, 50275, 66, 352, 310, 23762, 760, 50276, 9133, 273, 3603, 840, 352, 16308, 253, 1551, 273, 3603, 581, 1615, 531, 32627, 3104, 20825, 16989, 275, 1097, 273, 841, 2219, 253, 4495, 310, 2590, 533, 690, 789, 943, 320, 2218, 281, 1056, 253, 3448, 625, 2590, 285, 3626, 50276, 328, 9520, 253, 629, 273, 253, 2929, 326, 9606, 253, 954, 432, 247, 3480, 273, 19843, 310, 253, 5740, 273, 253, 2022, 906, 273, 253, 2929, 275, 253, 1543, 275, 2593, 7652, 253, 1543, 1646, 281, 320, 1485, 543, 579, 528, 342, 253, 4679, 347, 2529, 253, 4477, 897, 2067, 941, 5239, 534, 407, 616, 1211, 5740, 2486, 1097, 9162, 285, 9077, 8892, 2568, 512, 326, 310, 2361, 310, 253, 1180, 273, 34574, 253, 5426, 323, 534, 310, 1620, 5469, 4543, 476, 891, 8564, 247, 5426, 891, 651, 2997, 347, 973, 347, 269, 18, 7363, 534, 476, 320, 14923, 281, 9162, 8892, 533, 347, 2080, 347, 891, 871, 1056, 642, 3282, 323, 9077, 253, 884, 7200, 2572, 4620, 281, 320, 2905, 281, 253, 2559, 269, 18, 7363, 534, 969, 403, 21643, 285, 13414, 1056, 1199, 3282, 275, 253, 3634, 273, 253, 1895, 50276, 74, 7192, 253, 4477, 1750, 281, 320, 326, 616, 1332, 2559, 253, 7200, 273, 271, 990, 936, 423, 3772, 77, 1566, 533, 891, 13414, 923, 667, 1543, 275, 253, 2022, 2929, 4543, 253, 14801, 1271, 326, 2312, 670, 253, 3045, 273, 247, 3426, 1566, 3185, 891, 923, 1543, 670, 14631, 253, 3451, 50144, 2954, 273, 13301, 285, 841, 19796, 2323, 33699, 71, 18, 9990, 253, 4477, 452, 1119, 247, 1895, 326, 1142, 13361, 24432, 588, 1089, 4722, 285, 616, 2900, 3133, 281, 320, 247, 1175, 581, 281, 479, 891, 11435, 849, 253, 4477, 3812, 432, 253, 3640, 273, 295, 24343, 8607, 281, 8596, 275, 643, 4910, 253, 4081, 2900, 310, 816, 2570, 2217, 281, 320, 4722, 1223, 1335, 1146, 10649, 494, 281, 3359, 323, 667, 7407, 13361, 2608, 50276, 783, 2929, 3198, 247, 2257, 273, 4116, 281, 253, 3448, 285, 7681, 7794, 2299, 253, 4477, 403, 12744, 670, 752, 616, 2022, 1543, 403, 534, 14551, 980, 432, 253, 9414, 1880, 597, 403, 14282, 15965, 285, 417, 1050, 16503, 403, 1846, 2217, 326, 597, 2430, 253, 9414, 281, 1056, 458, 1825, 281, 2096, 752, 253, 4477, 403, 3981, 275, 5053, 4720, 690, 1543, 24088, 253, 1543, 670, 970, 13361, 983, 347, 247, 7982, 273, 50144, 5886, 2190, 21761, 1646, 281, 1891, 275, 15866, 273, 253, 2505, 14851, 253, 7285, 50275, 74, 651, 1804, 326, 253, 4477, 9257, 24813, 616, 2929, 342, 625, 2590, 1543, 285, 690, 14835, 285, 501, 538, 2225, 387, 247, 2852, 18767, 1955, 281, 253, 19843, 3374, 285, 253, 1142, 7681, 6332, 285, 963, 993, 4768, 253, 2929, 891, 513, 417, 5583, 253, 2929, 320, 3863, 275, 697, 1655, 1375, 253, 2934, 3133, 12532, 285, 891, 2868, 253, 4477, 812, 452, 1160, 247, 1534, 7680, 281, 253, 1673, 533, 275, 697, 1655, 1375, 352, 10224, 281, 7257, 327, 697, 16966, 390, 18578, 479, 326, 352, 369, 2684, 16575, 285, 3947, 1574, 5474, 339, 431, 248, 4477, 9569, 253, 270, 797, 15227, 5933, 281, 3300, 39904, 22573, 50144, 31091, 2193, 3066, 247, 1182, 254, 6934, 302, 3215, 11273, 34741, 3448, 1566, 253, 4477, 671, 3989, 49602, 432, 884, 1345, 5239, 323, 23762, 50144, 31091, 2193, 835, 616, 4081, 1332, 3133, 281, 562, 32231, 5368, 3772, 77, 7274, 337, 1146, 253, 806, 281, 2085, 271, 16644, 5145, 4715, 2746, 281, 1232, 50144, 31091, 2193, 715, 247, 830, 326, 29499, 1566, 3733, 285, 4735, 4715, 374, 3989, 22791, 8892, 281, 7472, 3210, 5350, 275, 10620, 342, 50144, 31091, 2193, 638, 21678, 253, 5661, 4278, 1646, 3451, 253, 2929, 310, 973, 3542, 50276, 783, 4477, 943, 2085, 271, 3081, 28913, 1263, 327, 253, 15504, 273, 253, 767, 7274, 7000, 275, 2593, 4567, 253, 2216, 7000, 275, 2593, 4567, 3133, 281, 5224, 326, 253, 270, 797, 15227, 5933, 310, 417, 1014, 2104, 281, 337, 3686, 3511, 25317, 3511, 48343, 3511, 5412, 715, 5412, 50276, 12022, 50276, 2399, 4837, 3511, 50276, 77, 2623, 3511, 390, 374, 3686, 2607, 2556, 281, 616, 50144, 3904, 534, 1646, 281, 320, 6685, 2969, 672, 247, 17623, 4087, 1781, 1566, 824, 347, 270, 797, 310, 908, 891, 1158, 253, 2488, 943, 2085, 4679, 5001, 270, 797, 15227, 1293, 970, 253, 767, 7274, 5393, 1840, 5010, 891, 651, 35533, 352, 310, 253, 42115, 8492, 275, 253, 806, 285, 253, 1273, 7274, 326, 14757, 253, 1543, 275, 2829, 577, 50274, 249, 2829, 608, 891, 513, 417, 923, 247, 1534, 2572, 275, 253, 1180, 273, 5547, 10166, 3210, 672, 270, 797, 15227, 310, 908, 281, 320, 625, 2173, 253, 1180, 273, 8379, 10166, 3210, 310, 3495, 1235, 259, 253, 270, 797, 15227, 5933, 285, 374, 8970, 32063, 253, 5933, 25761, 323, 1125, 462, 7675, 251, 892, 16878, 285, 288, 19, 80, 627, 310, 642, 3486, 672, 270, 797, 15227, 310, 908, 891, 751, 253, 5697, 4081, 407, 253, 4477, 534, 891, 2868, 281, 320, 4460, 275, 253, 1673, 273, 3772, 77, 533, 891, 760, 5583, 5075, 14924, 387, 436, 2774, 347, 253, 16774, 1543, 403, 417, 9470, 26332, 513, 417, 2085, 28913, 1263, 21414, 26332, 513, 417, 10280, 689, 1027, 3632, 12922, 285, 1534, 26332, 253, 1543, 275, 2829, 608, 403, 2080, 432, 1146, 11906, 2490, 187, 4118, 18435, 27, 2520, 2929, 19401, 638, 21678, 273, 31091, 3386, 407, 12767, 731, 347, 2505, 281, 534, 247, 3215, 11273, 39707, 751, 270, 797, 310, 908, 281, 4271, 752, 651, 320, 253, 954, 2779, 15824, 273, 841, 3000, 275, 247, 3626, 20689, 50275, 74, 651, 751, 281, 5717, 253, 4477, 323, 15257, 15974, 30628, 5701, 390, 5277, 8254, 6787, 10793, 352, 369, 3058, 347, 247, 906, 549, 86, 19, 285, 9901, 67, 23, 285, 9161, 522, 452, 5439, 616, 4868, 285, 403, 21802, 281, 22112, 2997, 253, 2929, 2299, 581, 273, 253, 30628, 5439, 1142, 19843, 3374, 7681, 6332, 285, 963, 993, 285, 5125, 247, 5075, 12009, 50275, 74, 651, 5583, 34018, 247, 703, 29359, 281, 436, 2929, 281, 2953, 253, 30628, 721, 76, 26, 73, 275, 2426, 273, 27228, 285, 4028, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: docsepthis paper linear algebra with transformers studies the application of seq2seq transformers to matrix operations it studies their performance across different encodings of floating point numbers different sizes of matrices different operations and different synthetic data distributions the main findings are that transformers work surprisingly well on various matrix operations addition multiplication eigenvalues inversion svd for small matrices eg 5x5 and that generalization to ood problems is not symmetric ie generalization from one distribution to another does not imply the other way round strengths the authors present a detailed study of many important questions this is a fairly comprehensive work on the idea thoughtprovoking application of transformers very well written weaknesses i would have loved to see more than just l1 distances the paper studies the question only on random matrices in other symbolic domains we have seen that insights gained from machine learning approaches trained on random data do not necessarily carry over to realworld distributions i would have loved to see a study that includes a wider variety of training and evaluation data the models used in this paper have sometimes rather odd small hyper parameters eg for many experiments the models have only 2 layers id love to see larger models and see if they improve the results and the exact number of parameters this work explores a wonderful idea to solve linear algebra with transformer models while these models use way more compute internally than the problem they are applied to requires to solve it is an intriguing question whether these computations can be learned from scratch without further biases the surprising answer is that this works relatively well for small matrices docsepthis paper consider the problem of approximating algebraic computations over matrices using transformers experiments with different encodings are presented investigating the use of transformers for approximating a number of algebraic operations while i found the paper well written i didnt find it very impactful the authors are not proposing a novel technique for addressing the problem but only report some experiments with different encodings for the matrices and a standard transformer architecture i was hoping to get more insights after reading this article such as what architectural choices are beneficial for each specific problem and why instead the proposed approach is training an offtheshelf model with randomly distributed examples and the experiments do not consider alternative techniques in my opinion the motivation for this approach is also lacking since the reported experiments only consider very small problems that can be solved exactly i wish that the paper considered instead cases that need to be approximated or possibly prove that indeed transformers trained on smaller problems can generalise to much higher dimensions since the learning algorithm has access to an oracle numpy that can provide exact supervision an interesting problem is how to select the most informative input instances to be solved during training another aspect that is worth investigating in my opinion is how to cope with noise in the training data allowing the training of transformers with an approximate oracle there isnt a novel contribution besides some experiments with an offtheshelf model the motivation is not supported by the experiments which only report results in lowdimensional settings the experiments do not consider any other technique besides transformers docsepthe authors train generic dense transformers to perform several standard linear algebra computations ranging from simple tasks like transposition to complex nonlinear tasks such as matrix inversion they restrict themselves to relatively small matrices due to the practical limits of the dense quadratic attention mechanism the main result of the paper is that transformers can perform fairly well on all tasks meaning that they can usually produce outputs that are correct upto relatively small tolerance the paper also shows that some forms of outofdistribution generalization are possible and that this phenomenon is sensitive to the details of the training distribution i found this to be an interesting paper overall framing i found the following claim to be problematic our results on outofdistribution generalization provide justification to the idea that models trained over random data can be used to solve real world problems first the authors only evaluate on other synthetic distributions second in most real world problems matrices are gigantic relative to the tiny context windows of dense transformers third since traditional methods are always perfectly accurate on all distributions i think the claim carries with it some burden to elaborate on why such noisy and much less scalable methods might prove useful in practice note that even if the potential practicality cannot be argued for i think the experiments are interesting enough to stand on their own sparse data reporting i would have liked to see much more data collected from the experiments especially trainloss validationloss and correctnessuptotolerance curves over a range of architectures the curves would also make it clear how many samples had been trained on for each measurement which would be useful for understanding the relative performance of the different encodings note it is not always clear from the current tables which encoding is even being used it would also be interesting to see some analysisvisualization of the attention patterns at least for tasks with relatively simple groundtruth algorithms like transposition and addition some experimental results also seem to be omitted for example s43 claims that deeper decoders are needed for matrixmatrix multiplication but table 5 does not include enough data to defend this claim out of distribution findings seem unsurprising it is great that the authors assess outofdistribution and i appreciate the negative result of generalizing from wigner to matrices with positive eigenvalues however although the details of the subsequent outofdistribution experiments are interesting i found it generally unsurprising that models trained on nonwigner matrices would generalize better and i thought that the authors tried to make too big a point of this finding cotraining although not essential i would be interested to see how cotraining on many of the tasks at once affects sample efficiency despite the concerns listed above i think this paper does contribute to our emerging understanding of transformers and that many people in the community will find it worth engaging with docsepthis paper describes several experiments where transformers are trained to perform realvalued linear algebra calculations matrix transposition addition multiplication eigenvalues eigenvectors of symmetric matrices svd inversion indistribution accuracy is generally very high whereas care is needed in order to obtain outofdomain generalization the paper carries out a very thorough set of experiments on linear algebra calculations with transformers using four different encodings of input matrices in addition the authors are aware of the importance of outofdistribution generalization varying both the matrix size and the distribution of input matrices of a given size results appear to be complete and the conclusion drawn from them are generally sound however the problem tackled in this paper does not appear to be particularly useful in my opinion the conclusions and findings of this paper are only interesting on a theoretical level perhaps they can help understand what transformers can or cannot do rather than being directly applicable in a meaningful way after all we do have algorithms for all linear algebra problems considered and they work with 100 accuracy perfect outofdomain generalization and faster run time as the authors note in the discussion at the current stage transformers have quadratic complexity in the number of tokens which translates into on4 complexity for n times n input matrices and this is asymptotically slower than the exact algorithms we have a potentially interesting future direction which perhaps can be advertised more by the author to strengthen the claim that this paper is useful is to investigate lineartime transformers on tasks where the exact algorithm requires more than on2 time so that perhaps transformers can be used to perform approximate computations with less time i also have the following minor comments first line of page 2 m times n should be in a math formula section 5 fifth line what does for small values of n mean here the statement is very precise so it is hard to believe that it is true up to a certain small number say 5 and false for a larger n end of page 7 this confirms that outofdistribution generalization is possible when particular attention actually you showed that it is necessary to pay particular attention not that it is sufficient so i would write outofdistribution generalization requires particular attention this paper provides a thorough and wellwritten investigation of the use of transformers to perform linear algebra computation however this does not appear to be particularly useful ### Summary:
the paper demonstrates that transformer architectures can be trained to compute solutions of linear algebra problems with high accuracy this is an interesting direction and as the the reviews and the discussion show it is a good data point and insightful as one reviewer puts it i fully agree with this but also agree with one other reviewer in that this is yet another application of a known transformer architecture the author should place the model into context and provide some perspective without the motivation behind solving the specific set of linear algebra problems considered is a bit unclear for instance could a transformer now learn to solve corresponding ml problems moreover the dimensions of the considered matrices are rather small and the generalization to larger dimension appear to be tricky
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 7152, 33032, 2520, 2929, 4872, 8697, 342, 4979, 398, 2175, 253, 2898, 273, 22510, 19, 14571, 4979, 398, 281, 4315, 5871, 352, 2175, 616, 3045, 2439, 1027, 2349, 351, 723, 273, 14974, 1127, 3904, 1027, 9552, 273, 12624, 1027, 5871, 285, 1027, 13506, 941, 10670, 253, 2022, 4342, 403, 326, 4979, 398, 789, 19143, 973, 327, 2710, 4315, 5871, 1635, 25219, 20223, 27697, 18504, 69, 50276, 1542, 1355, 12624, 24088, 608, 89, 22, 285, 326, 26647, 281, 258, 351, 3237, 310, 417, 13123, 26332, 26647, 432, 581, 3268, 281, 1529, 1057, 417, 16084, 253, 643, 1039, 3790, 20544, 50276, 783, 4477, 1246, 247, 7000, 1263, 273, 1142, 1774, 3533, 436, 310, 247, 9648, 11088, 789, 327, 253, 2934, 50276, 24286, 11404, 6856, 2898, 273, 4979, 398, 50276, 635, 973, 3542, 50276, 20881, 1255, 265, 50276, 74, 651, 452, 7636, 281, 923, 625, 685, 816, 298, 18, 13849, 50276, 783, 2929, 2175, 253, 1953, 760, 327, 3632, 12624, 275, 643, 24762, 10625, 359, 452, 2326, 326, 16039, 12103, 432, 5145, 4715, 7274, 10166, 327, 3632, 941, 513, 417, 7933, 4459, 689, 281, 1524, 10186, 10670, 891, 651, 452, 7636, 281, 923, 247, 1263, 326, 3797, 247, 14200, 5235, 273, 3733, 285, 7103, 941, 50276, 783, 3210, 908, 275, 436, 2929, 452, 4536, 2581, 8909, 1355, 4373, 3602, 24088, 323, 1142, 4679, 253, 3210, 452, 760, 374, 8090, 2654, 2389, 281, 923, 4067, 3210, 285, 923, 604, 597, 3157, 253, 1543, 285, 253, 3242, 1180, 273, 3602, 436, 789, 33826, 247, 9386, 2934, 281, 8415, 4872, 8697, 342, 39707, 3210, 1223, 841, 3210, 897, 1039, 625, 11897, 26506, 685, 253, 1895, 597, 403, 3732, 281, 4419, 281, 8415, 352, 310, 271, 27807, 1953, 1880, 841, 30745, 476, 320, 6311, 432, 20041, 1293, 2007, 31306, 253, 10084, 3662, 310, 326, 436, 2987, 4942, 973, 323, 1355, 12624, 50276, 7152, 33032, 2520, 2929, 1908, 253, 1895, 273, 4020, 839, 20157, 30745, 689, 12624, 970, 4979, 398, 50275, 16217, 3825, 342, 1027, 2349, 351, 723, 403, 3559, 15686, 253, 897, 273, 4979, 398, 323, 4020, 839, 247, 1180, 273, 20157, 5871, 1223, 891, 1119, 253, 2929, 973, 3542, 891, 42126, 1089, 352, 1077, 3486, 1020, 253, 4477, 403, 417, 36636, 247, 4460, 5853, 323, 15974, 253, 1895, 533, 760, 1304, 690, 4679, 342, 1027, 2349, 351, 723, 323, 253, 12624, 285, 247, 2629, 39707, 10336, 891, 369, 11525, 281, 755, 625, 16039, 846, 4361, 436, 3929, 824, 347, 752, 27934, 10165, 403, 12912, 323, 1016, 2173, 1895, 285, 2139, 3185, 253, 4081, 2746, 310, 3733, 271, 273, 649, 1041, 48164, 1566, 342, 12421, 5939, 6667, 285, 253, 4679, 513, 417, 1908, 5795, 5609, 50276, 249, 619, 4743, 253, 16038, 323, 436, 2746, 310, 671, 14999, 1580, 253, 2361, 4679, 760, 1908, 1077, 1355, 3237, 326, 476, 320, 14042, 4555, 891, 5730, 326, 253, 2929, 2783, 3185, 2219, 326, 878, 281, 320, 34930, 390, 6830, 5276, 326, 6296, 4979, 398, 10166, 327, 4577, 3237, 476, 2087, 885, 281, 1199, 2169, 10103, 1580, 253, 4715, 5933, 556, 2289, 281, 271, 42295, 36950, 326, 476, 2085, 3242, 20446, 271, 4722, 1895, 310, 849, 281, 3609, 253, 954, 27096, 3280, 10872, 281, 320, 14042, 1309, 3733, 1529, 4809, 326, 310, 4409, 15686, 275, 619, 4743, 310, 849, 281, 23808, 342, 6046, 275, 253, 3733, 941, 6941, 253, 3733, 273, 4979, 398, 342, 271, 16851, 42295, 50276, 9088, 310, 2649, 247, 4460, 7680, 16280, 690, 4679, 342, 271, 273, 649, 1041, 48164, 1566, 50276, 783, 16038, 310, 417, 4516, 407, 253, 4679, 534, 760, 1304, 1543, 275, 1698, 6967, 7533, 50276, 783, 4679, 513, 417, 1908, 667, 643, 5853, 16280, 4979, 398, 50276, 7152, 339, 431, 248, 4477, 6194, 12314, 14086, 4979, 398, 281, 1347, 2067, 2629, 4872, 8697, 30745, 12319, 432, 2969, 8892, 751, 811, 3321, 281, 2570, 14561, 8892, 824, 347, 4315, 27697, 597, 4656, 3746, 281, 4942, 1355, 12624, 1955, 281, 253, 8542, 7787, 273, 253, 14086, 21396, 4116, 5122, 253, 2022, 906, 273, 253, 2929, 310, 326, 4979, 398, 476, 1347, 9648, 973, 327, 512, 8892, 4495, 326, 597, 476, 3798, 4711, 18012, 326, 403, 3451, 11776, 80, 4942, 1355, 13761, 253, 2929, 671, 2722, 326, 690, 4948, 273, 562, 1171, 35360, 26647, 403, 1896, 285, 326, 436, 11562, 310, 7996, 281, 253, 4278, 273, 253, 3733, 3268, 891, 1119, 436, 281, 320, 271, 4722, 2929, 4583, 50275, 925, 6472, 50276, 74, 1119, 253, 1563, 1750, 281, 320, 20276, 776, 1543, 327, 562, 1171, 35360, 26647, 2085, 22861, 281, 253, 2934, 326, 3210, 10166, 689, 3632, 941, 476, 320, 908, 281, 8415, 1524, 1533, 3237, 806, 253, 4477, 760, 7472, 327, 643, 13506, 10670, 1273, 275, 954, 1524, 1533, 3237, 12624, 403, 41490, 4103, 281, 253, 10058, 3634, 8323, 273, 14086, 4979, 398, 2626, 1580, 5899, 3082, 403, 1900, 9670, 7899, 327, 512, 10670, 891, 1158, 253, 1750, 15814, 342, 352, 690, 7977, 281, 21184, 327, 2139, 824, 27620, 285, 1199, 1679, 44755, 3082, 1537, 5276, 4217, 275, 3946, 3877, 326, 1014, 604, 253, 2442, 8542, 414, 2550, 320, 9125, 323, 891, 1158, 253, 4679, 403, 4722, 2217, 281, 1462, 327, 616, 1211, 50274, 1033, 10788, 941, 9610, 50276, 74, 651, 452, 10490, 281, 923, 1199, 625, 941, 5728, 432, 253, 4679, 3340, 6194, 18585, 12820, 18585, 285, 36594, 18642, 302, 27730, 9191, 689, 247, 2491, 273, 35615, 253, 9191, 651, 671, 1056, 352, 2590, 849, 1142, 3530, 574, 644, 10166, 327, 323, 1016, 6814, 534, 651, 320, 4217, 323, 4685, 253, 4103, 3045, 273, 253, 1027, 2349, 351, 723, 3877, 352, 310, 417, 1900, 2590, 432, 253, 1655, 7180, 534, 9706, 310, 1014, 1146, 908, 352, 651, 671, 320, 4722, 281, 923, 690, 1783, 34309, 1320, 273, 253, 4116, 6127, 387, 1878, 323, 8892, 342, 4942, 2969, 3216, 33024, 11333, 751, 811, 3321, 285, 1635, 690, 5661, 1543, 671, 1646, 281, 320, 11035, 323, 1650, 256, 3079, 3916, 326, 12861, 1086, 351, 398, 403, 3058, 323, 4315, 6674, 25219, 533, 2829, 608, 1057, 417, 2486, 2217, 941, 281, 2342, 436, 1750, 50275, 483, 273, 3268, 4342, 1646, 5061, 321, 20733, 50276, 262, 310, 1270, 326, 253, 4477, 2939, 562, 1171, 35360, 285, 891, 11435, 253, 4016, 906, 273, 2087, 3006, 432, 259, 35892, 281, 12624, 342, 2762, 20223, 2299, 3738, 253, 4278, 273, 253, 6774, 562, 1171, 35360, 4679, 403, 4722, 891, 1119, 352, 3839, 5061, 321, 20733, 326, 3210, 10166, 327, 1327, 88, 35892, 12624, 651, 39970, 1805, 285, 891, 1869, 326, 253, 4477, 3597, 281, 1056, 1512, 1943, 247, 1127, 273, 436, 4560, 50275, 27678, 26208, 50276, 20261, 417, 5667, 891, 651, 320, 6110, 281, 923, 849, 13450, 26208, 327, 1142, 273, 253, 8892, 387, 2378, 11852, 3410, 6733, 50276, 3229, 3784, 253, 7350, 7117, 1840, 891, 1158, 436, 2929, 1057, 8162, 281, 776, 14149, 4685, 273, 4979, 398, 285, 326, 1142, 952, 275, 253, 3114, 588, 1089, 352, 4409, 15966, 342, 5474, 33032, 2520, 2929, 8631, 2067, 4679, 835, 4979, 398, 403, 10166, 281, 1347, 1524, 24995, 4872, 8697, 10426, 4315, 811, 3321, 1635, 25219, 20223, 50276, 70, 3855, 34383, 273, 13123, 12624, 18504, 69, 27697, 31929, 2382, 7200, 310, 3839, 1077, 1029, 5727, 1557, 310, 3058, 275, 1340, 281, 4044, 562, 1171, 13517, 26647, 253, 2929, 15814, 562, 247, 1077, 11080, 873, 273, 4679, 327, 4872, 8697, 10426, 342, 4979, 398, 970, 1740, 1027, 2349, 351, 723, 273, 3280, 12624, 275, 1635, 253, 4477, 403, 6600, 273, 253, 6349, 273, 562, 1171, 35360, 26647, 11962, 1097, 253, 4315, 1979, 285, 253, 3268, 273, 3280, 12624, 273, 247, 1677, 1979, 1543, 3176, 281, 320, 3426, 285, 253, 6452, 8392, 432, 731, 403, 3839, 3590, 50276, 35529, 253, 1895, 11463, 1070, 275, 436, 2929, 1057, 417, 3176, 281, 320, 3782, 4217, 275, 619, 4743, 253, 11815, 285, 4342, 273, 436, 2929, 403, 760, 4722, 327, 247, 10527, 1268, 4931, 597, 476, 1361, 2096, 752, 4979, 398, 476, 390, 2550, 513, 2581, 685, 1146, 3587, 7763, 275, 247, 14282, 1039, 846, 512, 359, 513, 452, 11333, 323, 512, 4872, 8697, 3237, 2783, 285, 597, 789, 342, 2233, 7200, 3962, 562, 1171, 13517, 26647, 285, 7938, 1408, 673, 50276, 284, 253, 4477, 3877, 275, 253, 5955, 387, 253, 1655, 3924, 4979, 398, 452, 21396, 10454, 275, 253, 1180, 273, 21761, 534, 30376, 715, 327, 21, 10454, 323, 295, 2069, 295, 3280, 12624, 285, 436, 310, 38311, 17357, 685, 253, 3242, 11333, 359, 452, 247, 7826, 4722, 2852, 3884, 534, 4931, 476, 320, 37636, 625, 407, 253, 2488, 281, 17084, 253, 1750, 326, 436, 2929, 310, 4217, 310, 281, 7409, 1386, 435, 553, 4979, 398, 327, 8892, 835, 253, 3242, 5933, 4419, 625, 685, 327, 19, 673, 594, 326, 4931, 4979, 398, 476, 320, 908, 281, 1347, 16851, 30745, 342, 1679, 673, 50276, 74, 671, 452, 253, 1563, 5884, 5701, 50276, 7053, 1386, 273, 3239, 374, 278, 2069, 295, 943, 320, 275, 247, 14168, 7212, 50276, 4674, 608, 10720, 1386, 752, 1057, 323, 1355, 2193, 273, 295, 1599, 1060, 253, 3908, 310, 1077, 10799, 594, 352, 310, 1892, 281, 2868, 326, 352, 310, 2032, 598, 281, 247, 2176, 1355, 1180, 1333, 608, 285, 3221, 323, 247, 4067, 295, 50276, 423, 273, 3239, 818, 436, 23849, 326, 562, 1171, 35360, 26647, 310, 1896, 672, 1798, 4116, 2686, 368, 2692, 326, 352, 310, 3309, 281, 2075, 1798, 4116, 417, 326, 352, 310, 4209, 594, 891, 651, 3630, 562, 1171, 35360, 26647, 4419, 1798, 4116, 436, 2929, 3400, 247, 11080, 285, 973, 15720, 5839, 273, 253, 897, 273, 4979, 398, 281, 1347, 4872, 8697, 13782, 2299, 436, 1057, 417, 3176, 281, 320, 3782, 4217, 2490, 187, 4118, 18435, 27, 783, 2929, 14371, 326, 39707, 35615, 476, 320, 10166, 281, 11897, 5482, 273, 4872, 8697, 3237, 342, 1029, 7200, 436, 310, 271, 4722, 3884, 285, 347, 253, 253, 10123, 285, 253, 5955, 921, 352, 310, 247, 1175, 941, 1127, 285, 47860, 347, 581, 37317, 12516, 352, 891, 4751, 5194, 342, 436, 533, 671, 5194, 342, 581, 643, 37317, 275, 326, 436, 310, 2568, 1529, 2898, 273, 247, 1929, 39707, 10336, 253, 2488, 943, 1659, 253, 1566, 715, 3634, 285, 2085, 690, 8668, 1293, 253, 16038, 3212, 16161, 253, 2173, 873, 273, 4872, 8697, 3237, 2783, 310, 247, 2372, 12744, 323, 4227, 812, 247, 39707, 1024, 3037, 281, 8415, 3969, 13361, 3237, 25761, 253, 10103, 273, 253, 2783, 12624, 403, 2581, 1355, 285, 253, 26647, 281, 4067, 7877, 3176, 281, 320, 28190 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 7152, 33032, 2520, 2929, 4872, 8697, 342, 4979, 398, 2175, 253, 2898, 273, 22510, 19, 14571, 4979, 398, 281, 4315, 5871, 352, 2175, 616, 3045, 2439, 1027, 2349, 351, 723, 273, 14974, 1127, 3904, 1027, 9552, 273, 12624, 1027, 5871, 285, 1027, 13506, 941, 10670, 253, 2022, 4342, 403, 326, 4979, 398, 789, 19143, 973, 327, 2710, 4315, 5871, 1635, 25219, 20223, 27697, 18504, 69, 50276, 1542, 1355, 12624, 24088, 608, 89, 22, 285, 326, 26647, 281, 258, 351, 3237, 310, 417, 13123, 26332, 26647, 432, 581, 3268, 281, 1529, 1057, 417, 16084, 253, 643, 1039, 3790, 20544, 50276, 783, 4477, 1246, 247, 7000, 1263, 273, 1142, 1774, 3533, 436, 310, 247, 9648, 11088, 789, 327, 253, 2934, 50276, 24286, 11404, 6856, 2898, 273, 4979, 398, 50276, 635, 973, 3542, 50276, 20881, 1255, 265, 50276, 74, 651, 452, 7636, 281, 923, 625, 685, 816, 298, 18, 13849, 50276, 783, 2929, 2175, 253, 1953, 760, 327, 3632, 12624, 275, 643, 24762, 10625, 359, 452, 2326, 326, 16039, 12103, 432, 5145, 4715, 7274, 10166, 327, 3632, 941, 513, 417, 7933, 4459, 689, 281, 1524, 10186, 10670, 891, 651, 452, 7636, 281, 923, 247, 1263, 326, 3797, 247, 14200, 5235, 273, 3733, 285, 7103, 941, 50276, 783, 3210, 908, 275, 436, 2929, 452, 4536, 2581, 8909, 1355, 4373, 3602, 24088, 323, 1142, 4679, 253, 3210, 452, 760, 374, 8090, 2654, 2389, 281, 923, 4067, 3210, 285, 923, 604, 597, 3157, 253, 1543, 285, 253, 3242, 1180, 273, 3602, 436, 789, 33826, 247, 9386, 2934, 281, 8415, 4872, 8697, 342, 39707, 3210, 1223, 841, 3210, 897, 1039, 625, 11897, 26506, 685, 253, 1895, 597, 403, 3732, 281, 4419, 281, 8415, 352, 310, 271, 27807, 1953, 1880, 841, 30745, 476, 320, 6311, 432, 20041, 1293, 2007, 31306, 253, 10084, 3662, 310, 326, 436, 2987, 4942, 973, 323, 1355, 12624, 50276, 7152, 33032, 2520, 2929, 1908, 253, 1895, 273, 4020, 839, 20157, 30745, 689, 12624, 970, 4979, 398, 50275, 16217, 3825, 342, 1027, 2349, 351, 723, 403, 3559, 15686, 253, 897, 273, 4979, 398, 323, 4020, 839, 247, 1180, 273, 20157, 5871, 1223, 891, 1119, 253, 2929, 973, 3542, 891, 42126, 1089, 352, 1077, 3486, 1020, 253, 4477, 403, 417, 36636, 247, 4460, 5853, 323, 15974, 253, 1895, 533, 760, 1304, 690, 4679, 342, 1027, 2349, 351, 723, 323, 253, 12624, 285, 247, 2629, 39707, 10336, 891, 369, 11525, 281, 755, 625, 16039, 846, 4361, 436, 3929, 824, 347, 752, 27934, 10165, 403, 12912, 323, 1016, 2173, 1895, 285, 2139, 3185, 253, 4081, 2746, 310, 3733, 271, 273, 649, 1041, 48164, 1566, 342, 12421, 5939, 6667, 285, 253, 4679, 513, 417, 1908, 5795, 5609, 50276, 249, 619, 4743, 253, 16038, 323, 436, 2746, 310, 671, 14999, 1580, 253, 2361, 4679, 760, 1908, 1077, 1355, 3237, 326, 476, 320, 14042, 4555, 891, 5730, 326, 253, 2929, 2783, 3185, 2219, 326, 878, 281, 320, 34930, 390, 6830, 5276, 326, 6296, 4979, 398, 10166, 327, 4577, 3237, 476, 2087, 885, 281, 1199, 2169, 10103, 1580, 253, 4715, 5933, 556, 2289, 281, 271, 42295, 36950, 326, 476, 2085, 3242, 20446, 271, 4722, 1895, 310, 849, 281, 3609, 253, 954, 27096, 3280, 10872, 281, 320, 14042, 1309, 3733, 1529, 4809, 326, 310, 4409, 15686, 275, 619, 4743, 310, 849, 281, 23808, 342, 6046, 275, 253, 3733, 941, 6941, 253, 3733, 273, 4979, 398, 342, 271, 16851, 42295, 50276, 9088, 310, 2649, 247, 4460, 7680, 16280, 690, 4679, 342, 271, 273, 649, 1041, 48164, 1566, 50276, 783, 16038, 310, 417, 4516, 407, 253, 4679, 534, 760, 1304, 1543, 275, 1698, 6967, 7533, 50276, 783, 4679, 513, 417, 1908, 667, 643, 5853, 16280, 4979, 398, 50276, 7152, 339, 431, 248, 4477, 6194, 12314, 14086, 4979, 398, 281, 1347, 2067, 2629, 4872, 8697, 30745, 12319, 432, 2969, 8892, 751, 811, 3321, 281, 2570, 14561, 8892, 824, 347, 4315, 27697, 597, 4656, 3746, 281, 4942, 1355, 12624, 1955, 281, 253, 8542, 7787, 273, 253, 14086, 21396, 4116, 5122, 253, 2022, 906, 273, 253, 2929, 310, 326, 4979, 398, 476, 1347, 9648, 973, 327, 512, 8892, 4495, 326, 597, 476, 3798, 4711, 18012, 326, 403, 3451, 11776, 80, 4942, 1355, 13761, 253, 2929, 671, 2722, 326, 690, 4948, 273, 562, 1171, 35360, 26647, 403, 1896, 285, 326, 436, 11562, 310, 7996, 281, 253, 4278, 273, 253, 3733, 3268, 891, 1119, 436, 281, 320, 271, 4722, 2929, 4583, 50275, 925, 6472, 50276, 74, 1119, 253, 1563, 1750, 281, 320, 20276, 776, 1543, 327, 562, 1171, 35360, 26647, 2085, 22861, 281, 253, 2934, 326, 3210, 10166, 689, 3632, 941, 476, 320, 908, 281, 8415, 1524, 1533, 3237, 806, 253, 4477, 760, 7472, 327, 643, 13506, 10670, 1273, 275, 954, 1524, 1533, 3237, 12624, 403, 41490, 4103, 281, 253, 10058, 3634, 8323, 273, 14086, 4979, 398, 2626, 1580, 5899, 3082, 403, 1900, 9670, 7899, 327, 512, 10670, 891, 1158, 253, 1750, 15814, 342, 352, 690, 7977, 281, 21184, 327, 2139, 824, 27620, 285, 1199, 1679, 44755, 3082, 1537, 5276, 4217, 275, 3946, 3877, 326, 1014, 604, 253, 2442, 8542, 414, 2550, 320, 9125, 323, 891, 1158, 253, 4679, 403, 4722, 2217, 281, 1462, 327, 616, 1211, 50274, 1033, 10788, 941, 9610, 50276, 74, 651, 452, 10490, 281, 923, 1199, 625, 941, 5728, 432, 253, 4679, 3340, 6194, 18585, 12820, 18585, 285, 36594, 18642, 302, 27730, 9191, 689, 247, 2491, 273, 35615, 253, 9191, 651, 671, 1056, 352, 2590, 849, 1142, 3530, 574, 644, 10166, 327, 323, 1016, 6814, 534, 651, 320, 4217, 323, 4685, 253, 4103, 3045, 273, 253, 1027, 2349, 351, 723, 3877, 352, 310, 417, 1900, 2590, 432, 253, 1655, 7180, 534, 9706, 310, 1014, 1146, 908, 352, 651, 671, 320, 4722, 281, 923, 690, 1783, 34309, 1320, 273, 253, 4116, 6127, 387, 1878, 323, 8892, 342, 4942, 2969, 3216, 33024, 11333, 751, 811, 3321, 285, 1635, 690, 5661, 1543, 671, 1646, 281, 320, 11035, 323, 1650, 256, 3079, 3916, 326, 12861, 1086, 351, 398, 403, 3058, 323, 4315, 6674, 25219, 533, 2829, 608, 1057, 417, 2486, 2217, 941, 281, 2342, 436, 1750, 50275, 483, 273, 3268, 4342, 1646, 5061, 321, 20733, 50276, 262, 310, 1270, 326, 253, 4477, 2939, 562, 1171, 35360, 285, 891, 11435, 253, 4016, 906, 273, 2087, 3006, 432, 259, 35892, 281, 12624, 342, 2762, 20223, 2299, 3738, 253, 4278, 273, 253, 6774, 562, 1171, 35360, 4679, 403, 4722, 891, 1119, 352, 3839, 5061, 321, 20733, 326, 3210, 10166, 327, 1327, 88, 35892, 12624, 651, 39970, 1805, 285, 891, 1869, 326, 253, 4477, 3597, 281, 1056, 1512, 1943, 247, 1127, 273, 436, 4560, 50275, 27678, 26208, 50276, 20261, 417, 5667, 891, 651, 320, 6110, 281, 923, 849, 13450, 26208, 327, 1142, 273, 253, 8892, 387, 2378, 11852, 3410, 6733, 50276, 3229, 3784, 253, 7350, 7117, 1840, 891, 1158, 436, 2929, 1057, 8162, 281, 776, 14149, 4685, 273, 4979, 398, 285, 326, 1142, 952, 275, 253, 3114, 588, 1089, 352, 4409, 15966, 342, 5474, 33032, 2520, 2929, 8631, 2067, 4679, 835, 4979, 398, 403, 10166, 281, 1347, 1524, 24995, 4872, 8697, 10426, 4315, 811, 3321, 1635, 25219, 20223, 50276, 70, 3855, 34383, 273, 13123, 12624, 18504, 69, 27697, 31929, 2382, 7200, 310, 3839, 1077, 1029, 5727, 1557, 310, 3058, 275, 1340, 281, 4044, 562, 1171, 13517, 26647, 253, 2929, 15814, 562, 247, 1077, 11080, 873, 273, 4679, 327, 4872, 8697, 10426, 342, 4979, 398, 970, 1740, 1027, 2349, 351, 723, 273, 3280, 12624, 275, 1635, 253, 4477, 403, 6600, 273, 253, 6349, 273, 562, 1171, 35360, 26647, 11962, 1097, 253, 4315, 1979, 285, 253, 3268, 273, 3280, 12624, 273, 247, 1677, 1979, 1543, 3176, 281, 320, 3426, 285, 253, 6452, 8392, 432, 731, 403, 3839, 3590, 50276, 35529, 253, 1895, 11463, 1070, 275, 436, 2929, 1057, 417, 3176, 281, 320, 3782, 4217, 275, 619, 4743, 253, 11815, 285, 4342, 273, 436, 2929, 403, 760, 4722, 327, 247, 10527, 1268, 4931, 597, 476, 1361, 2096, 752, 4979, 398, 476, 390, 2550, 513, 2581, 685, 1146, 3587, 7763, 275, 247, 14282, 1039, 846, 512, 359, 513, 452, 11333, 323, 512, 4872, 8697, 3237, 2783, 285, 597, 789, 342, 2233, 7200, 3962, 562, 1171, 13517, 26647, 285, 7938, 1408, 673, 50276, 284, 253, 4477, 3877, 275, 253, 5955, 387, 253, 1655, 3924, 4979, 398, 452, 21396, 10454, 275, 253, 1180, 273, 21761, 534, 30376, 715, 327, 21, 10454, 323, 295, 2069, 295, 3280, 12624, 285, 436, 310, 38311, 17357, 685, 253, 3242, 11333, 359, 452, 247, 7826, 4722, 2852, 3884, 534, 4931, 476, 320, 37636, 625, 407, 253, 2488, 281, 17084, 253, 1750, 326, 436, 2929, 310, 4217, 310, 281, 7409, 1386, 435, 553, 4979, 398, 327, 8892, 835, 253, 3242, 5933, 4419, 625, 685, 327, 19, 673, 594, 326, 4931, 4979, 398, 476, 320, 908, 281, 1347, 16851, 30745, 342, 1679, 673, 50276, 74, 671, 452, 253, 1563, 5884, 5701, 50276, 7053, 1386, 273, 3239, 374, 278, 2069, 295, 943, 320, 275, 247, 14168, 7212, 50276, 4674, 608, 10720, 1386, 752, 1057, 323, 1355, 2193, 273, 295, 1599, 1060, 253, 3908, 310, 1077, 10799, 594, 352, 310, 1892, 281, 2868, 326, 352, 310, 2032, 598, 281, 247, 2176, 1355, 1180, 1333, 608, 285, 3221, 323, 247, 4067, 295, 50276, 423, 273, 3239, 818, 436, 23849, 326, 562, 1171, 35360, 26647, 310, 1896, 672, 1798, 4116, 2686, 368, 2692, 326, 352, 310, 3309, 281, 2075, 1798, 4116, 417, 326, 352, 310, 4209, 594, 891, 651, 3630, 562, 1171, 35360, 26647, 4419, 1798, 4116, 436, 2929, 3400, 247, 11080, 285, 973, 15720, 5839, 273, 253, 897, 273, 4979, 398, 281, 1347, 4872, 8697, 13782, 2299, 436, 1057, 417, 3176, 281, 320, 3782, 4217, 2490, 187, 4118, 18435, 27, 783, 2929, 14371, 326, 39707, 35615, 476, 320, 10166, 281, 11897, 5482, 273, 4872, 8697, 3237, 342, 1029, 7200, 436, 310, 271, 4722, 3884, 285, 347, 253, 253, 10123, 285, 253, 5955, 921, 352, 310, 247, 1175, 941, 1127, 285, 47860, 347, 581, 37317, 12516, 352, 891, 4751, 5194, 342, 436, 533, 671, 5194, 342, 581, 643, 37317, 275, 326, 436, 310, 2568, 1529, 2898, 273, 247, 1929, 39707, 10336, 253, 2488, 943, 1659, 253, 1566, 715, 3634, 285, 2085, 690, 8668, 1293, 253, 16038, 3212, 16161, 253, 2173, 873, 273, 4872, 8697, 3237, 2783, 310, 247, 2372, 12744, 323, 4227, 812, 247, 39707, 1024, 3037, 281, 8415, 3969, 13361, 3237, 25761, 253, 10103, 273, 253, 2783, 12624, 403, 2581, 1355, 285, 253, 26647, 281, 4067, 7877, 3176, 281, 320, 28190 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: summary this paper proposes a pretraining method lime to improve transformers performance on mathematical reasoning benchmarks specifically this paper design three synthetic tasks to teach the transformer model to first learn three primitive reasoning steps deduction abduction and induction respectively the three datasets are used in the pretraining step and the paper shows empirical performance gains on large mathematical reasoning tasks in the context of automated theorem proving reasons for the score this paper provides an interesting direction in the field of automated theorem proving in particular it proposes a novel way to teach the model to learn primitive reasoning steps via designs of datasets the paper provides good motivations and intuitions for their proposed method but it would be nice for the authors to also formalize these intuitions and provide rigorous definitions as an academic paper eg what is inductive bias and so on details see below the proposed method provides some ablation studies in the experiment section but it is still worthwhile to conduct the following studies to enhance the quality of the paper 1 how the positive transfer from the proposed pretraining change as we increase the size of the model 2 it seems unclear from the current text how the lime pretraining compares with largescale unsupervised pretraining eg bert in terms of convergence rate and the number of samples needed on the downstream tasks for example would pretraining on bert lead to faster convergence and a smaller number of samples compared to lime pretraining pros 1the idea of designing pretraining datasets to help mathematical reasoning is novel 2good empirical performance gains on large mathematical reasoning tasks in the context of automated theorem proving cons 1though the proposed method provides some ablation studies in the experiment section but it is still worthwhile to conduct the following studies to enhance the quality of the paper a how the positive transfer from the proposed pretraining change as we increase the size of the model b it seems unclear from the current text how the lime pretraining compares with largescale unsupervised pretraining eg bert in terms of convergence rate and the number of samples needed on the downstream tasks for example would pretraining on bert lead to faster convergence and a smaller sample complexity comparing to lime pretraining c would the proposed method have similar improvements when using different architectures eg lstm lstmattention gpt2 2the writing provides good intuitions in general but it would be nice for the authors to also formalize these intuitions and provide rigorous definitions as an academic paper a the word inductive bias may have different connotations across different contexts and it would be nice for the authors to give a more formal definition of what they mean by inductive bias in the context of this work for example inductive bias may refer to the structural bias imposed by the architecture it may also refer to the prior knowledge of the target tasks or to the preference of the training scheme eg what kind of functions are learned first by an sgd optimizer it seems to me that the usage of inductive bias in this work is close to the prior knowledge of the target tasks but this is not clear from the current writing b similarly it would be nice for the authors to clearly state what they mean by knowledge in the context of this work is it referring to knowledge distillation learned representations of the context or something else for example in this sentence we focus on the latter and design pretraining tasks that are intentionally devoid of knowledge and only allow the model to learn inductive bias for reasoning it is unclear why specific reasoning procedure cannot be treated as a form of knowledge 3the design of the three pretraining tasks is based on the three primitive rules for logical reasoning which requires nontrivial human insights to formalize thus the proposed approach may not be easily transferrable to other fields 4it would be nice for the authors to provide discussions with relevant works on the reasoning abilities of neural networks such as 1 and 2 the proposed dataset in 1 could be of interest to this work as a pretraining dataset 2 provides a theoretical framework to formalize the inductive bias in graph neural networks which demonstrate the limitations of graph neural networks in terms of its reasoning capacity 1 saxton david et al analysing mathematical reasoning abilities of neural models arxiv preprint arxiv190401557 2019 2 xu keyulu et al what can neural networks reason about arxiv preprint arxiv190513211 2019docsep summary the authors propose lime a pretraining strategy for learning inductive biases for mathematical reasoning they construct 3 synthetic datasets corresponding to 3 basic reasoning patterns deduction induction and abduction each dataset is designed to be devoid of concrete mathematical knowledge but encodes inductive biases of the reasoning pattern in experiments lime pretraining improves the performance of a generic transformer model on 3 benchmarks for mathematical reasoning isarstep holist skiptree and metamathstep strengths i really like the idea of designing the synthetic datasets inspired by 3 basic patterns of mathematical reasoning deduction induction and abduction though the basic reasoning patterns were discovered a while ago it is novel to apply the idea to synthetic data for theorem proving traditional automated theorem proving focuses on deduction but as the authors have explained in the paper induction and abduction also play significant roles in conjecturing and framing new definitions the proposed lime pretraining lead to descent performance improvements on 3 benchmarks it is surprising since the pretraining task is a very simple string rewriting task without any mathematical knowledge it is interesting if it actually works the paper is wellwritten and very easy to read weaknesses the authors acknowledged that there are numerous alternative designs for the pretraining datasets and presented a few alternatives in the appendix are there empirical comparisons between these alternatives why did the authors decide to use the current ones it would be great to have ablations that only include 2 of the 3 pretraining datasets then we can know if the 3 reasoning patterns are truly irreducible in sec 32 im confused about how the authors define rule and case to me it looks like a a b c is more like a case whereas a a b b c d e is more like a rule since it describes how to map rule symbols to math symbols im confused about framing pretraining as learning inductive biases technically i believe its correct suppose inductive biases is defined as the learning algorithms preferences when choosing one hypothesis among many equally valid ones in that case the initialization of model parameters is a kind of inductive bias then any pretraining is learning inductive bias it would be great if the authors clarify more about pretraining and inductive biases in the next revision conclusion i like the idea of constructing synthetic datasets inspired by basic reasoning patterns and it is surprising that pretraining on these simple tasks can work however i have a few questions and the paper lacks ablations i could potentially raise my score if the authors are able to address these questions minor comments these are minor points that did not account for my score the authors do not have to address them in the rebuttal 1 there should be more margin between table 1 and the caption of table 2 2 use k for thousand consistently not k 3 we set the size of the math symbol set 44 and rule symbol set of size 24 this sentence is difficult to parse correctly i thought there was a typo before reading it a few more times docsepin this work the authors introduce a method called lime for imparting certain mathematical inductive biases into a model the structure of the approach is to first pretrain the model on synthetic tasks that are designed around three principles of mathematical reasoning deduction induction and abduction each of these pretraining tasks is a sequencetosequence mapping involving 3 basic components of reasoning rule case and result where two of these three components are provided as input and the third component is the target output after pretraining on these tasks the authors finetune on 3 different proof datasets and find that the pretraining almost always improves performance sometimes by a large margin strengths 1 this approach is creative and thoughtprovoking pretraining is an important topic in ml nowadays and this paper gives several interesting insights about how to structure pretraining therefore publishing this paper at iclr could help inspire others to use and develop improved variations of pretraining 2 one aspect of the pretraining that i found particularly impressive was how the authors found such clear improvements from such small amounts of pretraining this is in stark contrast to the usually massive pretraining datasets that are used and stands as an especially strong piece of evidence for the models usefulness 3 the experimental setup is wellmotivated drawing on a principled analysis of the problem domain 4 the paper is overall clearly written and clearly structured 5 there were some interesting discussion points and ablation studies analyzing the approach in more detail i particularly liked the discussion about how loading the vocabulary weights had little effect showing that the inductive biases that were imparted were abstract in nature it was also useful to see that lime was more useful than other pretraining tasks ruling out the possibility that you could get similar improvements from just any pretraining task weaknesses 1 part of the papers motivation for imparting inductive bias through a dataset rather than through an architecture is that designing an architecture strongly requires human insight this is true but lime also seems to strongly rely on human insight so this point is not a benefit for lime over architectural approaches this is not a huge problem but it does not seem like a great motivation for lime 2 related to the previous point it would be good to discuss the fact that the usefulness of lime may be limited by the need to design the right pretraining tasks as table 4 shows the nature of the pretraining task is very important and although the authors were able to create some successful pretraining tasks for mathematical reasoning it might be harder to create similarly useful tasks for largerscale tasks in eg language or vision again this is not a huge problem but i think it at least deserves some discussion 3 though the goal of the approach if i am understanding correctly is to give inductive biases for induction deduction and abduction the paper gives no direct evidence that it has done so the authors create an approach intended to impart certain inductive biases and this approach improves performance on 3 tasks that plausibly would benefit from those biases but this result does not necessarily mean that the model has the inductive biases that were intended to be imparted its possible that lime imparted some other inductive biases that are also useful for mathematical reasoning but that are not related to induction deduction and abduction thus there is a bit of a gap between the motivation and the actual experiments 4 its not entirely clear to me that the specific tasks deduct induct abduct will necessarily enforce the types of reasoning that they are intended to enforce for instance consider the following inputoutput example a a b b c de s a a b c a a b d e such an example is intended to show deduction but it could instead be viewed as induction where a a b c is the result a a b d e is the rule and the case dictionary should be read in reverse treating the values as keys and the keys as values thus related to the previous point i think there is some concern that the lime tasks may not necessarily encode the intended primitives the results show that the lime tasks clearly encode something useful but its not clear exactly what useful things they encode recommended citations you definitely dont need to include all of these or even any of these but im pointing to them just in case theyre useful 1 you already cite the gpt3 paper brown et al but it might make sense to cite it in a second place as well for the sentence where you say however there is another potential advantage of pretrainingit may distill inductive biases into the model that are helpful for training on downstream tasks another paper you can cite for this point is this one can neural networks acquire a structural bias from raw linguistic data httpsarxivorgpdf200706761pdf 2 like your approach the following paper also uses carefullyconstructed synthetic datasets as a way to impart targeted inductive biases into a model however they use these tasks for metatraining not pretraining universal linguistic inductive biases via metalearning httpsarxivorgpdf200616324pdf this paper might also be useful as an example of how you can address the last two points i listed under weaknesses as this paper gives examples of how to test whether a model has some specific inductive biases the paper i linked to in the previous bullet warstadt and bowman also does this however adding such analyses might be more work than would be doable for a cameraready 3 it might be good to cite peirce when first mentioned in the intro right now the citation to peirce is buried deep in the paper after he has already been discussed at length 4 some more potentiallyrelevant examples of architecturally encoding inductive biases for math httpsarxivorgpdf191002339pdf httpsarxivorgpdf191006611pdf other comments these are not things that have affected my assessment instead they are just comments that i think might be helpful in revising 1 note that there is another approach in ml called lime which could potentially cause confusion its completely up to you but i would consider renaming to avoid confusion here is the other lime by ribeiro singh and guestrin httpsdlacmorgdoipdf10114529396722939778casatokenvrgsekoqonkaaaaatmzxq2ucwkuvypdd9ytcnk4lsdrfiwsiex4hd8emkjnjevz4drceiim7acirwgtqlqemuqdlajxq 2 abstract neural architecture should be neural architectures 3 abstract on three large very different mathematical reasoning benchmarks should be on three very different large mathematical reasoning benchmarks 4 abstract i did not understand what dominating the computation meant until i read the rest of the paper the intro says it is commonly believed that the benefit of pretraining is that the model can learn world knowledge by memorizing the contents of the natural language corpus this statement seems strong i am more inclined to think that much of the benefit comes from learning linguistic structure not world knowledge so it might be safer to reword as saying one plausible explanation for the benefit of pretraining is 5 page 3 says the bert pretraining objective which suggests that bert is the objective but bert is a model not an objective the objectives are masked language modeling and nextsentence prediction 6 table 1 the formatting of the table makes it look like the first two rows are numbers copied from li et al but from the prose of your paper and from looking at li et al im pretty sure that these numbers are from your own reimplementation is that correct if so it might be best to format the table different using the citation within the body of the table gives a strong suggestion that the numbers come from li et al in my opinion 7 table 4 and table 5 in the caption say what task these results are for so that the table can be understood on its own 8 please double check the references several of them seem to only list authors title and year when there is at least an arxiv version that could be listed as well eg mathematical reasoning via selfsupervised skiptree training enhancing sat solvers with glue variable predictions transformers generalize to the semantics of logics also where possible cite a papers actual publication venue instead of arxiv eg the raffel et al t5 paper appeared in jmlr not just arxiv summary overall i am rating this an 8 because i find the strengths compelling but think that the weaknesses in framing hold the paper back from an even higher score i would consider increasing the score if those weaknesses were addressed though those weaknesses are deep enough that it would be hard to properly address them in time docsepwith the aim of learning inductive biases for deep neural net architectures this paper presents three synthetic experiments for learning primitive forms of mathematical reasoning in theorem provers the overall idea is inspired by pierces view that these primitives are deduction abduction and induction the synthetic tasks are built upon a simple arithmetic language with a source and a target using those tasks for pretraining a transformer model the authors show the merit of this methodology in several mathematical reasoning experiments i am not an expert in transformer models and pretraining techniques so it is possible that i did not understand some parts in section 4 5 overall i think that the idea of training a learner with primitive forms of inference is interesting for improving its performance in mathematical reasoning the experimental results corroborate the relevance of this approach my main comment lies in the specification of synthetic tasks section 32 here the authors are using an adhoc arithmetic language upon which deduction induction and abduction tasks are defined but this adhoc language has no formal semantics so we cannot formally capture the primitive forms of reasoning contrastingly in section 31 the three reasoning primitives identified by pierce are welldefined because they are captured by the semantics of firstorder logic for example take the abductive mode of reasoning rule forall x bagx rightarrow whitex all the beans from this bag are white result whiteo these beans are white where o is an object in the herbrand universe case bago these beans are from this bag from the rule and the result we can indeed infer by abduction that the case is an explanation of the result since i bago wedge forall x bagx rightarrow whitex notmodels bot consistency and ii bago wedge forall x bagx rightarrow whitex models whiteo consequence but in section 32 the arithmetic language is given without any semantics and hence we cannot define a clear unambiguous form of logical consequence models therefore the notion of abduction is here very unclear to sum up the synthetic tasks proposed by the authors might indeed help in learning an inductive bias capable of improving theorem provers but there is a discrepancy between the logical notions of deduction abduction and induction defined by pierce and more generally in the knowledge representation literature and the reasoning primitives essentially some forms of pattern matching presented in the synthetic tasks ### Summary:
the authors propose a pretraining strategy learning inductive biases in transformers for deduction induction and abduction further the claims and results seem to indicate that such pretraining is more successful in transformers which provide a more malleable architecture for learning inductive structural biases there are open questions that remain specifically surrounding disentangling high performance from structural bias learning ie is pretraining doing what we think it is and whether datasets are the correct mechanism for imparting such biasesknowledge
[ 8322, 273, 3215, 26208, 436, 310, 275, 29274, 4499, 281, 253, 3798, 7863, 3215, 26208, 15302, 326, 403, 908, 285, 9572, 347, 271, 3340, 2266, 5313, 273, 1941, 323, 253, 3210, 31471, 50276, 20, 253, 5661, 9978, 310, 973, 24013, 8550, 10263, 327, 247, 3505, 74, 6216, 1783, 273, 253, 1895, 5028, 50275, 21, 253, 2929, 310, 4583, 4518, 3542, 285, 4518, 18872, 50276, 22, 627, 497, 690, 4722, 5955, 2792, 285, 28913, 2175, 18918, 253, 2746, 275, 625, 2508, 891, 3782, 10490, 253, 5955, 670, 849, 10935, 253, 30318, 13461, 574, 1652, 1055, 4645, 326, 253, 42115, 31306, 326, 497, 22435, 264, 497, 12002, 275, 3753, 352, 369, 671, 4217, 281, 923, 326, 30037, 369, 625, 4217, 685, 643, 3215, 26208, 8892, 10362, 562, 253, 6387, 326, 368, 812, 755, 2074, 11701, 432, 816, 667, 3215, 26208, 4836, 50276, 20881, 1255, 265, 50276, 18, 629, 273, 253, 9380, 16038, 323, 22435, 272, 42115, 8492, 949, 247, 10895, 2581, 685, 949, 271, 10336, 310, 326, 20462, 271, 10336, 7052, 4419, 1966, 12288, 436, 310, 2032, 533, 30037, 671, 3133, 281, 7052, 10725, 327, 1966, 12288, 594, 436, 1127, 310, 417, 247, 5649, 323, 30037, 689, 27934, 7274, 436, 310, 417, 247, 5699, 1895, 533, 352, 1057, 417, 1646, 751, 247, 1270, 16038, 323, 30037, 50275, 19, 2905, 281, 253, 2045, 1127, 352, 651, 320, 1175, 281, 2319, 253, 958, 326, 253, 31471, 273, 30037, 778, 320, 3710, 407, 253, 878, 281, 2216, 253, 987, 3215, 26208, 8892, 347, 2829, 577, 2722, 253, 3753, 273, 253, 3215, 26208, 4836, 310, 1077, 1774, 285, 3738, 253, 4477, 497, 2104, 281, 2794, 690, 5547, 3215, 26208, 8892, 323, 15965, 14720, 352, 1537, 320, 12150, 281, 2794, 12014, 4217, 8892, 323, 1236, 7276, 25912, 8892, 275, 24088, 3448, 390, 8113, 969, 436, 310, 417, 247, 5699, 1895, 533, 891, 1158, 352, 387, 1878, 22828, 690, 5955, 50276, 20, 2167, 253, 4736, 273, 253, 2746, 604, 891, 717, 4685, 9113, 310, 281, 1918, 42115, 31306, 323, 9953, 34143, 285, 490, 10083, 253, 2929, 4245, 642, 1480, 1941, 326, 352, 556, 2218, 594, 253, 4477, 2794, 271, 2746, 6034, 281, 22435, 2176, 42115, 31306, 285, 436, 2746, 19132, 3045, 327, 495, 8892, 326, 18662, 4360, 651, 5649, 432, 1110, 31306, 533, 436, 906, 1057, 417, 7933, 1599, 326, 253, 1566, 556, 253, 42115, 31306, 326, 497, 6034, 281, 320, 22435, 264, 697, 1896, 326, 30037, 22435, 264, 690, 643, 42115, 31306, 326, 403, 671, 4217, 323, 15965, 14720, 533, 326, 403, 417, 2905, 281, 9953, 34143, 285, 490, 10083, 3021, 627, 310, 247, 2372, 273, 247, 8037, 875, 253, 16038, 285, 253, 4588, 4679, 50276, 21, 697, 417, 7094, 2590, 281, 479, 326, 253, 2173, 8892, 31985, 21782, 490, 1586, 588, 7933, 7767, 253, 3510, 273, 14720, 326, 597, 403, 6034, 281, 7767, 323, 4227, 1908, 253, 1563, 3280, 9252, 1650, 247, 50276, 66, 270, 270, 260, 372, 256, 247, 247, 50276, 67, 50276, 68, 50276, 66, 247, 50276, 67, 50276, 69, 50276, 70, 824, 271, 1650, 310, 6034, 281, 921, 34143, 533, 352, 812, 3185, 320, 11575, 347, 9953, 835, 247, 247, 50276, 67, 50276, 68, 310, 253, 906, 247, 247, 50276, 67, 50276, 69, 50276, 70, 310, 253, 4086, 285, 253, 1083, 19034, 943, 320, 1239, 275, 8107, 12767, 253, 2193, 347, 10149, 285, 253, 10149, 347, 2193, 3021, 2905, 281, 253, 2045, 1127, 891, 1158, 627, 310, 690, 4468, 326, 253, 30037, 8892, 778, 417, 7933, 22573, 253, 6034, 2248, 23223, 253, 1543, 921, 326, 253, 30037, 8892, 4518, 22573, 1633, 4217, 533, 697, 417, 2590, 4555, 752, 4217, 1841, 597, 22573, 50275, 250, 34274, 30404, 368, 7964, 13414, 878, 281, 2486, 512, 273, 841, 390, 1014, 667, 273, 841, 533, 516, 13458, 281, 731, 816, 275, 1083, 597, 250, 4217, 50276, 18, 368, 2168, 26542, 253, 305, 431, 20, 2929, 8516, 1162, 355, 533, 352, 1537, 1056, 3282, 281, 26542, 352, 275, 247, 1273, 1659, 347, 973, 323, 253, 6197, 835, 368, 1333, 2299, 627, 310, 1529, 2442, 5750, 273, 3215, 26208, 262, 778, 940, 408, 42115, 31306, 715, 253, 1566, 326, 403, 9371, 323, 3733, 327, 15450, 8892, 1529, 2929, 368, 476, 26542, 323, 436, 1127, 310, 436, 581, 476, 11454, 6928, 16270, 247, 8350, 8492, 432, 9305, 32019, 941, 5987, 39962, 2061, 9275, 1518, 1967, 2251, 3832, 9275, 50276, 19, 751, 634, 2746, 253, 1563, 2929, 671, 4648, 9257, 17439, 264, 13506, 15302, 347, 247, 1039, 281, 22435, 10522, 42115, 31306, 715, 247, 1566, 2299, 597, 897, 841, 8892, 323, 1313, 255, 26208, 417, 3215, 26208, 10898, 32019, 42115, 31306, 3066, 5148, 613, 920, 5987, 39962, 2061, 9275, 8603, 1036, 21397, 9275, 436, 2929, 1537, 671, 320, 4217, 347, 271, 1650, 273, 849, 368, 476, 2953, 253, 1390, 767, 2792, 891, 7117, 762, 32213, 347, 436, 2929, 4245, 6667, 273, 849, 281, 1071, 1880, 247, 1566, 556, 690, 2173, 42115, 31306, 253, 2929, 891, 7939, 281, 275, 253, 2045, 16950, 2137, 38148, 285, 6026, 1342, 671, 1057, 436, 2299, 6240, 824, 6260, 1537, 320, 625, 789, 685, 651, 320, 513, 494, 323, 247, 4049, 254, 609, 5102, 50276, 20, 352, 1537, 320, 1175, 281, 26542, 759, 343, 336, 672, 806, 5393, 275, 253, 26432, 987, 1024, 253, 25577, 281, 759, 343, 336, 310, 14205, 3676, 275, 253, 2929, 846, 344, 556, 2168, 644, 5469, 387, 2978, 50276, 21, 690, 625, 7826, 15477, 6667, 273, 6805, 8572, 9706, 42115, 31306, 323, 14168, 5987, 39962, 2061, 9275, 746, 2313, 1508, 1867, 9275, 5987, 39962, 2061, 9275, 746, 2313, 2526, 883, 9275, 50275, 977, 5701, 841, 403, 417, 1841, 326, 452, 5876, 619, 6803, 3185, 597, 403, 816, 5701, 326, 891, 1158, 1537, 320, 9371, 275, 3585, 2182, 50276, 18, 3877, 326, 627, 310, 1529, 2746, 275, 13361, 1925, 30037, 534, 812, 7826, 2847, 13775, 697, 4336, 598, 281, 368, 533, 891, 651, 1908, 3816, 6472, 281, 3693, 13775, 1060, 310, 253, 643, 30037, 407, 9412, 70, 9401, 1625, 73, 285, 12141, 11078, 5987, 11830, 50232, 2061, 3088, 532, 4989, 6903, 11838, 1717, 1867, 2251, 17107, 1867, 33505, 16559, 255, 5097, 87, 15164, 339, 7381, 82, 251, 4530, 39639, 255, 78, 91, 89, 82, 19, 1028, 30567, 8962, 3170, 1678, 26, 1767, 14340, 76, 21, 5200, 5267, 11125, 8819, 466, 89, 21, 13838, 25, 358, 31169, 43498, 1173, 91, 21, 5267, 336, 74, 303, 24, 317, 343, 88, 7332, 5848, 82, 36812, 82, 11830, 1432, 89, 82, 50275, 19, 12002, 11454, 10336, 943, 320, 11454, 35615, 50276, 20, 12002, 327, 1264, 1781, 1077, 1027, 15965, 14720, 49602, 943, 320, 327, 1264, 1077, 1027, 1781, 15965, 14720, 49602, 50276, 21, 12002, 891, 858, 417, 2096, 752, 41297, 253, 13782, 5486, 1919, 891, 1239, 253, 1551, 273, 253, 2929, 253, 26432, 2296, 352, 310, 7744, 6566, 326, 253, 5649, 273, 3215, 26208, 310, 326, 253, 1566, 476, 3037, 1533, 3640, 407, 16407, 3006, 253, 9410, 273, 253, 3626, 3448, 20689, 436, 3908, 3133, 2266, 50276, 74, 717, 625, 21802, 281, 1158, 326, 1199, 273, 253, 5649, 3249, 432, 4715, 32019, 2605, 417, 1533, 3640, 594, 352, 1537, 320, 23107, 281, 294, 3418, 347, 3981, 581, 21541, 8813, 323, 253, 5649, 273, 3215, 26208, 310, 50276, 22, 3239, 495, 2296, 253, 270, 797, 3215, 26208, 8103, 534, 5936, 326, 270, 797, 310, 253, 8103, 533, 270, 797, 310, 247, 1566, 417, 271, 8103, 253, 16566, 403, 34741, 3448, 14053, 285, 1735, 36817, 10554, 50276, 23, 2829, 337, 253, 33907, 273, 253, 2829, 2789, 352, 1007, 751, 253, 806, 767, 10175, 403, 3904, 22489, 432, 632, 1162, 355, 533, 432, 253, 36045, 273, 634, 2929, 285, 432, 2819, 387, 632, 1162, 355, 516, 3965, 2119, 326, 841, 3904, 403, 432, 634, 1211, 294, 39595, 310, 326, 3451, 604, 594, 352, 1537, 320, 1682, 281, 5981, 253, 2829, 1027, 50276, 5302, 253, 25577, 1561, 253, 2133, 273, 253, 2829, 4245, 247, 2266, 14876, 326, 253, 3904, 1705, 432, 632, 1162, 355, 275, 619, 4743, 50276, 24, 2829, 577, 285, 2829, 608, 275, 253, 11743, 1333, 752, 4836, 841, 1543, 403, 323, 594, 326, 253, 2829, 476, 320, 7192, 327, 697, 1211, 50276, 25, 4496, 4021, 2451, 253, 10414, 2067, 273, 731, 1646, 281, 760, 1618, 4477, 4060, 285, 807, 672, 627, 310, 387, 1878, 271, 549, 32693, 2715, 326, 812, 320, 7117, 347, 973, 24088, 15965, 14720, 3066, 1881, 35421, 1629, 9155, 658, 3733, 22474, 2206, 1220, 735, 342, 28400, 4778, 13650, 4979, 398, 39970, 281, 253, 35185, 273, 2412, 982, 671, 835, 1896, 26542, 247, 9380, 4588, 9311, 18767, 3185, 273, 549, 32693, 50276, 909, 253, 1218, 567, 293, 1162, 355, 246, 22, 2929, 5420, 275, 480, 1686, 83, 417, 816, 549, 32693, 50275, 8774, 4583, 891, 717, 13716, 436, 271, 854, 984, 891, 1089, 253, 20544, 18511, 533, 1158, 326, 253, 32213, 275, 39926, 2186, 253, 2929, 896, 432, 271, 1014, 2169, 4868, 891, 651, 1908, 3629, 253, 4868, 604, 1110, 32213, 497, 9713, 2167, 1110, 32213, 403, 3676, 2217, 326, 352, 651, 320, 1892, 281, 6283, 2953, 731, 275, 673, 5474, 33032, 3113, 253, 4388, 273, 4715, 42115, 31306, 323, 3676, 11454, 2036, 35615, 436, 2929, 10262, 1264, 13506, 4679, 323, 4715, 20523, 4948, 273, 15965, 14720, 275, 10012, 354, 735, 253, 4583, 2934, 310, 11797, 407, 18753, 707, 1859, 326, 841, 2248, 23223, 403, 34143, 490, 10083, 285, 9953, 253, 13506, 8892, 403, 4270, 2220, 247, 2969, 27844, 3448, 342, 247, 2603, 285, 247, 2303, 970, 1110, 8892, 323, 3215, 26208, 247, 39707, 1566, 253, 4477, 921, 253, 15785, 273, 436, 16182, 275, 2067, 15965, 14720, 4679, 50276, 74, 717, 417, 271, 6485, 275, 39707, 3210, 285, 3215, 26208, 5609, 594, 352, 310, 1896, 326, 891, 858, 417, 2096, 690, 4243, 275, 2593, 577, 50276, 22, 4583, 891, 1158, 326, 253, 2934, 273, 3733, 247, 458, 47612, 342, 20523, 4948, 273, 17032, 310, 4722, 323, 11138, 697, 3045, 275, 15965, 14720, 253, 5661, 1543, 25092, 366, 253, 17200, 273, 436, 2746, 50275, 2577, 2022, 4385, 8696, 275, 253, 17776, 273, 13506, 8892, 2593, 4567, 1060, 253, 4477, 403, 970, 271, 519, 37806, 27844, 3448, 2220, 534, 34143, 9953, 285, 490, 10083, 8892, 403, 2931, 533, 436, 519, 37806, 3448, 556, 642, 7473, 35185, 594, 359, 2550, 19186, 9232, 253, 20523, 4948, 273, 14720, 50275, 45842, 5356, 275, 2593, 4562, 253, 1264, 14720, 2248, 23223, 3636, 407, 18753, 336, 403, 6210, 392, 37224, 984, 597, 403, 10848, 407, 253, 35185, 273, 806, 2621, 9317, 323, 1650, 1379, 253, 490, 43324, 4438, 273, 14720, 50276, 15093, 323, 455, 1269, 7351, 89, 987, 2501, 3168, 89, 512, 253, 18661, 432, 436, 7351, 403, 3168, 50276, 6870, 3168, 80, 841, 18661, 403, 3168, 835, 258, 310, 271, 1789, 275, 253, 617, 22374, 10325, 50275, 5045, 270, 5477, 841, 18661, 403, 432, 436, 7351, 50275, 4064, 253, 4086, 285, 253, 906, 359, 476, 6296, 9441, 407, 490, 10083, 326, 253, 1083, 310, 271, 8813, 273, 253, 906, 1580, 50276, 74, 270, 5477, 33906, 323, 455, 1269, 7351, 89, 987, 2501, 3168, 89, 417, 19286, 17994, 15274, 285, 50276, 2886, 270, 5477, 33906, 323, 455, 1269, 7351, 89, 987, 2501, 3168, 89, 3210, 3168, 80, 9936, 50275, 2858, 275, 2593, 4567, 253, 27844, 3448, 310, 1677, 1293, 667, 35185, 285, 7613, 359, 2550, 4853, 247, 2590, 39662, 830, 273, 13760, 9936, 3210, 3103, 253, 10732, 273, 490, 10083, 310, 1060, 1077, 12744, 50275, 936, 2020, 598, 253, 13506, 8892, 4081, 407, 253, 4477, 1537, 6296, 1361, 275, 4715, 271, 42115, 8492, 7032, 273, 11138, 10012, 354, 735, 533, 627, 310, 247, 26210, 875, 253, 13760, 27367, 273, 34143, 490, 10083, 285, 9953, 2931, 407, 18753, 336, 285, 625, 3839, 275, 253, 3640, 6779, 6239, 285, 253, 14720, 2248, 23223, 9093, 690, 4948, 273, 3102, 11038, 3559, 275, 253, 13506, 8892, 50276, 187, 187, 4118, 18435, 27, 253, 4477, 12661, 247, 3215, 26208, 5700, 4715, 42115, 31306, 275, 4979, 398, 323, 34143, 9953, 285, 490, 10083, 50276, 44295, 253, 3916, 285, 1543, 1646, 281, 5224, 326, 824, 3215, 26208, 310, 625, 5547, 275, 4979, 398, 534, 2085, 247, 625, 4691, 282, 494, 10336, 323, 4715, 42115, 8350, 31306, 50276, 9088, 403, 1527, 3533, 326, 3464, 5742, 8704, 557, 290, 36874, 1029, 3045, 432, 8350, 8492, 4715, 26332, 310, 3215, 26208, 2509, 752, 359, 1158, 352, 310, 285, 1880, 15302, 403, 253, 3451, 5122, 323, 22435, 272, 824, 31306, 36871 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 8322, 273, 3215, 26208, 436, 310, 275, 29274, 4499, 281, 253, 3798, 7863, 3215, 26208, 15302, 326, 403, 908, 285, 9572, 347, 271, 3340, 2266, 5313, 273, 1941, 323, 253, 3210, 31471, 50276, 20, 253, 5661, 9978, 310, 973, 24013, 8550, 10263, 327, 247, 3505, 74, 6216, 1783, 273, 253, 1895, 5028, 50275, 21, 253, 2929, 310, 4583, 4518, 3542, 285, 4518, 18872, 50276, 22, 627, 497, 690, 4722, 5955, 2792, 285, 28913, 2175, 18918, 253, 2746, 275, 625, 2508, 891, 3782, 10490, 253, 5955, 670, 849, 10935, 253, 30318, 13461, 574, 1652, 1055, 4645, 326, 253, 42115, 31306, 326, 497, 22435, 264, 497, 12002, 275, 3753, 352, 369, 671, 4217, 281, 923, 326, 30037, 369, 625, 4217, 685, 643, 3215, 26208, 8892, 10362, 562, 253, 6387, 326, 368, 812, 755, 2074, 11701, 432, 816, 667, 3215, 26208, 4836, 50276, 20881, 1255, 265, 50276, 18, 629, 273, 253, 9380, 16038, 323, 22435, 272, 42115, 8492, 949, 247, 10895, 2581, 685, 949, 271, 10336, 310, 326, 20462, 271, 10336, 7052, 4419, 1966, 12288, 436, 310, 2032, 533, 30037, 671, 3133, 281, 7052, 10725, 327, 1966, 12288, 594, 436, 1127, 310, 417, 247, 5649, 323, 30037, 689, 27934, 7274, 436, 310, 417, 247, 5699, 1895, 533, 352, 1057, 417, 1646, 751, 247, 1270, 16038, 323, 30037, 50275, 19, 2905, 281, 253, 2045, 1127, 352, 651, 320, 1175, 281, 2319, 253, 958, 326, 253, 31471, 273, 30037, 778, 320, 3710, 407, 253, 878, 281, 2216, 253, 987, 3215, 26208, 8892, 347, 2829, 577, 2722, 253, 3753, 273, 253, 3215, 26208, 4836, 310, 1077, 1774, 285, 3738, 253, 4477, 497, 2104, 281, 2794, 690, 5547, 3215, 26208, 8892, 323, 15965, 14720, 352, 1537, 320, 12150, 281, 2794, 12014, 4217, 8892, 323, 1236, 7276, 25912, 8892, 275, 24088, 3448, 390, 8113, 969, 436, 310, 417, 247, 5699, 1895, 533, 891, 1158, 352, 387, 1878, 22828, 690, 5955, 50276, 20, 2167, 253, 4736, 273, 253, 2746, 604, 891, 717, 4685, 9113, 310, 281, 1918, 42115, 31306, 323, 9953, 34143, 285, 490, 10083, 253, 2929, 4245, 642, 1480, 1941, 326, 352, 556, 2218, 594, 253, 4477, 2794, 271, 2746, 6034, 281, 22435, 2176, 42115, 31306, 285, 436, 2746, 19132, 3045, 327, 495, 8892, 326, 18662, 4360, 651, 5649, 432, 1110, 31306, 533, 436, 906, 1057, 417, 7933, 1599, 326, 253, 1566, 556, 253, 42115, 31306, 326, 497, 6034, 281, 320, 22435, 264, 697, 1896, 326, 30037, 22435, 264, 690, 643, 42115, 31306, 326, 403, 671, 4217, 323, 15965, 14720, 533, 326, 403, 417, 2905, 281, 9953, 34143, 285, 490, 10083, 3021, 627, 310, 247, 2372, 273, 247, 8037, 875, 253, 16038, 285, 253, 4588, 4679, 50276, 21, 697, 417, 7094, 2590, 281, 479, 326, 253, 2173, 8892, 31985, 21782, 490, 1586, 588, 7933, 7767, 253, 3510, 273, 14720, 326, 597, 403, 6034, 281, 7767, 323, 4227, 1908, 253, 1563, 3280, 9252, 1650, 247, 50276, 66, 270, 270, 260, 372, 256, 247, 247, 50276, 67, 50276, 68, 50276, 66, 247, 50276, 67, 50276, 69, 50276, 70, 824, 271, 1650, 310, 6034, 281, 921, 34143, 533, 352, 812, 3185, 320, 11575, 347, 9953, 835, 247, 247, 50276, 67, 50276, 68, 310, 253, 906, 247, 247, 50276, 67, 50276, 69, 50276, 70, 310, 253, 4086, 285, 253, 1083, 19034, 943, 320, 1239, 275, 8107, 12767, 253, 2193, 347, 10149, 285, 253, 10149, 347, 2193, 3021, 2905, 281, 253, 2045, 1127, 891, 1158, 627, 310, 690, 4468, 326, 253, 30037, 8892, 778, 417, 7933, 22573, 253, 6034, 2248, 23223, 253, 1543, 921, 326, 253, 30037, 8892, 4518, 22573, 1633, 4217, 533, 697, 417, 2590, 4555, 752, 4217, 1841, 597, 22573, 50275, 250, 34274, 30404, 368, 7964, 13414, 878, 281, 2486, 512, 273, 841, 390, 1014, 667, 273, 841, 533, 516, 13458, 281, 731, 816, 275, 1083, 597, 250, 4217, 50276, 18, 368, 2168, 26542, 253, 305, 431, 20, 2929, 8516, 1162, 355, 533, 352, 1537, 1056, 3282, 281, 26542, 352, 275, 247, 1273, 1659, 347, 973, 323, 253, 6197, 835, 368, 1333, 2299, 627, 310, 1529, 2442, 5750, 273, 3215, 26208, 262, 778, 940, 408, 42115, 31306, 715, 253, 1566, 326, 403, 9371, 323, 3733, 327, 15450, 8892, 1529, 2929, 368, 476, 26542, 323, 436, 1127, 310, 436, 581, 476, 11454, 6928, 16270, 247, 8350, 8492, 432, 9305, 32019, 941, 5987, 39962, 2061, 9275, 1518, 1967, 2251, 3832, 9275, 50276, 19, 751, 634, 2746, 253, 1563, 2929, 671, 4648, 9257, 17439, 264, 13506, 15302, 347, 247, 1039, 281, 22435, 10522, 42115, 31306, 715, 247, 1566, 2299, 597, 897, 841, 8892, 323, 1313, 255, 26208, 417, 3215, 26208, 10898, 32019, 42115, 31306, 3066, 5148, 613, 920, 5987, 39962, 2061, 9275, 8603, 1036, 21397, 9275, 436, 2929, 1537, 671, 320, 4217, 347, 271, 1650, 273, 849, 368, 476, 2953, 253, 1390, 767, 2792, 891, 7117, 762, 32213, 347, 436, 2929, 4245, 6667, 273, 849, 281, 1071, 1880, 247, 1566, 556, 690, 2173, 42115, 31306, 253, 2929, 891, 7939, 281, 275, 253, 2045, 16950, 2137, 38148, 285, 6026, 1342, 671, 1057, 436, 2299, 6240, 824, 6260, 1537, 320, 625, 789, 685, 651, 320, 513, 494, 323, 247, 4049, 254, 609, 5102, 50276, 20, 352, 1537, 320, 1175, 281, 26542, 759, 343, 336, 672, 806, 5393, 275, 253, 26432, 987, 1024, 253, 25577, 281, 759, 343, 336, 310, 14205, 3676, 275, 253, 2929, 846, 344, 556, 2168, 644, 5469, 387, 2978, 50276, 21, 690, 625, 7826, 15477, 6667, 273, 6805, 8572, 9706, 42115, 31306, 323, 14168, 5987, 39962, 2061, 9275, 746, 2313, 1508, 1867, 9275, 5987, 39962, 2061, 9275, 746, 2313, 2526, 883, 9275, 50275, 977, 5701, 841, 403, 417, 1841, 326, 452, 5876, 619, 6803, 3185, 597, 403, 816, 5701, 326, 891, 1158, 1537, 320, 9371, 275, 3585, 2182, 50276, 18, 3877, 326, 627, 310, 1529, 2746, 275, 13361, 1925, 30037, 534, 812, 7826, 2847, 13775, 697, 4336, 598, 281, 368, 533, 891, 651, 1908, 3816, 6472, 281, 3693, 13775, 1060, 310, 253, 643, 30037, 407, 9412, 70, 9401, 1625, 73, 285, 12141, 11078, 5987, 11830, 50232, 2061, 3088, 532, 4989, 6903, 11838, 1717, 1867, 2251, 17107, 1867, 33505, 16559, 255, 5097, 87, 15164, 339, 7381, 82, 251, 4530, 39639, 255, 78, 91, 89, 82, 19, 1028, 30567, 8962, 3170, 1678, 26, 1767, 14340, 76, 21, 5200, 5267, 11125, 8819, 466, 89, 21, 13838, 25, 358, 31169, 43498, 1173, 91, 21, 5267, 336, 74, 303, 24, 317, 343, 88, 7332, 5848, 82, 36812, 82, 11830, 1432, 89, 82, 50275, 19, 12002, 11454, 10336, 943, 320, 11454, 35615, 50276, 20, 12002, 327, 1264, 1781, 1077, 1027, 15965, 14720, 49602, 943, 320, 327, 1264, 1077, 1027, 1781, 15965, 14720, 49602, 50276, 21, 12002, 891, 858, 417, 2096, 752, 41297, 253, 13782, 5486, 1919, 891, 1239, 253, 1551, 273, 253, 2929, 253, 26432, 2296, 352, 310, 7744, 6566, 326, 253, 5649, 273, 3215, 26208, 310, 326, 253, 1566, 476, 3037, 1533, 3640, 407, 16407, 3006, 253, 9410, 273, 253, 3626, 3448, 20689, 436, 3908, 3133, 2266, 50276, 74, 717, 625, 21802, 281, 1158, 326, 1199, 273, 253, 5649, 3249, 432, 4715, 32019, 2605, 417, 1533, 3640, 594, 352, 1537, 320, 23107, 281, 294, 3418, 347, 3981, 581, 21541, 8813, 323, 253, 5649, 273, 3215, 26208, 310, 50276, 22, 3239, 495, 2296, 253, 270, 797, 3215, 26208, 8103, 534, 5936, 326, 270, 797, 310, 253, 8103, 533, 270, 797, 310, 247, 1566, 417, 271, 8103, 253, 16566, 403, 34741, 3448, 14053, 285, 1735, 36817, 10554, 50276, 23, 2829, 337, 253, 33907, 273, 253, 2829, 2789, 352, 1007, 751, 253, 806, 767, 10175, 403, 3904, 22489, 432, 632, 1162, 355, 533, 432, 253, 36045, 273, 634, 2929, 285, 432, 2819, 387, 632, 1162, 355, 516, 3965, 2119, 326, 841, 3904, 403, 432, 634, 1211, 294, 39595, 310, 326, 3451, 604, 594, 352, 1537, 320, 1682, 281, 5981, 253, 2829, 1027, 50276, 5302, 253, 25577, 1561, 253, 2133, 273, 253, 2829, 4245, 247, 2266, 14876, 326, 253, 3904, 1705, 432, 632, 1162, 355, 275, 619, 4743, 50276, 24, 2829, 577, 285, 2829, 608, 275, 253, 11743, 1333, 752, 4836, 841, 1543, 403, 323, 594, 326, 253, 2829, 476, 320, 7192, 327, 697, 1211, 50276, 25, 4496, 4021, 2451, 253, 10414, 2067, 273, 731, 1646, 281, 760, 1618, 4477, 4060, 285, 807, 672, 627, 310, 387, 1878, 271, 549, 32693, 2715, 326, 812, 320, 7117, 347, 973, 24088, 15965, 14720, 3066, 1881, 35421, 1629, 9155, 658, 3733, 22474, 2206, 1220, 735, 342, 28400, 4778, 13650, 4979, 398, 39970, 281, 253, 35185, 273, 2412, 982, 671, 835, 1896, 26542, 247, 9380, 4588, 9311, 18767, 3185, 273, 549, 32693, 50276, 909, 253, 1218, 567, 293, 1162, 355, 246, 22, 2929, 5420, 275, 480, 1686, 83, 417, 816, 549, 32693, 50275, 8774, 4583, 891, 717, 13716, 436, 271, 854, 984, 891, 1089, 253, 20544, 18511, 533, 1158, 326, 253, 32213, 275, 39926, 2186, 253, 2929, 896, 432, 271, 1014, 2169, 4868, 891, 651, 1908, 3629, 253, 4868, 604, 1110, 32213, 497, 9713, 2167, 1110, 32213, 403, 3676, 2217, 326, 352, 651, 320, 1892, 281, 6283, 2953, 731, 275, 673, 5474, 33032, 3113, 253, 4388, 273, 4715, 42115, 31306, 323, 3676, 11454, 2036, 35615, 436, 2929, 10262, 1264, 13506, 4679, 323, 4715, 20523, 4948, 273, 15965, 14720, 275, 10012, 354, 735, 253, 4583, 2934, 310, 11797, 407, 18753, 707, 1859, 326, 841, 2248, 23223, 403, 34143, 490, 10083, 285, 9953, 253, 13506, 8892, 403, 4270, 2220, 247, 2969, 27844, 3448, 342, 247, 2603, 285, 247, 2303, 970, 1110, 8892, 323, 3215, 26208, 247, 39707, 1566, 253, 4477, 921, 253, 15785, 273, 436, 16182, 275, 2067, 15965, 14720, 4679, 50276, 74, 717, 417, 271, 6485, 275, 39707, 3210, 285, 3215, 26208, 5609, 594, 352, 310, 1896, 326, 891, 858, 417, 2096, 690, 4243, 275, 2593, 577, 50276, 22, 4583, 891, 1158, 326, 253, 2934, 273, 3733, 247, 458, 47612, 342, 20523, 4948, 273, 17032, 310, 4722, 323, 11138, 697, 3045, 275, 15965, 14720, 253, 5661, 1543, 25092, 366, 253, 17200, 273, 436, 2746, 50275, 2577, 2022, 4385, 8696, 275, 253, 17776, 273, 13506, 8892, 2593, 4567, 1060, 253, 4477, 403, 970, 271, 519, 37806, 27844, 3448, 2220, 534, 34143, 9953, 285, 490, 10083, 8892, 403, 2931, 533, 436, 519, 37806, 3448, 556, 642, 7473, 35185, 594, 359, 2550, 19186, 9232, 253, 20523, 4948, 273, 14720, 50275, 45842, 5356, 275, 2593, 4562, 253, 1264, 14720, 2248, 23223, 3636, 407, 18753, 336, 403, 6210, 392, 37224, 984, 597, 403, 10848, 407, 253, 35185, 273, 806, 2621, 9317, 323, 1650, 1379, 253, 490, 43324, 4438, 273, 14720, 50276, 15093, 323, 455, 1269, 7351, 89, 987, 2501, 3168, 89, 512, 253, 18661, 432, 436, 7351, 403, 3168, 50276, 6870, 3168, 80, 841, 18661, 403, 3168, 835, 258, 310, 271, 1789, 275, 253, 617, 22374, 10325, 50275, 5045, 270, 5477, 841, 18661, 403, 432, 436, 7351, 50275, 4064, 253, 4086, 285, 253, 906, 359, 476, 6296, 9441, 407, 490, 10083, 326, 253, 1083, 310, 271, 8813, 273, 253, 906, 1580, 50276, 74, 270, 5477, 33906, 323, 455, 1269, 7351, 89, 987, 2501, 3168, 89, 417, 19286, 17994, 15274, 285, 50276, 2886, 270, 5477, 33906, 323, 455, 1269, 7351, 89, 987, 2501, 3168, 89, 3210, 3168, 80, 9936, 50275, 2858, 275, 2593, 4567, 253, 27844, 3448, 310, 1677, 1293, 667, 35185, 285, 7613, 359, 2550, 4853, 247, 2590, 39662, 830, 273, 13760, 9936, 3210, 3103, 253, 10732, 273, 490, 10083, 310, 1060, 1077, 12744, 50275, 936, 2020, 598, 253, 13506, 8892, 4081, 407, 253, 4477, 1537, 6296, 1361, 275, 4715, 271, 42115, 8492, 7032, 273, 11138, 10012, 354, 735, 533, 627, 310, 247, 26210, 875, 253, 13760, 27367, 273, 34143, 490, 10083, 285, 9953, 2931, 407, 18753, 336, 285, 625, 3839, 275, 253, 3640, 6779, 6239, 285, 253, 14720, 2248, 23223, 9093, 690, 4948, 273, 3102, 11038, 3559, 275, 253, 13506, 8892, 50276, 187, 187, 4118, 18435, 27, 253, 4477, 12661, 247, 3215, 26208, 5700, 4715, 42115, 31306, 275, 4979, 398, 323, 34143, 9953, 285, 490, 10083, 50276, 44295, 253, 3916, 285, 1543, 1646, 281, 5224, 326, 824, 3215, 26208, 310, 625, 5547, 275, 4979, 398, 534, 2085, 247, 625, 4691, 282, 494, 10336, 323, 4715, 42115, 8350, 31306, 50276, 9088, 403, 1527, 3533, 326, 3464, 5742, 8704, 557, 290, 36874, 1029, 3045, 432, 8350, 8492, 4715, 26332, 310, 3215, 26208, 2509, 752, 359, 1158, 352, 310, 285, 1880, 15302, 403, 253, 3451, 5122, 323, 22435, 272, 824, 31306, 36871 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: summary authors propose an interesting approach to do shapetailored spatial filtering and encode equivariance constraints by using solutions of specifically constructed poisson pdes this leads to a significant decrease in number of trainable parameters and amount of training data required pros the paper seems to largely wellwritten and includes appropriate references encouraging results are demonstrated although on a few experiments cons some parts about the smoothing process l6373 and its incorporation into the model are hard to follow for someone with less exposure to the references needs more clarity about the experiments and the metrics reported the reviewer is not familiar with the some of the literature behind this work but the paper and idea seem to be a good fit for the workshopdocsepthe paper is well written and their approach is grounded in solid mathematical theory it is fairly easy to follow it presents an interesting approach to deal with regions that are not trivial compositions of rectangles although as they point out this is not per se an innovation they add a further layer no pun intended which follows from their mathematical framework their approach is robust against transformations such as rotations translations and deformations of the domain they test to which extent these properties are actually inherited by the architecture and achieve stateoftheart performances on the other hand it is not clear the effort put into finetuning their competitors we all know you can hammer a model to obtain better results but nothing takes away the fact that the number of trainable parameters is orders of magnitude below that of sota dnns to conclude 1 interesting approach and results backed by theoretical arguments 2 experiments could be improved more transparency could be provided as well as a clear presentation of results and metrics used issues 1 what is gtcovering 2 consider moving figure 1 in the main article it is one of your main results do not hide it in the appendix 3 consider renaming the region r with a different symbol there are plenty of other letters that do not remind directly to the line of real numbers or to the rotation operator in the proofs 4 if you give proofs you have to decide you give a sketch or you are fully formal and define everything no halfway therefore consider enhancing proofs for example what is h in line 257 i guess its the hessian matrix but maybe another reader might not do the same 5 line 12 either you use poisson partial differential equation or poisson partial differential equation not poisson partial differential equation 6 line 34 insert a space after the dot in consideration3 we 7 make sure you use parenthesis whose height can adaptdocsep summary the article proposes to take advantage of a pde description of a segmentation problem to drastically reduce the amount of parameters x103 to learn still improving the final performance comments pros the methodology is new drastic reduce in amount of parameters with apparent good results for similarity detection but not clear results on actual segmentation cons l165 the sentence clarity can be improved table 2 the metric reported gtcovering is not defined only presented it would be important to see the actual performance on the dataset since gt seems to capture the similarity of the representations maybe the network is making great representations for similarity but not for segmentation the paper robustness would benefit from reporting means and std of several runs with different initialization seed ### Summary:
the reviews were somewhat mixed with reviewers raising some issues with the clarity of the submission and concerns about the need for morebetter experimental tests nonetheless all reviewers agreed that this is an interesting approach with great promise to deliver a robust theorybacked impact of practical relevance
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 8774, 4477, 12661, 271, 4722, 2746, 281, 513, 439, 522, 292, 647, 2149, 8820, 19690, 285, 22573, 32270, 14417, 10806, 407, 970, 5482, 273, 5742, 8818, 2963, 17469, 268, 3229, 436, 5644, 281, 247, 1534, 6379, 275, 1180, 273, 6194, 494, 3602, 285, 2408, 273, 3733, 941, 2424, 50276, 856, 84, 50275, 783, 2929, 3133, 281, 8127, 973, 15720, 285, 3797, 4569, 10414, 50276, 2083, 454, 2977, 1543, 403, 5183, 3738, 327, 247, 1643, 4679, 50276, 5040, 50275, 8826, 4243, 670, 253, 36971, 1232, 298, 23, 29546, 285, 697, 24319, 715, 253, 1566, 403, 1892, 281, 956, 323, 3095, 342, 1679, 5445, 281, 253, 10414, 50276, 50234, 625, 19843, 670, 253, 4679, 285, 253, 17082, 2361, 50276, 783, 37317, 310, 417, 7615, 342, 253, 690, 273, 253, 6239, 3212, 436, 789, 533, 253, 2929, 285, 2934, 1646, 281, 320, 247, 1175, 4944, 323, 253, 22586, 7152, 339, 431, 248, 2929, 310, 973, 3542, 285, 616, 2746, 310, 28462, 275, 4891, 15965, 3762, 352, 310, 9648, 3477, 281, 956, 352, 10262, 271, 4722, 2746, 281, 2968, 342, 4811, 326, 403, 417, 14916, 16672, 273, 9004, 19236, 3738, 347, 597, 1127, 562, 436, 310, 417, 591, 396, 271, 15832, 597, 823, 247, 2007, 3828, 642, 5419, 6034, 534, 3637, 432, 616, 15965, 7792, 616, 2746, 310, 10237, 1411, 21257, 824, 347, 39501, 29971, 285, 45989, 273, 253, 5028, 50276, 9328, 1071, 281, 534, 6070, 841, 3607, 403, 2686, 20265, 407, 253, 10336, 285, 5115, 1375, 23037, 14387, 16226, 50276, 251, 253, 643, 1133, 352, 310, 417, 2590, 253, 3434, 1691, 715, 1442, 292, 25004, 616, 21607, 359, 512, 871, 368, 476, 24627, 247, 1566, 281, 4044, 1805, 1543, 533, 2717, 3936, 1977, 253, 958, 326, 253, 1180, 273, 6194, 494, 3602, 310, 7367, 273, 9777, 2708, 326, 273, 256, 5503, 277, 79, 2224, 50276, 936, 7525, 337, 4722, 2746, 285, 1543, 17245, 407, 10527, 7125, 374, 4679, 812, 320, 5520, 625, 22107, 812, 320, 2530, 347, 973, 347, 247, 2590, 9759, 273, 1543, 285, 17082, 908, 50275, 22402, 337, 752, 310, 305, 85, 16484, 272, 50276, 19, 1908, 4886, 4677, 337, 275, 253, 2022, 3929, 352, 310, 581, 273, 634, 2022, 1543, 513, 417, 10507, 352, 275, 253, 30762, 50276, 20, 1908, 3816, 6472, 253, 2919, 391, 342, 247, 1027, 9484, 627, 403, 9828, 273, 643, 4876, 326, 513, 417, 9287, 3587, 281, 253, 1386, 273, 1524, 3904, 390, 281, 253, 9381, 5572, 275, 253, 27947, 50276, 21, 604, 368, 1918, 27947, 368, 452, 281, 7617, 368, 1918, 247, 23211, 390, 368, 403, 4751, 7473, 285, 4853, 3253, 642, 25854, 3103, 1908, 22474, 27947, 323, 1650, 752, 310, 288, 275, 1386, 30092, 891, 5476, 697, 253, 344, 859, 757, 4315, 533, 5046, 1529, 9414, 1537, 417, 513, 253, 1072, 50276, 22, 1386, 1249, 2057, 368, 897, 2963, 17469, 7898, 8967, 5150, 390, 2963, 17469, 7898, 8967, 5150, 417, 2963, 17469, 7898, 8967, 5150, 721, 1386, 5910, 5669, 247, 2317, 846, 253, 14261, 275, 8180, 20, 359, 818, 1056, 2119, 368, 897, 2885, 25232, 3692, 4898, 476, 5223, 7152, 33032, 6010, 50276, 783, 3929, 29328, 281, 1379, 5750, 273, 247, 268, 615, 5740, 273, 247, 26405, 1895, 281, 31063, 4796, 253, 2408, 273, 3602, 1269, 12172, 281, 3037, 1335, 11138, 253, 2457, 3045, 50275, 26122, 50276, 856, 84, 253, 16182, 310, 747, 36927, 4796, 275, 2408, 273, 3602, 342, 5165, 1175, 1543, 323, 14259, 5481, 533, 417, 2590, 1543, 327, 4588, 26405, 50276, 5040, 50276, 77, 15429, 253, 6197, 19843, 476, 320, 5520, 50276, 2420, 374, 253, 7982, 2361, 305, 85, 16484, 272, 310, 417, 2931, 760, 3559, 352, 651, 320, 1774, 281, 923, 253, 4588, 3045, 327, 253, 10895, 1580, 305, 85, 3133, 281, 9232, 253, 14259, 273, 253, 14237, 5046, 253, 2990, 310, 2403, 1270, 14237, 323, 14259, 533, 417, 323, 26405, 253, 2929, 31640, 651, 5649, 432, 9610, 2097, 285, 6268, 273, 2067, 6613, 342, 1027, 31850, 8357, 187, 187, 4118, 18435, 27, 783, 10123, 497, 8489, 6804, 342, 30628, 12976, 690, 3374, 342, 253, 19843, 273, 253, 19529, 285, 7350, 670, 253, 878, 323, 625, 29266, 5661, 5216, 23188, 512, 30628, 5821, 326, 436, 310, 271, 4722, 2746, 342, 1270, 9023, 281, 7257, 247, 10237, 3762, 32797, 3486, 273, 8542, 17200, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 8774, 4477, 12661, 271, 4722, 2746, 281, 513, 439, 522, 292, 647, 2149, 8820, 19690, 285, 22573, 32270, 14417, 10806, 407, 970, 5482, 273, 5742, 8818, 2963, 17469, 268, 3229, 436, 5644, 281, 247, 1534, 6379, 275, 1180, 273, 6194, 494, 3602, 285, 2408, 273, 3733, 941, 2424, 50276, 856, 84, 50275, 783, 2929, 3133, 281, 8127, 973, 15720, 285, 3797, 4569, 10414, 50276, 2083, 454, 2977, 1543, 403, 5183, 3738, 327, 247, 1643, 4679, 50276, 5040, 50275, 8826, 4243, 670, 253, 36971, 1232, 298, 23, 29546, 285, 697, 24319, 715, 253, 1566, 403, 1892, 281, 956, 323, 3095, 342, 1679, 5445, 281, 253, 10414, 50276, 50234, 625, 19843, 670, 253, 4679, 285, 253, 17082, 2361, 50276, 783, 37317, 310, 417, 7615, 342, 253, 690, 273, 253, 6239, 3212, 436, 789, 533, 253, 2929, 285, 2934, 1646, 281, 320, 247, 1175, 4944, 323, 253, 22586, 7152, 339, 431, 248, 2929, 310, 973, 3542, 285, 616, 2746, 310, 28462, 275, 4891, 15965, 3762, 352, 310, 9648, 3477, 281, 956, 352, 10262, 271, 4722, 2746, 281, 2968, 342, 4811, 326, 403, 417, 14916, 16672, 273, 9004, 19236, 3738, 347, 597, 1127, 562, 436, 310, 417, 591, 396, 271, 15832, 597, 823, 247, 2007, 3828, 642, 5419, 6034, 534, 3637, 432, 616, 15965, 7792, 616, 2746, 310, 10237, 1411, 21257, 824, 347, 39501, 29971, 285, 45989, 273, 253, 5028, 50276, 9328, 1071, 281, 534, 6070, 841, 3607, 403, 2686, 20265, 407, 253, 10336, 285, 5115, 1375, 23037, 14387, 16226, 50276, 251, 253, 643, 1133, 352, 310, 417, 2590, 253, 3434, 1691, 715, 1442, 292, 25004, 616, 21607, 359, 512, 871, 368, 476, 24627, 247, 1566, 281, 4044, 1805, 1543, 533, 2717, 3936, 1977, 253, 958, 326, 253, 1180, 273, 6194, 494, 3602, 310, 7367, 273, 9777, 2708, 326, 273, 256, 5503, 277, 79, 2224, 50276, 936, 7525, 337, 4722, 2746, 285, 1543, 17245, 407, 10527, 7125, 374, 4679, 812, 320, 5520, 625, 22107, 812, 320, 2530, 347, 973, 347, 247, 2590, 9759, 273, 1543, 285, 17082, 908, 50275, 22402, 337, 752, 310, 305, 85, 16484, 272, 50276, 19, 1908, 4886, 4677, 337, 275, 253, 2022, 3929, 352, 310, 581, 273, 634, 2022, 1543, 513, 417, 10507, 352, 275, 253, 30762, 50276, 20, 1908, 3816, 6472, 253, 2919, 391, 342, 247, 1027, 9484, 627, 403, 9828, 273, 643, 4876, 326, 513, 417, 9287, 3587, 281, 253, 1386, 273, 1524, 3904, 390, 281, 253, 9381, 5572, 275, 253, 27947, 50276, 21, 604, 368, 1918, 27947, 368, 452, 281, 7617, 368, 1918, 247, 23211, 390, 368, 403, 4751, 7473, 285, 4853, 3253, 642, 25854, 3103, 1908, 22474, 27947, 323, 1650, 752, 310, 288, 275, 1386, 30092, 891, 5476, 697, 253, 344, 859, 757, 4315, 533, 5046, 1529, 9414, 1537, 417, 513, 253, 1072, 50276, 22, 1386, 1249, 2057, 368, 897, 2963, 17469, 7898, 8967, 5150, 390, 2963, 17469, 7898, 8967, 5150, 417, 2963, 17469, 7898, 8967, 5150, 721, 1386, 5910, 5669, 247, 2317, 846, 253, 14261, 275, 8180, 20, 359, 818, 1056, 2119, 368, 897, 2885, 25232, 3692, 4898, 476, 5223, 7152, 33032, 6010, 50276, 783, 3929, 29328, 281, 1379, 5750, 273, 247, 268, 615, 5740, 273, 247, 26405, 1895, 281, 31063, 4796, 253, 2408, 273, 3602, 1269, 12172, 281, 3037, 1335, 11138, 253, 2457, 3045, 50275, 26122, 50276, 856, 84, 253, 16182, 310, 747, 36927, 4796, 275, 2408, 273, 3602, 342, 5165, 1175, 1543, 323, 14259, 5481, 533, 417, 2590, 1543, 327, 4588, 26405, 50276, 5040, 50276, 77, 15429, 253, 6197, 19843, 476, 320, 5520, 50276, 2420, 374, 253, 7982, 2361, 305, 85, 16484, 272, 310, 417, 2931, 760, 3559, 352, 651, 320, 1774, 281, 923, 253, 4588, 3045, 327, 253, 10895, 1580, 305, 85, 3133, 281, 9232, 253, 14259, 273, 253, 14237, 5046, 253, 2990, 310, 2403, 1270, 14237, 323, 14259, 533, 417, 323, 26405, 253, 2929, 31640, 651, 5649, 432, 9610, 2097, 285, 6268, 273, 2067, 6613, 342, 1027, 31850, 8357, 187, 187, 4118, 18435, 27, 783, 10123, 497, 8489, 6804, 342, 30628, 12976, 690, 3374, 342, 253, 19843, 273, 253, 19529, 285, 7350, 670, 253, 878, 323, 625, 29266, 5661, 5216, 23188, 512, 30628, 5821, 326, 436, 310, 271, 4722, 2746, 342, 1270, 9023, 281, 7257, 247, 10237, 3762, 32797, 3486, 273, 8542, 17200, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: summary this paper proposes an information bottleneck method dropbottleneck that allows the input to be compressed by dropping each input feature with probability pi the model then learns the drop probability vector p p1 pn where dropping redundant features will reduce the compression penalty term ixz the approach is demonstrated in experiments in 1 robust exploration setting for rl 2 adversarial attacks on imagenet and 3 an experiment showing that their approach is able to maintain performance on imagenet with reduced dimensionality evaluation i found this paper to be clear and the experiments seem reasonable i dont know of any prior work that takes this approach im not an rl expert so i wont comment on the strength of the rl results other than that their methods were clear and they seem to have been careful and fair in choosing baselines my biggest criticism is that the robustness experiments on imagenet compare to vib alemi et al 2017 as a baseline but they should really compare to the more recent adversarial results on conditional entropy bottleneck another information bottleneck approach that outperforms vib given in fischer and alemi 2020 httpsarxivorgpdf200205380pdf id like to see that added to the revised versiondocsepsummary this paper proposes the dropbottleneck db method that performs feature selection during the training with the mutual information it achieves the stateoftheart result in few reinforcement learning tasks and trains a more robust model originality and significance aspect this paper combines mainly two ideas 1 classic feature selection choose xis to drop with respect to the mutual information between x and y 2 information bottleneck ib formulation that maximizes the predictionterm mutual information term between z and y and minimizes the compression information between x and z simultaneously it is actually fairly close to the core idea of the feature selection because it finds the compression by dropping the original feature or not unlike the other ib compression methods in other words db is a modernized version of the feature selection that automatically drops a feature based on the ibstyle loss i would say the idea is not entirely new somewhat limited but it could be still useful to the community in the significance aspect i wanted to see authors to apply this method on other noisy setup tasks eg computer vision tasks with noise outside of the reinforcement learning tasks the improvement on the rl tasks seems to be substantial however i would like to know how db performs when there is correlation between feature dimensions see below clarification question to see more details quality and clarity aspect the paper was overall easy to follow here are a few questions to authors what if we just drop the feature space only using the mutual information between x and y and drop them to achieve a similar number of features that was resulted by db it is basically the classic mutual information feature selection would that perform as good as db can you make a comparison i think this should be one of the baseline if thats similar to db what would be the benefit of db does db always minimize hatiz x with the independent assumption when some of the xis are correlated eg consider a vision task there could be quite a gap between iz x and the independent assumption version recommendation i think the paper is at the borderline looking forward to seeing more discussion with the classic feature selection method and some evaluation on tasks outside of rl if possible i would be happy to revisit my score post rebuttal comment i thank the authors for the rebuttal authors have addressed my concerns and clarified some of the confusing points that i had i would like to recommend this paper to be accepted docsepthe paper proposes a new information bottleneck objective which compresses the latent by learning to drop features similar to dropout unlike dropout a different probability is learnt for each latent featuredimension using concrete relaxation the experiments show that the works well overall i score this paper as an accept while the approach is limited to dropping input features which does not make it a general ib objective it seems to work very well in the presented rl experiments as well as show robustness that is better than dvibs moreover the paper is clearly written and engaging strengths the paper proposes a simple yet effective idea using a concrete relaxation to learn dropout probabilities has been done before but the idea to have a separate probability per latent dimension is novel the dropbottleneck objective works directly on the inputlatent layer which means that the compression objective is easy to compute this is nice because ib terms are usually cumbersome to compute however this also requires the inputlatent to already be disentangled as dropping out features is limited in its expressiveness the rl experiments on vizdoom and dmlab are convincing as are the ones on imagenet the additional experiment on the occluded cifar10 dataset in the appendix is also well thoughtout and shows the advantage of this straightforward method over dvib moreover once trained features can be dropped out deterministically if so required which allows for proper compression and consistent behaviour questions given the simple conceptual idea this reviewer would be interested to see an ablation with using other methods of enforcing sparsity in the latent could l1 regularization of the latent activations be used instead of the iz x term db cannot provide the same generality as other ib objectives the input latent has to be sufficiently disentangled already as the objective itself does not encourage further disentanglement by itself do the imagenet experiments use extractedpretrained embeddings as latents rebuttal i thank the authors for their reply im more confident this is a good paper nowdocsepsummary the paper contributes a novel method dropbottleneck db for discretely dropping input features that are irrelevant for predicting the target variable key idea is to instantiate the compression term of the information bottleneck framework with learned term that sets irrelevant feature dimensions to 0 to this end a drop probability is learned for each dimension dimensions that have a lower probability than 05 a fixed threshold of being relevant are set to 0 strong points the paper is wellwritten and easy to understand experiments show that db works better than vib in vizdoom and dmlab when a noisytv noise is added to the input images different noisytv noises are considered changing the image when the agent performs an action adding random noise to the tv and adding random noise when the agent performs an action experiments also show that the obtained representation is more robust against l2 and linf attacks in imagenet furthermore the experiments show that the approach can drop many imagenet features while almost preserving the accuracy of a resnet the paper comes with code in the supplementary material weak points i found the reinforcement learning experiments not convincing since only a fixed region of the input is modified by noise ie the noisy tv hence the approach essentially identifies irrelevant pixel locations such a problem could be solved by a simple preprocessing step the method wont work if the location of the noise changes in general limitations of the work are not discussed the experiments on imagenet are more interesting however the fact that individual dimensions ie specific pixel locations are identified as irrelevant is still a limiting factor furthermore the experiments do not fit to the focus of this paper on reinforcement learning the paper does not discuss connections of the presented approach to prior works for discrete feature selection it only discusses connections to prior bottleneck methods the paper does not perform experiments on datasets with meaningful features where a feature selection makes more sense than for specific pixels in images additional feedback it could be stated explicitly that h refers to the entropy currently it is only implicitly defined in eq 6 i think it would be better to also cite jang et al categorical reparameterization with gumbelsoftmax for the concrete relaxation of the bernoulli distribution figure 1 should either be improved or removed i dont see much additional insights that can be gained from this figure of course it would be very interesting to see if the drop probabilities correlate with the location of the noise inputs it would be great if such an analysis could be added this could replace figure 1 ### Summary:
this paper proposes to enhance the robustness of rl and supervised learning algorithms to noise in the observations by dropping input features that are irrelevant for the task it relies on the information bottleneck framework well derived in the paper and learns a parametric compression of the input features that sets them to zero if they are not relevant for the taskn the method is extensively evaluated on several rl tasks exploration in vizdoom and dmlab with a noisy tv distractor and supervised tasks imagenet or cifar10 classification with noise reviewers have praised the idea derivation and writing as well as the extensive experiments on rl and supervised tasks critique focused on the contrived nature of the tv noise localised always in the same corner of the image a standard evaluation according to the authors lack of comparison with other feature selection methods lack of comparison with conditional entropy bottleneck done during rebuttal more general noise than just specific pixels clarified by the authors as being the features coming out of a convnet given that the reviewers comments were largely addressed by the authors and given the final scores of the paper i will recommend acceptance
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 8774, 436, 2929, 29328, 271, 1491, 3673, 44856, 1332, 5926, 67, 1519, 44856, 326, 4483, 253, 3280, 281, 320, 21012, 407, 18752, 1016, 3280, 4735, 342, 5912, 12580, 253, 1566, 840, 33772, 253, 5926, 5912, 4972, 268, 50276, 81, 18, 50275, 16077, 835, 18752, 28116, 3386, 588, 4796, 253, 13800, 12339, 1307, 891, 39344, 253, 2746, 310, 5183, 275, 4679, 275, 337, 10237, 17947, 4758, 323, 391, 77, 374, 48960, 8104, 327, 4440, 257, 292, 285, 495, 271, 3368, 4645, 326, 616, 2746, 310, 2104, 281, 6558, 3045, 327, 4440, 257, 292, 342, 3777, 7877, 1319, 50276, 15419, 2368, 891, 1119, 436, 2929, 281, 320, 2590, 285, 253, 4679, 1646, 5272, 891, 13414, 871, 273, 667, 2720, 789, 326, 3936, 436, 2746, 516, 417, 271, 391, 77, 6485, 594, 891, 31451, 4385, 327, 253, 4757, 273, 253, 391, 77, 1543, 643, 685, 326, 616, 3082, 497, 2590, 285, 597, 1646, 281, 452, 644, 10182, 285, 4344, 275, 13887, 1666, 25379, 619, 5962, 14226, 310, 326, 253, 31640, 4679, 327, 4440, 257, 292, 7277, 281, 362, 487, 247, 5616, 74, 1162, 355, 4240, 347, 247, 8245, 533, 597, 943, 1663, 7277, 281, 253, 625, 3332, 48960, 1543, 327, 17697, 15579, 3673, 44856, 1529, 1491, 3673, 44856, 2746, 326, 41731, 13015, 362, 487, 1677, 275, 269, 23268, 285, 247, 5616, 74, 9169, 5987, 39962, 2061, 9275, 1518, 938, 3357, 1438, 9275, 2654, 751, 281, 923, 326, 2879, 281, 253, 17265, 2715, 7152, 339, 793, 360, 3454, 436, 2929, 29328, 253, 5926, 67, 1519, 44856, 14073, 1332, 326, 17923, 4735, 5438, 1309, 253, 3733, 342, 253, 15577, 1491, 352, 33526, 253, 1375, 23037, 14387, 906, 275, 1643, 35221, 4715, 8892, 285, 18784, 247, 625, 10237, 1566, 50276, 19164, 414, 285, 8453, 4809, 436, 2929, 24772, 7194, 767, 5697, 337, 10610, 4735, 5438, 5206, 1269, 261, 281, 5926, 342, 1675, 281, 253, 15577, 1491, 875, 1269, 285, 340, 374, 1491, 3673, 44856, 18890, 15895, 326, 11903, 4219, 253, 10554, 3945, 15577, 1491, 1307, 875, 1182, 285, 340, 285, 46926, 253, 13800, 1491, 875, 1269, 285, 1182, 10486, 352, 310, 2686, 9648, 2810, 281, 253, 5161, 2934, 273, 253, 4735, 5438, 984, 352, 9010, 253, 13800, 407, 18752, 253, 3236, 4735, 390, 417, 12401, 253, 643, 18890, 13800, 3082, 275, 643, 3000, 14073, 310, 247, 4980, 1025, 2715, 273, 253, 4735, 5438, 326, 8356, 15323, 247, 4735, 1754, 327, 253, 18890, 4826, 2957, 891, 651, 1333, 253, 2934, 310, 417, 7094, 747, 8489, 3710, 533, 352, 812, 320, 1335, 4217, 281, 253, 3114, 50275, 249, 253, 8453, 4809, 891, 3078, 281, 923, 4477, 281, 4647, 436, 1332, 327, 643, 27620, 9978, 8892, 24088, 4382, 8113, 8892, 342, 6046, 3345, 273, 253, 35221, 4715, 8892, 253, 7756, 327, 253, 391, 77, 8892, 3133, 281, 320, 6832, 2299, 891, 651, 751, 281, 871, 849, 14073, 17923, 672, 627, 310, 5921, 875, 4735, 10103, 923, 2708, 37699, 1953, 281, 923, 625, 4278, 50276, 15177, 285, 19843, 4809, 50276, 783, 2929, 369, 4583, 3477, 281, 956, 1060, 403, 247, 1643, 3533, 281, 4477, 50275, 5371, 604, 359, 816, 5926, 253, 4735, 2317, 760, 970, 253, 15577, 1491, 875, 1269, 285, 340, 285, 5926, 731, 281, 5115, 247, 2074, 1180, 273, 3386, 326, 369, 7369, 407, 14073, 50276, 262, 310, 10323, 253, 10610, 15577, 1491, 4735, 5438, 651, 326, 1347, 347, 1175, 347, 14073, 476, 368, 1056, 247, 5301, 50276, 74, 1158, 436, 943, 320, 581, 273, 253, 8245, 604, 28763, 2074, 281, 14073, 752, 651, 320, 253, 5649, 273, 14073, 50276, 18566, 14073, 1900, 15338, 7856, 478, 1269, 50276, 3113, 253, 3907, 9376, 672, 690, 273, 253, 1269, 261, 403, 9578, 24088, 1908, 247, 8113, 4836, 627, 812, 320, 3240, 247, 8037, 875, 24901, 1269, 285, 253, 3907, 9376, 2715, 50275, 250, 27167, 318, 891, 1158, 253, 2929, 310, 387, 253, 45210, 2819, 3579, 281, 6523, 625, 5955, 342, 253, 10610, 4735, 5438, 1332, 285, 690, 7103, 327, 8892, 3345, 273, 391, 77, 604, 1896, 891, 651, 320, 5211, 281, 45735, 619, 4868, 50274, 5996, 30080, 22559, 4385, 891, 5717, 253, 4477, 323, 253, 30080, 22559, 4477, 452, 9713, 619, 7350, 285, 31637, 690, 273, 253, 21643, 2792, 326, 891, 574, 891, 651, 751, 281, 5583, 436, 2929, 281, 320, 7607, 50276, 7152, 339, 431, 248, 2929, 29328, 247, 747, 1491, 3673, 44856, 8103, 534, 509, 16443, 253, 21624, 407, 4715, 281, 5926, 3386, 2074, 281, 5926, 483, 12401, 5926, 483, 247, 1027, 5912, 310, 34003, 323, 1016, 21624, 12819, 50127, 970, 11859, 17040, 253, 4679, 921, 326, 253, 2987, 973, 50275, 1189, 455, 891, 4868, 436, 2929, 347, 271, 2997, 1223, 253, 2746, 310, 3710, 281, 18752, 3280, 3386, 534, 1057, 417, 1056, 352, 247, 2087, 18890, 8103, 352, 3133, 281, 789, 1077, 973, 275, 253, 3559, 391, 77, 4679, 347, 973, 347, 921, 31640, 326, 310, 1805, 685, 43857, 487, 84, 25761, 253, 2929, 310, 4518, 3542, 285, 15966, 50274, 296, 3755, 20556, 50275, 783, 2929, 29328, 247, 2969, 2568, 3576, 2934, 970, 247, 11859, 17040, 281, 3037, 5926, 483, 20552, 556, 644, 2218, 1078, 533, 253, 2934, 281, 452, 247, 4858, 5912, 591, 21624, 7877, 310, 4460, 253, 5926, 67, 1519, 44856, 8103, 2987, 3587, 327, 253, 3280, 13324, 290, 3828, 534, 2097, 326, 253, 13800, 8103, 310, 3477, 281, 11897, 436, 310, 5322, 984, 18890, 2426, 403, 3798, 41049, 281, 11897, 2299, 436, 671, 4419, 253, 3280, 13324, 290, 281, 2168, 320, 557, 290, 33195, 347, 18752, 562, 3386, 310, 3710, 275, 697, 3890, 6460, 50274, 783, 391, 77, 4679, 327, 40027, 3088, 297, 285, 277, 1686, 357, 403, 21414, 347, 403, 253, 4394, 327, 4440, 257, 292, 253, 3081, 3368, 327, 253, 15715, 4686, 260, 338, 274, 740, 10895, 275, 253, 30762, 310, 671, 973, 1869, 483, 285, 2722, 253, 5750, 273, 436, 15246, 1332, 689, 43857, 487, 50275, 3062, 1189, 2378, 10166, 3386, 476, 320, 8231, 562, 11544, 18260, 604, 594, 2424, 534, 4483, 323, 1463, 13800, 285, 5185, 8770, 50274, 34974, 50275, 28821, 253, 2969, 20178, 2934, 436, 37317, 651, 320, 6110, 281, 923, 271, 28913, 342, 970, 643, 3082, 273, 37703, 37139, 414, 275, 253, 21624, 812, 298, 18, 37820, 273, 253, 21624, 1396, 569, 320, 908, 3185, 273, 253, 24901, 1269, 1307, 50275, 5470, 2550, 2085, 253, 1072, 31376, 347, 643, 18890, 16566, 253, 3280, 21624, 556, 281, 320, 10481, 557, 290, 33195, 2168, 347, 253, 8103, 3139, 1057, 417, 11907, 2007, 557, 290, 606, 1338, 407, 3139, 513, 253, 4440, 257, 292, 4679, 897, 10375, 4025, 11273, 46234, 347, 4329, 592, 50274, 250, 2858, 22559, 50276, 74, 5717, 253, 4477, 323, 616, 12252, 516, 625, 13224, 436, 310, 247, 1175, 2929, 1024, 7152, 339, 793, 360, 3454, 253, 2929, 17904, 247, 4460, 1332, 5926, 67, 1519, 44856, 14073, 323, 35132, 600, 18752, 3280, 3386, 326, 403, 19124, 323, 21565, 253, 2303, 4778, 2234, 2934, 310, 281, 8164, 4513, 253, 13800, 1307, 273, 253, 1491, 3673, 44856, 7792, 342, 6311, 1307, 326, 5239, 19124, 4735, 10103, 281, 470, 281, 436, 990, 247, 5926, 5912, 310, 6311, 323, 1016, 7877, 10103, 326, 452, 247, 2406, 5912, 685, 16987, 247, 4229, 7887, 273, 1146, 4623, 403, 873, 281, 470, 50276, 9072, 2792, 50276, 783, 2929, 310, 973, 15720, 285, 3477, 281, 2096, 50276, 16217, 3825, 921, 326, 14073, 2987, 1805, 685, 362, 487, 275, 40027, 3088, 297, 285, 277, 1686, 357, 672, 247, 642, 261, 1767, 87, 6046, 310, 2879, 281, 253, 3280, 3888, 1027, 642, 261, 1767, 87, 33737, 403, 2783, 6890, 253, 2460, 672, 253, 5570, 17923, 271, 2250, 6240, 3632, 6046, 281, 253, 23055, 285, 6240, 3632, 6046, 672, 253, 5570, 17923, 271, 2250, 50276, 16217, 3825, 671, 921, 326, 253, 2797, 6779, 310, 625, 10237, 1411, 298, 19, 285, 298, 2050, 8104, 275, 4440, 257, 292, 33810, 253, 4679, 921, 326, 253, 2746, 476, 5926, 1142, 4440, 257, 292, 3386, 1223, 2761, 24279, 253, 7200, 273, 247, 501, 3024, 50276, 783, 2929, 3249, 342, 2127, 275, 253, 24864, 2144, 50276, 20881, 2792, 50276, 74, 1119, 253, 35221, 4715, 4679, 417, 21414, 1580, 760, 247, 4229, 2919, 273, 253, 3280, 310, 7321, 407, 6046, 26332, 253, 27620, 23055, 7613, 253, 2746, 9093, 22649, 19124, 12275, 8593, 824, 247, 1895, 812, 320, 14042, 407, 247, 2969, 638, 21678, 3213, 253, 1332, 31451, 789, 604, 253, 4328, 273, 253, 6046, 2544, 275, 2087, 7364, 273, 253, 789, 403, 417, 5469, 50276, 783, 4679, 327, 4440, 257, 292, 403, 625, 4722, 2299, 253, 958, 326, 2060, 10103, 26332, 2173, 12275, 8593, 403, 3636, 347, 19124, 310, 1335, 247, 14155, 2803, 33810, 253, 4679, 513, 417, 4944, 281, 253, 2770, 273, 436, 2929, 327, 35221, 4715, 50276, 783, 2929, 1057, 417, 2319, 10291, 273, 253, 3559, 2746, 281, 2720, 2987, 323, 13358, 4735, 5438, 352, 760, 25339, 10291, 281, 2720, 3673, 44856, 3082, 50276, 783, 2929, 1057, 417, 1347, 4679, 327, 15302, 342, 14282, 3386, 835, 247, 4735, 5438, 2789, 625, 3282, 685, 323, 2173, 15115, 275, 3888, 50276, 38092, 8680, 50276, 262, 812, 320, 4767, 11120, 326, 288, 10770, 281, 253, 15579, 4390, 352, 310, 760, 29688, 2931, 275, 16186, 721, 50276, 74, 1158, 352, 651, 320, 1805, 281, 671, 26542, 480, 606, 1162, 355, 31091, 294, 19484, 1320, 342, 305, 3561, 293, 5530, 4090, 323, 253, 11859, 17040, 273, 253, 270, 1808, 276, 25658, 3268, 50276, 13206, 337, 943, 2057, 320, 5520, 390, 5176, 891, 13414, 923, 1199, 3081, 16039, 326, 476, 320, 12103, 432, 436, 4677, 50276, 1171, 2282, 352, 651, 320, 1077, 4722, 281, 923, 604, 253, 5926, 20552, 24888, 342, 253, 4328, 273, 253, 6046, 14800, 352, 651, 320, 1270, 604, 824, 271, 1783, 812, 320, 2879, 436, 812, 8171, 4677, 337, 187, 187, 4118, 18435, 27, 2520, 2929, 29328, 281, 7278, 253, 31640, 273, 391, 77, 285, 22296, 4715, 11333, 281, 6046, 275, 253, 7313, 407, 18752, 3280, 3386, 326, 403, 19124, 323, 253, 4836, 352, 15771, 327, 253, 1491, 3673, 44856, 7792, 973, 6012, 275, 253, 2929, 285, 33772, 247, 36833, 13800, 273, 253, 3280, 3386, 326, 5239, 731, 281, 5058, 604, 597, 403, 417, 4623, 323, 253, 4836, 79, 253, 1332, 310, 18171, 6760, 327, 2067, 391, 77, 8892, 17947, 275, 40027, 3088, 297, 285, 277, 1686, 357, 342, 247, 27620, 23055, 940, 30524, 285, 22296, 8892, 4440, 257, 292, 390, 260, 338, 274, 740, 9162, 342, 6046, 50276, 15337, 398, 452, 26108, 253, 2934, 28529, 285, 4028, 347, 973, 347, 253, 9470, 4679, 327, 391, 77, 285, 22296, 8892, 29254, 7106, 327, 50276, 783, 523, 30487, 3753, 273, 253, 23055, 6046, 1980, 1701, 1900, 275, 253, 1072, 7145, 273, 253, 2460, 50276, 66, 2629, 7103, 2556, 281, 253, 4477, 50276, 77, 471, 273, 5301, 342, 643, 4735, 5438, 3082, 50276, 77, 471, 273, 5301, 342, 17697, 15579, 3673, 44856, 2218, 1309, 30080, 22559, 50276, 3062, 2087, 6046, 685, 816, 2173, 15115, 31637, 407, 253, 4477, 347, 1146, 253, 3386, 3551, 562, 273, 247, 2410, 3024, 50276, 28821, 326, 253, 30628, 5701, 497, 8127, 9713, 407, 253, 4477, 285, 1677, 253, 2457, 7363, 273, 253, 2929, 891, 588, 5583, 14924, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 8774, 436, 2929, 29328, 271, 1491, 3673, 44856, 1332, 5926, 67, 1519, 44856, 326, 4483, 253, 3280, 281, 320, 21012, 407, 18752, 1016, 3280, 4735, 342, 5912, 12580, 253, 1566, 840, 33772, 253, 5926, 5912, 4972, 268, 50276, 81, 18, 50275, 16077, 835, 18752, 28116, 3386, 588, 4796, 253, 13800, 12339, 1307, 891, 39344, 253, 2746, 310, 5183, 275, 4679, 275, 337, 10237, 17947, 4758, 323, 391, 77, 374, 48960, 8104, 327, 4440, 257, 292, 285, 495, 271, 3368, 4645, 326, 616, 2746, 310, 2104, 281, 6558, 3045, 327, 4440, 257, 292, 342, 3777, 7877, 1319, 50276, 15419, 2368, 891, 1119, 436, 2929, 281, 320, 2590, 285, 253, 4679, 1646, 5272, 891, 13414, 871, 273, 667, 2720, 789, 326, 3936, 436, 2746, 516, 417, 271, 391, 77, 6485, 594, 891, 31451, 4385, 327, 253, 4757, 273, 253, 391, 77, 1543, 643, 685, 326, 616, 3082, 497, 2590, 285, 597, 1646, 281, 452, 644, 10182, 285, 4344, 275, 13887, 1666, 25379, 619, 5962, 14226, 310, 326, 253, 31640, 4679, 327, 4440, 257, 292, 7277, 281, 362, 487, 247, 5616, 74, 1162, 355, 4240, 347, 247, 8245, 533, 597, 943, 1663, 7277, 281, 253, 625, 3332, 48960, 1543, 327, 17697, 15579, 3673, 44856, 1529, 1491, 3673, 44856, 2746, 326, 41731, 13015, 362, 487, 1677, 275, 269, 23268, 285, 247, 5616, 74, 9169, 5987, 39962, 2061, 9275, 1518, 938, 3357, 1438, 9275, 2654, 751, 281, 923, 326, 2879, 281, 253, 17265, 2715, 7152, 339, 793, 360, 3454, 436, 2929, 29328, 253, 5926, 67, 1519, 44856, 14073, 1332, 326, 17923, 4735, 5438, 1309, 253, 3733, 342, 253, 15577, 1491, 352, 33526, 253, 1375, 23037, 14387, 906, 275, 1643, 35221, 4715, 8892, 285, 18784, 247, 625, 10237, 1566, 50276, 19164, 414, 285, 8453, 4809, 436, 2929, 24772, 7194, 767, 5697, 337, 10610, 4735, 5438, 5206, 1269, 261, 281, 5926, 342, 1675, 281, 253, 15577, 1491, 875, 1269, 285, 340, 374, 1491, 3673, 44856, 18890, 15895, 326, 11903, 4219, 253, 10554, 3945, 15577, 1491, 1307, 875, 1182, 285, 340, 285, 46926, 253, 13800, 1491, 875, 1269, 285, 1182, 10486, 352, 310, 2686, 9648, 2810, 281, 253, 5161, 2934, 273, 253, 4735, 5438, 984, 352, 9010, 253, 13800, 407, 18752, 253, 3236, 4735, 390, 417, 12401, 253, 643, 18890, 13800, 3082, 275, 643, 3000, 14073, 310, 247, 4980, 1025, 2715, 273, 253, 4735, 5438, 326, 8356, 15323, 247, 4735, 1754, 327, 253, 18890, 4826, 2957, 891, 651, 1333, 253, 2934, 310, 417, 7094, 747, 8489, 3710, 533, 352, 812, 320, 1335, 4217, 281, 253, 3114, 50275, 249, 253, 8453, 4809, 891, 3078, 281, 923, 4477, 281, 4647, 436, 1332, 327, 643, 27620, 9978, 8892, 24088, 4382, 8113, 8892, 342, 6046, 3345, 273, 253, 35221, 4715, 8892, 253, 7756, 327, 253, 391, 77, 8892, 3133, 281, 320, 6832, 2299, 891, 651, 751, 281, 871, 849, 14073, 17923, 672, 627, 310, 5921, 875, 4735, 10103, 923, 2708, 37699, 1953, 281, 923, 625, 4278, 50276, 15177, 285, 19843, 4809, 50276, 783, 2929, 369, 4583, 3477, 281, 956, 1060, 403, 247, 1643, 3533, 281, 4477, 50275, 5371, 604, 359, 816, 5926, 253, 4735, 2317, 760, 970, 253, 15577, 1491, 875, 1269, 285, 340, 285, 5926, 731, 281, 5115, 247, 2074, 1180, 273, 3386, 326, 369, 7369, 407, 14073, 50276, 262, 310, 10323, 253, 10610, 15577, 1491, 4735, 5438, 651, 326, 1347, 347, 1175, 347, 14073, 476, 368, 1056, 247, 5301, 50276, 74, 1158, 436, 943, 320, 581, 273, 253, 8245, 604, 28763, 2074, 281, 14073, 752, 651, 320, 253, 5649, 273, 14073, 50276, 18566, 14073, 1900, 15338, 7856, 478, 1269, 50276, 3113, 253, 3907, 9376, 672, 690, 273, 253, 1269, 261, 403, 9578, 24088, 1908, 247, 8113, 4836, 627, 812, 320, 3240, 247, 8037, 875, 24901, 1269, 285, 253, 3907, 9376, 2715, 50275, 250, 27167, 318, 891, 1158, 253, 2929, 310, 387, 253, 45210, 2819, 3579, 281, 6523, 625, 5955, 342, 253, 10610, 4735, 5438, 1332, 285, 690, 7103, 327, 8892, 3345, 273, 391, 77, 604, 1896, 891, 651, 320, 5211, 281, 45735, 619, 4868, 50274, 5996, 30080, 22559, 4385, 891, 5717, 253, 4477, 323, 253, 30080, 22559, 4477, 452, 9713, 619, 7350, 285, 31637, 690, 273, 253, 21643, 2792, 326, 891, 574, 891, 651, 751, 281, 5583, 436, 2929, 281, 320, 7607, 50276, 7152, 339, 431, 248, 2929, 29328, 247, 747, 1491, 3673, 44856, 8103, 534, 509, 16443, 253, 21624, 407, 4715, 281, 5926, 3386, 2074, 281, 5926, 483, 12401, 5926, 483, 247, 1027, 5912, 310, 34003, 323, 1016, 21624, 12819, 50127, 970, 11859, 17040, 253, 4679, 921, 326, 253, 2987, 973, 50275, 1189, 455, 891, 4868, 436, 2929, 347, 271, 2997, 1223, 253, 2746, 310, 3710, 281, 18752, 3280, 3386, 534, 1057, 417, 1056, 352, 247, 2087, 18890, 8103, 352, 3133, 281, 789, 1077, 973, 275, 253, 3559, 391, 77, 4679, 347, 973, 347, 921, 31640, 326, 310, 1805, 685, 43857, 487, 84, 25761, 253, 2929, 310, 4518, 3542, 285, 15966, 50274, 296, 3755, 20556, 50275, 783, 2929, 29328, 247, 2969, 2568, 3576, 2934, 970, 247, 11859, 17040, 281, 3037, 5926, 483, 20552, 556, 644, 2218, 1078, 533, 253, 2934, 281, 452, 247, 4858, 5912, 591, 21624, 7877, 310, 4460, 253, 5926, 67, 1519, 44856, 8103, 2987, 3587, 327, 253, 3280, 13324, 290, 3828, 534, 2097, 326, 253, 13800, 8103, 310, 3477, 281, 11897, 436, 310, 5322, 984, 18890, 2426, 403, 3798, 41049, 281, 11897, 2299, 436, 671, 4419, 253, 3280, 13324, 290, 281, 2168, 320, 557, 290, 33195, 347, 18752, 562, 3386, 310, 3710, 275, 697, 3890, 6460, 50274, 783, 391, 77, 4679, 327, 40027, 3088, 297, 285, 277, 1686, 357, 403, 21414, 347, 403, 253, 4394, 327, 4440, 257, 292, 253, 3081, 3368, 327, 253, 15715, 4686, 260, 338, 274, 740, 10895, 275, 253, 30762, 310, 671, 973, 1869, 483, 285, 2722, 253, 5750, 273, 436, 15246, 1332, 689, 43857, 487, 50275, 3062, 1189, 2378, 10166, 3386, 476, 320, 8231, 562, 11544, 18260, 604, 594, 2424, 534, 4483, 323, 1463, 13800, 285, 5185, 8770, 50274, 34974, 50275, 28821, 253, 2969, 20178, 2934, 436, 37317, 651, 320, 6110, 281, 923, 271, 28913, 342, 970, 643, 3082, 273, 37703, 37139, 414, 275, 253, 21624, 812, 298, 18, 37820, 273, 253, 21624, 1396, 569, 320, 908, 3185, 273, 253, 24901, 1269, 1307, 50275, 5470, 2550, 2085, 253, 1072, 31376, 347, 643, 18890, 16566, 253, 3280, 21624, 556, 281, 320, 10481, 557, 290, 33195, 2168, 347, 253, 8103, 3139, 1057, 417, 11907, 2007, 557, 290, 606, 1338, 407, 3139, 513, 253, 4440, 257, 292, 4679, 897, 10375, 4025, 11273, 46234, 347, 4329, 592, 50274, 250, 2858, 22559, 50276, 74, 5717, 253, 4477, 323, 616, 12252, 516, 625, 13224, 436, 310, 247, 1175, 2929, 1024, 7152, 339, 793, 360, 3454, 253, 2929, 17904, 247, 4460, 1332, 5926, 67, 1519, 44856, 14073, 323, 35132, 600, 18752, 3280, 3386, 326, 403, 19124, 323, 21565, 253, 2303, 4778, 2234, 2934, 310, 281, 8164, 4513, 253, 13800, 1307, 273, 253, 1491, 3673, 44856, 7792, 342, 6311, 1307, 326, 5239, 19124, 4735, 10103, 281, 470, 281, 436, 990, 247, 5926, 5912, 310, 6311, 323, 1016, 7877, 10103, 326, 452, 247, 2406, 5912, 685, 16987, 247, 4229, 7887, 273, 1146, 4623, 403, 873, 281, 470, 50276, 9072, 2792, 50276, 783, 2929, 310, 973, 15720, 285, 3477, 281, 2096, 50276, 16217, 3825, 921, 326, 14073, 2987, 1805, 685, 362, 487, 275, 40027, 3088, 297, 285, 277, 1686, 357, 672, 247, 642, 261, 1767, 87, 6046, 310, 2879, 281, 253, 3280, 3888, 1027, 642, 261, 1767, 87, 33737, 403, 2783, 6890, 253, 2460, 672, 253, 5570, 17923, 271, 2250, 6240, 3632, 6046, 281, 253, 23055, 285, 6240, 3632, 6046, 672, 253, 5570, 17923, 271, 2250, 50276, 16217, 3825, 671, 921, 326, 253, 2797, 6779, 310, 625, 10237, 1411, 298, 19, 285, 298, 2050, 8104, 275, 4440, 257, 292, 33810, 253, 4679, 921, 326, 253, 2746, 476, 5926, 1142, 4440, 257, 292, 3386, 1223, 2761, 24279, 253, 7200, 273, 247, 501, 3024, 50276, 783, 2929, 3249, 342, 2127, 275, 253, 24864, 2144, 50276, 20881, 2792, 50276, 74, 1119, 253, 35221, 4715, 4679, 417, 21414, 1580, 760, 247, 4229, 2919, 273, 253, 3280, 310, 7321, 407, 6046, 26332, 253, 27620, 23055, 7613, 253, 2746, 9093, 22649, 19124, 12275, 8593, 824, 247, 1895, 812, 320, 14042, 407, 247, 2969, 638, 21678, 3213, 253, 1332, 31451, 789, 604, 253, 4328, 273, 253, 6046, 2544, 275, 2087, 7364, 273, 253, 789, 403, 417, 5469, 50276, 783, 4679, 327, 4440, 257, 292, 403, 625, 4722, 2299, 253, 958, 326, 2060, 10103, 26332, 2173, 12275, 8593, 403, 3636, 347, 19124, 310, 1335, 247, 14155, 2803, 33810, 253, 4679, 513, 417, 4944, 281, 253, 2770, 273, 436, 2929, 327, 35221, 4715, 50276, 783, 2929, 1057, 417, 2319, 10291, 273, 253, 3559, 2746, 281, 2720, 2987, 323, 13358, 4735, 5438, 352, 760, 25339, 10291, 281, 2720, 3673, 44856, 3082, 50276, 783, 2929, 1057, 417, 1347, 4679, 327, 15302, 342, 14282, 3386, 835, 247, 4735, 5438, 2789, 625, 3282, 685, 323, 2173, 15115, 275, 3888, 50276, 38092, 8680, 50276, 262, 812, 320, 4767, 11120, 326, 288, 10770, 281, 253, 15579, 4390, 352, 310, 760, 29688, 2931, 275, 16186, 721, 50276, 74, 1158, 352, 651, 320, 1805, 281, 671, 26542, 480, 606, 1162, 355, 31091, 294, 19484, 1320, 342, 305, 3561, 293, 5530, 4090, 323, 253, 11859, 17040, 273, 253, 270, 1808, 276, 25658, 3268, 50276, 13206, 337, 943, 2057, 320, 5520, 390, 5176, 891, 13414, 923, 1199, 3081, 16039, 326, 476, 320, 12103, 432, 436, 4677, 50276, 1171, 2282, 352, 651, 320, 1077, 4722, 281, 923, 604, 253, 5926, 20552, 24888, 342, 253, 4328, 273, 253, 6046, 14800, 352, 651, 320, 1270, 604, 824, 271, 1783, 812, 320, 2879, 436, 812, 8171, 4677, 337, 187, 187, 4118, 18435, 27, 2520, 2929, 29328, 281, 7278, 253, 31640, 273, 391, 77, 285, 22296, 4715, 11333, 281, 6046, 275, 253, 7313, 407, 18752, 3280, 3386, 326, 403, 19124, 323, 253, 4836, 352, 15771, 327, 253, 1491, 3673, 44856, 7792, 973, 6012, 275, 253, 2929, 285, 33772, 247, 36833, 13800, 273, 253, 3280, 3386, 326, 5239, 731, 281, 5058, 604, 597, 403, 417, 4623, 323, 253, 4836, 79, 253, 1332, 310, 18171, 6760, 327, 2067, 391, 77, 8892, 17947, 275, 40027, 3088, 297, 285, 277, 1686, 357, 342, 247, 27620, 23055, 940, 30524, 285, 22296, 8892, 4440, 257, 292, 390, 260, 338, 274, 740, 9162, 342, 6046, 50276, 15337, 398, 452, 26108, 253, 2934, 28529, 285, 4028, 347, 973, 347, 253, 9470, 4679, 327, 391, 77, 285, 22296, 8892, 29254, 7106, 327, 50276, 783, 523, 30487, 3753, 273, 253, 23055, 6046, 1980, 1701, 1900, 275, 253, 1072, 7145, 273, 253, 2460, 50276, 66, 2629, 7103, 2556, 281, 253, 4477, 50276, 77, 471, 273, 5301, 342, 643, 4735, 5438, 3082, 50276, 77, 471, 273, 5301, 342, 17697, 15579, 3673, 44856, 2218, 1309, 30080, 22559, 50276, 3062, 2087, 6046, 685, 816, 2173, 15115, 31637, 407, 253, 4477, 347, 1146, 253, 3386, 3551, 562, 273, 247, 2410, 3024, 50276, 28821, 326, 253, 30628, 5701, 497, 8127, 9713, 407, 253, 4477, 285, 1677, 253, 2457, 7363, 273, 253, 2929, 891, 588, 5583, 14924, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper builds upon previous lines of research on multitask learning problem such as conditional latent variable models including the neural process as shown by the extensive related work section this seems to be an active research direction this makes it difficult for me to judge originality and significance but it is wellwritten and clear specific comments the approximate posterior distribution qphi is often referred to as the posterior distribution i would keep approximate here p2 correspondence of gps with infinite bayesian nns bnns what is meant by infinite bnn is it infinitewidth bnn please specify p2 adaptive blr please describe the acronym p6 the gaussian approximation of the posterior predictive likelihood 10 is said to be inspired by gps which also define a gaussian likelihood this is also essentially what is done by synthetic likelihood a nature paper by sn wood 2010 which is i think more closely related to the proposed approach than gps p6 one line below define pb acronym the first time it is used not three lines after typos top p2 does not introducing noticeable computational overhead p6 conditional model with an decoder operating on the references list could be tidied some first names are abbreviated some notdocsepin this paper the authors make two contributions to neural processlike clv models first they replace the somewhat adhoc variationallike approach to learning the amortized latent variable distribution with a monte carlo based approximation second they replace the step of context aggregation with direct latent variable inference over z overall in my opinion these modifications make the neural process model significantly cleaner from a bayesian perspective and is quite nice in the context of neural processes i have very little criticism for the authors methods everything makes sense and the mc approach in equation 3 seems cleaner to me than the somewhat ad hoc vi like approach in equation 2 the biggest difficulty i have is determining how to evaluate the authors clear improvements to neural processes in the broader context of scalable probabilistic regression an area to which the authors claim membership to start with the authors clearly demonstrate the value of both bayesian context aggregation and a mc based likelihood approximation scheme on precisely the same types of problems that existing neural processes papers eg garnelo et al 2018 have considered with the notable exception that the 2d image completion task considers only mnist as a target dataset in this respect its difficult to fault the experimental evaluation however this paper and many neural process papers are written in the context of formulating scalable probabilistic regression models with reliable uncertainty estimates surely at some point this should involve a comparison of these approaches to existing techniques for probabilistic regression whether that be deep gaussian processes dropout based approaches bayesian neural networks or other approaches i dont mean to imply here that the authors are unaware of this large body of literature indeed the authors have a decent if incomplete overview of techniques in this area notably missing work on deep gps rather it just seems surprising to me that the discussion of the relevant probabilistic regression literature ends at well it exists what i would like to see is a discussion of where the authors impressive improvements to neural processes leave the model family in this broader context how close or far off is the family on performance for standard benchmark regression tasks are there settings in which we can leverage the fully nn based nature of neural processes to achieve probabilistic regression in settings where the inductive biases of kernel methods are poor like in computer vision or natural language processing the relatively toy nature and limited dimensionality of the problems considered suggests that there is still significant progress to be made before such a comparison would be reasonable or even possible to summarize in the context of neural processes i feel the paper makes good methodological contributions in presenting a much cleaner and more natural from a bayesian perspective version of the model that has more of the flavor of standard amortized inference for latent variable models within the very narrow context of neural process papers i therefore have very little to complain about however from a broader scientific perspective i would feel that the paper would be significantly strengthened by a fair evaluation to the rest of this literature whether empirical or simply in discussion regardless of how the authors approach fares in comparisondocsepsummary of the paper this paper describes bayesian context aggregation for neural processes these models are useful to address regression problems in which a set of related tasks are available for inference with associated context information in the form of extra data these models assume that there is a taskspecific global latent variable and taskindependent latent variable they are learned via approximate maximum posterior likelihood in which the latent variables specific for each tasks are marginalzied out for this an approximation to the posterior distribution of these variables is need this requires conditioning to the context dataset which is challenging in the past a latent representation is used and the context data set is aggregated as the mean of the latent representation in this paper a bayesian way of aggregating context information is proposed this is based on using bayes rule and a gaussian generative model for the latent representations the proposed method also leads to a new way of training clv models which is based on moment matching the method is validated on several synthetic an realworld experiments showing improvements over mean aggregation detailed comments i believe that this is a relevant paper context aggregation is a difficult problem that is required to address the learning tasks described in the paper previous solution look limited and the proposed method seems natural and a more effective method of aggregating this information the paper is well written and the proposed method is sound the experiments are also convincing and exhaustive i believe that this is a relevant paper for the conference docsepthe authors present the bayesian aggregation ba mechanism in the context of neural processes nps for aggregating the context information into the latent variable z in the form of posterior updates to z the authors show that this improves predictive performance in terms of likelihood compared to mean aggregation ma that it replaces on various regression tasks with varying inputoutput dimensionality strengths 1 the idea is simple and leads to a notable improvement compared to ma in terms of likelihood 2 the background and method is presented very clearly 3 the evaluation is done on a wide variety of tasks ranging from standard 1d regression of gp samples to pendulum trajectory prediction tasks weaknesses 1 the evaluation is missing an important baseline model which are anp models that have selfattention in the encoder for processing the contexts cf model figure in anp paper kim et al 2019b contrary to the npcnp baselines that are compared against in the paper the anp with selfattention in the encoder does not give uniform weights to each context point the selfattention allows the model to assign varying importance to the different context points despite using meanaggregation after the selfattention which is presented as a key motivation for the ba mechanism introduced in the paper hence for the experiments i strongly suggest comparing against cnpnpanp with selfattention in the deterministiclatentlatent path of the encoder for completeness if would be nice to also compare against models that have both deterministic and latent paths since ba can also be applied to these models at the same time i understand that ba would be more interpretable for showing which observations have littlehigh effect on z compared to the approach of using selfattention in the encoder but it would still be very informative for the reader to be able to compare the two approaches also these two approaches can be combined to have selfattention in the encoder ba which might also yield improved performance 2 the claim that ba includes ma as a special case doesnt seem to be true using a noninformative prior and uniform observation variances leads to constant sigmaz and muz being linearly proportional to meanrn ie sumn rn n which is not quite the same as ma ma allows sigmaz and muz to be nonlinear functions of meanrn hence is strictly more expressive than this special case 3 in equation 7 it seems as though the context points xnyn only affects rn via the variance which seems unnecessarily limiting why not have the mean also depend on rn eg prnz nrn z murn diagsigmarn2 where murn is also computed as a function of xnyn this will still give a closedform posterior pzr1n since the mean of prnz is still linear in z creating a model thats strictly more expressive with very similar efficiency it would be informative to see how this changes the experimental results 4 im guessing the vi objective was used to train the anp given the clear advantage of training with the mc objective shouldnt the anp also be trained with mc 5 the latent variable models were not evaluated on 2d image completion tasks because architectures without deterministic paths were not able to solve this task why not then add a deterministic path to these latent variable models to allow them to train other points in the text it says that the model is also compared against anp to show that ba can compete with sota this is arguably incorrect since convcnp models are sota among models of the np family showing a significant improvement over anp hence to achieve the goal mentioned in the text it would make sense to compare with convcnp models as part of the evaluation against other deterministic nps overall the paper is presented very clearly with a simple yet effective idea tested on a wide variety of tasks however its missing an important baseline that uses selfattention in the encoder along with several other baselines that would be informative to compare against i am willing to increase my score should these results be included in the revised version of the paper score raised to 6 after inclusion of ma sa results in rebuttal ### Summary:
the authors present a bayesian approach for context aggregation in neural processes based models the article is well written and provides a nice and comprehensive framework the reviewers raised some issues regarding the lack of comparisons to proper baselines the authors provided additional comparisons in the revised version the comparisons were found satisfactory by some some reviewers who increased their scores based on the revised version i recommend acceptance
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 21168, 2220, 2045, 3104, 273, 2561, 327, 1554, 262, 1945, 4715, 1895, 824, 347, 17697, 21624, 4778, 3210, 1690, 253, 11454, 1232, 347, 2011, 407, 253, 9470, 2905, 789, 2593, 436, 3133, 281, 320, 271, 3939, 2561, 3884, 436, 2789, 352, 2834, 323, 479, 281, 5963, 3236, 414, 285, 8453, 533, 352, 310, 973, 15720, 285, 2590, 50276, 6160, 5701, 50276, 783, 16851, 12637, 3268, 2805, 2162, 310, 2223, 6289, 281, 347, 253, 12637, 3268, 891, 651, 1978, 16851, 1060, 50275, 81, 19, 17668, 273, 305, 793, 342, 11968, 17699, 16561, 295, 2224, 270, 79, 2224, 752, 310, 5486, 407, 11968, 270, 9866, 310, 352, 11968, 3429, 270, 9866, 4496, 13199, 50276, 81, 19, 17825, 787, 83, 4496, 6266, 253, 913, 1406, 1105, 50276, 81, 23, 253, 305, 12064, 11193, 273, 253, 12637, 15970, 12177, 884, 310, 753, 281, 320, 11797, 407, 305, 793, 534, 671, 4853, 247, 305, 12064, 12177, 436, 310, 671, 9093, 752, 310, 2218, 407, 13506, 12177, 247, 3753, 2929, 407, 3802, 5534, 4267, 534, 310, 891, 1158, 625, 8244, 2905, 281, 253, 4081, 2746, 685, 305, 793, 50276, 81, 23, 581, 1386, 2708, 4853, 268, 67, 913, 1406, 1105, 253, 806, 673, 352, 310, 908, 417, 1264, 3104, 846, 50276, 555, 993, 50276, 3956, 268, 19, 1057, 417, 16984, 28629, 15180, 18332, 50276, 81, 23, 17697, 1566, 342, 271, 29810, 6498, 327, 50276, 783, 10414, 1618, 812, 320, 18882, 728, 690, 806, 4454, 403, 48030, 690, 417, 7152, 339, 9852, 436, 2929, 253, 4477, 1056, 767, 9021, 281, 11454, 1232, 3022, 502, 87, 3210, 806, 597, 8171, 253, 8489, 519, 37806, 7629, 455, 2804, 2746, 281, 4715, 253, 717, 430, 1025, 21624, 4778, 3268, 342, 247, 1114, 442, 1113, 4213, 1754, 11193, 1273, 597, 8171, 253, 3213, 273, 3634, 20828, 342, 1480, 21624, 4778, 17032, 689, 1182, 50275, 1189, 455, 275, 619, 4743, 841, 14586, 1056, 253, 11454, 1232, 1566, 3012, 28452, 432, 247, 17699, 16561, 8668, 285, 310, 3240, 5322, 275, 253, 3634, 273, 11454, 4870, 891, 452, 1077, 1652, 14226, 323, 253, 4477, 3082, 3253, 2789, 3282, 285, 253, 278, 68, 2746, 275, 5150, 495, 3133, 28452, 281, 479, 685, 253, 8489, 519, 26901, 2177, 751, 2746, 275, 5150, 374, 50276, 783, 5962, 10183, 891, 452, 310, 8925, 849, 281, 7472, 253, 4477, 2590, 11701, 281, 11454, 4870, 275, 253, 16055, 3634, 273, 44755, 37851, 9077, 271, 2170, 281, 534, 253, 4477, 1750, 14199, 281, 1265, 342, 253, 4477, 4518, 7568, 253, 1318, 273, 1097, 17699, 16561, 3634, 20828, 285, 247, 278, 68, 1754, 12177, 11193, 6974, 327, 10534, 253, 1072, 3510, 273, 3237, 326, 5368, 11454, 4870, 9380, 24088, 34226, 29595, 1162, 355, 4765, 452, 2783, 342, 253, 16613, 6517, 326, 253, 374, 69, 2460, 12240, 4836, 19401, 760, 278, 79, 382, 347, 247, 2303, 10895, 275, 436, 1675, 697, 2834, 281, 9331, 253, 5661, 7103, 50276, 35529, 436, 2929, 285, 1142, 11454, 1232, 9380, 403, 3542, 275, 253, 3634, 273, 830, 8287, 44755, 37851, 9077, 3210, 342, 9630, 11649, 8197, 13353, 387, 690, 1127, 436, 943, 6388, 247, 5301, 273, 841, 7274, 281, 5368, 5609, 323, 37851, 9077, 1880, 326, 320, 3676, 305, 12064, 4870, 5926, 483, 1754, 7274, 17699, 16561, 11454, 6928, 390, 643, 7274, 891, 13414, 1599, 281, 16084, 1060, 326, 253, 4477, 403, 25229, 273, 436, 1781, 2133, 273, 6239, 50276, 527, 13158, 253, 4477, 452, 247, 12524, 604, 18464, 18389, 273, 5609, 275, 436, 2170, 19836, 5816, 789, 327, 3676, 305, 793, 50276, 30786, 352, 816, 3133, 10084, 281, 479, 326, 253, 5955, 273, 253, 4623, 37851, 9077, 6239, 7637, 387, 973, 352, 4961, 752, 891, 651, 751, 281, 923, 310, 247, 5955, 273, 835, 253, 4477, 13943, 11701, 281, 11454, 4870, 3553, 253, 1566, 2021, 275, 436, 16055, 3634, 849, 2810, 390, 2080, 745, 310, 253, 2021, 327, 3045, 323, 2629, 22791, 9077, 8892, 403, 627, 7533, 275, 534, 359, 476, 25057, 253, 4751, 48257, 1754, 3753, 273, 11454, 4870, 281, 5115, 37851, 9077, 275, 7533, 835, 253, 42115, 31306, 273, 10295, 3082, 403, 4105, 751, 275, 4382, 8113, 390, 3626, 3448, 5162, 253, 4942, 20953, 3753, 285, 3710, 7877, 1319, 273, 253, 3237, 2783, 5936, 326, 627, 310, 1335, 1534, 4780, 281, 320, 1160, 1078, 824, 247, 5301, 651, 320, 5272, 390, 1014, 1896, 50276, 936, 26799, 275, 253, 3634, 273, 11454, 4870, 891, 1928, 253, 2929, 2789, 1175, 35961, 9021, 275, 15250, 247, 1199, 28452, 285, 625, 3626, 432, 247, 17699, 16561, 8668, 2715, 273, 253, 1566, 326, 556, 625, 273, 253, 13746, 273, 2629, 717, 430, 1025, 17032, 323, 21624, 4778, 3210, 1561, 253, 1077, 6891, 3634, 273, 11454, 1232, 9380, 891, 3103, 452, 1077, 1652, 281, 17805, 670, 2299, 432, 247, 16055, 8249, 8668, 891, 651, 1928, 326, 253, 2929, 651, 320, 3012, 34615, 407, 247, 4344, 7103, 281, 253, 1551, 273, 436, 6239, 1880, 16774, 390, 3365, 275, 5955, 10159, 273, 849, 253, 4477, 2746, 4195, 373, 275, 3294, 261, 857, 406, 339, 793, 360, 3454, 273, 253, 2929, 50266, 2520, 2929, 8631, 17699, 16561, 3634, 20828, 323, 11454, 4870, 841, 3210, 403, 4217, 281, 2953, 9077, 3237, 275, 534, 247, 873, 273, 2905, 8892, 403, 2130, 323, 17032, 342, 2330, 3634, 1491, 275, 253, 830, 273, 4465, 941, 841, 3210, 5467, 326, 627, 310, 247, 8892, 29765, 4156, 21624, 4778, 285, 4836, 17777, 21624, 4778, 597, 403, 6311, 3066, 16851, 4869, 12637, 12177, 275, 534, 253, 21624, 4903, 2173, 323, 1016, 8892, 403, 16888, 91, 728, 562, 323, 436, 271, 11193, 281, 253, 12637, 3268, 273, 841, 4903, 310, 878, 436, 4419, 21839, 281, 253, 3634, 10895, 534, 310, 11132, 275, 253, 2469, 247, 21624, 6779, 310, 908, 285, 253, 3634, 941, 873, 310, 40006, 347, 253, 1599, 273, 253, 21624, 6779, 275, 436, 2929, 247, 17699, 16561, 1039, 273, 9406, 839, 3634, 1491, 310, 4081, 436, 310, 1754, 327, 970, 17699, 265, 4086, 285, 247, 305, 12064, 1006, 800, 1566, 323, 253, 21624, 14237, 253, 4081, 1332, 671, 5644, 281, 247, 747, 1039, 273, 3733, 502, 87, 3210, 534, 310, 1754, 327, 2774, 11038, 253, 1332, 310, 17618, 327, 2067, 13506, 271, 1524, 10186, 4679, 4645, 11701, 689, 1599, 20828, 50276, 5992, 7193, 5701, 50276, 74, 2868, 326, 436, 310, 247, 4623, 2929, 3634, 20828, 310, 247, 2834, 1895, 326, 310, 2424, 281, 2953, 253, 4715, 8892, 2529, 275, 253, 2929, 2045, 2900, 1007, 3710, 285, 253, 4081, 1332, 3133, 3626, 285, 247, 625, 3576, 1332, 273, 9406, 839, 436, 1491, 253, 2929, 310, 973, 3542, 285, 253, 4081, 1332, 310, 3590, 253, 4679, 403, 671, 21414, 285, 41389, 891, 2868, 326, 436, 310, 247, 4623, 2929, 323, 253, 8059, 50276, 7152, 339, 431, 248, 4477, 1246, 253, 17699, 16561, 20828, 18927, 5122, 275, 253, 3634, 273, 11454, 4870, 295, 793, 323, 9406, 839, 253, 3634, 1491, 715, 253, 21624, 4778, 1182, 275, 253, 830, 273, 12637, 11269, 281, 1182, 253, 4477, 921, 326, 436, 19132, 15970, 3045, 275, 2426, 273, 12177, 2429, 281, 1599, 20828, 6429, 326, 352, 36287, 327, 2710, 9077, 8892, 342, 11962, 3280, 9252, 7877, 1319, 50276, 296, 3755, 20556, 337, 253, 2934, 310, 2969, 285, 5644, 281, 247, 16613, 7756, 2429, 281, 6429, 275, 2426, 273, 12177, 374, 253, 4114, 285, 1332, 310, 3559, 1077, 4518, 495, 253, 7103, 310, 2218, 327, 247, 4618, 5235, 273, 8892, 12319, 432, 2629, 337, 69, 9077, 273, 31025, 3530, 281, 32752, 15508, 18974, 10554, 8892, 50276, 20881, 1255, 265, 337, 253, 7103, 310, 5816, 271, 1774, 8245, 1566, 534, 403, 271, 81, 3210, 326, 452, 1881, 42959, 275, 253, 32049, 323, 5162, 253, 22349, 21194, 1566, 4677, 275, 271, 81, 2929, 465, 303, 1162, 355, 6247, 67, 10214, 281, 253, 295, 5902, 18650, 1666, 25379, 326, 403, 2429, 1411, 275, 253, 2929, 253, 271, 81, 342, 1881, 42959, 275, 253, 32049, 1057, 417, 1918, 6447, 13461, 281, 1016, 3634, 1127, 50276, 783, 1881, 42959, 4483, 253, 1566, 281, 9212, 11962, 6349, 281, 253, 1027, 3634, 2792, 5747, 970, 1599, 356, 18840, 846, 253, 1881, 42959, 534, 310, 3559, 347, 247, 2234, 16038, 323, 253, 18927, 5122, 5611, 275, 253, 2929, 7613, 323, 253, 4679, 891, 7052, 1804, 10941, 1411, 260, 79, 16077, 4029, 81, 342, 1881, 42959, 275, 253, 30027, 13324, 290, 13324, 290, 1854, 273, 253, 32049, 323, 29867, 604, 651, 320, 5322, 281, 671, 7277, 1411, 3210, 326, 452, 1097, 30027, 285, 21624, 11865, 1580, 18927, 476, 671, 320, 3732, 281, 841, 3210, 387, 253, 1072, 673, 891, 2096, 326, 18927, 651, 320, 625, 4665, 494, 323, 4645, 534, 7313, 452, 1652, 8656, 1055, 327, 1182, 2429, 281, 253, 2746, 273, 970, 1881, 42959, 275, 253, 32049, 533, 352, 651, 1335, 320, 1077, 27096, 323, 253, 9414, 281, 320, 2104, 281, 7277, 253, 767, 7274, 671, 841, 767, 7274, 476, 320, 5678, 281, 452, 1881, 42959, 275, 253, 32049, 50276, 5830, 534, 1537, 671, 4917, 5520, 3045, 374, 253, 1750, 326, 18927, 3797, 6429, 347, 247, 2714, 1083, 36908, 1646, 281, 320, 2032, 970, 247, 1327, 37650, 800, 2720, 285, 6447, 8310, 48894, 5644, 281, 3638, 40009, 91, 285, 278, 7958, 1146, 23352, 14495, 281, 1599, 30930, 26332, 2020, 79, 391, 79, 50276, 79, 534, 310, 417, 3240, 253, 1072, 347, 6429, 50276, 785, 4483, 40009, 91, 285, 278, 7958, 281, 320, 14561, 3470, 273, 1599, 30930, 7613, 310, 13714, 625, 43541, 685, 436, 2714, 1083, 495, 275, 5150, 818, 352, 3133, 347, 2167, 253, 3634, 2792, 1269, 79, 1362, 760, 11852, 391, 79, 3066, 253, 11041, 534, 3133, 48312, 14155, 2139, 417, 452, 253, 1599, 671, 3469, 327, 391, 79, 24088, 819, 36794, 50276, 23838, 79, 1182, 50276, 78, 662, 1073, 3544, 15379, 1596, 19, 835, 278, 662, 310, 671, 10302, 347, 247, 1159, 273, 1269, 79, 1362, 436, 588, 1335, 1918, 247, 4581, 630, 12637, 268, 31383, 18, 79, 1580, 253, 1599, 273, 819, 36794, 310, 1335, 4872, 275, 1182, 6153, 247, 1566, 28763, 13714, 625, 43541, 342, 1077, 2074, 6733, 352, 651, 320, 27096, 281, 923, 849, 436, 2544, 253, 5661, 1543, 577, 516, 29985, 253, 2177, 8103, 369, 908, 281, 6194, 253, 271, 81, 1677, 253, 2590, 5750, 273, 3733, 342, 253, 278, 68, 8103, 943, 2649, 253, 271, 81, 671, 320, 10166, 342, 278, 68, 608, 253, 21624, 4778, 3210, 497, 417, 6760, 327, 374, 69, 2460, 12240, 8892, 984, 35615, 1293, 30027, 11865, 497, 417, 2104, 281, 8415, 436, 4836, 2139, 417, 840, 823, 247, 30027, 1854, 281, 841, 21624, 4778, 3210, 281, 1581, 731, 281, 6194, 50276, 977, 2792, 50276, 249, 253, 2505, 352, 2296, 326, 253, 1566, 310, 671, 2429, 1411, 271, 81, 281, 921, 326, 18927, 476, 15639, 342, 256, 5503, 436, 310, 25711, 13583, 1580, 2410, 14340, 81, 3210, 403, 256, 5503, 2190, 3210, 273, 253, 15749, 2021, 4645, 247, 1534, 7756, 689, 271, 81, 7613, 281, 5115, 253, 4736, 5393, 275, 253, 2505, 352, 651, 1056, 3282, 281, 7277, 342, 2410, 14340, 81, 3210, 347, 629, 273, 253, 7103, 1411, 643, 30027, 295, 793, 50276, 1189, 455, 253, 2929, 310, 3559, 1077, 4518, 342, 247, 2969, 2568, 3576, 2934, 5762, 327, 247, 4618, 5235, 273, 8892, 2299, 697, 5816, 271, 1774, 8245, 326, 4648, 1881, 42959, 275, 253, 32049, 2112, 342, 2067, 643, 1666, 25379, 326, 651, 320, 27096, 281, 7277, 1411, 891, 717, 7378, 281, 2572, 619, 4868, 943, 841, 1543, 320, 2908, 275, 253, 17265, 2715, 273, 253, 2929, 50274, 18891, 5439, 281, 721, 846, 11250, 273, 6429, 50276, 6678, 1543, 275, 30080, 22559, 187, 187, 4118, 18435, 27, 783, 4477, 1246, 247, 17699, 16561, 2746, 323, 3634, 20828, 275, 11454, 4870, 1754, 3210, 253, 3929, 310, 973, 3542, 285, 3400, 247, 5322, 285, 11088, 7792, 253, 30628, 5439, 690, 3374, 5001, 253, 3480, 273, 14023, 281, 1463, 1666, 25379, 253, 4477, 2530, 3081, 14023, 275, 253, 17265, 2715, 253, 14023, 497, 1119, 20297, 407, 690, 690, 30628, 665, 2559, 616, 7363, 1754, 327, 253, 17265, 2715, 891, 5583, 14924 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 21168, 2220, 2045, 3104, 273, 2561, 327, 1554, 262, 1945, 4715, 1895, 824, 347, 17697, 21624, 4778, 3210, 1690, 253, 11454, 1232, 347, 2011, 407, 253, 9470, 2905, 789, 2593, 436, 3133, 281, 320, 271, 3939, 2561, 3884, 436, 2789, 352, 2834, 323, 479, 281, 5963, 3236, 414, 285, 8453, 533, 352, 310, 973, 15720, 285, 2590, 50276, 6160, 5701, 50276, 783, 16851, 12637, 3268, 2805, 2162, 310, 2223, 6289, 281, 347, 253, 12637, 3268, 891, 651, 1978, 16851, 1060, 50275, 81, 19, 17668, 273, 305, 793, 342, 11968, 17699, 16561, 295, 2224, 270, 79, 2224, 752, 310, 5486, 407, 11968, 270, 9866, 310, 352, 11968, 3429, 270, 9866, 4496, 13199, 50276, 81, 19, 17825, 787, 83, 4496, 6266, 253, 913, 1406, 1105, 50276, 81, 23, 253, 305, 12064, 11193, 273, 253, 12637, 15970, 12177, 884, 310, 753, 281, 320, 11797, 407, 305, 793, 534, 671, 4853, 247, 305, 12064, 12177, 436, 310, 671, 9093, 752, 310, 2218, 407, 13506, 12177, 247, 3753, 2929, 407, 3802, 5534, 4267, 534, 310, 891, 1158, 625, 8244, 2905, 281, 253, 4081, 2746, 685, 305, 793, 50276, 81, 23, 581, 1386, 2708, 4853, 268, 67, 913, 1406, 1105, 253, 806, 673, 352, 310, 908, 417, 1264, 3104, 846, 50276, 555, 993, 50276, 3956, 268, 19, 1057, 417, 16984, 28629, 15180, 18332, 50276, 81, 23, 17697, 1566, 342, 271, 29810, 6498, 327, 50276, 783, 10414, 1618, 812, 320, 18882, 728, 690, 806, 4454, 403, 48030, 690, 417, 7152, 339, 9852, 436, 2929, 253, 4477, 1056, 767, 9021, 281, 11454, 1232, 3022, 502, 87, 3210, 806, 597, 8171, 253, 8489, 519, 37806, 7629, 455, 2804, 2746, 281, 4715, 253, 717, 430, 1025, 21624, 4778, 3268, 342, 247, 1114, 442, 1113, 4213, 1754, 11193, 1273, 597, 8171, 253, 3213, 273, 3634, 20828, 342, 1480, 21624, 4778, 17032, 689, 1182, 50275, 1189, 455, 275, 619, 4743, 841, 14586, 1056, 253, 11454, 1232, 1566, 3012, 28452, 432, 247, 17699, 16561, 8668, 285, 310, 3240, 5322, 275, 253, 3634, 273, 11454, 4870, 891, 452, 1077, 1652, 14226, 323, 253, 4477, 3082, 3253, 2789, 3282, 285, 253, 278, 68, 2746, 275, 5150, 495, 3133, 28452, 281, 479, 685, 253, 8489, 519, 26901, 2177, 751, 2746, 275, 5150, 374, 50276, 783, 5962, 10183, 891, 452, 310, 8925, 849, 281, 7472, 253, 4477, 2590, 11701, 281, 11454, 4870, 275, 253, 16055, 3634, 273, 44755, 37851, 9077, 271, 2170, 281, 534, 253, 4477, 1750, 14199, 281, 1265, 342, 253, 4477, 4518, 7568, 253, 1318, 273, 1097, 17699, 16561, 3634, 20828, 285, 247, 278, 68, 1754, 12177, 11193, 6974, 327, 10534, 253, 1072, 3510, 273, 3237, 326, 5368, 11454, 4870, 9380, 24088, 34226, 29595, 1162, 355, 4765, 452, 2783, 342, 253, 16613, 6517, 326, 253, 374, 69, 2460, 12240, 4836, 19401, 760, 278, 79, 382, 347, 247, 2303, 10895, 275, 436, 1675, 697, 2834, 281, 9331, 253, 5661, 7103, 50276, 35529, 436, 2929, 285, 1142, 11454, 1232, 9380, 403, 3542, 275, 253, 3634, 273, 830, 8287, 44755, 37851, 9077, 3210, 342, 9630, 11649, 8197, 13353, 387, 690, 1127, 436, 943, 6388, 247, 5301, 273, 841, 7274, 281, 5368, 5609, 323, 37851, 9077, 1880, 326, 320, 3676, 305, 12064, 4870, 5926, 483, 1754, 7274, 17699, 16561, 11454, 6928, 390, 643, 7274, 891, 13414, 1599, 281, 16084, 1060, 326, 253, 4477, 403, 25229, 273, 436, 1781, 2133, 273, 6239, 50276, 527, 13158, 253, 4477, 452, 247, 12524, 604, 18464, 18389, 273, 5609, 275, 436, 2170, 19836, 5816, 789, 327, 3676, 305, 793, 50276, 30786, 352, 816, 3133, 10084, 281, 479, 326, 253, 5955, 273, 253, 4623, 37851, 9077, 6239, 7637, 387, 973, 352, 4961, 752, 891, 651, 751, 281, 923, 310, 247, 5955, 273, 835, 253, 4477, 13943, 11701, 281, 11454, 4870, 3553, 253, 1566, 2021, 275, 436, 16055, 3634, 849, 2810, 390, 2080, 745, 310, 253, 2021, 327, 3045, 323, 2629, 22791, 9077, 8892, 403, 627, 7533, 275, 534, 359, 476, 25057, 253, 4751, 48257, 1754, 3753, 273, 11454, 4870, 281, 5115, 37851, 9077, 275, 7533, 835, 253, 42115, 31306, 273, 10295, 3082, 403, 4105, 751, 275, 4382, 8113, 390, 3626, 3448, 5162, 253, 4942, 20953, 3753, 285, 3710, 7877, 1319, 273, 253, 3237, 2783, 5936, 326, 627, 310, 1335, 1534, 4780, 281, 320, 1160, 1078, 824, 247, 5301, 651, 320, 5272, 390, 1014, 1896, 50276, 936, 26799, 275, 253, 3634, 273, 11454, 4870, 891, 1928, 253, 2929, 2789, 1175, 35961, 9021, 275, 15250, 247, 1199, 28452, 285, 625, 3626, 432, 247, 17699, 16561, 8668, 2715, 273, 253, 1566, 326, 556, 625, 273, 253, 13746, 273, 2629, 717, 430, 1025, 17032, 323, 21624, 4778, 3210, 1561, 253, 1077, 6891, 3634, 273, 11454, 1232, 9380, 891, 3103, 452, 1077, 1652, 281, 17805, 670, 2299, 432, 247, 16055, 8249, 8668, 891, 651, 1928, 326, 253, 2929, 651, 320, 3012, 34615, 407, 247, 4344, 7103, 281, 253, 1551, 273, 436, 6239, 1880, 16774, 390, 3365, 275, 5955, 10159, 273, 849, 253, 4477, 2746, 4195, 373, 275, 3294, 261, 857, 406, 339, 793, 360, 3454, 273, 253, 2929, 50266, 2520, 2929, 8631, 17699, 16561, 3634, 20828, 323, 11454, 4870, 841, 3210, 403, 4217, 281, 2953, 9077, 3237, 275, 534, 247, 873, 273, 2905, 8892, 403, 2130, 323, 17032, 342, 2330, 3634, 1491, 275, 253, 830, 273, 4465, 941, 841, 3210, 5467, 326, 627, 310, 247, 8892, 29765, 4156, 21624, 4778, 285, 4836, 17777, 21624, 4778, 597, 403, 6311, 3066, 16851, 4869, 12637, 12177, 275, 534, 253, 21624, 4903, 2173, 323, 1016, 8892, 403, 16888, 91, 728, 562, 323, 436, 271, 11193, 281, 253, 12637, 3268, 273, 841, 4903, 310, 878, 436, 4419, 21839, 281, 253, 3634, 10895, 534, 310, 11132, 275, 253, 2469, 247, 21624, 6779, 310, 908, 285, 253, 3634, 941, 873, 310, 40006, 347, 253, 1599, 273, 253, 21624, 6779, 275, 436, 2929, 247, 17699, 16561, 1039, 273, 9406, 839, 3634, 1491, 310, 4081, 436, 310, 1754, 327, 970, 17699, 265, 4086, 285, 247, 305, 12064, 1006, 800, 1566, 323, 253, 21624, 14237, 253, 4081, 1332, 671, 5644, 281, 247, 747, 1039, 273, 3733, 502, 87, 3210, 534, 310, 1754, 327, 2774, 11038, 253, 1332, 310, 17618, 327, 2067, 13506, 271, 1524, 10186, 4679, 4645, 11701, 689, 1599, 20828, 50276, 5992, 7193, 5701, 50276, 74, 2868, 326, 436, 310, 247, 4623, 2929, 3634, 20828, 310, 247, 2834, 1895, 326, 310, 2424, 281, 2953, 253, 4715, 8892, 2529, 275, 253, 2929, 2045, 2900, 1007, 3710, 285, 253, 4081, 1332, 3133, 3626, 285, 247, 625, 3576, 1332, 273, 9406, 839, 436, 1491, 253, 2929, 310, 973, 3542, 285, 253, 4081, 1332, 310, 3590, 253, 4679, 403, 671, 21414, 285, 41389, 891, 2868, 326, 436, 310, 247, 4623, 2929, 323, 253, 8059, 50276, 7152, 339, 431, 248, 4477, 1246, 253, 17699, 16561, 20828, 18927, 5122, 275, 253, 3634, 273, 11454, 4870, 295, 793, 323, 9406, 839, 253, 3634, 1491, 715, 253, 21624, 4778, 1182, 275, 253, 830, 273, 12637, 11269, 281, 1182, 253, 4477, 921, 326, 436, 19132, 15970, 3045, 275, 2426, 273, 12177, 2429, 281, 1599, 20828, 6429, 326, 352, 36287, 327, 2710, 9077, 8892, 342, 11962, 3280, 9252, 7877, 1319, 50276, 296, 3755, 20556, 337, 253, 2934, 310, 2969, 285, 5644, 281, 247, 16613, 7756, 2429, 281, 6429, 275, 2426, 273, 12177, 374, 253, 4114, 285, 1332, 310, 3559, 1077, 4518, 495, 253, 7103, 310, 2218, 327, 247, 4618, 5235, 273, 8892, 12319, 432, 2629, 337, 69, 9077, 273, 31025, 3530, 281, 32752, 15508, 18974, 10554, 8892, 50276, 20881, 1255, 265, 337, 253, 7103, 310, 5816, 271, 1774, 8245, 1566, 534, 403, 271, 81, 3210, 326, 452, 1881, 42959, 275, 253, 32049, 323, 5162, 253, 22349, 21194, 1566, 4677, 275, 271, 81, 2929, 465, 303, 1162, 355, 6247, 67, 10214, 281, 253, 295, 5902, 18650, 1666, 25379, 326, 403, 2429, 1411, 275, 253, 2929, 253, 271, 81, 342, 1881, 42959, 275, 253, 32049, 1057, 417, 1918, 6447, 13461, 281, 1016, 3634, 1127, 50276, 783, 1881, 42959, 4483, 253, 1566, 281, 9212, 11962, 6349, 281, 253, 1027, 3634, 2792, 5747, 970, 1599, 356, 18840, 846, 253, 1881, 42959, 534, 310, 3559, 347, 247, 2234, 16038, 323, 253, 18927, 5122, 5611, 275, 253, 2929, 7613, 323, 253, 4679, 891, 7052, 1804, 10941, 1411, 260, 79, 16077, 4029, 81, 342, 1881, 42959, 275, 253, 30027, 13324, 290, 13324, 290, 1854, 273, 253, 32049, 323, 29867, 604, 651, 320, 5322, 281, 671, 7277, 1411, 3210, 326, 452, 1097, 30027, 285, 21624, 11865, 1580, 18927, 476, 671, 320, 3732, 281, 841, 3210, 387, 253, 1072, 673, 891, 2096, 326, 18927, 651, 320, 625, 4665, 494, 323, 4645, 534, 7313, 452, 1652, 8656, 1055, 327, 1182, 2429, 281, 253, 2746, 273, 970, 1881, 42959, 275, 253, 32049, 533, 352, 651, 1335, 320, 1077, 27096, 323, 253, 9414, 281, 320, 2104, 281, 7277, 253, 767, 7274, 671, 841, 767, 7274, 476, 320, 5678, 281, 452, 1881, 42959, 275, 253, 32049, 50276, 5830, 534, 1537, 671, 4917, 5520, 3045, 374, 253, 1750, 326, 18927, 3797, 6429, 347, 247, 2714, 1083, 36908, 1646, 281, 320, 2032, 970, 247, 1327, 37650, 800, 2720, 285, 6447, 8310, 48894, 5644, 281, 3638, 40009, 91, 285, 278, 7958, 1146, 23352, 14495, 281, 1599, 30930, 26332, 2020, 79, 391, 79, 50276, 79, 534, 310, 417, 3240, 253, 1072, 347, 6429, 50276, 785, 4483, 40009, 91, 285, 278, 7958, 281, 320, 14561, 3470, 273, 1599, 30930, 7613, 310, 13714, 625, 43541, 685, 436, 2714, 1083, 495, 275, 5150, 818, 352, 3133, 347, 2167, 253, 3634, 2792, 1269, 79, 1362, 760, 11852, 391, 79, 3066, 253, 11041, 534, 3133, 48312, 14155, 2139, 417, 452, 253, 1599, 671, 3469, 327, 391, 79, 24088, 819, 36794, 50276, 23838, 79, 1182, 50276, 78, 662, 1073, 3544, 15379, 1596, 19, 835, 278, 662, 310, 671, 10302, 347, 247, 1159, 273, 1269, 79, 1362, 436, 588, 1335, 1918, 247, 4581, 630, 12637, 268, 31383, 18, 79, 1580, 253, 1599, 273, 819, 36794, 310, 1335, 4872, 275, 1182, 6153, 247, 1566, 28763, 13714, 625, 43541, 342, 1077, 2074, 6733, 352, 651, 320, 27096, 281, 923, 849, 436, 2544, 253, 5661, 1543, 577, 516, 29985, 253, 2177, 8103, 369, 908, 281, 6194, 253, 271, 81, 1677, 253, 2590, 5750, 273, 3733, 342, 253, 278, 68, 8103, 943, 2649, 253, 271, 81, 671, 320, 10166, 342, 278, 68, 608, 253, 21624, 4778, 3210, 497, 417, 6760, 327, 374, 69, 2460, 12240, 8892, 984, 35615, 1293, 30027, 11865, 497, 417, 2104, 281, 8415, 436, 4836, 2139, 417, 840, 823, 247, 30027, 1854, 281, 841, 21624, 4778, 3210, 281, 1581, 731, 281, 6194, 50276, 977, 2792, 50276, 249, 253, 2505, 352, 2296, 326, 253, 1566, 310, 671, 2429, 1411, 271, 81, 281, 921, 326, 18927, 476, 15639, 342, 256, 5503, 436, 310, 25711, 13583, 1580, 2410, 14340, 81, 3210, 403, 256, 5503, 2190, 3210, 273, 253, 15749, 2021, 4645, 247, 1534, 7756, 689, 271, 81, 7613, 281, 5115, 253, 4736, 5393, 275, 253, 2505, 352, 651, 1056, 3282, 281, 7277, 342, 2410, 14340, 81, 3210, 347, 629, 273, 253, 7103, 1411, 643, 30027, 295, 793, 50276, 1189, 455, 253, 2929, 310, 3559, 1077, 4518, 342, 247, 2969, 2568, 3576, 2934, 5762, 327, 247, 4618, 5235, 273, 8892, 2299, 697, 5816, 271, 1774, 8245, 326, 4648, 1881, 42959, 275, 253, 32049, 2112, 342, 2067, 643, 1666, 25379, 326, 651, 320, 27096, 281, 7277, 1411, 891, 717, 7378, 281, 2572, 619, 4868, 943, 841, 1543, 320, 2908, 275, 253, 17265, 2715, 273, 253, 2929, 50274, 18891, 5439, 281, 721, 846, 11250, 273, 6429, 50276, 6678, 1543, 275, 30080, 22559, 187, 187, 4118, 18435, 27, 783, 4477, 1246, 247, 17699, 16561, 2746, 323, 3634, 20828, 275, 11454, 4870, 1754, 3210, 253, 3929, 310, 973, 3542, 285, 3400, 247, 5322, 285, 11088, 7792, 253, 30628, 5439, 690, 3374, 5001, 253, 3480, 273, 14023, 281, 1463, 1666, 25379, 253, 4477, 2530, 3081, 14023, 275, 253, 17265, 2715, 253, 14023, 497, 1119, 20297, 407, 690, 690, 30628, 665, 2559, 616, 7363, 1754, 327, 253, 17265, 2715, 891, 5583, 14924 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the draft proposes a heuristic combining environment rewards with an irlstyle rewards recovered from expert demonstrations seeking to extend the gail approach to irl to the case of mismatching action spaces between the expert and the learner the interesting contribution is in my opinion the selfexploration parameter that reduces the reliance of learning on demonstrations once they have been learned sufficiently well questions in general its known that behavioral cloning of which this work seem to be an example in so much it learns state distributions that are indistinguishable from the expert ones can fail spectacularly because of the distribution shift kaariainenalw06 rossbagnellaistats10 rossbagnellaistats11 can you comment if ganbased methods are immune or susceptible to this would this work for tasks where the statespace has to be learned together with the policy eg image captioning tasks or atari games is it possible to quantify the ease of learning or the frequency of use of the new actions ie al setminus ae wont learning these actions effectively be as difficult as rl with sparse rewards say in a grid world where 4way diagonal moves allow reaching the goal faster learner is king 8way demonstrations come from a 4way expert rewards are sparse and each step receives a 1 reward and the final goal is large positive does the learners final policy actually use the diagonals and when related work is it possible to make a connection to data or policy aggregation methods in il such methods eg chang et alicml15 can also sometimes learn policies better than the expert experiments why gail wasnt evaluated in fig 3 and fig 4 minor whats bce in algorithm 1 fig1 the the sec 32 but avoid but avoids sec 32 be to considers be to consider sec 32 any hyperparameter any hyperparameters colors in fig 2 are indistinguishable table 1 headers saying which method is prior work and which is contribution would be helpful fig 3 if possible try to find a way of communicating the relation of action spaces between expert and learner eg a subset ofsuperset of using the same figure to depict selfexploration make it complicated to analyse sec 32 wording in the last paragraph on p4 positive scaling wont make anything positive if it wasnt beforedocsepthe paper proposes to combine expert demonstration together with reinforcement learning to speed up learning of control policies to do so the authors modify the gail algorithm and create a composite reward function as a linear combination of the extrinsic reward and the imitation reward they test their approach on several toy problems small grid worlds the idea of combining gail reward and extrinsic reward is not really new and quite straight forward so i wouldnt consider this as a contribution also using state only demonstration in the framework of gail is not new as the authors also acknowledge in the paper finally i dont think the experiments are convincing since the chosen problems are rather simple but my main concern is that the major claim of the authors is that they dont use expert actions as input to their algorithm but only sequences of states yet they test their algorithm on deterministic environments in such a case two consecutive states kind of encode the action and all the information is there even if the action sets are different in some of the experiments they are still very close to each other and the encoding of the expert actions in the state sequence is probably helping a lot so i would like to see how this method works in stochastic environments docsepthis paper proposes some new angles to the problem of imitation learning from state only observations not stateaction pairs which are more expensive specifically the paper proposes self exploration in which it mixes the imitation reward with environment reward from the mdp itself in a gradual manner guided by the rate of learning it also proposes a couple of variants of imitation rewards rtgd and atd inparticular which formulate the imitation rewards for random or exhaustive pairs of states in the observation data as opposed to the rewards proposed in existing works csd ssd which are based on either consecutive or single states which constitute the baseline methods for comparison the authors then perform a systematic experiment using a particular navigation problem on a grid world and inspect under what scenarios eg when the action spaces of the expert and learner are the same disjoint or in a containment relationship which of the methods perform well relative to the baselines some moderately interesting observations are reported which largely confirm ones intuition about when these methods may perform relatively well there is not very much theoretical support for the proposed methods per se the paper is mostly an empirical study on these competing reward schemes for imitation learning the empirical evaluation is done in a single domainproblem and in that sense it is questionable how far the observed trends on the relative performance of the competing methods generalizes to other problems and domains also the proposed ideas are all reasonable but relatively simple and unsurprising casting some doubt as to the extent to which the paper contributes to the state of understanding of this area of research ### Summary:
this paper proposes to combine rewards obtained through irl from rewards coming from the environment and evaluate the algorithm on grid world environments the problem setting is important and of interest to the iclr community while the revised paper addresses the concerns about the lack of a stochastic environment problem the reviewers still have major concerns regarding the novelty and significance of the algorithmic contribution as well as the limited complexity of the experimental domains as such the paper does not meet the bar for publication at iclr
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 7482, 29328, 247, 47641, 16248, 3126, 23267, 342, 271, 209, 2587, 4826, 23267, 12372, 432, 6485, 32367, 8445, 281, 9017, 253, 305, 647, 2746, 281, 209, 2587, 281, 253, 1083, 273, 19412, 16464, 2250, 8470, 875, 253, 6485, 285, 253, 458, 47612, 253, 4722, 7680, 310, 275, 619, 4743, 253, 11329, 453, 89, 446, 7843, 4764, 326, 11355, 253, 22095, 273, 4715, 327, 32367, 2378, 597, 452, 644, 6311, 10481, 973, 50276, 34974, 50275, 249, 2087, 697, 1929, 326, 14613, 34591, 273, 534, 436, 789, 1646, 281, 320, 271, 1650, 275, 594, 1199, 352, 33772, 1375, 10670, 326, 403, 40184, 432, 253, 6485, 4394, 476, 1891, 22936, 314, 984, 273, 253, 3268, 5333, 16288, 1792, 39198, 267, 88, 3071, 687, 859, 67, 1530, 5021, 382, 1832, 740, 687, 859, 67, 1530, 5021, 382, 1832, 883, 476, 368, 4385, 604, 36827, 3169, 3082, 403, 7026, 390, 16931, 281, 436, 50272, 12756, 436, 789, 323, 8892, 835, 253, 3054, 4511, 556, 281, 320, 6311, 2366, 342, 253, 3646, 24088, 2460, 11743, 272, 8892, 390, 387, 1792, 3958, 50275, 261, 352, 1896, 281, 22048, 253, 11990, 273, 4715, 390, 253, 4294, 273, 897, 273, 253, 747, 5231, 26332, 355, 873, 10420, 247, 70, 31451, 4715, 841, 5231, 8069, 320, 347, 2834, 347, 391, 77, 342, 23507, 23267, 1333, 275, 247, 9860, 1533, 835, 577, 1106, 16421, 9727, 1581, 10922, 253, 4736, 7938, 458, 47612, 310, 6963, 854, 1106, 32367, 1705, 432, 247, 577, 1106, 6485, 23267, 403, 23507, 285, 1016, 3213, 14488, 247, 337, 10921, 285, 253, 2457, 4736, 310, 1781, 2762, 50276, 18566, 253, 40390, 2457, 3646, 2686, 897, 253, 1073, 5154, 932, 285, 672, 50276, 4919, 789, 50272, 261, 352, 1896, 281, 1056, 247, 4602, 281, 941, 390, 3646, 20828, 3082, 275, 4164, 824, 3082, 24088, 1683, 1162, 355, 280, 1686, 1010, 476, 671, 4536, 3037, 7823, 1805, 685, 253, 6485, 50276, 16217, 3825, 50276, 22309, 305, 647, 369, 2649, 6760, 275, 3036, 495, 285, 3036, 577, 50276, 37585, 50276, 5371, 84, 270, 336, 275, 5933, 337, 50272, 926, 18, 253, 253, 50276, 1704, 4567, 533, 3693, 50276, 2858, 32547, 50276, 1704, 4567, 320, 281, 19401, 50276, 1257, 281, 1908, 50276, 1704, 4567, 667, 4373, 19484, 50276, 1279, 4373, 22041, 50276, 36022, 275, 3036, 374, 403, 40184, 50276, 2420, 337, 20546, 3981, 534, 1332, 310, 2720, 789, 285, 534, 310, 7680, 651, 320, 9371, 50276, 926, 495, 604, 1896, 1611, 281, 1089, 247, 1039, 273, 26728, 253, 5886, 273, 2250, 8470, 875, 6485, 285, 458, 47612, 24088, 247, 8578, 273, 8403, 398, 292, 273, 970, 253, 1072, 4677, 281, 17154, 11329, 453, 89, 446, 7843, 1056, 352, 9542, 281, 30648, 50276, 1704, 4567, 41066, 275, 253, 1390, 12494, 327, 268, 21, 2762, 13642, 31451, 1056, 2712, 2762, 604, 352, 369, 2649, 1078, 7152, 339, 431, 248, 2929, 29328, 281, 13398, 6485, 20028, 2366, 342, 35221, 4715, 281, 3885, 598, 4715, 273, 1453, 7823, 281, 513, 594, 253, 4477, 10007, 253, 305, 647, 5933, 285, 2794, 247, 8212, 10921, 1159, 347, 247, 4872, 5019, 273, 253, 38988, 10921, 285, 253, 45738, 10921, 597, 1071, 616, 2746, 327, 2067, 20953, 3237, 1355, 9860, 20490, 50275, 783, 2934, 273, 16248, 305, 647, 10921, 285, 38988, 10921, 310, 417, 1663, 747, 285, 3240, 4951, 3579, 594, 891, 651, 2649, 1908, 436, 347, 247, 7680, 671, 970, 1375, 760, 20028, 275, 253, 7792, 273, 305, 647, 310, 417, 747, 347, 253, 4477, 671, 14409, 275, 253, 2929, 4720, 891, 13414, 1158, 253, 4679, 403, 21414, 1580, 253, 6777, 3237, 403, 2581, 2969, 50275, 2858, 619, 2022, 4468, 310, 326, 253, 2201, 1750, 273, 253, 4477, 310, 326, 597, 13414, 897, 6485, 5231, 347, 3280, 281, 616, 5933, 533, 760, 6430, 273, 3054, 2568, 597, 1071, 616, 5933, 327, 30027, 12620, 275, 824, 247, 1083, 767, 12640, 3054, 2238, 273, 22573, 253, 2250, 285, 512, 253, 1491, 310, 627, 1014, 604, 253, 2250, 5239, 403, 1027, 275, 690, 273, 253, 4679, 597, 403, 1335, 1077, 2810, 281, 1016, 643, 285, 253, 9706, 273, 253, 6485, 5231, 275, 253, 1375, 3425, 310, 3164, 9073, 247, 2257, 594, 891, 651, 751, 281, 923, 849, 436, 1332, 2987, 275, 19191, 12620, 5474, 33032, 2520, 2929, 29328, 690, 747, 14636, 281, 253, 1895, 273, 45738, 4715, 432, 1375, 760, 7313, 417, 1375, 1913, 8557, 534, 403, 625, 8214, 50276, 46458, 253, 2929, 29328, 1881, 17947, 275, 534, 352, 47603, 253, 45738, 10921, 342, 3126, 10921, 432, 253, 278, 12132, 3139, 275, 247, 26830, 5133, 18107, 407, 253, 2281, 273, 4715, 352, 671, 29328, 247, 4564, 273, 11640, 273, 45738, 23267, 391, 32304, 69, 285, 387, 69, 275, 50077, 534, 36803, 253, 45738, 23267, 323, 3632, 390, 41389, 8557, 273, 3054, 275, 253, 8310, 941, 347, 10066, 281, 253, 23267, 4081, 275, 5368, 2987, 260, 8289, 256, 8289, 534, 403, 1754, 327, 2057, 12640, 390, 2014, 3054, 534, 12647, 253, 8245, 3082, 323, 5301, 253, 4477, 840, 1347, 247, 12082, 3368, 970, 247, 1798, 15034, 1895, 327, 247, 9860, 1533, 285, 16030, 762, 752, 15216, 24088, 672, 253, 2250, 8470, 273, 253, 6485, 285, 458, 47612, 403, 253, 1072, 28465, 390, 275, 247, 46054, 2954, 534, 273, 253, 3082, 1347, 973, 4103, 281, 253, 1666, 25379, 50276, 8826, 28249, 4722, 7313, 403, 2361, 534, 8127, 6583, 4394, 30328, 670, 672, 841, 3082, 778, 1347, 4942, 973, 50276, 9088, 310, 417, 1077, 1199, 10527, 1329, 323, 253, 4081, 3082, 591, 396, 253, 2929, 310, 6571, 271, 16774, 1263, 327, 841, 11771, 10921, 15849, 323, 45738, 4715, 253, 16774, 7103, 310, 2218, 275, 247, 2014, 5028, 28872, 285, 275, 326, 3282, 352, 310, 30455, 849, 2080, 253, 2540, 13554, 327, 253, 4103, 3045, 273, 253, 11771, 3082, 2087, 4219, 281, 643, 3237, 285, 10625, 50276, 12563, 253, 4081, 5697, 403, 512, 5272, 533, 4942, 2969, 285, 5061, 321, 20733, 20278, 690, 5545, 347, 281, 253, 6070, 281, 534, 253, 2929, 17904, 281, 253, 1375, 273, 4685, 273, 436, 2170, 273, 2561, 2490, 187, 4118, 18435, 27, 2520, 2929, 29328, 281, 13398, 23267, 2797, 949, 209, 2587, 432, 23267, 3551, 432, 253, 3126, 285, 7472, 253, 5933, 327, 9860, 1533, 12620, 253, 1895, 4758, 310, 1774, 285, 273, 1600, 281, 253, 17857, 32888, 3114, 1223, 253, 17265, 2929, 12453, 253, 7350, 670, 253, 3480, 273, 247, 19191, 3126, 1895, 253, 30628, 1335, 452, 2201, 7350, 5001, 253, 38135, 285, 8453, 273, 253, 5933, 280, 7680, 347, 973, 347, 253, 3710, 10454, 273, 253, 5661, 10625, 347, 824, 253, 2929, 1057, 417, 2525, 253, 2534, 323, 9311, 387, 17857, 32888 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 7482, 29328, 247, 47641, 16248, 3126, 23267, 342, 271, 209, 2587, 4826, 23267, 12372, 432, 6485, 32367, 8445, 281, 9017, 253, 305, 647, 2746, 281, 209, 2587, 281, 253, 1083, 273, 19412, 16464, 2250, 8470, 875, 253, 6485, 285, 253, 458, 47612, 253, 4722, 7680, 310, 275, 619, 4743, 253, 11329, 453, 89, 446, 7843, 4764, 326, 11355, 253, 22095, 273, 4715, 327, 32367, 2378, 597, 452, 644, 6311, 10481, 973, 50276, 34974, 50275, 249, 2087, 697, 1929, 326, 14613, 34591, 273, 534, 436, 789, 1646, 281, 320, 271, 1650, 275, 594, 1199, 352, 33772, 1375, 10670, 326, 403, 40184, 432, 253, 6485, 4394, 476, 1891, 22936, 314, 984, 273, 253, 3268, 5333, 16288, 1792, 39198, 267, 88, 3071, 687, 859, 67, 1530, 5021, 382, 1832, 740, 687, 859, 67, 1530, 5021, 382, 1832, 883, 476, 368, 4385, 604, 36827, 3169, 3082, 403, 7026, 390, 16931, 281, 436, 50272, 12756, 436, 789, 323, 8892, 835, 253, 3054, 4511, 556, 281, 320, 6311, 2366, 342, 253, 3646, 24088, 2460, 11743, 272, 8892, 390, 387, 1792, 3958, 50275, 261, 352, 1896, 281, 22048, 253, 11990, 273, 4715, 390, 253, 4294, 273, 897, 273, 253, 747, 5231, 26332, 355, 873, 10420, 247, 70, 31451, 4715, 841, 5231, 8069, 320, 347, 2834, 347, 391, 77, 342, 23507, 23267, 1333, 275, 247, 9860, 1533, 835, 577, 1106, 16421, 9727, 1581, 10922, 253, 4736, 7938, 458, 47612, 310, 6963, 854, 1106, 32367, 1705, 432, 247, 577, 1106, 6485, 23267, 403, 23507, 285, 1016, 3213, 14488, 247, 337, 10921, 285, 253, 2457, 4736, 310, 1781, 2762, 50276, 18566, 253, 40390, 2457, 3646, 2686, 897, 253, 1073, 5154, 932, 285, 672, 50276, 4919, 789, 50272, 261, 352, 1896, 281, 1056, 247, 4602, 281, 941, 390, 3646, 20828, 3082, 275, 4164, 824, 3082, 24088, 1683, 1162, 355, 280, 1686, 1010, 476, 671, 4536, 3037, 7823, 1805, 685, 253, 6485, 50276, 16217, 3825, 50276, 22309, 305, 647, 369, 2649, 6760, 275, 3036, 495, 285, 3036, 577, 50276, 37585, 50276, 5371, 84, 270, 336, 275, 5933, 337, 50272, 926, 18, 253, 253, 50276, 1704, 4567, 533, 3693, 50276, 2858, 32547, 50276, 1704, 4567, 320, 281, 19401, 50276, 1257, 281, 1908, 50276, 1704, 4567, 667, 4373, 19484, 50276, 1279, 4373, 22041, 50276, 36022, 275, 3036, 374, 403, 40184, 50276, 2420, 337, 20546, 3981, 534, 1332, 310, 2720, 789, 285, 534, 310, 7680, 651, 320, 9371, 50276, 926, 495, 604, 1896, 1611, 281, 1089, 247, 1039, 273, 26728, 253, 5886, 273, 2250, 8470, 875, 6485, 285, 458, 47612, 24088, 247, 8578, 273, 8403, 398, 292, 273, 970, 253, 1072, 4677, 281, 17154, 11329, 453, 89, 446, 7843, 1056, 352, 9542, 281, 30648, 50276, 1704, 4567, 41066, 275, 253, 1390, 12494, 327, 268, 21, 2762, 13642, 31451, 1056, 2712, 2762, 604, 352, 369, 2649, 1078, 7152, 339, 431, 248, 2929, 29328, 281, 13398, 6485, 20028, 2366, 342, 35221, 4715, 281, 3885, 598, 4715, 273, 1453, 7823, 281, 513, 594, 253, 4477, 10007, 253, 305, 647, 5933, 285, 2794, 247, 8212, 10921, 1159, 347, 247, 4872, 5019, 273, 253, 38988, 10921, 285, 253, 45738, 10921, 597, 1071, 616, 2746, 327, 2067, 20953, 3237, 1355, 9860, 20490, 50275, 783, 2934, 273, 16248, 305, 647, 10921, 285, 38988, 10921, 310, 417, 1663, 747, 285, 3240, 4951, 3579, 594, 891, 651, 2649, 1908, 436, 347, 247, 7680, 671, 970, 1375, 760, 20028, 275, 253, 7792, 273, 305, 647, 310, 417, 747, 347, 253, 4477, 671, 14409, 275, 253, 2929, 4720, 891, 13414, 1158, 253, 4679, 403, 21414, 1580, 253, 6777, 3237, 403, 2581, 2969, 50275, 2858, 619, 2022, 4468, 310, 326, 253, 2201, 1750, 273, 253, 4477, 310, 326, 597, 13414, 897, 6485, 5231, 347, 3280, 281, 616, 5933, 533, 760, 6430, 273, 3054, 2568, 597, 1071, 616, 5933, 327, 30027, 12620, 275, 824, 247, 1083, 767, 12640, 3054, 2238, 273, 22573, 253, 2250, 285, 512, 253, 1491, 310, 627, 1014, 604, 253, 2250, 5239, 403, 1027, 275, 690, 273, 253, 4679, 597, 403, 1335, 1077, 2810, 281, 1016, 643, 285, 253, 9706, 273, 253, 6485, 5231, 275, 253, 1375, 3425, 310, 3164, 9073, 247, 2257, 594, 891, 651, 751, 281, 923, 849, 436, 1332, 2987, 275, 19191, 12620, 5474, 33032, 2520, 2929, 29328, 690, 747, 14636, 281, 253, 1895, 273, 45738, 4715, 432, 1375, 760, 7313, 417, 1375, 1913, 8557, 534, 403, 625, 8214, 50276, 46458, 253, 2929, 29328, 1881, 17947, 275, 534, 352, 47603, 253, 45738, 10921, 342, 3126, 10921, 432, 253, 278, 12132, 3139, 275, 247, 26830, 5133, 18107, 407, 253, 2281, 273, 4715, 352, 671, 29328, 247, 4564, 273, 11640, 273, 45738, 23267, 391, 32304, 69, 285, 387, 69, 275, 50077, 534, 36803, 253, 45738, 23267, 323, 3632, 390, 41389, 8557, 273, 3054, 275, 253, 8310, 941, 347, 10066, 281, 253, 23267, 4081, 275, 5368, 2987, 260, 8289, 256, 8289, 534, 403, 1754, 327, 2057, 12640, 390, 2014, 3054, 534, 12647, 253, 8245, 3082, 323, 5301, 253, 4477, 840, 1347, 247, 12082, 3368, 970, 247, 1798, 15034, 1895, 327, 247, 9860, 1533, 285, 16030, 762, 752, 15216, 24088, 672, 253, 2250, 8470, 273, 253, 6485, 285, 458, 47612, 403, 253, 1072, 28465, 390, 275, 247, 46054, 2954, 534, 273, 253, 3082, 1347, 973, 4103, 281, 253, 1666, 25379, 50276, 8826, 28249, 4722, 7313, 403, 2361, 534, 8127, 6583, 4394, 30328, 670, 672, 841, 3082, 778, 1347, 4942, 973, 50276, 9088, 310, 417, 1077, 1199, 10527, 1329, 323, 253, 4081, 3082, 591, 396, 253, 2929, 310, 6571, 271, 16774, 1263, 327, 841, 11771, 10921, 15849, 323, 45738, 4715, 253, 16774, 7103, 310, 2218, 275, 247, 2014, 5028, 28872, 285, 275, 326, 3282, 352, 310, 30455, 849, 2080, 253, 2540, 13554, 327, 253, 4103, 3045, 273, 253, 11771, 3082, 2087, 4219, 281, 643, 3237, 285, 10625, 50276, 12563, 253, 4081, 5697, 403, 512, 5272, 533, 4942, 2969, 285, 5061, 321, 20733, 20278, 690, 5545, 347, 281, 253, 6070, 281, 534, 253, 2929, 17904, 281, 253, 1375, 273, 4685, 273, 436, 2170, 273, 2561, 2490, 187, 4118, 18435, 27, 2520, 2929, 29328, 281, 13398, 23267, 2797, 949, 209, 2587, 432, 23267, 3551, 432, 253, 3126, 285, 7472, 253, 5933, 327, 9860, 1533, 12620, 253, 1895, 4758, 310, 1774, 285, 273, 1600, 281, 253, 17857, 32888, 3114, 1223, 253, 17265, 2929, 12453, 253, 7350, 670, 253, 3480, 273, 247, 19191, 3126, 1895, 253, 30628, 1335, 452, 2201, 7350, 5001, 253, 38135, 285, 8453, 273, 253, 5933, 280, 7680, 347, 973, 347, 253, 3710, 10454, 273, 253, 5661, 10625, 347, 824, 253, 2929, 1057, 417, 2525, 253, 2534, 323, 9311, 387, 17857, 32888 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper combines insights from modular metalearning lifelong learning and rl to arrive at a system that can learn online to train modules that can be recombined to solve future tasks without further training in addition training that happens offline can retrospectively improve modules that were constructed to solve earlier tasks these methods are demonstrated in a pedagogical domain and on some highdimensional robotcontrol problems this paper describes a method that is sensible and seems to work well it is a creative combination of existing ideas my most favorite and also in a way least feature of this paper is the strategy of instantiating the modular structures for all the tasks with appropriate sharing and then performing endtoend experience replay on the whole ensemble simultaneously its a very nice way to get each of the modules to understand its shared job in the overall task ensemble but it also seems unsatisfying that we have to remember and reuse all the data can you explain a bit more about what the various colors mean in figure 2 left i guess youre making a plot for 3 or so different numbers of tasks eg 20 40 60 also the shaded error regions seem surprisingly small is there really so little variance what sources of error does a seed control the tasks it is trained on initial weights is there any randomization in the domain legends for sparse dotted lines are really hard to view it would probably be better just to use solid lines of different colors the phenomenon of backward transfer is nice but why is it surprising it seems almost inevitable given your training setup minor are the only that achieve therefore upon training a new task we downscaled the actions and qvalues this seems like a very blunt instrument for addressing the problem with ppo isnt there something that would let you retain the actual module weight values and therefore some hope of zeroshot module recombination it was still able to learn slightly faster than stl despite using an order of magnitude fewer trainable parameters having fewer trainable parameters if they are organized right should make learning go faster again the stderr seems so small in figs 3 and 4 please be sure to articulate all the possible sources of variance and explain how many times youre varying each one and be careful about the statistical analysis for example training a network once but testing it 100 times and then reporting a standard error based on n 100 is somewhere between not very illuminating and actually misleading i dont mean to say thats what youre doingits just an example of a common error and the kind of thing to watch out for the reward function for the robotics problems is very wellshapedso well it seems that one could possibly just write a controller that does gradient ascent on r to reach the objective i understand that this is the norm for rl on robotics tasks but it is very nearly supervised learning it could be interesting as a point of comparison to just generate a big dataset of good trajectories for each task using a planner and do supervised learning aka behavioral cloning just to see how well it works this is a nice combination of mostly existing methods with some good insights for how to put them together its not a major contribution but it is well described and the experiments seem good and most importantly i think it is heading in a very good direction lifelong learning and compositional generalization are critical aspects of moving forward to bigger and more interesting problems and so i think it is important to encourage work in this area docsepthis paper introduces and explores the use of functional compositionality in lifelong reinforcement learning rl in contrast to hierarchical reinforcement which explores temporal compositionality the chaining of options or subpolicies across time functional compositionality involves assembling a novel overall function in the case of rl the policy function by composing together neural modules that perform multistage processing of the policys inputs each module can be regarded as a small function that takes some abstract input x and maps it to an abstract output y the initial set of modules will take the state space as the input x while the final set will take the actions as the output y modules can perform their computations in parallel with their outputs concatenated together afterwards or they can be chained together with the output of one set of modules serving as the input of the next set in this setup the continual learning problem consists of two phases first the correct set of modules for the task must be selected and composed into the policy then the parameters of those neural network modules must be updated from the data generated by the agent selecting the incorrect set of modules will lead to poor transfer and catastrophic forgetting as the parameters of the incorrect neural network modules get overwritten this modularity can help improve robustness by reducing dependencies between modules this is achieved both through parallelism and through the need of downstream modules to generalise to novel combinations of upstream modules the authors validate their approaches on a set of tasks with compositional structure which a functionally compositional policy would be able to exploit as it could substitute out modules according to the components of the current task they compare their framework against alternative continual learning baselines and show that their framework is able to i avoid catastrophic forgetting ii demonstrate better zeroshot transfer to new tasks and iii even exhibit better backward transfer on training tasks overall i found this to be an interesting paper the idea of functional compositionality is rather intriguing and its application to rl and continual learning is original the authors have motivated why a modular policy should improve continual learning on tasks with compositional structure and they have validated their approach with a series of compositional tasks showing that their approach outperforms several alternative approaches to continual learning on a variety of metrics in particular they show empirically that their approach demonstrates better forward transfer with both better zeroshot performance and faster learning on new tasks and that their approach not only avoids catastrophic forgetting but that it can actually demonstrate a degree of backward transfer with training on new tasks improving performance on old tasks methodologically the primary limitation of their approach is that the computational graph linking modules together needs to be specified in advance for each graph node a set of possible modules is defined in advance although the module parameters still need to be learned and the agent must determine which of the possibilities to insert at each node moreover as the authors have pointed out the complexity of finding the right combination of modules can grow combinatorially with the number of possible modules making it difficult to scale this to a larger number of modules another limitation is that the overall framework must be divided into distinct training phases with an initialisation phase that must give the modules some initial training before they can be deployed in the computational graph a final limitation is that while the framework shows good performance on compositional gridworld tasks the empirical performance advantage seem weaker on higher dimensional robotic manipulation tasks however in spite of these limitations the approach is novel and these are issues that further development of this approach would need to address for me the primary weakness with this paper is that the exposition of functional compositionality in section 3 could be clearer and improving this would help the paper reach a wider audience the framework and how it was applied to rl only became clearer after reading the subsequent sections particularly section 51 and the algorithmic details in the supplement providing concrete examples may help ground what was otherwise a rather abstract discussion for instance it was initially unclear what the module were doing as i now understand it they are just stages in a multistage hierarchical processing and defining them as solutions to abstract subproblems and as functions that map undefined inputs x to undefined outputs y did not help part of the issue was that i initially understood the modules in set m as being insertable at any node in the computational graph which they are not moreover it was not clear in this section that the computational graph is predefined as i initially thought the agent could flexibly compose the modules together to form any graph and part of the problem it needed to solve was to find the correct graph relatedly algorithm 1 includes many variables and functions whose meaning did not become clear until a second reading of the paper for instance how the discretesearch worked how the modules are composed together and how the algorithm relates to the graph in fig 1 again i had initially understood the discretesearch to be discovering the graph in addition to the modules that should go in the graph node there is one part of the algorithm that remains unclear to me towards the end of paragraph 1 in section 62 the authors mention batch rl as using data from the current task and all tasks that reuse those modules to avoid forgetting by this statement i read two possibilities a the algorithm considers all tasks that share some but not all modules with the current task or b the algorithm considers all tasks that use the exact same set of modules as the current task itd be helpful if the authors could clarify what is meant here additionally if a is meant then how are the gradients propagated so that catastrophic interference does not occur in the modules that are not shared with the current task and if b is meant then how are these tasks considered different from the current task given that they are using the exact same policy network the authors also claim the use of batch rl to overcome catastrophic forgetting as one of their contributions yet it is unclear to me how this is novel rehearsal methods have previously been used to overcome catastrophic forgetting in lifelong learning and in the rl context specifically rolnick et al 2019 specifically used replay techniques to overcome forgetting finally it may be worth noting that besides functional compositionality and temporal hrl compositionality there has also been work at least in the cognitive sciences in getting agents to learn compositional internal models of tasks for the purposes of modelbased planning see eg franklin frank 2018 httpsjournalsplosorgploscompbiolarticleid101371journalpcbi1006116 indeed the authors compositional gridworld task is reminiscent of the compositional task in this paper which is in turn based off of the compositional gridsailing task in fermin et al 2010 httpswwwtandfonlinecomdoifull101080002228952010526467 overall i would recommend this paper for acceptance the application of functional compositionality to rl policies is quite original and the approach has been validated empirically but the paper would benefit significantly from greater clarity in its exposition of functional compositionality and in better defining certain variables and functions in algorithm 1 docsepthe paper introduces an algorithm for lifelong reinforcement learning using functional neural composition the algorithm first maps a new problem onto a composition of previously acquired modules then the agent trainsfinetunes the selected module combination on the new task and finally the agent incorporates this newly acquired information into the existing modules this algorithm is then evaluated on two domains one multitask lifelong gridworld domain and a multitask lifelong robotics domain the algorithm produces superior results to several other lifelong learning approaches i very much agree with the authors goals lifelong reinforcement learning is an extremely interesting and underexplored area i also agree that compositional modularity is an extremely promising direction that merits further study by itself unfortunately the paper suffers from several problems in its current version however don not think that these problems are insurmountable the authors should be able to submit an updated version addressing my main issues in time strengths the topic and chosen solution are each highly important research directions with enormous room for future research the paper is extremely well written the language is fluent and the overall structure of the paper is solid the algorithm introduced is a wellengineered solution to the outlined problem weaknesses this is my main point of concern if it is addressed to my satisfaction i will increase my score to weak accept while all explanations and descriptions throughout the paper are easily understood they severely lack in detail that is required to put many statements into context and while most of this detail can be found in the appendix i cannot recommend a paper where the appendix is essential to understanding core parts of the main paper for acceptance it is always a delicate balance to jointly introduce a new domain and a new approach as this can quickly create the impression that a paper codeveloped an approach and a domain the approach is bound to excel at diminishing the appeal of both unfortunately this is the case for the current paper the particular input permutations changing the color of the goal and actionspace permutations are an extremely good fit to the algorithm and its targeted modular architecture one example is that the number of permutations is aligned with the number of modules consequently i find the domain a little too simple each tasks will decompose perfectly in representation and action space even for the robotics domain this is an unrealistic assumption an additional experiment on something with a less welldefined structure would have been very interesting this would have allowed to also investigate other interesting properties like learning more complex rules that could eg involve questions of recursivity or similar throughout the paper it is implied that composition is a sequential process this is a useful simplification as it allows routinglike stacking of neural layers however i would strongly argue that it only a simplification true modular compositionality should allow for arbitrary connections imagine eg two perception skills that are required for logical inference or recursive rules detailed remarks p3 l8 i dont understand what a twolayer abstraction should mean p4 top mapping the policy onto a graph of subskills seems off as a graph does not allow for parallelisms that may be required for more complex tasks section 5 here is where the bulk of details are missing section 51 each neural module mi in our architecture is in charge of solving one specific subproblem f what does that mean that there are as many modules as subproblems section 51 a better solution is for each module to receive as input only the information relevant to it such that its output need not characterize any additional information i have no idea what this means section 51 our architecture assumes that the state is composed of modulespecific components too vague section 51 at each depth d in our modular net what is dmax in general how is dmax determined section 52 while simultaneously preventing the forgetting of knowledge stored in those modules we achieve this by separating the learning into multiple stages way too little detail i do not understand this either section 52 for this we train each new task on disjoint sets of neural modules until all modules have been initialized does this mean that the subtasks are also disjoint section 52 incorporate new likely incorrect information the information may be superfluous or distracting but its definitely not incorrect section 52 since a diverse set of modules has already been initialized we do this via discrete optimization of the reward with respect to the possible combinations of modules what does this mean what is discrete optimization section 52 however while experience replay has been tremendously successful in the supervised setting its success in rl has been limited to very short sequences of tasks please elaborate section 52 a module is copied before it is updated to prevent catastrophic forgetting i did not find anywhere how the updates are later reintroduced into the architecture section 6 i argue that at least a rough explanation of the architecture is essential when introducing results i want to know what are the modules how many are there how were they chosen how many parameters do the different approaches have the existing pairwise comparisons are insufficient i especially miss an explanation of how the different modules are partitioned and why they were designed that way a well chosen topic with a highly interesting solution strengths good research direction paper is extremely well written the core of the paper the introduced algorithm is a wellengineered solution to the problem weaknesses all details relevant to contextualize both the approach and the results are in the appendix the paper jointly introduces a domain and an approach opening it to the critic that the experiments are handcrafted towards the approach the experiments seem too simple i disagree with some assumptions behind compositionality docsepupdate after authors response after reading the other reviews the updated manuscript and the authors response i think the authors have made several small improvements based on the reviewers suggestions typically i would now be leaning towards a score of 7 however this years conference only allows for either a 6 or an 8 to me personally the paper is still slightly closer to 6 than 8 but to make a clear statement in favor of acceptance and facilitate the reviewers discussion i have raised my score to an 8 i will clearly state my opinion and hesitation to strongly endorse acceptance of the paper during the reviewers discussion summary the paper proposes a novel method for lifelong compositional rl the latter is formally introduced in the paper as lifelong rl learning to solve a continuous stream of tasks without revisiting previously seen tasks with families of tasks that have a known compositional structure the paper proposes to use a modular neural architecture consisting of layers of modules to capture the compositional nature of the tasks to learn through an appropriate moduleselection process the topologically disjoint modules become functionally disjoint this is shown to facilitate forwardtransfer faster learning of novel task variations that follow the compositional structure the paper investigates two moduleselection mechanisms a fixed mechanism where the correct sequence of modules for each task is known and a bruteforce combinatorial search over all module combination to find the highest scoring combination once modules are selected the selected moduleparameters are trained to increase current taskperformance this is then consolidated with previous task experience in an offline rl fashion via batchrl to avoid catastrophic forgetting experiments are shown on a set of 2d gridworld tasks and a number of simulated robot arms with different sensory setups and the method performs well against a number of baseline methods main contributions 1 definition of compositional families of rl tasks as the paper correctly points out a number of previous work on compositional architectures has used variations of offtheshelf lifelong learning tasks which are not explicitly compositional and a beneficial composition of modules is often not known intuitively often not even the number of modules is intuitively clear the two taskfamilies introduced overcome these issues and are a nice contribution towards more meaningful experimentation the gridworld task is conceptually easy and does not have very complicated perceptual problem or highdimensional continuous actionspaces the robotictasks on the other hand is closer to realworld applications and the corresponding complexities significance i think the tasks are well chosen and might become standardtasks for lifelong compositional rl 2 proposal of a compositional architecture and training procedure given that the compositional structure of the tasks is precisely known it is relatively straightforward to propose an exactly matching architecture the main innovation is a training procedure to train such an architecture unfortunately the paper proposes two fairly straightforward solutions that need strong simplifying assumptions one the correct sequence of modules is known and applied accordingly or two bruteforce search over all module combinations which becomes exponentially costly with increasing numbers of modules and requires trainingdata and relies on disjoint sets of tasks initially until all modules have been trained at least on one task significance the two moduleselection mechanisms serve as important controls and lead to promising results but they are also fairly straightforward conceptually and rely on often unrealistically strong assumptions what i would have liked to see for more impactful results is an attempt to train the architecture from scratch without bruteforce search and disjoint tasks initially one possible attempt would be to use rl to train a module selection policy similarly to chang 2019 which is used in the paper as a main inspiration for defining compositional rl tasks 3 definition of lifelong compositional rl and using techniques from offline rl to avoid catastrophic forgetting significance these are sensible choices and thus valid contributions but both have been conceptually proposed before in slightly different settings chang 2019 defines compositional supervised problems and using offline rl to avoid catastrophic forgetting and even have backwardstransfer is also not very farfetched quality clarity soundness correctness the paper is well written the problem is well formulated and introduced the experiments are clear and the proposed approach is simple but sound the benefits of the modular architecture over the nonmodular architecture are clear though perhaps less pronounced than expected experiments are repeated multiple times to produce results of statistical significance and the appendix gives all the hyperparameters and architecture details to reproduce the experiments i appreciate that the paper clearly points out the extra amounts of data that go into the modulesearch procedure which helps with comparison against the other methods to the best of my knowledge the claims made in the paper are supported by the empirical results shown my main criticism is that the paper stops short of tackling the most important problems in lifelong compositional rl which are i designing the architecture when the correct number of modules is unknown and the depth ie the number of layers of modules ii training when the correct sequence of modules is unknown and bruteforce search is too costly iii training without disjoint initial training tasks i think each of these problems is hard and fully solving only one of them would merit at least another publication but i am afraid that the current paper implements the baselines that one would like to see in any attempt to solve said problems but makes no attempt to do the latter at least some solution attempts could be taken eg from the supervised literature eg chang 2019 who train a moduleselection policy via rl another strong simplifying assumption is that the state can be separated a priori into relevant subparts for each module such that only relevant information is fed to each module this is perhaps the fourth iv important and difficult problem in training compositional architectures improvements 1 the main improvement that i would like to see is an attempt to train the architecture without knowledge of the correct modulesequence and without disjoint initial training tasks the latter might turn out to be not possible which would also be an interesting finding this attempt would probably consist of several subexperiments including ablations and controls eg training with too many modules etc i understand that this is mostly beyond of whats possible during the rebuttal phase yet i think this would be by far the biggest improvement of the current manuscript 2 while the plots in fig 2 left fig 3 left and fig 4 left nicely visually summarize the performance they do not provide information about how long it takes for the modular mtl to overtake stl does mtl need to see many tasks where it gets gradually faster and faster to reach high performance or is there an initial phase of little improvement followed by a takeoff where all subsequent tasks are learned rapidly does that occur after each module has been trained once showing the mean only hides all of these dynamics and it would be nice to see this in a series of appropriate plots it would also help answer questions like is modular mtl worth the overhead if its only 32 or 16 tasks instead of 64 etc 3 figure 2 right the performance gains of modular over monolithic mtl are smaller than intuitively expected one reason could be that the monolithic mtl successfully makes use of the correct modulesequence which is fed to the monolithic architecture as well to test for this it would be nice to see performance of both modular and monolithic mtl when feeding in random information for the modulesequence to use prediction modular mtl without search should catastrophically deteriorate monolithic mtl should deteriorate if it uses this information but should remain the same if the information is not used comments 4 another control experiment knocking out individual modules eg reinitializing one module randomly for the modular mtl with search should lead to deterioration of the corresponding tasks and only these tasks i have no strong reasons to believe otherwise yet it would be an interesting control experiment to see whether the modules have really fully specialized or whether they become responsible for multiple task variations when training with search the paper aims at addressing a very timely and important and notoriously difficult problem lifelong rl with taskfamilies that have compositional structure in principle learning about this compositional structure should greatly facilitate learningspeed for new taskvariations forwardtransfer there are at least four main problems to solve i how many modules are needed and what is the required computational depth ie maximum number of sequential module calls ii how are modules selected for a particular training instance when this is not clear apriori and bruteforce search across modulecombinations is too costly iii how are modules trained initially after random initialization how do they acquire specialization iv how can it be ensured that modules only process relevant information and thus become invariant to irrelevant variation the fifth problem avoiding catastrophic forgetting is actually solved in the paper by borrowing a technique from batch rl all of these problems are hard and active areas of research admittedly the answer to iii might be a curriculum similar to whats proposed in the paper the current paper makes strong simplifying assumptions to reduce the severity of each problem while this is a sensible starting point to establish baseline results what im missing from the current manuscript is an attempt to tackle some of these problems i am not saying this is trivial and i would consider it a significant addition to the paper without such an addition the paper shows promising and interesting results for situations where these problems do not appear which is often unrealistic or for situations where these problems could be solved through some means in the future the paper is well written and clear results are good and i have no complaints regarding the correctness of results and claims however given the strong simplifying assumptions the results are not very surprising which ultimately limits the impact and significance of the paper i currently see no reason to reject the paper but also have a hard time assigning a high score i think with a bit of extra work admittedly beyond whats possible in the rebuttal the paper could become a landmark piece of work in lifelong compositional rl ### Summary:
the paper presents a method for compositional task learning in the continual rl setting by composing and reconfiguring neural modules the method is evaluated on minigrids and simulated robot manipulation tasks the reviewers agree and i concur that the paper proposes an interesting solution to a difficult and important problem the paper is well presented and would make a good addition to the multitask continual learning the reviewers appreciate the authors responses and the improvement to the manuscript and in particular the extra experiments with the wrong number of modules the final version of the paper should clarify the explanation of functional modularity move the relevant pieces to the main text see gur et al neurips 2021 httpsopenreviewnetforumidcebydmy0ytl for a definition of learnable compositional tasks via petri net formalism reviewers appreciate the extra experiments with the wrong number of modules
[ 1355, 11701, 1754, 327, 253, 30628, 13991, 5431, 891, 651, 1024, 320, 25661, 4404, 247, 4868, 273, 818, 2299, 436, 1107, 8059, 760, 4483, 323, 2057, 247, 721, 390, 271, 854, 281, 479, 11697, 253, 2929, 310, 1335, 5777, 8003, 281, 721, 685, 854, 533, 281, 1056, 247, 2590, 3908, 275, 3718, 273, 14924, 285, 12454, 253, 30628, 5955, 891, 452, 5439, 619, 4868, 281, 271, 854, 891, 588, 4518, 1375, 619, 4743, 285, 39500, 281, 7052, 18883, 14924, 273, 253, 2929, 1309, 253, 30628, 5955, 50276, 8774, 253, 2929, 29328, 247, 4460, 1332, 323, 36536, 5889, 267, 391, 77, 253, 6158, 310, 19186, 5611, 275, 253, 2929, 347, 36536, 391, 77, 4715, 281, 8415, 247, 5415, 5542, 273, 8892, 1293, 27694, 2996, 3786, 2326, 8892, 342, 5870, 273, 8892, 326, 452, 247, 1929, 5889, 267, 2605, 253, 2929, 29328, 281, 897, 247, 23178, 11454, 10336, 11253, 273, 8090, 273, 11911, 281, 9232, 253, 5889, 267, 3753, 273, 253, 8892, 281, 3037, 949, 271, 4569, 11911, 24508, 1232, 253, 1755, 11220, 28465, 11911, 2489, 30333, 28465, 436, 310, 2011, 281, 12454, 3579, 17338, 7938, 4715, 273, 4460, 4836, 10575, 326, 956, 253, 5889, 267, 2605, 253, 2929, 2340, 684, 767, 11911, 24508, 6297, 247, 4229, 5122, 835, 253, 3451, 3425, 273, 11911, 323, 1016, 4836, 310, 1929, 285, 247, 16325, 832, 48040, 38183, 3186, 689, 512, 6333, 5019, 281, 1089, 253, 4585, 14755, 5019, 2378, 11911, 403, 4236, 253, 4236, 6333, 22041, 403, 10166, 281, 2572, 1655, 4836, 24159, 436, 310, 840, 33114, 342, 2045, 4836, 2793, 275, 271, 28841, 391, 77, 8142, 3066, 14604, 8435, 281, 3693, 36256, 37264, 4679, 403, 2011, 327, 247, 873, 273, 374, 69, 9860, 10186, 8892, 285, 247, 1180, 273, 15524, 15688, 6174, 342, 1027, 17872, 873, 8777, 285, 253, 1332, 17923, 973, 1411, 247, 1180, 273, 8245, 3082, 50276, 7265, 9021, 337, 5426, 273, 5889, 267, 5870, 273, 391, 77, 8892, 347, 253, 2929, 9113, 2792, 562, 247, 1180, 273, 2045, 789, 327, 5889, 267, 35615, 556, 908, 10575, 273, 273, 649, 1041, 48164, 36536, 4715, 8892, 534, 403, 417, 11120, 5889, 267, 285, 247, 12912, 5889, 273, 11911, 310, 2223, 417, 1929, 540, 41597, 2223, 417, 1014, 253, 1180, 273, 11911, 310, 540, 41597, 2590, 253, 767, 4836, 49896, 5611, 11399, 841, 3374, 285, 403, 247, 5322, 7680, 4404, 625, 14282, 40290, 253, 9860, 10186, 4836, 310, 4473, 1230, 3477, 285, 1057, 417, 452, 1077, 9542, 39612, 1895, 390, 1029, 6967, 5415, 5231, 32328, 253, 15688, 882, 6579, 327, 253, 643, 1133, 310, 8003, 281, 1524, 10186, 4893, 285, 253, 3969, 48663, 8453, 891, 1158, 253, 8892, 403, 973, 6777, 285, 1537, 2489, 2629, 40480, 323, 36536, 5889, 267, 391, 77, 50276, 19, 10419, 273, 247, 5889, 267, 10336, 285, 3733, 5199, 1677, 326, 253, 5889, 267, 2605, 273, 253, 8892, 310, 10534, 1929, 352, 310, 4942, 15246, 281, 12661, 271, 4555, 11038, 10336, 253, 2022, 15832, 310, 247, 3733, 5199, 281, 6194, 824, 271, 10336, 19235, 253, 2929, 29328, 767, 9648, 15246, 5482, 326, 878, 2266, 8077, 5411, 13260, 581, 253, 3451, 3425, 273, 11911, 310, 1929, 285, 3732, 15672, 390, 767, 16325, 832, 48040, 3186, 689, 512, 6333, 13553, 534, 4916, 28596, 19983, 342, 3629, 3904, 273, 11911, 285, 4419, 3733, 2203, 285, 15771, 327, 28465, 5239, 273, 8892, 8523, 1919, 512, 11911, 452, 644, 10166, 387, 1878, 327, 581, 4836, 8453, 253, 767, 11911, 24508, 6297, 5752, 347, 1774, 5760, 285, 1421, 281, 12532, 1543, 533, 597, 403, 671, 9648, 15246, 4473, 1230, 285, 10725, 327, 2223, 32638, 18260, 2266, 13260, 752, 891, 651, 452, 10490, 281, 923, 323, 625, 3486, 1020, 1543, 310, 271, 3177, 281, 6194, 253, 10336, 432, 20041, 1293, 16325, 832, 48040, 3186, 285, 28465, 8892, 8523, 50276, 531, 1896, 3177, 651, 320, 281, 897, 391, 77, 281, 6194, 247, 6333, 5438, 3646, 12014, 281, 1683, 6247, 534, 310, 908, 275, 253, 2929, 347, 247, 2022, 17006, 323, 13947, 5889, 267, 391, 77, 8892, 50276, 20, 5426, 273, 36536, 5889, 267, 391, 77, 285, 970, 5609, 432, 28841, 391, 77, 281, 3693, 36256, 37264, 8453, 841, 403, 24600, 10165, 285, 3021, 3588, 9021, 533, 1097, 452, 644, 4473, 1230, 4081, 1078, 275, 5777, 1027, 7533, 1683, 6247, 13067, 5889, 267, 22296, 3237, 285, 970, 28841, 391, 77, 281, 3693, 36256, 37264, 285, 1014, 452, 19265, 1344, 507, 1592, 310, 671, 417, 1077, 2080, 39792, 2147, 50276, 15177, 19843, 3590, 1255, 36594, 50276, 783, 2929, 310, 973, 3542, 253, 1895, 310, 973, 26115, 285, 5611, 253, 4679, 403, 2590, 285, 253, 4081, 2746, 310, 2969, 533, 3590, 253, 5373, 273, 253, 23178, 10336, 689, 253, 1327, 2307, 792, 10336, 403, 2590, 2167, 4931, 1679, 17088, 685, 3264, 4679, 403, 6015, 2709, 2069, 281, 4711, 1543, 273, 7605, 8453, 285, 253, 30762, 4245, 512, 253, 4373, 22041, 285, 10336, 4278, 281, 18302, 253, 4679, 891, 11435, 326, 253, 2929, 4518, 2792, 562, 253, 4465, 8322, 273, 941, 326, 564, 715, 253, 11911, 3849, 5199, 534, 7729, 342, 5301, 1411, 253, 643, 3082, 281, 253, 1682, 273, 619, 3640, 253, 3916, 1160, 275, 253, 2929, 403, 4516, 407, 253, 16774, 1543, 2011, 50275, 2577, 2022, 14226, 310, 326, 253, 2929, 14545, 2159, 273, 46710, 253, 954, 1774, 3237, 275, 36536, 5889, 267, 391, 77, 534, 403, 891, 20462, 253, 10336, 672, 253, 3451, 1180, 273, 11911, 310, 7202, 285, 253, 6864, 26332, 253, 1180, 273, 8090, 273, 11911, 21255, 3733, 672, 253, 3451, 3425, 273, 11911, 310, 7202, 285, 16325, 832, 48040, 3186, 310, 1512, 19983, 37685, 3733, 1293, 28465, 3302, 3733, 8892, 891, 1158, 1016, 273, 841, 3237, 310, 1892, 285, 4751, 16161, 760, 581, 273, 731, 651, 15785, 387, 1878, 1529, 9311, 533, 891, 717, 9202, 326, 253, 1655, 2929, 17930, 253, 1666, 25379, 326, 581, 651, 751, 281, 923, 275, 667, 3177, 281, 8415, 753, 3237, 533, 2789, 642, 3177, 281, 513, 253, 6158, 387, 1878, 690, 2900, 9437, 812, 320, 2668, 24088, 432, 253, 22296, 6239, 24088, 1683, 6247, 665, 6194, 247, 11911, 24508, 3646, 3066, 391, 77, 1529, 2266, 8077, 5411, 9376, 310, 326, 253, 1375, 476, 320, 9070, 247, 30400, 715, 4623, 749, 31369, 323, 1016, 6333, 824, 326, 760, 4623, 1491, 310, 10208, 281, 1016, 6333, 50276, 2520, 310, 4931, 253, 7002, 21983, 1774, 285, 2834, 1895, 275, 3733, 5889, 267, 35615, 50275, 49831, 942, 50276, 18, 253, 2022, 7756, 326, 891, 651, 751, 281, 923, 310, 271, 3177, 281, 6194, 253, 10336, 1293, 3640, 273, 253, 3451, 11911, 2655, 566, 285, 1293, 28465, 3302, 3733, 8892, 253, 6158, 1537, 1614, 562, 281, 320, 417, 1896, 534, 651, 671, 320, 271, 4722, 4560, 436, 3177, 651, 3164, 2882, 273, 2067, 749, 16217, 3825, 1690, 490, 77, 569, 285, 5760, 24088, 3733, 342, 1512, 1142, 11911, 3966, 891, 2096, 326, 436, 310, 6571, 4457, 273, 47515, 1896, 1309, 253, 30080, 22559, 3408, 2568, 891, 1158, 436, 651, 320, 407, 2080, 253, 5962, 7756, 273, 253, 1655, 7714, 50276, 19, 1223, 253, 14777, 275, 3036, 374, 1669, 3036, 495, 1669, 285, 3036, 577, 1669, 23395, 25910, 26799, 253, 3045, 597, 513, 417, 2085, 1491, 670, 849, 1048, 352, 3936, 323, 253, 23178, 278, 17945, 281, 19486, 640, 331, 77, 1057, 278, 17945, 878, 281, 923, 1142, 8892, 835, 352, 4850, 13237, 7938, 285, 7938, 281, 3986, 1029, 3045, 390, 310, 627, 271, 3302, 3408, 273, 1652, 7756, 3560, 407, 247, 1379, 2727, 835, 512, 6774, 8892, 403, 6311, 9086, 1057, 326, 2826, 846, 1016, 6333, 556, 644, 10166, 2378, 4645, 253, 1599, 760, 43660, 512, 273, 841, 8062, 285, 352, 651, 320, 5322, 281, 923, 436, 275, 247, 2962, 273, 4569, 14777, 352, 651, 671, 1361, 3662, 3533, 751, 310, 23178, 278, 17945, 4409, 253, 18332, 604, 697, 760, 4567, 390, 1668, 8892, 3185, 273, 6705, 3966, 50276, 20, 4677, 374, 987, 50276, 783, 3045, 15988, 273, 23178, 689, 1114, 36842, 278, 17945, 403, 4577, 685, 540, 41597, 3264, 581, 1921, 812, 320, 326, 253, 1114, 36842, 278, 17945, 8379, 2789, 897, 273, 253, 3451, 11911, 2655, 566, 534, 310, 10208, 281, 253, 1114, 36842, 10336, 347, 973, 281, 1071, 323, 436, 352, 651, 320, 5322, 281, 923, 3045, 273, 1097, 23178, 285, 1114, 36842, 278, 17945, 672, 12422, 275, 3632, 1491, 323, 253, 11911, 2655, 566, 281, 897, 10554, 23178, 278, 17945, 1293, 3186, 943, 22476, 3838, 1037, 16528, 366, 1114, 36842, 278, 17945, 943, 16528, 366, 604, 352, 4648, 436, 1491, 533, 943, 3464, 253, 1072, 604, 253, 1491, 310, 417, 908, 50276, 26122, 50276, 21, 1529, 1453, 3368, 30906, 562, 2060, 11911, 24088, 294, 19078, 3006, 581, 6333, 12421, 323, 253, 23178, 278, 17945, 342, 3186, 943, 1421, 281, 28153, 273, 253, 3969, 8892, 285, 760, 841, 8892, 891, 452, 642, 2266, 4606, 281, 2868, 5010, 2568, 352, 651, 320, 271, 4722, 1453, 3368, 281, 923, 1880, 253, 11911, 452, 1663, 4751, 18052, 390, 1880, 597, 2489, 5506, 323, 2709, 4836, 10575, 672, 3733, 342, 3186, 50275, 783, 2929, 13698, 387, 15974, 247, 1077, 14793, 285, 1774, 285, 417, 49186, 2834, 1895, 36536, 391, 77, 342, 4836, 49896, 326, 452, 5889, 267, 2605, 275, 8063, 4715, 670, 436, 5889, 267, 2605, 943, 10260, 12454, 3037, 723, 365, 264, 323, 747, 4836, 20617, 569, 3579, 17338, 627, 403, 387, 1878, 1740, 2022, 3237, 281, 8415, 891, 849, 1142, 11911, 403, 3058, 285, 752, 310, 253, 2424, 15180, 6864, 26332, 4869, 1180, 273, 22453, 6333, 5841, 21255, 849, 403, 11911, 4236, 323, 247, 1798, 3733, 4227, 672, 436, 310, 417, 2590, 1049, 7947, 74, 285, 16325, 832, 48040, 3186, 2439, 6333, 39902, 569, 310, 1512, 19983, 37685, 849, 403, 11911, 10166, 8523, 846, 3632, 31850, 849, 513, 597, 16270, 48544, 21983, 849, 476, 352, 320, 33075, 326, 11911, 760, 1232, 4623, 1491, 285, 3021, 2489, 13727, 281, 19124, 7629, 253, 10720, 1895, 17816, 36256, 37264, 310, 2686, 14042, 275, 253, 2929, 407, 40770, 247, 5853, 432, 14604, 391, 77, 512, 273, 841, 3237, 403, 1892, 285, 3939, 3672, 273, 2561, 47421, 253, 3662, 281, 37685, 1537, 320, 247, 24642, 2074, 281, 47515, 4081, 275, 253, 2929, 50276, 783, 1655, 2929, 2789, 2266, 8077, 5411, 13260, 281, 4796, 253, 12147, 273, 1016, 1895, 1223, 436, 310, 247, 24600, 4983, 1127, 281, 5100, 8245, 1543, 752, 516, 5816, 432, 253, 1655, 7714, 310, 271, 3177, 281, 18915, 690, 273, 841, 3237, 891, 717, 417, 3981, 436, 310, 14916, 285, 891, 651, 1908, 352, 247, 1534, 1635, 281, 253, 2929, 1293, 824, 271, 1635, 253, 2929, 2722, 12532, 285, 4722, 1543, 323, 9534, 835, 841, 3237, 513, 417, 3176, 534, 310, 2223, 46521, 390, 323, 9534, 835, 841, 3237, 812, 320, 14042, 949, 690, 2097, 275, 253, 2852, 253, 2929, 310, 973, 3542, 285, 2590, 1543, 403, 1175, 285, 891, 452, 642, 14672, 5001, 253, 36594, 273, 1543, 285, 3916, 2299, 1677, 253, 2266, 8077, 5411, 13260, 253, 1543, 403, 417, 1077, 10084, 534, 9142, 7787, 253, 3486, 285, 8453, 273, 253, 2929, 891, 4390, 923, 642, 1921, 281, 12009, 253, 2929, 533, 671, 452, 247, 1892, 673, 34018, 247, 1029, 4868, 891, 1158, 342, 247, 2372, 273, 4465, 789, 47421, 4457, 47515, 1896, 275, 253, 30080, 22559, 253, 2929, 812, 2489, 247, 30951, 5313, 273, 789, 275, 36536, 5889, 267, 391, 77, 2490, 187, 4118, 18435, 27, 783, 2929, 10262, 247, 1332, 323, 5889, 267, 4836, 4715, 275, 253, 45120, 391, 77, 4758, 407, 47247, 285, 294, 5397, 981, 11454, 11911, 253, 1332, 310, 6760, 327, 1054, 19531, 2352, 285, 15524, 15688, 19763, 8892, 50276, 783, 30628, 5194, 285, 891, 15038, 326, 253, 2929, 29328, 271, 4722, 2900, 281, 247, 2834, 285, 1774, 1895, 253, 2929, 310, 973, 3559, 285, 651, 1056, 247, 1175, 1635, 281, 253, 1554, 262, 1945, 45120, 4715, 253, 30628, 11435, 253, 4477, 6128, 285, 253, 7756, 281, 253, 7714, 285, 275, 1798, 253, 4465, 4679, 342, 253, 3430, 1180, 273, 11911, 50276, 783, 2457, 2715, 273, 253, 2929, 943, 50275, 498, 274, 1419, 253, 8813, 273, 5164, 23178, 414, 50276, 15106, 253, 4623, 7437, 281, 253, 2022, 2505, 50276, 2887, 305, 321, 1162, 355, 5723, 2824, 43425, 5987, 5758, 15337, 3024, 39061, 301, 336, 1615, 69, 2577, 17, 1767, 77, 323, 247, 5426, 273, 3037, 494, 5889, 267, 8892, 3066, 7590, 363, 2036, 30221, 50274, 15337, 398, 11435, 253, 4465, 4679, 342, 253, 3430, 1180, 273, 11911 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 1355, 11701, 1754, 327, 253, 30628, 13991, 5431, 891, 651, 1024, 320, 25661, 4404, 247, 4868, 273, 818, 2299, 436, 1107, 8059, 760, 4483, 323, 2057, 247, 721, 390, 271, 854, 281, 479, 11697, 253, 2929, 310, 1335, 5777, 8003, 281, 721, 685, 854, 533, 281, 1056, 247, 2590, 3908, 275, 3718, 273, 14924, 285, 12454, 253, 30628, 5955, 891, 452, 5439, 619, 4868, 281, 271, 854, 891, 588, 4518, 1375, 619, 4743, 285, 39500, 281, 7052, 18883, 14924, 273, 253, 2929, 1309, 253, 30628, 5955, 50276, 8774, 253, 2929, 29328, 247, 4460, 1332, 323, 36536, 5889, 267, 391, 77, 253, 6158, 310, 19186, 5611, 275, 253, 2929, 347, 36536, 391, 77, 4715, 281, 8415, 247, 5415, 5542, 273, 8892, 1293, 27694, 2996, 3786, 2326, 8892, 342, 5870, 273, 8892, 326, 452, 247, 1929, 5889, 267, 2605, 253, 2929, 29328, 281, 897, 247, 23178, 11454, 10336, 11253, 273, 8090, 273, 11911, 281, 9232, 253, 5889, 267, 3753, 273, 253, 8892, 281, 3037, 949, 271, 4569, 11911, 24508, 1232, 253, 1755, 11220, 28465, 11911, 2489, 30333, 28465, 436, 310, 2011, 281, 12454, 3579, 17338, 7938, 4715, 273, 4460, 4836, 10575, 326, 956, 253, 5889, 267, 2605, 253, 2929, 2340, 684, 767, 11911, 24508, 6297, 247, 4229, 5122, 835, 253, 3451, 3425, 273, 11911, 323, 1016, 4836, 310, 1929, 285, 247, 16325, 832, 48040, 38183, 3186, 689, 512, 6333, 5019, 281, 1089, 253, 4585, 14755, 5019, 2378, 11911, 403, 4236, 253, 4236, 6333, 22041, 403, 10166, 281, 2572, 1655, 4836, 24159, 436, 310, 840, 33114, 342, 2045, 4836, 2793, 275, 271, 28841, 391, 77, 8142, 3066, 14604, 8435, 281, 3693, 36256, 37264, 4679, 403, 2011, 327, 247, 873, 273, 374, 69, 9860, 10186, 8892, 285, 247, 1180, 273, 15524, 15688, 6174, 342, 1027, 17872, 873, 8777, 285, 253, 1332, 17923, 973, 1411, 247, 1180, 273, 8245, 3082, 50276, 7265, 9021, 337, 5426, 273, 5889, 267, 5870, 273, 391, 77, 8892, 347, 253, 2929, 9113, 2792, 562, 247, 1180, 273, 2045, 789, 327, 5889, 267, 35615, 556, 908, 10575, 273, 273, 649, 1041, 48164, 36536, 4715, 8892, 534, 403, 417, 11120, 5889, 267, 285, 247, 12912, 5889, 273, 11911, 310, 2223, 417, 1929, 540, 41597, 2223, 417, 1014, 253, 1180, 273, 11911, 310, 540, 41597, 2590, 253, 767, 4836, 49896, 5611, 11399, 841, 3374, 285, 403, 247, 5322, 7680, 4404, 625, 14282, 40290, 253, 9860, 10186, 4836, 310, 4473, 1230, 3477, 285, 1057, 417, 452, 1077, 9542, 39612, 1895, 390, 1029, 6967, 5415, 5231, 32328, 253, 15688, 882, 6579, 327, 253, 643, 1133, 310, 8003, 281, 1524, 10186, 4893, 285, 253, 3969, 48663, 8453, 891, 1158, 253, 8892, 403, 973, 6777, 285, 1537, 2489, 2629, 40480, 323, 36536, 5889, 267, 391, 77, 50276, 19, 10419, 273, 247, 5889, 267, 10336, 285, 3733, 5199, 1677, 326, 253, 5889, 267, 2605, 273, 253, 8892, 310, 10534, 1929, 352, 310, 4942, 15246, 281, 12661, 271, 4555, 11038, 10336, 253, 2022, 15832, 310, 247, 3733, 5199, 281, 6194, 824, 271, 10336, 19235, 253, 2929, 29328, 767, 9648, 15246, 5482, 326, 878, 2266, 8077, 5411, 13260, 581, 253, 3451, 3425, 273, 11911, 310, 1929, 285, 3732, 15672, 390, 767, 16325, 832, 48040, 3186, 689, 512, 6333, 13553, 534, 4916, 28596, 19983, 342, 3629, 3904, 273, 11911, 285, 4419, 3733, 2203, 285, 15771, 327, 28465, 5239, 273, 8892, 8523, 1919, 512, 11911, 452, 644, 10166, 387, 1878, 327, 581, 4836, 8453, 253, 767, 11911, 24508, 6297, 5752, 347, 1774, 5760, 285, 1421, 281, 12532, 1543, 533, 597, 403, 671, 9648, 15246, 4473, 1230, 285, 10725, 327, 2223, 32638, 18260, 2266, 13260, 752, 891, 651, 452, 10490, 281, 923, 323, 625, 3486, 1020, 1543, 310, 271, 3177, 281, 6194, 253, 10336, 432, 20041, 1293, 16325, 832, 48040, 3186, 285, 28465, 8892, 8523, 50276, 531, 1896, 3177, 651, 320, 281, 897, 391, 77, 281, 6194, 247, 6333, 5438, 3646, 12014, 281, 1683, 6247, 534, 310, 908, 275, 253, 2929, 347, 247, 2022, 17006, 323, 13947, 5889, 267, 391, 77, 8892, 50276, 20, 5426, 273, 36536, 5889, 267, 391, 77, 285, 970, 5609, 432, 28841, 391, 77, 281, 3693, 36256, 37264, 8453, 841, 403, 24600, 10165, 285, 3021, 3588, 9021, 533, 1097, 452, 644, 4473, 1230, 4081, 1078, 275, 5777, 1027, 7533, 1683, 6247, 13067, 5889, 267, 22296, 3237, 285, 970, 28841, 391, 77, 281, 3693, 36256, 37264, 285, 1014, 452, 19265, 1344, 507, 1592, 310, 671, 417, 1077, 2080, 39792, 2147, 50276, 15177, 19843, 3590, 1255, 36594, 50276, 783, 2929, 310, 973, 3542, 253, 1895, 310, 973, 26115, 285, 5611, 253, 4679, 403, 2590, 285, 253, 4081, 2746, 310, 2969, 533, 3590, 253, 5373, 273, 253, 23178, 10336, 689, 253, 1327, 2307, 792, 10336, 403, 2590, 2167, 4931, 1679, 17088, 685, 3264, 4679, 403, 6015, 2709, 2069, 281, 4711, 1543, 273, 7605, 8453, 285, 253, 30762, 4245, 512, 253, 4373, 22041, 285, 10336, 4278, 281, 18302, 253, 4679, 891, 11435, 326, 253, 2929, 4518, 2792, 562, 253, 4465, 8322, 273, 941, 326, 564, 715, 253, 11911, 3849, 5199, 534, 7729, 342, 5301, 1411, 253, 643, 3082, 281, 253, 1682, 273, 619, 3640, 253, 3916, 1160, 275, 253, 2929, 403, 4516, 407, 253, 16774, 1543, 2011, 50275, 2577, 2022, 14226, 310, 326, 253, 2929, 14545, 2159, 273, 46710, 253, 954, 1774, 3237, 275, 36536, 5889, 267, 391, 77, 534, 403, 891, 20462, 253, 10336, 672, 253, 3451, 1180, 273, 11911, 310, 7202, 285, 253, 6864, 26332, 253, 1180, 273, 8090, 273, 11911, 21255, 3733, 672, 253, 3451, 3425, 273, 11911, 310, 7202, 285, 16325, 832, 48040, 3186, 310, 1512, 19983, 37685, 3733, 1293, 28465, 3302, 3733, 8892, 891, 1158, 1016, 273, 841, 3237, 310, 1892, 285, 4751, 16161, 760, 581, 273, 731, 651, 15785, 387, 1878, 1529, 9311, 533, 891, 717, 9202, 326, 253, 1655, 2929, 17930, 253, 1666, 25379, 326, 581, 651, 751, 281, 923, 275, 667, 3177, 281, 8415, 753, 3237, 533, 2789, 642, 3177, 281, 513, 253, 6158, 387, 1878, 690, 2900, 9437, 812, 320, 2668, 24088, 432, 253, 22296, 6239, 24088, 1683, 6247, 665, 6194, 247, 11911, 24508, 3646, 3066, 391, 77, 1529, 2266, 8077, 5411, 9376, 310, 326, 253, 1375, 476, 320, 9070, 247, 30400, 715, 4623, 749, 31369, 323, 1016, 6333, 824, 326, 760, 4623, 1491, 310, 10208, 281, 1016, 6333, 50276, 2520, 310, 4931, 253, 7002, 21983, 1774, 285, 2834, 1895, 275, 3733, 5889, 267, 35615, 50275, 49831, 942, 50276, 18, 253, 2022, 7756, 326, 891, 651, 751, 281, 923, 310, 271, 3177, 281, 6194, 253, 10336, 1293, 3640, 273, 253, 3451, 11911, 2655, 566, 285, 1293, 28465, 3302, 3733, 8892, 253, 6158, 1537, 1614, 562, 281, 320, 417, 1896, 534, 651, 671, 320, 271, 4722, 4560, 436, 3177, 651, 3164, 2882, 273, 2067, 749, 16217, 3825, 1690, 490, 77, 569, 285, 5760, 24088, 3733, 342, 1512, 1142, 11911, 3966, 891, 2096, 326, 436, 310, 6571, 4457, 273, 47515, 1896, 1309, 253, 30080, 22559, 3408, 2568, 891, 1158, 436, 651, 320, 407, 2080, 253, 5962, 7756, 273, 253, 1655, 7714, 50276, 19, 1223, 253, 14777, 275, 3036, 374, 1669, 3036, 495, 1669, 285, 3036, 577, 1669, 23395, 25910, 26799, 253, 3045, 597, 513, 417, 2085, 1491, 670, 849, 1048, 352, 3936, 323, 253, 23178, 278, 17945, 281, 19486, 640, 331, 77, 1057, 278, 17945, 878, 281, 923, 1142, 8892, 835, 352, 4850, 13237, 7938, 285, 7938, 281, 3986, 1029, 3045, 390, 310, 627, 271, 3302, 3408, 273, 1652, 7756, 3560, 407, 247, 1379, 2727, 835, 512, 6774, 8892, 403, 6311, 9086, 1057, 326, 2826, 846, 1016, 6333, 556, 644, 10166, 2378, 4645, 253, 1599, 760, 43660, 512, 273, 841, 8062, 285, 352, 651, 320, 5322, 281, 923, 436, 275, 247, 2962, 273, 4569, 14777, 352, 651, 671, 1361, 3662, 3533, 751, 310, 23178, 278, 17945, 4409, 253, 18332, 604, 697, 760, 4567, 390, 1668, 8892, 3185, 273, 6705, 3966, 50276, 20, 4677, 374, 987, 50276, 783, 3045, 15988, 273, 23178, 689, 1114, 36842, 278, 17945, 403, 4577, 685, 540, 41597, 3264, 581, 1921, 812, 320, 326, 253, 1114, 36842, 278, 17945, 8379, 2789, 897, 273, 253, 3451, 11911, 2655, 566, 534, 310, 10208, 281, 253, 1114, 36842, 10336, 347, 973, 281, 1071, 323, 436, 352, 651, 320, 5322, 281, 923, 3045, 273, 1097, 23178, 285, 1114, 36842, 278, 17945, 672, 12422, 275, 3632, 1491, 323, 253, 11911, 2655, 566, 281, 897, 10554, 23178, 278, 17945, 1293, 3186, 943, 22476, 3838, 1037, 16528, 366, 1114, 36842, 278, 17945, 943, 16528, 366, 604, 352, 4648, 436, 1491, 533, 943, 3464, 253, 1072, 604, 253, 1491, 310, 417, 908, 50276, 26122, 50276, 21, 1529, 1453, 3368, 30906, 562, 2060, 11911, 24088, 294, 19078, 3006, 581, 6333, 12421, 323, 253, 23178, 278, 17945, 342, 3186, 943, 1421, 281, 28153, 273, 253, 3969, 8892, 285, 760, 841, 8892, 891, 452, 642, 2266, 4606, 281, 2868, 5010, 2568, 352, 651, 320, 271, 4722, 1453, 3368, 281, 923, 1880, 253, 11911, 452, 1663, 4751, 18052, 390, 1880, 597, 2489, 5506, 323, 2709, 4836, 10575, 672, 3733, 342, 3186, 50275, 783, 2929, 13698, 387, 15974, 247, 1077, 14793, 285, 1774, 285, 417, 49186, 2834, 1895, 36536, 391, 77, 342, 4836, 49896, 326, 452, 5889, 267, 2605, 275, 8063, 4715, 670, 436, 5889, 267, 2605, 943, 10260, 12454, 3037, 723, 365, 264, 323, 747, 4836, 20617, 569, 3579, 17338, 627, 403, 387, 1878, 1740, 2022, 3237, 281, 8415, 891, 849, 1142, 11911, 403, 3058, 285, 752, 310, 253, 2424, 15180, 6864, 26332, 4869, 1180, 273, 22453, 6333, 5841, 21255, 849, 403, 11911, 4236, 323, 247, 1798, 3733, 4227, 672, 436, 310, 417, 2590, 1049, 7947, 74, 285, 16325, 832, 48040, 3186, 2439, 6333, 39902, 569, 310, 1512, 19983, 37685, 849, 403, 11911, 10166, 8523, 846, 3632, 31850, 849, 513, 597, 16270, 48544, 21983, 849, 476, 352, 320, 33075, 326, 11911, 760, 1232, 4623, 1491, 285, 3021, 2489, 13727, 281, 19124, 7629, 253, 10720, 1895, 17816, 36256, 37264, 310, 2686, 14042, 275, 253, 2929, 407, 40770, 247, 5853, 432, 14604, 391, 77, 512, 273, 841, 3237, 403, 1892, 285, 3939, 3672, 273, 2561, 47421, 253, 3662, 281, 37685, 1537, 320, 247, 24642, 2074, 281, 47515, 4081, 275, 253, 2929, 50276, 783, 1655, 2929, 2789, 2266, 8077, 5411, 13260, 281, 4796, 253, 12147, 273, 1016, 1895, 1223, 436, 310, 247, 24600, 4983, 1127, 281, 5100, 8245, 1543, 752, 516, 5816, 432, 253, 1655, 7714, 310, 271, 3177, 281, 18915, 690, 273, 841, 3237, 891, 717, 417, 3981, 436, 310, 14916, 285, 891, 651, 1908, 352, 247, 1534, 1635, 281, 253, 2929, 1293, 824, 271, 1635, 253, 2929, 2722, 12532, 285, 4722, 1543, 323, 9534, 835, 841, 3237, 513, 417, 3176, 534, 310, 2223, 46521, 390, 323, 9534, 835, 841, 3237, 812, 320, 14042, 949, 690, 2097, 275, 253, 2852, 253, 2929, 310, 973, 3542, 285, 2590, 1543, 403, 1175, 285, 891, 452, 642, 14672, 5001, 253, 36594, 273, 1543, 285, 3916, 2299, 1677, 253, 2266, 8077, 5411, 13260, 253, 1543, 403, 417, 1077, 10084, 534, 9142, 7787, 253, 3486, 285, 8453, 273, 253, 2929, 891, 4390, 923, 642, 1921, 281, 12009, 253, 2929, 533, 671, 452, 247, 1892, 673, 34018, 247, 1029, 4868, 891, 1158, 342, 247, 2372, 273, 4465, 789, 47421, 4457, 47515, 1896, 275, 253, 30080, 22559, 253, 2929, 812, 2489, 247, 30951, 5313, 273, 789, 275, 36536, 5889, 267, 391, 77, 2490, 187, 4118, 18435, 27, 783, 2929, 10262, 247, 1332, 323, 5889, 267, 4836, 4715, 275, 253, 45120, 391, 77, 4758, 407, 47247, 285, 294, 5397, 981, 11454, 11911, 253, 1332, 310, 6760, 327, 1054, 19531, 2352, 285, 15524, 15688, 19763, 8892, 50276, 783, 30628, 5194, 285, 891, 15038, 326, 253, 2929, 29328, 271, 4722, 2900, 281, 247, 2834, 285, 1774, 1895, 253, 2929, 310, 973, 3559, 285, 651, 1056, 247, 1175, 1635, 281, 253, 1554, 262, 1945, 45120, 4715, 253, 30628, 11435, 253, 4477, 6128, 285, 253, 7756, 281, 253, 7714, 285, 275, 1798, 253, 4465, 4679, 342, 253, 3430, 1180, 273, 11911, 50276, 783, 2457, 2715, 273, 253, 2929, 943, 50275, 498, 274, 1419, 253, 8813, 273, 5164, 23178, 414, 50276, 15106, 253, 4623, 7437, 281, 253, 2022, 2505, 50276, 2887, 305, 321, 1162, 355, 5723, 2824, 43425, 5987, 5758, 15337, 3024, 39061, 301, 336, 1615, 69, 2577, 17, 1767, 77, 323, 247, 5426, 273, 3037, 494, 5889, 267, 8892, 3066, 7590, 363, 2036, 30221, 50274, 15337, 398, 11435, 253, 4465, 4679, 342, 253, 3430, 1180, 273, 11911 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: to my knowledge this paper is probably the first one to apply fewshot learning concept into highlevel computer vision tasks in this papers sense segmentation it proposes a general framework to few from the very few sample extract a latent representation z and apply it to do segmentation on a query cases of semantic interactive and video segmentation are applied experiments are very thorough we see too many variants of fewshot learning papers on miniimagenet or omniglot for the reason of applying to highlevel segmentation the paper already deserves an acceptance for the first work i believe this work would inspire many followups in related domain especially for highlevel vision tasks comments what is interactive segmentation i looked through the related work it just mentioned some previous work without defining or describing it z is the network output of g is there any constraint on z like gaussian distributions like what z is like in vae models docsepsummary this paper proposed a fewshot learning approach for interactive segmentation given a set of userannotated points the proposed model learns to generate dense segmentation masks of objects to incorporate the pointwise annotation the guidance network is introduced the proposed idea is applied to guided image segmentation semantic segmentation and video segmentation clarity overall the presentation of the paper can be significantly improved first of all it is not clear what the problem setting of this paper is as it seems to have two sets of training data of fullyannotated images for training and the combined set of pointwise annotated images and unannotated images guidance images t in the first equation it is not clear whether authors generate the second dataset out of the first one or they have separate datasets for these two also it is not clear how the authors incorporate the unannotated images for training the descriptions on model architecture are also not quite clear as it involves two components g and f but start discussing with g without providing a clear overview of the combined model i would suggest changing the order of section 41 and section 42 to make it clearer the loss functions are introduced in the last part of the method which makes it also very difficult to understand originality and significance the technical contribution of the paper is very limited i do not see many novel contributions in terms of both network architecture and learning perspective experiment overall i am not quite convinced with the experiment results the method is compared against only a few not popular interactive segmentation methods although there exist many recent works addressing the same task eg xu et al 2016 the experiment settings are also not clearly presented for instance what is the dataset used for the evaluation of the first paragraph in section 51 how do you split the pascal voc data to exclusive sets how do you sample pointwise annotation from dense mask labels how does the sampling procedure affect the performance the performance of the guided semantic segmentation is also quite low limiting the practical usefulness of the method finally the paper does not present qualitative results which are essential to understanding the performance of the segmentation system minor comments 1 there are a lot of grammar issues please revise your draft 2 please revise the notations in equations for instance t x1 l1 cup barx1 ls pjljjin1p lin1kcupemptyset also in the next equation jinbarxq p jinbarxq j is an index of pixel docsepsummary this paper proposes to formulate diverse segmentation problems as a guided segmentation whose task is defined by the guiding annotations the main idea of this paper is using metalearning to train a single neural network performing guidance segmentation specifically they encode s annotated support image into a task representation and use it to perform binary segmentation by performing episodic optimisation the models guidance to segmentation output is defined by the task distribution strength learning a single segmentation algorithm to solve various segmentation problem is an interesting problem that worth exploring this paper tackles this problem and showed results on various segmentation problems weakness the proposed method including the architecture and training strategy is relatively simple and very closely related to existing approach especially the only differences with the referenced paper shaban et al 2017 is how the support is fused and how multiple guidance could be handled which can be done by averaging these differences are relatively minor so i question the novelty of this paper this paper performs experiments on diverse tasks but the method is compared with relatively weak baselines absolute performance looks bad compared to existing algorithms exploiting prior knowledge for each of the tasks for example the oracle performance in semantic segmentation fully supervised method is 045 iou in pascal voc dataset while many existing algorithms could achieve more than 08 mean iou in this dataset in addition i question whether foreground background baseline is reasonable baseline for all these tasks because a little domain knowledge might already give very strong result on various segmentation tasks for example in terms of video segmentation one trivial baseline might include propagating ground truth labels in the first frame with color and spatial location similarity which might be already stronger than the foreground background baseline there are some strong arguments that require further justification in 43 authors argue that the model is trained with s1 but could operate with different s p however its suspicious whether this would be really true because it requires generalisation to outofdistribution examples which is very difficult machine learning problem the performance in figure 5 right might support the difficulty of this generalisation because increasing s does not necessarily increase the performance in 53 this paper investigated whether the model trained with instances could be used for semantic segmentation i think performing semantic segmentation with model trained for instance segmentation in the same dataset might show reasonable performance but this might be just because there are many images with single instance in each image and because instance annotations in this dataset are based on semantic classes so the argument that training with instance segmentation lead to semantic segmentation should be more carefully made overall comment i believe the method proposed in this paper is rather incremental and analysis is not supporting the main arguments of this paper and strength of the proposed method especially simple performance comparison with weak baselines give no clues about the property of the method and advantage of using this method compared to other existing approaches ### Summary:
paper proposes a metalearning approach to interactive segmentation after the author response r2 and r3 recommend rejecting this paper citing concerns of limited novelty and insufficient experimental evaluation given the popularity of this topic in computer vision r1 does not seem be familiar with the extensive literature on interactive segmentation and their positive recommendation has been discounted the ac finds no basis for accepting this paper
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 936, 619, 3640, 436, 2929, 310, 3164, 253, 806, 581, 281, 4647, 1643, 11860, 4715, 4473, 715, 1029, 5251, 4382, 8113, 8892, 275, 436, 9380, 3282, 26405, 352, 29328, 247, 2087, 7792, 281, 1643, 432, 253, 1077, 1643, 3410, 4908, 247, 21624, 6779, 1182, 285, 4647, 352, 281, 513, 26405, 327, 247, 7316, 2219, 273, 24705, 18366, 285, 3492, 26405, 403, 3732, 4679, 403, 1077, 11080, 50276, 664, 923, 1512, 1142, 11640, 273, 1643, 11860, 4715, 9380, 327, 12949, 303, 6533, 292, 390, 33039, 304, 11753, 323, 253, 1921, 273, 9433, 281, 1029, 5251, 26405, 253, 2929, 2168, 22828, 271, 14924, 323, 253, 806, 789, 891, 2868, 436, 789, 651, 26761, 1142, 956, 8777, 275, 2905, 5028, 3340, 323, 1029, 5251, 8113, 8892, 50276, 26122, 50275, 5371, 310, 18366, 26405, 891, 3261, 949, 253, 2905, 789, 352, 816, 5393, 690, 2045, 789, 1293, 13947, 390, 12930, 352, 50275, 91, 310, 253, 2990, 3453, 273, 305, 310, 627, 667, 7658, 327, 1182, 751, 305, 12064, 10670, 751, 752, 1182, 310, 751, 275, 362, 3348, 3210, 50275, 7152, 339, 793, 360, 3454, 436, 2929, 4081, 247, 1643, 11860, 4715, 2746, 323, 18366, 26405, 1677, 247, 873, 273, 2608, 11423, 456, 2792, 253, 4081, 1566, 33772, 281, 6635, 14086, 26405, 25965, 273, 5113, 281, 19071, 253, 1127, 3020, 22581, 253, 12925, 2990, 310, 5611, 253, 4081, 2934, 310, 3732, 281, 18107, 2460, 26405, 24705, 26405, 285, 3492, 26405, 50276, 498, 15752, 4583, 253, 9759, 273, 253, 2929, 476, 320, 3012, 5520, 806, 273, 512, 352, 310, 417, 2590, 752, 253, 1895, 4758, 273, 436, 2929, 310, 347, 352, 3133, 281, 452, 767, 5239, 273, 3733, 941, 273, 4751, 11423, 456, 3888, 323, 3733, 285, 253, 5678, 873, 273, 1127, 3020, 28267, 3888, 285, 440, 11423, 456, 3888, 12925, 3888, 246, 275, 253, 806, 5150, 352, 310, 417, 2590, 1880, 4477, 6635, 253, 1273, 10895, 562, 273, 253, 806, 581, 390, 597, 452, 4858, 15302, 323, 841, 767, 671, 352, 310, 417, 2590, 849, 253, 4477, 19071, 253, 440, 11423, 456, 3888, 323, 3733, 50275, 783, 20121, 327, 1566, 10336, 403, 671, 417, 3240, 2590, 347, 352, 8687, 767, 4295, 305, 285, 269, 533, 1265, 16585, 342, 305, 1293, 5277, 247, 2590, 18389, 273, 253, 5678, 1566, 891, 651, 1804, 6890, 253, 1340, 273, 2593, 7609, 285, 2593, 5976, 281, 1056, 352, 30909, 253, 2957, 3470, 403, 5611, 275, 253, 1390, 629, 273, 253, 1332, 534, 2789, 352, 671, 1077, 2834, 281, 2096, 50275, 19164, 414, 285, 8453, 253, 7681, 7680, 273, 253, 2929, 310, 1077, 3710, 891, 513, 417, 923, 1142, 4460, 9021, 275, 2426, 273, 1097, 2990, 10336, 285, 4715, 8668, 50276, 16217, 2092, 4583, 891, 717, 417, 3240, 13762, 342, 253, 3368, 1543, 253, 1332, 310, 2429, 1411, 760, 247, 1643, 417, 4633, 18366, 26405, 3082, 3738, 627, 2226, 1142, 3332, 2987, 15974, 253, 1072, 4836, 24088, 1269, 86, 1162, 355, 4022, 50275, 783, 3368, 7533, 403, 671, 417, 4518, 3559, 323, 4227, 752, 310, 253, 10895, 908, 323, 253, 7103, 273, 253, 806, 12494, 275, 2593, 8319, 849, 513, 368, 8085, 253, 7222, 1179, 11571, 941, 281, 11855, 5239, 849, 513, 368, 3410, 1127, 3020, 22581, 432, 14086, 8989, 13301, 849, 1057, 253, 10491, 5199, 2818, 253, 3045, 50275, 783, 3045, 273, 253, 18107, 24705, 26405, 310, 671, 3240, 1698, 14155, 253, 8542, 31471, 273, 253, 1332, 4720, 253, 2929, 1057, 417, 1246, 18276, 1543, 534, 403, 5667, 281, 4685, 253, 3045, 273, 253, 26405, 985, 50275, 37585, 5701, 337, 627, 403, 247, 2257, 273, 28146, 3374, 4496, 49620, 634, 7482, 374, 4496, 49620, 253, 41818, 275, 7424, 323, 4227, 50272, 85, 50276, 89, 18, 298, 18, 5500, 2534, 89, 18, 50273, 5200, 50276, 81, 40031, 22492, 249, 18, 81, 19169, 18, 76, 6837, 20760, 50273, 12563, 275, 253, 1735, 5150, 480, 249, 2009, 89, 82, 50276, 81, 480, 249, 2009, 89, 82, 480, 310, 271, 3605, 273, 12275, 5474, 339, 793, 360, 3454, 436, 2929, 29328, 281, 36803, 11117, 26405, 3237, 347, 247, 18107, 26405, 3692, 4836, 310, 2931, 407, 253, 26766, 31825, 253, 2022, 2934, 273, 436, 2929, 310, 970, 5148, 613, 920, 281, 6194, 247, 2014, 11454, 2990, 9591, 12925, 26405, 5742, 597, 22573, 256, 28267, 1329, 2460, 715, 247, 4836, 6779, 285, 897, 352, 281, 1347, 8985, 26405, 407, 9591, 6314, 23329, 5556, 5837, 253, 3210, 12925, 281, 26405, 3453, 310, 2931, 407, 253, 4836, 3268, 50276, 45563, 4715, 247, 2014, 26405, 5933, 281, 8415, 2710, 26405, 1895, 310, 271, 4722, 1895, 326, 4409, 18216, 436, 2929, 39223, 436, 1895, 285, 2692, 1543, 327, 2710, 26405, 3237, 50276, 20881, 1255, 253, 4081, 1332, 1690, 253, 10336, 285, 3733, 5700, 310, 4942, 2969, 285, 1077, 8244, 2905, 281, 5368, 2746, 3340, 253, 760, 3910, 342, 253, 23378, 2929, 439, 23818, 1162, 355, 4240, 310, 849, 253, 1329, 310, 29843, 285, 849, 2709, 12925, 812, 320, 15726, 534, 476, 320, 2218, 407, 25001, 841, 3910, 403, 4942, 5884, 594, 891, 1953, 253, 38135, 273, 436, 2929, 50276, 2520, 2929, 17923, 4679, 327, 11117, 8892, 533, 253, 1332, 310, 2429, 342, 4942, 5075, 1666, 25379, 7880, 3045, 4453, 3076, 2429, 281, 5368, 11333, 38883, 2720, 3640, 323, 1016, 273, 253, 8892, 323, 1650, 253, 42295, 3045, 275, 24705, 26405, 4751, 22296, 1332, 310, 470, 1857, 891, 276, 275, 7222, 1179, 11571, 10895, 1223, 1142, 5368, 11333, 812, 5115, 625, 685, 16331, 1599, 891, 276, 275, 436, 10895, 50276, 249, 1635, 891, 1953, 1880, 35936, 50276, 11814, 8245, 310, 5272, 8245, 323, 512, 841, 8892, 984, 247, 1652, 5028, 3640, 1537, 2168, 1918, 1077, 2266, 906, 327, 2710, 26405, 8892, 323, 1650, 275, 2426, 273, 3492, 26405, 581, 14916, 8245, 1537, 2486, 42995, 3216, 5083, 13301, 275, 253, 806, 3665, 342, 3295, 285, 8820, 4328, 14259, 534, 1537, 320, 2168, 10046, 685, 253, 35936, 50276, 11814, 8245, 50276, 9088, 403, 690, 2266, 7125, 326, 2430, 2007, 22861, 50275, 249, 7652, 4477, 9059, 326, 253, 1566, 310, 10166, 342, 256, 18, 533, 812, 10196, 342, 1027, 256, 268, 2299, 697, 20634, 1880, 436, 651, 320, 1663, 2032, 984, 352, 4419, 2087, 5837, 281, 562, 1171, 35360, 6667, 534, 310, 1077, 2834, 5145, 4715, 1895, 253, 3045, 275, 4677, 608, 987, 1537, 1329, 253, 10183, 273, 436, 2087, 5837, 984, 3629, 256, 1057, 417, 7933, 2572, 253, 3045, 50276, 249, 8676, 436, 2929, 6949, 1880, 253, 1566, 10166, 342, 10872, 812, 320, 908, 323, 24705, 26405, 891, 1158, 9591, 24705, 26405, 342, 1566, 10166, 323, 4227, 26405, 275, 253, 1072, 10895, 1537, 921, 5272, 3045, 533, 436, 1537, 320, 816, 984, 627, 403, 1142, 3888, 342, 2014, 4227, 275, 1016, 2460, 285, 984, 4227, 31825, 275, 436, 10895, 403, 1754, 327, 24705, 5971, 594, 253, 4154, 326, 3733, 342, 4227, 26405, 1421, 281, 24705, 26405, 943, 320, 625, 9257, 1160, 50276, 1189, 455, 4385, 891, 2868, 253, 1332, 4081, 275, 436, 2929, 310, 2581, 32809, 285, 1783, 310, 417, 8109, 253, 2022, 7125, 273, 436, 2929, 285, 4757, 273, 253, 4081, 1332, 50276, 20432, 2969, 3045, 5301, 342, 5075, 1666, 25379, 1918, 642, 30591, 670, 253, 2867, 273, 253, 1332, 285, 5750, 273, 970, 436, 1332, 2429, 281, 643, 5368, 7274, 2490, 187, 4118, 18435, 27, 20790, 29328, 247, 5148, 613, 920, 2746, 281, 18366, 26405, 846, 253, 2488, 2380, 391, 19, 285, 391, 20, 5583, 33944, 436, 2929, 19936, 7350, 273, 3710, 38135, 285, 12497, 5661, 7103, 1677, 253, 18395, 273, 436, 9400, 275, 4382, 8113, 391, 18, 1057, 417, 1646, 320, 7615, 342, 253, 9470, 6239, 327, 18366, 26405, 285, 616, 2762, 17401, 556, 644, 42214, 253, 913, 9010, 642, 3720, 323, 18738, 436, 2929, 50276 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 936, 619, 3640, 436, 2929, 310, 3164, 253, 806, 581, 281, 4647, 1643, 11860, 4715, 4473, 715, 1029, 5251, 4382, 8113, 8892, 275, 436, 9380, 3282, 26405, 352, 29328, 247, 2087, 7792, 281, 1643, 432, 253, 1077, 1643, 3410, 4908, 247, 21624, 6779, 1182, 285, 4647, 352, 281, 513, 26405, 327, 247, 7316, 2219, 273, 24705, 18366, 285, 3492, 26405, 403, 3732, 4679, 403, 1077, 11080, 50276, 664, 923, 1512, 1142, 11640, 273, 1643, 11860, 4715, 9380, 327, 12949, 303, 6533, 292, 390, 33039, 304, 11753, 323, 253, 1921, 273, 9433, 281, 1029, 5251, 26405, 253, 2929, 2168, 22828, 271, 14924, 323, 253, 806, 789, 891, 2868, 436, 789, 651, 26761, 1142, 956, 8777, 275, 2905, 5028, 3340, 323, 1029, 5251, 8113, 8892, 50276, 26122, 50275, 5371, 310, 18366, 26405, 891, 3261, 949, 253, 2905, 789, 352, 816, 5393, 690, 2045, 789, 1293, 13947, 390, 12930, 352, 50275, 91, 310, 253, 2990, 3453, 273, 305, 310, 627, 667, 7658, 327, 1182, 751, 305, 12064, 10670, 751, 752, 1182, 310, 751, 275, 362, 3348, 3210, 50275, 7152, 339, 793, 360, 3454, 436, 2929, 4081, 247, 1643, 11860, 4715, 2746, 323, 18366, 26405, 1677, 247, 873, 273, 2608, 11423, 456, 2792, 253, 4081, 1566, 33772, 281, 6635, 14086, 26405, 25965, 273, 5113, 281, 19071, 253, 1127, 3020, 22581, 253, 12925, 2990, 310, 5611, 253, 4081, 2934, 310, 3732, 281, 18107, 2460, 26405, 24705, 26405, 285, 3492, 26405, 50276, 498, 15752, 4583, 253, 9759, 273, 253, 2929, 476, 320, 3012, 5520, 806, 273, 512, 352, 310, 417, 2590, 752, 253, 1895, 4758, 273, 436, 2929, 310, 347, 352, 3133, 281, 452, 767, 5239, 273, 3733, 941, 273, 4751, 11423, 456, 3888, 323, 3733, 285, 253, 5678, 873, 273, 1127, 3020, 28267, 3888, 285, 440, 11423, 456, 3888, 12925, 3888, 246, 275, 253, 806, 5150, 352, 310, 417, 2590, 1880, 4477, 6635, 253, 1273, 10895, 562, 273, 253, 806, 581, 390, 597, 452, 4858, 15302, 323, 841, 767, 671, 352, 310, 417, 2590, 849, 253, 4477, 19071, 253, 440, 11423, 456, 3888, 323, 3733, 50275, 783, 20121, 327, 1566, 10336, 403, 671, 417, 3240, 2590, 347, 352, 8687, 767, 4295, 305, 285, 269, 533, 1265, 16585, 342, 305, 1293, 5277, 247, 2590, 18389, 273, 253, 5678, 1566, 891, 651, 1804, 6890, 253, 1340, 273, 2593, 7609, 285, 2593, 5976, 281, 1056, 352, 30909, 253, 2957, 3470, 403, 5611, 275, 253, 1390, 629, 273, 253, 1332, 534, 2789, 352, 671, 1077, 2834, 281, 2096, 50275, 19164, 414, 285, 8453, 253, 7681, 7680, 273, 253, 2929, 310, 1077, 3710, 891, 513, 417, 923, 1142, 4460, 9021, 275, 2426, 273, 1097, 2990, 10336, 285, 4715, 8668, 50276, 16217, 2092, 4583, 891, 717, 417, 3240, 13762, 342, 253, 3368, 1543, 253, 1332, 310, 2429, 1411, 760, 247, 1643, 417, 4633, 18366, 26405, 3082, 3738, 627, 2226, 1142, 3332, 2987, 15974, 253, 1072, 4836, 24088, 1269, 86, 1162, 355, 4022, 50275, 783, 3368, 7533, 403, 671, 417, 4518, 3559, 323, 4227, 752, 310, 253, 10895, 908, 323, 253, 7103, 273, 253, 806, 12494, 275, 2593, 8319, 849, 513, 368, 8085, 253, 7222, 1179, 11571, 941, 281, 11855, 5239, 849, 513, 368, 3410, 1127, 3020, 22581, 432, 14086, 8989, 13301, 849, 1057, 253, 10491, 5199, 2818, 253, 3045, 50275, 783, 3045, 273, 253, 18107, 24705, 26405, 310, 671, 3240, 1698, 14155, 253, 8542, 31471, 273, 253, 1332, 4720, 253, 2929, 1057, 417, 1246, 18276, 1543, 534, 403, 5667, 281, 4685, 253, 3045, 273, 253, 26405, 985, 50275, 37585, 5701, 337, 627, 403, 247, 2257, 273, 28146, 3374, 4496, 49620, 634, 7482, 374, 4496, 49620, 253, 41818, 275, 7424, 323, 4227, 50272, 85, 50276, 89, 18, 298, 18, 5500, 2534, 89, 18, 50273, 5200, 50276, 81, 40031, 22492, 249, 18, 81, 19169, 18, 76, 6837, 20760, 50273, 12563, 275, 253, 1735, 5150, 480, 249, 2009, 89, 82, 50276, 81, 480, 249, 2009, 89, 82, 480, 310, 271, 3605, 273, 12275, 5474, 339, 793, 360, 3454, 436, 2929, 29328, 281, 36803, 11117, 26405, 3237, 347, 247, 18107, 26405, 3692, 4836, 310, 2931, 407, 253, 26766, 31825, 253, 2022, 2934, 273, 436, 2929, 310, 970, 5148, 613, 920, 281, 6194, 247, 2014, 11454, 2990, 9591, 12925, 26405, 5742, 597, 22573, 256, 28267, 1329, 2460, 715, 247, 4836, 6779, 285, 897, 352, 281, 1347, 8985, 26405, 407, 9591, 6314, 23329, 5556, 5837, 253, 3210, 12925, 281, 26405, 3453, 310, 2931, 407, 253, 4836, 3268, 50276, 45563, 4715, 247, 2014, 26405, 5933, 281, 8415, 2710, 26405, 1895, 310, 271, 4722, 1895, 326, 4409, 18216, 436, 2929, 39223, 436, 1895, 285, 2692, 1543, 327, 2710, 26405, 3237, 50276, 20881, 1255, 253, 4081, 1332, 1690, 253, 10336, 285, 3733, 5700, 310, 4942, 2969, 285, 1077, 8244, 2905, 281, 5368, 2746, 3340, 253, 760, 3910, 342, 253, 23378, 2929, 439, 23818, 1162, 355, 4240, 310, 849, 253, 1329, 310, 29843, 285, 849, 2709, 12925, 812, 320, 15726, 534, 476, 320, 2218, 407, 25001, 841, 3910, 403, 4942, 5884, 594, 891, 1953, 253, 38135, 273, 436, 2929, 50276, 2520, 2929, 17923, 4679, 327, 11117, 8892, 533, 253, 1332, 310, 2429, 342, 4942, 5075, 1666, 25379, 7880, 3045, 4453, 3076, 2429, 281, 5368, 11333, 38883, 2720, 3640, 323, 1016, 273, 253, 8892, 323, 1650, 253, 42295, 3045, 275, 24705, 26405, 4751, 22296, 1332, 310, 470, 1857, 891, 276, 275, 7222, 1179, 11571, 10895, 1223, 1142, 5368, 11333, 812, 5115, 625, 685, 16331, 1599, 891, 276, 275, 436, 10895, 50276, 249, 1635, 891, 1953, 1880, 35936, 50276, 11814, 8245, 310, 5272, 8245, 323, 512, 841, 8892, 984, 247, 1652, 5028, 3640, 1537, 2168, 1918, 1077, 2266, 906, 327, 2710, 26405, 8892, 323, 1650, 275, 2426, 273, 3492, 26405, 581, 14916, 8245, 1537, 2486, 42995, 3216, 5083, 13301, 275, 253, 806, 3665, 342, 3295, 285, 8820, 4328, 14259, 534, 1537, 320, 2168, 10046, 685, 253, 35936, 50276, 11814, 8245, 50276, 9088, 403, 690, 2266, 7125, 326, 2430, 2007, 22861, 50275, 249, 7652, 4477, 9059, 326, 253, 1566, 310, 10166, 342, 256, 18, 533, 812, 10196, 342, 1027, 256, 268, 2299, 697, 20634, 1880, 436, 651, 320, 1663, 2032, 984, 352, 4419, 2087, 5837, 281, 562, 1171, 35360, 6667, 534, 310, 1077, 2834, 5145, 4715, 1895, 253, 3045, 275, 4677, 608, 987, 1537, 1329, 253, 10183, 273, 436, 2087, 5837, 984, 3629, 256, 1057, 417, 7933, 2572, 253, 3045, 50276, 249, 8676, 436, 2929, 6949, 1880, 253, 1566, 10166, 342, 10872, 812, 320, 908, 323, 24705, 26405, 891, 1158, 9591, 24705, 26405, 342, 1566, 10166, 323, 4227, 26405, 275, 253, 1072, 10895, 1537, 921, 5272, 3045, 533, 436, 1537, 320, 816, 984, 627, 403, 1142, 3888, 342, 2014, 4227, 275, 1016, 2460, 285, 984, 4227, 31825, 275, 436, 10895, 403, 1754, 327, 24705, 5971, 594, 253, 4154, 326, 3733, 342, 4227, 26405, 1421, 281, 24705, 26405, 943, 320, 625, 9257, 1160, 50276, 1189, 455, 4385, 891, 2868, 253, 1332, 4081, 275, 436, 2929, 310, 2581, 32809, 285, 1783, 310, 417, 8109, 253, 2022, 7125, 273, 436, 2929, 285, 4757, 273, 253, 4081, 1332, 50276, 20432, 2969, 3045, 5301, 342, 5075, 1666, 25379, 1918, 642, 30591, 670, 253, 2867, 273, 253, 1332, 285, 5750, 273, 970, 436, 1332, 2429, 281, 643, 5368, 7274, 2490, 187, 4118, 18435, 27, 20790, 29328, 247, 5148, 613, 920, 2746, 281, 18366, 26405, 846, 253, 2488, 2380, 391, 19, 285, 391, 20, 5583, 33944, 436, 2929, 19936, 7350, 273, 3710, 38135, 285, 12497, 5661, 7103, 1677, 253, 18395, 273, 436, 9400, 275, 4382, 8113, 391, 18, 1057, 417, 1646, 320, 7615, 342, 253, 9470, 6239, 327, 18366, 26405, 285, 616, 2762, 17401, 556, 644, 42214, 253, 913, 9010, 642, 3720, 323, 18738, 436, 2929, 50276 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the authors propose algorithms for closest fair ranking and fair rank aggregation problems specifically they consider two notions of fairness weak a bfair up to a prefix k of the ranking and strong fair for all prefixes they provide an exact algorithm for the cfr problem for two metrics kendall tau and ulam their results hold when the number of groups are a constant in the case of the ulam metric strengths 1 fair ranking is an important problem 2 all the details are presented the paper is however a bit hard to read in some places 3 the algorithms are fairly intuitive weaknesses 1 the algorithms are fairly simple to analyze 2 while fair ranking is an important problem the same might not be the case for rank aggregation one of the primary goals for aggregation is to overcome biases from any individual ranking 3 comparison to related work especially in fair group formation and fair clusteringpartitioning is missing the latter problems are especially related under the weak notion of fairness when k is set to the size of each paritionclustergroup while the problem of fair ranking is an important problem to consider fair rank aggregation is not well motivated docsepfrom a fairness or diversity perspective this paper examines ranking problems in which candidates belonging to different groups have a fair representation in the final rankings using a linear time exact algorithm the algorithm will find the closest fair ranking for kendall tau metric under strong fairness where the final ranking is fair for all values of k given the parameters for defining fair representation number of candidates from a particular group in the topk positions of the ranking the authors also provide an exact algorithm for finding the closest fair ranking for the ulam metric under strong fairness when there are a few number of groups additionally the authors propose a novel metaalgorithm for the general rank aggregation problem under the fairness framework in a nutshell the main contribution is to develop a novel algorithmic toolbox for the fair rank aggregation that solves a variety of rank aggregation objectives satisfying such generic fairness constraints an essential takeaway of this work is that a set of potentially biased rankings can be aggregated into a fair ranking with only a small loss in the quality of the ranking the paper throughly studies the fair rank aggregation problem under different fairness constraints it provides two metaalgorithms to approximate the fair aggregated ranking there are a lot of theorems claims lemmas and definitions in the paper which is good in a sense because it adds details but overall makes the paper not easy to follow i suggest moving some of them to appendix one of the main sections of the paper fair rank aggregation starts in page 7 while the paper looks interesting and novel it is pure theoretical paper which im not sure is a good choice for this venue the paper introduces a generic approach and argues it is a novel algorithm but it does not test this method on a simulated or realworld data it would be interesting to see some experiments and comparisons with the methods discussed in 1 or 2 as these papers mentioned in the paper as closest approaches and they provide experiments to show the effectiveness of their method missing related work in line 43 besides mentioned papers 1 is also a new related work that outputs a fair ranking robust to label noise that maximizes the ranking utility subject to group fairness constraints based on exposure such as demographic parity it is an inprocessing method which presents a preferable tradeoff between fairness and utility 1 caitlin kuhlman and elke rundensteiner rank aggregation algorithms for fair con sensus proceedings of the vldb endowment 1312 2020 2 david wei md moinul islam baruch schieber and senjuti basu roy rank aggrega tion with proportional fairness in sigmod page to appear 2022 3 memarrast omid ashkan rezaei rizal fathony and brian ziebart fairness for robust learning to rank arxiv preprint arxiv211206288 2021 i dont if they have discussed the limitations of their approach docsepthe authors study fair rank aggregation where candidates may belong to different groups and each group must be represented fairly in the final ranking in a way that the designer can specify they provide a linear time algorithm for finding the closest fair ranking under proportional fairness for the kendall tau distance metric and a polytime dynamic programming algorithm for the ulam metric for a constant number of groups which is the case in practice they then study the fair rank aggregation problem and show that many biased rankings can be aggregated into a fair ranking with only a small loss in quality for a variety of distance metrics strengths the definition of proportional fairness is general and flexible and nicely generalizes previous results the fair rank aggregation problem is also an interesting line of inquiry where the goal is to find a fair ranking that minimizes the generalized mean distance to the input profile strengthweakness the metaalgorithms for fair rank aggregation are also simple which could be a weakness but as a first foray into the field seems like a plus where the ideas are either to 1 find fair approximations of each input ranking and then choose the most central one or 2 find a fair approximation of the unfair ranking that minimizes the generalized mean objective however there is perhaps more room for more sophisticated algorithms that dont involve the cfr subproblem weaknesses the algorithms described in the paper are all relatively simple even if the analysis is nontrivial this is only a minor critique however as an early paper in this area that is to be expected yes docsepthis paper focused on the rank aggregation problem under a very general notion of proportional fairness authors propose a linear time exact algorithm to find the closest fair ranking cfr by a greedy strategy for kendall tau and ulam metrics and propose a novel algorithmic toolbox to solve a wide variety of rank aggregation objectives satisfying such generic fairness constraints they provide rigorous mathematical derivation to prove that the there exists a linear time ranking algorithm strengths this paper gives a novel metaalgorithm for the general rank aggregation problem under the fairness framework which works for any generalized mean objective and any fairness criteria this paper proves that the there exists a linear time ranking algorithm through rigorous mathematical derivation weaknesses the structure of this paper is incomplete which makes the paper poor readable this paper misses sections of related works and conclusion authors can refer to previous paper published in neurips for revision in this paper authors only use one paragraph in section 11 to introduce the comparison with only one concurrent work it is difficult to convince me that the algorithm proposed in this paper is superior to the stateoftheart algorithms some formulations are not correct for example in section 1 the generalized mean objective in section 11 sumi1n rhopii sigmaq1q should be sumi1n rhopii sigmaq1q in line 105 i think the contribution of the proposed algorithm is that authors design a novel objection function of greedy strategy for fair ranking and prove the algorithm is a linear time ranking algorithm however the greedy strategy for ranking is lack of innovation this paper focuses on the mathematical derivation for the fair rank aggregation and illustrates that the proposed algorithms are better than concurrent work wisr22 in line 132 maybe authors can add more comparisons and give more detailed explanation for proving the superiority of the proposed algorithms this paper studies group fairness as well as proportional fairness the assumption of this problem seems to be that each candidate only belongs to one group i wonder if the algorithms in this paper still works if each candidate can belong to multiple groups at the same time ### Summary:
reviewers liked the novelty in the fair range aggregation problem and enjoyed the simplicity of the proposed algorithms the theoretical results are solid though the proof techniques are believed to be standard there is a large room to improve the quality of presentation reviewers opinions stayed the same after rebuttal and no additional points were raised during the discussion and no reviewer had a strong opinion on acceptance or rejection so the paper is on the slightly positive side of the fence
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 12661, 11333, 323, 8642, 4344, 19947, 285, 4344, 5958, 20828, 3237, 50276, 46458, 597, 1908, 767, 27367, 273, 28959, 50276, 20881, 50276, 66, 270, 25525, 598, 281, 247, 17744, 465, 273, 253, 19947, 285, 2266, 50276, 25525, 323, 512, 17744, 265, 50276, 9328, 2085, 271, 3242, 5933, 323, 253, 260, 925, 1895, 323, 767, 17082, 50276, 76, 423, 455, 29201, 285, 12130, 312, 50276, 14094, 1543, 2186, 672, 253, 1180, 273, 2390, 403, 247, 3638, 275, 253, 1083, 273, 253, 12130, 312, 7982, 20544, 337, 4344, 19947, 310, 271, 1774, 1895, 50276, 19, 512, 253, 4278, 403, 3559, 50276, 783, 2929, 310, 2299, 247, 2372, 1892, 281, 1239, 275, 690, 5053, 495, 253, 11333, 403, 9648, 27350, 50276, 20881, 1255, 265, 337, 253, 11333, 403, 9648, 2969, 281, 12106, 374, 50276, 6050, 4344, 19947, 310, 271, 1774, 1895, 253, 1072, 1537, 417, 320, 253, 1083, 323, 5958, 20828, 50276, 531, 273, 253, 3625, 7342, 323, 20828, 310, 281, 11399, 31306, 432, 667, 2060, 19947, 495, 5301, 281, 2905, 789, 3340, 275, 4344, 1387, 4702, 285, 4344, 17524, 37717, 272, 310, 5816, 50276, 783, 6158, 3237, 403, 3340, 2905, 762, 253, 5075, 10732, 273, 28959, 672, 465, 310, 873, 281, 253, 1979, 273, 1016, 1061, 539, 24670, 4399, 1223, 253, 1895, 273, 4344, 19947, 310, 271, 1774, 1895, 281, 1908, 4344, 5958, 20828, 310, 417, 973, 17194, 50276, 7152, 33032, 4064, 247, 28959, 390, 9991, 8668, 436, 2929, 33888, 19947, 3237, 275, 534, 9183, 15823, 281, 1027, 2390, 452, 247, 4344, 6779, 275, 253, 2457, 31972, 970, 247, 4872, 673, 3242, 5933, 253, 5933, 588, 1089, 253, 8642, 4344, 19947, 323, 465, 423, 455, 29201, 7982, 762, 2266, 28959, 835, 253, 2457, 19947, 310, 4344, 323, 512, 2193, 273, 465, 1677, 253, 3602, 323, 13947, 4344, 6779, 1180, 273, 9183, 432, 247, 1798, 1387, 275, 253, 1755, 76, 6887, 273, 253, 19947, 253, 4477, 671, 2085, 271, 3242, 5933, 323, 4560, 253, 8642, 4344, 19947, 323, 253, 12130, 312, 7982, 762, 2266, 28959, 672, 627, 403, 247, 1643, 1180, 273, 2390, 23000, 253, 4477, 12661, 247, 4460, 11419, 41528, 323, 253, 2087, 5958, 20828, 1895, 762, 253, 28959, 7792, 275, 247, 5825, 17901, 253, 2022, 7680, 310, 281, 1287, 247, 4460, 5933, 280, 4968, 3364, 323, 253, 4344, 5958, 20828, 326, 35910, 247, 5235, 273, 5958, 20828, 16566, 14127, 824, 12314, 28959, 10806, 271, 5667, 1379, 12594, 273, 436, 789, 310, 326, 247, 873, 273, 7826, 23539, 31972, 476, 320, 40006, 715, 247, 4344, 19947, 342, 760, 247, 1355, 2957, 275, 253, 3290, 273, 253, 19947, 253, 2929, 949, 314, 2175, 253, 4344, 5958, 20828, 1895, 762, 1027, 28959, 10806, 352, 3400, 767, 11419, 267, 46042, 281, 16851, 253, 4344, 40006, 19947, 627, 403, 247, 2257, 273, 39383, 3916, 458, 44661, 285, 14308, 275, 253, 2929, 534, 310, 1175, 275, 247, 3282, 984, 352, 11323, 4278, 533, 4583, 2789, 253, 2929, 417, 3477, 281, 956, 891, 1804, 4886, 690, 273, 731, 281, 30762, 581, 273, 253, 2022, 7118, 273, 253, 2929, 4344, 5958, 20828, 7866, 275, 3239, 818, 1223, 253, 2929, 4453, 4722, 285, 4460, 352, 310, 6313, 10527, 2929, 534, 516, 417, 2119, 310, 247, 1175, 4327, 323, 436, 18767, 253, 2929, 23970, 247, 12314, 2746, 285, 8219, 352, 310, 247, 4460, 5933, 533, 352, 1057, 417, 1071, 436, 1332, 327, 247, 15524, 390, 1524, 10186, 941, 352, 651, 320, 4722, 281, 923, 690, 4679, 285, 14023, 342, 253, 3082, 5469, 275, 337, 390, 374, 347, 841, 9380, 5393, 275, 253, 2929, 347, 8642, 7274, 285, 597, 2085, 4679, 281, 921, 253, 12510, 273, 616, 1332, 50275, 33722, 2905, 789, 275, 1386, 7652, 16280, 5393, 9380, 337, 310, 671, 247, 747, 2905, 789, 326, 18012, 247, 4344, 19947, 10237, 281, 5203, 6046, 326, 11903, 4219, 253, 19947, 11839, 2256, 281, 1387, 28959, 10806, 1754, 327, 5445, 824, 347, 18825, 25622, 352, 310, 271, 275, 21678, 1332, 534, 10262, 247, 29224, 5454, 2727, 875, 28959, 285, 11839, 50276, 18, 260, 1942, 3642, 465, 6968, 77, 1342, 285, 1045, 413, 1408, 3354, 6339, 254, 5958, 20828, 11333, 323, 4344, 345, 2288, 316, 10061, 273, 253, 362, 392, 67, 990, 319, 420, 2145, 805, 9169, 50276, 19, 34843, 301, 359, 74, 31934, 5497, 249, 335, 310, 5247, 2534, 976, 256, 2942, 589, 285, 5303, 75, 32616, 1666, 86, 12869, 5958, 9406, 66, 246, 279, 342, 14495, 28959, 275, 9788, 2307, 3239, 281, 3176, 1384, 1423, 50276, 20, 1167, 3298, 505, 7005, 301, 15898, 17369, 294, 91, 3348, 74, 4172, 91, 267, 269, 506, 2421, 285, 270, 5651, 1182, 466, 35292, 28959, 323, 10237, 4715, 281, 5958, 549, 32693, 638, 3845, 549, 32693, 1797, 805, 3071, 21340, 43425, 50276, 74, 13414, 604, 597, 452, 5469, 253, 7364, 273, 616, 2746, 5474, 339, 431, 248, 4477, 1263, 4344, 5958, 20828, 835, 9183, 778, 5663, 281, 1027, 2390, 285, 1016, 1387, 1364, 320, 6607, 9648, 275, 253, 2457, 19947, 275, 247, 1039, 326, 253, 17039, 476, 13199, 597, 2085, 247, 4872, 673, 5933, 323, 4560, 253, 8642, 4344, 19947, 762, 14495, 28959, 323, 253, 465, 423, 455, 29201, 4181, 7982, 285, 247, 877, 1767, 553, 7870, 10717, 5933, 323, 253, 12130, 312, 7982, 323, 247, 3638, 1180, 273, 2390, 534, 310, 253, 1083, 275, 3946, 597, 840, 1263, 253, 4344, 5958, 20828, 1895, 285, 921, 326, 1142, 23539, 31972, 476, 320, 40006, 715, 247, 4344, 19947, 342, 760, 247, 1355, 2957, 275, 3290, 323, 247, 5235, 273, 4181, 17082, 50276, 296, 3755, 20556, 50275, 783, 5426, 273, 14495, 28959, 310, 2087, 285, 12112, 285, 23395, 2087, 4219, 2045, 1543, 50276, 783, 4344, 5958, 20828, 1895, 310, 671, 271, 4722, 1386, 273, 14392, 835, 253, 4736, 310, 281, 1089, 247, 4344, 19947, 326, 46926, 253, 14923, 1599, 4181, 281, 253, 3280, 6222, 50276, 45563, 20881, 1255, 50275, 783, 11419, 267, 46042, 323, 4344, 5958, 20828, 403, 671, 2969, 534, 812, 320, 247, 14855, 533, 347, 247, 806, 323, 333, 715, 253, 1673, 3133, 751, 247, 5043, 835, 253, 5697, 403, 2057, 281, 337, 1089, 4344, 34754, 273, 1016, 3280, 19947, 285, 840, 5206, 253, 954, 4275, 581, 390, 374, 1089, 247, 4344, 11193, 273, 253, 16593, 19947, 326, 46926, 253, 14923, 1599, 8103, 2299, 627, 310, 4931, 625, 2316, 323, 625, 18144, 11333, 326, 13414, 6388, 253, 260, 925, 749, 28872, 50276, 20881, 1255, 265, 50276, 783, 11333, 2529, 275, 253, 2929, 403, 512, 4942, 2969, 1014, 604, 253, 1783, 310, 37825, 436, 310, 760, 247, 5884, 29254, 2299, 347, 271, 2393, 2929, 275, 436, 2170, 326, 310, 281, 320, 3264, 4754, 5474, 33032, 2520, 2929, 7106, 327, 253, 5958, 20828, 1895, 762, 247, 1077, 2087, 10732, 273, 14495, 28959, 4477, 12661, 247, 4872, 673, 3242, 5933, 281, 1089, 253, 8642, 4344, 19947, 260, 925, 407, 247, 38754, 5700, 323, 465, 423, 455, 29201, 285, 12130, 312, 17082, 285, 12661, 247, 4460, 5933, 280, 4968, 3364, 281, 8415, 247, 4618, 5235, 273, 5958, 20828, 16566, 14127, 824, 12314, 28959, 10806, 597, 2085, 26565, 15965, 28529, 281, 5276, 326, 253, 627, 4961, 247, 4872, 673, 19947, 5933, 20544, 50276, 2520, 2929, 4245, 247, 4460, 11419, 41528, 323, 253, 2087, 5958, 20828, 1895, 762, 253, 28959, 7792, 534, 2987, 323, 667, 14923, 1599, 8103, 285, 667, 28959, 6866, 50276, 2520, 2929, 19539, 326, 253, 627, 4961, 247, 4872, 673, 19947, 5933, 949, 26565, 15965, 28529, 32213, 50276, 783, 2605, 273, 436, 2929, 310, 18464, 534, 2789, 253, 2929, 4105, 34025, 436, 2929, 38771, 7118, 273, 2905, 2987, 285, 6452, 4477, 476, 3730, 281, 2045, 2929, 3863, 275, 5723, 2824, 323, 18520, 50275, 249, 436, 2929, 4477, 760, 897, 581, 12494, 275, 2593, 1903, 281, 9569, 253, 5301, 342, 760, 581, 17336, 789, 352, 310, 2834, 281, 18578, 479, 326, 253, 5933, 4081, 275, 436, 2929, 310, 8936, 281, 253, 1375, 23037, 14387, 11333, 50276, 8826, 26850, 403, 417, 3451, 323, 1650, 275, 2593, 337, 253, 14923, 1599, 8103, 275, 2593, 1903, 2020, 74, 18, 79, 391, 12242, 2886, 40009, 82, 18, 82, 943, 320, 2020, 74, 18, 79, 391, 12242, 2886, 40009, 82, 18, 82, 275, 1386, 12446, 50276, 74, 1158, 253, 7680, 273, 253, 4081, 5933, 310, 326, 4477, 2216, 247, 4460, 14926, 1159, 273, 38754, 5700, 323, 4344, 19947, 285, 5276, 253, 5933, 310, 247, 4872, 673, 19947, 5933, 2299, 253, 38754, 5700, 323, 19947, 310, 3480, 273, 15832, 50276, 2520, 2929, 16633, 327, 253, 15965, 28529, 323, 253, 4344, 5958, 20828, 285, 18303, 326, 253, 4081, 11333, 403, 1805, 685, 17336, 789, 14552, 83, 1423, 275, 1386, 13718, 5046, 4477, 476, 823, 625, 14023, 285, 1918, 625, 7000, 8813, 323, 18597, 253, 34385, 273, 253, 4081, 11333, 50276, 2520, 2929, 2175, 1387, 28959, 347, 973, 347, 14495, 28959, 253, 9376, 273, 436, 1895, 3133, 281, 320, 326, 1016, 7431, 760, 14125, 281, 581, 1387, 891, 4282, 604, 253, 11333, 275, 436, 2929, 1335, 2987, 604, 1016, 7431, 476, 5663, 281, 2709, 2390, 387, 253, 1072, 673, 50276, 187, 187, 4118, 18435, 27, 15337, 398, 10490, 253, 38135, 275, 253, 4344, 2491, 20828, 1895, 285, 11346, 253, 17647, 273, 253, 4081, 11333, 253, 10527, 1543, 403, 4891, 2167, 253, 4737, 5609, 403, 6566, 281, 320, 2629, 627, 310, 247, 1781, 2316, 281, 3157, 253, 3290, 273, 9759, 30628, 11626, 11791, 253, 1072, 846, 30080, 22559, 285, 642, 3081, 2792, 497, 5439, 1309, 253, 5955, 285, 642, 37317, 574, 247, 2266, 4743, 327, 14924, 390, 18235, 594, 253, 2929, 50276, 261, 327, 253, 5777, 2762, 1930, 273, 253, 19354 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 12661, 11333, 323, 8642, 4344, 19947, 285, 4344, 5958, 20828, 3237, 50276, 46458, 597, 1908, 767, 27367, 273, 28959, 50276, 20881, 50276, 66, 270, 25525, 598, 281, 247, 17744, 465, 273, 253, 19947, 285, 2266, 50276, 25525, 323, 512, 17744, 265, 50276, 9328, 2085, 271, 3242, 5933, 323, 253, 260, 925, 1895, 323, 767, 17082, 50276, 76, 423, 455, 29201, 285, 12130, 312, 50276, 14094, 1543, 2186, 672, 253, 1180, 273, 2390, 403, 247, 3638, 275, 253, 1083, 273, 253, 12130, 312, 7982, 20544, 337, 4344, 19947, 310, 271, 1774, 1895, 50276, 19, 512, 253, 4278, 403, 3559, 50276, 783, 2929, 310, 2299, 247, 2372, 1892, 281, 1239, 275, 690, 5053, 495, 253, 11333, 403, 9648, 27350, 50276, 20881, 1255, 265, 337, 253, 11333, 403, 9648, 2969, 281, 12106, 374, 50276, 6050, 4344, 19947, 310, 271, 1774, 1895, 253, 1072, 1537, 417, 320, 253, 1083, 323, 5958, 20828, 50276, 531, 273, 253, 3625, 7342, 323, 20828, 310, 281, 11399, 31306, 432, 667, 2060, 19947, 495, 5301, 281, 2905, 789, 3340, 275, 4344, 1387, 4702, 285, 4344, 17524, 37717, 272, 310, 5816, 50276, 783, 6158, 3237, 403, 3340, 2905, 762, 253, 5075, 10732, 273, 28959, 672, 465, 310, 873, 281, 253, 1979, 273, 1016, 1061, 539, 24670, 4399, 1223, 253, 1895, 273, 4344, 19947, 310, 271, 1774, 1895, 281, 1908, 4344, 5958, 20828, 310, 417, 973, 17194, 50276, 7152, 33032, 4064, 247, 28959, 390, 9991, 8668, 436, 2929, 33888, 19947, 3237, 275, 534, 9183, 15823, 281, 1027, 2390, 452, 247, 4344, 6779, 275, 253, 2457, 31972, 970, 247, 4872, 673, 3242, 5933, 253, 5933, 588, 1089, 253, 8642, 4344, 19947, 323, 465, 423, 455, 29201, 7982, 762, 2266, 28959, 835, 253, 2457, 19947, 310, 4344, 323, 512, 2193, 273, 465, 1677, 253, 3602, 323, 13947, 4344, 6779, 1180, 273, 9183, 432, 247, 1798, 1387, 275, 253, 1755, 76, 6887, 273, 253, 19947, 253, 4477, 671, 2085, 271, 3242, 5933, 323, 4560, 253, 8642, 4344, 19947, 323, 253, 12130, 312, 7982, 762, 2266, 28959, 672, 627, 403, 247, 1643, 1180, 273, 2390, 23000, 253, 4477, 12661, 247, 4460, 11419, 41528, 323, 253, 2087, 5958, 20828, 1895, 762, 253, 28959, 7792, 275, 247, 5825, 17901, 253, 2022, 7680, 310, 281, 1287, 247, 4460, 5933, 280, 4968, 3364, 323, 253, 4344, 5958, 20828, 326, 35910, 247, 5235, 273, 5958, 20828, 16566, 14127, 824, 12314, 28959, 10806, 271, 5667, 1379, 12594, 273, 436, 789, 310, 326, 247, 873, 273, 7826, 23539, 31972, 476, 320, 40006, 715, 247, 4344, 19947, 342, 760, 247, 1355, 2957, 275, 253, 3290, 273, 253, 19947, 253, 2929, 949, 314, 2175, 253, 4344, 5958, 20828, 1895, 762, 1027, 28959, 10806, 352, 3400, 767, 11419, 267, 46042, 281, 16851, 253, 4344, 40006, 19947, 627, 403, 247, 2257, 273, 39383, 3916, 458, 44661, 285, 14308, 275, 253, 2929, 534, 310, 1175, 275, 247, 3282, 984, 352, 11323, 4278, 533, 4583, 2789, 253, 2929, 417, 3477, 281, 956, 891, 1804, 4886, 690, 273, 731, 281, 30762, 581, 273, 253, 2022, 7118, 273, 253, 2929, 4344, 5958, 20828, 7866, 275, 3239, 818, 1223, 253, 2929, 4453, 4722, 285, 4460, 352, 310, 6313, 10527, 2929, 534, 516, 417, 2119, 310, 247, 1175, 4327, 323, 436, 18767, 253, 2929, 23970, 247, 12314, 2746, 285, 8219, 352, 310, 247, 4460, 5933, 533, 352, 1057, 417, 1071, 436, 1332, 327, 247, 15524, 390, 1524, 10186, 941, 352, 651, 320, 4722, 281, 923, 690, 4679, 285, 14023, 342, 253, 3082, 5469, 275, 337, 390, 374, 347, 841, 9380, 5393, 275, 253, 2929, 347, 8642, 7274, 285, 597, 2085, 4679, 281, 921, 253, 12510, 273, 616, 1332, 50275, 33722, 2905, 789, 275, 1386, 7652, 16280, 5393, 9380, 337, 310, 671, 247, 747, 2905, 789, 326, 18012, 247, 4344, 19947, 10237, 281, 5203, 6046, 326, 11903, 4219, 253, 19947, 11839, 2256, 281, 1387, 28959, 10806, 1754, 327, 5445, 824, 347, 18825, 25622, 352, 310, 271, 275, 21678, 1332, 534, 10262, 247, 29224, 5454, 2727, 875, 28959, 285, 11839, 50276, 18, 260, 1942, 3642, 465, 6968, 77, 1342, 285, 1045, 413, 1408, 3354, 6339, 254, 5958, 20828, 11333, 323, 4344, 345, 2288, 316, 10061, 273, 253, 362, 392, 67, 990, 319, 420, 2145, 805, 9169, 50276, 19, 34843, 301, 359, 74, 31934, 5497, 249, 335, 310, 5247, 2534, 976, 256, 2942, 589, 285, 5303, 75, 32616, 1666, 86, 12869, 5958, 9406, 66, 246, 279, 342, 14495, 28959, 275, 9788, 2307, 3239, 281, 3176, 1384, 1423, 50276, 20, 1167, 3298, 505, 7005, 301, 15898, 17369, 294, 91, 3348, 74, 4172, 91, 267, 269, 506, 2421, 285, 270, 5651, 1182, 466, 35292, 28959, 323, 10237, 4715, 281, 5958, 549, 32693, 638, 3845, 549, 32693, 1797, 805, 3071, 21340, 43425, 50276, 74, 13414, 604, 597, 452, 5469, 253, 7364, 273, 616, 2746, 5474, 339, 431, 248, 4477, 1263, 4344, 5958, 20828, 835, 9183, 778, 5663, 281, 1027, 2390, 285, 1016, 1387, 1364, 320, 6607, 9648, 275, 253, 2457, 19947, 275, 247, 1039, 326, 253, 17039, 476, 13199, 597, 2085, 247, 4872, 673, 5933, 323, 4560, 253, 8642, 4344, 19947, 762, 14495, 28959, 323, 253, 465, 423, 455, 29201, 4181, 7982, 285, 247, 877, 1767, 553, 7870, 10717, 5933, 323, 253, 12130, 312, 7982, 323, 247, 3638, 1180, 273, 2390, 534, 310, 253, 1083, 275, 3946, 597, 840, 1263, 253, 4344, 5958, 20828, 1895, 285, 921, 326, 1142, 23539, 31972, 476, 320, 40006, 715, 247, 4344, 19947, 342, 760, 247, 1355, 2957, 275, 3290, 323, 247, 5235, 273, 4181, 17082, 50276, 296, 3755, 20556, 50275, 783, 5426, 273, 14495, 28959, 310, 2087, 285, 12112, 285, 23395, 2087, 4219, 2045, 1543, 50276, 783, 4344, 5958, 20828, 1895, 310, 671, 271, 4722, 1386, 273, 14392, 835, 253, 4736, 310, 281, 1089, 247, 4344, 19947, 326, 46926, 253, 14923, 1599, 4181, 281, 253, 3280, 6222, 50276, 45563, 20881, 1255, 50275, 783, 11419, 267, 46042, 323, 4344, 5958, 20828, 403, 671, 2969, 534, 812, 320, 247, 14855, 533, 347, 247, 806, 323, 333, 715, 253, 1673, 3133, 751, 247, 5043, 835, 253, 5697, 403, 2057, 281, 337, 1089, 4344, 34754, 273, 1016, 3280, 19947, 285, 840, 5206, 253, 954, 4275, 581, 390, 374, 1089, 247, 4344, 11193, 273, 253, 16593, 19947, 326, 46926, 253, 14923, 1599, 8103, 2299, 627, 310, 4931, 625, 2316, 323, 625, 18144, 11333, 326, 13414, 6388, 253, 260, 925, 749, 28872, 50276, 20881, 1255, 265, 50276, 783, 11333, 2529, 275, 253, 2929, 403, 512, 4942, 2969, 1014, 604, 253, 1783, 310, 37825, 436, 310, 760, 247, 5884, 29254, 2299, 347, 271, 2393, 2929, 275, 436, 2170, 326, 310, 281, 320, 3264, 4754, 5474, 33032, 2520, 2929, 7106, 327, 253, 5958, 20828, 1895, 762, 247, 1077, 2087, 10732, 273, 14495, 28959, 4477, 12661, 247, 4872, 673, 3242, 5933, 281, 1089, 253, 8642, 4344, 19947, 260, 925, 407, 247, 38754, 5700, 323, 465, 423, 455, 29201, 285, 12130, 312, 17082, 285, 12661, 247, 4460, 5933, 280, 4968, 3364, 281, 8415, 247, 4618, 5235, 273, 5958, 20828, 16566, 14127, 824, 12314, 28959, 10806, 597, 2085, 26565, 15965, 28529, 281, 5276, 326, 253, 627, 4961, 247, 4872, 673, 19947, 5933, 20544, 50276, 2520, 2929, 4245, 247, 4460, 11419, 41528, 323, 253, 2087, 5958, 20828, 1895, 762, 253, 28959, 7792, 534, 2987, 323, 667, 14923, 1599, 8103, 285, 667, 28959, 6866, 50276, 2520, 2929, 19539, 326, 253, 627, 4961, 247, 4872, 673, 19947, 5933, 949, 26565, 15965, 28529, 32213, 50276, 783, 2605, 273, 436, 2929, 310, 18464, 534, 2789, 253, 2929, 4105, 34025, 436, 2929, 38771, 7118, 273, 2905, 2987, 285, 6452, 4477, 476, 3730, 281, 2045, 2929, 3863, 275, 5723, 2824, 323, 18520, 50275, 249, 436, 2929, 4477, 760, 897, 581, 12494, 275, 2593, 1903, 281, 9569, 253, 5301, 342, 760, 581, 17336, 789, 352, 310, 2834, 281, 18578, 479, 326, 253, 5933, 4081, 275, 436, 2929, 310, 8936, 281, 253, 1375, 23037, 14387, 11333, 50276, 8826, 26850, 403, 417, 3451, 323, 1650, 275, 2593, 337, 253, 14923, 1599, 8103, 275, 2593, 1903, 2020, 74, 18, 79, 391, 12242, 2886, 40009, 82, 18, 82, 943, 320, 2020, 74, 18, 79, 391, 12242, 2886, 40009, 82, 18, 82, 275, 1386, 12446, 50276, 74, 1158, 253, 7680, 273, 253, 4081, 5933, 310, 326, 4477, 2216, 247, 4460, 14926, 1159, 273, 38754, 5700, 323, 4344, 19947, 285, 5276, 253, 5933, 310, 247, 4872, 673, 19947, 5933, 2299, 253, 38754, 5700, 323, 19947, 310, 3480, 273, 15832, 50276, 2520, 2929, 16633, 327, 253, 15965, 28529, 323, 253, 4344, 5958, 20828, 285, 18303, 326, 253, 4081, 11333, 403, 1805, 685, 17336, 789, 14552, 83, 1423, 275, 1386, 13718, 5046, 4477, 476, 823, 625, 14023, 285, 1918, 625, 7000, 8813, 323, 18597, 253, 34385, 273, 253, 4081, 11333, 50276, 2520, 2929, 2175, 1387, 28959, 347, 973, 347, 14495, 28959, 253, 9376, 273, 436, 1895, 3133, 281, 320, 326, 1016, 7431, 760, 14125, 281, 581, 1387, 891, 4282, 604, 253, 11333, 275, 436, 2929, 1335, 2987, 604, 1016, 7431, 476, 5663, 281, 2709, 2390, 387, 253, 1072, 673, 50276, 187, 187, 4118, 18435, 27, 15337, 398, 10490, 253, 38135, 275, 253, 4344, 2491, 20828, 1895, 285, 11346, 253, 17647, 273, 253, 4081, 11333, 253, 10527, 1543, 403, 4891, 2167, 253, 4737, 5609, 403, 6566, 281, 320, 2629, 627, 310, 247, 1781, 2316, 281, 3157, 253, 3290, 273, 9759, 30628, 11626, 11791, 253, 1072, 846, 30080, 22559, 285, 642, 3081, 2792, 497, 5439, 1309, 253, 5955, 285, 642, 37317, 574, 247, 2266, 4743, 327, 14924, 390, 18235, 594, 253, 2929, 50276, 261, 327, 253, 5777, 2762, 1930, 273, 253, 19354 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper argues that in the wisdomofthecrowd paradigm plurality voting may not necessarily yield the correct answer when the majority makes systematic errors the paper presents a theoretical framework to elicit thinking hierarchy and demonstrates that their method outperforms plurality voting and also demonstrates certain desirable properties apart from presenting the theoretical framework the paper also conducts crowdsourced user studies to demonstrate the practical effectiveness of their framework strengths 1 the papers motivation is clear and important if a vast majority displays certain biases plurality voting would not be very useful 2 the papers writing and organization are good the examples provided help understanding weaknesses 1 the paper can be strengthened by making the user study more elaborate with a clear description of how are the questions selected no negative societal impact is discussed i do not see any obvious red flags in terms of negative societal impact docsepthe crux of this paper is that it provides an empirically validated way to use the wisdom of the crowd rather than the default popular answerplurality voting paradigm in the process the authors show how to obtain the thinking hierarchy of the crowd for the given set of questions they claim that knowing this rich hierarchy helps in areas like policy making using mathematical tools and certain assumptions the paper demonstrates the superiority of their method especially when obtaining a prior distribution from the crowd is prohibitively expensive originality the authors distinguish themselves from previous work by discarding the use of a prior distribution further they claim that their thinking hierarchy learning model captures a richer spectrum of answers their method reduces bias like previous work does with the difference that they do not collect a prediction distribution rather a single prediction answer they also differentiate themselves from the game theoretic setting the concept of the thinking hierarchy is sufficiently novel wrt past work quality the paper provides mathematical justification although under certain idealistic assumptions for every statement it makes the experiment results have been provided at a url and are easy to grasp the authors are upfront about the assumptions that are required and also provide alternatives for eg the iid assumption under which respondents make their predictions is contrasted with picking the first prediction that a respondent makes due to the fact that iid predictions dont match with reality clarity the paper is clear with what it aims to achieve and under what conditions it can achieve them the goal is to learn the thinking hierarchy among respondents and to do so without collecting prior information the authors also provide future uses of their work from the ml and scale points of view the experiments are easy enough for a novice reader to understand and collect sufficient information significance the method described in the paper has significance in that it is applicable to cases where collecting prior information from a crowd is difficult and plurality voting is not sufficient one can foresee future applications of systematically obtaining a thought hierarchy as more information on a certain topic or policy can only help with decision making the authors discuss technical limitations such as the iid assumption on user predictions negative impacts on society are not particularly discussed but one could think of situations such as political mercenaries or governments collecting information on thinking patterns of a voting population in pursuit of an agenda docsepthe paper proposes a mathematical model for thinking hierarchy of users predictions for their answer and other users answers given the joint distribution of the answers and predictions of other answers the authors show that the parameters of the model can be derived using a novel matrix factorization the authors solve an equivalent frobenius norm minimization problem for the special case when the w matrix one of the parameters of the model is semiorthogonal and then the authors propose a bruteforce search based algorithm to find the ranking of the answers from the thinking hierarchy since the joint distribution of answers and predictions may not be available they use an empirical estimate naturally this works if the samples are iid strengths 1 the problem of eliciting thinking hierarchy to come up with the correct ranking over various answers is wellmotivated and is an important problem to study 2 the paper seems to be placed well in the recent literature about using additional information about peoples prediction about other answers to get better accuracy for ranking answers 3 the theoretical results about the problem are interesting however somewhat limited weaknesses 1 the main drawback is that the algorithms proposed in the paper are just bruteforce search based algorithms and hence are not efficient for complicated questions 2 the theoretical guarantees about eliciting thinking hierarchies hold in fairly restricted settings which may not hold majority of the times in realworld especially the questions and answers very quickly become complicated in which case pluralityvoting could be a better method 3 basically for a complete picture the complexity of the questions and answers needs to be captured in this framework in what cases does pluralityvoting give better ranking than the algorithms in the proposed framework i dont understand when is the side information better 4 the writing can be improved see questions i dont see any mention of the negative impacts of when the proposed algorithms are bad compared to standard algorithms this discussion is needed docsepthe paper proposes a method to aggregate crowdsourcing answers based on a key observation experts have different expertise levels and experts at a higher level are able to simulate the experts at lower levels the paper aims to find this underlying thinking hierarchy which can be used to find the best answerby asking people to predict other peoples answers the paper first proposes a model for this thinking hierarchy the model assumes that experts have their types and for each type there is a thinking oracle that specifies how this type of experts generate their answers an expert can run the oracles with lower types but never higher types the paper then develops two algorithms to learn the underlying thinking hierarchy the algorithms basically generate an answerprediction matrix based on experts answers and predictions and then find a ranking of answers that will reorder the matrix to match the hierarchical structure in the best way the paper provides theoretical justification for their algorithms finally the paper shows by realworld experiments that their methods outperform plurality voting the strengths of the paper are the algorithms and the experiments the paper proposes novel methods to utilize the underlying thinking hierarchy when aggregating experts answers and tests the method through realworld experiments the algorithms are intuitively reasonable and the experiment results are good the weakness of the paper is the modeling of the thinking hierarchy which also makes the theoretical justification of the algorithms implausible the use of thinking oracle seems unreasonable to me are these thinking oracles private information or public information if they are private information how can the experts of higher types use the lowertype oracles if they are public information why cant the lowertype experts use the highertype oracles it is also ungrounded why an expert generates predictions by running thinking oracles of others the assumption seems crucial for the theoretical analysis which in my point of view should be more carefully justified yes ### Summary:
this work proposes the framework to elicit peoples thinking hierarchy that helps improve the wisdom of the crowd even if the majority is wrong the reviewers overall appreciate the main idea of the work and believe it makes a nice contribution to the literature there have been some questionsconcerns raised about the efficiency of the algorithm and the model limitations to which the authors have provided reasonable responses we encourage the authors to incorporate those responses and other reviewer comments into the final version of the paper
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 8219, 326, 275, 253, 15799, 23037, 248, 32910, 69, 22199, 11234, 13423, 778, 417, 7933, 4917, 253, 3451, 3662, 672, 253, 5020, 2789, 12082, 6332, 253, 2929, 10262, 247, 10527, 7792, 281, 35192, 4680, 19868, 285, 14371, 326, 616, 1332, 41731, 13015, 11234, 13423, 285, 671, 14371, 2176, 11408, 3607, 7419, 432, 15250, 253, 10527, 7792, 253, 2929, 671, 2589, 84, 24597, 47549, 2608, 2175, 281, 7568, 253, 8542, 12510, 273, 616, 7792, 50276, 296, 3755, 20556, 337, 253, 9380, 16038, 310, 2590, 285, 1774, 604, 247, 8485, 5020, 12646, 2176, 31306, 11234, 13423, 651, 417, 320, 1077, 4217, 50276, 19, 253, 9380, 4028, 285, 6003, 403, 1175, 253, 6667, 2530, 1361, 4685, 50275, 20881, 1255, 265, 50276, 18, 253, 2929, 476, 320, 34615, 407, 2403, 253, 2608, 1263, 625, 21184, 342, 247, 2590, 5740, 273, 849, 403, 253, 3533, 4236, 50275, 2369, 4016, 38058, 3486, 310, 5469, 891, 513, 417, 923, 667, 4755, 2502, 12201, 275, 2426, 273, 4016, 38058, 3486, 50276, 7152, 339, 431, 248, 5385, 89, 273, 436, 2929, 310, 326, 352, 3400, 271, 45190, 17618, 1039, 281, 897, 253, 15799, 273, 253, 9539, 2581, 685, 253, 4284, 4633, 3662, 446, 321, 1319, 13423, 22199, 275, 253, 1232, 253, 4477, 921, 849, 281, 4044, 253, 4680, 19868, 273, 253, 9539, 323, 253, 1677, 873, 273, 3533, 597, 1750, 326, 8958, 436, 6793, 19868, 7729, 275, 3672, 751, 3646, 2403, 970, 15965, 5657, 285, 2176, 13260, 253, 2929, 14371, 253, 34385, 273, 616, 1332, 3340, 672, 13546, 247, 2720, 3268, 432, 253, 9539, 310, 9419, 25785, 8214, 50276, 19164, 414, 253, 4477, 12129, 3746, 432, 2045, 789, 407, 1262, 13218, 253, 897, 273, 247, 2720, 3268, 2007, 597, 1750, 326, 616, 4680, 19868, 4715, 1566, 28174, 247, 38539, 6637, 273, 9172, 616, 1332, 11355, 8492, 751, 2045, 789, 1057, 342, 253, 3064, 326, 597, 513, 417, 4822, 247, 10554, 3268, 2581, 247, 2014, 10554, 3662, 597, 671, 22629, 3746, 432, 253, 2165, 253, 30325, 4758, 253, 4473, 273, 253, 4680, 19868, 310, 10481, 4460, 8772, 2469, 789, 50276, 15177, 253, 2929, 3400, 15965, 22861, 3738, 762, 2176, 7445, 2531, 13260, 323, 1046, 3908, 352, 2789, 253, 3368, 1543, 452, 644, 2530, 387, 247, 9688, 285, 403, 3477, 281, 15909, 253, 4477, 403, 598, 6342, 670, 253, 13260, 326, 403, 2424, 285, 671, 2085, 18075, 50276, 1542, 24088, 253, 891, 301, 9376, 762, 534, 15102, 1056, 616, 13650, 310, 48397, 342, 8871, 253, 806, 10554, 326, 247, 15587, 2789, 1955, 281, 253, 958, 326, 891, 301, 13650, 13414, 3761, 342, 6612, 50276, 498, 15752, 253, 2929, 310, 2590, 342, 752, 352, 13698, 281, 5115, 285, 762, 752, 2515, 352, 476, 5115, 731, 253, 4736, 310, 281, 3037, 253, 4680, 19868, 2190, 15102, 285, 281, 513, 594, 1293, 17055, 2720, 1491, 253, 4477, 671, 2085, 2852, 4648, 273, 616, 789, 432, 253, 13361, 285, 4311, 2792, 273, 1859, 253, 4679, 403, 3477, 2217, 323, 247, 22458, 547, 9414, 281, 2096, 285, 4822, 4209, 1491, 50276, 9188, 40348, 253, 1332, 2529, 275, 253, 2929, 556, 8453, 275, 326, 352, 310, 7763, 281, 2219, 835, 17055, 2720, 1491, 432, 247, 9539, 310, 2834, 285, 11234, 13423, 310, 417, 4209, 581, 476, 32734, 2852, 4893, 273, 24181, 13546, 247, 1869, 19868, 347, 625, 1491, 327, 247, 2176, 9400, 390, 3646, 476, 760, 1361, 342, 3061, 2403, 50276, 783, 4477, 2319, 7681, 7364, 824, 347, 253, 891, 301, 9376, 327, 2608, 13650, 4016, 16274, 327, 5948, 403, 417, 3782, 5469, 533, 581, 812, 1158, 273, 9534, 824, 347, 3569, 14480, 257, 3927, 390, 13001, 17055, 1491, 327, 4680, 6127, 273, 247, 13423, 3072, 275, 20808, 273, 271, 15990, 50276, 7152, 339, 431, 248, 2929, 29328, 247, 15965, 1566, 323, 4680, 19868, 273, 4212, 13650, 323, 616, 3662, 285, 643, 4212, 9172, 1677, 253, 6036, 3268, 273, 253, 9172, 285, 13650, 273, 643, 9172, 253, 4477, 921, 326, 253, 3602, 273, 253, 1566, 476, 320, 6012, 970, 247, 4460, 4315, 39401, 253, 4477, 8415, 271, 6425, 8954, 7564, 3750, 5222, 41458, 1895, 323, 253, 2714, 1083, 672, 253, 259, 4315, 581, 273, 253, 3602, 273, 253, 1566, 310, 3300, 1528, 394, 17397, 285, 840, 253, 4477, 12661, 247, 16325, 832, 48040, 3186, 1754, 5933, 281, 1089, 253, 19947, 273, 253, 9172, 432, 253, 4680, 19868, 1580, 253, 6036, 3268, 273, 9172, 285, 13650, 778, 417, 320, 2130, 597, 897, 271, 16774, 6642, 10748, 436, 2987, 604, 253, 3530, 403, 891, 301, 50276, 296, 3755, 20556, 50276, 18, 253, 1895, 273, 24021, 2996, 4680, 19868, 281, 1705, 598, 342, 253, 3451, 19947, 689, 2710, 9172, 310, 973, 24013, 8550, 285, 310, 271, 1774, 1895, 281, 1263, 374, 253, 2929, 3133, 281, 320, 4845, 973, 275, 253, 3332, 6239, 670, 970, 3081, 1491, 670, 22132, 10554, 670, 643, 9172, 281, 755, 1805, 7200, 323, 19947, 9172, 495, 253, 10527, 1543, 670, 253, 1895, 403, 4722, 2299, 8489, 3710, 50276, 20881, 1255, 265, 50276, 18, 253, 2022, 32489, 310, 326, 253, 11333, 4081, 275, 253, 2929, 403, 816, 16325, 832, 48040, 3186, 1754, 11333, 285, 7613, 403, 417, 5919, 323, 9542, 3533, 374, 253, 10527, 23632, 670, 24021, 2996, 4680, 20258, 447, 2186, 275, 9648, 11096, 7533, 534, 778, 417, 2186, 5020, 273, 253, 2069, 275, 1524, 10186, 3340, 253, 3533, 285, 9172, 1077, 4541, 2489, 9542, 275, 534, 1083, 11234, 87, 5341, 812, 320, 247, 1805, 1332, 50276, 20, 10323, 323, 247, 3426, 5406, 253, 10454, 273, 253, 3533, 285, 9172, 3198, 281, 320, 10848, 275, 436, 7792, 275, 752, 2219, 1057, 11234, 87, 5341, 1918, 1805, 19947, 685, 253, 11333, 275, 253, 4081, 7792, 891, 13414, 2096, 672, 310, 253, 1930, 1491, 1805, 577, 253, 4028, 476, 320, 5520, 923, 3533, 891, 13414, 923, 667, 3748, 273, 253, 4016, 16274, 273, 672, 253, 4081, 11333, 403, 3076, 2429, 281, 2629, 11333, 436, 5955, 310, 3058, 5474, 339, 431, 248, 2929, 29328, 247, 1332, 281, 19737, 24597, 40883, 9172, 1754, 327, 247, 2234, 8310, 10071, 452, 1027, 15040, 2308, 285, 10071, 387, 247, 2169, 1268, 403, 2104, 281, 26065, 253, 10071, 387, 2406, 2308, 253, 2929, 13698, 281, 1089, 436, 6944, 4680, 19868, 534, 476, 320, 908, 281, 1089, 253, 1682, 3662, 1615, 7004, 952, 281, 3283, 643, 22132, 9172, 253, 2929, 806, 29328, 247, 1566, 323, 436, 4680, 19868, 253, 1566, 19584, 326, 10071, 452, 616, 3510, 285, 323, 1016, 1511, 627, 310, 247, 4680, 42295, 326, 28251, 849, 436, 1511, 273, 10071, 6635, 616, 9172, 271, 6485, 476, 1408, 253, 390, 13853, 342, 2406, 3510, 533, 1620, 2169, 3510, 253, 2929, 840, 24357, 767, 11333, 281, 3037, 253, 6944, 4680, 19868, 253, 11333, 10323, 6635, 271, 3662, 12787, 2474, 4315, 1754, 327, 10071, 9172, 285, 13650, 285, 840, 1089, 247, 19947, 273, 9172, 326, 588, 294, 2621, 253, 4315, 281, 3761, 253, 24498, 2605, 275, 253, 1682, 1039, 253, 2929, 3400, 10527, 22861, 323, 616, 11333, 4720, 253, 2929, 2722, 407, 1524, 10186, 4679, 326, 616, 3082, 562, 32231, 11234, 13423, 50276, 783, 20544, 273, 253, 2929, 403, 253, 11333, 285, 253, 4679, 253, 2929, 29328, 4460, 3082, 281, 16584, 253, 6944, 4680, 19868, 672, 9406, 839, 10071, 9172, 285, 5216, 253, 1332, 949, 1524, 10186, 4679, 253, 11333, 403, 540, 41597, 5272, 285, 253, 3368, 1543, 403, 1175, 50275, 783, 14855, 273, 253, 2929, 310, 253, 14053, 273, 253, 4680, 19868, 534, 671, 2789, 253, 10527, 22861, 273, 253, 11333, 3898, 666, 917, 253, 897, 273, 4680, 42295, 3133, 20697, 281, 479, 403, 841, 4680, 390, 13853, 3055, 1491, 390, 1345, 1491, 604, 597, 403, 3055, 1491, 849, 476, 253, 10071, 273, 2169, 3510, 897, 253, 2406, 881, 390, 13853, 604, 597, 403, 1345, 1491, 2139, 16216, 253, 2406, 881, 10071, 897, 253, 2169, 881, 390, 13853, 352, 310, 671, 440, 2595, 264, 2139, 271, 6485, 15693, 13650, 407, 3515, 4680, 390, 13853, 273, 2571, 253, 9376, 3133, 9560, 323, 253, 10527, 1783, 534, 275, 619, 1127, 273, 1859, 943, 320, 625, 9257, 17285, 50275, 9820, 2490, 187, 4118, 18435, 27, 2520, 789, 29328, 253, 7792, 281, 35192, 22132, 4680, 19868, 326, 7729, 3157, 253, 15799, 273, 253, 9539, 1014, 604, 253, 5020, 310, 3430, 253, 30628, 4583, 11435, 253, 2022, 2934, 273, 253, 789, 285, 2868, 352, 2789, 247, 5322, 7680, 281, 253, 6239, 50276, 9088, 452, 644, 690, 3533, 585, 1209, 2224, 5439, 670, 253, 6733, 273, 253, 5933, 285, 253, 1566, 7364, 281, 534, 253, 4477, 452, 2530, 5272, 6128, 50276, 664, 11907, 253, 4477, 281, 19071, 1110, 6128, 285, 643, 37317, 5701, 715, 253, 2457, 2715, 273, 253, 2929, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 8219, 326, 275, 253, 15799, 23037, 248, 32910, 69, 22199, 11234, 13423, 778, 417, 7933, 4917, 253, 3451, 3662, 672, 253, 5020, 2789, 12082, 6332, 253, 2929, 10262, 247, 10527, 7792, 281, 35192, 4680, 19868, 285, 14371, 326, 616, 1332, 41731, 13015, 11234, 13423, 285, 671, 14371, 2176, 11408, 3607, 7419, 432, 15250, 253, 10527, 7792, 253, 2929, 671, 2589, 84, 24597, 47549, 2608, 2175, 281, 7568, 253, 8542, 12510, 273, 616, 7792, 50276, 296, 3755, 20556, 337, 253, 9380, 16038, 310, 2590, 285, 1774, 604, 247, 8485, 5020, 12646, 2176, 31306, 11234, 13423, 651, 417, 320, 1077, 4217, 50276, 19, 253, 9380, 4028, 285, 6003, 403, 1175, 253, 6667, 2530, 1361, 4685, 50275, 20881, 1255, 265, 50276, 18, 253, 2929, 476, 320, 34615, 407, 2403, 253, 2608, 1263, 625, 21184, 342, 247, 2590, 5740, 273, 849, 403, 253, 3533, 4236, 50275, 2369, 4016, 38058, 3486, 310, 5469, 891, 513, 417, 923, 667, 4755, 2502, 12201, 275, 2426, 273, 4016, 38058, 3486, 50276, 7152, 339, 431, 248, 5385, 89, 273, 436, 2929, 310, 326, 352, 3400, 271, 45190, 17618, 1039, 281, 897, 253, 15799, 273, 253, 9539, 2581, 685, 253, 4284, 4633, 3662, 446, 321, 1319, 13423, 22199, 275, 253, 1232, 253, 4477, 921, 849, 281, 4044, 253, 4680, 19868, 273, 253, 9539, 323, 253, 1677, 873, 273, 3533, 597, 1750, 326, 8958, 436, 6793, 19868, 7729, 275, 3672, 751, 3646, 2403, 970, 15965, 5657, 285, 2176, 13260, 253, 2929, 14371, 253, 34385, 273, 616, 1332, 3340, 672, 13546, 247, 2720, 3268, 432, 253, 9539, 310, 9419, 25785, 8214, 50276, 19164, 414, 253, 4477, 12129, 3746, 432, 2045, 789, 407, 1262, 13218, 253, 897, 273, 247, 2720, 3268, 2007, 597, 1750, 326, 616, 4680, 19868, 4715, 1566, 28174, 247, 38539, 6637, 273, 9172, 616, 1332, 11355, 8492, 751, 2045, 789, 1057, 342, 253, 3064, 326, 597, 513, 417, 4822, 247, 10554, 3268, 2581, 247, 2014, 10554, 3662, 597, 671, 22629, 3746, 432, 253, 2165, 253, 30325, 4758, 253, 4473, 273, 253, 4680, 19868, 310, 10481, 4460, 8772, 2469, 789, 50276, 15177, 253, 2929, 3400, 15965, 22861, 3738, 762, 2176, 7445, 2531, 13260, 323, 1046, 3908, 352, 2789, 253, 3368, 1543, 452, 644, 2530, 387, 247, 9688, 285, 403, 3477, 281, 15909, 253, 4477, 403, 598, 6342, 670, 253, 13260, 326, 403, 2424, 285, 671, 2085, 18075, 50276, 1542, 24088, 253, 891, 301, 9376, 762, 534, 15102, 1056, 616, 13650, 310, 48397, 342, 8871, 253, 806, 10554, 326, 247, 15587, 2789, 1955, 281, 253, 958, 326, 891, 301, 13650, 13414, 3761, 342, 6612, 50276, 498, 15752, 253, 2929, 310, 2590, 342, 752, 352, 13698, 281, 5115, 285, 762, 752, 2515, 352, 476, 5115, 731, 253, 4736, 310, 281, 3037, 253, 4680, 19868, 2190, 15102, 285, 281, 513, 594, 1293, 17055, 2720, 1491, 253, 4477, 671, 2085, 2852, 4648, 273, 616, 789, 432, 253, 13361, 285, 4311, 2792, 273, 1859, 253, 4679, 403, 3477, 2217, 323, 247, 22458, 547, 9414, 281, 2096, 285, 4822, 4209, 1491, 50276, 9188, 40348, 253, 1332, 2529, 275, 253, 2929, 556, 8453, 275, 326, 352, 310, 7763, 281, 2219, 835, 17055, 2720, 1491, 432, 247, 9539, 310, 2834, 285, 11234, 13423, 310, 417, 4209, 581, 476, 32734, 2852, 4893, 273, 24181, 13546, 247, 1869, 19868, 347, 625, 1491, 327, 247, 2176, 9400, 390, 3646, 476, 760, 1361, 342, 3061, 2403, 50276, 783, 4477, 2319, 7681, 7364, 824, 347, 253, 891, 301, 9376, 327, 2608, 13650, 4016, 16274, 327, 5948, 403, 417, 3782, 5469, 533, 581, 812, 1158, 273, 9534, 824, 347, 3569, 14480, 257, 3927, 390, 13001, 17055, 1491, 327, 4680, 6127, 273, 247, 13423, 3072, 275, 20808, 273, 271, 15990, 50276, 7152, 339, 431, 248, 2929, 29328, 247, 15965, 1566, 323, 4680, 19868, 273, 4212, 13650, 323, 616, 3662, 285, 643, 4212, 9172, 1677, 253, 6036, 3268, 273, 253, 9172, 285, 13650, 273, 643, 9172, 253, 4477, 921, 326, 253, 3602, 273, 253, 1566, 476, 320, 6012, 970, 247, 4460, 4315, 39401, 253, 4477, 8415, 271, 6425, 8954, 7564, 3750, 5222, 41458, 1895, 323, 253, 2714, 1083, 672, 253, 259, 4315, 581, 273, 253, 3602, 273, 253, 1566, 310, 3300, 1528, 394, 17397, 285, 840, 253, 4477, 12661, 247, 16325, 832, 48040, 3186, 1754, 5933, 281, 1089, 253, 19947, 273, 253, 9172, 432, 253, 4680, 19868, 1580, 253, 6036, 3268, 273, 9172, 285, 13650, 778, 417, 320, 2130, 597, 897, 271, 16774, 6642, 10748, 436, 2987, 604, 253, 3530, 403, 891, 301, 50276, 296, 3755, 20556, 50276, 18, 253, 1895, 273, 24021, 2996, 4680, 19868, 281, 1705, 598, 342, 253, 3451, 19947, 689, 2710, 9172, 310, 973, 24013, 8550, 285, 310, 271, 1774, 1895, 281, 1263, 374, 253, 2929, 3133, 281, 320, 4845, 973, 275, 253, 3332, 6239, 670, 970, 3081, 1491, 670, 22132, 10554, 670, 643, 9172, 281, 755, 1805, 7200, 323, 19947, 9172, 495, 253, 10527, 1543, 670, 253, 1895, 403, 4722, 2299, 8489, 3710, 50276, 20881, 1255, 265, 50276, 18, 253, 2022, 32489, 310, 326, 253, 11333, 4081, 275, 253, 2929, 403, 816, 16325, 832, 48040, 3186, 1754, 11333, 285, 7613, 403, 417, 5919, 323, 9542, 3533, 374, 253, 10527, 23632, 670, 24021, 2996, 4680, 20258, 447, 2186, 275, 9648, 11096, 7533, 534, 778, 417, 2186, 5020, 273, 253, 2069, 275, 1524, 10186, 3340, 253, 3533, 285, 9172, 1077, 4541, 2489, 9542, 275, 534, 1083, 11234, 87, 5341, 812, 320, 247, 1805, 1332, 50276, 20, 10323, 323, 247, 3426, 5406, 253, 10454, 273, 253, 3533, 285, 9172, 3198, 281, 320, 10848, 275, 436, 7792, 275, 752, 2219, 1057, 11234, 87, 5341, 1918, 1805, 19947, 685, 253, 11333, 275, 253, 4081, 7792, 891, 13414, 2096, 672, 310, 253, 1930, 1491, 1805, 577, 253, 4028, 476, 320, 5520, 923, 3533, 891, 13414, 923, 667, 3748, 273, 253, 4016, 16274, 273, 672, 253, 4081, 11333, 403, 3076, 2429, 281, 2629, 11333, 436, 5955, 310, 3058, 5474, 339, 431, 248, 2929, 29328, 247, 1332, 281, 19737, 24597, 40883, 9172, 1754, 327, 247, 2234, 8310, 10071, 452, 1027, 15040, 2308, 285, 10071, 387, 247, 2169, 1268, 403, 2104, 281, 26065, 253, 10071, 387, 2406, 2308, 253, 2929, 13698, 281, 1089, 436, 6944, 4680, 19868, 534, 476, 320, 908, 281, 1089, 253, 1682, 3662, 1615, 7004, 952, 281, 3283, 643, 22132, 9172, 253, 2929, 806, 29328, 247, 1566, 323, 436, 4680, 19868, 253, 1566, 19584, 326, 10071, 452, 616, 3510, 285, 323, 1016, 1511, 627, 310, 247, 4680, 42295, 326, 28251, 849, 436, 1511, 273, 10071, 6635, 616, 9172, 271, 6485, 476, 1408, 253, 390, 13853, 342, 2406, 3510, 533, 1620, 2169, 3510, 253, 2929, 840, 24357, 767, 11333, 281, 3037, 253, 6944, 4680, 19868, 253, 11333, 10323, 6635, 271, 3662, 12787, 2474, 4315, 1754, 327, 10071, 9172, 285, 13650, 285, 840, 1089, 247, 19947, 273, 9172, 326, 588, 294, 2621, 253, 4315, 281, 3761, 253, 24498, 2605, 275, 253, 1682, 1039, 253, 2929, 3400, 10527, 22861, 323, 616, 11333, 4720, 253, 2929, 2722, 407, 1524, 10186, 4679, 326, 616, 3082, 562, 32231, 11234, 13423, 50276, 783, 20544, 273, 253, 2929, 403, 253, 11333, 285, 253, 4679, 253, 2929, 29328, 4460, 3082, 281, 16584, 253, 6944, 4680, 19868, 672, 9406, 839, 10071, 9172, 285, 5216, 253, 1332, 949, 1524, 10186, 4679, 253, 11333, 403, 540, 41597, 5272, 285, 253, 3368, 1543, 403, 1175, 50275, 783, 14855, 273, 253, 2929, 310, 253, 14053, 273, 253, 4680, 19868, 534, 671, 2789, 253, 10527, 22861, 273, 253, 11333, 3898, 666, 917, 253, 897, 273, 4680, 42295, 3133, 20697, 281, 479, 403, 841, 4680, 390, 13853, 3055, 1491, 390, 1345, 1491, 604, 597, 403, 3055, 1491, 849, 476, 253, 10071, 273, 2169, 3510, 897, 253, 2406, 881, 390, 13853, 604, 597, 403, 1345, 1491, 2139, 16216, 253, 2406, 881, 10071, 897, 253, 2169, 881, 390, 13853, 352, 310, 671, 440, 2595, 264, 2139, 271, 6485, 15693, 13650, 407, 3515, 4680, 390, 13853, 273, 2571, 253, 9376, 3133, 9560, 323, 253, 10527, 1783, 534, 275, 619, 1127, 273, 1859, 943, 320, 625, 9257, 17285, 50275, 9820, 2490, 187, 4118, 18435, 27, 2520, 789, 29328, 253, 7792, 281, 35192, 22132, 4680, 19868, 326, 7729, 3157, 253, 15799, 273, 253, 9539, 1014, 604, 253, 5020, 310, 3430, 253, 30628, 4583, 11435, 253, 2022, 2934, 273, 253, 789, 285, 2868, 352, 2789, 247, 5322, 7680, 281, 253, 6239, 50276, 9088, 452, 644, 690, 3533, 585, 1209, 2224, 5439, 670, 253, 6733, 273, 253, 5933, 285, 253, 1566, 7364, 281, 534, 253, 4477, 452, 2530, 5272, 6128, 50276, 664, 11907, 253, 4477, 281, 19071, 1110, 6128, 285, 643, 37317, 5701, 715, 253, 2457, 2715, 273, 253, 2929, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper studies the federated generalized linear bandits problem where each client faces a generalized linear bandits model the parameters and the link functions across clients are the same the learning objective is to minimize the cumulative regret of those clients the paper proposes an algorithm called fedglbucb with regret bound odsqrtt and communication cost at most odn2sqrtt where d is the dimension of parameter space t is the time horizon and n is the number of clients experiments are performed on both synthetic and real world dataset the glb setting introduces new challenges in the federated setting since the estimation of the parameter does not have closed form and should be iteratively updated this imposes challenges for efficient communication and causes drifting issues at local updates the proposed fedglbucb algorithm addresses such challenges by combining online local updating and offline regression for global update it rigorously proves that fedglbucb achieves the same learning regret order as in the centralized setting with a sublinear communication cost the paper is in general well written and easy to follow although the communication cost is sublinear the order of on2sqrtt still seems a little bit large since federated linear bandits can achieve onlogt the paper discusses limitations of current analysis and potential future directions docsepthis paper studies the generalized linear bandit in the federated setting in contrast to prior work on federateddistributed bandits that consider the linear bandit setting the generalized linear bandit is more challenging since the lack of a closedform solution prevents naively broadcasting sufficient statistics in the generalized linear bandit the authors propose a novel algorithm textttfedglbucb for the generalized linear bandit in the federated setting that combines offline and online regression in order to balance communication and computation the authors demonstrate that textttfedglbucb obtains competitive regret with a moderate communication budget and also demonstrate its efficiency on a synthetic and realworld benchmark strengths federated bandits are an upcoming area of interest within the bandit community and hence the topic of this paper is very relevant to that community the authors are correct in that most prior work focuses on linear bandits in the federated setting and extending those algorithms to the generalized linear bandit is nontrivial and hence this is an important challenge the paper is wellwritten and easy to understand the central algorithm design components are easy to grasp the presented algorithm is noregret with efficient communication and works well on benchmarks outperforming prior work weaknesses the authors do not discuss optimality within the problem setting what is the optimal level of communication for noregret learning in federated generalized linear bandits it appears that the suboptimality eg compared to the linear bandit arises from the global update communication do the authors have any concrete remarks regarding this the update does nullify the value in local updates as pointed out by the authors in the conclusion as well are there any assumptions that can allow reusing local updates in the global aggregation step the authors have discussed their limitations in the conclusion however it would be great if they could provide more substantial remarks in those directions docsepthis paper considers federated learning for generalized linear bandits the key difference compared to federated linear bandits is that it requires an iterative process for the global update rather than relying on sufficient statistics the authors propose an efficient algorithm that is able to achieve a tradeoff between communication and regret some variants of the proposed algorithm are also studied experiments are also conducted to corroborate the theoretical results strengths a first study on the new problem both theoretical results and empirical experiments weaknesses technical contributions are limited the presentation may be improved yes ### Summary:
federated bandits are a current area of interest within the community and the paper provides valuable contributions in particular the authors deal with the rather general glm setting provide algorithms and study the regret it would be useful if the authors would use the discussions with the reviewers and the reviewers comments to improve and polish the paper
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 2175, 253, 10208, 12072, 14923, 4872, 3961, 953, 1895, 835, 1016, 5268, 9365, 247, 14923, 4872, 3961, 953, 1566, 253, 3602, 285, 253, 3048, 3470, 2439, 8548, 403, 253, 1072, 253, 4715, 8103, 310, 281, 15338, 253, 18849, 14938, 273, 1110, 8548, 253, 2929, 29328, 271, 5933, 1925, 10208, 3129, 67, 1028, 67, 342, 14938, 3033, 258, 1397, 2274, 85, 285, 5511, 2105, 387, 954, 7687, 79, 19, 2609, 85, 835, 277, 310, 253, 7877, 273, 4764, 2317, 246, 310, 253, 673, 16892, 285, 295, 310, 253, 1180, 273, 8548, 4679, 403, 2684, 327, 1097, 13506, 285, 1524, 1533, 10895, 50276, 783, 1289, 67, 4758, 23970, 747, 7881, 275, 253, 10208, 12072, 4758, 1580, 253, 13418, 273, 253, 4764, 1057, 417, 452, 4581, 830, 285, 943, 320, 10040, 3146, 9300, 436, 35979, 7881, 323, 5919, 5511, 285, 5997, 44251, 3374, 387, 1980, 11269, 50276, 783, 4081, 10208, 3129, 67, 1028, 67, 5933, 12453, 824, 7881, 407, 16248, 3909, 1980, 22753, 285, 28841, 9077, 323, 4156, 5731, 352, 8132, 29689, 19539, 326, 50276, 22210, 3129, 67, 1028, 67, 33526, 253, 1072, 4715, 14938, 1340, 347, 275, 253, 36409, 4758, 342, 247, 749, 8172, 5511, 2105, 253, 2929, 310, 275, 2087, 973, 3542, 285, 3477, 281, 956, 50276, 20261, 50276, 783, 5511, 2105, 310, 749, 8172, 253, 1340, 273, 327, 19, 2609, 85, 1335, 3133, 247, 1652, 2372, 1781, 1580, 10208, 12072, 4872, 3961, 953, 476, 5115, 327, 2808, 85, 50275, 783, 2929, 25339, 7364, 273, 1655, 1783, 285, 2442, 2852, 10746, 5474, 33032, 2520, 2929, 2175, 253, 14923, 4872, 3961, 262, 275, 253, 10208, 12072, 4758, 275, 4499, 281, 2720, 789, 327, 10208, 12072, 45618, 3961, 953, 326, 1908, 253, 4872, 3961, 262, 4758, 253, 14923, 4872, 3961, 262, 310, 625, 11132, 1580, 253, 3480, 273, 247, 4581, 630, 2900, 16897, 5549, 1242, 32380, 4209, 9990, 275, 253, 14923, 4872, 3961, 262, 253, 4477, 12661, 247, 4460, 5933, 2505, 1440, 22210, 3129, 67, 1028, 67, 323, 253, 14923, 4872, 3961, 262, 275, 253, 10208, 12072, 4758, 326, 24772, 28841, 285, 3909, 9077, 275, 1340, 281, 6654, 5511, 285, 13782, 253, 4477, 7568, 326, 2505, 1440, 22210, 3129, 67, 1028, 67, 31326, 12085, 14938, 342, 247, 10290, 5511, 7563, 285, 671, 7568, 697, 6733, 327, 247, 13506, 285, 1524, 10186, 22791, 20544, 50276, 22210, 12072, 3961, 953, 403, 271, 15146, 2170, 273, 1600, 1561, 253, 3961, 262, 3114, 285, 7613, 253, 9400, 273, 436, 2929, 310, 1077, 4623, 281, 326, 3114, 50276, 783, 4477, 403, 3451, 275, 326, 954, 2720, 789, 16633, 327, 4872, 3961, 953, 275, 253, 10208, 12072, 4758, 285, 13633, 1110, 11333, 281, 253, 14923, 4872, 3961, 262, 310, 37825, 285, 7613, 436, 310, 271, 1774, 5691, 50275, 783, 2929, 310, 973, 15720, 285, 3477, 281, 2096, 253, 4275, 5933, 2216, 4295, 403, 3477, 281, 15909, 50276, 783, 3559, 5933, 310, 295, 410, 72, 1221, 342, 5919, 5511, 285, 2987, 973, 327, 49602, 41731, 14692, 2720, 789, 50276, 20881, 1255, 265, 50276, 783, 4477, 513, 417, 2319, 5556, 1319, 1561, 253, 1895, 4758, 752, 310, 253, 8654, 1268, 273, 5511, 323, 295, 410, 72, 1221, 4715, 275, 10208, 12072, 14923, 4872, 3961, 953, 50276, 262, 4620, 326, 253, 749, 32581, 1319, 24088, 2429, 281, 253, 4872, 3961, 262, 15877, 432, 253, 4156, 5731, 5511, 513, 253, 4477, 452, 667, 11859, 16157, 5001, 436, 253, 5731, 1057, 3635, 1419, 253, 1318, 275, 1980, 11269, 347, 8042, 562, 407, 253, 4477, 275, 253, 6452, 347, 973, 403, 627, 667, 13260, 326, 476, 1581, 294, 5302, 1980, 11269, 275, 253, 4156, 20828, 3213, 253, 4477, 452, 5469, 616, 7364, 275, 253, 6452, 2299, 352, 651, 320, 1270, 604, 597, 812, 2085, 625, 6832, 16157, 275, 1110, 10746, 5474, 33032, 2520, 2929, 19401, 10208, 12072, 4715, 323, 14923, 4872, 3961, 953, 253, 2234, 3064, 2429, 281, 10208, 12072, 4872, 3961, 953, 310, 326, 352, 4419, 271, 34560, 1232, 323, 253, 4156, 5731, 2581, 685, 22128, 327, 4209, 9990, 253, 4477, 12661, 271, 5919, 5933, 326, 310, 2104, 281, 5115, 247, 5454, 2727, 875, 5511, 285, 14938, 690, 11640, 273, 253, 4081, 5933, 403, 671, 5421, 4679, 403, 671, 5196, 281, 25092, 366, 253, 10527, 1543, 20544, 50276, 66, 806, 1263, 327, 253, 747, 1895, 50276, 15617, 10527, 1543, 285, 16774, 4679, 50276, 20881, 1255, 265, 50276, 48746, 9021, 403, 3710, 50276, 783, 9759, 778, 320, 5520, 4754, 2490, 187, 4118, 18435, 27, 22210, 12072, 3961, 953, 403, 247, 1655, 2170, 273, 1600, 1561, 253, 3114, 285, 253, 2929, 3400, 9865, 9021, 275, 1798, 253, 4477, 2968, 342, 253, 2581, 2087, 1289, 78, 4758, 2085, 11333, 285, 1263, 253, 14938, 352, 651, 320, 4217, 604, 253, 4477, 651, 897, 253, 11985, 342, 253, 30628, 285, 253, 30628, 5701, 281, 3157, 285, 40167, 253, 2929 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 2175, 253, 10208, 12072, 14923, 4872, 3961, 953, 1895, 835, 1016, 5268, 9365, 247, 14923, 4872, 3961, 953, 1566, 253, 3602, 285, 253, 3048, 3470, 2439, 8548, 403, 253, 1072, 253, 4715, 8103, 310, 281, 15338, 253, 18849, 14938, 273, 1110, 8548, 253, 2929, 29328, 271, 5933, 1925, 10208, 3129, 67, 1028, 67, 342, 14938, 3033, 258, 1397, 2274, 85, 285, 5511, 2105, 387, 954, 7687, 79, 19, 2609, 85, 835, 277, 310, 253, 7877, 273, 4764, 2317, 246, 310, 253, 673, 16892, 285, 295, 310, 253, 1180, 273, 8548, 4679, 403, 2684, 327, 1097, 13506, 285, 1524, 1533, 10895, 50276, 783, 1289, 67, 4758, 23970, 747, 7881, 275, 253, 10208, 12072, 4758, 1580, 253, 13418, 273, 253, 4764, 1057, 417, 452, 4581, 830, 285, 943, 320, 10040, 3146, 9300, 436, 35979, 7881, 323, 5919, 5511, 285, 5997, 44251, 3374, 387, 1980, 11269, 50276, 783, 4081, 10208, 3129, 67, 1028, 67, 5933, 12453, 824, 7881, 407, 16248, 3909, 1980, 22753, 285, 28841, 9077, 323, 4156, 5731, 352, 8132, 29689, 19539, 326, 50276, 22210, 3129, 67, 1028, 67, 33526, 253, 1072, 4715, 14938, 1340, 347, 275, 253, 36409, 4758, 342, 247, 749, 8172, 5511, 2105, 253, 2929, 310, 275, 2087, 973, 3542, 285, 3477, 281, 956, 50276, 20261, 50276, 783, 5511, 2105, 310, 749, 8172, 253, 1340, 273, 327, 19, 2609, 85, 1335, 3133, 247, 1652, 2372, 1781, 1580, 10208, 12072, 4872, 3961, 953, 476, 5115, 327, 2808, 85, 50275, 783, 2929, 25339, 7364, 273, 1655, 1783, 285, 2442, 2852, 10746, 5474, 33032, 2520, 2929, 2175, 253, 14923, 4872, 3961, 262, 275, 253, 10208, 12072, 4758, 275, 4499, 281, 2720, 789, 327, 10208, 12072, 45618, 3961, 953, 326, 1908, 253, 4872, 3961, 262, 4758, 253, 14923, 4872, 3961, 262, 310, 625, 11132, 1580, 253, 3480, 273, 247, 4581, 630, 2900, 16897, 5549, 1242, 32380, 4209, 9990, 275, 253, 14923, 4872, 3961, 262, 253, 4477, 12661, 247, 4460, 5933, 2505, 1440, 22210, 3129, 67, 1028, 67, 323, 253, 14923, 4872, 3961, 262, 275, 253, 10208, 12072, 4758, 326, 24772, 28841, 285, 3909, 9077, 275, 1340, 281, 6654, 5511, 285, 13782, 253, 4477, 7568, 326, 2505, 1440, 22210, 3129, 67, 1028, 67, 31326, 12085, 14938, 342, 247, 10290, 5511, 7563, 285, 671, 7568, 697, 6733, 327, 247, 13506, 285, 1524, 10186, 22791, 20544, 50276, 22210, 12072, 3961, 953, 403, 271, 15146, 2170, 273, 1600, 1561, 253, 3961, 262, 3114, 285, 7613, 253, 9400, 273, 436, 2929, 310, 1077, 4623, 281, 326, 3114, 50276, 783, 4477, 403, 3451, 275, 326, 954, 2720, 789, 16633, 327, 4872, 3961, 953, 275, 253, 10208, 12072, 4758, 285, 13633, 1110, 11333, 281, 253, 14923, 4872, 3961, 262, 310, 37825, 285, 7613, 436, 310, 271, 1774, 5691, 50275, 783, 2929, 310, 973, 15720, 285, 3477, 281, 2096, 253, 4275, 5933, 2216, 4295, 403, 3477, 281, 15909, 50276, 783, 3559, 5933, 310, 295, 410, 72, 1221, 342, 5919, 5511, 285, 2987, 973, 327, 49602, 41731, 14692, 2720, 789, 50276, 20881, 1255, 265, 50276, 783, 4477, 513, 417, 2319, 5556, 1319, 1561, 253, 1895, 4758, 752, 310, 253, 8654, 1268, 273, 5511, 323, 295, 410, 72, 1221, 4715, 275, 10208, 12072, 14923, 4872, 3961, 953, 50276, 262, 4620, 326, 253, 749, 32581, 1319, 24088, 2429, 281, 253, 4872, 3961, 262, 15877, 432, 253, 4156, 5731, 5511, 513, 253, 4477, 452, 667, 11859, 16157, 5001, 436, 253, 5731, 1057, 3635, 1419, 253, 1318, 275, 1980, 11269, 347, 8042, 562, 407, 253, 4477, 275, 253, 6452, 347, 973, 403, 627, 667, 13260, 326, 476, 1581, 294, 5302, 1980, 11269, 275, 253, 4156, 20828, 3213, 253, 4477, 452, 5469, 616, 7364, 275, 253, 6452, 2299, 352, 651, 320, 1270, 604, 597, 812, 2085, 625, 6832, 16157, 275, 1110, 10746, 5474, 33032, 2520, 2929, 19401, 10208, 12072, 4715, 323, 14923, 4872, 3961, 953, 253, 2234, 3064, 2429, 281, 10208, 12072, 4872, 3961, 953, 310, 326, 352, 4419, 271, 34560, 1232, 323, 253, 4156, 5731, 2581, 685, 22128, 327, 4209, 9990, 253, 4477, 12661, 271, 5919, 5933, 326, 310, 2104, 281, 5115, 247, 5454, 2727, 875, 5511, 285, 14938, 690, 11640, 273, 253, 4081, 5933, 403, 671, 5421, 4679, 403, 671, 5196, 281, 25092, 366, 253, 10527, 1543, 20544, 50276, 66, 806, 1263, 327, 253, 747, 1895, 50276, 15617, 10527, 1543, 285, 16774, 4679, 50276, 20881, 1255, 265, 50276, 48746, 9021, 403, 3710, 50276, 783, 9759, 778, 320, 5520, 4754, 2490, 187, 4118, 18435, 27, 22210, 12072, 3961, 953, 403, 247, 1655, 2170, 273, 1600, 1561, 253, 3114, 285, 253, 2929, 3400, 9865, 9021, 275, 1798, 253, 4477, 2968, 342, 253, 2581, 2087, 1289, 78, 4758, 2085, 11333, 285, 1263, 253, 14938, 352, 651, 320, 4217, 604, 253, 4477, 651, 897, 253, 11985, 342, 253, 30628, 285, 253, 30628, 5701, 281, 3157, 285, 40167, 253, 2929 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper studies how randomly initialized relu networks split the data into linear regions it is based on hanin and rolnick 2019a but focuses on data with low dimensional manifold structure it provides an upper bound for the density of the linear regions which results in a lower bound for the average distance to the boundary of all linear regions this lower bound is inverse linear in the number of neurons and related to the geometric properties of the manifold the authors then empirically verify their statement over both synthetic datasets of two parameterized curves and one realistic dataset of a curve generated from stygan strength the problem this paper trying to solve is interesting in many applications data exhibit low dimensional or low dimensional manifold structures understanding how the intrinsic dimension and geometries interact with the deep network is an interesting problem the paper is able to relate the average distance from the linear region to parameters depending on the scalar curvature and dimensionality of the manifold while still preserving the same dependence on the number of neurons as in the euclidean case the experiments are insightful and support the intuitions of the theory weakness 1 the paper mostly serves as a theoretical paper but some key definitions are not presented in a clear and rigorous way the paper is partially based on hanin 2019 so the authors seem to adopt some of the notation in that paper without further clarification for example the definition of mathcal bf k defined in line 162 is not clear in the first read i need to go and forth between this and hanins paper to understand a similar definition and the relationship between them i would encourage the author to provide a formal math definition of mathcal bf k in the appendix 2 the role of the geometries of the manifold is not clearly laid out in the theorem the key benefit of considering data with structure is to understand how the structure interacts with the neural networks theorem 34 which is the main theorem of the paper is very similar to the euclidean case in the sense that it has inverse dependence over the number of neurons however the definition for cm and cm kappa is not provided in an intuitive way for the readers to understand the relationship between the geometries of the manifold and average distance for example one would like to know how cm and cm kappa depend on the scalar curvature and dimension of the manifold in the current form it would be hard for the reader to understand the exact benefit of considering data with low dimensional manifold structure 3 the linear region density lefthand side of equation on line 145 or line 202 only serves as a proxy for the number of linear regions in the later work of hanin and rolnick 2019b the number of linear regions would still depend on textneuronsnin so theorem 33 does not provide a bound for the number of linear regions directly minors 1 line 126 it should be zx wl1sigmacdots sigmab1 w1x in line 127 it should be zx w1 zx 2 line 22 in the appendix should be gijx 3 many reference links in the appendix are broken 4 in appendix line 216 the definition of kerdmhkperp should be kerdmhkperp seta vert a cdot z 0 forall z in kerdmhk instead of for some z the authors have adequately addressed the limitations and potential negative social impact of their work docsepwith the assumption that a dataset is sampled from a low dimensional manifold this paper incorporated such data geometry into measuring the effective approximation capacity of dnns by deriving average bounds on the density of boundary regions and distance from the linear boundary which is reported in previous work by hannin and rolnick strengths originality this work builds upon previous work by hannin and rolnick which establishes the linear regions in deep networks and provides an original study on how neural networks with relu activation can split data manifolds into regions where the neural network behaves as a linear function quality the study is of relatively good quality bound on the number of linear regions and the distance to boundaries of these linear regions on the data manifold are derived clarity the paper is very clearly presented weaknesses significance while building upon hannin and rolnicks previous work the significance of this extension is not very clear especially how this work leads to understanding the expressivity of deep networks on noneuclidean data sets as mentioned by the authors proving a lower bound on the number of linear regions still remains open docsepthis paper studies the data geometry of randomly initialized deep neural networks with relu activation motivated by the manifold hypothesis the authors present an extension of the previous result in hanin and ronick 1 where the previous assumption of uniform sampled data on the entire input space is relaxed to sampling on submanifolds the authors develop analogous upper bounds on the density of boundary regions and lower bound on expected distance to the boundary the authors then verified their results empirically using both synthetic datasets and the metfaces dataset 1 bhanin and d ronick complexity of linear regions in neural networks icml 2019 quality and clarity the paper is in general wellwritten i enjoyed reading the motivations of the paper as well the interpretation sections for the theorems while i have not checked the proofs carefully the progression of the theorems make sense to me the experiments also empirically support the authors interpretations though i have some questions which i will ask in the next section originality and significance the paper is mainly an extension of the previous hanin and ronick paper onto the submanifold case while the framework here is not new i believe it is still a novel result as the relaxed assumption of the data lying on a submanifold is indeed more realistic that said i am personally not a researcher in this field and i would refer to other reviewers for further evaluation of the significance of the results na docsepthe paper continues the work of hanin and rolnick 12 on the expressivity of piecewise linear pwl deep neural networks dnns the main contribution c1 involves improving the theoretical bounds of both the number of linear regions and the average distance to the linear boundaries defined by a pwl dnn by taking into account the geometry of the input manifold the authors also provided a toy example to empirically validate the results in addition they performed an experiment of natural images c2 to empirically show that both memorization and closeness to the data manifold have an effect on the density of linear regions references 1 boris hanin and david rolnick deep relu networks have surprisingly few activation patterns neurips 2019 2 boris hanin and david rolnick complexity of linear regions in deep networks iclr 2019 strengths strength 1 the paper addresses the significant problem of extending the bounds on expressiveness metrics of pwl dnns in 12 by taking into account the dimensionality and geometry of the input manifold s2 i really liked the toy example as a way to validate the analytical results although the clarity could be improved s3 the authors helped the reader in understanding the intuition behind the theoretical results by providing useful figures 1 and 2 in the main 1 and 2 in the appendix weaknesses with regard to the weaknesses some of the experimental results 2 out of 3 are not sufficiently clear or do not provide a significant contribution weakness 1 sect 42 the authors claimed to have obtained results figures 6 and 7 in contrast with 2 however it is not clear to me which result in 2 the authors are referring to see questions w2 sect 43 the metfaces experiment represents an additional separate contribution c2 but due to lack of space the results were not validated by enough empirical tests for example claiming a relationship between the decreasing number of linear regions during training and memorization was not validated against any hyperparameter change as it was done in 1 in fact the behaviour of the number of regions in an overfitting scenario was shown in 1 to be dependent on both the number of points to be memorized against the capacity of the network and the performance of the model not reported w3 sect 43 the second result of the metfaces experiment ie the difference in the number of regions in and out of the data manifold was already shown in 3 figure 2 with a slightly different experimental setting although using a gan to fit the indistribution manifold is a nice idea the obtained result in my opinion doesnt contribute to the overall significance of the paper concluding remarks although the paper addresses a significant research question validated with a toy example i think that more empirical experiments should have been devoted to the main contribution the role of the manifold geometry instead of focusing on the role of memorization and closeness to the data manifold both already analyzed indepth in previous papers 13 moreover i have doubts about some of the claims resulting from the experimental results which either are not clear or do not address the limitation of the experimental evaluation reviewerauthors discussion after the reviewerauthors discussion i increased the rating of both soundness and presentation moreover i increased the overall rating from 4 to 6 please refer to the discussion for the details references 1 boris hanin and david rolnick deep relu networks have surprisingly few activation patterns neurips 2019 2 boris hanin and david rolnick complexity of linear regions in deep networks iclr 2019 3 roman novak et al sensitivity and generalization in neural networks an empirical study iclr 2018 i would have addressed the role of hyperparameter selection in the experimental results since previous work clearly stated that size learning rate and other hyperparameter choices have an effect on the evolution of the number of linear regions during training ### Summary:
the paper studies the number of linear regions cut out by a randomly initialized deep network for data with lowdimensional structure manifold structured data the main results pertain to the density of linear regions and the average distance to the boundary of a linear region these results take the same form as in the euclidean case with distance inversely proportional to the number of neurons but depend on geometric properties of the data manifold in particular its dimension and curvature reviewers generally appreciated the relevance of the papers setting data arising in applications often have lowdimensional structure and understanding how deep networks interact with the structure of data is an important research direction at a technical level the paper builds on techniques of hanin and ronik 2019 but extends these results to manifold structured data questions raised by the reviewers include the role of curvature and input dimension in the results and the interpretation of real data experiments after interacting with the authors the reviewers considered their main concerns about the paper to be welladdressed the ac concurs and recommends acceptance
[ 323, 253, 3388, 4181, 281, 253, 7548, 273, 512, 4872, 4811, 436, 2406, 3033, 310, 13737, 4872, 275, 253, 1180, 273, 8512, 285, 2905, 281, 253, 17856, 3607, 273, 253, 16751, 253, 4477, 840, 45190, 12654, 616, 3908, 689, 1097, 13506, 15302, 273, 767, 4764, 1025, 9191, 285, 581, 15958, 10895, 273, 247, 6970, 4561, 432, 30085, 1247, 50276, 45563, 50276, 783, 1895, 436, 2929, 2820, 281, 8415, 310, 4722, 275, 1142, 4893, 941, 10738, 1698, 15759, 390, 1698, 15759, 16751, 5289, 4685, 849, 253, 15276, 7877, 285, 41184, 8008, 342, 253, 3676, 2990, 310, 271, 4722, 1895, 253, 2929, 310, 2104, 281, 14588, 253, 3388, 4181, 432, 253, 4872, 2919, 281, 3602, 7293, 327, 253, 13434, 16841, 285, 7877, 1319, 273, 253, 16751, 1223, 1335, 24279, 253, 1072, 10096, 327, 253, 1180, 273, 8512, 347, 275, 253, 299, 26365, 1083, 253, 4679, 403, 47860, 285, 1329, 253, 16875, 4431, 273, 253, 3762, 50276, 20881, 1255, 50275, 18, 253, 2929, 6571, 11029, 347, 247, 10527, 2929, 533, 690, 2234, 14308, 403, 417, 3559, 275, 247, 2590, 285, 26565, 1039, 253, 2929, 310, 10571, 1754, 327, 15761, 249, 6247, 594, 253, 4477, 1646, 281, 5283, 690, 273, 253, 14951, 275, 326, 2929, 1293, 2007, 37699, 323, 1650, 253, 5426, 273, 14168, 1179, 270, 71, 465, 2931, 275, 1386, 23094, 310, 417, 2590, 275, 253, 806, 1239, 891, 878, 281, 564, 285, 6593, 875, 436, 285, 15761, 968, 2929, 281, 2096, 247, 2074, 5426, 285, 253, 2954, 875, 731, 891, 651, 11907, 253, 2488, 281, 2085, 247, 7473, 14168, 5426, 273, 14168, 1179, 270, 71, 465, 275, 253, 30762, 50275, 19, 253, 2554, 273, 253, 41184, 273, 253, 16751, 310, 417, 4518, 10090, 562, 275, 253, 10012, 253, 2234, 5649, 273, 7296, 941, 342, 2605, 310, 281, 2096, 849, 253, 2605, 29290, 342, 253, 11454, 6928, 10012, 5910, 534, 310, 253, 2022, 10012, 273, 253, 2929, 310, 1077, 2074, 281, 253, 299, 26365, 1083, 275, 253, 3282, 326, 352, 556, 13737, 10096, 689, 253, 1180, 273, 8512, 2299, 253, 5426, 323, 7892, 285, 7892, 465, 5596, 310, 417, 2530, 275, 271, 27350, 1039, 323, 253, 10668, 281, 2096, 253, 2954, 875, 253, 41184, 273, 253, 16751, 285, 3388, 4181, 323, 1650, 581, 651, 751, 281, 871, 849, 7892, 285, 7892, 465, 5596, 3469, 327, 253, 13434, 16841, 285, 7877, 273, 253, 16751, 275, 253, 1655, 830, 352, 651, 320, 1892, 323, 253, 9414, 281, 2096, 253, 3242, 5649, 273, 7296, 941, 342, 1698, 15759, 16751, 2605, 50275, 20, 253, 4872, 2919, 4038, 458, 71, 394, 395, 1930, 273, 5150, 327, 1386, 19092, 390, 1386, 22038, 760, 11029, 347, 247, 17335, 323, 253, 1180, 273, 4872, 4811, 275, 253, 1996, 789, 273, 15761, 249, 285, 687, 6677, 781, 6247, 67, 253, 1180, 273, 4872, 4811, 651, 1335, 3469, 327, 2505, 32167, 790, 79, 249, 594, 10012, 5922, 1057, 417, 2085, 247, 3033, 323, 253, 1180, 273, 4872, 4811, 3587, 50274, 1222, 641, 337, 1386, 17574, 352, 943, 320, 1182, 89, 50276, 24966, 18, 24502, 12432, 6768, 9788, 78, 357, 18, 50276, 88, 18, 89, 275, 1386, 15610, 352, 943, 320, 1182, 89, 50276, 88, 18, 1182, 89, 50276, 19, 1386, 3307, 275, 253, 30762, 943, 320, 305, 1944, 89, 50276, 20, 1142, 3806, 4859, 275, 253, 30762, 403, 7154, 50276, 21, 275, 30762, 1386, 24521, 253, 5426, 273, 465, 15182, 33928, 76, 14715, 943, 320, 465, 15182, 33928, 76, 14715, 50276, 1178, 66, 5765, 247, 260, 5256, 1182, 50276, 17, 323, 455, 1182, 275, 465, 15182, 33928, 76, 3185, 273, 323, 690, 1182, 50274, 783, 4477, 452, 18212, 9713, 253, 7364, 285, 2442, 4016, 2675, 3486, 273, 616, 789, 50276, 7152, 33032, 3113, 253, 9376, 326, 247, 10895, 310, 19958, 432, 247, 1698, 15759, 16751, 436, 2929, 11217, 824, 941, 12087, 715, 10499, 253, 3576, 11193, 5350, 273, 277, 79, 2224, 407, 44190, 3388, 14493, 327, 253, 4038, 273, 7548, 4811, 285, 4181, 432, 253, 4872, 7548, 534, 310, 2361, 275, 2045, 789, 407, 288, 1136, 249, 285, 687, 6677, 781, 50276, 296, 3755, 20556, 50276, 19164, 414, 436, 789, 21168, 2220, 2045, 789, 407, 288, 1136, 249, 285, 687, 6677, 781, 534, 25097, 253, 4872, 4811, 275, 3676, 6928, 285, 3400, 271, 3236, 1263, 327, 849, 11454, 6928, 342, 774, 86, 5743, 476, 8085, 941, 28236, 715, 4811, 835, 253, 11454, 2990, 37824, 347, 247, 4872, 1159, 50275, 15177, 253, 1263, 310, 273, 4942, 1175, 3290, 3033, 327, 253, 1180, 273, 4872, 4811, 285, 253, 4181, 281, 13674, 273, 841, 4872, 4811, 327, 253, 941, 16751, 403, 6012, 50275, 498, 15752, 253, 2929, 310, 1077, 4518, 3559, 50275, 20881, 1255, 265, 50275, 9188, 40348, 1223, 3652, 2220, 288, 1136, 249, 285, 687, 6677, 5519, 2045, 789, 253, 8453, 273, 436, 6880, 310, 417, 1077, 2590, 3340, 849, 436, 789, 5644, 281, 4685, 253, 3890, 2351, 273, 3676, 6928, 327, 5293, 26365, 941, 5239, 50276, 284, 5393, 407, 253, 4477, 18597, 247, 2406, 3033, 327, 253, 1180, 273, 4872, 4811, 1335, 4558, 1527, 5474, 33032, 2520, 2929, 2175, 253, 941, 12087, 273, 12421, 31260, 3676, 11454, 6928, 342, 774, 86, 5743, 17194, 407, 253, 16751, 9079, 253, 4477, 1246, 271, 6880, 273, 253, 2045, 906, 275, 15761, 249, 285, 391, 251, 781, 337, 835, 253, 2045, 9376, 273, 6447, 19958, 941, 327, 253, 2862, 3280, 2317, 310, 19595, 281, 10491, 327, 749, 21338, 3502, 253, 4477, 1287, 19890, 5170, 14493, 327, 253, 4038, 273, 7548, 4811, 285, 2406, 3033, 327, 3264, 4181, 281, 253, 7548, 253, 4477, 840, 16058, 616, 1543, 45190, 970, 1097, 13506, 15302, 285, 253, 1313, 6511, 10895, 50276, 18, 270, 5582, 249, 285, 277, 391, 251, 781, 10454, 273, 4872, 4811, 275, 11454, 6928, 17857, 1686, 6247, 3290, 285, 19843, 253, 2929, 310, 275, 2087, 973, 15720, 891, 11346, 4361, 253, 42852, 273, 253, 2929, 347, 973, 253, 7914, 7118, 323, 253, 39383, 1223, 891, 452, 417, 10141, 253, 27947, 9257, 253, 10005, 273, 253, 39383, 1056, 3282, 281, 479, 253, 4679, 671, 45190, 1329, 253, 4477, 27838, 2167, 891, 452, 690, 3533, 534, 891, 588, 1642, 275, 253, 1735, 2593, 50275, 19164, 414, 285, 8453, 253, 2929, 310, 7194, 271, 6880, 273, 253, 2045, 15761, 249, 285, 391, 251, 781, 2929, 4830, 253, 749, 38556, 1083, 1223, 253, 7792, 1060, 310, 417, 747, 891, 2868, 352, 310, 1335, 247, 4460, 906, 347, 253, 19595, 9376, 273, 253, 941, 10776, 327, 247, 749, 38556, 310, 6296, 625, 15958, 326, 753, 891, 717, 11697, 417, 247, 22780, 275, 436, 1673, 285, 891, 651, 3730, 281, 643, 30628, 323, 2007, 7103, 273, 253, 8453, 273, 253, 1543, 50276, 2072, 5474, 339, 431, 248, 2929, 7788, 253, 789, 273, 15761, 249, 285, 687, 6677, 781, 1249, 327, 253, 3890, 2351, 273, 5313, 3020, 4872, 268, 24966, 3676, 11454, 6928, 277, 79, 2224, 253, 2022, 7680, 260, 18, 8687, 11138, 253, 10527, 14493, 273, 1097, 253, 1180, 273, 4872, 4811, 285, 253, 3388, 4181, 281, 253, 4872, 13674, 2931, 407, 247, 268, 24966, 277, 9866, 407, 3192, 715, 2395, 253, 12087, 273, 253, 3280, 16751, 253, 4477, 671, 2530, 247, 20953, 1650, 281, 45190, 17813, 253, 1543, 275, 1635, 597, 2684, 271, 3368, 273, 3626, 3888, 260, 19, 281, 45190, 921, 326, 1097, 16407, 1320, 285, 2734, 8098, 281, 253, 941, 16751, 452, 271, 1055, 327, 253, 4038, 273, 4872, 4811, 50275, 250, 3065, 50276, 18, 20495, 261, 15761, 249, 285, 34843, 301, 687, 6677, 781, 3676, 774, 86, 6928, 452, 19143, 1643, 5743, 6127, 5723, 2824, 6247, 50275, 19, 20495, 261, 15761, 249, 285, 34843, 301, 687, 6677, 781, 10454, 273, 4872, 4811, 275, 3676, 6928, 17857, 32888, 6247, 50276, 296, 3755, 20556, 50276, 45563, 337, 253, 2929, 12453, 253, 1534, 1895, 273, 13633, 253, 14493, 327, 3890, 6460, 17082, 273, 268, 24966, 277, 79, 2224, 275, 1249, 407, 3192, 715, 2395, 253, 7877, 1319, 285, 12087, 273, 253, 3280, 16751, 50276, 84, 19, 891, 1663, 10490, 253, 20953, 1650, 347, 247, 1039, 281, 17813, 253, 16101, 1543, 3738, 253, 19843, 812, 320, 5520, 50276, 84, 20, 253, 4477, 6518, 253, 9414, 275, 4685, 253, 30328, 3212, 253, 10527, 1543, 407, 5277, 4217, 8442, 337, 285, 374, 275, 253, 2022, 337, 285, 374, 275, 253, 30762, 50275, 20881, 1255, 265, 342, 2743, 281, 253, 32213, 690, 273, 253, 5661, 1543, 374, 562, 273, 495, 403, 417, 10481, 2590, 390, 513, 417, 2085, 247, 1534, 7680, 50276, 20881, 1255, 337, 25102, 5976, 253, 4477, 7558, 281, 452, 2797, 1543, 8442, 721, 285, 818, 275, 4499, 342, 374, 2299, 352, 310, 417, 2590, 281, 479, 534, 906, 275, 374, 253, 4477, 403, 14339, 281, 923, 3533, 50276, 88, 19, 25102, 7652, 253, 1313, 6511, 3368, 6125, 271, 3081, 4858, 7680, 260, 19, 533, 1955, 281, 3480, 273, 2317, 253, 1543, 497, 417, 17618, 407, 2217, 16774, 5216, 323, 1650, 15081, 247, 2954, 875, 253, 11052, 1180, 273, 4872, 4811, 1309, 3733, 285, 16407, 1320, 369, 417, 17618, 1411, 667, 4373, 19484, 1818, 347, 352, 369, 2218, 275, 337, 275, 958, 253, 8770, 273, 253, 1180, 273, 4811, 275, 271, 689, 31893, 10076, 369, 2011, 275, 337, 281, 320, 7976, 327, 1097, 253, 1180, 273, 2792, 281, 320, 16407, 1025, 1411, 253, 5350, 273, 253, 2990, 285, 253, 3045, 273, 253, 1566, 417, 2361, 50275, 88, 20, 25102, 7652, 253, 1273, 906, 273, 253, 1313, 6511, 3368, 26332, 253, 3064, 275, 253, 1180, 273, 4811, 275, 285, 562, 273, 253, 941, 16751, 369, 2168, 2011, 275, 495, 4677, 374, 342, 247, 5777, 1027, 5661, 4758, 3738, 970, 247, 36827, 281, 4944, 253, 31929, 2382, 16751, 310, 247, 5322, 2934, 253, 2797, 906, 275, 619, 4743, 36908, 8162, 281, 253, 4583, 8453, 273, 253, 2929, 50275, 585, 6547, 16157, 3738, 253, 2929, 12453, 247, 1534, 2561, 1953, 17618, 342, 247, 20953, 1650, 891, 1158, 326, 625, 16774, 4679, 943, 452, 644, 16222, 281, 253, 2022, 7680, 253, 2554, 273, 253, 16751, 12087, 3185, 273, 13654, 327, 253, 2554, 273, 16407, 1320, 285, 2734, 8098, 281, 253, 941, 16751, 1097, 2168, 5867, 801, 554, 394, 275, 2045, 9380, 2145, 25761, 891, 452, 24626, 670, 690, 273, 253, 3916, 4795, 432, 253, 5661, 1543, 534, 2057, 403, 417, 2590, 390, 513, 417, 2953, 253, 12291, 273, 253, 5661, 7103, 50275, 15337, 254, 43355, 5955, 846, 253, 37317, 43355, 5955, 891, 2559, 253, 13716, 273, 1097, 3590, 1255, 285, 9759, 25761, 891, 2559, 253, 4583, 13716, 432, 577, 281, 721, 4496, 3730, 281, 253, 5955, 323, 253, 4278, 50275, 250, 3065, 50276, 18, 20495, 261, 15761, 249, 285, 34843, 301, 687, 6677, 781, 3676, 774, 86, 6928, 452, 19143, 1643, 5743, 6127, 5723, 2824, 6247, 50275, 19, 20495, 261, 15761, 249, 285, 34843, 301, 687, 6677, 781, 10454, 273, 4872, 4811, 275, 3676, 6928, 17857, 32888, 6247, 50276, 20, 10102, 266, 22458, 518, 1162, 355, 7340, 285, 26647, 275, 11454, 6928, 271, 16774, 1263, 17857, 32888, 4765, 891, 651, 452, 9713, 253, 2554, 273, 4373, 19484, 5438, 275, 253, 5661, 1543, 1580, 2045, 789, 4518, 4767, 326, 1979, 4715, 2281, 285, 643, 4373, 19484, 10165, 452, 271, 1055, 327, 253, 5606, 273, 253, 1180, 273, 4872, 4811, 1309, 3733, 2490, 187, 4118, 18435, 27, 783, 2929, 2175, 253, 1180, 273, 4872, 4811, 2624, 562, 407, 247, 12421, 31260, 3676, 2990, 323, 941, 342, 1698, 6967, 2605, 16751, 18872, 941, 253, 2022, 1543, 6925, 404, 281, 253, 4038, 273, 4872, 4811, 285, 253, 3388, 4181, 281, 253, 7548, 273, 247, 4872, 2919, 841, 1543, 1379, 253, 1072, 830, 347, 275, 253, 299, 26365, 1083, 342, 4181, 39342, 14495, 281, 253, 1180, 273, 8512, 533, 3469, 327, 17856, 3607, 273, 253, 941, 16751, 50276, 249, 1798, 697, 7877, 285, 16841, 30628, 3839, 14109, 253, 17200, 273, 253, 9380, 4758, 941, 14475, 275, 4893, 2223, 452, 1698, 6967, 2605, 285, 4685, 849, 3676, 6928, 8008, 342, 253, 2605, 273, 941, 310, 271, 1774, 2561, 3884, 387, 247, 7681, 1268, 253, 2929, 21168, 327, 5609, 273, 15761, 249, 285, 391, 251, 1479, 6247, 533, 8725, 841, 1543, 281, 16751, 18872, 941, 3533, 5439, 407, 253, 30628, 2486, 253, 2554, 273, 16841, 285, 3280, 7877, 275, 253, 1543, 285, 253, 7914, 273, 1524, 941, 4679, 846, 18745, 342, 253, 4477, 253, 30628, 2783, 616, 2022, 7350, 670, 253, 2929, 281, 320, 973, 1911, 2079, 253, 913, 15038, 84, 285, 32636, 14924, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 323, 253, 3388, 4181, 281, 253, 7548, 273, 512, 4872, 4811, 436, 2406, 3033, 310, 13737, 4872, 275, 253, 1180, 273, 8512, 285, 2905, 281, 253, 17856, 3607, 273, 253, 16751, 253, 4477, 840, 45190, 12654, 616, 3908, 689, 1097, 13506, 15302, 273, 767, 4764, 1025, 9191, 285, 581, 15958, 10895, 273, 247, 6970, 4561, 432, 30085, 1247, 50276, 45563, 50276, 783, 1895, 436, 2929, 2820, 281, 8415, 310, 4722, 275, 1142, 4893, 941, 10738, 1698, 15759, 390, 1698, 15759, 16751, 5289, 4685, 849, 253, 15276, 7877, 285, 41184, 8008, 342, 253, 3676, 2990, 310, 271, 4722, 1895, 253, 2929, 310, 2104, 281, 14588, 253, 3388, 4181, 432, 253, 4872, 2919, 281, 3602, 7293, 327, 253, 13434, 16841, 285, 7877, 1319, 273, 253, 16751, 1223, 1335, 24279, 253, 1072, 10096, 327, 253, 1180, 273, 8512, 347, 275, 253, 299, 26365, 1083, 253, 4679, 403, 47860, 285, 1329, 253, 16875, 4431, 273, 253, 3762, 50276, 20881, 1255, 50275, 18, 253, 2929, 6571, 11029, 347, 247, 10527, 2929, 533, 690, 2234, 14308, 403, 417, 3559, 275, 247, 2590, 285, 26565, 1039, 253, 2929, 310, 10571, 1754, 327, 15761, 249, 6247, 594, 253, 4477, 1646, 281, 5283, 690, 273, 253, 14951, 275, 326, 2929, 1293, 2007, 37699, 323, 1650, 253, 5426, 273, 14168, 1179, 270, 71, 465, 2931, 275, 1386, 23094, 310, 417, 2590, 275, 253, 806, 1239, 891, 878, 281, 564, 285, 6593, 875, 436, 285, 15761, 968, 2929, 281, 2096, 247, 2074, 5426, 285, 253, 2954, 875, 731, 891, 651, 11907, 253, 2488, 281, 2085, 247, 7473, 14168, 5426, 273, 14168, 1179, 270, 71, 465, 275, 253, 30762, 50275, 19, 253, 2554, 273, 253, 41184, 273, 253, 16751, 310, 417, 4518, 10090, 562, 275, 253, 10012, 253, 2234, 5649, 273, 7296, 941, 342, 2605, 310, 281, 2096, 849, 253, 2605, 29290, 342, 253, 11454, 6928, 10012, 5910, 534, 310, 253, 2022, 10012, 273, 253, 2929, 310, 1077, 2074, 281, 253, 299, 26365, 1083, 275, 253, 3282, 326, 352, 556, 13737, 10096, 689, 253, 1180, 273, 8512, 2299, 253, 5426, 323, 7892, 285, 7892, 465, 5596, 310, 417, 2530, 275, 271, 27350, 1039, 323, 253, 10668, 281, 2096, 253, 2954, 875, 253, 41184, 273, 253, 16751, 285, 3388, 4181, 323, 1650, 581, 651, 751, 281, 871, 849, 7892, 285, 7892, 465, 5596, 3469, 327, 253, 13434, 16841, 285, 7877, 273, 253, 16751, 275, 253, 1655, 830, 352, 651, 320, 1892, 323, 253, 9414, 281, 2096, 253, 3242, 5649, 273, 7296, 941, 342, 1698, 15759, 16751, 2605, 50275, 20, 253, 4872, 2919, 4038, 458, 71, 394, 395, 1930, 273, 5150, 327, 1386, 19092, 390, 1386, 22038, 760, 11029, 347, 247, 17335, 323, 253, 1180, 273, 4872, 4811, 275, 253, 1996, 789, 273, 15761, 249, 285, 687, 6677, 781, 6247, 67, 253, 1180, 273, 4872, 4811, 651, 1335, 3469, 327, 2505, 32167, 790, 79, 249, 594, 10012, 5922, 1057, 417, 2085, 247, 3033, 323, 253, 1180, 273, 4872, 4811, 3587, 50274, 1222, 641, 337, 1386, 17574, 352, 943, 320, 1182, 89, 50276, 24966, 18, 24502, 12432, 6768, 9788, 78, 357, 18, 50276, 88, 18, 89, 275, 1386, 15610, 352, 943, 320, 1182, 89, 50276, 88, 18, 1182, 89, 50276, 19, 1386, 3307, 275, 253, 30762, 943, 320, 305, 1944, 89, 50276, 20, 1142, 3806, 4859, 275, 253, 30762, 403, 7154, 50276, 21, 275, 30762, 1386, 24521, 253, 5426, 273, 465, 15182, 33928, 76, 14715, 943, 320, 465, 15182, 33928, 76, 14715, 50276, 1178, 66, 5765, 247, 260, 5256, 1182, 50276, 17, 323, 455, 1182, 275, 465, 15182, 33928, 76, 3185, 273, 323, 690, 1182, 50274, 783, 4477, 452, 18212, 9713, 253, 7364, 285, 2442, 4016, 2675, 3486, 273, 616, 789, 50276, 7152, 33032, 3113, 253, 9376, 326, 247, 10895, 310, 19958, 432, 247, 1698, 15759, 16751, 436, 2929, 11217, 824, 941, 12087, 715, 10499, 253, 3576, 11193, 5350, 273, 277, 79, 2224, 407, 44190, 3388, 14493, 327, 253, 4038, 273, 7548, 4811, 285, 4181, 432, 253, 4872, 7548, 534, 310, 2361, 275, 2045, 789, 407, 288, 1136, 249, 285, 687, 6677, 781, 50276, 296, 3755, 20556, 50276, 19164, 414, 436, 789, 21168, 2220, 2045, 789, 407, 288, 1136, 249, 285, 687, 6677, 781, 534, 25097, 253, 4872, 4811, 275, 3676, 6928, 285, 3400, 271, 3236, 1263, 327, 849, 11454, 6928, 342, 774, 86, 5743, 476, 8085, 941, 28236, 715, 4811, 835, 253, 11454, 2990, 37824, 347, 247, 4872, 1159, 50275, 15177, 253, 1263, 310, 273, 4942, 1175, 3290, 3033, 327, 253, 1180, 273, 4872, 4811, 285, 253, 4181, 281, 13674, 273, 841, 4872, 4811, 327, 253, 941, 16751, 403, 6012, 50275, 498, 15752, 253, 2929, 310, 1077, 4518, 3559, 50275, 20881, 1255, 265, 50275, 9188, 40348, 1223, 3652, 2220, 288, 1136, 249, 285, 687, 6677, 5519, 2045, 789, 253, 8453, 273, 436, 6880, 310, 417, 1077, 2590, 3340, 849, 436, 789, 5644, 281, 4685, 253, 3890, 2351, 273, 3676, 6928, 327, 5293, 26365, 941, 5239, 50276, 284, 5393, 407, 253, 4477, 18597, 247, 2406, 3033, 327, 253, 1180, 273, 4872, 4811, 1335, 4558, 1527, 5474, 33032, 2520, 2929, 2175, 253, 941, 12087, 273, 12421, 31260, 3676, 11454, 6928, 342, 774, 86, 5743, 17194, 407, 253, 16751, 9079, 253, 4477, 1246, 271, 6880, 273, 253, 2045, 906, 275, 15761, 249, 285, 391, 251, 781, 337, 835, 253, 2045, 9376, 273, 6447, 19958, 941, 327, 253, 2862, 3280, 2317, 310, 19595, 281, 10491, 327, 749, 21338, 3502, 253, 4477, 1287, 19890, 5170, 14493, 327, 253, 4038, 273, 7548, 4811, 285, 2406, 3033, 327, 3264, 4181, 281, 253, 7548, 253, 4477, 840, 16058, 616, 1543, 45190, 970, 1097, 13506, 15302, 285, 253, 1313, 6511, 10895, 50276, 18, 270, 5582, 249, 285, 277, 391, 251, 781, 10454, 273, 4872, 4811, 275, 11454, 6928, 17857, 1686, 6247, 3290, 285, 19843, 253, 2929, 310, 275, 2087, 973, 15720, 891, 11346, 4361, 253, 42852, 273, 253, 2929, 347, 973, 253, 7914, 7118, 323, 253, 39383, 1223, 891, 452, 417, 10141, 253, 27947, 9257, 253, 10005, 273, 253, 39383, 1056, 3282, 281, 479, 253, 4679, 671, 45190, 1329, 253, 4477, 27838, 2167, 891, 452, 690, 3533, 534, 891, 588, 1642, 275, 253, 1735, 2593, 50275, 19164, 414, 285, 8453, 253, 2929, 310, 7194, 271, 6880, 273, 253, 2045, 15761, 249, 285, 391, 251, 781, 2929, 4830, 253, 749, 38556, 1083, 1223, 253, 7792, 1060, 310, 417, 747, 891, 2868, 352, 310, 1335, 247, 4460, 906, 347, 253, 19595, 9376, 273, 253, 941, 10776, 327, 247, 749, 38556, 310, 6296, 625, 15958, 326, 753, 891, 717, 11697, 417, 247, 22780, 275, 436, 1673, 285, 891, 651, 3730, 281, 643, 30628, 323, 2007, 7103, 273, 253, 8453, 273, 253, 1543, 50276, 2072, 5474, 339, 431, 248, 2929, 7788, 253, 789, 273, 15761, 249, 285, 687, 6677, 781, 1249, 327, 253, 3890, 2351, 273, 5313, 3020, 4872, 268, 24966, 3676, 11454, 6928, 277, 79, 2224, 253, 2022, 7680, 260, 18, 8687, 11138, 253, 10527, 14493, 273, 1097, 253, 1180, 273, 4872, 4811, 285, 253, 3388, 4181, 281, 253, 4872, 13674, 2931, 407, 247, 268, 24966, 277, 9866, 407, 3192, 715, 2395, 253, 12087, 273, 253, 3280, 16751, 253, 4477, 671, 2530, 247, 20953, 1650, 281, 45190, 17813, 253, 1543, 275, 1635, 597, 2684, 271, 3368, 273, 3626, 3888, 260, 19, 281, 45190, 921, 326, 1097, 16407, 1320, 285, 2734, 8098, 281, 253, 941, 16751, 452, 271, 1055, 327, 253, 4038, 273, 4872, 4811, 50275, 250, 3065, 50276, 18, 20495, 261, 15761, 249, 285, 34843, 301, 687, 6677, 781, 3676, 774, 86, 6928, 452, 19143, 1643, 5743, 6127, 5723, 2824, 6247, 50275, 19, 20495, 261, 15761, 249, 285, 34843, 301, 687, 6677, 781, 10454, 273, 4872, 4811, 275, 3676, 6928, 17857, 32888, 6247, 50276, 296, 3755, 20556, 50276, 45563, 337, 253, 2929, 12453, 253, 1534, 1895, 273, 13633, 253, 14493, 327, 3890, 6460, 17082, 273, 268, 24966, 277, 79, 2224, 275, 1249, 407, 3192, 715, 2395, 253, 7877, 1319, 285, 12087, 273, 253, 3280, 16751, 50276, 84, 19, 891, 1663, 10490, 253, 20953, 1650, 347, 247, 1039, 281, 17813, 253, 16101, 1543, 3738, 253, 19843, 812, 320, 5520, 50276, 84, 20, 253, 4477, 6518, 253, 9414, 275, 4685, 253, 30328, 3212, 253, 10527, 1543, 407, 5277, 4217, 8442, 337, 285, 374, 275, 253, 2022, 337, 285, 374, 275, 253, 30762, 50275, 20881, 1255, 265, 342, 2743, 281, 253, 32213, 690, 273, 253, 5661, 1543, 374, 562, 273, 495, 403, 417, 10481, 2590, 390, 513, 417, 2085, 247, 1534, 7680, 50276, 20881, 1255, 337, 25102, 5976, 253, 4477, 7558, 281, 452, 2797, 1543, 8442, 721, 285, 818, 275, 4499, 342, 374, 2299, 352, 310, 417, 2590, 281, 479, 534, 906, 275, 374, 253, 4477, 403, 14339, 281, 923, 3533, 50276, 88, 19, 25102, 7652, 253, 1313, 6511, 3368, 6125, 271, 3081, 4858, 7680, 260, 19, 533, 1955, 281, 3480, 273, 2317, 253, 1543, 497, 417, 17618, 407, 2217, 16774, 5216, 323, 1650, 15081, 247, 2954, 875, 253, 11052, 1180, 273, 4872, 4811, 1309, 3733, 285, 16407, 1320, 369, 417, 17618, 1411, 667, 4373, 19484, 1818, 347, 352, 369, 2218, 275, 337, 275, 958, 253, 8770, 273, 253, 1180, 273, 4811, 275, 271, 689, 31893, 10076, 369, 2011, 275, 337, 281, 320, 7976, 327, 1097, 253, 1180, 273, 2792, 281, 320, 16407, 1025, 1411, 253, 5350, 273, 253, 2990, 285, 253, 3045, 273, 253, 1566, 417, 2361, 50275, 88, 20, 25102, 7652, 253, 1273, 906, 273, 253, 1313, 6511, 3368, 26332, 253, 3064, 275, 253, 1180, 273, 4811, 275, 285, 562, 273, 253, 941, 16751, 369, 2168, 2011, 275, 495, 4677, 374, 342, 247, 5777, 1027, 5661, 4758, 3738, 970, 247, 36827, 281, 4944, 253, 31929, 2382, 16751, 310, 247, 5322, 2934, 253, 2797, 906, 275, 619, 4743, 36908, 8162, 281, 253, 4583, 8453, 273, 253, 2929, 50275, 585, 6547, 16157, 3738, 253, 2929, 12453, 247, 1534, 2561, 1953, 17618, 342, 247, 20953, 1650, 891, 1158, 326, 625, 16774, 4679, 943, 452, 644, 16222, 281, 253, 2022, 7680, 253, 2554, 273, 253, 16751, 12087, 3185, 273, 13654, 327, 253, 2554, 273, 16407, 1320, 285, 2734, 8098, 281, 253, 941, 16751, 1097, 2168, 5867, 801, 554, 394, 275, 2045, 9380, 2145, 25761, 891, 452, 24626, 670, 690, 273, 253, 3916, 4795, 432, 253, 5661, 1543, 534, 2057, 403, 417, 2590, 390, 513, 417, 2953, 253, 12291, 273, 253, 5661, 7103, 50275, 15337, 254, 43355, 5955, 846, 253, 37317, 43355, 5955, 891, 2559, 253, 13716, 273, 1097, 3590, 1255, 285, 9759, 25761, 891, 2559, 253, 4583, 13716, 432, 577, 281, 721, 4496, 3730, 281, 253, 5955, 323, 253, 4278, 50275, 250, 3065, 50276, 18, 20495, 261, 15761, 249, 285, 34843, 301, 687, 6677, 781, 3676, 774, 86, 6928, 452, 19143, 1643, 5743, 6127, 5723, 2824, 6247, 50275, 19, 20495, 261, 15761, 249, 285, 34843, 301, 687, 6677, 781, 10454, 273, 4872, 4811, 275, 3676, 6928, 17857, 32888, 6247, 50276, 20, 10102, 266, 22458, 518, 1162, 355, 7340, 285, 26647, 275, 11454, 6928, 271, 16774, 1263, 17857, 32888, 4765, 891, 651, 452, 9713, 253, 2554, 273, 4373, 19484, 5438, 275, 253, 5661, 1543, 1580, 2045, 789, 4518, 4767, 326, 1979, 4715, 2281, 285, 643, 4373, 19484, 10165, 452, 271, 1055, 327, 253, 5606, 273, 253, 1180, 273, 4872, 4811, 1309, 3733, 2490, 187, 4118, 18435, 27, 783, 2929, 2175, 253, 1180, 273, 4872, 4811, 2624, 562, 407, 247, 12421, 31260, 3676, 2990, 323, 941, 342, 1698, 6967, 2605, 16751, 18872, 941, 253, 2022, 1543, 6925, 404, 281, 253, 4038, 273, 4872, 4811, 285, 253, 3388, 4181, 281, 253, 7548, 273, 247, 4872, 2919, 841, 1543, 1379, 253, 1072, 830, 347, 275, 253, 299, 26365, 1083, 342, 4181, 39342, 14495, 281, 253, 1180, 273, 8512, 533, 3469, 327, 17856, 3607, 273, 253, 941, 16751, 50276, 249, 1798, 697, 7877, 285, 16841, 30628, 3839, 14109, 253, 17200, 273, 253, 9380, 4758, 941, 14475, 275, 4893, 2223, 452, 1698, 6967, 2605, 285, 4685, 849, 3676, 6928, 8008, 342, 253, 2605, 273, 941, 310, 271, 1774, 2561, 3884, 387, 247, 7681, 1268, 253, 2929, 21168, 327, 5609, 273, 15761, 249, 285, 391, 251, 1479, 6247, 533, 8725, 841, 1543, 281, 16751, 18872, 941, 3533, 5439, 407, 253, 30628, 2486, 253, 2554, 273, 16841, 285, 3280, 7877, 275, 253, 1543, 285, 253, 7914, 273, 1524, 941, 4679, 846, 18745, 342, 253, 4477, 253, 30628, 2783, 616, 2022, 7350, 670, 253, 2929, 281, 320, 973, 1911, 2079, 253, 913, 15038, 84, 285, 32636, 14924, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: trustworthy ai is of interest to many as systems are quickly being deployed in the real world there is a chance that the models being deployed may contain a backdoor policy which acts maliciously when triggered by an interested attacker who may perturb the agents observations this work attempts to clearly define the objective of such an attacker describes the limits of their behavior and attempts to defend against such an attack it introduces the concept of a safe subspace based on testing the provided policy on a clean environment and then uses this empirically derived subspace to sanitize the policy akin to a denoising step bounds on performance of this sanitized policy wrt the optimal policy are given strengths intuitive examples fig2 explaining role of backdoor attacker bounds provided for sanitized policy behavior in clean environment tackles question of choice when subspace dimension not known sec 54 weaknesses experiments on discrete actionspaces does this generalize to mujoco environments and robotic problems analysis is on zero mean state case and claims to generalize to the nonzero mean case but this is not substantiated in my view assumptions on attacker might not be valid in many cases limiting to eperp point on attacker needing to limit in eperp space may not be true in practice and need for clean environment interactions to get sanitization policy is mentioned it is not certain that this will generalize to continuous action spaces docsepthis paper proposes a defense algorithm against backdoor attacks for reinforcement learning specifically the defense is based on singular value decomposition of the covariance matrix of state occupancy the algorithm sanitizes the poisoned policy by projecting states into a safe subspace formed by a lowrank decomposition of the covariance matrix based on several assumptions on the state occupancy structure the authors show that the proposed algorithm guarantees a bounded different between the optimal policys value and the sanitized policys value experiments on a vectorinput game and a pixelinput game verify the effectiveness of the proposed algorithm strengths the tackled problem defending against backdoor policy attack is important the proposed algorithm applies svd to filter out triggers which makes intuitive sense the experiment results are good it is surprising that the proposed svdbased defense also works for pixel inputs the presentation of this paper is good and easy to follow the authors make the attack and defense problem clear with nice visualization figures the mathematical notations are also clean to me weaknesses the algorithm although makes intuitive sense may have some difficulties of scaling up it is nice that the authors provide hyperparameter check for d in figure 5 and show that d can be selected based on the eigen gap but the experiment is only done in one pixelinput task breakout which is a relatively simple game making the results less convincing more experimental results would be appreciated the explanation of the sample complexity is too simplified how would the depencence on d b le k sigma influence the complexity they seem to reveal the intrinsic hardness of defending in the enviornment the paper briefly discusses the limitations of the work in section 6 docsepthis paper proposes a defense method to deal with the backdoor attack on reinforcement learning rl instead of retraining the rl policy or directly modifying the model parameters the proposed method serves as a wrapper that projects the observation onto an estimated safe subspace and thus eliminates the potential triggers this helps the rl agent retain its performance even in triggered environments the authors also provide theoretical analysis for their method strength 1 originality the idea of this paper is novel and the proposed method is a promissing exploration for the defense mechanism against backdoor attack on rl 2 clarity the paper is organized well and the paper is easy to follow weaknesses 1 quality the results of this paper are sound and well supported by the conducted experiments however the method seems to rely on a strong assumption please see detailed discussion in question section also the method is only tested in two atari game which is kind of limited 2 significance the experiments prove the method can work in some scenarios however there are no comparisons with any existing works therefore it is unclear how much this method advances the state of the art please check the questions section docsepthis paper formulate backdoor policy attack it specifies a kind of backdoor policy attack and provides an algorithm to defense such attack it provide a proof to bound the suboptimality of the policy sanitized by the defense it also conducts experiments to verify their algorithm strength backdoor attack is an interesting problem and the formulation in this paper is useful for theoretical study and algorithm design they also provide an algorithm and prove the soundness of the algorithm both theoretical and empirixal weakness the writing of proof for the main theorem can be improved moreover their algorithm and analysis can only be used on the simple backdoor attack that consist with their assumption their formulation only adapts to the case that the support of the image the trigger function is a linear subspace of rd which might be restricted for reallife application ### Summary:
the authors present a novel algorithm for defending against backdoor policies in reinforcement learning rl the main idea is to project observations onto a safe subspace which cleans out the backdoor the authors present both empirical finds and theoretical results for their method there was an active discussion between reviewers and authors in which the main concerns were addressed i recommend that the authors follow the suggestion of reviewer iqtz and emphasise the restriction on the attacker ability more clearly to avoid any overclaims about the capability of the defense method
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 32993, 26214, 23105, 310, 273, 1600, 281, 1142, 347, 2718, 403, 4541, 1146, 18329, 275, 253, 1524, 1533, 627, 310, 247, 4839, 326, 253, 3210, 1146, 18329, 778, 3831, 247, 896, 11806, 3646, 534, 6993, 24764, 314, 672, 17142, 407, 271, 6110, 30539, 665, 778, 12230, 253, 6083, 7313, 436, 789, 9437, 281, 4518, 4853, 253, 8103, 273, 824, 271, 30539, 8631, 253, 7787, 273, 616, 3879, 285, 9437, 281, 2342, 1411, 824, 271, 2983, 352, 23970, 253, 4473, 273, 247, 4999, 24822, 1754, 327, 5175, 253, 2530, 3646, 327, 247, 4076, 3126, 285, 840, 4648, 436, 45190, 6012, 24822, 281, 44368, 907, 253, 3646, 33917, 281, 247, 1850, 80, 2182, 3213, 14493, 327, 3045, 273, 436, 44368, 1025, 3646, 8772, 253, 8654, 3646, 403, 1677, 20544, 50276, 565, 48714, 6667, 3036, 19, 15571, 2554, 273, 896, 11806, 30539, 50276, 35800, 2530, 323, 44368, 1025, 3646, 3879, 275, 4076, 3126, 50276, 85, 471, 868, 1953, 273, 4327, 672, 24822, 7877, 417, 1929, 4706, 8255, 50276, 20881, 1255, 265, 50276, 16217, 3825, 327, 13358, 5231, 32328, 1057, 436, 39970, 281, 278, 10441, 16856, 12620, 285, 35121, 3237, 50276, 12792, 310, 327, 5058, 1599, 1375, 1083, 285, 3916, 281, 39970, 281, 253, 28078, 1599, 1083, 533, 436, 310, 417, 4326, 4215, 275, 619, 1859, 50276, 515, 360, 6372, 327, 30539, 1537, 417, 320, 3588, 275, 1142, 2219, 14155, 281, 299, 14715, 1127, 327, 30539, 25312, 281, 2701, 275, 299, 14715, 2317, 778, 417, 320, 2032, 275, 3946, 285, 878, 323, 4076, 3126, 6355, 281, 755, 44368, 1320, 3646, 310, 5393, 352, 310, 417, 2176, 326, 436, 588, 39970, 281, 5415, 2250, 8470, 50276, 7152, 33032, 2520, 2929, 29328, 247, 5684, 5933, 1411, 896, 11806, 8104, 323, 35221, 4715, 5742, 253, 5684, 310, 1754, 327, 11098, 1318, 14717, 273, 253, 26677, 4315, 273, 1375, 35190, 253, 5933, 44368, 4219, 253, 47494, 3646, 407, 35104, 3054, 715, 247, 4999, 24822, 4447, 407, 247, 1698, 14714, 14717, 273, 253, 26677, 4315, 1754, 327, 2067, 13260, 327, 253, 1375, 35190, 2605, 253, 4477, 921, 326, 253, 4081, 5933, 23632, 247, 11542, 1027, 875, 253, 8654, 6382, 656, 1318, 285, 253, 44368, 1025, 6382, 656, 1318, 4679, 327, 247, 4972, 5423, 2165, 285, 247, 12275, 5423, 2165, 12654, 253, 12510, 273, 253, 4081, 5933, 50276, 296, 3755, 20556, 50276, 783, 11463, 1070, 1895, 21449, 1411, 896, 11806, 3646, 2983, 310, 1774, 253, 4081, 5933, 10384, 18504, 69, 281, 5806, 562, 23785, 534, 2789, 27350, 3282, 50275, 783, 3368, 1543, 403, 1175, 352, 310, 10084, 326, 253, 4081, 18504, 69, 3169, 5684, 671, 2987, 323, 12275, 14800, 50276, 783, 9759, 273, 436, 2929, 310, 1175, 285, 3477, 281, 956, 253, 4477, 1056, 253, 2983, 285, 5684, 1895, 2590, 342, 5322, 24426, 8442, 253, 15965, 41818, 403, 671, 4076, 281, 479, 50273, 20881, 1255, 265, 50276, 783, 5933, 3738, 2789, 27350, 3282, 778, 452, 690, 12748, 273, 13642, 598, 352, 310, 5322, 326, 253, 4477, 2085, 4373, 19484, 2451, 323, 277, 275, 4677, 608, 285, 921, 326, 277, 476, 320, 4236, 1754, 327, 253, 9216, 8037, 533, 253, 3368, 310, 760, 2218, 275, 581, 12275, 5423, 4836, 2740, 483, 534, 310, 247, 4942, 2969, 2165, 2403, 253, 1543, 1679, 21414, 625, 5661, 1543, 651, 320, 14109, 50275, 783, 8813, 273, 253, 3410, 10454, 310, 1512, 21010, 849, 651, 253, 1305, 2083, 566, 327, 277, 270, 458, 465, 40009, 4833, 253, 10454, 597, 1646, 281, 10313, 253, 15276, 38576, 273, 21449, 275, 253, 17400, 1528, 79, 420, 50276, 783, 2929, 13366, 25339, 253, 7364, 273, 253, 789, 275, 2593, 721, 5474, 33032, 2520, 2929, 29328, 247, 5684, 1332, 281, 2968, 342, 253, 896, 11806, 2983, 327, 35221, 4715, 391, 77, 3185, 273, 851, 26208, 253, 391, 77, 3646, 390, 3587, 26264, 253, 1566, 3602, 253, 4081, 1332, 11029, 347, 247, 27436, 326, 6493, 253, 8310, 4830, 271, 5998, 4999, 24822, 285, 3021, 35580, 253, 2442, 23785, 436, 7729, 253, 391, 77, 5570, 13280, 697, 3045, 1014, 275, 17142, 12620, 253, 4477, 671, 2085, 10527, 1783, 323, 616, 1332, 4757, 337, 3236, 414, 253, 2934, 273, 436, 2929, 310, 4460, 285, 253, 4081, 1332, 310, 247, 1964, 739, 272, 17947, 323, 253, 5684, 5122, 1411, 896, 11806, 2983, 327, 391, 77, 50276, 19, 19843, 253, 2929, 310, 10932, 973, 285, 253, 2929, 310, 3477, 281, 956, 50276, 20881, 1255, 265, 337, 3290, 253, 1543, 273, 436, 2929, 403, 3590, 285, 973, 4516, 407, 253, 5196, 4679, 2299, 253, 1332, 3133, 281, 10725, 327, 247, 2266, 9376, 4496, 923, 7000, 5955, 275, 1953, 2593, 671, 253, 1332, 310, 760, 5762, 275, 767, 387, 1792, 2165, 534, 310, 2238, 273, 3710, 50276, 19, 8453, 253, 4679, 5276, 253, 1332, 476, 789, 275, 690, 15216, 2299, 627, 403, 642, 14023, 342, 667, 5368, 2987, 3103, 352, 310, 12744, 849, 1199, 436, 1332, 16424, 253, 1375, 273, 253, 1445, 4496, 2451, 253, 3533, 2593, 5474, 33032, 2520, 2929, 36803, 896, 11806, 3646, 2983, 352, 28251, 247, 2238, 273, 896, 11806, 3646, 2983, 285, 3400, 271, 5933, 281, 5684, 824, 2983, 352, 2085, 247, 4737, 281, 3033, 253, 749, 32581, 1319, 273, 253, 3646, 44368, 1025, 407, 253, 5684, 352, 671, 2589, 84, 4679, 281, 12654, 616, 5933, 4757, 896, 11806, 2983, 310, 271, 4722, 1895, 285, 253, 15895, 275, 436, 2929, 310, 4217, 323, 10527, 1263, 285, 5933, 2216, 597, 671, 2085, 271, 5933, 285, 5276, 253, 3590, 1255, 273, 253, 5933, 1097, 10527, 285, 13974, 895, 267, 50276, 20881, 1255, 253, 4028, 273, 4737, 323, 253, 2022, 10012, 476, 320, 5520, 25761, 616, 5933, 285, 1783, 476, 760, 50276, 1257, 908, 327, 253, 2969, 896, 11806, 2983, 326, 2882, 342, 616, 9376, 616, 15895, 760, 5223, 84, 281, 253, 1083, 326, 253, 1329, 273, 253, 2460, 253, 9632, 1159, 310, 247, 4872, 24822, 273, 47939, 534, 1537, 320, 11096, 323, 294, 455, 1074, 2898, 2490, 187, 4118, 18435, 27, 783, 4477, 1246, 247, 4460, 5933, 323, 21449, 1411, 896, 11806, 7823, 275, 35221, 4715, 391, 77, 50276, 783, 2022, 2934, 310, 281, 2199, 7313, 4830, 247, 4999, 24822, 534, 30671, 562, 253, 896, 11806, 50275, 783, 4477, 1246, 1097, 16774, 9010, 285, 10527, 1543, 323, 616, 1332, 50276, 9088, 369, 271, 3939, 5955, 875, 30628, 285, 4477, 275, 534, 253, 2022, 7350, 497, 9713, 50275, 74, 5583, 326, 253, 4477, 956, 253, 14876, 273, 37317, 891, 82, 21239, 285, 10251, 885, 253, 12400, 327, 253, 30539, 3745, 625, 4518, 281, 3693, 667, 689, 28803, 670, 253, 14603, 273, 253, 5684, 1332 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 32993, 26214, 23105, 310, 273, 1600, 281, 1142, 347, 2718, 403, 4541, 1146, 18329, 275, 253, 1524, 1533, 627, 310, 247, 4839, 326, 253, 3210, 1146, 18329, 778, 3831, 247, 896, 11806, 3646, 534, 6993, 24764, 314, 672, 17142, 407, 271, 6110, 30539, 665, 778, 12230, 253, 6083, 7313, 436, 789, 9437, 281, 4518, 4853, 253, 8103, 273, 824, 271, 30539, 8631, 253, 7787, 273, 616, 3879, 285, 9437, 281, 2342, 1411, 824, 271, 2983, 352, 23970, 253, 4473, 273, 247, 4999, 24822, 1754, 327, 5175, 253, 2530, 3646, 327, 247, 4076, 3126, 285, 840, 4648, 436, 45190, 6012, 24822, 281, 44368, 907, 253, 3646, 33917, 281, 247, 1850, 80, 2182, 3213, 14493, 327, 3045, 273, 436, 44368, 1025, 3646, 8772, 253, 8654, 3646, 403, 1677, 20544, 50276, 565, 48714, 6667, 3036, 19, 15571, 2554, 273, 896, 11806, 30539, 50276, 35800, 2530, 323, 44368, 1025, 3646, 3879, 275, 4076, 3126, 50276, 85, 471, 868, 1953, 273, 4327, 672, 24822, 7877, 417, 1929, 4706, 8255, 50276, 20881, 1255, 265, 50276, 16217, 3825, 327, 13358, 5231, 32328, 1057, 436, 39970, 281, 278, 10441, 16856, 12620, 285, 35121, 3237, 50276, 12792, 310, 327, 5058, 1599, 1375, 1083, 285, 3916, 281, 39970, 281, 253, 28078, 1599, 1083, 533, 436, 310, 417, 4326, 4215, 275, 619, 1859, 50276, 515, 360, 6372, 327, 30539, 1537, 417, 320, 3588, 275, 1142, 2219, 14155, 281, 299, 14715, 1127, 327, 30539, 25312, 281, 2701, 275, 299, 14715, 2317, 778, 417, 320, 2032, 275, 3946, 285, 878, 323, 4076, 3126, 6355, 281, 755, 44368, 1320, 3646, 310, 5393, 352, 310, 417, 2176, 326, 436, 588, 39970, 281, 5415, 2250, 8470, 50276, 7152, 33032, 2520, 2929, 29328, 247, 5684, 5933, 1411, 896, 11806, 8104, 323, 35221, 4715, 5742, 253, 5684, 310, 1754, 327, 11098, 1318, 14717, 273, 253, 26677, 4315, 273, 1375, 35190, 253, 5933, 44368, 4219, 253, 47494, 3646, 407, 35104, 3054, 715, 247, 4999, 24822, 4447, 407, 247, 1698, 14714, 14717, 273, 253, 26677, 4315, 1754, 327, 2067, 13260, 327, 253, 1375, 35190, 2605, 253, 4477, 921, 326, 253, 4081, 5933, 23632, 247, 11542, 1027, 875, 253, 8654, 6382, 656, 1318, 285, 253, 44368, 1025, 6382, 656, 1318, 4679, 327, 247, 4972, 5423, 2165, 285, 247, 12275, 5423, 2165, 12654, 253, 12510, 273, 253, 4081, 5933, 50276, 296, 3755, 20556, 50276, 783, 11463, 1070, 1895, 21449, 1411, 896, 11806, 3646, 2983, 310, 1774, 253, 4081, 5933, 10384, 18504, 69, 281, 5806, 562, 23785, 534, 2789, 27350, 3282, 50275, 783, 3368, 1543, 403, 1175, 352, 310, 10084, 326, 253, 4081, 18504, 69, 3169, 5684, 671, 2987, 323, 12275, 14800, 50276, 783, 9759, 273, 436, 2929, 310, 1175, 285, 3477, 281, 956, 253, 4477, 1056, 253, 2983, 285, 5684, 1895, 2590, 342, 5322, 24426, 8442, 253, 15965, 41818, 403, 671, 4076, 281, 479, 50273, 20881, 1255, 265, 50276, 783, 5933, 3738, 2789, 27350, 3282, 778, 452, 690, 12748, 273, 13642, 598, 352, 310, 5322, 326, 253, 4477, 2085, 4373, 19484, 2451, 323, 277, 275, 4677, 608, 285, 921, 326, 277, 476, 320, 4236, 1754, 327, 253, 9216, 8037, 533, 253, 3368, 310, 760, 2218, 275, 581, 12275, 5423, 4836, 2740, 483, 534, 310, 247, 4942, 2969, 2165, 2403, 253, 1543, 1679, 21414, 625, 5661, 1543, 651, 320, 14109, 50275, 783, 8813, 273, 253, 3410, 10454, 310, 1512, 21010, 849, 651, 253, 1305, 2083, 566, 327, 277, 270, 458, 465, 40009, 4833, 253, 10454, 597, 1646, 281, 10313, 253, 15276, 38576, 273, 21449, 275, 253, 17400, 1528, 79, 420, 50276, 783, 2929, 13366, 25339, 253, 7364, 273, 253, 789, 275, 2593, 721, 5474, 33032, 2520, 2929, 29328, 247, 5684, 1332, 281, 2968, 342, 253, 896, 11806, 2983, 327, 35221, 4715, 391, 77, 3185, 273, 851, 26208, 253, 391, 77, 3646, 390, 3587, 26264, 253, 1566, 3602, 253, 4081, 1332, 11029, 347, 247, 27436, 326, 6493, 253, 8310, 4830, 271, 5998, 4999, 24822, 285, 3021, 35580, 253, 2442, 23785, 436, 7729, 253, 391, 77, 5570, 13280, 697, 3045, 1014, 275, 17142, 12620, 253, 4477, 671, 2085, 10527, 1783, 323, 616, 1332, 4757, 337, 3236, 414, 253, 2934, 273, 436, 2929, 310, 4460, 285, 253, 4081, 1332, 310, 247, 1964, 739, 272, 17947, 323, 253, 5684, 5122, 1411, 896, 11806, 2983, 327, 391, 77, 50276, 19, 19843, 253, 2929, 310, 10932, 973, 285, 253, 2929, 310, 3477, 281, 956, 50276, 20881, 1255, 265, 337, 3290, 253, 1543, 273, 436, 2929, 403, 3590, 285, 973, 4516, 407, 253, 5196, 4679, 2299, 253, 1332, 3133, 281, 10725, 327, 247, 2266, 9376, 4496, 923, 7000, 5955, 275, 1953, 2593, 671, 253, 1332, 310, 760, 5762, 275, 767, 387, 1792, 2165, 534, 310, 2238, 273, 3710, 50276, 19, 8453, 253, 4679, 5276, 253, 1332, 476, 789, 275, 690, 15216, 2299, 627, 403, 642, 14023, 342, 667, 5368, 2987, 3103, 352, 310, 12744, 849, 1199, 436, 1332, 16424, 253, 1375, 273, 253, 1445, 4496, 2451, 253, 3533, 2593, 5474, 33032, 2520, 2929, 36803, 896, 11806, 3646, 2983, 352, 28251, 247, 2238, 273, 896, 11806, 3646, 2983, 285, 3400, 271, 5933, 281, 5684, 824, 2983, 352, 2085, 247, 4737, 281, 3033, 253, 749, 32581, 1319, 273, 253, 3646, 44368, 1025, 407, 253, 5684, 352, 671, 2589, 84, 4679, 281, 12654, 616, 5933, 4757, 896, 11806, 2983, 310, 271, 4722, 1895, 285, 253, 15895, 275, 436, 2929, 310, 4217, 323, 10527, 1263, 285, 5933, 2216, 597, 671, 2085, 271, 5933, 285, 5276, 253, 3590, 1255, 273, 253, 5933, 1097, 10527, 285, 13974, 895, 267, 50276, 20881, 1255, 253, 4028, 273, 4737, 323, 253, 2022, 10012, 476, 320, 5520, 25761, 616, 5933, 285, 1783, 476, 760, 50276, 1257, 908, 327, 253, 2969, 896, 11806, 2983, 326, 2882, 342, 616, 9376, 616, 15895, 760, 5223, 84, 281, 253, 1083, 326, 253, 1329, 273, 253, 2460, 253, 9632, 1159, 310, 247, 4872, 24822, 273, 47939, 534, 1537, 320, 11096, 323, 294, 455, 1074, 2898, 2490, 187, 4118, 18435, 27, 783, 4477, 1246, 247, 4460, 5933, 323, 21449, 1411, 896, 11806, 7823, 275, 35221, 4715, 391, 77, 50276, 783, 2022, 2934, 310, 281, 2199, 7313, 4830, 247, 4999, 24822, 534, 30671, 562, 253, 896, 11806, 50275, 783, 4477, 1246, 1097, 16774, 9010, 285, 10527, 1543, 323, 616, 1332, 50276, 9088, 369, 271, 3939, 5955, 875, 30628, 285, 4477, 275, 534, 253, 2022, 7350, 497, 9713, 50275, 74, 5583, 326, 253, 4477, 956, 253, 14876, 273, 37317, 891, 82, 21239, 285, 10251, 885, 253, 12400, 327, 253, 30539, 3745, 625, 4518, 281, 3693, 667, 689, 28803, 670, 253, 14603, 273, 253, 5684, 1332 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper introduces a new method based on learnable hash functions to reduce the on2 cost of selfattention in transformers to on15 as previously known for bucketbased approaches this cost is only achieved when buckets are balanced the paper investigates the effectiveness of related approaches regarding bucket imbalance issues by showing statistics for several attention heads of a pretrained transformer a precise metric is designed to quantify this notion of efficiency attention utility and later this metric is optimized by learning separate parameterized hash functions for queries and keys to be able to optimize this metric the authors use the unbiased approximation to the softmax function proposed by performer choromanski et al 2020 as a regularization term experiments on several nlp and cv tasks show that the proposed method achieves better results than previous fasttransformers while being faster than a standard transformer pros i really enjoyed reading this paper it starts by departing itself from related works by investigating the critical issue of not having balanced buckets showing that this characteristic is a common pitfall of previous popular works therefore the foundation is clear and has practical implications moreover quantifying the degree to which an approximate selfattention can imitate the full attention is very informative and might guide future work on fast transformers the authors acknowledge that computing this number is an nphard problem but they go further and try to optimize this metric during training the idea of having a learnable function prior to bucketing is not entirely novel see a contemporary idea by treviso et al 2021 but the formulation is concise and clear enabling generalizations to previous approaches the experiments span a large set of tasks in nlp and cv with increasingly large input lengths lm experiments show that the proposed method trainable lsh outperforms reformer untrainable lsh on multiple settings results on the glue benchmark show that the proposed method achieves results on par with the original roberta model which is a good sanity check the experiments on long range arena give the final flavor to the paper with the proposed method outperforming related works on all considered tasks while being cheaper higher throughput to train than a standard transformer cons however i was also able to spot some concerns in this paper concretely the core of the proposed method relies on optimizing the attention utility which relies on the performers softmax kernel to regularize the training regime the paper opts to use a kl between the softmax kernel and lha to that end it would be interesting to see how this kl term evolves during training the choice of 10 heads from the 3rd layer in section 32 looks arbitrary in fact it is widely known that different heads at different layers focus on distinct tokens thus what is the motivation for that choice moreover from which pretrained model were the results extracted this is not clear in the paper figure 1 shows the bucket size of queries and keys for each attention head using lsh attention since lsh attention was originally designed to use tied queries and keys ie queries keys how can queries and keys have different bucket sizes also it is unclear if queries and keys were tied for all experiments using lsh attention in the rest of the paper which setting regarding the number of attention heads and layers was used for experiments on the glue benchmark and lra that is how many layers and heads were set to use lha in section 32 it is said that queries should attend to enough keys to get a good approximation of the fullattention is there a precise notion of what is enough in this context that is for recovering the true softmax distribution queries should attend to all keys in contrast if we aim to recover an entmax distribution queries can attend to only a subset of keys see the sparseconsistency property in treviso et al 2021 so what would change in lha if we want to approximate entmax instead of softmax minor comments while equation 16 is easy to follow there is a big jump in equation 17 in section 43 since hkkj does change for the queries perhaps since hkkj does not change for the queries refs treviso m gis a fernandes p fonseca e martins a f 2021 predicting attention sparsity in transformers arxiv preprint arxiv210912188 update i thank the authors for their responses and efforts to improve the paper the newly added experiments alongside the discussion addressed my concerns i believe this paper provides a new angle to study the efficiency of bucketbased methods proposing a method that overcomes issues found in previous approaches therefore the paper presents a step forward for this field and i recommend it to be accepted overall i think this paper is wellwritten has a clear and practical motivation and provides an elegant solution to the quadratic bottleneck issue in transformers most of my concerns are about the lack of correctness in some parts of the paper such as choosing hyperparameters or evaluating lsh attention however the authors can easily address these concerns in the rebuttal period therefore i am learning towards acceptance docsepthis paper proposes a learning to hash attention lha to learn sparse attention for transformer the proposed lha addresses the limitation of annbased sparse attention method by separate learnable hash functions for queries and keys and utilizes kernelized techniques for efficient approximation of attention utilities experiments on several applications validate the effectiveness of the proposed lha strengths the attention sparsification is important to reduce the complexity of transformer when applied to long sequence the imbalance of bucket size and querykey ratios are studied which is closely related to efficiently reducing the complexity and performance on downstream tasks the attention utility is proposed as a metric to measure how well the attention weights are preserved the attention utility is used to train the learnable hash functions since the argmax operation makes it impossible to train the hash functions from the downstream tasks an approximation of the attention utilities is proposed with random fourier features to reduce the complexity weaknesses what is the statistics of the bucket size and querykey ratios for the proposed lha is it significantly better than lsh it is not clear how to formulate the final training objective and balance the loss terms it is not clear how to apply lha as a plugandplay replacement for dense attention layers in pretrained transformer models since the lha method introduces the learnable hash functions hk and hq the hash functions should be trained on the target dataset the implementation is inconsistent with the analyze by using the token sorting method in roy et al 2021 the validation of attention biclustering is no longer guaranteed however the attention utility is meaningful only if the attention biclustering is guaranteed minor the notation of key and query is confused qi should be the query and a key should use ki before 14 to make the paper selfcontained it is necessary to include a brief introduction to the compared baselines in table 3 adding a column to show what type of attention sparsification would help the reader to understand the comparison below14 is not longer is no longer below 16 hkkj does change does not change this paper propose to learn separate hash functions for keys and queries with the guidance of attention utilization which is the metric proposed in this paper however the implementation is inconsistent with the analyze and important method details are missing at this point i would tend to vote this paper slightly below the bar if the authors can address my concerns i would be happy to increase my score docsepthis work studies the contentbased sparse attention in transformer and how to improve the selfattention part ie efficiency and effectiveness the authors figure the bucket imbalance problem in annderived contentbased sparse attention and analyze its weakness on the imbalance problem attention biclusteringis introduced to find the optimal attention utility the learning to hashbased attention model is proposed to improve the effectiveness of sparse attention experiments on different applications support their claims concerns 1 it is wellknown that lsh is dataindependent hashing which is formulated by random projection with some predefined metrics notably using learning to hash is better than the lsh and its variants in most cases for representation learning therefore the authors need to clarify why replacing lsh with learning to hash models is useful in sparse attention 2 such a simple replacement could fully support your work in the present form the authors should give full reasons for your contributions 3 the authors fail to convince the reviewer what are the connections between theorem 1 and your proposed method to me you can directly claim there are some limitations on lshbased methods and implementing learning to hash can improve its representation capabilities on feature learning 4 efficiency is one of the proposed methods and more analysis on the efficiency and efficacy is necessary the authors try to improve some limitaions on the contentbased sparse attention when using lshproduced sparse attention patterns a learningtohash attention is formulated to enhance its model expressiveness experiments show its usefulness of the proposed methods however there are questionable points that should be clarified ### Summary:
this paper adds to the literature of efficient sparse attention for longrange transformer architectures a learned hash function is proposed by building successfully upon contributions from previous work a similar idea appears in contemporary work but with clear and complementary differences the reviewers are convinced of the importance of attention complexity bucket imbalance issues and agree that learningtohash is is a promising solution the authors have clarified almost all outstanding concerns in some cases adding valuable new results eg timing experiments i echo the reviewers concern and stress to the authors to clarify the precise meaning of plugandplay as it may be misinterpreted eg no finetuning needed or just no change to model but still finetuning is needed some of the cited papers are accepted at conferences please update your bib file with the correct information for credit attribution
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 23970, 247, 747, 1332, 1754, 327, 3037, 494, 13283, 3470, 281, 4796, 253, 327, 19, 2105, 273, 1881, 42959, 275, 4979, 398, 281, 327, 1010, 347, 3786, 1929, 323, 22205, 3169, 7274, 436, 2105, 310, 760, 6786, 672, 47289, 403, 16645, 253, 2929, 2340, 684, 253, 12510, 273, 2905, 7274, 5001, 22205, 31561, 3374, 407, 4645, 9990, 323, 2067, 4116, 9851, 273, 247, 3215, 11273, 39707, 50276, 66, 10799, 7982, 310, 4158, 281, 22048, 436, 10732, 273, 6733, 4116, 11839, 285, 1996, 436, 7982, 310, 18325, 407, 4715, 4858, 4764, 1025, 13283, 3470, 323, 19241, 285, 10149, 281, 320, 2104, 281, 22318, 436, 7982, 253, 4477, 897, 253, 38663, 11193, 281, 253, 2602, 4090, 1159, 4081, 407, 40247, 30811, 297, 507, 5985, 1162, 355, 9169, 347, 247, 37820, 1307, 4679, 327, 2067, 295, 24343, 285, 30105, 8892, 921, 326, 253, 4081, 1332, 33526, 1805, 1543, 685, 2045, 3809, 16702, 398, 1223, 1146, 7938, 685, 247, 2629, 39707, 50276, 856, 84, 50276, 74, 1663, 11346, 4361, 436, 2929, 352, 7866, 407, 48373, 3139, 432, 2905, 2987, 407, 15686, 253, 4619, 2523, 273, 417, 1907, 16645, 47289, 4645, 326, 436, 8847, 310, 247, 1846, 8483, 12615, 273, 2045, 4633, 2987, 3103, 253, 12153, 310, 2590, 285, 556, 8542, 12739, 50276, 3062, 1189, 2677, 5411, 253, 4248, 281, 534, 271, 16851, 1881, 42959, 476, 516, 17255, 253, 2120, 4116, 310, 1077, 27096, 285, 1537, 7102, 2852, 789, 327, 3809, 4979, 398, 253, 4477, 14409, 326, 12672, 436, 1180, 310, 271, 295, 545, 472, 1895, 533, 597, 564, 2007, 285, 1611, 281, 22318, 436, 7982, 1309, 3733, 50275, 783, 2934, 273, 1907, 247, 3037, 494, 1159, 2720, 281, 12433, 8211, 310, 417, 7094, 4460, 923, 247, 13399, 2934, 407, 2578, 4534, 80, 1162, 355, 43425, 533, 253, 15895, 310, 44003, 285, 2590, 17690, 2087, 5904, 281, 2045, 7274, 50276, 783, 4679, 13905, 247, 1781, 873, 273, 8892, 275, 295, 24343, 285, 30105, 342, 9592, 1781, 3280, 16095, 298, 78, 4679, 921, 326, 253, 4081, 1332, 6194, 494, 298, 1200, 41731, 13015, 8460, 254, 18093, 1949, 494, 298, 1200, 327, 2709, 7533, 1543, 327, 253, 28400, 22791, 921, 326, 253, 4081, 1332, 33526, 1543, 327, 1061, 342, 253, 3236, 687, 589, 893, 1566, 534, 310, 247, 1175, 45985, 2451, 253, 4679, 327, 1048, 2491, 23192, 1918, 253, 2457, 13746, 281, 253, 2929, 342, 253, 4081, 1332, 41731, 14692, 2905, 2987, 327, 512, 2783, 8892, 1223, 1146, 20182, 2169, 28519, 281, 6194, 685, 247, 2629, 39707, 50275, 5040, 50276, 35529, 891, 369, 671, 2104, 281, 6308, 690, 7350, 275, 436, 2929, 345, 2414, 600, 50275, 783, 5161, 273, 253, 4081, 1332, 15771, 327, 39793, 253, 4116, 11839, 534, 15771, 327, 253, 34983, 2602, 4090, 10295, 281, 3963, 907, 253, 3733, 9459, 253, 2929, 32659, 281, 897, 247, 27451, 875, 253, 2602, 4090, 10295, 285, 298, 3227, 281, 326, 990, 352, 651, 320, 4722, 281, 923, 849, 436, 27451, 1307, 43279, 1309, 3733, 50275, 783, 4327, 273, 884, 9851, 432, 253, 495, 5784, 3828, 275, 2593, 4567, 4453, 10341, 275, 958, 352, 310, 7561, 1929, 326, 1027, 9851, 387, 1027, 8090, 2770, 327, 5799, 21761, 3021, 752, 310, 253, 16038, 323, 326, 4327, 25761, 432, 534, 3215, 11273, 1566, 497, 253, 1543, 10375, 436, 310, 417, 2590, 275, 253, 2929, 50275, 13206, 337, 2722, 253, 22205, 1979, 273, 19241, 285, 10149, 323, 1016, 4116, 1481, 970, 298, 1200, 4116, 1580, 298, 1200, 4116, 369, 8927, 4158, 281, 897, 12331, 19241, 285, 10149, 26332, 19241, 50276, 12305, 849, 476, 19241, 285, 10149, 452, 1027, 22205, 9552, 671, 352, 310, 12744, 604, 19241, 285, 10149, 497, 12331, 323, 512, 4679, 970, 298, 1200, 4116, 275, 253, 1551, 273, 253, 2929, 50275, 4609, 4758, 5001, 253, 1180, 273, 4116, 9851, 285, 8090, 369, 908, 323, 4679, 327, 253, 28400, 22791, 285, 298, 376, 326, 310, 849, 1142, 8090, 285, 9851, 497, 873, 281, 897, 298, 3227, 50275, 249, 2593, 4567, 352, 310, 753, 326, 19241, 943, 8041, 281, 2217, 10149, 281, 755, 247, 1175, 11193, 273, 253, 2120, 42959, 310, 627, 247, 10799, 10732, 273, 752, 310, 2217, 275, 436, 3634, 326, 310, 323, 27930, 253, 2032, 2602, 4090, 3268, 19241, 943, 8041, 281, 512, 10149, 275, 4499, 604, 359, 4388, 281, 9295, 271, 994, 4090, 3268, 19241, 476, 8041, 281, 760, 247, 8578, 273, 10149, 923, 253, 23507, 46540, 1371, 2867, 275, 2578, 4534, 80, 1162, 355, 43425, 594, 752, 651, 1818, 275, 298, 3227, 604, 359, 971, 281, 16851, 994, 4090, 3185, 273, 2602, 4090, 50275, 37585, 5701, 50274, 6050, 5150, 1668, 310, 3477, 281, 956, 627, 310, 247, 1943, 6923, 275, 5150, 1722, 50275, 249, 2593, 7652, 1580, 288, 14750, 75, 1057, 1818, 323, 253, 19241, 4931, 1580, 288, 14750, 75, 1057, 417, 1818, 323, 253, 19241, 50275, 44281, 50276, 5643, 4534, 80, 278, 305, 261, 247, 269, 1808, 395, 265, 268, 37885, 1704, 66, 299, 50276, 20707, 968, 247, 269, 43425, 21565, 4116, 37139, 414, 275, 4979, 398, 549, 32693, 638, 3845, 549, 32693, 19, 12852, 805, 17599, 50275, 11183, 50275, 74, 5717, 253, 4477, 323, 616, 6128, 285, 6031, 281, 3157, 253, 2929, 253, 9841, 2879, 4679, 12936, 253, 5955, 9713, 619, 7350, 891, 2868, 436, 2929, 3400, 247, 747, 6907, 281, 1263, 253, 6733, 273, 22205, 3169, 3082, 36636, 247, 1332, 326, 689, 3217, 3374, 1119, 275, 2045, 7274, 3103, 253, 2929, 10262, 247, 3213, 3579, 323, 436, 1673, 285, 891, 5583, 352, 281, 320, 7607, 4583, 891, 1158, 436, 2929, 310, 973, 15720, 556, 247, 2590, 285, 8542, 16038, 285, 3400, 271, 20654, 2900, 281, 253, 21396, 3673, 44856, 2523, 275, 4979, 398, 954, 273, 619, 7350, 403, 670, 253, 3480, 273, 36594, 275, 690, 4243, 273, 253, 2929, 824, 347, 13887, 4373, 22041, 390, 16344, 298, 1200, 4116, 2299, 253, 4477, 476, 4354, 2953, 841, 7350, 275, 253, 30080, 22559, 2180, 3103, 891, 717, 4715, 4404, 14924, 50276, 7152, 33032, 2520, 2929, 29328, 247, 4715, 281, 13283, 4116, 298, 3227, 281, 3037, 23507, 4116, 323, 39707, 253, 4081, 298, 3227, 12453, 253, 12291, 273, 2459, 3169, 23507, 4116, 1332, 407, 4858, 3037, 494, 13283, 3470, 323, 19241, 285, 10149, 285, 29820, 10295, 1025, 5609, 323, 5919, 11193, 273, 4116, 28275, 50276, 16217, 3825, 327, 2067, 4893, 17813, 253, 12510, 273, 253, 4081, 298, 3227, 50276, 296, 3755, 20556, 50276, 783, 4116, 37139, 1877, 310, 1774, 281, 4796, 253, 10454, 273, 39707, 672, 3732, 281, 1048, 3425, 50275, 783, 31561, 273, 22205, 1979, 285, 7316, 2364, 11878, 403, 5421, 534, 310, 8244, 2905, 281, 14556, 8493, 253, 10454, 285, 3045, 327, 15450, 8892, 50275, 783, 4116, 11839, 310, 4081, 347, 247, 7982, 281, 2557, 849, 973, 253, 4116, 13461, 403, 15296, 50275, 783, 4116, 11839, 310, 908, 281, 6194, 253, 3037, 494, 13283, 3470, 1580, 253, 1736, 4090, 4254, 2789, 352, 7479, 281, 6194, 253, 13283, 3470, 432, 253, 15450, 8892, 50275, 266, 11193, 273, 253, 4116, 28275, 310, 4081, 342, 3632, 269, 15421, 3386, 281, 4796, 253, 10454, 50275, 20881, 1255, 265, 50275, 5371, 310, 253, 9990, 273, 253, 22205, 1979, 285, 7316, 2364, 11878, 323, 253, 4081, 298, 3227, 310, 352, 3012, 1805, 685, 298, 1200, 50275, 262, 310, 417, 2590, 849, 281, 36803, 253, 2457, 3733, 8103, 285, 6654, 253, 2957, 2426, 50275, 262, 310, 417, 2590, 849, 281, 4647, 298, 3227, 347, 247, 10358, 395, 1993, 5407, 323, 14086, 4116, 8090, 275, 3215, 11273, 39707, 3210, 1580, 253, 298, 3227, 1332, 23970, 253, 3037, 494, 13283, 3470, 288, 76, 285, 288, 82, 253, 13283, 3470, 943, 320, 10166, 327, 253, 2303, 10895, 50275, 783, 7092, 310, 16706, 342, 253, 12106, 407, 970, 253, 10669, 23762, 1332, 275, 12869, 1162, 355, 43425, 253, 12820, 273, 4116, 43022, 77, 49591, 310, 642, 3356, 16293, 2299, 253, 4116, 11839, 310, 14282, 760, 604, 253, 50276, 42959, 43022, 77, 49591, 310, 16293, 50273, 37585, 50275, 783, 14951, 273, 2234, 285, 7316, 310, 13477, 2805, 74, 943, 320, 253, 7316, 285, 247, 2234, 943, 897, 25130, 1078, 1638, 50274, 936, 1056, 253, 2929, 1881, 41010, 352, 310, 3309, 281, 2486, 247, 4864, 10199, 281, 253, 2429, 1666, 25379, 275, 2829, 495, 6240, 247, 5084, 281, 921, 752, 1511, 273, 4116, 37139, 1877, 651, 1361, 253, 9414, 281, 2096, 253, 5301, 50275, 27490, 1047, 310, 417, 3356, 50276, 261, 642, 3356, 50275, 27490, 1668, 288, 14750, 75, 1057, 1818, 50276, 18566, 417, 1818, 436, 2929, 12661, 281, 3037, 4858, 13283, 3470, 323, 10149, 285, 19241, 342, 253, 12925, 273, 4116, 19575, 534, 310, 253, 7982, 4081, 275, 436, 2929, 2299, 253, 7092, 310, 16706, 342, 253, 12106, 285, 1774, 1332, 4278, 403, 5816, 387, 436, 1127, 891, 651, 5257, 281, 6273, 436, 2929, 5777, 2708, 253, 2534, 604, 253, 4477, 476, 2953, 619, 7350, 891, 651, 320, 5211, 281, 2572, 619, 4868, 5474, 33032, 2520, 789, 2175, 253, 2600, 3169, 23507, 4116, 275, 39707, 285, 849, 281, 3157, 253, 1881, 42959, 629, 26332, 6733, 285, 12510, 253, 4477, 4677, 253, 22205, 31561, 1895, 275, 2459, 13472, 2600, 3169, 23507, 4116, 285, 12106, 697, 14855, 327, 253, 31561, 1895, 4116, 43022, 77, 49591, 261, 5611, 281, 1089, 253, 8654, 4116, 11839, 253, 4715, 281, 13283, 3169, 4116, 1566, 310, 4081, 281, 3157, 253, 12510, 273, 23507, 4116, 4679, 327, 1027, 4893, 1329, 616, 3916, 50276, 585, 1209, 2224, 337, 352, 310, 973, 4304, 326, 298, 1200, 310, 2856, 404, 6820, 556, 2027, 534, 310, 26115, 407, 3632, 12378, 342, 690, 41364, 17082, 19836, 970, 4715, 281, 13283, 310, 1805, 685, 253, 298, 1200, 285, 697, 11640, 275, 954, 2219, 323, 6779, 4715, 3103, 253, 4477, 878, 281, 19148, 2139, 15706, 298, 1200, 342, 4715, 281, 13283, 3210, 310, 4217, 275, 23507, 4116, 374, 824, 247, 2969, 5407, 812, 4751, 1329, 634, 789, 275, 253, 1246, 830, 253, 4477, 943, 1918, 2120, 4606, 323, 634, 9021, 495, 253, 4477, 1891, 281, 18578, 253, 37317, 752, 403, 253, 10291, 875, 10012, 337, 285, 634, 4081, 1332, 281, 479, 368, 476, 3587, 1750, 627, 403, 690, 7364, 327, 298, 1200, 3169, 3082, 285, 16994, 4715, 281, 13283, 476, 3157, 697, 6779, 13789, 327, 4735, 4715, 577, 6733, 310, 581, 273, 253, 4081, 3082, 285, 625, 1783, 327, 253, 6733, 285, 10307, 310, 3309, 50276, 783, 4477, 1611, 281, 3157, 690, 2701, 66, 621, 327, 253, 2600, 3169, 23507, 4116, 672, 970, 298, 1200, 36008, 23507, 4116, 6127, 247, 4715, 936, 13362, 4116, 310, 26115, 281, 7278, 697, 1566, 3890, 6460, 4679, 921, 697, 31471, 273, 253, 4081, 3082, 2299, 627, 403, 30455, 2792, 326, 943, 320, 31637, 2490, 187, 4118, 18435, 27, 2520, 2929, 11323, 281, 253, 6239, 273, 5919, 23507, 4116, 323, 1048, 6324, 39707, 35615, 247, 6311, 13283, 1159, 310, 4081, 407, 3652, 8379, 2220, 9021, 432, 2045, 789, 247, 2074, 2934, 4620, 275, 13399, 789, 533, 342, 2590, 285, 19767, 3910, 50276, 783, 30628, 403, 13762, 273, 253, 6349, 273, 4116, 10454, 22205, 31561, 3374, 285, 5194, 326, 4715, 936, 13362, 310, 310, 247, 12532, 2900, 253, 4477, 452, 31637, 2761, 512, 16383, 7350, 275, 690, 2219, 6240, 9865, 747, 1543, 24088, 11795, 4679, 50276, 74, 7392, 253, 30628, 4468, 285, 4073, 281, 253, 4477, 281, 19148, 253, 10799, 4495, 273, 10358, 395, 1993, 347, 352, 778, 320, 3731, 22416, 264, 24088, 642, 1442, 292, 25004, 3058, 390, 816, 642, 1818, 281, 1566, 533, 1335, 1442, 292, 25004, 310, 3058, 50276, 8826, 273, 253, 11106, 9380, 403, 7607, 387, 27691, 4496, 5731, 634, 20314, 1873, 342, 253, 3451, 1491, 323, 6152, 863, 2382 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 23970, 247, 747, 1332, 1754, 327, 3037, 494, 13283, 3470, 281, 4796, 253, 327, 19, 2105, 273, 1881, 42959, 275, 4979, 398, 281, 327, 1010, 347, 3786, 1929, 323, 22205, 3169, 7274, 436, 2105, 310, 760, 6786, 672, 47289, 403, 16645, 253, 2929, 2340, 684, 253, 12510, 273, 2905, 7274, 5001, 22205, 31561, 3374, 407, 4645, 9990, 323, 2067, 4116, 9851, 273, 247, 3215, 11273, 39707, 50276, 66, 10799, 7982, 310, 4158, 281, 22048, 436, 10732, 273, 6733, 4116, 11839, 285, 1996, 436, 7982, 310, 18325, 407, 4715, 4858, 4764, 1025, 13283, 3470, 323, 19241, 285, 10149, 281, 320, 2104, 281, 22318, 436, 7982, 253, 4477, 897, 253, 38663, 11193, 281, 253, 2602, 4090, 1159, 4081, 407, 40247, 30811, 297, 507, 5985, 1162, 355, 9169, 347, 247, 37820, 1307, 4679, 327, 2067, 295, 24343, 285, 30105, 8892, 921, 326, 253, 4081, 1332, 33526, 1805, 1543, 685, 2045, 3809, 16702, 398, 1223, 1146, 7938, 685, 247, 2629, 39707, 50276, 856, 84, 50276, 74, 1663, 11346, 4361, 436, 2929, 352, 7866, 407, 48373, 3139, 432, 2905, 2987, 407, 15686, 253, 4619, 2523, 273, 417, 1907, 16645, 47289, 4645, 326, 436, 8847, 310, 247, 1846, 8483, 12615, 273, 2045, 4633, 2987, 3103, 253, 12153, 310, 2590, 285, 556, 8542, 12739, 50276, 3062, 1189, 2677, 5411, 253, 4248, 281, 534, 271, 16851, 1881, 42959, 476, 516, 17255, 253, 2120, 4116, 310, 1077, 27096, 285, 1537, 7102, 2852, 789, 327, 3809, 4979, 398, 253, 4477, 14409, 326, 12672, 436, 1180, 310, 271, 295, 545, 472, 1895, 533, 597, 564, 2007, 285, 1611, 281, 22318, 436, 7982, 1309, 3733, 50275, 783, 2934, 273, 1907, 247, 3037, 494, 1159, 2720, 281, 12433, 8211, 310, 417, 7094, 4460, 923, 247, 13399, 2934, 407, 2578, 4534, 80, 1162, 355, 43425, 533, 253, 15895, 310, 44003, 285, 2590, 17690, 2087, 5904, 281, 2045, 7274, 50276, 783, 4679, 13905, 247, 1781, 873, 273, 8892, 275, 295, 24343, 285, 30105, 342, 9592, 1781, 3280, 16095, 298, 78, 4679, 921, 326, 253, 4081, 1332, 6194, 494, 298, 1200, 41731, 13015, 8460, 254, 18093, 1949, 494, 298, 1200, 327, 2709, 7533, 1543, 327, 253, 28400, 22791, 921, 326, 253, 4081, 1332, 33526, 1543, 327, 1061, 342, 253, 3236, 687, 589, 893, 1566, 534, 310, 247, 1175, 45985, 2451, 253, 4679, 327, 1048, 2491, 23192, 1918, 253, 2457, 13746, 281, 253, 2929, 342, 253, 4081, 1332, 41731, 14692, 2905, 2987, 327, 512, 2783, 8892, 1223, 1146, 20182, 2169, 28519, 281, 6194, 685, 247, 2629, 39707, 50275, 5040, 50276, 35529, 891, 369, 671, 2104, 281, 6308, 690, 7350, 275, 436, 2929, 345, 2414, 600, 50275, 783, 5161, 273, 253, 4081, 1332, 15771, 327, 39793, 253, 4116, 11839, 534, 15771, 327, 253, 34983, 2602, 4090, 10295, 281, 3963, 907, 253, 3733, 9459, 253, 2929, 32659, 281, 897, 247, 27451, 875, 253, 2602, 4090, 10295, 285, 298, 3227, 281, 326, 990, 352, 651, 320, 4722, 281, 923, 849, 436, 27451, 1307, 43279, 1309, 3733, 50275, 783, 4327, 273, 884, 9851, 432, 253, 495, 5784, 3828, 275, 2593, 4567, 4453, 10341, 275, 958, 352, 310, 7561, 1929, 326, 1027, 9851, 387, 1027, 8090, 2770, 327, 5799, 21761, 3021, 752, 310, 253, 16038, 323, 326, 4327, 25761, 432, 534, 3215, 11273, 1566, 497, 253, 1543, 10375, 436, 310, 417, 2590, 275, 253, 2929, 50275, 13206, 337, 2722, 253, 22205, 1979, 273, 19241, 285, 10149, 323, 1016, 4116, 1481, 970, 298, 1200, 4116, 1580, 298, 1200, 4116, 369, 8927, 4158, 281, 897, 12331, 19241, 285, 10149, 26332, 19241, 50276, 12305, 849, 476, 19241, 285, 10149, 452, 1027, 22205, 9552, 671, 352, 310, 12744, 604, 19241, 285, 10149, 497, 12331, 323, 512, 4679, 970, 298, 1200, 4116, 275, 253, 1551, 273, 253, 2929, 50275, 4609, 4758, 5001, 253, 1180, 273, 4116, 9851, 285, 8090, 369, 908, 323, 4679, 327, 253, 28400, 22791, 285, 298, 376, 326, 310, 849, 1142, 8090, 285, 9851, 497, 873, 281, 897, 298, 3227, 50275, 249, 2593, 4567, 352, 310, 753, 326, 19241, 943, 8041, 281, 2217, 10149, 281, 755, 247, 1175, 11193, 273, 253, 2120, 42959, 310, 627, 247, 10799, 10732, 273, 752, 310, 2217, 275, 436, 3634, 326, 310, 323, 27930, 253, 2032, 2602, 4090, 3268, 19241, 943, 8041, 281, 512, 10149, 275, 4499, 604, 359, 4388, 281, 9295, 271, 994, 4090, 3268, 19241, 476, 8041, 281, 760, 247, 8578, 273, 10149, 923, 253, 23507, 46540, 1371, 2867, 275, 2578, 4534, 80, 1162, 355, 43425, 594, 752, 651, 1818, 275, 298, 3227, 604, 359, 971, 281, 16851, 994, 4090, 3185, 273, 2602, 4090, 50275, 37585, 5701, 50274, 6050, 5150, 1668, 310, 3477, 281, 956, 627, 310, 247, 1943, 6923, 275, 5150, 1722, 50275, 249, 2593, 7652, 1580, 288, 14750, 75, 1057, 1818, 323, 253, 19241, 4931, 1580, 288, 14750, 75, 1057, 417, 1818, 323, 253, 19241, 50275, 44281, 50276, 5643, 4534, 80, 278, 305, 261, 247, 269, 1808, 395, 265, 268, 37885, 1704, 66, 299, 50276, 20707, 968, 247, 269, 43425, 21565, 4116, 37139, 414, 275, 4979, 398, 549, 32693, 638, 3845, 549, 32693, 19, 12852, 805, 17599, 50275, 11183, 50275, 74, 5717, 253, 4477, 323, 616, 6128, 285, 6031, 281, 3157, 253, 2929, 253, 9841, 2879, 4679, 12936, 253, 5955, 9713, 619, 7350, 891, 2868, 436, 2929, 3400, 247, 747, 6907, 281, 1263, 253, 6733, 273, 22205, 3169, 3082, 36636, 247, 1332, 326, 689, 3217, 3374, 1119, 275, 2045, 7274, 3103, 253, 2929, 10262, 247, 3213, 3579, 323, 436, 1673, 285, 891, 5583, 352, 281, 320, 7607, 4583, 891, 1158, 436, 2929, 310, 973, 15720, 556, 247, 2590, 285, 8542, 16038, 285, 3400, 271, 20654, 2900, 281, 253, 21396, 3673, 44856, 2523, 275, 4979, 398, 954, 273, 619, 7350, 403, 670, 253, 3480, 273, 36594, 275, 690, 4243, 273, 253, 2929, 824, 347, 13887, 4373, 22041, 390, 16344, 298, 1200, 4116, 2299, 253, 4477, 476, 4354, 2953, 841, 7350, 275, 253, 30080, 22559, 2180, 3103, 891, 717, 4715, 4404, 14924, 50276, 7152, 33032, 2520, 2929, 29328, 247, 4715, 281, 13283, 4116, 298, 3227, 281, 3037, 23507, 4116, 323, 39707, 253, 4081, 298, 3227, 12453, 253, 12291, 273, 2459, 3169, 23507, 4116, 1332, 407, 4858, 3037, 494, 13283, 3470, 323, 19241, 285, 10149, 285, 29820, 10295, 1025, 5609, 323, 5919, 11193, 273, 4116, 28275, 50276, 16217, 3825, 327, 2067, 4893, 17813, 253, 12510, 273, 253, 4081, 298, 3227, 50276, 296, 3755, 20556, 50276, 783, 4116, 37139, 1877, 310, 1774, 281, 4796, 253, 10454, 273, 39707, 672, 3732, 281, 1048, 3425, 50275, 783, 31561, 273, 22205, 1979, 285, 7316, 2364, 11878, 403, 5421, 534, 310, 8244, 2905, 281, 14556, 8493, 253, 10454, 285, 3045, 327, 15450, 8892, 50275, 783, 4116, 11839, 310, 4081, 347, 247, 7982, 281, 2557, 849, 973, 253, 4116, 13461, 403, 15296, 50275, 783, 4116, 11839, 310, 908, 281, 6194, 253, 3037, 494, 13283, 3470, 1580, 253, 1736, 4090, 4254, 2789, 352, 7479, 281, 6194, 253, 13283, 3470, 432, 253, 15450, 8892, 50275, 266, 11193, 273, 253, 4116, 28275, 310, 4081, 342, 3632, 269, 15421, 3386, 281, 4796, 253, 10454, 50275, 20881, 1255, 265, 50275, 5371, 310, 253, 9990, 273, 253, 22205, 1979, 285, 7316, 2364, 11878, 323, 253, 4081, 298, 3227, 310, 352, 3012, 1805, 685, 298, 1200, 50275, 262, 310, 417, 2590, 849, 281, 36803, 253, 2457, 3733, 8103, 285, 6654, 253, 2957, 2426, 50275, 262, 310, 417, 2590, 849, 281, 4647, 298, 3227, 347, 247, 10358, 395, 1993, 5407, 323, 14086, 4116, 8090, 275, 3215, 11273, 39707, 3210, 1580, 253, 298, 3227, 1332, 23970, 253, 3037, 494, 13283, 3470, 288, 76, 285, 288, 82, 253, 13283, 3470, 943, 320, 10166, 327, 253, 2303, 10895, 50275, 783, 7092, 310, 16706, 342, 253, 12106, 407, 970, 253, 10669, 23762, 1332, 275, 12869, 1162, 355, 43425, 253, 12820, 273, 4116, 43022, 77, 49591, 310, 642, 3356, 16293, 2299, 253, 4116, 11839, 310, 14282, 760, 604, 253, 50276, 42959, 43022, 77, 49591, 310, 16293, 50273, 37585, 50275, 783, 14951, 273, 2234, 285, 7316, 310, 13477, 2805, 74, 943, 320, 253, 7316, 285, 247, 2234, 943, 897, 25130, 1078, 1638, 50274, 936, 1056, 253, 2929, 1881, 41010, 352, 310, 3309, 281, 2486, 247, 4864, 10199, 281, 253, 2429, 1666, 25379, 275, 2829, 495, 6240, 247, 5084, 281, 921, 752, 1511, 273, 4116, 37139, 1877, 651, 1361, 253, 9414, 281, 2096, 253, 5301, 50275, 27490, 1047, 310, 417, 3356, 50276, 261, 642, 3356, 50275, 27490, 1668, 288, 14750, 75, 1057, 1818, 50276, 18566, 417, 1818, 436, 2929, 12661, 281, 3037, 4858, 13283, 3470, 323, 10149, 285, 19241, 342, 253, 12925, 273, 4116, 19575, 534, 310, 253, 7982, 4081, 275, 436, 2929, 2299, 253, 7092, 310, 16706, 342, 253, 12106, 285, 1774, 1332, 4278, 403, 5816, 387, 436, 1127, 891, 651, 5257, 281, 6273, 436, 2929, 5777, 2708, 253, 2534, 604, 253, 4477, 476, 2953, 619, 7350, 891, 651, 320, 5211, 281, 2572, 619, 4868, 5474, 33032, 2520, 789, 2175, 253, 2600, 3169, 23507, 4116, 275, 39707, 285, 849, 281, 3157, 253, 1881, 42959, 629, 26332, 6733, 285, 12510, 253, 4477, 4677, 253, 22205, 31561, 1895, 275, 2459, 13472, 2600, 3169, 23507, 4116, 285, 12106, 697, 14855, 327, 253, 31561, 1895, 4116, 43022, 77, 49591, 261, 5611, 281, 1089, 253, 8654, 4116, 11839, 253, 4715, 281, 13283, 3169, 4116, 1566, 310, 4081, 281, 3157, 253, 12510, 273, 23507, 4116, 4679, 327, 1027, 4893, 1329, 616, 3916, 50276, 585, 1209, 2224, 337, 352, 310, 973, 4304, 326, 298, 1200, 310, 2856, 404, 6820, 556, 2027, 534, 310, 26115, 407, 3632, 12378, 342, 690, 41364, 17082, 19836, 970, 4715, 281, 13283, 310, 1805, 685, 253, 298, 1200, 285, 697, 11640, 275, 954, 2219, 323, 6779, 4715, 3103, 253, 4477, 878, 281, 19148, 2139, 15706, 298, 1200, 342, 4715, 281, 13283, 3210, 310, 4217, 275, 23507, 4116, 374, 824, 247, 2969, 5407, 812, 4751, 1329, 634, 789, 275, 253, 1246, 830, 253, 4477, 943, 1918, 2120, 4606, 323, 634, 9021, 495, 253, 4477, 1891, 281, 18578, 253, 37317, 752, 403, 253, 10291, 875, 10012, 337, 285, 634, 4081, 1332, 281, 479, 368, 476, 3587, 1750, 627, 403, 690, 7364, 327, 298, 1200, 3169, 3082, 285, 16994, 4715, 281, 13283, 476, 3157, 697, 6779, 13789, 327, 4735, 4715, 577, 6733, 310, 581, 273, 253, 4081, 3082, 285, 625, 1783, 327, 253, 6733, 285, 10307, 310, 3309, 50276, 783, 4477, 1611, 281, 3157, 690, 2701, 66, 621, 327, 253, 2600, 3169, 23507, 4116, 672, 970, 298, 1200, 36008, 23507, 4116, 6127, 247, 4715, 936, 13362, 4116, 310, 26115, 281, 7278, 697, 1566, 3890, 6460, 4679, 921, 697, 31471, 273, 253, 4081, 3082, 2299, 627, 403, 30455, 2792, 326, 943, 320, 31637, 2490, 187, 4118, 18435, 27, 2520, 2929, 11323, 281, 253, 6239, 273, 5919, 23507, 4116, 323, 1048, 6324, 39707, 35615, 247, 6311, 13283, 1159, 310, 4081, 407, 3652, 8379, 2220, 9021, 432, 2045, 789, 247, 2074, 2934, 4620, 275, 13399, 789, 533, 342, 2590, 285, 19767, 3910, 50276, 783, 30628, 403, 13762, 273, 253, 6349, 273, 4116, 10454, 22205, 31561, 3374, 285, 5194, 326, 4715, 936, 13362, 310, 310, 247, 12532, 2900, 253, 4477, 452, 31637, 2761, 512, 16383, 7350, 275, 690, 2219, 6240, 9865, 747, 1543, 24088, 11795, 4679, 50276, 74, 7392, 253, 30628, 4468, 285, 4073, 281, 253, 4477, 281, 19148, 253, 10799, 4495, 273, 10358, 395, 1993, 347, 352, 778, 320, 3731, 22416, 264, 24088, 642, 1442, 292, 25004, 3058, 390, 816, 642, 1818, 281, 1566, 533, 1335, 1442, 292, 25004, 310, 3058, 50276, 8826, 273, 253, 11106, 9380, 403, 7607, 387, 27691, 4496, 5731, 634, 20314, 1873, 342, 253, 3451, 1491, 323, 6152, 863, 2382 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this work presents a method to efficiently sample from a pretrained ddpm by solving a dynamic programming problem that can maximize the log likelihood of the data samples given a fixed computational budget this is done by defining a leastcost path problem to select a reduced set of time steps among a full grid of potential time steps across different possible step budget sizes where the elbo is used as the cost function the authors show that their method can identify ddpm schedules that can achieve significantly higher log likelihood ie lower bitsdim than prior ddpm schedules in the regime where about a hundred steps or fewer are used strengths the dynamic programming problem identified by the authors is an elegant and efficient approach to address the sampling limitations of ddpms it is natural to frame the search for an optimal schedule as a dynamic programming problem and the authors show this problem can be efficiently solved in linear rather than quadratic time the proposed method shows a significant improvement in model performance as measured by log likelihood compared to prior methods when applying a pretrained ddpm over a greatly reduced set of time steps weaknesses the main weakness of this work is that the method appears to overfit the elbo objective without improving and potentially reducing the visual quality of generated samples in particular the proposed method can significantly improve the log likelihood over fewstep diffusion paths compared to prior techniques however the dynamic programming step schedules can actually decrease the quality of visual appearance as measured by fid compared to previous methods personally i consider fid to be a much more reliable indicator of model quality than the log likelihood due to its sensitivity to small changes ability to detect mode coverage and the fact that fid is modelagnostic while log likelihood can only be applied to models with a tractable density or elbo the authors acknowledge this limitation and explore efficient schedules for maintaining low fidhigh visual quality but these results do not improve upon prior methods thus while the authors achieve their intended goal of efficient and high log likelihoods via their new method the outcome might not be particularly meaningful since it doesnt really improve modelsample quality i am unsure of the relevance of section 3 how does this fit into the presentation in section 4 see other comments below other comments in section 3 there is a claim that these equations show that we can perform inference with any ancestral sampling path ie the timesteps can attain continuous values but in section 4 there is a claim that for timecontinuous ddpms the choice of grid ie the t1 dots tt 1 can be arbitrary for models trained with discrete timesteps the grid must be a subset of or the full original steps used during training why does the method not work for arbitrary continuous time steps if the model is trained with discrete time steps the first claim makes it seems like that would be possible why were some of the models used retrained instead doing testing using only fixed pretrained models overall i found the approach to efficient ddpm sampling employed by the authors to be sensible and reasonably novel while their method can indeed effectively increase log likelihood for ddpm with a greatly reduced grid of time steps this did not appear to translate to improved model quality in terms of actual generated samples the final conclusion is therefore somewhat unsatisfying because an ideal ddpm schedule would be short and efficient able to produce high log likelihoods and able to produce low fid scores compared to other methods since this goal is not achieved i recommend that the authors revisit their approach to identify if there is a way to more effectively incorporate sample quality rather than log likelihood in their dp algorithm docsepthis paper presents a dynamic programming algorithm to sample from diffusion models in short they solve a dijkstratype problem on pretrained diffusions and show good results even with coarse discretizations some things are introduced but never clearly explained the most important one is you never explicitly say what decomposable means i think most readers can figure it out but only after reading the paper you write a bit on page 2 but i would make it even more clear as it is important for reading the rest also what is an elbo path on page 4 you write we can optimize a loss or reward function with respect to the timesteps themselves after the ddpm is trained can you explain again what this means condition 1 on page 4 the path starts at t 0 and ends at t 1 is it not possible to both scale and translate the timescale how restrictive is this really what is it about some regularization methods that makes your approach not work breaking the decomposability can the authors think of other regularization methods that break the approach my key concern in actual compute time say seconds how long does it take dp stride with 128 steps take compared to 128 quadratic stride to me it looks like 128 is enough for quadratic stride to catch up to your method so how much is there to win by choosing your algorithm overall the paper gives a nice overview of the literature and present a new inference scheme in posttraining scenarios i have some remarks above that can make me reconsider my evaluation and i hope for a nice discussion with the authors docsepsamples are generated from ddpms by solving an sde often in discrete time which is used to refer to specifically the eulermaruyama discretisation this necessitates a choice for where to make numerical steps each choice of step locations has a corresponding elbo this paper demonstrates that on a pretrained model the optimal elbo may be obtained via a dynamic programming algorithm for the location of the steps the paper is very clearly written and i enjoyed reading this paper the main attraction of this paper is the dramatic increase in computational efficiency the authors discuss one example in which 4000 steps in the sde solver are replaced with merely 32 steps i will tend to refer to things as being steps of an sde solver even in the discrete case since thats basically whats going on this is certainly a dramatic claim as the high computational cost of ddpms has so far been one of their major limiting factors the suggested technique seems to be mostly reasonable overall the dramatic reduction in steps feels too good to be true a sentiment that is largely borne out by section 51 in which it is demonstrated that improving the elbo does not necessarily imply improving the fid as the authors note it is multiple to derive multiple valid elbos so this is a case in which optimising the elbo need not imply actually improving the model overall my take on this paper is that speed is improved but it is hitandmiss whether model performance is compromised whilst doing so this is reflected in my middlingacceptance score with some refinement i could see the techniques this paper proposes being of great utility figure 5 one meaningful weakness in the presentation is figure 5 in which i think different brownian sample paths were used to generate each image i do note that the text claims that the same random seed was used but the variety both within each groupofsteps and between each groupofsteps means i am skeptical my guess is that a different brownian sample paths were used for each group of steps and b within each group of steps using the same random seed does not actually refer to using the same brownian motion rather it refers to using the same increments each of which are rescaled by alpha sigma or g depending on your notation this is not at all the same thing as using the same brownian motion the appropriate thing to do would be to use the same continuoustime brownian motion sample for every single picture shown in figure 5 every time a point is queried presumably nearly always at a point that it has not been queried at before as different step schemes may place steps is very different places then a brownian bridge should be constructed between the two samples already observed either side of it the authors have not released code so i cannot see what library they are using themselves but the above procedure may easily be done using the brownianinterval of the torchsde library 1 make sure to use a single brownianinterval object for the entirety of generating a figure recreating a new one at any point would be a mistake as it is deterministic only up to both its seed and the points it has already been queried at to give the appropriate references the brownian interval was introduced in 2 as an improvement of the virtual brownian tree of 3 if the above procedure is followed then i would expect the generated samples to much more closely resemble each in other and in doing so be able to better understand the effect of increasing the number of steps which is after all central to this paper other remarks equation 16 is clearly central to the paper however it pretty much comes out of nowhere at least for the reader who doesnt hold all the mathematics of ddpms in their head i think that a derivation would be a meaningful improvement to the paper the dynamic programming algorithm outlined in section 42 feels essentially standard besides dijkstras algorithm it also seems very reminiscent of dynamic time warping i regard the main contribution of this paper as the identification that step locations can be chosen via dp not the algorithm itself the entire paper is framed only in the context of inference i speculate that it might also be useful in the context of training minimising training costs especially for expensive models such as these is a topic of great importance perhaps the procedure suggested in this paper could be rerun every n training steps for some n ethics statement i would have thought that improving the computational efficiency of costly models would have some perhaps small positive impact on the pressing issue of climate change it seems a bit perverse that this positive ethical impact is not discussed in the ethics statement minor points dkl never has brackets around its arguments eg its just dkl pxqx rather than dklpx qx or dklpxqx page 4 the abbreviation ie is usually discouraged in academic writing algorithms 1 and 2 these are a weird a mix of pseudocode and python i think it would be preferred to pick just one especially as they rely on behaviour specific to numpy such as indexing by none i am not convinced how meaningful the discussion in section 43 really is it points out that ot forward passes are required as each forward pass takes ot work then overall ot2 work is required exactly as expected what is new here i dont think bpd page 7 is defined references 1 li torchsde httpsgithubcomgoogleresearchtorchsde 2 kidger et al efficient and accurate gradients for neural sdes neurips 2021 httpsarxivorgabs210513493 3 li et al scalable gradients for stochastic differential equations aistats 2020 httpsarxivorgabs200101328 possibly with some refinement the paper has the potential to be very good as it stands it presents a dramatic speed improvement that mayormaynot produce compromise the final model overall i recommend acceptance ### Summary:
this paper proposes a dynamic programming strategy for faster approximate generation in denoising diffusion probabilistic models all reviewers appreciated the paper but they are not overly excited two reviewers are focused on the log likelihood not being the objective for image quality this ac does not really buy this argument the method and story around are wellrounded and finished so it is hard to think of any major modifications that will change the overall story a lot one could therefore argue for acceptance as it stands on the other hand this is difficult to argue for given the below acceptance level scores so the final recommendation is reject with a strong encouragement to submit to the next conference updating the paper with preemptive arguments on why the elbo and not fid is the right thing to consider
[ 12177, 689, 1643, 10539, 12393, 11865, 2429, 281, 2720, 5609, 2299, 253, 7870, 10717, 3213, 28631, 476, 2686, 6379, 253, 3290, 273, 5304, 7286, 347, 4080, 407, 269, 301, 2429, 281, 2045, 3082, 11697, 891, 1908, 269, 301, 281, 320, 247, 1199, 625, 9630, 15301, 273, 1566, 3290, 685, 253, 2412, 12177, 1955, 281, 697, 7340, 281, 1355, 2544, 3745, 281, 2736, 4438, 7031, 285, 253, 958, 326, 269, 301, 310, 1566, 1530, 6932, 1223, 2412, 12177, 476, 760, 320, 3732, 281, 3210, 342, 247, 10649, 494, 4038, 390, 1045, 2399, 253, 4477, 14409, 436, 12291, 285, 8338, 5919, 28631, 323, 11850, 1698, 269, 301, 8656, 5304, 3290, 533, 841, 1543, 513, 417, 3157, 2220, 2720, 3082, 3021, 1223, 253, 4477, 5115, 616, 6034, 4736, 273, 5919, 285, 1029, 2412, 12177, 84, 3066, 616, 747, 1332, 253, 6454, 1537, 417, 320, 3782, 14282, 1580, 352, 36908, 1663, 3157, 3210, 4636, 3290, 50276, 74, 717, 31488, 273, 253, 17200, 273, 2593, 495, 849, 1057, 436, 4944, 715, 253, 9759, 275, 2593, 577, 923, 643, 5701, 2708, 50276, 977, 5701, 50276, 249, 2593, 495, 627, 310, 247, 1750, 326, 841, 7424, 921, 326, 359, 476, 1347, 17032, 342, 667, 37147, 10491, 1854, 26332, 253, 4522, 383, 2265, 476, 20685, 5415, 2193, 533, 275, 2593, 577, 627, 310, 247, 1750, 326, 323, 673, 38927, 277, 12132, 983, 253, 4327, 273, 9860, 26332, 253, 246, 18, 20200, 50276, 1440, 337, 476, 320, 10341, 323, 3210, 10166, 342, 13358, 4522, 383, 2265, 253, 9860, 1364, 320, 247, 8578, 273, 390, 253, 2120, 3236, 5018, 908, 1309, 3733, 2139, 1057, 253, 1332, 417, 789, 323, 10341, 5415, 673, 5018, 604, 253, 1566, 310, 10166, 342, 13358, 673, 5018, 253, 806, 1750, 2789, 352, 3133, 751, 326, 651, 320, 1896, 50276, 22309, 497, 690, 273, 253, 3210, 908, 851, 11273, 3185, 2509, 5175, 970, 760, 4229, 3215, 11273, 3210, 4583, 891, 1119, 253, 2746, 281, 5919, 32765, 2617, 10491, 7091, 407, 253, 4477, 281, 320, 24600, 285, 12054, 4460, 1223, 616, 1332, 476, 6296, 8069, 2572, 2412, 12177, 323, 32765, 2617, 342, 247, 10260, 3777, 9860, 273, 673, 5018, 436, 858, 417, 3176, 281, 16497, 281, 5520, 1566, 3290, 275, 2426, 273, 4588, 4561, 3530, 253, 2457, 6452, 310, 3103, 8489, 43288, 3184, 984, 271, 7445, 32765, 2617, 10130, 651, 320, 2159, 285, 5919, 2104, 281, 4711, 1029, 2412, 12177, 84, 285, 2104, 281, 4711, 1698, 269, 301, 7363, 2429, 281, 643, 3082, 1580, 436, 4736, 310, 417, 6786, 891, 5583, 326, 253, 4477, 45735, 616, 2746, 281, 4271, 604, 627, 310, 247, 1039, 281, 625, 8069, 19071, 3410, 3290, 2581, 685, 2412, 12177, 275, 616, 33234, 5933, 5474, 33032, 2520, 2929, 10262, 247, 7870, 10717, 5933, 281, 3410, 432, 12393, 3210, 275, 2159, 597, 8415, 247, 1073, 17443, 1344, 39960, 1895, 327, 3215, 11273, 2171, 16723, 285, 921, 1175, 1543, 1014, 342, 25319, 35132, 5904, 50276, 8826, 1841, 403, 5611, 533, 1620, 4518, 5544, 253, 954, 1774, 581, 310, 368, 1620, 11120, 1333, 752, 11101, 34690, 2097, 891, 1158, 954, 10668, 476, 4677, 352, 562, 533, 760, 846, 4361, 253, 2929, 368, 3630, 247, 2372, 327, 3239, 374, 533, 891, 651, 1056, 352, 1014, 625, 2590, 347, 352, 310, 1774, 323, 4361, 253, 1551, 671, 752, 310, 271, 1045, 2399, 1854, 50276, 251, 3239, 577, 368, 3630, 359, 476, 22318, 247, 2957, 390, 10921, 1159, 342, 1675, 281, 253, 4522, 383, 2265, 3746, 846, 253, 32765, 2617, 310, 10166, 50276, 5092, 368, 5513, 969, 752, 436, 2097, 50276, 12380, 337, 327, 3239, 577, 253, 1854, 7866, 387, 246, 50276, 17, 285, 7637, 387, 246, 50276, 18, 50276, 261, 352, 417, 1896, 281, 1097, 4311, 285, 16497, 253, 43936, 849, 29190, 310, 436, 1663, 50276, 5371, 310, 352, 670, 690, 37820, 3082, 326, 2789, 634, 2746, 417, 789, 10155, 253, 11101, 993, 1430, 50276, 5092, 253, 4477, 1158, 273, 643, 37820, 3082, 326, 2740, 253, 2746, 50276, 2577, 2234, 4468, 275, 4588, 11897, 673, 1333, 7253, 849, 1048, 1057, 352, 1379, 33234, 31482, 342, 12842, 5018, 1379, 2429, 281, 12842, 21396, 31482, 281, 479, 352, 4453, 751, 12842, 310, 2217, 323, 21396, 31482, 281, 5834, 598, 281, 634, 1332, 594, 849, 1199, 310, 627, 281, 3330, 407, 13887, 634, 5933, 4583, 253, 2929, 4245, 247, 5322, 18389, 273, 253, 6239, 285, 1246, 247, 747, 17032, 6974, 275, 1501, 31158, 15216, 891, 452, 690, 16157, 1840, 326, 476, 1056, 479, 24033, 619, 7103, 285, 891, 3524, 323, 247, 5322, 5955, 342, 253, 4477, 5474, 339, 793, 10240, 403, 4561, 432, 277, 12132, 983, 407, 16161, 271, 256, 615, 2223, 275, 13358, 673, 534, 310, 908, 281, 3730, 281, 5742, 253, 299, 335, 693, 274, 7352, 2902, 35132, 5837, 436, 2436, 36269, 247, 4327, 323, 835, 281, 1056, 10704, 5018, 1016, 4327, 273, 3213, 8593, 556, 247, 3969, 1045, 2399, 436, 2929, 14371, 326, 327, 247, 3215, 11273, 1566, 253, 8654, 1045, 2399, 778, 320, 2797, 3066, 247, 7870, 10717, 5933, 323, 253, 4328, 273, 253, 5018, 253, 2929, 310, 1077, 4518, 3542, 285, 891, 11346, 4361, 436, 2929, 50276, 783, 2022, 21779, 273, 436, 2929, 310, 253, 14138, 2572, 275, 15180, 6733, 50276, 783, 4477, 2319, 581, 1650, 275, 534, 35059, 5018, 275, 253, 256, 615, 47037, 403, 7932, 342, 7960, 4567, 5018, 891, 588, 5257, 281, 3730, 281, 1841, 347, 1146, 5018, 273, 271, 256, 615, 47037, 1014, 275, 253, 13358, 1083, 1580, 28763, 10323, 47515, 1469, 327, 436, 310, 5604, 247, 14138, 1750, 347, 253, 1029, 15180, 2105, 273, 277, 12132, 983, 556, 594, 2080, 644, 581, 273, 616, 2201, 14155, 2616, 50276, 783, 5125, 5853, 3133, 281, 320, 6571, 5272, 4583, 253, 14138, 5141, 275, 5018, 9193, 1512, 1175, 281, 320, 2032, 50276, 66, 21942, 326, 310, 8127, 32708, 562, 407, 2593, 8319, 275, 534, 352, 310, 5183, 326, 11138, 253, 1045, 2399, 1057, 417, 7933, 16084, 11138, 253, 269, 301, 347, 253, 4477, 3877, 352, 310, 2709, 281, 15313, 2709, 3588, 1045, 37298, 594, 436, 310, 247, 1083, 275, 534, 5556, 2182, 253, 1045, 2399, 878, 417, 16084, 2686, 11138, 253, 1566, 50276, 1189, 455, 619, 1379, 327, 436, 2929, 310, 326, 3885, 310, 5520, 533, 352, 310, 4352, 395, 3099, 1880, 1566, 3045, 310, 25047, 16682, 2509, 594, 436, 310, 11392, 275, 619, 278, 2016, 1981, 14764, 593, 4868, 342, 690, 29646, 891, 812, 923, 253, 5609, 436, 2929, 29328, 1146, 273, 1270, 11839, 50275, 13206, 608, 50276, 531, 14282, 14855, 275, 253, 9759, 310, 4677, 608, 275, 534, 891, 1158, 1027, 8516, 757, 3410, 11865, 497, 908, 281, 6635, 1016, 2460, 891, 513, 3877, 326, 253, 2505, 3916, 326, 253, 1072, 3632, 8357, 369, 908, 533, 253, 5235, 50276, 15617, 1561, 1016, 1387, 1171, 20528, 285, 875, 1016, 1387, 1171, 20528, 2097, 891, 717, 33872, 619, 5476, 310, 326, 247, 1027, 8516, 757, 3410, 11865, 497, 908, 323, 1016, 1387, 273, 5018, 285, 270, 1561, 1016, 1387, 273, 5018, 970, 253, 1072, 3632, 8357, 1057, 417, 2686, 3730, 281, 970, 253, 1072, 8516, 757, 3200, 2581, 352, 10770, 281, 970, 253, 1072, 42344, 1016, 273, 534, 403, 46595, 264, 407, 9765, 40009, 390, 305, 7293, 327, 634, 14951, 436, 310, 417, 387, 512, 253, 1072, 2181, 347, 970, 253, 1072, 8516, 757, 3200, 50276, 783, 4569, 2181, 281, 513, 651, 320, 281, 897, 253, 1072, 44351, 26202, 553, 8516, 757, 3200, 3410, 323, 1046, 2014, 5406, 2011, 275, 4677, 608, 1046, 673, 247, 1127, 310, 32305, 728, 18289, 4829, 1900, 387, 247, 1127, 326, 352, 556, 417, 644, 32305, 728, 387, 1078, 347, 1027, 3213, 15849, 778, 1659, 5018, 310, 1077, 1027, 5053, 840, 247, 8516, 757, 9729, 943, 320, 8818, 875, 253, 767, 3530, 2168, 2540, 2057, 1930, 273, 352, 50276, 783, 4477, 452, 417, 4439, 2127, 594, 891, 2550, 923, 752, 6335, 597, 403, 970, 3746, 533, 253, 1840, 5199, 778, 4354, 320, 2218, 970, 253, 8516, 757, 31251, 273, 253, 30162, 84, 615, 6335, 337, 1056, 2119, 281, 897, 247, 2014, 8516, 757, 31251, 1789, 323, 253, 25983, 273, 11365, 247, 4677, 761, 675, 272, 247, 747, 581, 387, 667, 1127, 651, 320, 247, 10551, 347, 352, 310, 30027, 760, 598, 281, 1097, 697, 8357, 285, 253, 2792, 352, 556, 2168, 644, 32305, 728, 387, 281, 1918, 253, 4569, 10414, 253, 8516, 757, 7726, 369, 5611, 275, 374, 347, 271, 7756, 273, 253, 7503, 8516, 757, 5202, 273, 495, 50276, 338, 253, 1840, 5199, 310, 3560, 840, 891, 651, 1902, 253, 4561, 3530, 281, 1199, 625, 8244, 28788, 1016, 275, 643, 285, 275, 2509, 594, 320, 2104, 281, 1805, 2096, 253, 1055, 273, 3629, 253, 1180, 273, 5018, 534, 310, 846, 512, 4275, 281, 436, 2929, 50275, 977, 16157, 50276, 29813, 1668, 310, 4518, 4275, 281, 253, 2929, 2299, 352, 3965, 1199, 3249, 562, 273, 17663, 387, 1878, 323, 253, 9414, 665, 36908, 2186, 512, 253, 23065, 273, 277, 12132, 983, 275, 616, 1481, 891, 1158, 326, 247, 28529, 651, 320, 247, 14282, 7756, 281, 253, 2929, 50276, 783, 7870, 10717, 5933, 18627, 275, 2593, 5976, 9193, 9093, 2629, 50276, 67, 11587, 1073, 17443, 1344, 284, 5933, 352, 671, 3133, 1077, 35036, 273, 7870, 673, 2137, 14650, 891, 2743, 253, 2022, 7680, 273, 436, 2929, 347, 253, 8137, 326, 3213, 8593, 476, 320, 6777, 3066, 33234, 417, 253, 5933, 3139, 50276, 783, 2862, 2929, 310, 29318, 760, 275, 253, 3634, 273, 17032, 891, 30821, 326, 352, 1537, 671, 320, 4217, 275, 253, 3634, 273, 3733, 7221, 2182, 3733, 4815, 3340, 323, 8214, 3210, 824, 347, 841, 310, 247, 9400, 273, 1270, 6349, 4931, 253, 5199, 5125, 275, 436, 2929, 812, 320, 294, 6321, 1046, 295, 3733, 5018, 323, 690, 295, 50275, 678, 982, 3908, 50276, 74, 651, 452, 1869, 326, 11138, 253, 15180, 6733, 273, 19983, 3210, 651, 452, 690, 4931, 1355, 2762, 3486, 327, 253, 17178, 2523, 273, 7952, 1818, 352, 3133, 247, 2372, 591, 3025, 326, 436, 2762, 16289, 3486, 310, 417, 5469, 275, 253, 18035, 3908, 50275, 37585, 2792, 50275, 69, 7261, 1620, 556, 26609, 1475, 697, 7125, 50276, 909, 697, 816, 277, 7261, 268, 89, 82, 89, 2581, 685, 277, 7261, 3498, 2805, 89, 390, 277, 7261, 3498, 82, 89, 50276, 6377, 577, 253, 31931, 2492, 26332, 310, 3798, 42965, 275, 11073, 4028, 50276, 267, 46042, 337, 285, 374, 841, 403, 247, 12504, 247, 5878, 273, 10585, 406, 853, 285, 15548, 891, 1158, 352, 651, 320, 9013, 281, 2619, 816, 581, 3340, 347, 597, 10725, 327, 8770, 2173, 281, 36950, 824, 347, 44176, 407, 5293, 50276, 74, 717, 417, 13762, 849, 14282, 253, 5955, 275, 2593, 7652, 1663, 310, 352, 2792, 562, 326, 14366, 3579, 11999, 403, 2424, 347, 1016, 3579, 1509, 3936, 14366, 789, 840, 4583, 14366, 19, 789, 310, 2424, 50276, 911, 24374, 347, 3264, 752, 310, 747, 1060, 50276, 74, 13414, 1158, 270, 19875, 3239, 818, 310, 2931, 50275, 250, 3065, 50276, 18, 632, 30162, 84, 615, 5987, 7280, 681, 9906, 36642, 13473, 42585, 615, 50276, 19, 5772, 1063, 1162, 355, 5919, 285, 7899, 27935, 323, 11454, 256, 3229, 5723, 2824, 43425, 5987, 39962, 2061, 5375, 19, 10655, 1012, 35337, 50276, 20, 632, 1162, 355, 44755, 27935, 323, 19191, 8967, 7424, 247, 382, 1832, 9169, 5987, 39962, 2061, 5375, 1518, 6903, 22130, 6830, 342, 690, 29646, 253, 2929, 556, 253, 2442, 281, 320, 1077, 1175, 347, 352, 9572, 352, 10262, 247, 14138, 3885, 7756, 326, 778, 526, 333, 1439, 4711, 18230, 253, 2457, 1566, 4583, 891, 5583, 14924, 2490, 187, 4118, 18435, 27, 2520, 2929, 29328, 247, 7870, 10717, 5700, 323, 7938, 16851, 5978, 275, 1850, 80, 2182, 12393, 37851, 3210, 50276, 455, 30628, 14109, 253, 2929, 533, 597, 403, 417, 27662, 9049, 50275, 9389, 30628, 403, 7106, 327, 253, 2412, 12177, 417, 1146, 253, 8103, 323, 2460, 3290, 436, 913, 1057, 417, 1663, 4489, 436, 4154, 50275, 783, 1332, 285, 2926, 1475, 403, 973, 48198, 285, 6699, 594, 352, 310, 1892, 281, 1158, 273, 667, 2201, 14586, 326, 588, 1818, 253, 4583, 2926, 247, 2257, 581, 812, 3103, 9059, 323, 14924, 347, 352, 9572, 327, 253, 643, 1133, 436, 310, 2834, 281, 9059, 323, 1677, 253, 2708, 14924, 1268, 7363, 50275, 601, 253, 2457, 17401, 310, 12009, 342, 247, 2266, 31868, 281, 11929, 281, 253, 1735, 8059, 22753, 253, 2929, 342, 36588, 422, 7125, 327, 2139, 253, 1045, 2399, 285, 417, 269, 301, 310, 253, 987, 2181, 281, 1908 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 12177, 689, 1643, 10539, 12393, 11865, 2429, 281, 2720, 5609, 2299, 253, 7870, 10717, 3213, 28631, 476, 2686, 6379, 253, 3290, 273, 5304, 7286, 347, 4080, 407, 269, 301, 2429, 281, 2045, 3082, 11697, 891, 1908, 269, 301, 281, 320, 247, 1199, 625, 9630, 15301, 273, 1566, 3290, 685, 253, 2412, 12177, 1955, 281, 697, 7340, 281, 1355, 2544, 3745, 281, 2736, 4438, 7031, 285, 253, 958, 326, 269, 301, 310, 1566, 1530, 6932, 1223, 2412, 12177, 476, 760, 320, 3732, 281, 3210, 342, 247, 10649, 494, 4038, 390, 1045, 2399, 253, 4477, 14409, 436, 12291, 285, 8338, 5919, 28631, 323, 11850, 1698, 269, 301, 8656, 5304, 3290, 533, 841, 1543, 513, 417, 3157, 2220, 2720, 3082, 3021, 1223, 253, 4477, 5115, 616, 6034, 4736, 273, 5919, 285, 1029, 2412, 12177, 84, 3066, 616, 747, 1332, 253, 6454, 1537, 417, 320, 3782, 14282, 1580, 352, 36908, 1663, 3157, 3210, 4636, 3290, 50276, 74, 717, 31488, 273, 253, 17200, 273, 2593, 495, 849, 1057, 436, 4944, 715, 253, 9759, 275, 2593, 577, 923, 643, 5701, 2708, 50276, 977, 5701, 50276, 249, 2593, 495, 627, 310, 247, 1750, 326, 841, 7424, 921, 326, 359, 476, 1347, 17032, 342, 667, 37147, 10491, 1854, 26332, 253, 4522, 383, 2265, 476, 20685, 5415, 2193, 533, 275, 2593, 577, 627, 310, 247, 1750, 326, 323, 673, 38927, 277, 12132, 983, 253, 4327, 273, 9860, 26332, 253, 246, 18, 20200, 50276, 1440, 337, 476, 320, 10341, 323, 3210, 10166, 342, 13358, 4522, 383, 2265, 253, 9860, 1364, 320, 247, 8578, 273, 390, 253, 2120, 3236, 5018, 908, 1309, 3733, 2139, 1057, 253, 1332, 417, 789, 323, 10341, 5415, 673, 5018, 604, 253, 1566, 310, 10166, 342, 13358, 673, 5018, 253, 806, 1750, 2789, 352, 3133, 751, 326, 651, 320, 1896, 50276, 22309, 497, 690, 273, 253, 3210, 908, 851, 11273, 3185, 2509, 5175, 970, 760, 4229, 3215, 11273, 3210, 4583, 891, 1119, 253, 2746, 281, 5919, 32765, 2617, 10491, 7091, 407, 253, 4477, 281, 320, 24600, 285, 12054, 4460, 1223, 616, 1332, 476, 6296, 8069, 2572, 2412, 12177, 323, 32765, 2617, 342, 247, 10260, 3777, 9860, 273, 673, 5018, 436, 858, 417, 3176, 281, 16497, 281, 5520, 1566, 3290, 275, 2426, 273, 4588, 4561, 3530, 253, 2457, 6452, 310, 3103, 8489, 43288, 3184, 984, 271, 7445, 32765, 2617, 10130, 651, 320, 2159, 285, 5919, 2104, 281, 4711, 1029, 2412, 12177, 84, 285, 2104, 281, 4711, 1698, 269, 301, 7363, 2429, 281, 643, 3082, 1580, 436, 4736, 310, 417, 6786, 891, 5583, 326, 253, 4477, 45735, 616, 2746, 281, 4271, 604, 627, 310, 247, 1039, 281, 625, 8069, 19071, 3410, 3290, 2581, 685, 2412, 12177, 275, 616, 33234, 5933, 5474, 33032, 2520, 2929, 10262, 247, 7870, 10717, 5933, 281, 3410, 432, 12393, 3210, 275, 2159, 597, 8415, 247, 1073, 17443, 1344, 39960, 1895, 327, 3215, 11273, 2171, 16723, 285, 921, 1175, 1543, 1014, 342, 25319, 35132, 5904, 50276, 8826, 1841, 403, 5611, 533, 1620, 4518, 5544, 253, 954, 1774, 581, 310, 368, 1620, 11120, 1333, 752, 11101, 34690, 2097, 891, 1158, 954, 10668, 476, 4677, 352, 562, 533, 760, 846, 4361, 253, 2929, 368, 3630, 247, 2372, 327, 3239, 374, 533, 891, 651, 1056, 352, 1014, 625, 2590, 347, 352, 310, 1774, 323, 4361, 253, 1551, 671, 752, 310, 271, 1045, 2399, 1854, 50276, 251, 3239, 577, 368, 3630, 359, 476, 22318, 247, 2957, 390, 10921, 1159, 342, 1675, 281, 253, 4522, 383, 2265, 3746, 846, 253, 32765, 2617, 310, 10166, 50276, 5092, 368, 5513, 969, 752, 436, 2097, 50276, 12380, 337, 327, 3239, 577, 253, 1854, 7866, 387, 246, 50276, 17, 285, 7637, 387, 246, 50276, 18, 50276, 261, 352, 417, 1896, 281, 1097, 4311, 285, 16497, 253, 43936, 849, 29190, 310, 436, 1663, 50276, 5371, 310, 352, 670, 690, 37820, 3082, 326, 2789, 634, 2746, 417, 789, 10155, 253, 11101, 993, 1430, 50276, 5092, 253, 4477, 1158, 273, 643, 37820, 3082, 326, 2740, 253, 2746, 50276, 2577, 2234, 4468, 275, 4588, 11897, 673, 1333, 7253, 849, 1048, 1057, 352, 1379, 33234, 31482, 342, 12842, 5018, 1379, 2429, 281, 12842, 21396, 31482, 281, 479, 352, 4453, 751, 12842, 310, 2217, 323, 21396, 31482, 281, 5834, 598, 281, 634, 1332, 594, 849, 1199, 310, 627, 281, 3330, 407, 13887, 634, 5933, 4583, 253, 2929, 4245, 247, 5322, 18389, 273, 253, 6239, 285, 1246, 247, 747, 17032, 6974, 275, 1501, 31158, 15216, 891, 452, 690, 16157, 1840, 326, 476, 1056, 479, 24033, 619, 7103, 285, 891, 3524, 323, 247, 5322, 5955, 342, 253, 4477, 5474, 339, 793, 10240, 403, 4561, 432, 277, 12132, 983, 407, 16161, 271, 256, 615, 2223, 275, 13358, 673, 534, 310, 908, 281, 3730, 281, 5742, 253, 299, 335, 693, 274, 7352, 2902, 35132, 5837, 436, 2436, 36269, 247, 4327, 323, 835, 281, 1056, 10704, 5018, 1016, 4327, 273, 3213, 8593, 556, 247, 3969, 1045, 2399, 436, 2929, 14371, 326, 327, 247, 3215, 11273, 1566, 253, 8654, 1045, 2399, 778, 320, 2797, 3066, 247, 7870, 10717, 5933, 323, 253, 4328, 273, 253, 5018, 253, 2929, 310, 1077, 4518, 3542, 285, 891, 11346, 4361, 436, 2929, 50276, 783, 2022, 21779, 273, 436, 2929, 310, 253, 14138, 2572, 275, 15180, 6733, 50276, 783, 4477, 2319, 581, 1650, 275, 534, 35059, 5018, 275, 253, 256, 615, 47037, 403, 7932, 342, 7960, 4567, 5018, 891, 588, 5257, 281, 3730, 281, 1841, 347, 1146, 5018, 273, 271, 256, 615, 47037, 1014, 275, 253, 13358, 1083, 1580, 28763, 10323, 47515, 1469, 327, 436, 310, 5604, 247, 14138, 1750, 347, 253, 1029, 15180, 2105, 273, 277, 12132, 983, 556, 594, 2080, 644, 581, 273, 616, 2201, 14155, 2616, 50276, 783, 5125, 5853, 3133, 281, 320, 6571, 5272, 4583, 253, 14138, 5141, 275, 5018, 9193, 1512, 1175, 281, 320, 2032, 50276, 66, 21942, 326, 310, 8127, 32708, 562, 407, 2593, 8319, 275, 534, 352, 310, 5183, 326, 11138, 253, 1045, 2399, 1057, 417, 7933, 16084, 11138, 253, 269, 301, 347, 253, 4477, 3877, 352, 310, 2709, 281, 15313, 2709, 3588, 1045, 37298, 594, 436, 310, 247, 1083, 275, 534, 5556, 2182, 253, 1045, 2399, 878, 417, 16084, 2686, 11138, 253, 1566, 50276, 1189, 455, 619, 1379, 327, 436, 2929, 310, 326, 3885, 310, 5520, 533, 352, 310, 4352, 395, 3099, 1880, 1566, 3045, 310, 25047, 16682, 2509, 594, 436, 310, 11392, 275, 619, 278, 2016, 1981, 14764, 593, 4868, 342, 690, 29646, 891, 812, 923, 253, 5609, 436, 2929, 29328, 1146, 273, 1270, 11839, 50275, 13206, 608, 50276, 531, 14282, 14855, 275, 253, 9759, 310, 4677, 608, 275, 534, 891, 1158, 1027, 8516, 757, 3410, 11865, 497, 908, 281, 6635, 1016, 2460, 891, 513, 3877, 326, 253, 2505, 3916, 326, 253, 1072, 3632, 8357, 369, 908, 533, 253, 5235, 50276, 15617, 1561, 1016, 1387, 1171, 20528, 285, 875, 1016, 1387, 1171, 20528, 2097, 891, 717, 33872, 619, 5476, 310, 326, 247, 1027, 8516, 757, 3410, 11865, 497, 908, 323, 1016, 1387, 273, 5018, 285, 270, 1561, 1016, 1387, 273, 5018, 970, 253, 1072, 3632, 8357, 1057, 417, 2686, 3730, 281, 970, 253, 1072, 8516, 757, 3200, 2581, 352, 10770, 281, 970, 253, 1072, 42344, 1016, 273, 534, 403, 46595, 264, 407, 9765, 40009, 390, 305, 7293, 327, 634, 14951, 436, 310, 417, 387, 512, 253, 1072, 2181, 347, 970, 253, 1072, 8516, 757, 3200, 50276, 783, 4569, 2181, 281, 513, 651, 320, 281, 897, 253, 1072, 44351, 26202, 553, 8516, 757, 3200, 3410, 323, 1046, 2014, 5406, 2011, 275, 4677, 608, 1046, 673, 247, 1127, 310, 32305, 728, 18289, 4829, 1900, 387, 247, 1127, 326, 352, 556, 417, 644, 32305, 728, 387, 1078, 347, 1027, 3213, 15849, 778, 1659, 5018, 310, 1077, 1027, 5053, 840, 247, 8516, 757, 9729, 943, 320, 8818, 875, 253, 767, 3530, 2168, 2540, 2057, 1930, 273, 352, 50276, 783, 4477, 452, 417, 4439, 2127, 594, 891, 2550, 923, 752, 6335, 597, 403, 970, 3746, 533, 253, 1840, 5199, 778, 4354, 320, 2218, 970, 253, 8516, 757, 31251, 273, 253, 30162, 84, 615, 6335, 337, 1056, 2119, 281, 897, 247, 2014, 8516, 757, 31251, 1789, 323, 253, 25983, 273, 11365, 247, 4677, 761, 675, 272, 247, 747, 581, 387, 667, 1127, 651, 320, 247, 10551, 347, 352, 310, 30027, 760, 598, 281, 1097, 697, 8357, 285, 253, 2792, 352, 556, 2168, 644, 32305, 728, 387, 281, 1918, 253, 4569, 10414, 253, 8516, 757, 7726, 369, 5611, 275, 374, 347, 271, 7756, 273, 253, 7503, 8516, 757, 5202, 273, 495, 50276, 338, 253, 1840, 5199, 310, 3560, 840, 891, 651, 1902, 253, 4561, 3530, 281, 1199, 625, 8244, 28788, 1016, 275, 643, 285, 275, 2509, 594, 320, 2104, 281, 1805, 2096, 253, 1055, 273, 3629, 253, 1180, 273, 5018, 534, 310, 846, 512, 4275, 281, 436, 2929, 50275, 977, 16157, 50276, 29813, 1668, 310, 4518, 4275, 281, 253, 2929, 2299, 352, 3965, 1199, 3249, 562, 273, 17663, 387, 1878, 323, 253, 9414, 665, 36908, 2186, 512, 253, 23065, 273, 277, 12132, 983, 275, 616, 1481, 891, 1158, 326, 247, 28529, 651, 320, 247, 14282, 7756, 281, 253, 2929, 50276, 783, 7870, 10717, 5933, 18627, 275, 2593, 5976, 9193, 9093, 2629, 50276, 67, 11587, 1073, 17443, 1344, 284, 5933, 352, 671, 3133, 1077, 35036, 273, 7870, 673, 2137, 14650, 891, 2743, 253, 2022, 7680, 273, 436, 2929, 347, 253, 8137, 326, 3213, 8593, 476, 320, 6777, 3066, 33234, 417, 253, 5933, 3139, 50276, 783, 2862, 2929, 310, 29318, 760, 275, 253, 3634, 273, 17032, 891, 30821, 326, 352, 1537, 671, 320, 4217, 275, 253, 3634, 273, 3733, 7221, 2182, 3733, 4815, 3340, 323, 8214, 3210, 824, 347, 841, 310, 247, 9400, 273, 1270, 6349, 4931, 253, 5199, 5125, 275, 436, 2929, 812, 320, 294, 6321, 1046, 295, 3733, 5018, 323, 690, 295, 50275, 678, 982, 3908, 50276, 74, 651, 452, 1869, 326, 11138, 253, 15180, 6733, 273, 19983, 3210, 651, 452, 690, 4931, 1355, 2762, 3486, 327, 253, 17178, 2523, 273, 7952, 1818, 352, 3133, 247, 2372, 591, 3025, 326, 436, 2762, 16289, 3486, 310, 417, 5469, 275, 253, 18035, 3908, 50275, 37585, 2792, 50275, 69, 7261, 1620, 556, 26609, 1475, 697, 7125, 50276, 909, 697, 816, 277, 7261, 268, 89, 82, 89, 2581, 685, 277, 7261, 3498, 2805, 89, 390, 277, 7261, 3498, 82, 89, 50276, 6377, 577, 253, 31931, 2492, 26332, 310, 3798, 42965, 275, 11073, 4028, 50276, 267, 46042, 337, 285, 374, 841, 403, 247, 12504, 247, 5878, 273, 10585, 406, 853, 285, 15548, 891, 1158, 352, 651, 320, 9013, 281, 2619, 816, 581, 3340, 347, 597, 10725, 327, 8770, 2173, 281, 36950, 824, 347, 44176, 407, 5293, 50276, 74, 717, 417, 13762, 849, 14282, 253, 5955, 275, 2593, 7652, 1663, 310, 352, 2792, 562, 326, 14366, 3579, 11999, 403, 2424, 347, 1016, 3579, 1509, 3936, 14366, 789, 840, 4583, 14366, 19, 789, 310, 2424, 50276, 911, 24374, 347, 3264, 752, 310, 747, 1060, 50276, 74, 13414, 1158, 270, 19875, 3239, 818, 310, 2931, 50275, 250, 3065, 50276, 18, 632, 30162, 84, 615, 5987, 7280, 681, 9906, 36642, 13473, 42585, 615, 50276, 19, 5772, 1063, 1162, 355, 5919, 285, 7899, 27935, 323, 11454, 256, 3229, 5723, 2824, 43425, 5987, 39962, 2061, 5375, 19, 10655, 1012, 35337, 50276, 20, 632, 1162, 355, 44755, 27935, 323, 19191, 8967, 7424, 247, 382, 1832, 9169, 5987, 39962, 2061, 5375, 1518, 6903, 22130, 6830, 342, 690, 29646, 253, 2929, 556, 253, 2442, 281, 320, 1077, 1175, 347, 352, 9572, 352, 10262, 247, 14138, 3885, 7756, 326, 778, 526, 333, 1439, 4711, 18230, 253, 2457, 1566, 4583, 891, 5583, 14924, 2490, 187, 4118, 18435, 27, 2520, 2929, 29328, 247, 7870, 10717, 5700, 323, 7938, 16851, 5978, 275, 1850, 80, 2182, 12393, 37851, 3210, 50276, 455, 30628, 14109, 253, 2929, 533, 597, 403, 417, 27662, 9049, 50275, 9389, 30628, 403, 7106, 327, 253, 2412, 12177, 417, 1146, 253, 8103, 323, 2460, 3290, 436, 913, 1057, 417, 1663, 4489, 436, 4154, 50275, 783, 1332, 285, 2926, 1475, 403, 973, 48198, 285, 6699, 594, 352, 310, 1892, 281, 1158, 273, 667, 2201, 14586, 326, 588, 1818, 253, 4583, 2926, 247, 2257, 581, 812, 3103, 9059, 323, 14924, 347, 352, 9572, 327, 253, 643, 1133, 436, 310, 2834, 281, 9059, 323, 1677, 253, 2708, 14924, 1268, 7363, 50275, 601, 253, 2457, 17401, 310, 12009, 342, 247, 2266, 31868, 281, 11929, 281, 253, 1735, 8059, 22753, 253, 2929, 342, 36588, 422, 7125, 327, 2139, 253, 1045, 2399, 285, 417, 269, 301, 310, 253, 987, 2181, 281, 1908 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the authors propose a probabilistic framework to improve the classification accuracy in instances when there exists missing data in the multimodality datasets where one of the modalities is the predictive label however this label is not assumed missing to this end they propose a generalized softmax function as the joint distribution of all modalities and the label from which conditional distributions are derived for computing the maximum likelihood estimate mle experimental results on enterface and ravdess datasets demonstrate improvements in classification accuracy over baselines in addition the authors investigate the influence of the influence of the backbone models and the fusion functions the main contribution of the paper is the proposal of the generalized softmax function to model the joint distribution of all modalities and the label the generalized softmax function consists of the product of the marginal distributions of the modalities and the label as if they were independent subsequently compensating for this the dependence among the modalities and the label via an exponential function enhanced with feature extraction models this joint distribution leads to computationally efficient conditional distributions however there are a few concerns about the approaches and the evaluations in this paper the most significant concern is about the baseline comparison the authors set these baselines as instances of specifically defined simpler models or their own model in order to highlight specific manner in which those models deal with the missing modality however there are many prior works focused on solving the exact missing modality problem see below the authors should thus compare against those baselines instead of deriving their own baseline model instances 1 multimodal generative models for scalable weaklysupervised learning wu and goodman 2 privateshared disentangled multimodal vae for learning of latent representations lee and pavlovic 3 mhvae a humaninspired deep hierarchical generative model for multimodal representation learning vasco melo and paiva in the experiments tab1 and tab2 the visual missing rate and the audio missing rates are likely those in the training set it is not clear what are the missing rates are for the testing set if any the authors should clarify this in fig4 c and d the classification accuracy of the happiness emotion by zp is higher than that by the proposed method this single value may be caused by several factors hidden or accidental thus it may not be sound to claim that the proposed method is more efficient to exploit the information in most categories to support that claim more investigation is needed and the authors should present it i find the claims in the paper largely unsubstantiated due to the lack of comparison to baselines as mentioned in the main review above docsepthis submission proposed a maximum likelihood estimation framework combined with a generalized softmax function to resolve multimodal emotion recognition with missing modality two emotion recognition datasets are used in experiments to make comparison with several baseline methods the results suggest that the proposed approach outperforms these compared methods moreover according to the authors the endtoend nature of this framework makes it more efficient than previous works 1 in introduction the authors states that compared with unimodal learning multimodal learning can effectively utilize the multimodal data to achieve better performance actually in some cases multimodal data must be utilized properly to make multimodal learning more effective than unimodal learning for eg researchers have found that the best unimodal model can outperform its multimodal counterpart in this paper w wang et al what makes training multimodal classification networks hard the authors should try to make the statements more accurate 2 the author have mentioned in page 4 that in the following we will show that we use empirical distribution to implement these underlying marginal distribution in our algorithm but the reviewer could not find any descriptions in the following paragraphs 3 in experiments on enterfaces05 the condition with 100 missing rate should be considered which could be helpful to demonstrate whether the left 5 data in 95missing case is indeed used for the task or just because the other complete modality of data 4 the authors mentioned several different methods dealing with missing modality in related works but no experiments to compare the performances between the proposed methods and the mentioned framework at the same time the comparative methods in this submission are less persuasive there is a mismatch between the title and the content given that the proposed methods are only verified in one application scenario ie emotion recognition however the multimodal learning is such a big topic including but not limited to emotion recognition action recognition etc the authors may consider to extend the methods in order to match the title or change the title to a specific area besides the experiments the authors have conducted is far from extensive and the comparative methods are not sufficent to support the conclusion docsepthis paper deals with multimodal learning with missing modality in training specifically the proposed method is based on maximum likelihood estimation to obtain the conditional distributions of the socalled the modality complete data and the modalitymissing data in which a multimodal softmax function is defined to implement this framework in an endtoend manner strengths multimodal learning has achieved great success for many applications and having missing modality is an important challenge to be tackled this paper presents a simple endtoend method in a way novel and contributing to the field as being based on maximum likelihood which is not presented by prior art weakness the biggest limitation of this work is considering only the multimodal data having two modalities as well as not even discussing how the proposed method behaves in case of having more modalities although the paper presents a taskfree method the experimental analysis were limited to two datasets both are addressing the same task emotion recognition as also mentioned in the introduction there are several other tasks that the proposed method could have been tested on indeed the related work such as ma et al smil multimodal learning with severely missing modality was tested on several different tasks i suggest authors to either change the paper including the title abstract and related work and target categorical emotion recognition or comprehensively extend the experimental analysis such that the proposed method would be tested and validated on several other tasks it is also important to mention that the proposed method was tested only for categorical emotion recognition while emotion datasets are typically multilabeled as humans cannot elicit only one emotion at a time and also include continuous values therefore regression task might be targeted too i also found the used datasets limited in terms of their size in case authors would like to keep the emotion recognition task as the testbed i suggest them using a much larger dataset called cmumosei also having other modalities than video and audio indeed some related work was tested on cmumosei andor cmumosi such as ma et al smil multimodal learning with severely missing modality another limitation regarding the experimental analysis performed is that as modality only visual and audio data were used testing on combinations of several other modalities text depth data data of mocap accelerometer gyrometer would improve the validity of the proposed method in addition the generalized softmax function we propose generalized softmax function might be misleading it more sounds like the used softmax has tolerance to the diversity of samples belonging to different classes or somehow a domain adaption is being applied but these are not the cases it is unclear why authors think that the multimodal softmax is a contribution eq 3 and following equations look like standard softmax was written for multimodal data instead of first fusing the data and representing the fused data as a single feature vector on the other hand the fusion of the data is performed through standard strategies addition concatenation and multiplication i expect authors to clarify the contribution in this respect there are also lack of information regarding how the data is being processed in detail a what does we take central frame as visual modality mean do you take a bunch of frames and use only the central frame if so what is the motivation behind this what is the window size in fact it is more frequent to apply spatiotemporal processing for example processing motion and appearance in facial images for emotion recognition thus i do not understand the rationale behind discarding the temporal information b another issue is reading the audio data it is not clear what the audio data chunk selected to calculate the log melspectrogram c on each processed dataset we split all data into three parts training set validation set and test set their proportions are 70 15 and 15 are you randomly picking these splits and applying sort of a kfold cross validation or these splits are obtained only once and fixed do you guarantee that you use exactly the same split for all baseline methods this is a matter because the used datasets are relatively small i am aware that prior art on emotion recognition uses 5 or 10 fold cross validation for the same datasets and i am not sure why authors have selected a different data splitting strategy the proposed method was compared with some relatively simpler baselines such as zero padding but the comparative study should include the sota methods eg ma et al 2021b tran et al 2017 chen zhang 2020 liu et al 2021 suo et al 2019 given this lack of comparison i believe that the claim of authors which lead to the information of the modalitymissing data not being well exploited to was not justified as well i believe authors should include a better discussion why they tackle with the missing data only in training but never take into account that there could be missing modality in testing as well i think in a practical scenario it is more possible to train a model with a full set of modalities while during test some of the modalities are either completely or only for some test samples missing tables should include the results of unimodal data processing to allow reader to understand which modality perform better than other when used alone and include the results of processing complete data ie no missing modality as the upper bound when the visual modality is missing the classification accuracy is lower than that when the audio modality is missing indicating that the visual modality has a more significant contribution to the classification performance which is consistent with previous works zhang et al 2017 ma et al 2020 i believe the citations in this sentence is a bit irrelevant in detail the authors are not using neither the same feature sets nor the same datasets with the cited works the experimental analysis is limited in terms of several aspects a applications tested only for emotion recognition b fixed type of modalities only audio and video c no comparisons with the sota ### Summary:
three experts reviewed this paper and all recommended rejection the rebuttal did not change the reviewers recommendations the reviewers was not excited by the proposed probabilistic framework and raised many concerns regarding the comparison with baselines and competing methods limited size of datasets and limited scope of one dataset for one task considering the reviewers concerns we regret that the paper cannot be recommended for acceptance at this time the authors are encouraged to consider the reviewers comments when revising the paper for submission elsewhere
[ 285, 253, 5203, 432, 534, 17697, 10670, 403, 6012, 323, 12672, 253, 4869, 12177, 6642, 278, 282, 5661, 1543, 327, 4901, 1664, 285, 1218, 19122, 405, 15302, 7568, 11701, 275, 9162, 7200, 689, 1666, 25379, 275, 1635, 253, 4477, 7409, 253, 4833, 273, 253, 4833, 273, 253, 27882, 3210, 285, 253, 11781, 3470, 50276, 783, 2022, 7680, 273, 253, 2929, 310, 253, 10419, 273, 253, 14923, 2602, 4090, 1159, 281, 1566, 253, 6036, 3268, 273, 512, 33433, 285, 253, 5203, 253, 14923, 2602, 4090, 1159, 8414, 273, 253, 1885, 273, 253, 16888, 10670, 273, 253, 33433, 285, 253, 5203, 347, 604, 597, 497, 3907, 9674, 7037, 839, 323, 436, 253, 10096, 2190, 253, 33433, 285, 253, 5203, 3066, 271, 17619, 1159, 8655, 342, 4735, 11998, 3210, 436, 6036, 3268, 5644, 281, 43245, 5919, 17697, 10670, 50276, 35529, 627, 403, 247, 1643, 7350, 670, 253, 7274, 285, 253, 27163, 275, 436, 2929, 50275, 783, 954, 1534, 4468, 310, 670, 253, 8245, 5301, 50276, 783, 4477, 873, 841, 1666, 25379, 347, 10872, 273, 5742, 2931, 19554, 3210, 390, 616, 1211, 1566, 275, 1340, 281, 6780, 2173, 5133, 275, 534, 1110, 3210, 2968, 342, 253, 5816, 36453, 2299, 627, 403, 1142, 2720, 2987, 7106, 327, 16161, 253, 3242, 5816, 36453, 1895, 923, 2708, 253, 4477, 943, 3021, 7277, 1411, 1110, 1666, 25379, 3185, 273, 44190, 616, 1211, 8245, 1566, 10872, 50276, 18, 23390, 26306, 1006, 800, 3210, 323, 44755, 22112, 35421, 4715, 259, 86, 285, 1175, 1342, 374, 2294, 684, 73, 1096, 557, 290, 33195, 23390, 26306, 362, 3348, 323, 4715, 273, 21624, 14237, 458, 70, 285, 47299, 21412, 280, 495, 278, 73, 21574, 247, 1966, 38358, 3676, 24498, 1006, 800, 1566, 323, 23390, 26306, 6779, 4715, 16016, 1940, 6673, 80, 285, 1349, 9321, 50275, 249, 253, 4679, 10334, 18, 285, 10334, 19, 253, 5304, 5816, 2281, 285, 253, 9797, 5816, 4142, 403, 2779, 1110, 275, 253, 3733, 873, 352, 310, 417, 2590, 752, 403, 253, 5816, 4142, 403, 323, 253, 5175, 873, 604, 667, 50276, 783, 4477, 943, 19148, 436, 50275, 249, 3036, 21, 260, 285, 277, 253, 9162, 7200, 273, 253, 15704, 12904, 407, 1182, 81, 310, 2169, 685, 326, 407, 253, 4081, 1332, 436, 2014, 1318, 778, 320, 4269, 407, 2067, 2616, 8763, 390, 31120, 3021, 352, 778, 417, 320, 3590, 281, 1750, 326, 253, 4081, 1332, 310, 625, 5919, 281, 22059, 253, 1491, 275, 954, 9050, 281, 1329, 326, 1750, 625, 5839, 310, 3058, 285, 253, 4477, 943, 1246, 352, 50276, 74, 1089, 253, 3916, 275, 253, 2929, 8127, 440, 44167, 4215, 1955, 281, 253, 3480, 273, 5301, 281, 1666, 25379, 347, 5393, 275, 253, 2022, 2278, 1840, 50275, 7152, 33032, 2520, 19529, 4081, 247, 4869, 12177, 13418, 7792, 5678, 342, 247, 14923, 2602, 4090, 1159, 281, 11322, 23390, 26306, 12904, 8981, 342, 5816, 36453, 767, 12904, 8981, 15302, 403, 908, 275, 4679, 281, 1056, 5301, 342, 2067, 8245, 3082, 253, 1543, 1804, 326, 253, 4081, 2746, 41731, 13015, 841, 2429, 3082, 25761, 2556, 281, 253, 4477, 253, 990, 936, 423, 3753, 273, 436, 7792, 2789, 352, 625, 5919, 685, 2045, 2987, 337, 275, 10199, 50276, 783, 4477, 3054, 326, 2429, 342, 32505, 26306, 4715, 23390, 26306, 4715, 476, 8069, 16584, 253, 23390, 26306, 941, 281, 5115, 1805, 3045, 2686, 275, 690, 2219, 23390, 26306, 941, 1364, 320, 12845, 6283, 281, 1056, 23390, 26306, 4715, 625, 3576, 50276, 14644, 32505, 26306, 4715, 323, 24088, 8607, 452, 1119, 326, 253, 1682, 32505, 26306, 1566, 476, 562, 32231, 697, 23390, 26306, 14317, 275, 436, 2929, 259, 259, 606, 1162, 355, 752, 2789, 3733, 23390, 26306, 9162, 6928, 1892, 253, 4477, 943, 1611, 281, 1056, 253, 7234, 625, 7899, 50276, 19, 253, 2488, 452, 5393, 275, 3239, 577, 326, 275, 253, 1563, 359, 588, 921, 326, 359, 897, 16774, 3268, 281, 3359, 841, 6944, 16888, 3268, 275, 776, 5933, 533, 253, 37317, 812, 417, 1089, 667, 20121, 275, 253, 1563, 33295, 50276, 20, 275, 4679, 327, 4901, 6511, 1762, 253, 1617, 342, 2233, 5816, 2281, 943, 320, 2783, 534, 812, 320, 9371, 281, 7568, 1880, 253, 1669, 608, 941, 275, 5325, 33722, 1083, 310, 6296, 908, 323, 253, 4836, 390, 816, 984, 253, 643, 3426, 36453, 273, 941, 50276, 21, 253, 4477, 5393, 2067, 1027, 3082, 10620, 342, 5816, 36453, 275, 2905, 2987, 533, 642, 4679, 281, 7277, 253, 16226, 875, 253, 4081, 3082, 285, 253, 5393, 7792, 387, 253, 1072, 673, 253, 20407, 3082, 275, 436, 19529, 403, 1679, 34593, 627, 310, 247, 29713, 875, 253, 4060, 285, 253, 2600, 1677, 326, 253, 4081, 3082, 403, 760, 16058, 275, 581, 2898, 10076, 26332, 12904, 8981, 2299, 253, 23390, 26306, 4715, 310, 824, 247, 1943, 9400, 1690, 533, 417, 3710, 281, 12904, 8981, 2250, 8981, 3966, 253, 4477, 778, 1908, 281, 9017, 253, 3082, 275, 1340, 281, 3761, 253, 4060, 390, 1818, 253, 4060, 281, 247, 2173, 2170, 50276, 67, 11587, 253, 4679, 253, 4477, 452, 5196, 310, 2080, 432, 9470, 285, 253, 20407, 3082, 403, 417, 402, 1330, 290, 281, 1329, 253, 6452, 5474, 33032, 2520, 2929, 13330, 342, 23390, 26306, 4715, 342, 5816, 36453, 275, 3733, 5742, 253, 4081, 1332, 310, 1754, 327, 4869, 12177, 13418, 281, 4044, 253, 17697, 10670, 273, 253, 9267, 18859, 253, 36453, 3426, 941, 285, 253, 36453, 33722, 941, 275, 534, 247, 23390, 26306, 2602, 4090, 1159, 310, 2931, 281, 3359, 436, 7792, 275, 271, 990, 936, 423, 5133, 20544, 209, 186, 9961, 303, 26306, 4715, 556, 6786, 1270, 2323, 323, 1142, 4893, 285, 1907, 5816, 36453, 310, 271, 1774, 5691, 281, 320, 11463, 1070, 436, 2929, 10262, 247, 2969, 990, 936, 423, 1332, 275, 247, 1039, 4460, 285, 15979, 281, 253, 1673, 347, 1146, 1754, 327, 4869, 12177, 534, 310, 417, 3559, 407, 2720, 1445, 50276, 20881, 1255, 209, 186, 783, 5962, 12291, 273, 436, 789, 310, 7296, 760, 253, 23390, 26306, 941, 1907, 767, 33433, 347, 973, 347, 417, 1014, 16585, 849, 253, 4081, 1332, 37824, 275, 1083, 273, 1907, 625, 33433, 209, 186, 20261, 253, 2929, 10262, 247, 4836, 4924, 1332, 253, 5661, 1783, 497, 3710, 281, 767, 15302, 1097, 403, 15974, 253, 1072, 4836, 12904, 8981, 347, 671, 5393, 275, 253, 10199, 627, 403, 2067, 643, 8892, 326, 253, 4081, 1332, 812, 452, 644, 5762, 327, 6296, 253, 2905, 789, 824, 347, 6429, 1162, 355, 924, 300, 23390, 26306, 4715, 342, 18270, 5816, 36453, 369, 5762, 327, 2067, 1027, 8892, 891, 1804, 4477, 281, 2057, 1818, 253, 2929, 1690, 253, 4060, 12002, 285, 2905, 789, 285, 2303, 31091, 12904, 8981, 390, 9483, 1242, 9017, 253, 5661, 1783, 824, 326, 253, 4081, 1332, 651, 320, 5762, 285, 17618, 327, 2067, 643, 8892, 50276, 186, 262, 310, 671, 1774, 281, 3748, 326, 253, 4081, 1332, 369, 5762, 760, 323, 31091, 12904, 8981, 1223, 12904, 15302, 403, 5431, 33362, 1492, 264, 347, 7497, 2550, 35192, 760, 581, 12904, 387, 247, 673, 285, 671, 2486, 5415, 2193, 3103, 9077, 4836, 1537, 320, 10522, 1512, 209, 186, 74, 671, 1119, 253, 908, 15302, 3710, 275, 2426, 273, 616, 1979, 275, 1083, 4477, 651, 751, 281, 1978, 253, 12904, 8981, 4836, 347, 253, 1071, 3026, 891, 1804, 731, 970, 247, 1199, 4067, 10895, 1925, 7892, 360, 583, 74, 671, 1907, 643, 33433, 685, 3492, 285, 9797, 6296, 690, 2905, 789, 369, 5762, 327, 7892, 360, 583, 74, 285, 263, 7892, 360, 21221, 824, 347, 6429, 1162, 355, 924, 300, 23390, 26306, 4715, 342, 18270, 5816, 36453, 209, 186, 23955, 12291, 5001, 253, 5661, 1783, 2684, 310, 326, 347, 36453, 760, 5304, 285, 9797, 941, 497, 908, 5175, 327, 13553, 273, 2067, 643, 33433, 2505, 6864, 941, 941, 273, 278, 406, 522, 17308, 11955, 19859, 409, 1715, 651, 3157, 253, 13091, 273, 253, 4081, 1332, 209, 186, 249, 1635, 253, 14923, 2602, 4090, 1159, 359, 12661, 14923, 2602, 4090, 1159, 1537, 320, 24363, 352, 625, 7835, 751, 253, 908, 2602, 4090, 556, 13761, 281, 253, 9991, 273, 3530, 15823, 281, 1027, 5971, 390, 10380, 247, 5028, 5223, 279, 310, 1146, 3732, 533, 841, 403, 417, 253, 2219, 209, 186, 262, 310, 12744, 2139, 4477, 1158, 326, 253, 23390, 26306, 2602, 4090, 310, 247, 7680, 16186, 495, 285, 1563, 7424, 1007, 751, 2629, 2602, 4090, 369, 3542, 323, 23390, 26306, 941, 3185, 273, 806, 269, 5302, 253, 941, 285, 9999, 253, 29843, 941, 347, 247, 2014, 4735, 4972, 327, 253, 643, 1133, 253, 11781, 273, 253, 941, 310, 2684, 949, 2629, 8130, 1635, 32147, 318, 285, 25219, 891, 1902, 4477, 281, 19148, 253, 7680, 275, 436, 1675, 209, 186, 9088, 403, 671, 3480, 273, 1491, 5001, 849, 253, 941, 310, 1146, 11742, 275, 2508, 50276, 66, 186, 5371, 1057, 359, 1379, 4275, 3665, 347, 5304, 36453, 1599, 513, 368, 1379, 247, 12190, 273, 13009, 285, 897, 760, 253, 4275, 3665, 604, 594, 752, 310, 253, 16038, 3212, 436, 752, 310, 253, 3497, 1979, 275, 958, 352, 310, 625, 10879, 281, 4647, 7046, 7173, 358, 23702, 5162, 323, 1650, 5162, 3200, 285, 7286, 275, 17754, 3888, 323, 12904, 8981, 3021, 891, 513, 417, 2096, 253, 24775, 3212, 1262, 13218, 253, 11935, 1491, 270, 186, 23955, 2523, 310, 4361, 253, 9797, 941, 352, 310, 417, 2590, 752, 253, 9797, 941, 20540, 4236, 281, 10173, 253, 2412, 278, 1241, 808, 287, 1710, 260, 186, 251, 1016, 11742, 10895, 359, 8085, 512, 941, 715, 1264, 4243, 3733, 873, 12820, 873, 285, 1071, 873, 616, 22260, 403, 5571, 1458, 285, 1458, 403, 368, 12421, 8871, 841, 36509, 285, 9433, 3686, 273, 247, 465, 8089, 2831, 12820, 390, 841, 36509, 403, 2797, 760, 2378, 285, 4229, 513, 368, 12215, 326, 368, 897, 4555, 253, 1072, 8085, 323, 512, 8245, 3082, 436, 310, 247, 2647, 984, 253, 908, 15302, 403, 4942, 1355, 891, 717, 6600, 326, 2720, 1445, 327, 12904, 8981, 4648, 608, 390, 884, 7975, 2831, 12820, 323, 253, 1072, 15302, 285, 891, 717, 417, 2119, 2139, 4477, 452, 4236, 247, 1027, 941, 19860, 5700, 28910, 50276, 783, 4081, 1332, 369, 2429, 342, 690, 4942, 19554, 1666, 25379, 824, 347, 5058, 13294, 533, 253, 20407, 1263, 943, 2486, 253, 256, 5503, 3082, 24088, 6429, 1162, 355, 43425, 67, 21191, 1162, 355, 4240, 260, 864, 50276, 91, 12109, 9169, 632, 86, 1162, 355, 43425, 45588, 1162, 355, 6247, 1677, 436, 3480, 273, 5301, 891, 2868, 326, 253, 1750, 273, 4477, 534, 1421, 281, 253, 1491, 273, 253, 36453, 33722, 941, 417, 1146, 973, 28734, 281, 369, 417, 17285, 347, 973, 209, 186, 74, 2868, 4477, 943, 2486, 247, 1805, 5955, 2139, 597, 18915, 342, 253, 5816, 941, 760, 275, 3733, 533, 1620, 1379, 715, 2395, 326, 627, 812, 320, 5816, 36453, 275, 5175, 347, 973, 891, 1158, 275, 247, 8542, 10076, 352, 310, 625, 1896, 281, 6194, 247, 1566, 342, 247, 2120, 873, 273, 33433, 1223, 1309, 1071, 690, 273, 253, 33433, 403, 2057, 4336, 390, 760, 323, 690, 1071, 3530, 5816, 209, 186, 38538, 943, 2486, 253, 1543, 273, 32505, 26306, 941, 5162, 281, 1581, 9414, 281, 2096, 534, 36453, 1347, 1805, 685, 643, 672, 908, 3815, 285, 2486, 253, 1543, 273, 5162, 3426, 941, 26332, 642, 5816, 36453, 347, 253, 5170, 3033, 209, 186, 9453, 253, 5304, 36453, 310, 5816, 253, 9162, 7200, 310, 2406, 685, 326, 672, 253, 9797, 36453, 310, 5816, 7809, 326, 253, 5304, 36453, 556, 247, 625, 1534, 7680, 281, 253, 9162, 3045, 534, 310, 5185, 342, 2045, 2987, 1182, 12109, 1162, 355, 4240, 6429, 1162, 355, 9169, 891, 2868, 253, 30404, 275, 436, 6197, 310, 247, 2372, 19124, 275, 2508, 253, 4477, 403, 417, 970, 6747, 253, 1072, 4735, 5239, 4543, 253, 1072, 15302, 342, 253, 11106, 2987, 50274, 783, 5661, 1783, 310, 3710, 275, 2426, 273, 2067, 7794, 247, 4893, 5762, 760, 323, 12904, 8981, 270, 4229, 1511, 273, 33433, 760, 9797, 285, 3492, 260, 642, 14023, 342, 253, 256, 5503, 50276, 187, 187, 4118, 18435, 27, 13524, 10071, 9814, 436, 2929, 285, 512, 8521, 18235, 253, 30080, 22559, 858, 417, 1818, 253, 30628, 12645, 253, 30628, 369, 417, 9049, 407, 253, 4081, 37851, 7792, 285, 5439, 1142, 7350, 5001, 253, 5301, 342, 1666, 25379, 285, 11771, 3082, 3710, 1979, 273, 15302, 285, 3710, 7990, 273, 581, 10895, 323, 581, 4836, 7296, 253, 30628, 7350, 359, 14938, 326, 253, 2929, 2550, 320, 8521, 323, 14924, 387, 436, 673, 50276, 783, 4477, 403, 14659, 281, 1908, 253, 30628, 5701, 672, 3585, 2182, 253, 2929, 323, 19529, 11358 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 285, 253, 5203, 432, 534, 17697, 10670, 403, 6012, 323, 12672, 253, 4869, 12177, 6642, 278, 282, 5661, 1543, 327, 4901, 1664, 285, 1218, 19122, 405, 15302, 7568, 11701, 275, 9162, 7200, 689, 1666, 25379, 275, 1635, 253, 4477, 7409, 253, 4833, 273, 253, 4833, 273, 253, 27882, 3210, 285, 253, 11781, 3470, 50276, 783, 2022, 7680, 273, 253, 2929, 310, 253, 10419, 273, 253, 14923, 2602, 4090, 1159, 281, 1566, 253, 6036, 3268, 273, 512, 33433, 285, 253, 5203, 253, 14923, 2602, 4090, 1159, 8414, 273, 253, 1885, 273, 253, 16888, 10670, 273, 253, 33433, 285, 253, 5203, 347, 604, 597, 497, 3907, 9674, 7037, 839, 323, 436, 253, 10096, 2190, 253, 33433, 285, 253, 5203, 3066, 271, 17619, 1159, 8655, 342, 4735, 11998, 3210, 436, 6036, 3268, 5644, 281, 43245, 5919, 17697, 10670, 50276, 35529, 627, 403, 247, 1643, 7350, 670, 253, 7274, 285, 253, 27163, 275, 436, 2929, 50275, 783, 954, 1534, 4468, 310, 670, 253, 8245, 5301, 50276, 783, 4477, 873, 841, 1666, 25379, 347, 10872, 273, 5742, 2931, 19554, 3210, 390, 616, 1211, 1566, 275, 1340, 281, 6780, 2173, 5133, 275, 534, 1110, 3210, 2968, 342, 253, 5816, 36453, 2299, 627, 403, 1142, 2720, 2987, 7106, 327, 16161, 253, 3242, 5816, 36453, 1895, 923, 2708, 253, 4477, 943, 3021, 7277, 1411, 1110, 1666, 25379, 3185, 273, 44190, 616, 1211, 8245, 1566, 10872, 50276, 18, 23390, 26306, 1006, 800, 3210, 323, 44755, 22112, 35421, 4715, 259, 86, 285, 1175, 1342, 374, 2294, 684, 73, 1096, 557, 290, 33195, 23390, 26306, 362, 3348, 323, 4715, 273, 21624, 14237, 458, 70, 285, 47299, 21412, 280, 495, 278, 73, 21574, 247, 1966, 38358, 3676, 24498, 1006, 800, 1566, 323, 23390, 26306, 6779, 4715, 16016, 1940, 6673, 80, 285, 1349, 9321, 50275, 249, 253, 4679, 10334, 18, 285, 10334, 19, 253, 5304, 5816, 2281, 285, 253, 9797, 5816, 4142, 403, 2779, 1110, 275, 253, 3733, 873, 352, 310, 417, 2590, 752, 403, 253, 5816, 4142, 403, 323, 253, 5175, 873, 604, 667, 50276, 783, 4477, 943, 19148, 436, 50275, 249, 3036, 21, 260, 285, 277, 253, 9162, 7200, 273, 253, 15704, 12904, 407, 1182, 81, 310, 2169, 685, 326, 407, 253, 4081, 1332, 436, 2014, 1318, 778, 320, 4269, 407, 2067, 2616, 8763, 390, 31120, 3021, 352, 778, 417, 320, 3590, 281, 1750, 326, 253, 4081, 1332, 310, 625, 5919, 281, 22059, 253, 1491, 275, 954, 9050, 281, 1329, 326, 1750, 625, 5839, 310, 3058, 285, 253, 4477, 943, 1246, 352, 50276, 74, 1089, 253, 3916, 275, 253, 2929, 8127, 440, 44167, 4215, 1955, 281, 253, 3480, 273, 5301, 281, 1666, 25379, 347, 5393, 275, 253, 2022, 2278, 1840, 50275, 7152, 33032, 2520, 19529, 4081, 247, 4869, 12177, 13418, 7792, 5678, 342, 247, 14923, 2602, 4090, 1159, 281, 11322, 23390, 26306, 12904, 8981, 342, 5816, 36453, 767, 12904, 8981, 15302, 403, 908, 275, 4679, 281, 1056, 5301, 342, 2067, 8245, 3082, 253, 1543, 1804, 326, 253, 4081, 2746, 41731, 13015, 841, 2429, 3082, 25761, 2556, 281, 253, 4477, 253, 990, 936, 423, 3753, 273, 436, 7792, 2789, 352, 625, 5919, 685, 2045, 2987, 337, 275, 10199, 50276, 783, 4477, 3054, 326, 2429, 342, 32505, 26306, 4715, 23390, 26306, 4715, 476, 8069, 16584, 253, 23390, 26306, 941, 281, 5115, 1805, 3045, 2686, 275, 690, 2219, 23390, 26306, 941, 1364, 320, 12845, 6283, 281, 1056, 23390, 26306, 4715, 625, 3576, 50276, 14644, 32505, 26306, 4715, 323, 24088, 8607, 452, 1119, 326, 253, 1682, 32505, 26306, 1566, 476, 562, 32231, 697, 23390, 26306, 14317, 275, 436, 2929, 259, 259, 606, 1162, 355, 752, 2789, 3733, 23390, 26306, 9162, 6928, 1892, 253, 4477, 943, 1611, 281, 1056, 253, 7234, 625, 7899, 50276, 19, 253, 2488, 452, 5393, 275, 3239, 577, 326, 275, 253, 1563, 359, 588, 921, 326, 359, 897, 16774, 3268, 281, 3359, 841, 6944, 16888, 3268, 275, 776, 5933, 533, 253, 37317, 812, 417, 1089, 667, 20121, 275, 253, 1563, 33295, 50276, 20, 275, 4679, 327, 4901, 6511, 1762, 253, 1617, 342, 2233, 5816, 2281, 943, 320, 2783, 534, 812, 320, 9371, 281, 7568, 1880, 253, 1669, 608, 941, 275, 5325, 33722, 1083, 310, 6296, 908, 323, 253, 4836, 390, 816, 984, 253, 643, 3426, 36453, 273, 941, 50276, 21, 253, 4477, 5393, 2067, 1027, 3082, 10620, 342, 5816, 36453, 275, 2905, 2987, 533, 642, 4679, 281, 7277, 253, 16226, 875, 253, 4081, 3082, 285, 253, 5393, 7792, 387, 253, 1072, 673, 253, 20407, 3082, 275, 436, 19529, 403, 1679, 34593, 627, 310, 247, 29713, 875, 253, 4060, 285, 253, 2600, 1677, 326, 253, 4081, 3082, 403, 760, 16058, 275, 581, 2898, 10076, 26332, 12904, 8981, 2299, 253, 23390, 26306, 4715, 310, 824, 247, 1943, 9400, 1690, 533, 417, 3710, 281, 12904, 8981, 2250, 8981, 3966, 253, 4477, 778, 1908, 281, 9017, 253, 3082, 275, 1340, 281, 3761, 253, 4060, 390, 1818, 253, 4060, 281, 247, 2173, 2170, 50276, 67, 11587, 253, 4679, 253, 4477, 452, 5196, 310, 2080, 432, 9470, 285, 253, 20407, 3082, 403, 417, 402, 1330, 290, 281, 1329, 253, 6452, 5474, 33032, 2520, 2929, 13330, 342, 23390, 26306, 4715, 342, 5816, 36453, 275, 3733, 5742, 253, 4081, 1332, 310, 1754, 327, 4869, 12177, 13418, 281, 4044, 253, 17697, 10670, 273, 253, 9267, 18859, 253, 36453, 3426, 941, 285, 253, 36453, 33722, 941, 275, 534, 247, 23390, 26306, 2602, 4090, 1159, 310, 2931, 281, 3359, 436, 7792, 275, 271, 990, 936, 423, 5133, 20544, 209, 186, 9961, 303, 26306, 4715, 556, 6786, 1270, 2323, 323, 1142, 4893, 285, 1907, 5816, 36453, 310, 271, 1774, 5691, 281, 320, 11463, 1070, 436, 2929, 10262, 247, 2969, 990, 936, 423, 1332, 275, 247, 1039, 4460, 285, 15979, 281, 253, 1673, 347, 1146, 1754, 327, 4869, 12177, 534, 310, 417, 3559, 407, 2720, 1445, 50276, 20881, 1255, 209, 186, 783, 5962, 12291, 273, 436, 789, 310, 7296, 760, 253, 23390, 26306, 941, 1907, 767, 33433, 347, 973, 347, 417, 1014, 16585, 849, 253, 4081, 1332, 37824, 275, 1083, 273, 1907, 625, 33433, 209, 186, 20261, 253, 2929, 10262, 247, 4836, 4924, 1332, 253, 5661, 1783, 497, 3710, 281, 767, 15302, 1097, 403, 15974, 253, 1072, 4836, 12904, 8981, 347, 671, 5393, 275, 253, 10199, 627, 403, 2067, 643, 8892, 326, 253, 4081, 1332, 812, 452, 644, 5762, 327, 6296, 253, 2905, 789, 824, 347, 6429, 1162, 355, 924, 300, 23390, 26306, 4715, 342, 18270, 5816, 36453, 369, 5762, 327, 2067, 1027, 8892, 891, 1804, 4477, 281, 2057, 1818, 253, 2929, 1690, 253, 4060, 12002, 285, 2905, 789, 285, 2303, 31091, 12904, 8981, 390, 9483, 1242, 9017, 253, 5661, 1783, 824, 326, 253, 4081, 1332, 651, 320, 5762, 285, 17618, 327, 2067, 643, 8892, 50276, 186, 262, 310, 671, 1774, 281, 3748, 326, 253, 4081, 1332, 369, 5762, 760, 323, 31091, 12904, 8981, 1223, 12904, 15302, 403, 5431, 33362, 1492, 264, 347, 7497, 2550, 35192, 760, 581, 12904, 387, 247, 673, 285, 671, 2486, 5415, 2193, 3103, 9077, 4836, 1537, 320, 10522, 1512, 209, 186, 74, 671, 1119, 253, 908, 15302, 3710, 275, 2426, 273, 616, 1979, 275, 1083, 4477, 651, 751, 281, 1978, 253, 12904, 8981, 4836, 347, 253, 1071, 3026, 891, 1804, 731, 970, 247, 1199, 4067, 10895, 1925, 7892, 360, 583, 74, 671, 1907, 643, 33433, 685, 3492, 285, 9797, 6296, 690, 2905, 789, 369, 5762, 327, 7892, 360, 583, 74, 285, 263, 7892, 360, 21221, 824, 347, 6429, 1162, 355, 924, 300, 23390, 26306, 4715, 342, 18270, 5816, 36453, 209, 186, 23955, 12291, 5001, 253, 5661, 1783, 2684, 310, 326, 347, 36453, 760, 5304, 285, 9797, 941, 497, 908, 5175, 327, 13553, 273, 2067, 643, 33433, 2505, 6864, 941, 941, 273, 278, 406, 522, 17308, 11955, 19859, 409, 1715, 651, 3157, 253, 13091, 273, 253, 4081, 1332, 209, 186, 249, 1635, 253, 14923, 2602, 4090, 1159, 359, 12661, 14923, 2602, 4090, 1159, 1537, 320, 24363, 352, 625, 7835, 751, 253, 908, 2602, 4090, 556, 13761, 281, 253, 9991, 273, 3530, 15823, 281, 1027, 5971, 390, 10380, 247, 5028, 5223, 279, 310, 1146, 3732, 533, 841, 403, 417, 253, 2219, 209, 186, 262, 310, 12744, 2139, 4477, 1158, 326, 253, 23390, 26306, 2602, 4090, 310, 247, 7680, 16186, 495, 285, 1563, 7424, 1007, 751, 2629, 2602, 4090, 369, 3542, 323, 23390, 26306, 941, 3185, 273, 806, 269, 5302, 253, 941, 285, 9999, 253, 29843, 941, 347, 247, 2014, 4735, 4972, 327, 253, 643, 1133, 253, 11781, 273, 253, 941, 310, 2684, 949, 2629, 8130, 1635, 32147, 318, 285, 25219, 891, 1902, 4477, 281, 19148, 253, 7680, 275, 436, 1675, 209, 186, 9088, 403, 671, 3480, 273, 1491, 5001, 849, 253, 941, 310, 1146, 11742, 275, 2508, 50276, 66, 186, 5371, 1057, 359, 1379, 4275, 3665, 347, 5304, 36453, 1599, 513, 368, 1379, 247, 12190, 273, 13009, 285, 897, 760, 253, 4275, 3665, 604, 594, 752, 310, 253, 16038, 3212, 436, 752, 310, 253, 3497, 1979, 275, 958, 352, 310, 625, 10879, 281, 4647, 7046, 7173, 358, 23702, 5162, 323, 1650, 5162, 3200, 285, 7286, 275, 17754, 3888, 323, 12904, 8981, 3021, 891, 513, 417, 2096, 253, 24775, 3212, 1262, 13218, 253, 11935, 1491, 270, 186, 23955, 2523, 310, 4361, 253, 9797, 941, 352, 310, 417, 2590, 752, 253, 9797, 941, 20540, 4236, 281, 10173, 253, 2412, 278, 1241, 808, 287, 1710, 260, 186, 251, 1016, 11742, 10895, 359, 8085, 512, 941, 715, 1264, 4243, 3733, 873, 12820, 873, 285, 1071, 873, 616, 22260, 403, 5571, 1458, 285, 1458, 403, 368, 12421, 8871, 841, 36509, 285, 9433, 3686, 273, 247, 465, 8089, 2831, 12820, 390, 841, 36509, 403, 2797, 760, 2378, 285, 4229, 513, 368, 12215, 326, 368, 897, 4555, 253, 1072, 8085, 323, 512, 8245, 3082, 436, 310, 247, 2647, 984, 253, 908, 15302, 403, 4942, 1355, 891, 717, 6600, 326, 2720, 1445, 327, 12904, 8981, 4648, 608, 390, 884, 7975, 2831, 12820, 323, 253, 1072, 15302, 285, 891, 717, 417, 2119, 2139, 4477, 452, 4236, 247, 1027, 941, 19860, 5700, 28910, 50276, 783, 4081, 1332, 369, 2429, 342, 690, 4942, 19554, 1666, 25379, 824, 347, 5058, 13294, 533, 253, 20407, 1263, 943, 2486, 253, 256, 5503, 3082, 24088, 6429, 1162, 355, 43425, 67, 21191, 1162, 355, 4240, 260, 864, 50276, 91, 12109, 9169, 632, 86, 1162, 355, 43425, 45588, 1162, 355, 6247, 1677, 436, 3480, 273, 5301, 891, 2868, 326, 253, 1750, 273, 4477, 534, 1421, 281, 253, 1491, 273, 253, 36453, 33722, 941, 417, 1146, 973, 28734, 281, 369, 417, 17285, 347, 973, 209, 186, 74, 2868, 4477, 943, 2486, 247, 1805, 5955, 2139, 597, 18915, 342, 253, 5816, 941, 760, 275, 3733, 533, 1620, 1379, 715, 2395, 326, 627, 812, 320, 5816, 36453, 275, 5175, 347, 973, 891, 1158, 275, 247, 8542, 10076, 352, 310, 625, 1896, 281, 6194, 247, 1566, 342, 247, 2120, 873, 273, 33433, 1223, 1309, 1071, 690, 273, 253, 33433, 403, 2057, 4336, 390, 760, 323, 690, 1071, 3530, 5816, 209, 186, 38538, 943, 2486, 253, 1543, 273, 32505, 26306, 941, 5162, 281, 1581, 9414, 281, 2096, 534, 36453, 1347, 1805, 685, 643, 672, 908, 3815, 285, 2486, 253, 1543, 273, 5162, 3426, 941, 26332, 642, 5816, 36453, 347, 253, 5170, 3033, 209, 186, 9453, 253, 5304, 36453, 310, 5816, 253, 9162, 7200, 310, 2406, 685, 326, 672, 253, 9797, 36453, 310, 5816, 7809, 326, 253, 5304, 36453, 556, 247, 625, 1534, 7680, 281, 253, 9162, 3045, 534, 310, 5185, 342, 2045, 2987, 1182, 12109, 1162, 355, 4240, 6429, 1162, 355, 9169, 891, 2868, 253, 30404, 275, 436, 6197, 310, 247, 2372, 19124, 275, 2508, 253, 4477, 403, 417, 970, 6747, 253, 1072, 4735, 5239, 4543, 253, 1072, 15302, 342, 253, 11106, 2987, 50274, 783, 5661, 1783, 310, 3710, 275, 2426, 273, 2067, 7794, 247, 4893, 5762, 760, 323, 12904, 8981, 270, 4229, 1511, 273, 33433, 760, 9797, 285, 3492, 260, 642, 14023, 342, 253, 256, 5503, 50276, 187, 187, 4118, 18435, 27, 13524, 10071, 9814, 436, 2929, 285, 512, 8521, 18235, 253, 30080, 22559, 858, 417, 1818, 253, 30628, 12645, 253, 30628, 369, 417, 9049, 407, 253, 4081, 37851, 7792, 285, 5439, 1142, 7350, 5001, 253, 5301, 342, 1666, 25379, 285, 11771, 3082, 3710, 1979, 273, 15302, 285, 3710, 7990, 273, 581, 10895, 323, 581, 4836, 7296, 253, 30628, 7350, 359, 14938, 326, 253, 2929, 2550, 320, 8521, 323, 14924, 387, 436, 673, 50276, 783, 4477, 403, 14659, 281, 1908, 253, 30628, 5701, 672, 3585, 2182, 253, 2929, 323, 19529, 11358 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper proposes a hierarchical memory to store features at different semantic levels for fewshot learning across domains it introduces a hierarchical prototype model where each level of the prototypes fetches the corresponding information from the hierarchical memory the authors follow the hyper network design to learn the weights when combining predictions from multiple levels the overall model is optimized using a variational inference framework the proposed method is evaluated on 4 various datasets which have a domain gap between the training data the authors also show that this method is competitive on the commonly used fewshot image classification benchmarks pros the paper is wellwritten and easy to follow the paper structure is very clear the idea of leveraging multilevel semantic information is interesting for fewshot learning across domains although using multilevel features is not something new in the space in general using it as a condition for the prototypes is novel for fewshot learning it seems to be a good extension to the previous memory networks concerns would the proposed method with external memory significantly increase the model size the external memory might be something unfair to approaches without explicit external memory the baselines in the result tables table 4 and 5 may not be the current soda methods according to the public leader board httpspaperswithcodecomsotafewshotimageclassificationonmini2 the prototype completion for fewshot learning zhang et al work achieves 6968 076 on the 5way 1shot experiment using resnet12 which is higher than the reported numbers in table 5 more ablation study in terms of model size eg different backbones training efficiency eg how fast is the variational inferencebased approach the overall idea is interesting but there are concerns regarding the experiments as well as the introduction of external memory postrebuttal comments the authors addressed most of my concerns in the feedback i kept my score and learned towards acceptance docsepthis paper presents a hierarchical version of the variational memory approach of zhen et al 2020 for fewshot learning the core technical methodology follows that developed by zhen et al with the difference being in terms of the model this work utilizes a deep model with perlayer memory whereas zhen et al utilize a single memory for highlevel concepts ie deepest layer only motivating the choice of perlayer memory is the desire to better handle fewshot learning tasks with domain shift as the representations in earlier layers may be more relevant in such scenarios the proposed framework learns a hypernetwork to predict attention weights over the perlayer prototypes from a technical standpoint the work appears to be a relatively straightforward extension of zhen et al 2020 hence experimental validation of the impact of the proposed perlayer memory model is especially important the paper presents results on the same fewshot tasks as zhen et al miniimagenet and tieredimagenet as well as comparison to other metalearning methods on fewshot crossdomain tasks here ablation experiments also show learned weighting of perlayer prototypes to be useful however the experimental results leave open a critical question about the comparison of the proposed approach to the baseline variational semantic memory vsm of zhen et al specifically the results quoted for vsm in table 5 are worse than the results in zhen et al 2020 for this same experiment in fact the results reported in table 6 of zhen et al 2020 are better in 3 out of 4 settings than the results reported for the proposed system the discrepancy is method miniimagenet 1shot5shot tieredimagenet 1shot5shot vsm 6572 8273 7201 8677 vsm 6421 7969 6958 8328 ours 6701 8175 7170 8513 where vsm is as reported in zhen et al 2020 vsm is the reimplementation by this submission and ours is the submissions approach since the proposed system is an extension of zhen et al it is quite detrimental to actually perform worse than the baseline these results are also on a central experimental setting fewshow learning with deep models it is not acceptable to present a reimplementation that flips the ranking of the methods at minimum some extensive explanation is required about differences between the original and reimplementation as well as why the original published results were not replicated while this work appears to be a promising extension of the variational semantic memory of zhen et al 2020 to models with multilevel prototypes it omits proper comparison to zhen et al 2020 on a central experiment which when included shows the proposed approach produces worse performance than the baseline the author response should address this discrepancy docsepthe authors propose a novel model that focuses on improving cross domain few shot classification to this end they introduce a hierarchical extension to the work proposed in zhen et al 2020 wherein the latent variables at different levels capture distinct semantic information the proposed framework enables generating class specific prototypes at different hierarchical levels which are then used to make predictions at each level these predictions are ensembled using domain specific weights obtained from the support set via a gradient based method through experiments on various crossdomain and indomain tasks the authors show considerable improvements over the baselines textbfstrengths 1 the paper is well written and easy to follow the authors do a good job of introducing their model elements and contrasting them with previous works 2 the improvements obtained over the baselines are impressive specifically on the task of crossdomain few shot classification the gap between the proposed method and the most competitive baseline is significant 3 additionally the ablation experiments shown are extensive and do a good job of highlighting the importance of each component in the proposed framework textbfweaknesses 1 table 13 highlights that the proposed method is not that effective when using shallow feature extractors like conv4 this probably leads me to believe that ensembling is providing the major improvements and the proposed hierarchical formulation isnt actually that important to this end i have two questions a what happens if you dont use an ensemble and rather just use the logits from the last level keeping the rest of the architecture as is as the later latent variables depend on the earlier ones figure 1 if the hierarchical framework works well you should still see improvements over the baselines b what happens if you train l instances of vsm zhen et al 2020 where each instance is trained on the output of a residual block that is if you remove the connections between mathbfz and mathbfm in figure 1 how crucial is it to have these dependencies between the latent variables it would be great if the authors could comment on this 2 what is the memory overhead of using the proposed hierarchical method over the baseline zhen et al 2020 i can imagine having l layers of memory to considerably increase the memory overhead additionally due to the increase in the number of latent variables is convergence slower as well 3 the vsm numbers shown in table 5 are lower than what is reported in zhen et al 2020 the numbers in zhen et al 2020 show that the proposed method is inferior when compared to vsm on withindomain few shot classification i understand that the authors reimplemented their model but is there an explanation as to why the reimplemented numbers are considerably lower this work provides a logical extension to the existing work in zhen et al 2020 by introducing a hierarchical variational memory framework through the experiment results it is evident that the proposed method provides considerable improvements over existing approaches i have some concerns regarding the actual importance of dependencies within the latent variables im still inclined to accept this paper and would be willing to increase my rating if the authors address my concerns ### Summary:
this paper presents a hierarchical memory for cross domain and few shot classification problems the paper is well written tackles an important topic and the proposed approach which is an extension of vsm is interesting reviewer yexz has some concerns regarding comparison to a more proper baseline i believe that the authors have adequately addressed this reviewer 2ajk and g1bf also have suggestions that the authors have incorporated in the revision i recommend accepting this paper
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 247, 24498, 3541, 281, 4657, 3386, 387, 1027, 24705, 2308, 323, 1643, 11860, 4715, 2439, 10625, 352, 23970, 247, 24498, 21841, 1566, 835, 1016, 1268, 273, 253, 3861, 9117, 8264, 2706, 253, 3969, 1491, 432, 253, 24498, 3541, 50276, 783, 4477, 956, 253, 4373, 2990, 2216, 281, 3037, 253, 13461, 672, 16248, 13650, 432, 2709, 2308, 50276, 783, 4583, 1566, 310, 18325, 970, 247, 39762, 17032, 7792, 50275, 783, 4081, 1332, 310, 6760, 327, 577, 2710, 15302, 534, 452, 247, 5028, 8037, 875, 253, 3733, 941, 50276, 783, 4477, 671, 921, 326, 436, 1332, 310, 12085, 327, 253, 7744, 908, 1643, 11860, 2460, 9162, 49602, 50274, 856, 84, 50276, 783, 2929, 310, 973, 15720, 285, 3477, 281, 956, 253, 2929, 2605, 310, 1077, 2590, 50275, 783, 2934, 273, 19732, 2977, 1554, 48268, 24705, 1491, 310, 4722, 323, 1643, 11860, 4715, 2439, 10625, 50276, 20261, 970, 1554, 48268, 3386, 310, 417, 1633, 747, 275, 253, 2317, 275, 2087, 970, 352, 347, 247, 1617, 323, 253, 3861, 9117, 310, 4460, 323, 1643, 11860, 4715, 50276, 262, 3133, 281, 320, 247, 1175, 6880, 281, 253, 2045, 3541, 6928, 50275, 585, 1209, 2224, 50276, 12756, 253, 4081, 1332, 342, 6024, 3541, 3012, 2572, 253, 1566, 1979, 50276, 783, 6024, 3541, 1537, 320, 1633, 16593, 281, 7274, 1293, 6843, 6024, 3541, 50275, 783, 1666, 25379, 275, 253, 906, 7180, 2829, 577, 285, 608, 778, 417, 320, 253, 1655, 29737, 3082, 2556, 281, 253, 1345, 6657, 4450, 5987, 50004, 3113, 3211, 681, 84, 5503, 37922, 11860, 5695, 42070, 251, 42825, 19, 253, 21841, 12240, 323, 1643, 11860, 4715, 1182, 12109, 1162, 355, 789, 33526, 10447, 2358, 50276, 41923, 327, 253, 608, 1106, 337, 11860, 3368, 970, 501, 3024, 805, 534, 310, 2169, 685, 253, 2361, 3904, 275, 2829, 608, 50275, 3062, 28913, 1263, 275, 2426, 273, 1566, 1979, 24088, 1027, 896, 47473, 3733, 6733, 24088, 849, 3809, 310, 253, 39762, 17032, 3169, 2746, 50273, 783, 4583, 2934, 310, 4722, 533, 627, 403, 7350, 5001, 253, 4679, 347, 973, 347, 253, 10199, 273, 6024, 3541, 50273, 5996, 250, 2858, 22559, 5701, 253, 4477, 9713, 954, 273, 619, 7350, 275, 253, 8680, 50276, 74, 4934, 619, 4868, 285, 6311, 4404, 14924, 50275, 7152, 33032, 2520, 2929, 10262, 247, 24498, 2715, 273, 253, 39762, 3541, 2746, 273, 1182, 864, 1162, 355, 9169, 323, 1643, 11860, 4715, 50276, 783, 5161, 7681, 16182, 3637, 326, 3715, 407, 1182, 864, 1162, 355, 342, 253, 3064, 1146, 275, 2426, 273, 253, 1566, 436, 789, 29820, 247, 3676, 1566, 342, 591, 12026, 3541, 5727, 1182, 864, 1162, 355, 16584, 247, 2014, 3541, 323, 1029, 5251, 12342, 26332, 34620, 3828, 760, 50276, 24013, 400, 839, 253, 4327, 273, 591, 12026, 3541, 310, 253, 8327, 281, 1805, 6016, 1643, 11860, 4715, 8892, 342, 5028, 5333, 347, 253, 14237, 275, 4321, 8090, 778, 320, 625, 4623, 275, 824, 15216, 253, 4081, 7792, 33772, 247, 4373, 18428, 281, 3283, 4116, 13461, 689, 253, 591, 12026, 3861, 9117, 50276, 4064, 247, 7681, 32764, 253, 789, 4620, 281, 320, 247, 4942, 15246, 6880, 273, 1182, 864, 1162, 355, 9169, 50276, 48521, 5661, 12820, 273, 253, 3486, 273, 253, 4081, 591, 12026, 3541, 1566, 310, 3340, 1774, 50276, 783, 2929, 10262, 1543, 327, 253, 1072, 1643, 11860, 8892, 347, 1182, 864, 1162, 355, 12949, 303, 6533, 292, 285, 13898, 433, 303, 6533, 292, 347, 973, 347, 5301, 281, 643, 5148, 613, 920, 3082, 327, 1643, 11860, 2831, 13517, 8892, 50276, 1568, 28913, 4679, 671, 921, 6311, 42428, 273, 591, 12026, 3861, 9117, 281, 320, 4217, 50276, 35529, 253, 5661, 1543, 3553, 1527, 247, 4619, 1953, 670, 253, 5301, 273, 253, 4081, 2746, 281, 253, 8245, 39762, 24705, 3541, 362, 3610, 273, 1182, 864, 1162, 355, 50276, 46458, 253, 1543, 15212, 323, 362, 3610, 275, 2829, 608, 403, 7197, 685, 253, 1543, 275, 1182, 864, 1162, 355, 9169, 323, 436, 1072, 3368, 50276, 249, 958, 253, 1543, 2361, 275, 2829, 721, 273, 1182, 864, 1162, 355, 9169, 403, 1805, 275, 495, 562, 273, 577, 7533, 685, 253, 1543, 2361, 323, 253, 4081, 985, 50276, 783, 26210, 310, 50276, 9349, 12949, 303, 6533, 292, 337, 11860, 22, 11860, 13898, 433, 303, 6533, 292, 337, 11860, 22, 11860, 50276, 87, 3610, 50276, 2082, 3547, 854, 25671, 818, 1252, 854, 30311, 50276, 87, 3610, 6705, 1797, 11275, 2090, 721, 34240, 854, 22130, 50276, 2108, 9963, 520, 854, 14840, 818, 15046, 9330, 1012, 50276, 2811, 362, 3610, 310, 347, 2361, 275, 1182, 864, 1162, 355, 9169, 362, 3610, 310, 253, 294, 39595, 407, 436, 19529, 285, 20451, 310, 253, 35103, 2746, 50276, 17480, 253, 4081, 985, 310, 271, 6880, 273, 1182, 864, 1162, 355, 352, 310, 3240, 30078, 281, 2686, 1347, 7197, 685, 253, 8245, 50276, 20513, 1543, 403, 671, 327, 247, 4275, 5661, 4758, 1643, 9029, 4715, 342, 3676, 3210, 50276, 262, 310, 417, 12207, 281, 1246, 247, 294, 39595, 326, 892, 2824, 253, 19947, 273, 253, 3082, 387, 5927, 690, 9470, 8813, 310, 2424, 670, 3910, 875, 253, 3236, 285, 294, 39595, 347, 973, 347, 2139, 253, 3236, 3863, 1543, 497, 417, 37221, 50276, 6050, 436, 789, 4620, 281, 320, 247, 12532, 6880, 273, 253, 39762, 24705, 3541, 273, 1182, 864, 1162, 355, 9169, 281, 3210, 342, 1554, 48268, 3861, 9117, 352, 7005, 953, 1463, 5301, 281, 1182, 864, 1162, 355, 9169, 327, 247, 4275, 3368, 534, 672, 2908, 2722, 253, 4081, 2746, 11330, 7197, 3045, 685, 253, 8245, 50276, 783, 2488, 2380, 943, 2953, 436, 26210, 50276, 7152, 339, 431, 248, 4477, 12661, 247, 4460, 1566, 326, 16633, 327, 11138, 2831, 5028, 1643, 5103, 9162, 281, 436, 990, 597, 9569, 247, 24498, 6880, 281, 253, 789, 4081, 275, 1182, 864, 1162, 355, 9169, 10646, 253, 21624, 4903, 387, 1027, 2308, 9232, 5799, 24705, 1491, 253, 4081, 7792, 13276, 11365, 966, 2173, 3861, 9117, 387, 1027, 24498, 2308, 534, 403, 840, 908, 281, 1056, 13650, 387, 1016, 1268, 841, 13650, 403, 546, 14948, 970, 5028, 2173, 13461, 2797, 432, 253, 1329, 873, 3066, 247, 11786, 1754, 1332, 949, 4679, 327, 2710, 2831, 13517, 285, 801, 297, 404, 8892, 253, 4477, 921, 10665, 11701, 689, 253, 1666, 25379, 2505, 3342, 296, 3755, 20556, 337, 253, 2929, 310, 973, 3542, 285, 3477, 281, 956, 253, 4477, 513, 247, 1175, 2628, 273, 16984, 616, 1566, 3603, 285, 42455, 731, 342, 2045, 2987, 374, 253, 11701, 2797, 689, 253, 1666, 25379, 403, 13943, 5742, 327, 253, 4836, 273, 2831, 13517, 1643, 5103, 9162, 253, 8037, 875, 253, 4081, 1332, 285, 253, 954, 12085, 8245, 310, 1534, 495, 23000, 253, 28913, 4679, 2011, 403, 9470, 285, 513, 247, 1175, 2628, 273, 27321, 253, 6349, 273, 1016, 4445, 275, 253, 4081, 7792, 50275, 11765, 20881, 1255, 265, 337, 2829, 2145, 16681, 326, 253, 4081, 1332, 310, 417, 326, 3576, 672, 970, 20126, 4735, 4908, 641, 751, 2410, 21, 436, 3164, 5644, 479, 281, 2868, 326, 546, 35128, 310, 5277, 253, 2201, 11701, 285, 253, 4081, 24498, 15895, 310, 2649, 2686, 326, 1774, 281, 436, 990, 891, 452, 767, 3533, 50276, 66, 752, 6569, 604, 368, 13414, 897, 271, 19862, 285, 2581, 816, 897, 253, 2412, 953, 432, 253, 1390, 1268, 7562, 253, 1551, 273, 253, 10336, 347, 310, 347, 253, 1996, 21624, 4903, 3469, 327, 253, 4321, 4394, 4677, 337, 604, 253, 24498, 7792, 2987, 973, 368, 943, 1335, 923, 11701, 689, 253, 1666, 25379, 50276, 67, 752, 6569, 604, 368, 6194, 298, 10872, 273, 362, 3610, 1182, 864, 1162, 355, 9169, 835, 1016, 4227, 310, 10166, 327, 253, 3453, 273, 247, 12541, 2972, 326, 310, 604, 368, 5386, 253, 10291, 875, 14168, 3342, 91, 285, 14168, 3342, 78, 275, 4677, 337, 849, 9560, 310, 352, 281, 452, 841, 21011, 875, 253, 21624, 4903, 352, 651, 320, 1270, 604, 253, 4477, 812, 4385, 327, 436, 50276, 19, 752, 310, 253, 3541, 18332, 273, 970, 253, 4081, 24498, 1332, 689, 253, 8245, 1182, 864, 1162, 355, 9169, 891, 476, 8564, 1907, 298, 8090, 273, 3541, 281, 15455, 2572, 253, 3541, 18332, 23000, 1955, 281, 253, 2572, 275, 253, 1180, 273, 21624, 4903, 310, 14940, 17357, 347, 973, 50276, 20, 253, 362, 3610, 3904, 2011, 275, 2829, 608, 403, 2406, 685, 752, 310, 2361, 275, 1182, 864, 1162, 355, 9169, 253, 3904, 275, 1182, 864, 1162, 355, 9169, 921, 326, 253, 4081, 1332, 310, 18134, 672, 2429, 281, 362, 3610, 327, 342, 527, 297, 404, 1643, 5103, 9162, 891, 2096, 326, 253, 4477, 294, 303, 38758, 616, 1566, 533, 310, 627, 271, 8813, 347, 281, 2139, 253, 294, 303, 38758, 3904, 403, 15455, 2406, 50275, 2520, 789, 3400, 247, 13760, 6880, 281, 253, 5368, 789, 275, 1182, 864, 1162, 355, 9169, 407, 16984, 247, 24498, 39762, 3541, 7792, 949, 253, 3368, 1543, 352, 310, 8943, 326, 253, 4081, 1332, 3400, 10665, 11701, 689, 5368, 7274, 891, 452, 690, 7350, 5001, 253, 4588, 6349, 273, 21011, 1561, 253, 21624, 4903, 516, 1335, 21802, 281, 2997, 436, 2929, 285, 651, 320, 7378, 281, 2572, 619, 13716, 604, 253, 4477, 2953, 619, 7350, 2490, 187, 4118, 18435, 27, 2520, 2929, 10262, 247, 24498, 3541, 323, 2831, 5028, 285, 1643, 5103, 9162, 3237, 253, 2929, 310, 973, 3542, 39223, 271, 1774, 9400, 285, 253, 4081, 2746, 534, 310, 271, 6880, 273, 362, 3610, 310, 4722, 37317, 340, 911, 91, 556, 690, 7350, 5001, 5301, 281, 247, 625, 1463, 8245, 891, 2868, 326, 253, 4477, 452, 18212, 9713, 436, 37317, 374, 1432, 76, 285, 305, 18, 3342, 671, 452, 13991, 326, 253, 4477, 452, 11217, 275, 253, 18520, 891, 5583, 18738, 436, 2929 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 247, 24498, 3541, 281, 4657, 3386, 387, 1027, 24705, 2308, 323, 1643, 11860, 4715, 2439, 10625, 352, 23970, 247, 24498, 21841, 1566, 835, 1016, 1268, 273, 253, 3861, 9117, 8264, 2706, 253, 3969, 1491, 432, 253, 24498, 3541, 50276, 783, 4477, 956, 253, 4373, 2990, 2216, 281, 3037, 253, 13461, 672, 16248, 13650, 432, 2709, 2308, 50276, 783, 4583, 1566, 310, 18325, 970, 247, 39762, 17032, 7792, 50275, 783, 4081, 1332, 310, 6760, 327, 577, 2710, 15302, 534, 452, 247, 5028, 8037, 875, 253, 3733, 941, 50276, 783, 4477, 671, 921, 326, 436, 1332, 310, 12085, 327, 253, 7744, 908, 1643, 11860, 2460, 9162, 49602, 50274, 856, 84, 50276, 783, 2929, 310, 973, 15720, 285, 3477, 281, 956, 253, 2929, 2605, 310, 1077, 2590, 50275, 783, 2934, 273, 19732, 2977, 1554, 48268, 24705, 1491, 310, 4722, 323, 1643, 11860, 4715, 2439, 10625, 50276, 20261, 970, 1554, 48268, 3386, 310, 417, 1633, 747, 275, 253, 2317, 275, 2087, 970, 352, 347, 247, 1617, 323, 253, 3861, 9117, 310, 4460, 323, 1643, 11860, 4715, 50276, 262, 3133, 281, 320, 247, 1175, 6880, 281, 253, 2045, 3541, 6928, 50275, 585, 1209, 2224, 50276, 12756, 253, 4081, 1332, 342, 6024, 3541, 3012, 2572, 253, 1566, 1979, 50276, 783, 6024, 3541, 1537, 320, 1633, 16593, 281, 7274, 1293, 6843, 6024, 3541, 50275, 783, 1666, 25379, 275, 253, 906, 7180, 2829, 577, 285, 608, 778, 417, 320, 253, 1655, 29737, 3082, 2556, 281, 253, 1345, 6657, 4450, 5987, 50004, 3113, 3211, 681, 84, 5503, 37922, 11860, 5695, 42070, 251, 42825, 19, 253, 21841, 12240, 323, 1643, 11860, 4715, 1182, 12109, 1162, 355, 789, 33526, 10447, 2358, 50276, 41923, 327, 253, 608, 1106, 337, 11860, 3368, 970, 501, 3024, 805, 534, 310, 2169, 685, 253, 2361, 3904, 275, 2829, 608, 50275, 3062, 28913, 1263, 275, 2426, 273, 1566, 1979, 24088, 1027, 896, 47473, 3733, 6733, 24088, 849, 3809, 310, 253, 39762, 17032, 3169, 2746, 50273, 783, 4583, 2934, 310, 4722, 533, 627, 403, 7350, 5001, 253, 4679, 347, 973, 347, 253, 10199, 273, 6024, 3541, 50273, 5996, 250, 2858, 22559, 5701, 253, 4477, 9713, 954, 273, 619, 7350, 275, 253, 8680, 50276, 74, 4934, 619, 4868, 285, 6311, 4404, 14924, 50275, 7152, 33032, 2520, 2929, 10262, 247, 24498, 2715, 273, 253, 39762, 3541, 2746, 273, 1182, 864, 1162, 355, 9169, 323, 1643, 11860, 4715, 50276, 783, 5161, 7681, 16182, 3637, 326, 3715, 407, 1182, 864, 1162, 355, 342, 253, 3064, 1146, 275, 2426, 273, 253, 1566, 436, 789, 29820, 247, 3676, 1566, 342, 591, 12026, 3541, 5727, 1182, 864, 1162, 355, 16584, 247, 2014, 3541, 323, 1029, 5251, 12342, 26332, 34620, 3828, 760, 50276, 24013, 400, 839, 253, 4327, 273, 591, 12026, 3541, 310, 253, 8327, 281, 1805, 6016, 1643, 11860, 4715, 8892, 342, 5028, 5333, 347, 253, 14237, 275, 4321, 8090, 778, 320, 625, 4623, 275, 824, 15216, 253, 4081, 7792, 33772, 247, 4373, 18428, 281, 3283, 4116, 13461, 689, 253, 591, 12026, 3861, 9117, 50276, 4064, 247, 7681, 32764, 253, 789, 4620, 281, 320, 247, 4942, 15246, 6880, 273, 1182, 864, 1162, 355, 9169, 50276, 48521, 5661, 12820, 273, 253, 3486, 273, 253, 4081, 591, 12026, 3541, 1566, 310, 3340, 1774, 50276, 783, 2929, 10262, 1543, 327, 253, 1072, 1643, 11860, 8892, 347, 1182, 864, 1162, 355, 12949, 303, 6533, 292, 285, 13898, 433, 303, 6533, 292, 347, 973, 347, 5301, 281, 643, 5148, 613, 920, 3082, 327, 1643, 11860, 2831, 13517, 8892, 50276, 1568, 28913, 4679, 671, 921, 6311, 42428, 273, 591, 12026, 3861, 9117, 281, 320, 4217, 50276, 35529, 253, 5661, 1543, 3553, 1527, 247, 4619, 1953, 670, 253, 5301, 273, 253, 4081, 2746, 281, 253, 8245, 39762, 24705, 3541, 362, 3610, 273, 1182, 864, 1162, 355, 50276, 46458, 253, 1543, 15212, 323, 362, 3610, 275, 2829, 608, 403, 7197, 685, 253, 1543, 275, 1182, 864, 1162, 355, 9169, 323, 436, 1072, 3368, 50276, 249, 958, 253, 1543, 2361, 275, 2829, 721, 273, 1182, 864, 1162, 355, 9169, 403, 1805, 275, 495, 562, 273, 577, 7533, 685, 253, 1543, 2361, 323, 253, 4081, 985, 50276, 783, 26210, 310, 50276, 9349, 12949, 303, 6533, 292, 337, 11860, 22, 11860, 13898, 433, 303, 6533, 292, 337, 11860, 22, 11860, 50276, 87, 3610, 50276, 2082, 3547, 854, 25671, 818, 1252, 854, 30311, 50276, 87, 3610, 6705, 1797, 11275, 2090, 721, 34240, 854, 22130, 50276, 2108, 9963, 520, 854, 14840, 818, 15046, 9330, 1012, 50276, 2811, 362, 3610, 310, 347, 2361, 275, 1182, 864, 1162, 355, 9169, 362, 3610, 310, 253, 294, 39595, 407, 436, 19529, 285, 20451, 310, 253, 35103, 2746, 50276, 17480, 253, 4081, 985, 310, 271, 6880, 273, 1182, 864, 1162, 355, 352, 310, 3240, 30078, 281, 2686, 1347, 7197, 685, 253, 8245, 50276, 20513, 1543, 403, 671, 327, 247, 4275, 5661, 4758, 1643, 9029, 4715, 342, 3676, 3210, 50276, 262, 310, 417, 12207, 281, 1246, 247, 294, 39595, 326, 892, 2824, 253, 19947, 273, 253, 3082, 387, 5927, 690, 9470, 8813, 310, 2424, 670, 3910, 875, 253, 3236, 285, 294, 39595, 347, 973, 347, 2139, 253, 3236, 3863, 1543, 497, 417, 37221, 50276, 6050, 436, 789, 4620, 281, 320, 247, 12532, 6880, 273, 253, 39762, 24705, 3541, 273, 1182, 864, 1162, 355, 9169, 281, 3210, 342, 1554, 48268, 3861, 9117, 352, 7005, 953, 1463, 5301, 281, 1182, 864, 1162, 355, 9169, 327, 247, 4275, 3368, 534, 672, 2908, 2722, 253, 4081, 2746, 11330, 7197, 3045, 685, 253, 8245, 50276, 783, 2488, 2380, 943, 2953, 436, 26210, 50276, 7152, 339, 431, 248, 4477, 12661, 247, 4460, 1566, 326, 16633, 327, 11138, 2831, 5028, 1643, 5103, 9162, 281, 436, 990, 597, 9569, 247, 24498, 6880, 281, 253, 789, 4081, 275, 1182, 864, 1162, 355, 9169, 10646, 253, 21624, 4903, 387, 1027, 2308, 9232, 5799, 24705, 1491, 253, 4081, 7792, 13276, 11365, 966, 2173, 3861, 9117, 387, 1027, 24498, 2308, 534, 403, 840, 908, 281, 1056, 13650, 387, 1016, 1268, 841, 13650, 403, 546, 14948, 970, 5028, 2173, 13461, 2797, 432, 253, 1329, 873, 3066, 247, 11786, 1754, 1332, 949, 4679, 327, 2710, 2831, 13517, 285, 801, 297, 404, 8892, 253, 4477, 921, 10665, 11701, 689, 253, 1666, 25379, 2505, 3342, 296, 3755, 20556, 337, 253, 2929, 310, 973, 3542, 285, 3477, 281, 956, 253, 4477, 513, 247, 1175, 2628, 273, 16984, 616, 1566, 3603, 285, 42455, 731, 342, 2045, 2987, 374, 253, 11701, 2797, 689, 253, 1666, 25379, 403, 13943, 5742, 327, 253, 4836, 273, 2831, 13517, 1643, 5103, 9162, 253, 8037, 875, 253, 4081, 1332, 285, 253, 954, 12085, 8245, 310, 1534, 495, 23000, 253, 28913, 4679, 2011, 403, 9470, 285, 513, 247, 1175, 2628, 273, 27321, 253, 6349, 273, 1016, 4445, 275, 253, 4081, 7792, 50275, 11765, 20881, 1255, 265, 337, 2829, 2145, 16681, 326, 253, 4081, 1332, 310, 417, 326, 3576, 672, 970, 20126, 4735, 4908, 641, 751, 2410, 21, 436, 3164, 5644, 479, 281, 2868, 326, 546, 35128, 310, 5277, 253, 2201, 11701, 285, 253, 4081, 24498, 15895, 310, 2649, 2686, 326, 1774, 281, 436, 990, 891, 452, 767, 3533, 50276, 66, 752, 6569, 604, 368, 13414, 897, 271, 19862, 285, 2581, 816, 897, 253, 2412, 953, 432, 253, 1390, 1268, 7562, 253, 1551, 273, 253, 10336, 347, 310, 347, 253, 1996, 21624, 4903, 3469, 327, 253, 4321, 4394, 4677, 337, 604, 253, 24498, 7792, 2987, 973, 368, 943, 1335, 923, 11701, 689, 253, 1666, 25379, 50276, 67, 752, 6569, 604, 368, 6194, 298, 10872, 273, 362, 3610, 1182, 864, 1162, 355, 9169, 835, 1016, 4227, 310, 10166, 327, 253, 3453, 273, 247, 12541, 2972, 326, 310, 604, 368, 5386, 253, 10291, 875, 14168, 3342, 91, 285, 14168, 3342, 78, 275, 4677, 337, 849, 9560, 310, 352, 281, 452, 841, 21011, 875, 253, 21624, 4903, 352, 651, 320, 1270, 604, 253, 4477, 812, 4385, 327, 436, 50276, 19, 752, 310, 253, 3541, 18332, 273, 970, 253, 4081, 24498, 1332, 689, 253, 8245, 1182, 864, 1162, 355, 9169, 891, 476, 8564, 1907, 298, 8090, 273, 3541, 281, 15455, 2572, 253, 3541, 18332, 23000, 1955, 281, 253, 2572, 275, 253, 1180, 273, 21624, 4903, 310, 14940, 17357, 347, 973, 50276, 20, 253, 362, 3610, 3904, 2011, 275, 2829, 608, 403, 2406, 685, 752, 310, 2361, 275, 1182, 864, 1162, 355, 9169, 253, 3904, 275, 1182, 864, 1162, 355, 9169, 921, 326, 253, 4081, 1332, 310, 18134, 672, 2429, 281, 362, 3610, 327, 342, 527, 297, 404, 1643, 5103, 9162, 891, 2096, 326, 253, 4477, 294, 303, 38758, 616, 1566, 533, 310, 627, 271, 8813, 347, 281, 2139, 253, 294, 303, 38758, 3904, 403, 15455, 2406, 50275, 2520, 789, 3400, 247, 13760, 6880, 281, 253, 5368, 789, 275, 1182, 864, 1162, 355, 9169, 407, 16984, 247, 24498, 39762, 3541, 7792, 949, 253, 3368, 1543, 352, 310, 8943, 326, 253, 4081, 1332, 3400, 10665, 11701, 689, 5368, 7274, 891, 452, 690, 7350, 5001, 253, 4588, 6349, 273, 21011, 1561, 253, 21624, 4903, 516, 1335, 21802, 281, 2997, 436, 2929, 285, 651, 320, 7378, 281, 2572, 619, 13716, 604, 253, 4477, 2953, 619, 7350, 2490, 187, 4118, 18435, 27, 2520, 2929, 10262, 247, 24498, 3541, 323, 2831, 5028, 285, 1643, 5103, 9162, 3237, 253, 2929, 310, 973, 3542, 39223, 271, 1774, 9400, 285, 253, 4081, 2746, 534, 310, 271, 6880, 273, 362, 3610, 310, 4722, 37317, 340, 911, 91, 556, 690, 7350, 5001, 5301, 281, 247, 625, 1463, 8245, 891, 2868, 326, 253, 4477, 452, 18212, 9713, 436, 37317, 374, 1432, 76, 285, 305, 18, 3342, 671, 452, 13991, 326, 253, 4477, 452, 11217, 275, 253, 18520, 891, 5583, 18738, 436, 2929 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper studies an online learning and contextual pricing problem with semiparametric partially linear demand models where the demand function is the sum of a linear function with an unknown coefficients of price or context resp and an unknown function of the context or price resp for the above two demand models the paper develops two corresponding online pricing algorithms with provable regret upper bounds and matching lower bounds with respect to the horizon t strengths overall the paper is wellwritten formulationwise the paper generalizes the commonly used lipschitz continuous assumption to holder continuity importantly the results show that the additional structure of linearity indeed improves the regret bound algorithmwise the paper combines several online learning techniques including binning and approximation random shock in pricing and upper confidence bound algorithm so as to estimate the nonparametric part of the demand functions analysiswise the paper gives a comprehensive analysis of the statistical complexityregret of the formulated problems weakness 1 in algorithm 2 when solving the ucb optimization problem in line 10 what is its complexity and in general how to solve it efficiently 2 the algorithms seem to require the knowledge of beta and k the authors should state these explicitly in the assumptions and discuss how to estimate these quantities when there is no such prior knowledge minor comment upon reviewing the literature the paper misses one stream of works related to semiparametric dynamic pricing see below the author should discuss the contribution against these works policy optimization using semiparametric models for dynamic pricing fan et al 2021 distributionfree contextual dynamic pricing luo and sun 2021 towards agnostic featurebased dynamic pricing linear policies vs linear valuation with unknown noise xu and wang 2022 also the authors should mention that the matching of lower bounds refers to the dependency on t but not with respect to d as in the discussion of theorem 4 one of the lower bounds doesnt meet the corresponding upper bound in the context dimension na docsepthis paper studies two online contextual dynamic pricing problem settings dplpe where demand is linear wrt price and dplce where demand is linear wrt context for dplpe problem they assume the realized demand dt bptgxt epsilont with gx being betaholder continuous and epsilont being a subgaussian noise they propose an adplp algorithm that adopts a space binning and a random shock techniques with adplp they achieve an optimal tildeosqrttvee tfracdd2beta regret up to logarithmic factors by proving both the upper and the lower regret bounds where d is the dimentionality of features for dplce problem they assume the realized demand dt fpt atopxt epsilont with fp being ktextthorder smooth with a small parameter delta that could be tdependent they present an adplc algorithm that adopts a local polynomial approximation and a biased linear contextual bandit with an optimistic ofu strategies with adplc they achieve an optimal tildeodsqrttveedelta tk1frac12k1 regret up to logarithmic factors by proving both the upper and the lower regret bounds finally they conduct numerical experiments and show that the simulation results of adplp and adplc ourperform all benchmarks strengths 1 this work generalizes the problem settings of both linear demand and linear context problems for dplpe they consider betaholder class instead of only lipschitz that was broadly assumed by previous works and they improve the comparison by replacing the linear benchmark with the true demand function in 8 and 23 for dplce they consider kthorder smooth with not only integer but also noninteger ks and they also emphasize the role that delta plays instead of treating it as a constant as previous works did 2 for both dplpe and dplce they design algorithms with provable optimal regret bounds these results are significant as they not only match the order of t but also those of beta and delta moreover for dplce their upper and lower bound match those in 30 indicating that a linear context added on the demand curve might not require substantially more information to learn 3 their numerical experiments are comprehensive and the results are welldisplayed weaknesses 1 some related literatures should be discussed with more details for example the stream of binary demand model as this work also assume a noisy feedback the only difference of binary feedback is that the noise distribution is dependent on p and x while this work assumes iid since the closelyrelated work 30 also assumes a binary feedback the results of this work do not actually cover those in 30 overall the binary feedback is an important property in many pricing problem settings and i suggest the authors to be aware of this issue and place this paper in the related literatures with more precision limitations and potential extensions of this work are well discussioned there is no discussions on social impact as it is a work of theory however i indeed suggest the authors to consider any potential ethic issue that might occur in a pricing problem with these assumptions you have specified docsepthis paper studies the contextbased dynamic pricing where the unknown expected demand admits a semiparametric partially linear structure two special cases of semiparametric partially linear models linear pricing effect and linear contextual effect are considered two new algorithms dplpe and dplce are proposed and their regret upper bounds and matching lower bounds are established strengths 1 in spite of a theoretical paper it is well written and easy to follow 2 the proved matching lower bound is useful to understand the limit of the considered problem weaknesses 1 technical novelty the proposed partially linear demand model is a natural extension of existing linear demand model and nonparametric demand model on the other hand a similar partially linear demand model has been proposed in dynamic pricing literature see below none of them was mentioned in the paper in fact these papers consider a binary choice model which is arguably more challenging than the one considered in this paper jianqing fan yongyi guo mengxin yu 2021 policy optimization using semiparametric models for dynamic pricing httpsarxivorgabs210906368 jianyu xu yuxiang wang 2022 towards agnostic featurebased dynamic pricing linear policies vs linear valuation with unknown noise aistats 2022 2 the proposed algorithm and theoretical analysis require the knowledge of some true parameters eg the upper and lower bounds of price the upper and lower bounds of the true price coefficient the continuous parameter beta of the unknown function g in practical pricing applications knowing upper and lower bounds of price is arguable and a mild assumption but it is less justifiable to know the bound of b and beta in practice 3 it is unclear what assumptions were assumed in the main theorems theorems 14 in addition when the authors compare the proved regret bounds with those in the literature it is also important to discuss if their model assumptions are comparable it would be more convincing if the faster regret bound is not obtained under much stronger assumptions 4 the experiments of this paper only study the performance when the true model is the proposed partially linear demand model no model misspecification was studied it is unclear if it is always safe to use the proposed algorithm in practical pricing applications after rebuttal thanks the authors for carefully addressing all my concerns my last three comments have been nicely addressed i have raised my rating to borderline accept on the other hand while i understand the difference eg nonparametric on noise vs nonparametric on covariates binary feedback vs continuous feedback of this paper compared to existing pricing literature with partial linear demand model i am not fully convinced by the technical novelty beyond them so i choose to only increase the rating to borderline accept no potential negative societal impact was discussed in the checklist no limitation was discussed in the paper docsepthis paper studies the contextual dynamic pricing problem under partially linear structural assumptions in particular the authors consider two demand models dplpe and dplce the former assumes a linear term in the price with an additive betaholder continuous function of the context the latter assumes a linear term in the context with an additive kth order smooth function of the price the authors present two online algorithms and provide regret upper bound as well as lower bound guarantees for each of the models 1 dplpe upper bound of mathcaloleftsqrtt ln tcdot tfracdd2betaright lower bound of omegaleftmaxsqrtt tfracdd2betaright 2 dplce upper bound of mathcaloleftdln tsqrtt delta tk1frac12k1right lower bound of omegaleftmaxsqrttdelta tk1frac12k1right strenghts the paper in general is well written and its contribution over previous works is presented clearly by the authors the models considered in the paper are more general compared to known works and they provide new results on the problem of dynamic pricing i find the algorithm design of both algorithms to be interesting and novel as it combines multiple ideas from previous works the authors also provide lower bound guarantees which enhances the main results of the paper weaknesses as stated by the authors the lower bound in theorem 4 is missing the dependence in d which makes this lower bound less significant compared to the lower bound in theorem 2 i believe that the problem setup can be presented more clearly for the reader with better formatting i feel like the main text could benefit from a more comprehensive technical overview i dont see any potential negative societal impact ### Summary:
the reviewers found the paper to be novel and interesting the introduced model was found to be innovative and leading to cleanerbetter regret bounds the only major concern that was raised and not resolved was lack of technical novelty however it seems that this work provides new and relevant results to the existing literature and it seems that clean and fundamental techniques are indeed adequate for achieving the result of this paper
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 2175, 271, 3909, 4715, 285, 33876, 20910, 1895, 342, 3300, 532, 29982, 6853, 10571, 4872, 4831, 3210, 835, 253, 4831, 1159, 310, 253, 2020, 273, 247, 4872, 1159, 342, 271, 7202, 10303, 273, 4376, 390, 3634, 1183, 285, 271, 7202, 1159, 273, 253, 3634, 390, 4376, 1183, 323, 253, 1840, 767, 4831, 3210, 253, 2929, 24357, 767, 3969, 3909, 20910, 11333, 342, 872, 494, 14938, 5170, 14493, 285, 11038, 2406, 14493, 342, 1675, 281, 253, 16892, 246, 20544, 4583, 253, 2929, 310, 973, 15720, 50276, 630, 1427, 3020, 253, 2929, 2087, 4219, 253, 7744, 908, 11233, 37913, 5415, 9376, 281, 17051, 21815, 15538, 253, 1543, 921, 326, 253, 3081, 2605, 273, 50137, 6296, 19132, 253, 14938, 3033, 50276, 41528, 3020, 253, 2929, 24772, 2067, 3909, 4715, 5609, 1690, 10269, 920, 285, 11193, 3632, 6966, 275, 20910, 285, 5170, 7162, 3033, 5933, 594, 347, 281, 6642, 253, 1327, 36928, 629, 273, 253, 4831, 3470, 50276, 12792, 3020, 253, 2929, 4245, 247, 11088, 1783, 273, 253, 7605, 10454, 1747, 1221, 273, 253, 26115, 3237, 50276, 20881, 1255, 50275, 18, 275, 5933, 374, 672, 16161, 253, 44274, 67, 13757, 1895, 275, 1386, 884, 752, 310, 697, 10454, 285, 275, 2087, 849, 281, 8415, 352, 14556, 50276, 19, 253, 11333, 1646, 281, 2430, 253, 3640, 273, 9840, 285, 465, 253, 4477, 943, 1375, 841, 11120, 275, 253, 13260, 285, 2319, 849, 281, 6642, 841, 13483, 672, 627, 310, 642, 824, 2720, 3640, 50275, 37585, 4385, 2220, 16725, 253, 6239, 253, 2929, 38771, 581, 5542, 273, 2987, 2905, 281, 3300, 532, 29982, 6853, 7870, 20910, 923, 2708, 253, 2488, 943, 2319, 253, 7680, 1411, 841, 2987, 50276, 22872, 13757, 970, 3300, 532, 29982, 6853, 3210, 323, 7870, 20910, 7989, 1162, 355, 43425, 50276, 35360, 4924, 33876, 7870, 20910, 26535, 80, 285, 5101, 43425, 50276, 32289, 2196, 639, 79, 6932, 4735, 3169, 7870, 20910, 4872, 7823, 4632, 4872, 29581, 342, 7202, 6046, 1269, 86, 285, 259, 606, 1384, 1423, 50276, 12563, 253, 4477, 943, 3748, 326, 253, 11038, 273, 2406, 14493, 10770, 281, 253, 18925, 327, 246, 533, 417, 342, 1675, 281, 277, 347, 275, 253, 5955, 273, 10012, 577, 581, 273, 253, 2406, 14493, 36908, 2525, 253, 3969, 5170, 3033, 275, 253, 3634, 7877, 5549, 5474, 33032, 2520, 2929, 2175, 767, 3909, 33876, 7870, 20910, 1895, 7533, 277, 446, 365, 835, 4831, 310, 4872, 8772, 4376, 285, 277, 446, 336, 835, 4831, 310, 4872, 8772, 3634, 323, 277, 446, 365, 1895, 597, 5467, 253, 8156, 4831, 19641, 50276, 67, 431, 72, 633, 50276, 2265, 300, 834, 342, 305, 89, 1146, 701, 1240, 4486, 5415, 285, 299, 793, 300, 834, 1146, 247, 749, 72, 12064, 6046, 597, 12661, 271, 519, 446, 81, 5933, 326, 47932, 247, 2317, 10269, 920, 285, 247, 3632, 6966, 5609, 342, 519, 446, 81, 597, 5115, 271, 8654, 246, 6227, 375, 2274, 85, 19406, 246, 1124, 1678, 19, 2461, 14938, 598, 281, 32643, 2616, 407, 18597, 1097, 253, 5170, 285, 253, 2406, 14938, 14493, 835, 277, 310, 253, 3317, 1075, 1319, 273, 3386, 323, 277, 446, 336, 1895, 597, 5467, 253, 8156, 4831, 19641, 50276, 71, 431, 31831, 633, 50276, 2265, 300, 834, 342, 44296, 1146, 465, 1156, 394, 2621, 6032, 342, 247, 1355, 4764, 18687, 326, 812, 320, 246, 6820, 597, 1246, 271, 519, 446, 68, 5933, 326, 47932, 247, 1980, 14189, 11193, 285, 247, 23539, 4872, 33876, 3961, 262, 342, 271, 28684, 273, 86, 8130, 342, 519, 446, 68, 597, 5115, 271, 8654, 246, 6227, 351, 2609, 85, 306, 264, 1862, 246, 76, 18, 1124, 805, 76, 18, 14938, 598, 281, 32643, 2616, 407, 18597, 1097, 253, 5170, 285, 253, 2406, 14938, 14493, 4720, 597, 2589, 10704, 4679, 285, 921, 326, 253, 9864, 1543, 273, 519, 446, 81, 285, 519, 446, 68, 776, 32231, 512, 49602, 20544, 50276, 18, 436, 789, 2087, 4219, 253, 1895, 7533, 273, 1097, 4872, 4831, 285, 4872, 3634, 3237, 323, 277, 446, 365, 597, 1908, 701, 1240, 4486, 966, 3185, 273, 760, 11233, 37913, 326, 369, 21450, 8025, 407, 2045, 2987, 285, 597, 3157, 253, 5301, 407, 15706, 253, 4872, 22791, 342, 253, 2032, 4831, 1159, 275, 854, 285, 3495, 323, 277, 446, 336, 597, 1908, 465, 394, 2621, 6032, 342, 417, 760, 7007, 533, 671, 1327, 18743, 465, 84, 285, 597, 671, 22175, 253, 2554, 326, 18687, 7120, 3185, 273, 12767, 352, 347, 247, 3638, 347, 2045, 2987, 858, 50276, 19, 323, 1097, 277, 446, 365, 285, 277, 446, 336, 597, 2216, 11333, 342, 872, 494, 8654, 14938, 14493, 841, 1543, 403, 1534, 347, 597, 417, 760, 3761, 253, 1340, 273, 246, 533, 671, 1110, 273, 9840, 285, 18687, 25761, 323, 277, 446, 336, 616, 5170, 285, 2406, 3033, 3761, 1110, 275, 1884, 7809, 326, 247, 4872, 3634, 2879, 327, 253, 4831, 6970, 1537, 417, 2430, 9619, 625, 1491, 281, 3037, 50276, 20, 616, 10704, 4679, 403, 11088, 285, 253, 1543, 403, 6210, 392, 261, 35293, 50276, 20881, 1255, 265, 50276, 18, 690, 2905, 4133, 2478, 943, 320, 5469, 342, 625, 4278, 323, 1650, 253, 5542, 273, 8985, 4831, 1566, 347, 436, 789, 671, 5467, 247, 27620, 8680, 253, 760, 3064, 273, 8985, 8680, 310, 326, 253, 6046, 3268, 310, 7976, 327, 268, 285, 1269, 1223, 436, 789, 19584, 891, 301, 1580, 253, 8244, 4919, 789, 1884, 671, 19584, 247, 8985, 8680, 253, 1543, 273, 436, 789, 513, 417, 2686, 3835, 1110, 275, 1884, 4583, 253, 8985, 8680, 310, 271, 1774, 2867, 275, 1142, 20910, 1895, 7533, 285, 891, 1804, 253, 4477, 281, 320, 6600, 273, 436, 2523, 285, 1659, 436, 2929, 275, 253, 2905, 4133, 2478, 342, 625, 12320, 50275, 17465, 569, 285, 2442, 18149, 273, 436, 789, 403, 973, 5955, 264, 50276, 9088, 310, 642, 11985, 327, 2675, 3486, 347, 352, 310, 247, 789, 273, 3762, 2299, 891, 6296, 1804, 253, 4477, 281, 1908, 667, 2442, 48538, 2523, 326, 1537, 2826, 275, 247, 20910, 1895, 342, 841, 13260, 368, 452, 7616, 5474, 33032, 2520, 2929, 2175, 253, 3634, 3169, 7870, 20910, 835, 253, 7202, 3264, 4831, 19943, 247, 3300, 532, 29982, 6853, 10571, 4872, 2605, 767, 2714, 2219, 273, 3300, 532, 29982, 6853, 10571, 4872, 3210, 4872, 20910, 1055, 285, 4872, 33876, 1055, 403, 2783, 767, 747, 11333, 277, 446, 365, 285, 277, 446, 336, 403, 4081, 285, 616, 14938, 5170, 14493, 285, 11038, 2406, 14493, 403, 4232, 50275, 296, 3755, 20556, 50276, 18, 275, 15866, 273, 247, 10527, 2929, 352, 310, 973, 3542, 285, 3477, 281, 956, 50275, 19, 253, 8058, 11038, 2406, 3033, 310, 4217, 281, 2096, 253, 2701, 273, 253, 2783, 1895, 50274, 20881, 1255, 265, 50276, 18, 7681, 38135, 253, 4081, 10571, 4872, 4831, 1566, 310, 247, 3626, 6880, 273, 5368, 4872, 4831, 1566, 285, 1327, 36928, 4831, 1566, 327, 253, 643, 1133, 247, 2074, 10571, 4872, 4831, 1566, 556, 644, 4081, 275, 7870, 20910, 6239, 923, 2708, 5293, 273, 731, 369, 5393, 275, 253, 2929, 275, 958, 841, 9380, 1908, 247, 8985, 4327, 1566, 534, 310, 25711, 625, 11132, 685, 253, 581, 2783, 275, 436, 2929, 50275, 75, 757, 82, 272, 7989, 340, 543, 28212, 1149, 80, 278, 1205, 89, 249, 340, 86, 43425, 3646, 13757, 970, 3300, 532, 29982, 6853, 3210, 323, 7870, 20910, 5987, 39962, 2061, 5375, 16899, 30920, 23926, 50275, 75, 757, 30838, 1269, 86, 340, 2310, 22589, 259, 606, 1384, 1423, 4404, 639, 79, 6932, 4735, 3169, 7870, 20910, 4872, 7823, 4632, 4872, 29581, 342, 7202, 6046, 247, 382, 1832, 1384, 1423, 50275, 19, 253, 4081, 5933, 285, 10527, 1783, 2430, 253, 3640, 273, 690, 2032, 3602, 24088, 253, 5170, 285, 2406, 14493, 273, 4376, 253, 5170, 285, 2406, 14493, 273, 253, 2032, 4376, 10235, 253, 5415, 4764, 9840, 273, 253, 7202, 1159, 305, 275, 8542, 20910, 4893, 8958, 5170, 285, 2406, 14493, 273, 4376, 310, 1736, 8584, 285, 247, 11134, 9376, 533, 352, 310, 1679, 816, 18397, 281, 871, 253, 3033, 273, 270, 285, 9840, 275, 3946, 50275, 20, 352, 310, 12744, 752, 13260, 497, 8025, 275, 253, 2022, 39383, 39383, 1638, 275, 1635, 672, 253, 4477, 7277, 253, 8058, 14938, 14493, 342, 1110, 275, 253, 6239, 352, 310, 671, 1774, 281, 2319, 604, 616, 1566, 13260, 403, 10870, 352, 651, 320, 625, 21414, 604, 253, 7938, 14938, 3033, 310, 417, 2797, 762, 1199, 10046, 13260, 50276, 21, 253, 4679, 273, 436, 2929, 760, 1263, 253, 3045, 672, 253, 2032, 1566, 310, 253, 4081, 10571, 4872, 4831, 1566, 642, 1566, 2985, 1553, 1877, 369, 5421, 352, 310, 12744, 604, 352, 310, 1900, 4999, 281, 897, 253, 4081, 5933, 275, 8542, 20910, 4893, 50274, 6438, 30080, 22559, 6701, 253, 4477, 323, 9257, 15974, 512, 619, 7350, 619, 1390, 1264, 5701, 452, 644, 23395, 9713, 891, 452, 5439, 619, 13716, 281, 45210, 2997, 327, 253, 643, 1133, 1223, 891, 2096, 253, 3064, 24088, 1327, 36928, 327, 6046, 4632, 1327, 36928, 327, 33520, 8985, 8680, 4632, 5415, 8680, 273, 436, 2929, 2429, 281, 5368, 20910, 6239, 342, 7898, 4872, 4831, 1566, 891, 717, 417, 4751, 13762, 407, 253, 7681, 38135, 4457, 731, 594, 891, 5206, 281, 760, 2572, 253, 13716, 281, 45210, 2997, 50274, 2369, 2442, 4016, 38058, 3486, 369, 5469, 275, 253, 44282, 642, 12291, 369, 5469, 275, 253, 2929, 50276, 7152, 33032, 2520, 2929, 2175, 253, 33876, 7870, 20910, 1895, 762, 10571, 4872, 8350, 13260, 275, 1798, 253, 4477, 1908, 767, 4831, 3210, 277, 446, 365, 285, 277, 446, 336, 253, 3438, 19584, 247, 4872, 1307, 275, 253, 4376, 342, 271, 21842, 701, 1240, 4486, 5415, 1159, 273, 253, 3634, 253, 6158, 19584, 247, 4872, 1307, 275, 253, 3634, 342, 271, 21842, 465, 394, 1340, 6032, 1159, 273, 253, 4376, 253, 4477, 1246, 767, 3909, 11333, 285, 2085, 14938, 5170, 3033, 347, 973, 347, 2406, 3033, 23632, 323, 1016, 273, 253, 3210, 337, 277, 446, 365, 50273, 23725, 3033, 273, 14168, 1179, 80, 1274, 2609, 85, 50276, 6677, 246, 3830, 246, 1124, 1678, 19, 9900, 274, 429, 50273, 12973, 3033, 273, 7005, 909, 1079, 649, 4090, 2609, 85, 246, 1124, 1678, 19, 9900, 274, 429, 374, 277, 446, 336, 50273, 23725, 3033, 273, 14168, 1179, 80, 1274, 69, 6677, 246, 2609, 85, 50276, 3005, 246, 76, 18, 1124, 805, 76, 18, 918, 50273, 12973, 3033, 273, 7005, 909, 1079, 649, 4090, 2609, 2851, 1862, 246, 76, 18, 1124, 805, 76, 18, 918, 4056, 384, 84, 50276, 783, 2929, 275, 2087, 310, 973, 3542, 285, 697, 7680, 689, 2045, 2987, 310, 3559, 4518, 407, 253, 4477, 50276, 783, 3210, 2783, 275, 253, 2929, 403, 625, 2087, 2429, 281, 1929, 2987, 285, 597, 2085, 747, 1543, 327, 253, 1895, 273, 7870, 20910, 50276, 74, 1089, 253, 5933, 2216, 273, 1097, 11333, 281, 320, 4722, 285, 4460, 347, 352, 24772, 2709, 5697, 432, 2045, 2987, 50276, 783, 4477, 671, 2085, 2406, 3033, 23632, 534, 25222, 253, 2022, 1543, 273, 253, 2929, 50276, 20881, 1255, 265, 50276, 284, 4767, 407, 253, 4477, 253, 2406, 3033, 275, 10012, 577, 310, 5816, 253, 10096, 275, 277, 534, 2789, 436, 2406, 3033, 1679, 1534, 2429, 281, 253, 2406, 3033, 275, 10012, 374, 50276, 74, 2868, 326, 253, 1895, 9978, 476, 320, 3559, 625, 4518, 323, 253, 9414, 342, 1805, 33907, 50276, 74, 1928, 751, 253, 2022, 2505, 812, 5649, 432, 247, 625, 11088, 7681, 18389, 891, 13414, 923, 667, 2442, 4016, 38058, 3486, 2490, 187, 4118, 18435, 27, 783, 30628, 1119, 253, 2929, 281, 320, 4460, 285, 4722, 253, 5611, 1566, 369, 1119, 281, 320, 16694, 285, 4283, 281, 28452, 29266, 14938, 14493, 50276, 783, 760, 2201, 4468, 326, 369, 5439, 285, 417, 11512, 369, 3480, 273, 7681, 38135, 2299, 352, 3133, 326, 436, 789, 3400, 747, 285, 4623, 1543, 281, 253, 5368, 6239, 285, 352, 3133, 326, 4076, 285, 7936, 5609, 403, 6296, 10599, 323, 17170, 253, 906, 273, 436, 2929 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 2175, 271, 3909, 4715, 285, 33876, 20910, 1895, 342, 3300, 532, 29982, 6853, 10571, 4872, 4831, 3210, 835, 253, 4831, 1159, 310, 253, 2020, 273, 247, 4872, 1159, 342, 271, 7202, 10303, 273, 4376, 390, 3634, 1183, 285, 271, 7202, 1159, 273, 253, 3634, 390, 4376, 1183, 323, 253, 1840, 767, 4831, 3210, 253, 2929, 24357, 767, 3969, 3909, 20910, 11333, 342, 872, 494, 14938, 5170, 14493, 285, 11038, 2406, 14493, 342, 1675, 281, 253, 16892, 246, 20544, 4583, 253, 2929, 310, 973, 15720, 50276, 630, 1427, 3020, 253, 2929, 2087, 4219, 253, 7744, 908, 11233, 37913, 5415, 9376, 281, 17051, 21815, 15538, 253, 1543, 921, 326, 253, 3081, 2605, 273, 50137, 6296, 19132, 253, 14938, 3033, 50276, 41528, 3020, 253, 2929, 24772, 2067, 3909, 4715, 5609, 1690, 10269, 920, 285, 11193, 3632, 6966, 275, 20910, 285, 5170, 7162, 3033, 5933, 594, 347, 281, 6642, 253, 1327, 36928, 629, 273, 253, 4831, 3470, 50276, 12792, 3020, 253, 2929, 4245, 247, 11088, 1783, 273, 253, 7605, 10454, 1747, 1221, 273, 253, 26115, 3237, 50276, 20881, 1255, 50275, 18, 275, 5933, 374, 672, 16161, 253, 44274, 67, 13757, 1895, 275, 1386, 884, 752, 310, 697, 10454, 285, 275, 2087, 849, 281, 8415, 352, 14556, 50276, 19, 253, 11333, 1646, 281, 2430, 253, 3640, 273, 9840, 285, 465, 253, 4477, 943, 1375, 841, 11120, 275, 253, 13260, 285, 2319, 849, 281, 6642, 841, 13483, 672, 627, 310, 642, 824, 2720, 3640, 50275, 37585, 4385, 2220, 16725, 253, 6239, 253, 2929, 38771, 581, 5542, 273, 2987, 2905, 281, 3300, 532, 29982, 6853, 7870, 20910, 923, 2708, 253, 2488, 943, 2319, 253, 7680, 1411, 841, 2987, 50276, 22872, 13757, 970, 3300, 532, 29982, 6853, 3210, 323, 7870, 20910, 7989, 1162, 355, 43425, 50276, 35360, 4924, 33876, 7870, 20910, 26535, 80, 285, 5101, 43425, 50276, 32289, 2196, 639, 79, 6932, 4735, 3169, 7870, 20910, 4872, 7823, 4632, 4872, 29581, 342, 7202, 6046, 1269, 86, 285, 259, 606, 1384, 1423, 50276, 12563, 253, 4477, 943, 3748, 326, 253, 11038, 273, 2406, 14493, 10770, 281, 253, 18925, 327, 246, 533, 417, 342, 1675, 281, 277, 347, 275, 253, 5955, 273, 10012, 577, 581, 273, 253, 2406, 14493, 36908, 2525, 253, 3969, 5170, 3033, 275, 253, 3634, 7877, 5549, 5474, 33032, 2520, 2929, 2175, 767, 3909, 33876, 7870, 20910, 1895, 7533, 277, 446, 365, 835, 4831, 310, 4872, 8772, 4376, 285, 277, 446, 336, 835, 4831, 310, 4872, 8772, 3634, 323, 277, 446, 365, 1895, 597, 5467, 253, 8156, 4831, 19641, 50276, 67, 431, 72, 633, 50276, 2265, 300, 834, 342, 305, 89, 1146, 701, 1240, 4486, 5415, 285, 299, 793, 300, 834, 1146, 247, 749, 72, 12064, 6046, 597, 12661, 271, 519, 446, 81, 5933, 326, 47932, 247, 2317, 10269, 920, 285, 247, 3632, 6966, 5609, 342, 519, 446, 81, 597, 5115, 271, 8654, 246, 6227, 375, 2274, 85, 19406, 246, 1124, 1678, 19, 2461, 14938, 598, 281, 32643, 2616, 407, 18597, 1097, 253, 5170, 285, 253, 2406, 14938, 14493, 835, 277, 310, 253, 3317, 1075, 1319, 273, 3386, 323, 277, 446, 336, 1895, 597, 5467, 253, 8156, 4831, 19641, 50276, 71, 431, 31831, 633, 50276, 2265, 300, 834, 342, 44296, 1146, 465, 1156, 394, 2621, 6032, 342, 247, 1355, 4764, 18687, 326, 812, 320, 246, 6820, 597, 1246, 271, 519, 446, 68, 5933, 326, 47932, 247, 1980, 14189, 11193, 285, 247, 23539, 4872, 33876, 3961, 262, 342, 271, 28684, 273, 86, 8130, 342, 519, 446, 68, 597, 5115, 271, 8654, 246, 6227, 351, 2609, 85, 306, 264, 1862, 246, 76, 18, 1124, 805, 76, 18, 14938, 598, 281, 32643, 2616, 407, 18597, 1097, 253, 5170, 285, 253, 2406, 14938, 14493, 4720, 597, 2589, 10704, 4679, 285, 921, 326, 253, 9864, 1543, 273, 519, 446, 81, 285, 519, 446, 68, 776, 32231, 512, 49602, 20544, 50276, 18, 436, 789, 2087, 4219, 253, 1895, 7533, 273, 1097, 4872, 4831, 285, 4872, 3634, 3237, 323, 277, 446, 365, 597, 1908, 701, 1240, 4486, 966, 3185, 273, 760, 11233, 37913, 326, 369, 21450, 8025, 407, 2045, 2987, 285, 597, 3157, 253, 5301, 407, 15706, 253, 4872, 22791, 342, 253, 2032, 4831, 1159, 275, 854, 285, 3495, 323, 277, 446, 336, 597, 1908, 465, 394, 2621, 6032, 342, 417, 760, 7007, 533, 671, 1327, 18743, 465, 84, 285, 597, 671, 22175, 253, 2554, 326, 18687, 7120, 3185, 273, 12767, 352, 347, 247, 3638, 347, 2045, 2987, 858, 50276, 19, 323, 1097, 277, 446, 365, 285, 277, 446, 336, 597, 2216, 11333, 342, 872, 494, 8654, 14938, 14493, 841, 1543, 403, 1534, 347, 597, 417, 760, 3761, 253, 1340, 273, 246, 533, 671, 1110, 273, 9840, 285, 18687, 25761, 323, 277, 446, 336, 616, 5170, 285, 2406, 3033, 3761, 1110, 275, 1884, 7809, 326, 247, 4872, 3634, 2879, 327, 253, 4831, 6970, 1537, 417, 2430, 9619, 625, 1491, 281, 3037, 50276, 20, 616, 10704, 4679, 403, 11088, 285, 253, 1543, 403, 6210, 392, 261, 35293, 50276, 20881, 1255, 265, 50276, 18, 690, 2905, 4133, 2478, 943, 320, 5469, 342, 625, 4278, 323, 1650, 253, 5542, 273, 8985, 4831, 1566, 347, 436, 789, 671, 5467, 247, 27620, 8680, 253, 760, 3064, 273, 8985, 8680, 310, 326, 253, 6046, 3268, 310, 7976, 327, 268, 285, 1269, 1223, 436, 789, 19584, 891, 301, 1580, 253, 8244, 4919, 789, 1884, 671, 19584, 247, 8985, 8680, 253, 1543, 273, 436, 789, 513, 417, 2686, 3835, 1110, 275, 1884, 4583, 253, 8985, 8680, 310, 271, 1774, 2867, 275, 1142, 20910, 1895, 7533, 285, 891, 1804, 253, 4477, 281, 320, 6600, 273, 436, 2523, 285, 1659, 436, 2929, 275, 253, 2905, 4133, 2478, 342, 625, 12320, 50275, 17465, 569, 285, 2442, 18149, 273, 436, 789, 403, 973, 5955, 264, 50276, 9088, 310, 642, 11985, 327, 2675, 3486, 347, 352, 310, 247, 789, 273, 3762, 2299, 891, 6296, 1804, 253, 4477, 281, 1908, 667, 2442, 48538, 2523, 326, 1537, 2826, 275, 247, 20910, 1895, 342, 841, 13260, 368, 452, 7616, 5474, 33032, 2520, 2929, 2175, 253, 3634, 3169, 7870, 20910, 835, 253, 7202, 3264, 4831, 19943, 247, 3300, 532, 29982, 6853, 10571, 4872, 2605, 767, 2714, 2219, 273, 3300, 532, 29982, 6853, 10571, 4872, 3210, 4872, 20910, 1055, 285, 4872, 33876, 1055, 403, 2783, 767, 747, 11333, 277, 446, 365, 285, 277, 446, 336, 403, 4081, 285, 616, 14938, 5170, 14493, 285, 11038, 2406, 14493, 403, 4232, 50275, 296, 3755, 20556, 50276, 18, 275, 15866, 273, 247, 10527, 2929, 352, 310, 973, 3542, 285, 3477, 281, 956, 50275, 19, 253, 8058, 11038, 2406, 3033, 310, 4217, 281, 2096, 253, 2701, 273, 253, 2783, 1895, 50274, 20881, 1255, 265, 50276, 18, 7681, 38135, 253, 4081, 10571, 4872, 4831, 1566, 310, 247, 3626, 6880, 273, 5368, 4872, 4831, 1566, 285, 1327, 36928, 4831, 1566, 327, 253, 643, 1133, 247, 2074, 10571, 4872, 4831, 1566, 556, 644, 4081, 275, 7870, 20910, 6239, 923, 2708, 5293, 273, 731, 369, 5393, 275, 253, 2929, 275, 958, 841, 9380, 1908, 247, 8985, 4327, 1566, 534, 310, 25711, 625, 11132, 685, 253, 581, 2783, 275, 436, 2929, 50275, 75, 757, 82, 272, 7989, 340, 543, 28212, 1149, 80, 278, 1205, 89, 249, 340, 86, 43425, 3646, 13757, 970, 3300, 532, 29982, 6853, 3210, 323, 7870, 20910, 5987, 39962, 2061, 5375, 16899, 30920, 23926, 50275, 75, 757, 30838, 1269, 86, 340, 2310, 22589, 259, 606, 1384, 1423, 4404, 639, 79, 6932, 4735, 3169, 7870, 20910, 4872, 7823, 4632, 4872, 29581, 342, 7202, 6046, 247, 382, 1832, 1384, 1423, 50275, 19, 253, 4081, 5933, 285, 10527, 1783, 2430, 253, 3640, 273, 690, 2032, 3602, 24088, 253, 5170, 285, 2406, 14493, 273, 4376, 253, 5170, 285, 2406, 14493, 273, 253, 2032, 4376, 10235, 253, 5415, 4764, 9840, 273, 253, 7202, 1159, 305, 275, 8542, 20910, 4893, 8958, 5170, 285, 2406, 14493, 273, 4376, 310, 1736, 8584, 285, 247, 11134, 9376, 533, 352, 310, 1679, 816, 18397, 281, 871, 253, 3033, 273, 270, 285, 9840, 275, 3946, 50275, 20, 352, 310, 12744, 752, 13260, 497, 8025, 275, 253, 2022, 39383, 39383, 1638, 275, 1635, 672, 253, 4477, 7277, 253, 8058, 14938, 14493, 342, 1110, 275, 253, 6239, 352, 310, 671, 1774, 281, 2319, 604, 616, 1566, 13260, 403, 10870, 352, 651, 320, 625, 21414, 604, 253, 7938, 14938, 3033, 310, 417, 2797, 762, 1199, 10046, 13260, 50276, 21, 253, 4679, 273, 436, 2929, 760, 1263, 253, 3045, 672, 253, 2032, 1566, 310, 253, 4081, 10571, 4872, 4831, 1566, 642, 1566, 2985, 1553, 1877, 369, 5421, 352, 310, 12744, 604, 352, 310, 1900, 4999, 281, 897, 253, 4081, 5933, 275, 8542, 20910, 4893, 50274, 6438, 30080, 22559, 6701, 253, 4477, 323, 9257, 15974, 512, 619, 7350, 619, 1390, 1264, 5701, 452, 644, 23395, 9713, 891, 452, 5439, 619, 13716, 281, 45210, 2997, 327, 253, 643, 1133, 1223, 891, 2096, 253, 3064, 24088, 1327, 36928, 327, 6046, 4632, 1327, 36928, 327, 33520, 8985, 8680, 4632, 5415, 8680, 273, 436, 2929, 2429, 281, 5368, 20910, 6239, 342, 7898, 4872, 4831, 1566, 891, 717, 417, 4751, 13762, 407, 253, 7681, 38135, 4457, 731, 594, 891, 5206, 281, 760, 2572, 253, 13716, 281, 45210, 2997, 50274, 2369, 2442, 4016, 38058, 3486, 369, 5469, 275, 253, 44282, 642, 12291, 369, 5469, 275, 253, 2929, 50276, 7152, 33032, 2520, 2929, 2175, 253, 33876, 7870, 20910, 1895, 762, 10571, 4872, 8350, 13260, 275, 1798, 253, 4477, 1908, 767, 4831, 3210, 277, 446, 365, 285, 277, 446, 336, 253, 3438, 19584, 247, 4872, 1307, 275, 253, 4376, 342, 271, 21842, 701, 1240, 4486, 5415, 1159, 273, 253, 3634, 253, 6158, 19584, 247, 4872, 1307, 275, 253, 3634, 342, 271, 21842, 465, 394, 1340, 6032, 1159, 273, 253, 4376, 253, 4477, 1246, 767, 3909, 11333, 285, 2085, 14938, 5170, 3033, 347, 973, 347, 2406, 3033, 23632, 323, 1016, 273, 253, 3210, 337, 277, 446, 365, 50273, 23725, 3033, 273, 14168, 1179, 80, 1274, 2609, 85, 50276, 6677, 246, 3830, 246, 1124, 1678, 19, 9900, 274, 429, 50273, 12973, 3033, 273, 7005, 909, 1079, 649, 4090, 2609, 85, 246, 1124, 1678, 19, 9900, 274, 429, 374, 277, 446, 336, 50273, 23725, 3033, 273, 14168, 1179, 80, 1274, 69, 6677, 246, 2609, 85, 50276, 3005, 246, 76, 18, 1124, 805, 76, 18, 918, 50273, 12973, 3033, 273, 7005, 909, 1079, 649, 4090, 2609, 2851, 1862, 246, 76, 18, 1124, 805, 76, 18, 918, 4056, 384, 84, 50276, 783, 2929, 275, 2087, 310, 973, 3542, 285, 697, 7680, 689, 2045, 2987, 310, 3559, 4518, 407, 253, 4477, 50276, 783, 3210, 2783, 275, 253, 2929, 403, 625, 2087, 2429, 281, 1929, 2987, 285, 597, 2085, 747, 1543, 327, 253, 1895, 273, 7870, 20910, 50276, 74, 1089, 253, 5933, 2216, 273, 1097, 11333, 281, 320, 4722, 285, 4460, 347, 352, 24772, 2709, 5697, 432, 2045, 2987, 50276, 783, 4477, 671, 2085, 2406, 3033, 23632, 534, 25222, 253, 2022, 1543, 273, 253, 2929, 50276, 20881, 1255, 265, 50276, 284, 4767, 407, 253, 4477, 253, 2406, 3033, 275, 10012, 577, 310, 5816, 253, 10096, 275, 277, 534, 2789, 436, 2406, 3033, 1679, 1534, 2429, 281, 253, 2406, 3033, 275, 10012, 374, 50276, 74, 2868, 326, 253, 1895, 9978, 476, 320, 3559, 625, 4518, 323, 253, 9414, 342, 1805, 33907, 50276, 74, 1928, 751, 253, 2022, 2505, 812, 5649, 432, 247, 625, 11088, 7681, 18389, 891, 13414, 923, 667, 2442, 4016, 38058, 3486, 2490, 187, 4118, 18435, 27, 783, 30628, 1119, 253, 2929, 281, 320, 4460, 285, 4722, 253, 5611, 1566, 369, 1119, 281, 320, 16694, 285, 4283, 281, 28452, 29266, 14938, 14493, 50276, 783, 760, 2201, 4468, 326, 369, 5439, 285, 417, 11512, 369, 3480, 273, 7681, 38135, 2299, 352, 3133, 326, 436, 789, 3400, 747, 285, 4623, 1543, 281, 253, 5368, 6239, 285, 352, 3133, 326, 4076, 285, 7936, 5609, 403, 6296, 10599, 323, 17170, 253, 906, 273, 436, 2929 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper focuses on estimating treatment effects under a limited overlap ie subjects with specific characteristics might only belong to one treatment group to solve this problem the authors propose a variational autoencoder vae called betaintactvae which extends the framework of cvae and ivae the authors prove that the identifies the treatment effects the proposed model is compared with other methods on synthetic datasets strengths i believe the betaintactvae is novel and is useful in treatment effects estimation the authors have provided theoretical guarantees for the proposed method experimental results also demonstrate that betaintactvae outperforms other generative models weaknesses i am able to understand the general idea of the method but have difficulty understanding some notations in the paper the authors might want to define some of the symbols before using them for example ft kt ht diag and diag1 in section 31 and ft kt ht in section 41 the notation ft indications that there are two different functions for t0 and t1 respectively is it true i do not understand the differences between pthetayx t and pyx t in section 32 in the caption of table 1 what is the unconditional balancing hyperparameter in section 42 are the parameters lambda learn from the data in equation 6 qphi x y t is still a function of t how is balanced pts achieved in this case in the second term of equatnion 7 it looks like gt2z should be gtz in general i believe the betaintactvae is novel and is useful in treatment effects estimation the authors have provided theoretical guarantees for the proposed method experimental results also demonstrate that betaintactvae outperforms other generative models docsepthe present paper proposes to address the issue of limited overlap in treatment effect estimation by investigating the identification assumptions of prognostic scores which in certain contexts are less restrictive on overlap than other methods such as propensity score based methods the main contributions of this work are presented in sections 32 and 42 where they derive identifiability of the treatment effect via prognostic scores and propose a generative prognostic model based on variational autoencoders vae the theoretical results are complemented with several synthetic and semisynthetic experiments which also show that the proposed models can compete with and in certain cases improve upon state of the art strong points the authors extend the idea of prognostic scores to the context of machine learning and with the additional difficulty of limited overlap indeed this approach that can be seen as an alternative to the more common propensity score based approaches in causal inference is indeed interesting in cases of limited overlap which is relevant especially in highdimensional settings the authors provide a broad range of theoretical results ranging from treatment effect identification via prognostic scores to a novel estimation strategy with error bounds for the cate estimation the article is well written especially the first two sections lay out the context and motivation the authors provide an adequately succinct yet sufficient list of references on related works to position their proposal issuespoints that require clarification as a general comment the claims and statements of the main text both in the identification and the estimation sections are in part hard to follow without checking in the appendix which makes the reading difficult the content being very dense i would suggest to either split the work into two articles one for identification and the other on estimation or to submit this work under a more readable format or to defer the experiments to the appendix instead of the theoretical aspects and small examples which would help to follow the justifications in the identification and estimation sections in definition 1 the authors talk about overlapping random variables v and in definition 2 they talk about functions do they mean functions of xinmathcalx and when they say overlapping functions mathbbptx do they mean the random variables or do they make an implicit definition of overlapping functions the authors claim that mathbbmx is an effect modifier since this relies on a definition of effect modifier from hansen 2008 which is much less common than standard definitions of effect modifiers as interaction terms between covariates and the treatment assignment variable it would be helpful to provide a concrete example if not in the main text then at least in the appendix of such an effect modifier that does not involve t at the end of section 2 the authors provide an informal definition of balanced pts why this nonrigorous definition instead of a formal one maybe im confusing the intact vae and the beta intact vae but from the graphical model of intact vae in figure 1 i dont understand the connection between z and a ps mathbbp the definition of conditionally balanced representation requires that zperp tx which does not seem to be possible under the model of intact vae the choice of the acronym ps for prognostic scores could be reconsidered since the target audience of this work seems to be rooted in causal inference where ps usually stands for the propensity score in many works i have several questions about the experiments why the notation switch from z to w in the experiments is there a conceptual difference between the two why the choice of linear functions hkl and how are they chosen why isnt the proposed method from damour and franks 2021 also included in the comparative study even if their method relies on linear assumptions it would be interesting to see a comparison in this context since the paper contains a derivation of error bounds on the cate in theorem 2 it would be interesting to assess them in the simulations and compare the empirical results to these bounds for the ihdp dataset it is written that in the used model beta1 referenced as ours in table 1 but then there are also modified versions that are reported with other values of the hyperparameter beta shouldnt ours and mod1 show the same results or am i misunderstanding the tables description minor comments that did not impact the score p1 confounders are not necessarily highdimensional i would reformulate the last sentence of the 2nd paragraph to nuance the claim rather say that the more covariates are collected the more likely unconfoundedness is to hold but that this can quickly lead to issues in separation between treatment and control groups p1 i would suggest adding the work on overlap weights by li and li 2019 p3 in section 21 i would recommend adding the sutva assumption for completeness either by mentioning it directly or by adding the exclusion of interference in the assumptions postrebuttal update i thank the authors for their detailed and timely responses to all reviewers their efforts to clarify and reformulated assumptions on the dgp model and data as well as their concession to adapt their notations following the reviewers remarks and the additional experiments related to the theoretical bound have helped addressing most of my concerns however i still think the positioning of their work relative to related works is difficult to follow or find which makes it difficult to compare its contributions with previous works i have therefore decided to only increase my rating by one level references 1 alexander damour and alexander franks deconfounding scores feature representations for causal effect estimation with weak overlap arxiv preprint arxiv210405762 2021 2 ben b hansen the prognostic analogue of the propensity score biometrika 952 481488 2008 3 fan li and fan li propensity score weighting for causal inference with multiple treatments annals of applied statistics 13423892415 2019 4 christos louizos uri shalit joris mooij david sontag richard zemel and max welling causal effect inference with deep latentvariable models in proceedings of the 31st international conference on neural information processing systems pp 64496459 2017 5 uri shalit fredrik d johansson and david sontag estimating individual treatment effect generalization bounds and algorithms in international conference on machine learning pp 30763085 pmlr 2017 in summary the aim and approach of this paper are interesting and i believe the approach could be a methodological gain for the causal machine learning audience but its underlying assumptions are presented in a way that makes it difficult to link or compare them to other works and its significance for realworld examples is therefore difficult to judge especially the details of the underlying causal model are not clear to me and make it difficult to conceptually compare it with other methods such as cfr shalit et al 2017 or cevae louizos et al 2017 i will read the rebuttal carefully and am willing to increase the score if the authors address the raised concerns docsepthe paper tackles the problem of estimating causal effects under partial overlap conditions overlap is a common assumption in causality so the problem is hard and relevant the authors present a theoretical analysis using pts an adaptation of prognostic scores the estimation is carried out with a modified version of a variational autoencoder vae called betaintactvae in addition the authors present bounds on the error of the conditional average treatment effect cate using their framework their paper ends with experimental results of the proposed method and a comparison with other recent methods for cate estimation strengths with the exception of a few errors the paper is well written and has an understandable language the motivation of the work is clear and relevant the theoretical analysis in the paper is serious as far as my understanding of the topic goes the proposed method seems reasonable and justified by the theoretical analysis in the paper weaknesses i found the paper to be a bit information dense in fact i am unsure about what could be some of the most subtle assumptions of the method clearly there is the assumption that x is a valid adjustment set but what about g1 additive noise models if this is necessary then why use the machinery of a neural network it seems that in order to make the theoretical analysis easier you have to assume a noiseless prior is there any chance you can make a list of the required assumptions and a sentence or two of why they are needed maybe you can consider having such an explicit version of the assumptions in the main document or a separate section in the appendix along the same line of the previous point i would invite the authors to think about scenarios in which such assumptions are fulfilled the examples of this paper in which we want to adjust a causal estimate using high dimensional covariates has always struck me as a bit unrealistic as the authors might know adjusting for all pretreatment variables is not always the best option because that might introduce mbias i would like to say though that as the authors correctly point out the more variables you have the lower the chance of having overlap of the treatments can you please comment on what could be some possible scenarios in which the method could be applied finally if known to the authors i would like to know the relation of this work between work on generalization of neural networks it seems from my point of view that the problem of limited overlapping of conditioning variables can be fixed by having estimators of the counterfactuals that generalize well beyond the support of the combination of covariates and treatment could it be possible that what is driving most of your results is the regularization of the estimates in general regularization is a way of achieving out of support generalization can you comment on this i believe the paper solves the interesting problem of estimating cate with limited overlap it seems to me that most of the theory and the estimation method were developed before this article so in a certain way it is an incremental contribution nevertheless both theoretical and empirical analysis of the article seem serious docsepthe paper considers the setting of limited overlap of covariates and studies identifying and estimating the causal effects this setting is difficult to deal with because it is impossible to estimate causal effects at nonoverlapping values due to a lack of data to this end the paper proposes an idea based on the prognostic score and identifiable representation intactvae the prognostic score is an appropriate tool for limited overlapping because it can map some nonoverlapping values to an overlapping value in a space of lower dimensions intactvae is a natural combination of ivae and cvae and is proposed to help to identify the causal effect the paper implements this idea by proposing a new regularized vae called betaintactvae and further analyzes the treatment effect error bounds in the end the paper compares the model with recent methods in the experiments strengths 1 authors contribute a novel approach for limited overlapping of covariates the setting of limited overlapping is interesting the proposed approach combines the prognostic score and intactvae which are appropriate and novel under this setting although the beta variational lower bound was proposed in the previous methods it plays a role of controlling balance and it can be viewed as a novel application 2 the experiment results are extensive the authors follow some existing data generation processing or data and show that the balanced prior is important in practice compared to nonbalanced ones and their models can obtain better performance 3 the theoretical analyses are novel under the setting of this paper these results follow those of ivae and adapt to the new settings the authors elaborate on some conditions to help readers to get them which is great weaknesses 1 the paper is dense and difficult to follow it introduces many concepts from different areas it takes much time to understand and check the paper i suggest authors add some plots to help readers to faster understand some definitions eg overlapping the paper involves and requires readers to have wide knowledge so i think that a good presentation is difficult maybe moving some contents to the appendix and illustrating some concepts could be a choice in addition characters with the same style are used for different types of subjects such as mathbbr for real numbers and mathbbm for a function i strongly suggest using consistent notations 2 experiment results measured by pehe are worse than cfr ate and pehe reflect different information ate focuses on the mean and pehe reflects the stability because if there is a large cate error pehe can be large as mentioned on the page 9 large cate errors mean that the model performs unstably at some values so it is better to discuss the results more properly i vote for weak acceptance the paper proposes a technically and theoretically sound approach and it studies an interesting setting limited overlapping though it has some presentation issues ### Summary:
in this paper the authors proposed a method for causal inference under limited overlap an important and understudied complication the authors propose to recover a prognostic score using a variational autoencoder and thereby map a higher dimensional set of covariates with limited overlap to a lower dimensional set where overlap holds and such that ignorability is maintained the paper was reviewed quite favorably by reviewers and the authors updated the manuscript to address specific issues raised by reviewers
[ 604, 417, 275, 253, 2022, 2505, 840, 387, 1878, 275, 253, 30762, 273, 824, 271, 1055, 34301, 326, 1057, 417, 6388, 246, 50276, 255, 253, 990, 273, 2593, 374, 253, 4477, 2085, 271, 25040, 5426, 273, 16645, 268, 1641, 2139, 436, 1327, 10389, 11303, 5426, 3185, 273, 247, 7473, 581, 50276, 28489, 516, 21643, 253, 15282, 362, 3348, 285, 253, 9840, 15282, 362, 3348, 533, 432, 253, 29886, 1566, 273, 15282, 362, 3348, 275, 4677, 337, 891, 13414, 2096, 253, 4602, 875, 1182, 285, 247, 3714, 14168, 4482, 81, 253, 5426, 273, 1617, 595, 16645, 6779, 4419, 326, 1182, 14715, 28951, 534, 1057, 417, 1646, 281, 320, 1896, 762, 253, 1566, 273, 15282, 362, 3348, 50275, 783, 4327, 273, 253, 913, 1406, 1105, 3714, 323, 18983, 7363, 812, 320, 8756, 38617, 1580, 253, 2303, 8446, 273, 436, 789, 3133, 281, 320, 26415, 275, 19349, 17032, 835, 3714, 3798, 9572, 323, 253, 33882, 4868, 275, 1142, 2987, 50276, 74, 452, 2067, 3533, 670, 253, 4679, 50272, 22309, 253, 14951, 5234, 432, 1182, 281, 259, 275, 253, 4679, 310, 627, 247, 20178, 3064, 875, 253, 767, 50272, 22309, 253, 4327, 273, 4872, 3470, 288, 7261, 285, 849, 403, 597, 6777, 50272, 22309, 310, 2649, 253, 4081, 1332, 432, 2687, 454, 285, 1315, 3107, 43425, 671, 2908, 275, 253, 20407, 1263, 1014, 604, 616, 1332, 15771, 327, 4872, 13260, 352, 651, 320, 4722, 281, 923, 247, 5301, 275, 436, 3634, 50272, 17480, 253, 2929, 4428, 247, 28529, 273, 2228, 14493, 327, 253, 32439, 275, 10012, 374, 352, 651, 320, 4722, 281, 2939, 731, 275, 253, 9938, 285, 7277, 253, 16774, 1543, 281, 841, 14493, 50272, 1542, 253, 25730, 12132, 10895, 352, 310, 3542, 326, 275, 253, 908, 1566, 9840, 18, 23378, 347, 20451, 275, 2829, 337, 533, 840, 627, 403, 671, 7321, 9508, 326, 403, 2361, 342, 643, 2193, 273, 253, 4373, 19484, 9840, 943, 2649, 20451, 285, 771, 18, 921, 253, 1072, 1543, 390, 717, 891, 40663, 253, 7180, 5740, 50274, 37585, 5701, 326, 858, 417, 3486, 253, 4868, 50276, 81, 18, 44667, 398, 403, 417, 7933, 1029, 6967, 891, 651, 8460, 4187, 253, 1390, 6197, 273, 253, 374, 2109, 12494, 281, 8794, 593, 253, 1750, 2581, 1333, 326, 253, 625, 33520, 403, 5728, 253, 625, 2779, 440, 8259, 8055, 1255, 310, 281, 2186, 533, 326, 436, 476, 4541, 1421, 281, 3374, 275, 9712, 875, 1971, 285, 1453, 2390, 50276, 81, 18, 891, 651, 1804, 6240, 253, 789, 327, 14787, 13461, 407, 632, 285, 632, 6247, 50276, 81, 20, 275, 2593, 3127, 891, 651, 5583, 6240, 253, 35339, 6156, 9376, 323, 29867, 2057, 407, 29570, 352, 3587, 390, 407, 6240, 253, 14978, 273, 11689, 275, 253, 13260, 50274, 5996, 250, 2858, 22559, 5731, 891, 5717, 253, 4477, 323, 616, 7000, 285, 14793, 6128, 281, 512, 30628, 616, 6031, 281, 19148, 285, 8460, 2907, 13260, 327, 253, 277, 17788, 1566, 285, 941, 347, 973, 347, 616, 44619, 281, 5223, 616, 41818, 1563, 253, 30628, 16157, 285, 253, 3081, 4679, 2905, 281, 253, 10527, 3033, 452, 6518, 15974, 954, 273, 619, 7350, 2299, 891, 1335, 1158, 253, 19274, 273, 616, 789, 4103, 281, 2905, 2987, 310, 2834, 281, 956, 390, 1089, 534, 2789, 352, 2834, 281, 7277, 697, 9021, 342, 2045, 2987, 891, 452, 3103, 4425, 281, 760, 2572, 619, 13716, 407, 581, 1268, 50273, 250, 3065, 337, 247, 1591, 5945, 2687, 454, 285, 247, 1591, 5945, 1315, 3107, 372, 8259, 13802, 7363, 4735, 14237, 323, 19349, 1055, 13418, 342, 5075, 14787, 549, 32693, 638, 3845, 549, 32693, 16899, 1449, 3011, 3763, 43425, 50276, 19, 2240, 270, 288, 35714, 253, 18983, 28046, 273, 253, 33882, 4868, 1794, 2755, 32410, 898, 3583, 5693, 1047, 2055, 4695, 50276, 20, 7989, 632, 285, 7989, 632, 33882, 4868, 42428, 323, 19349, 17032, 342, 2709, 9694, 2459, 932, 273, 3732, 9990, 13900, 1508, 2511, 1348, 1010, 6247, 50276, 21, 37622, 375, 29245, 478, 375, 41475, 439, 267, 262, 480, 263, 261, 278, 3288, 1944, 34843, 301, 16491, 356, 6793, 472, 1182, 358, 293, 285, 2781, 973, 272, 19349, 1055, 17032, 342, 3676, 21624, 18645, 3210, 275, 10061, 273, 253, 4562, 296, 5213, 8059, 327, 11454, 1491, 5162, 2718, 7266, 721, 29185, 1540, 3046, 4240, 50276, 22, 41475, 439, 267, 262, 269, 433, 16409, 277, 480, 1368, 507, 1665, 285, 34843, 301, 16491, 356, 26230, 2060, 1971, 1055, 26647, 14493, 285, 11333, 275, 5213, 8059, 327, 5145, 4715, 7266, 1884, 3121, 1229, 2227, 268, 1686, 83, 4240, 275, 6010, 253, 4388, 285, 2746, 273, 436, 2929, 403, 4722, 285, 891, 2868, 253, 2746, 812, 320, 247, 35961, 6351, 323, 253, 19349, 5145, 4715, 8446, 533, 697, 6944, 13260, 403, 3559, 275, 247, 1039, 326, 2789, 352, 2834, 281, 3048, 390, 7277, 731, 281, 643, 2987, 285, 697, 8453, 323, 1524, 10186, 6667, 310, 3103, 2834, 281, 5963, 3340, 253, 4278, 273, 253, 6944, 19349, 1566, 403, 417, 2590, 281, 479, 285, 1056, 352, 2834, 281, 4473, 1230, 7277, 352, 342, 643, 3082, 824, 347, 260, 925, 439, 267, 262, 1162, 355, 4240, 390, 260, 1173, 3348, 29245, 478, 375, 1162, 355, 4240, 891, 588, 1239, 253, 30080, 22559, 9257, 285, 717, 7378, 281, 2572, 253, 4868, 604, 253, 4477, 2953, 253, 5439, 7350, 5474, 339, 431, 248, 2929, 39223, 253, 1895, 273, 26230, 19349, 2538, 762, 7898, 14787, 2515, 14787, 310, 247, 1846, 9376, 275, 46449, 594, 253, 1895, 310, 1892, 285, 4623, 253, 4477, 1246, 247, 10527, 1783, 970, 268, 1641, 271, 15644, 273, 18983, 7363, 253, 13418, 310, 4824, 562, 342, 247, 7321, 2715, 273, 247, 39762, 6753, 36465, 362, 3348, 1925, 701, 1143, 514, 21574, 275, 1635, 253, 4477, 1246, 14493, 327, 253, 2228, 273, 253, 17697, 3388, 1971, 1055, 32439, 970, 616, 7792, 616, 2929, 7637, 342, 5661, 1543, 273, 253, 4081, 1332, 285, 247, 5301, 342, 643, 3332, 3082, 323, 32439, 13418, 20544, 50276, 3113, 253, 6517, 273, 247, 1643, 6332, 253, 2929, 310, 973, 3542, 285, 556, 271, 34007, 3448, 253, 16038, 273, 253, 789, 310, 2590, 285, 4623, 50276, 783, 10527, 1783, 275, 253, 2929, 310, 4092, 347, 2080, 347, 619, 4685, 273, 253, 9400, 4566, 50275, 783, 4081, 1332, 3133, 5272, 285, 17285, 407, 253, 10527, 1783, 275, 253, 2929, 50276, 20881, 1255, 265, 50276, 74, 1119, 253, 2929, 281, 320, 247, 2372, 1491, 14086, 275, 958, 891, 717, 31488, 670, 752, 812, 320, 690, 273, 253, 954, 16105, 13260, 273, 253, 1332, 4518, 627, 310, 253, 9376, 326, 1269, 310, 247, 3588, 14000, 873, 533, 752, 670, 305, 18, 21842, 6046, 3210, 604, 436, 310, 3309, 840, 2139, 897, 253, 20949, 273, 247, 11454, 2990, 352, 3133, 326, 275, 1340, 281, 1056, 253, 10527, 1783, 6927, 368, 452, 281, 5467, 247, 642, 261, 6134, 2720, 310, 627, 667, 4839, 368, 476, 1056, 247, 1618, 273, 253, 2424, 13260, 285, 247, 6197, 390, 767, 273, 2139, 597, 403, 3058, 5046, 368, 476, 1908, 1907, 824, 271, 6843, 2715, 273, 253, 13260, 275, 253, 2022, 3389, 390, 247, 4858, 2593, 275, 253, 30762, 50275, 28694, 253, 1072, 1386, 273, 253, 2045, 1127, 891, 651, 19864, 253, 4477, 281, 1158, 670, 15216, 275, 534, 824, 13260, 403, 25146, 253, 6667, 273, 436, 2929, 275, 534, 359, 971, 281, 4575, 247, 19349, 6642, 970, 1029, 15759, 33520, 556, 1900, 10903, 479, 347, 247, 2372, 46521, 347, 253, 4477, 1537, 871, 19427, 323, 512, 33555, 4903, 310, 417, 1900, 253, 1682, 4500, 984, 326, 1537, 9569, 278, 39043, 891, 651, 751, 281, 1333, 2167, 326, 347, 253, 4477, 9113, 1127, 562, 253, 625, 4903, 368, 452, 253, 2406, 253, 4839, 273, 1907, 14787, 273, 253, 9694, 476, 368, 4496, 4385, 327, 752, 812, 320, 690, 1896, 15216, 275, 534, 253, 1332, 812, 320, 3732, 50275, 71, 3341, 604, 1929, 281, 253, 4477, 891, 651, 751, 281, 871, 253, 5886, 273, 436, 789, 875, 789, 327, 26647, 273, 11454, 6928, 352, 3133, 432, 619, 1127, 273, 1859, 326, 253, 1895, 273, 3710, 21481, 273, 21839, 4903, 476, 320, 4229, 407, 1907, 48489, 273, 253, 4828, 12690, 780, 84, 326, 39970, 973, 4457, 253, 1329, 273, 253, 5019, 273, 33520, 285, 1971, 812, 352, 320, 1896, 326, 752, 310, 6276, 954, 273, 634, 1543, 310, 253, 37820, 273, 253, 8197, 275, 2087, 37820, 310, 247, 1039, 273, 17170, 562, 273, 1329, 26647, 476, 368, 4385, 327, 436, 50276, 74, 2868, 253, 2929, 35910, 253, 4722, 1895, 273, 26230, 32439, 342, 3710, 14787, 352, 3133, 281, 479, 326, 954, 273, 253, 3762, 285, 253, 13418, 1332, 497, 3715, 1078, 436, 3929, 594, 275, 247, 2176, 1039, 352, 310, 271, 32809, 7680, 17837, 1097, 10527, 285, 16774, 1783, 273, 253, 3929, 1646, 4092, 5474, 339, 431, 248, 2929, 19401, 253, 4758, 273, 3710, 14787, 273, 33520, 285, 2175, 12488, 285, 26230, 253, 19349, 2538, 436, 4758, 310, 2834, 281, 2968, 342, 984, 352, 310, 7479, 281, 6642, 19349, 2538, 387, 1327, 1189, 77, 5436, 2193, 1955, 281, 247, 3480, 273, 941, 281, 436, 990, 253, 2929, 29328, 271, 2934, 1754, 327, 253, 18983, 4868, 285, 38640, 6779, 15282, 21574, 253, 18983, 4868, 310, 271, 4569, 4968, 323, 3710, 21481, 984, 352, 476, 3711, 690, 1327, 1189, 77, 5436, 2193, 281, 271, 21481, 1318, 275, 247, 2317, 273, 2406, 10103, 15282, 21574, 310, 247, 3626, 5019, 273, 21983, 3348, 285, 260, 21574, 285, 310, 4081, 281, 1361, 281, 4271, 253, 19349, 1055, 253, 2929, 17930, 436, 2934, 407, 36636, 247, 747, 3963, 1025, 362, 3348, 1925, 701, 1143, 514, 21574, 285, 2007, 3537, 13505, 253, 1971, 1055, 2228, 14493, 275, 253, 990, 253, 2929, 26662, 253, 1566, 342, 3332, 3082, 275, 253, 4679, 20544, 50276, 18, 4477, 8162, 247, 4460, 2746, 323, 3710, 21481, 273, 33520, 253, 4758, 273, 3710, 21481, 310, 4722, 253, 4081, 2746, 24772, 253, 18983, 4868, 285, 15282, 21574, 534, 403, 4569, 285, 4460, 762, 436, 4758, 3738, 253, 9840, 39762, 2406, 3033, 369, 4081, 275, 253, 2045, 3082, 352, 7120, 247, 2554, 273, 10938, 6654, 285, 352, 476, 320, 11575, 347, 247, 4460, 2898, 50276, 19, 253, 3368, 1543, 403, 9470, 253, 4477, 956, 690, 5368, 941, 5978, 5162, 390, 941, 285, 921, 326, 253, 16645, 2720, 310, 1774, 275, 3946, 2429, 281, 1327, 30063, 4394, 285, 616, 3210, 476, 4044, 1805, 3045, 50276, 20, 253, 10527, 6260, 403, 4460, 762, 253, 4758, 273, 436, 2929, 841, 1543, 956, 1110, 273, 21983, 3348, 285, 5223, 281, 253, 747, 7533, 253, 4477, 21184, 327, 690, 2515, 281, 1361, 10668, 281, 755, 731, 534, 310, 1270, 50276, 20881, 1255, 265, 50276, 18, 253, 2929, 310, 14086, 285, 2834, 281, 956, 352, 23970, 1142, 12342, 432, 1027, 3672, 352, 3936, 1199, 673, 281, 2096, 285, 2451, 253, 2929, 891, 1804, 4477, 823, 690, 14777, 281, 1361, 10668, 281, 7938, 2096, 690, 14308, 24088, 21481, 253, 2929, 8687, 285, 4419, 10668, 281, 452, 4618, 3640, 594, 891, 1158, 326, 247, 1175, 9759, 310, 2834, 5046, 4886, 690, 9410, 281, 253, 30762, 285, 34805, 690, 12342, 812, 320, 247, 4327, 275, 1635, 5810, 342, 253, 1072, 3740, 403, 908, 323, 1027, 3510, 273, 5705, 824, 347, 14168, 67, 1288, 323, 1524, 3904, 285, 14168, 4482, 78, 323, 247, 1159, 891, 7052, 1804, 970, 5185, 41818, 50276, 19, 3368, 1543, 4080, 407, 759, 248, 403, 7197, 685, 260, 925, 18739, 285, 759, 248, 4887, 1027, 1491, 18739, 16633, 327, 253, 1599, 285, 759, 248, 13806, 253, 7882, 984, 604, 627, 310, 247, 1781, 32439, 2228, 759, 248, 476, 320, 1781, 347, 5393, 327, 253, 3239, 898, 1781, 32439, 6332, 1599, 326, 253, 1566, 17923, 440, 296, 1598, 387, 690, 2193, 594, 352, 310, 1805, 281, 2319, 253, 1543, 625, 6283, 50276, 74, 6273, 323, 5075, 14924, 253, 2929, 29328, 247, 22335, 285, 28055, 3590, 2746, 285, 352, 2175, 271, 4722, 4758, 50276, 15870, 21481, 2167, 352, 556, 690, 9759, 3374, 2490, 187, 4118, 18435, 27, 249, 436, 2929, 253, 4477, 4081, 247, 1332, 323, 19349, 17032, 762, 3710, 14787, 50276, 266, 1774, 285, 762, 14091, 728, 23950, 50276, 783, 4477, 12661, 281, 9295, 247, 18983, 4868, 970, 247, 39762, 6753, 36465, 285, 7624, 3711, 247, 2169, 15759, 873, 273, 33520, 342, 3710, 14787, 281, 247, 2406, 15759, 873, 835, 14787, 6556, 285, 824, 326, 15776, 1430, 310, 8838, 50276, 783, 2929, 369, 9814, 3240, 49148, 407, 30628, 285, 253, 4477, 9300, 253, 7714, 281, 2953, 2173, 3374, 5439, 407, 30628 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 604, 417, 275, 253, 2022, 2505, 840, 387, 1878, 275, 253, 30762, 273, 824, 271, 1055, 34301, 326, 1057, 417, 6388, 246, 50276, 255, 253, 990, 273, 2593, 374, 253, 4477, 2085, 271, 25040, 5426, 273, 16645, 268, 1641, 2139, 436, 1327, 10389, 11303, 5426, 3185, 273, 247, 7473, 581, 50276, 28489, 516, 21643, 253, 15282, 362, 3348, 285, 253, 9840, 15282, 362, 3348, 533, 432, 253, 29886, 1566, 273, 15282, 362, 3348, 275, 4677, 337, 891, 13414, 2096, 253, 4602, 875, 1182, 285, 247, 3714, 14168, 4482, 81, 253, 5426, 273, 1617, 595, 16645, 6779, 4419, 326, 1182, 14715, 28951, 534, 1057, 417, 1646, 281, 320, 1896, 762, 253, 1566, 273, 15282, 362, 3348, 50275, 783, 4327, 273, 253, 913, 1406, 1105, 3714, 323, 18983, 7363, 812, 320, 8756, 38617, 1580, 253, 2303, 8446, 273, 436, 789, 3133, 281, 320, 26415, 275, 19349, 17032, 835, 3714, 3798, 9572, 323, 253, 33882, 4868, 275, 1142, 2987, 50276, 74, 452, 2067, 3533, 670, 253, 4679, 50272, 22309, 253, 14951, 5234, 432, 1182, 281, 259, 275, 253, 4679, 310, 627, 247, 20178, 3064, 875, 253, 767, 50272, 22309, 253, 4327, 273, 4872, 3470, 288, 7261, 285, 849, 403, 597, 6777, 50272, 22309, 310, 2649, 253, 4081, 1332, 432, 2687, 454, 285, 1315, 3107, 43425, 671, 2908, 275, 253, 20407, 1263, 1014, 604, 616, 1332, 15771, 327, 4872, 13260, 352, 651, 320, 4722, 281, 923, 247, 5301, 275, 436, 3634, 50272, 17480, 253, 2929, 4428, 247, 28529, 273, 2228, 14493, 327, 253, 32439, 275, 10012, 374, 352, 651, 320, 4722, 281, 2939, 731, 275, 253, 9938, 285, 7277, 253, 16774, 1543, 281, 841, 14493, 50272, 1542, 253, 25730, 12132, 10895, 352, 310, 3542, 326, 275, 253, 908, 1566, 9840, 18, 23378, 347, 20451, 275, 2829, 337, 533, 840, 627, 403, 671, 7321, 9508, 326, 403, 2361, 342, 643, 2193, 273, 253, 4373, 19484, 9840, 943, 2649, 20451, 285, 771, 18, 921, 253, 1072, 1543, 390, 717, 891, 40663, 253, 7180, 5740, 50274, 37585, 5701, 326, 858, 417, 3486, 253, 4868, 50276, 81, 18, 44667, 398, 403, 417, 7933, 1029, 6967, 891, 651, 8460, 4187, 253, 1390, 6197, 273, 253, 374, 2109, 12494, 281, 8794, 593, 253, 1750, 2581, 1333, 326, 253, 625, 33520, 403, 5728, 253, 625, 2779, 440, 8259, 8055, 1255, 310, 281, 2186, 533, 326, 436, 476, 4541, 1421, 281, 3374, 275, 9712, 875, 1971, 285, 1453, 2390, 50276, 81, 18, 891, 651, 1804, 6240, 253, 789, 327, 14787, 13461, 407, 632, 285, 632, 6247, 50276, 81, 20, 275, 2593, 3127, 891, 651, 5583, 6240, 253, 35339, 6156, 9376, 323, 29867, 2057, 407, 29570, 352, 3587, 390, 407, 6240, 253, 14978, 273, 11689, 275, 253, 13260, 50274, 5996, 250, 2858, 22559, 5731, 891, 5717, 253, 4477, 323, 616, 7000, 285, 14793, 6128, 281, 512, 30628, 616, 6031, 281, 19148, 285, 8460, 2907, 13260, 327, 253, 277, 17788, 1566, 285, 941, 347, 973, 347, 616, 44619, 281, 5223, 616, 41818, 1563, 253, 30628, 16157, 285, 253, 3081, 4679, 2905, 281, 253, 10527, 3033, 452, 6518, 15974, 954, 273, 619, 7350, 2299, 891, 1335, 1158, 253, 19274, 273, 616, 789, 4103, 281, 2905, 2987, 310, 2834, 281, 956, 390, 1089, 534, 2789, 352, 2834, 281, 7277, 697, 9021, 342, 2045, 2987, 891, 452, 3103, 4425, 281, 760, 2572, 619, 13716, 407, 581, 1268, 50273, 250, 3065, 337, 247, 1591, 5945, 2687, 454, 285, 247, 1591, 5945, 1315, 3107, 372, 8259, 13802, 7363, 4735, 14237, 323, 19349, 1055, 13418, 342, 5075, 14787, 549, 32693, 638, 3845, 549, 32693, 16899, 1449, 3011, 3763, 43425, 50276, 19, 2240, 270, 288, 35714, 253, 18983, 28046, 273, 253, 33882, 4868, 1794, 2755, 32410, 898, 3583, 5693, 1047, 2055, 4695, 50276, 20, 7989, 632, 285, 7989, 632, 33882, 4868, 42428, 323, 19349, 17032, 342, 2709, 9694, 2459, 932, 273, 3732, 9990, 13900, 1508, 2511, 1348, 1010, 6247, 50276, 21, 37622, 375, 29245, 478, 375, 41475, 439, 267, 262, 480, 263, 261, 278, 3288, 1944, 34843, 301, 16491, 356, 6793, 472, 1182, 358, 293, 285, 2781, 973, 272, 19349, 1055, 17032, 342, 3676, 21624, 18645, 3210, 275, 10061, 273, 253, 4562, 296, 5213, 8059, 327, 11454, 1491, 5162, 2718, 7266, 721, 29185, 1540, 3046, 4240, 50276, 22, 41475, 439, 267, 262, 269, 433, 16409, 277, 480, 1368, 507, 1665, 285, 34843, 301, 16491, 356, 26230, 2060, 1971, 1055, 26647, 14493, 285, 11333, 275, 5213, 8059, 327, 5145, 4715, 7266, 1884, 3121, 1229, 2227, 268, 1686, 83, 4240, 275, 6010, 253, 4388, 285, 2746, 273, 436, 2929, 403, 4722, 285, 891, 2868, 253, 2746, 812, 320, 247, 35961, 6351, 323, 253, 19349, 5145, 4715, 8446, 533, 697, 6944, 13260, 403, 3559, 275, 247, 1039, 326, 2789, 352, 2834, 281, 3048, 390, 7277, 731, 281, 643, 2987, 285, 697, 8453, 323, 1524, 10186, 6667, 310, 3103, 2834, 281, 5963, 3340, 253, 4278, 273, 253, 6944, 19349, 1566, 403, 417, 2590, 281, 479, 285, 1056, 352, 2834, 281, 4473, 1230, 7277, 352, 342, 643, 3082, 824, 347, 260, 925, 439, 267, 262, 1162, 355, 4240, 390, 260, 1173, 3348, 29245, 478, 375, 1162, 355, 4240, 891, 588, 1239, 253, 30080, 22559, 9257, 285, 717, 7378, 281, 2572, 253, 4868, 604, 253, 4477, 2953, 253, 5439, 7350, 5474, 339, 431, 248, 2929, 39223, 253, 1895, 273, 26230, 19349, 2538, 762, 7898, 14787, 2515, 14787, 310, 247, 1846, 9376, 275, 46449, 594, 253, 1895, 310, 1892, 285, 4623, 253, 4477, 1246, 247, 10527, 1783, 970, 268, 1641, 271, 15644, 273, 18983, 7363, 253, 13418, 310, 4824, 562, 342, 247, 7321, 2715, 273, 247, 39762, 6753, 36465, 362, 3348, 1925, 701, 1143, 514, 21574, 275, 1635, 253, 4477, 1246, 14493, 327, 253, 2228, 273, 253, 17697, 3388, 1971, 1055, 32439, 970, 616, 7792, 616, 2929, 7637, 342, 5661, 1543, 273, 253, 4081, 1332, 285, 247, 5301, 342, 643, 3332, 3082, 323, 32439, 13418, 20544, 50276, 3113, 253, 6517, 273, 247, 1643, 6332, 253, 2929, 310, 973, 3542, 285, 556, 271, 34007, 3448, 253, 16038, 273, 253, 789, 310, 2590, 285, 4623, 50276, 783, 10527, 1783, 275, 253, 2929, 310, 4092, 347, 2080, 347, 619, 4685, 273, 253, 9400, 4566, 50275, 783, 4081, 1332, 3133, 5272, 285, 17285, 407, 253, 10527, 1783, 275, 253, 2929, 50276, 20881, 1255, 265, 50276, 74, 1119, 253, 2929, 281, 320, 247, 2372, 1491, 14086, 275, 958, 891, 717, 31488, 670, 752, 812, 320, 690, 273, 253, 954, 16105, 13260, 273, 253, 1332, 4518, 627, 310, 253, 9376, 326, 1269, 310, 247, 3588, 14000, 873, 533, 752, 670, 305, 18, 21842, 6046, 3210, 604, 436, 310, 3309, 840, 2139, 897, 253, 20949, 273, 247, 11454, 2990, 352, 3133, 326, 275, 1340, 281, 1056, 253, 10527, 1783, 6927, 368, 452, 281, 5467, 247, 642, 261, 6134, 2720, 310, 627, 667, 4839, 368, 476, 1056, 247, 1618, 273, 253, 2424, 13260, 285, 247, 6197, 390, 767, 273, 2139, 597, 403, 3058, 5046, 368, 476, 1908, 1907, 824, 271, 6843, 2715, 273, 253, 13260, 275, 253, 2022, 3389, 390, 247, 4858, 2593, 275, 253, 30762, 50275, 28694, 253, 1072, 1386, 273, 253, 2045, 1127, 891, 651, 19864, 253, 4477, 281, 1158, 670, 15216, 275, 534, 824, 13260, 403, 25146, 253, 6667, 273, 436, 2929, 275, 534, 359, 971, 281, 4575, 247, 19349, 6642, 970, 1029, 15759, 33520, 556, 1900, 10903, 479, 347, 247, 2372, 46521, 347, 253, 4477, 1537, 871, 19427, 323, 512, 33555, 4903, 310, 417, 1900, 253, 1682, 4500, 984, 326, 1537, 9569, 278, 39043, 891, 651, 751, 281, 1333, 2167, 326, 347, 253, 4477, 9113, 1127, 562, 253, 625, 4903, 368, 452, 253, 2406, 253, 4839, 273, 1907, 14787, 273, 253, 9694, 476, 368, 4496, 4385, 327, 752, 812, 320, 690, 1896, 15216, 275, 534, 253, 1332, 812, 320, 3732, 50275, 71, 3341, 604, 1929, 281, 253, 4477, 891, 651, 751, 281, 871, 253, 5886, 273, 436, 789, 875, 789, 327, 26647, 273, 11454, 6928, 352, 3133, 432, 619, 1127, 273, 1859, 326, 253, 1895, 273, 3710, 21481, 273, 21839, 4903, 476, 320, 4229, 407, 1907, 48489, 273, 253, 4828, 12690, 780, 84, 326, 39970, 973, 4457, 253, 1329, 273, 253, 5019, 273, 33520, 285, 1971, 812, 352, 320, 1896, 326, 752, 310, 6276, 954, 273, 634, 1543, 310, 253, 37820, 273, 253, 8197, 275, 2087, 37820, 310, 247, 1039, 273, 17170, 562, 273, 1329, 26647, 476, 368, 4385, 327, 436, 50276, 74, 2868, 253, 2929, 35910, 253, 4722, 1895, 273, 26230, 32439, 342, 3710, 14787, 352, 3133, 281, 479, 326, 954, 273, 253, 3762, 285, 253, 13418, 1332, 497, 3715, 1078, 436, 3929, 594, 275, 247, 2176, 1039, 352, 310, 271, 32809, 7680, 17837, 1097, 10527, 285, 16774, 1783, 273, 253, 3929, 1646, 4092, 5474, 339, 431, 248, 2929, 19401, 253, 4758, 273, 3710, 14787, 273, 33520, 285, 2175, 12488, 285, 26230, 253, 19349, 2538, 436, 4758, 310, 2834, 281, 2968, 342, 984, 352, 310, 7479, 281, 6642, 19349, 2538, 387, 1327, 1189, 77, 5436, 2193, 1955, 281, 247, 3480, 273, 941, 281, 436, 990, 253, 2929, 29328, 271, 2934, 1754, 327, 253, 18983, 4868, 285, 38640, 6779, 15282, 21574, 253, 18983, 4868, 310, 271, 4569, 4968, 323, 3710, 21481, 984, 352, 476, 3711, 690, 1327, 1189, 77, 5436, 2193, 281, 271, 21481, 1318, 275, 247, 2317, 273, 2406, 10103, 15282, 21574, 310, 247, 3626, 5019, 273, 21983, 3348, 285, 260, 21574, 285, 310, 4081, 281, 1361, 281, 4271, 253, 19349, 1055, 253, 2929, 17930, 436, 2934, 407, 36636, 247, 747, 3963, 1025, 362, 3348, 1925, 701, 1143, 514, 21574, 285, 2007, 3537, 13505, 253, 1971, 1055, 2228, 14493, 275, 253, 990, 253, 2929, 26662, 253, 1566, 342, 3332, 3082, 275, 253, 4679, 20544, 50276, 18, 4477, 8162, 247, 4460, 2746, 323, 3710, 21481, 273, 33520, 253, 4758, 273, 3710, 21481, 310, 4722, 253, 4081, 2746, 24772, 253, 18983, 4868, 285, 15282, 21574, 534, 403, 4569, 285, 4460, 762, 436, 4758, 3738, 253, 9840, 39762, 2406, 3033, 369, 4081, 275, 253, 2045, 3082, 352, 7120, 247, 2554, 273, 10938, 6654, 285, 352, 476, 320, 11575, 347, 247, 4460, 2898, 50276, 19, 253, 3368, 1543, 403, 9470, 253, 4477, 956, 690, 5368, 941, 5978, 5162, 390, 941, 285, 921, 326, 253, 16645, 2720, 310, 1774, 275, 3946, 2429, 281, 1327, 30063, 4394, 285, 616, 3210, 476, 4044, 1805, 3045, 50276, 20, 253, 10527, 6260, 403, 4460, 762, 253, 4758, 273, 436, 2929, 841, 1543, 956, 1110, 273, 21983, 3348, 285, 5223, 281, 253, 747, 7533, 253, 4477, 21184, 327, 690, 2515, 281, 1361, 10668, 281, 755, 731, 534, 310, 1270, 50276, 20881, 1255, 265, 50276, 18, 253, 2929, 310, 14086, 285, 2834, 281, 956, 352, 23970, 1142, 12342, 432, 1027, 3672, 352, 3936, 1199, 673, 281, 2096, 285, 2451, 253, 2929, 891, 1804, 4477, 823, 690, 14777, 281, 1361, 10668, 281, 7938, 2096, 690, 14308, 24088, 21481, 253, 2929, 8687, 285, 4419, 10668, 281, 452, 4618, 3640, 594, 891, 1158, 326, 247, 1175, 9759, 310, 2834, 5046, 4886, 690, 9410, 281, 253, 30762, 285, 34805, 690, 12342, 812, 320, 247, 4327, 275, 1635, 5810, 342, 253, 1072, 3740, 403, 908, 323, 1027, 3510, 273, 5705, 824, 347, 14168, 67, 1288, 323, 1524, 3904, 285, 14168, 4482, 78, 323, 247, 1159, 891, 7052, 1804, 970, 5185, 41818, 50276, 19, 3368, 1543, 4080, 407, 759, 248, 403, 7197, 685, 260, 925, 18739, 285, 759, 248, 4887, 1027, 1491, 18739, 16633, 327, 253, 1599, 285, 759, 248, 13806, 253, 7882, 984, 604, 627, 310, 247, 1781, 32439, 2228, 759, 248, 476, 320, 1781, 347, 5393, 327, 253, 3239, 898, 1781, 32439, 6332, 1599, 326, 253, 1566, 17923, 440, 296, 1598, 387, 690, 2193, 594, 352, 310, 1805, 281, 2319, 253, 1543, 625, 6283, 50276, 74, 6273, 323, 5075, 14924, 253, 2929, 29328, 247, 22335, 285, 28055, 3590, 2746, 285, 352, 2175, 271, 4722, 4758, 50276, 15870, 21481, 2167, 352, 556, 690, 9759, 3374, 2490, 187, 4118, 18435, 27, 249, 436, 2929, 253, 4477, 4081, 247, 1332, 323, 19349, 17032, 762, 3710, 14787, 50276, 266, 1774, 285, 762, 14091, 728, 23950, 50276, 783, 4477, 12661, 281, 9295, 247, 18983, 4868, 970, 247, 39762, 6753, 36465, 285, 7624, 3711, 247, 2169, 15759, 873, 273, 33520, 342, 3710, 14787, 281, 247, 2406, 15759, 873, 835, 14787, 6556, 285, 824, 326, 15776, 1430, 310, 8838, 50276, 783, 2929, 369, 9814, 3240, 49148, 407, 30628, 285, 253, 4477, 9300, 253, 7714, 281, 2953, 2173, 3374, 5439, 407, 30628 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper describes a python package for projecting onto quadratic hypersurfaces the basic problem it solves is to minimizing projection errors in hilbert space subject to xt a x bt x c 0 the authors show that with eigen decomposition of a we can reparametrize this problem and solve it with newtons method the paper also provides a number of example codes illustrating how to use the package strength clarity of writing the tool seems to have broader usage in practice weaknesses some computational performance studies are missing it would be nice to include statistics of how this package tool scales with the growth of problem size na docsepthis paper an introduction to a python package implementing the method of projecting onto a central and noncylindrical quadratic hypersurface quadric proposed in a recently posted arxiv manuscript 21 orthogonal projection onto a set is an important tool in machine learning as it is a basic building block of many learning algorithms when the set onto which the projection is made is compact and convex the problem is relatively easy in the sense that the unique closest point exists and this point can be found using standard convex optimization algorithms when the set is nonconvex however the problem needs to be tackled case by case a quadric is a set of roots of a matrix quadratic equation characterized by the eigenvalues of the coefficient matrix a of the quadratic term and the discriminant of the equation this set is in general nonconvex the recent manuscript 21 proposition 220 provides characteristics of a projected point there can be multiple closest points from the input point since the set is nonconvex when the quadric is central discriminant is nonzero and noncylindrical a is nonsingular and also an algorithm 21 algorithm 2 for computing it this paper implements 21 algorithm 2 in python and build a package quadproj in order to democratize the algorithm a review of the theory of 21 and exposition of how to use the quadproj package with sample code snippets are provided a strength of the paper is in the democratization of the algorithm for projection onto a quadric by writing an easytouse python package the package is designed so that construction of a quadric object projection onto the quadric and visualization of the solution is straightforward the sample code snippets are also easy to follow indeed visualization in 2d or 3d cases is emphasized with a couple of figures also the exposition is quite clear a weakness is in novelty all the theory and method in sections 2 and 3 are treated in 21 so the contribution of this paper seems to be limited to section 4 description of the quadproj package i understand that careful and easytouse implementation of an algorithm is an important task but for such an implementation is publishable to a premire conference and journal i think more consideration is needed for example in the editorial policy of the siam journal on scientific computing httpsepubssiamorgjournalsisceditorialpolicy which i believe is closest in scope with the paper states for the software and highperformance computing category as follows for software in particular submitted papers should not be limited to describing a new package but must present the algorithmic or technological innovations that made the development of the software possible unfortunately i do not see either algorithmic or technological innovation in the exposition of the package in this paper please see strengths and weaknesses section in the present form this paper looks as if it is an appendix to 21 docsepthis paper is a short tutorial on a python optimization package which implements the method of the recent preprint 21 regarding the problem of projecting a given point of the euclidean ndimensional space onto a quadratic hypersurface strength certainly being able to project efficiently and accurately onto a quadratic hypersurface is very important and the present paper concerns a promising step in this direction here is an additional reason for the significance of this problem not mentioned in the paper it is a classical fact in algebraic geometry that any variety defined by homogeneous equations is isomorphic via the socalled veronese map to a variety defined by quadratic equations eg see exercise 29 page 25 in algebraic geometry a first course by joe harris thus being able to efficiently and accurately project onto a single quadratic hypersurface is the beginning of being able to project onto any variety since the latter can be viewed as an intersection of quadratic hypersurfaces weakness while a tutorial presenting a python package for projecting onto quadratic hypersurfaces is certainly nice to have i am of the impression that neurips is not the appropriate venue for it instead i would have loved to see a short version of 21 with experiments or a case study highlighting the importance of the problem for the machine learning and data science community and comparing with the stateoftheart homotopy continuation methods for the same problem bibliographic remark i would like to bring to the authors attention the following papers and citations therein regarding the problem of projecting onto hypersurfaces and varieties more generally 1 draisma horobet ottaviani sturmfels thomas the euclidean distance degree of an algebraic variety foundations of computational mathematics 2 breiding sottile woodcock euclidean distance degree and mixed volume foundations of computational mathematics yes docseppresents a library for computing projections onto a range of surfaces defined by quadrics this paper does a good job of describing the problem it solves and illustrating examples of how the library can be used to solve instances of problems within the class of projection problems it considers i am supportive of papers on software libraries they are explicitly within the scope of neurips and contribute directly to the ml community i dont have any larger critiques the examples are easy to understand and the visuals are simple and clear this paper addresses a very narrow problem and does not illustrate any concrete applications i would have liked to see some demonstration of applications of the library ### Summary:
the paper presents a software package to do projections on the noncylindrical central quadratic hypersurfaces while the problem is certainly interesting all the reviewers agree its motivation in the context of machine learning seems to be lacking in the paper this is missing in the paper currently and is the main source of confusion in the reviewers and the acs minds after discussions among the reviewers i believe the paper has much scope for improvements notwithstanding the merits please look at the suggestions carefully also the paper as it is seems to better fit the scope of the mloss journal rather than the neurips conference just a thought from the ac having said that i would encourage the authors to continue the development of this package
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 8631, 247, 15548, 5522, 323, 35104, 4830, 21396, 24052, 321, 6511, 253, 5044, 1895, 352, 35910, 310, 281, 28699, 12378, 6332, 275, 288, 300, 6291, 2317, 2256, 281, 209, 633, 247, 1269, 50276, 2612, 1269, 50276, 68, 50276, 17, 253, 4477, 921, 326, 342, 9216, 14717, 273, 247, 359, 476, 294, 3575, 292, 363, 2721, 436, 1895, 285, 8415, 352, 342, 747, 24787, 1332, 253, 2929, 671, 3400, 247, 1180, 273, 1650, 11646, 34805, 849, 281, 897, 253, 5522, 50276, 45563, 50275, 498, 15752, 273, 4028, 50276, 783, 4968, 3133, 281, 452, 16055, 10393, 275, 3946, 50276, 20881, 1255, 265, 50275, 8826, 15180, 3045, 2175, 403, 5816, 352, 651, 320, 5322, 281, 2486, 9990, 273, 849, 436, 5522, 4968, 11498, 342, 253, 3116, 273, 1895, 1979, 50274, 2072, 5474, 33032, 2520, 2929, 271, 10199, 281, 247, 15548, 5522, 16994, 253, 1332, 273, 35104, 4830, 247, 4275, 285, 1327, 32990, 527, 5526, 21396, 24052, 32961, 9853, 695, 4081, 275, 247, 4102, 9269, 549, 32693, 7714, 3127, 19627, 12378, 4830, 247, 873, 310, 271, 1774, 4968, 275, 5145, 4715, 347, 352, 310, 247, 5044, 3652, 2972, 273, 1142, 4715, 11333, 672, 253, 873, 4830, 534, 253, 12378, 310, 1160, 310, 8566, 285, 17133, 253, 1895, 310, 4942, 3477, 275, 253, 3282, 326, 253, 4451, 8642, 1127, 4961, 285, 436, 1127, 476, 320, 1119, 970, 2629, 17133, 13757, 11333, 672, 253, 873, 310, 1327, 44181, 2299, 253, 1895, 3198, 281, 320, 11463, 1070, 1083, 407, 1083, 247, 9853, 695, 310, 247, 873, 273, 11465, 273, 247, 4315, 21396, 5150, 7943, 407, 253, 20223, 273, 253, 10235, 4315, 247, 273, 253, 21396, 1307, 285, 253, 20741, 386, 273, 253, 5150, 436, 873, 310, 275, 2087, 1327, 44181, 253, 3332, 7714, 3127, 13989, 18881, 3400, 50276, 17932, 3397, 273, 247, 16589, 1127, 627, 476, 320, 2709, 8642, 2792, 432, 253, 3280, 1127, 1580, 253, 873, 310, 1327, 44181, 50276, 9453, 253, 9853, 695, 310, 4275, 20741, 386, 310, 28078, 285, 1327, 32990, 527, 5526, 247, 310, 14122, 272, 792, 285, 671, 271, 5933, 3127, 5933, 374, 323, 12672, 352, 436, 2929, 17930, 3127, 5933, 374, 275, 15548, 285, 1973, 247, 5522, 9853, 29167, 275, 1340, 281, 8738, 255, 907, 253, 5933, 247, 2278, 273, 253, 3762, 273, 3127, 285, 47284, 273, 849, 281, 897, 253, 9853, 29167, 5522, 342, 3410, 2127, 3802, 46588, 403, 2530, 50276, 66, 4757, 273, 253, 2929, 310, 275, 253, 8738, 47159, 273, 253, 5933, 323, 12378, 4830, 247, 9853, 695, 407, 4028, 271, 1842, 1767, 1312, 15548, 5522, 253, 5522, 310, 4158, 594, 326, 5140, 273, 247, 9853, 695, 1789, 12378, 4830, 253, 9853, 695, 285, 24426, 273, 253, 2900, 310, 15246, 253, 3410, 2127, 3802, 46588, 403, 671, 3477, 281, 956, 6296, 24426, 275, 374, 69, 390, 495, 69, 2219, 50276, 261, 21947, 342, 247, 4564, 273, 8442, 671, 253, 47284, 310, 3240, 2590, 50276, 66, 14855, 310, 275, 38135, 512, 253, 3762, 285, 1332, 275, 7118, 374, 285, 495, 403, 4127, 275, 3127, 594, 253, 7680, 273, 436, 2929, 3133, 281, 320, 3710, 281, 2593, 577, 5740, 273, 253, 9853, 29167, 5522, 891, 2096, 326, 10182, 285, 1842, 1767, 1312, 7092, 273, 271, 5933, 310, 271, 1774, 4836, 533, 323, 824, 271, 7092, 310, 15452, 494, 281, 247, 5398, 603, 8059, 285, 6698, 891, 1158, 625, 8180, 310, 3058, 50276, 1542, 1650, 275, 253, 21977, 3646, 273, 253, 4927, 312, 6698, 327, 8249, 12672, 3944, 339, 16712, 859, 16726, 2061, 34859, 261, 758, 2081, 451, 22872, 534, 891, 2868, 310, 8642, 275, 7990, 342, 253, 2929, 3054, 323, 253, 3694, 285, 1029, 24159, 12672, 7140, 347, 3637, 50274, 1542, 3694, 275, 1798, 9262, 9380, 943, 417, 320, 3710, 281, 12930, 247, 747, 5522, 533, 1364, 1246, 253, 5933, 280, 390, 20417, 32771, 326, 1160, 253, 2440, 273, 253, 3694, 1896, 50276, 328, 9520, 891, 513, 417, 923, 2057, 5933, 280, 390, 20417, 15832, 275, 253, 47284, 273, 253, 5522, 275, 436, 2929, 50276, 32897, 923, 20544, 285, 32213, 2593, 275, 253, 1246, 830, 436, 2929, 4453, 347, 604, 352, 310, 271, 30762, 281, 3127, 5474, 33032, 2520, 2929, 310, 247, 2159, 23647, 327, 247, 15548, 13757, 5522, 534, 17930, 253, 1332, 273, 253, 3332, 638, 3845, 3127, 5001, 253, 1895, 273, 35104, 247, 1677, 1127, 273, 253, 299, 26365, 295, 6967, 2317, 4830, 247, 21396, 24052, 32961, 50276, 45563, 50276, 68, 20427, 1146, 2104, 281, 2199, 14556, 285, 13613, 4830, 247, 21396, 24052, 32961, 310, 1077, 1774, 285, 253, 1246, 2929, 7350, 247, 12532, 3213, 275, 436, 3884, 1060, 310, 271, 3081, 1921, 323, 253, 8453, 273, 436, 1895, 417, 5393, 275, 253, 2929, 352, 310, 247, 8946, 958, 275, 20157, 12087, 326, 667, 5235, 2931, 407, 17010, 7424, 310, 25783, 3066, 253, 9267, 18859, 2336, 2487, 70, 3711, 281, 247, 5235, 2931, 407, 21396, 7424, 24088, 923, 5763, 3285, 3239, 2030, 275, 20157, 12087, 247, 806, 2282, 407, 3371, 70, 288, 34662, 3021, 1146, 2104, 281, 14556, 285, 13613, 2199, 4830, 247, 2014, 21396, 24052, 32961, 310, 253, 5068, 273, 1146, 2104, 281, 2199, 4830, 667, 5235, 1580, 253, 6158, 476, 320, 11575, 347, 271, 15171, 273, 21396, 24052, 321, 6511, 50275, 20881, 1255, 50276, 6050, 247, 23647, 15250, 247, 15548, 5522, 323, 35104, 4830, 21396, 24052, 321, 6511, 310, 5604, 5322, 281, 452, 891, 717, 273, 253, 13214, 326, 5723, 2824, 310, 417, 253, 4569, 18767, 323, 352, 50276, 34235, 891, 651, 452, 7636, 281, 923, 247, 2159, 2715, 273, 3127, 342, 4679, 390, 247, 1083, 1263, 27321, 253, 6349, 273, 253, 1895, 323, 253, 5145, 4715, 285, 941, 5859, 3114, 285, 10941, 342, 253, 1375, 23037, 14387, 32866, 26272, 3082, 323, 253, 1072, 1895, 50276, 2383, 965, 5576, 7579, 891, 651, 751, 281, 3324, 281, 253, 4477, 4116, 253, 1563, 9380, 285, 30404, 15308, 5001, 253, 1895, 273, 35104, 4830, 24052, 321, 6511, 285, 19112, 625, 3839, 337, 6536, 46984, 3499, 706, 292, 258, 1440, 580, 27510, 331, 321, 32067, 1241, 289, 4921, 253, 299, 26365, 4181, 4248, 273, 271, 20157, 5235, 27629, 273, 15180, 23065, 374, 1517, 2821, 256, 1519, 587, 5534, 26822, 299, 26365, 4181, 4248, 285, 6804, 4644, 27629, 273, 15180, 23065, 50275, 9820, 5474, 339, 377, 5957, 247, 6335, 323, 12672, 20553, 4830, 247, 2491, 273, 9421, 2931, 407, 9853, 18211, 436, 2929, 1057, 247, 1175, 2628, 273, 12930, 253, 1895, 352, 35910, 285, 34805, 6667, 273, 849, 253, 6335, 476, 320, 908, 281, 8415, 10872, 273, 3237, 1561, 253, 966, 273, 12378, 3237, 352, 19401, 891, 717, 23384, 273, 9380, 327, 3694, 13747, 597, 403, 11120, 1561, 253, 7990, 273, 5723, 2824, 285, 8162, 3587, 281, 253, 13361, 3114, 891, 13414, 452, 667, 4067, 2268, 4624, 253, 6667, 403, 3477, 281, 2096, 285, 253, 5304, 84, 403, 2969, 285, 2590, 436, 2929, 12453, 247, 1077, 6891, 1895, 285, 1057, 417, 17093, 667, 11859, 4893, 891, 651, 452, 10490, 281, 923, 690, 20028, 273, 4893, 273, 253, 6335, 2490, 187, 4118, 18435, 27, 783, 2929, 10262, 247, 3694, 5522, 281, 513, 20553, 327, 253, 1327, 32990, 527, 5526, 4275, 21396, 24052, 321, 6511, 1223, 253, 1895, 310, 5604, 4722, 512, 253, 30628, 5194, 697, 16038, 275, 253, 3634, 273, 5145, 4715, 3133, 281, 320, 14999, 275, 253, 2929, 436, 310, 5816, 275, 253, 2929, 4390, 285, 310, 253, 2022, 2603, 273, 13775, 275, 253, 30628, 285, 253, 913, 84, 13846, 846, 11985, 2190, 253, 30628, 891, 2868, 253, 2929, 556, 1199, 7990, 323, 11701, 30812, 253, 16108, 4496, 1007, 387, 253, 13991, 9257, 671, 253, 2929, 347, 352, 310, 3133, 281, 1805, 4944, 253, 7990, 273, 253, 13361, 1730, 6698, 2581, 685, 253, 5723, 2824, 8059, 816, 247, 1869, 432, 253, 913, 1907, 753, 326, 891, 651, 11907, 253, 4477, 281, 4035, 253, 2440, 273, 436, 5522, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 8631, 247, 15548, 5522, 323, 35104, 4830, 21396, 24052, 321, 6511, 253, 5044, 1895, 352, 35910, 310, 281, 28699, 12378, 6332, 275, 288, 300, 6291, 2317, 2256, 281, 209, 633, 247, 1269, 50276, 2612, 1269, 50276, 68, 50276, 17, 253, 4477, 921, 326, 342, 9216, 14717, 273, 247, 359, 476, 294, 3575, 292, 363, 2721, 436, 1895, 285, 8415, 352, 342, 747, 24787, 1332, 253, 2929, 671, 3400, 247, 1180, 273, 1650, 11646, 34805, 849, 281, 897, 253, 5522, 50276, 45563, 50275, 498, 15752, 273, 4028, 50276, 783, 4968, 3133, 281, 452, 16055, 10393, 275, 3946, 50276, 20881, 1255, 265, 50275, 8826, 15180, 3045, 2175, 403, 5816, 352, 651, 320, 5322, 281, 2486, 9990, 273, 849, 436, 5522, 4968, 11498, 342, 253, 3116, 273, 1895, 1979, 50274, 2072, 5474, 33032, 2520, 2929, 271, 10199, 281, 247, 15548, 5522, 16994, 253, 1332, 273, 35104, 4830, 247, 4275, 285, 1327, 32990, 527, 5526, 21396, 24052, 32961, 9853, 695, 4081, 275, 247, 4102, 9269, 549, 32693, 7714, 3127, 19627, 12378, 4830, 247, 873, 310, 271, 1774, 4968, 275, 5145, 4715, 347, 352, 310, 247, 5044, 3652, 2972, 273, 1142, 4715, 11333, 672, 253, 873, 4830, 534, 253, 12378, 310, 1160, 310, 8566, 285, 17133, 253, 1895, 310, 4942, 3477, 275, 253, 3282, 326, 253, 4451, 8642, 1127, 4961, 285, 436, 1127, 476, 320, 1119, 970, 2629, 17133, 13757, 11333, 672, 253, 873, 310, 1327, 44181, 2299, 253, 1895, 3198, 281, 320, 11463, 1070, 1083, 407, 1083, 247, 9853, 695, 310, 247, 873, 273, 11465, 273, 247, 4315, 21396, 5150, 7943, 407, 253, 20223, 273, 253, 10235, 4315, 247, 273, 253, 21396, 1307, 285, 253, 20741, 386, 273, 253, 5150, 436, 873, 310, 275, 2087, 1327, 44181, 253, 3332, 7714, 3127, 13989, 18881, 3400, 50276, 17932, 3397, 273, 247, 16589, 1127, 627, 476, 320, 2709, 8642, 2792, 432, 253, 3280, 1127, 1580, 253, 873, 310, 1327, 44181, 50276, 9453, 253, 9853, 695, 310, 4275, 20741, 386, 310, 28078, 285, 1327, 32990, 527, 5526, 247, 310, 14122, 272, 792, 285, 671, 271, 5933, 3127, 5933, 374, 323, 12672, 352, 436, 2929, 17930, 3127, 5933, 374, 275, 15548, 285, 1973, 247, 5522, 9853, 29167, 275, 1340, 281, 8738, 255, 907, 253, 5933, 247, 2278, 273, 253, 3762, 273, 3127, 285, 47284, 273, 849, 281, 897, 253, 9853, 29167, 5522, 342, 3410, 2127, 3802, 46588, 403, 2530, 50276, 66, 4757, 273, 253, 2929, 310, 275, 253, 8738, 47159, 273, 253, 5933, 323, 12378, 4830, 247, 9853, 695, 407, 4028, 271, 1842, 1767, 1312, 15548, 5522, 253, 5522, 310, 4158, 594, 326, 5140, 273, 247, 9853, 695, 1789, 12378, 4830, 253, 9853, 695, 285, 24426, 273, 253, 2900, 310, 15246, 253, 3410, 2127, 3802, 46588, 403, 671, 3477, 281, 956, 6296, 24426, 275, 374, 69, 390, 495, 69, 2219, 50276, 261, 21947, 342, 247, 4564, 273, 8442, 671, 253, 47284, 310, 3240, 2590, 50276, 66, 14855, 310, 275, 38135, 512, 253, 3762, 285, 1332, 275, 7118, 374, 285, 495, 403, 4127, 275, 3127, 594, 253, 7680, 273, 436, 2929, 3133, 281, 320, 3710, 281, 2593, 577, 5740, 273, 253, 9853, 29167, 5522, 891, 2096, 326, 10182, 285, 1842, 1767, 1312, 7092, 273, 271, 5933, 310, 271, 1774, 4836, 533, 323, 824, 271, 7092, 310, 15452, 494, 281, 247, 5398, 603, 8059, 285, 6698, 891, 1158, 625, 8180, 310, 3058, 50276, 1542, 1650, 275, 253, 21977, 3646, 273, 253, 4927, 312, 6698, 327, 8249, 12672, 3944, 339, 16712, 859, 16726, 2061, 34859, 261, 758, 2081, 451, 22872, 534, 891, 2868, 310, 8642, 275, 7990, 342, 253, 2929, 3054, 323, 253, 3694, 285, 1029, 24159, 12672, 7140, 347, 3637, 50274, 1542, 3694, 275, 1798, 9262, 9380, 943, 417, 320, 3710, 281, 12930, 247, 747, 5522, 533, 1364, 1246, 253, 5933, 280, 390, 20417, 32771, 326, 1160, 253, 2440, 273, 253, 3694, 1896, 50276, 328, 9520, 891, 513, 417, 923, 2057, 5933, 280, 390, 20417, 15832, 275, 253, 47284, 273, 253, 5522, 275, 436, 2929, 50276, 32897, 923, 20544, 285, 32213, 2593, 275, 253, 1246, 830, 436, 2929, 4453, 347, 604, 352, 310, 271, 30762, 281, 3127, 5474, 33032, 2520, 2929, 310, 247, 2159, 23647, 327, 247, 15548, 13757, 5522, 534, 17930, 253, 1332, 273, 253, 3332, 638, 3845, 3127, 5001, 253, 1895, 273, 35104, 247, 1677, 1127, 273, 253, 299, 26365, 295, 6967, 2317, 4830, 247, 21396, 24052, 32961, 50276, 45563, 50276, 68, 20427, 1146, 2104, 281, 2199, 14556, 285, 13613, 4830, 247, 21396, 24052, 32961, 310, 1077, 1774, 285, 253, 1246, 2929, 7350, 247, 12532, 3213, 275, 436, 3884, 1060, 310, 271, 3081, 1921, 323, 253, 8453, 273, 436, 1895, 417, 5393, 275, 253, 2929, 352, 310, 247, 8946, 958, 275, 20157, 12087, 326, 667, 5235, 2931, 407, 17010, 7424, 310, 25783, 3066, 253, 9267, 18859, 2336, 2487, 70, 3711, 281, 247, 5235, 2931, 407, 21396, 7424, 24088, 923, 5763, 3285, 3239, 2030, 275, 20157, 12087, 247, 806, 2282, 407, 3371, 70, 288, 34662, 3021, 1146, 2104, 281, 14556, 285, 13613, 2199, 4830, 247, 2014, 21396, 24052, 32961, 310, 253, 5068, 273, 1146, 2104, 281, 2199, 4830, 667, 5235, 1580, 253, 6158, 476, 320, 11575, 347, 271, 15171, 273, 21396, 24052, 321, 6511, 50275, 20881, 1255, 50276, 6050, 247, 23647, 15250, 247, 15548, 5522, 323, 35104, 4830, 21396, 24052, 321, 6511, 310, 5604, 5322, 281, 452, 891, 717, 273, 253, 13214, 326, 5723, 2824, 310, 417, 253, 4569, 18767, 323, 352, 50276, 34235, 891, 651, 452, 7636, 281, 923, 247, 2159, 2715, 273, 3127, 342, 4679, 390, 247, 1083, 1263, 27321, 253, 6349, 273, 253, 1895, 323, 253, 5145, 4715, 285, 941, 5859, 3114, 285, 10941, 342, 253, 1375, 23037, 14387, 32866, 26272, 3082, 323, 253, 1072, 1895, 50276, 2383, 965, 5576, 7579, 891, 651, 751, 281, 3324, 281, 253, 4477, 4116, 253, 1563, 9380, 285, 30404, 15308, 5001, 253, 1895, 273, 35104, 4830, 24052, 321, 6511, 285, 19112, 625, 3839, 337, 6536, 46984, 3499, 706, 292, 258, 1440, 580, 27510, 331, 321, 32067, 1241, 289, 4921, 253, 299, 26365, 4181, 4248, 273, 271, 20157, 5235, 27629, 273, 15180, 23065, 374, 1517, 2821, 256, 1519, 587, 5534, 26822, 299, 26365, 4181, 4248, 285, 6804, 4644, 27629, 273, 15180, 23065, 50275, 9820, 5474, 339, 377, 5957, 247, 6335, 323, 12672, 20553, 4830, 247, 2491, 273, 9421, 2931, 407, 9853, 18211, 436, 2929, 1057, 247, 1175, 2628, 273, 12930, 253, 1895, 352, 35910, 285, 34805, 6667, 273, 849, 253, 6335, 476, 320, 908, 281, 8415, 10872, 273, 3237, 1561, 253, 966, 273, 12378, 3237, 352, 19401, 891, 717, 23384, 273, 9380, 327, 3694, 13747, 597, 403, 11120, 1561, 253, 7990, 273, 5723, 2824, 285, 8162, 3587, 281, 253, 13361, 3114, 891, 13414, 452, 667, 4067, 2268, 4624, 253, 6667, 403, 3477, 281, 2096, 285, 253, 5304, 84, 403, 2969, 285, 2590, 436, 2929, 12453, 247, 1077, 6891, 1895, 285, 1057, 417, 17093, 667, 11859, 4893, 891, 651, 452, 10490, 281, 923, 690, 20028, 273, 4893, 273, 253, 6335, 2490, 187, 4118, 18435, 27, 783, 2929, 10262, 247, 3694, 5522, 281, 513, 20553, 327, 253, 1327, 32990, 527, 5526, 4275, 21396, 24052, 321, 6511, 1223, 253, 1895, 310, 5604, 4722, 512, 253, 30628, 5194, 697, 16038, 275, 253, 3634, 273, 5145, 4715, 3133, 281, 320, 14999, 275, 253, 2929, 436, 310, 5816, 275, 253, 2929, 4390, 285, 310, 253, 2022, 2603, 273, 13775, 275, 253, 30628, 285, 253, 913, 84, 13846, 846, 11985, 2190, 253, 30628, 891, 2868, 253, 2929, 556, 1199, 7990, 323, 11701, 30812, 253, 16108, 4496, 1007, 387, 253, 13991, 9257, 671, 253, 2929, 347, 352, 310, 3133, 281, 1805, 4944, 253, 7990, 273, 253, 13361, 1730, 6698, 2581, 685, 253, 5723, 2824, 8059, 816, 247, 1869, 432, 253, 913, 1907, 753, 326, 891, 651, 11907, 253, 4477, 281, 4035, 253, 2440, 273, 436, 5522, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper introduces the problem of domain adaptation under open set label shift osls where the classconditional distributions pxy are domaininvariant and py can change domain adaptation under label shift and positiveunlabeled pu learning can be all considered as special cases of osls this paper also provides the identifiablity of osls including necessary condition weak positivity and sufficient conditions strong positvity under the strong positivity condition the osls problem can be broken into k pu problems the pu learning algorithms cannot scale to datasets with large number of classes because of error accumulation this paper then proposes pulse framework to solve this problem by exploiting the joint structure of the problem with sourceclass resampling experiments across 7 semisynthetic benchmarks shows that the proposed pulse consistently outperform osda baselines strengths open set label shift osls an interesting problem assuming pxy is the same and py is changing theories on identifiablity of the problem and convergence analysis new algorithm pulse to solve the problem good experimental results weakness all experimental results are semisynthesized and are on relative small datasets such as cifar10 cifar100 it is good to have results on some popularly used datasets for domain adaptation such as domainnet the writing is not easy to follow the references eg algorithm 23 switch from main paper and supplementary materials without clear instruction there are also some minor grammar errors including in the formula line 153 the benchmarks used in this paper are all semisynthesized and thus lack proof from realworld application the paper mentions that in future work we hope to bridge the gap between the necessary and sufficient identifiability conditions docsepthis paper scopes the common open set domain adaptation problem to specifically consider open set label shift osls problems where pxy is constant but the class proportions may change between source and target and there may be a new unseen class added in testing in this setting the goal is to identify instances of the unseen class while also performing adequately on the previously seen classes the paper proposes a new framework pulse which combines classical positive and unlabeled learning techniques and label reweighting techniques in order to tackle this new problem the method is empirically shown to perform well in a variety of osls problems and theoretical analysis is conducted to create sufficient conditions for identifiability of the osls problem originality to the best of my knowledge this is the first time i have seen the scoping of the open set domain adaptation problem by focusing purely on label shift the authors are able to create some novel theoretical results as well as a highly performant framework for tackling the osls problem in particular their framework combines two standard techniques to great effect class reweighting to handle the label shift and pu techniques to handle the open set nature of the problem quality the claims are wellsubstantiated both theoretically and empirically and the results are impressive the authors perform a detailed evaluation on existing open source domain adaptation methods as well as a slight ablation study by performing standard pu techniques without any label reweighting clarity the paper is mostly well organized and written a worthwhile addition in the related work section would be to make it more clear exactly who your setting differs from each of the main categories that are defined there further it may be worth a few sentences relating why identifiability is useful and related to the remainder of the paper section 4 felt significantly out of place compared to the rest of the paper some minor nitpicks line 48 matrix submatrix appears to be a typo line 257 should likely be a heading significance by scoping the osda problem to focus on label shift the authors were able to show relatively significant gains compared to standard osda methods this kind of scoping appears quite fruitful for other researchers to build off of the theoretical contributions in the paper are strong evidence that their framework pulse is able to perform quite well on a variety of scenarios which leaves it as a valuable benchmark for future work it would be worthwhile to have a further discussion on limitations of pulse with regards to the amount of labeled data of the target domain for example with no labels it would be difficult to approximate what the label shift is it could also be useful to discuss the types of models or datasets that pulse is expected to work on eg does it perform worse as the number of classes increase the computational efficiency with regards to larger models etc and that it inherits limitations from its particular stages eg any limitations of importance weighted label shift or cvirbbe docsepthe goal of this paper is to solve the open set domain adaptation under the label shift setting the authors proposed a pu learningbased framework to first estimate the label shift and then classify the novel class moreover the authors also gave sufficient and necessary conditions to the open set label shift in order to make the target label marginal identify the experimental results showed that the pulse framework could achieve great performance improvement 1 the idea of combing pu problems with the osls is interesting the authors used an ingenious way to merge the pu problems into the osls the reduction from the kpu to a single pu is attractive equation 3 2 the definitions and theorems are straightforward and easy to be understood and the mathematics proof seems solid 3 the author used a twostage method to separately estimate the label shift and to identify the novel class which seems effective in solving both the domain adaptation and novel class identification the results of the pulse seems good 4 i have no doubts in the originality to the best of my knowledge the quality of this paper is good some more details need to be discussed for clarity as i stated later and the contribution of this paper has some significance to the osls regime 1 this paper needs to be well organized 2 some details need to be completed as described in the questions 3 the experiments in this paper is not convincing docsepthis paper introduces domain adaptation under open set label shift osls specifically it assumes the target has one more novel class that was not previously seen in the source domain while allowing label shifts between source and target domains this work provides theoretical findings of osls specifically the necessary condition weak positivity and two sufficient conditions strong positivity and separability the author further proposes a framework to solve osls named pulse which combines techniques from both areas of 1 positive and unlabeled learning and 2 da under label shift the effectiveness of the proposed methods is demonstrated in language image and medical datasets originality the paper focuses on a special case of the open set domain adaptation where the target domain has one novel class that is not previously unseen in the source domain the big concept of open set da is not novel while the special case is well studied yet thus can be considered novel the paper delivers an identifiability analysis of the osls which is novel quality the paper provides sold theoretical analysis as well as extensive experiments on 5 datasets across multiple application domains minor issue typos 1 line 46 the matrix submatrix 2 line 67 double periods after sec7 3 line 154 frac1ptyk1 is missing in the closed form of ptxyk1 4 the paper seems not define what the metric novel prevalence estimation is clarity the paper has many contents and is pretty dense the majority of the paper is wellwritten and clear however i find the method section a bit obfuscated one reason is that there is no headings or subsection title to remind the reader what is the focus of each paragraph another reason is that the author seems to jump to the details too quickly the logic of the current text is kind of linear i would like to suggest the author organize section 6 and each of its subsections to have an overviewanddetails structure significance the paper does a good job on the specific problem it aims to solve the analysis is solid and the empirical results are illustrative my main concern is that the papers significance is constrained by the practicality of the proposed problem the problem setting is a bit artificial to me i am not sure how practical it is that we assume the target domain has exactly one novel class unseen the experiment is kind of syntheticsemisynthetic since the author chooses source and novel classes randomly this work does not raise potential negative societal impact ### Summary:
the paper addresses an interesting domain adataption question and proposes an novel and elegant solution supported with relevant theory although some issues have been raised all reviewers agree that the paper worth be published and we expect the authors to take into account the comments of the reviewers eg discussing limitations of pulse checking positivity conditions
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 23970, 253, 1895, 273, 5028, 15644, 762, 1527, 873, 5203, 5333, 258, 3433, 84, 835, 253, 966, 35428, 10670, 268, 5246, 403, 5028, 25168, 285, 7239, 476, 1818, 50276, 13517, 15644, 762, 5203, 5333, 285, 2762, 328, 22027, 8429, 4715, 476, 320, 512, 2783, 347, 2714, 2219, 273, 258, 3433, 84, 50275, 2520, 2929, 671, 3400, 253, 1548, 18279, 1752, 414, 273, 258, 3433, 84, 1690, 3309, 1617, 5075, 34324, 285, 4209, 2515, 2266, 10538, 27789, 762, 253, 2266, 34324, 1617, 253, 258, 3433, 84, 1895, 476, 320, 7154, 715, 465, 8429, 3237, 50276, 783, 8429, 4715, 11333, 2550, 4311, 281, 15302, 342, 1781, 1180, 273, 5971, 984, 273, 2228, 12037, 436, 2929, 840, 29328, 10724, 7792, 281, 8415, 436, 1895, 407, 38883, 253, 6036, 2605, 273, 253, 1895, 342, 2603, 2437, 501, 312, 4906, 4679, 2439, 818, 49863, 23744, 49602, 2722, 326, 253, 4081, 10724, 12724, 562, 32231, 7684, 1473, 1666, 25379, 20544, 50276, 5758, 873, 5203, 5333, 258, 3433, 84, 271, 4722, 1895, 7384, 268, 5246, 310, 253, 1072, 285, 7239, 310, 6890, 50276, 783, 2370, 327, 1548, 18279, 1752, 414, 273, 253, 1895, 285, 14940, 1783, 50276, 1826, 5933, 10724, 281, 8415, 253, 1895, 50276, 12311, 5661, 1543, 50276, 20881, 1255, 50276, 455, 5661, 1543, 403, 49863, 10285, 10306, 285, 403, 327, 4103, 1355, 15302, 824, 347, 260, 338, 274, 740, 260, 338, 274, 2313, 352, 310, 1175, 281, 452, 1543, 327, 690, 4633, 314, 908, 15302, 323, 5028, 15644, 824, 347, 5028, 3024, 50276, 783, 4028, 310, 417, 3477, 281, 956, 253, 10414, 24088, 5933, 3495, 5234, 432, 2022, 2929, 285, 24864, 4753, 1293, 2590, 9775, 627, 403, 671, 690, 5884, 28146, 6332, 1690, 275, 253, 7212, 1386, 21579, 50276, 783, 49602, 908, 275, 436, 2929, 403, 512, 49863, 10285, 10306, 285, 3021, 3480, 4737, 432, 1524, 10186, 2898, 50276, 783, 2929, 25957, 326, 275, 2852, 789, 359, 3524, 281, 9729, 253, 8037, 875, 253, 3309, 285, 4209, 1548, 18279, 1430, 2515, 50276, 7152, 33032, 2520, 2929, 660, 11192, 253, 1846, 1527, 873, 5028, 15644, 1895, 281, 5742, 1908, 1527, 873, 5203, 5333, 258, 3433, 84, 3237, 835, 268, 5246, 310, 3638, 533, 253, 966, 22260, 778, 1818, 875, 2603, 285, 2303, 285, 627, 778, 320, 247, 747, 39709, 966, 2879, 275, 5175, 275, 436, 4758, 253, 4736, 310, 281, 4271, 10872, 273, 253, 39709, 966, 1223, 671, 9591, 18212, 327, 253, 3786, 2326, 5971, 253, 2929, 29328, 247, 747, 7792, 10724, 534, 24772, 8946, 2762, 285, 440, 22027, 4715, 5609, 285, 5203, 294, 6712, 272, 5609, 275, 1340, 281, 18915, 436, 747, 1895, 253, 1332, 310, 45190, 2011, 281, 1347, 973, 275, 247, 5235, 273, 258, 3433, 84, 3237, 285, 10527, 1783, 310, 5196, 281, 2794, 4209, 2515, 323, 1548, 18279, 1430, 273, 253, 258, 3433, 84, 1895, 50275, 19164, 414, 50273, 936, 253, 1682, 273, 619, 3640, 436, 310, 253, 806, 673, 891, 452, 2326, 253, 660, 18225, 273, 253, 1527, 873, 5028, 15644, 1895, 407, 13654, 15846, 327, 5203, 5333, 253, 4477, 403, 2104, 281, 2794, 690, 4460, 10527, 1543, 347, 973, 347, 247, 4122, 1347, 386, 7792, 323, 46710, 253, 258, 3433, 84, 1895, 275, 1798, 616, 7792, 24772, 767, 2629, 5609, 281, 1270, 1055, 966, 294, 6712, 272, 281, 6016, 253, 5203, 5333, 285, 8429, 5609, 281, 6016, 253, 1527, 873, 3753, 273, 253, 1895, 50275, 15177, 50273, 783, 3916, 403, 973, 44167, 4215, 1097, 28055, 285, 45190, 285, 253, 1543, 403, 13943, 253, 4477, 1347, 247, 7000, 7103, 327, 5368, 1527, 2603, 5028, 15644, 3082, 347, 973, 347, 247, 4512, 28913, 1263, 407, 9591, 2629, 8429, 5609, 1293, 667, 5203, 294, 6712, 272, 50275, 498, 15752, 50273, 783, 2929, 310, 6571, 973, 10932, 285, 3542, 247, 32811, 1635, 275, 253, 2905, 789, 2593, 651, 320, 281, 1056, 352, 625, 2590, 4555, 665, 634, 4758, 19986, 432, 1016, 273, 253, 2022, 9050, 326, 403, 2931, 627, 2007, 352, 778, 320, 4409, 247, 1643, 14683, 12600, 2139, 1548, 18279, 1430, 310, 4217, 285, 2905, 281, 253, 6414, 273, 253, 2929, 2593, 577, 3543, 3012, 562, 273, 1659, 2429, 281, 253, 1551, 273, 253, 2929, 50276, 8826, 5884, 12389, 81, 5519, 50271, 1282, 5693, 50276, 6674, 749, 6674, 4620, 281, 320, 247, 1745, 80, 50270, 1282, 30092, 943, 2779, 320, 247, 13590, 50276, 9188, 40348, 50273, 1615, 660, 18225, 253, 7684, 1473, 1895, 281, 2770, 327, 5203, 5333, 253, 4477, 497, 2104, 281, 921, 4942, 1534, 15988, 2429, 281, 2629, 7684, 1473, 3082, 50276, 2520, 2238, 273, 660, 18225, 4620, 3240, 46001, 323, 643, 8607, 281, 1973, 745, 273, 253, 10527, 9021, 275, 253, 2929, 403, 2266, 1941, 326, 616, 7792, 10724, 310, 2104, 281, 1347, 3240, 973, 327, 247, 5235, 273, 15216, 534, 6505, 352, 347, 247, 9865, 22791, 323, 2852, 789, 50273, 262, 651, 320, 32811, 281, 452, 247, 2007, 5955, 327, 7364, 273, 10724, 342, 17730, 281, 253, 2408, 273, 13130, 941, 273, 253, 2303, 5028, 50276, 1542, 1650, 342, 642, 13301, 352, 651, 320, 2834, 281, 16851, 752, 253, 5203, 5333, 310, 352, 812, 671, 320, 4217, 281, 2319, 253, 3510, 273, 3210, 390, 15302, 326, 10724, 310, 3264, 281, 789, 327, 24088, 1057, 352, 1347, 7197, 347, 253, 1180, 273, 5971, 2572, 253, 15180, 6733, 342, 17730, 281, 4067, 3210, 3966, 285, 326, 352, 10958, 953, 7364, 432, 697, 1798, 8661, 24088, 667, 7364, 273, 6349, 17375, 5203, 5333, 390, 260, 12855, 49472, 50276, 7152, 339, 431, 248, 4736, 273, 436, 2929, 310, 281, 8415, 253, 1527, 873, 5028, 15644, 762, 253, 5203, 5333, 4758, 253, 4477, 4081, 247, 8429, 4715, 3169, 7792, 281, 806, 6642, 253, 5203, 5333, 285, 840, 30215, 253, 4460, 966, 25761, 253, 4477, 671, 3534, 4209, 285, 3309, 2515, 281, 253, 1527, 873, 5203, 5333, 275, 1340, 281, 1056, 253, 2303, 5203, 16888, 4271, 253, 5661, 1543, 2692, 326, 253, 10724, 7792, 812, 5115, 1270, 3045, 7756, 50276, 18, 253, 2934, 273, 2049, 272, 8429, 3237, 342, 253, 258, 3433, 84, 310, 4722, 253, 4477, 908, 271, 35604, 784, 1039, 281, 17310, 253, 8429, 3237, 715, 253, 258, 3433, 84, 253, 5141, 432, 253, 465, 11113, 281, 247, 2014, 8429, 310, 12994, 5150, 495, 50276, 19, 253, 14308, 285, 39383, 403, 15246, 285, 3477, 281, 320, 7192, 285, 253, 23065, 4737, 3133, 4891, 50275, 20, 253, 2488, 908, 247, 2500, 493, 486, 1332, 281, 11794, 6642, 253, 5203, 5333, 285, 281, 4271, 253, 4460, 966, 534, 3133, 3576, 275, 16161, 1097, 253, 5028, 15644, 285, 4460, 966, 8137, 253, 1543, 273, 253, 10724, 3133, 1175, 50276, 21, 891, 452, 642, 24626, 275, 253, 3236, 414, 281, 253, 1682, 273, 619, 3640, 253, 3290, 273, 436, 2929, 310, 1175, 690, 625, 4278, 878, 281, 320, 5469, 323, 19843, 347, 891, 4767, 1996, 285, 253, 7680, 273, 436, 2929, 556, 690, 8453, 281, 253, 258, 3433, 84, 9459, 50276, 18, 436, 2929, 3198, 281, 320, 973, 10932, 374, 690, 4278, 878, 281, 320, 6312, 347, 2529, 275, 253, 3533, 495, 253, 4679, 275, 436, 2929, 310, 417, 21414, 50276, 7152, 33032, 2520, 2929, 23970, 5028, 15644, 762, 1527, 873, 5203, 5333, 258, 3433, 84, 5742, 352, 19584, 253, 2303, 556, 581, 625, 4460, 966, 326, 369, 417, 3786, 2326, 275, 253, 2603, 5028, 1223, 6941, 5203, 15036, 875, 2603, 285, 2303, 10625, 436, 789, 3400, 10527, 4342, 273, 258, 3433, 84, 5742, 253, 3309, 1617, 5075, 34324, 285, 767, 4209, 2515, 2266, 34324, 285, 2533, 1430, 253, 2488, 2007, 29328, 247, 7792, 281, 8415, 258, 3433, 84, 4907, 10724, 534, 24772, 5609, 432, 1097, 3672, 273, 337, 2762, 285, 440, 22027, 4715, 285, 374, 4204, 762, 5203, 5333, 253, 12510, 273, 253, 4081, 3082, 310, 5183, 275, 3448, 2460, 285, 3739, 15302, 50276, 19164, 414, 50276, 783, 2929, 16633, 327, 247, 2714, 1083, 273, 253, 1527, 873, 5028, 15644, 835, 253, 2303, 5028, 556, 581, 4460, 966, 326, 310, 417, 3786, 39709, 275, 253, 2603, 5028, 253, 1943, 4473, 273, 1527, 873, 4204, 310, 417, 4460, 1223, 253, 2714, 1083, 310, 973, 5421, 2568, 3021, 476, 320, 2783, 4460, 253, 2929, 26361, 271, 1548, 18279, 1430, 1783, 273, 253, 258, 3433, 84, 534, 310, 4460, 50275, 15177, 50276, 783, 2929, 3400, 4211, 10527, 1783, 347, 973, 347, 9470, 4679, 327, 608, 15302, 2439, 2709, 2898, 10625, 50274, 37585, 2523, 963, 993, 50276, 18, 1386, 7904, 50276, 783, 4315, 749, 6674, 50276, 19, 1386, 9963, 4021, 9894, 846, 4706, 24, 50276, 20, 1386, 21603, 1315, 317, 18, 431, 25073, 18, 310, 5816, 275, 253, 4581, 830, 273, 31048, 5246, 76, 18, 577, 253, 2929, 3133, 417, 4853, 752, 253, 7982, 4460, 8996, 13418, 310, 50275, 498, 15752, 50276, 783, 2929, 556, 1142, 9410, 285, 310, 3965, 14086, 253, 5020, 273, 253, 2929, 310, 973, 15720, 285, 2590, 2299, 891, 1089, 253, 1332, 2593, 247, 2372, 691, 71, 19387, 456, 581, 1921, 310, 326, 627, 310, 642, 1481, 723, 390, 19087, 4060, 281, 9287, 253, 9414, 752, 310, 253, 2770, 273, 1016, 12494, 1529, 1921, 310, 326, 253, 2488, 3133, 281, 6923, 281, 253, 4278, 1512, 4541, 253, 9317, 273, 253, 1655, 2505, 310, 2238, 273, 4872, 891, 651, 751, 281, 1804, 253, 2488, 23968, 2593, 721, 285, 1016, 273, 697, 749, 21454, 281, 452, 271, 18389, 395, 23454, 2605, 50275, 9188, 40348, 50276, 783, 2929, 1057, 247, 1175, 2628, 327, 253, 2173, 1895, 352, 13698, 281, 8415, 253, 1783, 310, 4891, 285, 253, 16774, 1543, 403, 47386, 619, 2022, 4468, 310, 326, 253, 9380, 8453, 310, 20793, 407, 253, 8542, 414, 273, 253, 4081, 1895, 253, 1895, 4758, 310, 247, 2372, 13345, 281, 479, 891, 717, 417, 2119, 849, 8542, 352, 310, 326, 359, 5467, 253, 2303, 5028, 556, 4555, 581, 4460, 966, 39709, 253, 3368, 310, 2238, 273, 13506, 6017, 261, 23744, 1580, 253, 2488, 28467, 2603, 285, 4460, 5971, 12421, 50276, 2520, 789, 1057, 417, 7164, 2442, 4016, 38058, 3486, 2490, 187, 4118, 18435, 27, 783, 2929, 12453, 271, 4722, 5028, 519, 682, 2476, 1953, 285, 29328, 271, 4460, 285, 20654, 2900, 4516, 342, 4623, 3762, 3738, 690, 3374, 452, 644, 5439, 512, 30628, 5194, 326, 253, 2929, 4409, 320, 3863, 285, 359, 1902, 253, 4477, 281, 1379, 715, 2395, 253, 5701, 273, 253, 30628, 24088, 16585, 7364, 273, 10724, 12669, 34324, 2515 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 23970, 253, 1895, 273, 5028, 15644, 762, 1527, 873, 5203, 5333, 258, 3433, 84, 835, 253, 966, 35428, 10670, 268, 5246, 403, 5028, 25168, 285, 7239, 476, 1818, 50276, 13517, 15644, 762, 5203, 5333, 285, 2762, 328, 22027, 8429, 4715, 476, 320, 512, 2783, 347, 2714, 2219, 273, 258, 3433, 84, 50275, 2520, 2929, 671, 3400, 253, 1548, 18279, 1752, 414, 273, 258, 3433, 84, 1690, 3309, 1617, 5075, 34324, 285, 4209, 2515, 2266, 10538, 27789, 762, 253, 2266, 34324, 1617, 253, 258, 3433, 84, 1895, 476, 320, 7154, 715, 465, 8429, 3237, 50276, 783, 8429, 4715, 11333, 2550, 4311, 281, 15302, 342, 1781, 1180, 273, 5971, 984, 273, 2228, 12037, 436, 2929, 840, 29328, 10724, 7792, 281, 8415, 436, 1895, 407, 38883, 253, 6036, 2605, 273, 253, 1895, 342, 2603, 2437, 501, 312, 4906, 4679, 2439, 818, 49863, 23744, 49602, 2722, 326, 253, 4081, 10724, 12724, 562, 32231, 7684, 1473, 1666, 25379, 20544, 50276, 5758, 873, 5203, 5333, 258, 3433, 84, 271, 4722, 1895, 7384, 268, 5246, 310, 253, 1072, 285, 7239, 310, 6890, 50276, 783, 2370, 327, 1548, 18279, 1752, 414, 273, 253, 1895, 285, 14940, 1783, 50276, 1826, 5933, 10724, 281, 8415, 253, 1895, 50276, 12311, 5661, 1543, 50276, 20881, 1255, 50276, 455, 5661, 1543, 403, 49863, 10285, 10306, 285, 403, 327, 4103, 1355, 15302, 824, 347, 260, 338, 274, 740, 260, 338, 274, 2313, 352, 310, 1175, 281, 452, 1543, 327, 690, 4633, 314, 908, 15302, 323, 5028, 15644, 824, 347, 5028, 3024, 50276, 783, 4028, 310, 417, 3477, 281, 956, 253, 10414, 24088, 5933, 3495, 5234, 432, 2022, 2929, 285, 24864, 4753, 1293, 2590, 9775, 627, 403, 671, 690, 5884, 28146, 6332, 1690, 275, 253, 7212, 1386, 21579, 50276, 783, 49602, 908, 275, 436, 2929, 403, 512, 49863, 10285, 10306, 285, 3021, 3480, 4737, 432, 1524, 10186, 2898, 50276, 783, 2929, 25957, 326, 275, 2852, 789, 359, 3524, 281, 9729, 253, 8037, 875, 253, 3309, 285, 4209, 1548, 18279, 1430, 2515, 50276, 7152, 33032, 2520, 2929, 660, 11192, 253, 1846, 1527, 873, 5028, 15644, 1895, 281, 5742, 1908, 1527, 873, 5203, 5333, 258, 3433, 84, 3237, 835, 268, 5246, 310, 3638, 533, 253, 966, 22260, 778, 1818, 875, 2603, 285, 2303, 285, 627, 778, 320, 247, 747, 39709, 966, 2879, 275, 5175, 275, 436, 4758, 253, 4736, 310, 281, 4271, 10872, 273, 253, 39709, 966, 1223, 671, 9591, 18212, 327, 253, 3786, 2326, 5971, 253, 2929, 29328, 247, 747, 7792, 10724, 534, 24772, 8946, 2762, 285, 440, 22027, 4715, 5609, 285, 5203, 294, 6712, 272, 5609, 275, 1340, 281, 18915, 436, 747, 1895, 253, 1332, 310, 45190, 2011, 281, 1347, 973, 275, 247, 5235, 273, 258, 3433, 84, 3237, 285, 10527, 1783, 310, 5196, 281, 2794, 4209, 2515, 323, 1548, 18279, 1430, 273, 253, 258, 3433, 84, 1895, 50275, 19164, 414, 50273, 936, 253, 1682, 273, 619, 3640, 436, 310, 253, 806, 673, 891, 452, 2326, 253, 660, 18225, 273, 253, 1527, 873, 5028, 15644, 1895, 407, 13654, 15846, 327, 5203, 5333, 253, 4477, 403, 2104, 281, 2794, 690, 4460, 10527, 1543, 347, 973, 347, 247, 4122, 1347, 386, 7792, 323, 46710, 253, 258, 3433, 84, 1895, 275, 1798, 616, 7792, 24772, 767, 2629, 5609, 281, 1270, 1055, 966, 294, 6712, 272, 281, 6016, 253, 5203, 5333, 285, 8429, 5609, 281, 6016, 253, 1527, 873, 3753, 273, 253, 1895, 50275, 15177, 50273, 783, 3916, 403, 973, 44167, 4215, 1097, 28055, 285, 45190, 285, 253, 1543, 403, 13943, 253, 4477, 1347, 247, 7000, 7103, 327, 5368, 1527, 2603, 5028, 15644, 3082, 347, 973, 347, 247, 4512, 28913, 1263, 407, 9591, 2629, 8429, 5609, 1293, 667, 5203, 294, 6712, 272, 50275, 498, 15752, 50273, 783, 2929, 310, 6571, 973, 10932, 285, 3542, 247, 32811, 1635, 275, 253, 2905, 789, 2593, 651, 320, 281, 1056, 352, 625, 2590, 4555, 665, 634, 4758, 19986, 432, 1016, 273, 253, 2022, 9050, 326, 403, 2931, 627, 2007, 352, 778, 320, 4409, 247, 1643, 14683, 12600, 2139, 1548, 18279, 1430, 310, 4217, 285, 2905, 281, 253, 6414, 273, 253, 2929, 2593, 577, 3543, 3012, 562, 273, 1659, 2429, 281, 253, 1551, 273, 253, 2929, 50276, 8826, 5884, 12389, 81, 5519, 50271, 1282, 5693, 50276, 6674, 749, 6674, 4620, 281, 320, 247, 1745, 80, 50270, 1282, 30092, 943, 2779, 320, 247, 13590, 50276, 9188, 40348, 50273, 1615, 660, 18225, 253, 7684, 1473, 1895, 281, 2770, 327, 5203, 5333, 253, 4477, 497, 2104, 281, 921, 4942, 1534, 15988, 2429, 281, 2629, 7684, 1473, 3082, 50276, 2520, 2238, 273, 660, 18225, 4620, 3240, 46001, 323, 643, 8607, 281, 1973, 745, 273, 253, 10527, 9021, 275, 253, 2929, 403, 2266, 1941, 326, 616, 7792, 10724, 310, 2104, 281, 1347, 3240, 973, 327, 247, 5235, 273, 15216, 534, 6505, 352, 347, 247, 9865, 22791, 323, 2852, 789, 50273, 262, 651, 320, 32811, 281, 452, 247, 2007, 5955, 327, 7364, 273, 10724, 342, 17730, 281, 253, 2408, 273, 13130, 941, 273, 253, 2303, 5028, 50276, 1542, 1650, 342, 642, 13301, 352, 651, 320, 2834, 281, 16851, 752, 253, 5203, 5333, 310, 352, 812, 671, 320, 4217, 281, 2319, 253, 3510, 273, 3210, 390, 15302, 326, 10724, 310, 3264, 281, 789, 327, 24088, 1057, 352, 1347, 7197, 347, 253, 1180, 273, 5971, 2572, 253, 15180, 6733, 342, 17730, 281, 4067, 3210, 3966, 285, 326, 352, 10958, 953, 7364, 432, 697, 1798, 8661, 24088, 667, 7364, 273, 6349, 17375, 5203, 5333, 390, 260, 12855, 49472, 50276, 7152, 339, 431, 248, 4736, 273, 436, 2929, 310, 281, 8415, 253, 1527, 873, 5028, 15644, 762, 253, 5203, 5333, 4758, 253, 4477, 4081, 247, 8429, 4715, 3169, 7792, 281, 806, 6642, 253, 5203, 5333, 285, 840, 30215, 253, 4460, 966, 25761, 253, 4477, 671, 3534, 4209, 285, 3309, 2515, 281, 253, 1527, 873, 5203, 5333, 275, 1340, 281, 1056, 253, 2303, 5203, 16888, 4271, 253, 5661, 1543, 2692, 326, 253, 10724, 7792, 812, 5115, 1270, 3045, 7756, 50276, 18, 253, 2934, 273, 2049, 272, 8429, 3237, 342, 253, 258, 3433, 84, 310, 4722, 253, 4477, 908, 271, 35604, 784, 1039, 281, 17310, 253, 8429, 3237, 715, 253, 258, 3433, 84, 253, 5141, 432, 253, 465, 11113, 281, 247, 2014, 8429, 310, 12994, 5150, 495, 50276, 19, 253, 14308, 285, 39383, 403, 15246, 285, 3477, 281, 320, 7192, 285, 253, 23065, 4737, 3133, 4891, 50275, 20, 253, 2488, 908, 247, 2500, 493, 486, 1332, 281, 11794, 6642, 253, 5203, 5333, 285, 281, 4271, 253, 4460, 966, 534, 3133, 3576, 275, 16161, 1097, 253, 5028, 15644, 285, 4460, 966, 8137, 253, 1543, 273, 253, 10724, 3133, 1175, 50276, 21, 891, 452, 642, 24626, 275, 253, 3236, 414, 281, 253, 1682, 273, 619, 3640, 253, 3290, 273, 436, 2929, 310, 1175, 690, 625, 4278, 878, 281, 320, 5469, 323, 19843, 347, 891, 4767, 1996, 285, 253, 7680, 273, 436, 2929, 556, 690, 8453, 281, 253, 258, 3433, 84, 9459, 50276, 18, 436, 2929, 3198, 281, 320, 973, 10932, 374, 690, 4278, 878, 281, 320, 6312, 347, 2529, 275, 253, 3533, 495, 253, 4679, 275, 436, 2929, 310, 417, 21414, 50276, 7152, 33032, 2520, 2929, 23970, 5028, 15644, 762, 1527, 873, 5203, 5333, 258, 3433, 84, 5742, 352, 19584, 253, 2303, 556, 581, 625, 4460, 966, 326, 369, 417, 3786, 2326, 275, 253, 2603, 5028, 1223, 6941, 5203, 15036, 875, 2603, 285, 2303, 10625, 436, 789, 3400, 10527, 4342, 273, 258, 3433, 84, 5742, 253, 3309, 1617, 5075, 34324, 285, 767, 4209, 2515, 2266, 34324, 285, 2533, 1430, 253, 2488, 2007, 29328, 247, 7792, 281, 8415, 258, 3433, 84, 4907, 10724, 534, 24772, 5609, 432, 1097, 3672, 273, 337, 2762, 285, 440, 22027, 4715, 285, 374, 4204, 762, 5203, 5333, 253, 12510, 273, 253, 4081, 3082, 310, 5183, 275, 3448, 2460, 285, 3739, 15302, 50276, 19164, 414, 50276, 783, 2929, 16633, 327, 247, 2714, 1083, 273, 253, 1527, 873, 5028, 15644, 835, 253, 2303, 5028, 556, 581, 4460, 966, 326, 310, 417, 3786, 39709, 275, 253, 2603, 5028, 253, 1943, 4473, 273, 1527, 873, 4204, 310, 417, 4460, 1223, 253, 2714, 1083, 310, 973, 5421, 2568, 3021, 476, 320, 2783, 4460, 253, 2929, 26361, 271, 1548, 18279, 1430, 1783, 273, 253, 258, 3433, 84, 534, 310, 4460, 50275, 15177, 50276, 783, 2929, 3400, 4211, 10527, 1783, 347, 973, 347, 9470, 4679, 327, 608, 15302, 2439, 2709, 2898, 10625, 50274, 37585, 2523, 963, 993, 50276, 18, 1386, 7904, 50276, 783, 4315, 749, 6674, 50276, 19, 1386, 9963, 4021, 9894, 846, 4706, 24, 50276, 20, 1386, 21603, 1315, 317, 18, 431, 25073, 18, 310, 5816, 275, 253, 4581, 830, 273, 31048, 5246, 76, 18, 577, 253, 2929, 3133, 417, 4853, 752, 253, 7982, 4460, 8996, 13418, 310, 50275, 498, 15752, 50276, 783, 2929, 556, 1142, 9410, 285, 310, 3965, 14086, 253, 5020, 273, 253, 2929, 310, 973, 15720, 285, 2590, 2299, 891, 1089, 253, 1332, 2593, 247, 2372, 691, 71, 19387, 456, 581, 1921, 310, 326, 627, 310, 642, 1481, 723, 390, 19087, 4060, 281, 9287, 253, 9414, 752, 310, 253, 2770, 273, 1016, 12494, 1529, 1921, 310, 326, 253, 2488, 3133, 281, 6923, 281, 253, 4278, 1512, 4541, 253, 9317, 273, 253, 1655, 2505, 310, 2238, 273, 4872, 891, 651, 751, 281, 1804, 253, 2488, 23968, 2593, 721, 285, 1016, 273, 697, 749, 21454, 281, 452, 271, 18389, 395, 23454, 2605, 50275, 9188, 40348, 50276, 783, 2929, 1057, 247, 1175, 2628, 327, 253, 2173, 1895, 352, 13698, 281, 8415, 253, 1783, 310, 4891, 285, 253, 16774, 1543, 403, 47386, 619, 2022, 4468, 310, 326, 253, 9380, 8453, 310, 20793, 407, 253, 8542, 414, 273, 253, 4081, 1895, 253, 1895, 4758, 310, 247, 2372, 13345, 281, 479, 891, 717, 417, 2119, 849, 8542, 352, 310, 326, 359, 5467, 253, 2303, 5028, 556, 4555, 581, 4460, 966, 39709, 253, 3368, 310, 2238, 273, 13506, 6017, 261, 23744, 1580, 253, 2488, 28467, 2603, 285, 4460, 5971, 12421, 50276, 2520, 789, 1057, 417, 7164, 2442, 4016, 38058, 3486, 2490, 187, 4118, 18435, 27, 783, 2929, 12453, 271, 4722, 5028, 519, 682, 2476, 1953, 285, 29328, 271, 4460, 285, 20654, 2900, 4516, 342, 4623, 3762, 3738, 690, 3374, 452, 644, 5439, 512, 30628, 5194, 326, 253, 2929, 4409, 320, 3863, 285, 359, 1902, 253, 4477, 281, 1379, 715, 2395, 253, 5701, 273, 253, 30628, 24088, 16585, 7364, 273, 10724, 12669, 34324, 2515 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper points out the previous offline reinforcement learning methods are too conservative about the outofdistribution ood actions and instead propose a mild conservative algorithm it introduces an auxiliary loss term to properly train the value function for ood actions the proposed method mcq is shown to outperform previous methods empirically and theoretically proved to behave at least as well as the behavior policy and has no erroneous overestimation pros this paper is clearly written and the whole structure is organized and easy to follow the method is wellmotivated and the claims in the paper are all supported by either theoretical analysis or experimental results although overpessimism is not a new problem in offline rl this method is very innovative and elegant in its processing of ood actions in comparison with the baselines mcq archives a remarkable improvement on random or medium datasets the authors also make a careful analysis of the sensitivity of hyperparameters cons i want to draw the authors attention to a recent paper that also addresses the overpessimism of offline rl algorithms httpsarxivorgpdf220702200pdf which uses an adaptive method besides i am also confused about the results in figure 2 it seems that decreasing lambda is always beneficial to the final performance and there is no trend that introducing the auxiliary loss can help the final performance i think the authors should use a small scale of lambda 003 to explicitly show the benefits of the auxiliary loss the authors have addresses the limitations in the paper docsepoffline rl is a topic of significant interest one common class of approaches is to learn an actionvalue function but to enforce that the function is conservative so that it does not result in a policy which takes actions that were not in the training data and therefore of unknown value this work introduces a mildly conservative bellman operator in particular for actions in the support of the behavior policy the operator behaves like standard bellman operator but for actions outside the behavior policy support it assumes that value is delta less than an action in the support they show that this operator will always result in a conservative q estimate that is it will not overestimate the value of any action they then introduce a practical approximation where the behavior policy is estimated by a cvae and test on a set of offline rl control tasks it performs notably better than prior work on poor demonstrates but does not consistently outperform td3bc when expert demonstrations are available strengths well communicated principled explanation of an algorithm that empirically performs well on a set of benchmark tasks offline rl is a topic of significant interest to the community and active research weaknesses ideally the algorithm would be tested on a different style of tasks as well eg perhaps atari rather than only mujoco control tasks this approach does not seem to perform as well when expert level demonstrations are available minor i found definition 2 line 123 confusing since it refers to muas which is stated above you are trying to avoid it is explained further what the actual practical solution is when mu is not known but i found this a bit confusing on first read yes docsepthe paper presents a method for offline reinforcement learning based that involves assigning pseudoqvalues to ood values called mcq the methods main idea is to modify the qtargets by detecting the outofdistribution actions via a density model and assign the q values to these actions similarly to bcq by taking a maximum of a q function over n samples from a density model the main difference to bcq is that the authors propose to use this backup operator only for ood actions instead of all actions and use an actor to recover optimal actions from the modified qfunction the method is evaluated on severals datasets from d4rl where it outperforms the baselines strengths the method is theoretically sound and practical even though all method components have been explored in prior work the idea of not penalizing the values of ood actions but using a bcqstyle value estimate to impute the values for these actions is novel the practical version of the algorithm eqn 11 is easy to implement mcb demonstrates good empirical results on a subset of d4rl tasks the paper is well written and easy to follow weaknesses the method can be seen as an extension of bcq however the comparison to bcq is missing the practical implementation of the method diverges from the theory in particular the practical implementation of the method omits ood evaluation and instead regularizes all actions sampled from the training policy which are not necessary ood and throughout training will certainly become less ood which can result in overpenalizing the optimal actions due to value underestimation caused by the bcqstyle operator the method is evaluated only on the locomotion tasks from d4rl which do not require stitching 1 dynamical programming also the method is evaluated using only 4 seeds which might be insufficient the offline to online experiments miss essential details in particular it is unclear how the authors obtained results for other methods also the authors pick a different subset of tasks for offline to online finetuning than considered in the original papers awac iql 1 rvs what is essential for offline rl via supervised learning s emmons b eysenbach i kostrikov s levine right now the main limitation of this work is limited experimental evaluation in particular the method is evaluated only on locomotion tasks using an insufficient number of runs the paper considers a set of tasks different from the standard tasks used in prior work for offline to online experiments i will raise my score if these concerns are addressed docsepthis paper proposes using mild conservatism in offline rl to benefit generalization and to avoid overly conservative on ood actions specifically this paper develops a mildly conservative bellman mcb operator for offline rl where ood actions are actively trained and their q values are actively queried theoretical results under the tabular setting and a practical mcb operator are provided empirically combining the practical mcb operator with sac performs well on the d4rl mujoco locomotion tasks and when transferring from offline learning to online strengths 1 the proposed method is wellmotivated by theory 2 the idea of actively train the qvalues of ood actions is interesting though in some sense similar to the cql paper 1 3 experiments are extensive and the proposed method generally performs well weaknesses 1 since the proposed method requires behavior cloning bc it is doubtful if the proposed method can work well on higherdimensional andor nonmarkovian datasets where bc can be difficult theoretical results such as proportion 5 requires sufficiently accurate bc which may not be possible on harder datasets in fact from table 8 the proposed method performs less favorable on maze2d datasets compared with other stronger baseline such as optidice 2 and is slightly worse than onestep rl 3 on adroit dataset 2 there seems to be discrepancies between the impractical theoretical algorithm and the theoretically lesssupported practical algorithm in particular the practical loss functions l167193 3 from table 4 9 the proposed method requires perdataset tuning of the weighting coefficient lambda on a relatively fine grid which muds the empirical significance since many of the compared baselines actually unify hyperparameters across the mujoco datasets eg cql iql and td3bc 1 kumar aviral et al conservative qlearning for offline reinforcement learning advances in neural information processing systems 33 2020 11791191 2 lee jongmin et al optidice offline policy optimization via stationary distribution correction estimation international conference on machine learning pmlr 2021 3 brandfonbrener david et al offline rl without offpolicy evaluation advances in neural information processing systems 34 2021 49334946 the authors briefly addressed the limitations no potential negative societal impacts is discussed ### Summary:
all reviewers are generally positive or borderline about this paper reviewers note that the method is theoretically sound and practical to implement even though all of the components have been explored previously the authors combine them in a novel approach that convincingly improves over prior works major concerns have been addressed by the authors response however i agree with reviewer fvhb that per dataset tuning of lambda muddies the comparison with previous approaches that do not do similar i would encourage the authors to additionally report the best performance with a single setting across datasets to make the comparison clearer
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 2792, 562, 253, 2045, 28841, 35221, 4715, 3082, 403, 1512, 11518, 670, 253, 562, 1171, 35360, 258, 351, 5231, 285, 3185, 12661, 247, 11134, 11518, 5933, 352, 23970, 271, 24026, 2957, 1307, 281, 6283, 6194, 253, 1318, 1159, 323, 258, 351, 5231, 253, 4081, 1332, 278, 68, 82, 310, 2011, 281, 562, 32231, 2045, 3082, 45190, 285, 28055, 8058, 281, 21319, 387, 1878, 347, 973, 347, 253, 3879, 3646, 285, 556, 642, 21210, 35039, 14508, 5847, 436, 2929, 310, 4518, 3542, 285, 253, 2644, 2605, 310, 10932, 285, 3477, 281, 956, 253, 1332, 310, 973, 24013, 8550, 285, 253, 3916, 275, 253, 2929, 403, 512, 4516, 407, 2057, 10527, 1783, 390, 5661, 1543, 3738, 689, 81, 405, 303, 1204, 310, 417, 247, 747, 1895, 275, 28841, 391, 77, 436, 1332, 310, 1077, 16694, 285, 20654, 275, 697, 5162, 273, 258, 351, 5231, 275, 5301, 342, 253, 1666, 25379, 278, 68, 82, 31523, 247, 13406, 7756, 327, 3632, 390, 4646, 15302, 253, 4477, 671, 1056, 247, 10182, 1783, 273, 253, 7340, 273, 4373, 22041, 50276, 5040, 891, 971, 281, 3812, 253, 4477, 4116, 281, 247, 3332, 2929, 326, 671, 12453, 253, 689, 81, 405, 303, 1204, 273, 28841, 391, 77, 11333, 5987, 39962, 2061, 9275, 14256, 1967, 1423, 361, 9275, 534, 4648, 271, 17825, 1332, 16280, 891, 717, 671, 13477, 670, 253, 1543, 275, 4677, 374, 352, 3133, 326, 11052, 29331, 310, 1900, 12912, 281, 253, 2457, 3045, 285, 627, 310, 642, 9058, 326, 16984, 253, 24026, 2957, 476, 1361, 253, 2457, 3045, 891, 1158, 253, 4477, 943, 897, 247, 1355, 4311, 273, 29331, 209, 4838, 281, 11120, 921, 253, 5373, 273, 253, 24026, 2957, 50276, 783, 4477, 452, 12453, 253, 7364, 275, 253, 2929, 5474, 33032, 2727, 1282, 391, 77, 310, 247, 9400, 273, 1534, 1600, 581, 1846, 966, 273, 7274, 310, 281, 3037, 271, 2250, 2877, 1159, 533, 281, 7767, 326, 253, 1159, 310, 11518, 594, 326, 352, 1057, 417, 906, 275, 247, 3646, 534, 3936, 5231, 326, 497, 417, 275, 253, 3733, 941, 285, 3103, 273, 7202, 1318, 50276, 2520, 789, 23970, 247, 38920, 11518, 17487, 1342, 5572, 275, 1798, 323, 5231, 275, 253, 1329, 273, 253, 3879, 3646, 253, 5572, 37824, 751, 2629, 17487, 1342, 5572, 533, 323, 5231, 3345, 253, 3879, 3646, 1329, 352, 19584, 326, 1318, 310, 18687, 1679, 685, 271, 2250, 275, 253, 1329, 50276, 9328, 921, 326, 436, 5572, 588, 1900, 906, 275, 247, 11518, 2805, 6642, 326, 310, 352, 588, 417, 35039, 2542, 253, 1318, 273, 667, 2250, 597, 840, 9569, 247, 8542, 11193, 835, 253, 3879, 3646, 310, 5998, 407, 247, 260, 21574, 285, 1071, 327, 247, 873, 273, 28841, 391, 77, 1453, 8892, 352, 17923, 19836, 1805, 685, 2720, 789, 327, 4105, 14371, 533, 1057, 417, 12724, 562, 32231, 32989, 20, 12847, 672, 6485, 32367, 403, 2130, 50276, 296, 3755, 20556, 50276, 4714, 32452, 50275, 26985, 74, 6216, 8813, 273, 271, 5933, 326, 45190, 17923, 973, 327, 247, 873, 273, 22791, 8892, 50275, 2727, 1282, 391, 77, 310, 247, 9400, 273, 1534, 1600, 281, 253, 3114, 285, 3939, 2561, 50276, 20881, 1255, 265, 50275, 504, 595, 253, 5933, 651, 320, 5762, 327, 247, 1027, 3740, 273, 8892, 347, 973, 24088, 4931, 387, 1792, 2581, 685, 760, 278, 10441, 16856, 1453, 8892, 50275, 2520, 2746, 1057, 417, 1646, 281, 1347, 347, 973, 672, 6485, 1268, 32367, 403, 2130, 50276, 37585, 50276, 74, 1119, 5426, 374, 1386, 15567, 21643, 1580, 352, 10770, 281, 12910, 284, 534, 310, 4767, 1840, 368, 403, 2820, 281, 3693, 352, 310, 5544, 2007, 752, 253, 4588, 8542, 2900, 310, 672, 12910, 310, 417, 1929, 533, 891, 1119, 436, 247, 2372, 21643, 327, 806, 1239, 50276, 9820, 5474, 339, 431, 248, 2929, 10262, 247, 1332, 323, 28841, 35221, 4715, 1754, 326, 8687, 34018, 17927, 82, 8858, 281, 258, 351, 2193, 1925, 278, 68, 82, 253, 3082, 2022, 2934, 310, 281, 10007, 253, 2805, 48413, 407, 15549, 253, 562, 1171, 35360, 5231, 3066, 247, 4038, 1566, 285, 9212, 253, 2805, 2193, 281, 841, 5231, 12014, 281, 49501, 82, 407, 3192, 247, 4869, 273, 247, 2805, 1159, 689, 295, 3530, 432, 247, 4038, 1566, 253, 2022, 3064, 281, 49501, 82, 310, 326, 253, 4477, 12661, 281, 897, 436, 17119, 5572, 760, 323, 258, 351, 5231, 3185, 273, 512, 5231, 285, 897, 271, 12353, 281, 9295, 8654, 5231, 432, 253, 7321, 2805, 3701, 253, 1332, 310, 6760, 327, 1917, 932, 15302, 432, 277, 21, 8435, 835, 352, 41731, 13015, 253, 1666, 25379, 50276, 296, 3755, 20556, 50276, 783, 1332, 310, 28055, 3590, 285, 8542, 1014, 2167, 512, 1332, 4295, 452, 644, 14859, 275, 2720, 789, 253, 2934, 273, 417, 29697, 3006, 253, 2193, 273, 258, 351, 5231, 533, 970, 247, 49501, 82, 4826, 1318, 6642, 281, 516, 48334, 253, 2193, 323, 841, 5231, 310, 4460, 50276, 783, 8542, 2715, 273, 253, 5933, 16186, 79, 1903, 310, 3477, 281, 3359, 50275, 78, 11316, 14371, 1175, 16774, 1543, 327, 247, 8578, 273, 277, 21, 8435, 8892, 50276, 783, 2929, 310, 973, 3542, 285, 3477, 281, 956, 50275, 20881, 1255, 265, 50276, 783, 1332, 476, 320, 2326, 347, 271, 6880, 273, 49501, 82, 2299, 253, 5301, 281, 49501, 82, 310, 5816, 50276, 783, 8542, 7092, 273, 253, 1332, 11711, 2510, 432, 253, 3762, 275, 1798, 253, 8542, 7092, 273, 253, 1332, 7005, 953, 258, 351, 7103, 285, 3185, 3963, 4219, 512, 5231, 19958, 432, 253, 3733, 3646, 534, 403, 417, 3309, 258, 351, 285, 4768, 3733, 588, 5604, 2489, 1679, 258, 351, 534, 476, 906, 275, 689, 3878, 267, 3006, 253, 8654, 5231, 1955, 281, 1318, 22698, 14508, 4269, 407, 253, 49501, 82, 4826, 5572, 50276, 783, 1332, 310, 6760, 760, 327, 253, 23904, 5011, 8892, 432, 277, 21, 8435, 534, 513, 417, 2430, 331, 31054, 337, 18525, 10717, 50276, 12563, 253, 1332, 310, 6760, 970, 760, 577, 12922, 534, 1537, 320, 12497, 50276, 783, 28841, 281, 3909, 4679, 2985, 5667, 4278, 275, 1798, 352, 310, 12744, 849, 253, 4477, 2797, 1543, 323, 643, 3082, 671, 253, 4477, 2619, 247, 1027, 8578, 273, 8892, 323, 28841, 281, 3909, 1442, 292, 25004, 685, 2783, 275, 253, 3236, 9380, 3768, 317, 891, 5848, 50276, 18, 391, 10936, 752, 310, 5667, 323, 28841, 391, 77, 3066, 22296, 4715, 256, 802, 23787, 270, 299, 656, 257, 16836, 891, 465, 493, 16409, 729, 256, 20978, 460, 987, 1024, 253, 2022, 12291, 273, 436, 789, 310, 3710, 5661, 7103, 275, 1798, 253, 1332, 310, 6760, 760, 327, 23904, 5011, 8892, 970, 271, 12497, 1180, 273, 6613, 253, 2929, 19401, 247, 873, 273, 8892, 1027, 432, 253, 2629, 8892, 908, 275, 2720, 789, 323, 28841, 281, 3909, 4679, 891, 588, 7164, 619, 4868, 604, 841, 7350, 403, 9713, 5474, 33032, 2520, 2929, 29328, 970, 11134, 6405, 44451, 275, 28841, 391, 77, 281, 5649, 26647, 285, 281, 3693, 27662, 11518, 327, 258, 351, 5231, 5742, 436, 2929, 24357, 247, 38920, 11518, 17487, 1342, 278, 11316, 5572, 323, 28841, 391, 77, 835, 258, 351, 5231, 403, 15257, 10166, 285, 616, 2805, 2193, 403, 15257, 32305, 728, 10527, 1543, 762, 253, 10334, 792, 4758, 285, 247, 8542, 278, 11316, 5572, 403, 2530, 45190, 16248, 253, 8542, 278, 11316, 5572, 342, 7044, 17923, 973, 327, 253, 277, 21, 8435, 278, 10441, 16856, 23904, 5011, 8892, 285, 672, 27090, 432, 28841, 4715, 281, 3909, 50276, 296, 3755, 20556, 337, 253, 4081, 1332, 310, 973, 24013, 8550, 407, 3762, 374, 253, 2934, 273, 15257, 6194, 253, 2805, 8858, 273, 258, 351, 5231, 310, 4722, 2167, 275, 690, 3282, 2074, 281, 253, 260, 5848, 2929, 337, 495, 4679, 403, 9470, 285, 253, 4081, 1332, 3839, 17923, 973, 50275, 20881, 1255, 265, 337, 1580, 253, 4081, 1332, 4419, 3879, 34591, 49501, 352, 310, 38342, 604, 253, 4081, 1332, 476, 789, 973, 327, 2169, 6967, 285, 263, 1327, 4698, 729, 757, 15302, 835, 49501, 476, 320, 2834, 10527, 1543, 824, 347, 8394, 608, 4419, 10481, 7899, 49501, 534, 778, 417, 320, 1896, 327, 12150, 15302, 50276, 249, 958, 432, 2829, 854, 253, 4081, 1332, 17923, 1679, 13857, 327, 37045, 19, 69, 15302, 2429, 342, 643, 10046, 8245, 824, 347, 1478, 301, 547, 374, 285, 310, 5777, 7197, 685, 327, 383, 554, 391, 77, 495, 327, 519, 14790, 10895, 374, 627, 3133, 281, 320, 37122, 875, 253, 45783, 10527, 5933, 285, 253, 28055, 1679, 19391, 8542, 5933, 275, 1798, 253, 8542, 2957, 3470, 298, 18146, 19631, 50276, 20, 432, 2829, 577, 50276, 26, 253, 4081, 1332, 4419, 591, 42429, 25184, 273, 253, 42428, 10235, 29331, 327, 247, 4942, 4030, 9860, 534, 16059, 84, 253, 16774, 8453, 1580, 1142, 273, 253, 2429, 1666, 25379, 2686, 440, 1419, 4373, 22041, 2439, 253, 278, 10441, 16856, 15302, 24088, 260, 5848, 891, 5848, 285, 32989, 20, 12847, 50274, 18, 465, 22711, 1323, 7411, 1162, 355, 11518, 2805, 28269, 323, 28841, 35221, 4715, 16424, 275, 11454, 1491, 5162, 2718, 5922, 9169, 12387, 4739, 22179, 50276, 19, 458, 70, 480, 543, 1222, 1162, 355, 1478, 301, 547, 28841, 3646, 13757, 3066, 17429, 3268, 10618, 13418, 5213, 8059, 327, 5145, 4715, 268, 1686, 83, 43425, 495, 7138, 35626, 67, 445, 254, 34843, 301, 1162, 355, 28841, 391, 77, 1293, 745, 22872, 7103, 16424, 275, 11454, 1491, 5162, 2718, 5910, 43425, 7584, 1610, 2537, 2950, 253, 4477, 13366, 9713, 253, 7364, 642, 2442, 4016, 38058, 16274, 310, 5469, 2490, 187, 4118, 18435, 27, 455, 30628, 403, 3839, 2762, 390, 45210, 670, 436, 2929, 30628, 3877, 326, 253, 1332, 310, 28055, 3590, 285, 8542, 281, 3359, 1014, 2167, 512, 273, 253, 4295, 452, 644, 14859, 3786, 253, 4477, 13398, 731, 275, 247, 4460, 2746, 326, 2410, 1763, 5356, 19132, 689, 2720, 2987, 50275, 24330, 7350, 452, 644, 9713, 407, 253, 4477, 2380, 2299, 891, 5194, 342, 37317, 269, 87, 38057, 326, 591, 10895, 25184, 273, 29331, 278, 7937, 447, 253, 5301, 342, 2045, 7274, 326, 513, 417, 513, 2074, 891, 651, 11907, 253, 4477, 281, 23000, 1304, 253, 1682, 3045, 342, 247, 2014, 4758, 2439, 15302, 281, 1056, 253, 5301, 30909, 50276 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 2792, 562, 253, 2045, 28841, 35221, 4715, 3082, 403, 1512, 11518, 670, 253, 562, 1171, 35360, 258, 351, 5231, 285, 3185, 12661, 247, 11134, 11518, 5933, 352, 23970, 271, 24026, 2957, 1307, 281, 6283, 6194, 253, 1318, 1159, 323, 258, 351, 5231, 253, 4081, 1332, 278, 68, 82, 310, 2011, 281, 562, 32231, 2045, 3082, 45190, 285, 28055, 8058, 281, 21319, 387, 1878, 347, 973, 347, 253, 3879, 3646, 285, 556, 642, 21210, 35039, 14508, 5847, 436, 2929, 310, 4518, 3542, 285, 253, 2644, 2605, 310, 10932, 285, 3477, 281, 956, 253, 1332, 310, 973, 24013, 8550, 285, 253, 3916, 275, 253, 2929, 403, 512, 4516, 407, 2057, 10527, 1783, 390, 5661, 1543, 3738, 689, 81, 405, 303, 1204, 310, 417, 247, 747, 1895, 275, 28841, 391, 77, 436, 1332, 310, 1077, 16694, 285, 20654, 275, 697, 5162, 273, 258, 351, 5231, 275, 5301, 342, 253, 1666, 25379, 278, 68, 82, 31523, 247, 13406, 7756, 327, 3632, 390, 4646, 15302, 253, 4477, 671, 1056, 247, 10182, 1783, 273, 253, 7340, 273, 4373, 22041, 50276, 5040, 891, 971, 281, 3812, 253, 4477, 4116, 281, 247, 3332, 2929, 326, 671, 12453, 253, 689, 81, 405, 303, 1204, 273, 28841, 391, 77, 11333, 5987, 39962, 2061, 9275, 14256, 1967, 1423, 361, 9275, 534, 4648, 271, 17825, 1332, 16280, 891, 717, 671, 13477, 670, 253, 1543, 275, 4677, 374, 352, 3133, 326, 11052, 29331, 310, 1900, 12912, 281, 253, 2457, 3045, 285, 627, 310, 642, 9058, 326, 16984, 253, 24026, 2957, 476, 1361, 253, 2457, 3045, 891, 1158, 253, 4477, 943, 897, 247, 1355, 4311, 273, 29331, 209, 4838, 281, 11120, 921, 253, 5373, 273, 253, 24026, 2957, 50276, 783, 4477, 452, 12453, 253, 7364, 275, 253, 2929, 5474, 33032, 2727, 1282, 391, 77, 310, 247, 9400, 273, 1534, 1600, 581, 1846, 966, 273, 7274, 310, 281, 3037, 271, 2250, 2877, 1159, 533, 281, 7767, 326, 253, 1159, 310, 11518, 594, 326, 352, 1057, 417, 906, 275, 247, 3646, 534, 3936, 5231, 326, 497, 417, 275, 253, 3733, 941, 285, 3103, 273, 7202, 1318, 50276, 2520, 789, 23970, 247, 38920, 11518, 17487, 1342, 5572, 275, 1798, 323, 5231, 275, 253, 1329, 273, 253, 3879, 3646, 253, 5572, 37824, 751, 2629, 17487, 1342, 5572, 533, 323, 5231, 3345, 253, 3879, 3646, 1329, 352, 19584, 326, 1318, 310, 18687, 1679, 685, 271, 2250, 275, 253, 1329, 50276, 9328, 921, 326, 436, 5572, 588, 1900, 906, 275, 247, 11518, 2805, 6642, 326, 310, 352, 588, 417, 35039, 2542, 253, 1318, 273, 667, 2250, 597, 840, 9569, 247, 8542, 11193, 835, 253, 3879, 3646, 310, 5998, 407, 247, 260, 21574, 285, 1071, 327, 247, 873, 273, 28841, 391, 77, 1453, 8892, 352, 17923, 19836, 1805, 685, 2720, 789, 327, 4105, 14371, 533, 1057, 417, 12724, 562, 32231, 32989, 20, 12847, 672, 6485, 32367, 403, 2130, 50276, 296, 3755, 20556, 50276, 4714, 32452, 50275, 26985, 74, 6216, 8813, 273, 271, 5933, 326, 45190, 17923, 973, 327, 247, 873, 273, 22791, 8892, 50275, 2727, 1282, 391, 77, 310, 247, 9400, 273, 1534, 1600, 281, 253, 3114, 285, 3939, 2561, 50276, 20881, 1255, 265, 50275, 504, 595, 253, 5933, 651, 320, 5762, 327, 247, 1027, 3740, 273, 8892, 347, 973, 24088, 4931, 387, 1792, 2581, 685, 760, 278, 10441, 16856, 1453, 8892, 50275, 2520, 2746, 1057, 417, 1646, 281, 1347, 347, 973, 672, 6485, 1268, 32367, 403, 2130, 50276, 37585, 50276, 74, 1119, 5426, 374, 1386, 15567, 21643, 1580, 352, 10770, 281, 12910, 284, 534, 310, 4767, 1840, 368, 403, 2820, 281, 3693, 352, 310, 5544, 2007, 752, 253, 4588, 8542, 2900, 310, 672, 12910, 310, 417, 1929, 533, 891, 1119, 436, 247, 2372, 21643, 327, 806, 1239, 50276, 9820, 5474, 339, 431, 248, 2929, 10262, 247, 1332, 323, 28841, 35221, 4715, 1754, 326, 8687, 34018, 17927, 82, 8858, 281, 258, 351, 2193, 1925, 278, 68, 82, 253, 3082, 2022, 2934, 310, 281, 10007, 253, 2805, 48413, 407, 15549, 253, 562, 1171, 35360, 5231, 3066, 247, 4038, 1566, 285, 9212, 253, 2805, 2193, 281, 841, 5231, 12014, 281, 49501, 82, 407, 3192, 247, 4869, 273, 247, 2805, 1159, 689, 295, 3530, 432, 247, 4038, 1566, 253, 2022, 3064, 281, 49501, 82, 310, 326, 253, 4477, 12661, 281, 897, 436, 17119, 5572, 760, 323, 258, 351, 5231, 3185, 273, 512, 5231, 285, 897, 271, 12353, 281, 9295, 8654, 5231, 432, 253, 7321, 2805, 3701, 253, 1332, 310, 6760, 327, 1917, 932, 15302, 432, 277, 21, 8435, 835, 352, 41731, 13015, 253, 1666, 25379, 50276, 296, 3755, 20556, 50276, 783, 1332, 310, 28055, 3590, 285, 8542, 1014, 2167, 512, 1332, 4295, 452, 644, 14859, 275, 2720, 789, 253, 2934, 273, 417, 29697, 3006, 253, 2193, 273, 258, 351, 5231, 533, 970, 247, 49501, 82, 4826, 1318, 6642, 281, 516, 48334, 253, 2193, 323, 841, 5231, 310, 4460, 50276, 783, 8542, 2715, 273, 253, 5933, 16186, 79, 1903, 310, 3477, 281, 3359, 50275, 78, 11316, 14371, 1175, 16774, 1543, 327, 247, 8578, 273, 277, 21, 8435, 8892, 50276, 783, 2929, 310, 973, 3542, 285, 3477, 281, 956, 50275, 20881, 1255, 265, 50276, 783, 1332, 476, 320, 2326, 347, 271, 6880, 273, 49501, 82, 2299, 253, 5301, 281, 49501, 82, 310, 5816, 50276, 783, 8542, 7092, 273, 253, 1332, 11711, 2510, 432, 253, 3762, 275, 1798, 253, 8542, 7092, 273, 253, 1332, 7005, 953, 258, 351, 7103, 285, 3185, 3963, 4219, 512, 5231, 19958, 432, 253, 3733, 3646, 534, 403, 417, 3309, 258, 351, 285, 4768, 3733, 588, 5604, 2489, 1679, 258, 351, 534, 476, 906, 275, 689, 3878, 267, 3006, 253, 8654, 5231, 1955, 281, 1318, 22698, 14508, 4269, 407, 253, 49501, 82, 4826, 5572, 50276, 783, 1332, 310, 6760, 760, 327, 253, 23904, 5011, 8892, 432, 277, 21, 8435, 534, 513, 417, 2430, 331, 31054, 337, 18525, 10717, 50276, 12563, 253, 1332, 310, 6760, 970, 760, 577, 12922, 534, 1537, 320, 12497, 50276, 783, 28841, 281, 3909, 4679, 2985, 5667, 4278, 275, 1798, 352, 310, 12744, 849, 253, 4477, 2797, 1543, 323, 643, 3082, 671, 253, 4477, 2619, 247, 1027, 8578, 273, 8892, 323, 28841, 281, 3909, 1442, 292, 25004, 685, 2783, 275, 253, 3236, 9380, 3768, 317, 891, 5848, 50276, 18, 391, 10936, 752, 310, 5667, 323, 28841, 391, 77, 3066, 22296, 4715, 256, 802, 23787, 270, 299, 656, 257, 16836, 891, 465, 493, 16409, 729, 256, 20978, 460, 987, 1024, 253, 2022, 12291, 273, 436, 789, 310, 3710, 5661, 7103, 275, 1798, 253, 1332, 310, 6760, 760, 327, 23904, 5011, 8892, 970, 271, 12497, 1180, 273, 6613, 253, 2929, 19401, 247, 873, 273, 8892, 1027, 432, 253, 2629, 8892, 908, 275, 2720, 789, 323, 28841, 281, 3909, 4679, 891, 588, 7164, 619, 4868, 604, 841, 7350, 403, 9713, 5474, 33032, 2520, 2929, 29328, 970, 11134, 6405, 44451, 275, 28841, 391, 77, 281, 5649, 26647, 285, 281, 3693, 27662, 11518, 327, 258, 351, 5231, 5742, 436, 2929, 24357, 247, 38920, 11518, 17487, 1342, 278, 11316, 5572, 323, 28841, 391, 77, 835, 258, 351, 5231, 403, 15257, 10166, 285, 616, 2805, 2193, 403, 15257, 32305, 728, 10527, 1543, 762, 253, 10334, 792, 4758, 285, 247, 8542, 278, 11316, 5572, 403, 2530, 45190, 16248, 253, 8542, 278, 11316, 5572, 342, 7044, 17923, 973, 327, 253, 277, 21, 8435, 278, 10441, 16856, 23904, 5011, 8892, 285, 672, 27090, 432, 28841, 4715, 281, 3909, 50276, 296, 3755, 20556, 337, 253, 4081, 1332, 310, 973, 24013, 8550, 407, 3762, 374, 253, 2934, 273, 15257, 6194, 253, 2805, 8858, 273, 258, 351, 5231, 310, 4722, 2167, 275, 690, 3282, 2074, 281, 253, 260, 5848, 2929, 337, 495, 4679, 403, 9470, 285, 253, 4081, 1332, 3839, 17923, 973, 50275, 20881, 1255, 265, 337, 1580, 253, 4081, 1332, 4419, 3879, 34591, 49501, 352, 310, 38342, 604, 253, 4081, 1332, 476, 789, 973, 327, 2169, 6967, 285, 263, 1327, 4698, 729, 757, 15302, 835, 49501, 476, 320, 2834, 10527, 1543, 824, 347, 8394, 608, 4419, 10481, 7899, 49501, 534, 778, 417, 320, 1896, 327, 12150, 15302, 50276, 249, 958, 432, 2829, 854, 253, 4081, 1332, 17923, 1679, 13857, 327, 37045, 19, 69, 15302, 2429, 342, 643, 10046, 8245, 824, 347, 1478, 301, 547, 374, 285, 310, 5777, 7197, 685, 327, 383, 554, 391, 77, 495, 327, 519, 14790, 10895, 374, 627, 3133, 281, 320, 37122, 875, 253, 45783, 10527, 5933, 285, 253, 28055, 1679, 19391, 8542, 5933, 275, 1798, 253, 8542, 2957, 3470, 298, 18146, 19631, 50276, 20, 432, 2829, 577, 50276, 26, 253, 4081, 1332, 4419, 591, 42429, 25184, 273, 253, 42428, 10235, 29331, 327, 247, 4942, 4030, 9860, 534, 16059, 84, 253, 16774, 8453, 1580, 1142, 273, 253, 2429, 1666, 25379, 2686, 440, 1419, 4373, 22041, 2439, 253, 278, 10441, 16856, 15302, 24088, 260, 5848, 891, 5848, 285, 32989, 20, 12847, 50274, 18, 465, 22711, 1323, 7411, 1162, 355, 11518, 2805, 28269, 323, 28841, 35221, 4715, 16424, 275, 11454, 1491, 5162, 2718, 5922, 9169, 12387, 4739, 22179, 50276, 19, 458, 70, 480, 543, 1222, 1162, 355, 1478, 301, 547, 28841, 3646, 13757, 3066, 17429, 3268, 10618, 13418, 5213, 8059, 327, 5145, 4715, 268, 1686, 83, 43425, 495, 7138, 35626, 67, 445, 254, 34843, 301, 1162, 355, 28841, 391, 77, 1293, 745, 22872, 7103, 16424, 275, 11454, 1491, 5162, 2718, 5910, 43425, 7584, 1610, 2537, 2950, 253, 4477, 13366, 9713, 253, 7364, 642, 2442, 4016, 38058, 16274, 310, 5469, 2490, 187, 4118, 18435, 27, 455, 30628, 403, 3839, 2762, 390, 45210, 670, 436, 2929, 30628, 3877, 326, 253, 1332, 310, 28055, 3590, 285, 8542, 281, 3359, 1014, 2167, 512, 273, 253, 4295, 452, 644, 14859, 3786, 253, 4477, 13398, 731, 275, 247, 4460, 2746, 326, 2410, 1763, 5356, 19132, 689, 2720, 2987, 50275, 24330, 7350, 452, 644, 9713, 407, 253, 4477, 2380, 2299, 891, 5194, 342, 37317, 269, 87, 38057, 326, 591, 10895, 25184, 273, 29331, 278, 7937, 447, 253, 5301, 342, 2045, 7274, 326, 513, 417, 513, 2074, 891, 651, 11907, 253, 4477, 281, 23000, 1304, 253, 1682, 3045, 342, 247, 2014, 4758, 2439, 15302, 281, 1056, 253, 5301, 30909, 50276 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper studies models trained with alphazero for the game of hex the paper applies techniques from the nlp literature to investigate whether the learned model learns concepts usually taught to human players the paper further investigates if the trained agent is able to act according to such concepts eg connect a bridge when that is needed to win the game or to reach an advantageous state of the game strengths the paper is well written and the experiments reported are interesting although the metrics and procedures come from the nlp literature this paper is the first to attempt them in models learned with alphazero i find it particularly interesting the probe procedure where a linear classifier is trained to detect known concepts of hex interestingly the concepts studied are detected with very high accuracy and the accuracy improves as alphazero further trains the model this shows an interesting connection between the learned model and concepts that we humans deem as important for playing the game there is a chance this paper will inspire others to work on the extraction of such concepts from the learned model this future research endeavor could be important for those interested in teaching humans the knowledge that is generated with learning systems the paper also makes interesting connections with previous work i particularly like the connection between the results presented in the paper and those of mcgrath et al 37 on chess regarding where in the neural model that the short and longterm decisions are stored such connections facilitate the work of others starting out in this field it was also interesting to see the methodology the paper uses to verify that the learned agent learns the connections in the board of hex weaknesses some of the results reported in the paper arent as interesting as the idea of running the experiments for example shortterm decisions are encoded near the output layers of the neural model perhaps this isnt surprising because such decisions are actionable and one would expect the last year of the model which is also linear to correctly identify such situations another result that perhaps isnt particularly interesting is the fact that the studied concepts are first noted in the mcts search and only later incorporated in the model this is because mcts generates the training data for the model so the natural order is to first generate the data and then train the model it would be strange if the model learned before mcts collected the data it is interesting that the concepts escape and bottleneck are first detected in probing to only then being detected in decisions of the agent figure 6 of the paper i list this as a weakness because the paper offers no explanation of why this is the case this result almost contradicts the idea of mcts generating training data and the model learning from the training data this is because the concept can be first noted in the neural model to only then appearing as decisions in the game behavior i apologize if i missed the explanation of this phenomenon and would appreciate a response from the authors about it other comments regarding the following this highlights a weakness in alphazero and a risk some concepts may be provable and useful to people but deemed less important by az the agent is probably playing dead cells because it has either won or lost the game and it doesnt matter if it plays on those cells if this is the cases then this isnt a weaknesses of alphazero but rather a challenge for systems that will attempt to extract information from the learned model to transfer to human learners the experiments were performed on a single domain and it isnt clear whether they will generalize to other domains eg security games or even other board games the experiments and results arent exactly actionable they are of scientific interest and might inspire others but it isnt clear what role this paper will have moving forward docsepthe paper investigates alphazeros internal representations in the game of hex by using two evaluation techniques model probing and behavioral tests they use probing classifiers to determine if agents can encode conceptual information and design behavioral tests to measure whether the agents can use the concepts to win games the paper combines both representational and behavioral approaches to analyze reinforcement learning agents in the game of hex overall the result is quite interesting and allows the readers an understanding of how alphazero agents learn however the method used in the paper has already existed and has been widely used in many other board games such as chess1 and go2 the paper simply applies these methods to a different game hex i think the novelty is not enough and the results in this paper will be limited to the whole community 1 t mcgrath a kapishnikov n tomaev a pearce d hassabis b kim u paquet and v kramnik acquisition of chess knowledge in alphazero arxiv preprint arxiv2111092592021 2 n tomlin et al understanding gameplaying agents with natural language annotations to appear acl 2022 the authors have addressed most of the possible limitations and potential negative societal impact in section 34 docsepthe authors use linear probing to identify predefined hex concepts in the activations of the agent they use behavioural tests to evaluate whether the concepts that were identified with linear probing are actually enacted by the agent they identify that alphazero does indeed represent concepts but inconsistently enacts some of them negative ones the concepts related to which moves the agent should not play they find that short term concepts those related to winning the game are represented in later layers in the agent and long term concepts those related to gamestates that are distant from a potential winning state are represented in middle layers they also find that the learned embeddings reflect the structure of the game board im inclined to believe the authors when they describe their work as concurrent to the very similar work of mcgrath et al 2021 which means the authors work is at least somewhat novel however the authors sell this work as borrowing methods from nlp when in fact those methods are linear probes and behavioural studies methods that are sufficiently general that it cant be accurate to describe them as nlp methods even the reference the authors use for linear probing is from image classifiers alain and bengio 2017 consequently the discussion in the introduction regarding nlp ought to be changed to reduce the implied links between nlp and the present work the experiments are also not especially extensive and do little to address the weaknesses of prior work particularly those of mcgrath et al which is that they rely on a predefined set of concepts nor is it possible in hex to use this method to improve human play since perfect play for hex is known at least for boards of certain sizes it would have been interesting and novel to expand on prior work to see whether alphazero uses novel concepts in settings where perfect play is unknown overall the paper is reasonably well presented however figure 6 is unclear why are there lines connecting concepts in the same group the authors should use a consistent name for structuralneighbours in both the figure its caption and the text the choice of the game of hex warrants some explanation especially given that it dramatically reduces the audience of people who are interested in the game compared with go or chess structural comment put long term vs short term sec 21 into the concept taxonomy sec 22 the authors do little to address the limitations of the method of linear probing and behavioural studies namely that it relies on a bank of preidentified human concepts consequently making it difficult to understand how the network plays so well beyond concepts humans already know nor does it really give us a deep picture about how the network actually achieves these feats of play the authors adequately address the potential societal impact of their work docsepin this paper the authors use probing and behavioral tests to investigate to what extent and when alphazerostyle dnns learn to use various concepts that humans are also known to use in the board game hex the analysis shows that the dnn tends to consistently pass many of the tests for almost all of the evaluated concepts after about 75 training progress negative concepts which recognise cells that are useless to play in such as dead cells are an exception these appear to still not be fully understood even at the end of training and this can be observed in the agents behaviour as well where it sometimes does not win as fast as it could wasting some moves instead for most concepts the agents performance on behavioral tests tends to improve before the performance on probing tests which suggests that the full mcts agent may learn how to use them before this knowledge gets encoded into the dnn in general it appears that longterm concepts tend to be encoded in middle layers whereas shortterm concepts tend to be encoded closer to the final layers the overall methodology used for the analysis described above may be applicable to other games other rl environments strengths 1 interesting and important topic with potential applications to improve our understanding and ability to interpret what deep rl models have learned and when they learn it and what they fail to learn 2 wellwritten largely easy to follow 3 discussions of related work seems good dont see anything missing there though my background and awareness of existing work is more so in rl and games than in probinginterpretability angle weaknesses 1 i believe that one small but important part of the paper could use some clarifications in the writing section 32 on representational probing i will elaborate below 2 i think that a couple of claims in the paper may be slightly too strong and need a bit more nuance i will elaborate below 3 a lot of the details described in section 33 behavioral tests seem quite specific to the game of hex for the specific case of hex we can indeed know how to create such states that i contain a concept ii contain that concept in only exactly one place iii make sure that the agent must play according to the concept immediately because otherwise it would lose i imagine that setting up such specific situations may be much more difficult in many other games or rl environments and would certainly require highly gamespecific knowledge again for such tasks this seems like a potential limitation which doesnt seem to be discussed yet on weakness 1 the first sentence that i personally found confusing was to form the random control for each board h0 y in the probing dataset we consistently map each cell in that board to a random cell forming hs0 i guess that map each cell in that board to a random cell means creating a random mapping from all cells in the original board to all cells in the control board in a way such that every original cell maps to exactly one randomlyselected control cell and every control cell is also mapped to by exactly one original cell and then the value of each original cell blackwhiteempty is assigned to the control cell that it maps to i guess this is what is done and it makes sense but its not 100 explicit im afraid that many readers could misunderstand it as simply saying that every cell gets a random value directly then a bit further down under implementation details it is described how the boards in the probing dataset get constructed i suspect it would make more sense to actually describe this before describing how the matching controls are created on weakness 2 a the behavioral tests involve states created specifically such that they i contain the concept but also ii demand that the agent immediately plays according to the concept because it will lose otherwise in the game of hex this means that all of these board states for all these different concepts actually include one more new concept that is shared across all the tests a concept that recognises a long chain of enemy pieces that is about to become a winning connection if not interrupted by playing in what is usually just one or two remaining blank cells in between so i do not believe that we can say with 100 certainty that all these behavior tests are actually testing for the concept that you intend them to test for some or all of them may simply be testing more generally if the agent can recognise when it needs to interrupt the opponents soontobewinning chain b fig 5 shows evidence that some information is learned before the model is able to use the concepts i think evidence may be too strong here and would say something more like fig 5 suggests that some information may be learned technically fig 5 just shows that there is generally a long period with no progress on the tests and after a long time suddenly rapid progress on the tests to me this indeed suggests that it is likely that it is learning something else first but it is not hard evidence it could also be that its just randomly wandering about the parameter space and suddenly gets lucky and makes quick progress then having learned nothing at all before c behavioral tests can also expose heuristics the model may be using yes but only if we actually already know that the heuristics exist and know how to explicitly encode them and create probes for them they cant teach us any new heuristics that we didnt already know about so maybe better phrasing could be something like behavioral tests can also confirm whether or not the model may be using certain heuristics it may be useful to discuss the apparent limitation that quite a bit of hexspecific knowledge is used for setting up the probes discussed in more detail as a weakness above it may be useful to discuss the potential limitation i discussed in more detail above that the behavioral tests may simply all be testing for an agents ability to recognise when it needs to interrupt an opponents immediate winning threat ### Summary:
this paper uses methods from interpretability to study the knowledge learned by alphazero in the game of hex in particular the networks outputs are correlated with various handdesigned features the reviewers are all in agreement that there are some interesting contributions there was some comparison with mcgrath et al but given that this paper is relatively recent and still unpublished this does not seem like an insurmontable blocker there was also some debate on whether evaluating on hex is sufficiently relevant to the research community or whether the kind of probes used are particularly insightful i agree that the latter is a weak point of the paper while the former is a reasonable concern but more minor another concern is what we do from this paper onwards ie how will this research feed into future work in followup discussion a majority of reviewers argued that there were useful nuggets of knowledge produced by this paper
[ 253, 2165, 273, 15442, 407, 970, 767, 7103, 5609, 1566, 39578, 285, 14613, 5216, 597, 897, 39578, 49996, 281, 3653, 604, 6083, 476, 22573, 20178, 1491, 285, 2216, 14613, 5216, 281, 2557, 1880, 253, 6083, 476, 897, 253, 12342, 281, 3330, 3958, 50276, 783, 2929, 24772, 1097, 1957, 1050, 285, 14613, 7274, 281, 12106, 35221, 4715, 6083, 275, 253, 2165, 273, 15442, 4583, 253, 906, 310, 3240, 4722, 285, 4483, 253, 10668, 271, 4685, 273, 849, 355, 545, 1370, 2771, 6083, 3037, 2299, 253, 1332, 908, 275, 253, 2929, 556, 2168, 13164, 285, 556, 644, 7561, 908, 275, 1142, 643, 4450, 3958, 824, 347, 29992, 18, 285, 564, 19, 253, 2929, 3365, 10384, 841, 3082, 281, 247, 1027, 2165, 15442, 891, 1158, 253, 38135, 310, 417, 2217, 285, 253, 1543, 275, 436, 2929, 588, 320, 3710, 281, 253, 2644, 3114, 50276, 18, 246, 278, 68, 737, 506, 247, 42844, 763, 47835, 295, 281, 785, 1173, 247, 27887, 336, 277, 38193, 16291, 270, 465, 303, 1484, 1349, 21118, 285, 362, 465, 3358, 16825, 11931, 273, 29992, 3640, 275, 355, 545, 1370, 2771, 549, 32693, 638, 3845, 549, 32693, 17605, 12852, 20395, 938, 1797, 374, 295, 281, 38263, 1162, 355, 4685, 30355, 272, 6083, 342, 3626, 3448, 31825, 281, 3176, 247, 498, 1384, 1423, 253, 4477, 452, 9713, 954, 273, 253, 1896, 7364, 285, 2442, 4016, 38058, 3486, 275, 2593, 5910, 5474, 339, 431, 248, 4477, 897, 4872, 39578, 281, 4271, 41364, 15442, 12342, 275, 253, 1396, 569, 273, 253, 5570, 597, 897, 35174, 5216, 281, 7472, 1880, 253, 12342, 326, 497, 3636, 342, 4872, 39578, 403, 2686, 23552, 407, 253, 5570, 597, 4271, 326, 355, 545, 1370, 2771, 1057, 6296, 1957, 12342, 533, 27393, 1574, 546, 10719, 690, 273, 731, 4016, 4394, 50276, 783, 12342, 2905, 281, 534, 9727, 253, 5570, 943, 417, 1132, 597, 1089, 326, 2159, 1307, 12342, 1110, 2905, 281, 9880, 253, 2165, 403, 6607, 275, 1996, 8090, 275, 253, 5570, 285, 1048, 1307, 12342, 1110, 2905, 281, 18814, 383, 684, 326, 403, 13392, 432, 247, 2442, 9880, 1375, 403, 6607, 275, 4766, 8090, 597, 671, 1089, 326, 253, 6311, 46234, 4887, 253, 2605, 273, 253, 2165, 4450, 50276, 303, 21802, 281, 2868, 253, 4477, 672, 597, 6266, 616, 789, 347, 17336, 281, 253, 1077, 2074, 789, 273, 278, 68, 737, 506, 1162, 355, 43425, 534, 2097, 253, 4477, 789, 310, 387, 1878, 8489, 4460, 2299, 253, 4477, 5580, 436, 789, 347, 40770, 3082, 432, 295, 24343, 672, 275, 958, 1110, 3082, 403, 4872, 19432, 285, 35174, 2175, 3082, 326, 403, 10481, 2087, 326, 352, 16216, 320, 7899, 281, 6266, 731, 347, 295, 24343, 3082, 1014, 253, 3806, 253, 4477, 897, 323, 4872, 39578, 310, 432, 2460, 49996, 355, 404, 285, 270, 1205, 900, 4240, 17912, 253, 5955, 275, 253, 10199, 5001, 295, 24343, 12758, 281, 320, 4391, 281, 4796, 253, 10466, 4859, 875, 295, 24343, 285, 253, 1246, 789, 253, 4679, 403, 671, 417, 3340, 9470, 285, 513, 1652, 281, 2953, 253, 32213, 273, 2720, 789, 3782, 1110, 273, 278, 68, 737, 506, 1162, 355, 534, 310, 326, 597, 10725, 327, 247, 41364, 873, 273, 12342, 4543, 310, 352, 1896, 275, 15442, 281, 897, 436, 1332, 281, 3157, 1966, 1132, 1580, 3962, 1132, 323, 15442, 310, 1929, 387, 1878, 323, 16431, 273, 2176, 9552, 352, 651, 452, 644, 4722, 285, 4460, 281, 5645, 327, 2720, 789, 281, 923, 1880, 355, 545, 1370, 2771, 4648, 4460, 12342, 275, 7533, 835, 3962, 1132, 310, 7202, 50274, 1189, 455, 253, 2929, 310, 12054, 973, 3559, 2299, 4677, 721, 310, 12744, 2139, 403, 627, 3104, 12873, 12342, 275, 253, 1072, 1387, 253, 4477, 943, 897, 247, 5185, 1416, 323, 8350, 570, 47295, 2108, 275, 1097, 253, 4677, 697, 11743, 285, 253, 2505, 50276, 783, 4327, 273, 253, 2165, 273, 15442, 32570, 690, 8813, 3340, 1677, 326, 352, 16821, 11355, 253, 8446, 273, 952, 665, 403, 6110, 275, 253, 2165, 2429, 342, 564, 390, 29992, 50275, 48791, 4385, 1691, 1048, 1307, 4632, 2159, 1307, 4706, 3127, 715, 253, 4473, 2891, 13646, 4706, 3307, 50276, 783, 4477, 513, 1652, 281, 2953, 253, 7364, 273, 253, 1332, 273, 4872, 39578, 285, 35174, 2175, 50276, 49592, 326, 352, 15771, 327, 247, 4310, 273, 638, 24491, 1966, 12342, 17912, 2403, 352, 2834, 281, 2096, 849, 253, 2990, 7120, 594, 973, 4457, 12342, 7497, 2168, 871, 4543, 1057, 352, 1663, 1918, 441, 247, 3676, 5406, 670, 849, 253, 2990, 2686, 33526, 841, 704, 1832, 273, 1132, 253, 4477, 18212, 2953, 253, 2442, 38058, 3486, 273, 616, 789, 50276, 7152, 339, 9852, 436, 2929, 253, 4477, 897, 39578, 285, 14613, 5216, 281, 7409, 281, 752, 6070, 285, 672, 355, 545, 24613, 493, 2172, 277, 79, 2224, 3037, 281, 897, 2710, 12342, 326, 7497, 403, 671, 1929, 281, 897, 275, 253, 4450, 2165, 15442, 50275, 783, 1783, 2722, 326, 253, 277, 9866, 14280, 281, 12724, 1509, 1142, 273, 253, 5216, 323, 2761, 512, 273, 253, 6760, 12342, 846, 670, 6879, 3733, 4780, 4016, 12342, 534, 31410, 1341, 326, 403, 19437, 281, 1132, 275, 824, 347, 3846, 1341, 403, 271, 6517, 841, 3176, 281, 1335, 417, 320, 4751, 7192, 1014, 387, 253, 990, 273, 3733, 285, 436, 476, 320, 2540, 275, 253, 6083, 8770, 347, 973, 835, 352, 4536, 1057, 417, 3330, 347, 3809, 347, 352, 812, 31160, 690, 9727, 3185, 323, 954, 12342, 253, 6083, 3045, 327, 14613, 5216, 14280, 281, 3157, 1078, 253, 3045, 327, 39578, 5216, 534, 5936, 326, 253, 2120, 278, 291, 84, 5570, 778, 3037, 849, 281, 897, 731, 1078, 436, 3640, 4850, 16202, 715, 253, 277, 9866, 275, 2087, 352, 4620, 326, 1048, 3945, 12342, 5257, 281, 320, 16202, 275, 4766, 8090, 5727, 2159, 3945, 12342, 5257, 281, 320, 16202, 8003, 281, 253, 2457, 8090, 50276, 783, 4583, 16182, 908, 323, 253, 1783, 2529, 1840, 778, 320, 7763, 281, 643, 3958, 50276, 977, 391, 77, 12620, 20544, 337, 4722, 285, 1774, 9400, 342, 2442, 4893, 281, 3157, 776, 4685, 285, 3745, 281, 4665, 752, 3676, 391, 77, 3210, 452, 6311, 285, 672, 597, 3037, 352, 285, 752, 597, 1891, 281, 3037, 374, 973, 15720, 8127, 3477, 281, 956, 495, 11985, 273, 2905, 789, 3133, 1175, 13414, 923, 2712, 5816, 627, 2167, 619, 4114, 285, 11891, 273, 5368, 789, 310, 625, 594, 275, 391, 77, 285, 3958, 685, 275, 39578, 22416, 1430, 6907, 50276, 20881, 1255, 265, 337, 891, 2868, 326, 581, 1355, 533, 1774, 629, 273, 253, 2929, 812, 897, 690, 8254, 6787, 275, 253, 4028, 2593, 4567, 327, 1957, 1050, 39578, 891, 588, 21184, 2708, 374, 891, 1158, 326, 247, 4564, 273, 3916, 275, 253, 2929, 778, 320, 5777, 1512, 2266, 285, 878, 247, 2372, 625, 8794, 593, 891, 588, 21184, 2708, 495, 247, 2257, 273, 253, 4278, 2529, 275, 2593, 5922, 14613, 5216, 1646, 3240, 2173, 281, 253, 2165, 273, 15442, 323, 253, 2173, 1083, 273, 15442, 359, 476, 6296, 871, 849, 281, 2794, 824, 3054, 326, 891, 3831, 247, 4473, 21255, 3831, 326, 4473, 275, 760, 4555, 581, 1659, 37685, 1056, 2119, 326, 253, 5570, 1364, 1132, 2556, 281, 253, 4473, 4745, 984, 5010, 352, 651, 7168, 891, 8564, 326, 4758, 598, 824, 2173, 9534, 778, 320, 1199, 625, 2834, 275, 1142, 643, 3958, 390, 391, 77, 12620, 285, 651, 5604, 2430, 4122, 3958, 29765, 3640, 969, 323, 824, 8892, 436, 3133, 751, 247, 2442, 12291, 534, 36908, 1646, 281, 320, 5469, 2568, 50274, 251, 14855, 337, 50275, 783, 806, 6197, 326, 891, 11697, 1119, 21643, 369, 281, 830, 253, 3632, 1453, 323, 1016, 4450, 288, 17, 340, 275, 253, 39578, 10895, 359, 12724, 3711, 1016, 894, 275, 326, 4450, 281, 247, 3632, 894, 9046, 49343, 17, 891, 5476, 326, 3711, 1016, 894, 275, 326, 4450, 281, 247, 3632, 894, 2097, 6153, 247, 3632, 10603, 432, 512, 1341, 275, 253, 3236, 4450, 281, 512, 1341, 275, 253, 1453, 4450, 275, 247, 1039, 824, 326, 1046, 3236, 894, 8115, 281, 4555, 581, 12421, 16191, 1453, 894, 285, 1046, 1453, 894, 310, 671, 18301, 281, 407, 4555, 581, 3236, 894, 285, 840, 253, 1318, 273, 1016, 3236, 894, 2806, 11300, 11004, 310, 7922, 281, 253, 1453, 894, 326, 352, 8115, 281, 891, 5476, 436, 310, 752, 310, 2218, 285, 352, 2789, 3282, 533, 697, 417, 2233, 6843, 516, 9202, 326, 1142, 10668, 812, 23452, 1676, 352, 347, 3365, 3981, 326, 1046, 894, 4850, 247, 3632, 1318, 3587, 50276, 7461, 247, 2372, 2007, 1066, 762, 7092, 4278, 352, 310, 2529, 849, 253, 16431, 275, 253, 39578, 10895, 755, 8818, 891, 9101, 352, 651, 1056, 625, 3282, 281, 2686, 6266, 436, 1078, 12930, 849, 253, 11038, 5760, 403, 3562, 50274, 251, 14855, 374, 50276, 66, 253, 14613, 5216, 6388, 3054, 3562, 5742, 824, 326, 597, 891, 3831, 253, 4473, 533, 671, 21255, 4831, 326, 253, 5570, 4745, 7120, 2556, 281, 253, 4473, 984, 352, 588, 7168, 5010, 275, 253, 2165, 273, 15442, 436, 2097, 326, 512, 273, 841, 4450, 3054, 323, 512, 841, 1027, 12342, 2686, 2486, 581, 625, 747, 4473, 326, 310, 6096, 2439, 512, 253, 5216, 247, 4473, 326, 3183, 3013, 247, 1048, 5931, 273, 9054, 7437, 326, 310, 670, 281, 2489, 247, 9880, 4602, 604, 417, 21018, 407, 4882, 275, 752, 310, 3798, 816, 581, 390, 767, 5780, 9912, 1341, 275, 875, 594, 891, 513, 417, 2868, 326, 359, 476, 1333, 342, 2233, 23140, 326, 512, 841, 3879, 5216, 403, 2686, 5175, 323, 253, 4473, 326, 368, 18607, 731, 281, 1071, 323, 690, 390, 512, 273, 731, 778, 3365, 320, 5175, 625, 3839, 604, 253, 5570, 476, 31410, 672, 352, 3198, 281, 11606, 253, 18062, 594, 834, 706, 999, 249, 920, 5931, 50276, 67, 3036, 608, 2722, 1941, 326, 690, 1491, 310, 6311, 1078, 253, 1566, 310, 2104, 281, 897, 253, 12342, 50276, 74, 1158, 1941, 778, 320, 1512, 2266, 1060, 285, 651, 1333, 1633, 625, 751, 3036, 608, 5936, 326, 690, 1491, 778, 320, 6311, 50276, 23693, 1037, 3036, 608, 816, 2722, 326, 627, 310, 3839, 247, 1048, 2180, 342, 642, 4780, 327, 253, 5216, 285, 846, 247, 1048, 673, 8423, 5233, 4780, 327, 253, 5216, 281, 479, 436, 6296, 5936, 326, 352, 310, 2779, 326, 352, 310, 4715, 1633, 2010, 806, 533, 352, 310, 417, 1892, 1941, 352, 812, 671, 320, 326, 697, 816, 12421, 31590, 670, 253, 4764, 2317, 285, 8423, 4850, 13476, 285, 2789, 3158, 4780, 840, 1907, 6311, 2717, 387, 512, 1078, 50276, 68, 14613, 5216, 476, 671, 22065, 344, 321, 3397, 253, 1566, 778, 320, 970, 50276, 9820, 533, 760, 604, 359, 2686, 2168, 871, 326, 253, 344, 321, 3397, 2226, 285, 871, 849, 281, 11120, 22573, 731, 285, 2794, 19432, 323, 731, 597, 16216, 9798, 441, 667, 747, 344, 321, 3397, 326, 359, 42126, 2168, 871, 670, 594, 5046, 1805, 9839, 2355, 812, 320, 1633, 751, 14613, 5216, 476, 671, 6583, 1880, 390, 417, 253, 1566, 778, 320, 970, 2176, 344, 321, 3397, 50276, 262, 778, 320, 4217, 281, 2319, 253, 5165, 12291, 326, 3240, 247, 2372, 273, 15442, 6160, 3640, 310, 908, 323, 4758, 598, 253, 19432, 5469, 275, 625, 2508, 347, 247, 14855, 1840, 50276, 262, 778, 320, 4217, 281, 2319, 253, 2442, 12291, 891, 5469, 275, 625, 2508, 1840, 326, 253, 14613, 5216, 778, 3365, 512, 320, 5175, 323, 271, 6083, 3745, 281, 31410, 672, 352, 3198, 281, 11606, 271, 18062, 8993, 9880, 4322, 2490, 187, 4118, 18435, 27, 2520, 2929, 4648, 3082, 432, 4665, 1430, 281, 1263, 253, 3640, 6311, 407, 355, 545, 1370, 2771, 275, 253, 2165, 273, 15442, 275, 1798, 253, 6928, 18012, 403, 9578, 342, 2710, 1133, 38061, 3386, 50276, 783, 30628, 403, 512, 275, 4345, 326, 627, 403, 690, 4722, 9021, 627, 369, 690, 5301, 342, 278, 68, 737, 506, 1162, 355, 533, 1677, 326, 436, 2929, 310, 4942, 3332, 285, 1335, 27085, 436, 1057, 417, 1646, 751, 271, 1210, 321, 19065, 494, 45859, 627, 369, 671, 690, 8881, 327, 1880, 16344, 327, 15442, 310, 10481, 4623, 281, 253, 2561, 3114, 390, 1880, 253, 2238, 273, 19432, 908, 403, 3782, 47860, 891, 5194, 326, 253, 6158, 310, 247, 5075, 1127, 273, 253, 2929, 1223, 253, 3438, 310, 247, 5272, 4468, 533, 625, 5884, 1529, 4468, 310, 752, 359, 513, 432, 436, 2929, 39210, 50276, 466, 849, 588, 436, 2561, 3997, 715, 2852, 789, 275, 956, 484, 5955, 247, 5020, 273, 30628, 9125, 326, 627, 497, 4217, 295, 814, 18145, 273, 3640, 4197, 407, 436, 2929 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 253, 2165, 273, 15442, 407, 970, 767, 7103, 5609, 1566, 39578, 285, 14613, 5216, 597, 897, 39578, 49996, 281, 3653, 604, 6083, 476, 22573, 20178, 1491, 285, 2216, 14613, 5216, 281, 2557, 1880, 253, 6083, 476, 897, 253, 12342, 281, 3330, 3958, 50276, 783, 2929, 24772, 1097, 1957, 1050, 285, 14613, 7274, 281, 12106, 35221, 4715, 6083, 275, 253, 2165, 273, 15442, 4583, 253, 906, 310, 3240, 4722, 285, 4483, 253, 10668, 271, 4685, 273, 849, 355, 545, 1370, 2771, 6083, 3037, 2299, 253, 1332, 908, 275, 253, 2929, 556, 2168, 13164, 285, 556, 644, 7561, 908, 275, 1142, 643, 4450, 3958, 824, 347, 29992, 18, 285, 564, 19, 253, 2929, 3365, 10384, 841, 3082, 281, 247, 1027, 2165, 15442, 891, 1158, 253, 38135, 310, 417, 2217, 285, 253, 1543, 275, 436, 2929, 588, 320, 3710, 281, 253, 2644, 3114, 50276, 18, 246, 278, 68, 737, 506, 247, 42844, 763, 47835, 295, 281, 785, 1173, 247, 27887, 336, 277, 38193, 16291, 270, 465, 303, 1484, 1349, 21118, 285, 362, 465, 3358, 16825, 11931, 273, 29992, 3640, 275, 355, 545, 1370, 2771, 549, 32693, 638, 3845, 549, 32693, 17605, 12852, 20395, 938, 1797, 374, 295, 281, 38263, 1162, 355, 4685, 30355, 272, 6083, 342, 3626, 3448, 31825, 281, 3176, 247, 498, 1384, 1423, 253, 4477, 452, 9713, 954, 273, 253, 1896, 7364, 285, 2442, 4016, 38058, 3486, 275, 2593, 5910, 5474, 339, 431, 248, 4477, 897, 4872, 39578, 281, 4271, 41364, 15442, 12342, 275, 253, 1396, 569, 273, 253, 5570, 597, 897, 35174, 5216, 281, 7472, 1880, 253, 12342, 326, 497, 3636, 342, 4872, 39578, 403, 2686, 23552, 407, 253, 5570, 597, 4271, 326, 355, 545, 1370, 2771, 1057, 6296, 1957, 12342, 533, 27393, 1574, 546, 10719, 690, 273, 731, 4016, 4394, 50276, 783, 12342, 2905, 281, 534, 9727, 253, 5570, 943, 417, 1132, 597, 1089, 326, 2159, 1307, 12342, 1110, 2905, 281, 9880, 253, 2165, 403, 6607, 275, 1996, 8090, 275, 253, 5570, 285, 1048, 1307, 12342, 1110, 2905, 281, 18814, 383, 684, 326, 403, 13392, 432, 247, 2442, 9880, 1375, 403, 6607, 275, 4766, 8090, 597, 671, 1089, 326, 253, 6311, 46234, 4887, 253, 2605, 273, 253, 2165, 4450, 50276, 303, 21802, 281, 2868, 253, 4477, 672, 597, 6266, 616, 789, 347, 17336, 281, 253, 1077, 2074, 789, 273, 278, 68, 737, 506, 1162, 355, 43425, 534, 2097, 253, 4477, 789, 310, 387, 1878, 8489, 4460, 2299, 253, 4477, 5580, 436, 789, 347, 40770, 3082, 432, 295, 24343, 672, 275, 958, 1110, 3082, 403, 4872, 19432, 285, 35174, 2175, 3082, 326, 403, 10481, 2087, 326, 352, 16216, 320, 7899, 281, 6266, 731, 347, 295, 24343, 3082, 1014, 253, 3806, 253, 4477, 897, 323, 4872, 39578, 310, 432, 2460, 49996, 355, 404, 285, 270, 1205, 900, 4240, 17912, 253, 5955, 275, 253, 10199, 5001, 295, 24343, 12758, 281, 320, 4391, 281, 4796, 253, 10466, 4859, 875, 295, 24343, 285, 253, 1246, 789, 253, 4679, 403, 671, 417, 3340, 9470, 285, 513, 1652, 281, 2953, 253, 32213, 273, 2720, 789, 3782, 1110, 273, 278, 68, 737, 506, 1162, 355, 534, 310, 326, 597, 10725, 327, 247, 41364, 873, 273, 12342, 4543, 310, 352, 1896, 275, 15442, 281, 897, 436, 1332, 281, 3157, 1966, 1132, 1580, 3962, 1132, 323, 15442, 310, 1929, 387, 1878, 323, 16431, 273, 2176, 9552, 352, 651, 452, 644, 4722, 285, 4460, 281, 5645, 327, 2720, 789, 281, 923, 1880, 355, 545, 1370, 2771, 4648, 4460, 12342, 275, 7533, 835, 3962, 1132, 310, 7202, 50274, 1189, 455, 253, 2929, 310, 12054, 973, 3559, 2299, 4677, 721, 310, 12744, 2139, 403, 627, 3104, 12873, 12342, 275, 253, 1072, 1387, 253, 4477, 943, 897, 247, 5185, 1416, 323, 8350, 570, 47295, 2108, 275, 1097, 253, 4677, 697, 11743, 285, 253, 2505, 50276, 783, 4327, 273, 253, 2165, 273, 15442, 32570, 690, 8813, 3340, 1677, 326, 352, 16821, 11355, 253, 8446, 273, 952, 665, 403, 6110, 275, 253, 2165, 2429, 342, 564, 390, 29992, 50275, 48791, 4385, 1691, 1048, 1307, 4632, 2159, 1307, 4706, 3127, 715, 253, 4473, 2891, 13646, 4706, 3307, 50276, 783, 4477, 513, 1652, 281, 2953, 253, 7364, 273, 253, 1332, 273, 4872, 39578, 285, 35174, 2175, 50276, 49592, 326, 352, 15771, 327, 247, 4310, 273, 638, 24491, 1966, 12342, 17912, 2403, 352, 2834, 281, 2096, 849, 253, 2990, 7120, 594, 973, 4457, 12342, 7497, 2168, 871, 4543, 1057, 352, 1663, 1918, 441, 247, 3676, 5406, 670, 849, 253, 2990, 2686, 33526, 841, 704, 1832, 273, 1132, 253, 4477, 18212, 2953, 253, 2442, 38058, 3486, 273, 616, 789, 50276, 7152, 339, 9852, 436, 2929, 253, 4477, 897, 39578, 285, 14613, 5216, 281, 7409, 281, 752, 6070, 285, 672, 355, 545, 24613, 493, 2172, 277, 79, 2224, 3037, 281, 897, 2710, 12342, 326, 7497, 403, 671, 1929, 281, 897, 275, 253, 4450, 2165, 15442, 50275, 783, 1783, 2722, 326, 253, 277, 9866, 14280, 281, 12724, 1509, 1142, 273, 253, 5216, 323, 2761, 512, 273, 253, 6760, 12342, 846, 670, 6879, 3733, 4780, 4016, 12342, 534, 31410, 1341, 326, 403, 19437, 281, 1132, 275, 824, 347, 3846, 1341, 403, 271, 6517, 841, 3176, 281, 1335, 417, 320, 4751, 7192, 1014, 387, 253, 990, 273, 3733, 285, 436, 476, 320, 2540, 275, 253, 6083, 8770, 347, 973, 835, 352, 4536, 1057, 417, 3330, 347, 3809, 347, 352, 812, 31160, 690, 9727, 3185, 323, 954, 12342, 253, 6083, 3045, 327, 14613, 5216, 14280, 281, 3157, 1078, 253, 3045, 327, 39578, 5216, 534, 5936, 326, 253, 2120, 278, 291, 84, 5570, 778, 3037, 849, 281, 897, 731, 1078, 436, 3640, 4850, 16202, 715, 253, 277, 9866, 275, 2087, 352, 4620, 326, 1048, 3945, 12342, 5257, 281, 320, 16202, 275, 4766, 8090, 5727, 2159, 3945, 12342, 5257, 281, 320, 16202, 8003, 281, 253, 2457, 8090, 50276, 783, 4583, 16182, 908, 323, 253, 1783, 2529, 1840, 778, 320, 7763, 281, 643, 3958, 50276, 977, 391, 77, 12620, 20544, 337, 4722, 285, 1774, 9400, 342, 2442, 4893, 281, 3157, 776, 4685, 285, 3745, 281, 4665, 752, 3676, 391, 77, 3210, 452, 6311, 285, 672, 597, 3037, 352, 285, 752, 597, 1891, 281, 3037, 374, 973, 15720, 8127, 3477, 281, 956, 495, 11985, 273, 2905, 789, 3133, 1175, 13414, 923, 2712, 5816, 627, 2167, 619, 4114, 285, 11891, 273, 5368, 789, 310, 625, 594, 275, 391, 77, 285, 3958, 685, 275, 39578, 22416, 1430, 6907, 50276, 20881, 1255, 265, 337, 891, 2868, 326, 581, 1355, 533, 1774, 629, 273, 253, 2929, 812, 897, 690, 8254, 6787, 275, 253, 4028, 2593, 4567, 327, 1957, 1050, 39578, 891, 588, 21184, 2708, 374, 891, 1158, 326, 247, 4564, 273, 3916, 275, 253, 2929, 778, 320, 5777, 1512, 2266, 285, 878, 247, 2372, 625, 8794, 593, 891, 588, 21184, 2708, 495, 247, 2257, 273, 253, 4278, 2529, 275, 2593, 5922, 14613, 5216, 1646, 3240, 2173, 281, 253, 2165, 273, 15442, 323, 253, 2173, 1083, 273, 15442, 359, 476, 6296, 871, 849, 281, 2794, 824, 3054, 326, 891, 3831, 247, 4473, 21255, 3831, 326, 4473, 275, 760, 4555, 581, 1659, 37685, 1056, 2119, 326, 253, 5570, 1364, 1132, 2556, 281, 253, 4473, 4745, 984, 5010, 352, 651, 7168, 891, 8564, 326, 4758, 598, 824, 2173, 9534, 778, 320, 1199, 625, 2834, 275, 1142, 643, 3958, 390, 391, 77, 12620, 285, 651, 5604, 2430, 4122, 3958, 29765, 3640, 969, 323, 824, 8892, 436, 3133, 751, 247, 2442, 12291, 534, 36908, 1646, 281, 320, 5469, 2568, 50274, 251, 14855, 337, 50275, 783, 806, 6197, 326, 891, 11697, 1119, 21643, 369, 281, 830, 253, 3632, 1453, 323, 1016, 4450, 288, 17, 340, 275, 253, 39578, 10895, 359, 12724, 3711, 1016, 894, 275, 326, 4450, 281, 247, 3632, 894, 9046, 49343, 17, 891, 5476, 326, 3711, 1016, 894, 275, 326, 4450, 281, 247, 3632, 894, 2097, 6153, 247, 3632, 10603, 432, 512, 1341, 275, 253, 3236, 4450, 281, 512, 1341, 275, 253, 1453, 4450, 275, 247, 1039, 824, 326, 1046, 3236, 894, 8115, 281, 4555, 581, 12421, 16191, 1453, 894, 285, 1046, 1453, 894, 310, 671, 18301, 281, 407, 4555, 581, 3236, 894, 285, 840, 253, 1318, 273, 1016, 3236, 894, 2806, 11300, 11004, 310, 7922, 281, 253, 1453, 894, 326, 352, 8115, 281, 891, 5476, 436, 310, 752, 310, 2218, 285, 352, 2789, 3282, 533, 697, 417, 2233, 6843, 516, 9202, 326, 1142, 10668, 812, 23452, 1676, 352, 347, 3365, 3981, 326, 1046, 894, 4850, 247, 3632, 1318, 3587, 50276, 7461, 247, 2372, 2007, 1066, 762, 7092, 4278, 352, 310, 2529, 849, 253, 16431, 275, 253, 39578, 10895, 755, 8818, 891, 9101, 352, 651, 1056, 625, 3282, 281, 2686, 6266, 436, 1078, 12930, 849, 253, 11038, 5760, 403, 3562, 50274, 251, 14855, 374, 50276, 66, 253, 14613, 5216, 6388, 3054, 3562, 5742, 824, 326, 597, 891, 3831, 253, 4473, 533, 671, 21255, 4831, 326, 253, 5570, 4745, 7120, 2556, 281, 253, 4473, 984, 352, 588, 7168, 5010, 275, 253, 2165, 273, 15442, 436, 2097, 326, 512, 273, 841, 4450, 3054, 323, 512, 841, 1027, 12342, 2686, 2486, 581, 625, 747, 4473, 326, 310, 6096, 2439, 512, 253, 5216, 247, 4473, 326, 3183, 3013, 247, 1048, 5931, 273, 9054, 7437, 326, 310, 670, 281, 2489, 247, 9880, 4602, 604, 417, 21018, 407, 4882, 275, 752, 310, 3798, 816, 581, 390, 767, 5780, 9912, 1341, 275, 875, 594, 891, 513, 417, 2868, 326, 359, 476, 1333, 342, 2233, 23140, 326, 512, 841, 3879, 5216, 403, 2686, 5175, 323, 253, 4473, 326, 368, 18607, 731, 281, 1071, 323, 690, 390, 512, 273, 731, 778, 3365, 320, 5175, 625, 3839, 604, 253, 5570, 476, 31410, 672, 352, 3198, 281, 11606, 253, 18062, 594, 834, 706, 999, 249, 920, 5931, 50276, 67, 3036, 608, 2722, 1941, 326, 690, 1491, 310, 6311, 1078, 253, 1566, 310, 2104, 281, 897, 253, 12342, 50276, 74, 1158, 1941, 778, 320, 1512, 2266, 1060, 285, 651, 1333, 1633, 625, 751, 3036, 608, 5936, 326, 690, 1491, 778, 320, 6311, 50276, 23693, 1037, 3036, 608, 816, 2722, 326, 627, 310, 3839, 247, 1048, 2180, 342, 642, 4780, 327, 253, 5216, 285, 846, 247, 1048, 673, 8423, 5233, 4780, 327, 253, 5216, 281, 479, 436, 6296, 5936, 326, 352, 310, 2779, 326, 352, 310, 4715, 1633, 2010, 806, 533, 352, 310, 417, 1892, 1941, 352, 812, 671, 320, 326, 697, 816, 12421, 31590, 670, 253, 4764, 2317, 285, 8423, 4850, 13476, 285, 2789, 3158, 4780, 840, 1907, 6311, 2717, 387, 512, 1078, 50276, 68, 14613, 5216, 476, 671, 22065, 344, 321, 3397, 253, 1566, 778, 320, 970, 50276, 9820, 533, 760, 604, 359, 2686, 2168, 871, 326, 253, 344, 321, 3397, 2226, 285, 871, 849, 281, 11120, 22573, 731, 285, 2794, 19432, 323, 731, 597, 16216, 9798, 441, 667, 747, 344, 321, 3397, 326, 359, 42126, 2168, 871, 670, 594, 5046, 1805, 9839, 2355, 812, 320, 1633, 751, 14613, 5216, 476, 671, 6583, 1880, 390, 417, 253, 1566, 778, 320, 970, 2176, 344, 321, 3397, 50276, 262, 778, 320, 4217, 281, 2319, 253, 5165, 12291, 326, 3240, 247, 2372, 273, 15442, 6160, 3640, 310, 908, 323, 4758, 598, 253, 19432, 5469, 275, 625, 2508, 347, 247, 14855, 1840, 50276, 262, 778, 320, 4217, 281, 2319, 253, 2442, 12291, 891, 5469, 275, 625, 2508, 1840, 326, 253, 14613, 5216, 778, 3365, 512, 320, 5175, 323, 271, 6083, 3745, 281, 31410, 672, 352, 3198, 281, 11606, 271, 18062, 8993, 9880, 4322, 2490, 187, 4118, 18435, 27, 2520, 2929, 4648, 3082, 432, 4665, 1430, 281, 1263, 253, 3640, 6311, 407, 355, 545, 1370, 2771, 275, 253, 2165, 273, 15442, 275, 1798, 253, 6928, 18012, 403, 9578, 342, 2710, 1133, 38061, 3386, 50276, 783, 30628, 403, 512, 275, 4345, 326, 627, 403, 690, 4722, 9021, 627, 369, 690, 5301, 342, 278, 68, 737, 506, 1162, 355, 533, 1677, 326, 436, 2929, 310, 4942, 3332, 285, 1335, 27085, 436, 1057, 417, 1646, 751, 271, 1210, 321, 19065, 494, 45859, 627, 369, 671, 690, 8881, 327, 1880, 16344, 327, 15442, 310, 10481, 4623, 281, 253, 2561, 3114, 390, 1880, 253, 2238, 273, 19432, 908, 403, 3782, 47860, 891, 5194, 326, 253, 6158, 310, 247, 5075, 1127, 273, 253, 2929, 1223, 253, 3438, 310, 247, 5272, 4468, 533, 625, 5884, 1529, 4468, 310, 752, 359, 513, 432, 436, 2929, 39210, 50276, 466, 849, 588, 436, 2561, 3997, 715, 2852, 789, 275, 956, 484, 5955, 247, 5020, 273, 30628, 9125, 326, 627, 497, 4217, 295, 814, 18145, 273, 3640, 4197, 407, 436, 2929 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper proposes a tomographic autoencoder tae for unsupervised recovery of corrupted data more specifically tae takes a bayesian approach to recover the posterior distribution of a clean image conditioned on an observed corrupted image and thus effectively modeling uncertainty in data recovery the paper argues that a naive application of vae is not effective due to the latent variable collapse and proposes an alternative model where hierarchical latent variable models are used for both prior and variational posterior some tricks are introduced to facilitate the stochastic gradient variational inference i think the paper is tackling an important problem and i advocate the use of a vaelike model for uncertainty modeling the paper is clearly written with helpful figures the experimental results at least compared to the baseline mvae looks promising i like the way the methods are compared using the downstream task however im not sure whether a new model should be developed besides the existing approaches the authors state that the problem of a vanilla vae with hierarchical latent structure latent code z clean image x is that it is prone to latent variable collapse this is true but there are plenty of existing works partially resolving this problem the most relevant approach i can think of is the semiimplicit variational inference 1 for which a hierarchical latent variable model is used for the variational distribution similar to the setting considered in this paper 1 proposes a theoretically guaranteed solution to prevent a trivial case where the lowerlevel latent variable z completely collapses into a point mass and i think this can directly be applied for the problem considered in this paper also as 2 pointed out that one can consider using a more expressive prior distribution for latent code z to combat latent variable collapse for instance flowbased models may be employed for both prior and variational posterior the proposed design also makes sense but i dont think they are more expressive than the models listed above if that is the case there should be a special factor making tae specifically wellsuited for the problem at hand the recovery of corrupted images but i failed to find such a thing is there any reason that the existing models other than the vanilla vae cannot be considered also the baseline mvae is not clearly described so it is not clear how mvae was actually implemented i recommend giving a more detailed description of the baseline at least in the appendix references 1 yin and zhou semiimplicit variational inference icml 2018 2 chen et al variational lossy autoencoder iclr 2016docsepreview this paper proposes a novel approach to handle the recovery of dirty data in fully unsupervised scenarios the corrupted data considers both missing data and noisy samples they derive a vae model with a novel reduced entropy condition inference method that results in richer posteriors this is a very challenging problem since the model cannot use clean examples as part of their training procedure questionscomments it feels to me that the main focus of this paper is on missing data rather than in other types of corruption i get this impression mainly from the experimental section and methods used for comparison well known for handling missing data something i would have appreciated in this work is to observe the performance of the authors method in scenarios handling missing data and corrupted data separately nonetheless it is interesting to see experiments with both effects combined which is not so common in the literature would it be possible to get results beyond maximizing the elbo the elbo contains the reconstruction term of the images but also many other terms in the end in a missing data imputation model it is interesting to get an idea of how good the reconstruction of the images is i can see from figures 3 and 4 that they should be good but having a different metric might be helpful i find it surprising that in table 1 the elbo of tae is almost double of the other methods while the average reconstruction of mvae for example does not look that much different in figure 3 from the ground truth and it is pretty similar to tae summary the paper is well written and the idea is novel as far as i know the notation is clear and the proofs in the appendix look sensible the experimental section showcases several scenarios where they compare to unsupervised generative models to handle missing data the analysis of corrupted data outside of missing data seems a bit lackingdocsepsummary the paper proposes a method for reconstructing noisefree data instances without assuming any ground truths for this using a variant of autoencoder that avoids posterior collapse by utilising a newly proposed reduced entropy condition the problem itself is important and the proposed method seems to offer good empirical performance for the task outperforming a good selection of recent methods the paper is well written with ample citations for relevant work and provides sufficient technical details reasons for score a good paper with no obvious flaws i do not have concrete improvement suggestions additional comments the key concept of empirical prior and its use for regularising the autoencoder is good and intuitive and i could not spot any obvious theoretical or practical issues with the idea in the end the theoretical development results in fairly simple modification for the standard vae objective that can be directly trained using standard algorithms this is both a strength the approach is easy to implement and can be plugged into existing models and a minor weakness in terms of technical depth nevertheless the justification for the loss is well explained in the paper the empirical experiments are sufficiently comprehensive and show good empirical performance in a range of tasks against reasonable comparison methods as a minor comment i would recommend using a more consistent style for the figures eg fig 3 and fig 5 use very different visual styledocseppractical datasets often come with corruptions such as missing items or noisy observations thus needs models enable to recover the corrupted data automatically this paper presents the tomographic autoencoder tvae which conducts inference over the data space x because the prior regularization acts over the data space tvae is enforced to generate diverse samples from the corrupted observations empirically the paper demonstrates that tvae can indeed generate diverse samples and can achieve superior test elbo compared to the previous baselines missing data imputation is one important problem in machine learning given the imperfection of the practical data specifically tvae focuses on two properties 1 unsupervised learning which resolves the intractability of labelling large datasets 2 diverse recoveries which attempts to generate multiple possible samples instead of collapsing onto one possibility towards these two properties tvae proposes to conduct inference over the data space x whose prior directly prevents from collapsing to resolve the intractability of the entropy hqxy tvae identifies the reduced entropy condition and transforms the intractable elbo maximization into a constrained optimization problem the resulting model demonstrates superior performances compared to baselines in terms of generating diverse samples i think this work makes an important contribution this paper is well written conditional neural process garnelo et al 2018a and neural process garnelo et al 2018b are another stream of models for missing data imputation which are missed by the paper regarding the reduced entropy condition though it ensures that the entropy is decomposed into more tractable forms will enforcing this condition limit the expressiveness of the encoder qx y ### Summary:
summary the authors propose a bayesian approach to data cleaning implemented via a variational autoencoder they argue that a common problem in this context are posteriors that overfit by concentrating on a lowdimensional subset and introduce an optimization target intended to discourage that behavior discussion arguably the main concern brought up in the reviews was how much novelty there is in addressing latent variable posterior collapse solutions for which have been proposed the authors were able to clarify that this was due to a misunderstanding the collapse they address is not in latent space and the reviewer considers the matter resolved recommendation i recommend publication the reviewers are all positive agree that the method is interesting and seems novel the writing is clear and remaining doubts have been addressed in the discussion
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 247, 7275, 5576, 6753, 36465, 246, 3348, 323, 440, 35421, 7355, 273, 40634, 941, 625, 5742, 246, 3348, 3936, 247, 17699, 16561, 2746, 281, 9295, 253, 12637, 3268, 273, 247, 4076, 2460, 27039, 327, 271, 2540, 40634, 2460, 285, 3021, 8069, 14053, 11649, 275, 941, 7355, 253, 2929, 8219, 326, 247, 27785, 2898, 273, 362, 3348, 310, 417, 3576, 1955, 281, 253, 21624, 4778, 13551, 285, 29328, 271, 5795, 1566, 835, 24498, 21624, 4778, 3210, 403, 908, 323, 1097, 2720, 285, 39762, 12637, 690, 24866, 403, 5611, 281, 12454, 253, 19191, 11786, 39762, 17032, 50275, 74, 1158, 253, 2929, 310, 46710, 271, 1774, 1895, 285, 891, 21424, 253, 897, 273, 247, 362, 4696, 2804, 1566, 323, 11649, 14053, 253, 2929, 310, 4518, 3542, 342, 9371, 8442, 253, 5661, 1543, 387, 1878, 2429, 281, 253, 8245, 278, 21574, 4453, 12532, 891, 751, 253, 1039, 253, 3082, 403, 2429, 970, 253, 15450, 4836, 50276, 35529, 516, 417, 2119, 1880, 247, 747, 1566, 943, 320, 3715, 16280, 253, 5368, 7274, 253, 4477, 1375, 326, 253, 1895, 273, 247, 26724, 362, 3348, 342, 24498, 21624, 2605, 21624, 2127, 1182, 50276, 16437, 2460, 1269, 310, 326, 352, 310, 21291, 281, 21624, 4778, 13551, 50276, 2520, 310, 2032, 533, 627, 403, 9828, 273, 5368, 2987, 10571, 30426, 436, 1895, 253, 954, 4623, 2746, 891, 476, 1158, 273, 310, 253, 10020, 303, 20692, 39762, 17032, 337, 323, 534, 247, 24498, 21624, 4778, 1566, 310, 908, 323, 253, 39762, 3268, 2074, 281, 253, 4758, 2783, 275, 436, 2929, 337, 29328, 247, 28055, 16293, 2900, 281, 3657, 247, 14916, 1083, 835, 253, 2406, 5251, 21624, 4778, 1182, 4336, 3007, 23508, 715, 247, 1127, 2280, 285, 891, 1158, 436, 476, 3587, 320, 3732, 323, 253, 1895, 2783, 275, 436, 2929, 671, 347, 374, 8042, 562, 326, 581, 476, 1908, 970, 247, 625, 43541, 2720, 3268, 323, 21624, 2127, 1182, 281, 11757, 21624, 4778, 13551, 323, 4227, 2685, 3169, 3210, 778, 320, 7091, 323, 1097, 2720, 285, 39762, 12637, 253, 4081, 2216, 671, 2789, 3282, 533, 891, 13414, 1158, 597, 403, 625, 43541, 685, 253, 3210, 7117, 1840, 604, 326, 310, 253, 1083, 627, 943, 320, 247, 2714, 2803, 2403, 246, 3348, 5742, 973, 3467, 959, 323, 253, 1895, 387, 1133, 50276, 783, 7355, 273, 40634, 3888, 50276, 2858, 891, 4242, 281, 1089, 824, 247, 2181, 310, 627, 667, 1921, 326, 253, 5368, 3210, 643, 685, 253, 26724, 362, 3348, 2550, 320, 2783, 50276, 12563, 253, 8245, 278, 21574, 310, 417, 4518, 2529, 594, 352, 310, 417, 2590, 849, 278, 21574, 369, 2686, 9009, 891, 5583, 4933, 247, 625, 7000, 5740, 273, 253, 8245, 387, 1878, 275, 253, 30762, 50276, 250, 3065, 337, 340, 249, 285, 1182, 14451, 10020, 303, 20692, 39762, 17032, 17857, 1686, 4765, 374, 260, 864, 1162, 355, 39762, 2957, 90, 6753, 36465, 17857, 32888, 4022, 7152, 339, 42150, 50276, 2520, 2929, 29328, 247, 4460, 2746, 281, 6016, 253, 7355, 273, 16076, 941, 275, 4751, 440, 35421, 15216, 253, 40634, 941, 19401, 1097, 5816, 941, 285, 27620, 3530, 597, 15313, 247, 362, 3348, 1566, 342, 247, 4460, 3777, 15579, 1617, 17032, 1332, 326, 1543, 275, 38539, 20731, 17327, 436, 310, 247, 1077, 11132, 1895, 1580, 253, 1566, 2550, 897, 4076, 6667, 347, 629, 273, 616, 3733, 5199, 50276, 34974, 26122, 50275, 262, 9193, 281, 479, 326, 253, 2022, 2770, 273, 436, 2929, 310, 327, 5816, 941, 2581, 685, 275, 643, 3510, 273, 16933, 891, 755, 436, 13214, 7194, 432, 253, 5661, 2593, 285, 3082, 908, 323, 5301, 973, 1929, 323, 10885, 5816, 941, 1633, 891, 651, 452, 14109, 275, 436, 789, 310, 281, 10018, 253, 3045, 273, 253, 4477, 1332, 275, 15216, 10885, 5816, 941, 285, 40634, 941, 11794, 23188, 352, 310, 4722, 281, 923, 4679, 342, 1097, 2538, 5678, 534, 310, 417, 594, 1846, 275, 253, 6239, 50275, 12756, 352, 320, 1896, 281, 755, 1543, 4457, 46875, 253, 1045, 2399, 253, 1045, 2399, 4428, 253, 14433, 1307, 273, 253, 3888, 533, 671, 1142, 643, 2426, 275, 253, 990, 275, 247, 5816, 941, 516, 10340, 1566, 352, 310, 4722, 281, 755, 271, 2934, 273, 849, 1175, 253, 14433, 273, 253, 3888, 310, 891, 476, 923, 432, 8442, 495, 285, 577, 326, 597, 943, 320, 1175, 533, 1907, 247, 1027, 7982, 1537, 320, 9371, 891, 1089, 352, 10084, 326, 275, 2829, 337, 253, 1045, 2399, 273, 246, 3348, 310, 2761, 4021, 273, 253, 643, 3082, 1223, 253, 3388, 14433, 273, 278, 21574, 323, 1650, 1057, 417, 1007, 326, 1199, 1027, 275, 4677, 495, 432, 253, 3216, 5083, 285, 352, 310, 3965, 2074, 281, 246, 3348, 50276, 8774, 50276, 783, 2929, 310, 973, 3542, 285, 253, 2934, 310, 4460, 347, 2080, 347, 891, 871, 253, 14951, 310, 2590, 285, 253, 27947, 275, 253, 30762, 1007, 24600, 253, 5661, 2593, 921, 12866, 2067, 15216, 835, 597, 7277, 281, 440, 35421, 1006, 800, 3210, 281, 6016, 5816, 941, 253, 1783, 273, 40634, 941, 3345, 273, 5816, 941, 3133, 247, 2372, 14999, 7152, 339, 793, 360, 3454, 253, 2929, 29328, 247, 1332, 323, 17029, 272, 642, 261, 832, 658, 941, 10872, 1293, 7384, 667, 3216, 35086, 323, 436, 970, 247, 12955, 273, 6753, 36465, 326, 32547, 12637, 13551, 407, 4981, 2182, 247, 9841, 4081, 3777, 15579, 1617, 253, 1895, 3139, 310, 1774, 285, 253, 4081, 1332, 3133, 281, 3959, 1175, 16774, 3045, 323, 253, 4836, 41731, 14692, 247, 1175, 5438, 273, 3332, 3082, 253, 2929, 310, 973, 3542, 342, 24904, 30404, 323, 4623, 789, 285, 3400, 4209, 7681, 4278, 50276, 250, 3743, 323, 4868, 247, 1175, 2929, 342, 642, 4755, 32138, 891, 513, 417, 452, 11859, 7756, 13991, 50276, 38092, 5701, 253, 2234, 4473, 273, 16774, 2720, 285, 697, 897, 323, 3963, 2182, 253, 6753, 36465, 310, 1175, 285, 27350, 285, 891, 812, 417, 6308, 667, 4755, 10527, 390, 8542, 3374, 342, 253, 2934, 275, 253, 990, 253, 10527, 2440, 1543, 275, 9648, 2969, 11237, 323, 253, 2629, 362, 3348, 8103, 326, 476, 320, 3587, 10166, 970, 2629, 11333, 436, 310, 1097, 247, 4757, 253, 2746, 310, 3477, 281, 3359, 285, 476, 320, 43867, 715, 5368, 3210, 285, 247, 5884, 14855, 275, 2426, 273, 7681, 6864, 17837, 253, 22861, 323, 253, 2957, 310, 973, 5544, 275, 253, 2929, 50276, 783, 16774, 4679, 403, 10481, 11088, 285, 921, 1175, 16774, 3045, 275, 247, 2491, 273, 8892, 1411, 5272, 5301, 3082, 347, 247, 5884, 4385, 891, 651, 5583, 970, 247, 625, 5185, 3740, 323, 253, 8442, 24088, 3036, 495, 285, 3036, 608, 897, 1077, 1027, 5304, 49879, 406, 339, 377, 26080, 15302, 2223, 1705, 342, 17715, 621, 824, 347, 5816, 4957, 390, 27620, 7313, 3021, 3198, 3210, 8046, 281, 9295, 253, 40634, 941, 8356, 436, 2929, 10262, 253, 7275, 5576, 6753, 36465, 246, 21574, 534, 2589, 84, 17032, 689, 253, 941, 2317, 1269, 984, 253, 2720, 37820, 6993, 689, 253, 941, 2317, 246, 21574, 310, 27810, 281, 6635, 11117, 3530, 432, 253, 40634, 7313, 45190, 253, 2929, 14371, 326, 246, 21574, 476, 6296, 6635, 11117, 3530, 285, 476, 5115, 8936, 1071, 1045, 2399, 2429, 281, 253, 2045, 1666, 25379, 50276, 33722, 941, 516, 10340, 310, 581, 1774, 1895, 275, 5145, 4715, 1677, 253, 9719, 3716, 273, 253, 8542, 941, 5742, 246, 21574, 16633, 327, 767, 3607, 337, 440, 35421, 4715, 534, 501, 14503, 253, 540, 974, 1430, 273, 46684, 1781, 15302, 374, 11117, 9295, 447, 534, 9437, 281, 6635, 2709, 1896, 3530, 3185, 273, 45130, 4830, 581, 6387, 4404, 841, 767, 3607, 246, 21574, 29328, 281, 2589, 17032, 689, 253, 941, 2317, 1269, 3692, 2720, 3587, 16897, 432, 45130, 50276, 936, 11322, 253, 540, 974, 1430, 273, 253, 15579, 288, 82, 5246, 246, 21574, 22649, 253, 3777, 15579, 1617, 285, 29698, 253, 540, 44374, 1045, 2399, 11903, 1320, 715, 247, 20793, 13757, 1895, 253, 4795, 1566, 14371, 8936, 16226, 2429, 281, 1666, 25379, 275, 2426, 273, 11365, 11117, 3530, 891, 1158, 436, 789, 2789, 271, 1774, 7680, 50276, 2520, 2929, 310, 973, 3542, 50276, 35428, 11454, 1232, 34226, 29595, 1162, 355, 4765, 66, 285, 11454, 1232, 34226, 29595, 1162, 355, 4765, 67, 403, 1529, 5542, 273, 3210, 323, 5816, 941, 516, 10340, 534, 403, 9829, 407, 253, 2929, 50276, 1747, 13218, 253, 3777, 15579, 1617, 2167, 352, 20096, 326, 253, 15579, 310, 45765, 715, 625, 10649, 494, 4948, 588, 37703, 436, 1617, 2701, 253, 3890, 6460, 273, 253, 32049, 2805, 89, 50276, 90, 2490, 187, 4118, 18435, 27, 8774, 50276, 783, 4477, 12661, 247, 17699, 16561, 2746, 281, 941, 12478, 9009, 3066, 247, 39762, 6753, 36465, 597, 9059, 326, 247, 1846, 1895, 275, 436, 3634, 403, 20731, 17327, 326, 689, 8491, 407, 45012, 327, 247, 1698, 6967, 8578, 285, 9569, 271, 13757, 2303, 6034, 281, 43162, 326, 3879, 50276, 49794, 50276, 1662, 86, 1598, 253, 2022, 4468, 3982, 598, 275, 253, 10123, 369, 849, 1199, 38135, 627, 310, 275, 15974, 21624, 4778, 12637, 13551, 5482, 323, 534, 452, 644, 4081, 253, 4477, 497, 2104, 281, 19148, 326, 436, 369, 1955, 281, 247, 40663, 253, 13551, 597, 2953, 310, 417, 275, 21624, 2317, 285, 253, 37317, 19401, 253, 2647, 11512, 50275, 250, 27167, 318, 50276, 74, 5583, 9311, 253, 30628, 403, 512, 2762, 5194, 326, 253, 1332, 310, 4722, 285, 3133, 4460, 253, 4028, 310, 2590, 285, 5780, 24626, 452, 644, 9713, 275, 253, 5955, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 247, 7275, 5576, 6753, 36465, 246, 3348, 323, 440, 35421, 7355, 273, 40634, 941, 625, 5742, 246, 3348, 3936, 247, 17699, 16561, 2746, 281, 9295, 253, 12637, 3268, 273, 247, 4076, 2460, 27039, 327, 271, 2540, 40634, 2460, 285, 3021, 8069, 14053, 11649, 275, 941, 7355, 253, 2929, 8219, 326, 247, 27785, 2898, 273, 362, 3348, 310, 417, 3576, 1955, 281, 253, 21624, 4778, 13551, 285, 29328, 271, 5795, 1566, 835, 24498, 21624, 4778, 3210, 403, 908, 323, 1097, 2720, 285, 39762, 12637, 690, 24866, 403, 5611, 281, 12454, 253, 19191, 11786, 39762, 17032, 50275, 74, 1158, 253, 2929, 310, 46710, 271, 1774, 1895, 285, 891, 21424, 253, 897, 273, 247, 362, 4696, 2804, 1566, 323, 11649, 14053, 253, 2929, 310, 4518, 3542, 342, 9371, 8442, 253, 5661, 1543, 387, 1878, 2429, 281, 253, 8245, 278, 21574, 4453, 12532, 891, 751, 253, 1039, 253, 3082, 403, 2429, 970, 253, 15450, 4836, 50276, 35529, 516, 417, 2119, 1880, 247, 747, 1566, 943, 320, 3715, 16280, 253, 5368, 7274, 253, 4477, 1375, 326, 253, 1895, 273, 247, 26724, 362, 3348, 342, 24498, 21624, 2605, 21624, 2127, 1182, 50276, 16437, 2460, 1269, 310, 326, 352, 310, 21291, 281, 21624, 4778, 13551, 50276, 2520, 310, 2032, 533, 627, 403, 9828, 273, 5368, 2987, 10571, 30426, 436, 1895, 253, 954, 4623, 2746, 891, 476, 1158, 273, 310, 253, 10020, 303, 20692, 39762, 17032, 337, 323, 534, 247, 24498, 21624, 4778, 1566, 310, 908, 323, 253, 39762, 3268, 2074, 281, 253, 4758, 2783, 275, 436, 2929, 337, 29328, 247, 28055, 16293, 2900, 281, 3657, 247, 14916, 1083, 835, 253, 2406, 5251, 21624, 4778, 1182, 4336, 3007, 23508, 715, 247, 1127, 2280, 285, 891, 1158, 436, 476, 3587, 320, 3732, 323, 253, 1895, 2783, 275, 436, 2929, 671, 347, 374, 8042, 562, 326, 581, 476, 1908, 970, 247, 625, 43541, 2720, 3268, 323, 21624, 2127, 1182, 281, 11757, 21624, 4778, 13551, 323, 4227, 2685, 3169, 3210, 778, 320, 7091, 323, 1097, 2720, 285, 39762, 12637, 253, 4081, 2216, 671, 2789, 3282, 533, 891, 13414, 1158, 597, 403, 625, 43541, 685, 253, 3210, 7117, 1840, 604, 326, 310, 253, 1083, 627, 943, 320, 247, 2714, 2803, 2403, 246, 3348, 5742, 973, 3467, 959, 323, 253, 1895, 387, 1133, 50276, 783, 7355, 273, 40634, 3888, 50276, 2858, 891, 4242, 281, 1089, 824, 247, 2181, 310, 627, 667, 1921, 326, 253, 5368, 3210, 643, 685, 253, 26724, 362, 3348, 2550, 320, 2783, 50276, 12563, 253, 8245, 278, 21574, 310, 417, 4518, 2529, 594, 352, 310, 417, 2590, 849, 278, 21574, 369, 2686, 9009, 891, 5583, 4933, 247, 625, 7000, 5740, 273, 253, 8245, 387, 1878, 275, 253, 30762, 50276, 250, 3065, 337, 340, 249, 285, 1182, 14451, 10020, 303, 20692, 39762, 17032, 17857, 1686, 4765, 374, 260, 864, 1162, 355, 39762, 2957, 90, 6753, 36465, 17857, 32888, 4022, 7152, 339, 42150, 50276, 2520, 2929, 29328, 247, 4460, 2746, 281, 6016, 253, 7355, 273, 16076, 941, 275, 4751, 440, 35421, 15216, 253, 40634, 941, 19401, 1097, 5816, 941, 285, 27620, 3530, 597, 15313, 247, 362, 3348, 1566, 342, 247, 4460, 3777, 15579, 1617, 17032, 1332, 326, 1543, 275, 38539, 20731, 17327, 436, 310, 247, 1077, 11132, 1895, 1580, 253, 1566, 2550, 897, 4076, 6667, 347, 629, 273, 616, 3733, 5199, 50276, 34974, 26122, 50275, 262, 9193, 281, 479, 326, 253, 2022, 2770, 273, 436, 2929, 310, 327, 5816, 941, 2581, 685, 275, 643, 3510, 273, 16933, 891, 755, 436, 13214, 7194, 432, 253, 5661, 2593, 285, 3082, 908, 323, 5301, 973, 1929, 323, 10885, 5816, 941, 1633, 891, 651, 452, 14109, 275, 436, 789, 310, 281, 10018, 253, 3045, 273, 253, 4477, 1332, 275, 15216, 10885, 5816, 941, 285, 40634, 941, 11794, 23188, 352, 310, 4722, 281, 923, 4679, 342, 1097, 2538, 5678, 534, 310, 417, 594, 1846, 275, 253, 6239, 50275, 12756, 352, 320, 1896, 281, 755, 1543, 4457, 46875, 253, 1045, 2399, 253, 1045, 2399, 4428, 253, 14433, 1307, 273, 253, 3888, 533, 671, 1142, 643, 2426, 275, 253, 990, 275, 247, 5816, 941, 516, 10340, 1566, 352, 310, 4722, 281, 755, 271, 2934, 273, 849, 1175, 253, 14433, 273, 253, 3888, 310, 891, 476, 923, 432, 8442, 495, 285, 577, 326, 597, 943, 320, 1175, 533, 1907, 247, 1027, 7982, 1537, 320, 9371, 891, 1089, 352, 10084, 326, 275, 2829, 337, 253, 1045, 2399, 273, 246, 3348, 310, 2761, 4021, 273, 253, 643, 3082, 1223, 253, 3388, 14433, 273, 278, 21574, 323, 1650, 1057, 417, 1007, 326, 1199, 1027, 275, 4677, 495, 432, 253, 3216, 5083, 285, 352, 310, 3965, 2074, 281, 246, 3348, 50276, 8774, 50276, 783, 2929, 310, 973, 3542, 285, 253, 2934, 310, 4460, 347, 2080, 347, 891, 871, 253, 14951, 310, 2590, 285, 253, 27947, 275, 253, 30762, 1007, 24600, 253, 5661, 2593, 921, 12866, 2067, 15216, 835, 597, 7277, 281, 440, 35421, 1006, 800, 3210, 281, 6016, 5816, 941, 253, 1783, 273, 40634, 941, 3345, 273, 5816, 941, 3133, 247, 2372, 14999, 7152, 339, 793, 360, 3454, 253, 2929, 29328, 247, 1332, 323, 17029, 272, 642, 261, 832, 658, 941, 10872, 1293, 7384, 667, 3216, 35086, 323, 436, 970, 247, 12955, 273, 6753, 36465, 326, 32547, 12637, 13551, 407, 4981, 2182, 247, 9841, 4081, 3777, 15579, 1617, 253, 1895, 3139, 310, 1774, 285, 253, 4081, 1332, 3133, 281, 3959, 1175, 16774, 3045, 323, 253, 4836, 41731, 14692, 247, 1175, 5438, 273, 3332, 3082, 253, 2929, 310, 973, 3542, 342, 24904, 30404, 323, 4623, 789, 285, 3400, 4209, 7681, 4278, 50276, 250, 3743, 323, 4868, 247, 1175, 2929, 342, 642, 4755, 32138, 891, 513, 417, 452, 11859, 7756, 13991, 50276, 38092, 5701, 253, 2234, 4473, 273, 16774, 2720, 285, 697, 897, 323, 3963, 2182, 253, 6753, 36465, 310, 1175, 285, 27350, 285, 891, 812, 417, 6308, 667, 4755, 10527, 390, 8542, 3374, 342, 253, 2934, 275, 253, 990, 253, 10527, 2440, 1543, 275, 9648, 2969, 11237, 323, 253, 2629, 362, 3348, 8103, 326, 476, 320, 3587, 10166, 970, 2629, 11333, 436, 310, 1097, 247, 4757, 253, 2746, 310, 3477, 281, 3359, 285, 476, 320, 43867, 715, 5368, 3210, 285, 247, 5884, 14855, 275, 2426, 273, 7681, 6864, 17837, 253, 22861, 323, 253, 2957, 310, 973, 5544, 275, 253, 2929, 50276, 783, 16774, 4679, 403, 10481, 11088, 285, 921, 1175, 16774, 3045, 275, 247, 2491, 273, 8892, 1411, 5272, 5301, 3082, 347, 247, 5884, 4385, 891, 651, 5583, 970, 247, 625, 5185, 3740, 323, 253, 8442, 24088, 3036, 495, 285, 3036, 608, 897, 1077, 1027, 5304, 49879, 406, 339, 377, 26080, 15302, 2223, 1705, 342, 17715, 621, 824, 347, 5816, 4957, 390, 27620, 7313, 3021, 3198, 3210, 8046, 281, 9295, 253, 40634, 941, 8356, 436, 2929, 10262, 253, 7275, 5576, 6753, 36465, 246, 21574, 534, 2589, 84, 17032, 689, 253, 941, 2317, 1269, 984, 253, 2720, 37820, 6993, 689, 253, 941, 2317, 246, 21574, 310, 27810, 281, 6635, 11117, 3530, 432, 253, 40634, 7313, 45190, 253, 2929, 14371, 326, 246, 21574, 476, 6296, 6635, 11117, 3530, 285, 476, 5115, 8936, 1071, 1045, 2399, 2429, 281, 253, 2045, 1666, 25379, 50276, 33722, 941, 516, 10340, 310, 581, 1774, 1895, 275, 5145, 4715, 1677, 253, 9719, 3716, 273, 253, 8542, 941, 5742, 246, 21574, 16633, 327, 767, 3607, 337, 440, 35421, 4715, 534, 501, 14503, 253, 540, 974, 1430, 273, 46684, 1781, 15302, 374, 11117, 9295, 447, 534, 9437, 281, 6635, 2709, 1896, 3530, 3185, 273, 45130, 4830, 581, 6387, 4404, 841, 767, 3607, 246, 21574, 29328, 281, 2589, 17032, 689, 253, 941, 2317, 1269, 3692, 2720, 3587, 16897, 432, 45130, 50276, 936, 11322, 253, 540, 974, 1430, 273, 253, 15579, 288, 82, 5246, 246, 21574, 22649, 253, 3777, 15579, 1617, 285, 29698, 253, 540, 44374, 1045, 2399, 11903, 1320, 715, 247, 20793, 13757, 1895, 253, 4795, 1566, 14371, 8936, 16226, 2429, 281, 1666, 25379, 275, 2426, 273, 11365, 11117, 3530, 891, 1158, 436, 789, 2789, 271, 1774, 7680, 50276, 2520, 2929, 310, 973, 3542, 50276, 35428, 11454, 1232, 34226, 29595, 1162, 355, 4765, 66, 285, 11454, 1232, 34226, 29595, 1162, 355, 4765, 67, 403, 1529, 5542, 273, 3210, 323, 5816, 941, 516, 10340, 534, 403, 9829, 407, 253, 2929, 50276, 1747, 13218, 253, 3777, 15579, 1617, 2167, 352, 20096, 326, 253, 15579, 310, 45765, 715, 625, 10649, 494, 4948, 588, 37703, 436, 1617, 2701, 253, 3890, 6460, 273, 253, 32049, 2805, 89, 50276, 90, 2490, 187, 4118, 18435, 27, 8774, 50276, 783, 4477, 12661, 247, 17699, 16561, 2746, 281, 941, 12478, 9009, 3066, 247, 39762, 6753, 36465, 597, 9059, 326, 247, 1846, 1895, 275, 436, 3634, 403, 20731, 17327, 326, 689, 8491, 407, 45012, 327, 247, 1698, 6967, 8578, 285, 9569, 271, 13757, 2303, 6034, 281, 43162, 326, 3879, 50276, 49794, 50276, 1662, 86, 1598, 253, 2022, 4468, 3982, 598, 275, 253, 10123, 369, 849, 1199, 38135, 627, 310, 275, 15974, 21624, 4778, 12637, 13551, 5482, 323, 534, 452, 644, 4081, 253, 4477, 497, 2104, 281, 19148, 326, 436, 369, 1955, 281, 247, 40663, 253, 13551, 597, 2953, 310, 417, 275, 21624, 2317, 285, 253, 37317, 19401, 253, 2647, 11512, 50275, 250, 27167, 318, 50276, 74, 5583, 9311, 253, 30628, 403, 512, 2762, 5194, 326, 253, 1332, 310, 4722, 285, 3133, 4460, 253, 4028, 310, 2590, 285, 5780, 24626, 452, 644, 9713, 275, 253, 5955, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: first of all i want to point something out i found quite bothersome the abstract states a desirable agent should be capable of balancing between different subtasks navigation to find enemies and shooting to kill them and the intro begins with it is an urgent need to use drl methods to solve more complex decisionmaking problems i want to state that i strongly believe we should not be framing our research problems with these types of problems nor trivializing concepts such as killing enemies ill try to be as unbiased in my scientific evaluation of this paper but i would request that the language be toned down a bit and ideally other types of tasks considered down the road back to the review this paper presents a multitask agent architecture with a final mixture component the authors show that this approach can work on a custombuilt fps game better than competing sota methods both multitask and monotask agents overall the method uses a bilevel optimization to find the optimal mixture of subpolicies as well as individually optimize each subpolicy pros this seems like a relatively simple architecture and the empirical results are promising and the analysis of alpha values correlation to subtasks is interesting and seems to indicate that the metacontroller does gain some insight into subtask structure cons there are many confusing points about the paper that made it hard to follow and that i would need clarified to argue for acceptance 1 doesnt minclip a a clip a 2 doesnt grad alpha l train epsilon alpha grad alpha ltrain epsilon alpha 0 perhaps you meant grad w on the second term 3 what are ltrain and lval i couldnt find a clear definition 4 do the subagents receive the values of the respective ri or is only r sumi alphai ri passed to the agent 5 i am not familiar with bilevel optimization it could be worth talking a bit more about this instead of environmental architecture choices which are relatively irrelevant to the core of the paper 6 could you come up with a task with more than 2 subtasks i find that 2 is likely a corner case and going beyond just two policies would make the method more convincing trying out on a smaller more synthetic environment that would more easily allow task factorization would also allow for better empirical evaluation 7 although i am not super familiar with the fps environments for rl are there some already existant environments that have been independently benchmarked any fps env has a mixture of navigation and fighting so i wonder what the value of proposing a new environment for this would be conclusion i am not an expert of the multitask literature but although the proposed idea seems to have merrits the paper would need to be more clear on the points above for me to consider it clean enough for publication as it stands the approach is too opaque to really understand what is going on docsepthe paper proposes a metalearning approach for hierarchical reinforcement learning essentially metalearning parametrised weights of a highlevel controller the method is tested on a single new environment a first person shooter against a small set of baselines this review will focus on the method but i personally find that this application with particular emphasis on killing as terminology from the paper is highly inappropriate and the method could easily be tested on a less problematic domain overall the paper is clearly written and the additional analysis on js divergence and alphas in section 53 provides an interesting perspective which sadly turns into an analysis of semantic aspects of the game in terms of killing and how different amounts of enemies are handled a main shortcoming of the submission is the evaluation section even ethical concerns aside additional baselines from the existing hrl literature would be beneficial as eg 2 in addition another baseline mlsh frans et al 2017 referred to quickly in the submission would be a good baseline metalearning the highlevel while applying impala for the lowlevel proposed method is highly related to mlsh which metalearns the lowlevel while learning the highlevel individually for each task one core argument of the paper surrounds the combination of multiple policies in a single time step vs hard switching between policies across timesteps the combination of objectives for every output is familiar in the multiobjective rl literature 1 as well as other methods in hierarchical rl 2 work that would provide stronger context for the investigation finally the evaluation focuses on a single task which has been introduced for this paper strongly limiting the viability of the analysis further multitask domains could be taken from the existing multiobjectivemultitask rl literature as well as via simple extensions of existing domains imagine a cartpole environment with additional action penalty the final controversial point of this submission is the fact that the weights for combining rewards which seem to also be used for the overall evaluation of the agent itself though this remains ambiguous in the submission are learned by the agent this could on one hand lead to the agent finding a shortcut and ignoring some given objectives and on the other hand renders these weighted returns irrelevant for comparing different agents as the weights underlying the returns are part of the agent overall the paper proposes an interesting method but the analysis has ethical problems as well as only a single environment which was created purely for this submission and limited baselines while the main ideas underlying the method remain interesting all these problems should be addressed before considering publication 1 vamplew peter et al empirical evaluation methods for multiobjective reinforcement learning algorithms machine learning 8412 2011 5180 2 peng xue bin et al mcp learning composable hierarchical control with multiplicative compositional policies advances in neural information processing systems 2019 docsepthis paper introduces a new firstperson shooting environment consisting of two tasks navigation and eliminating enemies via shooting it describes a reinforcement learning architecture to that trains individual policies for each task and automatically balances between those policies by computing weights to mix the action distributions and the rewards unfortunately this paper leaves significant open questions about the environment and experiments this makes it hard to judge the reported results chiefly it is not clear what kind of reward should be maximized the environment provides two rewards one for each subtask but figure 4 and 5 compare algorithms for a single onedimensional measure of performance there are further questions around this what kind of reward is used for the baselines which are supposedly trained with a single reward what kind of reward does the value function for mesh predict the mixed reward as determined by the metanet then the paper mentions that lstm networks are used but of what size there are no details on complex baselines such as fun which are not trivial to implement or tune are the returns reported on a fixed set of levels figure 6 seems to imply that alpha weights sum up to one but i couldnt find any description of this furthermore details regarding the action space of the environment are missing the writeup would strongly benefit from some proofreading to fix grammar and expression technical details such as the algorithm for procedural level generation or the observation space could be put in an appendixdocsepthis paper considers a fps game that can be decomposed into two subtasks navigation and shooting a hierarchical meta rl method is introduced and the updating rules for subpolicies and meta parameters are provided experiments focus on this specific environment and hence the hierarchical structure is also specified as a meta controller over two subpolicies defined for navigation and shooting explicitly the proposed hierarchical rl method is not novel and indeed a most straightforward way to control subpolicies with a meta controller the number of subpolicies each of which is specified to solve an explicit subtask is fixed given the environment and the rewarding scheme is also clear that each subtask has its own reward the final policy is simply presented as linear combination of subpolicies actually similar studies has been well studied in the literature like feudal networks and maml which are even more general meta learning methods to automatically find subpolicies these highly related approaches are ignored from the discussion and not considered in the experiments as a baseline another question is about the considered global reward which is also formulated as linear combination of the subtasks reward using the meta parameters alphais under such a formulation the rewarding scheme is dynamically varying as the training goes one concern is that in impala the estimated values in vtrace or advantage might suffer large variance since at early stages the meta parameters are almost from scratch another concern is that this rewarding scheme naturally restricts the meta controller to select subpolicy with high immediate ri at a specific time step and hence it seems the meta parameters are almost determined by the rewarding scheme experimental parts lack comparison of many related works as mentioned above moreover only one specific environment is studied with only two subtasks it is hard to see the generality of the method when scaling to cases where a large number of subtasks exist ### Summary:
the reviewers agreed that the paper presents interesting ideas but the presentation of the paper needs be improved also the experiments and the related work section need be improved
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 7053, 273, 512, 891, 971, 281, 1127, 1633, 562, 891, 1119, 3240, 15105, 8826, 253, 12002, 3054, 50276, 66, 11408, 5570, 943, 320, 7032, 273, 26259, 875, 1027, 8482, 6579, 15034, 281, 1089, 13948, 285, 9602, 281, 5159, 731, 50276, 395, 253, 26432, 9513, 342, 352, 310, 271, 21007, 878, 281, 897, 1837, 77, 3082, 281, 8415, 625, 2570, 3061, 11849, 3237, 50276, 74, 971, 281, 1375, 326, 891, 7052, 2868, 359, 943, 417, 320, 39926, 776, 2561, 3237, 342, 841, 3510, 273, 3237, 4543, 14916, 3006, 12342, 824, 347, 9811, 13948, 50276, 408, 1611, 281, 320, 347, 38663, 275, 619, 8249, 7103, 273, 436, 2929, 533, 891, 651, 2748, 326, 253, 3448, 320, 7020, 264, 1066, 247, 2372, 285, 34243, 643, 3510, 273, 8892, 2783, 1066, 253, 3971, 50276, 2135, 281, 253, 2278, 436, 2929, 10262, 247, 1554, 262, 1945, 5570, 10336, 342, 247, 2457, 7802, 4445, 50276, 783, 4477, 921, 326, 436, 2746, 476, 789, 327, 247, 2840, 20989, 269, 793, 2165, 1805, 685, 11771, 256, 5503, 3082, 1097, 1554, 262, 1945, 285, 26734, 1945, 6083, 50276, 1189, 455, 253, 1332, 4648, 247, 26413, 652, 13757, 281, 1089, 253, 8654, 7802, 273, 749, 81, 3422, 447, 347, 973, 347, 15978, 22318, 1016, 749, 22872, 50276, 856, 84, 436, 3133, 751, 247, 4942, 2969, 10336, 285, 253, 16774, 1543, 403, 12532, 285, 253, 1783, 273, 9765, 2193, 5921, 281, 8482, 6579, 310, 4722, 285, 3133, 281, 5224, 326, 253, 1313, 317, 834, 6417, 1057, 6351, 690, 12288, 715, 8482, 1945, 2605, 50276, 5040, 627, 403, 1142, 21643, 2792, 670, 253, 2929, 326, 1160, 352, 1892, 281, 956, 285, 326, 891, 651, 878, 31637, 281, 9059, 323, 14924, 50276, 18, 36908, 1054, 11536, 50276, 66, 50275, 66, 50276, 11536, 247, 50276, 19, 36908, 3805, 9765, 298, 6194, 50276, 4259, 50276, 1637, 50276, 4971, 9765, 298, 24382, 50276, 4259, 9765, 50276, 17, 4931, 368, 5486, 3805, 259, 327, 253, 1273, 1307, 495, 752, 403, 298, 24382, 285, 298, 1208, 50276, 74, 812, 2649, 1089, 247, 2590, 5426, 577, 50276, 3088, 253, 749, 21215, 4763, 253, 2193, 273, 253, 9056, 4172, 390, 310, 760, 391, 50276, 2204, 74, 9765, 74, 4172, 4817, 281, 253, 5570, 608, 50276, 74, 717, 417, 7615, 342, 26413, 652, 13757, 352, 812, 320, 4409, 5015, 247, 2372, 625, 670, 436, 3185, 273, 6938, 10336, 10165, 534, 403, 4942, 19124, 281, 253, 5161, 273, 253, 2929, 721, 812, 368, 1705, 598, 342, 50276, 66, 4836, 342, 625, 685, 374, 8482, 6579, 50276, 74, 1089, 326, 374, 310, 2779, 247, 7145, 1083, 285, 1469, 4457, 816, 767, 7823, 651, 1056, 253, 1332, 625, 21414, 50276, 85, 11729, 562, 327, 247, 4577, 625, 13506, 3126, 326, 651, 625, 4354, 1581, 4836, 39401, 651, 671, 1581, 323, 1805, 16774, 7103, 818, 3738, 891, 717, 417, 2221, 7615, 342, 253, 269, 793, 12620, 323, 391, 77, 403, 627, 690, 2168, 2226, 386, 12620, 326, 452, 644, 10939, 22791, 264, 50276, 1279, 269, 793, 17400, 556, 247, 7802, 273, 15034, 285, 8615, 594, 891, 4282, 752, 253, 1318, 273, 36636, 247, 747, 3126, 323, 436, 651, 320, 50276, 585, 3444, 891, 717, 417, 271, 6485, 273, 253, 1554, 262, 1945, 6239, 533, 3738, 253, 4081, 2934, 3133, 281, 452, 4285, 902, 84, 253, 2929, 651, 878, 281, 320, 625, 2590, 327, 253, 2792, 1840, 323, 479, 281, 1908, 352, 4076, 2217, 323, 9311, 50276, 284, 352, 9572, 253, 2746, 310, 1512, 34350, 281, 1663, 2096, 752, 310, 1469, 327, 50275, 7152, 339, 431, 248, 2929, 29328, 247, 5148, 613, 920, 2746, 323, 24498, 35221, 4715, 9093, 5148, 613, 920, 2236, 11656, 1701, 13461, 273, 247, 1029, 5251, 9763, 253, 1332, 310, 5762, 327, 247, 2014, 747, 3126, 247, 806, 1436, 30702, 1411, 247, 1355, 873, 273, 1666, 25379, 50276, 2520, 2278, 588, 2770, 327, 253, 1332, 533, 891, 11697, 1089, 326, 436, 2898, 342, 1798, 15075, 327, 9811, 347, 28939, 432, 253, 2929, 310, 4122, 19582, 285, 253, 1332, 812, 4354, 320, 5762, 327, 247, 1679, 20276, 5028, 50276, 1189, 455, 253, 2929, 310, 4518, 3542, 285, 253, 3081, 1783, 327, 23421, 23279, 285, 355, 545, 284, 275, 2593, 8676, 3400, 271, 4722, 8668, 534, 30018, 7819, 715, 271, 1783, 273, 24705, 7794, 273, 253, 2165, 275, 2426, 273, 9811, 285, 849, 1027, 8322, 273, 13948, 403, 15726, 50276, 66, 2022, 2159, 4202, 273, 253, 19529, 310, 253, 7103, 2593, 1014, 16289, 7350, 9255, 3081, 1666, 25379, 432, 253, 5368, 288, 8435, 6239, 651, 320, 12912, 347, 24088, 374, 275, 1635, 1529, 8245, 13361, 1200, 1315, 507, 1162, 355, 4240, 6289, 281, 4541, 275, 253, 19529, 651, 320, 247, 1175, 8245, 5148, 613, 920, 253, 1029, 5251, 1223, 9433, 1607, 7080, 323, 253, 1698, 5251, 4081, 1332, 310, 4122, 2905, 281, 13361, 1200, 534, 5148, 613, 2224, 253, 1698, 5251, 1223, 4715, 253, 1029, 5251, 15978, 323, 1016, 4836, 50276, 531, 5161, 4154, 273, 253, 2929, 46168, 253, 5019, 273, 2709, 7823, 275, 247, 2014, 673, 3213, 4632, 1892, 12797, 875, 7823, 2439, 4522, 383, 2265, 253, 5019, 273, 16566, 323, 1046, 3453, 310, 7615, 275, 253, 4471, 6082, 422, 391, 77, 6239, 337, 347, 973, 347, 643, 3082, 275, 24498, 391, 77, 374, 789, 326, 651, 2085, 10046, 3634, 323, 253, 5839, 50275, 71, 3341, 253, 7103, 16633, 327, 247, 2014, 4836, 534, 556, 644, 5611, 323, 436, 2929, 7052, 14155, 253, 17036, 273, 253, 1783, 2007, 1554, 262, 1945, 10625, 812, 320, 2668, 432, 253, 5368, 4471, 6082, 422, 9961, 262, 1945, 391, 77, 6239, 347, 973, 347, 3066, 2969, 18149, 273, 5368, 10625, 8564, 247, 7281, 36479, 3126, 342, 3081, 2250, 12339, 50276, 783, 2457, 15620, 1127, 273, 436, 19529, 310, 253, 958, 326, 253, 13461, 323, 16248, 23267, 534, 1646, 281, 671, 320, 908, 323, 253, 4583, 7103, 273, 253, 5570, 3139, 2167, 436, 4558, 23851, 275, 253, 19529, 403, 6311, 407, 253, 5570, 436, 812, 327, 581, 1133, 1421, 281, 253, 5570, 4560, 247, 28194, 285, 23111, 690, 1677, 16566, 285, 327, 253, 643, 1133, 29512, 841, 17375, 6548, 19124, 323, 10941, 1027, 6083, 347, 253, 13461, 6944, 253, 6548, 403, 629, 273, 253, 5570, 50276, 1189, 455, 253, 2929, 29328, 271, 4722, 1332, 533, 253, 1783, 556, 16289, 3237, 347, 973, 347, 760, 247, 2014, 3126, 534, 369, 3562, 15846, 323, 436, 19529, 285, 3710, 1666, 25379, 1223, 253, 2022, 5697, 6944, 253, 1332, 3464, 4722, 512, 841, 3237, 943, 320, 9713, 1078, 7296, 9311, 50275, 18, 362, 4636, 88, 268, 1715, 1162, 355, 16774, 7103, 3082, 323, 4471, 6082, 422, 35221, 4715, 11333, 5145, 4715, 11130, 805, 4332, 608, 11395, 374, 42151, 1269, 489, 10269, 1162, 355, 278, 7693, 4715, 21515, 494, 24498, 1453, 342, 43904, 5889, 267, 7823, 16424, 275, 11454, 1491, 5162, 2718, 6247, 5474, 33032, 2520, 2929, 23970, 247, 747, 806, 10816, 9602, 3126, 11253, 273, 767, 8892, 15034, 285, 23703, 13948, 3066, 9602, 352, 8631, 247, 35221, 4715, 10336, 281, 326, 18784, 2060, 7823, 323, 1016, 4836, 285, 8356, 40216, 875, 1110, 7823, 407, 12672, 13461, 281, 5878, 253, 2250, 10670, 285, 253, 23267, 50276, 328, 9520, 436, 2929, 6505, 1534, 1527, 3533, 670, 253, 3126, 285, 4679, 436, 2789, 352, 1892, 281, 5963, 253, 2361, 1543, 40699, 352, 310, 417, 2590, 752, 2238, 273, 10921, 943, 320, 11903, 1025, 253, 3126, 3400, 767, 23267, 581, 323, 1016, 8482, 1945, 533, 4677, 577, 285, 608, 7277, 11333, 323, 247, 2014, 327, 264, 37613, 2557, 273, 3045, 627, 403, 2007, 3533, 1475, 436, 752, 2238, 273, 10921, 310, 908, 323, 253, 1666, 25379, 534, 403, 24628, 10166, 342, 247, 2014, 10921, 752, 2238, 273, 10921, 1057, 253, 1318, 1159, 323, 17489, 3283, 253, 6804, 10921, 347, 3413, 407, 253, 1313, 266, 292, 840, 253, 2929, 25957, 326, 298, 296, 78, 6928, 403, 908, 533, 273, 752, 1979, 627, 403, 642, 4278, 327, 2570, 1666, 25379, 824, 347, 794, 534, 403, 417, 14916, 281, 3359, 390, 19928, 403, 253, 6548, 2361, 327, 247, 4229, 873, 273, 2308, 4677, 721, 3133, 281, 16084, 326, 9765, 13461, 2020, 598, 281, 581, 533, 891, 812, 2649, 1089, 667, 5740, 273, 436, 33810, 4278, 5001, 253, 2250, 2317, 273, 253, 3126, 403, 5816, 50276, 783, 3630, 484, 651, 7052, 5649, 432, 690, 4737, 24042, 281, 4993, 28146, 285, 2048, 7681, 4278, 824, 347, 253, 5933, 323, 19993, 1268, 5978, 390, 253, 8310, 2317, 812, 320, 1691, 275, 271, 30762, 7152, 33032, 2520, 2929, 19401, 247, 269, 793, 2165, 326, 476, 320, 45765, 715, 767, 8482, 6579, 15034, 285, 9602, 247, 24498, 11419, 391, 77, 1332, 310, 5611, 285, 253, 22753, 4803, 323, 749, 81, 3422, 447, 285, 11419, 3602, 403, 2530, 4679, 2770, 327, 436, 2173, 3126, 285, 7613, 253, 24498, 2605, 310, 671, 7616, 347, 247, 11419, 9763, 689, 767, 749, 81, 3422, 447, 2931, 323, 15034, 285, 9602, 11120, 50276, 783, 4081, 24498, 391, 77, 1332, 310, 417, 4460, 285, 6296, 247, 954, 15246, 1039, 281, 1453, 749, 81, 3422, 447, 342, 247, 11419, 9763, 253, 1180, 273, 749, 81, 3422, 447, 1016, 273, 534, 310, 7616, 281, 8415, 271, 6843, 8482, 1945, 310, 4229, 1677, 253, 3126, 285, 253, 34975, 6974, 310, 671, 2590, 326, 1016, 8482, 1945, 556, 697, 1211, 10921, 253, 2457, 3646, 310, 3365, 3559, 347, 4872, 5019, 273, 749, 81, 3422, 447, 2686, 2074, 2175, 556, 644, 973, 5421, 275, 253, 6239, 751, 40157, 267, 6928, 285, 278, 16878, 534, 403, 1014, 625, 2087, 11419, 4715, 3082, 281, 8356, 1089, 749, 81, 3422, 447, 841, 4122, 2905, 7274, 403, 12841, 432, 253, 5955, 285, 417, 2783, 275, 253, 4679, 347, 247, 8245, 50276, 23955, 1953, 310, 670, 253, 2783, 4156, 10921, 534, 310, 671, 26115, 347, 4872, 5019, 273, 253, 8482, 6579, 10921, 970, 253, 11419, 3602, 9765, 261, 762, 824, 247, 15895, 253, 34975, 6974, 310, 23043, 11962, 347, 253, 3733, 4566, 581, 4468, 310, 326, 275, 1607, 7080, 253, 5998, 2193, 275, 362, 20668, 390, 5750, 1537, 11089, 1781, 11041, 1580, 387, 2393, 8661, 253, 11419, 3602, 403, 2761, 432, 20041, 1529, 4468, 310, 326, 436, 34975, 6974, 10748, 45798, 253, 11419, 9763, 281, 3609, 749, 22872, 342, 1029, 8993, 4172, 387, 247, 2173, 673, 3213, 285, 7613, 352, 3133, 253, 11419, 3602, 403, 2761, 3413, 407, 253, 34975, 6974, 50276, 49363, 4243, 3480, 5301, 273, 1142, 2905, 2987, 347, 5393, 1840, 25761, 760, 581, 2173, 3126, 310, 5421, 342, 760, 767, 8482, 6579, 352, 310, 1892, 281, 923, 253, 31376, 273, 253, 1332, 672, 13642, 281, 2219, 835, 247, 1781, 1180, 273, 8482, 6579, 2226, 187, 187, 4118, 18435, 27, 783, 30628, 5821, 326, 253, 2929, 10262, 4722, 5697, 533, 253, 9759, 273, 253, 2929, 3198, 320, 5520, 671, 253, 4679, 285, 253, 2905, 789, 2593, 878, 320, 5520 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 7053, 273, 512, 891, 971, 281, 1127, 1633, 562, 891, 1119, 3240, 15105, 8826, 253, 12002, 3054, 50276, 66, 11408, 5570, 943, 320, 7032, 273, 26259, 875, 1027, 8482, 6579, 15034, 281, 1089, 13948, 285, 9602, 281, 5159, 731, 50276, 395, 253, 26432, 9513, 342, 352, 310, 271, 21007, 878, 281, 897, 1837, 77, 3082, 281, 8415, 625, 2570, 3061, 11849, 3237, 50276, 74, 971, 281, 1375, 326, 891, 7052, 2868, 359, 943, 417, 320, 39926, 776, 2561, 3237, 342, 841, 3510, 273, 3237, 4543, 14916, 3006, 12342, 824, 347, 9811, 13948, 50276, 408, 1611, 281, 320, 347, 38663, 275, 619, 8249, 7103, 273, 436, 2929, 533, 891, 651, 2748, 326, 253, 3448, 320, 7020, 264, 1066, 247, 2372, 285, 34243, 643, 3510, 273, 8892, 2783, 1066, 253, 3971, 50276, 2135, 281, 253, 2278, 436, 2929, 10262, 247, 1554, 262, 1945, 5570, 10336, 342, 247, 2457, 7802, 4445, 50276, 783, 4477, 921, 326, 436, 2746, 476, 789, 327, 247, 2840, 20989, 269, 793, 2165, 1805, 685, 11771, 256, 5503, 3082, 1097, 1554, 262, 1945, 285, 26734, 1945, 6083, 50276, 1189, 455, 253, 1332, 4648, 247, 26413, 652, 13757, 281, 1089, 253, 8654, 7802, 273, 749, 81, 3422, 447, 347, 973, 347, 15978, 22318, 1016, 749, 22872, 50276, 856, 84, 436, 3133, 751, 247, 4942, 2969, 10336, 285, 253, 16774, 1543, 403, 12532, 285, 253, 1783, 273, 9765, 2193, 5921, 281, 8482, 6579, 310, 4722, 285, 3133, 281, 5224, 326, 253, 1313, 317, 834, 6417, 1057, 6351, 690, 12288, 715, 8482, 1945, 2605, 50276, 5040, 627, 403, 1142, 21643, 2792, 670, 253, 2929, 326, 1160, 352, 1892, 281, 956, 285, 326, 891, 651, 878, 31637, 281, 9059, 323, 14924, 50276, 18, 36908, 1054, 11536, 50276, 66, 50275, 66, 50276, 11536, 247, 50276, 19, 36908, 3805, 9765, 298, 6194, 50276, 4259, 50276, 1637, 50276, 4971, 9765, 298, 24382, 50276, 4259, 9765, 50276, 17, 4931, 368, 5486, 3805, 259, 327, 253, 1273, 1307, 495, 752, 403, 298, 24382, 285, 298, 1208, 50276, 74, 812, 2649, 1089, 247, 2590, 5426, 577, 50276, 3088, 253, 749, 21215, 4763, 253, 2193, 273, 253, 9056, 4172, 390, 310, 760, 391, 50276, 2204, 74, 9765, 74, 4172, 4817, 281, 253, 5570, 608, 50276, 74, 717, 417, 7615, 342, 26413, 652, 13757, 352, 812, 320, 4409, 5015, 247, 2372, 625, 670, 436, 3185, 273, 6938, 10336, 10165, 534, 403, 4942, 19124, 281, 253, 5161, 273, 253, 2929, 721, 812, 368, 1705, 598, 342, 50276, 66, 4836, 342, 625, 685, 374, 8482, 6579, 50276, 74, 1089, 326, 374, 310, 2779, 247, 7145, 1083, 285, 1469, 4457, 816, 767, 7823, 651, 1056, 253, 1332, 625, 21414, 50276, 85, 11729, 562, 327, 247, 4577, 625, 13506, 3126, 326, 651, 625, 4354, 1581, 4836, 39401, 651, 671, 1581, 323, 1805, 16774, 7103, 818, 3738, 891, 717, 417, 2221, 7615, 342, 253, 269, 793, 12620, 323, 391, 77, 403, 627, 690, 2168, 2226, 386, 12620, 326, 452, 644, 10939, 22791, 264, 50276, 1279, 269, 793, 17400, 556, 247, 7802, 273, 15034, 285, 8615, 594, 891, 4282, 752, 253, 1318, 273, 36636, 247, 747, 3126, 323, 436, 651, 320, 50276, 585, 3444, 891, 717, 417, 271, 6485, 273, 253, 1554, 262, 1945, 6239, 533, 3738, 253, 4081, 2934, 3133, 281, 452, 4285, 902, 84, 253, 2929, 651, 878, 281, 320, 625, 2590, 327, 253, 2792, 1840, 323, 479, 281, 1908, 352, 4076, 2217, 323, 9311, 50276, 284, 352, 9572, 253, 2746, 310, 1512, 34350, 281, 1663, 2096, 752, 310, 1469, 327, 50275, 7152, 339, 431, 248, 2929, 29328, 247, 5148, 613, 920, 2746, 323, 24498, 35221, 4715, 9093, 5148, 613, 920, 2236, 11656, 1701, 13461, 273, 247, 1029, 5251, 9763, 253, 1332, 310, 5762, 327, 247, 2014, 747, 3126, 247, 806, 1436, 30702, 1411, 247, 1355, 873, 273, 1666, 25379, 50276, 2520, 2278, 588, 2770, 327, 253, 1332, 533, 891, 11697, 1089, 326, 436, 2898, 342, 1798, 15075, 327, 9811, 347, 28939, 432, 253, 2929, 310, 4122, 19582, 285, 253, 1332, 812, 4354, 320, 5762, 327, 247, 1679, 20276, 5028, 50276, 1189, 455, 253, 2929, 310, 4518, 3542, 285, 253, 3081, 1783, 327, 23421, 23279, 285, 355, 545, 284, 275, 2593, 8676, 3400, 271, 4722, 8668, 534, 30018, 7819, 715, 271, 1783, 273, 24705, 7794, 273, 253, 2165, 275, 2426, 273, 9811, 285, 849, 1027, 8322, 273, 13948, 403, 15726, 50276, 66, 2022, 2159, 4202, 273, 253, 19529, 310, 253, 7103, 2593, 1014, 16289, 7350, 9255, 3081, 1666, 25379, 432, 253, 5368, 288, 8435, 6239, 651, 320, 12912, 347, 24088, 374, 275, 1635, 1529, 8245, 13361, 1200, 1315, 507, 1162, 355, 4240, 6289, 281, 4541, 275, 253, 19529, 651, 320, 247, 1175, 8245, 5148, 613, 920, 253, 1029, 5251, 1223, 9433, 1607, 7080, 323, 253, 1698, 5251, 4081, 1332, 310, 4122, 2905, 281, 13361, 1200, 534, 5148, 613, 2224, 253, 1698, 5251, 1223, 4715, 253, 1029, 5251, 15978, 323, 1016, 4836, 50276, 531, 5161, 4154, 273, 253, 2929, 46168, 253, 5019, 273, 2709, 7823, 275, 247, 2014, 673, 3213, 4632, 1892, 12797, 875, 7823, 2439, 4522, 383, 2265, 253, 5019, 273, 16566, 323, 1046, 3453, 310, 7615, 275, 253, 4471, 6082, 422, 391, 77, 6239, 337, 347, 973, 347, 643, 3082, 275, 24498, 391, 77, 374, 789, 326, 651, 2085, 10046, 3634, 323, 253, 5839, 50275, 71, 3341, 253, 7103, 16633, 327, 247, 2014, 4836, 534, 556, 644, 5611, 323, 436, 2929, 7052, 14155, 253, 17036, 273, 253, 1783, 2007, 1554, 262, 1945, 10625, 812, 320, 2668, 432, 253, 5368, 4471, 6082, 422, 9961, 262, 1945, 391, 77, 6239, 347, 973, 347, 3066, 2969, 18149, 273, 5368, 10625, 8564, 247, 7281, 36479, 3126, 342, 3081, 2250, 12339, 50276, 783, 2457, 15620, 1127, 273, 436, 19529, 310, 253, 958, 326, 253, 13461, 323, 16248, 23267, 534, 1646, 281, 671, 320, 908, 323, 253, 4583, 7103, 273, 253, 5570, 3139, 2167, 436, 4558, 23851, 275, 253, 19529, 403, 6311, 407, 253, 5570, 436, 812, 327, 581, 1133, 1421, 281, 253, 5570, 4560, 247, 28194, 285, 23111, 690, 1677, 16566, 285, 327, 253, 643, 1133, 29512, 841, 17375, 6548, 19124, 323, 10941, 1027, 6083, 347, 253, 13461, 6944, 253, 6548, 403, 629, 273, 253, 5570, 50276, 1189, 455, 253, 2929, 29328, 271, 4722, 1332, 533, 253, 1783, 556, 16289, 3237, 347, 973, 347, 760, 247, 2014, 3126, 534, 369, 3562, 15846, 323, 436, 19529, 285, 3710, 1666, 25379, 1223, 253, 2022, 5697, 6944, 253, 1332, 3464, 4722, 512, 841, 3237, 943, 320, 9713, 1078, 7296, 9311, 50275, 18, 362, 4636, 88, 268, 1715, 1162, 355, 16774, 7103, 3082, 323, 4471, 6082, 422, 35221, 4715, 11333, 5145, 4715, 11130, 805, 4332, 608, 11395, 374, 42151, 1269, 489, 10269, 1162, 355, 278, 7693, 4715, 21515, 494, 24498, 1453, 342, 43904, 5889, 267, 7823, 16424, 275, 11454, 1491, 5162, 2718, 6247, 5474, 33032, 2520, 2929, 23970, 247, 747, 806, 10816, 9602, 3126, 11253, 273, 767, 8892, 15034, 285, 23703, 13948, 3066, 9602, 352, 8631, 247, 35221, 4715, 10336, 281, 326, 18784, 2060, 7823, 323, 1016, 4836, 285, 8356, 40216, 875, 1110, 7823, 407, 12672, 13461, 281, 5878, 253, 2250, 10670, 285, 253, 23267, 50276, 328, 9520, 436, 2929, 6505, 1534, 1527, 3533, 670, 253, 3126, 285, 4679, 436, 2789, 352, 1892, 281, 5963, 253, 2361, 1543, 40699, 352, 310, 417, 2590, 752, 2238, 273, 10921, 943, 320, 11903, 1025, 253, 3126, 3400, 767, 23267, 581, 323, 1016, 8482, 1945, 533, 4677, 577, 285, 608, 7277, 11333, 323, 247, 2014, 327, 264, 37613, 2557, 273, 3045, 627, 403, 2007, 3533, 1475, 436, 752, 2238, 273, 10921, 310, 908, 323, 253, 1666, 25379, 534, 403, 24628, 10166, 342, 247, 2014, 10921, 752, 2238, 273, 10921, 1057, 253, 1318, 1159, 323, 17489, 3283, 253, 6804, 10921, 347, 3413, 407, 253, 1313, 266, 292, 840, 253, 2929, 25957, 326, 298, 296, 78, 6928, 403, 908, 533, 273, 752, 1979, 627, 403, 642, 4278, 327, 2570, 1666, 25379, 824, 347, 794, 534, 403, 417, 14916, 281, 3359, 390, 19928, 403, 253, 6548, 2361, 327, 247, 4229, 873, 273, 2308, 4677, 721, 3133, 281, 16084, 326, 9765, 13461, 2020, 598, 281, 581, 533, 891, 812, 2649, 1089, 667, 5740, 273, 436, 33810, 4278, 5001, 253, 2250, 2317, 273, 253, 3126, 403, 5816, 50276, 783, 3630, 484, 651, 7052, 5649, 432, 690, 4737, 24042, 281, 4993, 28146, 285, 2048, 7681, 4278, 824, 347, 253, 5933, 323, 19993, 1268, 5978, 390, 253, 8310, 2317, 812, 320, 1691, 275, 271, 30762, 7152, 33032, 2520, 2929, 19401, 247, 269, 793, 2165, 326, 476, 320, 45765, 715, 767, 8482, 6579, 15034, 285, 9602, 247, 24498, 11419, 391, 77, 1332, 310, 5611, 285, 253, 22753, 4803, 323, 749, 81, 3422, 447, 285, 11419, 3602, 403, 2530, 4679, 2770, 327, 436, 2173, 3126, 285, 7613, 253, 24498, 2605, 310, 671, 7616, 347, 247, 11419, 9763, 689, 767, 749, 81, 3422, 447, 2931, 323, 15034, 285, 9602, 11120, 50276, 783, 4081, 24498, 391, 77, 1332, 310, 417, 4460, 285, 6296, 247, 954, 15246, 1039, 281, 1453, 749, 81, 3422, 447, 342, 247, 11419, 9763, 253, 1180, 273, 749, 81, 3422, 447, 1016, 273, 534, 310, 7616, 281, 8415, 271, 6843, 8482, 1945, 310, 4229, 1677, 253, 3126, 285, 253, 34975, 6974, 310, 671, 2590, 326, 1016, 8482, 1945, 556, 697, 1211, 10921, 253, 2457, 3646, 310, 3365, 3559, 347, 4872, 5019, 273, 749, 81, 3422, 447, 2686, 2074, 2175, 556, 644, 973, 5421, 275, 253, 6239, 751, 40157, 267, 6928, 285, 278, 16878, 534, 403, 1014, 625, 2087, 11419, 4715, 3082, 281, 8356, 1089, 749, 81, 3422, 447, 841, 4122, 2905, 7274, 403, 12841, 432, 253, 5955, 285, 417, 2783, 275, 253, 4679, 347, 247, 8245, 50276, 23955, 1953, 310, 670, 253, 2783, 4156, 10921, 534, 310, 671, 26115, 347, 4872, 5019, 273, 253, 8482, 6579, 10921, 970, 253, 11419, 3602, 9765, 261, 762, 824, 247, 15895, 253, 34975, 6974, 310, 23043, 11962, 347, 253, 3733, 4566, 581, 4468, 310, 326, 275, 1607, 7080, 253, 5998, 2193, 275, 362, 20668, 390, 5750, 1537, 11089, 1781, 11041, 1580, 387, 2393, 8661, 253, 11419, 3602, 403, 2761, 432, 20041, 1529, 4468, 310, 326, 436, 34975, 6974, 10748, 45798, 253, 11419, 9763, 281, 3609, 749, 22872, 342, 1029, 8993, 4172, 387, 247, 2173, 673, 3213, 285, 7613, 352, 3133, 253, 11419, 3602, 403, 2761, 3413, 407, 253, 34975, 6974, 50276, 49363, 4243, 3480, 5301, 273, 1142, 2905, 2987, 347, 5393, 1840, 25761, 760, 581, 2173, 3126, 310, 5421, 342, 760, 767, 8482, 6579, 352, 310, 1892, 281, 923, 253, 31376, 273, 253, 1332, 672, 13642, 281, 2219, 835, 247, 1781, 1180, 273, 8482, 6579, 2226, 187, 187, 4118, 18435, 27, 783, 30628, 5821, 326, 253, 2929, 10262, 4722, 5697, 533, 253, 9759, 273, 253, 2929, 3198, 320, 5520, 671, 253, 4679, 285, 253, 2905, 789, 2593, 878, 320, 5520 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper proposes a tracking algorithm by designing a modamixer and asymmetrical networks to learn unifiedadaptive vl representation the combination of the two modules gives very good results applying language features to tracking tasks is a very reasonable and innovative multimodal learning algorithm strengths the paper is clearly written results greatly improve the performance and seems achieve sota the paper has adequate ablation experiments weaknesses i noticed that the results in table 2 used different settings0tensor vs template the author just lists the best results the results of same algorithms should come from the exact same network structure otherwise they should be shown separately for the same reason the result of tnl2k in table 3483466 is different from it in table 2498510 the article does not include a discussion of solving common tracking problems such as target deformation and occlusion the introduction of the template update module may have better results on some samples docsepthis paper proposes an interesting solution to show how to achieve stateoftheart sota tracking performance without relying on the complex transformer architecture in visual tracking specifically the authors explore a unified multimodal learning of vision and language under the simple convnets for tracking the authors first give an indepth analysis of the limitations of previous solutions and then introduce an innovative unified framework that learns better visionlanguage vl representation by proposing modamixer and asymmetrical searching strategy ass the former well enhances the interaction inside visionlanguage for discriminative representation while the later improves the adaption of the learned vl representation by modamixer extensive experiments on five challenging datasets demonstrates that the proposed method improves a pure cnnbased baseline to be competitive and even better than recent transformerbased counterparts moreover the proposed method is general and can also be applied to improve transformer tracking architecture as shown by experiments in addition some theoretical analysis is given to explain the effectiveness of the proposed method strengths overall i am rather positive on this paper in particular i really like the motivation of this work that aims at finding alternative path instead of relying on transformer architecture although it is shown to be powerful to achieve sota tracking by exploring the almost ignored multimodal learning which i believe can inspire many other works on tracking and facilitates this field the strengths in this work include 1 novelty this paper introduces an innovative framework of multimodal visionlanguage learning vl for object tracking the proposed modamixer enables a good interaction between the two modalities throughout the whole network which is completely different from other works that only interact two modalities at the result fusion stage in addition another novel point is the proposed ass that adapts the representation of different branches and modalities in an asymmetrical way compared with the conventional symmetrical way ass is the first time to show the asymmetrical architecture may be more suitable for representation learning in tracking and may provide new insights to future research i like these novel ideas in this work 2 good performance this paper demonstrates excellent performance in improving the simple pure cnnbased tracking baselines in specific on the challenging lasot the proposed approach improves the baseline from 507 to 652 in suc with absolute 145 gains which is competitive and even better than recent transformerbased trackers compared to previous best vl method with 540 suc this paper shows 112 improvement the performance on other benchmarks is also sota this excellent performance clearly shows the effectiveness and advantages of the proposed method 3 generality the proposed method is general and not limited to cnnbased framework the authors show this point by applying their methods on a transformerbased tracking framework and show promising improvements on all the five challenging benchmarks which is consistent with the improvements shown for cnnbased framework 4 rich experiments and analysis the authors provide extensive experiments and analysis for the proposed method i appreciate this the experimental analysis with various ablation studies allows a better understanding of each module and overall performance and the theoretical analysis gives an explanation why the proposed method works well in improving tracking performance 5 good writing and organization this paper is well written and organized each section has a clear motivation its easy to follow the ideas i enjoy reading the paper overall i believe this paper is significant to the visual tracking community because it shows new insights and directions in designing simple but effective tracking framework with sota performance weaknesses although this paper is technically sound and novel i have some minor concerns or questions 1 in this work language is crucial i noticed that in the experiment the pseudo language description generated by an image caption model does not show significant improvements what do you think are the reasons causing this 2 the proposed method is shown to be general which is nice however it will be great if the authors can show more examples such as conducting more experiments on additional pure cnnbased frameworks like siamrpn li et al cvpr 2019 3 the authors show many comparisons with other sota methods but i dont see qualitative comparison with other approaches it will be nice to see these results and comparison to other sota methods 4 for experiments with partial language 50 and 75 in the supplementary material how do you determine the which 50 or 75 should be used randomly generating if you randomly generate it it will be better to do multiple eg 3 times of experiments i dont see the discussion related to the limitations and broader impact i hope the author can provide a brief discuss on this part in the response and are encouraged to detail this part in the revision docsepthis paper proposes a method to learn the visionlanguage representation for achieving sota tracking by using pure convnets it proposes a modality mixer as the fusion module and uses it in both lowlevel convstages and highlevel convstages to explore the complementarity of different modalities it also uses nas to search an asymmetrical siamese network to learn adaptive visionlanguage representations the proposed method is also used to improve a transformerbased tracker strengths the proposed method can make a cnnbased tracker and a transformerbased tracker achieve sota performance on five benchmarks the performance of visionlanguage tracking is improved largely this work uses nas for tracking to search different model structures for template and search region respectively this is a novel practice for siamese trackers weaknesses the proposed trackers and the baselines use different training data the comparisons in this paper are unfair for instance both siamcar and transt didnt use tnl2k for model training they shouldnt be used as the baselines for comparison on tnl2k siamcar only uses the lasot training set for model learning to achieve 507suc but the proposed trackers use lots of extra training sets so the performance gains shown in figure1 are debatable and the current ablation study cannot show the effectiveness of the proposed method modality mixer is too simple to be the main contribution its crossmodal channel attention the authors attempted to make the structure more complex by designing a residual connection but this modification brings some doubts about the effectiveness of the crossmodal channel attention we dont know whether the final improvements can be attributed to the modalfusion or the pure vision feature this paper didnt provide any ablation analysis about it the current experiments are not sufficient this paper proposes a vl representation learning method for object tracking but half training pairs dont have a language description and the authors use the 0tensor or the visual feature as the pseudo language description for training its doubtful that the learned representations are really multimodal representations the authors conducted an experiment by removing the description to prove the effectiveness of modal fusion but how about using a 0tensor or a visual pooling feature for inference the current experiments are insufficient na ### Summary:
all three reviewers lean towards the acceptance of the paper reviewer yvur was not 100 excited about the paper pointing out the simplicity of the approach and lacking ablations we encourage the authors to include the new materials they prepared for the rebuttal in the final version of the paper
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 247, 12544, 5933, 407, 20462, 247, 771, 312, 895, 254, 285, 40736, 5526, 6928, 281, 3037, 27998, 26672, 422, 362, 77, 6779, 253, 5019, 273, 253, 767, 11911, 4245, 1077, 1175, 1543, 9433, 3448, 3386, 281, 12544, 8892, 310, 247, 1077, 5272, 285, 16694, 23390, 26306, 4715, 5933, 20544, 209, 186, 783, 2929, 310, 4518, 3542, 209, 186, 16680, 10260, 3157, 253, 3045, 285, 3133, 5115, 256, 5503, 209, 186, 783, 2929, 556, 10599, 28913, 4679, 50276, 20881, 1255, 265, 209, 186, 74, 8344, 326, 253, 1543, 275, 2829, 374, 908, 1027, 7533, 17, 26109, 4632, 7646, 253, 2488, 816, 10894, 253, 1682, 1543, 253, 1543, 273, 1072, 11333, 943, 1705, 432, 253, 3242, 1072, 2990, 2605, 5010, 597, 943, 320, 2011, 11794, 323, 253, 1072, 1921, 253, 906, 273, 246, 13307, 19, 76, 275, 2829, 36859, 1706, 2526, 310, 1027, 432, 352, 275, 2829, 29503, 2227, 740, 50276, 783, 3929, 1057, 417, 2486, 247, 5955, 273, 16161, 1846, 12544, 3237, 824, 347, 2303, 19972, 285, 30796, 253, 10199, 273, 253, 7646, 5731, 6333, 778, 452, 1805, 1543, 327, 690, 3530, 5474, 33032, 2520, 2929, 29328, 271, 4722, 2900, 281, 921, 849, 281, 5115, 1375, 23037, 14387, 256, 5503, 12544, 3045, 1293, 22128, 327, 253, 2570, 39707, 10336, 275, 5304, 12544, 5742, 253, 4477, 8338, 247, 27998, 23390, 26306, 4715, 273, 8113, 285, 3448, 762, 253, 2969, 2410, 47301, 323, 12544, 253, 4477, 806, 1918, 271, 801, 554, 394, 1783, 273, 253, 7364, 273, 2045, 5482, 285, 840, 9569, 271, 16694, 27998, 7792, 326, 33772, 1805, 8113, 12982, 362, 77, 6779, 407, 36636, 771, 312, 895, 254, 285, 40736, 5526, 12203, 5700, 718, 253, 3438, 973, 25222, 253, 5016, 3304, 8113, 12982, 323, 20741, 800, 6779, 1223, 253, 1996, 19132, 253, 5223, 279, 273, 253, 6311, 362, 77, 6779, 407, 771, 312, 895, 254, 9470, 4679, 327, 2620, 11132, 15302, 14371, 326, 253, 4081, 1332, 19132, 247, 6313, 260, 9866, 3169, 8245, 281, 320, 12085, 285, 1014, 1805, 685, 3332, 39707, 3169, 21421, 25761, 253, 4081, 1332, 310, 2087, 285, 476, 671, 320, 3732, 281, 3157, 39707, 12544, 10336, 347, 2011, 407, 4679, 275, 1635, 690, 10527, 1783, 310, 1677, 281, 5513, 253, 12510, 273, 253, 4081, 1332, 20544, 50276, 1189, 455, 891, 717, 2581, 2762, 327, 436, 2929, 275, 1798, 891, 1663, 751, 253, 16038, 273, 436, 789, 326, 13698, 387, 4560, 5795, 1854, 3185, 273, 22128, 327, 39707, 10336, 3738, 352, 310, 2011, 281, 320, 6422, 281, 5115, 256, 5503, 12544, 407, 18216, 253, 2761, 12841, 23390, 26306, 4715, 534, 891, 2868, 476, 26761, 1142, 643, 2987, 327, 12544, 285, 29499, 436, 1673, 253, 20544, 275, 436, 789, 2486, 50276, 18, 38135, 436, 2929, 23970, 271, 16694, 7792, 273, 23390, 26306, 8113, 12982, 4715, 362, 77, 323, 1789, 12544, 253, 4081, 771, 312, 895, 254, 13276, 247, 1175, 5016, 875, 253, 767, 33433, 4768, 253, 2644, 2990, 534, 310, 4336, 1027, 432, 643, 2987, 326, 760, 8008, 767, 33433, 387, 253, 906, 11781, 3924, 275, 1635, 1529, 4460, 1127, 310, 253, 4081, 718, 326, 5223, 84, 253, 6779, 273, 1027, 12998, 285, 33433, 275, 271, 40736, 5526, 1039, 2429, 342, 253, 6041, 42736, 1039, 718, 310, 253, 806, 673, 281, 921, 253, 40736, 5526, 10336, 778, 320, 625, 7470, 323, 6779, 4715, 275, 12544, 285, 778, 2085, 747, 16039, 281, 2852, 2561, 891, 751, 841, 4460, 5697, 275, 436, 789, 50276, 19, 1175, 3045, 436, 2929, 14371, 7126, 3045, 275, 11138, 253, 2969, 6313, 260, 9866, 3169, 12544, 1666, 25379, 275, 2173, 327, 253, 11132, 4358, 302, 253, 4081, 2746, 19132, 253, 8245, 432, 38991, 281, 721, 3583, 275, 8227, 342, 7880, 19092, 15988, 534, 310, 12085, 285, 1014, 1805, 685, 3332, 39707, 3169, 3540, 398, 2429, 281, 2045, 1682, 362, 77, 1332, 342, 36725, 8227, 436, 2929, 2722, 11633, 7756, 253, 3045, 327, 643, 49602, 310, 671, 256, 5503, 436, 7126, 3045, 4518, 2722, 253, 12510, 285, 11361, 273, 253, 4081, 1332, 50276, 20, 31376, 253, 4081, 1332, 310, 2087, 285, 417, 3710, 281, 260, 9866, 3169, 7792, 253, 4477, 921, 436, 1127, 407, 9433, 616, 3082, 327, 247, 39707, 3169, 12544, 7792, 285, 921, 12532, 11701, 327, 512, 253, 2620, 11132, 49602, 534, 310, 5185, 342, 253, 11701, 2011, 323, 260, 9866, 3169, 7792, 50276, 21, 6793, 4679, 285, 1783, 253, 4477, 2085, 9470, 4679, 285, 1783, 323, 253, 4081, 1332, 891, 11435, 436, 253, 5661, 1783, 342, 2710, 28913, 2175, 4483, 247, 1805, 4685, 273, 1016, 6333, 285, 4583, 3045, 285, 253, 10527, 1783, 4245, 271, 8813, 2139, 253, 4081, 1332, 2987, 973, 275, 11138, 12544, 3045, 50276, 22, 1175, 4028, 285, 6003, 436, 2929, 310, 973, 3542, 285, 10932, 1016, 2593, 556, 247, 2590, 16038, 697, 3477, 281, 956, 253, 5697, 891, 4264, 4361, 253, 2929, 50276, 1189, 455, 891, 2868, 436, 2929, 310, 1534, 281, 253, 5304, 12544, 3114, 984, 352, 2722, 747, 16039, 285, 10746, 275, 20462, 2969, 533, 3576, 12544, 7792, 342, 256, 5503, 3045, 50276, 20881, 1255, 265, 50276, 20261, 436, 2929, 310, 22335, 3590, 285, 4460, 891, 452, 690, 5884, 7350, 390, 3533, 337, 275, 436, 789, 3448, 310, 9560, 891, 8344, 326, 275, 253, 3368, 253, 17927, 3448, 5740, 4561, 407, 271, 2460, 11743, 1566, 1057, 417, 921, 1534, 11701, 752, 513, 368, 1158, 403, 253, 4606, 8479, 436, 374, 253, 4081, 1332, 310, 2011, 281, 320, 2087, 534, 310, 5322, 2299, 352, 588, 320, 1270, 604, 253, 4477, 476, 921, 625, 6667, 824, 347, 16472, 625, 4679, 327, 3081, 6313, 260, 9866, 3169, 31225, 751, 4927, 312, 83, 16077, 632, 1162, 355, 30105, 1087, 6247, 495, 253, 4477, 921, 1142, 14023, 342, 643, 256, 5503, 3082, 533, 891, 13414, 923, 18276, 5301, 342, 643, 7274, 352, 588, 320, 5322, 281, 923, 841, 1543, 285, 5301, 281, 643, 256, 5503, 3082, 577, 323, 4679, 342, 7898, 3448, 2456, 285, 6879, 275, 253, 24864, 2144, 849, 513, 368, 3653, 253, 534, 2456, 390, 6879, 943, 320, 908, 12421, 11365, 604, 368, 12421, 6635, 352, 352, 588, 320, 1805, 281, 513, 2709, 24088, 495, 2069, 273, 4679, 50275, 74, 13414, 923, 253, 5955, 2905, 281, 253, 7364, 285, 16055, 3486, 891, 3524, 253, 2488, 476, 2085, 247, 4864, 2319, 327, 436, 629, 275, 253, 2380, 285, 403, 14659, 281, 2508, 436, 629, 275, 253, 18520, 5474, 33032, 2520, 2929, 29328, 247, 1332, 281, 3037, 253, 8113, 12982, 6779, 323, 17170, 256, 5503, 12544, 407, 970, 6313, 2410, 47301, 352, 29328, 247, 36453, 33947, 347, 253, 11781, 6333, 285, 4648, 352, 275, 1097, 1698, 5251, 2410, 296, 1131, 285, 1029, 5251, 2410, 296, 1131, 281, 8338, 253, 13503, 15752, 273, 1027, 33433, 352, 671, 4648, 13332, 281, 3186, 271, 40736, 5526, 4927, 1443, 70, 2990, 281, 3037, 17825, 8113, 12982, 14237, 253, 4081, 1332, 310, 671, 908, 281, 3157, 247, 39707, 3169, 40143, 20544, 50276, 783, 4081, 1332, 476, 1056, 247, 260, 9866, 3169, 40143, 285, 247, 39707, 3169, 40143, 5115, 256, 5503, 3045, 327, 2620, 49602, 253, 3045, 273, 8113, 12982, 12544, 310, 5520, 8127, 50276, 2520, 789, 4648, 13332, 323, 12544, 281, 3186, 1027, 1566, 5289, 323, 7646, 285, 3186, 2919, 2975, 436, 310, 247, 4460, 3946, 323, 4927, 1443, 70, 3540, 398, 50275, 20881, 1255, 265, 50276, 783, 4081, 3540, 398, 285, 253, 1666, 25379, 897, 1027, 3733, 941, 253, 14023, 275, 436, 2929, 403, 16593, 323, 4227, 1097, 4927, 312, 5546, 285, 21191, 296, 42126, 897, 246, 13307, 19, 76, 323, 1566, 3733, 597, 943, 2649, 320, 908, 347, 253, 1666, 25379, 323, 5301, 327, 246, 13307, 19, 76, 4927, 312, 5546, 760, 4648, 253, 4358, 302, 3733, 873, 323, 1566, 4715, 281, 5115, 38991, 47924, 533, 253, 4081, 3540, 398, 897, 8783, 273, 4465, 3733, 5239, 594, 253, 3045, 15988, 2011, 275, 4677, 18, 403, 4274, 17980, 285, 253, 1655, 28913, 1263, 2550, 921, 253, 12510, 273, 253, 4081, 1332, 50276, 2307, 1319, 33947, 310, 1512, 2969, 281, 320, 253, 2022, 7680, 697, 2831, 24353, 5048, 4116, 253, 4477, 9919, 281, 1056, 253, 2605, 625, 2570, 407, 20462, 247, 12541, 4602, 533, 436, 11237, 10316, 690, 24626, 670, 253, 12510, 273, 253, 2831, 24353, 5048, 4116, 359, 13414, 871, 1880, 253, 2457, 11701, 476, 320, 12877, 281, 253, 771, 2103, 2035, 390, 253, 6313, 8113, 4735, 436, 2929, 42126, 2085, 667, 28913, 1783, 670, 352, 253, 1655, 4679, 403, 417, 4209, 50276, 2520, 2929, 29328, 247, 362, 77, 6779, 4715, 1332, 323, 1789, 12544, 533, 2716, 3733, 8557, 13414, 452, 247, 3448, 5740, 285, 253, 4477, 897, 253, 470, 26109, 390, 253, 5304, 4735, 347, 253, 17927, 3448, 5740, 323, 3733, 697, 38342, 326, 253, 6311, 14237, 403, 1663, 23390, 26306, 14237, 253, 4477, 5196, 271, 3368, 407, 11922, 253, 5740, 281, 5276, 253, 12510, 273, 30771, 11781, 533, 849, 670, 970, 247, 470, 26109, 390, 247, 5304, 45900, 4735, 323, 17032, 253, 1655, 4679, 403, 12497, 5549, 2490, 187, 4118, 18435, 27, 455, 1264, 30628, 9644, 4404, 253, 14924, 273, 253, 2929, 37317, 340, 87, 321, 369, 417, 2233, 9049, 670, 253, 2929, 13458, 562, 253, 17647, 273, 253, 2746, 285, 14999, 490, 77, 569, 359, 11907, 253, 4477, 281, 2486, 253, 747, 4753, 597, 5480, 323, 253, 30080, 22559, 275, 253, 2457, 2715, 273, 253, 2929 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 247, 12544, 5933, 407, 20462, 247, 771, 312, 895, 254, 285, 40736, 5526, 6928, 281, 3037, 27998, 26672, 422, 362, 77, 6779, 253, 5019, 273, 253, 767, 11911, 4245, 1077, 1175, 1543, 9433, 3448, 3386, 281, 12544, 8892, 310, 247, 1077, 5272, 285, 16694, 23390, 26306, 4715, 5933, 20544, 209, 186, 783, 2929, 310, 4518, 3542, 209, 186, 16680, 10260, 3157, 253, 3045, 285, 3133, 5115, 256, 5503, 209, 186, 783, 2929, 556, 10599, 28913, 4679, 50276, 20881, 1255, 265, 209, 186, 74, 8344, 326, 253, 1543, 275, 2829, 374, 908, 1027, 7533, 17, 26109, 4632, 7646, 253, 2488, 816, 10894, 253, 1682, 1543, 253, 1543, 273, 1072, 11333, 943, 1705, 432, 253, 3242, 1072, 2990, 2605, 5010, 597, 943, 320, 2011, 11794, 323, 253, 1072, 1921, 253, 906, 273, 246, 13307, 19, 76, 275, 2829, 36859, 1706, 2526, 310, 1027, 432, 352, 275, 2829, 29503, 2227, 740, 50276, 783, 3929, 1057, 417, 2486, 247, 5955, 273, 16161, 1846, 12544, 3237, 824, 347, 2303, 19972, 285, 30796, 253, 10199, 273, 253, 7646, 5731, 6333, 778, 452, 1805, 1543, 327, 690, 3530, 5474, 33032, 2520, 2929, 29328, 271, 4722, 2900, 281, 921, 849, 281, 5115, 1375, 23037, 14387, 256, 5503, 12544, 3045, 1293, 22128, 327, 253, 2570, 39707, 10336, 275, 5304, 12544, 5742, 253, 4477, 8338, 247, 27998, 23390, 26306, 4715, 273, 8113, 285, 3448, 762, 253, 2969, 2410, 47301, 323, 12544, 253, 4477, 806, 1918, 271, 801, 554, 394, 1783, 273, 253, 7364, 273, 2045, 5482, 285, 840, 9569, 271, 16694, 27998, 7792, 326, 33772, 1805, 8113, 12982, 362, 77, 6779, 407, 36636, 771, 312, 895, 254, 285, 40736, 5526, 12203, 5700, 718, 253, 3438, 973, 25222, 253, 5016, 3304, 8113, 12982, 323, 20741, 800, 6779, 1223, 253, 1996, 19132, 253, 5223, 279, 273, 253, 6311, 362, 77, 6779, 407, 771, 312, 895, 254, 9470, 4679, 327, 2620, 11132, 15302, 14371, 326, 253, 4081, 1332, 19132, 247, 6313, 260, 9866, 3169, 8245, 281, 320, 12085, 285, 1014, 1805, 685, 3332, 39707, 3169, 21421, 25761, 253, 4081, 1332, 310, 2087, 285, 476, 671, 320, 3732, 281, 3157, 39707, 12544, 10336, 347, 2011, 407, 4679, 275, 1635, 690, 10527, 1783, 310, 1677, 281, 5513, 253, 12510, 273, 253, 4081, 1332, 20544, 50276, 1189, 455, 891, 717, 2581, 2762, 327, 436, 2929, 275, 1798, 891, 1663, 751, 253, 16038, 273, 436, 789, 326, 13698, 387, 4560, 5795, 1854, 3185, 273, 22128, 327, 39707, 10336, 3738, 352, 310, 2011, 281, 320, 6422, 281, 5115, 256, 5503, 12544, 407, 18216, 253, 2761, 12841, 23390, 26306, 4715, 534, 891, 2868, 476, 26761, 1142, 643, 2987, 327, 12544, 285, 29499, 436, 1673, 253, 20544, 275, 436, 789, 2486, 50276, 18, 38135, 436, 2929, 23970, 271, 16694, 7792, 273, 23390, 26306, 8113, 12982, 4715, 362, 77, 323, 1789, 12544, 253, 4081, 771, 312, 895, 254, 13276, 247, 1175, 5016, 875, 253, 767, 33433, 4768, 253, 2644, 2990, 534, 310, 4336, 1027, 432, 643, 2987, 326, 760, 8008, 767, 33433, 387, 253, 906, 11781, 3924, 275, 1635, 1529, 4460, 1127, 310, 253, 4081, 718, 326, 5223, 84, 253, 6779, 273, 1027, 12998, 285, 33433, 275, 271, 40736, 5526, 1039, 2429, 342, 253, 6041, 42736, 1039, 718, 310, 253, 806, 673, 281, 921, 253, 40736, 5526, 10336, 778, 320, 625, 7470, 323, 6779, 4715, 275, 12544, 285, 778, 2085, 747, 16039, 281, 2852, 2561, 891, 751, 841, 4460, 5697, 275, 436, 789, 50276, 19, 1175, 3045, 436, 2929, 14371, 7126, 3045, 275, 11138, 253, 2969, 6313, 260, 9866, 3169, 12544, 1666, 25379, 275, 2173, 327, 253, 11132, 4358, 302, 253, 4081, 2746, 19132, 253, 8245, 432, 38991, 281, 721, 3583, 275, 8227, 342, 7880, 19092, 15988, 534, 310, 12085, 285, 1014, 1805, 685, 3332, 39707, 3169, 3540, 398, 2429, 281, 2045, 1682, 362, 77, 1332, 342, 36725, 8227, 436, 2929, 2722, 11633, 7756, 253, 3045, 327, 643, 49602, 310, 671, 256, 5503, 436, 7126, 3045, 4518, 2722, 253, 12510, 285, 11361, 273, 253, 4081, 1332, 50276, 20, 31376, 253, 4081, 1332, 310, 2087, 285, 417, 3710, 281, 260, 9866, 3169, 7792, 253, 4477, 921, 436, 1127, 407, 9433, 616, 3082, 327, 247, 39707, 3169, 12544, 7792, 285, 921, 12532, 11701, 327, 512, 253, 2620, 11132, 49602, 534, 310, 5185, 342, 253, 11701, 2011, 323, 260, 9866, 3169, 7792, 50276, 21, 6793, 4679, 285, 1783, 253, 4477, 2085, 9470, 4679, 285, 1783, 323, 253, 4081, 1332, 891, 11435, 436, 253, 5661, 1783, 342, 2710, 28913, 2175, 4483, 247, 1805, 4685, 273, 1016, 6333, 285, 4583, 3045, 285, 253, 10527, 1783, 4245, 271, 8813, 2139, 253, 4081, 1332, 2987, 973, 275, 11138, 12544, 3045, 50276, 22, 1175, 4028, 285, 6003, 436, 2929, 310, 973, 3542, 285, 10932, 1016, 2593, 556, 247, 2590, 16038, 697, 3477, 281, 956, 253, 5697, 891, 4264, 4361, 253, 2929, 50276, 1189, 455, 891, 2868, 436, 2929, 310, 1534, 281, 253, 5304, 12544, 3114, 984, 352, 2722, 747, 16039, 285, 10746, 275, 20462, 2969, 533, 3576, 12544, 7792, 342, 256, 5503, 3045, 50276, 20881, 1255, 265, 50276, 20261, 436, 2929, 310, 22335, 3590, 285, 4460, 891, 452, 690, 5884, 7350, 390, 3533, 337, 275, 436, 789, 3448, 310, 9560, 891, 8344, 326, 275, 253, 3368, 253, 17927, 3448, 5740, 4561, 407, 271, 2460, 11743, 1566, 1057, 417, 921, 1534, 11701, 752, 513, 368, 1158, 403, 253, 4606, 8479, 436, 374, 253, 4081, 1332, 310, 2011, 281, 320, 2087, 534, 310, 5322, 2299, 352, 588, 320, 1270, 604, 253, 4477, 476, 921, 625, 6667, 824, 347, 16472, 625, 4679, 327, 3081, 6313, 260, 9866, 3169, 31225, 751, 4927, 312, 83, 16077, 632, 1162, 355, 30105, 1087, 6247, 495, 253, 4477, 921, 1142, 14023, 342, 643, 256, 5503, 3082, 533, 891, 13414, 923, 18276, 5301, 342, 643, 7274, 352, 588, 320, 5322, 281, 923, 841, 1543, 285, 5301, 281, 643, 256, 5503, 3082, 577, 323, 4679, 342, 7898, 3448, 2456, 285, 6879, 275, 253, 24864, 2144, 849, 513, 368, 3653, 253, 534, 2456, 390, 6879, 943, 320, 908, 12421, 11365, 604, 368, 12421, 6635, 352, 352, 588, 320, 1805, 281, 513, 2709, 24088, 495, 2069, 273, 4679, 50275, 74, 13414, 923, 253, 5955, 2905, 281, 253, 7364, 285, 16055, 3486, 891, 3524, 253, 2488, 476, 2085, 247, 4864, 2319, 327, 436, 629, 275, 253, 2380, 285, 403, 14659, 281, 2508, 436, 629, 275, 253, 18520, 5474, 33032, 2520, 2929, 29328, 247, 1332, 281, 3037, 253, 8113, 12982, 6779, 323, 17170, 256, 5503, 12544, 407, 970, 6313, 2410, 47301, 352, 29328, 247, 36453, 33947, 347, 253, 11781, 6333, 285, 4648, 352, 275, 1097, 1698, 5251, 2410, 296, 1131, 285, 1029, 5251, 2410, 296, 1131, 281, 8338, 253, 13503, 15752, 273, 1027, 33433, 352, 671, 4648, 13332, 281, 3186, 271, 40736, 5526, 4927, 1443, 70, 2990, 281, 3037, 17825, 8113, 12982, 14237, 253, 4081, 1332, 310, 671, 908, 281, 3157, 247, 39707, 3169, 40143, 20544, 50276, 783, 4081, 1332, 476, 1056, 247, 260, 9866, 3169, 40143, 285, 247, 39707, 3169, 40143, 5115, 256, 5503, 3045, 327, 2620, 49602, 253, 3045, 273, 8113, 12982, 12544, 310, 5520, 8127, 50276, 2520, 789, 4648, 13332, 323, 12544, 281, 3186, 1027, 1566, 5289, 323, 7646, 285, 3186, 2919, 2975, 436, 310, 247, 4460, 3946, 323, 4927, 1443, 70, 3540, 398, 50275, 20881, 1255, 265, 50276, 783, 4081, 3540, 398, 285, 253, 1666, 25379, 897, 1027, 3733, 941, 253, 14023, 275, 436, 2929, 403, 16593, 323, 4227, 1097, 4927, 312, 5546, 285, 21191, 296, 42126, 897, 246, 13307, 19, 76, 323, 1566, 3733, 597, 943, 2649, 320, 908, 347, 253, 1666, 25379, 323, 5301, 327, 246, 13307, 19, 76, 4927, 312, 5546, 760, 4648, 253, 4358, 302, 3733, 873, 323, 1566, 4715, 281, 5115, 38991, 47924, 533, 253, 4081, 3540, 398, 897, 8783, 273, 4465, 3733, 5239, 594, 253, 3045, 15988, 2011, 275, 4677, 18, 403, 4274, 17980, 285, 253, 1655, 28913, 1263, 2550, 921, 253, 12510, 273, 253, 4081, 1332, 50276, 2307, 1319, 33947, 310, 1512, 2969, 281, 320, 253, 2022, 7680, 697, 2831, 24353, 5048, 4116, 253, 4477, 9919, 281, 1056, 253, 2605, 625, 2570, 407, 20462, 247, 12541, 4602, 533, 436, 11237, 10316, 690, 24626, 670, 253, 12510, 273, 253, 2831, 24353, 5048, 4116, 359, 13414, 871, 1880, 253, 2457, 11701, 476, 320, 12877, 281, 253, 771, 2103, 2035, 390, 253, 6313, 8113, 4735, 436, 2929, 42126, 2085, 667, 28913, 1783, 670, 352, 253, 1655, 4679, 403, 417, 4209, 50276, 2520, 2929, 29328, 247, 362, 77, 6779, 4715, 1332, 323, 1789, 12544, 533, 2716, 3733, 8557, 13414, 452, 247, 3448, 5740, 285, 253, 4477, 897, 253, 470, 26109, 390, 253, 5304, 4735, 347, 253, 17927, 3448, 5740, 323, 3733, 697, 38342, 326, 253, 6311, 14237, 403, 1663, 23390, 26306, 14237, 253, 4477, 5196, 271, 3368, 407, 11922, 253, 5740, 281, 5276, 253, 12510, 273, 30771, 11781, 533, 849, 670, 970, 247, 470, 26109, 390, 247, 5304, 45900, 4735, 323, 17032, 253, 1655, 4679, 403, 12497, 5549, 2490, 187, 4118, 18435, 27, 455, 1264, 30628, 9644, 4404, 253, 14924, 273, 253, 2929, 37317, 340, 87, 321, 369, 417, 2233, 9049, 670, 253, 2929, 13458, 562, 253, 17647, 273, 253, 2746, 285, 14999, 490, 77, 569, 359, 11907, 253, 4477, 281, 2486, 253, 747, 4753, 597, 5480, 323, 253, 30080, 22559, 275, 253, 2457, 2715, 273, 253, 2929 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper studies adversarial training of robust classification models it is based on pgd training in madry17 it proposes two points 1 add attention schemes 2 add a feature regularization loss the results on mnist and cifar10 demonstrate the effectiveness at last it did some diagnostic study and visualization on the attention maps and gradient maps 1 can you provide detailed explanationsintuitions why attention will help train a more robust models 2 two related adversarial training papers are missing ensemble adversarial training iclr2018 and adversarial logit pairing icml2018 also feature logit regularization has been studied in alp paper on imagenet 3 for table 2 on cifar10 i would like to see pgd20 iterations 2 step size in pixels pgd100 2 and pgd200 2 also i am interested in seeing cw loss which is based on logit margin 4 i would like to see results using the wide model in madry17 paper for alp and lrm i think results from largecapacity models are more convincing 5 i would like to see results on cifar100 which is a harder dataset 100 classes and 500 images per class i think cifar10 alone is not sufficient for justification nowadays maybe enough one year ago since imagenet is to some extent computationally impossible for schools i want to see the justification results on cifar100 postrebuttal i appreciate the additional results in the rebuttal i raise the score but it is still slightly below the acceptance the reasons are 1 incremental novelty 2 insufficient experiments also i found in table 3 that the largercapacity model is less robust than the smallercapacity model against whitebox iterative attacks this is strange docsepsummary this paper argues that improved resistance to adversarial attacks can be achieved by an implicit denoising method in which model weights learned during adversarial training are encouraged to stay close to a set of reference weights using the ell2 penalty additionally the authors claim that by introducing an attention model which focuses the model training on more robust features they can further improve performance some experiments are provided feedback my main concerns with the paper are the experimental section is fairly thin there are at this point a large number of defense methods of which madry et al is only one in light of these the experimental section should be expanded the results should ideally be reported with error bars which would help in gauging significance of the results the differential impact of the two contributions is not entirely clear the results in table 1 suggest that implicit denoising can help yet at the same time table 2 suggests that blackbox performance is better if we just use the attention model overall this conflates the contributions unnecessarily and makes it hard to distingish their individual impact the section on gradient maps is not clear the authors argue that if the gradient map aligns with the image the model depends solely on the robust features while this may be somewhat more intuitive in the context of simple glms its not clear why it should carry over to dnns i think it would help to make these intuitions much more precise secondly even if this were the case the methodology of using a neural net to classify gradient maps and from this derive a robustness metric raises precisely the kinds of robustness questions that the paper tries to answer ie how robust is the neural net classifying the gradient images and how meaningful are its predictions when gradient maps deviate from clean images overall i feel this paper has some potentially interesting ideas but needs additional work before it is ready for publicationdocsepthis paper proposes a new architecture for adversarial training that is able to improve both accuracy and robustness performances using an attentionbased model for feature prioritization and l2 regularization as implicit denoising the paper is very clear and well written and the contribution is relevant to iclr pros the background model and experiments are clearly explained the paper provides fair comparisons with a strong baseline on standard datasets using attention mechanisms to improve the model robustness in an adversarial training setting is a strong and novel contribution both quantitative and qualitative results are interesting ### Summary:
the paper proposes an attention mechanism to focus on robust features in the context of adversarial attacks reviewers asked for more intuition more results and more experiments with different attackdefense models authors have added experimental results and provided some intuition of their proposed approach overall reviewers still think the novelty is too thin and recommend rejection i concur with them
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 2175, 48960, 3733, 273, 10237, 9162, 3210, 352, 310, 1754, 327, 23256, 69, 3733, 275, 10279, 610, 1166, 352, 29328, 767, 2792, 337, 823, 4116, 15849, 374, 823, 247, 4735, 37820, 2957, 253, 1543, 327, 278, 79, 382, 285, 260, 338, 274, 740, 7568, 253, 12510, 387, 1390, 352, 858, 690, 10401, 1263, 285, 24426, 327, 253, 4116, 8115, 285, 11786, 8115, 50276, 18, 476, 368, 2085, 7000, 22909, 565, 86, 4431, 2139, 4116, 588, 1361, 6194, 247, 625, 10237, 3210, 50276, 19, 767, 2905, 48960, 3733, 9380, 403, 5816, 19862, 48960, 3733, 17857, 32888, 7798, 285, 48960, 2412, 262, 25015, 17857, 1686, 7798, 671, 4735, 2412, 262, 37820, 556, 644, 5421, 275, 355, 81, 2929, 327, 4440, 257, 292, 50276, 20, 323, 2829, 374, 327, 260, 338, 274, 740, 891, 651, 751, 281, 923, 23256, 69, 938, 25142, 50276, 19, 3213, 1979, 275, 15115, 23256, 69, 2313, 50276, 19, 285, 23256, 69, 1518, 50276, 19, 671, 891, 717, 6110, 275, 6523, 260, 88, 2957, 534, 310, 1754, 327, 2412, 262, 8459, 50275, 21, 891, 651, 751, 281, 923, 1543, 970, 253, 4618, 1566, 275, 10279, 610, 1166, 2929, 323, 355, 81, 285, 298, 1109, 891, 1158, 1543, 432, 1781, 37473, 3210, 403, 625, 21414, 50276, 22, 891, 651, 751, 281, 923, 1543, 327, 260, 338, 274, 2313, 534, 310, 247, 12150, 10895, 2233, 5971, 285, 6783, 3888, 591, 966, 891, 1158, 260, 338, 274, 740, 3815, 310, 417, 4209, 323, 22861, 31735, 5046, 2217, 581, 807, 3622, 1580, 4440, 257, 292, 310, 50276, 936, 690, 6070, 43245, 7479, 323, 6629, 891, 971, 281, 923, 253, 22861, 1543, 327, 260, 338, 274, 2313, 50275, 5996, 250, 2858, 22559, 50276, 74, 11435, 253, 3081, 1543, 275, 253, 30080, 22559, 891, 7164, 253, 4868, 533, 352, 310, 1335, 5777, 2708, 253, 14924, 253, 4606, 403, 337, 32809, 38135, 374, 12497, 4679, 671, 891, 1119, 275, 2829, 495, 326, 253, 4067, 37473, 1566, 310, 1679, 10237, 685, 253, 1355, 2269, 522, 10757, 1566, 1411, 3168, 3364, 34560, 8104, 436, 310, 8921, 5474, 339, 793, 360, 3454, 436, 2929, 8219, 326, 5520, 5052, 281, 48960, 8104, 476, 320, 6786, 407, 271, 15424, 1850, 80, 2182, 1332, 275, 534, 1566, 13461, 6311, 1309, 48960, 3733, 403, 14659, 281, 3297, 2810, 281, 247, 873, 273, 3806, 13461, 970, 253, 11591, 19, 12339, 23000, 253, 4477, 1750, 326, 407, 16984, 271, 4116, 1566, 534, 16633, 253, 1566, 3733, 327, 625, 10237, 3386, 597, 476, 2007, 3157, 3045, 690, 4679, 403, 2530, 50276, 44333, 619, 2022, 7350, 342, 253, 2929, 403, 50275, 783, 5661, 2593, 310, 9648, 6906, 627, 403, 387, 436, 1127, 247, 50275, 16374, 1180, 273, 5684, 3082, 273, 534, 10279, 610, 1162, 355, 310, 760, 581, 275, 50275, 3243, 273, 841, 253, 5661, 2593, 943, 320, 11848, 253, 50275, 16680, 943, 34243, 320, 2361, 342, 2228, 8965, 534, 651, 1361, 50275, 249, 39417, 272, 8453, 273, 253, 1543, 50275, 783, 8967, 3486, 273, 253, 767, 9021, 310, 417, 7094, 50275, 8250, 253, 1543, 275, 2829, 337, 1804, 326, 15424, 1850, 80, 2182, 476, 50275, 13070, 2568, 387, 253, 1072, 673, 2829, 374, 5936, 326, 2806, 3364, 50275, 24159, 310, 1805, 604, 359, 816, 897, 253, 4116, 1566, 4583, 50275, 2520, 49446, 684, 253, 9021, 48312, 285, 2789, 352, 1892, 281, 50275, 8155, 272, 763, 616, 2060, 3486, 50275, 783, 2593, 327, 11786, 8115, 310, 417, 2590, 253, 4477, 9059, 326, 604, 50275, 783, 11786, 3711, 8495, 84, 342, 253, 2460, 253, 1566, 7024, 12718, 327, 50275, 783, 10237, 3386, 1223, 436, 778, 320, 8489, 625, 27350, 275, 50275, 783, 3634, 273, 2969, 1289, 983, 697, 417, 2590, 2139, 352, 943, 4459, 689, 50275, 936, 277, 79, 2224, 891, 1158, 352, 651, 1361, 281, 1056, 841, 16875, 4431, 1199, 625, 50275, 17995, 885, 1273, 314, 1014, 604, 436, 497, 253, 1083, 253, 16182, 273, 50275, 5302, 247, 11454, 2036, 281, 30215, 11786, 8115, 285, 432, 436, 15313, 247, 50275, 18848, 461, 1255, 7982, 16540, 10534, 253, 9351, 273, 31640, 3533, 50275, 3529, 253, 2929, 14177, 281, 3662, 26332, 849, 10237, 310, 253, 11454, 2036, 50275, 2437, 5411, 253, 11786, 3888, 285, 849, 14282, 403, 697, 50275, 12787, 11297, 672, 11786, 8115, 1474, 4513, 432, 4076, 3888, 50276, 1189, 455, 891, 1928, 436, 2929, 556, 690, 7826, 4722, 5697, 533, 3198, 3081, 789, 1078, 352, 310, 4704, 323, 9311, 7152, 33032, 2520, 2929, 29328, 247, 747, 10336, 323, 48960, 3733, 326, 310, 2104, 281, 3157, 1097, 7200, 285, 31640, 16226, 970, 271, 4116, 3169, 1566, 323, 4735, 23652, 1320, 285, 298, 19, 37820, 347, 15424, 1850, 80, 2182, 253, 2929, 310, 1077, 2590, 285, 973, 3542, 285, 253, 7680, 310, 4623, 281, 17857, 32888, 50275, 856, 84, 50275, 783, 4114, 1566, 285, 4679, 403, 4518, 5544, 253, 2929, 3400, 4344, 14023, 342, 247, 2266, 8245, 327, 2629, 15302, 50276, 5302, 4116, 6297, 281, 3157, 253, 1566, 31640, 275, 271, 48960, 3733, 4758, 310, 247, 2266, 285, 4460, 7680, 50275, 15617, 11745, 285, 18276, 1543, 403, 4722, 50275, 187, 187, 4118, 18435, 27, 783, 2929, 29328, 271, 4116, 5122, 281, 2770, 327, 10237, 3386, 275, 253, 3634, 273, 48960, 8104, 30628, 2546, 323, 625, 30328, 625, 1543, 285, 625, 4679, 342, 1027, 2983, 29337, 3210, 4477, 452, 2879, 5661, 1543, 285, 2530, 690, 30328, 273, 616, 4081, 2746, 4583, 30628, 1335, 1158, 253, 38135, 310, 1512, 6906, 285, 5583, 18235, 891, 15038, 342, 731 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 2175, 48960, 3733, 273, 10237, 9162, 3210, 352, 310, 1754, 327, 23256, 69, 3733, 275, 10279, 610, 1166, 352, 29328, 767, 2792, 337, 823, 4116, 15849, 374, 823, 247, 4735, 37820, 2957, 253, 1543, 327, 278, 79, 382, 285, 260, 338, 274, 740, 7568, 253, 12510, 387, 1390, 352, 858, 690, 10401, 1263, 285, 24426, 327, 253, 4116, 8115, 285, 11786, 8115, 50276, 18, 476, 368, 2085, 7000, 22909, 565, 86, 4431, 2139, 4116, 588, 1361, 6194, 247, 625, 10237, 3210, 50276, 19, 767, 2905, 48960, 3733, 9380, 403, 5816, 19862, 48960, 3733, 17857, 32888, 7798, 285, 48960, 2412, 262, 25015, 17857, 1686, 7798, 671, 4735, 2412, 262, 37820, 556, 644, 5421, 275, 355, 81, 2929, 327, 4440, 257, 292, 50276, 20, 323, 2829, 374, 327, 260, 338, 274, 740, 891, 651, 751, 281, 923, 23256, 69, 938, 25142, 50276, 19, 3213, 1979, 275, 15115, 23256, 69, 2313, 50276, 19, 285, 23256, 69, 1518, 50276, 19, 671, 891, 717, 6110, 275, 6523, 260, 88, 2957, 534, 310, 1754, 327, 2412, 262, 8459, 50275, 21, 891, 651, 751, 281, 923, 1543, 970, 253, 4618, 1566, 275, 10279, 610, 1166, 2929, 323, 355, 81, 285, 298, 1109, 891, 1158, 1543, 432, 1781, 37473, 3210, 403, 625, 21414, 50276, 22, 891, 651, 751, 281, 923, 1543, 327, 260, 338, 274, 2313, 534, 310, 247, 12150, 10895, 2233, 5971, 285, 6783, 3888, 591, 966, 891, 1158, 260, 338, 274, 740, 3815, 310, 417, 4209, 323, 22861, 31735, 5046, 2217, 581, 807, 3622, 1580, 4440, 257, 292, 310, 50276, 936, 690, 6070, 43245, 7479, 323, 6629, 891, 971, 281, 923, 253, 22861, 1543, 327, 260, 338, 274, 2313, 50275, 5996, 250, 2858, 22559, 50276, 74, 11435, 253, 3081, 1543, 275, 253, 30080, 22559, 891, 7164, 253, 4868, 533, 352, 310, 1335, 5777, 2708, 253, 14924, 253, 4606, 403, 337, 32809, 38135, 374, 12497, 4679, 671, 891, 1119, 275, 2829, 495, 326, 253, 4067, 37473, 1566, 310, 1679, 10237, 685, 253, 1355, 2269, 522, 10757, 1566, 1411, 3168, 3364, 34560, 8104, 436, 310, 8921, 5474, 339, 793, 360, 3454, 436, 2929, 8219, 326, 5520, 5052, 281, 48960, 8104, 476, 320, 6786, 407, 271, 15424, 1850, 80, 2182, 1332, 275, 534, 1566, 13461, 6311, 1309, 48960, 3733, 403, 14659, 281, 3297, 2810, 281, 247, 873, 273, 3806, 13461, 970, 253, 11591, 19, 12339, 23000, 253, 4477, 1750, 326, 407, 16984, 271, 4116, 1566, 534, 16633, 253, 1566, 3733, 327, 625, 10237, 3386, 597, 476, 2007, 3157, 3045, 690, 4679, 403, 2530, 50276, 44333, 619, 2022, 7350, 342, 253, 2929, 403, 50275, 783, 5661, 2593, 310, 9648, 6906, 627, 403, 387, 436, 1127, 247, 50275, 16374, 1180, 273, 5684, 3082, 273, 534, 10279, 610, 1162, 355, 310, 760, 581, 275, 50275, 3243, 273, 841, 253, 5661, 2593, 943, 320, 11848, 253, 50275, 16680, 943, 34243, 320, 2361, 342, 2228, 8965, 534, 651, 1361, 50275, 249, 39417, 272, 8453, 273, 253, 1543, 50275, 783, 8967, 3486, 273, 253, 767, 9021, 310, 417, 7094, 50275, 8250, 253, 1543, 275, 2829, 337, 1804, 326, 15424, 1850, 80, 2182, 476, 50275, 13070, 2568, 387, 253, 1072, 673, 2829, 374, 5936, 326, 2806, 3364, 50275, 24159, 310, 1805, 604, 359, 816, 897, 253, 4116, 1566, 4583, 50275, 2520, 49446, 684, 253, 9021, 48312, 285, 2789, 352, 1892, 281, 50275, 8155, 272, 763, 616, 2060, 3486, 50275, 783, 2593, 327, 11786, 8115, 310, 417, 2590, 253, 4477, 9059, 326, 604, 50275, 783, 11786, 3711, 8495, 84, 342, 253, 2460, 253, 1566, 7024, 12718, 327, 50275, 783, 10237, 3386, 1223, 436, 778, 320, 8489, 625, 27350, 275, 50275, 783, 3634, 273, 2969, 1289, 983, 697, 417, 2590, 2139, 352, 943, 4459, 689, 50275, 936, 277, 79, 2224, 891, 1158, 352, 651, 1361, 281, 1056, 841, 16875, 4431, 1199, 625, 50275, 17995, 885, 1273, 314, 1014, 604, 436, 497, 253, 1083, 253, 16182, 273, 50275, 5302, 247, 11454, 2036, 281, 30215, 11786, 8115, 285, 432, 436, 15313, 247, 50275, 18848, 461, 1255, 7982, 16540, 10534, 253, 9351, 273, 31640, 3533, 50275, 3529, 253, 2929, 14177, 281, 3662, 26332, 849, 10237, 310, 253, 11454, 2036, 50275, 2437, 5411, 253, 11786, 3888, 285, 849, 14282, 403, 697, 50275, 12787, 11297, 672, 11786, 8115, 1474, 4513, 432, 4076, 3888, 50276, 1189, 455, 891, 1928, 436, 2929, 556, 690, 7826, 4722, 5697, 533, 3198, 3081, 789, 1078, 352, 310, 4704, 323, 9311, 7152, 33032, 2520, 2929, 29328, 247, 747, 10336, 323, 48960, 3733, 326, 310, 2104, 281, 3157, 1097, 7200, 285, 31640, 16226, 970, 271, 4116, 3169, 1566, 323, 4735, 23652, 1320, 285, 298, 19, 37820, 347, 15424, 1850, 80, 2182, 253, 2929, 310, 1077, 2590, 285, 973, 3542, 285, 253, 7680, 310, 4623, 281, 17857, 32888, 50275, 856, 84, 50275, 783, 4114, 1566, 285, 4679, 403, 4518, 5544, 253, 2929, 3400, 4344, 14023, 342, 247, 2266, 8245, 327, 2629, 15302, 50276, 5302, 4116, 6297, 281, 3157, 253, 1566, 31640, 275, 271, 48960, 3733, 4758, 310, 247, 2266, 285, 4460, 7680, 50275, 15617, 11745, 285, 18276, 1543, 403, 4722, 50275, 187, 187, 4118, 18435, 27, 783, 2929, 29328, 271, 4116, 5122, 281, 2770, 327, 10237, 3386, 275, 253, 3634, 273, 48960, 8104, 30628, 2546, 323, 625, 30328, 625, 1543, 285, 625, 4679, 342, 1027, 2983, 29337, 3210, 4477, 452, 2879, 5661, 1543, 285, 2530, 690, 30328, 273, 616, 4081, 2746, 4583, 30628, 1335, 1158, 253, 38135, 310, 1512, 6906, 285, 5583, 18235, 891, 15038, 342, 731 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper presents the idea that the current formulation of disentangled latent representations of data that have been presented are implausible in the sense that the factors are often not actually independent and cannot be learned or generated as independent instead the authors put forth the idea of transformations of data that are equivariant to the latent space representation as a formulation of disentangled factors the authors use group theoretical constructs such as shift and rotation operators to show that a latent space representation should be equivariant such transformations in other words if a latent space representation is rotated it should still reconstruct correctly because the reconstruction loss should be trained on a rotated version of the image the key strengths of this paper are the examples that showcase the lack of ability to learn independent latent factors figure 1 displays the failure to learn rotation as a factor in the mnist digit dataset figure 2 is even more convincing in that it shows that the orbits of the different factors cannot be mapped to one another and thus cannot be truly independent second i believe that the idea that is better stated in the introduction on how disentanglement can be framed is valuable in this framework the factors of variation are different subgroups acting on the dataset and the goal is to learn representations where separated of the data subspaces are equivariant to distinct subgroups theoretically the authors are proposing an operational view of the latent factors as separate transformations on the data and the representation as having subspaces equivariant to the transformations definition 1 is trying to state the same idea but is much less clear to the average ml reader more generally the authors should work harder to communicate this to the ml audience the group theoretical background from the appendix should be in the background section particularly the idea of equivariance and group operations the key weakness is that their new formulation of disentanglement is that it is definitional does not give a plan of how this should be done based on their description it seems as if the dataset has to come with a set of known operations on the data like rotations that are equivariant how would such operations be learned de novo from the data it seems as if the framework requires learning two things separately 1 a latent representation of the data 2 a set of equivariant operations on the data that are perhaps cyclic generators of an orbit it is not clear how this would be learned docsepsummary the authors proposed a new way to disentangle affine transformations without topological defects this paper made several theoretical contributions including a new definition of disentanglement and demonstration of the topological defects in existing disentanglement methods experimentally this paper showed how their proposed shift operator model is powerful when dealing with topological defects disentanglement is a relatively challenging task due to the lack of clear definition and the lack of a robust evaluation method the authors did a good job providing new theoretical definitions and providing empirical and qualitative results to support their claims the main weakness of the paper is the lack of quantitative metrics to evaluate their approach and compare with others in addition the model doesnt appear to be very flexible as it requires that the transformation is known in advance strengths overall the paper is well written and contains a good review of advances in the theory of disentanglement the idea of addressing topological defects for disentanglement appears novel using operators on the entire latent space is a new direction for the study of disentanglement the authors viewpoint that isolating factors of variation is different from mapping these factors into distinct subspaces and how they propose a new definition based on this viewpoint is interesting weaknesses lack of quantitative evaluation metrics the mse in the appendix is not enough for quantifying disentanglement since this paper focuses on disentanglement at least factorvae one of the other representative disentanglement vae models should be considered when doing the model evaluation baseline models should be optimized in a more comprehensive manner eg currently the selection of beta is 4 10 100 1000 and latent dimension is 10 30 its unclear whether these models have been well optimized or what measures are used to optimize the models for this task because the method requires that the transformation is known in advance this limits the flexibility of the approach how different transformations impact each other is not shown experimentally there is only an example on fig 3e showing some visual results but this should be elaborated on further given the goal of the paper minor points the complex version of the shift operator is used it would be interesting to show another version and their differences latent traversals results appear to be rather sparse it would be interesting to show how the variation exists inside the model via dense traversals and the computing of generated images variation with different latent traversals rotations may be more challenging to learn 2000 examples may be insufficient for the model to learn this transformation correctly docsepsummarize what the paper claims to contribute the authors claims to show that disentanglement into subspaces by a continuous encoder is impossible for any finite group acting on euclidean space the authors claim to introduce an alternative definition of disentanglement that is more flexible and leads to a strengths the authors consider the problem of disentangled representation learning which is of considerable interest to the community the authors approach the problem by imposing structure through their disentangled operators weaknesses the reliance of the impossibility of disentanglement proof seems to rely heavily on the example of the perturbed triangle the example and its assumptions seem fairly rigid and unnatural and i am unconvinced this captures the reality of disentangled representation learning with autoencoding networks the approach of adding structure by means of a transformation operator was also used in 12 which are cited but not compared against instead the authors compare against various vaes which do not impose any external structure which does not seem particularly appropriate if i understand correctly the paper seems to be based on a mischaracterization of the arguments in 3 clearly state your recommendation accept or reject with one or two key reasons for this choice reject see weaknesses supporting arguments for your recommendation while the authors tackle an interesting problem and propose an interesting solution the arguments on which the paper is based seem flawed ask questions you would like answered by the authors to help you clarify your understanding of the paper and provide the additional evidence you need to be confident in your assessment as i understand the argument is against the utility of the linear disentangled representation in 3 the more flexible definition the authors propose seems quite close to disentangled representation in 3 please clarify the difference moreover it seems the authors suggest the definition of disentangled representations proposed in 3 requires that subspaces corresponding to factors of variation are single dimensional section 2 which is not the case please clarify how does the approach compare against other methods namely 124 that use structure to encourage disentangling of the representation section 2 asserts that the vae and its variants do not learn disentangled representations and uses pca to show this is true i expect that if this same analysis were used in the structured case a similar result would be found in particular since the rotation matrix interacts with multiple dimensions of the latent code perhaps my intuition is incorrect please clarify provide additional feedback with the aim to improve the paper perhaps a rewording could clarify supervised disentanglement is composed of a 2x2 diagonal block is a block diagonal matrix with a 2x2 rotation matrix in the upper left block and 1s on the remaining diagonals just after 11 the authors state that most deep networks are differentiable my understanding is that the common relu networks are not differentiable but subdifferentiable possible typos vae betavae and ccivae the 4s the 4s dfn of a group identity element gk eg eg gk eg gk eg eg gk gk post rebuttal i thank the authors and other reviewers for their comments and discussion while the direction the authors pursue is of unquestionable merit i remain unconvinced that the work as it stands is sufficiently impactful for this venue 1 falorsi luca et al explorations in homeomorphic variational autoencoding arxiv preprint arxiv180704689 2018 2 connor marissa and christopher rozell representing closed transformation paths in encoded network latent space aaai 2020 3 higgins irina et al towards a definition of disentangled representations arxiv preprint arxiv181202230 2018 4 cohen taco and max welling learning the irreducible representations of commutative lie groups international conference on machine learning 2014docsepthe paper first shows that existing approaches to latent space disentanglement perform poorly when the latent space topology usually euclidean does not match the actual data topology using rotation equivariance as an example this analysis culminates in a general impossibility theorem for this type of disentanglement the authors then propose a relaxed definition of disentanglement and show that it can be realized by means of a shift operator in latent space theoretical and empirical results demonstrate the superiority of the new approach this is a very interesting idea that represents significant progress in an important problem unfortunately the current organization of the paper does not work well the authors devote too much space half of the paper to the explanation of the problem and too little barely one page to its solution this leaves the reader with many unanswered questions about how the new method works and what its crucial details are some of these questions are later dealt with in the appendix but this is too late my main suggestion for improvement is therefore to move most of section c 31 to the main text and allocate the required space by shortening the motivation up to section 32 and possibly the discussion of multiple transformations in section 43 content that would get lost by this change should be moved to the appendix more minor points are the authors repeatedly refer to recent success of distributed operators but do not cite and discuss any prior work please add appropriate references to the introduction or related work rotations and translations are continuous transformations whereas the proposed shift operator is discrete does this discretization introduce rounding errors or other artifacts how many discretization levels are needed and how can this number be determined does discretization have undesirable limitations such potential limitations should at least be acknowledged ideally these questions should be investigated experimentally but this can be left for future work if infeasible in the present paper in appendix e1 results for rotations are an order of magnitude better than those for translations why is this the case figure 3e it is hard to judge if the results align with the ground truth preferably the ground truth should be displayed for reference is it necessary to design the network according to apriory knowledge of the relevant group transformations or can this be inferred automatically for example what happens if the latent space implements a group that does not correspond to any symmetry in the data im willing to raise my rating if these points in particular the relocation of section c 31 are suitably addressed in an updated version of the submissiondocsepthis paper studies the notion of disentanglement in a group representation theoretic setting disentangling is sometimes conceptualized as mapping distinct factors eg position orientation to distinct subspaces it is shown theoretically that such a naive notion of disentangling is impossible for topological reasons and this is confirmed empirically an alternative definition of disentanglement is given where instead of confining the effect of each transformation to a subspace an operator is used that acts on the whole latent space this operator is chosen as a shift operator which works for cyclic groups it is shown empirically that an autoencoder with a shift operator in latent space is better able to learn rotations and translations the paper does a good job explaining why the naive notion of disentangling leads to topological problems and convincingly backs this up with experiments as well the insight is not new to me personally but i cant find a reference that explains it and i think it is not widely understood so i consider this an important contribution to the very muddled discourse on disentangling definition 1 provides a new definition of disentangling however the statement is not very precise and i am not convinced that it can reasonably be considered as a definition of disentanglement the definition is a representation is said to be disentangled with respect to a particular decomposition of a symmetry group into subgroups if there is a family of known operators acting on this representation potentially distributed across the full latent where each operator is equivariant to the action of a single subgroup based on the rest of the paper i think this means that we have for each subgroup gi an operator phiig acting on the latent space the definition does not make it clear that we wish the encoder to be equivariant wrt this operator and some operator acting on the input space but i will assume that is what is meant otherwise having an operator acting on the latent space is a rather vacuous requirement on the encoderrepresentation the definition does speak of the operator being equivariant which i will take to mean that it is a group representation ie phigg phigphig the operator being distributed i will take to mean that phig can be any linear map not necessarily acting trivially on a subspace or being block diagonal reduced the definition mentions that each subgroup should have its own operator but since all of them act on the whole subspace this seems to a trivial constraint indeed if we have a representation of the whole group acting on the latent space simply restricting it to each subgroup gives us a representation of the subgroups i would further note that what is done in practice in the paper is different from this definition because we have one latent space per operator not multiple operators acting on the same space under this interpretation i dont see how the definition is saying anything else than that the network should be equivariant wrt some representation of the group acting on the input and output space although equivariance is a good property for various reasons it does not seem to me to be reasonable definition of disentangling by itself indeed the identity map satisfies this constraint trivially it may be that i have misunderstood definition 1 but this strengthens the case for making it mathematically precise even if one can question whether def 1 is a good formalization of disentangling the paper does show empirically that it is easier to learn an equivariant encoderdecoder when the latent operator is a shift operator or a diagonalized complex version of it rather than a disentangled operator with one 2x2 rotation matrix block and an identity block fig 3b although i dont know if these two approaches have been compared before several older papers consider similar models to the shift operator model for instance in a sequence of papers memisevic hinton considered factorized rbms that do something similar cohen welling described a representationtheoretic version of this model which is very similar what is presented in this paper at least the linear ae and also gave a definition of disentangling under this definition the complex diagonal shift operator is disentangled while the original shift operator is not models with a stack of multiple operators were considered by sohldickstein et al if one wishes to define a notion of disentangling based on subgroups and representations it may be worth investigating subgroup adapted gelfandtsetlin bases in summary i think this paper contains several interesting observations and results and i think the general direction is very interesting and deserves further study however im not convinced that this paper provides a good definition of disentangling the experiments although convincing and well executed are restricted to simplified domains and some of the insights methods presented in the paper are already present in earlier work nevertheless i hope the authors will not be discouraged and continue to work on this important and fundamental problem using the tools of representation theory references memisevic hinton learning to represent spatial transformations with factored higherorder boltzmann machines 2010 sohldickstein wang olshausen an unsupervised algorithm for learning lie group transformations 2010 cohen welling learning the irreducible representations of commutative lie groups 2014 wakin donoho choi baraniuk the multiscale structure of nondifferentiable image manifolds 2005 postdiscussion update having read the other reviews author response and updated paper i still think this paper is borderline the insight that disentangling transformations as naively defined is impossible for topological reasons is valid and interesting but seems to have been already observed by others eg falorsi et al nevertheless the paper does a good job explaining this so it could be useful as some authors seem to not know about this issue the definition of disentangling still seems a bit vague to me and im not convinced of practical applicability of the proposed method ### Summary:
this is a borderline case quite comparable to the other borderline case in my batch the paper has received careful reviews and based on my weighting of the different arguments i arrive at an average score between 575 and 6 the authors present some worthwhile ideas related to disentanglement that deserves more attention and that could spark more research in this direction at the same time the level of novelty and significance of this work remains a bit limited taken together the paper is likely not compelling enough to be among the top papers to be selected for publication at iclr
[ 557, 290, 33195, 14237, 285, 4648, 268, 6357, 281, 921, 436, 310, 2032, 891, 1902, 326, 604, 436, 1072, 1783, 497, 908, 275, 253, 18872, 1083, 247, 2074, 906, 651, 320, 1119, 275, 1798, 1580, 253, 9381, 4315, 29290, 342, 2709, 10103, 273, 253, 21624, 2127, 4931, 619, 30328, 310, 13583, 4496, 19148, 50276, 42260, 3081, 8680, 342, 253, 4388, 281, 3157, 253, 2929, 4931, 247, 294, 88, 1573, 812, 19148, 22296, 557, 290, 606, 1338, 310, 9924, 273, 247, 374, 89, 19, 16421, 2972, 50275, 261, 247, 2972, 16421, 4315, 342, 247, 374, 89, 19, 9381, 4315, 275, 253, 5170, 1669, 2972, 285, 337, 84, 327, 253, 5780, 1073, 5154, 932, 816, 846, 1903, 253, 4477, 1375, 326, 954, 3676, 6928, 403, 46350, 619, 4685, 310, 326, 253, 1846, 774, 86, 6928, 403, 417, 46350, 533, 749, 19623, 6051, 50276, 24902, 963, 993, 362, 3348, 701, 2623, 70, 285, 25215, 400, 3348, 253, 577, 84, 50276, 783, 577, 84, 277, 4174, 273, 247, 1387, 6489, 3284, 305, 76, 24088, 50276, 909, 305, 76, 50276, 909, 50276, 72, 76, 24088, 50276, 909, 305, 76, 50276, 72, 76, 50276, 5996, 30080, 22559, 891, 5717, 253, 4477, 285, 643, 30628, 323, 616, 5701, 285, 5955, 1223, 253, 3884, 253, 4477, 15142, 310, 273, 440, 19751, 494, 15785, 891, 3464, 10915, 8498, 758, 326, 253, 789, 347, 352, 9572, 310, 10481, 3486, 1020, 323, 436, 18767, 50275, 18, 18512, 641, 74, 18205, 66, 1162, 355, 31880, 569, 275, 1728, 13468, 39762, 6753, 27676, 549, 32693, 638, 3845, 549, 32693, 11395, 26942, 29941, 4765, 374, 345, 15387, 2304, 14302, 285, 37622, 12679, 33971, 437, 9999, 4581, 9261, 11865, 275, 16202, 2990, 21624, 2317, 39951, 2284, 9169, 495, 1725, 35022, 3496, 1758, 1162, 355, 4404, 247, 5426, 273, 557, 290, 33195, 14237, 549, 32693, 638, 3845, 549, 32693, 1093, 8193, 1423, 1229, 4765, 577, 820, 864, 246, 15861, 285, 2781, 973, 272, 4715, 253, 22816, 14237, 273, 33796, 7027, 2390, 5213, 8059, 327, 5145, 4715, 4059, 7152, 339, 431, 248, 2929, 806, 2722, 326, 5368, 7274, 281, 21624, 2317, 557, 290, 606, 1338, 1347, 15225, 672, 253, 21624, 2317, 18080, 3798, 299, 26365, 1057, 417, 3761, 253, 4588, 941, 18080, 970, 9381, 32270, 14417, 347, 271, 1650, 436, 1783, 27798, 684, 275, 247, 2087, 37593, 2322, 10012, 323, 436, 1511, 273, 557, 290, 606, 1338, 253, 4477, 840, 12661, 247, 19595, 5426, 273, 557, 290, 606, 1338, 285, 921, 326, 352, 476, 320, 8156, 407, 2097, 273, 247, 5333, 5572, 275, 21624, 2317, 10527, 285, 16774, 1543, 7568, 253, 34385, 273, 253, 747, 2746, 436, 310, 247, 1077, 4722, 2934, 326, 6125, 1534, 4780, 275, 271, 1774, 1895, 50275, 328, 9520, 253, 1655, 6003, 273, 253, 2929, 1057, 417, 789, 973, 253, 4477, 34097, 1512, 1199, 2317, 2716, 273, 253, 2929, 281, 253, 8813, 273, 253, 1895, 285, 1512, 1652, 12345, 581, 3239, 281, 697, 2900, 436, 6505, 253, 9414, 342, 1142, 440, 42195, 3533, 670, 849, 253, 747, 1332, 2987, 285, 752, 697, 9560, 4278, 403, 690, 273, 841, 3533, 403, 1996, 18445, 342, 275, 253, 30762, 533, 436, 310, 1512, 3563, 50276, 2577, 2022, 14876, 323, 7756, 310, 3103, 281, 2118, 954, 273, 2593, 260, 4562, 281, 253, 2022, 2505, 285, 29211, 253, 2424, 2317, 407, 39243, 253, 16038, 598, 281, 2593, 4567, 285, 6830, 253, 5955, 273, 2709, 21257, 275, 2593, 7652, 2600, 326, 651, 755, 3663, 407, 436, 1818, 943, 320, 4395, 281, 253, 30762, 50276, 3062, 5884, 2792, 403, 50276, 783, 4477, 12889, 3730, 281, 3332, 2323, 273, 50276, 45618, 9158, 533, 513, 417, 26542, 285, 2319, 667, 2720, 789, 4496, 823, 4569, 10414, 281, 253, 10199, 390, 2905, 789, 50276, 8601, 569, 285, 29971, 403, 5415, 21257, 5727, 253, 4081, 5333, 5572, 310, 13358, 1057, 436, 35132, 1320, 9569, 46551, 6332, 390, 643, 24165, 849, 1142, 35132, 1320, 2308, 403, 3058, 285, 849, 476, 436, 1180, 320, 3413, 1057, 35132, 1320, 452, 26016, 7364, 824, 2442, 7364, 943, 387, 1878, 320, 14969, 34243, 841, 3533, 943, 320, 6949, 21657, 533, 436, 476, 320, 1669, 323, 2852, 789, 604, 275, 36764, 917, 275, 253, 1246, 2929, 50275, 249, 30762, 299, 18, 1543, 323, 39501, 403, 271, 1340, 273, 9777, 1805, 685, 1110, 323, 29971, 2139, 310, 436, 253, 1083, 50276, 13206, 495, 70, 352, 310, 1892, 281, 5963, 604, 253, 1543, 8495, 342, 253, 3216, 5083, 13027, 253, 3216, 5083, 943, 320, 8653, 323, 3806, 50276, 261, 352, 3309, 281, 2216, 253, 2990, 2556, 281, 1049, 363, 590, 3640, 273, 253, 4623, 1387, 21257, 390, 476, 436, 320, 22245, 8356, 323, 1650, 752, 6569, 604, 253, 21624, 2317, 17930, 247, 1387, 326, 1057, 417, 2723, 281, 667, 10377, 275, 253, 941, 50276, 303, 7378, 281, 7164, 619, 13716, 604, 841, 2792, 275, 1798, 253, 46293, 273, 2593, 260, 4562, 403, 43364, 9713, 275, 271, 9300, 2715, 273, 253, 19529, 7152, 33032, 2520, 2929, 2175, 253, 10732, 273, 557, 290, 606, 1338, 275, 247, 1387, 6779, 253, 30325, 4758, 557, 290, 36874, 310, 4536, 20178, 1025, 347, 10603, 5799, 2616, 24088, 1899, 50276, 31756, 281, 5799, 749, 31748, 352, 310, 2011, 28055, 326, 824, 247, 27785, 10732, 273, 557, 290, 36874, 310, 7479, 323, 17597, 4606, 285, 436, 310, 5783, 45190, 271, 5795, 5426, 273, 557, 290, 606, 1338, 310, 1677, 835, 3185, 273, 1461, 1699, 253, 1055, 273, 1016, 9261, 281, 247, 24822, 271, 5572, 310, 908, 326, 6993, 327, 253, 2644, 21624, 2317, 436, 5572, 310, 6777, 347, 247, 5333, 5572, 534, 2987, 323, 19870, 2390, 352, 310, 2011, 45190, 326, 271, 6753, 36465, 342, 247, 5333, 5572, 275, 21624, 2317, 310, 1805, 2104, 281, 3037, 39501, 285, 29971, 50276, 783, 2929, 1057, 247, 1175, 2628, 15571, 2139, 253, 27785, 10732, 273, 557, 290, 36874, 5644, 281, 17597, 3237, 285, 2410, 1763, 5356, 22513, 436, 598, 342, 4679, 347, 973, 253, 12288, 310, 417, 747, 281, 479, 11697, 533, 891, 16216, 1089, 247, 3806, 326, 11424, 352, 285, 891, 1158, 352, 310, 417, 7561, 7192, 594, 891, 1908, 436, 271, 1774, 7680, 281, 253, 1077, 278, 40747, 25200, 327, 557, 290, 36874, 50276, 28692, 337, 3400, 247, 747, 5426, 273, 557, 290, 36874, 2299, 253, 3908, 310, 417, 1077, 10799, 285, 891, 717, 417, 13762, 326, 352, 476, 12054, 320, 2783, 347, 247, 5426, 273, 557, 290, 606, 1338, 253, 5426, 310, 50275, 66, 6779, 310, 753, 281, 320, 557, 290, 33195, 342, 1675, 281, 247, 1798, 14717, 273, 247, 10377, 1387, 715, 22105, 604, 627, 310, 247, 2021, 273, 1929, 9158, 8534, 327, 436, 6779, 7826, 5939, 2439, 253, 2120, 21624, 835, 1016, 5572, 310, 32270, 6410, 281, 253, 2250, 273, 247, 2014, 14632, 50276, 3169, 327, 253, 1551, 273, 253, 2929, 891, 1158, 436, 2097, 326, 359, 452, 323, 1016, 14632, 15891, 271, 5572, 815, 74, 304, 8534, 327, 253, 21624, 2317, 253, 5426, 1057, 417, 1056, 352, 2590, 326, 359, 5730, 253, 32049, 281, 320, 32270, 6410, 8772, 436, 5572, 285, 690, 5572, 8534, 327, 253, 3280, 2317, 533, 891, 588, 5467, 326, 310, 752, 310, 5486, 5010, 1907, 271, 5572, 8534, 327, 253, 21624, 2317, 310, 247, 2581, 5809, 3472, 8284, 327, 253, 32049, 37626, 253, 5426, 1057, 3984, 273, 253, 5572, 1146, 32270, 6410, 534, 891, 588, 1379, 281, 1599, 326, 352, 310, 247, 1387, 6779, 26332, 815, 20878, 50276, 545, 304, 545, 304, 253, 5572, 1146, 5939, 891, 588, 1379, 281, 1599, 326, 815, 304, 476, 320, 667, 4872, 3711, 417, 7933, 8534, 35820, 1365, 327, 247, 24822, 390, 1146, 2972, 16421, 50276, 43408, 50275, 783, 5426, 25957, 326, 1016, 14632, 943, 452, 697, 1211, 5572, 533, 1580, 512, 273, 731, 769, 327, 253, 2644, 24822, 436, 3133, 281, 247, 14916, 7658, 6296, 604, 359, 452, 247, 6779, 273, 253, 2644, 1387, 8534, 327, 253, 21624, 2317, 3365, 34617, 352, 281, 1016, 14632, 4245, 441, 247, 6779, 273, 253, 22105, 891, 651, 2007, 3877, 326, 752, 310, 2218, 275, 3946, 275, 253, 2929, 310, 1027, 432, 436, 5426, 984, 359, 452, 581, 21624, 2317, 591, 5572, 417, 2709, 9158, 8534, 327, 253, 1072, 2317, 50276, 4524, 436, 7914, 891, 13414, 923, 849, 253, 5426, 310, 3981, 2712, 2010, 685, 326, 253, 2990, 943, 320, 32270, 6410, 8772, 690, 6779, 273, 253, 1387, 8534, 327, 253, 3280, 285, 3453, 2317, 3738, 32270, 14417, 310, 247, 1175, 2867, 323, 2710, 4606, 352, 1057, 417, 1646, 281, 479, 281, 320, 5272, 5426, 273, 557, 290, 36874, 407, 3139, 6296, 253, 6489, 3711, 12310, 436, 7658, 35820, 1365, 50276, 262, 778, 320, 326, 891, 452, 46485, 5426, 337, 533, 436, 4056, 49966, 253, 1083, 323, 2403, 352, 11076, 1037, 10799, 50276, 9154, 604, 581, 476, 1953, 1880, 809, 337, 310, 247, 1175, 7473, 1320, 273, 557, 290, 36874, 253, 2929, 1057, 921, 45190, 326, 352, 310, 6927, 281, 3037, 271, 32270, 6410, 32049, 48759, 672, 253, 21624, 5572, 310, 247, 5333, 5572, 390, 247, 16421, 1025, 2570, 2715, 273, 352, 2581, 685, 247, 557, 290, 33195, 5572, 342, 581, 374, 89, 19, 9381, 4315, 2972, 285, 271, 6489, 2972, 3036, 495, 67, 3738, 891, 13414, 871, 604, 841, 767, 7274, 452, 644, 2429, 1078, 2067, 5662, 9380, 1908, 2074, 3210, 281, 253, 5333, 5572, 1566, 50275, 1542, 4227, 275, 247, 3425, 273, 9380, 1167, 885, 19742, 50276, 73, 8185, 2783, 2803, 1025, 45630, 983, 326, 513, 1633, 2074, 820, 864, 50276, 88, 3485, 2529, 247, 6779, 783, 30325, 2715, 273, 436, 1566, 534, 310, 1077, 2074, 752, 310, 3559, 275, 436, 2929, 387, 1878, 253, 4872, 247, 70, 285, 671, 3534, 247, 5426, 273, 557, 290, 36874, 762, 436, 5426, 253, 2570, 16421, 5333, 5572, 310, 557, 290, 33195, 1223, 253, 3236, 5333, 5572, 310, 417, 3210, 342, 247, 8031, 273, 2709, 9158, 497, 2783, 407, 594, 73, 392, 781, 6339, 1162, 355, 50276, 338, 581, 16995, 281, 4853, 247, 10732, 273, 557, 290, 36874, 1754, 327, 22105, 285, 14237, 352, 778, 320, 4409, 15686, 14632, 12956, 50276, 72, 813, 395, 85, 1178, 3642, 14395, 50276, 249, 6010, 891, 1158, 436, 2929, 4428, 2067, 4722, 7313, 285, 1543, 285, 891, 1158, 253, 2087, 3884, 310, 1077, 4722, 285, 22828, 2007, 1263, 2299, 516, 417, 13762, 326, 436, 2929, 3400, 247, 1175, 5426, 273, 557, 290, 36874, 253, 4679, 3738, 21414, 285, 973, 11407, 403, 11096, 281, 21010, 10625, 285, 690, 273, 253, 16039, 50276, 30172, 3559, 275, 253, 2929, 403, 2168, 1246, 275, 4321, 789, 17837, 891, 3524, 253, 4477, 588, 417, 320, 42965, 285, 4035, 281, 789, 327, 436, 1774, 285, 7936, 1895, 970, 253, 5657, 273, 6779, 3762, 50276, 250, 3065, 1167, 885, 19742, 50276, 73, 8185, 4715, 281, 1957, 8820, 21257, 342, 958, 2149, 2169, 2621, 22491, 91, 8420, 10679, 4267, 594, 73, 392, 781, 6339, 259, 606, 8919, 1200, 666, 257, 271, 440, 35421, 5933, 323, 4715, 7027, 1387, 21257, 4267, 820, 864, 50276, 88, 3485, 4715, 253, 22816, 14237, 273, 33796, 7027, 2390, 4059, 259, 28709, 1053, 49132, 2093, 74, 2534, 6451, 2788, 253, 1554, 2865, 1079, 2605, 273, 27370, 7413, 6051, 2460, 28236, 5826, 50275, 5996, 49794, 5731, 1907, 1239, 253, 643, 10123, 2488, 2380, 285, 9300, 2929, 891, 1335, 1158, 436, 2929, 310, 45210, 253, 12288, 326, 557, 290, 36874, 21257, 347, 5549, 1242, 2931, 310, 7479, 323, 17597, 4606, 310, 3588, 285, 4722, 533, 3133, 281, 452, 644, 2168, 2540, 407, 2571, 24088, 18512, 641, 74, 1162, 355, 17837, 253, 2929, 1057, 247, 1175, 2628, 15571, 436, 594, 352, 812, 320, 4217, 347, 690, 4477, 1646, 281, 417, 871, 670, 436, 2523, 253, 5426, 273, 557, 290, 36874, 1335, 3133, 247, 2372, 21248, 281, 479, 285, 516, 417, 13762, 273, 8542, 30437, 273, 253, 4081, 1332, 2490, 187, 4118, 18435, 27, 2520, 310, 247, 45210, 1083, 3240, 10870, 281, 253, 643, 45210, 1083, 275, 619, 14604, 253, 2929, 556, 2959, 10182, 10123, 285, 1754, 327, 619, 42428, 273, 253, 1027, 7125, 891, 12666, 387, 271, 3388, 4868, 875, 45916, 285, 721, 253, 4477, 1246, 690, 32811, 5697, 2905, 281, 557, 290, 606, 1338, 326, 22828, 625, 4116, 285, 326, 812, 13673, 625, 2561, 275, 436, 3884, 387, 253, 1072, 673, 253, 1268, 273, 38135, 285, 8453, 273, 436, 789, 4558, 247, 2372, 3710, 2668, 2366, 253, 2929, 310, 2779, 417, 18511, 2217, 281, 320, 2190, 253, 1755, 9380, 281, 320, 4236, 323, 9311, 387, 17857, 32888, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 557, 290, 33195, 14237, 285, 4648, 268, 6357, 281, 921, 436, 310, 2032, 891, 1902, 326, 604, 436, 1072, 1783, 497, 908, 275, 253, 18872, 1083, 247, 2074, 906, 651, 320, 1119, 275, 1798, 1580, 253, 9381, 4315, 29290, 342, 2709, 10103, 273, 253, 21624, 2127, 4931, 619, 30328, 310, 13583, 4496, 19148, 50276, 42260, 3081, 8680, 342, 253, 4388, 281, 3157, 253, 2929, 4931, 247, 294, 88, 1573, 812, 19148, 22296, 557, 290, 606, 1338, 310, 9924, 273, 247, 374, 89, 19, 16421, 2972, 50275, 261, 247, 2972, 16421, 4315, 342, 247, 374, 89, 19, 9381, 4315, 275, 253, 5170, 1669, 2972, 285, 337, 84, 327, 253, 5780, 1073, 5154, 932, 816, 846, 1903, 253, 4477, 1375, 326, 954, 3676, 6928, 403, 46350, 619, 4685, 310, 326, 253, 1846, 774, 86, 6928, 403, 417, 46350, 533, 749, 19623, 6051, 50276, 24902, 963, 993, 362, 3348, 701, 2623, 70, 285, 25215, 400, 3348, 253, 577, 84, 50276, 783, 577, 84, 277, 4174, 273, 247, 1387, 6489, 3284, 305, 76, 24088, 50276, 909, 305, 76, 50276, 909, 50276, 72, 76, 24088, 50276, 909, 305, 76, 50276, 72, 76, 50276, 5996, 30080, 22559, 891, 5717, 253, 4477, 285, 643, 30628, 323, 616, 5701, 285, 5955, 1223, 253, 3884, 253, 4477, 15142, 310, 273, 440, 19751, 494, 15785, 891, 3464, 10915, 8498, 758, 326, 253, 789, 347, 352, 9572, 310, 10481, 3486, 1020, 323, 436, 18767, 50275, 18, 18512, 641, 74, 18205, 66, 1162, 355, 31880, 569, 275, 1728, 13468, 39762, 6753, 27676, 549, 32693, 638, 3845, 549, 32693, 11395, 26942, 29941, 4765, 374, 345, 15387, 2304, 14302, 285, 37622, 12679, 33971, 437, 9999, 4581, 9261, 11865, 275, 16202, 2990, 21624, 2317, 39951, 2284, 9169, 495, 1725, 35022, 3496, 1758, 1162, 355, 4404, 247, 5426, 273, 557, 290, 33195, 14237, 549, 32693, 638, 3845, 549, 32693, 1093, 8193, 1423, 1229, 4765, 577, 820, 864, 246, 15861, 285, 2781, 973, 272, 4715, 253, 22816, 14237, 273, 33796, 7027, 2390, 5213, 8059, 327, 5145, 4715, 4059, 7152, 339, 431, 248, 2929, 806, 2722, 326, 5368, 7274, 281, 21624, 2317, 557, 290, 606, 1338, 1347, 15225, 672, 253, 21624, 2317, 18080, 3798, 299, 26365, 1057, 417, 3761, 253, 4588, 941, 18080, 970, 9381, 32270, 14417, 347, 271, 1650, 436, 1783, 27798, 684, 275, 247, 2087, 37593, 2322, 10012, 323, 436, 1511, 273, 557, 290, 606, 1338, 253, 4477, 840, 12661, 247, 19595, 5426, 273, 557, 290, 606, 1338, 285, 921, 326, 352, 476, 320, 8156, 407, 2097, 273, 247, 5333, 5572, 275, 21624, 2317, 10527, 285, 16774, 1543, 7568, 253, 34385, 273, 253, 747, 2746, 436, 310, 247, 1077, 4722, 2934, 326, 6125, 1534, 4780, 275, 271, 1774, 1895, 50275, 328, 9520, 253, 1655, 6003, 273, 253, 2929, 1057, 417, 789, 973, 253, 4477, 34097, 1512, 1199, 2317, 2716, 273, 253, 2929, 281, 253, 8813, 273, 253, 1895, 285, 1512, 1652, 12345, 581, 3239, 281, 697, 2900, 436, 6505, 253, 9414, 342, 1142, 440, 42195, 3533, 670, 849, 253, 747, 1332, 2987, 285, 752, 697, 9560, 4278, 403, 690, 273, 841, 3533, 403, 1996, 18445, 342, 275, 253, 30762, 533, 436, 310, 1512, 3563, 50276, 2577, 2022, 14876, 323, 7756, 310, 3103, 281, 2118, 954, 273, 2593, 260, 4562, 281, 253, 2022, 2505, 285, 29211, 253, 2424, 2317, 407, 39243, 253, 16038, 598, 281, 2593, 4567, 285, 6830, 253, 5955, 273, 2709, 21257, 275, 2593, 7652, 2600, 326, 651, 755, 3663, 407, 436, 1818, 943, 320, 4395, 281, 253, 30762, 50276, 3062, 5884, 2792, 403, 50276, 783, 4477, 12889, 3730, 281, 3332, 2323, 273, 50276, 45618, 9158, 533, 513, 417, 26542, 285, 2319, 667, 2720, 789, 4496, 823, 4569, 10414, 281, 253, 10199, 390, 2905, 789, 50276, 8601, 569, 285, 29971, 403, 5415, 21257, 5727, 253, 4081, 5333, 5572, 310, 13358, 1057, 436, 35132, 1320, 9569, 46551, 6332, 390, 643, 24165, 849, 1142, 35132, 1320, 2308, 403, 3058, 285, 849, 476, 436, 1180, 320, 3413, 1057, 35132, 1320, 452, 26016, 7364, 824, 2442, 7364, 943, 387, 1878, 320, 14969, 34243, 841, 3533, 943, 320, 6949, 21657, 533, 436, 476, 320, 1669, 323, 2852, 789, 604, 275, 36764, 917, 275, 253, 1246, 2929, 50275, 249, 30762, 299, 18, 1543, 323, 39501, 403, 271, 1340, 273, 9777, 1805, 685, 1110, 323, 29971, 2139, 310, 436, 253, 1083, 50276, 13206, 495, 70, 352, 310, 1892, 281, 5963, 604, 253, 1543, 8495, 342, 253, 3216, 5083, 13027, 253, 3216, 5083, 943, 320, 8653, 323, 3806, 50276, 261, 352, 3309, 281, 2216, 253, 2990, 2556, 281, 1049, 363, 590, 3640, 273, 253, 4623, 1387, 21257, 390, 476, 436, 320, 22245, 8356, 323, 1650, 752, 6569, 604, 253, 21624, 2317, 17930, 247, 1387, 326, 1057, 417, 2723, 281, 667, 10377, 275, 253, 941, 50276, 303, 7378, 281, 7164, 619, 13716, 604, 841, 2792, 275, 1798, 253, 46293, 273, 2593, 260, 4562, 403, 43364, 9713, 275, 271, 9300, 2715, 273, 253, 19529, 7152, 33032, 2520, 2929, 2175, 253, 10732, 273, 557, 290, 606, 1338, 275, 247, 1387, 6779, 253, 30325, 4758, 557, 290, 36874, 310, 4536, 20178, 1025, 347, 10603, 5799, 2616, 24088, 1899, 50276, 31756, 281, 5799, 749, 31748, 352, 310, 2011, 28055, 326, 824, 247, 27785, 10732, 273, 557, 290, 36874, 310, 7479, 323, 17597, 4606, 285, 436, 310, 5783, 45190, 271, 5795, 5426, 273, 557, 290, 606, 1338, 310, 1677, 835, 3185, 273, 1461, 1699, 253, 1055, 273, 1016, 9261, 281, 247, 24822, 271, 5572, 310, 908, 326, 6993, 327, 253, 2644, 21624, 2317, 436, 5572, 310, 6777, 347, 247, 5333, 5572, 534, 2987, 323, 19870, 2390, 352, 310, 2011, 45190, 326, 271, 6753, 36465, 342, 247, 5333, 5572, 275, 21624, 2317, 310, 1805, 2104, 281, 3037, 39501, 285, 29971, 50276, 783, 2929, 1057, 247, 1175, 2628, 15571, 2139, 253, 27785, 10732, 273, 557, 290, 36874, 5644, 281, 17597, 3237, 285, 2410, 1763, 5356, 22513, 436, 598, 342, 4679, 347, 973, 253, 12288, 310, 417, 747, 281, 479, 11697, 533, 891, 16216, 1089, 247, 3806, 326, 11424, 352, 285, 891, 1158, 352, 310, 417, 7561, 7192, 594, 891, 1908, 436, 271, 1774, 7680, 281, 253, 1077, 278, 40747, 25200, 327, 557, 290, 36874, 50276, 28692, 337, 3400, 247, 747, 5426, 273, 557, 290, 36874, 2299, 253, 3908, 310, 417, 1077, 10799, 285, 891, 717, 417, 13762, 326, 352, 476, 12054, 320, 2783, 347, 247, 5426, 273, 557, 290, 606, 1338, 253, 5426, 310, 50275, 66, 6779, 310, 753, 281, 320, 557, 290, 33195, 342, 1675, 281, 247, 1798, 14717, 273, 247, 10377, 1387, 715, 22105, 604, 627, 310, 247, 2021, 273, 1929, 9158, 8534, 327, 436, 6779, 7826, 5939, 2439, 253, 2120, 21624, 835, 1016, 5572, 310, 32270, 6410, 281, 253, 2250, 273, 247, 2014, 14632, 50276, 3169, 327, 253, 1551, 273, 253, 2929, 891, 1158, 436, 2097, 326, 359, 452, 323, 1016, 14632, 15891, 271, 5572, 815, 74, 304, 8534, 327, 253, 21624, 2317, 253, 5426, 1057, 417, 1056, 352, 2590, 326, 359, 5730, 253, 32049, 281, 320, 32270, 6410, 8772, 436, 5572, 285, 690, 5572, 8534, 327, 253, 3280, 2317, 533, 891, 588, 5467, 326, 310, 752, 310, 5486, 5010, 1907, 271, 5572, 8534, 327, 253, 21624, 2317, 310, 247, 2581, 5809, 3472, 8284, 327, 253, 32049, 37626, 253, 5426, 1057, 3984, 273, 253, 5572, 1146, 32270, 6410, 534, 891, 588, 1379, 281, 1599, 326, 352, 310, 247, 1387, 6779, 26332, 815, 20878, 50276, 545, 304, 545, 304, 253, 5572, 1146, 5939, 891, 588, 1379, 281, 1599, 326, 815, 304, 476, 320, 667, 4872, 3711, 417, 7933, 8534, 35820, 1365, 327, 247, 24822, 390, 1146, 2972, 16421, 50276, 43408, 50275, 783, 5426, 25957, 326, 1016, 14632, 943, 452, 697, 1211, 5572, 533, 1580, 512, 273, 731, 769, 327, 253, 2644, 24822, 436, 3133, 281, 247, 14916, 7658, 6296, 604, 359, 452, 247, 6779, 273, 253, 2644, 1387, 8534, 327, 253, 21624, 2317, 3365, 34617, 352, 281, 1016, 14632, 4245, 441, 247, 6779, 273, 253, 22105, 891, 651, 2007, 3877, 326, 752, 310, 2218, 275, 3946, 275, 253, 2929, 310, 1027, 432, 436, 5426, 984, 359, 452, 581, 21624, 2317, 591, 5572, 417, 2709, 9158, 8534, 327, 253, 1072, 2317, 50276, 4524, 436, 7914, 891, 13414, 923, 849, 253, 5426, 310, 3981, 2712, 2010, 685, 326, 253, 2990, 943, 320, 32270, 6410, 8772, 690, 6779, 273, 253, 1387, 8534, 327, 253, 3280, 285, 3453, 2317, 3738, 32270, 14417, 310, 247, 1175, 2867, 323, 2710, 4606, 352, 1057, 417, 1646, 281, 479, 281, 320, 5272, 5426, 273, 557, 290, 36874, 407, 3139, 6296, 253, 6489, 3711, 12310, 436, 7658, 35820, 1365, 50276, 262, 778, 320, 326, 891, 452, 46485, 5426, 337, 533, 436, 4056, 49966, 253, 1083, 323, 2403, 352, 11076, 1037, 10799, 50276, 9154, 604, 581, 476, 1953, 1880, 809, 337, 310, 247, 1175, 7473, 1320, 273, 557, 290, 36874, 253, 2929, 1057, 921, 45190, 326, 352, 310, 6927, 281, 3037, 271, 32270, 6410, 32049, 48759, 672, 253, 21624, 5572, 310, 247, 5333, 5572, 390, 247, 16421, 1025, 2570, 2715, 273, 352, 2581, 685, 247, 557, 290, 33195, 5572, 342, 581, 374, 89, 19, 9381, 4315, 2972, 285, 271, 6489, 2972, 3036, 495, 67, 3738, 891, 13414, 871, 604, 841, 767, 7274, 452, 644, 2429, 1078, 2067, 5662, 9380, 1908, 2074, 3210, 281, 253, 5333, 5572, 1566, 50275, 1542, 4227, 275, 247, 3425, 273, 9380, 1167, 885, 19742, 50276, 73, 8185, 2783, 2803, 1025, 45630, 983, 326, 513, 1633, 2074, 820, 864, 50276, 88, 3485, 2529, 247, 6779, 783, 30325, 2715, 273, 436, 1566, 534, 310, 1077, 2074, 752, 310, 3559, 275, 436, 2929, 387, 1878, 253, 4872, 247, 70, 285, 671, 3534, 247, 5426, 273, 557, 290, 36874, 762, 436, 5426, 253, 2570, 16421, 5333, 5572, 310, 557, 290, 33195, 1223, 253, 3236, 5333, 5572, 310, 417, 3210, 342, 247, 8031, 273, 2709, 9158, 497, 2783, 407, 594, 73, 392, 781, 6339, 1162, 355, 50276, 338, 581, 16995, 281, 4853, 247, 10732, 273, 557, 290, 36874, 1754, 327, 22105, 285, 14237, 352, 778, 320, 4409, 15686, 14632, 12956, 50276, 72, 813, 395, 85, 1178, 3642, 14395, 50276, 249, 6010, 891, 1158, 436, 2929, 4428, 2067, 4722, 7313, 285, 1543, 285, 891, 1158, 253, 2087, 3884, 310, 1077, 4722, 285, 22828, 2007, 1263, 2299, 516, 417, 13762, 326, 436, 2929, 3400, 247, 1175, 5426, 273, 557, 290, 36874, 253, 4679, 3738, 21414, 285, 973, 11407, 403, 11096, 281, 21010, 10625, 285, 690, 273, 253, 16039, 50276, 30172, 3559, 275, 253, 2929, 403, 2168, 1246, 275, 4321, 789, 17837, 891, 3524, 253, 4477, 588, 417, 320, 42965, 285, 4035, 281, 789, 327, 436, 1774, 285, 7936, 1895, 970, 253, 5657, 273, 6779, 3762, 50276, 250, 3065, 1167, 885, 19742, 50276, 73, 8185, 4715, 281, 1957, 8820, 21257, 342, 958, 2149, 2169, 2621, 22491, 91, 8420, 10679, 4267, 594, 73, 392, 781, 6339, 259, 606, 8919, 1200, 666, 257, 271, 440, 35421, 5933, 323, 4715, 7027, 1387, 21257, 4267, 820, 864, 50276, 88, 3485, 4715, 253, 22816, 14237, 273, 33796, 7027, 2390, 4059, 259, 28709, 1053, 49132, 2093, 74, 2534, 6451, 2788, 253, 1554, 2865, 1079, 2605, 273, 27370, 7413, 6051, 2460, 28236, 5826, 50275, 5996, 49794, 5731, 1907, 1239, 253, 643, 10123, 2488, 2380, 285, 9300, 2929, 891, 1335, 1158, 436, 2929, 310, 45210, 253, 12288, 326, 557, 290, 36874, 21257, 347, 5549, 1242, 2931, 310, 7479, 323, 17597, 4606, 310, 3588, 285, 4722, 533, 3133, 281, 452, 644, 2168, 2540, 407, 2571, 24088, 18512, 641, 74, 1162, 355, 17837, 253, 2929, 1057, 247, 1175, 2628, 15571, 436, 594, 352, 812, 320, 4217, 347, 690, 4477, 1646, 281, 417, 871, 670, 436, 2523, 253, 5426, 273, 557, 290, 36874, 1335, 3133, 247, 2372, 21248, 281, 479, 285, 516, 417, 13762, 273, 8542, 30437, 273, 253, 4081, 1332, 2490, 187, 4118, 18435, 27, 2520, 310, 247, 45210, 1083, 3240, 10870, 281, 253, 643, 45210, 1083, 275, 619, 14604, 253, 2929, 556, 2959, 10182, 10123, 285, 1754, 327, 619, 42428, 273, 253, 1027, 7125, 891, 12666, 387, 271, 3388, 4868, 875, 45916, 285, 721, 253, 4477, 1246, 690, 32811, 5697, 2905, 281, 557, 290, 606, 1338, 326, 22828, 625, 4116, 285, 326, 812, 13673, 625, 2561, 275, 436, 3884, 387, 253, 1072, 673, 253, 1268, 273, 38135, 285, 8453, 273, 436, 789, 4558, 247, 2372, 3710, 2668, 2366, 253, 2929, 310, 2779, 417, 18511, 2217, 281, 320, 2190, 253, 1755, 9380, 281, 320, 4236, 323, 9311, 387, 17857, 32888, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this work attempts to learn improved slotbased object representations and video segmentations with the help of the slateimage gpt 2 decoder the model is evaluated on several datasets included ones generated by the authors the results have dubious merit the claims are poorly supported and the writing needs a lot of improvement there is little innovation in experimental methodology on the plus side the paper conducts a wide evaluation over several datasets but the central claim that steve works on naturalistic images is not convincing at all it is also unclear whether steve is any different than savi with a slateimage gpt 2 decoder there was no attempt to explain the limitations of the model docseplearning the compositional structure eg scene graph in dynamic visual scenes without labels eg object masks has proven challenging slotbased models leveraging motion cues have recently made progress in learning to represent segment and track objects without direct supervision however these methods only work on synthetic data and fails on complex realworld multiobject videos this paper proposes steve an unsupervised model for objectcentric learning in videos the paper clams that it made a significant progress by demonstrating its effectiveness on various complex and naturalistic videos unprecedented this is enabled by a simple architecture that uses a transformerbased image decoder conditioned on slots and the learning objective is simply to reconstruct the observation experiment results on several videos show improvements compared to the previous stateoftheart savi slate and op3 strengths the paper proposes a a simple architecture that uses a transformerbased image decoder conditioned on slots and the learning objective is simply to reconstruct the observation the transformerbased image decoder uses the one proposed in prior work slate singh et al 2022 weaknesses despite the claims on its effectiveness on complex and naturalistic videos the datasets seem to be very limiting 6 of them are procedurally generated datasets cater girdhar ramanan 2020 catertex movisolid movitex movid and movie greff et al 2022 2 natural datasets are only collected for this paper traffic and aquarium the paper does not provide details on the complexity and diversity of these datasets the paper seems to lack novelty as the design seems to be a straightforward combination of 1 a cnnbased image encoder 2 a recurrent slot encoder that updates slots temporally with recurrent neural networks rnns and 3 the slottransformer decoder of slate the paper does not address its limitations a section clearly articulating the technical limitations is important docsepthe paper proposes a new method for unsupervised object discovery and tracking in videos called steve this model combines an encoding backbone a recurrent slot encoder and a transformerbased decoder to achieve impressive object discovery results on complex textured multiobject datasets additionally the model provides useful insights into its predecessor slate which worked on static images strengths s1 the writing style is great and the paper is a pleasure to read it is easy to follow and contains a lot of useful insights in particular i enjoyed the summarizing statements at the end of many paragraphs in the experiments section s2 the proposed method is comparatively simple yet very effective it achieves impressive object discovery performance especially on textured multiobject datasets s3 the paper provides extensive and insightful experiments across 8 datasets 5 of which are introduced by the paper itself weaknesses w1 the description of the model in l119128 was somewhat unclear to me is the dvae an entirely separate model from the slotencoder transformerdecoder architecture how exactly do the dvae and transformer decoder interact and why additionally since this is heavily based on the slate model it would be nice to explicitly mention the architectural differences between the two models others the references need some cleanup there are several papers that appear twice in the list limitations are adequately addressed in the paper docsep post rebuttal update the rebuttal addresses my concerns with new experiments and a new limitations section in the manuscript i continue to support the acceptance of this paper rating 8 end of post rebuttal update this paper proposes a model called steve for object centric learning for video data steve is a combination of a recurrent slot encoder like savi and the slate slottransformer decoder the combination works surprisingly well on relatively complex synthetic data and even in some real world video datasets steve demonstrates better object centric decomposition when compared against baselines on the fgari metric it also generalizes ood to unseen number of objects and unseen materials strengths the proposed method is simply yet very effectively steve does not use extra annotations in the first frame nor does it need optical flow or other signal in its loss function purely using dvae token reconstruction as the loss function indireclty just pixel reconstruction steve is able to decompose textured videos handle static objects movid dataset and moving camera movie dataset the method has been tested on a wide variety of datasets this gives more weight to the claims the comparisons show that the slate decoder is significantly better than a spatial broadcast decoder in both per frame and video metrics the paper thus also provides insights about the slate decoder slate itself is very powerful for object centric learning the analysis covers obvious concerns such as using encoder attention maps as opposed to decoder attention maps and relevant ablations such as dvae granularity are included weaknesses minor issues missing legend in figure 3a please note how many objects were used in the indistribution and ood settings if possible include steve in figure 5a whats the performance of steve relative to stevediagnostic i think you mean convolution transpose not deconvolution in page 9 please describe the exact architecture of this convolution transpose network the spatial broadcast decoder also does some convolution transpose operations how is this decoder different an explicit bigo analysis would be great in the computational requirements paragraph a model figure is missing i have read savi paper so i understand this method but people new to steve would likely benefit from a nice figure fgari ignores background pixels this does not penalize leaking of object segments into the background relying only on this metric is dangerous is it possible to report readout based metrics in these datasets for instance the slot attention paper reported property prediction for clevr it is still not clear why the transformer decoder which mixes information from all the slots in the decoding step unlike the mixture decoder which is applied to each slot independently still leads to object decompositions does one need to control transformer capacity to make this work what is the most sensitive hyperparameter here societal impact has been discussed limitations can be elaborated further for instance the memory complexity of slate has been discussed in the paper but the time complexity of sampling an image has not been discussed since transformer decoders need autoregressive tokenbytoken generation they tend to be very slow during inference not a problem during training maybe this becomes a big issue when running this at high resolution 512x512 or 1080p ### Summary:
the paper proposes a method for unsupervised learning of objects from videos in particular the proposed approach combines two existing ideas a recurrent slotbased architecture like savi and an autoregressive image decoder like slate the methods is thoroughly evaluated and shown to outperform the relevant baselines after considering the authors feedback and extensive discussions the reviewers opinions of the paper are still mixed two strongly positive one neutral leaning positive and one strongly negative the key strengths and weaknesses that were pointed out are as follows strengths 1 simple and efficient model 2 convincing experimental results on 8 datasets including 2 realworld 3 insightful analysis and ablation studies weaknesses 1 lack of novelty 2 no experiments on more complex realworld datasets 3 fgari is not a perfect evaluation metric 4 lack of insight from the experiments note that some of the mentioned weaknesses are in conflict with strengths for instance s1 with w1 s2 with w2 s23 with w34 taking all of the above into account i believe the paper presents a simple yet efficient combination of two existing methods and evaluated it thoroughly in line with and actually somewhat above what is commonly done for unsupervised object learning papers leading to good performance and interesting insights i believe the simplicity of the method is valuable and should not be critiqued as lack of innovation and the provided experiments are already on more complex datasets than commonly used in the field therefore at this point i recommend acceptance but encourage the authors to take the reviewers comments to heart whenever possible and adjust the paper accordingly
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 789, 9437, 281, 3037, 5520, 15239, 3169, 1789, 14237, 285, 3492, 8223, 569, 342, 253, 1361, 273, 253, 42550, 5695, 305, 431, 374, 29810, 253, 1566, 310, 6760, 327, 2067, 15302, 2908, 4394, 4561, 407, 253, 4477, 50276, 783, 1543, 452, 42326, 15785, 253, 3916, 403, 15225, 4516, 285, 253, 4028, 3198, 247, 2257, 273, 7756, 627, 310, 1652, 15832, 275, 5661, 16182, 327, 253, 5043, 1930, 253, 2929, 2589, 84, 247, 4618, 7103, 689, 2067, 15302, 533, 253, 4275, 1750, 326, 2870, 306, 2987, 327, 3626, 2531, 3888, 310, 417, 21414, 387, 512, 352, 310, 671, 12744, 1880, 2870, 306, 310, 667, 1027, 685, 5745, 74, 342, 247, 42550, 5695, 305, 431, 374, 29810, 627, 369, 642, 3177, 281, 5513, 253, 7364, 273, 253, 1566, 5474, 339, 713, 4026, 253, 5889, 267, 2605, 24088, 6200, 4216, 275, 7870, 5304, 13451, 1293, 13301, 24088, 1789, 25965, 556, 11464, 11132, 50276, 31873, 3169, 3210, 19732, 2977, 3200, 26638, 452, 4102, 1160, 4780, 275, 4715, 281, 1957, 8223, 285, 3540, 5113, 1293, 1480, 20446, 2299, 841, 3082, 760, 789, 327, 13506, 941, 285, 10224, 327, 2570, 1524, 10186, 4471, 6082, 10556, 50276, 2520, 2929, 29328, 2870, 306, 271, 440, 35421, 1566, 323, 1789, 37382, 4715, 275, 10556, 253, 2929, 502, 1317, 326, 352, 1160, 247, 1534, 4780, 407, 17227, 697, 12510, 327, 2710, 2570, 285, 3626, 2531, 10556, 23663, 436, 310, 11410, 407, 247, 2969, 10336, 326, 4648, 247, 39707, 3169, 2460, 29810, 27039, 327, 25195, 285, 253, 4715, 8103, 310, 3365, 281, 17029, 253, 8310, 50276, 16217, 2092, 1543, 327, 2067, 10556, 921, 11701, 2429, 281, 253, 2045, 1375, 23037, 14387, 5745, 74, 42550, 285, 1121, 20, 20544, 50276, 783, 2929, 29328, 247, 247, 2969, 10336, 326, 4648, 247, 39707, 3169, 2460, 29810, 27039, 327, 25195, 285, 253, 4715, 8103, 310, 3365, 281, 17029, 253, 8310, 253, 39707, 3169, 2460, 29810, 4648, 253, 581, 4081, 275, 2720, 789, 42550, 1625, 73, 1162, 355, 1384, 1423, 50276, 20881, 1255, 265, 50276, 3229, 3784, 253, 3916, 327, 697, 12510, 327, 2570, 285, 3626, 2531, 10556, 253, 15302, 1646, 281, 320, 1077, 14155, 50276, 23, 273, 731, 403, 3352, 8572, 4561, 15302, 28335, 305, 1817, 9432, 50276, 3358, 35243, 9169, 28335, 7109, 1855, 261, 9528, 1855, 614, 89, 1855, 301, 285, 6440, 13738, 567, 1162, 355, 1384, 1423, 374, 3626, 15302, 403, 760, 5728, 323, 436, 2929, 7137, 285, 12649, 27601, 253, 2929, 1057, 417, 2085, 4278, 327, 253, 10454, 285, 9991, 273, 841, 15302, 50275, 783, 2929, 3133, 281, 3480, 38135, 347, 253, 2216, 3133, 281, 320, 247, 15246, 5019, 273, 337, 247, 260, 9866, 3169, 2460, 32049, 374, 247, 18902, 15239, 32049, 326, 11269, 25195, 5897, 595, 342, 18902, 11454, 6928, 391, 79, 2224, 285, 495, 253, 15239, 16702, 254, 29810, 273, 42550, 253, 2929, 1057, 417, 2953, 697, 7364, 247, 2593, 4518, 18575, 8287, 253, 7681, 7364, 310, 1774, 50276, 7152, 339, 431, 248, 2929, 29328, 247, 747, 1332, 323, 440, 35421, 1789, 8900, 285, 12544, 275, 10556, 1925, 2870, 306, 436, 1566, 24772, 271, 9706, 27882, 247, 18902, 15239, 32049, 285, 247, 39707, 3169, 29810, 281, 5115, 13943, 1789, 8900, 1543, 327, 2570, 2505, 1520, 4471, 6082, 15302, 23000, 253, 1566, 3400, 4217, 16039, 715, 697, 28934, 42550, 534, 4307, 327, 4228, 3888, 50275, 296, 3755, 20556, 50276, 84, 18, 253, 4028, 3740, 310, 1270, 285, 253, 2929, 310, 247, 11284, 281, 1239, 352, 310, 3477, 281, 956, 285, 4428, 247, 2257, 273, 4217, 16039, 275, 1798, 891, 11346, 253, 10405, 3006, 7234, 387, 253, 990, 273, 1142, 33295, 275, 253, 4679, 2593, 50276, 84, 19, 253, 4081, 1332, 310, 31381, 2969, 2568, 1077, 3576, 352, 33526, 13943, 1789, 8900, 3045, 3340, 327, 2505, 1520, 4471, 6082, 15302, 50276, 84, 20, 253, 2929, 3400, 9470, 285, 47860, 4679, 2439, 854, 15302, 608, 273, 534, 403, 5611, 407, 253, 2929, 3139, 50273, 20881, 1255, 265, 50276, 88, 18, 253, 5740, 273, 253, 1566, 275, 298, 12115, 8196, 369, 8489, 12744, 281, 479, 310, 253, 277, 21574, 271, 7094, 4858, 1566, 432, 253, 15239, 36465, 39707, 48759, 10336, 849, 4555, 513, 253, 277, 21574, 285, 39707, 29810, 8008, 285, 2139, 23000, 1580, 436, 310, 11306, 1754, 327, 253, 42550, 1566, 352, 651, 320, 5322, 281, 11120, 3748, 253, 27934, 3910, 875, 253, 767, 3210, 50275, 39658, 253, 10414, 878, 690, 34709, 627, 403, 2067, 9380, 326, 3176, 7019, 275, 253, 1618, 7364, 403, 18212, 9713, 275, 253, 2929, 50276, 7152, 33032, 1501, 30080, 22559, 5731, 50276, 783, 30080, 22559, 12453, 619, 7350, 342, 747, 4679, 285, 247, 747, 7364, 2593, 275, 253, 7714, 891, 4035, 281, 1329, 253, 14924, 273, 436, 2929, 13716, 854, 50275, 423, 273, 1501, 30080, 22559, 5731, 50276, 2520, 2929, 29328, 247, 1566, 1925, 2870, 306, 323, 1789, 1399, 695, 4715, 323, 3492, 941, 2870, 306, 310, 247, 5019, 273, 247, 18902, 15239, 32049, 751, 5745, 74, 285, 253, 42550, 15239, 16702, 254, 29810, 253, 5019, 2987, 19143, 973, 327, 4942, 2570, 13506, 941, 285, 1014, 275, 690, 1524, 1533, 3492, 15302, 2870, 306, 14371, 1805, 1789, 1399, 695, 14717, 672, 2429, 1411, 1666, 25379, 327, 253, 269, 72, 1792, 7982, 352, 671, 2087, 4219, 258, 351, 281, 39709, 1180, 273, 5113, 285, 39709, 4753, 50276, 296, 3755, 20556, 50275, 783, 4081, 1332, 310, 3365, 2568, 1077, 8069, 2870, 306, 1057, 417, 897, 4465, 31825, 275, 253, 806, 3665, 4543, 1057, 352, 878, 5748, 2685, 390, 643, 2625, 275, 697, 2957, 1159, 15846, 970, 277, 21574, 10669, 14433, 347, 253, 2957, 1159, 801, 603, 498, 555, 816, 12275, 14433, 2870, 306, 310, 2104, 281, 11101, 3014, 2505, 1520, 10556, 6016, 4228, 5113, 1855, 301, 10895, 285, 4886, 6568, 6440, 10895, 50275, 783, 1332, 556, 644, 5762, 327, 247, 4618, 5235, 273, 15302, 436, 4245, 625, 2801, 281, 253, 3916, 253, 14023, 921, 326, 253, 42550, 29810, 310, 3012, 1805, 685, 247, 8820, 10675, 29810, 275, 1097, 591, 3665, 285, 3492, 17082, 253, 2929, 3021, 671, 3400, 16039, 670, 253, 42550, 29810, 42550, 3139, 310, 1077, 6422, 323, 1789, 1399, 695, 4715, 50275, 783, 1783, 10949, 4755, 7350, 824, 347, 970, 32049, 4116, 8115, 347, 10066, 281, 29810, 4116, 8115, 285, 4623, 490, 77, 569, 50276, 10328, 347, 277, 21574, 32449, 414, 403, 2908, 50275, 20881, 1255, 265, 50276, 37585, 3374, 50274, 33722, 13691, 275, 4677, 495, 66, 4496, 3877, 849, 1142, 5113, 497, 908, 275, 253, 31929, 2382, 285, 258, 351, 7533, 50273, 338, 1896, 2486, 2870, 306, 275, 4677, 608, 66, 47515, 253, 3045, 273, 2870, 306, 4103, 281, 331, 1173, 24559, 1530, 6932, 50273, 74, 1158, 368, 1599, 27311, 811, 3014, 417, 372, 13118, 2241, 275, 3239, 898, 4496, 6266, 253, 3242, 10336, 273, 436, 27311, 811, 3014, 2990, 253, 8820, 10675, 29810, 671, 1057, 690, 27311, 811, 3014, 5871, 849, 310, 436, 29810, 1027, 50273, 266, 6843, 1943, 80, 1783, 651, 320, 1270, 275, 253, 15180, 6095, 12494, 50275, 66, 1566, 4677, 310, 5816, 891, 452, 1239, 5745, 74, 2929, 594, 891, 2096, 436, 1332, 533, 952, 747, 281, 2870, 306, 651, 2779, 5649, 432, 247, 5322, 4677, 50275, 16054, 1792, 35136, 4114, 15115, 436, 1057, 417, 29697, 907, 40929, 273, 1789, 13288, 715, 253, 4114, 22128, 760, 327, 436, 7982, 310, 8312, 310, 352, 1896, 281, 1304, 49914, 1754, 17082, 275, 841, 15302, 323, 4227, 253, 15239, 4116, 2929, 2361, 2867, 10554, 323, 1391, 24987, 50275, 262, 310, 1335, 417, 2590, 2139, 253, 39707, 29810, 534, 47603, 1491, 432, 512, 253, 25195, 275, 253, 28490, 3213, 12401, 253, 7802, 29810, 534, 310, 3732, 281, 1016, 15239, 10939, 1335, 5644, 281, 1789, 14717, 84, 1057, 581, 878, 281, 1453, 39707, 5350, 281, 1056, 436, 789, 752, 310, 253, 954, 7996, 4373, 19484, 1060, 38058, 3486, 556, 644, 5469, 50276, 17465, 569, 476, 320, 50221, 2007, 323, 4227, 253, 3541, 10454, 273, 42550, 556, 644, 5469, 275, 253, 2929, 533, 253, 673, 10454, 273, 10491, 271, 2460, 556, 417, 644, 5469, 1580, 39707, 1086, 351, 398, 878, 47694, 11020, 10669, 1615, 13763, 5978, 597, 5257, 281, 320, 1077, 3468, 1309, 17032, 417, 247, 1895, 1309, 3733, 5046, 436, 4916, 247, 1943, 2523, 672, 3515, 436, 387, 1029, 6064, 23414, 89, 19233, 390, 41399, 81, 2490, 187, 4118, 18435, 27, 783, 2929, 29328, 247, 1332, 323, 440, 35421, 4715, 273, 5113, 432, 10556, 275, 1798, 253, 4081, 2746, 24772, 767, 5368, 5697, 50276, 66, 18902, 15239, 3169, 10336, 751, 5745, 74, 285, 271, 47694, 11020, 2460, 29810, 751, 42550, 253, 3082, 310, 16575, 6760, 285, 2011, 281, 562, 32231, 253, 4623, 1666, 25379, 50276, 6438, 7296, 253, 4477, 8680, 285, 9470, 11985, 253, 30628, 11626, 273, 253, 2929, 403, 1335, 6804, 767, 7052, 2762, 581, 9238, 25661, 2762, 285, 581, 7052, 4016, 253, 2234, 20544, 285, 32213, 326, 497, 8042, 562, 403, 347, 3637, 20544, 337, 2969, 285, 5919, 1566, 374, 21414, 5661, 1543, 327, 854, 15302, 1690, 374, 1524, 10186, 495, 47860, 1783, 285, 28913, 2175, 50276, 20881, 1255, 265, 337, 3480, 273, 38135, 374, 642, 4679, 327, 625, 2570, 1524, 10186, 15302, 495, 269, 72, 1792, 310, 417, 247, 3962, 7103, 7982, 577, 3480, 273, 12288, 432, 253, 4679, 50276, 9939, 326, 690, 273, 253, 5393, 32213, 403, 275, 7344, 342, 20544, 50276, 1542, 4227, 256, 18, 342, 259, 18, 256, 19, 342, 259, 19, 256, 1508, 342, 259, 1706, 3192, 512, 273, 253, 1840, 715, 2395, 891, 2868, 253, 2929, 10262, 247, 2969, 2568, 5919, 5019, 273, 767, 5368, 3082, 285, 6760, 352, 16575, 275, 1386, 342, 285, 2686, 8489, 1840, 752, 310, 7744, 2218, 323, 440, 35421, 1789, 4715, 9380, 50276, 16378, 281, 1175, 3045, 285, 4722, 16039, 891, 2868, 253, 17647, 273, 253, 1332, 310, 9865, 285, 943, 417, 320, 2268, 3008, 264, 347, 3480, 273, 15832, 285, 253, 2530, 4679, 403, 2168, 327, 625, 2570, 15302, 685, 7744, 908, 275, 253, 1673, 50275, 45230, 387, 436, 1127, 891, 5583, 14924, 533, 11907, 253, 4477, 281, 1379, 253, 30628, 5701, 281, 2798, 10793, 1896, 285, 4575, 253, 2929, 15672 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 789, 9437, 281, 3037, 5520, 15239, 3169, 1789, 14237, 285, 3492, 8223, 569, 342, 253, 1361, 273, 253, 42550, 5695, 305, 431, 374, 29810, 253, 1566, 310, 6760, 327, 2067, 15302, 2908, 4394, 4561, 407, 253, 4477, 50276, 783, 1543, 452, 42326, 15785, 253, 3916, 403, 15225, 4516, 285, 253, 4028, 3198, 247, 2257, 273, 7756, 627, 310, 1652, 15832, 275, 5661, 16182, 327, 253, 5043, 1930, 253, 2929, 2589, 84, 247, 4618, 7103, 689, 2067, 15302, 533, 253, 4275, 1750, 326, 2870, 306, 2987, 327, 3626, 2531, 3888, 310, 417, 21414, 387, 512, 352, 310, 671, 12744, 1880, 2870, 306, 310, 667, 1027, 685, 5745, 74, 342, 247, 42550, 5695, 305, 431, 374, 29810, 627, 369, 642, 3177, 281, 5513, 253, 7364, 273, 253, 1566, 5474, 339, 713, 4026, 253, 5889, 267, 2605, 24088, 6200, 4216, 275, 7870, 5304, 13451, 1293, 13301, 24088, 1789, 25965, 556, 11464, 11132, 50276, 31873, 3169, 3210, 19732, 2977, 3200, 26638, 452, 4102, 1160, 4780, 275, 4715, 281, 1957, 8223, 285, 3540, 5113, 1293, 1480, 20446, 2299, 841, 3082, 760, 789, 327, 13506, 941, 285, 10224, 327, 2570, 1524, 10186, 4471, 6082, 10556, 50276, 2520, 2929, 29328, 2870, 306, 271, 440, 35421, 1566, 323, 1789, 37382, 4715, 275, 10556, 253, 2929, 502, 1317, 326, 352, 1160, 247, 1534, 4780, 407, 17227, 697, 12510, 327, 2710, 2570, 285, 3626, 2531, 10556, 23663, 436, 310, 11410, 407, 247, 2969, 10336, 326, 4648, 247, 39707, 3169, 2460, 29810, 27039, 327, 25195, 285, 253, 4715, 8103, 310, 3365, 281, 17029, 253, 8310, 50276, 16217, 2092, 1543, 327, 2067, 10556, 921, 11701, 2429, 281, 253, 2045, 1375, 23037, 14387, 5745, 74, 42550, 285, 1121, 20, 20544, 50276, 783, 2929, 29328, 247, 247, 2969, 10336, 326, 4648, 247, 39707, 3169, 2460, 29810, 27039, 327, 25195, 285, 253, 4715, 8103, 310, 3365, 281, 17029, 253, 8310, 253, 39707, 3169, 2460, 29810, 4648, 253, 581, 4081, 275, 2720, 789, 42550, 1625, 73, 1162, 355, 1384, 1423, 50276, 20881, 1255, 265, 50276, 3229, 3784, 253, 3916, 327, 697, 12510, 327, 2570, 285, 3626, 2531, 10556, 253, 15302, 1646, 281, 320, 1077, 14155, 50276, 23, 273, 731, 403, 3352, 8572, 4561, 15302, 28335, 305, 1817, 9432, 50276, 3358, 35243, 9169, 28335, 7109, 1855, 261, 9528, 1855, 614, 89, 1855, 301, 285, 6440, 13738, 567, 1162, 355, 1384, 1423, 374, 3626, 15302, 403, 760, 5728, 323, 436, 2929, 7137, 285, 12649, 27601, 253, 2929, 1057, 417, 2085, 4278, 327, 253, 10454, 285, 9991, 273, 841, 15302, 50275, 783, 2929, 3133, 281, 3480, 38135, 347, 253, 2216, 3133, 281, 320, 247, 15246, 5019, 273, 337, 247, 260, 9866, 3169, 2460, 32049, 374, 247, 18902, 15239, 32049, 326, 11269, 25195, 5897, 595, 342, 18902, 11454, 6928, 391, 79, 2224, 285, 495, 253, 15239, 16702, 254, 29810, 273, 42550, 253, 2929, 1057, 417, 2953, 697, 7364, 247, 2593, 4518, 18575, 8287, 253, 7681, 7364, 310, 1774, 50276, 7152, 339, 431, 248, 2929, 29328, 247, 747, 1332, 323, 440, 35421, 1789, 8900, 285, 12544, 275, 10556, 1925, 2870, 306, 436, 1566, 24772, 271, 9706, 27882, 247, 18902, 15239, 32049, 285, 247, 39707, 3169, 29810, 281, 5115, 13943, 1789, 8900, 1543, 327, 2570, 2505, 1520, 4471, 6082, 15302, 23000, 253, 1566, 3400, 4217, 16039, 715, 697, 28934, 42550, 534, 4307, 327, 4228, 3888, 50275, 296, 3755, 20556, 50276, 84, 18, 253, 4028, 3740, 310, 1270, 285, 253, 2929, 310, 247, 11284, 281, 1239, 352, 310, 3477, 281, 956, 285, 4428, 247, 2257, 273, 4217, 16039, 275, 1798, 891, 11346, 253, 10405, 3006, 7234, 387, 253, 990, 273, 1142, 33295, 275, 253, 4679, 2593, 50276, 84, 19, 253, 4081, 1332, 310, 31381, 2969, 2568, 1077, 3576, 352, 33526, 13943, 1789, 8900, 3045, 3340, 327, 2505, 1520, 4471, 6082, 15302, 50276, 84, 20, 253, 2929, 3400, 9470, 285, 47860, 4679, 2439, 854, 15302, 608, 273, 534, 403, 5611, 407, 253, 2929, 3139, 50273, 20881, 1255, 265, 50276, 88, 18, 253, 5740, 273, 253, 1566, 275, 298, 12115, 8196, 369, 8489, 12744, 281, 479, 310, 253, 277, 21574, 271, 7094, 4858, 1566, 432, 253, 15239, 36465, 39707, 48759, 10336, 849, 4555, 513, 253, 277, 21574, 285, 39707, 29810, 8008, 285, 2139, 23000, 1580, 436, 310, 11306, 1754, 327, 253, 42550, 1566, 352, 651, 320, 5322, 281, 11120, 3748, 253, 27934, 3910, 875, 253, 767, 3210, 50275, 39658, 253, 10414, 878, 690, 34709, 627, 403, 2067, 9380, 326, 3176, 7019, 275, 253, 1618, 7364, 403, 18212, 9713, 275, 253, 2929, 50276, 7152, 33032, 1501, 30080, 22559, 5731, 50276, 783, 30080, 22559, 12453, 619, 7350, 342, 747, 4679, 285, 247, 747, 7364, 2593, 275, 253, 7714, 891, 4035, 281, 1329, 253, 14924, 273, 436, 2929, 13716, 854, 50275, 423, 273, 1501, 30080, 22559, 5731, 50276, 2520, 2929, 29328, 247, 1566, 1925, 2870, 306, 323, 1789, 1399, 695, 4715, 323, 3492, 941, 2870, 306, 310, 247, 5019, 273, 247, 18902, 15239, 32049, 751, 5745, 74, 285, 253, 42550, 15239, 16702, 254, 29810, 253, 5019, 2987, 19143, 973, 327, 4942, 2570, 13506, 941, 285, 1014, 275, 690, 1524, 1533, 3492, 15302, 2870, 306, 14371, 1805, 1789, 1399, 695, 14717, 672, 2429, 1411, 1666, 25379, 327, 253, 269, 72, 1792, 7982, 352, 671, 2087, 4219, 258, 351, 281, 39709, 1180, 273, 5113, 285, 39709, 4753, 50276, 296, 3755, 20556, 50275, 783, 4081, 1332, 310, 3365, 2568, 1077, 8069, 2870, 306, 1057, 417, 897, 4465, 31825, 275, 253, 806, 3665, 4543, 1057, 352, 878, 5748, 2685, 390, 643, 2625, 275, 697, 2957, 1159, 15846, 970, 277, 21574, 10669, 14433, 347, 253, 2957, 1159, 801, 603, 498, 555, 816, 12275, 14433, 2870, 306, 310, 2104, 281, 11101, 3014, 2505, 1520, 10556, 6016, 4228, 5113, 1855, 301, 10895, 285, 4886, 6568, 6440, 10895, 50275, 783, 1332, 556, 644, 5762, 327, 247, 4618, 5235, 273, 15302, 436, 4245, 625, 2801, 281, 253, 3916, 253, 14023, 921, 326, 253, 42550, 29810, 310, 3012, 1805, 685, 247, 8820, 10675, 29810, 275, 1097, 591, 3665, 285, 3492, 17082, 253, 2929, 3021, 671, 3400, 16039, 670, 253, 42550, 29810, 42550, 3139, 310, 1077, 6422, 323, 1789, 1399, 695, 4715, 50275, 783, 1783, 10949, 4755, 7350, 824, 347, 970, 32049, 4116, 8115, 347, 10066, 281, 29810, 4116, 8115, 285, 4623, 490, 77, 569, 50276, 10328, 347, 277, 21574, 32449, 414, 403, 2908, 50275, 20881, 1255, 265, 50276, 37585, 3374, 50274, 33722, 13691, 275, 4677, 495, 66, 4496, 3877, 849, 1142, 5113, 497, 908, 275, 253, 31929, 2382, 285, 258, 351, 7533, 50273, 338, 1896, 2486, 2870, 306, 275, 4677, 608, 66, 47515, 253, 3045, 273, 2870, 306, 4103, 281, 331, 1173, 24559, 1530, 6932, 50273, 74, 1158, 368, 1599, 27311, 811, 3014, 417, 372, 13118, 2241, 275, 3239, 898, 4496, 6266, 253, 3242, 10336, 273, 436, 27311, 811, 3014, 2990, 253, 8820, 10675, 29810, 671, 1057, 690, 27311, 811, 3014, 5871, 849, 310, 436, 29810, 1027, 50273, 266, 6843, 1943, 80, 1783, 651, 320, 1270, 275, 253, 15180, 6095, 12494, 50275, 66, 1566, 4677, 310, 5816, 891, 452, 1239, 5745, 74, 2929, 594, 891, 2096, 436, 1332, 533, 952, 747, 281, 2870, 306, 651, 2779, 5649, 432, 247, 5322, 4677, 50275, 16054, 1792, 35136, 4114, 15115, 436, 1057, 417, 29697, 907, 40929, 273, 1789, 13288, 715, 253, 4114, 22128, 760, 327, 436, 7982, 310, 8312, 310, 352, 1896, 281, 1304, 49914, 1754, 17082, 275, 841, 15302, 323, 4227, 253, 15239, 4116, 2929, 2361, 2867, 10554, 323, 1391, 24987, 50275, 262, 310, 1335, 417, 2590, 2139, 253, 39707, 29810, 534, 47603, 1491, 432, 512, 253, 25195, 275, 253, 28490, 3213, 12401, 253, 7802, 29810, 534, 310, 3732, 281, 1016, 15239, 10939, 1335, 5644, 281, 1789, 14717, 84, 1057, 581, 878, 281, 1453, 39707, 5350, 281, 1056, 436, 789, 752, 310, 253, 954, 7996, 4373, 19484, 1060, 38058, 3486, 556, 644, 5469, 50276, 17465, 569, 476, 320, 50221, 2007, 323, 4227, 253, 3541, 10454, 273, 42550, 556, 644, 5469, 275, 253, 2929, 533, 253, 673, 10454, 273, 10491, 271, 2460, 556, 417, 644, 5469, 1580, 39707, 1086, 351, 398, 878, 47694, 11020, 10669, 1615, 13763, 5978, 597, 5257, 281, 320, 1077, 3468, 1309, 17032, 417, 247, 1895, 1309, 3733, 5046, 436, 4916, 247, 1943, 2523, 672, 3515, 436, 387, 1029, 6064, 23414, 89, 19233, 390, 41399, 81, 2490, 187, 4118, 18435, 27, 783, 2929, 29328, 247, 1332, 323, 440, 35421, 4715, 273, 5113, 432, 10556, 275, 1798, 253, 4081, 2746, 24772, 767, 5368, 5697, 50276, 66, 18902, 15239, 3169, 10336, 751, 5745, 74, 285, 271, 47694, 11020, 2460, 29810, 751, 42550, 253, 3082, 310, 16575, 6760, 285, 2011, 281, 562, 32231, 253, 4623, 1666, 25379, 50276, 6438, 7296, 253, 4477, 8680, 285, 9470, 11985, 253, 30628, 11626, 273, 253, 2929, 403, 1335, 6804, 767, 7052, 2762, 581, 9238, 25661, 2762, 285, 581, 7052, 4016, 253, 2234, 20544, 285, 32213, 326, 497, 8042, 562, 403, 347, 3637, 20544, 337, 2969, 285, 5919, 1566, 374, 21414, 5661, 1543, 327, 854, 15302, 1690, 374, 1524, 10186, 495, 47860, 1783, 285, 28913, 2175, 50276, 20881, 1255, 265, 337, 3480, 273, 38135, 374, 642, 4679, 327, 625, 2570, 1524, 10186, 15302, 495, 269, 72, 1792, 310, 417, 247, 3962, 7103, 7982, 577, 3480, 273, 12288, 432, 253, 4679, 50276, 9939, 326, 690, 273, 253, 5393, 32213, 403, 275, 7344, 342, 20544, 50276, 1542, 4227, 256, 18, 342, 259, 18, 256, 19, 342, 259, 19, 256, 1508, 342, 259, 1706, 3192, 512, 273, 253, 1840, 715, 2395, 891, 2868, 253, 2929, 10262, 247, 2969, 2568, 5919, 5019, 273, 767, 5368, 3082, 285, 6760, 352, 16575, 275, 1386, 342, 285, 2686, 8489, 1840, 752, 310, 7744, 2218, 323, 440, 35421, 1789, 4715, 9380, 50276, 16378, 281, 1175, 3045, 285, 4722, 16039, 891, 2868, 253, 17647, 273, 253, 1332, 310, 9865, 285, 943, 417, 320, 2268, 3008, 264, 347, 3480, 273, 15832, 285, 253, 2530, 4679, 403, 2168, 327, 625, 2570, 15302, 685, 7744, 908, 275, 253, 1673, 50275, 45230, 387, 436, 1127, 891, 5583, 14924, 533, 11907, 253, 4477, 281, 1379, 253, 30628, 5701, 281, 2798, 10793, 1896, 285, 4575, 253, 2929, 15672 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper presents a robust training against the union of lp threat models ie training a robust model in linf l2 l1 threat models the key idea is to use a nuclear normbased training procedure called ncat strengths ncat can achieve sota robust accuracy while requiring a singlestep attack budget per minibatch ncat can scale to large datasets weaknesses the actual training performance in terms of computation time is not reported therefore it is hard to compare the actual performance of ncat against other robust training methods prior work sriramanan et al 2021 has already used nuclear norms in the context of adversarial training albeit for singlestep attacks yes docsepthe paper proposes an efficient robust training method to achieve adversarial robustness on multiple threat models l1l2linf specifically it first study the adversarial training on l1 norm where normal adversarial training method could achieve a good performance by combining the previously proposed nuclearnorm adversarial training and curriculum schedule it finds it could achieve a better robust accuracy against l1 adversarial attack as l2 and linf threat model have a similar behavior on the robustness the paper then choose to alternatively switch the threat model to do nuclearnorm adversarial training to improve the efficiency the experiments are conducted on cifar10 and imagenet100 on resnet18 and wideresnet28 it shows the proposed method could achieve a better worstcase robustness while maintain a good scalabliilty pros 1 the paper is in general wellwritten and easy to follow except for the curriculum scheduling part 2 the experiment result shows the proposed method could achieve a worstcase robust accuracy cons 1 the novelty of proposed method is quite limited it is a fairy combination on nuclear norm adversarial training with the curriculum schedule and curriculum schedule is also discussed in the previous works such as 1 it is still unclear why the curriculum schedule is useful in the adversarial training or the proposed schedule only works for the l1 threat model 2 the main claim of improving the union of threat model is actually improving the adversarial trainings performance on the l1 threat model which is not equal to improve the performance over the union in other words if the union doesnt include l1 threat model will the proposed method still outperform the baseline 3 although the proposed method is better in terms of the worstcase acc they are sometimes worse on the average the main improvement i think is on the training efficiency minor 1 typos line 219 asigma1geq asimga1 2 the dash line in figure 1 bc is inconsistent with the legend 1 zhang jingfeng et al attacks which do not kill training make adversarial learning stronger international conference on machine learning pmlr 2020 the papers has discussed its limitation on provable certificates on different lp threat models and their union however i think one of big limitation is the paper requires the union to have l1 threat model docsepthis work tackles the lucrative goal of multinorm perturbation robustness which is largely constrained by efficient robustness against l1norm adversaries in the current literature the authors propose a technique inspired by the nuclearnorm regularizer that helps achieve robustness against a union of adversaries efficiently the curriculumbased method has clear performance gains especially with complex datasets and large models strengths the paper is well written with easytofollow reasoning for designalgorithmic choices buildup and review of relevant literature evaluation is also thorough although i would prefer to see at last least some mention of deviation in results across runs the observation and perhaps affirmation of l1 and linfty being the hardest threat models while aiming for union robustness is nice weaknesses as such i see no visible weaknesses in the paper apart from minor comments mentioned below and in the questions section good job minor comments table 1 what does the in ncat stand for please clarify in the table description although societal impact is addressed in section 7 i do not see much discussion around limitations docsepthe paper proposes a method ncat to obtain models robust wrt ell1 or multiple ellpnorms efficiently that is with single step adversarial training ncat combines nuclearnorm adversarial training nuat with a curriculum for the size of the perturbations seen at training time to prevent catastrophic overfitting in 1step adversarial training further the paper applies ncat to the multiple norms scenario achieving results close to those of more expensive multistep methods strengths the goal of reducing the cost of adversarial training without degrading too much the resulting robustness is well established and to my knowledge this is the first work to extend single step adversarial training to ell1 and multiple norms the proposed method is simple and effective in the experimental evaluation on both cifar10 with two architectures and imagenet100 several additional experiments are presented in the supplements to further analyze and support ncat weaknesses ncat is built combining several existing techniques nuat curriculum for epsilon steepest ascent direction for the threat models of ell1 with the image domain constraints alternating ell1 and ellinfty when training for multiple norms which to some extent limits the contributions of the paper overall i think that the novelty of the method is somehow limited showing how its possible to achieve with single step adversarial training robustness in challenging threat models a meaningful contribution the limitations are sufficiently addressed ### Summary:
this paper received mixed recommendations that range from borderline reject to strong accept after examining all reviewers comments author rebuttal and the paper itself i lean toward acceptance mainly because i it seems that the author responses have addressed reviewers main concerns although some reviewers did not respond to the rebuttal 2 the contributions made by the paper as summarized in the paper and author rebuttal are significant enough and could potentially facilitate the development of adversarial defense nevertheless the organization and presentation of the paper needs to be further improved for instance the noveltycontributions over some existing works eg 1 2 the computational cost of the proposed method and the possible limitations are insufficiently discussed in the main text of the original submission i urge the authors to consider these issues and take careful note of the reviewers comments and suggestions when preparing for the final version 1 f croce and m hein adversarial robustness against multiple lpthreat models at the price of one and how to quickly finetune robust models to another threat model 2 g sriramanan s addepalli a baburaj and v b radhakrishnan towards efficient and effective adversarial training in neurips 2021
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 10262, 247, 10237, 3733, 1411, 253, 8083, 273, 39322, 4322, 3210, 26332, 3733, 247, 10237, 1566, 275, 298, 2050, 298, 19, 298, 18, 4322, 3210, 253, 2234, 2934, 310, 281, 897, 247, 6069, 5222, 3169, 3733, 5199, 1925, 295, 8076, 50276, 296, 3755, 20556, 50275, 79, 8076, 476, 5115, 256, 5503, 10237, 7200, 1223, 10568, 247, 1625, 46701, 554, 2983, 7563, 591, 1054, 487, 1506, 50276, 79, 8076, 476, 4311, 281, 1781, 15302, 50275, 20881, 1255, 265, 50275, 783, 4588, 3733, 3045, 275, 2426, 273, 13782, 673, 310, 417, 2361, 3103, 352, 310, 1892, 281, 7277, 253, 4588, 3045, 273, 295, 8076, 1411, 643, 10237, 3733, 3082, 50274, 40844, 789, 256, 42061, 14990, 266, 1162, 355, 43425, 556, 2168, 908, 6069, 22429, 275, 253, 3634, 273, 48960, 3733, 23447, 323, 1625, 46701, 554, 8104, 50274, 9820, 5474, 339, 431, 248, 2929, 29328, 271, 5919, 10237, 3733, 1332, 281, 5115, 48960, 31640, 327, 2709, 4322, 3210, 298, 18, 77, 19, 77, 2050, 5742, 352, 806, 1263, 253, 48960, 3733, 327, 50276, 77, 18, 5222, 835, 2622, 48960, 3733, 1332, 812, 5115, 247, 1175, 3045, 407, 16248, 253, 3786, 4081, 3325, 1596, 526, 48960, 3733, 285, 24642, 10130, 352, 9010, 352, 812, 5115, 247, 1805, 10237, 7200, 1411, 298, 18, 48960, 2983, 347, 298, 19, 285, 298, 2050, 4322, 1566, 452, 247, 2074, 3879, 327, 253, 31640, 253, 2929, 840, 5206, 281, 31506, 5234, 253, 4322, 1566, 281, 513, 3325, 1596, 526, 48960, 3733, 281, 3157, 253, 6733, 253, 4679, 403, 5196, 327, 260, 338, 274, 740, 285, 4440, 257, 292, 2313, 327, 501, 3024, 1093, 285, 4618, 373, 3024, 1619, 352, 2722, 253, 4081, 1332, 812, 5115, 247, 1805, 9065, 5045, 31640, 1223, 6558, 247, 1175, 9171, 357, 965, 7443, 5847, 337, 253, 2929, 310, 275, 2087, 973, 15720, 285, 3477, 281, 956, 3707, 323, 253, 24642, 27387, 629, 374, 253, 3368, 906, 2722, 253, 4081, 1332, 812, 5115, 247, 9065, 5045, 10237, 7200, 50276, 5040, 337, 253, 38135, 273, 4081, 1332, 310, 3240, 3710, 352, 310, 247, 33784, 5019, 327, 6069, 5222, 48960, 3733, 342, 253, 24642, 10130, 285, 24642, 10130, 310, 671, 5469, 275, 253, 2045, 2987, 824, 347, 337, 352, 310, 1335, 12744, 2139, 253, 24642, 10130, 310, 4217, 275, 253, 48960, 3733, 390, 253, 4081, 10130, 760, 2987, 323, 253, 298, 18, 4322, 1566, 374, 253, 2022, 1750, 273, 11138, 253, 8083, 273, 4322, 1566, 310, 2686, 11138, 253, 48960, 6194, 723, 3045, 327, 253, 298, 18, 4322, 1566, 534, 310, 417, 4503, 281, 3157, 253, 3045, 689, 253, 8083, 275, 643, 3000, 604, 253, 8083, 36908, 2486, 298, 18, 4322, 1566, 588, 253, 4081, 1332, 1335, 562, 32231, 253, 8245, 495, 3738, 253, 4081, 1332, 310, 1805, 275, 2426, 273, 253, 9065, 5045, 756, 597, 403, 4536, 7197, 327, 253, 3388, 253, 2022, 7756, 891, 1158, 310, 327, 253, 3733, 6733, 50276, 37585, 337, 963, 993, 1386, 28121, 347, 2005, 18, 5090, 347, 303, 2485, 18, 50276, 19, 253, 20134, 1386, 275, 4677, 337, 49501, 310, 16706, 342, 253, 13691, 50276, 18, 1182, 12109, 480, 272, 71, 1205, 1162, 355, 8104, 534, 513, 417, 5159, 3733, 1056, 48960, 4715, 10046, 5213, 8059, 327, 5145, 4715, 268, 1686, 83, 9169, 253, 9380, 556, 5469, 697, 12291, 327, 872, 494, 28460, 327, 1027, 39322, 4322, 3210, 285, 616, 8083, 2299, 891, 1158, 581, 273, 1943, 12291, 310, 253, 2929, 4419, 253, 8083, 281, 452, 298, 18, 4322, 1566, 5474, 33032, 2520, 789, 39223, 253, 42733, 4736, 273, 37197, 526, 20452, 31640, 534, 310, 8127, 20793, 407, 5919, 31640, 1411, 298, 18, 12850, 18539, 3927, 275, 253, 1655, 6239, 253, 4477, 12661, 247, 5853, 11797, 407, 253, 3325, 1596, 526, 3963, 6081, 326, 7729, 5115, 31640, 1411, 247, 8083, 273, 18539, 3927, 14556, 253, 24642, 3169, 1332, 556, 2590, 3045, 15988, 3340, 342, 2570, 15302, 285, 1781, 3210, 50276, 296, 3755, 20556, 50276, 783, 2929, 310, 973, 3542, 342, 3477, 936, 25739, 14720, 323, 2216, 267, 4100, 6185, 10165, 1973, 484, 285, 2278, 273, 4623, 6239, 7103, 310, 671, 11080, 3738, 891, 651, 4510, 281, 923, 387, 1390, 1878, 690, 3748, 273, 11254, 275, 1543, 2439, 6613, 50276, 783, 8310, 285, 4931, 10418, 318, 273, 298, 18, 285, 298, 3259, 1146, 253, 31056, 4322, 3210, 1223, 26400, 323, 8083, 31640, 310, 5322, 50275, 20881, 1255, 265, 347, 824, 891, 923, 642, 7985, 32213, 275, 253, 2929, 7419, 432, 5884, 5701, 5393, 2708, 285, 275, 253, 3533, 2593, 1175, 2628, 50275, 37585, 5701, 50276, 2420, 337, 752, 1057, 253, 50276, 249, 295, 8076, 1462, 323, 4496, 19148, 275, 253, 2829, 5740, 50276, 20261, 38058, 3486, 310, 9713, 275, 2593, 818, 891, 513, 417, 923, 1199, 5955, 1475, 7364, 5474, 339, 431, 248, 2929, 29328, 247, 1332, 295, 8076, 281, 4044, 3210, 10237, 8772, 11591, 18, 390, 2709, 11591, 81, 12850, 84, 14556, 326, 310, 342, 2014, 3213, 48960, 3733, 295, 8076, 24772, 3325, 1596, 526, 48960, 3733, 8794, 255, 342, 247, 24642, 323, 253, 1979, 273, 253, 26309, 2326, 387, 3733, 673, 281, 3657, 36256, 689, 31893, 275, 337, 10539, 48960, 3733, 2007, 253, 2929, 10384, 295, 8076, 281, 253, 2709, 22429, 10076, 17170, 1543, 2810, 281, 1110, 273, 625, 8214, 1554, 382, 554, 3082, 20544, 50276, 783, 4736, 273, 8493, 253, 2105, 273, 48960, 3733, 1293, 9579, 272, 1512, 1199, 253, 4795, 31640, 310, 973, 4232, 285, 281, 619, 3640, 436, 310, 253, 806, 789, 281, 9017, 2014, 3213, 48960, 3733, 281, 11591, 18, 285, 2709, 22429, 50275, 783, 4081, 1332, 310, 2969, 285, 3576, 275, 253, 5661, 7103, 327, 1097, 260, 338, 274, 740, 342, 767, 35615, 285, 4440, 257, 292, 2313, 2067, 3081, 4679, 403, 3559, 275, 253, 26434, 281, 2007, 12106, 285, 1329, 295, 8076, 50276, 20881, 1255, 265, 50276, 79, 8076, 310, 4270, 16248, 2067, 5368, 5609, 8794, 255, 24642, 323, 299, 4277, 16624, 383, 49104, 3884, 323, 253, 4322, 3210, 273, 11591, 18, 342, 253, 2460, 5028, 10806, 28035, 11591, 18, 285, 11591, 3259, 672, 3733, 323, 2709, 22429, 534, 281, 690, 6070, 7787, 253, 9021, 273, 253, 2929, 50276, 1189, 455, 891, 1158, 326, 253, 38135, 273, 253, 1332, 310, 10380, 3710, 4645, 849, 697, 1896, 281, 5115, 342, 2014, 3213, 48960, 3733, 31640, 275, 11132, 4322, 3210, 247, 14282, 7680, 253, 7364, 403, 10481, 9713, 2490, 187, 4118, 18435, 27, 2520, 2929, 2959, 6804, 12645, 326, 2491, 432, 45210, 12009, 281, 2266, 2997, 846, 17565, 512, 30628, 5701, 2488, 30080, 22559, 285, 253, 2929, 3139, 891, 9644, 2584, 14924, 7194, 984, 891, 352, 3133, 326, 253, 2488, 6128, 452, 9713, 30628, 2022, 7350, 3738, 690, 30628, 858, 417, 3794, 281, 253, 30080, 22559, 374, 253, 9021, 1160, 407, 253, 2929, 347, 17903, 275, 253, 2929, 285, 2488, 30080, 22559, 403, 1534, 2217, 285, 812, 7826, 12454, 253, 2440, 273, 48960, 5684, 50276, 7594, 8299, 253, 6003, 285, 9759, 273, 253, 2929, 3198, 281, 320, 2007, 5520, 323, 4227, 253, 38135, 1987, 8303, 689, 690, 5368, 2987, 24088, 337, 374, 253, 15180, 2105, 273, 253, 4081, 1332, 285, 253, 1896, 7364, 403, 12497, 314, 5469, 275, 253, 2022, 2505, 273, 253, 3236, 19529, 891, 21434, 253, 4477, 281, 1908, 841, 3374, 285, 1379, 10182, 3877, 273, 253, 30628, 5701, 285, 13991, 672, 13828, 323, 253, 2457, 2715, 50276, 18, 269, 9187, 336, 285, 278, 344, 249, 48960, 31640, 1411, 2709, 39322, 26039, 3210, 387, 253, 4376, 273, 581, 285, 849, 281, 4541, 1442, 292, 2517, 10237, 3210, 281, 1529, 4322, 1566, 50276, 19, 305, 256, 42061, 14990, 266, 256, 823, 554, 43953, 247, 5366, 321, 1432, 285, 362, 270, 1985, 73, 518, 41657, 11943, 4404, 5919, 285, 3576, 48960, 3733, 275, 5723, 2824, 43425, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 10262, 247, 10237, 3733, 1411, 253, 8083, 273, 39322, 4322, 3210, 26332, 3733, 247, 10237, 1566, 275, 298, 2050, 298, 19, 298, 18, 4322, 3210, 253, 2234, 2934, 310, 281, 897, 247, 6069, 5222, 3169, 3733, 5199, 1925, 295, 8076, 50276, 296, 3755, 20556, 50275, 79, 8076, 476, 5115, 256, 5503, 10237, 7200, 1223, 10568, 247, 1625, 46701, 554, 2983, 7563, 591, 1054, 487, 1506, 50276, 79, 8076, 476, 4311, 281, 1781, 15302, 50275, 20881, 1255, 265, 50275, 783, 4588, 3733, 3045, 275, 2426, 273, 13782, 673, 310, 417, 2361, 3103, 352, 310, 1892, 281, 7277, 253, 4588, 3045, 273, 295, 8076, 1411, 643, 10237, 3733, 3082, 50274, 40844, 789, 256, 42061, 14990, 266, 1162, 355, 43425, 556, 2168, 908, 6069, 22429, 275, 253, 3634, 273, 48960, 3733, 23447, 323, 1625, 46701, 554, 8104, 50274, 9820, 5474, 339, 431, 248, 2929, 29328, 271, 5919, 10237, 3733, 1332, 281, 5115, 48960, 31640, 327, 2709, 4322, 3210, 298, 18, 77, 19, 77, 2050, 5742, 352, 806, 1263, 253, 48960, 3733, 327, 50276, 77, 18, 5222, 835, 2622, 48960, 3733, 1332, 812, 5115, 247, 1175, 3045, 407, 16248, 253, 3786, 4081, 3325, 1596, 526, 48960, 3733, 285, 24642, 10130, 352, 9010, 352, 812, 5115, 247, 1805, 10237, 7200, 1411, 298, 18, 48960, 2983, 347, 298, 19, 285, 298, 2050, 4322, 1566, 452, 247, 2074, 3879, 327, 253, 31640, 253, 2929, 840, 5206, 281, 31506, 5234, 253, 4322, 1566, 281, 513, 3325, 1596, 526, 48960, 3733, 281, 3157, 253, 6733, 253, 4679, 403, 5196, 327, 260, 338, 274, 740, 285, 4440, 257, 292, 2313, 327, 501, 3024, 1093, 285, 4618, 373, 3024, 1619, 352, 2722, 253, 4081, 1332, 812, 5115, 247, 1805, 9065, 5045, 31640, 1223, 6558, 247, 1175, 9171, 357, 965, 7443, 5847, 337, 253, 2929, 310, 275, 2087, 973, 15720, 285, 3477, 281, 956, 3707, 323, 253, 24642, 27387, 629, 374, 253, 3368, 906, 2722, 253, 4081, 1332, 812, 5115, 247, 9065, 5045, 10237, 7200, 50276, 5040, 337, 253, 38135, 273, 4081, 1332, 310, 3240, 3710, 352, 310, 247, 33784, 5019, 327, 6069, 5222, 48960, 3733, 342, 253, 24642, 10130, 285, 24642, 10130, 310, 671, 5469, 275, 253, 2045, 2987, 824, 347, 337, 352, 310, 1335, 12744, 2139, 253, 24642, 10130, 310, 4217, 275, 253, 48960, 3733, 390, 253, 4081, 10130, 760, 2987, 323, 253, 298, 18, 4322, 1566, 374, 253, 2022, 1750, 273, 11138, 253, 8083, 273, 4322, 1566, 310, 2686, 11138, 253, 48960, 6194, 723, 3045, 327, 253, 298, 18, 4322, 1566, 534, 310, 417, 4503, 281, 3157, 253, 3045, 689, 253, 8083, 275, 643, 3000, 604, 253, 8083, 36908, 2486, 298, 18, 4322, 1566, 588, 253, 4081, 1332, 1335, 562, 32231, 253, 8245, 495, 3738, 253, 4081, 1332, 310, 1805, 275, 2426, 273, 253, 9065, 5045, 756, 597, 403, 4536, 7197, 327, 253, 3388, 253, 2022, 7756, 891, 1158, 310, 327, 253, 3733, 6733, 50276, 37585, 337, 963, 993, 1386, 28121, 347, 2005, 18, 5090, 347, 303, 2485, 18, 50276, 19, 253, 20134, 1386, 275, 4677, 337, 49501, 310, 16706, 342, 253, 13691, 50276, 18, 1182, 12109, 480, 272, 71, 1205, 1162, 355, 8104, 534, 513, 417, 5159, 3733, 1056, 48960, 4715, 10046, 5213, 8059, 327, 5145, 4715, 268, 1686, 83, 9169, 253, 9380, 556, 5469, 697, 12291, 327, 872, 494, 28460, 327, 1027, 39322, 4322, 3210, 285, 616, 8083, 2299, 891, 1158, 581, 273, 1943, 12291, 310, 253, 2929, 4419, 253, 8083, 281, 452, 298, 18, 4322, 1566, 5474, 33032, 2520, 789, 39223, 253, 42733, 4736, 273, 37197, 526, 20452, 31640, 534, 310, 8127, 20793, 407, 5919, 31640, 1411, 298, 18, 12850, 18539, 3927, 275, 253, 1655, 6239, 253, 4477, 12661, 247, 5853, 11797, 407, 253, 3325, 1596, 526, 3963, 6081, 326, 7729, 5115, 31640, 1411, 247, 8083, 273, 18539, 3927, 14556, 253, 24642, 3169, 1332, 556, 2590, 3045, 15988, 3340, 342, 2570, 15302, 285, 1781, 3210, 50276, 296, 3755, 20556, 50276, 783, 2929, 310, 973, 3542, 342, 3477, 936, 25739, 14720, 323, 2216, 267, 4100, 6185, 10165, 1973, 484, 285, 2278, 273, 4623, 6239, 7103, 310, 671, 11080, 3738, 891, 651, 4510, 281, 923, 387, 1390, 1878, 690, 3748, 273, 11254, 275, 1543, 2439, 6613, 50276, 783, 8310, 285, 4931, 10418, 318, 273, 298, 18, 285, 298, 3259, 1146, 253, 31056, 4322, 3210, 1223, 26400, 323, 8083, 31640, 310, 5322, 50275, 20881, 1255, 265, 347, 824, 891, 923, 642, 7985, 32213, 275, 253, 2929, 7419, 432, 5884, 5701, 5393, 2708, 285, 275, 253, 3533, 2593, 1175, 2628, 50275, 37585, 5701, 50276, 2420, 337, 752, 1057, 253, 50276, 249, 295, 8076, 1462, 323, 4496, 19148, 275, 253, 2829, 5740, 50276, 20261, 38058, 3486, 310, 9713, 275, 2593, 818, 891, 513, 417, 923, 1199, 5955, 1475, 7364, 5474, 339, 431, 248, 2929, 29328, 247, 1332, 295, 8076, 281, 4044, 3210, 10237, 8772, 11591, 18, 390, 2709, 11591, 81, 12850, 84, 14556, 326, 310, 342, 2014, 3213, 48960, 3733, 295, 8076, 24772, 3325, 1596, 526, 48960, 3733, 8794, 255, 342, 247, 24642, 323, 253, 1979, 273, 253, 26309, 2326, 387, 3733, 673, 281, 3657, 36256, 689, 31893, 275, 337, 10539, 48960, 3733, 2007, 253, 2929, 10384, 295, 8076, 281, 253, 2709, 22429, 10076, 17170, 1543, 2810, 281, 1110, 273, 625, 8214, 1554, 382, 554, 3082, 20544, 50276, 783, 4736, 273, 8493, 253, 2105, 273, 48960, 3733, 1293, 9579, 272, 1512, 1199, 253, 4795, 31640, 310, 973, 4232, 285, 281, 619, 3640, 436, 310, 253, 806, 789, 281, 9017, 2014, 3213, 48960, 3733, 281, 11591, 18, 285, 2709, 22429, 50275, 783, 4081, 1332, 310, 2969, 285, 3576, 275, 253, 5661, 7103, 327, 1097, 260, 338, 274, 740, 342, 767, 35615, 285, 4440, 257, 292, 2313, 2067, 3081, 4679, 403, 3559, 275, 253, 26434, 281, 2007, 12106, 285, 1329, 295, 8076, 50276, 20881, 1255, 265, 50276, 79, 8076, 310, 4270, 16248, 2067, 5368, 5609, 8794, 255, 24642, 323, 299, 4277, 16624, 383, 49104, 3884, 323, 253, 4322, 3210, 273, 11591, 18, 342, 253, 2460, 5028, 10806, 28035, 11591, 18, 285, 11591, 3259, 672, 3733, 323, 2709, 22429, 534, 281, 690, 6070, 7787, 253, 9021, 273, 253, 2929, 50276, 1189, 455, 891, 1158, 326, 253, 38135, 273, 253, 1332, 310, 10380, 3710, 4645, 849, 697, 1896, 281, 5115, 342, 2014, 3213, 48960, 3733, 31640, 275, 11132, 4322, 3210, 247, 14282, 7680, 253, 7364, 403, 10481, 9713, 2490, 187, 4118, 18435, 27, 2520, 2929, 2959, 6804, 12645, 326, 2491, 432, 45210, 12009, 281, 2266, 2997, 846, 17565, 512, 30628, 5701, 2488, 30080, 22559, 285, 253, 2929, 3139, 891, 9644, 2584, 14924, 7194, 984, 891, 352, 3133, 326, 253, 2488, 6128, 452, 9713, 30628, 2022, 7350, 3738, 690, 30628, 858, 417, 3794, 281, 253, 30080, 22559, 374, 253, 9021, 1160, 407, 253, 2929, 347, 17903, 275, 253, 2929, 285, 2488, 30080, 22559, 403, 1534, 2217, 285, 812, 7826, 12454, 253, 2440, 273, 48960, 5684, 50276, 7594, 8299, 253, 6003, 285, 9759, 273, 253, 2929, 3198, 281, 320, 2007, 5520, 323, 4227, 253, 38135, 1987, 8303, 689, 690, 5368, 2987, 24088, 337, 374, 253, 15180, 2105, 273, 253, 4081, 1332, 285, 253, 1896, 7364, 403, 12497, 314, 5469, 275, 253, 2022, 2505, 273, 253, 3236, 19529, 891, 21434, 253, 4477, 281, 1908, 841, 3374, 285, 1379, 10182, 3877, 273, 253, 30628, 5701, 285, 13991, 672, 13828, 323, 253, 2457, 2715, 50276, 18, 269, 9187, 336, 285, 278, 344, 249, 48960, 31640, 1411, 2709, 39322, 26039, 3210, 387, 253, 4376, 273, 581, 285, 849, 281, 4541, 1442, 292, 2517, 10237, 3210, 281, 1529, 4322, 1566, 50276, 19, 305, 256, 42061, 14990, 266, 256, 823, 554, 43953, 247, 5366, 321, 1432, 285, 362, 270, 1985, 73, 518, 41657, 11943, 4404, 5919, 285, 3576, 48960, 3733, 275, 5723, 2824, 43425, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper presents plasticinelab a differentiable physics environment geared towards softbody manipulation by implementing a softened rigiddeformable contact interface and by leveraging recent advances in softbody dynamics simulation difftaichi chainqueen plasticinelab is able to provide analytical gradients which seem to outperform gradientestimation based approaches sac td3 ppo on few tasks strengths s1 the central problem tackled here is quite interesting and challenging there is a growing interest in the ml community wrt differentiable simulation techniques and in particular their applicability to learning dynamics s2 the paper is extremely wellwritten and easy to follow while this builds heavily on difftaichi and chainqueen it is commendable that this paper came across as selfcontained s3 i believe the characterization of related work is fair an interesting point made here was that tdw and sapien do not provide assets for softbody simulation i am unsure if thats entirely true but the lack of assets is indeed an issue for softbody simulation it will be a welcome contribution if this paper were to make these 3d assets publicly available s4 conceptually this paper claims that inductive biases arising from simulation of deformable objects should be exploited wherever possible in particular fig 4 and table 1 seem to indicate that gradientbased optimization using differentiable simulation consistently outperforms rl techniques in 810 tasks to ones anticipation gradientbased optimization seems to achieve significantly faster convergence seems like two orders of magnitude which is an impressive feat s5 it is interesting to see that it is possible to differentiate through contacts across rigid and deformable objects to my knowledge this has not been demonstrated before has been tangentially discussed in c and is a significant contribution weaknesses w1 the paper could benefit from an explicit exposition of critical design choices that affect differentiability while plasticinelab uses a particlebased model for representing and simulating softbodies alternatives in the form of usually tetrahedral meshbased representations exist it appears that particle systems are chosen to enable trivial differentiability eg the materialpoint method in the absense of contact forces is analytically differentiable w2 an important detail which i couldnt find in the paper andor supplementary material is how many of the parameters are simultaneously observable for example if the masses of the particles and the manipulator contact parameters are both unspecified wouldnt this lead to problems in observability ie both quantities cannot be simultaneously solved for resulting in illbehaved gradients w3 does the approach assume a oneone correspondence between the predicted and target shape while this might seem a reasonable assumption i believe gradient computation is cumbersome and perhaps ambiguous were this to be relaxed w4 another crucial detail that the paper does not seem to get through as with most other differentiable physics approaches unmodelled effects in the dynamical system might limit the applicability of the system since the physics engine only implements forces and softbody dynamics that are predetermined i would imagine it is hard to emulate realworld effects such as wearandtear sophisticated contacts and material properties in favor of the paper though i feel this detail might also be out of scope to an extent recent approaches such as neural dynamical systems a and learning physical constraints by neural projections b come to mind to handle some of these concerns summary while the differentiable simulation aspect of the paper is not substantially novel building atop difftaichi chainqueen and soft contact models stomakhin et al 2013 the overall system is impressive and addresses a gap in the differentiable physics community simulating differentiable softbody dynamics as well as interaction with a limited class of rigid bodies could open up interesting avenues in reinforcement learning and softbody manipulation a neural dynamical systems balancing structure and flexibility in physical prediction arxiv 2020 b learning physical constraints with neural projections arxiv 2020 c scalable differentiable physics for learning and control icml 2020 docsep summary in this work the authors present plasticinelab a new framework for softbody manipulation tasks for reinforcement learning and planning algorithms the environment consists of a novel soft ie deformable plastic material termed plasticine which is complex to model and manipulate because of the inherently complex highdimensional governing equations and the large number of degrees of freedom associated with soft materials the plasticinelab framework proposes 10 novel tasks involving manipulation of the soft plasticine material the authors show thorough empirical analysis that traditional state of the art modelfree reinforcement learning algorithms fail to effectively learn the task even after a substantial amount of training thus effectively showcasing the complexity of the proposed tasks and the inability of state of the art rl models to model the proposed tasks positives 1 novel tasks each task poses a different challenge eg some tasks involve flattening the plasticine other involve pinching the plasticine while yet other tasks involve grasping one or multiple plasticine objects at one or multiple points and deforming it or moving it in some required manner 2 the variety of tasks test various facets of rl like longterm planning especially in the case of multi stage tasks 3 another major effort prelavent in the paper is that the authors have chosen to use a differentiable physics engine using the difftaichi system thereby making the gradients available for planning and control algorithms 4 the paper highlights through empirical results the superiority of gradientbased approaches over modelfree rl approaches that leverage the underlying differentiable physics engine toward learning the required tasks 5 an important facet of a benchmark is to propose tasks that are sufficiently complex for the current state of the art procedures the authors employ 3 state of the art modelfree rl algorithms and show that these rl models perform poorly in a majority of the 10 tasks torus rollingpin move tasks are the only three tasks where the modelfree procedures are able to perform somewhat comparably with the gradientbased planning approaches which themselves perform well in all but the writer pinch and triplemove tasks the last task involves multiobject manipulation and requries longterm planning and hence concerns 1 in figure 4 the adam optimizer and the gd which both seem to consistently accumulate the greatest rewards also have high variance some commentary about how this can be explained would help the reader better contextualize the results 2 as one of the main claims of the paper is the challenge of softbody manipulation and the proposal of a framework for the same it is imperative to demonstrate the variation of the degree of difficulty with increase or decrease in rigidity of the materials being manipulated a comparative analysis such as this demonstrating for example the variation in iou error of the best performing rl model with increasing in yield stress for plasticine would serve to showcase the actual challenge posed by softbody material mainpulation in the context of the current proposed framework ofcourse since decreasing softness and increasing rigidity is most likely not as simple as increasing a single number such as yield stress this is a minor concern and more a suggestion toward a holistic analysis of the proposed framework minor details suggestions 1 extrapolation is an important facet of learning algorithms in general since one of the suggestions of the current work is to present the plasticinelab framework as a way to not only characterize rl and gradientbased algorithms but also combine these two families of methods it is also important to evaluate the performance of these models on unseen but related tasks eg manipulating a table with fewer or grater number of legs trying to place more than 3 objects at specified locations 2 the citation relating to the paper by avila et al titled endtoend differentiable physics for learning and control published in the advances in neural information processing systems conference in 2018 seems to be repeated 3 another potential direction of the current framework could be using plasticinelab to learn policies which might be transferred to the realworld similar to the task mentioned in 1 if feasible adding some brief commentary about this in the context of 1 might open up further avenues of exploration for plasticinelab references 1 matas j james s davison aj simtoreal reinforcement learning for deformable object manipulation arxiv preprint arxiv180607851 2018 jun 20docsepthe paper introduces a new opensource simulation benchmark for soft robotics the simulation environment builds on top of difftaichi an existing differentiable simulator which enables endtoend differentiability the paper proposes 10 different tasks each with 5 variations and evaluates both rlbased policy learning methods and gradientbased optimization methods on those tasks the results suggests neither current rlbased methods nor gradientbased method can solve most of the tasks efficiently especially for those require longterm planning overall the paper is wellwritten and the contribution is wellargued i have a few comments questions as follows the simulator only considers the state of the endeffector of the manipulator it would be great to consider higher dof soft manipulators eg 12 which would further benefit the soft robotics community given the randomness and the nature of the rl algorithms 3 the evaluation in section 52 should be done with at least multiple random seeds with multiple trials per seed to make the benchmarking results statistically significant similarly for section3 in table 1 it would be great to see the standard deviation in addition to the average value its also not clear how many trials were conducted in order to get the numbers shown in both table 1 and the plots in fig 4 1 della santina cosimo et al dynamic control of soft robots interacting with the environment 2018 ieee international conference on soft robotics robosoft ieee 2018 2 george thuruthel thomas et al control strategies for soft robotic manipulators a survey soft robotics 52 2018 149163 3 khimya khetarpal zafarali ahmed andre cianflone riashat islam joelle pineau reproducibility in machine learning workshop icml 2018docsepplasticinelab the paper presents a new softbody manipulation benchmark for rl and differentiable planning the presented simulation suite is very interesting and the contribution is solid strength new simulation benchmark with features that are not yet well explored differentiable physics to open up possibilities for planning methods tasks are difficult enough to be challenging for a while baseline results are provided weaknesses only the computation times would be good to add presentation the paper is clearly written and easy to follow ways to improve the paper wallclock times would really be very useful both for the forward pass as well as a backward pass through the entire horizon with adam maybe also some notes on how it can be parallelized since you have a cuda implementation details p5 last paragraph for any grid points with a signed distance d the formulation is not clear enough do you mean with positive signed distance prob not because you can also have penetration but why would a point then not have a distance to the rigid body same paragraph by definition s decays exponentially with d until d becomes negative when penetration occurs well it decays with increasing distance and then it cannot become negative if it increases 51 iou definition are s always positive i dont exactly understand what the mass tensor s is i know it from rigid body dynamics but this does not seem to be the same here can you clarify this better such that it becomes clear why the formula describes and iou fig 4 consider removing the grey background that seaborn uses automatically the plots will look much cleaner and better visible ### Summary:
this paper proposes a new differentiable physics benchmark for softbody manipulation the proposed benchmark is based on the difftaichi system several existing reinforcement learning algorithms are evaluated on this benchmark the paper identify a set of key challenges that are posed by this specific benchmark to rl algorithms short horizon tasks are shown to be feasible by optimizing the physics parameters via gradient descent the reviewers agree that this paper is very wellwritten the problem tackled in it is quite interesting and challenging and the use of differentiable physics in rl for manipulating soft objects quite intriguing
[ 22403, 494, 5113, 281, 619, 3640, 436, 556, 417, 644, 5183, 1078, 556, 644, 12717, 4303, 5469, 275, 260, 285, 310, 247, 1534, 7680, 50274, 20881, 1255, 265, 50276, 88, 18, 253, 2929, 812, 5649, 432, 271, 6843, 47284, 273, 4619, 2216, 10165, 326, 2818, 1027, 74, 1430, 1223, 8013, 29943, 357, 4648, 247, 8091, 3169, 1566, 323, 9999, 285, 948, 8287, 2602, 67, 4550, 18075, 275, 253, 830, 273, 3798, 26823, 16232, 17489, 3169, 14237, 2226, 352, 4620, 326, 8091, 2718, 403, 6777, 281, 8046, 14916, 1027, 74, 1430, 24088, 253, 2144, 3659, 1332, 275, 253, 2117, 1215, 273, 3057, 5621, 310, 41398, 46350, 50276, 88, 19, 271, 1774, 2508, 534, 891, 812, 2649, 1089, 275, 253, 2929, 285, 263, 24864, 2144, 310, 849, 1142, 273, 253, 3602, 403, 10486, 24802, 323, 1650, 604, 253, 11843, 273, 253, 6353, 285, 253, 9452, 11699, 3057, 3602, 403, 1097, 45346, 651, 2649, 436, 1421, 281, 3237, 275, 1759, 1430, 26332, 1097, 13483, 2550, 320, 10486, 14042, 323, 4795, 275, 2853, 48384, 9367, 27935, 50276, 88, 20, 1057, 253, 2746, 5467, 247, 581, 531, 17668, 875, 253, 8131, 285, 2303, 5281, 1223, 436, 1537, 1646, 247, 5272, 9376, 891, 2868, 11786, 13782, 310, 41049, 285, 4931, 23851, 497, 436, 281, 320, 19595, 50276, 88, 21, 1529, 9560, 2508, 326, 253, 2929, 1057, 417, 1646, 281, 755, 949, 347, 342, 954, 643, 46350, 12057, 7274, 440, 2307, 5911, 2538, 275, 253, 18525, 985, 1537, 2701, 253, 30437, 273, 253, 985, 1580, 253, 12057, 3948, 760, 17930, 5621, 285, 2602, 2915, 8062, 326, 403, 17095, 891, 651, 8564, 352, 310, 1892, 281, 802, 4187, 1524, 10186, 2538, 824, 347, 8251, 395, 442, 274, 18144, 12716, 285, 2144, 3607, 275, 3718, 273, 253, 2929, 2167, 891, 1928, 436, 2508, 1537, 671, 320, 562, 273, 7990, 281, 271, 6070, 3332, 7274, 824, 347, 11454, 18525, 2718, 247, 285, 4715, 3520, 10806, 407, 11454, 20553, 270, 1705, 281, 2564, 281, 6016, 690, 273, 841, 7350, 50274, 8774, 50276, 6050, 253, 46350, 9864, 4809, 273, 253, 2929, 310, 417, 9619, 4460, 3652, 31831, 722, 649, 66, 22208, 5931, 1452, 257, 285, 2602, 3057, 3210, 50276, 296, 297, 18980, 249, 1162, 355, 4072, 253, 4583, 985, 310, 13943, 285, 12453, 247, 8037, 275, 253, 46350, 12057, 3114, 948, 8287, 46350, 2602, 2915, 8062, 347, 973, 347, 5016, 342, 247, 3710, 966, 273, 16572, 8248, 812, 1527, 598, 4722, 44201, 275, 35221, 4715, 285, 2602, 2915, 19763, 50276, 66, 11454, 18525, 2718, 26259, 2605, 285, 15840, 275, 3520, 10554, 549, 32693, 9169, 50276, 67, 4715, 3520, 10806, 342, 11454, 20553, 549, 32693, 9169, 50276, 68, 44755, 46350, 12057, 323, 4715, 285, 1453, 17857, 1686, 9169, 5474, 33032, 50276, 8774, 50276, 249, 436, 789, 253, 4477, 1246, 8013, 29943, 357, 247, 747, 7792, 323, 2602, 2915, 19763, 8892, 323, 35221, 4715, 285, 7219, 11333, 253, 3126, 8414, 273, 247, 4460, 2602, 26332, 22403, 494, 8013, 2144, 23776, 8013, 460, 534, 310, 2570, 281, 1566, 285, 26526, 984, 273, 253, 26557, 2570, 1029, 6967, 13200, 7424, 285, 253, 1781, 1180, 273, 7759, 273, 7185, 2330, 342, 2602, 4753, 253, 8013, 29943, 357, 7792, 29328, 884, 4460, 8892, 7668, 19763, 273, 253, 2602, 8013, 460, 2144, 253, 4477, 921, 11080, 16774, 1783, 326, 5899, 1375, 273, 253, 1445, 771, 813, 658, 35221, 4715, 11333, 1891, 281, 8069, 3037, 253, 4836, 1014, 846, 247, 6832, 2408, 273, 3733, 3021, 8069, 44762, 2355, 253, 10454, 273, 253, 4081, 8892, 285, 253, 19297, 273, 1375, 273, 253, 1445, 391, 77, 3210, 281, 1566, 253, 4081, 8892, 50275, 993, 23223, 50276, 18, 4460, 8892, 1016, 4836, 24543, 247, 1027, 5691, 24088, 690, 8892, 6388, 30627, 2980, 253, 8013, 460, 643, 6388, 9176, 7695, 253, 8013, 460, 1223, 2568, 643, 8892, 6388, 48635, 581, 390, 2709, 8013, 460, 5113, 387, 581, 390, 2709, 2792, 285, 372, 14692, 352, 390, 4886, 352, 275, 690, 2424, 5133, 50275, 19, 253, 5235, 273, 8892, 1071, 2710, 42083, 273, 391, 77, 751, 1048, 3945, 7219, 3340, 275, 253, 1083, 273, 4471, 3924, 8892, 50276, 20, 1529, 2201, 3434, 638, 19306, 290, 275, 253, 2929, 310, 326, 253, 4477, 452, 6777, 281, 897, 247, 46350, 12057, 3948, 970, 253, 722, 649, 66, 22208, 985, 7624, 2403, 253, 27935, 2130, 323, 7219, 285, 1453, 11333, 50275, 21, 253, 2929, 16681, 949, 16774, 1543, 253, 34385, 273, 11786, 3169, 7274, 689, 771, 813, 658, 391, 77, 7274, 326, 25057, 253, 6944, 46350, 12057, 3948, 2584, 4715, 253, 2424, 8892, 50276, 22, 271, 1774, 32124, 273, 247, 22791, 310, 281, 12661, 8892, 326, 403, 10481, 2570, 323, 253, 1655, 1375, 273, 253, 1445, 7259, 253, 4477, 2126, 495, 1375, 273, 253, 1445, 771, 813, 658, 391, 77, 11333, 285, 921, 326, 841, 391, 77, 3210, 1347, 15225, 275, 247, 5020, 273, 253, 884, 8892, 32621, 14572, 9852, 2118, 8892, 403, 253, 760, 1264, 8892, 835, 253, 771, 813, 658, 7259, 403, 2104, 281, 1347, 8489, 3294, 1598, 342, 253, 11786, 3169, 7219, 7274, 534, 3746, 1347, 973, 275, 512, 533, 253, 8406, 33161, 285, 16260, 15106, 8892, 253, 1390, 4836, 8687, 4471, 6082, 19763, 285, 23169, 6465, 1048, 3945, 7219, 285, 7613, 50273, 585, 1209, 2224, 50276, 18, 275, 4677, 577, 253, 38622, 5556, 6081, 285, 253, 305, 69, 534, 1097, 1646, 281, 12724, 29010, 253, 6459, 23267, 671, 452, 1029, 11041, 690, 22378, 670, 849, 436, 476, 320, 5544, 651, 1361, 253, 9414, 1805, 33876, 907, 253, 1543, 50275, 19, 347, 581, 273, 253, 2022, 3916, 273, 253, 2929, 310, 253, 5691, 273, 2602, 2915, 19763, 285, 253, 10419, 273, 247, 7792, 323, 253, 1072, 352, 310, 30048, 281, 7568, 253, 7629, 273, 253, 4248, 273, 10183, 342, 2572, 390, 6379, 275, 40350, 273, 253, 4753, 1146, 32494, 247, 20407, 1783, 824, 347, 436, 17227, 323, 1650, 253, 7629, 275, 891, 276, 2228, 273, 253, 1682, 9591, 391, 77, 1566, 342, 3629, 275, 4917, 4073, 323, 8013, 460, 651, 5752, 281, 34647, 253, 4588, 5691, 22691, 407, 2602, 2915, 2144, 2022, 81, 1427, 275, 253, 3634, 273, 253, 1655, 4081, 7792, 273, 17406, 1580, 11052, 2602, 1255, 285, 3629, 40350, 310, 954, 2779, 417, 347, 2969, 347, 3629, 247, 2014, 1180, 824, 347, 4917, 4073, 436, 310, 247, 5884, 4468, 285, 625, 247, 14876, 2584, 247, 45290, 1783, 273, 253, 4081, 7792, 50274, 37585, 4278, 50276, 35640, 621, 50276, 18, 26480, 17888, 310, 271, 1774, 32124, 273, 4715, 11333, 275, 2087, 1580, 581, 273, 253, 13991, 273, 253, 1655, 789, 310, 281, 1246, 253, 8013, 29943, 357, 7792, 347, 247, 1039, 281, 417, 760, 17710, 391, 77, 285, 11786, 3169, 11333, 533, 671, 13398, 841, 767, 5870, 273, 3082, 352, 310, 671, 1774, 281, 7472, 253, 3045, 273, 841, 3210, 327, 39709, 533, 2905, 8892, 24088, 40238, 247, 2829, 342, 11184, 390, 650, 727, 1180, 273, 9246, 2820, 281, 1659, 625, 685, 495, 5113, 387, 7616, 8593, 50276, 19, 253, 25577, 12600, 281, 253, 2929, 407, 1323, 8807, 1162, 355, 18879, 990, 936, 423, 46350, 12057, 323, 4715, 285, 1453, 3863, 275, 253, 16424, 275, 11454, 1491, 5162, 2718, 8059, 275, 4765, 3133, 281, 320, 6015, 50275, 20, 1529, 2442, 3884, 273, 253, 1655, 7792, 812, 320, 970, 8013, 29943, 357, 281, 3037, 7823, 534, 1537, 320, 9495, 281, 253, 1524, 10186, 2074, 281, 253, 4836, 5393, 275, 337, 604, 17887, 6240, 690, 4864, 22378, 670, 436, 275, 253, 3634, 273, 337, 1537, 1527, 598, 2007, 44201, 273, 17947, 323, 8013, 29943, 357, 50275, 250, 3065, 50276, 18, 1111, 284, 480, 480, 1443, 256, 34843, 1988, 29168, 948, 85, 43549, 35221, 4715, 323, 22403, 494, 1789, 19763, 549, 32693, 638, 3845, 549, 32693, 11395, 25616, 37256, 4765, 12468, 1384, 7152, 339, 431, 248, 2929, 23970, 247, 747, 13279, 1505, 9864, 22791, 323, 2602, 15688, 982, 253, 9864, 3126, 21168, 327, 1755, 273, 722, 649, 66, 22208, 271, 5368, 46350, 40022, 534, 13276, 990, 936, 423, 1027, 74, 1430, 253, 2929, 29328, 884, 1027, 8892, 1016, 342, 608, 10575, 285, 44995, 1097, 391, 77, 3169, 3646, 4715, 3082, 285, 11786, 3169, 13757, 3082, 327, 1110, 8892, 253, 1543, 5936, 6747, 1655, 391, 77, 3169, 3082, 4543, 11786, 3169, 1332, 476, 8415, 954, 273, 253, 8892, 14556, 3340, 323, 1110, 2430, 1048, 3945, 7219, 50276, 1189, 455, 253, 2929, 310, 973, 15720, 285, 253, 7680, 310, 973, 1662, 2107, 891, 452, 247, 1643, 5701, 50276, 34974, 347, 3637, 50276, 783, 40022, 760, 19401, 253, 1375, 273, 253, 990, 8222, 263, 273, 253, 9452, 11699, 352, 651, 320, 1270, 281, 1908, 2169, 513, 71, 2602, 9452, 28457, 24088, 1249, 534, 651, 2007, 5649, 253, 2602, 15688, 982, 3114, 50276, 28821, 253, 3632, 1255, 285, 253, 3753, 273, 253, 391, 77, 11333, 495, 253, 7103, 275, 2593, 8073, 943, 320, 2218, 342, 387, 1878, 2709, 3632, 12922, 342, 2709, 7587, 591, 8357, 281, 1056, 253, 22791, 272, 1543, 50276, 8766, 18260, 1534, 50276, 3549, 6241, 323, 2593, 20, 275, 2829, 337, 352, 651, 320, 1270, 281, 923, 253, 2629, 11254, 275, 1635, 281, 253, 3388, 1318, 697, 671, 417, 2590, 849, 1142, 7587, 497, 5196, 275, 1340, 281, 755, 253, 3904, 2011, 275, 1097, 2829, 337, 285, 253, 14777, 275, 3036, 577, 50276, 18, 14804, 256, 386, 1758, 7349, 17622, 1162, 355, 7870, 1453, 273, 2602, 25497, 18745, 342, 253, 3126, 4765, 26332, 1796, 5213, 8059, 327, 2602, 15688, 982, 4848, 375, 23037, 26332, 1796, 4765, 50276, 19, 3471, 4652, 289, 321, 307, 2955, 289, 4921, 1162, 355, 1453, 8130, 323, 2602, 35121, 9452, 28457, 247, 6630, 2602, 15688, 982, 8073, 4765, 21300, 18121, 50276, 20, 465, 13243, 5973, 465, 6168, 5916, 267, 1182, 2320, 274, 8952, 21799, 1314, 285, 250, 260, 757, 1258, 531, 4172, 284, 700, 310, 5247, 3371, 4415, 21175, 1952, 38041, 275, 5145, 4715, 22586, 17857, 1686, 4765, 7152, 339, 377, 77, 3258, 29943, 357, 50276, 783, 2929, 10262, 247, 747, 2602, 2915, 19763, 22791, 323, 391, 77, 285, 46350, 7219, 253, 3559, 9864, 18880, 310, 1077, 4722, 285, 253, 7680, 310, 4891, 50276, 45563, 50276, 1826, 9864, 22791, 342, 3386, 326, 403, 417, 2568, 973, 14859, 50276, 19623, 6051, 12057, 281, 1527, 598, 15018, 323, 7219, 3082, 50276, 40480, 403, 2834, 2217, 281, 320, 11132, 323, 247, 1223, 50276, 44650, 1543, 403, 2530, 50276, 20881, 1255, 265, 50276, 7483, 253, 13782, 2069, 651, 320, 1175, 281, 823, 50276, 49836, 253, 2929, 310, 4518, 3542, 285, 3477, 281, 956, 50276, 1576, 281, 3157, 253, 2929, 50276, 12081, 13273, 2069, 651, 1663, 320, 1077, 4217, 1097, 323, 253, 3579, 1509, 347, 973, 347, 247, 19265, 1509, 949, 253, 2862, 16892, 342, 38622, 5046, 671, 690, 7211, 327, 849, 352, 476, 320, 7529, 1025, 1580, 368, 452, 247, 260, 14776, 7092, 50275, 23454, 50276, 81, 22, 1390, 12494, 323, 667, 9860, 2792, 342, 247, 6704, 4181, 277, 253, 15895, 310, 417, 2590, 2217, 50275, 3088, 368, 1599, 342, 2762, 6704, 4181, 1742, 417, 984, 368, 476, 671, 452, 24280, 533, 2139, 651, 247, 1127, 840, 417, 452, 247, 4181, 281, 253, 16572, 2133, 50274, 18941, 12494, 407, 5426, 256, 27221, 28596, 342, 277, 1919, 277, 4916, 4016, 672, 24280, 6634, 973, 352, 27221, 342, 3629, 4181, 285, 840, 352, 2550, 2489, 4016, 604, 352, 5459, 50276, 3712, 891, 276, 5426, 403, 256, 1900, 2762, 891, 13414, 4555, 2096, 752, 253, 2280, 13148, 256, 310, 891, 871, 352, 432, 16572, 2133, 8062, 533, 436, 1057, 417, 1646, 281, 320, 253, 1072, 1060, 476, 368, 19148, 436, 1805, 824, 326, 352, 4916, 2590, 2139, 253, 7212, 8631, 285, 891, 276, 50276, 926, 577, 1908, 11922, 253, 14370, 4114, 326, 396, 357, 1575, 4648, 8356, 253, 14777, 588, 1007, 1199, 28452, 285, 1805, 7985, 50274, 187, 187, 4118, 18435, 27, 2520, 2929, 29328, 247, 747, 46350, 12057, 22791, 323, 2602, 2915, 19763, 253, 4081, 22791, 310, 1754, 327, 253, 50276, 69, 338, 649, 66, 22208, 985, 2067, 5368, 35221, 4715, 11333, 403, 6760, 327, 436, 22791, 253, 2929, 4271, 247, 873, 273, 2234, 7881, 326, 403, 22691, 407, 436, 2173, 22791, 281, 391, 77, 11333, 2159, 16892, 8892, 403, 2011, 281, 320, 17887, 407, 39793, 253, 12057, 3602, 3066, 11786, 18499, 253, 30628, 5194, 326, 436, 2929, 310, 1077, 973, 15720, 253, 50276, 28872, 11463, 1070, 275, 352, 310, 3240, 4722, 285, 11132, 285, 253, 897, 273, 46350, 12057, 275, 391, 77, 323, 40238, 2602, 5113, 3240, 27807 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 22403, 494, 5113, 281, 619, 3640, 436, 556, 417, 644, 5183, 1078, 556, 644, 12717, 4303, 5469, 275, 260, 285, 310, 247, 1534, 7680, 50274, 20881, 1255, 265, 50276, 88, 18, 253, 2929, 812, 5649, 432, 271, 6843, 47284, 273, 4619, 2216, 10165, 326, 2818, 1027, 74, 1430, 1223, 8013, 29943, 357, 4648, 247, 8091, 3169, 1566, 323, 9999, 285, 948, 8287, 2602, 67, 4550, 18075, 275, 253, 830, 273, 3798, 26823, 16232, 17489, 3169, 14237, 2226, 352, 4620, 326, 8091, 2718, 403, 6777, 281, 8046, 14916, 1027, 74, 1430, 24088, 253, 2144, 3659, 1332, 275, 253, 2117, 1215, 273, 3057, 5621, 310, 41398, 46350, 50276, 88, 19, 271, 1774, 2508, 534, 891, 812, 2649, 1089, 275, 253, 2929, 285, 263, 24864, 2144, 310, 849, 1142, 273, 253, 3602, 403, 10486, 24802, 323, 1650, 604, 253, 11843, 273, 253, 6353, 285, 253, 9452, 11699, 3057, 3602, 403, 1097, 45346, 651, 2649, 436, 1421, 281, 3237, 275, 1759, 1430, 26332, 1097, 13483, 2550, 320, 10486, 14042, 323, 4795, 275, 2853, 48384, 9367, 27935, 50276, 88, 20, 1057, 253, 2746, 5467, 247, 581, 531, 17668, 875, 253, 8131, 285, 2303, 5281, 1223, 436, 1537, 1646, 247, 5272, 9376, 891, 2868, 11786, 13782, 310, 41049, 285, 4931, 23851, 497, 436, 281, 320, 19595, 50276, 88, 21, 1529, 9560, 2508, 326, 253, 2929, 1057, 417, 1646, 281, 755, 949, 347, 342, 954, 643, 46350, 12057, 7274, 440, 2307, 5911, 2538, 275, 253, 18525, 985, 1537, 2701, 253, 30437, 273, 253, 985, 1580, 253, 12057, 3948, 760, 17930, 5621, 285, 2602, 2915, 8062, 326, 403, 17095, 891, 651, 8564, 352, 310, 1892, 281, 802, 4187, 1524, 10186, 2538, 824, 347, 8251, 395, 442, 274, 18144, 12716, 285, 2144, 3607, 275, 3718, 273, 253, 2929, 2167, 891, 1928, 436, 2508, 1537, 671, 320, 562, 273, 7990, 281, 271, 6070, 3332, 7274, 824, 347, 11454, 18525, 2718, 247, 285, 4715, 3520, 10806, 407, 11454, 20553, 270, 1705, 281, 2564, 281, 6016, 690, 273, 841, 7350, 50274, 8774, 50276, 6050, 253, 46350, 9864, 4809, 273, 253, 2929, 310, 417, 9619, 4460, 3652, 31831, 722, 649, 66, 22208, 5931, 1452, 257, 285, 2602, 3057, 3210, 50276, 296, 297, 18980, 249, 1162, 355, 4072, 253, 4583, 985, 310, 13943, 285, 12453, 247, 8037, 275, 253, 46350, 12057, 3114, 948, 8287, 46350, 2602, 2915, 8062, 347, 973, 347, 5016, 342, 247, 3710, 966, 273, 16572, 8248, 812, 1527, 598, 4722, 44201, 275, 35221, 4715, 285, 2602, 2915, 19763, 50276, 66, 11454, 18525, 2718, 26259, 2605, 285, 15840, 275, 3520, 10554, 549, 32693, 9169, 50276, 67, 4715, 3520, 10806, 342, 11454, 20553, 549, 32693, 9169, 50276, 68, 44755, 46350, 12057, 323, 4715, 285, 1453, 17857, 1686, 9169, 5474, 33032, 50276, 8774, 50276, 249, 436, 789, 253, 4477, 1246, 8013, 29943, 357, 247, 747, 7792, 323, 2602, 2915, 19763, 8892, 323, 35221, 4715, 285, 7219, 11333, 253, 3126, 8414, 273, 247, 4460, 2602, 26332, 22403, 494, 8013, 2144, 23776, 8013, 460, 534, 310, 2570, 281, 1566, 285, 26526, 984, 273, 253, 26557, 2570, 1029, 6967, 13200, 7424, 285, 253, 1781, 1180, 273, 7759, 273, 7185, 2330, 342, 2602, 4753, 253, 8013, 29943, 357, 7792, 29328, 884, 4460, 8892, 7668, 19763, 273, 253, 2602, 8013, 460, 2144, 253, 4477, 921, 11080, 16774, 1783, 326, 5899, 1375, 273, 253, 1445, 771, 813, 658, 35221, 4715, 11333, 1891, 281, 8069, 3037, 253, 4836, 1014, 846, 247, 6832, 2408, 273, 3733, 3021, 8069, 44762, 2355, 253, 10454, 273, 253, 4081, 8892, 285, 253, 19297, 273, 1375, 273, 253, 1445, 391, 77, 3210, 281, 1566, 253, 4081, 8892, 50275, 993, 23223, 50276, 18, 4460, 8892, 1016, 4836, 24543, 247, 1027, 5691, 24088, 690, 8892, 6388, 30627, 2980, 253, 8013, 460, 643, 6388, 9176, 7695, 253, 8013, 460, 1223, 2568, 643, 8892, 6388, 48635, 581, 390, 2709, 8013, 460, 5113, 387, 581, 390, 2709, 2792, 285, 372, 14692, 352, 390, 4886, 352, 275, 690, 2424, 5133, 50275, 19, 253, 5235, 273, 8892, 1071, 2710, 42083, 273, 391, 77, 751, 1048, 3945, 7219, 3340, 275, 253, 1083, 273, 4471, 3924, 8892, 50276, 20, 1529, 2201, 3434, 638, 19306, 290, 275, 253, 2929, 310, 326, 253, 4477, 452, 6777, 281, 897, 247, 46350, 12057, 3948, 970, 253, 722, 649, 66, 22208, 985, 7624, 2403, 253, 27935, 2130, 323, 7219, 285, 1453, 11333, 50275, 21, 253, 2929, 16681, 949, 16774, 1543, 253, 34385, 273, 11786, 3169, 7274, 689, 771, 813, 658, 391, 77, 7274, 326, 25057, 253, 6944, 46350, 12057, 3948, 2584, 4715, 253, 2424, 8892, 50276, 22, 271, 1774, 32124, 273, 247, 22791, 310, 281, 12661, 8892, 326, 403, 10481, 2570, 323, 253, 1655, 1375, 273, 253, 1445, 7259, 253, 4477, 2126, 495, 1375, 273, 253, 1445, 771, 813, 658, 391, 77, 11333, 285, 921, 326, 841, 391, 77, 3210, 1347, 15225, 275, 247, 5020, 273, 253, 884, 8892, 32621, 14572, 9852, 2118, 8892, 403, 253, 760, 1264, 8892, 835, 253, 771, 813, 658, 7259, 403, 2104, 281, 1347, 8489, 3294, 1598, 342, 253, 11786, 3169, 7219, 7274, 534, 3746, 1347, 973, 275, 512, 533, 253, 8406, 33161, 285, 16260, 15106, 8892, 253, 1390, 4836, 8687, 4471, 6082, 19763, 285, 23169, 6465, 1048, 3945, 7219, 285, 7613, 50273, 585, 1209, 2224, 50276, 18, 275, 4677, 577, 253, 38622, 5556, 6081, 285, 253, 305, 69, 534, 1097, 1646, 281, 12724, 29010, 253, 6459, 23267, 671, 452, 1029, 11041, 690, 22378, 670, 849, 436, 476, 320, 5544, 651, 1361, 253, 9414, 1805, 33876, 907, 253, 1543, 50275, 19, 347, 581, 273, 253, 2022, 3916, 273, 253, 2929, 310, 253, 5691, 273, 2602, 2915, 19763, 285, 253, 10419, 273, 247, 7792, 323, 253, 1072, 352, 310, 30048, 281, 7568, 253, 7629, 273, 253, 4248, 273, 10183, 342, 2572, 390, 6379, 275, 40350, 273, 253, 4753, 1146, 32494, 247, 20407, 1783, 824, 347, 436, 17227, 323, 1650, 253, 7629, 275, 891, 276, 2228, 273, 253, 1682, 9591, 391, 77, 1566, 342, 3629, 275, 4917, 4073, 323, 8013, 460, 651, 5752, 281, 34647, 253, 4588, 5691, 22691, 407, 2602, 2915, 2144, 2022, 81, 1427, 275, 253, 3634, 273, 253, 1655, 4081, 7792, 273, 17406, 1580, 11052, 2602, 1255, 285, 3629, 40350, 310, 954, 2779, 417, 347, 2969, 347, 3629, 247, 2014, 1180, 824, 347, 4917, 4073, 436, 310, 247, 5884, 4468, 285, 625, 247, 14876, 2584, 247, 45290, 1783, 273, 253, 4081, 7792, 50274, 37585, 4278, 50276, 35640, 621, 50276, 18, 26480, 17888, 310, 271, 1774, 32124, 273, 4715, 11333, 275, 2087, 1580, 581, 273, 253, 13991, 273, 253, 1655, 789, 310, 281, 1246, 253, 8013, 29943, 357, 7792, 347, 247, 1039, 281, 417, 760, 17710, 391, 77, 285, 11786, 3169, 11333, 533, 671, 13398, 841, 767, 5870, 273, 3082, 352, 310, 671, 1774, 281, 7472, 253, 3045, 273, 841, 3210, 327, 39709, 533, 2905, 8892, 24088, 40238, 247, 2829, 342, 11184, 390, 650, 727, 1180, 273, 9246, 2820, 281, 1659, 625, 685, 495, 5113, 387, 7616, 8593, 50276, 19, 253, 25577, 12600, 281, 253, 2929, 407, 1323, 8807, 1162, 355, 18879, 990, 936, 423, 46350, 12057, 323, 4715, 285, 1453, 3863, 275, 253, 16424, 275, 11454, 1491, 5162, 2718, 8059, 275, 4765, 3133, 281, 320, 6015, 50275, 20, 1529, 2442, 3884, 273, 253, 1655, 7792, 812, 320, 970, 8013, 29943, 357, 281, 3037, 7823, 534, 1537, 320, 9495, 281, 253, 1524, 10186, 2074, 281, 253, 4836, 5393, 275, 337, 604, 17887, 6240, 690, 4864, 22378, 670, 436, 275, 253, 3634, 273, 337, 1537, 1527, 598, 2007, 44201, 273, 17947, 323, 8013, 29943, 357, 50275, 250, 3065, 50276, 18, 1111, 284, 480, 480, 1443, 256, 34843, 1988, 29168, 948, 85, 43549, 35221, 4715, 323, 22403, 494, 1789, 19763, 549, 32693, 638, 3845, 549, 32693, 11395, 25616, 37256, 4765, 12468, 1384, 7152, 339, 431, 248, 2929, 23970, 247, 747, 13279, 1505, 9864, 22791, 323, 2602, 15688, 982, 253, 9864, 3126, 21168, 327, 1755, 273, 722, 649, 66, 22208, 271, 5368, 46350, 40022, 534, 13276, 990, 936, 423, 1027, 74, 1430, 253, 2929, 29328, 884, 1027, 8892, 1016, 342, 608, 10575, 285, 44995, 1097, 391, 77, 3169, 3646, 4715, 3082, 285, 11786, 3169, 13757, 3082, 327, 1110, 8892, 253, 1543, 5936, 6747, 1655, 391, 77, 3169, 3082, 4543, 11786, 3169, 1332, 476, 8415, 954, 273, 253, 8892, 14556, 3340, 323, 1110, 2430, 1048, 3945, 7219, 50276, 1189, 455, 253, 2929, 310, 973, 15720, 285, 253, 7680, 310, 973, 1662, 2107, 891, 452, 247, 1643, 5701, 50276, 34974, 347, 3637, 50276, 783, 40022, 760, 19401, 253, 1375, 273, 253, 990, 8222, 263, 273, 253, 9452, 11699, 352, 651, 320, 1270, 281, 1908, 2169, 513, 71, 2602, 9452, 28457, 24088, 1249, 534, 651, 2007, 5649, 253, 2602, 15688, 982, 3114, 50276, 28821, 253, 3632, 1255, 285, 253, 3753, 273, 253, 391, 77, 11333, 495, 253, 7103, 275, 2593, 8073, 943, 320, 2218, 342, 387, 1878, 2709, 3632, 12922, 342, 2709, 7587, 591, 8357, 281, 1056, 253, 22791, 272, 1543, 50276, 8766, 18260, 1534, 50276, 3549, 6241, 323, 2593, 20, 275, 2829, 337, 352, 651, 320, 1270, 281, 923, 253, 2629, 11254, 275, 1635, 281, 253, 3388, 1318, 697, 671, 417, 2590, 849, 1142, 7587, 497, 5196, 275, 1340, 281, 755, 253, 3904, 2011, 275, 1097, 2829, 337, 285, 253, 14777, 275, 3036, 577, 50276, 18, 14804, 256, 386, 1758, 7349, 17622, 1162, 355, 7870, 1453, 273, 2602, 25497, 18745, 342, 253, 3126, 4765, 26332, 1796, 5213, 8059, 327, 2602, 15688, 982, 4848, 375, 23037, 26332, 1796, 4765, 50276, 19, 3471, 4652, 289, 321, 307, 2955, 289, 4921, 1162, 355, 1453, 8130, 323, 2602, 35121, 9452, 28457, 247, 6630, 2602, 15688, 982, 8073, 4765, 21300, 18121, 50276, 20, 465, 13243, 5973, 465, 6168, 5916, 267, 1182, 2320, 274, 8952, 21799, 1314, 285, 250, 260, 757, 1258, 531, 4172, 284, 700, 310, 5247, 3371, 4415, 21175, 1952, 38041, 275, 5145, 4715, 22586, 17857, 1686, 4765, 7152, 339, 377, 77, 3258, 29943, 357, 50276, 783, 2929, 10262, 247, 747, 2602, 2915, 19763, 22791, 323, 391, 77, 285, 46350, 7219, 253, 3559, 9864, 18880, 310, 1077, 4722, 285, 253, 7680, 310, 4891, 50276, 45563, 50276, 1826, 9864, 22791, 342, 3386, 326, 403, 417, 2568, 973, 14859, 50276, 19623, 6051, 12057, 281, 1527, 598, 15018, 323, 7219, 3082, 50276, 40480, 403, 2834, 2217, 281, 320, 11132, 323, 247, 1223, 50276, 44650, 1543, 403, 2530, 50276, 20881, 1255, 265, 50276, 7483, 253, 13782, 2069, 651, 320, 1175, 281, 823, 50276, 49836, 253, 2929, 310, 4518, 3542, 285, 3477, 281, 956, 50276, 1576, 281, 3157, 253, 2929, 50276, 12081, 13273, 2069, 651, 1663, 320, 1077, 4217, 1097, 323, 253, 3579, 1509, 347, 973, 347, 247, 19265, 1509, 949, 253, 2862, 16892, 342, 38622, 5046, 671, 690, 7211, 327, 849, 352, 476, 320, 7529, 1025, 1580, 368, 452, 247, 260, 14776, 7092, 50275, 23454, 50276, 81, 22, 1390, 12494, 323, 667, 9860, 2792, 342, 247, 6704, 4181, 277, 253, 15895, 310, 417, 2590, 2217, 50275, 3088, 368, 1599, 342, 2762, 6704, 4181, 1742, 417, 984, 368, 476, 671, 452, 24280, 533, 2139, 651, 247, 1127, 840, 417, 452, 247, 4181, 281, 253, 16572, 2133, 50274, 18941, 12494, 407, 5426, 256, 27221, 28596, 342, 277, 1919, 277, 4916, 4016, 672, 24280, 6634, 973, 352, 27221, 342, 3629, 4181, 285, 840, 352, 2550, 2489, 4016, 604, 352, 5459, 50276, 3712, 891, 276, 5426, 403, 256, 1900, 2762, 891, 13414, 4555, 2096, 752, 253, 2280, 13148, 256, 310, 891, 871, 352, 432, 16572, 2133, 8062, 533, 436, 1057, 417, 1646, 281, 320, 253, 1072, 1060, 476, 368, 19148, 436, 1805, 824, 326, 352, 4916, 2590, 2139, 253, 7212, 8631, 285, 891, 276, 50276, 926, 577, 1908, 11922, 253, 14370, 4114, 326, 396, 357, 1575, 4648, 8356, 253, 14777, 588, 1007, 1199, 28452, 285, 1805, 7985, 50274, 187, 187, 4118, 18435, 27, 2520, 2929, 29328, 247, 747, 46350, 12057, 22791, 323, 2602, 2915, 19763, 253, 4081, 22791, 310, 1754, 327, 253, 50276, 69, 338, 649, 66, 22208, 985, 2067, 5368, 35221, 4715, 11333, 403, 6760, 327, 436, 22791, 253, 2929, 4271, 247, 873, 273, 2234, 7881, 326, 403, 22691, 407, 436, 2173, 22791, 281, 391, 77, 11333, 2159, 16892, 8892, 403, 2011, 281, 320, 17887, 407, 39793, 253, 12057, 3602, 3066, 11786, 18499, 253, 30628, 5194, 326, 436, 2929, 310, 1077, 973, 15720, 253, 50276, 28872, 11463, 1070, 275, 352, 310, 3240, 4722, 285, 11132, 285, 253, 897, 273, 46350, 12057, 275, 391, 77, 323, 40238, 2602, 5113, 3240, 27807 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the authors propose a soft gradientbased subword tokenization gbst module with the aim of improving tokenizerfree endtoend training of language models the gbst module takes a bytelevel sequence and computes all possible subword representation up to a length in a convolutionlike manner the results are then pooled scored and weighted with other addons including 1dconvolutions and an attentionlike positionwise score the authors use the gbst module followed by an encoderdecoder transformer stack similar to that of t5 and call it charformer the authors conducted experiments on a range of nlp classification tasks under monolingual and multilingual settings the trend seems consistent that subwordlevel models charformer bytelevel t5 on monolingual and multilingual clean data charformer bytelevel t5 subwordlevel t5 for monolingual noisy data in terms of efficiency the authors show evidence that charformer uses fewer parameters and flops than subword t5 and proceeds more steps per second than bytelevel t5 unclear how is the proposed gbst different from convolution despite the scoring part pro extensive evaluation results across many nlp classification tasks in mono and multilingual settings pro charformer seems to achieve similar predictive performance to bytelevel t5 while being more computationally efficient table 6 mixed charformer achieves worse predictive performance than subword models bert and t5 which seems to suggest the proposed gbst gives little lift from characterlevel to subwordlevel however the charformer models presented are smaller in size or faster in speed or more efficient in flops thus hopefully it will provide people with more choices for their use cases mixed experimental results do not show strong improvements on accuracy metrics table 13 shows that charformer is generally on par with bytelevel t5 sometimes charformer wins other times bytelevel t5 wins either with small differences in the multilingual case table 4 we do see charformer consistently outperforms bytelevel t5 with a small margin in all cases charformer is significantly worse than subword models the rescaled version charformers is able to achieve comparable and sometimes better accuracy than subword models however the rescaling is not tied to charformer and can be applied to other characterlevel models as well it is not clear if the gain is because of the proposed gbst or solely due to the rescaling con it is not clear if the proposed gbst module learns meaningful subword tokenization better qualitative evidence eg examples that highly scored subwords align with human intuition or quantitative evidence eg how the learned tokenized subwords align with established methods or how they are better at solving lexical tasks would help con many technical details are unclearconfusing it would help if the authors can provide more and clearer technical details if limited by pages the author can point readers to appendices make sure the main text still contains the necessary details when moving things to appendices con no code provided providing code will greatly help reproducibility and help clarify many technical details not fully described in the paper questions to the authors how is the proposed gbst different from convolution despite the scoring part can the authors help clarify the design of gbst and implementation of charformer for example what is the transfomer stack used by charformer it seems like t5 but i dont find it mentioned anywhere see other localized points below all the tasks in the paper seem to be classification tasks is charformer encoder only if so how do the authors ensure a fair comparison with encoderdecoder models eg t5 if not is the decoder part of charformer characterbased or it also uses some notion of subwords localized points eq 1 what is used as f in the experiments in considering offsets how is 1d convolution applied to x how does it save computation figure 2 how to interpret this heatmap for block size 1 how are they aligned for example we see a high score for block size 3 at k is it for subword tok or subword ken a plausible proposal for the promising direction of learning subword tokenization endtoend however some important design and implementation details are missing or confusing which are not helped by the lack of source code the extensive experimental results themselves would benefit the community however they dont seem to strongly support the advantage of the proposed module in learning subword representation either in terms of endtask accuracy or explainability if the authors could clarify what they did exactly with their gbst module and thus their novelty i would be inclined to accept for the merit of extensive evaluation results and one more alternative that lies between subwordlevel and plainbytelevel models docsepthis paper aims to remove the reliance of nlp models to external tokenizers it proposes a gradientbased subword tokenization module that can be used in any neural model the module scores predefined candidate blocks in a soft way for each position and then it weight averages them to obtain a mixture of subword representations the resulting sequence is then downsampled in a fixed way to make processing easier the proposed method integrated in a deep narrow transformer model performs on par and slightly better than characterlevel baselines on monolingual and multilingual classification respectively in some settings it also outperforms strong subwordlevel baselines strengths 1 avoiding the reliance of nlp models on tokenizers is an important problem tokenizers influence the input structure and the design of the model the input structure is not amenable to graceful modifications after the initial training which can be limiting 2 the idea to calculate position representations based on a weighted average of subword blocks is interesting this way the representation captures information from nearby ngrams 3 the experimentation has some competitive results and focuses on both performance and speed aspects weaknesses 1 my main concern is that the proposed method does not seem to be satisfying the criteria for a proper tokenization method first it does not learn to segment the input to different chunks to be fed in the contextualizer but rather downsamples the sequence in a fixed way the latent subword representations could be captured by combing multiple convolutions and a simple pooling function per position which makes the method seem less novel second the soft selection at each position is performed over blocks of different size that correspond to that position and does not take into account global information eg by computing scores of all possible segmentations to maximize for the most likely one 2 the current framing would suggest comparison to existing tokenization methods like bpe or recent optimized ones to the downstream task 12 in a controlled setting eg similar vocabulary size and contextualizer design but this was not explored at all in the evaluation in addition there is no proper evaluation of the learned segmentations other than the visualization of the learned weights for a single example 3 the results are not that good compared to pure characterlevel models on average which makes results less exciting also one of the claimed benefits of the proposed method is that is faster is mainly due to the fixed downsampling which is not something new and applies to other characterlevel baselines 4 studies that focus on tokenization like the ones cited below evaluate on tasks where tokenization is important for reaching stateoftheart performance like machine translation it might be worth evaluating there question in section 82 about monolingual datasets was the downsampling rate optimized only for the proposed model 1 httpsaclanthologyorg2020findingsemnlp120pdf 2 httpsarxivorgpdf201215671pdf the paper focuses on an interesting problem and has many experiments with stateoftheart results in certain settings my main concerns are regarding its novelty framing and evaluation even though some of the results are competitive the experiments do not provide enough evidence that the learned tokenization is crucial for achieving the results docsepthis paper focus on a gradientbased subword tokenization gbst method for bytelevel transformers the key idea is to use a linear transformation with trainable parameters softmax to compute embedding for each character based on several ngrams in its neighborhood and then use average pooling to downsample and decrease the sequence length strength 1 the gbst module is simpler and faster it has onpar performance but is 3580 faster with less memory usage than current sota bytelevel transformer byt5 comparing with the lighter byt5canine model it has similar speed and memory usage but more straight forward and has better accuracy it also has on par performance and less computation costs in most tasks comparing with t5mt5 2 the rescaling of the model sbase achieves better performance with 20 less computation though this idea to have deeper encoder is from previous work its still informative to have the experiment 3 the paper is very well written and has enough experiments to show the models performance in accuracy and capability in different tasks weakness 1 minor the technical novelty of this paper is only gbst a dense layer over ngrams while training setting follows byt5 and the best sbase model is also an implementation of the deep encoder concept 2 the scalability of the model is not shown all the experiment focusing on models other than mt5 with a parameter scale of 200m but the paper byt5s smallest model has 300m parameters and performs much better than with 200m parameters comparing table 4 of byt5 and table 4 of this paper 3 minor t5 seems much better than sbase under zeroshot settings though it takes more computation 4 from the last sentence of 31 baselines it seems that the byt5 model used for comparison is unscaled and i couldnt find in the paper about structure of this byt5 sorry if i missed that this is important since in the byt5 paper all the model variants have much heavier weight in encoders and there is also an ablation study to prove that heavier encoder can improve performance a lot in this paper the scaled charformersbase is also much better than the unscaled charformerbase thus its a bit strange that the best settings of both models are not used for comparison to summarize pro 1 excellent paper writing 2 extensive experiments 3 simple but effective tokenfree module con 1 lack of experiment on different sizes of the model especially performance when with a larger scale 2 the comparison seems not between the best setting deep encoder shallow decoder of both baseline and this model docsepthis work first implements a bytelevel character tokenizer which is learned as part of the model rather than as a preprocessing task to limit the computational burden of character level encoding on the downstream architecture the method creates byte ngrams and combines them via a scoring network as outlined by the authors common transformer architecture can use the tokenizer and even be narrowed finally the model is evaluated using multiple nlu and noisy nlp tasks as a result the approach achieves comparable endtask performance to subword and other bytelevel transformers while improving parameter efficiency as well as ease of use for non english nlp tasks strengths the work makes tokenization more adaptable by learning subtokens onthefly training speed and parameter efficiency are improved over more rigid subword tokenization based transformers in many cases the tokenization method is based on a weighting of pooled byte ngram embeddings score calibration and downsampling are technically sensible and the authors make a noticeable effort to ease optimization and not introduce foreseeably brittle complexities like extra hyperparameters the ngram weighting approach allows the subtoken weightings to remain somewhat interpretable the downsampling operation allows an easy way of saving later transformer stack parameter size which is shown to increase learning speed and reduce parameters potential limitations that require future work are mentioned proactively instances of reasoning about technical choices eg narrower encoders provide instructive details and help make the foci of the experiments more deducible experiments are intentionally controlled deconflated to produce more robust insights table 2 uses aucpr rather than the original works aucroc as performance measure which can be considered an improvement this work first implements an endtoend trainable bytelevel character tokenizer while limiting computational overhead via a learned token weighting it is compatible with common transformer stacks but can reduce their width for computational gains without sacrificing performance and in some places gaining performance the evaluation on multiple nlu and noisy nlp tasks demonstrates only slighly less to slightly stronger performance compared to byte and subword tokenization models ease of use being conceivably language agnostic training more efficiently and allowing easy reimplementation pytorch make this work a welcome contribution to the field therefore i recommend accepting this work docsepthis work aims to let the model learn the subword representations by itself from characterlevel embeddings of the input without tokenizing a gradientbased subword tokenization module gbst is designed for replacement of the general tokenization the proposed charformer a rescaled transformer architecture transformer with gbst that integrates gradientbased subword tokenization outperforms several bytelevel baselines and performs on par with some subwordlevel models on three kinds of benchmarks including glue multilingual and noisy text datasets moreover charformer has the advantages of less memory usage faster speed compared to the models with similar parameter size strengths 1 the paper proposes a novel gradientbased tokenization module to address the issue of rigid subword tokenization algorithms and attains competitive performance the proposed method saves engineering efforts for tokenization which would be easy to generalize to different tasks 2 it is interesting to find that this module can deal with multilingual and noisy datasets to some extent which gives credit to learning subword representations from characters 3 this proposed tokenfree module gbst has less computational cost than using the general methods of tokenization and can be well extended to other models weaknesses 1 the performance is not surprising though the model indeed saves model size it is not clear if the method works for other languages for example characterbased models tend to achieve similar performance compared with wordbased ones would this method has advantage over the characterbased baseline then 2 the evaluations are based on base models since this model saves model size would it achieve better performance compared with the same size eg 200m like bytelevel t5 base in table 1 do you try large models besides it would be impactful if this paper showed that gbst module could actually generalize in other kinds of models minor comments 1 i wonder if average pooling is good way to down sample though it is really fast but how about mlp 2 in eq3 the indices should be 0 to m1 or 1 to m total number of block size is m ie pi softmaxp0 i p1 i dots pmi pi softmaxp1 i p2 i dots pmi 3 at figure 1 b it seems the annotation p58 in the fourth line is missed 4 in section 212 block sizes 1 b m should be 1 b m 5 in section 215 latent subwords hatx hatxi dots hatxm should be latent subwords hatx hatxi dots hatxl this work is wellmotivated the overall design of the model is technically sound and it provides moderate improvements on several datasets while being memory efficient faster with fewer parameters it is not clear if the advance of the method can generalize to other languages or larger models ### Summary:
this paper introduces a soft gradientbased subword tokenization module gbst that learns latent subword representations from characters gbst enumerates candidate subword blocks and learns to score them in a positionwise fashion using a block scoring network the resulting model was tested on glue and several crosslingual tasks the performance is competitive with bytet5 and often similar to subword models while being more efficient in flops and ram reviewers are mixed on this the negative reviewer points to how this not being a real tokenizer and does not produce a tokenization that experiments that use the base model do not address bigger scales and that there is a lack of code which is important for this kind of work and the resulting accuracy gains are not significant and the method being not interpretable the positive reviewers like the extensive experiments the efficiency improvements and flexibility simplicity of the gbst module the authors seemed to have addressed most of the reviewer issues by providing larger scale experiments and code i believe the results are fairly strong since one would not expect a big performance difference in a learned tokenization method but rather efficiency or flexibility gains the paper is generally wellwritten though details about the convolution should be included in the text and not just the code recommending accept
[ 689, 8336, 273, 1027, 1979, 326, 2723, 281, 326, 1899, 285, 1057, 417, 1379, 715, 2395, 4156, 1491, 24088, 407, 12672, 7363, 273, 512, 1896, 8223, 569, 281, 22950, 323, 253, 954, 2779, 581, 50275, 19, 253, 1655, 39926, 651, 1804, 5301, 281, 5368, 10669, 1320, 3082, 751, 270, 365, 390, 3332, 50276, 32581, 1025, 4394, 281, 253, 15450, 4836, 1249, 275, 247, 6537, 4758, 24088, 2074, 30318, 1979, 285, 33876, 6081, 2216, 533, 436, 369, 417, 14859, 387, 512, 275, 253, 7103, 275, 1635, 627, 310, 642, 1463, 7103, 273, 253, 6311, 8223, 569, 643, 685, 253, 24426, 273, 253, 6311, 13461, 323, 247, 2014, 1650, 50276, 20, 253, 1543, 403, 417, 326, 1175, 2429, 281, 6313, 1894, 5251, 3210, 327, 3388, 534, 2789, 1543, 1679, 12302, 671, 581, 273, 253, 7558, 5373, 273, 253, 4081, 1332, 310, 326, 310, 7938, 310, 7194, 1955, 281, 253, 4229, 1066, 48027, 534, 310, 417, 1633, 747, 285, 10384, 281, 643, 1894, 5251, 1666, 25379, 50276, 21, 2175, 326, 2770, 327, 10669, 1320, 751, 253, 4394, 11106, 2708, 7472, 327, 8892, 835, 10669, 1320, 310, 1774, 323, 10922, 1375, 23037, 14387, 3045, 751, 5145, 10234, 352, 1537, 320, 4409, 16344, 627, 50274, 19751, 50276, 249, 2593, 11487, 670, 28294, 272, 780, 15302, 369, 253, 1066, 48027, 2281, 18325, 760, 323, 253, 4081, 1566, 50276, 18, 5987, 29404, 14718, 1497, 2061, 14952, 28983, 6017, 13307, 81, 8193, 9275, 50276, 19, 5987, 39962, 2061, 9275, 6755, 1010, 35276, 9275, 50276, 783, 2929, 16633, 327, 271, 4722, 1895, 285, 556, 1142, 4679, 342, 1375, 23037, 14387, 1543, 275, 2176, 7533, 619, 2022, 7350, 403, 5001, 697, 38135, 39926, 285, 7103, 50276, 9154, 2167, 690, 273, 253, 1543, 403, 12085, 253, 4679, 513, 417, 2085, 2217, 1941, 326, 253, 6311, 50276, 13763, 1320, 310, 9560, 323, 17170, 253, 1543, 50276, 7152, 33032, 2520, 2929, 2770, 327, 247, 11786, 3169, 749, 3418, 10669, 1320, 305, 35441, 1332, 323, 12243, 5251, 4979, 398, 253, 2234, 2934, 310, 281, 897, 247, 4872, 9261, 342, 6194, 494, 3602, 50276, 5530, 4090, 281, 11897, 21496, 323, 1016, 1894, 1754, 327, 2067, 295, 5059, 275, 697, 9168, 285, 840, 897, 3388, 45900, 281, 1066, 16848, 285, 6379, 253, 3425, 2978, 4757, 50276, 18, 253, 305, 35441, 6333, 310, 19554, 285, 7938, 352, 556, 327, 1148, 3045, 533, 310, 4791, 1438, 7938, 342, 1679, 3541, 10393, 685, 1655, 256, 5503, 12243, 5251, 39707, 407, 85, 22, 10941, 342, 253, 21259, 407, 85, 22, 5092, 460, 1566, 352, 556, 2074, 3885, 285, 3541, 10393, 533, 625, 4951, 3579, 285, 556, 1805, 7200, 352, 671, 556, 327, 1061, 3045, 285, 1679, 13782, 4815, 275, 954, 8892, 10941, 342, 246, 22, 6917, 22, 374, 253, 46595, 272, 273, 253, 1566, 256, 4793, 33526, 1805, 3045, 342, 1384, 1679, 13782, 2167, 436, 2934, 281, 452, 12861, 32049, 310, 432, 2045, 789, 697, 1335, 27096, 281, 452, 253, 3368, 495, 253, 2929, 310, 1077, 973, 3542, 285, 556, 2217, 4679, 281, 921, 253, 3210, 3045, 275, 7200, 285, 14603, 275, 1027, 8892, 50275, 20881, 1255, 50276, 18, 5884, 253, 7681, 38135, 273, 436, 2929, 310, 760, 305, 35441, 247, 14086, 3828, 689, 295, 5059, 1223, 3733, 4758, 3637, 407, 85, 22, 285, 253, 1682, 256, 4793, 1566, 310, 671, 271, 7092, 273, 253, 3676, 32049, 4473, 374, 253, 9171, 1430, 273, 253, 1566, 310, 417, 2011, 512, 253, 3368, 13654, 327, 3210, 643, 685, 26301, 22, 342, 247, 4764, 4311, 273, 1052, 78, 533, 253, 2929, 407, 85, 22, 84, 8004, 1566, 556, 7469, 78, 3602, 285, 17923, 1199, 1805, 685, 342, 1052, 78, 3602, 10941, 2829, 577, 273, 407, 85, 22, 285, 2829, 577, 273, 436, 2929, 495, 5884, 246, 22, 3133, 1199, 1805, 685, 256, 4793, 762, 1182, 254, 6934, 302, 7533, 2167, 352, 3936, 625, 13782, 50276, 21, 432, 253, 1390, 6197, 273, 4562, 1666, 25379, 352, 3133, 326, 253, 407, 85, 22, 1566, 908, 323, 5301, 310, 440, 1026, 3256, 285, 891, 812, 2649, 1089, 275, 253, 2929, 670, 2605, 273, 436, 407, 85, 22, 7016, 604, 891, 9829, 326, 436, 310, 1774, 1580, 275, 253, 407, 85, 22, 2929, 512, 253, 1566, 11640, 452, 1199, 27953, 2801, 275, 2349, 351, 398, 285, 627, 310, 671, 271, 28913, 1263, 281, 5276, 326, 27953, 32049, 476, 3157, 3045, 247, 2257, 275, 436, 2929, 253, 24337, 1018, 630, 398, 4793, 310, 671, 1199, 1805, 685, 253, 440, 1026, 3256, 1018, 19946, 4793, 3021, 697, 247, 2372, 8921, 326, 253, 1682, 7533, 273, 1097, 3210, 403, 417, 908, 323, 5301, 50275, 936, 26799, 50276, 856, 50276, 18, 7126, 2929, 4028, 374, 9470, 4679, 495, 2969, 533, 3576, 10669, 4924, 6333, 50276, 585, 50276, 18, 3480, 273, 3368, 327, 1027, 9552, 273, 253, 1566, 3340, 3045, 672, 342, 247, 4067, 4311, 374, 253, 5301, 3133, 417, 875, 253, 1682, 4758, 3676, 32049, 20126, 29810, 273, 1097, 8245, 285, 436, 1566, 50275, 7152, 33032, 2520, 789, 806, 17930, 247, 12243, 5251, 1894, 10669, 6081, 534, 310, 6311, 347, 629, 273, 253, 1566, 2581, 685, 347, 247, 638, 21678, 4836, 281, 2701, 253, 15180, 7977, 273, 1894, 1268, 9706, 327, 253, 15450, 10336, 253, 1332, 10513, 12243, 295, 5059, 285, 24772, 731, 3066, 247, 14755, 2990, 347, 18627, 407, 253, 4477, 1846, 39707, 10336, 476, 897, 253, 10669, 6081, 285, 1014, 320, 34774, 4720, 253, 1566, 310, 6760, 970, 2709, 295, 7675, 285, 27620, 295, 24343, 8892, 347, 247, 906, 253, 2746, 33526, 10870, 990, 14605, 3045, 281, 749, 3418, 285, 643, 12243, 5251, 4979, 398, 1223, 11138, 4764, 6733, 347, 973, 347, 11990, 273, 897, 323, 1327, 48087, 295, 24343, 8892, 20544, 50276, 783, 789, 2789, 10669, 1320, 625, 5223, 494, 407, 4715, 8482, 536, 561, 327, 783, 16247, 50276, 31158, 3885, 285, 4764, 6733, 403, 5520, 689, 625, 16572, 749, 3418, 10669, 1320, 1754, 4979, 398, 275, 1142, 2219, 50276, 783, 10669, 1320, 1332, 310, 1754, 327, 247, 42428, 273, 24462, 12243, 295, 1710, 46234, 4868, 18543, 285, 1066, 48027, 403, 22335, 24600, 285, 253, 4477, 1056, 247, 28629, 3434, 281, 11990, 13757, 285, 417, 9569, 32734, 1598, 1308, 1522, 48663, 751, 4465, 4373, 22041, 50276, 783, 295, 1710, 42428, 2746, 4483, 253, 8482, 5097, 2801, 723, 281, 3464, 8489, 4665, 494, 50276, 783, 1066, 48027, 4254, 4483, 271, 3477, 1039, 273, 13868, 1996, 39707, 8031, 4764, 1979, 534, 310, 2011, 281, 2572, 4715, 3885, 285, 4796, 3602, 50276, 33177, 7364, 326, 2430, 2852, 789, 403, 5393, 354, 39237, 50276, 249, 4777, 273, 14720, 670, 7681, 10165, 24088, 39937, 2349, 351, 398, 2085, 49664, 4278, 285, 1361, 1056, 253, 37057, 273, 253, 4679, 625, 5514, 1028, 917, 50276, 16217, 3825, 403, 23209, 6537, 372, 585, 1258, 456, 281, 4711, 625, 10237, 16039, 50276, 2420, 374, 4648, 247, 1028, 1087, 2581, 685, 253, 3236, 2987, 247, 1028, 15683, 347, 3045, 2557, 534, 476, 320, 2783, 271, 7756, 436, 789, 806, 17930, 271, 990, 936, 423, 6194, 494, 12243, 5251, 1894, 10669, 6081, 1223, 14155, 15180, 18332, 3066, 247, 6311, 10669, 42428, 352, 310, 13333, 342, 1846, 39707, 34577, 533, 476, 4796, 616, 4871, 323, 15180, 15988, 1293, 18501, 272, 3045, 285, 275, 690, 5053, 21896, 3045, 253, 7103, 327, 2709, 295, 7675, 285, 27620, 295, 24343, 8892, 14371, 760, 1499, 798, 314, 1679, 281, 5777, 10046, 3045, 2429, 281, 12243, 285, 749, 3418, 10669, 1320, 3210, 11990, 273, 897, 1146, 10686, 400, 1598, 3448, 639, 79, 6932, 3733, 625, 14556, 285, 6941, 3477, 294, 39595, 268, 1767, 263, 348, 1056, 436, 789, 247, 10112, 7680, 281, 253, 1673, 50276, 45230, 891, 5583, 18738, 436, 789, 5474, 33032, 2520, 789, 13698, 281, 1339, 253, 1566, 3037, 253, 749, 3418, 14237, 407, 3139, 432, 1894, 5251, 46234, 273, 253, 3280, 1293, 10669, 3006, 247, 11786, 3169, 749, 3418, 10669, 1320, 6333, 305, 35441, 310, 4158, 323, 5407, 273, 253, 2087, 10669, 1320, 253, 4081, 1018, 19946, 247, 46595, 264, 39707, 10336, 39707, 342, 305, 35441, 326, 49661, 11786, 3169, 749, 3418, 10669, 1320, 41731, 13015, 2067, 12243, 5251, 1666, 25379, 285, 17923, 327, 1061, 342, 690, 749, 3418, 5251, 3210, 327, 1264, 9351, 273, 49602, 1690, 28400, 1554, 39661, 285, 27620, 2505, 15302, 25761, 1018, 19946, 556, 253, 11361, 273, 1679, 3541, 10393, 7938, 3885, 2429, 281, 253, 3210, 342, 2074, 4764, 1979, 20544, 50276, 18, 253, 2929, 29328, 247, 4460, 11786, 3169, 10669, 1320, 6333, 281, 2953, 253, 2523, 273, 16572, 749, 3418, 10669, 1320, 11333, 285, 863, 1550, 12085, 3045, 253, 4081, 1332, 26866, 11369, 6031, 323, 10669, 1320, 534, 651, 320, 3477, 281, 39970, 281, 1027, 8892, 50275, 19, 352, 310, 4722, 281, 1089, 326, 436, 6333, 476, 2968, 342, 1554, 39661, 285, 27620, 15302, 281, 690, 6070, 534, 4245, 6152, 281, 4715, 749, 3418, 14237, 432, 5810, 50275, 20, 436, 4081, 10669, 4924, 6333, 305, 35441, 556, 1679, 15180, 2105, 685, 970, 253, 2087, 3082, 273, 10669, 1320, 285, 476, 320, 973, 6508, 281, 643, 3210, 50276, 20881, 1255, 265, 50276, 18, 253, 3045, 310, 417, 10084, 2167, 253, 1566, 6296, 26866, 1566, 1979, 352, 310, 417, 2590, 604, 253, 1332, 2987, 323, 643, 11515, 323, 1650, 1894, 3169, 3210, 5257, 281, 5115, 2074, 3045, 2429, 342, 3159, 3169, 4394, 651, 436, 1332, 556, 5750, 689, 253, 1894, 3169, 8245, 840, 50276, 19, 253, 27163, 403, 1754, 327, 2613, 3210, 1580, 436, 1566, 26866, 1566, 1979, 651, 352, 5115, 1805, 3045, 2429, 342, 253, 1072, 1979, 24088, 1052, 78, 751, 12243, 5251, 246, 22, 2613, 275, 2829, 337, 513, 368, 1611, 1781, 3210, 16280, 352, 651, 320, 3486, 1020, 604, 436, 2929, 2692, 326, 305, 35441, 6333, 812, 2686, 39970, 275, 643, 9351, 273, 3210, 50275, 37585, 5701, 50276, 18, 891, 4282, 604, 3388, 45900, 310, 1175, 1039, 281, 1066, 3410, 2167, 352, 310, 1663, 3809, 533, 849, 670, 13361, 81, 50275, 19, 275, 16186, 20, 253, 14452, 943, 320, 470, 281, 278, 18, 390, 337, 281, 278, 2264, 1180, 273, 2972, 1979, 310, 278, 26332, 12580, 50276, 5530, 4090, 81, 17, 891, 50276, 81, 18, 891, 50276, 6768, 268, 7373, 50276, 2059, 50276, 5530, 4090, 81, 18, 891, 50276, 81, 19, 891, 50276, 6768, 268, 7373, 50276, 20, 387, 4677, 337, 270, 352, 3133, 253, 22581, 268, 3680, 275, 253, 7002, 1386, 310, 9829, 50276, 21, 275, 2593, 21990, 2972, 9552, 337, 50276, 67, 50276, 78, 943, 320, 337, 50276, 67, 50276, 78, 50276, 22, 275, 2593, 23917, 21624, 749, 12113, 7856, 89, 50276, 700, 2981, 20200, 7856, 89, 78, 50276, 11425, 320, 21624, 749, 12113, 7856, 89, 50276, 700, 2981, 20200, 7856, 30291, 50276, 2520, 789, 310, 973, 24013, 8550, 253, 4583, 2216, 273, 253, 1566, 310, 22335, 3590, 285, 352, 3400, 10290, 11701, 327, 2067, 15302, 1223, 1146, 3541, 5919, 7938, 342, 11184, 3602, 352, 310, 417, 2590, 604, 253, 7170, 273, 253, 1332, 476, 39970, 281, 643, 11515, 390, 4067, 3210, 2490, 187, 4118, 18435, 27, 2520, 2929, 23970, 247, 2602, 11786, 3169, 749, 3418, 10669, 1320, 6333, 305, 35441, 326, 33772, 21624, 749, 3418, 14237, 432, 5810, 305, 35441, 30482, 684, 7431, 749, 3418, 8336, 285, 33772, 281, 4868, 731, 275, 247, 1899, 3020, 8142, 970, 247, 2972, 14755, 2990, 253, 4795, 1566, 369, 5762, 327, 28400, 285, 2067, 2831, 1981, 780, 8892, 253, 3045, 310, 12085, 342, 407, 23214, 22, 285, 2223, 2074, 281, 749, 3418, 3210, 1223, 1146, 625, 5919, 275, 892, 2695, 285, 17653, 50276, 15337, 398, 403, 6804, 327, 436, 253, 4016, 37317, 2792, 281, 849, 436, 417, 1146, 247, 1524, 10669, 6081, 285, 1057, 417, 4711, 247, 10669, 1320, 326, 4679, 326, 897, 253, 2613, 1566, 513, 417, 2953, 8750, 11498, 285, 326, 627, 310, 247, 3480, 273, 2127, 534, 310, 1774, 323, 436, 2238, 273, 789, 285, 253, 4795, 7200, 15988, 403, 417, 1534, 285, 253, 1332, 1146, 417, 4665, 494, 253, 2762, 30628, 751, 253, 9470, 4679, 253, 6733, 11701, 285, 15840, 50276, 3549, 12986, 273, 253, 305, 35441, 6333, 253, 4477, 4455, 281, 452, 9713, 954, 273, 253, 37317, 3374, 407, 5277, 4067, 4311, 4679, 285, 2127, 891, 2868, 253, 1543, 403, 9648, 2266, 1580, 581, 651, 417, 1902, 247, 1943, 3045, 3064, 275, 247, 6311, 10669, 1320, 1332, 533, 2581, 6733, 390, 15840, 15988, 253, 2929, 310, 3839, 973, 15720, 2167, 4278, 670, 253, 27311, 943, 320, 2908, 275, 253, 2505, 285, 417, 816, 253, 2127, 50276, 250, 2823, 1946, 2997 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 689, 8336, 273, 1027, 1979, 326, 2723, 281, 326, 1899, 285, 1057, 417, 1379, 715, 2395, 4156, 1491, 24088, 407, 12672, 7363, 273, 512, 1896, 8223, 569, 281, 22950, 323, 253, 954, 2779, 581, 50275, 19, 253, 1655, 39926, 651, 1804, 5301, 281, 5368, 10669, 1320, 3082, 751, 270, 365, 390, 3332, 50276, 32581, 1025, 4394, 281, 253, 15450, 4836, 1249, 275, 247, 6537, 4758, 24088, 2074, 30318, 1979, 285, 33876, 6081, 2216, 533, 436, 369, 417, 14859, 387, 512, 275, 253, 7103, 275, 1635, 627, 310, 642, 1463, 7103, 273, 253, 6311, 8223, 569, 643, 685, 253, 24426, 273, 253, 6311, 13461, 323, 247, 2014, 1650, 50276, 20, 253, 1543, 403, 417, 326, 1175, 2429, 281, 6313, 1894, 5251, 3210, 327, 3388, 534, 2789, 1543, 1679, 12302, 671, 581, 273, 253, 7558, 5373, 273, 253, 4081, 1332, 310, 326, 310, 7938, 310, 7194, 1955, 281, 253, 4229, 1066, 48027, 534, 310, 417, 1633, 747, 285, 10384, 281, 643, 1894, 5251, 1666, 25379, 50276, 21, 2175, 326, 2770, 327, 10669, 1320, 751, 253, 4394, 11106, 2708, 7472, 327, 8892, 835, 10669, 1320, 310, 1774, 323, 10922, 1375, 23037, 14387, 3045, 751, 5145, 10234, 352, 1537, 320, 4409, 16344, 627, 50274, 19751, 50276, 249, 2593, 11487, 670, 28294, 272, 780, 15302, 369, 253, 1066, 48027, 2281, 18325, 760, 323, 253, 4081, 1566, 50276, 18, 5987, 29404, 14718, 1497, 2061, 14952, 28983, 6017, 13307, 81, 8193, 9275, 50276, 19, 5987, 39962, 2061, 9275, 6755, 1010, 35276, 9275, 50276, 783, 2929, 16633, 327, 271, 4722, 1895, 285, 556, 1142, 4679, 342, 1375, 23037, 14387, 1543, 275, 2176, 7533, 619, 2022, 7350, 403, 5001, 697, 38135, 39926, 285, 7103, 50276, 9154, 2167, 690, 273, 253, 1543, 403, 12085, 253, 4679, 513, 417, 2085, 2217, 1941, 326, 253, 6311, 50276, 13763, 1320, 310, 9560, 323, 17170, 253, 1543, 50276, 7152, 33032, 2520, 2929, 2770, 327, 247, 11786, 3169, 749, 3418, 10669, 1320, 305, 35441, 1332, 323, 12243, 5251, 4979, 398, 253, 2234, 2934, 310, 281, 897, 247, 4872, 9261, 342, 6194, 494, 3602, 50276, 5530, 4090, 281, 11897, 21496, 323, 1016, 1894, 1754, 327, 2067, 295, 5059, 275, 697, 9168, 285, 840, 897, 3388, 45900, 281, 1066, 16848, 285, 6379, 253, 3425, 2978, 4757, 50276, 18, 253, 305, 35441, 6333, 310, 19554, 285, 7938, 352, 556, 327, 1148, 3045, 533, 310, 4791, 1438, 7938, 342, 1679, 3541, 10393, 685, 1655, 256, 5503, 12243, 5251, 39707, 407, 85, 22, 10941, 342, 253, 21259, 407, 85, 22, 5092, 460, 1566, 352, 556, 2074, 3885, 285, 3541, 10393, 533, 625, 4951, 3579, 285, 556, 1805, 7200, 352, 671, 556, 327, 1061, 3045, 285, 1679, 13782, 4815, 275, 954, 8892, 10941, 342, 246, 22, 6917, 22, 374, 253, 46595, 272, 273, 253, 1566, 256, 4793, 33526, 1805, 3045, 342, 1384, 1679, 13782, 2167, 436, 2934, 281, 452, 12861, 32049, 310, 432, 2045, 789, 697, 1335, 27096, 281, 452, 253, 3368, 495, 253, 2929, 310, 1077, 973, 3542, 285, 556, 2217, 4679, 281, 921, 253, 3210, 3045, 275, 7200, 285, 14603, 275, 1027, 8892, 50275, 20881, 1255, 50276, 18, 5884, 253, 7681, 38135, 273, 436, 2929, 310, 760, 305, 35441, 247, 14086, 3828, 689, 295, 5059, 1223, 3733, 4758, 3637, 407, 85, 22, 285, 253, 1682, 256, 4793, 1566, 310, 671, 271, 7092, 273, 253, 3676, 32049, 4473, 374, 253, 9171, 1430, 273, 253, 1566, 310, 417, 2011, 512, 253, 3368, 13654, 327, 3210, 643, 685, 26301, 22, 342, 247, 4764, 4311, 273, 1052, 78, 533, 253, 2929, 407, 85, 22, 84, 8004, 1566, 556, 7469, 78, 3602, 285, 17923, 1199, 1805, 685, 342, 1052, 78, 3602, 10941, 2829, 577, 273, 407, 85, 22, 285, 2829, 577, 273, 436, 2929, 495, 5884, 246, 22, 3133, 1199, 1805, 685, 256, 4793, 762, 1182, 254, 6934, 302, 7533, 2167, 352, 3936, 625, 13782, 50276, 21, 432, 253, 1390, 6197, 273, 4562, 1666, 25379, 352, 3133, 326, 253, 407, 85, 22, 1566, 908, 323, 5301, 310, 440, 1026, 3256, 285, 891, 812, 2649, 1089, 275, 253, 2929, 670, 2605, 273, 436, 407, 85, 22, 7016, 604, 891, 9829, 326, 436, 310, 1774, 1580, 275, 253, 407, 85, 22, 2929, 512, 253, 1566, 11640, 452, 1199, 27953, 2801, 275, 2349, 351, 398, 285, 627, 310, 671, 271, 28913, 1263, 281, 5276, 326, 27953, 32049, 476, 3157, 3045, 247, 2257, 275, 436, 2929, 253, 24337, 1018, 630, 398, 4793, 310, 671, 1199, 1805, 685, 253, 440, 1026, 3256, 1018, 19946, 4793, 3021, 697, 247, 2372, 8921, 326, 253, 1682, 7533, 273, 1097, 3210, 403, 417, 908, 323, 5301, 50275, 936, 26799, 50276, 856, 50276, 18, 7126, 2929, 4028, 374, 9470, 4679, 495, 2969, 533, 3576, 10669, 4924, 6333, 50276, 585, 50276, 18, 3480, 273, 3368, 327, 1027, 9552, 273, 253, 1566, 3340, 3045, 672, 342, 247, 4067, 4311, 374, 253, 5301, 3133, 417, 875, 253, 1682, 4758, 3676, 32049, 20126, 29810, 273, 1097, 8245, 285, 436, 1566, 50275, 7152, 33032, 2520, 789, 806, 17930, 247, 12243, 5251, 1894, 10669, 6081, 534, 310, 6311, 347, 629, 273, 253, 1566, 2581, 685, 347, 247, 638, 21678, 4836, 281, 2701, 253, 15180, 7977, 273, 1894, 1268, 9706, 327, 253, 15450, 10336, 253, 1332, 10513, 12243, 295, 5059, 285, 24772, 731, 3066, 247, 14755, 2990, 347, 18627, 407, 253, 4477, 1846, 39707, 10336, 476, 897, 253, 10669, 6081, 285, 1014, 320, 34774, 4720, 253, 1566, 310, 6760, 970, 2709, 295, 7675, 285, 27620, 295, 24343, 8892, 347, 247, 906, 253, 2746, 33526, 10870, 990, 14605, 3045, 281, 749, 3418, 285, 643, 12243, 5251, 4979, 398, 1223, 11138, 4764, 6733, 347, 973, 347, 11990, 273, 897, 323, 1327, 48087, 295, 24343, 8892, 20544, 50276, 783, 789, 2789, 10669, 1320, 625, 5223, 494, 407, 4715, 8482, 536, 561, 327, 783, 16247, 50276, 31158, 3885, 285, 4764, 6733, 403, 5520, 689, 625, 16572, 749, 3418, 10669, 1320, 1754, 4979, 398, 275, 1142, 2219, 50276, 783, 10669, 1320, 1332, 310, 1754, 327, 247, 42428, 273, 24462, 12243, 295, 1710, 46234, 4868, 18543, 285, 1066, 48027, 403, 22335, 24600, 285, 253, 4477, 1056, 247, 28629, 3434, 281, 11990, 13757, 285, 417, 9569, 32734, 1598, 1308, 1522, 48663, 751, 4465, 4373, 22041, 50276, 783, 295, 1710, 42428, 2746, 4483, 253, 8482, 5097, 2801, 723, 281, 3464, 8489, 4665, 494, 50276, 783, 1066, 48027, 4254, 4483, 271, 3477, 1039, 273, 13868, 1996, 39707, 8031, 4764, 1979, 534, 310, 2011, 281, 2572, 4715, 3885, 285, 4796, 3602, 50276, 33177, 7364, 326, 2430, 2852, 789, 403, 5393, 354, 39237, 50276, 249, 4777, 273, 14720, 670, 7681, 10165, 24088, 39937, 2349, 351, 398, 2085, 49664, 4278, 285, 1361, 1056, 253, 37057, 273, 253, 4679, 625, 5514, 1028, 917, 50276, 16217, 3825, 403, 23209, 6537, 372, 585, 1258, 456, 281, 4711, 625, 10237, 16039, 50276, 2420, 374, 4648, 247, 1028, 1087, 2581, 685, 253, 3236, 2987, 247, 1028, 15683, 347, 3045, 2557, 534, 476, 320, 2783, 271, 7756, 436, 789, 806, 17930, 271, 990, 936, 423, 6194, 494, 12243, 5251, 1894, 10669, 6081, 1223, 14155, 15180, 18332, 3066, 247, 6311, 10669, 42428, 352, 310, 13333, 342, 1846, 39707, 34577, 533, 476, 4796, 616, 4871, 323, 15180, 15988, 1293, 18501, 272, 3045, 285, 275, 690, 5053, 21896, 3045, 253, 7103, 327, 2709, 295, 7675, 285, 27620, 295, 24343, 8892, 14371, 760, 1499, 798, 314, 1679, 281, 5777, 10046, 3045, 2429, 281, 12243, 285, 749, 3418, 10669, 1320, 3210, 11990, 273, 897, 1146, 10686, 400, 1598, 3448, 639, 79, 6932, 3733, 625, 14556, 285, 6941, 3477, 294, 39595, 268, 1767, 263, 348, 1056, 436, 789, 247, 10112, 7680, 281, 253, 1673, 50276, 45230, 891, 5583, 18738, 436, 789, 5474, 33032, 2520, 789, 13698, 281, 1339, 253, 1566, 3037, 253, 749, 3418, 14237, 407, 3139, 432, 1894, 5251, 46234, 273, 253, 3280, 1293, 10669, 3006, 247, 11786, 3169, 749, 3418, 10669, 1320, 6333, 305, 35441, 310, 4158, 323, 5407, 273, 253, 2087, 10669, 1320, 253, 4081, 1018, 19946, 247, 46595, 264, 39707, 10336, 39707, 342, 305, 35441, 326, 49661, 11786, 3169, 749, 3418, 10669, 1320, 41731, 13015, 2067, 12243, 5251, 1666, 25379, 285, 17923, 327, 1061, 342, 690, 749, 3418, 5251, 3210, 327, 1264, 9351, 273, 49602, 1690, 28400, 1554, 39661, 285, 27620, 2505, 15302, 25761, 1018, 19946, 556, 253, 11361, 273, 1679, 3541, 10393, 7938, 3885, 2429, 281, 253, 3210, 342, 2074, 4764, 1979, 20544, 50276, 18, 253, 2929, 29328, 247, 4460, 11786, 3169, 10669, 1320, 6333, 281, 2953, 253, 2523, 273, 16572, 749, 3418, 10669, 1320, 11333, 285, 863, 1550, 12085, 3045, 253, 4081, 1332, 26866, 11369, 6031, 323, 10669, 1320, 534, 651, 320, 3477, 281, 39970, 281, 1027, 8892, 50275, 19, 352, 310, 4722, 281, 1089, 326, 436, 6333, 476, 2968, 342, 1554, 39661, 285, 27620, 15302, 281, 690, 6070, 534, 4245, 6152, 281, 4715, 749, 3418, 14237, 432, 5810, 50275, 20, 436, 4081, 10669, 4924, 6333, 305, 35441, 556, 1679, 15180, 2105, 685, 970, 253, 2087, 3082, 273, 10669, 1320, 285, 476, 320, 973, 6508, 281, 643, 3210, 50276, 20881, 1255, 265, 50276, 18, 253, 3045, 310, 417, 10084, 2167, 253, 1566, 6296, 26866, 1566, 1979, 352, 310, 417, 2590, 604, 253, 1332, 2987, 323, 643, 11515, 323, 1650, 1894, 3169, 3210, 5257, 281, 5115, 2074, 3045, 2429, 342, 3159, 3169, 4394, 651, 436, 1332, 556, 5750, 689, 253, 1894, 3169, 8245, 840, 50276, 19, 253, 27163, 403, 1754, 327, 2613, 3210, 1580, 436, 1566, 26866, 1566, 1979, 651, 352, 5115, 1805, 3045, 2429, 342, 253, 1072, 1979, 24088, 1052, 78, 751, 12243, 5251, 246, 22, 2613, 275, 2829, 337, 513, 368, 1611, 1781, 3210, 16280, 352, 651, 320, 3486, 1020, 604, 436, 2929, 2692, 326, 305, 35441, 6333, 812, 2686, 39970, 275, 643, 9351, 273, 3210, 50275, 37585, 5701, 50276, 18, 891, 4282, 604, 3388, 45900, 310, 1175, 1039, 281, 1066, 3410, 2167, 352, 310, 1663, 3809, 533, 849, 670, 13361, 81, 50275, 19, 275, 16186, 20, 253, 14452, 943, 320, 470, 281, 278, 18, 390, 337, 281, 278, 2264, 1180, 273, 2972, 1979, 310, 278, 26332, 12580, 50276, 5530, 4090, 81, 17, 891, 50276, 81, 18, 891, 50276, 6768, 268, 7373, 50276, 2059, 50276, 5530, 4090, 81, 18, 891, 50276, 81, 19, 891, 50276, 6768, 268, 7373, 50276, 20, 387, 4677, 337, 270, 352, 3133, 253, 22581, 268, 3680, 275, 253, 7002, 1386, 310, 9829, 50276, 21, 275, 2593, 21990, 2972, 9552, 337, 50276, 67, 50276, 78, 943, 320, 337, 50276, 67, 50276, 78, 50276, 22, 275, 2593, 23917, 21624, 749, 12113, 7856, 89, 50276, 700, 2981, 20200, 7856, 89, 78, 50276, 11425, 320, 21624, 749, 12113, 7856, 89, 50276, 700, 2981, 20200, 7856, 30291, 50276, 2520, 789, 310, 973, 24013, 8550, 253, 4583, 2216, 273, 253, 1566, 310, 22335, 3590, 285, 352, 3400, 10290, 11701, 327, 2067, 15302, 1223, 1146, 3541, 5919, 7938, 342, 11184, 3602, 352, 310, 417, 2590, 604, 253, 7170, 273, 253, 1332, 476, 39970, 281, 643, 11515, 390, 4067, 3210, 2490, 187, 4118, 18435, 27, 2520, 2929, 23970, 247, 2602, 11786, 3169, 749, 3418, 10669, 1320, 6333, 305, 35441, 326, 33772, 21624, 749, 3418, 14237, 432, 5810, 305, 35441, 30482, 684, 7431, 749, 3418, 8336, 285, 33772, 281, 4868, 731, 275, 247, 1899, 3020, 8142, 970, 247, 2972, 14755, 2990, 253, 4795, 1566, 369, 5762, 327, 28400, 285, 2067, 2831, 1981, 780, 8892, 253, 3045, 310, 12085, 342, 407, 23214, 22, 285, 2223, 2074, 281, 749, 3418, 3210, 1223, 1146, 625, 5919, 275, 892, 2695, 285, 17653, 50276, 15337, 398, 403, 6804, 327, 436, 253, 4016, 37317, 2792, 281, 849, 436, 417, 1146, 247, 1524, 10669, 6081, 285, 1057, 417, 4711, 247, 10669, 1320, 326, 4679, 326, 897, 253, 2613, 1566, 513, 417, 2953, 8750, 11498, 285, 326, 627, 310, 247, 3480, 273, 2127, 534, 310, 1774, 323, 436, 2238, 273, 789, 285, 253, 4795, 7200, 15988, 403, 417, 1534, 285, 253, 1332, 1146, 417, 4665, 494, 253, 2762, 30628, 751, 253, 9470, 4679, 253, 6733, 11701, 285, 15840, 50276, 3549, 12986, 273, 253, 305, 35441, 6333, 253, 4477, 4455, 281, 452, 9713, 954, 273, 253, 37317, 3374, 407, 5277, 4067, 4311, 4679, 285, 2127, 891, 2868, 253, 1543, 403, 9648, 2266, 1580, 581, 651, 417, 1902, 247, 1943, 3045, 3064, 275, 247, 6311, 10669, 1320, 1332, 533, 2581, 6733, 390, 15840, 15988, 253, 2929, 310, 3839, 973, 15720, 2167, 4278, 670, 253, 27311, 943, 320, 2908, 275, 253, 2505, 285, 417, 816, 253, 2127, 50276, 250, 2823, 1946, 2997 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: summary the authors tighten the analysis of the minimum width for universal approximation with relu networks for lp functions where they have an exact characterization in terms of the input and output dimensions the paper is well written and easy to follow the review of prior art is quite clear major commentsquestions 1 it appears that the main result on lp functions can be seen as a generalization of hanin and sellke 2017s result for dy1 to arbitrary dy could you contrast the main differences in the proof technique 2 its interesting that uniform approximation is harder to obtain with relu networks while relustep works however step functions are not ideal from a practical perspective due to the ae zero gradient is there an optimizationfriendly activation that satisfies uniform approximation with minimal width 3 does relu networks of width m refer to networks where the number of layers is arbitrary could you please clarify what is the minimum number of layers required for the results to hold are there any implications for relu networks of fixed layers eg two and three layers minor comments table 1 ours theorem 2 has maxdx1dy but isnt dy2 in this casedocsep summary the paper studies the minimal neural network width needed for universal approximation while previous papers on the subject merely provided lower and upper bounds this paper derives exact bounds on the minimal width the paper considers both relu networks as well as general activation functions and examines approximation under both uniform norm and lp norm in the case of functions with highdimensional outputs the derived bounds are better than previously thought dx dy 1 to maxdx 1 dy with possible implications to practical  network design detailed review main strengths the first exact bounds on the minimal width required for universal approximation closing the gap between lower and upper bounds of previous results significant improvement for the case of lp functions with multidimensional output with possible practical implications for encoderdecoder style networks which are common in many vision and language tasks prior results had suggested a width double the input dimension whereas here it has been demonstrated to be sufficient to use the same dimension as the input 1 show that uniform approximation with relu networks is slightly weaker than with networks with discontinuous activations specifically using both relu and step functions the results are very clearly presented as are their comparisons to previous bounds including a very intuitive proof sketch main weaknesses for the case of scalar lp functions the previous gap was merely off by one ie dx 1 le wmin le dx 2 in this particular situation the improvement in absolute terms is quite minimal even in the general case the prior lower and upper bounds were already asymptotically tight i recommend the paper be accepted since the paper provides exact bounds that close the gap between lower and upper bounds and that helps us understand these networks better although the gap has already been quite small in some situations which might be regarded as incremental in other cases such as the very common encoderdecoder setting dx dy the gap is large enough to affect practical considerations when designing compact networks these improvements should be of interest to the general ml communitydocsepthis paper wants to establish tight theoretical lowerbounds on the minimum width required by a relu neural network to approximate almost all functions up to epsilon where the distance of approximation is defined using the lpnorm the paper improves the previously known bounds which lied in the range of d1 and d4 to exactly d1 one of the results of this paper is to establish that this bound is exactly d1 i am summarizing the result while ignoring some precisionnuances this result holds only when the distance is measured using the pnorm where p is finite in the infinity norm setting they show that this minimum width actually is indeed not a universal approximator however to ensure that the claimed minimum width also holds in the infinitynorm setting they modify the activation functions to also allow for the threshold functions which is 1 if larger than 0 and 0 otherwise this is primarily a mathematical paper and the goal is to understand the minimum width required when arbitrary depth is allowed and thus the results in this paper should be viewed as an interesting way to completely fill the fullunderstanding in the mathematics of deep neural nets in general the insights are unlikely to have any bearing on practice with that disclaimer here are my opinions of the paper the paper provides a very interesting set of results first it fully solves the problem and gives exact bounds where prior works have only given bounds that without getting the precise constants although one could argue that improving constants is not as significant it still is importantinteresting to learn that we can indeed find the precise constant and arguably using a very nice proofconstruction second this paper also shows the dichotomy between approximation measured using lp and linfinity in particular there is no smooth transition between the minimum width required using only relu activations between finite p and p infinity however it is also interesting that the fix to obtain the same minimum width is to allow for another additional activation function and like relu it is only nondifferentiable at 0 although unlike relu it is not continuous at 0 either overall the three set of results form an interesting and complete picture of the landscape for the required minimum width the proof techniques involve viewing the mapping from input to output as a set of encoder memorizer and decoder steps and implementing each of those using neural nets with relu and optimal width combining the maximum of these widths gives the final required bound i really like that the proof and construction is clean and simple to understand usually getting precise constants involves messy constructions that is not the case in this paper and that is by far the biggest strength of this paper i had a few suggestions on this paper that i feel could make the results in this paper even better even though the goal is to allow arbitrary depth it is still instructive to add a discussion on the depth required in these bounds in particular what is the cost of having the width as small as the minimum required one in terms of how deep the network should be given a fixed error epsilon or in terms of k m if that is convenient in the result for p infinity is the number of step activations required optimal it is clear that you would need some step functions but how many currently you seem to require one for every input coordinate is this optimal a discussion of this and its optimality will make the theoremresult more stronger it was not directlyquickly obvious to me how lemma 8 applying the qk one for each input coordinate leads to a width of d1 as opposed to 2d the answer is that this is because you are mapping identity for all coordinates except one of them and repeating it dx times so stating this out will make it quickly accessible to the reader even though this may seem obvious it improves the readability there seems to be a caveat that the neural net uses the target function itself in the construction of its weights although this proof is about information needed a discussion surrounding this andor explaining why this is okay will again make the spirit of the results clear to the reader docsepthis paper studies the problem of universal approximation with networks of bounded width and arbitrary depth the objective is to understand whats the minimum width necessary to approximate any function in a suitable space the main results of the paper can be summarized as follows 1 relu networks approximate functions in lpmathbb rdx mathbb rdy if and only if the width is at least maxdx1 dy this result is tight and improves upon kidger and lyons 2020 which give an upper bound of dxdy1 2 the same result does not hold if we look at the approximation in ck mathbb rdy k being a compact the authors prove that the minimum width is 3 by giving a counterexample 3 in order to maintain the width of maxdx1 dy in ck mathbb rdy it suffices to use relu and threshold activations 4 an upper bound on the width of maxdx2 dy1 for approximation in lpk mathbb rdy is given for a wide class of activation functions the proofs contain two main elements of novelty a the upper bounds results 134 above rely on what the authors call a coding scheme the input is mapped to a onedimensional codeword by an encoder the codeword is mapped to a onedimensional target by a memorizer and the target is mapped to vector close to the output by the decoder the idea is that all these three maps encoder memorizer and decoder can be constructed with neural networks of width maxdx1 dy by using ideas of prior results eg hanin and sellke 2017 the point of the coding scheme is to decouple the input and the output dimension the original mapping from a space of dimension dx to a space of dimension dy is broken into i a mapping from dimension dx to dimension 1 ii a mapping from dimension 1 to dimension 1 and iii a mapping from dimension 1 to dimension dy this is what allows to improve the bound on the width from dxdy1 to maxdx1 dy b the counterexample on which the lower bound is based result 2 above comes from a topological argument the paper is well written the results are interesting and strong the proof techniques are novel thus i am generally positive about the submission i have a few questionsremarks q1 this is a general question the authors provide a general upper bound on the width of maxdx2 dy1 is that tight for a subclass of activations any comment on how to improve it beyond relu q2 lemma 5 does the set mathcal s or equivalently the choice of a1 a2 b1 b2 depend on phi it looks like this is the case and it would be better to clarify this point q3 section 52 it is mentioned that by the definition of ell and lemma 5 there exists a line intersecting with mathcal b how do you guarantee the intersection with mathcal b the set mathcal s in lemma 5 is generic and in principle may not give rise to an intersection q4 in lemma 6 the authors talk about a bounded pathconnected component without defining what pathconnected means i spotted a couple of typos t1 page 6 memorizerk m should be memorizek m t2 differentiable at at least this occurs several times in the main text and the appendix as well ### Summary:
two knowledgeable reviewers and one fairly confident reviewer were positive 7 about this submission the authors response clarified a few questions and comments from the initial reviews the paper provides exact bounds that close the gap between lower and upper bounds and that helps us understand these networks better with the unanimously positive feedback i am recommending the paper to be accepted
[ 24330, 5701, 34974, 50276, 18, 352, 4620, 326, 253, 2022, 906, 327, 39322, 3470, 476, 320, 2326, 347, 247, 26647, 273, 15761, 249, 285, 5580, 413, 4240, 84, 906, 323, 17713, 18, 281, 10341, 17713, 812, 368, 4499, 253, 2022, 3910, 275, 253, 4737, 5853, 50275, 19, 697, 4722, 326, 6447, 11193, 310, 12150, 281, 4044, 342, 774, 86, 6928, 1223, 774, 461, 554, 2987, 2299, 3213, 3470, 403, 417, 7445, 432, 247, 8542, 8668, 1955, 281, 253, 247, 70, 5058, 11786, 310, 627, 271, 13757, 19771, 5743, 326, 12310, 6447, 11193, 342, 8723, 4871, 50276, 20, 1057, 774, 86, 6928, 273, 4871, 278, 3730, 281, 6928, 835, 253, 1180, 273, 8090, 310, 10341, 812, 368, 4496, 19148, 50276, 5371, 310, 253, 5927, 1180, 273, 8090, 2424, 323, 253, 1543, 281, 2186, 403, 627, 667, 12739, 323, 774, 86, 6928, 273, 4229, 8090, 24088, 767, 285, 1264, 8090, 50276, 37585, 5701, 2829, 337, 20451, 10012, 374, 556, 2781, 9665, 18, 6421, 533, 310, 2649, 17713, 19, 275, 436, 260, 833, 406, 33032, 6010, 50275, 783, 2929, 2175, 253, 8723, 11454, 2990, 4871, 3058, 323, 10898, 11193, 1223, 2045, 9380, 327, 253, 2256, 7960, 2530, 2406, 285, 5170, 14493, 436, 2929, 38422, 3242, 14493, 327, 253, 8723, 4871, 253, 2929, 19401, 1097, 774, 86, 6928, 347, 973, 347, 2087, 5743, 3470, 285, 33888, 11193, 762, 1097, 6447, 5222, 285, 39322, 5222, 275, 253, 1083, 273, 3470, 342, 1029, 6967, 18012, 253, 6012, 14493, 403, 1805, 685, 3786, 1869, 18747, 50276, 6421, 50276, 18, 281, 2781, 9665, 50276, 18, 17713, 342, 1896, 12739, 281, 8542, 575, 2990, 2216, 50275, 5992, 7193, 2278, 50275, 7265, 20544, 50276, 783, 806, 3242, 14493, 327, 253, 8723, 4871, 2424, 323, 10898, 11193, 11196, 253, 8037, 875, 2406, 285, 5170, 14493, 273, 2045, 1543, 50276, 32258, 7756, 323, 253, 1083, 273, 39322, 3470, 342, 23964, 37613, 3453, 342, 1896, 8542, 12739, 323, 32049, 48759, 3740, 6928, 534, 403, 1846, 275, 1142, 8113, 285, 3448, 8892, 2720, 1543, 574, 5125, 247, 4871, 4021, 253, 3280, 7877, 5727, 1060, 352, 556, 644, 5183, 281, 320, 4209, 281, 897, 253, 1072, 7877, 347, 253, 3280, 337, 50276, 9029, 326, 6447, 11193, 342, 774, 86, 6928, 310, 5777, 21076, 685, 342, 6928, 342, 16196, 3472, 1396, 569, 5742, 970, 1097, 774, 86, 285, 3213, 3470, 50276, 783, 1543, 403, 1077, 4518, 3559, 347, 403, 616, 14023, 281, 2045, 14493, 1690, 247, 1077, 27350, 4737, 23211, 50276, 7265, 32213, 50276, 1542, 253, 1083, 273, 13434, 39322, 3470, 253, 2045, 8037, 369, 7960, 745, 407, 581, 26332, 18747, 50276, 18, 458, 259, 1222, 458, 18747, 50276, 19, 275, 436, 1798, 4112, 253, 7756, 275, 7880, 2426, 310, 3240, 8723, 50276, 9154, 275, 253, 2087, 1083, 253, 2720, 2406, 285, 5170, 14493, 497, 2168, 38311, 6863, 50276, 74, 5583, 253, 2929, 320, 7607, 1580, 253, 2929, 3400, 3242, 14493, 326, 2810, 253, 8037, 875, 2406, 285, 5170, 14493, 285, 326, 7729, 441, 2096, 841, 6928, 1805, 575, 20261, 253, 8037, 556, 2168, 644, 3240, 1355, 275, 690, 9534, 534, 1537, 320, 12258, 347, 32809, 275, 643, 575, 12866, 824, 347, 253, 1077, 1846, 32049, 48759, 4758, 18747, 50276, 6421, 253, 8037, 310, 1781, 2217, 281, 2818, 8542, 15711, 672, 20462, 8566, 6928, 841, 11701, 943, 320, 273, 1600, 281, 253, 2087, 13361, 3114, 7152, 33032, 2520, 2929, 5605, 281, 5100, 6863, 10527, 2406, 35800, 327, 253, 5927, 4871, 2424, 407, 247, 774, 86, 11454, 2990, 281, 16851, 2761, 512, 3470, 598, 281, 299, 4277, 835, 253, 4181, 273, 11193, 310, 2931, 970, 253, 39322, 12850, 253, 2929, 19132, 253, 3786, 1929, 14493, 534, 27786, 275, 253, 2491, 273, 277, 18, 285, 277, 21, 281, 4555, 277, 18, 581, 273, 253, 1543, 273, 436, 2929, 310, 281, 5100, 326, 436, 3033, 310, 4555, 277, 18, 891, 717, 10405, 3006, 253, 906, 1223, 23111, 690, 12320, 3023, 1972, 436, 906, 6556, 760, 672, 253, 4181, 310, 4080, 970, 253, 268, 12850, 835, 268, 310, 6486, 275, 253, 23579, 5222, 4758, 597, 921, 326, 436, 5927, 4871, 2686, 310, 6296, 417, 247, 10898, 4020, 1080, 2299, 281, 5416, 326, 253, 7558, 5927, 4871, 671, 6556, 275, 253, 23579, 12850, 4758, 597, 10007, 253, 5743, 3470, 281, 671, 1581, 323, 253, 7887, 3470, 534, 310, 337, 604, 4067, 685, 470, 285, 470, 5010, 50275, 2520, 310, 8558, 247, 15965, 2929, 285, 253, 4736, 310, 281, 2096, 253, 5927, 4871, 2424, 672, 10341, 6864, 310, 4136, 285, 3021, 253, 1543, 275, 436, 2929, 943, 320, 11575, 347, 271, 4722, 1039, 281, 4336, 7522, 253, 2120, 4524, 6924, 275, 253, 23065, 273, 3676, 11454, 37507, 275, 2087, 253, 16039, 403, 11543, 281, 452, 667, 12206, 327, 3946, 342, 326, 27578, 1060, 403, 619, 11626, 273, 253, 2929, 253, 2929, 3400, 247, 1077, 4722, 873, 273, 1543, 806, 352, 4751, 35910, 253, 1895, 285, 4245, 3242, 14493, 835, 2720, 2987, 452, 760, 1677, 14493, 326, 1293, 2970, 253, 10799, 14637, 3738, 581, 812, 9059, 326, 11138, 14637, 310, 417, 347, 1534, 352, 1335, 310, 1774, 47606, 281, 3037, 326, 359, 476, 6296, 1089, 253, 10799, 3638, 285, 25711, 970, 247, 1077, 5322, 4737, 11682, 1273, 436, 2929, 671, 2722, 253, 19821, 22438, 875, 11193, 4080, 970, 39322, 285, 298, 43723, 275, 1798, 627, 310, 642, 6032, 5502, 875, 253, 5927, 4871, 2424, 970, 760, 774, 86, 1396, 569, 875, 6486, 268, 285, 268, 23579, 2299, 352, 310, 671, 4722, 326, 253, 4993, 281, 4044, 253, 1072, 5927, 4871, 310, 281, 1581, 323, 1529, 3081, 5743, 1159, 285, 751, 774, 86, 352, 310, 760, 27370, 7413, 6051, 387, 470, 3738, 12401, 774, 86, 352, 310, 417, 5415, 387, 470, 2057, 4583, 253, 1264, 873, 273, 1543, 830, 271, 4722, 285, 3426, 5406, 273, 253, 13016, 323, 253, 2424, 5927, 4871, 253, 4737, 5609, 6388, 14657, 253, 10603, 432, 3280, 281, 3453, 347, 247, 873, 273, 32049, 16407, 6081, 285, 29810, 5018, 285, 16994, 1016, 273, 1110, 970, 11454, 37507, 342, 774, 86, 285, 8654, 4871, 16248, 253, 4869, 273, 841, 34414, 4245, 253, 2457, 2424, 3033, 891, 1663, 751, 326, 253, 4737, 285, 5140, 310, 4076, 285, 2969, 281, 2096, 3798, 2970, 10799, 14637, 8687, 36396, 35831, 326, 310, 417, 253, 1083, 275, 436, 2929, 285, 326, 310, 407, 2080, 253, 5962, 4757, 273, 436, 2929, 50276, 74, 574, 247, 1643, 13991, 327, 436, 2929, 326, 891, 1928, 812, 1056, 253, 1543, 275, 436, 2929, 1014, 1805, 50274, 9154, 2167, 253, 4736, 310, 281, 1581, 10341, 6864, 352, 310, 1335, 49664, 281, 823, 247, 5955, 327, 253, 6864, 2424, 275, 841, 14493, 275, 1798, 752, 310, 253, 2105, 273, 1907, 253, 4871, 347, 1355, 347, 253, 5927, 2424, 581, 275, 2426, 273, 849, 3676, 253, 2990, 943, 320, 1677, 247, 4229, 2228, 299, 4277, 390, 275, 2426, 273, 465, 278, 604, 326, 310, 11638, 50273, 249, 253, 906, 323, 268, 50276, 43723, 310, 253, 1180, 273, 3213, 1396, 569, 2424, 8654, 352, 310, 2590, 326, 368, 651, 878, 690, 3213, 3470, 533, 849, 1142, 4390, 368, 1646, 281, 2430, 581, 323, 1046, 3280, 13249, 310, 436, 8654, 247, 5955, 273, 436, 285, 697, 5556, 1319, 588, 1056, 253, 10012, 6870, 625, 10046, 50274, 262, 369, 417, 3587, 32600, 314, 4755, 281, 479, 849, 18057, 854, 50276, 1212, 2943, 253, 2805, 76, 581, 323, 1016, 3280, 13249, 5644, 281, 247, 4871, 273, 277, 18, 347, 10066, 281, 374, 69, 253, 3662, 310, 326, 436, 310, 984, 368, 403, 10603, 6489, 323, 512, 11627, 3707, 581, 273, 731, 285, 24385, 352, 18747, 2069, 594, 14851, 436, 562, 588, 1056, 352, 4541, 12482, 281, 253, 9414, 1014, 2167, 436, 778, 1646, 4755, 352, 19132, 253, 1239, 1430, 50274, 9088, 3133, 281, 320, 247, 15985, 255, 326, 253, 11454, 2036, 4648, 253, 2303, 1159, 3139, 275, 253, 5140, 273, 697, 13461, 3738, 436, 4737, 310, 670, 1491, 3058, 247, 5955, 8704, 436, 285, 263, 15571, 2139, 436, 310, 8261, 588, 969, 1056, 253, 5968, 273, 253, 1543, 2590, 281, 253, 9414, 50276, 7152, 33032, 2520, 2929, 2175, 253, 1895, 273, 10898, 11193, 342, 6928, 273, 11542, 4871, 285, 10341, 6864, 253, 8103, 310, 281, 2096, 47515, 253, 5927, 4871, 3309, 281, 16851, 667, 1159, 275, 247, 7470, 2317, 253, 2022, 1543, 273, 253, 2929, 476, 320, 17903, 347, 3637, 50276, 18, 774, 86, 6928, 16851, 3470, 275, 39322, 1991, 391, 9665, 14168, 4482, 391, 6421, 604, 285, 760, 604, 253, 4871, 310, 387, 1878, 2781, 9665, 18, 17713, 436, 906, 310, 6863, 285, 19132, 2220, 5772, 1063, 285, 12865, 790, 9169, 534, 1918, 271, 5170, 3033, 273, 18747, 6421, 18, 50276, 19, 253, 1072, 906, 1057, 417, 2186, 604, 359, 1007, 387, 253, 11193, 275, 260, 76, 14168, 4482, 391, 6421, 465, 1146, 247, 8566, 253, 4477, 5276, 326, 253, 5927, 4871, 310, 495, 407, 4933, 247, 2258, 442, 18398, 4636, 50275, 20, 275, 1340, 281, 6558, 253, 4871, 273, 2781, 9665, 18, 17713, 275, 260, 76, 14168, 4482, 391, 6421, 352, 31088, 281, 897, 774, 86, 285, 7887, 1396, 569, 50275, 21, 271, 5170, 3033, 327, 253, 4871, 273, 2781, 9665, 19, 17713, 18, 323, 11193, 275, 298, 27905, 14168, 4482, 391, 6421, 310, 1677, 323, 247, 4618, 966, 273, 5743, 3470, 50274, 783, 27947, 3831, 767, 2022, 3603, 273, 38135, 50275, 66, 253, 5170, 14493, 1543, 13900, 1840, 10725, 327, 752, 253, 4477, 1067, 247, 12425, 6974, 253, 3280, 310, 18301, 281, 247, 327, 264, 37613, 2127, 3418, 407, 271, 32049, 253, 2127, 3418, 310, 18301, 281, 247, 327, 264, 37613, 2303, 407, 247, 16407, 6081, 285, 253, 2303, 310, 18301, 281, 4972, 2810, 281, 253, 3453, 407, 253, 29810, 253, 2934, 310, 326, 512, 841, 1264, 8115, 32049, 16407, 6081, 285, 29810, 476, 320, 8818, 342, 11454, 6928, 273, 4871, 2781, 9665, 18, 17713, 407, 970, 5697, 273, 2720, 1543, 24088, 15761, 249, 285, 5580, 413, 4240, 253, 1127, 273, 253, 12425, 6974, 310, 281, 34430, 713, 253, 3280, 285, 253, 3453, 7877, 253, 3236, 10603, 432, 247, 2317, 273, 7877, 18747, 281, 247, 2317, 273, 7877, 17713, 310, 7154, 715, 891, 247, 10603, 432, 7877, 18747, 281, 7877, 337, 21255, 247, 10603, 432, 7877, 337, 281, 7877, 337, 285, 37685, 247, 10603, 432, 7877, 337, 281, 7877, 17713, 436, 310, 752, 4483, 281, 3157, 253, 3033, 327, 253, 4871, 432, 50276, 9665, 6421, 18, 281, 2781, 9665, 18, 17713, 50276, 67, 253, 2258, 442, 18398, 4636, 327, 534, 253, 2406, 3033, 310, 1754, 906, 374, 1840, 3249, 432, 247, 17597, 4154, 50275, 783, 2929, 310, 973, 3542, 253, 1543, 403, 4722, 285, 2266, 253, 4737, 5609, 403, 4460, 3021, 891, 717, 3839, 2762, 670, 253, 19529, 50276, 74, 452, 247, 1643, 3533, 2013, 7969, 50276, 82, 18, 436, 310, 247, 2087, 1953, 253, 4477, 2085, 247, 2087, 5170, 3033, 327, 253, 4871, 273, 2781, 9665, 19, 17713, 18, 310, 326, 6863, 323, 247, 35851, 273, 1396, 569, 667, 4385, 327, 849, 281, 3157, 352, 4457, 774, 86, 50276, 82, 19, 18057, 608, 1057, 253, 873, 14168, 1179, 256, 390, 39406, 253, 4327, 273, 247, 18, 247, 19, 270, 18, 270, 19, 3469, 327, 815, 74, 352, 4453, 751, 436, 310, 253, 1083, 285, 352, 651, 320, 1805, 281, 19148, 436, 1127, 50276, 82, 20, 2593, 8073, 352, 310, 5393, 326, 407, 253, 5426, 273, 11591, 285, 18057, 608, 627, 4961, 247, 1386, 23965, 272, 342, 14168, 1179, 270, 849, 513, 368, 12215, 253, 15171, 342, 14168, 1179, 270, 253, 873, 14168, 1179, 256, 275, 18057, 608, 310, 12314, 285, 275, 8063, 778, 417, 1918, 6054, 281, 271, 15171, 50275, 82, 21, 275, 18057, 721, 253, 4477, 2312, 670, 247, 11542, 1854, 14063, 4445, 1293, 13947, 752, 1854, 14063, 2097, 50274, 74, 20673, 247, 4564, 273, 963, 993, 50276, 85, 18, 3239, 721, 16407, 6081, 76, 278, 943, 320, 16407, 907, 76, 278, 50276, 85, 19, 46350, 387, 387, 1878, 436, 6634, 2067, 2069, 275, 253, 2022, 2505, 285, 253, 30762, 347, 973, 187, 187, 4118, 18435, 27, 9389, 37289, 30628, 285, 581, 9648, 13224, 37317, 497, 2762, 818, 670, 436, 19529, 253, 4477, 2380, 31637, 247, 1643, 3533, 285, 5701, 432, 253, 3302, 10123, 253, 2929, 3400, 3242, 14493, 326, 2810, 253, 8037, 875, 2406, 285, 5170, 14493, 285, 326, 7729, 441, 2096, 841, 6928, 1805, 342, 253, 38350, 2762, 8680, 891, 717, 46705, 253, 2929, 281, 320, 7607, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 24330, 5701, 34974, 50276, 18, 352, 4620, 326, 253, 2022, 906, 327, 39322, 3470, 476, 320, 2326, 347, 247, 26647, 273, 15761, 249, 285, 5580, 413, 4240, 84, 906, 323, 17713, 18, 281, 10341, 17713, 812, 368, 4499, 253, 2022, 3910, 275, 253, 4737, 5853, 50275, 19, 697, 4722, 326, 6447, 11193, 310, 12150, 281, 4044, 342, 774, 86, 6928, 1223, 774, 461, 554, 2987, 2299, 3213, 3470, 403, 417, 7445, 432, 247, 8542, 8668, 1955, 281, 253, 247, 70, 5058, 11786, 310, 627, 271, 13757, 19771, 5743, 326, 12310, 6447, 11193, 342, 8723, 4871, 50276, 20, 1057, 774, 86, 6928, 273, 4871, 278, 3730, 281, 6928, 835, 253, 1180, 273, 8090, 310, 10341, 812, 368, 4496, 19148, 50276, 5371, 310, 253, 5927, 1180, 273, 8090, 2424, 323, 253, 1543, 281, 2186, 403, 627, 667, 12739, 323, 774, 86, 6928, 273, 4229, 8090, 24088, 767, 285, 1264, 8090, 50276, 37585, 5701, 2829, 337, 20451, 10012, 374, 556, 2781, 9665, 18, 6421, 533, 310, 2649, 17713, 19, 275, 436, 260, 833, 406, 33032, 6010, 50275, 783, 2929, 2175, 253, 8723, 11454, 2990, 4871, 3058, 323, 10898, 11193, 1223, 2045, 9380, 327, 253, 2256, 7960, 2530, 2406, 285, 5170, 14493, 436, 2929, 38422, 3242, 14493, 327, 253, 8723, 4871, 253, 2929, 19401, 1097, 774, 86, 6928, 347, 973, 347, 2087, 5743, 3470, 285, 33888, 11193, 762, 1097, 6447, 5222, 285, 39322, 5222, 275, 253, 1083, 273, 3470, 342, 1029, 6967, 18012, 253, 6012, 14493, 403, 1805, 685, 3786, 1869, 18747, 50276, 6421, 50276, 18, 281, 2781, 9665, 50276, 18, 17713, 342, 1896, 12739, 281, 8542, 575, 2990, 2216, 50275, 5992, 7193, 2278, 50275, 7265, 20544, 50276, 783, 806, 3242, 14493, 327, 253, 8723, 4871, 2424, 323, 10898, 11193, 11196, 253, 8037, 875, 2406, 285, 5170, 14493, 273, 2045, 1543, 50276, 32258, 7756, 323, 253, 1083, 273, 39322, 3470, 342, 23964, 37613, 3453, 342, 1896, 8542, 12739, 323, 32049, 48759, 3740, 6928, 534, 403, 1846, 275, 1142, 8113, 285, 3448, 8892, 2720, 1543, 574, 5125, 247, 4871, 4021, 253, 3280, 7877, 5727, 1060, 352, 556, 644, 5183, 281, 320, 4209, 281, 897, 253, 1072, 7877, 347, 253, 3280, 337, 50276, 9029, 326, 6447, 11193, 342, 774, 86, 6928, 310, 5777, 21076, 685, 342, 6928, 342, 16196, 3472, 1396, 569, 5742, 970, 1097, 774, 86, 285, 3213, 3470, 50276, 783, 1543, 403, 1077, 4518, 3559, 347, 403, 616, 14023, 281, 2045, 14493, 1690, 247, 1077, 27350, 4737, 23211, 50276, 7265, 32213, 50276, 1542, 253, 1083, 273, 13434, 39322, 3470, 253, 2045, 8037, 369, 7960, 745, 407, 581, 26332, 18747, 50276, 18, 458, 259, 1222, 458, 18747, 50276, 19, 275, 436, 1798, 4112, 253, 7756, 275, 7880, 2426, 310, 3240, 8723, 50276, 9154, 275, 253, 2087, 1083, 253, 2720, 2406, 285, 5170, 14493, 497, 2168, 38311, 6863, 50276, 74, 5583, 253, 2929, 320, 7607, 1580, 253, 2929, 3400, 3242, 14493, 326, 2810, 253, 8037, 875, 2406, 285, 5170, 14493, 285, 326, 7729, 441, 2096, 841, 6928, 1805, 575, 20261, 253, 8037, 556, 2168, 644, 3240, 1355, 275, 690, 9534, 534, 1537, 320, 12258, 347, 32809, 275, 643, 575, 12866, 824, 347, 253, 1077, 1846, 32049, 48759, 4758, 18747, 50276, 6421, 253, 8037, 310, 1781, 2217, 281, 2818, 8542, 15711, 672, 20462, 8566, 6928, 841, 11701, 943, 320, 273, 1600, 281, 253, 2087, 13361, 3114, 7152, 33032, 2520, 2929, 5605, 281, 5100, 6863, 10527, 2406, 35800, 327, 253, 5927, 4871, 2424, 407, 247, 774, 86, 11454, 2990, 281, 16851, 2761, 512, 3470, 598, 281, 299, 4277, 835, 253, 4181, 273, 11193, 310, 2931, 970, 253, 39322, 12850, 253, 2929, 19132, 253, 3786, 1929, 14493, 534, 27786, 275, 253, 2491, 273, 277, 18, 285, 277, 21, 281, 4555, 277, 18, 581, 273, 253, 1543, 273, 436, 2929, 310, 281, 5100, 326, 436, 3033, 310, 4555, 277, 18, 891, 717, 10405, 3006, 253, 906, 1223, 23111, 690, 12320, 3023, 1972, 436, 906, 6556, 760, 672, 253, 4181, 310, 4080, 970, 253, 268, 12850, 835, 268, 310, 6486, 275, 253, 23579, 5222, 4758, 597, 921, 326, 436, 5927, 4871, 2686, 310, 6296, 417, 247, 10898, 4020, 1080, 2299, 281, 5416, 326, 253, 7558, 5927, 4871, 671, 6556, 275, 253, 23579, 12850, 4758, 597, 10007, 253, 5743, 3470, 281, 671, 1581, 323, 253, 7887, 3470, 534, 310, 337, 604, 4067, 685, 470, 285, 470, 5010, 50275, 2520, 310, 8558, 247, 15965, 2929, 285, 253, 4736, 310, 281, 2096, 253, 5927, 4871, 2424, 672, 10341, 6864, 310, 4136, 285, 3021, 253, 1543, 275, 436, 2929, 943, 320, 11575, 347, 271, 4722, 1039, 281, 4336, 7522, 253, 2120, 4524, 6924, 275, 253, 23065, 273, 3676, 11454, 37507, 275, 2087, 253, 16039, 403, 11543, 281, 452, 667, 12206, 327, 3946, 342, 326, 27578, 1060, 403, 619, 11626, 273, 253, 2929, 253, 2929, 3400, 247, 1077, 4722, 873, 273, 1543, 806, 352, 4751, 35910, 253, 1895, 285, 4245, 3242, 14493, 835, 2720, 2987, 452, 760, 1677, 14493, 326, 1293, 2970, 253, 10799, 14637, 3738, 581, 812, 9059, 326, 11138, 14637, 310, 417, 347, 1534, 352, 1335, 310, 1774, 47606, 281, 3037, 326, 359, 476, 6296, 1089, 253, 10799, 3638, 285, 25711, 970, 247, 1077, 5322, 4737, 11682, 1273, 436, 2929, 671, 2722, 253, 19821, 22438, 875, 11193, 4080, 970, 39322, 285, 298, 43723, 275, 1798, 627, 310, 642, 6032, 5502, 875, 253, 5927, 4871, 2424, 970, 760, 774, 86, 1396, 569, 875, 6486, 268, 285, 268, 23579, 2299, 352, 310, 671, 4722, 326, 253, 4993, 281, 4044, 253, 1072, 5927, 4871, 310, 281, 1581, 323, 1529, 3081, 5743, 1159, 285, 751, 774, 86, 352, 310, 760, 27370, 7413, 6051, 387, 470, 3738, 12401, 774, 86, 352, 310, 417, 5415, 387, 470, 2057, 4583, 253, 1264, 873, 273, 1543, 830, 271, 4722, 285, 3426, 5406, 273, 253, 13016, 323, 253, 2424, 5927, 4871, 253, 4737, 5609, 6388, 14657, 253, 10603, 432, 3280, 281, 3453, 347, 247, 873, 273, 32049, 16407, 6081, 285, 29810, 5018, 285, 16994, 1016, 273, 1110, 970, 11454, 37507, 342, 774, 86, 285, 8654, 4871, 16248, 253, 4869, 273, 841, 34414, 4245, 253, 2457, 2424, 3033, 891, 1663, 751, 326, 253, 4737, 285, 5140, 310, 4076, 285, 2969, 281, 2096, 3798, 2970, 10799, 14637, 8687, 36396, 35831, 326, 310, 417, 253, 1083, 275, 436, 2929, 285, 326, 310, 407, 2080, 253, 5962, 4757, 273, 436, 2929, 50276, 74, 574, 247, 1643, 13991, 327, 436, 2929, 326, 891, 1928, 812, 1056, 253, 1543, 275, 436, 2929, 1014, 1805, 50274, 9154, 2167, 253, 4736, 310, 281, 1581, 10341, 6864, 352, 310, 1335, 49664, 281, 823, 247, 5955, 327, 253, 6864, 2424, 275, 841, 14493, 275, 1798, 752, 310, 253, 2105, 273, 1907, 253, 4871, 347, 1355, 347, 253, 5927, 2424, 581, 275, 2426, 273, 849, 3676, 253, 2990, 943, 320, 1677, 247, 4229, 2228, 299, 4277, 390, 275, 2426, 273, 465, 278, 604, 326, 310, 11638, 50273, 249, 253, 906, 323, 268, 50276, 43723, 310, 253, 1180, 273, 3213, 1396, 569, 2424, 8654, 352, 310, 2590, 326, 368, 651, 878, 690, 3213, 3470, 533, 849, 1142, 4390, 368, 1646, 281, 2430, 581, 323, 1046, 3280, 13249, 310, 436, 8654, 247, 5955, 273, 436, 285, 697, 5556, 1319, 588, 1056, 253, 10012, 6870, 625, 10046, 50274, 262, 369, 417, 3587, 32600, 314, 4755, 281, 479, 849, 18057, 854, 50276, 1212, 2943, 253, 2805, 76, 581, 323, 1016, 3280, 13249, 5644, 281, 247, 4871, 273, 277, 18, 347, 10066, 281, 374, 69, 253, 3662, 310, 326, 436, 310, 984, 368, 403, 10603, 6489, 323, 512, 11627, 3707, 581, 273, 731, 285, 24385, 352, 18747, 2069, 594, 14851, 436, 562, 588, 1056, 352, 4541, 12482, 281, 253, 9414, 1014, 2167, 436, 778, 1646, 4755, 352, 19132, 253, 1239, 1430, 50274, 9088, 3133, 281, 320, 247, 15985, 255, 326, 253, 11454, 2036, 4648, 253, 2303, 1159, 3139, 275, 253, 5140, 273, 697, 13461, 3738, 436, 4737, 310, 670, 1491, 3058, 247, 5955, 8704, 436, 285, 263, 15571, 2139, 436, 310, 8261, 588, 969, 1056, 253, 5968, 273, 253, 1543, 2590, 281, 253, 9414, 50276, 7152, 33032, 2520, 2929, 2175, 253, 1895, 273, 10898, 11193, 342, 6928, 273, 11542, 4871, 285, 10341, 6864, 253, 8103, 310, 281, 2096, 47515, 253, 5927, 4871, 3309, 281, 16851, 667, 1159, 275, 247, 7470, 2317, 253, 2022, 1543, 273, 253, 2929, 476, 320, 17903, 347, 3637, 50276, 18, 774, 86, 6928, 16851, 3470, 275, 39322, 1991, 391, 9665, 14168, 4482, 391, 6421, 604, 285, 760, 604, 253, 4871, 310, 387, 1878, 2781, 9665, 18, 17713, 436, 906, 310, 6863, 285, 19132, 2220, 5772, 1063, 285, 12865, 790, 9169, 534, 1918, 271, 5170, 3033, 273, 18747, 6421, 18, 50276, 19, 253, 1072, 906, 1057, 417, 2186, 604, 359, 1007, 387, 253, 11193, 275, 260, 76, 14168, 4482, 391, 6421, 465, 1146, 247, 8566, 253, 4477, 5276, 326, 253, 5927, 4871, 310, 495, 407, 4933, 247, 2258, 442, 18398, 4636, 50275, 20, 275, 1340, 281, 6558, 253, 4871, 273, 2781, 9665, 18, 17713, 275, 260, 76, 14168, 4482, 391, 6421, 352, 31088, 281, 897, 774, 86, 285, 7887, 1396, 569, 50275, 21, 271, 5170, 3033, 327, 253, 4871, 273, 2781, 9665, 19, 17713, 18, 323, 11193, 275, 298, 27905, 14168, 4482, 391, 6421, 310, 1677, 323, 247, 4618, 966, 273, 5743, 3470, 50274, 783, 27947, 3831, 767, 2022, 3603, 273, 38135, 50275, 66, 253, 5170, 14493, 1543, 13900, 1840, 10725, 327, 752, 253, 4477, 1067, 247, 12425, 6974, 253, 3280, 310, 18301, 281, 247, 327, 264, 37613, 2127, 3418, 407, 271, 32049, 253, 2127, 3418, 310, 18301, 281, 247, 327, 264, 37613, 2303, 407, 247, 16407, 6081, 285, 253, 2303, 310, 18301, 281, 4972, 2810, 281, 253, 3453, 407, 253, 29810, 253, 2934, 310, 326, 512, 841, 1264, 8115, 32049, 16407, 6081, 285, 29810, 476, 320, 8818, 342, 11454, 6928, 273, 4871, 2781, 9665, 18, 17713, 407, 970, 5697, 273, 2720, 1543, 24088, 15761, 249, 285, 5580, 413, 4240, 253, 1127, 273, 253, 12425, 6974, 310, 281, 34430, 713, 253, 3280, 285, 253, 3453, 7877, 253, 3236, 10603, 432, 247, 2317, 273, 7877, 18747, 281, 247, 2317, 273, 7877, 17713, 310, 7154, 715, 891, 247, 10603, 432, 7877, 18747, 281, 7877, 337, 21255, 247, 10603, 432, 7877, 337, 281, 7877, 337, 285, 37685, 247, 10603, 432, 7877, 337, 281, 7877, 17713, 436, 310, 752, 4483, 281, 3157, 253, 3033, 327, 253, 4871, 432, 50276, 9665, 6421, 18, 281, 2781, 9665, 18, 17713, 50276, 67, 253, 2258, 442, 18398, 4636, 327, 534, 253, 2406, 3033, 310, 1754, 906, 374, 1840, 3249, 432, 247, 17597, 4154, 50275, 783, 2929, 310, 973, 3542, 253, 1543, 403, 4722, 285, 2266, 253, 4737, 5609, 403, 4460, 3021, 891, 717, 3839, 2762, 670, 253, 19529, 50276, 74, 452, 247, 1643, 3533, 2013, 7969, 50276, 82, 18, 436, 310, 247, 2087, 1953, 253, 4477, 2085, 247, 2087, 5170, 3033, 327, 253, 4871, 273, 2781, 9665, 19, 17713, 18, 310, 326, 6863, 323, 247, 35851, 273, 1396, 569, 667, 4385, 327, 849, 281, 3157, 352, 4457, 774, 86, 50276, 82, 19, 18057, 608, 1057, 253, 873, 14168, 1179, 256, 390, 39406, 253, 4327, 273, 247, 18, 247, 19, 270, 18, 270, 19, 3469, 327, 815, 74, 352, 4453, 751, 436, 310, 253, 1083, 285, 352, 651, 320, 1805, 281, 19148, 436, 1127, 50276, 82, 20, 2593, 8073, 352, 310, 5393, 326, 407, 253, 5426, 273, 11591, 285, 18057, 608, 627, 4961, 247, 1386, 23965, 272, 342, 14168, 1179, 270, 849, 513, 368, 12215, 253, 15171, 342, 14168, 1179, 270, 253, 873, 14168, 1179, 256, 275, 18057, 608, 310, 12314, 285, 275, 8063, 778, 417, 1918, 6054, 281, 271, 15171, 50275, 82, 21, 275, 18057, 721, 253, 4477, 2312, 670, 247, 11542, 1854, 14063, 4445, 1293, 13947, 752, 1854, 14063, 2097, 50274, 74, 20673, 247, 4564, 273, 963, 993, 50276, 85, 18, 3239, 721, 16407, 6081, 76, 278, 943, 320, 16407, 907, 76, 278, 50276, 85, 19, 46350, 387, 387, 1878, 436, 6634, 2067, 2069, 275, 253, 2022, 2505, 285, 253, 30762, 347, 973, 187, 187, 4118, 18435, 27, 9389, 37289, 30628, 285, 581, 9648, 13224, 37317, 497, 2762, 818, 670, 436, 19529, 253, 4477, 2380, 31637, 247, 1643, 3533, 285, 5701, 432, 253, 3302, 10123, 253, 2929, 3400, 3242, 14493, 326, 2810, 253, 8037, 875, 2406, 285, 5170, 14493, 285, 326, 7729, 441, 2096, 841, 6928, 1805, 342, 253, 38350, 2762, 8680, 891, 717, 46705, 253, 2929, 281, 320, 7607, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the authors propose a regularizer for training dnns specifically adding the trace of the hessian wrt the parameters of the model they present two efficient stochastic estimators of the trace based on the hutchinson method and their own extension to it specific to nns the authors also provide a motivation of the regularizer through a known generalization bound and through the perspective of dynamical systems stability in three experiments on cifar10 cifar100 and wikitext they present results suggesting that the proposed method is performing better on average than other techniques the paper starts well with clear and simple goal to introduce a new regualrizer it motivates it through an existing generalization bound on linear models although this bound does not translate to dnn models i think this as a motivation is clearly presented and makes sense the stochastic estimators of the hessian through hutchinson method are well known and used in many places in ml equation 11 is somewhat redundant since anyone familiar with how modern autodiff packages perform efficient hessianvector products would know that they are just restating well known facts for the least they should be citing perlmutters paper on fast hessianvector products which basically define this in this whole section on the hessian estimator im very unclear what is the benefit of the proposed extension sethd over sethh theoretically due to dropping blocks of the whole network it should be more efficient however practically the way that modern autodiff packages work it would be a significantly challenge to actually compute hessianvector products such that they benefit from this eg if you just zero out those parts of sigma and perform still perform dense products on line 5 and 6 of algorithm 2 it wont make any difference later in the results the authors do cite that based on the probability that they pick p they gain different run times i think the authors should provide more details on how in practice they actually implement sehtd in order to take advantage of the structure of sigma these details although might look like not so important are probably the only reason why one would even use sehtd over sethh i have quite a bit of issue with the whole section 34 the authors attempt to motivate the trace trough stability analysis first that they expand the idealized ode for which there is now some literature which uses backward analysis to derive more accurate odes and in practice show case that the idealized version is not well capturing what happens with the dynamical system second the main point at the end of this whole section talking about stability of equilibrium points is to argue that the goal of the method is to have a more unstable dynamical system why would we want that i dont know the last paragraph presents an argument that since standard training is a function only of the training data we would like it to be unstable in order to avoid overfitting however the authors do not present any evidence or even elude to why that last statement would be true and im quite doubtful if it will at all we can make training unstable in many other ways injecting significant amount of noise for instance which do not lead to better generalization as a result i would advice the authors to either significantly change or just remove this whole section as it currently serves no purpose finally on the results section i have a few major issue with some of the numbers im seeing which make me a bit doubtful of the overall results the wrn2810 has a baseline with 73 top 1 accuracy the defualt pytorch opensource results from httpsgithubcommeliketoywideresnetpytorch indicate a 80 accuracy zca nodropout 5e4 weight decay perhaps the authors are suing different type of preprocessing scheme or there are other differences but this should be cleared out as to why they get so much worse results with the same model another issue i have with the presented results is that the authors do not mention to have done any search over the regualrizer penalty for any of the competing methods and for theirs they do only two values 001 005 they must mention why this is done if they pick the values from one of the papers they should make that more clear and also should make it clear that problems considered are equivalent to those in the reference for many regualrizers their optimal weight changes between different problems this might or might not explain some of the degradation results we see for some of the other techniques probably the final experiment is best execute with significant hyper parameter search and good ablations which makes it the only one that makes me trust its results and unsurprisingly there the benefits are significantly more marginal then in the others however it is unclear what is validation and test perplexy exactly do you train and crossvalidate the hyper parameter to select them and then retrain on train validation and measure test or do you just measure the performance on the test set of the best crossvalidated model please clarify good and simple idea i have some issues with lacking implementation details of the method that the authors use in practice one of the sections seems completely redundant to the narrative unless the authors provide better explanation for their final statements there some of the experiments seem to have lower numbers than publicly known results and in general there is a lack of exploring hyper parameters for other regualrizers the last experiment is probably the only one that is done really well docsepin this paper the authors propose an efficient regularizer for the trace of the hessian to circumvent the extra computation needed to do the naive hutchinson estimator which pretty much amounts to taking three derivatives of the loss the authors subsample the weights matrices and then a second time internal to each weights matrix experimental results on cifar10 and cifar100 are presented the proposed regularizer helps but unfortunately not in a statistically significant way strengths the motivation of using jacobian and hessian trace norm regularizations is drawn from existing theoretical work its quite a nice way of tying it together with the extensive literature on generalization bounds one comment worth adding though is that i believe all of those bounds are unfortunately vacuous but even in that light i think the idea is naturalworth writing a paper about the initial downsampling corresponding to p1 is a clever way to make this work otherwise instead of the 1215x slowdowns i would have expected to have seen at the very least a 2x slowdown for two more derivatives and practically speaking more like a 1050x slowdown given how slow hessian vector products seem to be in practice weaknesses unfortunately the better results have heavily overlapping confidence intervals in the top pane the 2 std confidence interval is 047 combined with say the sehtd label smoothing its clear that none of these results are significant i would suggest that the authors aggressively drive down the sizes of these confidence intervals in the bottom pane the comparison between mixup and sehtd mixup suffers from the same problem in fact from these experiments the only reasonable convlusion is that mixup works really well for comparisons sake i think the authors really need to try the full version of the regularizer with multiple iterations and prob10 its obviously going to be much slower but if that doesnt work then the existing sehtd then i think something is probably wrong ie if the base claim is that hessian regularization works it is much less convincing if you dont directly test that hypothesis and go directly to approximations id even be interested in seeing how much variance is introduced by the trace estimator the two rounds of downsampling etc i think there needs to be a careful study of hyperparameters of the regularizer ie prob and regularization coefficient etc the idea is interesting the approximation is nice but the experimental results are too weakincomplete to make this a convincing paper docsepthe paper develops a new regularization method for deep neural networks the proposed methods penalizes the trace of the hessian the paper adapts hutchinson method for estimating trace of a positive semidefinite matrix for the purposes of estimating trace of the hessian this adaptation uses a dropout mechanism to efficiently compute trace of the hessian the paper also studies the effects of minimizing the trace of the hessian using linear dynamical systems theory and shows that lowering the trace of hessian diminishes the stability of the equilibrium points in the parameter ie weight space thereby reducing overfitting and improving generalizability of the network the paper concludes with a set of experiments on standard deep learning benchmarkscifar10 cifar100 and wikitext2and in almost every case the proposed sehtd scheme achieves best results the paper makes a strong case for the proposed regularization scheme penalizing the trace of the hessian the empirical results included in the paper seem to indicate that the proposed regularization scheme improves network performance and generalizability the technical details are accessible and it is easy to follow the mathematical steps presented in the paper that lead to the final efficient algorithm for estimating the trace of the hessian section 34 that describes the linear stability analysis cogently argues why minimizing the trace of the hessian is one way to improve the generalizability of the network the paper also suffers from some shortcomings while reading the paper i stumbled upon a number typos misspellings incorrect double quotes very common in latex users etc these can be easily fixed a more serious issue with the paper is its use of the word efficient as in to efficiently implements sic the hutchinson method i did not see any experiments that showcase how efficient the proposed method is i would have liked to see some runtime figures that drive home the point about the efficiency of the proposed approach is there a way to remedy this the paper actually presents two methods for estimating the trace of the hessian the first method does not use dropout whereas the second more efficient method uses dropout the paper contains no results using the first method it would be nice to know the performance gap between the first and the second method in many situations performance and efficiency are inversely related does that hold here too alternately may be deep neural networks upend this trend and the second method posts both better performance and better runtimes overall the paper is wellwritten it proposes a new regularization scheme that seems to improve deep neural networks performance on cifar10 cifar100 and wikitext2 datasets the proposed scheme improves performance for both convolutional and recurrent neural networks the paper uses linear dynamical systems theory to argue that minimizing the trace of the hessian improves the generalizability of the network lastly the mathematical description for efficiently computing the trace of the hessian is easy to follow my only issue with the paper is that it doesnt include any results to support the assertion that the algorithm for estimating trace of the hessian is efficient docsepthe authors propose a method for estimating the trace of the hessian of the crossentropy loss with respect to the weights of a neural network classifier they suggest adding the trace estimator as a regularizing penalty term for training neural networks with improved generalization the authors give two theoretical motivations for regularizing the hessian trace for linear models the trace of the hessian of the crossentropy loss with respect to the logits appears as a factor in a term for bounding generalization error the authors draw a connection between the hessian with respect to the weights and the hessian with respect to the logits and argue that penalizing the trace of the hessian with respect to the weights also leads to a tighter bound on the generalization error for the continuous gradient dynamics around the optimum the hessian eigenspectrum is the negative eigenspectrum of the jacobian of the gradient dynamics which determine the stability of the dynamic system the authors argue that decreasing hessian trace increases jacobian trace and hence reduces data dependend stability which will avoid overfitting the authors suggested method for estimating the trace uses the stochastic hutchinson trace estimator which samples a random vector sigma with esigma sigmat i to get esigmat h sigma trh the authors appear to use backpropagation through a backpropagation gradient approach to compute sigmat h sigma by first computing the gradient g dldomega of the loss l with respect to the weights omega and then computing the gradient of the inner product gt sigma which is a scalar with respect to the weights again this gradientovergradient can be inner product multiplied again with sigma in order to save cost the authors suggest a drop out method setting a fraction of the terms of sigma to 0 which saves a fraction of the derivative computation and correspondingly drops terms of the trace sum the authors then present experimental results evaluating their suggested regularization method in combination with and compared to other regularization methods on cifar10 resnet18 cifar100 wide residual network wikitext2 2layer lstm strengths the connection of hessian spectrum and trace to generalization is an interesting and promising area of research and it is great to see results for a method regularizing the hessian that seems to outperform other regularization methods the structure of the paper is clear and the paper is understandable weaknesses the connection between theoretical justification and practical method is not always clear and the relevant details could be expanded my feeling is that the connection between using the hessian jacobian with respect to the logits vs with respect to the parameters is not as straight forward as it is made out to be in the paper and could be explained better for example the paper states naturally at the minimum the gradient is zero and the norm of the jacobian matrix is small near minimum but if we have for eg a nonlinear classifier fx w yx where w are the weights of the last layer then dldf is what is meant as jacobian in the bound but the gradient in optimization is with respect to parameters eg dlfxdw dldf dfdw dldf yx could there perhaps be a way for the gradient to be small if yx is small without dldf being small this question could be expanded upon similarly an explanation of the difference between the trace of the loss hessian wrt to logits vs wrt to the weights would benefit the paper for example it is wellknown that the hessian wrt to weights for a deep neural network can have negative eigenvalues but the hessian wrt to logits for a convex loss has only positive eigenvalues does this matter to the trace and associated bound somehow regarding the linear stability argument the paper could be improved by empirically demonstrating that regularizing the hessian trace leads to an optimum with a hessian that has less stability perhaps by analyzing the eigenvalues at the optimum furthermore is there a connection between the idea that increasing instability helps generalization to the literature that says that flat optima help generalization there is no discussion of using the reverseoverreverse autodiff method for computing the hessian vector product instead of a forwardoverreverse autodiff method which can save memory and hence could also be faster on a gpu in principle you can get the directional derivative of the scalar gtomega sigma with respect to omega in the direction of sigma by evaluating dgdomega sigma gomega eps sigma gomega eps or similarly dgsigmadomega sigma gomega eps sigma sigma gomega sigma eps at a factor two of the cost of evaluating only the gradient gradient evaluated at omega alternatively forwardmode autodiff can be used to the same effect and the dropout idea can still be applied by setting some of the sigma to zero and hence eliminating the cost further due to dropped nodes in the computational graph the experimental section does not analyse the regularizer itself or the connection to theory of the hessian trace regularization and is instead focused on just demonstrating beating baselines there is currently no analysis of the tradeoff of the bias of the trace estimator by using more dropout vs additional compute cost just the statement although increasing the probability to select parameters can improve test accuracy the time consumption will increase a lot the paper could be improved by adding an empiricial analysis of the accuracy of the trace estimator there is also no analysis of how trace accuracy impacts generalization performance and trace itself eg you could add plots showing how the true hessian trace changes throughout training it would be good to see the training and cross validation curves for different hyperparams of overlayed penalty terms like hessian trace term l2 norm term etc to see how they impact training and how stable training is to the corresponding hyperparameters it would be good to have an analysis of how much does hessian trace term add on top of hyperparam tuned other regularization eg for jacobian regularization the paper says just we set number of projections nproj 1 and weight values jr 001 which does not necessarily suggest that there was a sufficient hyperparameter tuning effort another interesting question to analyze would be how does the added regularizer impact the training dynamics eg faster or slower convergence in the beginning vs end of training and why some more general recommendations to improve the paper checking grammar and spelling typos will improve readability and quality of the paper state the generalization bound as a theorem as in the original paper theorem 41 in related work perhaps also connect to other works analyzing hessian and fisher matrix spectrum and related norms for generalization eg fisherrao metric geometry and complexity of neural networks i do not recommend to accept the paper in its current form since the connection between the theoretical justification and the empirical evidence is weak and the experiments in the paper are not sufficient for an empirical understanding of how the suggested regularizer really works ### Summary:
this paper regularizes deep neural networks via the hessian trace the algorithm is based on hutchisons method further accelerated via dropout connection to the linear stability of dynamical system is discussed the proposed regularization shows favorably in the experimental results the idea of the method is clear the papers writing needs a lot of improvement because there are a number of grammatical errors the major technical concerns include a the experimental results are still not convincing b the explanation of favoring instability in the dynamical system that resorts to overfitting prevention reviewer gdik ive read the rebuttal but remain unconvinced
[ 253, 4764, 26332, 2801, 2317, 50276, 9088, 1615, 8493, 689, 31893, 285, 11138, 2087, 50228, 273, 253, 2990, 50276, 783, 2929, 20097, 342, 247, 873, 273, 4679, 327, 2629, 3676, 4715, 49602, 46277, 274, 740, 260, 338, 274, 2313, 285, 259, 1479, 614, 633, 19, 395, 275, 2761, 1046, 1083, 253, 4081, 396, 384, 69, 6974, 33526, 1682, 1543, 50276, 783, 2929, 2789, 247, 2266, 1083, 323, 253, 4081, 37820, 6974, 29697, 3006, 253, 10711, 273, 253, 344, 859, 757, 50276, 783, 16774, 1543, 2908, 275, 253, 2929, 1646, 281, 5224, 326, 253, 4081, 37820, 6974, 19132, 2990, 3045, 285, 2087, 50228, 50276, 783, 7681, 4278, 403, 12482, 285, 352, 310, 3477, 281, 956, 253, 15965, 5018, 3559, 275, 253, 2929, 326, 1421, 281, 253, 2457, 5919, 5933, 323, 26230, 253, 10711, 273, 253, 344, 859, 757, 50276, 4674, 5910, 326, 8631, 253, 4872, 7882, 1783, 260, 462, 1574, 8219, 2139, 28699, 253, 10711, 273, 253, 344, 859, 757, 310, 581, 1039, 281, 3157, 253, 2087, 50228, 273, 253, 2990, 50275, 783, 2929, 671, 27171, 432, 690, 35387, 50276, 6050, 4361, 253, 2929, 891, 30773, 2220, 247, 1180, 963, 993, 2985, 81, 437, 723, 13583, 4021, 19101, 1077, 1846, 275, 44127, 4212, 3966, 50276, 20513, 476, 320, 4354, 4229, 50276, 66, 625, 4092, 2523, 342, 253, 2929, 310, 697, 897, 273, 253, 3159, 5919, 347, 275, 281, 14556, 17930, 256, 280, 253, 288, 9248, 9258, 1332, 50276, 74, 858, 417, 923, 667, 4679, 326, 34647, 849, 5919, 253, 4081, 1332, 310, 50276, 74, 651, 452, 10490, 281, 923, 690, 20243, 8442, 326, 4446, 1728, 253, 1127, 670, 253, 6733, 273, 253, 4081, 2746, 50276, 261, 627, 247, 1039, 281, 16748, 436, 50276, 783, 2929, 2686, 10262, 767, 3082, 323, 26230, 253, 10711, 273, 253, 344, 859, 757, 50276, 783, 806, 1332, 1057, 417, 897, 5926, 483, 50276, 2811, 284, 253, 1273, 625, 5919, 1332, 4648, 5926, 483, 50276, 783, 2929, 4428, 642, 1543, 970, 253, 806, 1332, 50276, 262, 651, 320, 5322, 281, 871, 253, 3045, 8037, 875, 253, 806, 285, 253, 1273, 1332, 50276, 249, 1142, 9534, 3045, 285, 6733, 403, 39342, 2905, 50276, 18566, 326, 2186, 1060, 1512, 50276, 30991, 1523, 778, 320, 3676, 11454, 6928, 598, 423, 436, 9058, 285, 253, 1273, 1332, 9319, 1097, 1805, 3045, 285, 1805, 1408, 3181, 4583, 253, 2929, 310, 973, 15720, 50276, 262, 29328, 247, 747, 37820, 6974, 326, 3133, 281, 3157, 3676, 11454, 6928, 3045, 327, 260, 338, 274, 740, 260, 338, 274, 2313, 285, 259, 1479, 614, 633, 19, 15302, 50276, 783, 4081, 6974, 19132, 3045, 323, 1097, 27311, 267, 285, 18902, 11454, 6928, 50276, 783, 2929, 4648, 4872, 18525, 2718, 3762, 281, 9059, 326, 28699, 253, 10711, 273, 253, 344, 859, 757, 19132, 253, 2087, 50228, 273, 253, 2990, 50276, 6275, 314, 253, 15965, 5740, 323, 14556, 12672, 253, 10711, 273, 253, 344, 859, 757, 310, 3477, 281, 956, 50276, 2577, 760, 2523, 342, 253, 2929, 310, 326, 352, 36908, 2486, 667, 1543, 281, 1329, 253, 17077, 326, 253, 5933, 323, 26230, 10711, 273, 253, 344, 859, 757, 310, 5919, 50276, 7152, 339, 431, 248, 4477, 12661, 247, 1332, 323, 26230, 253, 10711, 273, 253, 344, 859, 757, 273, 253, 2831, 290, 10144, 2957, 342, 1675, 281, 253, 13461, 273, 247, 11454, 2990, 30410, 597, 1804, 6240, 253, 10711, 29107, 347, 247, 3963, 3006, 12339, 1307, 323, 3733, 11454, 6928, 342, 5520, 26647, 50276, 783, 4477, 1918, 767, 10527, 42852, 323, 3963, 3006, 253, 344, 859, 757, 10711, 50276, 1542, 4872, 3210, 253, 10711, 273, 253, 344, 859, 757, 273, 253, 2831, 290, 10144, 2957, 342, 1675, 281, 253, 2412, 953, 4620, 347, 247, 2803, 275, 247, 1307, 323, 41113, 26647, 2228, 253, 4477, 3812, 247, 4602, 875, 253, 344, 859, 757, 342, 1675, 281, 253, 13461, 285, 253, 344, 859, 757, 342, 1675, 281, 253, 2412, 953, 285, 9059, 326, 29697, 3006, 253, 10711, 273, 253, 344, 859, 757, 342, 1675, 281, 253, 13461, 671, 5644, 281, 247, 40638, 3033, 327, 253, 26647, 2228, 50276, 1542, 253, 5415, 11786, 8062, 1475, 253, 24571, 253, 344, 859, 757, 299, 17731, 808, 4638, 310, 253, 4016, 299, 17731, 808, 4638, 273, 253, 480, 317, 706, 757, 273, 253, 11786, 8062, 534, 3653, 253, 7882, 273, 253, 7870, 985, 253, 4477, 9059, 326, 11052, 344, 859, 757, 10711, 5459, 480, 317, 706, 757, 10711, 285, 7613, 11355, 941, 3469, 423, 7882, 534, 588, 3693, 689, 31893, 50276, 783, 4477, 5125, 1332, 323, 26230, 253, 10711, 4648, 253, 19191, 288, 9248, 9258, 10711, 29107, 534, 3530, 247, 3632, 4972, 40009, 342, 1578, 2005, 9788, 2056, 50276, 74, 281, 755, 1578, 44916, 288, 40009, 50276, 1206, 73, 253, 4477, 3176, 281, 897, 896, 44263, 318, 949, 247, 896, 44263, 318, 11786, 2746, 281, 11897, 9788, 2056, 288, 40009, 407, 806, 12672, 253, 11786, 305, 50276, 69, 392, 3151, 273, 253, 2957, 298, 342, 1675, 281, 253, 13461, 40639, 285, 840, 12672, 253, 11786, 273, 253, 6703, 1885, 305, 85, 40009, 534, 310, 247, 13434, 342, 1675, 281, 253, 13461, 969, 436, 11786, 1189, 29844, 476, 320, 6703, 1885, 31458, 969, 342, 40009, 50276, 249, 1340, 281, 5321, 2105, 253, 4477, 1804, 247, 5926, 562, 1332, 4758, 247, 6919, 273, 253, 2426, 273, 40009, 281, 470, 534, 26866, 247, 6919, 273, 253, 4309, 13782, 285, 47512, 15323, 2426, 273, 253, 10711, 2020, 50276, 783, 4477, 840, 1246, 5661, 1543, 16344, 616, 5125, 37820, 1332, 275, 5019, 342, 285, 2429, 281, 643, 37820, 3082, 327, 50276, 46277, 274, 740, 50276, 373, 3024, 1093, 50276, 46277, 274, 2313, 50276, 4363, 12541, 2990, 50276, 44874, 614, 633, 19, 50276, 19, 12026, 298, 296, 78, 20544, 50276, 783, 4602, 273, 344, 859, 757, 6637, 285, 10711, 281, 26647, 310, 271, 4722, 285, 12532, 2170, 273, 2561, 285, 352, 310, 1270, 281, 923, 1543, 323, 247, 1332, 3963, 3006, 253, 344, 859, 757, 326, 3133, 281, 562, 32231, 643, 37820, 3082, 50276, 783, 2605, 273, 253, 2929, 310, 2590, 285, 253, 2929, 310, 34007, 50276, 20881, 1255, 265, 50276, 783, 4602, 875, 10527, 22861, 285, 8542, 1332, 310, 417, 1900, 2590, 285, 253, 4623, 4278, 812, 320, 11848, 50276, 2577, 5471, 310, 326, 253, 4602, 875, 970, 253, 344, 859, 757, 50276, 47941, 706, 757, 342, 1675, 281, 253, 2412, 953, 4632, 342, 1675, 281, 253, 3602, 310, 417, 347, 4951, 3579, 347, 352, 310, 1160, 562, 281, 320, 275, 253, 2929, 285, 812, 320, 5544, 1805, 323, 1650, 253, 2929, 3054, 10748, 387, 253, 5927, 253, 11786, 310, 5058, 285, 253, 5222, 273, 253, 480, 317, 706, 757, 4315, 310, 1355, 2822, 5927, 50276, 2858, 604, 359, 452, 323, 24088, 247, 14561, 30410, 269, 89, 50276, 88, 340, 89, 835, 259, 403, 253, 13461, 273, 253, 1390, 3828, 840, 277, 392, 71, 310, 752, 310, 5486, 347, 480, 317, 706, 757, 275, 253, 3033, 533, 253, 11786, 275, 13757, 310, 342, 1675, 281, 3602, 24088, 277, 35331, 17176, 88, 50276, 69, 392, 71, 50276, 4989, 23985, 50276, 69, 392, 71, 50276, 28264, 812, 627, 4931, 320, 247, 1039, 323, 253, 11786, 281, 320, 1355, 604, 340, 89, 310, 1355, 1293, 277, 392, 71, 1146, 1355, 436, 1953, 812, 320, 11848, 2220, 50276, 3549, 6241, 271, 8813, 273, 253, 3064, 875, 253, 10711, 273, 253, 2957, 344, 859, 757, 8772, 281, 2412, 953, 4632, 8772, 281, 253, 13461, 651, 5649, 253, 2929, 323, 1650, 352, 310, 973, 4304, 326, 253, 344, 859, 757, 8772, 281, 13461, 323, 247, 3676, 11454, 2990, 476, 452, 4016, 20223, 533, 253, 344, 859, 757, 8772, 281, 2412, 953, 323, 247, 17133, 2957, 556, 760, 2762, 20223, 1057, 436, 2647, 281, 253, 10711, 285, 2330, 3033, 10380, 50276, 1747, 13218, 253, 4872, 7882, 4154, 253, 2929, 812, 320, 5520, 407, 45190, 17227, 326, 3963, 3006, 253, 344, 859, 757, 10711, 5644, 281, 271, 24571, 342, 247, 344, 859, 757, 326, 556, 1679, 7882, 4931, 407, 18918, 253, 20223, 387, 253, 24571, 33810, 310, 627, 247, 4602, 875, 253, 2934, 326, 3629, 17620, 7729, 26647, 281, 253, 6239, 326, 2296, 326, 6507, 5556, 66, 1361, 26647, 50275, 9088, 310, 642, 5955, 273, 970, 253, 8107, 1189, 32514, 1125, 351, 1648, 1332, 323, 12672, 253, 344, 859, 757, 4972, 1885, 3185, 273, 247, 3579, 1189, 32514, 1125, 351, 1648, 1332, 534, 476, 5321, 3541, 285, 7613, 812, 671, 320, 7938, 327, 247, 305, 11113, 50276, 249, 8063, 368, 476, 755, 253, 36799, 4309, 273, 253, 13434, 305, 85, 3151, 50276, 2592, 342, 1675, 281, 40639, 275, 253, 3884, 273, 40009, 407, 16344, 50276, 27421, 69, 3151, 50276, 2592, 50276, 72, 3151, 50276, 2265, 50276, 2592, 50276, 72, 3151, 50276, 2265, 50276, 263, 12014, 50276, 69, 5943, 15379, 324, 3151, 50276, 2592, 50276, 72, 3151, 50276, 2265, 50276, 2592, 50276, 2592, 50276, 72, 3151, 50276, 2592, 50276, 2265, 50276, 255, 247, 2803, 767, 273, 253, 2105, 273, 16344, 760, 253, 11786, 11786, 6760, 387, 40639, 31506, 3579, 9561, 1125, 351, 1648, 476, 320, 908, 281, 253, 1072, 1055, 285, 253, 5926, 483, 2934, 476, 1335, 320, 3732, 407, 4758, 690, 273, 253, 40009, 281, 5058, 285, 7613, 23703, 253, 2105, 2007, 1955, 281, 8231, 7632, 275, 253, 15180, 4216, 50275, 783, 5661, 2593, 1057, 417, 30648, 253, 3963, 6081, 3139, 390, 253, 4602, 281, 3762, 273, 253, 344, 859, 757, 10711, 37820, 285, 310, 3185, 7106, 327, 816, 17227, 18019, 1666, 25379, 50276, 9088, 310, 4390, 642, 1783, 273, 253, 5454, 2727, 273, 253, 8492, 273, 253, 10711, 29107, 407, 970, 625, 5926, 483, 4632, 3081, 11897, 2105, 816, 253, 3908, 3738, 3629, 253, 5912, 281, 3609, 3602, 476, 3157, 1071, 7200, 253, 673, 8353, 588, 2572, 247, 2257, 253, 2929, 812, 320, 5520, 407, 6240, 271, 13974, 5660, 1783, 273, 253, 7200, 273, 253, 10711, 29107, 50276, 9088, 310, 671, 642, 1783, 273, 849, 10711, 7200, 16274, 26647, 3045, 285, 10711, 3139, 24088, 368, 812, 823, 14777, 4645, 849, 253, 2032, 344, 859, 757, 10711, 2544, 4768, 3733, 50276, 262, 651, 320, 1175, 281, 923, 253, 3733, 285, 2831, 12820, 9191, 323, 1027, 4373, 12928, 273, 8366, 19416, 12339, 2426, 751, 344, 859, 757, 10711, 1307, 298, 19, 5222, 1307, 3966, 281, 923, 849, 597, 3486, 3733, 285, 849, 6474, 3733, 310, 281, 253, 3969, 4373, 22041, 352, 651, 320, 1175, 281, 452, 271, 1783, 273, 849, 1199, 1057, 344, 859, 757, 10711, 1307, 823, 327, 1755, 273, 4373, 3575, 24251, 643, 37820, 24088, 323, 480, 317, 706, 757, 37820, 253, 2929, 2296, 816, 359, 873, 1180, 273, 20553, 295, 29167, 50276, 18, 285, 2801, 2193, 480, 83, 50276, 2874, 534, 1057, 417, 7933, 1804, 326, 627, 369, 247, 4209, 4373, 19484, 25184, 3434, 50276, 23955, 4722, 1953, 281, 12106, 651, 320, 849, 1057, 253, 2879, 3963, 6081, 3486, 253, 3733, 8062, 24088, 7938, 390, 17357, 14940, 275, 253, 5068, 4632, 990, 273, 3733, 285, 2139, 50275, 8826, 625, 2087, 12645, 281, 3157, 253, 2929, 50276, 47009, 28146, 285, 33797, 963, 993, 588, 3157, 1239, 1430, 285, 3290, 273, 253, 2929, 50276, 3409, 253, 26647, 3033, 347, 247, 10012, 347, 275, 253, 3236, 2929, 10012, 7609, 50276, 249, 2905, 789, 4931, 671, 4684, 281, 643, 2987, 18918, 344, 859, 757, 285, 27633, 4315, 6637, 285, 2905, 22429, 323, 26647, 24088, 27633, 376, 80, 7982, 12087, 285, 10454, 273, 11454, 6928, 50276, 74, 513, 417, 5583, 281, 2997, 253, 2929, 275, 697, 1655, 830, 1580, 253, 4602, 875, 253, 10527, 22861, 285, 253, 16774, 1941, 310, 5075, 285, 253, 4679, 275, 253, 2929, 403, 417, 4209, 323, 271, 16774, 4685, 273, 849, 253, 5125, 3963, 6081, 1663, 2987, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 3963, 4219, 3676, 11454, 6928, 3066, 253, 344, 859, 757, 10711, 50276, 783, 5933, 310, 1754, 327, 288, 9248, 10047, 1332, 2007, 21702, 3066, 5926, 483, 50276, 14477, 281, 253, 4872, 7882, 273, 18525, 985, 310, 5469, 50276, 783, 4081, 37820, 2722, 49148, 275, 253, 5661, 1543, 50276, 783, 2934, 273, 253, 1332, 310, 2590, 50276, 783, 9380, 4028, 3198, 247, 2257, 273, 7756, 984, 627, 403, 247, 1180, 273, 47412, 474, 6332, 50276, 783, 2201, 7681, 7350, 2486, 247, 253, 5661, 1543, 403, 1335, 417, 21414, 270, 253, 8813, 273, 48065, 17620, 275, 253, 18525, 985, 326, 48520, 281, 689, 31893, 12212, 37317, 305, 69, 1479, 50276, 422, 1239, 253, 30080, 22559, 533, 3464, 10915, 8498, 758 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 253, 4764, 26332, 2801, 2317, 50276, 9088, 1615, 8493, 689, 31893, 285, 11138, 2087, 50228, 273, 253, 2990, 50276, 783, 2929, 20097, 342, 247, 873, 273, 4679, 327, 2629, 3676, 4715, 49602, 46277, 274, 740, 260, 338, 274, 2313, 285, 259, 1479, 614, 633, 19, 395, 275, 2761, 1046, 1083, 253, 4081, 396, 384, 69, 6974, 33526, 1682, 1543, 50276, 783, 2929, 2789, 247, 2266, 1083, 323, 253, 4081, 37820, 6974, 29697, 3006, 253, 10711, 273, 253, 344, 859, 757, 50276, 783, 16774, 1543, 2908, 275, 253, 2929, 1646, 281, 5224, 326, 253, 4081, 37820, 6974, 19132, 2990, 3045, 285, 2087, 50228, 50276, 783, 7681, 4278, 403, 12482, 285, 352, 310, 3477, 281, 956, 253, 15965, 5018, 3559, 275, 253, 2929, 326, 1421, 281, 253, 2457, 5919, 5933, 323, 26230, 253, 10711, 273, 253, 344, 859, 757, 50276, 4674, 5910, 326, 8631, 253, 4872, 7882, 1783, 260, 462, 1574, 8219, 2139, 28699, 253, 10711, 273, 253, 344, 859, 757, 310, 581, 1039, 281, 3157, 253, 2087, 50228, 273, 253, 2990, 50275, 783, 2929, 671, 27171, 432, 690, 35387, 50276, 6050, 4361, 253, 2929, 891, 30773, 2220, 247, 1180, 963, 993, 2985, 81, 437, 723, 13583, 4021, 19101, 1077, 1846, 275, 44127, 4212, 3966, 50276, 20513, 476, 320, 4354, 4229, 50276, 66, 625, 4092, 2523, 342, 253, 2929, 310, 697, 897, 273, 253, 3159, 5919, 347, 275, 281, 14556, 17930, 256, 280, 253, 288, 9248, 9258, 1332, 50276, 74, 858, 417, 923, 667, 4679, 326, 34647, 849, 5919, 253, 4081, 1332, 310, 50276, 74, 651, 452, 10490, 281, 923, 690, 20243, 8442, 326, 4446, 1728, 253, 1127, 670, 253, 6733, 273, 253, 4081, 2746, 50276, 261, 627, 247, 1039, 281, 16748, 436, 50276, 783, 2929, 2686, 10262, 767, 3082, 323, 26230, 253, 10711, 273, 253, 344, 859, 757, 50276, 783, 806, 1332, 1057, 417, 897, 5926, 483, 50276, 2811, 284, 253, 1273, 625, 5919, 1332, 4648, 5926, 483, 50276, 783, 2929, 4428, 642, 1543, 970, 253, 806, 1332, 50276, 262, 651, 320, 5322, 281, 871, 253, 3045, 8037, 875, 253, 806, 285, 253, 1273, 1332, 50276, 249, 1142, 9534, 3045, 285, 6733, 403, 39342, 2905, 50276, 18566, 326, 2186, 1060, 1512, 50276, 30991, 1523, 778, 320, 3676, 11454, 6928, 598, 423, 436, 9058, 285, 253, 1273, 1332, 9319, 1097, 1805, 3045, 285, 1805, 1408, 3181, 4583, 253, 2929, 310, 973, 15720, 50276, 262, 29328, 247, 747, 37820, 6974, 326, 3133, 281, 3157, 3676, 11454, 6928, 3045, 327, 260, 338, 274, 740, 260, 338, 274, 2313, 285, 259, 1479, 614, 633, 19, 15302, 50276, 783, 4081, 6974, 19132, 3045, 323, 1097, 27311, 267, 285, 18902, 11454, 6928, 50276, 783, 2929, 4648, 4872, 18525, 2718, 3762, 281, 9059, 326, 28699, 253, 10711, 273, 253, 344, 859, 757, 19132, 253, 2087, 50228, 273, 253, 2990, 50276, 6275, 314, 253, 15965, 5740, 323, 14556, 12672, 253, 10711, 273, 253, 344, 859, 757, 310, 3477, 281, 956, 50276, 2577, 760, 2523, 342, 253, 2929, 310, 326, 352, 36908, 2486, 667, 1543, 281, 1329, 253, 17077, 326, 253, 5933, 323, 26230, 10711, 273, 253, 344, 859, 757, 310, 5919, 50276, 7152, 339, 431, 248, 4477, 12661, 247, 1332, 323, 26230, 253, 10711, 273, 253, 344, 859, 757, 273, 253, 2831, 290, 10144, 2957, 342, 1675, 281, 253, 13461, 273, 247, 11454, 2990, 30410, 597, 1804, 6240, 253, 10711, 29107, 347, 247, 3963, 3006, 12339, 1307, 323, 3733, 11454, 6928, 342, 5520, 26647, 50276, 783, 4477, 1918, 767, 10527, 42852, 323, 3963, 3006, 253, 344, 859, 757, 10711, 50276, 1542, 4872, 3210, 253, 10711, 273, 253, 344, 859, 757, 273, 253, 2831, 290, 10144, 2957, 342, 1675, 281, 253, 2412, 953, 4620, 347, 247, 2803, 275, 247, 1307, 323, 41113, 26647, 2228, 253, 4477, 3812, 247, 4602, 875, 253, 344, 859, 757, 342, 1675, 281, 253, 13461, 285, 253, 344, 859, 757, 342, 1675, 281, 253, 2412, 953, 285, 9059, 326, 29697, 3006, 253, 10711, 273, 253, 344, 859, 757, 342, 1675, 281, 253, 13461, 671, 5644, 281, 247, 40638, 3033, 327, 253, 26647, 2228, 50276, 1542, 253, 5415, 11786, 8062, 1475, 253, 24571, 253, 344, 859, 757, 299, 17731, 808, 4638, 310, 253, 4016, 299, 17731, 808, 4638, 273, 253, 480, 317, 706, 757, 273, 253, 11786, 8062, 534, 3653, 253, 7882, 273, 253, 7870, 985, 253, 4477, 9059, 326, 11052, 344, 859, 757, 10711, 5459, 480, 317, 706, 757, 10711, 285, 7613, 11355, 941, 3469, 423, 7882, 534, 588, 3693, 689, 31893, 50276, 783, 4477, 5125, 1332, 323, 26230, 253, 10711, 4648, 253, 19191, 288, 9248, 9258, 10711, 29107, 534, 3530, 247, 3632, 4972, 40009, 342, 1578, 2005, 9788, 2056, 50276, 74, 281, 755, 1578, 44916, 288, 40009, 50276, 1206, 73, 253, 4477, 3176, 281, 897, 896, 44263, 318, 949, 247, 896, 44263, 318, 11786, 2746, 281, 11897, 9788, 2056, 288, 40009, 407, 806, 12672, 253, 11786, 305, 50276, 69, 392, 3151, 273, 253, 2957, 298, 342, 1675, 281, 253, 13461, 40639, 285, 840, 12672, 253, 11786, 273, 253, 6703, 1885, 305, 85, 40009, 534, 310, 247, 13434, 342, 1675, 281, 253, 13461, 969, 436, 11786, 1189, 29844, 476, 320, 6703, 1885, 31458, 969, 342, 40009, 50276, 249, 1340, 281, 5321, 2105, 253, 4477, 1804, 247, 5926, 562, 1332, 4758, 247, 6919, 273, 253, 2426, 273, 40009, 281, 470, 534, 26866, 247, 6919, 273, 253, 4309, 13782, 285, 47512, 15323, 2426, 273, 253, 10711, 2020, 50276, 783, 4477, 840, 1246, 5661, 1543, 16344, 616, 5125, 37820, 1332, 275, 5019, 342, 285, 2429, 281, 643, 37820, 3082, 327, 50276, 46277, 274, 740, 50276, 373, 3024, 1093, 50276, 46277, 274, 2313, 50276, 4363, 12541, 2990, 50276, 44874, 614, 633, 19, 50276, 19, 12026, 298, 296, 78, 20544, 50276, 783, 4602, 273, 344, 859, 757, 6637, 285, 10711, 281, 26647, 310, 271, 4722, 285, 12532, 2170, 273, 2561, 285, 352, 310, 1270, 281, 923, 1543, 323, 247, 1332, 3963, 3006, 253, 344, 859, 757, 326, 3133, 281, 562, 32231, 643, 37820, 3082, 50276, 783, 2605, 273, 253, 2929, 310, 2590, 285, 253, 2929, 310, 34007, 50276, 20881, 1255, 265, 50276, 783, 4602, 875, 10527, 22861, 285, 8542, 1332, 310, 417, 1900, 2590, 285, 253, 4623, 4278, 812, 320, 11848, 50276, 2577, 5471, 310, 326, 253, 4602, 875, 970, 253, 344, 859, 757, 50276, 47941, 706, 757, 342, 1675, 281, 253, 2412, 953, 4632, 342, 1675, 281, 253, 3602, 310, 417, 347, 4951, 3579, 347, 352, 310, 1160, 562, 281, 320, 275, 253, 2929, 285, 812, 320, 5544, 1805, 323, 1650, 253, 2929, 3054, 10748, 387, 253, 5927, 253, 11786, 310, 5058, 285, 253, 5222, 273, 253, 480, 317, 706, 757, 4315, 310, 1355, 2822, 5927, 50276, 2858, 604, 359, 452, 323, 24088, 247, 14561, 30410, 269, 89, 50276, 88, 340, 89, 835, 259, 403, 253, 13461, 273, 253, 1390, 3828, 840, 277, 392, 71, 310, 752, 310, 5486, 347, 480, 317, 706, 757, 275, 253, 3033, 533, 253, 11786, 275, 13757, 310, 342, 1675, 281, 3602, 24088, 277, 35331, 17176, 88, 50276, 69, 392, 71, 50276, 4989, 23985, 50276, 69, 392, 71, 50276, 28264, 812, 627, 4931, 320, 247, 1039, 323, 253, 11786, 281, 320, 1355, 604, 340, 89, 310, 1355, 1293, 277, 392, 71, 1146, 1355, 436, 1953, 812, 320, 11848, 2220, 50276, 3549, 6241, 271, 8813, 273, 253, 3064, 875, 253, 10711, 273, 253, 2957, 344, 859, 757, 8772, 281, 2412, 953, 4632, 8772, 281, 253, 13461, 651, 5649, 253, 2929, 323, 1650, 352, 310, 973, 4304, 326, 253, 344, 859, 757, 8772, 281, 13461, 323, 247, 3676, 11454, 2990, 476, 452, 4016, 20223, 533, 253, 344, 859, 757, 8772, 281, 2412, 953, 323, 247, 17133, 2957, 556, 760, 2762, 20223, 1057, 436, 2647, 281, 253, 10711, 285, 2330, 3033, 10380, 50276, 1747, 13218, 253, 4872, 7882, 4154, 253, 2929, 812, 320, 5520, 407, 45190, 17227, 326, 3963, 3006, 253, 344, 859, 757, 10711, 5644, 281, 271, 24571, 342, 247, 344, 859, 757, 326, 556, 1679, 7882, 4931, 407, 18918, 253, 20223, 387, 253, 24571, 33810, 310, 627, 247, 4602, 875, 253, 2934, 326, 3629, 17620, 7729, 26647, 281, 253, 6239, 326, 2296, 326, 6507, 5556, 66, 1361, 26647, 50275, 9088, 310, 642, 5955, 273, 970, 253, 8107, 1189, 32514, 1125, 351, 1648, 1332, 323, 12672, 253, 344, 859, 757, 4972, 1885, 3185, 273, 247, 3579, 1189, 32514, 1125, 351, 1648, 1332, 534, 476, 5321, 3541, 285, 7613, 812, 671, 320, 7938, 327, 247, 305, 11113, 50276, 249, 8063, 368, 476, 755, 253, 36799, 4309, 273, 253, 13434, 305, 85, 3151, 50276, 2592, 342, 1675, 281, 40639, 275, 253, 3884, 273, 40009, 407, 16344, 50276, 27421, 69, 3151, 50276, 2592, 50276, 72, 3151, 50276, 2265, 50276, 2592, 50276, 72, 3151, 50276, 2265, 50276, 263, 12014, 50276, 69, 5943, 15379, 324, 3151, 50276, 2592, 50276, 72, 3151, 50276, 2265, 50276, 2592, 50276, 2592, 50276, 72, 3151, 50276, 2592, 50276, 2265, 50276, 255, 247, 2803, 767, 273, 253, 2105, 273, 16344, 760, 253, 11786, 11786, 6760, 387, 40639, 31506, 3579, 9561, 1125, 351, 1648, 476, 320, 908, 281, 253, 1072, 1055, 285, 253, 5926, 483, 2934, 476, 1335, 320, 3732, 407, 4758, 690, 273, 253, 40009, 281, 5058, 285, 7613, 23703, 253, 2105, 2007, 1955, 281, 8231, 7632, 275, 253, 15180, 4216, 50275, 783, 5661, 2593, 1057, 417, 30648, 253, 3963, 6081, 3139, 390, 253, 4602, 281, 3762, 273, 253, 344, 859, 757, 10711, 37820, 285, 310, 3185, 7106, 327, 816, 17227, 18019, 1666, 25379, 50276, 9088, 310, 4390, 642, 1783, 273, 253, 5454, 2727, 273, 253, 8492, 273, 253, 10711, 29107, 407, 970, 625, 5926, 483, 4632, 3081, 11897, 2105, 816, 253, 3908, 3738, 3629, 253, 5912, 281, 3609, 3602, 476, 3157, 1071, 7200, 253, 673, 8353, 588, 2572, 247, 2257, 253, 2929, 812, 320, 5520, 407, 6240, 271, 13974, 5660, 1783, 273, 253, 7200, 273, 253, 10711, 29107, 50276, 9088, 310, 671, 642, 1783, 273, 849, 10711, 7200, 16274, 26647, 3045, 285, 10711, 3139, 24088, 368, 812, 823, 14777, 4645, 849, 253, 2032, 344, 859, 757, 10711, 2544, 4768, 3733, 50276, 262, 651, 320, 1175, 281, 923, 253, 3733, 285, 2831, 12820, 9191, 323, 1027, 4373, 12928, 273, 8366, 19416, 12339, 2426, 751, 344, 859, 757, 10711, 1307, 298, 19, 5222, 1307, 3966, 281, 923, 849, 597, 3486, 3733, 285, 849, 6474, 3733, 310, 281, 253, 3969, 4373, 22041, 352, 651, 320, 1175, 281, 452, 271, 1783, 273, 849, 1199, 1057, 344, 859, 757, 10711, 1307, 823, 327, 1755, 273, 4373, 3575, 24251, 643, 37820, 24088, 323, 480, 317, 706, 757, 37820, 253, 2929, 2296, 816, 359, 873, 1180, 273, 20553, 295, 29167, 50276, 18, 285, 2801, 2193, 480, 83, 50276, 2874, 534, 1057, 417, 7933, 1804, 326, 627, 369, 247, 4209, 4373, 19484, 25184, 3434, 50276, 23955, 4722, 1953, 281, 12106, 651, 320, 849, 1057, 253, 2879, 3963, 6081, 3486, 253, 3733, 8062, 24088, 7938, 390, 17357, 14940, 275, 253, 5068, 4632, 990, 273, 3733, 285, 2139, 50275, 8826, 625, 2087, 12645, 281, 3157, 253, 2929, 50276, 47009, 28146, 285, 33797, 963, 993, 588, 3157, 1239, 1430, 285, 3290, 273, 253, 2929, 50276, 3409, 253, 26647, 3033, 347, 247, 10012, 347, 275, 253, 3236, 2929, 10012, 7609, 50276, 249, 2905, 789, 4931, 671, 4684, 281, 643, 2987, 18918, 344, 859, 757, 285, 27633, 4315, 6637, 285, 2905, 22429, 323, 26647, 24088, 27633, 376, 80, 7982, 12087, 285, 10454, 273, 11454, 6928, 50276, 74, 513, 417, 5583, 281, 2997, 253, 2929, 275, 697, 1655, 830, 1580, 253, 4602, 875, 253, 10527, 22861, 285, 253, 16774, 1941, 310, 5075, 285, 253, 4679, 275, 253, 2929, 403, 417, 4209, 323, 271, 16774, 4685, 273, 849, 253, 5125, 3963, 6081, 1663, 2987, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 3963, 4219, 3676, 11454, 6928, 3066, 253, 344, 859, 757, 10711, 50276, 783, 5933, 310, 1754, 327, 288, 9248, 10047, 1332, 2007, 21702, 3066, 5926, 483, 50276, 14477, 281, 253, 4872, 7882, 273, 18525, 985, 310, 5469, 50276, 783, 4081, 37820, 2722, 49148, 275, 253, 5661, 1543, 50276, 783, 2934, 273, 253, 1332, 310, 2590, 50276, 783, 9380, 4028, 3198, 247, 2257, 273, 7756, 984, 627, 403, 247, 1180, 273, 47412, 474, 6332, 50276, 783, 2201, 7681, 7350, 2486, 247, 253, 5661, 1543, 403, 1335, 417, 21414, 270, 253, 8813, 273, 48065, 17620, 275, 253, 18525, 985, 326, 48520, 281, 689, 31893, 12212, 37317, 305, 69, 1479, 50276, 422, 1239, 253, 30080, 22559, 533, 3464, 10915, 8498, 758 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper proposes a method to learn the cloth deformation of a tshirt given the skeletal pose of an upper body the method skins a thick tetrahedral mesh to the skeleton and embeds the tshirts cloth within at inference time a network predicts a rest pose displacement before conducting skinning via barycentric lookup in the tet mesh the method is trained as far as i can tell on some groundtruth cloth simulation method this is not revealed i recommend rejecting this paper from iclr 2021 on several grounds 1 the results are poor 2 the description is hard to follow 3 the methodological choices are not well motivated 4 the method as written is not reproducible and 5 the claims are too general 1 in terms of topic and methodology this paper would be an appropriate submission to siggraph or sca callibrating for the expected result quality at either venue i would recommend acceptance since the machine learning component of this paper is not a contribution besides being an application of offtheshelf tools i do not see reason to lower these standards for iclr 2 after reading the paper i eventually feel i understand this method important details are left out effecting replicability see below the paper does not clearly state what the input and output is the paper does not describe how groundtruth data is generated this is done by first sorting the tetrahedra on the list based on their largest minimum barycentric weight ie preferring tetrahedra the vertex is deeper inside i dont understand this barycentric weights are largest when near a vertex meanwhile tetrahedra can be very pancakeslivershaped so that regardless of the barycentric coordinates a point is never deep inside using barycentric coordinates to measure depth of penetration is misguided i didnt understand method 2 in that method is the tet mesh entirely ignored is figure 4 showing training data or withheld testing poses 3 the paper immediately jumps into the idea of embedding the cloth of a tshirt in a bulbous tetrahedral mesh around the upper body this isnt questioned until later when all sorts of issues appear due to overlapping and inverted elements there was no reason to think that skinning such a thick tet mesh was a good idea in the first place so the inversion and robustness section is describing ad hoc heuristics to a problem that could have been avoided by starting with a more sound premise the working premise is that pose space deformations can be used for efficient cloth simulation this is reasonable and traces its heritage to a powell optimization approach for examplebased skinning in a production animation environment which should probably be cited from there the choice of using a thick tet mesh comes without solid motivation why not for example instead learn the skinning weights and displacements directly so that for a point p on the cloth the final deformation is wip ti p dp to generalize across body types etc rather than the heavy handed proposed approach of sharing this mysteriously skinned tet mesh the learned w and d function could be predicted based on some relative position to the rest pose and tshirt size etc 4 this paper is far from replicable how is the groundtruth data computed some cloth simulation which method are collisions handled in that method why does the surface boundary in figure 1 c look so spiky yet the input level set is smooth this does not appear in the results of the redgreen tetrahedralization results this looks more like a simply clipped regular grid tetrahedralization how are the weights wkj determined manually automatically optimized during training this appears to be crucial to the method but left out 5 finally this paper claims to provide a skinning a parameterization of threedimensional space for neural network cloth even if i accepted this paper as successful in its results i do not then this paper could at best claim to have skinned a parameterization of tshirt deformations for upperbody motions the fragility of the method as discussed above makes this overclaiming especially dubious docsepim not very familiar with 3d cloth animation so may not provide an entirely adequate evaluation however it looks to me that this paper is more suited to the graphics conference like siggraph the paper present a new method for cloth deformation based on tetrahedral kdsm mesh to make the deformation more natural the neural network is used to predict the offsets that match the ground truth deformation differently from jin et al 2020 in this paper volumetric region of air surrounding the underlying body is used the are several questions about the experimental part of the paper 1 the paper only compares on a single dataset of the tshirts so it is not clear how well the model will perform on other types of clothes regular shirts dresses and so on the paper however claims that the method could be potentially to more broad categories such as hair and fluids is it possible to add more comparisons on the other clothes or object types 2 how the dataset of tshirt meshes been obtained is it synthetically generated if it is synthetically generated what is the benefit of using datadriven learning method could the method be applied on the real world scanned meshes minor issues figure 1d looks not intuitive does not look like d is modification of c and d does not look as shirt at all it is better to use some small modification to make it more intuitive supplementary material should go along with the paper as a set of appendixes only videos and source code should go as zip archive paper contain weird green artifacts in the end of first and second pages which should be removed it is hard to switch attention from figure 3 to figure 4 these figures should be joined the rebuttal did not change neither my confidence nor my impression about the paper so i did not change my rating however i acknowledged that other reviews may be more proficient to judge this paper properly docsepthe authors derive a volumetric extension of the surface parameterization approach developed by jin etal towards this they propose to use tetrahedral parameterization using well known techniques in computer graphics community the kinematically deforming skinned mesh kdsm formulation for tetrahedral parameterization is borrowed from lee at al the combination of the above techniques coupled with some heuristics to increase robustness to inversion suggest improvement over jin et al this is a very niche topic and i am not confident that the general audience stands to benefit from this specific formulation for clothes the iclr community would benefit by demonstrating the approach on other deformations of solidsliquids and validating the generality of the approach compared to other representations beyond virtual cloth only comparing to jin et al significantly limits the scope of the paper the computational complexity of the approach is completely ignored as the gains over jin etal seem to stem from extending the formulation to 3d domain the compute should be compared this becomes important for high dimensional solidliquid simulation docsepthis paper proposes to model 3d cloth by embedding it into kinematically deforming skinned mesh kdsm1 a tetrahedral mesh that parametrizes the volumetric region around the underlying body a kdsm can be created and deformed using a variety of skinning and simulation techniques introduced in 1 this paper extends kdsm by enabling plastic deformation in material space tpose and accurately models the cloth deformation as pervertex offsets inspired by 2 this paper trains a neural network to learn the pervertex offset as a function of body pose once trained the network is able to infer the 3d cloth on a particular body experiments show that the proposed 3d cloth parameterization method is better than the 2d uv parameterization method used in 2 strengths this paper proposes a new approach to modeling 3d cloth deformation experiments demonstrate that it outperforms the uv parameterization method on modeling the pervertex offset as a function of body pose it has the potential to be applied to other clothrelated tasks this paper successfully adapts kdsm which is originally used for hair modeling to clothes the inversion and robustness issues are addressed it would be inspiring for other researchers to apply the similar idea on other types of objects this paper is clearly organized and wellwritten there are sufficient technical details presented in the paper weaknesses some existing works parameterize cloth deformation based on smpl eg smpld 3 tailornet 4 tex2shape 5 this paper lacks the comparison with these closely related methods therefore it is not clear whether the proposed kdsmbased cloth parameterization method is better than existing smplbased methods there is only experimentation on synthetic data it is not clear how it performs on real world data like buff 6 at the end of sec 6 the authors claim that the hybrid method is able to achieve greater temporal consistency it is not clearly which component of the method enforces temporal consistency suggestion in sec 5 the authors elaborate on how to robustly handle tetrahedra inversion and overlapping when generating training examples being depicted in natural language the entire process is too complicated for readers to follow for example the sentence we prune this list of tetrahedra keeping only the most robust tetrahedron near each element boundary makes readers wonder how the robust tetrahedron and element boundary are defined it would be better if the authors can provide a piece of pseudocode to explain the entire process in a compact and precise manner a graphic illustration of the key operations would also be helpful for readers to understand the process 1 lee minjae et al a skinned tetrahedral mesh for hair animation and hairwater interaction ieee transactions on visualization and computer graphics 253 2018 14491459 2 jin ning et al a pixelbased framework for datadriven clothing in proceedings of the 19th acm siggraph eurographics symposium on computer animation volume 39 association for computing machinery 2020 3 bhatnagar bharat lal et al multigarment net learning to dress 3d people from images proceedings of the ieee international conference on computer vision 2019 4 patel chaitanya zhouyingcheng liao and gerard ponsmoll tailornet predicting clothing in 3d as a function of human pose shape and garment style proceedings of the ieeecvf conference on computer vision and pattern recognition 2020 5 alldieck thiemo et al tex2shape detailed full human body geometry from a single image proceedings of the ieee international conference on computer vision 2019 6 zhang chao et al detailed accurate human shape estimation from clothed 3d scan sequences proceedings of the ieee conference on computer vision and pattern recognition 2017 ### Summary:
three of the four reviewers recommend rejection one additional reviewer considers the paper to be marginally above threshold for acceptance but is very uncertain and this is taken into account the ac is in consensus with the first three reviewers that this paper is not ready yet for publication there is concern from the reviewers that iclr is not the right venue for this submission the author response in submission update does not clarify this concern training a neural network to solve the problem does not automatically mean that iclr or other ml conferences are necessarily the right venue regardless due to the many other raised concerns eg limited experimental results and comparisons as well as clarity the ac recommends rejection for this paper and resubmission at a more appropriate venue
[ 1146, 271, 2898, 273, 273, 649, 1041, 48164, 5657, 891, 513, 417, 923, 1921, 281, 2406, 841, 7465, 323, 17857, 32888, 50276, 19, 846, 4361, 253, 2929, 891, 6524, 1928, 891, 2096, 436, 1332, 50276, 18108, 4278, 403, 1669, 562, 1055, 272, 7446, 1430, 923, 2708, 253, 2929, 1057, 417, 4518, 1375, 752, 253, 3280, 285, 3453, 310, 253, 2929, 1057, 417, 6266, 849, 3216, 33024, 941, 310, 4561, 50275, 2520, 310, 2218, 407, 806, 23762, 253, 26823, 742, 376, 327, 253, 1618, 1754, 327, 616, 6253, 5927, 28556, 37382, 2801, 26332, 4510, 804, 26823, 742, 376, 253, 11302, 310, 12861, 3304, 891, 13414, 2096, 436, 28556, 37382, 13461, 403, 6253, 672, 2822, 247, 11302, 26614, 26823, 742, 376, 476, 320, 1077, 13785, 1582, 965, 735, 73, 7760, 594, 326, 10159, 273, 253, 28556, 37382, 11627, 247, 1127, 310, 1620, 3676, 3304, 970, 28556, 37382, 11627, 281, 2557, 6864, 273, 24280, 310, 3731, 26960, 50276, 74, 42126, 2096, 1332, 374, 275, 326, 1332, 310, 253, 14864, 17489, 7094, 12841, 50276, 261, 4677, 577, 4645, 3733, 941, 390, 40492, 5175, 24543, 50276, 20, 253, 2929, 4745, 27287, 715, 253, 2934, 273, 21496, 253, 17848, 273, 247, 246, 23565, 275, 247, 31415, 528, 26823, 16232, 17489, 1475, 253, 5170, 2133, 436, 310, 2649, 17801, 1919, 1996, 672, 512, 16308, 273, 3374, 3176, 1955, 281, 21481, 285, 28483, 3603, 627, 369, 642, 1921, 281, 1158, 326, 4808, 920, 824, 247, 5026, 14864, 17489, 369, 247, 1175, 2934, 275, 253, 806, 1659, 594, 253, 27697, 285, 31640, 2593, 310, 12930, 519, 26901, 344, 321, 3397, 281, 247, 1895, 326, 812, 452, 644, 16371, 407, 4983, 342, 247, 625, 3590, 26536, 50276, 783, 2444, 26536, 310, 326, 16753, 2317, 45989, 476, 320, 908, 323, 5919, 17848, 9864, 436, 310, 5272, 285, 20274, 697, 21904, 281, 247, 5030, 437, 13757, 2746, 323, 1650, 3169, 4808, 920, 275, 247, 3275, 16904, 3126, 534, 943, 3164, 320, 11106, 432, 627, 253, 4327, 273, 970, 247, 5026, 14864, 17489, 3249, 1293, 4891, 16038, 2139, 417, 323, 1650, 3185, 3037, 253, 4808, 920, 13461, 285, 7494, 28633, 3587, 594, 326, 323, 247, 1127, 268, 327, 253, 17848, 253, 2457, 19972, 310, 50275, 88, 532, 16816, 268, 50276, 12132, 50274, 936, 39970, 2439, 2133, 3510, 3966, 2581, 685, 253, 5536, 14048, 4081, 2746, 273, 9628, 436, 15351, 8140, 1629, 13172, 14864, 17489, 253, 6311, 259, 285, 277, 1159, 812, 320, 8131, 1754, 327, 690, 4103, 1899, 281, 253, 1551, 16753, 285, 246, 23565, 1979, 3966, 50276, 21, 436, 2929, 310, 2080, 432, 7446, 494, 849, 310, 253, 3216, 33024, 941, 10302, 690, 17848, 9864, 534, 1332, 403, 22959, 15726, 275, 326, 1332, 50275, 22309, 1057, 253, 2553, 7548, 275, 4677, 337, 260, 1007, 594, 653, 1479, 90, 2568, 253, 3280, 1268, 873, 310, 6032, 436, 1057, 417, 3176, 275, 253, 1543, 273, 253, 2502, 11707, 26823, 16232, 1320, 1543, 436, 4453, 625, 751, 247, 3365, 502, 6390, 3963, 9860, 26823, 16232, 1320, 50276, 5430, 403, 253, 13461, 259, 31169, 3413, 13542, 8356, 18325, 1309, 3733, 436, 4620, 281, 320, 9560, 281, 253, 1332, 533, 1669, 562, 50276, 22, 4720, 436, 2929, 3916, 281, 2085, 247, 4808, 920, 247, 4764, 1320, 273, 289, 22767, 37613, 2317, 323, 11454, 2990, 17848, 1014, 604, 891, 7607, 436, 2929, 347, 5547, 275, 697, 1543, 891, 513, 417, 840, 436, 2929, 812, 387, 1682, 1750, 281, 452, 1629, 13172, 247, 4764, 1320, 273, 246, 23565, 45989, 323, 5170, 2915, 14462, 253, 5920, 874, 273, 253, 1332, 347, 5469, 1840, 2789, 436, 689, 43759, 3340, 42326, 5474, 33032, 303, 417, 1077, 7615, 342, 495, 69, 17848, 16904, 594, 778, 417, 2085, 271, 7094, 10599, 7103, 2299, 352, 4453, 281, 479, 326, 436, 2929, 310, 625, 18960, 281, 253, 15896, 8059, 751, 9788, 10580, 50276, 783, 2929, 1246, 247, 747, 1332, 323, 17848, 19972, 1754, 327, 26823, 16232, 465, 1397, 78, 17489, 281, 1056, 253, 19972, 625, 3626, 253, 11454, 2990, 310, 908, 281, 3283, 253, 41669, 50276, 3529, 3761, 253, 3216, 5083, 19972, 13359, 432, 480, 249, 1162, 355, 9169, 275, 436, 2929, 1936, 45558, 2919, 273, 2329, 8704, 253, 6944, 2133, 50276, 261, 908, 50276, 783, 403, 2067, 3533, 670, 253, 5661, 629, 273, 253, 2929, 50276, 18, 253, 2929, 760, 26662, 327, 247, 2014, 10895, 273, 253, 246, 39529, 594, 352, 310, 417, 2590, 849, 973, 253, 1566, 588, 1347, 327, 643, 3510, 273, 50276, 498, 4521, 3963, 32622, 32724, 285, 594, 327, 50276, 783, 2929, 2299, 3916, 326, 253, 1332, 812, 320, 7826, 281, 625, 3862, 9050, 824, 347, 4707, 285, 21143, 310, 352, 1896, 281, 823, 625, 14023, 327, 253, 643, 10015, 390, 1789, 3510, 50276, 19, 849, 253, 10895, 273, 246, 23565, 6191, 1041, 644, 2797, 310, 352, 5132, 85, 1037, 4561, 604, 352, 310, 5132, 85, 1037, 4561, 752, 310, 253, 5649, 273, 970, 2856, 324, 1069, 257, 4715, 1332, 812, 253, 1332, 320, 3732, 327, 253, 1524, 1533, 25266, 6191, 1041, 50276, 37585, 3374, 50276, 13206, 337, 69, 4453, 417, 27350, 1057, 417, 1007, 751, 277, 310, 11237, 273, 260, 285, 277, 1057, 417, 1007, 347, 14133, 387, 512, 352, 310, 1805, 281, 897, 690, 1355, 11237, 281, 1056, 352, 625, 27350, 50276, 6962, 2144, 943, 564, 2112, 342, 253, 2929, 347, 247, 873, 273, 30762, 265, 760, 10556, 285, 2603, 2127, 943, 564, 347, 23367, 21429, 50276, 20790, 3831, 12504, 4759, 24165, 275, 253, 990, 273, 806, 285, 1273, 7223, 534, 943, 320, 5176, 50276, 262, 310, 1892, 281, 5234, 4116, 432, 4677, 495, 281, 4677, 577, 841, 8442, 943, 320, 7416, 50274, 783, 30080, 22559, 858, 417, 1818, 6747, 619, 7162, 4543, 619, 13214, 670, 253, 2929, 594, 891, 858, 417, 1818, 619, 13716, 2299, 891, 14969, 326, 643, 10123, 778, 320, 625, 1801, 17952, 281, 5963, 436, 2929, 6283, 5474, 339, 431, 248, 4477, 15313, 247, 1936, 45558, 6880, 273, 253, 2553, 4764, 1320, 2746, 3715, 407, 480, 249, 1162, 267, 4404, 436, 597, 12661, 281, 897, 26823, 16232, 4764, 1320, 970, 973, 1929, 5609, 275, 4382, 15896, 3114, 253, 5708, 358, 5372, 372, 14692, 1629, 13172, 17489, 465, 1397, 78, 15895, 323, 26823, 16232, 4764, 1320, 310, 29563, 432, 458, 70, 387, 355, 50275, 783, 5019, 273, 253, 1840, 5609, 9904, 342, 690, 344, 321, 3397, 281, 2572, 31640, 281, 27697, 1804, 7756, 689, 480, 249, 1162, 355, 436, 310, 247, 1077, 25803, 9400, 285, 891, 717, 417, 13224, 326, 253, 2087, 8446, 9572, 281, 5649, 432, 436, 2173, 15895, 323, 10015, 253, 17857, 32888, 3114, 651, 5649, 407, 17227, 253, 2746, 327, 643, 45989, 273, 33785, 26255, 2352, 285, 3588, 839, 253, 31376, 273, 253, 2746, 2429, 281, 643, 14237, 4457, 7503, 17848, 760, 10941, 281, 480, 249, 1162, 355, 3012, 7787, 253, 7990, 273, 253, 2929, 50275, 783, 15180, 10454, 273, 253, 2746, 310, 4336, 12841, 347, 253, 15988, 689, 480, 249, 1162, 267, 1646, 281, 8424, 432, 13633, 253, 15895, 281, 495, 69, 5028, 253, 11897, 943, 320, 2429, 436, 4916, 1774, 323, 1029, 15759, 4891, 38390, 9864, 5474, 33032, 2520, 2929, 29328, 281, 1566, 495, 69, 17848, 407, 21496, 352, 715, 5708, 358, 5372, 372, 14692, 1629, 13172, 17489, 465, 1397, 78, 18, 247, 26823, 16232, 17489, 326, 30364, 363, 13505, 253, 1936, 45558, 2919, 1475, 253, 6944, 2133, 247, 465, 1397, 78, 476, 320, 3562, 285, 37196, 970, 247, 5235, 273, 4808, 920, 285, 9864, 5609, 5611, 275, 337, 436, 2929, 8725, 465, 1397, 78, 407, 17690, 8013, 19972, 275, 2144, 2317, 246, 3014, 285, 13613, 3210, 253, 17848, 19972, 347, 591, 31560, 41669, 11797, 407, 374, 436, 2929, 18784, 247, 11454, 2990, 281, 3037, 253, 591, 31560, 8409, 347, 247, 1159, 273, 2133, 16753, 2378, 10166, 253, 2990, 310, 2104, 281, 9441, 253, 495, 69, 17848, 327, 247, 1798, 2133, 4679, 921, 326, 253, 4081, 495, 69, 17848, 4764, 1320, 1332, 310, 1805, 685, 253, 374, 69, 46556, 4764, 1320, 1332, 908, 275, 374, 50276, 296, 3755, 20556, 50276, 2520, 2929, 29328, 247, 747, 2746, 281, 14053, 495, 69, 17848, 19972, 4679, 7568, 326, 352, 41731, 13015, 253, 46556, 4764, 1320, 1332, 327, 14053, 253, 591, 31560, 8409, 347, 247, 1159, 273, 2133, 16753, 352, 556, 253, 2442, 281, 320, 3732, 281, 643, 17848, 4919, 8892, 50276, 2520, 2929, 8379, 5223, 84, 465, 1397, 78, 534, 310, 8927, 908, 323, 4707, 14053, 281, 10015, 253, 27697, 285, 31640, 3374, 403, 9713, 352, 651, 320, 29853, 323, 643, 8607, 281, 4647, 253, 2074, 2934, 327, 643, 3510, 273, 5113, 50276, 2520, 2929, 310, 4518, 10932, 285, 973, 15720, 627, 403, 4209, 7681, 4278, 3559, 275, 253, 2929, 50276, 20881, 1255, 265, 50276, 8826, 5368, 2987, 4764, 907, 17848, 19972, 1754, 327, 924, 446, 24088, 924, 81, 392, 495, 8105, 1575, 292, 577, 30557, 19, 15721, 608, 50276, 2520, 2929, 19756, 253, 5301, 342, 841, 8244, 2905, 3082, 3103, 352, 310, 417, 2590, 1880, 253, 4081, 465, 1397, 1814, 833, 17848, 4764, 1320, 1332, 310, 1805, 685, 5368, 924, 446, 3169, 3082, 50276, 9088, 310, 760, 40290, 327, 13506, 941, 352, 310, 417, 2590, 849, 352, 17923, 327, 1524, 1533, 941, 751, 14664, 721, 50276, 255, 253, 990, 273, 4706, 721, 253, 4477, 1750, 326, 253, 9769, 1332, 310, 2104, 281, 5115, 3687, 11935, 15274, 352, 310, 417, 4518, 534, 4445, 273, 253, 1332, 546, 36217, 11935, 15274, 50276, 35640, 279, 50276, 249, 4706, 608, 253, 4477, 21184, 327, 849, 281, 10237, 314, 6016, 26823, 742, 376, 27697, 285, 21481, 672, 11365, 3733, 6667, 1146, 17253, 275, 3626, 3448, 253, 2862, 1232, 310, 1512, 9542, 323, 10668, 281, 956, 323, 1650, 253, 6197, 359, 819, 2517, 436, 1618, 273, 26823, 742, 376, 7562, 760, 253, 954, 10237, 26823, 45938, 2822, 1016, 3284, 7548, 50276, 46535, 10668, 4282, 849, 253, 10237, 26823, 45938, 285, 3284, 7548, 403, 2931, 352, 651, 320, 1805, 604, 253, 4477, 476, 2085, 247, 5313, 273, 10585, 406, 853, 281, 5513, 253, 2862, 1232, 275, 247, 8566, 285, 10799, 5133, 247, 19908, 23356, 273, 253, 2234, 5871, 651, 671, 320, 9371, 323, 10668, 281, 2096, 253, 1232, 50275, 18, 458, 70, 1054, 75, 3348, 1162, 355, 247, 1629, 13172, 26823, 16232, 17489, 323, 4707, 16904, 285, 4707, 7779, 5016, 26332, 1796, 13122, 327, 24426, 285, 4382, 15896, 30135, 4765, 1638, 2537, 1047, 3046, 374, 480, 249, 39920, 1162, 355, 247, 12275, 3169, 7792, 323, 2856, 324, 1069, 257, 14234, 275, 10061, 273, 253, 655, 394, 913, 78, 9788, 10580, 50276, 70, 1822, 40718, 18870, 35835, 327, 4382, 16904, 4644, 6931, 5864, 323, 12672, 20949, 9169, 495, 270, 700, 79, 28923, 270, 9432, 255, 298, 267, 1162, 355, 1554, 304, 274, 420, 2036, 4715, 281, 7619, 495, 69, 952, 432, 3888, 10061, 273, 253, 26332, 1796, 5213, 8059, 327, 4382, 8113, 6247, 577, 869, 293, 448, 1942, 32028, 1182, 14451, 3184, 5756, 72, 298, 22728, 285, 21974, 472, 268, 790, 78, 2555, 8105, 1575, 292, 21565, 14234, 275, 495, 69, 347, 247, 1159, 273, 1966, 16753, 5281, 285, 40371, 3740, 10061, 273, 253, 26332, 70, 886, 39985, 8059, 327, 4382, 8113, 285, 3102, 8981, 9169, 608, 355, 392, 466, 777, 289, 9623, 80, 1162, 355, 30557, 19, 15721, 7000, 2120, 1966, 2133, 12087, 432, 247, 2014, 2460, 10061, 273, 253, 26332, 1796, 5213, 8059, 327, 4382, 8113, 6247, 721, 1182, 12109, 448, 8500, 1162, 355, 7000, 7899, 1966, 5281, 13418, 432, 31301, 742, 495, 69, 11017, 6430, 10061, 273, 253, 26332, 1796, 8059, 327, 4382, 8113, 285, 3102, 8981, 4240, 187, 187, 4118, 18435, 27, 13524, 273, 253, 1740, 30628, 5583, 18235, 581, 3081, 37317, 19401, 253, 2929, 281, 320, 42876, 1840, 7887, 323, 14924, 533, 310, 1077, 8767, 285, 436, 310, 2668, 715, 2395, 50276, 783, 913, 310, 275, 13969, 342, 253, 806, 1264, 30628, 326, 436, 2929, 310, 417, 4704, 2568, 323, 9311, 50274, 9088, 310, 4468, 432, 253, 30628, 326, 17857, 32888, 310, 417, 253, 987, 18767, 323, 436, 19529, 50276, 783, 2488, 2380, 275, 19529, 5731, 1057, 417, 19148, 436, 4468, 50276, 31158, 247, 11454, 2990, 281, 8415, 253, 1895, 1057, 417, 8356, 1599, 326, 17857, 32888, 390, 643, 13361, 27691, 403, 7933, 253, 987, 18767, 50276, 1747, 21350, 1955, 281, 253, 1142, 643, 5439, 7350, 24088, 3710, 5661, 1543, 285, 14023, 347, 973, 347, 19843, 50276, 783, 913, 32636, 18235, 323, 436, 2929, 285, 501, 538, 2230, 387, 247, 625, 4569, 18767, 50276 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 1146, 271, 2898, 273, 273, 649, 1041, 48164, 5657, 891, 513, 417, 923, 1921, 281, 2406, 841, 7465, 323, 17857, 32888, 50276, 19, 846, 4361, 253, 2929, 891, 6524, 1928, 891, 2096, 436, 1332, 50276, 18108, 4278, 403, 1669, 562, 1055, 272, 7446, 1430, 923, 2708, 253, 2929, 1057, 417, 4518, 1375, 752, 253, 3280, 285, 3453, 310, 253, 2929, 1057, 417, 6266, 849, 3216, 33024, 941, 310, 4561, 50275, 2520, 310, 2218, 407, 806, 23762, 253, 26823, 742, 376, 327, 253, 1618, 1754, 327, 616, 6253, 5927, 28556, 37382, 2801, 26332, 4510, 804, 26823, 742, 376, 253, 11302, 310, 12861, 3304, 891, 13414, 2096, 436, 28556, 37382, 13461, 403, 6253, 672, 2822, 247, 11302, 26614, 26823, 742, 376, 476, 320, 1077, 13785, 1582, 965, 735, 73, 7760, 594, 326, 10159, 273, 253, 28556, 37382, 11627, 247, 1127, 310, 1620, 3676, 3304, 970, 28556, 37382, 11627, 281, 2557, 6864, 273, 24280, 310, 3731, 26960, 50276, 74, 42126, 2096, 1332, 374, 275, 326, 1332, 310, 253, 14864, 17489, 7094, 12841, 50276, 261, 4677, 577, 4645, 3733, 941, 390, 40492, 5175, 24543, 50276, 20, 253, 2929, 4745, 27287, 715, 253, 2934, 273, 21496, 253, 17848, 273, 247, 246, 23565, 275, 247, 31415, 528, 26823, 16232, 17489, 1475, 253, 5170, 2133, 436, 310, 2649, 17801, 1919, 1996, 672, 512, 16308, 273, 3374, 3176, 1955, 281, 21481, 285, 28483, 3603, 627, 369, 642, 1921, 281, 1158, 326, 4808, 920, 824, 247, 5026, 14864, 17489, 369, 247, 1175, 2934, 275, 253, 806, 1659, 594, 253, 27697, 285, 31640, 2593, 310, 12930, 519, 26901, 344, 321, 3397, 281, 247, 1895, 326, 812, 452, 644, 16371, 407, 4983, 342, 247, 625, 3590, 26536, 50276, 783, 2444, 26536, 310, 326, 16753, 2317, 45989, 476, 320, 908, 323, 5919, 17848, 9864, 436, 310, 5272, 285, 20274, 697, 21904, 281, 247, 5030, 437, 13757, 2746, 323, 1650, 3169, 4808, 920, 275, 247, 3275, 16904, 3126, 534, 943, 3164, 320, 11106, 432, 627, 253, 4327, 273, 970, 247, 5026, 14864, 17489, 3249, 1293, 4891, 16038, 2139, 417, 323, 1650, 3185, 3037, 253, 4808, 920, 13461, 285, 7494, 28633, 3587, 594, 326, 323, 247, 1127, 268, 327, 253, 17848, 253, 2457, 19972, 310, 50275, 88, 532, 16816, 268, 50276, 12132, 50274, 936, 39970, 2439, 2133, 3510, 3966, 2581, 685, 253, 5536, 14048, 4081, 2746, 273, 9628, 436, 15351, 8140, 1629, 13172, 14864, 17489, 253, 6311, 259, 285, 277, 1159, 812, 320, 8131, 1754, 327, 690, 4103, 1899, 281, 253, 1551, 16753, 285, 246, 23565, 1979, 3966, 50276, 21, 436, 2929, 310, 2080, 432, 7446, 494, 849, 310, 253, 3216, 33024, 941, 10302, 690, 17848, 9864, 534, 1332, 403, 22959, 15726, 275, 326, 1332, 50275, 22309, 1057, 253, 2553, 7548, 275, 4677, 337, 260, 1007, 594, 653, 1479, 90, 2568, 253, 3280, 1268, 873, 310, 6032, 436, 1057, 417, 3176, 275, 253, 1543, 273, 253, 2502, 11707, 26823, 16232, 1320, 1543, 436, 4453, 625, 751, 247, 3365, 502, 6390, 3963, 9860, 26823, 16232, 1320, 50276, 5430, 403, 253, 13461, 259, 31169, 3413, 13542, 8356, 18325, 1309, 3733, 436, 4620, 281, 320, 9560, 281, 253, 1332, 533, 1669, 562, 50276, 22, 4720, 436, 2929, 3916, 281, 2085, 247, 4808, 920, 247, 4764, 1320, 273, 289, 22767, 37613, 2317, 323, 11454, 2990, 17848, 1014, 604, 891, 7607, 436, 2929, 347, 5547, 275, 697, 1543, 891, 513, 417, 840, 436, 2929, 812, 387, 1682, 1750, 281, 452, 1629, 13172, 247, 4764, 1320, 273, 246, 23565, 45989, 323, 5170, 2915, 14462, 253, 5920, 874, 273, 253, 1332, 347, 5469, 1840, 2789, 436, 689, 43759, 3340, 42326, 5474, 33032, 303, 417, 1077, 7615, 342, 495, 69, 17848, 16904, 594, 778, 417, 2085, 271, 7094, 10599, 7103, 2299, 352, 4453, 281, 479, 326, 436, 2929, 310, 625, 18960, 281, 253, 15896, 8059, 751, 9788, 10580, 50276, 783, 2929, 1246, 247, 747, 1332, 323, 17848, 19972, 1754, 327, 26823, 16232, 465, 1397, 78, 17489, 281, 1056, 253, 19972, 625, 3626, 253, 11454, 2990, 310, 908, 281, 3283, 253, 41669, 50276, 3529, 3761, 253, 3216, 5083, 19972, 13359, 432, 480, 249, 1162, 355, 9169, 275, 436, 2929, 1936, 45558, 2919, 273, 2329, 8704, 253, 6944, 2133, 50276, 261, 908, 50276, 783, 403, 2067, 3533, 670, 253, 5661, 629, 273, 253, 2929, 50276, 18, 253, 2929, 760, 26662, 327, 247, 2014, 10895, 273, 253, 246, 39529, 594, 352, 310, 417, 2590, 849, 973, 253, 1566, 588, 1347, 327, 643, 3510, 273, 50276, 498, 4521, 3963, 32622, 32724, 285, 594, 327, 50276, 783, 2929, 2299, 3916, 326, 253, 1332, 812, 320, 7826, 281, 625, 3862, 9050, 824, 347, 4707, 285, 21143, 310, 352, 1896, 281, 823, 625, 14023, 327, 253, 643, 10015, 390, 1789, 3510, 50276, 19, 849, 253, 10895, 273, 246, 23565, 6191, 1041, 644, 2797, 310, 352, 5132, 85, 1037, 4561, 604, 352, 310, 5132, 85, 1037, 4561, 752, 310, 253, 5649, 273, 970, 2856, 324, 1069, 257, 4715, 1332, 812, 253, 1332, 320, 3732, 327, 253, 1524, 1533, 25266, 6191, 1041, 50276, 37585, 3374, 50276, 13206, 337, 69, 4453, 417, 27350, 1057, 417, 1007, 751, 277, 310, 11237, 273, 260, 285, 277, 1057, 417, 1007, 347, 14133, 387, 512, 352, 310, 1805, 281, 897, 690, 1355, 11237, 281, 1056, 352, 625, 27350, 50276, 6962, 2144, 943, 564, 2112, 342, 253, 2929, 347, 247, 873, 273, 30762, 265, 760, 10556, 285, 2603, 2127, 943, 564, 347, 23367, 21429, 50276, 20790, 3831, 12504, 4759, 24165, 275, 253, 990, 273, 806, 285, 1273, 7223, 534, 943, 320, 5176, 50276, 262, 310, 1892, 281, 5234, 4116, 432, 4677, 495, 281, 4677, 577, 841, 8442, 943, 320, 7416, 50274, 783, 30080, 22559, 858, 417, 1818, 6747, 619, 7162, 4543, 619, 13214, 670, 253, 2929, 594, 891, 858, 417, 1818, 619, 13716, 2299, 891, 14969, 326, 643, 10123, 778, 320, 625, 1801, 17952, 281, 5963, 436, 2929, 6283, 5474, 339, 431, 248, 4477, 15313, 247, 1936, 45558, 6880, 273, 253, 2553, 4764, 1320, 2746, 3715, 407, 480, 249, 1162, 267, 4404, 436, 597, 12661, 281, 897, 26823, 16232, 4764, 1320, 970, 973, 1929, 5609, 275, 4382, 15896, 3114, 253, 5708, 358, 5372, 372, 14692, 1629, 13172, 17489, 465, 1397, 78, 15895, 323, 26823, 16232, 4764, 1320, 310, 29563, 432, 458, 70, 387, 355, 50275, 783, 5019, 273, 253, 1840, 5609, 9904, 342, 690, 344, 321, 3397, 281, 2572, 31640, 281, 27697, 1804, 7756, 689, 480, 249, 1162, 355, 436, 310, 247, 1077, 25803, 9400, 285, 891, 717, 417, 13224, 326, 253, 2087, 8446, 9572, 281, 5649, 432, 436, 2173, 15895, 323, 10015, 253, 17857, 32888, 3114, 651, 5649, 407, 17227, 253, 2746, 327, 643, 45989, 273, 33785, 26255, 2352, 285, 3588, 839, 253, 31376, 273, 253, 2746, 2429, 281, 643, 14237, 4457, 7503, 17848, 760, 10941, 281, 480, 249, 1162, 355, 3012, 7787, 253, 7990, 273, 253, 2929, 50275, 783, 15180, 10454, 273, 253, 2746, 310, 4336, 12841, 347, 253, 15988, 689, 480, 249, 1162, 267, 1646, 281, 8424, 432, 13633, 253, 15895, 281, 495, 69, 5028, 253, 11897, 943, 320, 2429, 436, 4916, 1774, 323, 1029, 15759, 4891, 38390, 9864, 5474, 33032, 2520, 2929, 29328, 281, 1566, 495, 69, 17848, 407, 21496, 352, 715, 5708, 358, 5372, 372, 14692, 1629, 13172, 17489, 465, 1397, 78, 18, 247, 26823, 16232, 17489, 326, 30364, 363, 13505, 253, 1936, 45558, 2919, 1475, 253, 6944, 2133, 247, 465, 1397, 78, 476, 320, 3562, 285, 37196, 970, 247, 5235, 273, 4808, 920, 285, 9864, 5609, 5611, 275, 337, 436, 2929, 8725, 465, 1397, 78, 407, 17690, 8013, 19972, 275, 2144, 2317, 246, 3014, 285, 13613, 3210, 253, 17848, 19972, 347, 591, 31560, 41669, 11797, 407, 374, 436, 2929, 18784, 247, 11454, 2990, 281, 3037, 253, 591, 31560, 8409, 347, 247, 1159, 273, 2133, 16753, 2378, 10166, 253, 2990, 310, 2104, 281, 9441, 253, 495, 69, 17848, 327, 247, 1798, 2133, 4679, 921, 326, 253, 4081, 495, 69, 17848, 4764, 1320, 1332, 310, 1805, 685, 253, 374, 69, 46556, 4764, 1320, 1332, 908, 275, 374, 50276, 296, 3755, 20556, 50276, 2520, 2929, 29328, 247, 747, 2746, 281, 14053, 495, 69, 17848, 19972, 4679, 7568, 326, 352, 41731, 13015, 253, 46556, 4764, 1320, 1332, 327, 14053, 253, 591, 31560, 8409, 347, 247, 1159, 273, 2133, 16753, 352, 556, 253, 2442, 281, 320, 3732, 281, 643, 17848, 4919, 8892, 50276, 2520, 2929, 8379, 5223, 84, 465, 1397, 78, 534, 310, 8927, 908, 323, 4707, 14053, 281, 10015, 253, 27697, 285, 31640, 3374, 403, 9713, 352, 651, 320, 29853, 323, 643, 8607, 281, 4647, 253, 2074, 2934, 327, 643, 3510, 273, 5113, 50276, 2520, 2929, 310, 4518, 10932, 285, 973, 15720, 627, 403, 4209, 7681, 4278, 3559, 275, 253, 2929, 50276, 20881, 1255, 265, 50276, 8826, 5368, 2987, 4764, 907, 17848, 19972, 1754, 327, 924, 446, 24088, 924, 81, 392, 495, 8105, 1575, 292, 577, 30557, 19, 15721, 608, 50276, 2520, 2929, 19756, 253, 5301, 342, 841, 8244, 2905, 3082, 3103, 352, 310, 417, 2590, 1880, 253, 4081, 465, 1397, 1814, 833, 17848, 4764, 1320, 1332, 310, 1805, 685, 5368, 924, 446, 3169, 3082, 50276, 9088, 310, 760, 40290, 327, 13506, 941, 352, 310, 417, 2590, 849, 352, 17923, 327, 1524, 1533, 941, 751, 14664, 721, 50276, 255, 253, 990, 273, 4706, 721, 253, 4477, 1750, 326, 253, 9769, 1332, 310, 2104, 281, 5115, 3687, 11935, 15274, 352, 310, 417, 4518, 534, 4445, 273, 253, 1332, 546, 36217, 11935, 15274, 50276, 35640, 279, 50276, 249, 4706, 608, 253, 4477, 21184, 327, 849, 281, 10237, 314, 6016, 26823, 742, 376, 27697, 285, 21481, 672, 11365, 3733, 6667, 1146, 17253, 275, 3626, 3448, 253, 2862, 1232, 310, 1512, 9542, 323, 10668, 281, 956, 323, 1650, 253, 6197, 359, 819, 2517, 436, 1618, 273, 26823, 742, 376, 7562, 760, 253, 954, 10237, 26823, 45938, 2822, 1016, 3284, 7548, 50276, 46535, 10668, 4282, 849, 253, 10237, 26823, 45938, 285, 3284, 7548, 403, 2931, 352, 651, 320, 1805, 604, 253, 4477, 476, 2085, 247, 5313, 273, 10585, 406, 853, 281, 5513, 253, 2862, 1232, 275, 247, 8566, 285, 10799, 5133, 247, 19908, 23356, 273, 253, 2234, 5871, 651, 671, 320, 9371, 323, 10668, 281, 2096, 253, 1232, 50275, 18, 458, 70, 1054, 75, 3348, 1162, 355, 247, 1629, 13172, 26823, 16232, 17489, 323, 4707, 16904, 285, 4707, 7779, 5016, 26332, 1796, 13122, 327, 24426, 285, 4382, 15896, 30135, 4765, 1638, 2537, 1047, 3046, 374, 480, 249, 39920, 1162, 355, 247, 12275, 3169, 7792, 323, 2856, 324, 1069, 257, 14234, 275, 10061, 273, 253, 655, 394, 913, 78, 9788, 10580, 50276, 70, 1822, 40718, 18870, 35835, 327, 4382, 16904, 4644, 6931, 5864, 323, 12672, 20949, 9169, 495, 270, 700, 79, 28923, 270, 9432, 255, 298, 267, 1162, 355, 1554, 304, 274, 420, 2036, 4715, 281, 7619, 495, 69, 952, 432, 3888, 10061, 273, 253, 26332, 1796, 5213, 8059, 327, 4382, 8113, 6247, 577, 869, 293, 448, 1942, 32028, 1182, 14451, 3184, 5756, 72, 298, 22728, 285, 21974, 472, 268, 790, 78, 2555, 8105, 1575, 292, 21565, 14234, 275, 495, 69, 347, 247, 1159, 273, 1966, 16753, 5281, 285, 40371, 3740, 10061, 273, 253, 26332, 70, 886, 39985, 8059, 327, 4382, 8113, 285, 3102, 8981, 9169, 608, 355, 392, 466, 777, 289, 9623, 80, 1162, 355, 30557, 19, 15721, 7000, 2120, 1966, 2133, 12087, 432, 247, 2014, 2460, 10061, 273, 253, 26332, 1796, 5213, 8059, 327, 4382, 8113, 6247, 721, 1182, 12109, 448, 8500, 1162, 355, 7000, 7899, 1966, 5281, 13418, 432, 31301, 742, 495, 69, 11017, 6430, 10061, 273, 253, 26332, 1796, 8059, 327, 4382, 8113, 285, 3102, 8981, 4240, 187, 187, 4118, 18435, 27, 13524, 273, 253, 1740, 30628, 5583, 18235, 581, 3081, 37317, 19401, 253, 2929, 281, 320, 42876, 1840, 7887, 323, 14924, 533, 310, 1077, 8767, 285, 436, 310, 2668, 715, 2395, 50276, 783, 913, 310, 275, 13969, 342, 253, 806, 1264, 30628, 326, 436, 2929, 310, 417, 4704, 2568, 323, 9311, 50274, 9088, 310, 4468, 432, 253, 30628, 326, 17857, 32888, 310, 417, 253, 987, 18767, 323, 436, 19529, 50276, 783, 2488, 2380, 275, 19529, 5731, 1057, 417, 19148, 436, 4468, 50276, 31158, 247, 11454, 2990, 281, 8415, 253, 1895, 1057, 417, 8356, 1599, 326, 17857, 32888, 390, 643, 13361, 27691, 403, 7933, 253, 987, 18767, 50276, 1747, 21350, 1955, 281, 253, 1142, 643, 5439, 7350, 24088, 3710, 5661, 1543, 285, 14023, 347, 973, 347, 19843, 50276, 783, 913, 32636, 18235, 323, 436, 2929, 285, 501, 538, 2230, 387, 247, 625, 4569, 18767, 50276 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper presents a new architecture called the blockrecurrent transformer input sequences are chunked as a block and each block is operated by a transformer layer each block is connected with a recurrent layer block id encoding similar to position encoding is introduced the authors include ablation studies on different variants of the gate mechanism in the recurrent layer the paper is wellwritten and the motivation is clear the blockrecurrent transformer improves the efficiencyaccuracy tradeoff compared to the transformerxl which is a strong language model baseline here the efficiency is the number of parameters and runtime and the accuracy is language modeling perplexity it would be much better to show that the blockrecurrent transformer could also be effective in many applications where transformers are successful however the new architecture is only tested on language modeling the paper does not explain how we can use the blockrecurrent transformer for a wide range of applications docsepthis paper is interesting the authors propose a new rnn structure for transformer by introducing new transition states between transfomers and acheived new sotas strengths 1 new sota results by cascading many transfomers with trival modifications 2 tackled the bottleneck the transformer by using rnn structure for language modelling weaknesses 1 this network may be more effective for very long sequences 1 some failure cases can be given in the paper docsepthe paper proposes a new model which adds blockwise outer recurrency to a transformer the blockwise recurrency operates on a block of tokens ie the hidden state is updated not after every single token but after a block of tokens the hidden state is not just a feature vector but a matrix or a vector of features the transformer accesses the hidden state by cross attention on the hidden state ie those vectors of features the transformer additionally uses slidingwindow attention like the longformer at the beginning of the sequence it has access to the previous sequence and also the previous hidden state via a cache like the transformerxl and longformer the gradient flow obviously stops at the sequence boundary in training one sequence of tokens consisting of multiple blocks is given to the model going over the blocks and updating the recurrent state is orthogonal to going over the tokens for any token the hidden state it would access is the last hidden state it has access to the recurrency update is similar to the layerwise structure of a transformer considering one transformer layer including the cross attention to the hidden state mostly the same structure is used for updating the hidden state specifically it uses selfattention on the hidden state itself and cross attention to the tokenwise features of the current block however instead of residual connections they use a gating mechanism here thus because there is the selfattention and crossattention already together and then a following mlp there are two gates per recurrent state update they also have some variations on removing some of the transformer components like the mlp such that the recurrent state update only has one gate in their experiments this simpler variant actually performs better this described actually only one single blockrecurrent transformer layer their overall model is actually mostly like longformer ie transformerxl with slidingwindow attention and they simply replace one single layer by such a blockrecurrent transformer layer they also test another variant referred to as feedback in the table where they also modify the other standard longformer layers by additional cross attention to the last recurrent state where the recurrent state still only comes from a single layer the slidingwindow attention is implemented also in a blockwise fashion for better efficiency in training i think it actually uses the same blocks both for the recurrent state and the slidingwindow attention although this is a model aspect for the recurrency and an implementation detail for the slidingwindow attention they perform experiments on the pg19 dataset arxiv dataset and github dataset they reach or surpass stateoftheart performance in all strengths the idea to combine recurrency with selfattention in some way is not novel and has often been tried before in various ways but often only with little success from the results it looks like the proposed model really improves considerably due to the recurrency the results look good overall the authors do a good job in summarizing related work plan to release the code weaknesses in the next part under questions suggestions i will list more things individually here this is just a summary of the weaknesses unfortunately the model definition is unclear in many parts my summary is what i assumed after rereading many parts again and again and inferring what would have made sense but this is not good the model definition should be completely unambiguous and very clear the analysis on aspects of the model is a bit short and leaves many open questions the experimental comparison to the memorizing transformer looks a bit unfair its not clear whether the memorizing transformer can yield better overall perplexity some more standard language modeling benchmarks like enwik8 or wiki103 are not used this would have been nice as there are more results from the literature to compare to code not released yet its also not clear whether it really will be done i have read statements like we plan to so often where it was never released in the end also in this case as there were so many things unclear the code could have helped a lot in clarifying everything exactly up to the latest detail it is mentioned that the full potential is likely not achieved yet and the current recurrent structure is maybe suboptimal negative social impacts are also shortly but adequately addressed ### Summary:
this paper describes a modification to the transformer architecture to use blockrecurrence to more accurately model very long sequences borrowing some ideas from the lstm the idea is fairly simple to implement as it doesnt require much code over a traditional transformer and results seem good if not completely overwhelmingly so all reviewers voted to accept this paper and i agree its a fairly simple idea with fairly good results and adds to the body of knowledge regarding how to model very long sequences
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 10262, 247, 747, 10336, 1925, 253, 2972, 250, 6259, 39707, 3280, 6430, 403, 20540, 264, 347, 247, 2972, 285, 1016, 2972, 310, 11658, 407, 247, 39707, 3828, 1016, 2972, 310, 4802, 342, 247, 18902, 3828, 2972, 2654, 9706, 2074, 281, 1899, 9706, 310, 5611, 253, 4477, 2486, 28913, 2175, 327, 1027, 11640, 273, 253, 7394, 5122, 275, 253, 18902, 3828, 253, 2929, 310, 973, 15720, 285, 253, 16038, 310, 2590, 253, 2972, 250, 6259, 39707, 19132, 253, 6733, 18921, 1974, 5454, 2727, 2429, 281, 253, 39707, 30291, 534, 310, 247, 2266, 3448, 1566, 8245, 1060, 253, 6733, 310, 253, 1180, 273, 3602, 285, 20243, 285, 253, 7200, 310, 3448, 14053, 44229, 414, 352, 651, 320, 1199, 1805, 281, 921, 326, 253, 2972, 250, 6259, 39707, 812, 671, 320, 3576, 275, 1142, 4893, 835, 4979, 398, 403, 5547, 2299, 253, 747, 10336, 310, 760, 5762, 327, 3448, 14053, 253, 2929, 1057, 417, 5513, 849, 359, 476, 897, 253, 2972, 250, 6259, 39707, 323, 247, 4618, 2491, 273, 4893, 5474, 33032, 2520, 2929, 310, 4722, 253, 4477, 12661, 247, 747, 391, 9866, 2605, 323, 39707, 407, 16984, 747, 5502, 3054, 875, 47415, 14724, 285, 49652, 1567, 747, 256, 302, 284, 50276, 296, 3755, 20556, 337, 747, 256, 5503, 1543, 407, 18779, 6748, 1142, 47415, 14724, 342, 35820, 267, 14586, 374, 11463, 1070, 253, 3673, 44856, 253, 39707, 407, 970, 391, 9866, 2605, 323, 3448, 26278, 32213, 337, 436, 2990, 778, 320, 625, 3576, 323, 1077, 1048, 6430, 50276, 18, 690, 4433, 2219, 476, 320, 1677, 275, 253, 2929, 50276, 7152, 339, 431, 248, 2929, 29328, 247, 747, 1566, 534, 11323, 2972, 3020, 8346, 11896, 7549, 281, 247, 39707, 253, 2972, 3020, 11896, 7549, 17209, 327, 247, 2972, 273, 21761, 26332, 253, 8763, 1375, 310, 9300, 417, 846, 1046, 2014, 10669, 533, 846, 247, 2972, 273, 21761, 253, 8763, 1375, 310, 417, 816, 247, 4735, 4972, 533, 247, 4315, 390, 247, 4972, 273, 3386, 253, 39707, 2289, 265, 253, 8763, 1375, 407, 2831, 4116, 327, 253, 8763, 1375, 26332, 1110, 11390, 273, 3386, 253, 39707, 23000, 4648, 20661, 13686, 4116, 751, 253, 1048, 19946, 387, 253, 5068, 273, 253, 3425, 352, 556, 2289, 281, 253, 2045, 3425, 285, 671, 253, 2045, 8763, 1375, 3066, 247, 11556, 751, 253, 39707, 30291, 285, 1048, 19946, 253, 11786, 2685, 9090, 14545, 387, 253, 3425, 7548, 275, 3733, 581, 3425, 273, 21761, 11253, 273, 2709, 8336, 310, 1677, 281, 253, 1566, 1469, 689, 253, 8336, 285, 22753, 253, 18902, 1375, 310, 19627, 281, 1469, 689, 253, 21761, 323, 667, 10669, 253, 8763, 1375, 352, 651, 2289, 310, 253, 1390, 8763, 1375, 352, 556, 2289, 281, 253, 11896, 7549, 5731, 310, 2074, 281, 253, 3828, 3020, 2605, 273, 247, 39707, 7296, 581, 39707, 3828, 1690, 253, 2831, 4116, 281, 253, 8763, 1375, 6571, 253, 1072, 2605, 310, 908, 323, 22753, 253, 8763, 1375, 5742, 352, 4648, 1881, 42959, 327, 253, 8763, 1375, 3139, 285, 2831, 4116, 281, 253, 10669, 3020, 3386, 273, 253, 1655, 2972, 2299, 3185, 273, 12541, 10291, 597, 897, 247, 305, 839, 5122, 1060, 3021, 984, 627, 310, 253, 1881, 42959, 285, 2831, 42959, 2168, 2366, 285, 840, 247, 1563, 13361, 81, 627, 403, 767, 18488, 591, 18902, 1375, 5731, 597, 671, 452, 690, 10575, 327, 11922, 690, 273, 253, 39707, 4295, 751, 253, 13361, 81, 824, 326, 253, 18902, 1375, 5731, 760, 556, 581, 7394, 275, 616, 4679, 436, 19554, 12955, 2686, 17923, 1805, 436, 2529, 2686, 760, 581, 2014, 2972, 250, 6259, 39707, 3828, 616, 4583, 1566, 310, 2686, 6571, 751, 1048, 19946, 26332, 39707, 30291, 342, 20661, 13686, 4116, 285, 597, 3365, 8171, 581, 2014, 3828, 407, 824, 247, 2972, 250, 6259, 39707, 3828, 597, 671, 1071, 1529, 12955, 6289, 281, 347, 8680, 275, 253, 2829, 835, 597, 671, 10007, 253, 643, 2629, 1048, 19946, 8090, 407, 3081, 2831, 4116, 281, 253, 1390, 18902, 1375, 835, 253, 18902, 1375, 1335, 760, 3249, 432, 247, 2014, 3828, 50276, 783, 20661, 13686, 4116, 310, 9009, 671, 275, 247, 2972, 3020, 8142, 323, 1805, 6733, 275, 3733, 891, 1158, 352, 2686, 4648, 253, 1072, 8336, 1097, 323, 253, 18902, 1375, 285, 253, 20661, 13686, 4116, 3738, 436, 310, 247, 1566, 4809, 323, 253, 11896, 7549, 285, 271, 7092, 2508, 323, 253, 20661, 13686, 4116, 50276, 9328, 1347, 4679, 327, 253, 23256, 746, 10895, 549, 32693, 10895, 285, 40477, 10895, 597, 3986, 390, 28842, 1375, 23037, 14387, 3045, 275, 512, 20544, 50276, 783, 2934, 281, 13398, 11896, 7549, 342, 1881, 42959, 275, 690, 1039, 310, 417, 4460, 285, 556, 2223, 644, 3597, 1078, 275, 2710, 4088, 533, 2223, 760, 342, 1652, 2323, 432, 253, 1543, 352, 4453, 751, 253, 4081, 1566, 1663, 19132, 15455, 1955, 281, 253, 11896, 7549, 253, 1543, 1007, 1175, 50276, 1189, 455, 253, 4477, 513, 247, 1175, 2628, 275, 10405, 3006, 2905, 789, 50276, 11139, 281, 3727, 253, 2127, 50276, 20881, 1255, 265, 50276, 249, 253, 1735, 629, 762, 3533, 13991, 891, 588, 1618, 625, 1841, 15978, 1060, 436, 310, 816, 247, 6010, 273, 253, 32213, 50276, 328, 9520, 253, 1566, 5426, 310, 12744, 275, 1142, 4243, 619, 6010, 310, 752, 891, 8025, 846, 294, 24042, 1142, 4243, 969, 285, 969, 285, 9441, 804, 752, 651, 452, 1160, 3282, 533, 436, 310, 417, 1175, 253, 1566, 5426, 943, 320, 4336, 39662, 285, 1077, 2590, 50276, 783, 1783, 327, 7794, 273, 253, 1566, 310, 247, 2372, 2159, 285, 6505, 1142, 1527, 3533, 50276, 783, 5661, 5301, 281, 253, 16407, 3006, 39707, 4453, 247, 2372, 16593, 697, 417, 2590, 1880, 253, 16407, 3006, 39707, 476, 4917, 1805, 4583, 44229, 414, 50276, 8826, 625, 2629, 3448, 14053, 49602, 751, 546, 44874, 25, 390, 35372, 12172, 403, 417, 908, 436, 651, 452, 644, 5322, 347, 627, 403, 625, 1543, 432, 253, 6239, 281, 7277, 281, 50276, 3211, 417, 4439, 2568, 697, 671, 417, 2590, 1880, 352, 1663, 588, 320, 2218, 891, 452, 1239, 7234, 751, 359, 2098, 281, 594, 2223, 835, 352, 369, 1620, 4439, 275, 253, 990, 671, 275, 436, 1083, 347, 627, 497, 594, 1142, 1841, 12744, 253, 2127, 812, 452, 6518, 247, 2257, 275, 8254, 5411, 3253, 4555, 598, 281, 253, 6323, 2508, 50276, 262, 310, 5393, 326, 253, 2120, 2442, 310, 2779, 417, 6786, 2568, 285, 253, 1655, 18902, 2605, 310, 5046, 749, 29776, 50276, 12373, 2675, 16274, 403, 671, 13515, 533, 18212, 9713, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 8631, 247, 11237, 281, 253, 39707, 10336, 281, 897, 2972, 250, 1915, 1196, 281, 625, 13613, 1566, 1077, 1048, 6430, 40770, 690, 5697, 432, 253, 298, 296, 78, 253, 2934, 310, 9648, 2969, 281, 3359, 347, 352, 36908, 2430, 1199, 2127, 689, 247, 5899, 39707, 285, 1543, 1646, 1175, 604, 417, 4336, 42935, 594, 50276, 455, 30628, 14285, 281, 2997, 436, 2929, 285, 891, 5194, 697, 247, 9648, 2969, 2934, 342, 9648, 1175, 1543, 285, 11323, 281, 253, 2133, 273, 3640, 5001, 849, 281, 1566, 1077, 1048, 6430 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 10262, 247, 747, 10336, 1925, 253, 2972, 250, 6259, 39707, 3280, 6430, 403, 20540, 264, 347, 247, 2972, 285, 1016, 2972, 310, 11658, 407, 247, 39707, 3828, 1016, 2972, 310, 4802, 342, 247, 18902, 3828, 2972, 2654, 9706, 2074, 281, 1899, 9706, 310, 5611, 253, 4477, 2486, 28913, 2175, 327, 1027, 11640, 273, 253, 7394, 5122, 275, 253, 18902, 3828, 253, 2929, 310, 973, 15720, 285, 253, 16038, 310, 2590, 253, 2972, 250, 6259, 39707, 19132, 253, 6733, 18921, 1974, 5454, 2727, 2429, 281, 253, 39707, 30291, 534, 310, 247, 2266, 3448, 1566, 8245, 1060, 253, 6733, 310, 253, 1180, 273, 3602, 285, 20243, 285, 253, 7200, 310, 3448, 14053, 44229, 414, 352, 651, 320, 1199, 1805, 281, 921, 326, 253, 2972, 250, 6259, 39707, 812, 671, 320, 3576, 275, 1142, 4893, 835, 4979, 398, 403, 5547, 2299, 253, 747, 10336, 310, 760, 5762, 327, 3448, 14053, 253, 2929, 1057, 417, 5513, 849, 359, 476, 897, 253, 2972, 250, 6259, 39707, 323, 247, 4618, 2491, 273, 4893, 5474, 33032, 2520, 2929, 310, 4722, 253, 4477, 12661, 247, 747, 391, 9866, 2605, 323, 39707, 407, 16984, 747, 5502, 3054, 875, 47415, 14724, 285, 49652, 1567, 747, 256, 302, 284, 50276, 296, 3755, 20556, 337, 747, 256, 5503, 1543, 407, 18779, 6748, 1142, 47415, 14724, 342, 35820, 267, 14586, 374, 11463, 1070, 253, 3673, 44856, 253, 39707, 407, 970, 391, 9866, 2605, 323, 3448, 26278, 32213, 337, 436, 2990, 778, 320, 625, 3576, 323, 1077, 1048, 6430, 50276, 18, 690, 4433, 2219, 476, 320, 1677, 275, 253, 2929, 50276, 7152, 339, 431, 248, 2929, 29328, 247, 747, 1566, 534, 11323, 2972, 3020, 8346, 11896, 7549, 281, 247, 39707, 253, 2972, 3020, 11896, 7549, 17209, 327, 247, 2972, 273, 21761, 26332, 253, 8763, 1375, 310, 9300, 417, 846, 1046, 2014, 10669, 533, 846, 247, 2972, 273, 21761, 253, 8763, 1375, 310, 417, 816, 247, 4735, 4972, 533, 247, 4315, 390, 247, 4972, 273, 3386, 253, 39707, 2289, 265, 253, 8763, 1375, 407, 2831, 4116, 327, 253, 8763, 1375, 26332, 1110, 11390, 273, 3386, 253, 39707, 23000, 4648, 20661, 13686, 4116, 751, 253, 1048, 19946, 387, 253, 5068, 273, 253, 3425, 352, 556, 2289, 281, 253, 2045, 3425, 285, 671, 253, 2045, 8763, 1375, 3066, 247, 11556, 751, 253, 39707, 30291, 285, 1048, 19946, 253, 11786, 2685, 9090, 14545, 387, 253, 3425, 7548, 275, 3733, 581, 3425, 273, 21761, 11253, 273, 2709, 8336, 310, 1677, 281, 253, 1566, 1469, 689, 253, 8336, 285, 22753, 253, 18902, 1375, 310, 19627, 281, 1469, 689, 253, 21761, 323, 667, 10669, 253, 8763, 1375, 352, 651, 2289, 310, 253, 1390, 8763, 1375, 352, 556, 2289, 281, 253, 11896, 7549, 5731, 310, 2074, 281, 253, 3828, 3020, 2605, 273, 247, 39707, 7296, 581, 39707, 3828, 1690, 253, 2831, 4116, 281, 253, 8763, 1375, 6571, 253, 1072, 2605, 310, 908, 323, 22753, 253, 8763, 1375, 5742, 352, 4648, 1881, 42959, 327, 253, 8763, 1375, 3139, 285, 2831, 4116, 281, 253, 10669, 3020, 3386, 273, 253, 1655, 2972, 2299, 3185, 273, 12541, 10291, 597, 897, 247, 305, 839, 5122, 1060, 3021, 984, 627, 310, 253, 1881, 42959, 285, 2831, 42959, 2168, 2366, 285, 840, 247, 1563, 13361, 81, 627, 403, 767, 18488, 591, 18902, 1375, 5731, 597, 671, 452, 690, 10575, 327, 11922, 690, 273, 253, 39707, 4295, 751, 253, 13361, 81, 824, 326, 253, 18902, 1375, 5731, 760, 556, 581, 7394, 275, 616, 4679, 436, 19554, 12955, 2686, 17923, 1805, 436, 2529, 2686, 760, 581, 2014, 2972, 250, 6259, 39707, 3828, 616, 4583, 1566, 310, 2686, 6571, 751, 1048, 19946, 26332, 39707, 30291, 342, 20661, 13686, 4116, 285, 597, 3365, 8171, 581, 2014, 3828, 407, 824, 247, 2972, 250, 6259, 39707, 3828, 597, 671, 1071, 1529, 12955, 6289, 281, 347, 8680, 275, 253, 2829, 835, 597, 671, 10007, 253, 643, 2629, 1048, 19946, 8090, 407, 3081, 2831, 4116, 281, 253, 1390, 18902, 1375, 835, 253, 18902, 1375, 1335, 760, 3249, 432, 247, 2014, 3828, 50276, 783, 20661, 13686, 4116, 310, 9009, 671, 275, 247, 2972, 3020, 8142, 323, 1805, 6733, 275, 3733, 891, 1158, 352, 2686, 4648, 253, 1072, 8336, 1097, 323, 253, 18902, 1375, 285, 253, 20661, 13686, 4116, 3738, 436, 310, 247, 1566, 4809, 323, 253, 11896, 7549, 285, 271, 7092, 2508, 323, 253, 20661, 13686, 4116, 50276, 9328, 1347, 4679, 327, 253, 23256, 746, 10895, 549, 32693, 10895, 285, 40477, 10895, 597, 3986, 390, 28842, 1375, 23037, 14387, 3045, 275, 512, 20544, 50276, 783, 2934, 281, 13398, 11896, 7549, 342, 1881, 42959, 275, 690, 1039, 310, 417, 4460, 285, 556, 2223, 644, 3597, 1078, 275, 2710, 4088, 533, 2223, 760, 342, 1652, 2323, 432, 253, 1543, 352, 4453, 751, 253, 4081, 1566, 1663, 19132, 15455, 1955, 281, 253, 11896, 7549, 253, 1543, 1007, 1175, 50276, 1189, 455, 253, 4477, 513, 247, 1175, 2628, 275, 10405, 3006, 2905, 789, 50276, 11139, 281, 3727, 253, 2127, 50276, 20881, 1255, 265, 50276, 249, 253, 1735, 629, 762, 3533, 13991, 891, 588, 1618, 625, 1841, 15978, 1060, 436, 310, 816, 247, 6010, 273, 253, 32213, 50276, 328, 9520, 253, 1566, 5426, 310, 12744, 275, 1142, 4243, 619, 6010, 310, 752, 891, 8025, 846, 294, 24042, 1142, 4243, 969, 285, 969, 285, 9441, 804, 752, 651, 452, 1160, 3282, 533, 436, 310, 417, 1175, 253, 1566, 5426, 943, 320, 4336, 39662, 285, 1077, 2590, 50276, 783, 1783, 327, 7794, 273, 253, 1566, 310, 247, 2372, 2159, 285, 6505, 1142, 1527, 3533, 50276, 783, 5661, 5301, 281, 253, 16407, 3006, 39707, 4453, 247, 2372, 16593, 697, 417, 2590, 1880, 253, 16407, 3006, 39707, 476, 4917, 1805, 4583, 44229, 414, 50276, 8826, 625, 2629, 3448, 14053, 49602, 751, 546, 44874, 25, 390, 35372, 12172, 403, 417, 908, 436, 651, 452, 644, 5322, 347, 627, 403, 625, 1543, 432, 253, 6239, 281, 7277, 281, 50276, 3211, 417, 4439, 2568, 697, 671, 417, 2590, 1880, 352, 1663, 588, 320, 2218, 891, 452, 1239, 7234, 751, 359, 2098, 281, 594, 2223, 835, 352, 369, 1620, 4439, 275, 253, 990, 671, 275, 436, 1083, 347, 627, 497, 594, 1142, 1841, 12744, 253, 2127, 812, 452, 6518, 247, 2257, 275, 8254, 5411, 3253, 4555, 598, 281, 253, 6323, 2508, 50276, 262, 310, 5393, 326, 253, 2120, 2442, 310, 2779, 417, 6786, 2568, 285, 253, 1655, 18902, 2605, 310, 5046, 749, 29776, 50276, 12373, 2675, 16274, 403, 671, 13515, 533, 18212, 9713, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 8631, 247, 11237, 281, 253, 39707, 10336, 281, 897, 2972, 250, 1915, 1196, 281, 625, 13613, 1566, 1077, 1048, 6430, 40770, 690, 5697, 432, 253, 298, 296, 78, 253, 2934, 310, 9648, 2969, 281, 3359, 347, 352, 36908, 2430, 1199, 2127, 689, 247, 5899, 39707, 285, 1543, 1646, 1175, 604, 417, 4336, 42935, 594, 50276, 455, 30628, 14285, 281, 2997, 436, 2929, 285, 891, 5194, 697, 247, 9648, 2969, 2934, 342, 9648, 1175, 1543, 285, 11323, 281, 253, 2133, 273, 3640, 5001, 849, 281, 1566, 1077, 1048, 6430 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper studies mathcalsmathcalt protocol in metalearning which means comparing a taskspecific solver to a target model the main contribution of this paper is proposing an efficient method to construct target models the authors generate target models by adapting from the momentum network which shows better adaptation performance than the metamodel itself in the proposed method the construction of target models and the training of metamodel are combined and iteratively performed experiment results demonstrate the effectiveness of the proposed method strengths 1 this paper studies an important and interesting problem in metalearning ie mathcalsmathcalt protocol which may bypass the limitations of query sets 2 experiments cover both fewshot learing and reinforcement learning which are two main applications of metalearning 3 this paper is well written and easy to understand weaknesses 1 imperfect target models in line 7 of algorithm 1 we can see that the authors generate a target model according to current momentum network phimathrmmoment however the momentum network is updated during the metatraining procedure and it is weak in early epochs the definition of target model is a strong model expert on a specific task but the generated target models by the proposed algorithm are not optimal 2 insufficient interpretation some technical details are not discussed adequately in the paper for example in equation 3 the second terms stands for the loss of mathcalsmathcalt protocol however the authors evaluate the taskspecific solver on the query set mathcalqtaui rather than the support set mathcalstaui which is different from 1 in line 146 the authors claim that the predictions of target models tend to generate soft predictions which is not verified empirically or theoretically apart from this the quality of generated target models are not evaluated since target models are experts on corresponding tasks it is necessary to evaluate them references 1 lu s ye h j gan l et al towards enabling metalearning from target modelsj advances in neural information processing systems 2021 34 80608071 the authors have included the limitations and future work in the paper and some other limitations are listed in the strengths and weaknesses part docsepthis paper proposes a simple algorithm to use the benefits of the knowledge distillation kd in metalearning more specifically since it is not possible to train an individual teacher for each metatraining tasks which could be hundreds of thousands tasks in standard setup this method obtains a momentum network using exponential moving average of the metamodel and then adapts this momentum network into each task using support set of that task the adapted model is used as teacher to guide the main fewshot adaptation process strengths this work introduces a very interesting idea to improve the generalization of the metalearning algorithms from kd point of view it is similar to born again neural networks where a model becomes its own teacher however instead of training from scratch for several times it uses momentum network to train the teacher obtained by adaptation from this momentum network and called target network even though the algorithm is very simple it works very well and provided results show improvement on baseline for different backbones and in different applications including classification reinforcement learning and regression the writing is easy to follow and concepts are defined well however there are some concerns regarding this work weaknesses proposed simt algorithm can be considered as a regularizer which prevents the baseline metalearning algorithm from overfitting to metatrain classes since the results proposed in this paper are way behind stateoftheart sota performance i expect to see at least the comparison with other regularizerlike methods for fewshot learning like crossdomain fewshot classification via learned featurewise transformation iclr20 in addition as the method can be considered in the line of works that uses kd for metalearning comparison with previous works needs to be included towards enabling metalearning from target models neurips21 for fewshot classification results as we cannot generally evaluate the whole possible set of metatest tasks the results usually presented as the mean and standard deviation over a batch of tasks however this paper just reports the std on three trials which is not accurate enough for rl results in figure 2 why the std of the proposed model is much higher than the baseline as simt uses momentum network i expect to see less std in the proposed method for computational efficiency the analysis is very limited and is presented just for a specific configuration regarding table 4 ablation studies what does it mean to use the momentum without distillation do you mean that the metalearner is updated using momentum equation in fig 3b configuration is not clear if same config why is this different from the 3a there are also some typos in the paper in line 118 it should be mathcallphimatchalti same problem with line 126 and equation 6 in fig 3a is should be 3x faster yes docsepan algorithm based on knowledge distillation together with a momentum network working as experts on the unseen fields to solve metalearning problems the proposed algorithm shows compatibility with multiple existing metamodels and improvement the general idea is simple and easy to implement strengths 1 the experiment shows the good performance of the model on a variety of tasks 2 the model is analysed from different perspectives loss landscape and computational efficiency weakness 1 from my understanding despite all the complicated presentation about all the algorithm components the algorithm only forces the metamodel to get close to its momentum model which works as a regulariser from this perspective the novelty is quite limited 2 the proposed algorithm shows the improvement and compatibility on multiple maml models but lacks comparison with other sota metamodels the paper is presented in an unfriendly way for the readers who do not have an efficient background in the beginning especially in the introduction section a lot of concepts target model original model and taskspecific solvers are introduced in the cause levels making the paper very confusing some components of the whole algorithm are explained very independently it is not very clear why one relies on the other docsepthe paper introduces a new method for fewshot learning that utilises a selfimproving momentum target the idea is to learn from a metamodel that is more quickly adapted than the standard metamodel the momentum target is updated using exponential moving average rather than sgd the approach can be combined with a wide variety of fewshot learners and consistently leads to strong improvements in performance strengths the approach can be combined with various fewshot learning methods and consistently leads to strong improvements in performance the idea itself is intuitive and novel even if the idea has some similarities to bootstrapped metalearning and momentum networks the paper provides excellent explanations and illustrations to show how the method works the empirical evaluation is extensive and covers fewshot regression fewshot classification both in domain and across domains and also reinforcement learning with strong results across all there is an ablation study evaluating the various components showing that all parts are important weaknesses there are quite many additional components in the approach the approach introduces several additional hyperparameters for which we need to select values but perhaps selecting them is not too difficult ive enjoyed reading the paper and the strengths significantly outweigh the weaknesses so i recommend acceptance the weaknesses are rather minor the paper will likely be interesting and valuable for the ml community especially for the researchers focused on metalearning yes the discussion of limitations and societal impacts is wellwritten ### Summary:
this submission proposes a strategy to improve metalearning that can be applied to many different base metalearning methods the base metalearning method is used to independently adapt both the online network whose parameters are being optimized and a momentum network constructed by taking an exponential moving average of the online networks weights and a distillation loss is used to encourage the adapted online network with dropout on its parameters to match the adapted momentum network extensive experiments demonstrate that the proposed method improves performance of several base metalearning methods and that each component of the method is necessary to attain optimal performance reviewers initially praised the idea and empirical evaluation but noted that the results obtained were far from stateoftheart and asked for additional experiments with betterperforming base metalearning methods the authors provided these experiments during the response period and the reviewers now unanimously recommend acceptance the ac agrees with the reviewers assessment
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 2175, 14168, 68, 932, 1588, 85, 7241, 275, 5148, 613, 920, 534, 2097, 10941, 247, 8892, 29765, 47037, 281, 247, 2303, 1566, 253, 2022, 7680, 273, 436, 2929, 310, 36636, 271, 5919, 1332, 281, 3989, 2303, 3210, 253, 4477, 6635, 2303, 3210, 407, 42174, 432, 253, 10254, 2990, 534, 2722, 1805, 15644, 3045, 685, 253, 42281, 49797, 3139, 275, 253, 4081, 1332, 253, 5140, 273, 2303, 3210, 285, 253, 3733, 273, 42281, 49797, 403, 5678, 285, 10040, 3146, 2684, 3368, 1543, 7568, 253, 12510, 273, 253, 4081, 1332, 20544, 337, 436, 2929, 2175, 271, 1774, 285, 4722, 1895, 275, 5148, 613, 920, 26332, 14168, 68, 932, 1588, 85, 7241, 534, 778, 18210, 253, 7364, 273, 7316, 5239, 374, 4679, 3835, 1097, 1643, 11860, 458, 1875, 285, 35221, 4715, 534, 403, 767, 2022, 4893, 273, 5148, 613, 920, 495, 436, 2929, 310, 973, 3542, 285, 3477, 281, 2096, 50276, 20881, 1255, 265, 337, 35180, 2303, 3210, 275, 1386, 818, 273, 5933, 337, 359, 476, 923, 326, 253, 4477, 6635, 247, 2303, 1566, 2556, 281, 1655, 10254, 2990, 815, 303, 506, 1109, 25094, 2299, 253, 10254, 2990, 310, 9300, 1309, 253, 1313, 255, 26208, 5199, 285, 352, 310, 5075, 275, 2393, 44540, 253, 5426, 273, 2303, 1566, 310, 247, 2266, 1566, 6485, 327, 247, 2173, 4836, 533, 253, 4561, 2303, 3210, 407, 253, 4081, 5933, 403, 417, 8654, 374, 12497, 7914, 690, 7681, 4278, 403, 417, 5469, 18212, 275, 253, 2929, 323, 1650, 275, 5150, 495, 253, 1273, 2426, 9572, 323, 253, 2957, 273, 14168, 68, 932, 1588, 85, 7241, 2299, 253, 4477, 7472, 253, 8892, 29765, 47037, 327, 253, 7316, 873, 14168, 1179, 82, 3115, 74, 2581, 685, 253, 1329, 873, 14168, 1179, 296, 1952, 74, 534, 310, 1027, 432, 337, 275, 1386, 21640, 253, 4477, 1750, 326, 253, 13650, 273, 2303, 3210, 5257, 281, 6635, 2602, 13650, 534, 310, 417, 16058, 45190, 390, 28055, 7419, 432, 436, 253, 3290, 273, 4561, 2303, 3210, 403, 417, 6760, 1580, 2303, 3210, 403, 10071, 327, 3969, 8892, 352, 310, 3309, 281, 7472, 731, 50276, 250, 3065, 50276, 18, 26535, 256, 9094, 288, 480, 36827, 298, 1162, 355, 4404, 17690, 5148, 613, 920, 432, 2303, 3210, 75, 16424, 275, 11454, 1491, 5162, 2718, 43425, 5910, 5096, 1549, 1438, 3677, 253, 4477, 452, 2908, 253, 7364, 285, 2852, 789, 275, 253, 2929, 285, 690, 643, 7364, 403, 7117, 275, 253, 20544, 285, 32213, 629, 5474, 33032, 2520, 2929, 29328, 247, 2969, 5933, 281, 897, 253, 5373, 273, 253, 3640, 940, 21755, 465, 69, 275, 5148, 613, 920, 625, 5742, 1580, 352, 310, 417, 1896, 281, 6194, 271, 2060, 9732, 323, 1016, 1313, 255, 26208, 8892, 534, 812, 320, 8307, 273, 6763, 8892, 275, 2629, 9978, 436, 1332, 31326, 247, 10254, 2990, 970, 17619, 4886, 3388, 273, 253, 42281, 49797, 285, 840, 5223, 84, 436, 10254, 2990, 715, 1016, 4836, 970, 1329, 873, 273, 326, 4836, 253, 12956, 1566, 310, 908, 347, 9732, 281, 7102, 253, 2022, 1643, 11860, 15644, 1232, 50276, 296, 3755, 20556, 50274, 2520, 789, 23970, 247, 1077, 4722, 2934, 281, 3157, 253, 26647, 273, 253, 5148, 613, 920, 11333, 432, 465, 69, 1127, 273, 1859, 352, 310, 2074, 281, 5686, 969, 11454, 6928, 835, 247, 1566, 4916, 697, 1211, 9732, 2299, 3185, 273, 3733, 432, 20041, 323, 2067, 2069, 352, 4648, 10254, 2990, 281, 6194, 253, 9732, 2797, 407, 15644, 432, 436, 10254, 2990, 285, 1925, 2303, 2990, 1014, 2167, 253, 5933, 310, 1077, 2969, 352, 2987, 1077, 973, 285, 2530, 1543, 921, 7756, 327, 8245, 323, 1027, 896, 47473, 285, 275, 1027, 4893, 1690, 9162, 35221, 4715, 285, 9077, 50275, 783, 4028, 310, 3477, 281, 956, 285, 12342, 403, 2931, 973, 2299, 627, 403, 690, 7350, 5001, 436, 789, 50276, 20881, 1255, 265, 50275, 856, 7334, 948, 85, 5933, 476, 320, 2783, 347, 247, 3963, 6081, 534, 16897, 253, 8245, 5148, 613, 920, 5933, 432, 689, 31893, 281, 1313, 255, 1949, 5971, 1580, 253, 1543, 4081, 275, 436, 2929, 403, 1039, 3212, 1375, 23037, 14387, 256, 5503, 3045, 891, 1902, 281, 923, 387, 1878, 253, 5301, 342, 643, 3963, 6081, 3022, 3082, 323, 1643, 11860, 4715, 751, 50276, 16599, 13517, 1643, 11860, 9162, 3066, 6311, 4735, 3020, 9261, 17857, 32888, 938, 50274, 249, 1635, 347, 253, 1332, 476, 320, 2783, 275, 253, 1386, 273, 2987, 326, 4648, 465, 69, 323, 5148, 613, 920, 5301, 342, 2045, 2987, 3198, 281, 320, 2908, 50276, 32289, 2196, 17690, 5148, 613, 920, 432, 2303, 3210, 5723, 2824, 1797, 50275, 1542, 1643, 11860, 9162, 1543, 347, 359, 2550, 3839, 7472, 253, 2644, 1896, 873, 273, 1313, 255, 383, 8892, 253, 1543, 3798, 3559, 347, 253, 1599, 285, 2629, 11254, 689, 247, 14604, 273, 8892, 2299, 436, 2929, 816, 5012, 253, 6268, 327, 1264, 7587, 534, 310, 417, 7899, 2217, 50275, 1542, 391, 77, 1543, 275, 4677, 374, 2139, 253, 6268, 273, 253, 4081, 1566, 310, 1199, 2169, 685, 253, 8245, 347, 948, 85, 4648, 10254, 2990, 891, 1902, 281, 923, 1679, 6268, 275, 253, 4081, 1332, 50275, 1542, 15180, 6733, 253, 1783, 310, 1077, 3710, 285, 310, 3559, 816, 323, 247, 2173, 6661, 50275, 1747, 13218, 2829, 577, 28913, 2175, 752, 1057, 352, 1599, 281, 897, 253, 10254, 1293, 940, 21755, 513, 368, 1599, 326, 253, 5148, 613, 1216, 310, 9300, 970, 10254, 5150, 50275, 249, 3036, 495, 67, 6661, 310, 417, 2590, 604, 1072, 3596, 2139, 310, 436, 1027, 432, 253, 495, 66, 50274, 9088, 403, 671, 690, 963, 993, 275, 253, 2929, 50275, 249, 1386, 12643, 352, 943, 320, 14168, 4065, 545, 303, 1506, 2711, 74, 50275, 18941, 1895, 342, 1386, 17574, 285, 5150, 721, 50275, 249, 3036, 495, 66, 310, 943, 320, 495, 89, 7938, 50275, 9820, 5474, 339, 4029, 5933, 1754, 327, 3640, 940, 21755, 2366, 342, 247, 10254, 2990, 2444, 347, 10071, 327, 253, 39709, 4910, 281, 8415, 5148, 613, 920, 3237, 253, 4081, 5933, 2722, 22862, 342, 2709, 5368, 42281, 351, 1241, 285, 7756, 253, 2087, 2934, 310, 2969, 285, 3477, 281, 3359, 50275, 296, 3755, 20556, 50276, 18, 253, 3368, 2722, 253, 1175, 3045, 273, 253, 1566, 327, 247, 5235, 273, 8892, 50276, 19, 253, 1566, 310, 15626, 432, 1027, 24302, 2957, 13016, 285, 15180, 6733, 50275, 20881, 1255, 50276, 18, 432, 619, 4685, 5747, 512, 253, 9542, 9759, 670, 512, 253, 5933, 4295, 253, 5933, 760, 5621, 253, 42281, 49797, 281, 755, 2810, 281, 697, 10254, 1566, 534, 2987, 347, 247, 3963, 9141, 432, 436, 8668, 253, 38135, 310, 3240, 3710, 50276, 19, 253, 4081, 5933, 2722, 253, 7756, 285, 22862, 327, 2709, 278, 16878, 3210, 533, 19756, 5301, 342, 643, 256, 5503, 42281, 351, 1241, 50275, 783, 2929, 310, 3559, 275, 271, 5369, 6902, 314, 1039, 323, 253, 10668, 665, 513, 417, 452, 271, 5919, 4114, 275, 253, 5068, 3340, 275, 253, 10199, 2593, 247, 2257, 273, 12342, 2303, 1566, 3236, 1566, 285, 8892, 29765, 1220, 735, 403, 5611, 275, 253, 2847, 2308, 2403, 253, 2929, 1077, 21643, 690, 4295, 273, 253, 2644, 5933, 403, 5544, 1077, 10939, 352, 310, 417, 1077, 2590, 2139, 581, 15771, 327, 253, 643, 50276, 7152, 339, 431, 248, 2929, 23970, 247, 747, 1332, 323, 1643, 11860, 4715, 326, 4981, 3013, 247, 1881, 303, 40037, 10254, 2303, 253, 2934, 310, 281, 3037, 432, 247, 42281, 49797, 326, 310, 625, 4541, 12956, 685, 253, 2629, 42281, 49797, 50276, 783, 10254, 2303, 310, 9300, 970, 17619, 4886, 3388, 2581, 685, 256, 35333, 253, 2746, 476, 320, 5678, 342, 247, 4618, 5235, 273, 1643, 11860, 40390, 285, 12724, 5644, 281, 2266, 11701, 275, 3045, 20544, 50276, 783, 2746, 476, 320, 5678, 342, 2710, 1643, 11860, 4715, 3082, 285, 12724, 5644, 281, 2266, 11701, 275, 3045, 50276, 783, 2934, 3139, 310, 27350, 285, 4460, 1014, 604, 253, 2934, 556, 690, 22620, 281, 7491, 10981, 1882, 5148, 613, 920, 285, 10254, 6928, 50276, 783, 2929, 3400, 7126, 22909, 285, 33954, 281, 921, 849, 253, 1332, 2987, 50276, 783, 16774, 7103, 310, 9470, 285, 10949, 1643, 11860, 9077, 1643, 11860, 9162, 1097, 275, 5028, 285, 2439, 10625, 285, 671, 35221, 4715, 50276, 3113, 2266, 1543, 2439, 512, 50276, 9088, 310, 271, 28913, 1263, 16344, 253, 2710, 4295, 4645, 326, 512, 4243, 403, 1774, 50276, 20881, 1255, 265, 50276, 9088, 403, 3240, 1142, 3081, 4295, 275, 253, 2746, 50276, 783, 2746, 23970, 2067, 3081, 4373, 22041, 323, 534, 359, 878, 281, 3609, 2193, 533, 4931, 17221, 731, 310, 417, 1512, 2834, 50276, 422, 11346, 4361, 253, 2929, 285, 253, 20544, 3012, 32180, 798, 253, 32213, 594, 891, 5583, 14924, 253, 32213, 403, 2581, 5884, 253, 2929, 588, 2779, 320, 4722, 285, 9865, 323, 253, 13361, 3114, 3340, 323, 253, 8607, 7106, 327, 5148, 613, 920, 50276, 9820, 253, 5955, 273, 7364, 285, 38058, 16274, 310, 973, 15720, 2490, 187, 4118, 18435, 27, 2520, 19529, 29328, 247, 5700, 281, 3157, 5148, 613, 920, 326, 476, 320, 3732, 281, 1142, 1027, 2613, 5148, 613, 920, 3082, 253, 2613, 5148, 613, 920, 1332, 310, 908, 281, 10939, 5223, 1097, 253, 3909, 2990, 3692, 3602, 403, 1146, 18325, 285, 247, 10254, 2990, 8818, 407, 3192, 271, 17619, 4886, 3388, 273, 253, 3909, 6928, 13461, 285, 247, 940, 21755, 2957, 310, 908, 281, 11907, 253, 12956, 3909, 2990, 342, 5926, 483, 327, 697, 3602, 281, 3761, 253, 12956, 10254, 2990, 9470, 4679, 7568, 326, 253, 4081, 1332, 19132, 3045, 273, 2067, 2613, 5148, 613, 920, 3082, 285, 326, 1016, 4445, 273, 253, 1332, 310, 3309, 281, 20685, 8654, 3045, 30628, 8523, 26108, 253, 2934, 285, 16774, 7103, 533, 4879, 326, 253, 1543, 2797, 497, 2080, 432, 1375, 23037, 14387, 285, 2546, 323, 3081, 4679, 342, 1805, 468, 14692, 2613, 5148, 613, 920, 3082, 253, 4477, 2530, 841, 4679, 1309, 253, 2380, 2180, 285, 253, 30628, 1024, 38350, 5583, 14924, 253, 913, 18726, 342, 253, 30628, 6803 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 2175, 14168, 68, 932, 1588, 85, 7241, 275, 5148, 613, 920, 534, 2097, 10941, 247, 8892, 29765, 47037, 281, 247, 2303, 1566, 253, 2022, 7680, 273, 436, 2929, 310, 36636, 271, 5919, 1332, 281, 3989, 2303, 3210, 253, 4477, 6635, 2303, 3210, 407, 42174, 432, 253, 10254, 2990, 534, 2722, 1805, 15644, 3045, 685, 253, 42281, 49797, 3139, 275, 253, 4081, 1332, 253, 5140, 273, 2303, 3210, 285, 253, 3733, 273, 42281, 49797, 403, 5678, 285, 10040, 3146, 2684, 3368, 1543, 7568, 253, 12510, 273, 253, 4081, 1332, 20544, 337, 436, 2929, 2175, 271, 1774, 285, 4722, 1895, 275, 5148, 613, 920, 26332, 14168, 68, 932, 1588, 85, 7241, 534, 778, 18210, 253, 7364, 273, 7316, 5239, 374, 4679, 3835, 1097, 1643, 11860, 458, 1875, 285, 35221, 4715, 534, 403, 767, 2022, 4893, 273, 5148, 613, 920, 495, 436, 2929, 310, 973, 3542, 285, 3477, 281, 2096, 50276, 20881, 1255, 265, 337, 35180, 2303, 3210, 275, 1386, 818, 273, 5933, 337, 359, 476, 923, 326, 253, 4477, 6635, 247, 2303, 1566, 2556, 281, 1655, 10254, 2990, 815, 303, 506, 1109, 25094, 2299, 253, 10254, 2990, 310, 9300, 1309, 253, 1313, 255, 26208, 5199, 285, 352, 310, 5075, 275, 2393, 44540, 253, 5426, 273, 2303, 1566, 310, 247, 2266, 1566, 6485, 327, 247, 2173, 4836, 533, 253, 4561, 2303, 3210, 407, 253, 4081, 5933, 403, 417, 8654, 374, 12497, 7914, 690, 7681, 4278, 403, 417, 5469, 18212, 275, 253, 2929, 323, 1650, 275, 5150, 495, 253, 1273, 2426, 9572, 323, 253, 2957, 273, 14168, 68, 932, 1588, 85, 7241, 2299, 253, 4477, 7472, 253, 8892, 29765, 47037, 327, 253, 7316, 873, 14168, 1179, 82, 3115, 74, 2581, 685, 253, 1329, 873, 14168, 1179, 296, 1952, 74, 534, 310, 1027, 432, 337, 275, 1386, 21640, 253, 4477, 1750, 326, 253, 13650, 273, 2303, 3210, 5257, 281, 6635, 2602, 13650, 534, 310, 417, 16058, 45190, 390, 28055, 7419, 432, 436, 253, 3290, 273, 4561, 2303, 3210, 403, 417, 6760, 1580, 2303, 3210, 403, 10071, 327, 3969, 8892, 352, 310, 3309, 281, 7472, 731, 50276, 250, 3065, 50276, 18, 26535, 256, 9094, 288, 480, 36827, 298, 1162, 355, 4404, 17690, 5148, 613, 920, 432, 2303, 3210, 75, 16424, 275, 11454, 1491, 5162, 2718, 43425, 5910, 5096, 1549, 1438, 3677, 253, 4477, 452, 2908, 253, 7364, 285, 2852, 789, 275, 253, 2929, 285, 690, 643, 7364, 403, 7117, 275, 253, 20544, 285, 32213, 629, 5474, 33032, 2520, 2929, 29328, 247, 2969, 5933, 281, 897, 253, 5373, 273, 253, 3640, 940, 21755, 465, 69, 275, 5148, 613, 920, 625, 5742, 1580, 352, 310, 417, 1896, 281, 6194, 271, 2060, 9732, 323, 1016, 1313, 255, 26208, 8892, 534, 812, 320, 8307, 273, 6763, 8892, 275, 2629, 9978, 436, 1332, 31326, 247, 10254, 2990, 970, 17619, 4886, 3388, 273, 253, 42281, 49797, 285, 840, 5223, 84, 436, 10254, 2990, 715, 1016, 4836, 970, 1329, 873, 273, 326, 4836, 253, 12956, 1566, 310, 908, 347, 9732, 281, 7102, 253, 2022, 1643, 11860, 15644, 1232, 50276, 296, 3755, 20556, 50274, 2520, 789, 23970, 247, 1077, 4722, 2934, 281, 3157, 253, 26647, 273, 253, 5148, 613, 920, 11333, 432, 465, 69, 1127, 273, 1859, 352, 310, 2074, 281, 5686, 969, 11454, 6928, 835, 247, 1566, 4916, 697, 1211, 9732, 2299, 3185, 273, 3733, 432, 20041, 323, 2067, 2069, 352, 4648, 10254, 2990, 281, 6194, 253, 9732, 2797, 407, 15644, 432, 436, 10254, 2990, 285, 1925, 2303, 2990, 1014, 2167, 253, 5933, 310, 1077, 2969, 352, 2987, 1077, 973, 285, 2530, 1543, 921, 7756, 327, 8245, 323, 1027, 896, 47473, 285, 275, 1027, 4893, 1690, 9162, 35221, 4715, 285, 9077, 50275, 783, 4028, 310, 3477, 281, 956, 285, 12342, 403, 2931, 973, 2299, 627, 403, 690, 7350, 5001, 436, 789, 50276, 20881, 1255, 265, 50275, 856, 7334, 948, 85, 5933, 476, 320, 2783, 347, 247, 3963, 6081, 534, 16897, 253, 8245, 5148, 613, 920, 5933, 432, 689, 31893, 281, 1313, 255, 1949, 5971, 1580, 253, 1543, 4081, 275, 436, 2929, 403, 1039, 3212, 1375, 23037, 14387, 256, 5503, 3045, 891, 1902, 281, 923, 387, 1878, 253, 5301, 342, 643, 3963, 6081, 3022, 3082, 323, 1643, 11860, 4715, 751, 50276, 16599, 13517, 1643, 11860, 9162, 3066, 6311, 4735, 3020, 9261, 17857, 32888, 938, 50274, 249, 1635, 347, 253, 1332, 476, 320, 2783, 275, 253, 1386, 273, 2987, 326, 4648, 465, 69, 323, 5148, 613, 920, 5301, 342, 2045, 2987, 3198, 281, 320, 2908, 50276, 32289, 2196, 17690, 5148, 613, 920, 432, 2303, 3210, 5723, 2824, 1797, 50275, 1542, 1643, 11860, 9162, 1543, 347, 359, 2550, 3839, 7472, 253, 2644, 1896, 873, 273, 1313, 255, 383, 8892, 253, 1543, 3798, 3559, 347, 253, 1599, 285, 2629, 11254, 689, 247, 14604, 273, 8892, 2299, 436, 2929, 816, 5012, 253, 6268, 327, 1264, 7587, 534, 310, 417, 7899, 2217, 50275, 1542, 391, 77, 1543, 275, 4677, 374, 2139, 253, 6268, 273, 253, 4081, 1566, 310, 1199, 2169, 685, 253, 8245, 347, 948, 85, 4648, 10254, 2990, 891, 1902, 281, 923, 1679, 6268, 275, 253, 4081, 1332, 50275, 1542, 15180, 6733, 253, 1783, 310, 1077, 3710, 285, 310, 3559, 816, 323, 247, 2173, 6661, 50275, 1747, 13218, 2829, 577, 28913, 2175, 752, 1057, 352, 1599, 281, 897, 253, 10254, 1293, 940, 21755, 513, 368, 1599, 326, 253, 5148, 613, 1216, 310, 9300, 970, 10254, 5150, 50275, 249, 3036, 495, 67, 6661, 310, 417, 2590, 604, 1072, 3596, 2139, 310, 436, 1027, 432, 253, 495, 66, 50274, 9088, 403, 671, 690, 963, 993, 275, 253, 2929, 50275, 249, 1386, 12643, 352, 943, 320, 14168, 4065, 545, 303, 1506, 2711, 74, 50275, 18941, 1895, 342, 1386, 17574, 285, 5150, 721, 50275, 249, 3036, 495, 66, 310, 943, 320, 495, 89, 7938, 50275, 9820, 5474, 339, 4029, 5933, 1754, 327, 3640, 940, 21755, 2366, 342, 247, 10254, 2990, 2444, 347, 10071, 327, 253, 39709, 4910, 281, 8415, 5148, 613, 920, 3237, 253, 4081, 5933, 2722, 22862, 342, 2709, 5368, 42281, 351, 1241, 285, 7756, 253, 2087, 2934, 310, 2969, 285, 3477, 281, 3359, 50275, 296, 3755, 20556, 50276, 18, 253, 3368, 2722, 253, 1175, 3045, 273, 253, 1566, 327, 247, 5235, 273, 8892, 50276, 19, 253, 1566, 310, 15626, 432, 1027, 24302, 2957, 13016, 285, 15180, 6733, 50275, 20881, 1255, 50276, 18, 432, 619, 4685, 5747, 512, 253, 9542, 9759, 670, 512, 253, 5933, 4295, 253, 5933, 760, 5621, 253, 42281, 49797, 281, 755, 2810, 281, 697, 10254, 1566, 534, 2987, 347, 247, 3963, 9141, 432, 436, 8668, 253, 38135, 310, 3240, 3710, 50276, 19, 253, 4081, 5933, 2722, 253, 7756, 285, 22862, 327, 2709, 278, 16878, 3210, 533, 19756, 5301, 342, 643, 256, 5503, 42281, 351, 1241, 50275, 783, 2929, 310, 3559, 275, 271, 5369, 6902, 314, 1039, 323, 253, 10668, 665, 513, 417, 452, 271, 5919, 4114, 275, 253, 5068, 3340, 275, 253, 10199, 2593, 247, 2257, 273, 12342, 2303, 1566, 3236, 1566, 285, 8892, 29765, 1220, 735, 403, 5611, 275, 253, 2847, 2308, 2403, 253, 2929, 1077, 21643, 690, 4295, 273, 253, 2644, 5933, 403, 5544, 1077, 10939, 352, 310, 417, 1077, 2590, 2139, 581, 15771, 327, 253, 643, 50276, 7152, 339, 431, 248, 2929, 23970, 247, 747, 1332, 323, 1643, 11860, 4715, 326, 4981, 3013, 247, 1881, 303, 40037, 10254, 2303, 253, 2934, 310, 281, 3037, 432, 247, 42281, 49797, 326, 310, 625, 4541, 12956, 685, 253, 2629, 42281, 49797, 50276, 783, 10254, 2303, 310, 9300, 970, 17619, 4886, 3388, 2581, 685, 256, 35333, 253, 2746, 476, 320, 5678, 342, 247, 4618, 5235, 273, 1643, 11860, 40390, 285, 12724, 5644, 281, 2266, 11701, 275, 3045, 20544, 50276, 783, 2746, 476, 320, 5678, 342, 2710, 1643, 11860, 4715, 3082, 285, 12724, 5644, 281, 2266, 11701, 275, 3045, 50276, 783, 2934, 3139, 310, 27350, 285, 4460, 1014, 604, 253, 2934, 556, 690, 22620, 281, 7491, 10981, 1882, 5148, 613, 920, 285, 10254, 6928, 50276, 783, 2929, 3400, 7126, 22909, 285, 33954, 281, 921, 849, 253, 1332, 2987, 50276, 783, 16774, 7103, 310, 9470, 285, 10949, 1643, 11860, 9077, 1643, 11860, 9162, 1097, 275, 5028, 285, 2439, 10625, 285, 671, 35221, 4715, 50276, 3113, 2266, 1543, 2439, 512, 50276, 9088, 310, 271, 28913, 1263, 16344, 253, 2710, 4295, 4645, 326, 512, 4243, 403, 1774, 50276, 20881, 1255, 265, 50276, 9088, 403, 3240, 1142, 3081, 4295, 275, 253, 2746, 50276, 783, 2746, 23970, 2067, 3081, 4373, 22041, 323, 534, 359, 878, 281, 3609, 2193, 533, 4931, 17221, 731, 310, 417, 1512, 2834, 50276, 422, 11346, 4361, 253, 2929, 285, 253, 20544, 3012, 32180, 798, 253, 32213, 594, 891, 5583, 14924, 253, 32213, 403, 2581, 5884, 253, 2929, 588, 2779, 320, 4722, 285, 9865, 323, 253, 13361, 3114, 3340, 323, 253, 8607, 7106, 327, 5148, 613, 920, 50276, 9820, 253, 5955, 273, 7364, 285, 38058, 16274, 310, 973, 15720, 2490, 187, 4118, 18435, 27, 2520, 19529, 29328, 247, 5700, 281, 3157, 5148, 613, 920, 326, 476, 320, 3732, 281, 1142, 1027, 2613, 5148, 613, 920, 3082, 253, 2613, 5148, 613, 920, 1332, 310, 908, 281, 10939, 5223, 1097, 253, 3909, 2990, 3692, 3602, 403, 1146, 18325, 285, 247, 10254, 2990, 8818, 407, 3192, 271, 17619, 4886, 3388, 273, 253, 3909, 6928, 13461, 285, 247, 940, 21755, 2957, 310, 908, 281, 11907, 253, 12956, 3909, 2990, 342, 5926, 483, 327, 697, 3602, 281, 3761, 253, 12956, 10254, 2990, 9470, 4679, 7568, 326, 253, 4081, 1332, 19132, 3045, 273, 2067, 2613, 5148, 613, 920, 3082, 285, 326, 1016, 4445, 273, 253, 1332, 310, 3309, 281, 20685, 8654, 3045, 30628, 8523, 26108, 253, 2934, 285, 16774, 7103, 533, 4879, 326, 253, 1543, 2797, 497, 2080, 432, 1375, 23037, 14387, 285, 2546, 323, 3081, 4679, 342, 1805, 468, 14692, 2613, 5148, 613, 920, 3082, 253, 4477, 2530, 841, 4679, 1309, 253, 2380, 2180, 285, 253, 30628, 1024, 38350, 5583, 14924, 253, 913, 18726, 342, 253, 30628, 6803 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper proposes lodl as a surrogate to replace the original optimization loss while approximately providing gradient information of the original optimization loss the key argument is that the gradient information may be difficult to obtain when solving complex optimization problems such as nonconvex optimization in lodl a surrogate loss is constructed based on variants of msequadratic functions over a dataset sampled around each training sample such that it approximates the original optimization loss and is easily differentiable pros constructing surrogates to obtain the gradients of the downstream optimization with respect to the predictions is important for decisionfocused learning cons the computational complexity of lodl is high its true that we reduce the dimensionality line 126 but the value of n can be very large which is typically the case in decisionfocused learning where the number of training samples is large also we have to learn a surrogate to approximate dl for each training sample the total complexity will be way much higher than existing methods lodl is still an approximationbased method to differentiate the optimizer theres no analyticaltheoretical evidence to show that lodl can approximate the gradient of the original optimizer with respect to the predictions with sufficiently high accuracy lodl is actually approximating dl but this doesnt mean the gradients of dl with respect to the prediction y is still well approximated by the gradient of lodl one can easily construct counter examples in which two functions have similar values but dramatically different gradients in a neighborhood the way to sample y to train lodl and learn phin around each training sample yn is questionable the samples are randomly generated based on a prior distribution in stage 1 but in stage 2 of learning the prediction model the output predictions can follow a different distribution than the assumed prior distribution in stage 1 a direct consequence is that phin may not be accurate for the new distribution of predictions in stage 2 raising further concerns with the use of pretrained phin this is also against the core idea of decisionfocused learning where we want to learn the predictions by considering the entire decision pipeline as a single process some analysis of lodl would be useful eg sampling complexity and generalization bounds the authors are suggested to highlight the targeted scenario of lodl rather than general decisionfocused learning for example in convex problems we can efficiently and accurately differentiate the optimizer with respect to predictions and so lodl is not needed or advantageous yes docsepthis paper proposes methods to approximate the decisionfocused loss function that quantifies the quality of a prediction function by the quality of its induced decision the proposed method considers several classes of locally parameterized loss functions at each label value in the sample these loss functions are convex functions of the prediction the parameters in the loss function for each label value in the sample are estimated by minimizing the loss approximation error at randomly sampled prediction values around the corresponding label truth originality the idea of assigning different parameters to the loss function at different label values appears novel this is an interesting idea to improve the expressive power of the loss function class that builds on relatively simple parametric losses to guarantee convexity quality the writing quality of this paper is overall very good and the proposed idea in this paper is well executed but some important limitations of the proposed approach eg computation and scalability lack discussion clarity i found this paper is overall wellwritten and the highlevel idea of this paper is easy to grasp one exception is that the experiment section seems to lack some important details significance although this paper proposes an interesting idea i am somewhat skeptical of its practicability considering the challenges in parameter tuning and the computational scalability of the proposed method see the comments in the box above docsepthis paper proposes a novel locally optimized decision loss lodl for decision focused learning dfl the lodl loss is a parameterized function trained with the true decision loss as the target the lodl loss is a relatively faithful measure of the decision quality and can provide informative gradient for the dfl training the authors run the experiments on various optimization tasks including linear model web advertising and portfolio optimization verifying the effectiveness of lodl also the authors product some ablation studies and shows how well the lodl represents the decision quality originality the idea of training a parameterized function to approximate the decision loss is interesting comparing with the mse loss in 2stage method the lodl is more faithful to the decision quality comparing with the surrogate loss in dfl the lodl can be easier to design significance the idea is novel for dfl quality the proposed method is compared with multiple baselines including 2stage dfl and the lodl by nn on various resource allocation problems which verifies the advantages of lodl for some problems clarity the paper is wellwritten and easy to follow the limitations of the design is clearly listed docsepthis paper considers learning losses for predictive models that are used in a downstream optimization task section 2 summarizes the basic setup where there is a predictive model hat ymthetax that creates predictions used to parameterize an optimization process the baselines considered are 1 2stage learning which trains the predictive model with an intermediate loss and 2 decisionfocused learning which seeks to optimize a decision loss with the predictive model defined with the objective of the optimization problem the locallyoptimized decision losses lodl proposed in this paper seek to learn the parameters of surrogate intermediate losses to match the decision loss strengths the idea of parameterizing surrogateintermediate losses makes a lot of sense in these twostage settings and the formulation considered here nicely shows the benefits of learning noneuclidean regression losses i can imagine learned losses to be a crucial impactful and longlasting contribution in these settings the experimental results clearly demonstrate that the lodl is able to learn a reasonable surrogate for the tasks considered weaknesses if we take the lodl to be the mse loss parameterized by weights on each dimension i do not understand the objective on l191 my interpretation is that it tries to make the weighted mse loss match the decision loss around the optimal prediction by changing the weights since the mse loss and decision loss are very different quantities it like this objective will not be possible to optimize even though the paper experimentally shows that lodl works the results are difficult to contextualize and compare to related research for example the web advertising task starting at l222 takes one of the settings from wilder et al 2019httpsojsaaaiorgindexphpaaaiarticleview3982 but does not provide or present the results in a way that is comparable to the results in that paper the submission would be significantly easier to evaluate in comparison to this work if it reproduces exactly table 1 of wilder et al 2019 and adds additional lines showing how well lodl performs in comparison related work it could be interesting to connect the work to learned losses used in metalearning such as in bechtle et al 2021httpsarxivorgpdf190605374pdf which takes a much more blackbox perspective to learning a latent loss function the paper does not clearly discuss limitations in a dedicated section while parameterizing and learning an intermediate loss seems appealing it seems limited by needing to specify and learn the right parameterization ### Summary:
this paper considers the problem of making decisionfocused learning dfl more usable for both researchers and practitioners it proposes a novel approach referred to as locallyoptimized decision losses lodl which learns the parameters of surrogate intermediate losses to match the decision loss experimental results clearly demonstrate that lodl approach is able to learn effective surrogate for the considered tasks all the reviewers appreciated the lodl idea but also raised a number of concerns there was a lot of discussion and authors have addressed most of the concerns and also acknowledged some limitations pointed out by some reviewers one expert reviewer who deeply engaged with the authors to both clarify and improve the paper was willing to strongly champion the paper in their words its a brilliant idea that will be foundational in the space and will be engaging and thoughtprovoking at the conference couple of reviewers raised few points beyond the authorreviewer discussion which authors could not seerespond to however i think the overall strengths of the paper outweigh these concerns therefore i recommend accepting the paper i strongly encourage the authors to improve the paper in terms of clarity exposition and additional experimental results to reflect the discussion with reviewers
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 22237, 77, 347, 247, 35701, 281, 8171, 253, 3236, 13757, 2957, 1223, 5512, 5277, 11786, 1491, 273, 253, 3236, 13757, 2957, 253, 2234, 4154, 310, 326, 253, 11786, 1491, 778, 320, 2834, 281, 4044, 672, 16161, 2570, 13757, 3237, 824, 347, 1327, 44181, 13757, 275, 22237, 77, 247, 35701, 2957, 310, 8818, 1754, 327, 11640, 273, 278, 2346, 35658, 1420, 3470, 689, 247, 10895, 19958, 1475, 1016, 3733, 3410, 824, 326, 352, 4020, 684, 253, 3236, 13757, 2957, 285, 310, 4354, 46350, 5847, 50276, 17439, 272, 919, 6375, 684, 281, 4044, 253, 27935, 273, 253, 15450, 13757, 342, 1675, 281, 253, 13650, 310, 1774, 323, 3061, 34821, 4715, 50276, 5040, 50276, 783, 15180, 10454, 273, 22237, 77, 310, 1029, 697, 2032, 326, 359, 4796, 253, 7877, 1319, 1386, 17574, 533, 253, 1318, 273, 295, 476, 320, 1077, 1781, 534, 310, 5431, 253, 1083, 275, 3061, 34821, 4715, 835, 253, 1180, 273, 3733, 3530, 310, 1781, 671, 359, 452, 281, 3037, 247, 35701, 281, 16851, 45439, 323, 1016, 3733, 3410, 253, 2264, 10454, 588, 320, 1039, 1199, 2169, 685, 5368, 3082, 50275, 39079, 77, 310, 1335, 271, 11193, 3169, 1332, 281, 22629, 253, 5556, 6081, 253, 373, 642, 16101, 783, 33977, 1941, 281, 921, 326, 22237, 77, 476, 16851, 253, 11786, 273, 253, 3236, 5556, 6081, 342, 1675, 281, 253, 13650, 342, 10481, 1029, 7200, 22237, 77, 310, 2686, 4020, 839, 45439, 533, 436, 36908, 1599, 253, 27935, 273, 45439, 342, 1675, 281, 253, 10554, 340, 310, 1335, 973, 34930, 407, 253, 11786, 273, 22237, 77, 581, 476, 4354, 3989, 4828, 6667, 275, 534, 767, 3470, 452, 2074, 2193, 533, 16821, 1027, 27935, 275, 247, 9168, 50275, 783, 1039, 281, 3410, 340, 281, 6194, 22237, 77, 285, 3037, 815, 249, 1475, 1016, 3733, 3410, 340, 79, 310, 30455, 253, 3530, 403, 12421, 4561, 1754, 327, 247, 2720, 3268, 275, 3924, 337, 533, 275, 3924, 374, 273, 4715, 253, 10554, 1566, 253, 3453, 50276, 12787, 11297, 50276, 5092, 956, 247, 1027, 3268, 685, 253, 8025, 2720, 3268, 275, 3924, 337, 247, 1480, 9936, 310, 326, 815, 249, 778, 417, 320, 7899, 323, 253, 747, 3268, 273, 13650, 275, 3924, 374, 12976, 2007, 7350, 342, 253, 897, 273, 3215, 11273, 815, 249, 436, 310, 671, 1411, 253, 5161, 2934, 273, 3061, 34821, 4715, 835, 359, 971, 281, 3037, 253, 13650, 407, 7296, 253, 2862, 3061, 15722, 347, 247, 2014, 1232, 50275, 8826, 1783, 273, 22237, 77, 651, 320, 4217, 24088, 10491, 10454, 285, 26647, 14493, 50275, 783, 4477, 403, 5125, 281, 6780, 253, 10522, 10076, 273, 22237, 77, 2581, 685, 2087, 3061, 34821, 4715, 323, 1650, 275, 17133, 3237, 359, 476, 14556, 285, 13613, 22629, 253, 5556, 6081, 342, 1675, 281, 13650, 285, 594, 22237, 77, 310, 417, 3058, 390, 24400, 50276, 9820, 5474, 33032, 2520, 2929, 29328, 3082, 281, 16851, 253, 3061, 34821, 2957, 1159, 326, 2677, 7790, 253, 3290, 273, 247, 10554, 1159, 407, 253, 3290, 273, 697, 5802, 3061, 253, 4081, 1332, 19401, 2067, 5971, 273, 12171, 4764, 1025, 2957, 3470, 387, 1016, 5203, 1318, 275, 253, 3410, 841, 2957, 3470, 403, 17133, 3470, 273, 253, 10554, 253, 3602, 275, 253, 2957, 1159, 323, 1016, 5203, 1318, 275, 253, 3410, 403, 5998, 407, 28699, 253, 2957, 11193, 2228, 387, 12421, 19958, 10554, 2193, 1475, 253, 3969, 5203, 5083, 50276, 19164, 414, 253, 2934, 273, 34018, 1027, 3602, 281, 253, 2957, 1159, 387, 1027, 5203, 2193, 4620, 4460, 436, 310, 271, 4722, 2934, 281, 3157, 253, 43541, 1612, 273, 253, 2957, 1159, 966, 326, 21168, 327, 4942, 2969, 36833, 11655, 281, 12215, 17133, 414, 50275, 15177, 253, 4028, 3290, 273, 436, 2929, 310, 4583, 1077, 1175, 285, 253, 4081, 2934, 275, 436, 2929, 310, 973, 11407, 533, 690, 1774, 7364, 273, 253, 4081, 2746, 24088, 13782, 285, 9171, 1430, 3480, 5955, 50275, 498, 15752, 891, 1119, 436, 2929, 310, 4583, 973, 15720, 285, 253, 1029, 5251, 2934, 273, 436, 2929, 310, 3477, 281, 15909, 581, 6517, 310, 326, 253, 3368, 2593, 3133, 281, 3480, 690, 1774, 4278, 50275, 9188, 40348, 3738, 436, 2929, 29328, 271, 4722, 2934, 891, 717, 8489, 33872, 273, 697, 2283, 280, 1430, 7296, 253, 7881, 275, 4764, 25184, 285, 253, 15180, 9171, 1430, 273, 253, 4081, 1332, 50276, 2887, 253, 5701, 275, 253, 3817, 1840, 50276, 7152, 33032, 2520, 2929, 29328, 247, 4460, 12171, 18325, 3061, 2957, 22237, 77, 323, 3061, 7106, 4715, 277, 1258, 253, 22237, 77, 2957, 310, 247, 4764, 1025, 1159, 10166, 342, 253, 2032, 3061, 2957, 347, 253, 2303, 253, 22237, 77, 2957, 310, 247, 4942, 21738, 2557, 273, 253, 3061, 3290, 285, 476, 2085, 27096, 11786, 323, 253, 277, 1258, 3733, 50276, 783, 4477, 1408, 253, 4679, 327, 2710, 13757, 8892, 1690, 4872, 1566, 4384, 12089, 285, 20820, 13757, 49160, 253, 12510, 273, 22237, 77, 671, 253, 4477, 1885, 690, 28913, 2175, 285, 2722, 849, 973, 253, 22237, 77, 6125, 253, 3061, 3290, 50276, 19164, 414, 50276, 783, 2934, 273, 3733, 247, 4764, 1025, 1159, 281, 16851, 253, 3061, 2957, 310, 4722, 10941, 342, 253, 278, 339, 2957, 275, 374, 13311, 1332, 253, 22237, 77, 310, 625, 21738, 281, 253, 3061, 3290, 10941, 342, 253, 35701, 2957, 275, 277, 1258, 253, 22237, 77, 476, 320, 6927, 281, 2216, 50274, 9188, 40348, 253, 2934, 310, 4460, 323, 277, 1258, 50273, 15177, 253, 4081, 1332, 310, 2429, 342, 2709, 1666, 25379, 1690, 50276, 19, 13311, 277, 1258, 285, 253, 22237, 77, 407, 48257, 327, 2710, 7741, 17621, 3237, 534, 2336, 7790, 253, 11361, 273, 22237, 77, 323, 690, 3237, 50275, 498, 15752, 253, 2929, 310, 973, 15720, 285, 3477, 281, 956, 50273, 783, 7364, 273, 253, 2216, 310, 4518, 7117, 5474, 33032, 2520, 2929, 19401, 4715, 11655, 323, 15970, 3210, 326, 403, 908, 275, 247, 15450, 13757, 4836, 2593, 374, 37250, 253, 5044, 9978, 835, 627, 310, 247, 15970, 1566, 7856, 340, 78, 3124, 89, 326, 10513, 13650, 908, 281, 4764, 907, 271, 13757, 1232, 253, 1666, 25379, 2783, 403, 337, 374, 13311, 4715, 534, 18784, 253, 15970, 1566, 342, 271, 10444, 2957, 285, 374, 3061, 34821, 4715, 534, 14993, 281, 22318, 247, 3061, 2957, 342, 253, 15970, 1566, 2931, 342, 253, 8103, 273, 253, 13757, 1895, 253, 12171, 32581, 1025, 3061, 11655, 22237, 77, 4081, 275, 436, 2929, 7703, 281, 3037, 253, 3602, 273, 35701, 10444, 11655, 281, 3761, 253, 3061, 2957, 20544, 50276, 783, 2934, 273, 4764, 3006, 35701, 2388, 8613, 11655, 2789, 247, 2257, 50275, 1171, 3282, 275, 841, 2500, 493, 486, 7533, 285, 253, 15895, 2783, 50275, 1568, 23395, 2722, 253, 5373, 273, 4715, 5293, 26365, 9077, 11655, 50275, 74, 476, 8564, 6311, 11655, 281, 320, 247, 9560, 50275, 48276, 1020, 285, 1048, 31403, 7680, 275, 841, 7533, 50276, 783, 5661, 1543, 4518, 7568, 326, 253, 22237, 77, 310, 50275, 494, 281, 3037, 247, 5272, 35701, 323, 253, 8892, 2783, 50276, 20881, 1255, 265, 50276, 338, 359, 1379, 253, 22237, 77, 281, 320, 253, 278, 339, 2957, 4764, 1025, 407, 13461, 327, 50275, 14382, 7877, 891, 513, 417, 2096, 253, 8103, 327, 298, 22179, 50275, 2577, 7914, 310, 326, 352, 14177, 281, 1056, 253, 17375, 50275, 78, 339, 2957, 3761, 253, 3061, 2957, 1475, 253, 8654, 10554, 50275, 1615, 6890, 253, 13461, 50275, 17480, 253, 278, 339, 2957, 285, 3061, 2957, 403, 1077, 1027, 13483, 50275, 262, 751, 436, 8103, 588, 417, 320, 1896, 281, 22318, 50276, 9154, 2167, 253, 2929, 21657, 2722, 326, 22237, 77, 2987, 253, 50275, 16680, 403, 2834, 281, 33876, 907, 285, 7277, 281, 2905, 2561, 50275, 1542, 1650, 253, 4384, 12089, 4836, 4983, 50275, 255, 298, 18895, 3936, 581, 273, 253, 7533, 432, 50275, 88, 6741, 1162, 355, 6247, 2413, 601, 4305, 39639, 1528, 72, 4663, 545, 4904, 66, 2284, 14600, 1374, 1867, 3507, 50275, 2858, 1057, 417, 2085, 390, 1246, 253, 1543, 275, 247, 1039, 326, 310, 50275, 681, 36730, 281, 253, 1543, 275, 326, 2929, 253, 19529, 651, 320, 50275, 9188, 36211, 6927, 281, 7472, 275, 5301, 281, 436, 789, 50275, 338, 352, 7598, 707, 4555, 2829, 337, 273, 29788, 1162, 355, 6247, 285, 50275, 324, 1397, 3081, 3104, 4645, 849, 973, 22237, 77, 17923, 275, 50275, 47109, 50276, 4919, 789, 50276, 262, 812, 320, 4722, 281, 4684, 253, 789, 281, 6311, 11655, 50275, 3197, 275, 5148, 613, 920, 824, 347, 275, 50275, 1257, 12914, 282, 1162, 355, 43425, 3614, 39962, 2061, 9275, 16129, 20954, 24562, 9275, 50275, 4609, 3936, 247, 1199, 625, 2806, 3364, 8668, 281, 4715, 247, 50275, 13324, 290, 2957, 1159, 253, 2929, 1057, 417, 4518, 2319, 7364, 275, 247, 9940, 2593, 1223, 4764, 3006, 285, 4715, 271, 10444, 2957, 3133, 23176, 352, 3133, 3710, 407, 25312, 281, 13199, 285, 3037, 253, 987, 4764, 1320, 2490, 187, 4118, 18435, 27, 2520, 2929, 19401, 253, 1895, 273, 2403, 3061, 34821, 4715, 277, 1258, 625, 31998, 323, 1097, 8607, 285, 24432, 352, 29328, 247, 4460, 2746, 6289, 281, 347, 12171, 32581, 1025, 3061, 11655, 22237, 77, 534, 33772, 253, 3602, 273, 35701, 10444, 11655, 281, 3761, 253, 3061, 2957, 5661, 1543, 4518, 7568, 326, 22237, 77, 2746, 310, 2104, 281, 3037, 3576, 35701, 323, 253, 2783, 8892, 50276, 455, 253, 30628, 14109, 253, 22237, 77, 2934, 533, 671, 5439, 247, 1180, 273, 7350, 627, 369, 247, 2257, 273, 5955, 285, 4477, 452, 9713, 954, 273, 253, 7350, 285, 671, 14969, 690, 7364, 8042, 562, 407, 690, 30628, 581, 6485, 37317, 665, 11617, 9583, 342, 253, 4477, 281, 1097, 19148, 285, 3157, 253, 2929, 369, 7378, 281, 7052, 16928, 253, 2929, 275, 616, 3000, 697, 247, 15925, 2934, 326, 588, 320, 1119, 1050, 275, 253, 2317, 285, 588, 320, 15966, 285, 1869, 11404, 6856, 387, 253, 8059, 4564, 273, 30628, 5439, 1643, 2792, 4457, 253, 2488, 15337, 254, 5955, 534, 4477, 812, 417, 923, 2541, 281, 2299, 891, 1158, 253, 4583, 20544, 273, 253, 2929, 32180, 798, 841, 7350, 50276, 45230, 891, 5583, 18738, 253, 2929, 891, 7052, 11907, 253, 4477, 281, 3157, 253, 2929, 275, 2426, 273, 19843, 47284, 285, 3081, 5661, 1543, 281, 4887, 253, 5955, 342, 30628 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 22237, 77, 347, 247, 35701, 281, 8171, 253, 3236, 13757, 2957, 1223, 5512, 5277, 11786, 1491, 273, 253, 3236, 13757, 2957, 253, 2234, 4154, 310, 326, 253, 11786, 1491, 778, 320, 2834, 281, 4044, 672, 16161, 2570, 13757, 3237, 824, 347, 1327, 44181, 13757, 275, 22237, 77, 247, 35701, 2957, 310, 8818, 1754, 327, 11640, 273, 278, 2346, 35658, 1420, 3470, 689, 247, 10895, 19958, 1475, 1016, 3733, 3410, 824, 326, 352, 4020, 684, 253, 3236, 13757, 2957, 285, 310, 4354, 46350, 5847, 50276, 17439, 272, 919, 6375, 684, 281, 4044, 253, 27935, 273, 253, 15450, 13757, 342, 1675, 281, 253, 13650, 310, 1774, 323, 3061, 34821, 4715, 50276, 5040, 50276, 783, 15180, 10454, 273, 22237, 77, 310, 1029, 697, 2032, 326, 359, 4796, 253, 7877, 1319, 1386, 17574, 533, 253, 1318, 273, 295, 476, 320, 1077, 1781, 534, 310, 5431, 253, 1083, 275, 3061, 34821, 4715, 835, 253, 1180, 273, 3733, 3530, 310, 1781, 671, 359, 452, 281, 3037, 247, 35701, 281, 16851, 45439, 323, 1016, 3733, 3410, 253, 2264, 10454, 588, 320, 1039, 1199, 2169, 685, 5368, 3082, 50275, 39079, 77, 310, 1335, 271, 11193, 3169, 1332, 281, 22629, 253, 5556, 6081, 253, 373, 642, 16101, 783, 33977, 1941, 281, 921, 326, 22237, 77, 476, 16851, 253, 11786, 273, 253, 3236, 5556, 6081, 342, 1675, 281, 253, 13650, 342, 10481, 1029, 7200, 22237, 77, 310, 2686, 4020, 839, 45439, 533, 436, 36908, 1599, 253, 27935, 273, 45439, 342, 1675, 281, 253, 10554, 340, 310, 1335, 973, 34930, 407, 253, 11786, 273, 22237, 77, 581, 476, 4354, 3989, 4828, 6667, 275, 534, 767, 3470, 452, 2074, 2193, 533, 16821, 1027, 27935, 275, 247, 9168, 50275, 783, 1039, 281, 3410, 340, 281, 6194, 22237, 77, 285, 3037, 815, 249, 1475, 1016, 3733, 3410, 340, 79, 310, 30455, 253, 3530, 403, 12421, 4561, 1754, 327, 247, 2720, 3268, 275, 3924, 337, 533, 275, 3924, 374, 273, 4715, 253, 10554, 1566, 253, 3453, 50276, 12787, 11297, 50276, 5092, 956, 247, 1027, 3268, 685, 253, 8025, 2720, 3268, 275, 3924, 337, 247, 1480, 9936, 310, 326, 815, 249, 778, 417, 320, 7899, 323, 253, 747, 3268, 273, 13650, 275, 3924, 374, 12976, 2007, 7350, 342, 253, 897, 273, 3215, 11273, 815, 249, 436, 310, 671, 1411, 253, 5161, 2934, 273, 3061, 34821, 4715, 835, 359, 971, 281, 3037, 253, 13650, 407, 7296, 253, 2862, 3061, 15722, 347, 247, 2014, 1232, 50275, 8826, 1783, 273, 22237, 77, 651, 320, 4217, 24088, 10491, 10454, 285, 26647, 14493, 50275, 783, 4477, 403, 5125, 281, 6780, 253, 10522, 10076, 273, 22237, 77, 2581, 685, 2087, 3061, 34821, 4715, 323, 1650, 275, 17133, 3237, 359, 476, 14556, 285, 13613, 22629, 253, 5556, 6081, 342, 1675, 281, 13650, 285, 594, 22237, 77, 310, 417, 3058, 390, 24400, 50276, 9820, 5474, 33032, 2520, 2929, 29328, 3082, 281, 16851, 253, 3061, 34821, 2957, 1159, 326, 2677, 7790, 253, 3290, 273, 247, 10554, 1159, 407, 253, 3290, 273, 697, 5802, 3061, 253, 4081, 1332, 19401, 2067, 5971, 273, 12171, 4764, 1025, 2957, 3470, 387, 1016, 5203, 1318, 275, 253, 3410, 841, 2957, 3470, 403, 17133, 3470, 273, 253, 10554, 253, 3602, 275, 253, 2957, 1159, 323, 1016, 5203, 1318, 275, 253, 3410, 403, 5998, 407, 28699, 253, 2957, 11193, 2228, 387, 12421, 19958, 10554, 2193, 1475, 253, 3969, 5203, 5083, 50276, 19164, 414, 253, 2934, 273, 34018, 1027, 3602, 281, 253, 2957, 1159, 387, 1027, 5203, 2193, 4620, 4460, 436, 310, 271, 4722, 2934, 281, 3157, 253, 43541, 1612, 273, 253, 2957, 1159, 966, 326, 21168, 327, 4942, 2969, 36833, 11655, 281, 12215, 17133, 414, 50275, 15177, 253, 4028, 3290, 273, 436, 2929, 310, 4583, 1077, 1175, 285, 253, 4081, 2934, 275, 436, 2929, 310, 973, 11407, 533, 690, 1774, 7364, 273, 253, 4081, 2746, 24088, 13782, 285, 9171, 1430, 3480, 5955, 50275, 498, 15752, 891, 1119, 436, 2929, 310, 4583, 973, 15720, 285, 253, 1029, 5251, 2934, 273, 436, 2929, 310, 3477, 281, 15909, 581, 6517, 310, 326, 253, 3368, 2593, 3133, 281, 3480, 690, 1774, 4278, 50275, 9188, 40348, 3738, 436, 2929, 29328, 271, 4722, 2934, 891, 717, 8489, 33872, 273, 697, 2283, 280, 1430, 7296, 253, 7881, 275, 4764, 25184, 285, 253, 15180, 9171, 1430, 273, 253, 4081, 1332, 50276, 2887, 253, 5701, 275, 253, 3817, 1840, 50276, 7152, 33032, 2520, 2929, 29328, 247, 4460, 12171, 18325, 3061, 2957, 22237, 77, 323, 3061, 7106, 4715, 277, 1258, 253, 22237, 77, 2957, 310, 247, 4764, 1025, 1159, 10166, 342, 253, 2032, 3061, 2957, 347, 253, 2303, 253, 22237, 77, 2957, 310, 247, 4942, 21738, 2557, 273, 253, 3061, 3290, 285, 476, 2085, 27096, 11786, 323, 253, 277, 1258, 3733, 50276, 783, 4477, 1408, 253, 4679, 327, 2710, 13757, 8892, 1690, 4872, 1566, 4384, 12089, 285, 20820, 13757, 49160, 253, 12510, 273, 22237, 77, 671, 253, 4477, 1885, 690, 28913, 2175, 285, 2722, 849, 973, 253, 22237, 77, 6125, 253, 3061, 3290, 50276, 19164, 414, 50276, 783, 2934, 273, 3733, 247, 4764, 1025, 1159, 281, 16851, 253, 3061, 2957, 310, 4722, 10941, 342, 253, 278, 339, 2957, 275, 374, 13311, 1332, 253, 22237, 77, 310, 625, 21738, 281, 253, 3061, 3290, 10941, 342, 253, 35701, 2957, 275, 277, 1258, 253, 22237, 77, 476, 320, 6927, 281, 2216, 50274, 9188, 40348, 253, 2934, 310, 4460, 323, 277, 1258, 50273, 15177, 253, 4081, 1332, 310, 2429, 342, 2709, 1666, 25379, 1690, 50276, 19, 13311, 277, 1258, 285, 253, 22237, 77, 407, 48257, 327, 2710, 7741, 17621, 3237, 534, 2336, 7790, 253, 11361, 273, 22237, 77, 323, 690, 3237, 50275, 498, 15752, 253, 2929, 310, 973, 15720, 285, 3477, 281, 956, 50273, 783, 7364, 273, 253, 2216, 310, 4518, 7117, 5474, 33032, 2520, 2929, 19401, 4715, 11655, 323, 15970, 3210, 326, 403, 908, 275, 247, 15450, 13757, 4836, 2593, 374, 37250, 253, 5044, 9978, 835, 627, 310, 247, 15970, 1566, 7856, 340, 78, 3124, 89, 326, 10513, 13650, 908, 281, 4764, 907, 271, 13757, 1232, 253, 1666, 25379, 2783, 403, 337, 374, 13311, 4715, 534, 18784, 253, 15970, 1566, 342, 271, 10444, 2957, 285, 374, 3061, 34821, 4715, 534, 14993, 281, 22318, 247, 3061, 2957, 342, 253, 15970, 1566, 2931, 342, 253, 8103, 273, 253, 13757, 1895, 253, 12171, 32581, 1025, 3061, 11655, 22237, 77, 4081, 275, 436, 2929, 7703, 281, 3037, 253, 3602, 273, 35701, 10444, 11655, 281, 3761, 253, 3061, 2957, 20544, 50276, 783, 2934, 273, 4764, 3006, 35701, 2388, 8613, 11655, 2789, 247, 2257, 50275, 1171, 3282, 275, 841, 2500, 493, 486, 7533, 285, 253, 15895, 2783, 50275, 1568, 23395, 2722, 253, 5373, 273, 4715, 5293, 26365, 9077, 11655, 50275, 74, 476, 8564, 6311, 11655, 281, 320, 247, 9560, 50275, 48276, 1020, 285, 1048, 31403, 7680, 275, 841, 7533, 50276, 783, 5661, 1543, 4518, 7568, 326, 253, 22237, 77, 310, 50275, 494, 281, 3037, 247, 5272, 35701, 323, 253, 8892, 2783, 50276, 20881, 1255, 265, 50276, 338, 359, 1379, 253, 22237, 77, 281, 320, 253, 278, 339, 2957, 4764, 1025, 407, 13461, 327, 50275, 14382, 7877, 891, 513, 417, 2096, 253, 8103, 327, 298, 22179, 50275, 2577, 7914, 310, 326, 352, 14177, 281, 1056, 253, 17375, 50275, 78, 339, 2957, 3761, 253, 3061, 2957, 1475, 253, 8654, 10554, 50275, 1615, 6890, 253, 13461, 50275, 17480, 253, 278, 339, 2957, 285, 3061, 2957, 403, 1077, 1027, 13483, 50275, 262, 751, 436, 8103, 588, 417, 320, 1896, 281, 22318, 50276, 9154, 2167, 253, 2929, 21657, 2722, 326, 22237, 77, 2987, 253, 50275, 16680, 403, 2834, 281, 33876, 907, 285, 7277, 281, 2905, 2561, 50275, 1542, 1650, 253, 4384, 12089, 4836, 4983, 50275, 255, 298, 18895, 3936, 581, 273, 253, 7533, 432, 50275, 88, 6741, 1162, 355, 6247, 2413, 601, 4305, 39639, 1528, 72, 4663, 545, 4904, 66, 2284, 14600, 1374, 1867, 3507, 50275, 2858, 1057, 417, 2085, 390, 1246, 253, 1543, 275, 247, 1039, 326, 310, 50275, 681, 36730, 281, 253, 1543, 275, 326, 2929, 253, 19529, 651, 320, 50275, 9188, 36211, 6927, 281, 7472, 275, 5301, 281, 436, 789, 50275, 338, 352, 7598, 707, 4555, 2829, 337, 273, 29788, 1162, 355, 6247, 285, 50275, 324, 1397, 3081, 3104, 4645, 849, 973, 22237, 77, 17923, 275, 50275, 47109, 50276, 4919, 789, 50276, 262, 812, 320, 4722, 281, 4684, 253, 789, 281, 6311, 11655, 50275, 3197, 275, 5148, 613, 920, 824, 347, 275, 50275, 1257, 12914, 282, 1162, 355, 43425, 3614, 39962, 2061, 9275, 16129, 20954, 24562, 9275, 50275, 4609, 3936, 247, 1199, 625, 2806, 3364, 8668, 281, 4715, 247, 50275, 13324, 290, 2957, 1159, 253, 2929, 1057, 417, 4518, 2319, 7364, 275, 247, 9940, 2593, 1223, 4764, 3006, 285, 4715, 271, 10444, 2957, 3133, 23176, 352, 3133, 3710, 407, 25312, 281, 13199, 285, 3037, 253, 987, 4764, 1320, 2490, 187, 4118, 18435, 27, 2520, 2929, 19401, 253, 1895, 273, 2403, 3061, 34821, 4715, 277, 1258, 625, 31998, 323, 1097, 8607, 285, 24432, 352, 29328, 247, 4460, 2746, 6289, 281, 347, 12171, 32581, 1025, 3061, 11655, 22237, 77, 534, 33772, 253, 3602, 273, 35701, 10444, 11655, 281, 3761, 253, 3061, 2957, 5661, 1543, 4518, 7568, 326, 22237, 77, 2746, 310, 2104, 281, 3037, 3576, 35701, 323, 253, 2783, 8892, 50276, 455, 253, 30628, 14109, 253, 22237, 77, 2934, 533, 671, 5439, 247, 1180, 273, 7350, 627, 369, 247, 2257, 273, 5955, 285, 4477, 452, 9713, 954, 273, 253, 7350, 285, 671, 14969, 690, 7364, 8042, 562, 407, 690, 30628, 581, 6485, 37317, 665, 11617, 9583, 342, 253, 4477, 281, 1097, 19148, 285, 3157, 253, 2929, 369, 7378, 281, 7052, 16928, 253, 2929, 275, 616, 3000, 697, 247, 15925, 2934, 326, 588, 320, 1119, 1050, 275, 253, 2317, 285, 588, 320, 15966, 285, 1869, 11404, 6856, 387, 253, 8059, 4564, 273, 30628, 5439, 1643, 2792, 4457, 253, 2488, 15337, 254, 5955, 534, 4477, 812, 417, 923, 2541, 281, 2299, 891, 1158, 253, 4583, 20544, 273, 253, 2929, 32180, 798, 841, 7350, 50276, 45230, 891, 5583, 18738, 253, 2929, 891, 7052, 11907, 253, 4477, 281, 3157, 253, 2929, 275, 2426, 273, 19843, 47284, 285, 3081, 5661, 1543, 281, 4887, 253, 5955, 342, 30628 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this work attempts to improve drl based mixedprecision network compression compared to the standard actioncritic method he proposed method inserts an additional candidate evaluation step on k proposed actions as shown in algorithm 1 as the paper title suggests the proposed augmentation is simple yet practically brings nontrivial performance improvement i am not a real expert on rl in general i appreciate the idea in this work it generates higherquality action proposals by heuristically evaluating a few of them either using a quick calculation of inference accuracy or direct quantization loss the proposed idea is intuitive and well validated by experiments i have several suggestions for improving this work the distancebased indicator is proved to be inferior in terms of performance mentioned by authors it is still suggested to report its accuracies in tables 1 and 2 for a better view of two variants a few issues in experiments need clarification is the reported time the total before convergence or the perepisode time the proposed method takes much fewer episodes to search but an additional qvalue indicator brings more computations more analysis regarding the complexity is required there are a few ad hoc treatments in the method such as memorizing the accuracy of specific bit value at a layer does it consider the evolution of the entire network since the change of other layers have impact the theoretic analysis is based on assumptions that can be only empirically verified yet not convincingly verified in the experimental sections it is not clear to me the true value of section 32 docsep summary this paper studies the dnn quantization using deep reinforcement learning the paper proposes an augmented drl which introduces a qvalue indicator to refine action selection the proposed approach has been applied to several image classification baselines and has compared with several recent drl based quantization approach achieving a similar compression rate without accuracy decrease in addition compared to previous methods the learning speed has been improved by 4564x pros 1 the paper is well written and easy to follow it is very clear even for an audience who may not be an expert on dnn quantization 2 the idea is simple and reasonable to introduce an additional qvalue indicator to refine action selection given the improvement in performance despite the simplicity the method does provide great performance concernscomments 1 is there any result about the final quantization configuration of the compressed model is there any takeaway about the quantization pattern 2 what is the action space used in the experiments how many bits were the model used 3 in table 1 could you provide the original accuracy of the models rather than the accuracy delta in addition is that possible to quantize deeper models like resnet101 or resnet 152 is the method extendable to the latest model architectures such as resnext densenet visual transformers docsepthis paper describes an improved way to determine weight quantization bit lengths using reinforcement learning by injecting model evaluation directly into action selection building upon a drl setup where the action at each timestep corresponds to selecting a bit value for each layer the method adds a qvalue indicator function q that selects among a set of candidate actions and filters based on the model performance quantizing the layer to each level this seems to form a hybrid between drl and greedy search using a greedy criteria q to filter proposals made by the drl agent experiments show very good performance with similar or better quantization levels and accuracy as other drlbased methods and much faster runtime the method is well described overall though i would have liked more details on the mu and q networks including their initializations and how well the mu proposals span the action space initially see questions below likewise i wonder if the mechanism of improvement is mostly through the initial guess provided by greedy search with q at the beginning of training an alternative may be to edit the action network mu using a guess made by greedy profiling see question 3 below was anything like this explored although i wonder whether the method may be able to be further simplified i still find this a good paper overall offering an effective way to reduce time to produce quantized models with no accuracy hit a few additional baselines of even simpler systems including greedy search and greedy policy initialization could help provide more context and assess the use of q as a filter additional questions 1 the descretized action space only has 8 values per layer what happens if one evaluates all 8 with the q indicator and remove the candidate generation network is this equivalent to a layerwise greedy search what is the final accuracy and quantization speed of this baseline 2 how is the last layer of the expanded actor function mu initialized randomly or in such a way that the initial outputs correspond to different discrete bit sizes how many candidate actions are there how well do the candidate values span the action space to allow for guidance by q particularly in the beginning episodes 3 instead of having multiple candidates is it possible to introduce an output bias per layer in the actor function mu initialized to the bit sizes determined by greedy profiler search with q that is to use a function muls mus bl where bl argmaxa qa for each layer l then use regular drl not adrl with this mul that has an initial guess using greedy search from the profiler 4 i dont see where the system is encouraged to use fewer bits if more bits generally leads to higher accuracy what makes the system learn to output smaller bits in its actions both q augmentation and the reward r accquant accorig depend only on accuracy 5 how different are the proposal actions from one another do they span the space of possible bit lengths and how do they change over the course of model selection is the initial max according to q used most of the time or are there times when the proposal index selected by q changes from one episode to the next ### Summary:
this paper proposes a simple yet effective approach for determining weight quantization bit lengths using rl all the reviewers agree that the simplicity and performance improvements are a strong plus point there are some concerns on applicability which have been sufficiently handled by rebuttal ac recommends accepting the paper
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 789, 9437, 281, 3157, 1837, 77, 1754, 6804, 40540, 2990, 13800, 2429, 281, 253, 2629, 2250, 68, 17425, 1332, 344, 4081, 1332, 37034, 271, 3081, 7431, 7103, 3213, 327, 465, 4081, 5231, 347, 2011, 275, 5933, 337, 347, 253, 2929, 4060, 5936, 253, 4081, 42072, 310, 2969, 2568, 18236, 10316, 37825, 3045, 7756, 50275, 74, 717, 417, 247, 1524, 6485, 327, 391, 77, 275, 2087, 891, 11435, 253, 2934, 275, 436, 789, 352, 15693, 2169, 15177, 2250, 18595, 407, 344, 321, 18260, 16344, 247, 1643, 273, 731, 2057, 970, 247, 3158, 10272, 273, 17032, 7200, 390, 1480, 36643, 2957, 253, 4081, 2934, 310, 27350, 285, 973, 17618, 407, 4679, 50275, 74, 452, 2067, 13991, 323, 11138, 436, 789, 253, 4181, 3169, 15301, 310, 8058, 281, 320, 18134, 275, 2426, 273, 3045, 5393, 407, 4477, 352, 310, 1335, 5125, 281, 1304, 697, 3933, 19103, 275, 7180, 337, 285, 374, 323, 247, 1805, 1859, 273, 767, 11640, 50276, 66, 1643, 3374, 275, 4679, 878, 37699, 310, 253, 2361, 673, 253, 2264, 1078, 14940, 390, 253, 759, 4762, 22151, 673, 253, 4081, 1332, 3936, 1199, 11184, 13305, 281, 3186, 533, 271, 3081, 2805, 2877, 15301, 10316, 625, 30745, 625, 1783, 5001, 253, 10454, 310, 2424, 627, 403, 247, 1643, 519, 26901, 9694, 275, 253, 1332, 824, 347, 16407, 3006, 253, 7200, 273, 2173, 2372, 1318, 387, 247, 3828, 1057, 352, 1908, 253, 5606, 273, 253, 2862, 2990, 1580, 253, 1818, 273, 643, 8090, 452, 3486, 253, 253, 30325, 1783, 310, 1754, 327, 13260, 326, 476, 320, 760, 45190, 16058, 2568, 417, 2410, 1763, 5356, 16058, 275, 253, 5661, 7118, 352, 310, 417, 2590, 281, 479, 253, 2032, 1318, 273, 2593, 4567, 5474, 33032, 6010, 50276, 2520, 2929, 2175, 253, 277, 9866, 36643, 970, 3676, 35221, 4715, 253, 2929, 29328, 271, 31612, 1837, 77, 534, 23970, 247, 2805, 2877, 15301, 281, 39494, 2250, 5438, 253, 4081, 2746, 556, 644, 3732, 281, 2067, 2460, 9162, 1666, 25379, 285, 556, 2429, 342, 2067, 3332, 1837, 77, 1754, 36643, 2746, 17170, 247, 2074, 13800, 2281, 1293, 7200, 6379, 50276, 249, 1635, 2429, 281, 2045, 3082, 253, 4715, 3885, 556, 644, 5520, 407, 5329, 1540, 89, 50274, 856, 84, 50276, 18, 253, 2929, 310, 973, 3542, 285, 3477, 281, 956, 50276, 262, 310, 1077, 2590, 1014, 323, 271, 8446, 665, 778, 417, 320, 271, 6485, 327, 277, 9866, 36643, 50276, 19, 253, 2934, 310, 2969, 285, 5272, 281, 9569, 271, 3081, 2805, 2877, 15301, 281, 39494, 2250, 5438, 50276, 28821, 253, 7756, 275, 3045, 50276, 3229, 3784, 253, 17647, 253, 1332, 1057, 2085, 1270, 3045, 50273, 585, 1209, 2224, 26122, 50276, 18, 50276, 261, 627, 667, 906, 670, 253, 2457, 36643, 6661, 273, 253, 21012, 1566, 310, 627, 667, 1379, 12594, 670, 253, 36643, 3102, 50275, 19, 50276, 5371, 310, 253, 2250, 2317, 908, 275, 253, 4679, 849, 1142, 9886, 497, 253, 1566, 908, 50275, 20, 275, 2829, 337, 812, 368, 2085, 253, 3236, 7200, 273, 253, 3210, 2581, 685, 253, 7200, 18687, 50276, 249, 1635, 310, 326, 1896, 281, 2677, 907, 12861, 3210, 751, 501, 3024, 6903, 390, 501, 3024, 21786, 310, 253, 1332, 9017, 494, 281, 253, 6323, 1566, 35615, 824, 347, 501, 8384, 12006, 257, 292, 5304, 4979, 398, 50262, 7152, 33032, 2520, 2929, 8631, 271, 5520, 1039, 281, 3653, 2801, 36643, 2372, 16095, 970, 35221, 4715, 407, 42014, 1566, 7103, 3587, 715, 2250, 5438, 50276, 22157, 2220, 247, 1837, 77, 9978, 835, 253, 2250, 387, 1016, 4522, 383, 554, 10140, 281, 17221, 247, 2372, 1318, 323, 1016, 3828, 253, 1332, 11323, 247, 2805, 2877, 15301, 1159, 2805, 326, 34899, 2190, 247, 873, 273, 7431, 5231, 285, 15116, 1754, 327, 253, 1566, 3045, 2677, 3006, 253, 3828, 281, 1016, 1268, 50276, 2520, 3133, 281, 830, 247, 9769, 875, 1837, 77, 285, 38754, 3186, 970, 247, 38754, 6866, 2805, 281, 5806, 18595, 1160, 407, 253, 1837, 77, 5570, 50276, 16217, 3825, 921, 1077, 1175, 3045, 342, 2074, 390, 1805, 36643, 2308, 285, 7200, 347, 643, 1837, 77, 3169, 3082, 285, 1199, 7938, 20243, 50276, 783, 1332, 310, 973, 2529, 4583, 2167, 891, 651, 452, 10490, 625, 4278, 327, 253, 12910, 285, 2805, 6928, 1690, 616, 3302, 5904, 285, 849, 973, 253, 12910, 18595, 13905, 253, 2250, 2317, 8523, 923, 3533, 2708, 50276, 3022, 3020, 891, 4282, 604, 253, 5122, 273, 7756, 310, 6571, 949, 253, 3302, 5476, 2530, 407, 38754, 3186, 342, 2805, 387, 253, 5068, 273, 3733, 50276, 266, 5795, 778, 320, 281, 12921, 253, 2250, 2990, 12910, 970, 247, 5476, 1160, 407, 38754, 27866, 923, 1953, 495, 2708, 50276, 4238, 2712, 751, 436, 14859, 50276, 20261, 891, 4282, 1880, 253, 1332, 778, 320, 2104, 281, 320, 2007, 21010, 891, 1335, 1089, 436, 247, 1175, 2929, 4583, 9159, 271, 3576, 1039, 281, 4796, 673, 281, 4711, 2677, 1025, 3210, 342, 642, 7200, 4352, 50276, 66, 1643, 3081, 1666, 25379, 273, 1014, 19554, 2718, 1690, 38754, 3186, 285, 38754, 3646, 31850, 812, 1361, 2085, 625, 3634, 285, 2939, 253, 897, 273, 2805, 347, 247, 5806, 50274, 38092, 3533, 50276, 18, 50276, 783, 711, 2414, 1025, 2250, 2317, 760, 556, 854, 2193, 591, 3828, 50276, 5371, 6569, 604, 581, 44995, 512, 854, 342, 253, 2805, 15301, 285, 5386, 253, 7431, 5978, 2990, 50276, 261, 436, 6425, 281, 247, 3828, 3020, 38754, 3186, 50276, 5371, 310, 253, 2457, 7200, 285, 36643, 3885, 273, 436, 8245, 50276, 19, 50276, 5430, 310, 253, 1390, 3828, 273, 253, 11848, 12353, 1159, 12910, 31260, 50276, 14719, 314, 390, 275, 824, 247, 1039, 326, 253, 3302, 18012, 2723, 281, 1027, 13358, 2372, 9552, 50276, 5430, 1142, 7431, 5231, 403, 627, 50276, 5430, 973, 513, 253, 7431, 2193, 13905, 253, 2250, 2317, 281, 1581, 323, 12925, 407, 2805, 3782, 275, 253, 5068, 13305, 50276, 20, 50276, 34235, 273, 1907, 2709, 9183, 310, 352, 1896, 281, 9569, 271, 3453, 8492, 591, 3828, 275, 253, 12353, 1159, 12910, 31260, 281, 253, 2372, 9552, 3413, 407, 38754, 1801, 6731, 3186, 342, 2805, 50275, 3529, 310, 281, 897, 247, 1159, 278, 7228, 50276, 14557, 50276, 1559, 835, 787, 50276, 1662, 4090, 66, 2805, 66, 323, 1016, 3828, 298, 840, 897, 3963, 1837, 77, 417, 519, 8435, 342, 436, 22871, 326, 556, 271, 3302, 5476, 970, 38754, 3186, 432, 253, 1801, 6731, 50276, 21, 50276, 74, 13414, 923, 835, 253, 985, 310, 14659, 281, 897, 11184, 9886, 50276, 338, 625, 9886, 3839, 5644, 281, 2169, 7200, 752, 2789, 253, 985, 3037, 281, 3453, 4577, 9886, 275, 697, 5231, 50276, 15617, 2805, 42072, 285, 253, 10921, 391, 50276, 3649, 17149, 50276, 3649, 28474, 3469, 760, 327, 7200, 50276, 22, 50276, 5430, 1027, 403, 253, 10419, 5231, 432, 581, 1529, 513, 597, 13905, 253, 2317, 273, 1896, 2372, 16095, 285, 849, 513, 597, 1818, 689, 253, 2282, 273, 1566, 5438, 50276, 261, 253, 3302, 2781, 2556, 281, 2805, 908, 954, 273, 253, 673, 390, 403, 627, 2069, 672, 253, 10419, 3605, 4236, 407, 2805, 2544, 432, 581, 9037, 281, 253, 1735, 187, 187, 4118, 18435, 27, 2520, 2929, 29328, 247, 2969, 2568, 3576, 2746, 323, 8925, 2801, 36643, 2372, 16095, 970, 391, 77, 512, 253, 30628, 5194, 326, 253, 17647, 285, 3045, 11701, 403, 247, 2266, 5043, 1127, 627, 403, 690, 7350, 327, 30437, 534, 452, 644, 10481, 15726, 407, 30080, 22559, 913, 32636, 18738, 253, 2929 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 789, 9437, 281, 3157, 1837, 77, 1754, 6804, 40540, 2990, 13800, 2429, 281, 253, 2629, 2250, 68, 17425, 1332, 344, 4081, 1332, 37034, 271, 3081, 7431, 7103, 3213, 327, 465, 4081, 5231, 347, 2011, 275, 5933, 337, 347, 253, 2929, 4060, 5936, 253, 4081, 42072, 310, 2969, 2568, 18236, 10316, 37825, 3045, 7756, 50275, 74, 717, 417, 247, 1524, 6485, 327, 391, 77, 275, 2087, 891, 11435, 253, 2934, 275, 436, 789, 352, 15693, 2169, 15177, 2250, 18595, 407, 344, 321, 18260, 16344, 247, 1643, 273, 731, 2057, 970, 247, 3158, 10272, 273, 17032, 7200, 390, 1480, 36643, 2957, 253, 4081, 2934, 310, 27350, 285, 973, 17618, 407, 4679, 50275, 74, 452, 2067, 13991, 323, 11138, 436, 789, 253, 4181, 3169, 15301, 310, 8058, 281, 320, 18134, 275, 2426, 273, 3045, 5393, 407, 4477, 352, 310, 1335, 5125, 281, 1304, 697, 3933, 19103, 275, 7180, 337, 285, 374, 323, 247, 1805, 1859, 273, 767, 11640, 50276, 66, 1643, 3374, 275, 4679, 878, 37699, 310, 253, 2361, 673, 253, 2264, 1078, 14940, 390, 253, 759, 4762, 22151, 673, 253, 4081, 1332, 3936, 1199, 11184, 13305, 281, 3186, 533, 271, 3081, 2805, 2877, 15301, 10316, 625, 30745, 625, 1783, 5001, 253, 10454, 310, 2424, 627, 403, 247, 1643, 519, 26901, 9694, 275, 253, 1332, 824, 347, 16407, 3006, 253, 7200, 273, 2173, 2372, 1318, 387, 247, 3828, 1057, 352, 1908, 253, 5606, 273, 253, 2862, 2990, 1580, 253, 1818, 273, 643, 8090, 452, 3486, 253, 253, 30325, 1783, 310, 1754, 327, 13260, 326, 476, 320, 760, 45190, 16058, 2568, 417, 2410, 1763, 5356, 16058, 275, 253, 5661, 7118, 352, 310, 417, 2590, 281, 479, 253, 2032, 1318, 273, 2593, 4567, 5474, 33032, 6010, 50276, 2520, 2929, 2175, 253, 277, 9866, 36643, 970, 3676, 35221, 4715, 253, 2929, 29328, 271, 31612, 1837, 77, 534, 23970, 247, 2805, 2877, 15301, 281, 39494, 2250, 5438, 253, 4081, 2746, 556, 644, 3732, 281, 2067, 2460, 9162, 1666, 25379, 285, 556, 2429, 342, 2067, 3332, 1837, 77, 1754, 36643, 2746, 17170, 247, 2074, 13800, 2281, 1293, 7200, 6379, 50276, 249, 1635, 2429, 281, 2045, 3082, 253, 4715, 3885, 556, 644, 5520, 407, 5329, 1540, 89, 50274, 856, 84, 50276, 18, 253, 2929, 310, 973, 3542, 285, 3477, 281, 956, 50276, 262, 310, 1077, 2590, 1014, 323, 271, 8446, 665, 778, 417, 320, 271, 6485, 327, 277, 9866, 36643, 50276, 19, 253, 2934, 310, 2969, 285, 5272, 281, 9569, 271, 3081, 2805, 2877, 15301, 281, 39494, 2250, 5438, 50276, 28821, 253, 7756, 275, 3045, 50276, 3229, 3784, 253, 17647, 253, 1332, 1057, 2085, 1270, 3045, 50273, 585, 1209, 2224, 26122, 50276, 18, 50276, 261, 627, 667, 906, 670, 253, 2457, 36643, 6661, 273, 253, 21012, 1566, 310, 627, 667, 1379, 12594, 670, 253, 36643, 3102, 50275, 19, 50276, 5371, 310, 253, 2250, 2317, 908, 275, 253, 4679, 849, 1142, 9886, 497, 253, 1566, 908, 50275, 20, 275, 2829, 337, 812, 368, 2085, 253, 3236, 7200, 273, 253, 3210, 2581, 685, 253, 7200, 18687, 50276, 249, 1635, 310, 326, 1896, 281, 2677, 907, 12861, 3210, 751, 501, 3024, 6903, 390, 501, 3024, 21786, 310, 253, 1332, 9017, 494, 281, 253, 6323, 1566, 35615, 824, 347, 501, 8384, 12006, 257, 292, 5304, 4979, 398, 50262, 7152, 33032, 2520, 2929, 8631, 271, 5520, 1039, 281, 3653, 2801, 36643, 2372, 16095, 970, 35221, 4715, 407, 42014, 1566, 7103, 3587, 715, 2250, 5438, 50276, 22157, 2220, 247, 1837, 77, 9978, 835, 253, 2250, 387, 1016, 4522, 383, 554, 10140, 281, 17221, 247, 2372, 1318, 323, 1016, 3828, 253, 1332, 11323, 247, 2805, 2877, 15301, 1159, 2805, 326, 34899, 2190, 247, 873, 273, 7431, 5231, 285, 15116, 1754, 327, 253, 1566, 3045, 2677, 3006, 253, 3828, 281, 1016, 1268, 50276, 2520, 3133, 281, 830, 247, 9769, 875, 1837, 77, 285, 38754, 3186, 970, 247, 38754, 6866, 2805, 281, 5806, 18595, 1160, 407, 253, 1837, 77, 5570, 50276, 16217, 3825, 921, 1077, 1175, 3045, 342, 2074, 390, 1805, 36643, 2308, 285, 7200, 347, 643, 1837, 77, 3169, 3082, 285, 1199, 7938, 20243, 50276, 783, 1332, 310, 973, 2529, 4583, 2167, 891, 651, 452, 10490, 625, 4278, 327, 253, 12910, 285, 2805, 6928, 1690, 616, 3302, 5904, 285, 849, 973, 253, 12910, 18595, 13905, 253, 2250, 2317, 8523, 923, 3533, 2708, 50276, 3022, 3020, 891, 4282, 604, 253, 5122, 273, 7756, 310, 6571, 949, 253, 3302, 5476, 2530, 407, 38754, 3186, 342, 2805, 387, 253, 5068, 273, 3733, 50276, 266, 5795, 778, 320, 281, 12921, 253, 2250, 2990, 12910, 970, 247, 5476, 1160, 407, 38754, 27866, 923, 1953, 495, 2708, 50276, 4238, 2712, 751, 436, 14859, 50276, 20261, 891, 4282, 1880, 253, 1332, 778, 320, 2104, 281, 320, 2007, 21010, 891, 1335, 1089, 436, 247, 1175, 2929, 4583, 9159, 271, 3576, 1039, 281, 4796, 673, 281, 4711, 2677, 1025, 3210, 342, 642, 7200, 4352, 50276, 66, 1643, 3081, 1666, 25379, 273, 1014, 19554, 2718, 1690, 38754, 3186, 285, 38754, 3646, 31850, 812, 1361, 2085, 625, 3634, 285, 2939, 253, 897, 273, 2805, 347, 247, 5806, 50274, 38092, 3533, 50276, 18, 50276, 783, 711, 2414, 1025, 2250, 2317, 760, 556, 854, 2193, 591, 3828, 50276, 5371, 6569, 604, 581, 44995, 512, 854, 342, 253, 2805, 15301, 285, 5386, 253, 7431, 5978, 2990, 50276, 261, 436, 6425, 281, 247, 3828, 3020, 38754, 3186, 50276, 5371, 310, 253, 2457, 7200, 285, 36643, 3885, 273, 436, 8245, 50276, 19, 50276, 5430, 310, 253, 1390, 3828, 273, 253, 11848, 12353, 1159, 12910, 31260, 50276, 14719, 314, 390, 275, 824, 247, 1039, 326, 253, 3302, 18012, 2723, 281, 1027, 13358, 2372, 9552, 50276, 5430, 1142, 7431, 5231, 403, 627, 50276, 5430, 973, 513, 253, 7431, 2193, 13905, 253, 2250, 2317, 281, 1581, 323, 12925, 407, 2805, 3782, 275, 253, 5068, 13305, 50276, 20, 50276, 34235, 273, 1907, 2709, 9183, 310, 352, 1896, 281, 9569, 271, 3453, 8492, 591, 3828, 275, 253, 12353, 1159, 12910, 31260, 281, 253, 2372, 9552, 3413, 407, 38754, 1801, 6731, 3186, 342, 2805, 50275, 3529, 310, 281, 897, 247, 1159, 278, 7228, 50276, 14557, 50276, 1559, 835, 787, 50276, 1662, 4090, 66, 2805, 66, 323, 1016, 3828, 298, 840, 897, 3963, 1837, 77, 417, 519, 8435, 342, 436, 22871, 326, 556, 271, 3302, 5476, 970, 38754, 3186, 432, 253, 1801, 6731, 50276, 21, 50276, 74, 13414, 923, 835, 253, 985, 310, 14659, 281, 897, 11184, 9886, 50276, 338, 625, 9886, 3839, 5644, 281, 2169, 7200, 752, 2789, 253, 985, 3037, 281, 3453, 4577, 9886, 275, 697, 5231, 50276, 15617, 2805, 42072, 285, 253, 10921, 391, 50276, 3649, 17149, 50276, 3649, 28474, 3469, 760, 327, 7200, 50276, 22, 50276, 5430, 1027, 403, 253, 10419, 5231, 432, 581, 1529, 513, 597, 13905, 253, 2317, 273, 1896, 2372, 16095, 285, 849, 513, 597, 1818, 689, 253, 2282, 273, 1566, 5438, 50276, 261, 253, 3302, 2781, 2556, 281, 2805, 908, 954, 273, 253, 673, 390, 403, 627, 2069, 672, 253, 10419, 3605, 4236, 407, 2805, 2544, 432, 581, 9037, 281, 253, 1735, 187, 187, 4118, 18435, 27, 2520, 2929, 29328, 247, 2969, 2568, 3576, 2746, 323, 8925, 2801, 36643, 2372, 16095, 970, 391, 77, 512, 253, 30628, 5194, 326, 253, 17647, 285, 3045, 11701, 403, 247, 2266, 5043, 1127, 627, 403, 690, 7350, 327, 30437, 534, 452, 644, 10481, 15726, 407, 30080, 22559, 913, 32636, 18738, 253, 2929 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the authors claimed in this paper that as the most empirically successful approach to defending adversarial examples pgdbased adversarial training is computationally inefficient fast adversarial training could mitigate this issue by training a model using fgsm attacks initialized with large randomized perturbations but the underlying reason for its success remains unclear and it may still suffer from catastrophic overfitting the authors conducted a series of experiments to figure out the key to the success and properties of fast adversarial training the experimental results showed that fast adversarial training cannot avoid catastrophic overfitting but could be able to recover from catastrophic overfitting quickly based on all of the observations the authors proposed a simple method to improve fast adversarial training by using pgd attack as training instead of rfgsm attack proposed in fast adversarial training when overfitting happens or using fast adversarial training as a warmup the proposed methods could achieve slightly better performance than the current stateofart approach while reducing the training time significantly overall i vote for weak reject marginally i like the idea of exploring the properties of adversarial training the experiments may also be inspiring but my major concern is that the interpretation about the catastrophic overfitting is not clear and the interpretation about the effectiveness of rfgsm and pgd against overfitting is also not clear hopefully the authors can address my concern in the rebuttal period pros pros 1attempting to interpret the successful reason for a previous work is interesting and the exploratory experiments may be inspiring for other researchers 2 overall the paper was well written all the motivations and conjectures are easy to follow and understand 3 this paper provides a lot of experiments to show the effectiveness of the proposed methods which appeared slightly better than the sota pgdtraining while reducing training time significantly cons 1 although the authors attempted to explain the key to the success of fast adversarial training it might be still not clear theoretically 1 why rfgsm and pgd could guide the model to recovery from catastrophic overfitting but fgsm could not does it mean that stronger attacks could guide the model to recovery 2 why the catastrophic overfitting happened a lot of times when using fgsm training but rfgsm and pgd could mitigate it does it mean that stronger attacks could mitigate it 2 as i understand figure 3c should be the result of proposed fastadv from figure 3c it can be observed that there are catastrophic overfitting in fastadv but this phenomenon could not be seen from figure 4 do you have any idea to explain it 3 as concerned in my 11 and 2 if weaker attacks lead to more catastrophic overfitting and could not guide the model to recovery why the fastadvw 48 using a weaker attack as a warmup could outperform fastadv and fastadvw 88 4 though the proposed methods appear useful they may be a bit straightforward and have a limited novelty using pgd attacked samples when catastrophic overfitting happens docsepsummary this paper proposes a method called improved fast adversarial training fastadv which improves fast adversarial training by replacing randomized initialization with pgd adversarial training when there is overfitting issue during training strengths the idea is reasonable and easy to implement there is a marginal improvement of accuracy under pgd attack epsilon8255 of around 2 comparing fast adversarial training fastadv with improved fast adversarial training fastadv on cifar10 weakness 1 fastadv itself does not perform better than pgd adversarial training when use the fastadv result as starting point then do pgd adversarial training it outperforms pgd adversarial training therefore fastadvw the combined method is actually more computational expensive than pgd adversarial training 2 the advantage of fast adversarial training is that it is less computational expensive than pgd adversarial training so it can scale up to large dataset like imagenet however fastadvw does not have this advantage anymore 3 the improvement of fastadv over fastadv on cifar100 and tinyimagenet is marginal 4 the methods are evaluated under one attack strength epsilon8255 for cifar10 it is better to evaluate the methods under different attack strengths clarity the paper is clearly written and easy to follow reproducibility details of the algorithm is provided but code is not conclusion the method is novel but the contribution is not strongdocsep summary this paper shows that the main reason for the success of fast adversarial training 1 will be referred to as fbf in this review is its ability to recover from catastrophic overfitting based on this observation the authors propose to utilize pgd multistep training for a few iterations when catastrophic overfitting occurs and resume singlestep training after the model recovers further the authors also propose to use this improved fast adversarial training fastadv as a warmup for pgd adversarial training and demonstrate improved performance over pgdat at a significantly lower computational cost 1 wong et al fast is better than free revisiting adversarial training iclr 2020 pros the authors present a very interesting finding that fbf has catastrophic overfitting for a few intermediate iterations and recovers very quickly from this based on this observation they propose to use pgd based training for a few intermediate iterations which seems to prevent this overfitting effectively the authors also propose the use of this training as warm up for pgd training and demonstrate significantly improved results this is certainly a significant contribution of the paper since it achieves improved results at a much lower computational cost cons since the proposed defenses involve singlestep training the absence of gradient masking needs to be justified using thorough validation as discussed by carlini et al 1 using gradientfree attacks such as square attack spsa blackbox transfer based attacks attacks with multiple steps and multiple random restarts also the sanity checks proposed by athalye et al 2 need to be demonstrated the paper discusses results only on pgd attack which is not the current stateoftheart the proposed defenses fastadv fastadvw must be evaluated on stronger attacks such as autoattack 3 and multitargeted attack 4 while the use of the modified fgsm attack would improve the efficiency of pgd training it is not clear why it should lead to improved robustness could the authors clarify if all the other training hyperparameter settings such as batch size weight decay optimizer initial learning rate and schedule number of epochs initial random noise added for the attack validation split use of early stopping batch norm in train eval mode during attack generation 5 are similar for pgdat reported in table4 of the appendix and fastadvw reported in table1 the learning rate schedule used for pgdat sec 32 is different from that used by rice et al so the results could be suboptimal could the authors use similar settings as rice et al sgd optimizer using a batch size of 128 a stepwise learning rate decay set initially at 01 and divided by 10 at epochs 100 and 150 total epochs 200 and weight decay 5e4 for reporting the pgd baseline results 1 carlini et al on evaluating adversarial robustness httpsarxivorgabs190206705 2 athalye et al obfuscated gradients give a false sense of security circumventing defenses to adversarial examples icml 2018 3 croce et al reliable evaluation of adversarial robustness with an ensemble of diverse parameterfree attacks icml 2020 4 gowal et al an alternative surrogate loss for pgdbased adversarial testing httpsarxivorgpdf191009338pdf 5 bag of tricks for adversarial training httpsopenreviewnetforumidxb8xvrtb8ce reasons for score the paper highlights a very interesting finding in fbf training proposes a singlestep defense and also an improvement to speedup pgdat however it does not show sufficient experimental results for reliable evaluation of the defenses and to ensure the absence of gradient masking hence i think the paper is marginally below the acceptance threshold i will be happy to increase the score if the required experimental results are presented during the rebuttal questions during rebuttal period could the authors provide the following results for cifar10 for both the proposed defenses fastadv fastadvw evaluation against autoattack 3 and multitargeted attack 4 it would help to also report the corresponding results for the baselines fbf pgdat plot of robust accuracy vs attack distortion bound epsilon for pgd10 step attack as discussed by athalye et al 2 accuracy on black box transfer based attacks using normally trained model of the same architecture as a source could the authors also report pgd10 step accuracy for pgdat and fastadvw for cifar10 dataset on preactresnet18 and wrn3410 this would help with comparison of baselines against those reported in prior work in the proposed defense fastadv could the authors clarify what is the fraction of validation split used for detecting catastrophic overfitting and what is the number of steps used for the pgd attack on the validation set is the validation time included in the time reported in table1 if not could the authors report the total time including validation could the authors clarify the size of the validation split used for early stopping also in fig4 and the number of steps used for the pgd attack for early stopping is this consistent across all experiments good to have loss surface plots for the proposed defenses to show the absence of gradient masking similar to those reported in 5 additional feedback not part of decision assessment the authors mention the following although the model quickly transforms into a nonrobust one it is fundamentally different from an ordinary nonrobust model it would be insightful to study the properties of this intermediate model that suffers from catastrophic overfitting to understand more about why it is able to recover so quickly the authors could visualize the loss surface of the fbf trained model during the catastrophic overfitting and immediately after it recovers this can lead to insightful findings on what is happening in the vicinity of the data samples in very few iterations the loss surface can be plotted similar to fig1 in 5 5 tramer et al ensemble adversarial training attacks and defenses iclr 2018 update after rebuttal the authors response addresses my concerns and i would like to update my score to 7 the paper presents insightful findings about fbf and proposes a simple and effective method for stabilizing singlestep adversarial training this is useful not only for fbf but also for other singlestep defenses although the gain in robustness is marginal stabilizing singlestep training is useful the authors also propose a computationally efficient method of achieving robustness similar to pgd training docsepthis paper proposes an improvement over fast adversarial training to improve the robustness of the model in an efficient manner overall its a well written paper but it can be improved in certain ways as follows fig 3 lacks important information about what specific attack was used to compute the robust accuracy was it pgd if yes what are the pgd parameters its not clear if the improvements are due to few iterations of pgd or due to piecewise linear learning rate regime an explicit experiments comparing the two learning rate regimes cyclic and piecewise would be good to confirm for section 5 when authors are using fgsm as a warmup they should refercompare with httpsarxivorgpdf200204237pdf where clean training is used as warmup author mention that strength of the adversary doesnt matter in the initial phase of training so should compare results with varying strength going to zero in most of the experiments the pgd step size is missing moreover it will be good to see the robustness of this technique against varying pgd step size i think authors use rfgsm and fast adversarial training interchangeably which creates some confusion it will be nice to be consistent with the terminology ### Summary:
this paper first investigates the behavior eg catastrophic overfitting of fast adversarial training fastadv through experiments it finds that the key to its success is the ability to recover from overfitting to weak attacks then it presents a simple fix fastadv that incorporates pgd adversarial training when catastrophic overfitting is observed the resulting method is shown to be able to train for a large number of epochs it also presents a version fastadvw that use the improved fast adversarial training as a warmup of pdgadversarial training similar as in previous work overall the analysis is useful and the ideas are valid the empirical results also show promise however the main weakness of such empirical analysis is that it may be sensitive to the settings eg of epochs splitting of datasets the authors rebuttal also reflected such potential concerns
[ 285, 23256, 69, 812, 29966, 352, 1057, 352, 1599, 326, 10046, 8104, 812, 29966, 352, 50276, 19, 347, 891, 2096, 4677, 495, 68, 943, 320, 253, 906, 273, 4081, 3809, 24301, 432, 4677, 495, 68, 352, 476, 320, 2540, 326, 627, 403, 36256, 689, 31893, 275, 3809, 24301, 533, 436, 11562, 812, 417, 320, 2326, 432, 4677, 577, 513, 368, 452, 667, 2934, 281, 5513, 352, 50276, 20, 347, 7514, 275, 619, 1903, 285, 374, 604, 21076, 8104, 1421, 281, 625, 36256, 689, 31893, 285, 812, 417, 7102, 253, 1566, 281, 7355, 2139, 253, 3809, 24301, 88, 5693, 970, 247, 21076, 2983, 347, 247, 5890, 484, 812, 562, 32231, 3809, 24301, 285, 3809, 24301, 88, 11003, 50276, 21, 2167, 253, 4081, 3082, 3176, 4217, 597, 778, 320, 247, 2372, 15246, 285, 452, 247, 3710, 38135, 970, 23256, 69, 13964, 3530, 672, 36256, 689, 31893, 6569, 5474, 339, 793, 360, 3454, 436, 2929, 29328, 247, 1332, 1925, 5520, 3809, 48960, 3733, 3809, 24301, 534, 19132, 3809, 48960, 3733, 407, 15706, 14871, 31850, 342, 23256, 69, 48960, 3733, 672, 627, 310, 689, 31893, 2523, 1309, 3733, 50276, 296, 3755, 20556, 253, 2934, 310, 5272, 285, 3477, 281, 3359, 627, 310, 247, 16888, 7756, 273, 7200, 762, 23256, 69, 2983, 299, 4277, 25, 10637, 273, 1475, 374, 10941, 3809, 48960, 3733, 3809, 24301, 342, 5520, 3809, 48960, 3733, 3809, 24301, 327, 260, 338, 274, 740, 50275, 20881, 1255, 337, 3809, 24301, 3139, 1057, 417, 1347, 1805, 685, 23256, 69, 48960, 3733, 672, 897, 253, 3809, 24301, 906, 347, 4983, 1127, 840, 513, 23256, 69, 48960, 3733, 352, 41731, 13015, 23256, 69, 48960, 3733, 3103, 3809, 24301, 88, 253, 5678, 1332, 310, 2686, 625, 15180, 8214, 685, 23256, 69, 48960, 3733, 50276, 19, 253, 5750, 273, 3809, 48960, 3733, 310, 326, 352, 310, 1679, 15180, 8214, 685, 23256, 69, 48960, 3733, 594, 352, 476, 4311, 598, 281, 1781, 10895, 751, 4440, 257, 292, 2299, 3809, 24301, 88, 1057, 417, 452, 436, 5750, 10542, 495, 253, 7756, 273, 3809, 24301, 689, 3809, 24301, 327, 260, 338, 274, 2313, 285, 10058, 303, 6533, 292, 310, 16888, 577, 253, 3082, 403, 6760, 762, 581, 2983, 4757, 299, 4277, 25, 10637, 323, 260, 338, 274, 740, 352, 310, 1805, 281, 7472, 253, 3082, 762, 1027, 2983, 20544, 50276, 498, 15752, 253, 2929, 310, 4518, 3542, 285, 3477, 281, 956, 50276, 250, 5551, 33593, 4278, 273, 253, 5933, 310, 2530, 533, 2127, 310, 417, 50276, 585, 3444, 253, 1332, 310, 4460, 533, 253, 7680, 310, 417, 2266, 7152, 33032, 6010, 50275, 2520, 2929, 2722, 326, 253, 2022, 1921, 323, 253, 2323, 273, 3809, 48960, 3733, 337, 588, 320, 6289, 281, 347, 269, 3342, 275, 436, 2278, 310, 697, 3745, 281, 9295, 432, 36256, 689, 31893, 1754, 327, 436, 8310, 253, 4477, 12661, 281, 16584, 23256, 69, 1554, 382, 554, 3733, 323, 247, 1643, 25142, 672, 36256, 689, 31893, 6634, 285, 21058, 1625, 46701, 554, 3733, 846, 253, 1566, 761, 12239, 2007, 253, 4477, 671, 12661, 281, 897, 436, 5520, 3809, 48960, 3733, 3809, 24301, 347, 247, 5890, 484, 323, 23256, 69, 48960, 3733, 285, 7568, 5520, 3045, 689, 23256, 8608, 387, 247, 3012, 2406, 15180, 2105, 50275, 18, 259, 543, 1162, 355, 3809, 310, 1805, 685, 1959, 27694, 2996, 48960, 3733, 17857, 32888, 9169, 50275, 856, 84, 50269, 783, 4477, 1246, 247, 1077, 4722, 4560, 326, 269, 3342, 556, 36256, 689, 31893, 323, 247, 1643, 10444, 25142, 285, 761, 12239, 1077, 4541, 432, 436, 50272, 3169, 327, 436, 8310, 597, 12661, 281, 897, 23256, 69, 1754, 3733, 323, 247, 1643, 10444, 25142, 534, 3133, 281, 3657, 436, 689, 31893, 8069, 50272, 783, 4477, 671, 12661, 253, 897, 273, 436, 3733, 347, 5890, 598, 323, 23256, 69, 3733, 285, 7568, 3012, 5520, 1543, 50272, 2520, 310, 5604, 247, 1534, 7680, 273, 253, 2929, 1580, 352, 33526, 5520, 1543, 387, 247, 1199, 2406, 15180, 2105, 50274, 5040, 50271, 17480, 253, 4081, 25774, 6388, 1625, 46701, 554, 3733, 253, 5928, 273, 11786, 44790, 3198, 281, 320, 17285, 970, 11080, 12820, 347, 5469, 407, 1113, 3642, 74, 1162, 355, 337, 970, 11786, 4924, 8104, 824, 347, 6278, 2983, 653, 6678, 2806, 3364, 3700, 1754, 8104, 8104, 342, 2709, 5018, 285, 2709, 3632, 1551, 12863, 671, 253, 45985, 12255, 4081, 407, 9621, 5242, 70, 1162, 355, 374, 878, 281, 320, 5183, 50272, 783, 2929, 25339, 1543, 760, 327, 23256, 69, 2983, 534, 310, 417, 253, 1655, 1375, 23037, 14387, 253, 4081, 25774, 3809, 24301, 3809, 24301, 88, 1364, 320, 6760, 327, 10046, 8104, 824, 347, 6753, 35946, 495, 285, 1554, 262, 1816, 264, 2983, 577, 50272, 6050, 253, 897, 273, 253, 7321, 269, 72, 3610, 2983, 651, 3157, 253, 6733, 273, 23256, 69, 3733, 352, 310, 417, 2590, 2139, 352, 943, 1421, 281, 5520, 31640, 50271, 16534, 253, 4477, 19148, 604, 512, 253, 643, 3733, 4373, 19484, 7533, 824, 347, 14604, 1979, 2801, 10027, 5556, 6081, 3302, 4715, 2281, 285, 10130, 1180, 273, 44540, 3302, 3632, 6046, 2879, 323, 253, 2983, 12820, 8085, 897, 273, 2393, 15910, 14604, 5222, 275, 6194, 2777, 4438, 1309, 2983, 5978, 608, 403, 2074, 323, 23256, 8608, 2361, 275, 2829, 21, 273, 253, 30762, 285, 3809, 24301, 88, 2361, 275, 2829, 18, 50272, 783, 4715, 2281, 10130, 908, 323, 23256, 8608, 4706, 4567, 310, 1027, 432, 326, 908, 407, 11789, 1162, 355, 594, 253, 1543, 812, 320, 749, 29776, 812, 253, 4477, 897, 2074, 7533, 347, 11789, 1162, 355, 256, 35333, 5556, 6081, 970, 247, 14604, 1979, 273, 12842, 247, 3213, 3020, 4715, 2281, 10027, 873, 8523, 387, 14805, 285, 4272, 407, 884, 387, 44540, 2233, 285, 7783, 2264, 44540, 1052, 285, 2801, 10027, 608, 70, 21, 323, 9610, 253, 23256, 69, 8245, 1543, 50274, 18, 1113, 3642, 74, 1162, 355, 327, 16344, 48960, 31640, 5987, 39962, 2061, 5375, 746, 9992, 2251, 1762, 50276, 19, 9621, 5242, 70, 1162, 355, 691, 71, 19387, 456, 27935, 1918, 247, 3221, 3282, 273, 3988, 39256, 272, 25774, 281, 48960, 6667, 17857, 1686, 4765, 50276, 20, 9187, 336, 1162, 355, 9630, 7103, 273, 48960, 31640, 342, 271, 19862, 273, 11117, 4764, 4924, 8104, 17857, 1686, 9169, 50276, 21, 305, 319, 267, 1162, 355, 271, 5795, 35701, 2957, 323, 23256, 69, 3169, 48960, 5175, 5987, 39962, 2061, 9275, 746, 2313, 26, 23922, 9275, 50276, 22, 7351, 273, 24866, 323, 48960, 3733, 5987, 5758, 15337, 3024, 39061, 16159, 67, 25, 89, 87, 1378, 67, 25, 336, 50275, 250, 3743, 323, 4868, 50274, 783, 2929, 16681, 247, 1077, 4722, 4560, 275, 269, 3342, 3733, 29328, 247, 1625, 46701, 554, 5684, 285, 671, 271, 7756, 281, 3885, 484, 23256, 8608, 2299, 352, 1057, 417, 921, 4209, 5661, 1543, 323, 9630, 7103, 273, 253, 25774, 285, 281, 5416, 253, 5928, 273, 11786, 44790, 7613, 891, 1158, 253, 2929, 310, 42876, 2708, 253, 14924, 7887, 891, 588, 320, 5211, 281, 2572, 253, 4868, 604, 253, 2424, 5661, 1543, 403, 3559, 1309, 253, 30080, 22559, 50274, 34974, 1309, 30080, 22559, 2180, 50271, 16534, 253, 4477, 2085, 253, 1563, 1543, 323, 260, 338, 274, 740, 323, 1097, 253, 4081, 25774, 3809, 24301, 3809, 24301, 88, 50266, 15419, 2368, 1411, 6753, 35946, 495, 285, 1554, 262, 1816, 264, 2983, 577, 352, 651, 1361, 281, 671, 1304, 253, 3969, 1543, 323, 253, 1666, 25379, 50276, 71, 3342, 23256, 8608, 50267, 14095, 273, 10237, 7200, 4632, 2983, 22841, 3033, 299, 4277, 323, 23256, 69, 740, 3213, 2983, 347, 5469, 407, 9621, 5242, 70, 1162, 355, 374, 50266, 18921, 1974, 327, 2806, 3817, 3700, 1754, 8104, 970, 9403, 10166, 1566, 273, 253, 1072, 10336, 347, 247, 2603, 50273, 16534, 253, 4477, 671, 1304, 23256, 69, 740, 3213, 7200, 323, 23256, 8608, 285, 3809, 24301, 88, 323, 260, 338, 274, 740, 10895, 327, 638, 514, 373, 3024, 1093, 285, 1488, 79, 1706, 740, 436, 651, 1361, 342, 5301, 273, 1666, 25379, 1411, 1110, 2361, 275, 2720, 789, 50273, 249, 253, 4081, 5684, 3809, 24301, 812, 253, 4477, 19148, 752, 310, 253, 6919, 273, 12820, 8085, 908, 323, 15549, 36256, 689, 31893, 285, 752, 310, 253, 1180, 273, 5018, 908, 323, 253, 23256, 69, 2983, 327, 253, 12820, 873, 310, 253, 12820, 673, 2908, 275, 253, 673, 2361, 275, 2829, 18, 604, 417, 812, 253, 4477, 1304, 253, 2264, 673, 1690, 12820, 50273, 16534, 253, 4477, 19148, 253, 1979, 273, 253, 12820, 8085, 908, 323, 2393, 15910, 671, 275, 3036, 21, 285, 253, 1180, 273, 5018, 908, 323, 253, 23256, 69, 2983, 323, 2393, 15910, 310, 436, 5185, 2439, 512, 4679, 50272, 12311, 281, 452, 2957, 2553, 14777, 323, 253, 4081, 25774, 281, 921, 253, 5928, 273, 11786, 44790, 2074, 281, 1110, 2361, 275, 608, 50275, 38092, 8680, 417, 629, 273, 3061, 6803, 50271, 783, 4477, 3748, 253, 1563, 3738, 253, 1566, 4541, 29698, 715, 247, 1327, 18848, 461, 581, 352, 310, 26401, 1027, 432, 271, 9826, 1327, 18848, 461, 1566, 352, 651, 320, 47860, 281, 1263, 253, 3607, 273, 436, 10444, 1566, 326, 27171, 432, 36256, 689, 31893, 281, 2096, 625, 670, 2139, 352, 310, 2104, 281, 9295, 594, 4541, 50272, 783, 4477, 812, 31986, 253, 2957, 2553, 273, 253, 269, 3342, 10166, 1566, 1309, 253, 36256, 689, 31893, 285, 4745, 846, 352, 761, 12239, 436, 476, 1421, 281, 47860, 4342, 327, 752, 310, 9369, 275, 253, 21520, 273, 253, 941, 3530, 275, 1077, 1643, 25142, 253, 2957, 2553, 476, 320, 17944, 2074, 281, 3036, 18, 275, 608, 50275, 22, 492, 13429, 1162, 355, 19862, 48960, 3733, 8104, 285, 25774, 17857, 32888, 4765, 50275, 11183, 846, 30080, 22559, 50275, 783, 4477, 2380, 12453, 619, 7350, 285, 891, 651, 751, 281, 5731, 619, 4868, 281, 818, 253, 2929, 10262, 47860, 4342, 670, 269, 3342, 285, 29328, 247, 2969, 285, 3576, 1332, 323, 41427, 1625, 46701, 554, 48960, 3733, 436, 310, 4217, 417, 760, 323, 269, 3342, 533, 671, 323, 643, 1625, 46701, 554, 25774, 3738, 253, 6351, 275, 31640, 310, 16888, 41427, 1625, 46701, 554, 3733, 310, 4217, 253, 4477, 671, 12661, 247, 43245, 5919, 1332, 273, 17170, 31640, 2074, 281, 23256, 69, 3733, 50276, 7152, 33032, 2520, 2929, 29328, 271, 7756, 689, 3809, 48960, 3733, 281, 3157, 253, 31640, 273, 253, 1566, 275, 271, 5919, 5133, 4583, 697, 247, 973, 3542, 2929, 533, 352, 476, 320, 5520, 275, 2176, 4088, 347, 3637, 50275, 926, 495, 19756, 1774, 1491, 670, 752, 2173, 2983, 369, 908, 281, 11897, 253, 10237, 7200, 369, 352, 23256, 69, 604, 4754, 752, 403, 253, 23256, 69, 3602, 50276, 953, 417, 2590, 604, 253, 11701, 403, 1955, 281, 1643, 25142, 273, 23256, 69, 390, 1955, 281, 5313, 3020, 4872, 4715, 2281, 9459, 271, 6843, 4679, 10941, 253, 767, 4715, 2281, 27005, 19870, 285, 5313, 3020, 651, 320, 1175, 281, 6583, 50275, 1542, 2593, 608, 672, 4477, 403, 970, 269, 72, 3610, 347, 247, 5890, 484, 597, 943, 3730, 23813, 342, 5987, 39962, 2061, 9275, 1518, 15781, 20991, 9275, 835, 4076, 3733, 310, 908, 347, 5890, 484, 2488, 3748, 326, 4757, 273, 253, 34014, 36908, 2647, 275, 253, 3302, 3408, 273, 3733, 594, 943, 7277, 1543, 342, 11962, 4757, 1469, 281, 5058, 50275, 249, 954, 273, 253, 4679, 253, 23256, 69, 3213, 1979, 310, 5816, 25761, 352, 588, 320, 1175, 281, 923, 253, 31640, 273, 436, 5853, 1411, 11962, 23256, 69, 3213, 1979, 50275, 74, 1158, 4477, 897, 391, 16054, 3610, 285, 3809, 48960, 3733, 28961, 1598, 534, 10513, 690, 13775, 352, 588, 320, 5322, 281, 320, 5185, 342, 253, 28939, 50275, 187, 187, 4118, 18435, 27, 2520, 2929, 806, 2340, 684, 253, 3879, 24088, 36256, 689, 31893, 273, 3809, 48960, 3733, 3809, 24301, 949, 4679, 352, 9010, 326, 253, 2234, 281, 697, 2323, 310, 253, 3745, 281, 9295, 432, 689, 31893, 281, 5075, 8104, 840, 352, 10262, 247, 2969, 4993, 3809, 24301, 326, 31167, 23256, 69, 48960, 3733, 672, 36256, 689, 31893, 310, 2540, 253, 4795, 1332, 310, 2011, 281, 320, 2104, 281, 6194, 323, 247, 1781, 1180, 273, 44540, 352, 671, 10262, 247, 2715, 3809, 24301, 88, 326, 897, 253, 5520, 3809, 48960, 3733, 347, 247, 5890, 484, 273, 268, 27421, 324, 735, 24406, 3733, 2074, 347, 275, 2045, 789, 4583, 253, 1783, 310, 4217, 285, 253, 5697, 403, 3588, 253, 16774, 1543, 671, 921, 9023, 2299, 253, 2022, 14855, 273, 824, 16774, 1783, 310, 326, 352, 778, 320, 7996, 281, 253, 7533, 24088, 50276, 1171, 44540, 19860, 273, 15302, 50276, 783, 4477, 30080, 22559, 671, 11392, 824, 2442, 7350 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 285, 23256, 69, 812, 29966, 352, 1057, 352, 1599, 326, 10046, 8104, 812, 29966, 352, 50276, 19, 347, 891, 2096, 4677, 495, 68, 943, 320, 253, 906, 273, 4081, 3809, 24301, 432, 4677, 495, 68, 352, 476, 320, 2540, 326, 627, 403, 36256, 689, 31893, 275, 3809, 24301, 533, 436, 11562, 812, 417, 320, 2326, 432, 4677, 577, 513, 368, 452, 667, 2934, 281, 5513, 352, 50276, 20, 347, 7514, 275, 619, 1903, 285, 374, 604, 21076, 8104, 1421, 281, 625, 36256, 689, 31893, 285, 812, 417, 7102, 253, 1566, 281, 7355, 2139, 253, 3809, 24301, 88, 5693, 970, 247, 21076, 2983, 347, 247, 5890, 484, 812, 562, 32231, 3809, 24301, 285, 3809, 24301, 88, 11003, 50276, 21, 2167, 253, 4081, 3082, 3176, 4217, 597, 778, 320, 247, 2372, 15246, 285, 452, 247, 3710, 38135, 970, 23256, 69, 13964, 3530, 672, 36256, 689, 31893, 6569, 5474, 339, 793, 360, 3454, 436, 2929, 29328, 247, 1332, 1925, 5520, 3809, 48960, 3733, 3809, 24301, 534, 19132, 3809, 48960, 3733, 407, 15706, 14871, 31850, 342, 23256, 69, 48960, 3733, 672, 627, 310, 689, 31893, 2523, 1309, 3733, 50276, 296, 3755, 20556, 253, 2934, 310, 5272, 285, 3477, 281, 3359, 627, 310, 247, 16888, 7756, 273, 7200, 762, 23256, 69, 2983, 299, 4277, 25, 10637, 273, 1475, 374, 10941, 3809, 48960, 3733, 3809, 24301, 342, 5520, 3809, 48960, 3733, 3809, 24301, 327, 260, 338, 274, 740, 50275, 20881, 1255, 337, 3809, 24301, 3139, 1057, 417, 1347, 1805, 685, 23256, 69, 48960, 3733, 672, 897, 253, 3809, 24301, 906, 347, 4983, 1127, 840, 513, 23256, 69, 48960, 3733, 352, 41731, 13015, 23256, 69, 48960, 3733, 3103, 3809, 24301, 88, 253, 5678, 1332, 310, 2686, 625, 15180, 8214, 685, 23256, 69, 48960, 3733, 50276, 19, 253, 5750, 273, 3809, 48960, 3733, 310, 326, 352, 310, 1679, 15180, 8214, 685, 23256, 69, 48960, 3733, 594, 352, 476, 4311, 598, 281, 1781, 10895, 751, 4440, 257, 292, 2299, 3809, 24301, 88, 1057, 417, 452, 436, 5750, 10542, 495, 253, 7756, 273, 3809, 24301, 689, 3809, 24301, 327, 260, 338, 274, 2313, 285, 10058, 303, 6533, 292, 310, 16888, 577, 253, 3082, 403, 6760, 762, 581, 2983, 4757, 299, 4277, 25, 10637, 323, 260, 338, 274, 740, 352, 310, 1805, 281, 7472, 253, 3082, 762, 1027, 2983, 20544, 50276, 498, 15752, 253, 2929, 310, 4518, 3542, 285, 3477, 281, 956, 50276, 250, 5551, 33593, 4278, 273, 253, 5933, 310, 2530, 533, 2127, 310, 417, 50276, 585, 3444, 253, 1332, 310, 4460, 533, 253, 7680, 310, 417, 2266, 7152, 33032, 6010, 50275, 2520, 2929, 2722, 326, 253, 2022, 1921, 323, 253, 2323, 273, 3809, 48960, 3733, 337, 588, 320, 6289, 281, 347, 269, 3342, 275, 436, 2278, 310, 697, 3745, 281, 9295, 432, 36256, 689, 31893, 1754, 327, 436, 8310, 253, 4477, 12661, 281, 16584, 23256, 69, 1554, 382, 554, 3733, 323, 247, 1643, 25142, 672, 36256, 689, 31893, 6634, 285, 21058, 1625, 46701, 554, 3733, 846, 253, 1566, 761, 12239, 2007, 253, 4477, 671, 12661, 281, 897, 436, 5520, 3809, 48960, 3733, 3809, 24301, 347, 247, 5890, 484, 323, 23256, 69, 48960, 3733, 285, 7568, 5520, 3045, 689, 23256, 8608, 387, 247, 3012, 2406, 15180, 2105, 50275, 18, 259, 543, 1162, 355, 3809, 310, 1805, 685, 1959, 27694, 2996, 48960, 3733, 17857, 32888, 9169, 50275, 856, 84, 50269, 783, 4477, 1246, 247, 1077, 4722, 4560, 326, 269, 3342, 556, 36256, 689, 31893, 323, 247, 1643, 10444, 25142, 285, 761, 12239, 1077, 4541, 432, 436, 50272, 3169, 327, 436, 8310, 597, 12661, 281, 897, 23256, 69, 1754, 3733, 323, 247, 1643, 10444, 25142, 534, 3133, 281, 3657, 436, 689, 31893, 8069, 50272, 783, 4477, 671, 12661, 253, 897, 273, 436, 3733, 347, 5890, 598, 323, 23256, 69, 3733, 285, 7568, 3012, 5520, 1543, 50272, 2520, 310, 5604, 247, 1534, 7680, 273, 253, 2929, 1580, 352, 33526, 5520, 1543, 387, 247, 1199, 2406, 15180, 2105, 50274, 5040, 50271, 17480, 253, 4081, 25774, 6388, 1625, 46701, 554, 3733, 253, 5928, 273, 11786, 44790, 3198, 281, 320, 17285, 970, 11080, 12820, 347, 5469, 407, 1113, 3642, 74, 1162, 355, 337, 970, 11786, 4924, 8104, 824, 347, 6278, 2983, 653, 6678, 2806, 3364, 3700, 1754, 8104, 8104, 342, 2709, 5018, 285, 2709, 3632, 1551, 12863, 671, 253, 45985, 12255, 4081, 407, 9621, 5242, 70, 1162, 355, 374, 878, 281, 320, 5183, 50272, 783, 2929, 25339, 1543, 760, 327, 23256, 69, 2983, 534, 310, 417, 253, 1655, 1375, 23037, 14387, 253, 4081, 25774, 3809, 24301, 3809, 24301, 88, 1364, 320, 6760, 327, 10046, 8104, 824, 347, 6753, 35946, 495, 285, 1554, 262, 1816, 264, 2983, 577, 50272, 6050, 253, 897, 273, 253, 7321, 269, 72, 3610, 2983, 651, 3157, 253, 6733, 273, 23256, 69, 3733, 352, 310, 417, 2590, 2139, 352, 943, 1421, 281, 5520, 31640, 50271, 16534, 253, 4477, 19148, 604, 512, 253, 643, 3733, 4373, 19484, 7533, 824, 347, 14604, 1979, 2801, 10027, 5556, 6081, 3302, 4715, 2281, 285, 10130, 1180, 273, 44540, 3302, 3632, 6046, 2879, 323, 253, 2983, 12820, 8085, 897, 273, 2393, 15910, 14604, 5222, 275, 6194, 2777, 4438, 1309, 2983, 5978, 608, 403, 2074, 323, 23256, 8608, 2361, 275, 2829, 21, 273, 253, 30762, 285, 3809, 24301, 88, 2361, 275, 2829, 18, 50272, 783, 4715, 2281, 10130, 908, 323, 23256, 8608, 4706, 4567, 310, 1027, 432, 326, 908, 407, 11789, 1162, 355, 594, 253, 1543, 812, 320, 749, 29776, 812, 253, 4477, 897, 2074, 7533, 347, 11789, 1162, 355, 256, 35333, 5556, 6081, 970, 247, 14604, 1979, 273, 12842, 247, 3213, 3020, 4715, 2281, 10027, 873, 8523, 387, 14805, 285, 4272, 407, 884, 387, 44540, 2233, 285, 7783, 2264, 44540, 1052, 285, 2801, 10027, 608, 70, 21, 323, 9610, 253, 23256, 69, 8245, 1543, 50274, 18, 1113, 3642, 74, 1162, 355, 327, 16344, 48960, 31640, 5987, 39962, 2061, 5375, 746, 9992, 2251, 1762, 50276, 19, 9621, 5242, 70, 1162, 355, 691, 71, 19387, 456, 27935, 1918, 247, 3221, 3282, 273, 3988, 39256, 272, 25774, 281, 48960, 6667, 17857, 1686, 4765, 50276, 20, 9187, 336, 1162, 355, 9630, 7103, 273, 48960, 31640, 342, 271, 19862, 273, 11117, 4764, 4924, 8104, 17857, 1686, 9169, 50276, 21, 305, 319, 267, 1162, 355, 271, 5795, 35701, 2957, 323, 23256, 69, 3169, 48960, 5175, 5987, 39962, 2061, 9275, 746, 2313, 26, 23922, 9275, 50276, 22, 7351, 273, 24866, 323, 48960, 3733, 5987, 5758, 15337, 3024, 39061, 16159, 67, 25, 89, 87, 1378, 67, 25, 336, 50275, 250, 3743, 323, 4868, 50274, 783, 2929, 16681, 247, 1077, 4722, 4560, 275, 269, 3342, 3733, 29328, 247, 1625, 46701, 554, 5684, 285, 671, 271, 7756, 281, 3885, 484, 23256, 8608, 2299, 352, 1057, 417, 921, 4209, 5661, 1543, 323, 9630, 7103, 273, 253, 25774, 285, 281, 5416, 253, 5928, 273, 11786, 44790, 7613, 891, 1158, 253, 2929, 310, 42876, 2708, 253, 14924, 7887, 891, 588, 320, 5211, 281, 2572, 253, 4868, 604, 253, 2424, 5661, 1543, 403, 3559, 1309, 253, 30080, 22559, 50274, 34974, 1309, 30080, 22559, 2180, 50271, 16534, 253, 4477, 2085, 253, 1563, 1543, 323, 260, 338, 274, 740, 323, 1097, 253, 4081, 25774, 3809, 24301, 3809, 24301, 88, 50266, 15419, 2368, 1411, 6753, 35946, 495, 285, 1554, 262, 1816, 264, 2983, 577, 352, 651, 1361, 281, 671, 1304, 253, 3969, 1543, 323, 253, 1666, 25379, 50276, 71, 3342, 23256, 8608, 50267, 14095, 273, 10237, 7200, 4632, 2983, 22841, 3033, 299, 4277, 323, 23256, 69, 740, 3213, 2983, 347, 5469, 407, 9621, 5242, 70, 1162, 355, 374, 50266, 18921, 1974, 327, 2806, 3817, 3700, 1754, 8104, 970, 9403, 10166, 1566, 273, 253, 1072, 10336, 347, 247, 2603, 50273, 16534, 253, 4477, 671, 1304, 23256, 69, 740, 3213, 7200, 323, 23256, 8608, 285, 3809, 24301, 88, 323, 260, 338, 274, 740, 10895, 327, 638, 514, 373, 3024, 1093, 285, 1488, 79, 1706, 740, 436, 651, 1361, 342, 5301, 273, 1666, 25379, 1411, 1110, 2361, 275, 2720, 789, 50273, 249, 253, 4081, 5684, 3809, 24301, 812, 253, 4477, 19148, 752, 310, 253, 6919, 273, 12820, 8085, 908, 323, 15549, 36256, 689, 31893, 285, 752, 310, 253, 1180, 273, 5018, 908, 323, 253, 23256, 69, 2983, 327, 253, 12820, 873, 310, 253, 12820, 673, 2908, 275, 253, 673, 2361, 275, 2829, 18, 604, 417, 812, 253, 4477, 1304, 253, 2264, 673, 1690, 12820, 50273, 16534, 253, 4477, 19148, 253, 1979, 273, 253, 12820, 8085, 908, 323, 2393, 15910, 671, 275, 3036, 21, 285, 253, 1180, 273, 5018, 908, 323, 253, 23256, 69, 2983, 323, 2393, 15910, 310, 436, 5185, 2439, 512, 4679, 50272, 12311, 281, 452, 2957, 2553, 14777, 323, 253, 4081, 25774, 281, 921, 253, 5928, 273, 11786, 44790, 2074, 281, 1110, 2361, 275, 608, 50275, 38092, 8680, 417, 629, 273, 3061, 6803, 50271, 783, 4477, 3748, 253, 1563, 3738, 253, 1566, 4541, 29698, 715, 247, 1327, 18848, 461, 581, 352, 310, 26401, 1027, 432, 271, 9826, 1327, 18848, 461, 1566, 352, 651, 320, 47860, 281, 1263, 253, 3607, 273, 436, 10444, 1566, 326, 27171, 432, 36256, 689, 31893, 281, 2096, 625, 670, 2139, 352, 310, 2104, 281, 9295, 594, 4541, 50272, 783, 4477, 812, 31986, 253, 2957, 2553, 273, 253, 269, 3342, 10166, 1566, 1309, 253, 36256, 689, 31893, 285, 4745, 846, 352, 761, 12239, 436, 476, 1421, 281, 47860, 4342, 327, 752, 310, 9369, 275, 253, 21520, 273, 253, 941, 3530, 275, 1077, 1643, 25142, 253, 2957, 2553, 476, 320, 17944, 2074, 281, 3036, 18, 275, 608, 50275, 22, 492, 13429, 1162, 355, 19862, 48960, 3733, 8104, 285, 25774, 17857, 32888, 4765, 50275, 11183, 846, 30080, 22559, 50275, 783, 4477, 2380, 12453, 619, 7350, 285, 891, 651, 751, 281, 5731, 619, 4868, 281, 818, 253, 2929, 10262, 47860, 4342, 670, 269, 3342, 285, 29328, 247, 2969, 285, 3576, 1332, 323, 41427, 1625, 46701, 554, 48960, 3733, 436, 310, 4217, 417, 760, 323, 269, 3342, 533, 671, 323, 643, 1625, 46701, 554, 25774, 3738, 253, 6351, 275, 31640, 310, 16888, 41427, 1625, 46701, 554, 3733, 310, 4217, 253, 4477, 671, 12661, 247, 43245, 5919, 1332, 273, 17170, 31640, 2074, 281, 23256, 69, 3733, 50276, 7152, 33032, 2520, 2929, 29328, 271, 7756, 689, 3809, 48960, 3733, 281, 3157, 253, 31640, 273, 253, 1566, 275, 271, 5919, 5133, 4583, 697, 247, 973, 3542, 2929, 533, 352, 476, 320, 5520, 275, 2176, 4088, 347, 3637, 50275, 926, 495, 19756, 1774, 1491, 670, 752, 2173, 2983, 369, 908, 281, 11897, 253, 10237, 7200, 369, 352, 23256, 69, 604, 4754, 752, 403, 253, 23256, 69, 3602, 50276, 953, 417, 2590, 604, 253, 11701, 403, 1955, 281, 1643, 25142, 273, 23256, 69, 390, 1955, 281, 5313, 3020, 4872, 4715, 2281, 9459, 271, 6843, 4679, 10941, 253, 767, 4715, 2281, 27005, 19870, 285, 5313, 3020, 651, 320, 1175, 281, 6583, 50275, 1542, 2593, 608, 672, 4477, 403, 970, 269, 72, 3610, 347, 247, 5890, 484, 597, 943, 3730, 23813, 342, 5987, 39962, 2061, 9275, 1518, 15781, 20991, 9275, 835, 4076, 3733, 310, 908, 347, 5890, 484, 2488, 3748, 326, 4757, 273, 253, 34014, 36908, 2647, 275, 253, 3302, 3408, 273, 3733, 594, 943, 7277, 1543, 342, 11962, 4757, 1469, 281, 5058, 50275, 249, 954, 273, 253, 4679, 253, 23256, 69, 3213, 1979, 310, 5816, 25761, 352, 588, 320, 1175, 281, 923, 253, 31640, 273, 436, 5853, 1411, 11962, 23256, 69, 3213, 1979, 50275, 74, 1158, 4477, 897, 391, 16054, 3610, 285, 3809, 48960, 3733, 28961, 1598, 534, 10513, 690, 13775, 352, 588, 320, 5322, 281, 320, 5185, 342, 253, 28939, 50275, 187, 187, 4118, 18435, 27, 2520, 2929, 806, 2340, 684, 253, 3879, 24088, 36256, 689, 31893, 273, 3809, 48960, 3733, 3809, 24301, 949, 4679, 352, 9010, 326, 253, 2234, 281, 697, 2323, 310, 253, 3745, 281, 9295, 432, 689, 31893, 281, 5075, 8104, 840, 352, 10262, 247, 2969, 4993, 3809, 24301, 326, 31167, 23256, 69, 48960, 3733, 672, 36256, 689, 31893, 310, 2540, 253, 4795, 1332, 310, 2011, 281, 320, 2104, 281, 6194, 323, 247, 1781, 1180, 273, 44540, 352, 671, 10262, 247, 2715, 3809, 24301, 88, 326, 897, 253, 5520, 3809, 48960, 3733, 347, 247, 5890, 484, 273, 268, 27421, 324, 735, 24406, 3733, 2074, 347, 275, 2045, 789, 4583, 253, 1783, 310, 4217, 285, 253, 5697, 403, 3588, 253, 16774, 1543, 671, 921, 9023, 2299, 253, 2022, 14855, 273, 824, 16774, 1783, 310, 326, 352, 778, 320, 7996, 281, 253, 7533, 24088, 50276, 1171, 44540, 19860, 273, 15302, 50276, 783, 4477, 30080, 22559, 671, 11392, 824, 2442, 7350 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: i think this paper conducts several interesting analysis about mt hallucinations and also proposes several different ways of reducing this effect my questions are as follows i am very curious about how do you decide the chosen noisy words i am also wondering what is the difference if you do choose different noisy words another thing if the noisy words are unseen in the training set will it be treated as unk can you highlight what is changed in the upper right side of fig4 it would be great if you include gloss in the figure as welldocsep my major concern about the work is that the studied model is quite weak all models we present are well trained with a bleu score of at least 200 on the test set a reasonable score for 2layer models with 256 hidden units we then used the wmt deen 2016 test set 2999 examples to compute the hallucination percentage for each model i checked the wmt official website httpmatrixstatmtorgmatrix it shows that the best result was a bleu score of 402 which was obtained at 2016 the models used in this work are about 200 which are much less than the wmt results reported two years ago note that neural machine translation has made remarkable progress in recent two years not to mention that production systems like google translator perform much better than research systems therefore the discoveries reported in this work are questionable i strongly suggest the authors to conduct the studies base on the latest nmt architecture ie transformer furthermore i checked the examples given in introduction in google translator and found no hallucination so im not sure whether such hallucinations are really critical to todays nmt systems id like to see that the study on some production translation systems eg applying algo 1 to google translator and check its outputs which can better motivate this work for the analysis in section 61 if attention is the root cause of hallucinations some existing methods should have already address this issue can you check whether the model trained by the following work still suffers from hallucinations modeling coverage for neural machine translation acl 16docsepthe authors introduce hallucinations in nmt and propose some algorithms to avoid them the paper is clear except section 62 which could have been more clearly described and the work is original the paper points out hallucination problems in nmt which looks like adversarial examples in the paper explaining and harnessing adversarial examples so the authors might want to compare the perturbed sources to the adversarial examples if analysis is provided for each hallucination patten that would be better ### Summary:
strengths hallucinations are a problem for seq2seq models esp trained on small datasets weankesses hallucinations are known to exists the analyses observations are not very novel the considered space of hallucinations source ie added noise is fairly limited it is not clear that these are the most natural sources of hallucination and not clear if the methods defined to combat these types would generalize to other types eg id rather see hallucinations appearing when running nmt on some natural albeit noisy corpus rather than defining the noise model manually the proposed approach is not particularly interesting and may not be general alternative techniques eg modeling coverage have been proposed in the past a wider variety of language pairs amounts of data etc needed to validate the methods this is an empirical paper i would expect higher quality of evaluation two reviewers argued that the baseline system is somewhat weak and the method is not very exciting
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 74, 1158, 436, 2929, 2589, 84, 2067, 4722, 1783, 670, 26301, 33092, 7097, 285, 671, 29328, 2067, 1027, 4088, 273, 8493, 436, 1055, 619, 3533, 403, 347, 3637, 50275, 74, 717, 1077, 14338, 670, 849, 513, 368, 7617, 253, 6777, 27620, 3000, 891, 717, 671, 12371, 752, 310, 253, 3064, 604, 368, 513, 5206, 1027, 27620, 3000, 1529, 2181, 604, 253, 27620, 3000, 403, 39709, 275, 253, 3733, 873, 588, 352, 320, 4127, 347, 440, 76, 50276, 5092, 368, 6780, 752, 310, 4391, 275, 253, 5170, 987, 1930, 273, 3036, 21, 352, 651, 320, 1270, 604, 368, 2486, 27392, 275, 253, 4677, 347, 6210, 392, 406, 33032, 186, 2577, 2201, 4468, 670, 253, 789, 310, 326, 253, 5421, 1566, 310, 3240, 5075, 50276, 186, 455, 3210, 359, 1246, 403, 973, 10166, 342, 247, 7387, 86, 4868, 273, 387, 1878, 1052, 327, 253, 1071, 873, 247, 5272, 4868, 323, 374, 12026, 3210, 342, 17558, 8763, 5085, 50276, 186, 664, 840, 908, 253, 259, 6917, 372, 257, 4022, 1071, 873, 374, 16742, 6667, 281, 11897, 253, 33092, 1515, 7155, 323, 1016, 1566, 209, 186, 74, 10141, 253, 259, 6917, 3565, 4422, 3944, 6674, 8766, 6917, 2061, 6674, 352, 2722, 326, 253, 1682, 906, 369, 247, 7387, 86, 4868, 273, 32835, 534, 369, 2797, 387, 4022, 253, 3210, 908, 275, 436, 789, 403, 670, 1052, 534, 403, 1199, 1679, 685, 253, 259, 6917, 1543, 2361, 767, 1107, 3622, 3877, 326, 11454, 5145, 10234, 556, 1160, 13406, 4780, 275, 3332, 767, 1107, 417, 281, 3748, 326, 3275, 2718, 751, 17899, 42726, 1347, 1199, 1805, 685, 2561, 2718, 3103, 253, 32912, 2361, 275, 436, 789, 403, 30455, 891, 7052, 1804, 253, 4477, 281, 2589, 253, 2175, 2613, 327, 253, 6323, 295, 6917, 10336, 26332, 39707, 28910, 209, 186, 44295, 3062, 891, 10141, 253, 6667, 1677, 275, 50276, 46089, 275, 17899, 42726, 285, 1119, 642, 33092, 1515, 594, 516, 417, 2119, 1880, 824, 33092, 7097, 403, 1663, 4619, 281, 281, 11015, 295, 6917, 2718, 2654, 751, 281, 923, 326, 253, 1263, 327, 690, 3275, 10234, 2718, 24088, 9433, 30390, 337, 281, 17899, 42726, 285, 2451, 697, 18012, 534, 476, 1805, 41509, 436, 789, 28910, 209, 186, 1542, 253, 1783, 275, 2593, 9901, 604, 4116, 310, 253, 5230, 2847, 273, 33092, 7097, 690, 5368, 3082, 943, 452, 2168, 2953, 436, 2523, 476, 368, 2451, 1880, 253, 1566, 10166, 407, 253, 1563, 789, 1335, 27171, 432, 33092, 7097, 14053, 7031, 323, 11454, 5145, 10234, 247, 498, 1668, 7152, 339, 431, 248, 4477, 9569, 33092, 7097, 275, 295, 6917, 285, 12661, 690, 11333, 281, 3693, 731, 50276, 783, 2929, 310, 2590, 3707, 2593, 9743, 534, 812, 452, 644, 625, 4518, 2529, 285, 253, 789, 310, 3236, 50276, 783, 2929, 2792, 562, 33092, 1515, 3237, 275, 295, 6917, 534, 4453, 751, 48960, 6667, 275, 253, 2929, 15571, 285, 26880, 272, 48960, 6667, 594, 253, 4477, 1537, 971, 281, 7277, 253, 44711, 4973, 281, 253, 48960, 6667, 604, 1783, 310, 2530, 323, 1016, 33092, 1515, 869, 1866, 326, 651, 320, 1805, 50276, 187, 187, 4118, 18435, 27, 296, 3755, 20556, 50274, 24489, 1028, 7097, 403, 247, 1895, 323, 22510, 19, 14571, 3210, 17985, 10166, 327, 1355, 15302, 50276, 664, 1164, 48998, 50275, 24489, 1028, 7097, 403, 1929, 281, 4961, 253, 6260, 50276, 23705, 569, 403, 417, 1077, 4460, 50274, 783, 2783, 2317, 273, 33092, 7097, 2603, 26332, 2879, 6046, 310, 9648, 3710, 352, 310, 417, 2590, 326, 841, 403, 253, 954, 3626, 4973, 273, 33092, 1515, 285, 417, 2590, 604, 253, 3082, 2931, 281, 11757, 841, 3510, 651, 39970, 281, 643, 3510, 24088, 2654, 2581, 923, 33092, 7097, 15602, 672, 3515, 295, 6917, 327, 690, 3626, 23447, 27620, 20689, 2581, 685, 13947, 253, 6046, 1566, 13542, 50274, 783, 4081, 2746, 310, 417, 3782, 4722, 285, 778, 417, 320, 2087, 5795, 5609, 24088, 14053, 7031, 452, 644, 4081, 275, 253, 2469, 50273, 66, 14200, 5235, 273, 3448, 8557, 8322, 273, 941, 3966, 3058, 281, 17813, 253, 3082, 436, 310, 271, 16774, 2929, 891, 651, 1902, 2169, 3290, 273, 7103, 50276, 9389, 30628, 9125, 326, 253, 8245, 985, 310, 8489, 5075, 285, 253, 1332, 310, 417, 1077, 12302, 50274 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 74, 1158, 436, 2929, 2589, 84, 2067, 4722, 1783, 670, 26301, 33092, 7097, 285, 671, 29328, 2067, 1027, 4088, 273, 8493, 436, 1055, 619, 3533, 403, 347, 3637, 50275, 74, 717, 1077, 14338, 670, 849, 513, 368, 7617, 253, 6777, 27620, 3000, 891, 717, 671, 12371, 752, 310, 253, 3064, 604, 368, 513, 5206, 1027, 27620, 3000, 1529, 2181, 604, 253, 27620, 3000, 403, 39709, 275, 253, 3733, 873, 588, 352, 320, 4127, 347, 440, 76, 50276, 5092, 368, 6780, 752, 310, 4391, 275, 253, 5170, 987, 1930, 273, 3036, 21, 352, 651, 320, 1270, 604, 368, 2486, 27392, 275, 253, 4677, 347, 6210, 392, 406, 33032, 186, 2577, 2201, 4468, 670, 253, 789, 310, 326, 253, 5421, 1566, 310, 3240, 5075, 50276, 186, 455, 3210, 359, 1246, 403, 973, 10166, 342, 247, 7387, 86, 4868, 273, 387, 1878, 1052, 327, 253, 1071, 873, 247, 5272, 4868, 323, 374, 12026, 3210, 342, 17558, 8763, 5085, 50276, 186, 664, 840, 908, 253, 259, 6917, 372, 257, 4022, 1071, 873, 374, 16742, 6667, 281, 11897, 253, 33092, 1515, 7155, 323, 1016, 1566, 209, 186, 74, 10141, 253, 259, 6917, 3565, 4422, 3944, 6674, 8766, 6917, 2061, 6674, 352, 2722, 326, 253, 1682, 906, 369, 247, 7387, 86, 4868, 273, 32835, 534, 369, 2797, 387, 4022, 253, 3210, 908, 275, 436, 789, 403, 670, 1052, 534, 403, 1199, 1679, 685, 253, 259, 6917, 1543, 2361, 767, 1107, 3622, 3877, 326, 11454, 5145, 10234, 556, 1160, 13406, 4780, 275, 3332, 767, 1107, 417, 281, 3748, 326, 3275, 2718, 751, 17899, 42726, 1347, 1199, 1805, 685, 2561, 2718, 3103, 253, 32912, 2361, 275, 436, 789, 403, 30455, 891, 7052, 1804, 253, 4477, 281, 2589, 253, 2175, 2613, 327, 253, 6323, 295, 6917, 10336, 26332, 39707, 28910, 209, 186, 44295, 3062, 891, 10141, 253, 6667, 1677, 275, 50276, 46089, 275, 17899, 42726, 285, 1119, 642, 33092, 1515, 594, 516, 417, 2119, 1880, 824, 33092, 7097, 403, 1663, 4619, 281, 281, 11015, 295, 6917, 2718, 2654, 751, 281, 923, 326, 253, 1263, 327, 690, 3275, 10234, 2718, 24088, 9433, 30390, 337, 281, 17899, 42726, 285, 2451, 697, 18012, 534, 476, 1805, 41509, 436, 789, 28910, 209, 186, 1542, 253, 1783, 275, 2593, 9901, 604, 4116, 310, 253, 5230, 2847, 273, 33092, 7097, 690, 5368, 3082, 943, 452, 2168, 2953, 436, 2523, 476, 368, 2451, 1880, 253, 1566, 10166, 407, 253, 1563, 789, 1335, 27171, 432, 33092, 7097, 14053, 7031, 323, 11454, 5145, 10234, 247, 498, 1668, 7152, 339, 431, 248, 4477, 9569, 33092, 7097, 275, 295, 6917, 285, 12661, 690, 11333, 281, 3693, 731, 50276, 783, 2929, 310, 2590, 3707, 2593, 9743, 534, 812, 452, 644, 625, 4518, 2529, 285, 253, 789, 310, 3236, 50276, 783, 2929, 2792, 562, 33092, 1515, 3237, 275, 295, 6917, 534, 4453, 751, 48960, 6667, 275, 253, 2929, 15571, 285, 26880, 272, 48960, 6667, 594, 253, 4477, 1537, 971, 281, 7277, 253, 44711, 4973, 281, 253, 48960, 6667, 604, 1783, 310, 2530, 323, 1016, 33092, 1515, 869, 1866, 326, 651, 320, 1805, 50276, 187, 187, 4118, 18435, 27, 296, 3755, 20556, 50274, 24489, 1028, 7097, 403, 247, 1895, 323, 22510, 19, 14571, 3210, 17985, 10166, 327, 1355, 15302, 50276, 664, 1164, 48998, 50275, 24489, 1028, 7097, 403, 1929, 281, 4961, 253, 6260, 50276, 23705, 569, 403, 417, 1077, 4460, 50274, 783, 2783, 2317, 273, 33092, 7097, 2603, 26332, 2879, 6046, 310, 9648, 3710, 352, 310, 417, 2590, 326, 841, 403, 253, 954, 3626, 4973, 273, 33092, 1515, 285, 417, 2590, 604, 253, 3082, 2931, 281, 11757, 841, 3510, 651, 39970, 281, 643, 3510, 24088, 2654, 2581, 923, 33092, 7097, 15602, 672, 3515, 295, 6917, 327, 690, 3626, 23447, 27620, 20689, 2581, 685, 13947, 253, 6046, 1566, 13542, 50274, 783, 4081, 2746, 310, 417, 3782, 4722, 285, 778, 417, 320, 2087, 5795, 5609, 24088, 14053, 7031, 452, 644, 4081, 275, 253, 2469, 50273, 66, 14200, 5235, 273, 3448, 8557, 8322, 273, 941, 3966, 3058, 281, 17813, 253, 3082, 436, 310, 271, 16774, 2929, 891, 651, 1902, 2169, 3290, 273, 7103, 50276, 9389, 30628, 9125, 326, 253, 8245, 985, 310, 8489, 5075, 285, 253, 1332, 310, 417, 1077, 12302, 50274 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper evaluates and compares various methods for learning gans in a continual learning setting ie only some of the classes are available during training it evaluates different continual learning methods including rehearsal ewc and generative replay applied to training several deep generative models like gan cgan wgan wgangp vae and cvae on mnist fashion mnist and cifar the authors conclude with these experimental results that generative replay is the most effective method for such a setting and found it is difficult to generate cifar10 images that can be classified successfully by an image classifier i appreciate the authors for providing so much detailed experimental results to the community but this paper lacks novelty in general all the cl methods the authors evaluate come from other papers that are already using these methods for generative models rehearsal has been used in vcl nguyen et al 2017 ewc comes directly from seff et al 2017 and generative replay has been used by wu et al 2018a the authors also fail to provide any valuable insight with these experimental results eg analyzing why generative replay fails to improve vaes i expect to see more exciting results coming from the authors but the paper is not mature enough for acceptance this time docsepthis paper performs an empirical comparison of models and cl methods in a generative setting the main motivation of the paper is to make statements about which modelmethod combinations are best to use for generative tasks in the cl setting in short the paper provides an empirical analysis and evaluation of the combination of cl methods and generative models the datasets used for comparison are mnist fashion mnist and cifar10 for each dataset sequential class by class generative tasks are introduced aligning with the cl setting the models investigated are vaes gans and wgans along with their class conditional counterparts the cl methods investigated are i finetuning a simple baseline ii rehearsal methods iii elastic weight consolidation ewc and iv generative replay gr the authors propose to use two evaluation metrics frchet inception distance fid measures the quality of the generated images and fitting capacity fc measures the usefulness of the images to train classifiers pros the authors are correct in pointing out that most of the work on cl has been restricted to the discriminative case and that there is value in exploring generative tasks in the cl setting empirical and experimental evaluation of this sort are useful and help the community better understand the relationship between model cl method and task such an evaluation and indepth analysis is welcomed in cl especially in the generative setting the authors draw a number of useful conclusions eg regarding the usefulness and dangers of employing the different cl methods cons my main concern with this paper regards the evaluation metrics used the authors propose quality metrics for the generative model both of which directly or indirectly measure the quality of the generated images in this setting it is unsurprising that gans outperform vaes as they are known to generate higherquality images this however does not necessarily mean that they are better at the continual learning task ie avoiding catastrophic forgetting it seems to me that one source from which to draw would be 1 which conducted a very rigorous and useful empirical evaluation of generative models and the methodology followed there ie evaluating marginal loglikelihoods via annealed importance sampling would be more convincing evidence for empirical comparison of models as it would somewhat detach the quality of the generated images from the ability of the model to avoid catastrophic forgetting using their proposed imagequality metrics the authors make statements such as our results do not give a clear distinction between conditional and unconditional models however adversarial methods perform significantly better than variational methods gans variants are able to produce better sharper quality and variety of samples as observed in fig 13 and 14 in appendix g hence adversarial methods seem more viable for cl my impression is that this statement on the viability of vaes vs gans for cl which is a major point of the paper does not follow from the empirical results on the quality of the generated images it seems quite predictable that the ganbased models would produce higher quality images regardless of catastrophic forgetting additional minor comments sec 2 could consist of a more thorough review of the literature with a more indepth comparison of the different cl methods proposed and evaluated in the paper sec 2 contains a number of statements of the form restricted to vaes only for many of the cases it is not immediately clear why this is true and in my opinion the authors should either drop those comments or make them rigorous vcl use specific weights for each task which only works for the setting where the number of tasks is known in advance unclear what exactly this means or why this is true while the teacher retains knowledge how does it retain knowledge how is this then transferred to the student and why is this restricted to vaes experimental protocol coresets for the rehearsal as proposed by 1 could be an interesting extension it is unclear how the samples were selected for rehearsal and coresets represent a principled way to do so that would also be interesting to compare in this setting to a random baseline for vaes a potentially better metric of their ability other than the loglikelihood as suggested by 2 would be fitting capacity or other metric over learned latent space rather than the reconstructed imagespace overall my impression is that while an empirical analysis of cl methods in the generative setting is a useful concept the submission in its current form requires some improvement in particular i am worried that the choice of evaluation metrics may lead to incorrect or partially correct conclusions which could of course have a negative impact on the research into cl it also seems that the paper could use some further polishing in both writing and presentation as such i encourage the authors to continue the work on this empirical analysis and perhaps submit in again to future conferences 1 nguyen et al variational continual learning iclr 2018 2 wu et al on the quantitative analysis of decoderbased generative models iclr 2017docsepthis paper presents an empirical evaluation of continual learning approaches for generative modelling noting that much of previous work focuses on supervised tasks the paper evaluates various combinations of continual learning strategies ewc rehearsalreplaybased or generative replay and generative models gans or likelihoodbased the experiments evaluate all combinations on mnist and fashion mnist and the resulting bestperforming combination on cifar the paper is wellwritten and structured and although there are no new proposed algorithms or measures i think this has the potential to be a useful empirical study on the relatively unstudied topic of continual learning with generative models however my main concern is in the detail of analysis and discussion for an empirical study it would be much more beneficial to empirically investigate why certain combinations are more effective than others for example is the reason gans are better than likelihood models with generative replay purely because of sample quality or is it sufficient for the generator to learn some key characteristics for a class that lead to sufficient discriminability why is rehearsal better for likelihood models and how does this relate to the hypothesis of overfitting to a few real examples the cifar10 results also require more work it is unclear why the existing approaches could not be made to work and whether this is a fundamental deficiency in the existing approaches or other factors hyperparameters architecture choices lack of time etc presuming the sample quality is as good as in the wgangp work given the original implementation is used for experiments why is this insufficient for generative replay more detailed analysis discussion or another combinatorial study would help for cifar too some comments the poor performance of ewc across the board is concerning was this implemented by computing the fisher of the elbo with respect to parameters was the empirical or true fisher used why does the performance appear so poor compared to seff et al 2017 this suggests that either more thought is required on how to best protect parameters of generative models or the baseline has not been properly implementedtuned given this is an entirely empirical study i would strongly encourage the authors to release code sooner than the acceptance deadline this can be achieved using an anonymous repository figure 2 and 3 plots are a little difficult to parse without axis labels ### Summary:
this paper presents empirical evaluation and comparison of different generative models such as gans and vae in the continual learning setting to avoid catastrophic forgetting the following strategies are considered rehearsal regularization generative replay and finetuning the empirical evaluations are carried out using three datasets mnist fashion mnist and cifar while all reviewers and ac acknowledge the importance and potential usefulness of studying and comparing different generative models in continual learning they raised several important concerns that place this paper bellow the acceptance bar 1 in an empirical study paper an indepth analysis and more insightful evaluations are required to better understand the benefits and shortcomings of the available models r1 and r2 eg analyzing why generative replay fails to improve vae why is rehearsal better for likelihood models and in general why certain combinations are more effective than others see more suggestions in r1s and r2s comments the authors discussed in their response to the reviews some of these questions but a more detailed analysis is required to fully understand the benefits of this empirical study 2 the evaluation is geared towards quality metrics for the generative models and lacks evaluation for catastrophic forgetting in continual learning hence it favours gans models see r3s suggestion how to improve to conclude the reviewers and ac suggest that in its current state the manuscript is not ready for a publication we hope the reviews are useful for improving and revising the paper
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 44995, 285, 26662, 2710, 3082, 323, 4715, 305, 507, 275, 247, 45120, 4715, 4758, 26332, 760, 690, 273, 253, 5971, 403, 2130, 1309, 3733, 352, 44995, 1027, 45120, 4715, 3082, 1690, 33558, 267, 299, 38212, 285, 1006, 800, 44864, 3732, 281, 3733, 2067, 3676, 1006, 800, 3210, 751, 36827, 260, 1247, 259, 1247, 259, 26774, 81, 362, 3348, 285, 260, 21574, 327, 278, 79, 382, 8142, 278, 79, 382, 285, 260, 338, 274, 253, 4477, 7525, 342, 841, 5661, 1543, 326, 1006, 800, 44864, 310, 253, 954, 3576, 1332, 323, 824, 247, 4758, 285, 1119, 352, 310, 2834, 281, 6635, 260, 338, 274, 740, 3888, 326, 476, 320, 10509, 8379, 407, 271, 2460, 30410, 50276, 74, 11435, 253, 4477, 323, 5277, 594, 1199, 7000, 5661, 1543, 281, 253, 3114, 533, 436, 2929, 19756, 38135, 275, 2087, 512, 253, 502, 3082, 253, 4477, 7472, 1705, 432, 643, 9380, 326, 403, 2168, 970, 841, 3082, 323, 1006, 800, 3210, 33558, 267, 556, 644, 908, 275, 362, 498, 295, 39170, 1162, 355, 4240, 299, 38212, 3249, 3587, 432, 396, 567, 1162, 355, 4240, 285, 1006, 800, 44864, 556, 644, 908, 407, 259, 86, 1162, 355, 4765, 66, 253, 4477, 671, 1891, 281, 2085, 667, 9865, 12288, 342, 841, 5661, 1543, 24088, 18918, 2139, 1006, 800, 44864, 10224, 281, 3157, 13460, 265, 50275, 74, 1902, 281, 923, 625, 12302, 1543, 3551, 432, 253, 4477, 533, 253, 2929, 310, 417, 14242, 2217, 323, 14924, 436, 673, 5474, 33032, 2520, 2929, 17923, 271, 16774, 5301, 273, 3210, 285, 502, 3082, 275, 247, 1006, 800, 4758, 253, 2022, 16038, 273, 253, 2929, 310, 281, 1056, 7234, 670, 534, 1566, 9349, 13553, 403, 1682, 281, 897, 323, 1006, 800, 8892, 275, 253, 502, 4758, 275, 2159, 253, 2929, 3400, 271, 16774, 1783, 285, 7103, 273, 253, 5019, 273, 502, 3082, 285, 1006, 800, 3210, 50276, 783, 15302, 908, 323, 5301, 403, 278, 79, 382, 8142, 278, 79, 382, 285, 260, 338, 274, 740, 323, 1016, 10895, 22453, 966, 407, 966, 1006, 800, 8892, 403, 5611, 8495, 272, 342, 253, 502, 4758, 253, 3210, 6949, 403, 13460, 265, 305, 507, 285, 259, 72, 507, 2112, 342, 616, 966, 17697, 21421, 253, 502, 3082, 6949, 403, 891, 1442, 292, 25004, 247, 2969, 8245, 21255, 33558, 267, 3082, 37685, 15386, 2801, 34889, 299, 38212, 285, 21983, 1006, 800, 44864, 650, 253, 4477, 12661, 281, 897, 767, 7103, 17082, 1315, 25742, 39645, 4181, 269, 301, 5593, 253, 3290, 273, 253, 4561, 3888, 285, 13532, 5350, 269, 68, 5593, 253, 31471, 273, 253, 3888, 281, 6194, 49996, 50276, 856, 84, 50276, 783, 4477, 403, 3451, 275, 13458, 562, 326, 954, 273, 253, 789, 327, 502, 556, 644, 11096, 281, 253, 20741, 800, 1083, 285, 326, 627, 310, 1318, 275, 18216, 1006, 800, 8892, 275, 253, 502, 4758, 50276, 358, 5378, 474, 285, 5661, 7103, 273, 436, 3686, 403, 4217, 285, 1361, 253, 3114, 1805, 2096, 253, 2954, 875, 1566, 502, 1332, 285, 4836, 824, 271, 7103, 285, 801, 554, 394, 1783, 310, 25213, 275, 502, 3340, 275, 253, 1006, 800, 4758, 50276, 783, 4477, 3812, 247, 1180, 273, 4217, 11815, 24088, 5001, 253, 31471, 285, 25926, 273, 19693, 253, 1027, 502, 3082, 50276, 5040, 50276, 2577, 2022, 4468, 342, 436, 2929, 17730, 253, 7103, 17082, 908, 253, 4477, 12661, 3290, 17082, 323, 253, 1006, 800, 1566, 1097, 273, 534, 3587, 390, 21719, 2557, 253, 3290, 273, 253, 4561, 3888, 275, 436, 4758, 352, 310, 5061, 321, 20733, 326, 305, 507, 562, 32231, 13460, 265, 347, 597, 403, 1929, 281, 6635, 2169, 15177, 3888, 436, 2299, 1057, 417, 7933, 1599, 326, 597, 403, 1805, 387, 253, 45120, 4715, 4836, 26332, 17816, 36256, 37264, 352, 3133, 281, 479, 326, 581, 2603, 432, 534, 281, 3812, 651, 320, 337, 534, 5196, 247, 1077, 26565, 285, 4217, 16774, 7103, 273, 1006, 800, 3210, 285, 253, 16182, 3560, 627, 26332, 16344, 16888, 2412, 7513, 10202, 84, 3066, 27175, 3256, 6349, 10491, 651, 320, 625, 21414, 1941, 323, 16774, 5301, 273, 3210, 347, 352, 651, 8489, 42103, 253, 3290, 273, 253, 4561, 3888, 432, 253, 3745, 273, 253, 1566, 281, 3693, 36256, 37264, 50276, 5302, 616, 4081, 2460, 15177, 17082, 253, 4477, 1056, 7234, 824, 347, 776, 1543, 513, 417, 1918, 247, 2590, 13812, 875, 17697, 285, 49795, 3210, 2299, 48960, 3082, 1347, 3012, 1805, 685, 39762, 3082, 305, 507, 11640, 403, 2104, 281, 4711, 1805, 17614, 468, 3290, 285, 5235, 273, 3530, 347, 2540, 275, 3036, 2145, 285, 1638, 275, 30762, 305, 7613, 48960, 3082, 1646, 625, 16571, 323, 502, 619, 13214, 310, 326, 436, 3908, 327, 253, 17036, 273, 13460, 265, 4632, 305, 507, 323, 502, 534, 310, 247, 2201, 1127, 273, 253, 2929, 1057, 417, 956, 432, 253, 16774, 1543, 327, 253, 3290, 273, 253, 4561, 3888, 352, 3133, 3240, 28826, 326, 253, 36827, 3169, 3210, 651, 4711, 2169, 3290, 3888, 10159, 273, 36256, 37264, 50276, 38092, 5884, 5701, 50276, 1704, 374, 812, 2882, 273, 247, 625, 11080, 2278, 273, 253, 6239, 342, 247, 625, 801, 554, 394, 5301, 273, 253, 1027, 502, 3082, 4081, 285, 6760, 275, 253, 2929, 50276, 1704, 374, 4428, 247, 1180, 273, 7234, 273, 253, 830, 11096, 281, 13460, 265, 760, 323, 1142, 273, 253, 2219, 352, 310, 417, 4745, 2590, 2139, 436, 310, 2032, 285, 275, 619, 4743, 253, 4477, 943, 2057, 5926, 1110, 5701, 390, 1056, 731, 26565, 50276, 87, 498, 897, 2173, 13461, 323, 1016, 4836, 534, 760, 2987, 323, 253, 4758, 835, 253, 1180, 273, 8892, 310, 1929, 275, 7170, 12744, 752, 4555, 436, 2097, 390, 2139, 436, 310, 2032, 50276, 6050, 253, 9732, 32751, 3640, 50276, 5430, 1057, 352, 13280, 3640, 849, 310, 436, 840, 9495, 281, 253, 5974, 285, 2139, 310, 436, 11096, 281, 13460, 265, 50276, 49363, 7241, 50276, 38337, 1507, 323, 253, 33558, 267, 347, 4081, 407, 337, 812, 320, 271, 4722, 6880, 352, 310, 12744, 849, 253, 3530, 497, 4236, 323, 33558, 267, 285, 23018, 1507, 1957, 247, 3505, 74, 6216, 1039, 281, 513, 594, 326, 651, 671, 320, 4722, 281, 7277, 275, 436, 4758, 281, 247, 3632, 8245, 50276, 1542, 13460, 265, 247, 7826, 1805, 7982, 273, 616, 3745, 643, 685, 253, 2412, 7513, 10202, 347, 5125, 407, 374, 651, 320, 13532, 5350, 390, 643, 7982, 689, 6311, 21624, 2317, 2581, 685, 253, 25578, 3888, 4511, 50276, 1189, 455, 619, 13214, 310, 326, 1223, 271, 16774, 1783, 273, 502, 3082, 275, 253, 1006, 800, 4758, 310, 247, 4217, 4473, 253, 19529, 275, 697, 1655, 830, 4419, 690, 7756, 275, 1798, 891, 717, 11926, 326, 253, 4327, 273, 7103, 17082, 778, 1421, 281, 13583, 390, 10571, 3451, 11815, 534, 812, 273, 2282, 452, 247, 4016, 3486, 327, 253, 2561, 715, 502, 352, 671, 3133, 326, 253, 2929, 812, 897, 690, 2007, 35952, 275, 1097, 4028, 285, 9759, 347, 824, 891, 11907, 253, 4477, 281, 4035, 253, 789, 327, 436, 16774, 1783, 285, 4931, 11929, 275, 969, 281, 2852, 27691, 50276, 18, 50276, 1251, 7352, 257, 1162, 355, 39762, 45120, 4715, 17857, 32888, 4765, 374, 50276, 44217, 1162, 355, 327, 253, 11745, 1783, 273, 29810, 3169, 1006, 800, 3210, 17857, 32888, 4240, 7152, 33032, 2520, 2929, 10262, 271, 16774, 7103, 273, 45120, 4715, 7274, 323, 1006, 800, 26278, 15806, 326, 1199, 273, 2045, 789, 16633, 327, 22296, 8892, 253, 2929, 44995, 2710, 13553, 273, 45120, 4715, 8130, 299, 38212, 33558, 267, 250, 1993, 3169, 390, 1006, 800, 44864, 285, 1006, 800, 3210, 305, 507, 390, 12177, 3169, 253, 4679, 7472, 512, 13553, 327, 278, 79, 382, 285, 8142, 278, 79, 382, 285, 253, 4795, 1682, 468, 14692, 5019, 327, 260, 338, 274, 253, 2929, 310, 973, 15720, 285, 18872, 285, 3738, 627, 403, 642, 747, 4081, 11333, 390, 5593, 891, 1158, 436, 556, 253, 2442, 281, 320, 247, 4217, 16774, 1263, 327, 253, 4942, 440, 14091, 728, 9400, 273, 45120, 4715, 342, 1006, 800, 3210, 50276, 35529, 619, 2022, 4468, 310, 275, 253, 2508, 273, 1783, 285, 5955, 323, 271, 16774, 1263, 352, 651, 320, 1199, 625, 12912, 281, 45190, 7409, 2139, 2176, 13553, 403, 625, 3576, 685, 2571, 323, 1650, 50276, 261, 253, 1921, 305, 507, 403, 1805, 685, 12177, 3210, 342, 1006, 800, 44864, 15846, 984, 273, 3410, 3290, 390, 310, 352, 4209, 323, 253, 14156, 281, 3037, 690, 2234, 5319, 323, 247, 966, 326, 1421, 281, 4209, 20741, 1430, 50276, 22309, 310, 33558, 267, 1805, 323, 12177, 3210, 285, 849, 1057, 436, 14588, 281, 253, 9079, 273, 689, 31893, 281, 247, 1643, 1524, 6667, 50276, 783, 260, 338, 274, 740, 1543, 671, 2430, 625, 789, 50276, 262, 310, 12744, 2139, 253, 5368, 7274, 812, 417, 320, 1160, 281, 789, 285, 1880, 436, 310, 247, 7936, 14384, 275, 253, 5368, 7274, 390, 643, 2616, 4373, 22041, 10336, 10165, 3480, 273, 673, 3966, 9475, 272, 253, 3410, 3290, 310, 347, 1175, 347, 275, 253, 259, 26774, 81, 789, 1677, 253, 3236, 7092, 310, 908, 323, 4679, 2139, 310, 436, 12497, 323, 1006, 800, 44864, 625, 7000, 1783, 50276, 49794, 390, 1529, 38183, 1263, 651, 1361, 323, 260, 338, 274, 1512, 50276, 8826, 5701, 50276, 783, 4105, 3045, 273, 299, 38212, 2439, 253, 4450, 310, 8664, 369, 436, 9009, 407, 12672, 253, 27633, 273, 253, 1045, 2399, 342, 1675, 281, 3602, 369, 253, 16774, 390, 2032, 27633, 908, 2139, 1057, 253, 3045, 3176, 594, 4105, 2429, 281, 396, 567, 1162, 355, 4240, 436, 5936, 326, 2057, 625, 1869, 310, 2424, 327, 849, 281, 1682, 4017, 3602, 273, 1006, 800, 3210, 390, 253, 8245, 556, 417, 644, 6283, 9009, 85, 37437, 50276, 28821, 436, 310, 271, 7094, 16774, 1263, 891, 651, 7052, 11907, 253, 4477, 281, 3727, 2127, 19473, 685, 253, 14924, 20639, 50276, 2520, 476, 320, 6786, 970, 271, 17679, 18491, 50276, 13206, 374, 285, 495, 14777, 403, 247, 1652, 2834, 281, 14390, 1293, 7844, 13301, 187, 187, 4118, 18435, 27, 2520, 2929, 10262, 16774, 7103, 285, 5301, 273, 1027, 1006, 800, 3210, 824, 347, 305, 507, 285, 362, 3348, 275, 253, 45120, 4715, 4758, 50276, 936, 3693, 36256, 37264, 253, 1563, 8130, 403, 2783, 33558, 267, 37820, 1006, 800, 44864, 285, 1442, 292, 25004, 253, 16774, 27163, 403, 4824, 562, 970, 1264, 15302, 278, 79, 382, 8142, 278, 79, 382, 285, 260, 338, 274, 50275, 6050, 512, 30628, 285, 913, 14409, 253, 6349, 285, 2442, 31471, 273, 12392, 285, 10941, 1027, 1006, 800, 3210, 275, 45120, 4715, 597, 5439, 2067, 1774, 7350, 326, 1659, 436, 2929, 44462, 253, 14924, 2534, 337, 275, 271, 16774, 1263, 2929, 271, 801, 554, 394, 1783, 285, 625, 47860, 27163, 403, 2424, 281, 1805, 2096, 253, 5373, 285, 35387, 273, 253, 2130, 3210, 391, 18, 285, 391, 19, 24088, 18918, 2139, 1006, 800, 44864, 10224, 281, 3157, 362, 3348, 2139, 310, 33558, 267, 1805, 323, 12177, 3210, 285, 275, 2087, 2139, 2176, 13553, 403, 625, 3576, 685, 2571, 50276, 2887, 625, 13991, 275, 391, 18, 84, 285, 391, 19, 84, 5701, 253, 4477, 5469, 275, 616, 2380, 281, 253, 10123, 690, 273, 841, 3533, 533, 247, 625, 7000, 1783, 310, 2424, 281, 4751, 2096, 253, 5373, 273, 436, 16774, 1263, 374, 253, 7103, 310, 48526, 4404, 3290, 17082, 323, 253, 1006, 800, 3210, 285, 19756, 7103, 323, 36256, 37264, 275, 45120, 4715, 7613, 352, 2883, 2108, 305, 507, 3210, 50276, 2887, 391, 20, 84, 14876, 849, 281, 3157, 50275, 936, 7525, 253, 30628, 285, 913, 1804, 326, 275, 697, 1655, 1375, 253, 7714, 310, 417, 4704, 323, 247, 9311, 359, 3524, 253, 10123, 403, 4217, 323, 11138, 285, 3585, 2182, 253, 2929, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 44995, 285, 26662, 2710, 3082, 323, 4715, 305, 507, 275, 247, 45120, 4715, 4758, 26332, 760, 690, 273, 253, 5971, 403, 2130, 1309, 3733, 352, 44995, 1027, 45120, 4715, 3082, 1690, 33558, 267, 299, 38212, 285, 1006, 800, 44864, 3732, 281, 3733, 2067, 3676, 1006, 800, 3210, 751, 36827, 260, 1247, 259, 1247, 259, 26774, 81, 362, 3348, 285, 260, 21574, 327, 278, 79, 382, 8142, 278, 79, 382, 285, 260, 338, 274, 253, 4477, 7525, 342, 841, 5661, 1543, 326, 1006, 800, 44864, 310, 253, 954, 3576, 1332, 323, 824, 247, 4758, 285, 1119, 352, 310, 2834, 281, 6635, 260, 338, 274, 740, 3888, 326, 476, 320, 10509, 8379, 407, 271, 2460, 30410, 50276, 74, 11435, 253, 4477, 323, 5277, 594, 1199, 7000, 5661, 1543, 281, 253, 3114, 533, 436, 2929, 19756, 38135, 275, 2087, 512, 253, 502, 3082, 253, 4477, 7472, 1705, 432, 643, 9380, 326, 403, 2168, 970, 841, 3082, 323, 1006, 800, 3210, 33558, 267, 556, 644, 908, 275, 362, 498, 295, 39170, 1162, 355, 4240, 299, 38212, 3249, 3587, 432, 396, 567, 1162, 355, 4240, 285, 1006, 800, 44864, 556, 644, 908, 407, 259, 86, 1162, 355, 4765, 66, 253, 4477, 671, 1891, 281, 2085, 667, 9865, 12288, 342, 841, 5661, 1543, 24088, 18918, 2139, 1006, 800, 44864, 10224, 281, 3157, 13460, 265, 50275, 74, 1902, 281, 923, 625, 12302, 1543, 3551, 432, 253, 4477, 533, 253, 2929, 310, 417, 14242, 2217, 323, 14924, 436, 673, 5474, 33032, 2520, 2929, 17923, 271, 16774, 5301, 273, 3210, 285, 502, 3082, 275, 247, 1006, 800, 4758, 253, 2022, 16038, 273, 253, 2929, 310, 281, 1056, 7234, 670, 534, 1566, 9349, 13553, 403, 1682, 281, 897, 323, 1006, 800, 8892, 275, 253, 502, 4758, 275, 2159, 253, 2929, 3400, 271, 16774, 1783, 285, 7103, 273, 253, 5019, 273, 502, 3082, 285, 1006, 800, 3210, 50276, 783, 15302, 908, 323, 5301, 403, 278, 79, 382, 8142, 278, 79, 382, 285, 260, 338, 274, 740, 323, 1016, 10895, 22453, 966, 407, 966, 1006, 800, 8892, 403, 5611, 8495, 272, 342, 253, 502, 4758, 253, 3210, 6949, 403, 13460, 265, 305, 507, 285, 259, 72, 507, 2112, 342, 616, 966, 17697, 21421, 253, 502, 3082, 6949, 403, 891, 1442, 292, 25004, 247, 2969, 8245, 21255, 33558, 267, 3082, 37685, 15386, 2801, 34889, 299, 38212, 285, 21983, 1006, 800, 44864, 650, 253, 4477, 12661, 281, 897, 767, 7103, 17082, 1315, 25742, 39645, 4181, 269, 301, 5593, 253, 3290, 273, 253, 4561, 3888, 285, 13532, 5350, 269, 68, 5593, 253, 31471, 273, 253, 3888, 281, 6194, 49996, 50276, 856, 84, 50276, 783, 4477, 403, 3451, 275, 13458, 562, 326, 954, 273, 253, 789, 327, 502, 556, 644, 11096, 281, 253, 20741, 800, 1083, 285, 326, 627, 310, 1318, 275, 18216, 1006, 800, 8892, 275, 253, 502, 4758, 50276, 358, 5378, 474, 285, 5661, 7103, 273, 436, 3686, 403, 4217, 285, 1361, 253, 3114, 1805, 2096, 253, 2954, 875, 1566, 502, 1332, 285, 4836, 824, 271, 7103, 285, 801, 554, 394, 1783, 310, 25213, 275, 502, 3340, 275, 253, 1006, 800, 4758, 50276, 783, 4477, 3812, 247, 1180, 273, 4217, 11815, 24088, 5001, 253, 31471, 285, 25926, 273, 19693, 253, 1027, 502, 3082, 50276, 5040, 50276, 2577, 2022, 4468, 342, 436, 2929, 17730, 253, 7103, 17082, 908, 253, 4477, 12661, 3290, 17082, 323, 253, 1006, 800, 1566, 1097, 273, 534, 3587, 390, 21719, 2557, 253, 3290, 273, 253, 4561, 3888, 275, 436, 4758, 352, 310, 5061, 321, 20733, 326, 305, 507, 562, 32231, 13460, 265, 347, 597, 403, 1929, 281, 6635, 2169, 15177, 3888, 436, 2299, 1057, 417, 7933, 1599, 326, 597, 403, 1805, 387, 253, 45120, 4715, 4836, 26332, 17816, 36256, 37264, 352, 3133, 281, 479, 326, 581, 2603, 432, 534, 281, 3812, 651, 320, 337, 534, 5196, 247, 1077, 26565, 285, 4217, 16774, 7103, 273, 1006, 800, 3210, 285, 253, 16182, 3560, 627, 26332, 16344, 16888, 2412, 7513, 10202, 84, 3066, 27175, 3256, 6349, 10491, 651, 320, 625, 21414, 1941, 323, 16774, 5301, 273, 3210, 347, 352, 651, 8489, 42103, 253, 3290, 273, 253, 4561, 3888, 432, 253, 3745, 273, 253, 1566, 281, 3693, 36256, 37264, 50276, 5302, 616, 4081, 2460, 15177, 17082, 253, 4477, 1056, 7234, 824, 347, 776, 1543, 513, 417, 1918, 247, 2590, 13812, 875, 17697, 285, 49795, 3210, 2299, 48960, 3082, 1347, 3012, 1805, 685, 39762, 3082, 305, 507, 11640, 403, 2104, 281, 4711, 1805, 17614, 468, 3290, 285, 5235, 273, 3530, 347, 2540, 275, 3036, 2145, 285, 1638, 275, 30762, 305, 7613, 48960, 3082, 1646, 625, 16571, 323, 502, 619, 13214, 310, 326, 436, 3908, 327, 253, 17036, 273, 13460, 265, 4632, 305, 507, 323, 502, 534, 310, 247, 2201, 1127, 273, 253, 2929, 1057, 417, 956, 432, 253, 16774, 1543, 327, 253, 3290, 273, 253, 4561, 3888, 352, 3133, 3240, 28826, 326, 253, 36827, 3169, 3210, 651, 4711, 2169, 3290, 3888, 10159, 273, 36256, 37264, 50276, 38092, 5884, 5701, 50276, 1704, 374, 812, 2882, 273, 247, 625, 11080, 2278, 273, 253, 6239, 342, 247, 625, 801, 554, 394, 5301, 273, 253, 1027, 502, 3082, 4081, 285, 6760, 275, 253, 2929, 50276, 1704, 374, 4428, 247, 1180, 273, 7234, 273, 253, 830, 11096, 281, 13460, 265, 760, 323, 1142, 273, 253, 2219, 352, 310, 417, 4745, 2590, 2139, 436, 310, 2032, 285, 275, 619, 4743, 253, 4477, 943, 2057, 5926, 1110, 5701, 390, 1056, 731, 26565, 50276, 87, 498, 897, 2173, 13461, 323, 1016, 4836, 534, 760, 2987, 323, 253, 4758, 835, 253, 1180, 273, 8892, 310, 1929, 275, 7170, 12744, 752, 4555, 436, 2097, 390, 2139, 436, 310, 2032, 50276, 6050, 253, 9732, 32751, 3640, 50276, 5430, 1057, 352, 13280, 3640, 849, 310, 436, 840, 9495, 281, 253, 5974, 285, 2139, 310, 436, 11096, 281, 13460, 265, 50276, 49363, 7241, 50276, 38337, 1507, 323, 253, 33558, 267, 347, 4081, 407, 337, 812, 320, 271, 4722, 6880, 352, 310, 12744, 849, 253, 3530, 497, 4236, 323, 33558, 267, 285, 23018, 1507, 1957, 247, 3505, 74, 6216, 1039, 281, 513, 594, 326, 651, 671, 320, 4722, 281, 7277, 275, 436, 4758, 281, 247, 3632, 8245, 50276, 1542, 13460, 265, 247, 7826, 1805, 7982, 273, 616, 3745, 643, 685, 253, 2412, 7513, 10202, 347, 5125, 407, 374, 651, 320, 13532, 5350, 390, 643, 7982, 689, 6311, 21624, 2317, 2581, 685, 253, 25578, 3888, 4511, 50276, 1189, 455, 619, 13214, 310, 326, 1223, 271, 16774, 1783, 273, 502, 3082, 275, 253, 1006, 800, 4758, 310, 247, 4217, 4473, 253, 19529, 275, 697, 1655, 830, 4419, 690, 7756, 275, 1798, 891, 717, 11926, 326, 253, 4327, 273, 7103, 17082, 778, 1421, 281, 13583, 390, 10571, 3451, 11815, 534, 812, 273, 2282, 452, 247, 4016, 3486, 327, 253, 2561, 715, 502, 352, 671, 3133, 326, 253, 2929, 812, 897, 690, 2007, 35952, 275, 1097, 4028, 285, 9759, 347, 824, 891, 11907, 253, 4477, 281, 4035, 253, 789, 327, 436, 16774, 1783, 285, 4931, 11929, 275, 969, 281, 2852, 27691, 50276, 18, 50276, 1251, 7352, 257, 1162, 355, 39762, 45120, 4715, 17857, 32888, 4765, 374, 50276, 44217, 1162, 355, 327, 253, 11745, 1783, 273, 29810, 3169, 1006, 800, 3210, 17857, 32888, 4240, 7152, 33032, 2520, 2929, 10262, 271, 16774, 7103, 273, 45120, 4715, 7274, 323, 1006, 800, 26278, 15806, 326, 1199, 273, 2045, 789, 16633, 327, 22296, 8892, 253, 2929, 44995, 2710, 13553, 273, 45120, 4715, 8130, 299, 38212, 33558, 267, 250, 1993, 3169, 390, 1006, 800, 44864, 285, 1006, 800, 3210, 305, 507, 390, 12177, 3169, 253, 4679, 7472, 512, 13553, 327, 278, 79, 382, 285, 8142, 278, 79, 382, 285, 253, 4795, 1682, 468, 14692, 5019, 327, 260, 338, 274, 253, 2929, 310, 973, 15720, 285, 18872, 285, 3738, 627, 403, 642, 747, 4081, 11333, 390, 5593, 891, 1158, 436, 556, 253, 2442, 281, 320, 247, 4217, 16774, 1263, 327, 253, 4942, 440, 14091, 728, 9400, 273, 45120, 4715, 342, 1006, 800, 3210, 50276, 35529, 619, 2022, 4468, 310, 275, 253, 2508, 273, 1783, 285, 5955, 323, 271, 16774, 1263, 352, 651, 320, 1199, 625, 12912, 281, 45190, 7409, 2139, 2176, 13553, 403, 625, 3576, 685, 2571, 323, 1650, 50276, 261, 253, 1921, 305, 507, 403, 1805, 685, 12177, 3210, 342, 1006, 800, 44864, 15846, 984, 273, 3410, 3290, 390, 310, 352, 4209, 323, 253, 14156, 281, 3037, 690, 2234, 5319, 323, 247, 966, 326, 1421, 281, 4209, 20741, 1430, 50276, 22309, 310, 33558, 267, 1805, 323, 12177, 3210, 285, 849, 1057, 436, 14588, 281, 253, 9079, 273, 689, 31893, 281, 247, 1643, 1524, 6667, 50276, 783, 260, 338, 274, 740, 1543, 671, 2430, 625, 789, 50276, 262, 310, 12744, 2139, 253, 5368, 7274, 812, 417, 320, 1160, 281, 789, 285, 1880, 436, 310, 247, 7936, 14384, 275, 253, 5368, 7274, 390, 643, 2616, 4373, 22041, 10336, 10165, 3480, 273, 673, 3966, 9475, 272, 253, 3410, 3290, 310, 347, 1175, 347, 275, 253, 259, 26774, 81, 789, 1677, 253, 3236, 7092, 310, 908, 323, 4679, 2139, 310, 436, 12497, 323, 1006, 800, 44864, 625, 7000, 1783, 50276, 49794, 390, 1529, 38183, 1263, 651, 1361, 323, 260, 338, 274, 1512, 50276, 8826, 5701, 50276, 783, 4105, 3045, 273, 299, 38212, 2439, 253, 4450, 310, 8664, 369, 436, 9009, 407, 12672, 253, 27633, 273, 253, 1045, 2399, 342, 1675, 281, 3602, 369, 253, 16774, 390, 2032, 27633, 908, 2139, 1057, 253, 3045, 3176, 594, 4105, 2429, 281, 396, 567, 1162, 355, 4240, 436, 5936, 326, 2057, 625, 1869, 310, 2424, 327, 849, 281, 1682, 4017, 3602, 273, 1006, 800, 3210, 390, 253, 8245, 556, 417, 644, 6283, 9009, 85, 37437, 50276, 28821, 436, 310, 271, 7094, 16774, 1263, 891, 651, 7052, 11907, 253, 4477, 281, 3727, 2127, 19473, 685, 253, 14924, 20639, 50276, 2520, 476, 320, 6786, 970, 271, 17679, 18491, 50276, 13206, 374, 285, 495, 14777, 403, 247, 1652, 2834, 281, 14390, 1293, 7844, 13301, 187, 187, 4118, 18435, 27, 2520, 2929, 10262, 16774, 7103, 285, 5301, 273, 1027, 1006, 800, 3210, 824, 347, 305, 507, 285, 362, 3348, 275, 253, 45120, 4715, 4758, 50276, 936, 3693, 36256, 37264, 253, 1563, 8130, 403, 2783, 33558, 267, 37820, 1006, 800, 44864, 285, 1442, 292, 25004, 253, 16774, 27163, 403, 4824, 562, 970, 1264, 15302, 278, 79, 382, 8142, 278, 79, 382, 285, 260, 338, 274, 50275, 6050, 512, 30628, 285, 913, 14409, 253, 6349, 285, 2442, 31471, 273, 12392, 285, 10941, 1027, 1006, 800, 3210, 275, 45120, 4715, 597, 5439, 2067, 1774, 7350, 326, 1659, 436, 2929, 44462, 253, 14924, 2534, 337, 275, 271, 16774, 1263, 2929, 271, 801, 554, 394, 1783, 285, 625, 47860, 27163, 403, 2424, 281, 1805, 2096, 253, 5373, 285, 35387, 273, 253, 2130, 3210, 391, 18, 285, 391, 19, 24088, 18918, 2139, 1006, 800, 44864, 10224, 281, 3157, 362, 3348, 2139, 310, 33558, 267, 1805, 323, 12177, 3210, 285, 275, 2087, 2139, 2176, 13553, 403, 625, 3576, 685, 2571, 50276, 2887, 625, 13991, 275, 391, 18, 84, 285, 391, 19, 84, 5701, 253, 4477, 5469, 275, 616, 2380, 281, 253, 10123, 690, 273, 841, 3533, 533, 247, 625, 7000, 1783, 310, 2424, 281, 4751, 2096, 253, 5373, 273, 436, 16774, 1263, 374, 253, 7103, 310, 48526, 4404, 3290, 17082, 323, 253, 1006, 800, 3210, 285, 19756, 7103, 323, 36256, 37264, 275, 45120, 4715, 7613, 352, 2883, 2108, 305, 507, 3210, 50276, 2887, 391, 20, 84, 14876, 849, 281, 3157, 50275, 936, 7525, 253, 30628, 285, 913, 1804, 326, 275, 697, 1655, 1375, 253, 7714, 310, 417, 4704, 323, 247, 9311, 359, 3524, 253, 10123, 403, 4217, 323, 11138, 285, 3585, 2182, 253, 2929, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: a great recommender system relies on great training set however at the beginning there is no such data availability this paper tries to solve the zeroshot recommendation problem where there is no user or item overlaps the two challenges are generalize to unseen users and to unseen items for unseens users sequential recommendation represents users as a sequence of their interacted items as long as the items are seen before the users can be represented as for items the unique ids are not useful however the attributes such as natural language description can be universal this paper proposed an approach based on hierarchical bayesian model the item universal embedding is using pretrained bert network with a single layer neural network extensive experiments are carried out to demonstrate the effectiveness of the proposed approach pros 1 this paper is well motivated the problem of zeroshot recommendation is interesting yet very challenging to solve 2 the proposed approach with universal item embedding combined with sequential recommendation methods for user embedding levering the items that user have interacted with is a novel and reasonable approach for solving this problem 3 the paper demonstrate the effectiveness of the proposed approach using realworld dataset and experiments the questions in section 4 that the experiment tries to answer are informative the experiment set up is solid with following both the nonoverlapping as well as temporal aspect to it 4 it is good that the paper demonstrates the proposed framework using multiple base sequential models cons 1 the literature survey does not cover a lot of more recent approaches from sequential recommendations although the proposed approach is model agnostic it would still be good a provide a thorough literature review same for the baselines compared are not as strong some examples papers there are more are below sun fei et al bert4rec sequential recommendation with bidirectional encoder representations from transformer proceedings of the 28th acm international conference on information and knowledge management 2019 kang wangcheng and julian mcauley selfattentive sequential recommendation 2018 ieee international conference on data mining icdm ieee 2018 zhou kun et al s3rec selfsupervised learning for sequential recommendation with mutual information maximization proceedings of the 29th acm international conference on information knowledge management 2020 2 it would be good to add some discussions in terms of how easyhard the proposed approach can generalize to other modalities such as video image attributes does these modality also share the commonalities that language use 3 what is the sequential model used for 34 user universal embedding network after getting each time steps embedding for each item 4 in figure 2 when saying the zsesrec do not directly use the targetdomain data do you use the interaction histories of users if so it will be consider to be as using the target data if not how is user represented arent all uses will get the same results if no item is recommended 5 the limitations in terms how close the source and target domain needs to be are not discussed in the paper overall the problem being studied is interesting and challenging the approach proposed in this paper make sense the experiments questions and design are good many major concern is intermittent question 4 in main review please clarify more for the authors there are some areas for improvement like baselines literature review lack of discussion about docsepin this paper the authors identify a new and interesting problem called zeroshot recommendation where there is no overlap of users or items between a source domain and a target domain the main idea is to bridge two domains via the attributes of the item and the users which is called continuous universal item id and user id strengths 1 the authors identify a new and interesting problem called zeroshot recommendation 2 the proposed zeroshot recommendation method is simple and generic weakness 1 the idea is a bit too straightforward ie using the attributes of the itemsusers and their embeddings to bridge any two domains 2 the technical contribution is limited ie there is no significant technical contribution and extension based on a typical model for the crossdomain recommendation setting in this paper the authors identify a new and interesting problem called zeroshot recommendation where there is no overlap of user or items between a source domain and a target domain the main idea is to bridge two domains via the attributes of the itemusers which is called continuous universal itemuser id the authors conduct experiments and show that the proposed zeroshot recommendation framework work well overall the paper is well presented the studied problem zeroshot recommendation is interesting however my main concern is that the idea to bridge two nonoverlap domains via attributes is too straightforward and the technical contribution is limited docsepthis paper studies zero shot recommendation where source and target domain have no overlap in terms of user and items the paper proposes to use item content features such as leveraging bert on descriptions instead of ids experiments are conducted on two offline datasets this paper studies zero shot recommendation where there is no overlap between the source domain and target domain in terms of user id and item ids this is different from the recent crossdomain recommendations where overlapping items are leveraged the main idea is to use item content features that can be generalizable such as using bert over item descriptions users are represented based on the items that were interacted by them experiments are conducted on two offline datasets where variants of the proposed methods outperform some indomain only methods and simple crossdomain baselines such as random and direct embedding matching some case studies are provided strength the paper is clearly written and the authors clearly communicates what has been done weakness the major issue of this paper is novelty the reviewer agrees that the zero shot recommendation is probably an interesting future direction but the way that this problem is tackled in this paper makes it not different from known problem settings more specifically the paper uses contentbased methods instead of ids contentbased recommendation is as old as id based recommendations if not older contentbased recommendations is the default solution for coldstart problems and is commonly used together with ids in practice the method proposed in this paper is not different from existing contentbased recommendation paradigm 1 the model is learned on source domain and directly applied to target domain there are no tuning on target domain eg by using some unsupervised methods the important assumption here is the source and target domain are actually still indomain otherwise the model will just fail since nothing is done in the target domain then it is not really different from contentbased methods for cold start problem which has been studied for decades the reviewer understands for realworld applications that are different problems but in terms of machine learning the focus of this conference the reviewer does not see novelty actually in terms of applications 2 one motivation to do zero shot recommendations is that small new websites do not have enough data however the paper assumes that users in the target domain has interactions available otherwise there will be no user representation this is confusing since the reviewer does not understand if this is really zeroshot or not many recommendation papers use such datasets so it is a standard data setting furthermore the experiments are conducted on two offline datasets and the authors need to do a bunch of data massaging to make it zero shot this makes the reviewer wonder how valuable this setting is if user interactions are already available in the target domain why zero shot is needed also ideally there should be realworld use cases to show this setting can really benefit new websites overall the reviewer recommends the authors to think carefully what zero shot recommendation is a novel meaningful setting and why the proposed methods in this paper has any meaningful novelties currently the problem setting and proposed method do not seem to really show novelty regardless of the bar of iclr reply to rebuttal the reviewer acknowledge the response but not convinced 1 the authors argue that our zeroshot setting is fundamentally different from contentbased cold start the reviewer understands the difference of the setting and mentioned in the original review that this setting can have values if tackled concretely the reviewers point is the proposed method makes assumptions that make the new setting work only under scenarios that is virtually the same as cold start or crossdomain recommendation more specifically only inference is done on the target domain without any adjustment this will clearly fail in many cases the authors keep using claims such as completely different extreme coldstart but what would happen for example if the model learned on mind dataset is applied to amazon datasets how would the owner of a new website decide which dataset to use how would they get such datasets 2 the authors argue that our setting is zeroshot because the target domain data is not used during training but only during inference to simulate online scenarios where new businesses just open and the customers are using service in realtime what the reviewer refers to is batch access ahead of time which is distinct from our online access case the authors also argue that previous initialphase recommenders can only collect lowquality interactions and initialphase recommenders take much longer time to collect data because inaccurate recommendations are not appealing to users the authors referred to figure 2 the incremental training to argue for effectiveness the reviewer finds some arguments and the incremental training experiments confusing first data used to train recommender systems are not necessarily from recommendation ui it is common practice to just use interaction data regardless of where it comes from in fact one can argue that interactions not from the ui do not possess many kinds of biases and are of higher quality second figure 2 is not really measuring what the authors are arguing about noticeably the proposed methods have constant performance over the time period while other methods get better as more training data is available how could the proposed method work well when there are no interactions as its inference depends on users past interactions the reviewer notices that the test set is on later date ie week 5 for mind and the proposed method already leverages previous interactions in first 4 weeks during inference again i understand it is not trained to update any parameters but think what would happen in practice such interaction data is available anyway for the proposed model to do inference a the performance in day1 or week1 is not measured which is what the authors are really arguing about on the real day1 without any interactions the proposed method will generate nothing or the same set of items to all users does it really help bootstrapping a new website 2 if the proposed method has first 4 weeks data for inference then other methods should use them for training it is just the proposed method cant train with them so the reviewer feels the experiments are not fair under the current setting please let me know if i misunderstood anything in terms of the experimental setting though the paper is well written the reviewer finds it difficult to believe the zero shot recommendation is a novel setting and the proposed methods have differ from content traditional contentbased recommender systems in meaningful ways ### Summary:
the authors propose zeroshot recommendations a scenario in which knowledge from a recommender system enables a second recommender system to provide recommendations in a new domain ie new users new items the idea developed by the authors is to transfer knowledge through the item content information and the user behaviors the initial assessment of the reviewers indicated that this paper was likely not yet ready for publication the reviewers all recognized the potential usefulness of zeroshot recommendations but argued that the implications of the proposed setup were somewhat unclear most notably the reviewers raised the issue of how widely applicable this was in terms of distance between source and target domains presumably the quality of the zeroshot recommendations depends on the distance the reviewers also noted that this was an application paper this is of course within the cfp and recommender systems papers have been published at iclr in the past for example one of the initial sessionbased recsys paper w rnns but the potential audience for this work is somewhat lower at iclr i should also add that i agree with the authors that their model is novel but its very much tailored to this application and it was unclear to me how it might be impactful on its own all in all this did not play a significant role in my recommendation during the discussion there were significant yet respectful disagreements between the authors and the reviewers it also seems like perhaps the authors missed an important reply from reviewer hjb8 made available through their updated review see reply to rebuttal so the discussion between reviewers and authors did not converge having said that even the two most positive reviewers have scores that would make this paper a very borderline one a 6 and a 5 further i do find that reviewers hjb8 arguments have merit and require another round of review in particular i think the role and effect of your simulated online scenario should be further discussed note that i did read the new paragraph on it from your latest manuscript for example comparing to a baseline that can train with the data from this new domain would be useful even if at some point it ends up being an upper bound on the performance of your approach i also found the question raised by the reviewer around the mind results to be pertinent further characterizing pairs of domains in which the approachworks fails even if empirically would add depth to this paper all in all this paper has interesting ideas and i strongly encourage the authors to provide a more thorough experimental setup that fully explores the benefits and limitations of their zeroshot approach
[ 253, 1072, 1543, 604, 642, 5382, 310, 8521, 608, 253, 7364, 275, 2426, 849, 2810, 253, 2603, 285, 2303, 5028, 3198, 281, 320, 403, 417, 5469, 275, 253, 2929, 50274, 1189, 455, 253, 1895, 1146, 5421, 310, 4722, 285, 11132, 253, 2746, 4081, 275, 436, 2929, 1056, 3282, 253, 4679, 3533, 285, 2216, 403, 1175, 1142, 2201, 4468, 310, 32540, 1953, 577, 275, 2022, 2278, 4496, 19148, 625, 323, 253, 4477, 627, 403, 690, 3672, 323, 7756, 751, 1666, 25379, 6239, 2278, 3480, 273, 5955, 670, 50274, 7152, 339, 9852, 436, 2929, 253, 4477, 4271, 247, 747, 285, 4722, 1895, 1925, 1182, 254, 6934, 302, 17401, 835, 627, 310, 642, 14787, 273, 4212, 390, 4957, 875, 247, 2603, 5028, 285, 247, 2303, 5028, 253, 2022, 2934, 310, 281, 9729, 767, 10625, 3066, 253, 12474, 273, 253, 5382, 285, 253, 4212, 534, 310, 1925, 5415, 10898, 5382, 2654, 285, 2608, 2654, 20544, 50276, 18, 253, 4477, 4271, 247, 747, 285, 4722, 1895, 1925, 1182, 254, 6934, 302, 17401, 50276, 19, 253, 4081, 1182, 254, 6934, 302, 17401, 1332, 310, 2969, 285, 12314, 50276, 20881, 1255, 50276, 18, 253, 2934, 310, 247, 2372, 1512, 15246, 26332, 970, 253, 12474, 273, 253, 4957, 15987, 285, 616, 46234, 281, 9729, 667, 767, 10625, 50276, 19, 253, 7681, 7680, 310, 3710, 26332, 627, 310, 642, 1534, 7681, 7680, 285, 6880, 1754, 327, 247, 6867, 1566, 323, 253, 2831, 13517, 17401, 4758, 50275, 249, 436, 2929, 253, 4477, 4271, 247, 747, 285, 4722, 1895, 1925, 1182, 254, 6934, 302, 17401, 835, 627, 310, 642, 14787, 273, 2608, 390, 4957, 875, 247, 2603, 5028, 285, 247, 2303, 5028, 253, 2022, 2934, 310, 281, 9729, 767, 10625, 3066, 253, 12474, 273, 253, 5382, 15987, 534, 310, 1925, 5415, 10898, 5382, 4537, 2654, 50276, 783, 4477, 2589, 4679, 285, 921, 326, 253, 4081, 1182, 254, 6934, 302, 17401, 7792, 789, 973, 50275, 1189, 455, 253, 2929, 310, 973, 3559, 253, 5421, 1895, 1182, 254, 6934, 302, 17401, 310, 4722, 50276, 35529, 619, 2022, 4468, 310, 326, 253, 2934, 281, 9729, 767, 1327, 1189, 30520, 10625, 3066, 12474, 310, 1512, 15246, 285, 253, 7681, 7680, 310, 3710, 50276, 7152, 33032, 2520, 2929, 2175, 5058, 5103, 17401, 835, 2603, 285, 2303, 5028, 452, 642, 14787, 275, 2426, 273, 2608, 285, 4957, 253, 2929, 29328, 281, 897, 5382, 2600, 3386, 824, 347, 19732, 2977, 270, 797, 327, 20121, 3185, 273, 44077, 4679, 403, 5196, 327, 767, 28841, 15302, 436, 2929, 2175, 5058, 5103, 17401, 835, 627, 310, 642, 14787, 875, 253, 2603, 5028, 285, 2303, 5028, 275, 2426, 273, 2608, 2654, 285, 5382, 44077, 436, 310, 1027, 432, 253, 3332, 2831, 13517, 12645, 835, 21481, 4957, 403, 19732, 2961, 253, 2022, 2934, 310, 281, 897, 5382, 2600, 3386, 326, 476, 320, 2087, 12729, 824, 347, 970, 270, 797, 689, 5382, 20121, 4212, 403, 6607, 1754, 327, 253, 4957, 326, 497, 8008, 264, 407, 731, 4679, 403, 5196, 327, 767, 28841, 15302, 835, 11640, 273, 253, 4081, 3082, 562, 32231, 690, 801, 297, 404, 760, 3082, 285, 2969, 2831, 13517, 1666, 25379, 824, 347, 3632, 285, 1480, 21496, 11038, 690, 1083, 2175, 403, 2530, 50276, 45563, 50276, 783, 2929, 310, 4518, 3542, 285, 253, 4477, 4518, 3461, 684, 752, 556, 644, 2218, 50276, 20881, 1255, 50276, 783, 2201, 2523, 273, 436, 2929, 310, 38135, 253, 37317, 18726, 326, 253, 5058, 5103, 17401, 310, 3164, 271, 4722, 2852, 3884, 533, 253, 1039, 326, 436, 1895, 310, 11463, 1070, 275, 436, 2929, 2789, 352, 417, 1027, 432, 1929, 1895, 7533, 625, 5742, 253, 2929, 4648, 2600, 3169, 3082, 3185, 273, 44077, 2600, 3169, 17401, 310, 347, 1711, 347, 2654, 1754, 12645, 604, 417, 5662, 2600, 3169, 12645, 310, 253, 4284, 2900, 323, 5412, 5478, 3237, 285, 310, 7744, 908, 2366, 342, 44077, 275, 3946, 253, 1332, 4081, 275, 436, 2929, 310, 417, 1027, 432, 5368, 2600, 3169, 17401, 22199, 337, 253, 1566, 310, 6311, 327, 2603, 5028, 285, 3587, 3732, 281, 2303, 5028, 627, 403, 642, 25184, 327, 2303, 5028, 24088, 407, 970, 690, 440, 35421, 3082, 253, 1774, 9376, 1060, 310, 253, 2603, 285, 2303, 5028, 403, 2686, 1335, 801, 297, 404, 5010, 253, 1566, 588, 816, 1891, 1580, 2717, 310, 2218, 275, 253, 2303, 5028, 840, 352, 310, 417, 1663, 1027, 432, 2600, 3169, 3082, 323, 5412, 1265, 1895, 534, 556, 644, 5421, 323, 8007, 253, 37317, 24586, 323, 1524, 10186, 4893, 326, 403, 1027, 3237, 533, 275, 2426, 273, 5145, 4715, 253, 2770, 273, 436, 8059, 253, 37317, 1057, 417, 923, 38135, 2686, 275, 2426, 273, 4893, 374, 581, 16038, 281, 513, 5058, 5103, 12645, 310, 326, 1355, 747, 14248, 513, 417, 452, 2217, 941, 2299, 253, 2929, 19584, 326, 4212, 275, 253, 2303, 5028, 556, 6355, 2130, 5010, 627, 588, 320, 642, 2608, 6779, 436, 310, 21643, 1580, 253, 37317, 1057, 417, 2096, 604, 436, 310, 1663, 1182, 254, 6934, 302, 390, 417, 1142, 17401, 9380, 897, 824, 15302, 594, 352, 310, 247, 2629, 941, 4758, 33810, 253, 4679, 403, 5196, 327, 767, 28841, 15302, 285, 253, 4477, 878, 281, 513, 247, 12190, 273, 941, 2280, 2977, 281, 1056, 352, 5058, 5103, 436, 2789, 253, 37317, 4282, 849, 9865, 436, 4758, 310, 50276, 338, 2608, 6355, 403, 2168, 2130, 275, 253, 2303, 5028, 2139, 5058, 5103, 310, 3058, 671, 34243, 627, 943, 320, 1524, 10186, 897, 2219, 281, 921, 436, 4758, 476, 1663, 5649, 747, 14248, 50276, 1189, 455, 253, 37317, 32636, 253, 4477, 281, 1158, 9257, 752, 5058, 5103, 17401, 310, 247, 4460, 14282, 4758, 285, 2139, 253, 4081, 3082, 275, 436, 2929, 556, 667, 14282, 4460, 2890, 4390, 253, 1895, 4758, 285, 4081, 1332, 513, 417, 1646, 281, 1663, 921, 38135, 10159, 273, 253, 2534, 273, 17857, 32888, 50276, 35670, 281, 30080, 22559, 253, 37317, 14409, 253, 2380, 533, 417, 13762, 50276, 18, 253, 4477, 9059, 326, 776, 1182, 254, 6934, 302, 4758, 310, 26401, 1027, 432, 2600, 3169, 5412, 1265, 50274, 783, 37317, 24586, 253, 3064, 273, 253, 4758, 285, 5393, 275, 253, 3236, 2278, 326, 436, 4758, 476, 452, 2193, 604, 11463, 1070, 345, 2414, 600, 253, 30628, 1127, 310, 253, 4081, 1332, 2789, 13260, 326, 1056, 253, 747, 4758, 789, 760, 762, 15216, 326, 310, 14257, 253, 1072, 347, 5412, 1265, 390, 2831, 13517, 17401, 625, 5742, 760, 17032, 310, 2218, 327, 253, 2303, 5028, 1293, 667, 14000, 436, 588, 4518, 1891, 275, 1142, 2219, 253, 4477, 1978, 970, 3916, 824, 347, 4336, 1027, 9559, 5412, 5478, 50276, 2858, 752, 651, 5108, 323, 1650, 604, 253, 1566, 6311, 327, 2564, 10895, 310, 3732, 281, 7001, 251, 15302, 849, 651, 253, 7302, 273, 247, 747, 4422, 7617, 534, 10895, 281, 897, 849, 651, 597, 755, 824, 15302, 50275, 19, 253, 4477, 9059, 326, 776, 4758, 310, 1182, 254, 6934, 302, 984, 253, 2303, 5028, 941, 310, 417, 908, 1309, 3733, 533, 760, 1309, 17032, 281, 26065, 3909, 15216, 835, 747, 9341, 816, 1527, 285, 253, 6383, 403, 970, 2579, 275, 1524, 2606, 752, 253, 37317, 10770, 281, 310, 14604, 2289, 6386, 273, 673, 534, 310, 5799, 432, 776, 3909, 2289, 1083, 253, 4477, 671, 9059, 326, 2045, 3302, 14213, 5583, 398, 476, 760, 4822, 1698, 15177, 6355, 285, 3302, 14213, 5583, 398, 1379, 1199, 3356, 673, 281, 4822, 941, 984, 31215, 12645, 403, 417, 23176, 281, 4212, 253, 4477, 6289, 281, 4677, 374, 253, 32809, 3733, 281, 9059, 323, 12510, 50276, 783, 37317, 9010, 690, 7125, 285, 253, 32809, 3733, 4679, 21643, 806, 941, 908, 281, 6194, 3818, 3109, 2718, 403, 417, 7933, 432, 17401, 28243, 352, 310, 1846, 3946, 281, 816, 897, 5016, 941, 10159, 273, 835, 352, 3249, 432, 275, 958, 581, 476, 9059, 326, 6355, 417, 432, 253, 28243, 513, 417, 7081, 1142, 9351, 273, 31306, 285, 403, 273, 2169, 3290, 1273, 4677, 374, 310, 417, 1663, 10499, 752, 253, 4477, 403, 16425, 670, 4366, 1598, 253, 4081, 3082, 452, 3638, 3045, 689, 253, 673, 2180, 1223, 643, 3082, 755, 1805, 347, 625, 3733, 941, 310, 2130, 849, 812, 253, 4081, 1332, 789, 973, 672, 627, 403, 642, 6355, 347, 697, 17032, 7024, 327, 4212, 2469, 6355, 253, 37317, 27833, 326, 253, 1071, 873, 310, 327, 1996, 3522, 26332, 2129, 608, 323, 2564, 285, 253, 4081, 1332, 2168, 19732, 1131, 2045, 6355, 275, 806, 577, 3618, 1309, 17032, 969, 891, 2096, 352, 310, 417, 10166, 281, 5731, 667, 3602, 533, 1158, 752, 651, 5108, 275, 3946, 50276, 10328, 5016, 941, 310, 2130, 8791, 323, 253, 4081, 1566, 281, 513, 17032, 247, 253, 3045, 275, 1388, 18, 390, 2129, 18, 310, 417, 4080, 534, 310, 752, 253, 4477, 403, 1663, 16425, 670, 327, 253, 1524, 1388, 18, 1293, 667, 6355, 253, 4081, 1332, 588, 6635, 2717, 390, 253, 1072, 873, 273, 4957, 281, 512, 4212, 1057, 352, 1663, 1361, 7491, 10981, 2784, 247, 747, 4422, 374, 604, 253, 4081, 1332, 556, 806, 577, 3618, 941, 323, 17032, 840, 643, 3082, 943, 897, 731, 323, 3733, 50276, 262, 310, 816, 253, 4081, 1332, 16216, 6194, 342, 731, 594, 253, 37317, 9193, 253, 4679, 403, 417, 4344, 762, 253, 1655, 4758, 4496, 1339, 479, 871, 604, 891, 46485, 2712, 275, 2426, 273, 253, 5661, 4758, 50274, 2004, 253, 2929, 310, 973, 3542, 253, 37317, 9010, 352, 2834, 281, 2868, 253, 5058, 5103, 17401, 310, 247, 4460, 4758, 285, 253, 4081, 3082, 452, 9184, 432, 2600, 5899, 2600, 3169, 3818, 3109, 2718, 275, 14282, 4088, 50276, 187, 187, 4118, 18435, 27, 783, 4477, 12661, 1182, 254, 6934, 302, 12645, 247, 10076, 275, 534, 3640, 432, 247, 3818, 3109, 985, 13276, 247, 1273, 3818, 3109, 985, 281, 2085, 12645, 275, 247, 747, 5028, 26332, 747, 4212, 50276, 1826, 4957, 253, 2934, 3715, 407, 253, 4477, 310, 281, 3700, 3640, 949, 253, 5382, 2600, 1491, 285, 253, 2608, 13576, 50276, 783, 3302, 6803, 273, 253, 30628, 4860, 326, 436, 2929, 369, 2779, 417, 2568, 4704, 323, 9311, 253, 30628, 512, 7478, 253, 2442, 31471, 273, 1182, 254, 6934, 302, 12645, 533, 9125, 326, 253, 12739, 273, 253, 4081, 9978, 497, 8489, 12744, 954, 19836, 253, 30628, 5439, 253, 2523, 273, 849, 7561, 7763, 436, 369, 275, 2426, 273, 4181, 875, 2603, 285, 2303, 10625, 18289, 253, 3290, 273, 253, 1182, 254, 6934, 302, 12645, 7024, 327, 253, 4181, 50275, 783, 30628, 671, 4879, 326, 436, 369, 271, 2898, 2929, 436, 310, 273, 2282, 1561, 253, 260, 16983, 285, 3818, 3109, 2718, 9380, 452, 644, 3863, 387, 17857, 32888, 275, 253, 2469, 323, 1650, 581, 273, 253, 3302, 6874, 3169, 761, 10404, 2929, 259, 391, 79, 2224, 533, 253, 2442, 8446, 323, 436, 789, 310, 8489, 2406, 387, 17857, 32888, 891, 943, 671, 823, 326, 891, 5194, 342, 253, 4477, 326, 616, 1566, 310, 4460, 533, 697, 1077, 1199, 27846, 281, 436, 2898, 285, 352, 369, 12744, 281, 479, 849, 352, 1537, 320, 3486, 1020, 327, 697, 1211, 512, 275, 512, 436, 858, 417, 1132, 247, 1534, 2554, 275, 619, 17401, 50276, 32674, 253, 5955, 627, 497, 1534, 2568, 38975, 10009, 36559, 875, 253, 4477, 285, 253, 30628, 352, 671, 3133, 751, 4931, 253, 4477, 9829, 271, 1774, 12252, 432, 37317, 288, 27686, 25, 1160, 2130, 949, 616, 9300, 2278, 923, 12252, 281, 30080, 22559, 594, 253, 5955, 875, 30628, 285, 4477, 858, 417, 29623, 1907, 753, 326, 1014, 253, 767, 954, 2762, 30628, 452, 7363, 326, 651, 1056, 436, 2929, 247, 1077, 45210, 581, 247, 721, 285, 247, 608, 50275, 44295, 891, 513, 1089, 326, 30628, 288, 27686, 25, 7125, 452, 15785, 285, 2430, 1529, 3790, 273, 2278, 275, 1798, 891, 1158, 253, 2554, 285, 1055, 273, 634, 15524, 3909, 10076, 943, 320, 2007, 5469, 3877, 326, 891, 858, 1239, 253, 747, 12494, 327, 352, 432, 634, 6323, 7714, 323, 1650, 10941, 281, 247, 8245, 326, 476, 6194, 342, 253, 941, 432, 436, 747, 5028, 651, 320, 4217, 1014, 604, 387, 690, 1127, 352, 7637, 598, 1146, 271, 5170, 3033, 327, 253, 3045, 273, 634, 2746, 891, 671, 1119, 253, 1953, 5439, 407, 253, 37317, 1475, 253, 2564, 1543, 281, 320, 21452, 2007, 39330, 8557, 273, 10625, 275, 534, 253, 2746, 4896, 10224, 1014, 604, 45190, 651, 823, 6864, 281, 436, 2929, 50275, 455, 275, 512, 436, 2929, 556, 4722, 5697, 285, 891, 7052, 11907, 253, 4477, 281, 2085, 247, 625, 11080, 5661, 9978, 326, 4751, 33826, 253, 5373, 285, 7364, 273, 616, 1182, 254, 6934, 302, 2746 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 253, 1072, 1543, 604, 642, 5382, 310, 8521, 608, 253, 7364, 275, 2426, 849, 2810, 253, 2603, 285, 2303, 5028, 3198, 281, 320, 403, 417, 5469, 275, 253, 2929, 50274, 1189, 455, 253, 1895, 1146, 5421, 310, 4722, 285, 11132, 253, 2746, 4081, 275, 436, 2929, 1056, 3282, 253, 4679, 3533, 285, 2216, 403, 1175, 1142, 2201, 4468, 310, 32540, 1953, 577, 275, 2022, 2278, 4496, 19148, 625, 323, 253, 4477, 627, 403, 690, 3672, 323, 7756, 751, 1666, 25379, 6239, 2278, 3480, 273, 5955, 670, 50274, 7152, 339, 9852, 436, 2929, 253, 4477, 4271, 247, 747, 285, 4722, 1895, 1925, 1182, 254, 6934, 302, 17401, 835, 627, 310, 642, 14787, 273, 4212, 390, 4957, 875, 247, 2603, 5028, 285, 247, 2303, 5028, 253, 2022, 2934, 310, 281, 9729, 767, 10625, 3066, 253, 12474, 273, 253, 5382, 285, 253, 4212, 534, 310, 1925, 5415, 10898, 5382, 2654, 285, 2608, 2654, 20544, 50276, 18, 253, 4477, 4271, 247, 747, 285, 4722, 1895, 1925, 1182, 254, 6934, 302, 17401, 50276, 19, 253, 4081, 1182, 254, 6934, 302, 17401, 1332, 310, 2969, 285, 12314, 50276, 20881, 1255, 50276, 18, 253, 2934, 310, 247, 2372, 1512, 15246, 26332, 970, 253, 12474, 273, 253, 4957, 15987, 285, 616, 46234, 281, 9729, 667, 767, 10625, 50276, 19, 253, 7681, 7680, 310, 3710, 26332, 627, 310, 642, 1534, 7681, 7680, 285, 6880, 1754, 327, 247, 6867, 1566, 323, 253, 2831, 13517, 17401, 4758, 50275, 249, 436, 2929, 253, 4477, 4271, 247, 747, 285, 4722, 1895, 1925, 1182, 254, 6934, 302, 17401, 835, 627, 310, 642, 14787, 273, 2608, 390, 4957, 875, 247, 2603, 5028, 285, 247, 2303, 5028, 253, 2022, 2934, 310, 281, 9729, 767, 10625, 3066, 253, 12474, 273, 253, 5382, 15987, 534, 310, 1925, 5415, 10898, 5382, 4537, 2654, 50276, 783, 4477, 2589, 4679, 285, 921, 326, 253, 4081, 1182, 254, 6934, 302, 17401, 7792, 789, 973, 50275, 1189, 455, 253, 2929, 310, 973, 3559, 253, 5421, 1895, 1182, 254, 6934, 302, 17401, 310, 4722, 50276, 35529, 619, 2022, 4468, 310, 326, 253, 2934, 281, 9729, 767, 1327, 1189, 30520, 10625, 3066, 12474, 310, 1512, 15246, 285, 253, 7681, 7680, 310, 3710, 50276, 7152, 33032, 2520, 2929, 2175, 5058, 5103, 17401, 835, 2603, 285, 2303, 5028, 452, 642, 14787, 275, 2426, 273, 2608, 285, 4957, 253, 2929, 29328, 281, 897, 5382, 2600, 3386, 824, 347, 19732, 2977, 270, 797, 327, 20121, 3185, 273, 44077, 4679, 403, 5196, 327, 767, 28841, 15302, 436, 2929, 2175, 5058, 5103, 17401, 835, 627, 310, 642, 14787, 875, 253, 2603, 5028, 285, 2303, 5028, 275, 2426, 273, 2608, 2654, 285, 5382, 44077, 436, 310, 1027, 432, 253, 3332, 2831, 13517, 12645, 835, 21481, 4957, 403, 19732, 2961, 253, 2022, 2934, 310, 281, 897, 5382, 2600, 3386, 326, 476, 320, 2087, 12729, 824, 347, 970, 270, 797, 689, 5382, 20121, 4212, 403, 6607, 1754, 327, 253, 4957, 326, 497, 8008, 264, 407, 731, 4679, 403, 5196, 327, 767, 28841, 15302, 835, 11640, 273, 253, 4081, 3082, 562, 32231, 690, 801, 297, 404, 760, 3082, 285, 2969, 2831, 13517, 1666, 25379, 824, 347, 3632, 285, 1480, 21496, 11038, 690, 1083, 2175, 403, 2530, 50276, 45563, 50276, 783, 2929, 310, 4518, 3542, 285, 253, 4477, 4518, 3461, 684, 752, 556, 644, 2218, 50276, 20881, 1255, 50276, 783, 2201, 2523, 273, 436, 2929, 310, 38135, 253, 37317, 18726, 326, 253, 5058, 5103, 17401, 310, 3164, 271, 4722, 2852, 3884, 533, 253, 1039, 326, 436, 1895, 310, 11463, 1070, 275, 436, 2929, 2789, 352, 417, 1027, 432, 1929, 1895, 7533, 625, 5742, 253, 2929, 4648, 2600, 3169, 3082, 3185, 273, 44077, 2600, 3169, 17401, 310, 347, 1711, 347, 2654, 1754, 12645, 604, 417, 5662, 2600, 3169, 12645, 310, 253, 4284, 2900, 323, 5412, 5478, 3237, 285, 310, 7744, 908, 2366, 342, 44077, 275, 3946, 253, 1332, 4081, 275, 436, 2929, 310, 417, 1027, 432, 5368, 2600, 3169, 17401, 22199, 337, 253, 1566, 310, 6311, 327, 2603, 5028, 285, 3587, 3732, 281, 2303, 5028, 627, 403, 642, 25184, 327, 2303, 5028, 24088, 407, 970, 690, 440, 35421, 3082, 253, 1774, 9376, 1060, 310, 253, 2603, 285, 2303, 5028, 403, 2686, 1335, 801, 297, 404, 5010, 253, 1566, 588, 816, 1891, 1580, 2717, 310, 2218, 275, 253, 2303, 5028, 840, 352, 310, 417, 1663, 1027, 432, 2600, 3169, 3082, 323, 5412, 1265, 1895, 534, 556, 644, 5421, 323, 8007, 253, 37317, 24586, 323, 1524, 10186, 4893, 326, 403, 1027, 3237, 533, 275, 2426, 273, 5145, 4715, 253, 2770, 273, 436, 8059, 253, 37317, 1057, 417, 923, 38135, 2686, 275, 2426, 273, 4893, 374, 581, 16038, 281, 513, 5058, 5103, 12645, 310, 326, 1355, 747, 14248, 513, 417, 452, 2217, 941, 2299, 253, 2929, 19584, 326, 4212, 275, 253, 2303, 5028, 556, 6355, 2130, 5010, 627, 588, 320, 642, 2608, 6779, 436, 310, 21643, 1580, 253, 37317, 1057, 417, 2096, 604, 436, 310, 1663, 1182, 254, 6934, 302, 390, 417, 1142, 17401, 9380, 897, 824, 15302, 594, 352, 310, 247, 2629, 941, 4758, 33810, 253, 4679, 403, 5196, 327, 767, 28841, 15302, 285, 253, 4477, 878, 281, 513, 247, 12190, 273, 941, 2280, 2977, 281, 1056, 352, 5058, 5103, 436, 2789, 253, 37317, 4282, 849, 9865, 436, 4758, 310, 50276, 338, 2608, 6355, 403, 2168, 2130, 275, 253, 2303, 5028, 2139, 5058, 5103, 310, 3058, 671, 34243, 627, 943, 320, 1524, 10186, 897, 2219, 281, 921, 436, 4758, 476, 1663, 5649, 747, 14248, 50276, 1189, 455, 253, 37317, 32636, 253, 4477, 281, 1158, 9257, 752, 5058, 5103, 17401, 310, 247, 4460, 14282, 4758, 285, 2139, 253, 4081, 3082, 275, 436, 2929, 556, 667, 14282, 4460, 2890, 4390, 253, 1895, 4758, 285, 4081, 1332, 513, 417, 1646, 281, 1663, 921, 38135, 10159, 273, 253, 2534, 273, 17857, 32888, 50276, 35670, 281, 30080, 22559, 253, 37317, 14409, 253, 2380, 533, 417, 13762, 50276, 18, 253, 4477, 9059, 326, 776, 1182, 254, 6934, 302, 4758, 310, 26401, 1027, 432, 2600, 3169, 5412, 1265, 50274, 783, 37317, 24586, 253, 3064, 273, 253, 4758, 285, 5393, 275, 253, 3236, 2278, 326, 436, 4758, 476, 452, 2193, 604, 11463, 1070, 345, 2414, 600, 253, 30628, 1127, 310, 253, 4081, 1332, 2789, 13260, 326, 1056, 253, 747, 4758, 789, 760, 762, 15216, 326, 310, 14257, 253, 1072, 347, 5412, 1265, 390, 2831, 13517, 17401, 625, 5742, 760, 17032, 310, 2218, 327, 253, 2303, 5028, 1293, 667, 14000, 436, 588, 4518, 1891, 275, 1142, 2219, 253, 4477, 1978, 970, 3916, 824, 347, 4336, 1027, 9559, 5412, 5478, 50276, 2858, 752, 651, 5108, 323, 1650, 604, 253, 1566, 6311, 327, 2564, 10895, 310, 3732, 281, 7001, 251, 15302, 849, 651, 253, 7302, 273, 247, 747, 4422, 7617, 534, 10895, 281, 897, 849, 651, 597, 755, 824, 15302, 50275, 19, 253, 4477, 9059, 326, 776, 4758, 310, 1182, 254, 6934, 302, 984, 253, 2303, 5028, 941, 310, 417, 908, 1309, 3733, 533, 760, 1309, 17032, 281, 26065, 3909, 15216, 835, 747, 9341, 816, 1527, 285, 253, 6383, 403, 970, 2579, 275, 1524, 2606, 752, 253, 37317, 10770, 281, 310, 14604, 2289, 6386, 273, 673, 534, 310, 5799, 432, 776, 3909, 2289, 1083, 253, 4477, 671, 9059, 326, 2045, 3302, 14213, 5583, 398, 476, 760, 4822, 1698, 15177, 6355, 285, 3302, 14213, 5583, 398, 1379, 1199, 3356, 673, 281, 4822, 941, 984, 31215, 12645, 403, 417, 23176, 281, 4212, 253, 4477, 6289, 281, 4677, 374, 253, 32809, 3733, 281, 9059, 323, 12510, 50276, 783, 37317, 9010, 690, 7125, 285, 253, 32809, 3733, 4679, 21643, 806, 941, 908, 281, 6194, 3818, 3109, 2718, 403, 417, 7933, 432, 17401, 28243, 352, 310, 1846, 3946, 281, 816, 897, 5016, 941, 10159, 273, 835, 352, 3249, 432, 275, 958, 581, 476, 9059, 326, 6355, 417, 432, 253, 28243, 513, 417, 7081, 1142, 9351, 273, 31306, 285, 403, 273, 2169, 3290, 1273, 4677, 374, 310, 417, 1663, 10499, 752, 253, 4477, 403, 16425, 670, 4366, 1598, 253, 4081, 3082, 452, 3638, 3045, 689, 253, 673, 2180, 1223, 643, 3082, 755, 1805, 347, 625, 3733, 941, 310, 2130, 849, 812, 253, 4081, 1332, 789, 973, 672, 627, 403, 642, 6355, 347, 697, 17032, 7024, 327, 4212, 2469, 6355, 253, 37317, 27833, 326, 253, 1071, 873, 310, 327, 1996, 3522, 26332, 2129, 608, 323, 2564, 285, 253, 4081, 1332, 2168, 19732, 1131, 2045, 6355, 275, 806, 577, 3618, 1309, 17032, 969, 891, 2096, 352, 310, 417, 10166, 281, 5731, 667, 3602, 533, 1158, 752, 651, 5108, 275, 3946, 50276, 10328, 5016, 941, 310, 2130, 8791, 323, 253, 4081, 1566, 281, 513, 17032, 247, 253, 3045, 275, 1388, 18, 390, 2129, 18, 310, 417, 4080, 534, 310, 752, 253, 4477, 403, 1663, 16425, 670, 327, 253, 1524, 1388, 18, 1293, 667, 6355, 253, 4081, 1332, 588, 6635, 2717, 390, 253, 1072, 873, 273, 4957, 281, 512, 4212, 1057, 352, 1663, 1361, 7491, 10981, 2784, 247, 747, 4422, 374, 604, 253, 4081, 1332, 556, 806, 577, 3618, 941, 323, 17032, 840, 643, 3082, 943, 897, 731, 323, 3733, 50276, 262, 310, 816, 253, 4081, 1332, 16216, 6194, 342, 731, 594, 253, 37317, 9193, 253, 4679, 403, 417, 4344, 762, 253, 1655, 4758, 4496, 1339, 479, 871, 604, 891, 46485, 2712, 275, 2426, 273, 253, 5661, 4758, 50274, 2004, 253, 2929, 310, 973, 3542, 253, 37317, 9010, 352, 2834, 281, 2868, 253, 5058, 5103, 17401, 310, 247, 4460, 4758, 285, 253, 4081, 3082, 452, 9184, 432, 2600, 5899, 2600, 3169, 3818, 3109, 2718, 275, 14282, 4088, 50276, 187, 187, 4118, 18435, 27, 783, 4477, 12661, 1182, 254, 6934, 302, 12645, 247, 10076, 275, 534, 3640, 432, 247, 3818, 3109, 985, 13276, 247, 1273, 3818, 3109, 985, 281, 2085, 12645, 275, 247, 747, 5028, 26332, 747, 4212, 50276, 1826, 4957, 253, 2934, 3715, 407, 253, 4477, 310, 281, 3700, 3640, 949, 253, 5382, 2600, 1491, 285, 253, 2608, 13576, 50276, 783, 3302, 6803, 273, 253, 30628, 4860, 326, 436, 2929, 369, 2779, 417, 2568, 4704, 323, 9311, 253, 30628, 512, 7478, 253, 2442, 31471, 273, 1182, 254, 6934, 302, 12645, 533, 9125, 326, 253, 12739, 273, 253, 4081, 9978, 497, 8489, 12744, 954, 19836, 253, 30628, 5439, 253, 2523, 273, 849, 7561, 7763, 436, 369, 275, 2426, 273, 4181, 875, 2603, 285, 2303, 10625, 18289, 253, 3290, 273, 253, 1182, 254, 6934, 302, 12645, 7024, 327, 253, 4181, 50275, 783, 30628, 671, 4879, 326, 436, 369, 271, 2898, 2929, 436, 310, 273, 2282, 1561, 253, 260, 16983, 285, 3818, 3109, 2718, 9380, 452, 644, 3863, 387, 17857, 32888, 275, 253, 2469, 323, 1650, 581, 273, 253, 3302, 6874, 3169, 761, 10404, 2929, 259, 391, 79, 2224, 533, 253, 2442, 8446, 323, 436, 789, 310, 8489, 2406, 387, 17857, 32888, 891, 943, 671, 823, 326, 891, 5194, 342, 253, 4477, 326, 616, 1566, 310, 4460, 533, 697, 1077, 1199, 27846, 281, 436, 2898, 285, 352, 369, 12744, 281, 479, 849, 352, 1537, 320, 3486, 1020, 327, 697, 1211, 512, 275, 512, 436, 858, 417, 1132, 247, 1534, 2554, 275, 619, 17401, 50276, 32674, 253, 5955, 627, 497, 1534, 2568, 38975, 10009, 36559, 875, 253, 4477, 285, 253, 30628, 352, 671, 3133, 751, 4931, 253, 4477, 9829, 271, 1774, 12252, 432, 37317, 288, 27686, 25, 1160, 2130, 949, 616, 9300, 2278, 923, 12252, 281, 30080, 22559, 594, 253, 5955, 875, 30628, 285, 4477, 858, 417, 29623, 1907, 753, 326, 1014, 253, 767, 954, 2762, 30628, 452, 7363, 326, 651, 1056, 436, 2929, 247, 1077, 45210, 581, 247, 721, 285, 247, 608, 50275, 44295, 891, 513, 1089, 326, 30628, 288, 27686, 25, 7125, 452, 15785, 285, 2430, 1529, 3790, 273, 2278, 275, 1798, 891, 1158, 253, 2554, 285, 1055, 273, 634, 15524, 3909, 10076, 943, 320, 2007, 5469, 3877, 326, 891, 858, 1239, 253, 747, 12494, 327, 352, 432, 634, 6323, 7714, 323, 1650, 10941, 281, 247, 8245, 326, 476, 6194, 342, 253, 941, 432, 436, 747, 5028, 651, 320, 4217, 1014, 604, 387, 690, 1127, 352, 7637, 598, 1146, 271, 5170, 3033, 327, 253, 3045, 273, 634, 2746, 891, 671, 1119, 253, 1953, 5439, 407, 253, 37317, 1475, 253, 2564, 1543, 281, 320, 21452, 2007, 39330, 8557, 273, 10625, 275, 534, 253, 2746, 4896, 10224, 1014, 604, 45190, 651, 823, 6864, 281, 436, 2929, 50275, 455, 275, 512, 436, 2929, 556, 4722, 5697, 285, 891, 7052, 11907, 253, 4477, 281, 2085, 247, 625, 11080, 5661, 9978, 326, 4751, 33826, 253, 5373, 285, 7364, 273, 616, 1182, 254, 6934, 302, 2746 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper shows sketching methods to speed up iterative hessian sketch ihs and hessian regression with decently accurate subspaces learned from qr decomposition ihs and hessian regression achieve faster convergence than data oblivious methods pros this paper shows provable speedups on the secondorder methods introduced in previous works and further demonstrates with experiments the previous works are properly surveyed and discussed in the context of the proposed methods cons and questions to authors despite the above contributions the writing is ambiguous and does not properly draw relationships among different parts especially among section 2 3 and 4 i would increase my score if the authors could properly address the doubts how does countsketch introduced in section 31 relate to the methods proposed in section 4 and 5 it seems both section 4 and 5 rely on the qr method in section 32 the results in 31 are more of standard leverage score sampling results and are irrelevant to the proposed approaches how can the matrix t be quickly chosen in algorithm 2 line 7 and what is the relationship between t and si and t and a it is not clear from the writing why such a t is readily available cost of each iteration at the bottom of page 7 the authors claim that no additional computational cost is incurred in generating s other than solving the iteration step using countsketch and guassian matrices and sparse jl transforms will be considerably slower however the iteration step itself requires running algorithm 2 or 3 each of which includes qr on the original a matrix algorithm 2 line 8 algorithm 3 line 2 which are also expensive i would suggest the authors compare periteration complexities of the approaches shown in figure 1 besides it is not clear how algorithm 2 line 10 is implemented section 32 at the bottom of page 5 says we found empirically that not squaring this loss function works better than squaring it why does notsquaring matter if s is solved by just minimizing the loss mathcalls ai itself and some minor issues the notations at the bottom of page 5 and in lemma 33 at the beginning of page 6 are inconsistent page 5 says sai qi ri1 whereas page 6 says sa qr also it would be better if the authors could expand the mathcalls ai term at the bottom of page 5 to explicitly show how s goes into the loss what are the ai bis in section 61 below equation 7 are they just a and b the features and labels of the corresponding datasets it may be nice to show these numerics the convergence of firstorder methods in figure 14 as a baseline and how the spectrum of a looks like as mentioned above despite the contributions the writing if not the method itself is often confusing i would suggest the authors address these ambiguities in both the author response and the revised version docsepin this paper the authors extend a line of work focused on sketching the hessian for convex problems to help accelerate second order optimization methods in particular they present an algorithm for learning weights of a sketching matrix with one nonzero entry per columns chosen uniformly by using gradient descent the hessians used for training are treated as draws from a distribution of hessians they also discuss how leverage scores can be used to improve convergence rates by ensuring that heavy rows are sampled with probability 1 the authors show empirically that the learned sketches improve convergence rates and reduce the number of iterations for several problems they also provide theoretical results stating a reduction in rows required in the sketch when heavy leverage scores rows are known and errortime complexity bounds for the hessian sketchregression problems strengths the paper is well written and gives a nice background of existing literature with strong motivation the technical detail is of good quality and the numerical experiments provided show that the method performs well on selected data sets when comparing number of iterations required the idea is simple and intuitively appealing the problems addressed specifically least squares are ubiquitous and finding efficient ways to solve high dimensional problems is of perennial interest in addition errorcomplexity bounds are provided when used in iterative hessian sketch and fast regression weaknesses the paper claims to provide a framework for learned sketching that applies to a large number of problems in convex optimization but still focuses on least squares problems it seems that the primary contribution of this paper is to apply learned sketches for least squares problems as introduced in works by liu et al on learned sketches for randomized numerical linear algebra 2020 and learning the positions in countsketch 2020 to iterative hessian sketch and fast regression although theoretical bounds are provided the contribution is more limited than initially stated questions and comments 1 is it common that the hessian can be decomposed as at a with a in mathbbrn times d and n gg d for other convex problems this is common for ls but is it often observed elsewhere that makes it more useful for non ls problems 2 for learning the locations of the nonzero entries as discussed in 31 how is the oracle trained to predict these heavy rows are the number of occurrences of heavy rows tabulated and the the k most common rows to have leverage scores over some threshold selected more detail on this front would be helpful 3 although the sketches can be learned offline and finished within 5 minutes it is unclear how the sketch training time compares with the time to solve the problem a comparison of training time and time to convergence would be helpful to evaluate its merit minor comments p in mathbbnn rather than being a real vector the notation ai bi isnt entirely clear it clear indicates the sixe since the paper primarily focuses on the application of a learned sketch to an iteratively solved ls problem both components well established elsewhere the contribution seems marginal i believe the paper is marginally below the acceptance threshold as is the paper can be improved by addressing questions and comments above in particular evidence that the method is applicable to more general convex problems and a more comprehensive comparison for total time to solve compared to naive sketching docsepthis paper considers how learned countsketches can be used in a variety of optimization tasks the authors propose a method for predicting rows with high leverage score based on the training data these rows are then sampled deterministically alongside either a standard countsketch or the learned countsketch introduced by indyk et al 2019 some novel theory is presented for this new sketch strengths s1 learning of sketch matrices is an interesting twist to the more standard random sketches out there s2 the theory for the learned heavy row sketch is a nice attempt at providing justification for why the proposed learning method might improve results weaknesses w1 it is unclear how useful these learned sketches actually are in applications it seems like the learned sketches could easily break to put it in machine learning terms the heavy row sketch uses row indices as features to predict whether or not rows are heavy this is somewhat akin to classifying images on a social networking site based on the id of the user who uploaded them if the user with that id uploaded mostly cat pictures in the training data then classify the newly uploaded picture as a cat picture this means that if the rows of a design matrix in the testing data are permuted then the learned heavy row sketch isnt useful anymore this could for example happen in the electricity dataset if two residents switch homes it therefore seems like in practice you would need to retrain the model frequently to ensure that the leverage score estimates remain accurate w2 the datasets in the experiments are very small scale and can be solved very easily using deterministic methods it would have been more interesting to see the performance on larger scale datasets where these kind of techniques would potentially be more useful w3 the improvements that the learned sketches yield dont seem that substantial in some of the experiments for example in figures 1 and 46 it seems like they just save a few iterations compared to doing standard countsketch w4 some parts of the paper are unclear see questions below questions q1 how is line 10 in alg 2 computed q2 below theorem 42 you discuss how a better subspace embedding can lead to a faster convergence but its not clear from the discussion why this is the case in particular its not clear why the discussion around improved accuracy guarantees would lead to improved convergence rates what am i missing q3 out of curiosity how do you implement the various countsketch matrices do you form sparse matrices in python with the relevant nonzeros and multiply or do you do something more refined to make the computation faster q4 the datasets are poorly introduced what does b represent in the electric and ghg datasets how is the data in a in the ghg dataset organizedis time along the rows with different columns corresponding to different measurement sites q5 also does abtrain400 and abtest100 mean that you have a total of 500 pairs of design matrices and corresponding vectors b and that you use 400 for training and the rest for testing if yes are all the reported results the average results over all pairs ab in the testing data q6 its not clear why some methods are not given in some plots why is learnedcombined not plotted in figure 1 when it appears in figure 2 why are learnedvalueonly and learnedcombined not included in figure 3 why are learnedvalueonly learnedcombined and countsketch not included in figure 4 q7 what is the difference between the three plots in figure 1 please explain in the caption also the yaxis labels are cut off and not showing properly q8 in section 62 why are you using newtons method to solve a least squares problem is there any benefit to doing that rather than just doing a single sketchandsolve on the least squares problem itself q9 in the last paragraph before the conclusion you say that you only run one iteration for each subroutine what does that mean it looks like you do multiple iterations in figure 4 q10 one of the questions you mention in the introduction is 2 should we apply the same learned sketch in each iteration or learn it in the next iteration by training on a data set involving previously learned sketches from prior iterations you then say that you will answer this question but i didnt see this discussed in the paper you briefly mention it in the paragraph training in section 62 but i didnt see it properly addressed anywhere did i miss something other minor things in footnote on page 3 inut should be input the conclusion heading is too big for a paragraph heading you should put it as a proper section heading you should specify the unit used in the runtime label on the xaxis of figure 6 im assuming its seconds the core idea of learning sketch matrices is interesting the theory that provides a potential justification for why learned heavy row sketching might improve on the oblivious countsketch is welcome however the learned oracle for the heavy row sketch seems very easy to break simply permuting rows would mean that it has to be relearned overall its not clear how useful the learned sketch would be in practice for that reason the experiments are all done on very small problems that dont need sketching parts of the paper are also unclear as outlined above for these reasons i think its below the acceptance threshold docsepthis paper studies matrix sketching algorithms where the sketching matrix is learned or generated adaptively theoretically this is done by favoring more the rows of a with high leverage scores while practically this is done by associating that set with rows of large norms it evaluates this algorithm on both l1 and l2 regression and observes improved test errors my background is more in theoretical matrix algorithms so to me the idea of adaptively generating sketches based on leverage scores is somehow more fundamental than sketching the authors do do a good job addressing this related literature but their results are mostly direct consequences of matrix concentration bounds the experimental results are interesting im not aware of such evaluations of test errors in previous works they also point to significant gains of these methods over oblivious sketching based methods on the other hand these studies mostly take place on small to moderate sized data where the performance gains from sketching is unclear so i feel they serve mostly as proof of concept of the utility of such sketching and perhaps also point out that different formulationsobjectives are necessary for giving better test errors i feel this paper is well thought out and the experimental results are highly interesting however i also feel that the studies taken here adaptively select rows can be taken one step deeper in particular it would be very interesting to see theoretical analyses of why such learned sketching give improved results over oblivious sketches ### Summary:
this paper proposes a new contribution in the recent literature on learning distributions of sketches while all reviewers have recognized the overall good quality of the presentation two factors seem to weight heavily on a negative decision clarifications on the contributions scope presented as a tool for general hessians in the introduction but ultimately only applied to leastsquare errors of linear predictors to recover an explicit factorization of the hessian matrix and links with existing literature weakness of experiments whose small scale does not justify using sketches in the first place since this is a learning approach i am particularly sensitive to the latter point and therefore am inclined to reject but i encourage the authors to address these two issues with the current draft
[ 5474, 339, 9852, 436, 2929, 253, 4477, 9017, 247, 1386, 273, 789, 7106, 327, 30547, 7695, 253, 344, 859, 757, 323, 17133, 3237, 281, 1361, 28523, 1273, 1340, 13757, 3082, 275, 1798, 597, 1246, 271, 5933, 323, 4715, 13461, 273, 247, 30547, 7695, 4315, 342, 581, 28078, 5857, 591, 9930, 6777, 17568, 407, 970, 11786, 18499, 253, 344, 859, 2458, 908, 323, 3733, 403, 4127, 347, 21354, 432, 247, 3268, 273, 344, 859, 2458, 597, 671, 2319, 849, 25057, 7363, 476, 320, 908, 281, 3157, 14940, 4142, 407, 17749, 326, 5536, 10175, 403, 19958, 342, 5912, 337, 50275, 783, 4477, 921, 45190, 326, 253, 6311, 46159, 3157, 14940, 4142, 285, 4796, 253, 1180, 273, 25142, 323, 2067, 3237, 597, 671, 2085, 10527, 1543, 14851, 247, 5141, 275, 10175, 2424, 275, 253, 23211, 672, 5536, 25057, 7363, 10175, 403, 1929, 285, 1486, 430, 553, 10454, 14493, 323, 253, 344, 859, 757, 23211, 1747, 1256, 3237, 50276, 296, 3755, 20556, 253, 2929, 310, 973, 3542, 285, 4245, 247, 5322, 4114, 273, 5368, 6239, 342, 2266, 16038, 253, 7681, 2508, 310, 273, 1175, 3290, 285, 253, 10704, 4679, 2530, 921, 326, 253, 1332, 17923, 973, 327, 4236, 941, 5239, 672, 10941, 1180, 273, 25142, 2424, 253, 2934, 310, 2969, 285, 540, 41597, 23176, 253, 3237, 9713, 5742, 1878, 19325, 403, 33079, 285, 4560, 5919, 4088, 281, 8415, 1029, 15759, 3237, 310, 273, 49714, 1600, 275, 1635, 2228, 19017, 414, 14493, 403, 2530, 672, 908, 275, 34560, 344, 859, 757, 23211, 285, 3809, 9077, 50274, 20881, 1255, 265, 253, 2929, 3916, 281, 2085, 247, 7792, 323, 6311, 30547, 7695, 326, 10384, 281, 247, 1781, 1180, 273, 3237, 275, 17133, 13757, 533, 1335, 16633, 327, 1878, 19325, 3237, 352, 3133, 326, 253, 3625, 7680, 273, 436, 2929, 310, 281, 4647, 6311, 46159, 323, 1878, 19325, 3237, 347, 5611, 275, 2987, 407, 632, 86, 1162, 355, 327, 6311, 46159, 323, 14871, 10704, 4872, 8697, 9169, 285, 4715, 253, 6887, 275, 1385, 3319, 11429, 9169, 281, 34560, 344, 859, 757, 23211, 285, 3809, 9077, 3738, 10527, 14493, 403, 2530, 253, 7680, 310, 625, 3710, 685, 8523, 4767, 50272, 34974, 285, 5701, 337, 310, 352, 1846, 326, 253, 344, 859, 757, 476, 320, 45765, 347, 387, 247, 342, 247, 275, 14168, 67, 1288, 79, 2069, 277, 285, 295, 305, 72, 277, 323, 643, 17133, 3237, 436, 310, 1846, 323, 35253, 533, 310, 352, 2223, 2540, 11358, 326, 2789, 352, 625, 4217, 323, 1327, 35253, 3237, 374, 323, 4715, 253, 8593, 273, 253, 28078, 12028, 347, 5469, 275, 4562, 849, 310, 253, 42295, 10166, 281, 3283, 841, 5536, 10175, 403, 253, 1180, 273, 37102, 273, 5536, 10175, 10334, 2907, 285, 253, 253, 465, 954, 1846, 10175, 281, 452, 25057, 7363, 689, 690, 7887, 4236, 625, 2508, 327, 436, 2914, 651, 320, 9371, 50276, 20, 3738, 253, 46159, 476, 320, 6311, 28841, 285, 6699, 1561, 608, 2909, 352, 310, 12744, 849, 253, 23211, 3733, 673, 26662, 342, 253, 673, 281, 8415, 253, 1895, 247, 5301, 273, 3733, 673, 285, 673, 281, 14940, 651, 320, 9371, 281, 7472, 697, 15785, 50271, 37585, 5701, 50276, 81, 275, 14168, 4482, 9866, 2581, 685, 1146, 247, 1524, 4972, 50276, 783, 14951, 23105, 1794, 310, 2649, 7094, 2590, 352, 2590, 6492, 253, 2800, 70, 50276, 17480, 253, 2929, 8558, 16633, 327, 253, 2898, 273, 247, 6311, 23211, 281, 271, 10040, 3146, 14042, 35253, 1895, 1097, 4295, 973, 4232, 11358, 253, 7680, 3133, 16888, 891, 2868, 253, 2929, 310, 42876, 2708, 253, 14924, 7887, 347, 310, 253, 2929, 476, 320, 5520, 407, 15974, 3533, 285, 5701, 1840, 275, 1798, 1941, 326, 253, 1332, 310, 7763, 281, 625, 2087, 17133, 3237, 285, 247, 625, 11088, 5301, 323, 2264, 673, 281, 8415, 2429, 281, 27785, 30547, 7695, 5474, 33032, 2520, 2929, 19401, 849, 6311, 1385, 3319, 292, 2706, 476, 320, 908, 275, 247, 5235, 273, 13757, 8892, 253, 4477, 12661, 247, 1332, 323, 21565, 10175, 342, 1029, 25057, 4868, 1754, 327, 253, 3733, 941, 841, 10175, 403, 840, 19958, 11544, 18260, 12936, 2057, 247, 2629, 1385, 3319, 11429, 390, 253, 6311, 1385, 3319, 11429, 5611, 407, 801, 25073, 1162, 355, 6247, 690, 4460, 3762, 310, 3559, 323, 436, 747, 23211, 50275, 296, 3755, 20556, 50276, 84, 18, 4715, 273, 23211, 12624, 310, 271, 4722, 19152, 281, 253, 625, 2629, 3632, 46159, 562, 627, 50276, 84, 19, 253, 3762, 323, 253, 6311, 5536, 4194, 23211, 310, 247, 5322, 3177, 387, 5277, 22861, 323, 2139, 253, 4081, 4715, 1332, 1537, 3157, 1543, 50274, 20881, 1255, 265, 50276, 88, 18, 352, 310, 12744, 849, 4217, 841, 6311, 46159, 2686, 403, 275, 4893, 352, 3133, 751, 253, 6311, 46159, 812, 4354, 2740, 281, 1691, 352, 275, 5145, 4715, 2426, 253, 5536, 4194, 23211, 4648, 4194, 14452, 347, 3386, 281, 3283, 1880, 390, 417, 10175, 403, 5536, 436, 310, 8489, 33917, 281, 49653, 3888, 327, 247, 2675, 23245, 2670, 1754, 327, 253, 2654, 273, 253, 2608, 665, 28228, 731, 604, 253, 2608, 342, 326, 2654, 28228, 6571, 5798, 7968, 275, 253, 3733, 941, 840, 30215, 253, 9841, 28228, 5406, 347, 247, 5798, 5406, 436, 2097, 326, 604, 253, 10175, 273, 247, 2216, 4315, 275, 253, 5175, 941, 403, 8143, 4525, 840, 253, 6311, 5536, 4194, 23211, 310, 2649, 4217, 10542, 436, 812, 323, 1650, 5108, 275, 253, 13978, 10895, 604, 767, 8811, 5234, 9076, 352, 3103, 3133, 751, 275, 3946, 368, 651, 878, 281, 851, 1949, 253, 1566, 7208, 281, 5416, 326, 253, 25057, 4868, 8197, 3464, 7899, 50276, 88, 19, 253, 15302, 275, 253, 4679, 403, 1077, 1355, 4311, 285, 476, 320, 14042, 1077, 4354, 970, 30027, 3082, 352, 651, 452, 644, 625, 4722, 281, 923, 253, 3045, 327, 4067, 4311, 15302, 835, 841, 2238, 273, 5609, 651, 7826, 320, 625, 4217, 50275, 88, 20, 253, 11701, 326, 253, 6311, 46159, 4917, 13414, 1646, 326, 6832, 275, 690, 273, 253, 4679, 323, 1650, 275, 8442, 337, 285, 7904, 352, 3133, 751, 597, 816, 5321, 247, 1643, 25142, 2429, 281, 2509, 2629, 1385, 3319, 11429, 50276, 88, 21, 690, 4243, 273, 253, 2929, 403, 12744, 923, 3533, 2708, 50275, 34974, 50276, 82, 18, 849, 310, 1386, 884, 275, 20320, 374, 10302, 50276, 82, 19, 2708, 10012, 5976, 368, 2319, 849, 247, 1805, 24822, 21496, 476, 1421, 281, 247, 7938, 14940, 533, 697, 417, 2590, 432, 253, 5955, 2139, 436, 310, 253, 1083, 275, 1798, 697, 417, 2590, 2139, 253, 5955, 1475, 5520, 7200, 23632, 651, 1421, 281, 5520, 14940, 4142, 752, 717, 891, 5816, 50276, 82, 20, 562, 273, 24536, 849, 513, 368, 3359, 253, 2710, 1385, 3319, 11429, 12624, 513, 368, 830, 23507, 12624, 275, 15548, 342, 253, 4623, 1327, 8260, 375, 285, 30247, 390, 513, 368, 513, 1633, 625, 22407, 281, 1056, 253, 13782, 7938, 50276, 82, 21, 253, 15302, 403, 15225, 5611, 752, 1057, 270, 1957, 275, 253, 5637, 285, 32798, 72, 15302, 849, 310, 253, 941, 275, 247, 275, 253, 32798, 72, 10895, 10932, 261, 673, 2112, 253, 10175, 342, 1027, 9930, 3969, 281, 1027, 6814, 4375, 50275, 82, 22, 671, 1057, 490, 24382, 8320, 285, 490, 2566, 2313, 50276, 10722, 326, 368, 452, 247, 2264, 273, 6783, 8557, 273, 2216, 12624, 285, 3969, 11390, 270, 285, 326, 368, 897, 9166, 323, 3733, 285, 253, 1551, 323, 5175, 604, 4754, 403, 512, 253, 2361, 1543, 253, 3388, 1543, 689, 512, 8557, 490, 275, 253, 5175, 941, 50275, 82, 23, 697, 417, 2590, 2139, 690, 3082, 403, 417, 1677, 275, 690, 14777, 2139, 310, 6311, 17890, 967, 417, 17944, 275, 4677, 337, 672, 352, 4620, 275, 4677, 374, 2139, 403, 6311, 2877, 7483, 285, 6311, 17890, 967, 417, 2908, 275, 4677, 495, 2139, 403, 6311, 2877, 7483, 6311, 17890, 967, 285, 1385, 3319, 11429, 417, 2908, 275, 4677, 577, 50276, 82, 24, 752, 310, 253, 3064, 875, 253, 1264, 14777, 275, 4677, 337, 4496, 5513, 275, 253, 11743, 671, 253, 340, 10565, 13301, 403, 2624, 745, 285, 417, 4645, 6283, 50275, 82, 25, 275, 2593, 9743, 2139, 403, 368, 970, 747, 24787, 1332, 281, 8415, 247, 1878, 19325, 1895, 310, 627, 667, 5649, 281, 2509, 326, 2581, 685, 816, 2509, 247, 2014, 23211, 2287, 3247, 327, 253, 1878, 19325, 1895, 3139, 50276, 82, 26, 275, 253, 1390, 12494, 1078, 253, 6452, 368, 1333, 326, 368, 760, 1408, 581, 19502, 323, 1016, 749, 27861, 460, 752, 1057, 326, 1599, 352, 4453, 751, 368, 513, 2709, 25142, 275, 4677, 577, 50276, 82, 740, 581, 273, 253, 3533, 368, 3748, 275, 253, 10199, 310, 374, 943, 359, 4647, 253, 1072, 6311, 23211, 275, 1016, 19502, 390, 3037, 352, 275, 253, 1735, 19502, 407, 3733, 327, 247, 941, 873, 7668, 3786, 6311, 46159, 432, 2720, 25142, 368, 840, 1333, 326, 368, 588, 3662, 436, 1953, 533, 891, 42126, 923, 436, 5469, 275, 253, 2929, 368, 13366, 3748, 352, 275, 253, 12494, 3733, 275, 2593, 9743, 533, 891, 42126, 923, 352, 6283, 9713, 9825, 858, 891, 2985, 1633, 50274, 977, 5884, 1841, 50275, 249, 43302, 327, 3239, 495, 275, 307, 943, 320, 3280, 50275, 783, 6452, 13590, 310, 1512, 1943, 323, 247, 12494, 13590, 368, 943, 1691, 352, 347, 247, 1463, 2593, 13590, 50275, 5658, 943, 13199, 253, 3943, 908, 275, 253, 20243, 5203, 327, 253, 1269, 10565, 273, 4677, 721, 516, 7384, 697, 7253, 50276, 783, 5161, 2934, 273, 4715, 23211, 12624, 310, 4722, 253, 3762, 326, 3400, 247, 2442, 22861, 323, 2139, 6311, 5536, 4194, 30547, 7695, 1537, 3157, 327, 253, 38611, 784, 1385, 3319, 11429, 310, 10112, 2299, 253, 6311, 42295, 323, 253, 5536, 4194, 23211, 3133, 1077, 3477, 281, 2740, 3365, 8143, 9634, 10175, 651, 1599, 326, 352, 556, 281, 320, 1693, 1596, 264, 4583, 697, 417, 2590, 849, 4217, 253, 6311, 23211, 651, 320, 275, 3946, 323, 326, 1921, 253, 4679, 403, 512, 2218, 327, 1077, 1355, 3237, 326, 13414, 878, 30547, 7695, 4243, 273, 253, 2929, 403, 671, 12744, 347, 18627, 1840, 323, 841, 4606, 891, 1158, 697, 2708, 253, 14924, 7887, 50276, 7152, 33032, 2520, 2929, 2175, 4315, 30547, 7695, 11333, 835, 253, 30547, 7695, 4315, 310, 6311, 390, 4561, 5223, 1242, 28055, 436, 310, 2218, 407, 48065, 625, 253, 10175, 273, 247, 342, 1029, 25057, 7363, 1223, 18236, 436, 310, 2218, 407, 1709, 839, 326, 873, 342, 10175, 273, 1781, 22429, 352, 44995, 436, 5933, 327, 1097, 298, 18, 285, 298, 19, 9077, 285, 40687, 5520, 1071, 6332, 619, 4114, 310, 625, 275, 10527, 4315, 11333, 594, 281, 479, 253, 2934, 273, 5223, 1242, 11365, 46159, 1754, 327, 25057, 7363, 310, 10380, 625, 7936, 685, 30547, 7695, 253, 4477, 513, 513, 247, 1175, 2628, 15974, 436, 2905, 6239, 533, 616, 1543, 403, 6571, 1480, 9099, 273, 4315, 4719, 14493, 50276, 783, 5661, 1543, 403, 4722, 516, 417, 6600, 273, 824, 27163, 273, 1071, 6332, 275, 2045, 2987, 597, 671, 1127, 281, 1534, 15988, 273, 841, 3082, 689, 38611, 784, 30547, 7695, 1754, 3082, 327, 253, 643, 1133, 841, 2175, 6571, 1379, 1659, 327, 1355, 281, 10290, 25180, 941, 835, 253, 3045, 15988, 432, 30547, 7695, 310, 12744, 594, 891, 1928, 597, 5752, 6571, 347, 4737, 273, 4473, 273, 253, 11839, 273, 824, 30547, 7695, 285, 4931, 671, 1127, 562, 326, 1027, 26850, 6082, 1644, 403, 3309, 323, 4933, 1805, 1071, 6332, 891, 1928, 436, 2929, 310, 973, 1869, 562, 285, 253, 5661, 1543, 403, 4122, 4722, 2299, 891, 671, 1928, 326, 253, 2175, 2668, 1060, 5223, 1242, 3609, 10175, 476, 320, 2668, 581, 3213, 12861, 275, 1798, 352, 651, 320, 1077, 4722, 281, 923, 10527, 6260, 273, 2139, 824, 6311, 30547, 7695, 1918, 5520, 1543, 689, 38611, 784, 46159, 2490, 187, 4118, 18435, 27, 2520, 2929, 29328, 247, 747, 7680, 275, 253, 3332, 6239, 327, 4715, 10670, 273, 46159, 1223, 512, 30628, 452, 7478, 253, 4583, 1175, 3290, 273, 253, 9759, 767, 2616, 1646, 281, 2801, 11306, 327, 247, 4016, 3061, 8254, 6787, 327, 253, 9021, 7990, 3559, 347, 247, 4968, 323, 2087, 344, 859, 2458, 275, 253, 10199, 533, 9142, 760, 3732, 281, 1878, 15044, 6332, 273, 4872, 23477, 281, 9295, 271, 6843, 39401, 273, 253, 344, 859, 757, 4315, 285, 4859, 342, 5368, 6239, 14855, 273, 4679, 3692, 1355, 4311, 1057, 417, 15249, 970, 46159, 275, 253, 806, 1659, 1580, 436, 310, 247, 4715, 2746, 891, 717, 3782, 7996, 281, 253, 6158, 1127, 285, 3103, 717, 21802, 281, 12009, 533, 891, 11907, 253, 4477, 281, 2953, 841, 767, 3374, 342, 253, 1655, 7482 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 5474, 339, 9852, 436, 2929, 253, 4477, 9017, 247, 1386, 273, 789, 7106, 327, 30547, 7695, 253, 344, 859, 757, 323, 17133, 3237, 281, 1361, 28523, 1273, 1340, 13757, 3082, 275, 1798, 597, 1246, 271, 5933, 323, 4715, 13461, 273, 247, 30547, 7695, 4315, 342, 581, 28078, 5857, 591, 9930, 6777, 17568, 407, 970, 11786, 18499, 253, 344, 859, 2458, 908, 323, 3733, 403, 4127, 347, 21354, 432, 247, 3268, 273, 344, 859, 2458, 597, 671, 2319, 849, 25057, 7363, 476, 320, 908, 281, 3157, 14940, 4142, 407, 17749, 326, 5536, 10175, 403, 19958, 342, 5912, 337, 50275, 783, 4477, 921, 45190, 326, 253, 6311, 46159, 3157, 14940, 4142, 285, 4796, 253, 1180, 273, 25142, 323, 2067, 3237, 597, 671, 2085, 10527, 1543, 14851, 247, 5141, 275, 10175, 2424, 275, 253, 23211, 672, 5536, 25057, 7363, 10175, 403, 1929, 285, 1486, 430, 553, 10454, 14493, 323, 253, 344, 859, 757, 23211, 1747, 1256, 3237, 50276, 296, 3755, 20556, 253, 2929, 310, 973, 3542, 285, 4245, 247, 5322, 4114, 273, 5368, 6239, 342, 2266, 16038, 253, 7681, 2508, 310, 273, 1175, 3290, 285, 253, 10704, 4679, 2530, 921, 326, 253, 1332, 17923, 973, 327, 4236, 941, 5239, 672, 10941, 1180, 273, 25142, 2424, 253, 2934, 310, 2969, 285, 540, 41597, 23176, 253, 3237, 9713, 5742, 1878, 19325, 403, 33079, 285, 4560, 5919, 4088, 281, 8415, 1029, 15759, 3237, 310, 273, 49714, 1600, 275, 1635, 2228, 19017, 414, 14493, 403, 2530, 672, 908, 275, 34560, 344, 859, 757, 23211, 285, 3809, 9077, 50274, 20881, 1255, 265, 253, 2929, 3916, 281, 2085, 247, 7792, 323, 6311, 30547, 7695, 326, 10384, 281, 247, 1781, 1180, 273, 3237, 275, 17133, 13757, 533, 1335, 16633, 327, 1878, 19325, 3237, 352, 3133, 326, 253, 3625, 7680, 273, 436, 2929, 310, 281, 4647, 6311, 46159, 323, 1878, 19325, 3237, 347, 5611, 275, 2987, 407, 632, 86, 1162, 355, 327, 6311, 46159, 323, 14871, 10704, 4872, 8697, 9169, 285, 4715, 253, 6887, 275, 1385, 3319, 11429, 9169, 281, 34560, 344, 859, 757, 23211, 285, 3809, 9077, 3738, 10527, 14493, 403, 2530, 253, 7680, 310, 625, 3710, 685, 8523, 4767, 50272, 34974, 285, 5701, 337, 310, 352, 1846, 326, 253, 344, 859, 757, 476, 320, 45765, 347, 387, 247, 342, 247, 275, 14168, 67, 1288, 79, 2069, 277, 285, 295, 305, 72, 277, 323, 643, 17133, 3237, 436, 310, 1846, 323, 35253, 533, 310, 352, 2223, 2540, 11358, 326, 2789, 352, 625, 4217, 323, 1327, 35253, 3237, 374, 323, 4715, 253, 8593, 273, 253, 28078, 12028, 347, 5469, 275, 4562, 849, 310, 253, 42295, 10166, 281, 3283, 841, 5536, 10175, 403, 253, 1180, 273, 37102, 273, 5536, 10175, 10334, 2907, 285, 253, 253, 465, 954, 1846, 10175, 281, 452, 25057, 7363, 689, 690, 7887, 4236, 625, 2508, 327, 436, 2914, 651, 320, 9371, 50276, 20, 3738, 253, 46159, 476, 320, 6311, 28841, 285, 6699, 1561, 608, 2909, 352, 310, 12744, 849, 253, 23211, 3733, 673, 26662, 342, 253, 673, 281, 8415, 253, 1895, 247, 5301, 273, 3733, 673, 285, 673, 281, 14940, 651, 320, 9371, 281, 7472, 697, 15785, 50271, 37585, 5701, 50276, 81, 275, 14168, 4482, 9866, 2581, 685, 1146, 247, 1524, 4972, 50276, 783, 14951, 23105, 1794, 310, 2649, 7094, 2590, 352, 2590, 6492, 253, 2800, 70, 50276, 17480, 253, 2929, 8558, 16633, 327, 253, 2898, 273, 247, 6311, 23211, 281, 271, 10040, 3146, 14042, 35253, 1895, 1097, 4295, 973, 4232, 11358, 253, 7680, 3133, 16888, 891, 2868, 253, 2929, 310, 42876, 2708, 253, 14924, 7887, 347, 310, 253, 2929, 476, 320, 5520, 407, 15974, 3533, 285, 5701, 1840, 275, 1798, 1941, 326, 253, 1332, 310, 7763, 281, 625, 2087, 17133, 3237, 285, 247, 625, 11088, 5301, 323, 2264, 673, 281, 8415, 2429, 281, 27785, 30547, 7695, 5474, 33032, 2520, 2929, 19401, 849, 6311, 1385, 3319, 292, 2706, 476, 320, 908, 275, 247, 5235, 273, 13757, 8892, 253, 4477, 12661, 247, 1332, 323, 21565, 10175, 342, 1029, 25057, 4868, 1754, 327, 253, 3733, 941, 841, 10175, 403, 840, 19958, 11544, 18260, 12936, 2057, 247, 2629, 1385, 3319, 11429, 390, 253, 6311, 1385, 3319, 11429, 5611, 407, 801, 25073, 1162, 355, 6247, 690, 4460, 3762, 310, 3559, 323, 436, 747, 23211, 50275, 296, 3755, 20556, 50276, 84, 18, 4715, 273, 23211, 12624, 310, 271, 4722, 19152, 281, 253, 625, 2629, 3632, 46159, 562, 627, 50276, 84, 19, 253, 3762, 323, 253, 6311, 5536, 4194, 23211, 310, 247, 5322, 3177, 387, 5277, 22861, 323, 2139, 253, 4081, 4715, 1332, 1537, 3157, 1543, 50274, 20881, 1255, 265, 50276, 88, 18, 352, 310, 12744, 849, 4217, 841, 6311, 46159, 2686, 403, 275, 4893, 352, 3133, 751, 253, 6311, 46159, 812, 4354, 2740, 281, 1691, 352, 275, 5145, 4715, 2426, 253, 5536, 4194, 23211, 4648, 4194, 14452, 347, 3386, 281, 3283, 1880, 390, 417, 10175, 403, 5536, 436, 310, 8489, 33917, 281, 49653, 3888, 327, 247, 2675, 23245, 2670, 1754, 327, 253, 2654, 273, 253, 2608, 665, 28228, 731, 604, 253, 2608, 342, 326, 2654, 28228, 6571, 5798, 7968, 275, 253, 3733, 941, 840, 30215, 253, 9841, 28228, 5406, 347, 247, 5798, 5406, 436, 2097, 326, 604, 253, 10175, 273, 247, 2216, 4315, 275, 253, 5175, 941, 403, 8143, 4525, 840, 253, 6311, 5536, 4194, 23211, 310, 2649, 4217, 10542, 436, 812, 323, 1650, 5108, 275, 253, 13978, 10895, 604, 767, 8811, 5234, 9076, 352, 3103, 3133, 751, 275, 3946, 368, 651, 878, 281, 851, 1949, 253, 1566, 7208, 281, 5416, 326, 253, 25057, 4868, 8197, 3464, 7899, 50276, 88, 19, 253, 15302, 275, 253, 4679, 403, 1077, 1355, 4311, 285, 476, 320, 14042, 1077, 4354, 970, 30027, 3082, 352, 651, 452, 644, 625, 4722, 281, 923, 253, 3045, 327, 4067, 4311, 15302, 835, 841, 2238, 273, 5609, 651, 7826, 320, 625, 4217, 50275, 88, 20, 253, 11701, 326, 253, 6311, 46159, 4917, 13414, 1646, 326, 6832, 275, 690, 273, 253, 4679, 323, 1650, 275, 8442, 337, 285, 7904, 352, 3133, 751, 597, 816, 5321, 247, 1643, 25142, 2429, 281, 2509, 2629, 1385, 3319, 11429, 50276, 88, 21, 690, 4243, 273, 253, 2929, 403, 12744, 923, 3533, 2708, 50275, 34974, 50276, 82, 18, 849, 310, 1386, 884, 275, 20320, 374, 10302, 50276, 82, 19, 2708, 10012, 5976, 368, 2319, 849, 247, 1805, 24822, 21496, 476, 1421, 281, 247, 7938, 14940, 533, 697, 417, 2590, 432, 253, 5955, 2139, 436, 310, 253, 1083, 275, 1798, 697, 417, 2590, 2139, 253, 5955, 1475, 5520, 7200, 23632, 651, 1421, 281, 5520, 14940, 4142, 752, 717, 891, 5816, 50276, 82, 20, 562, 273, 24536, 849, 513, 368, 3359, 253, 2710, 1385, 3319, 11429, 12624, 513, 368, 830, 23507, 12624, 275, 15548, 342, 253, 4623, 1327, 8260, 375, 285, 30247, 390, 513, 368, 513, 1633, 625, 22407, 281, 1056, 253, 13782, 7938, 50276, 82, 21, 253, 15302, 403, 15225, 5611, 752, 1057, 270, 1957, 275, 253, 5637, 285, 32798, 72, 15302, 849, 310, 253, 941, 275, 247, 275, 253, 32798, 72, 10895, 10932, 261, 673, 2112, 253, 10175, 342, 1027, 9930, 3969, 281, 1027, 6814, 4375, 50275, 82, 22, 671, 1057, 490, 24382, 8320, 285, 490, 2566, 2313, 50276, 10722, 326, 368, 452, 247, 2264, 273, 6783, 8557, 273, 2216, 12624, 285, 3969, 11390, 270, 285, 326, 368, 897, 9166, 323, 3733, 285, 253, 1551, 323, 5175, 604, 4754, 403, 512, 253, 2361, 1543, 253, 3388, 1543, 689, 512, 8557, 490, 275, 253, 5175, 941, 50275, 82, 23, 697, 417, 2590, 2139, 690, 3082, 403, 417, 1677, 275, 690, 14777, 2139, 310, 6311, 17890, 967, 417, 17944, 275, 4677, 337, 672, 352, 4620, 275, 4677, 374, 2139, 403, 6311, 2877, 7483, 285, 6311, 17890, 967, 417, 2908, 275, 4677, 495, 2139, 403, 6311, 2877, 7483, 6311, 17890, 967, 285, 1385, 3319, 11429, 417, 2908, 275, 4677, 577, 50276, 82, 24, 752, 310, 253, 3064, 875, 253, 1264, 14777, 275, 4677, 337, 4496, 5513, 275, 253, 11743, 671, 253, 340, 10565, 13301, 403, 2624, 745, 285, 417, 4645, 6283, 50275, 82, 25, 275, 2593, 9743, 2139, 403, 368, 970, 747, 24787, 1332, 281, 8415, 247, 1878, 19325, 1895, 310, 627, 667, 5649, 281, 2509, 326, 2581, 685, 816, 2509, 247, 2014, 23211, 2287, 3247, 327, 253, 1878, 19325, 1895, 3139, 50276, 82, 26, 275, 253, 1390, 12494, 1078, 253, 6452, 368, 1333, 326, 368, 760, 1408, 581, 19502, 323, 1016, 749, 27861, 460, 752, 1057, 326, 1599, 352, 4453, 751, 368, 513, 2709, 25142, 275, 4677, 577, 50276, 82, 740, 581, 273, 253, 3533, 368, 3748, 275, 253, 10199, 310, 374, 943, 359, 4647, 253, 1072, 6311, 23211, 275, 1016, 19502, 390, 3037, 352, 275, 253, 1735, 19502, 407, 3733, 327, 247, 941, 873, 7668, 3786, 6311, 46159, 432, 2720, 25142, 368, 840, 1333, 326, 368, 588, 3662, 436, 1953, 533, 891, 42126, 923, 436, 5469, 275, 253, 2929, 368, 13366, 3748, 352, 275, 253, 12494, 3733, 275, 2593, 9743, 533, 891, 42126, 923, 352, 6283, 9713, 9825, 858, 891, 2985, 1633, 50274, 977, 5884, 1841, 50275, 249, 43302, 327, 3239, 495, 275, 307, 943, 320, 3280, 50275, 783, 6452, 13590, 310, 1512, 1943, 323, 247, 12494, 13590, 368, 943, 1691, 352, 347, 247, 1463, 2593, 13590, 50275, 5658, 943, 13199, 253, 3943, 908, 275, 253, 20243, 5203, 327, 253, 1269, 10565, 273, 4677, 721, 516, 7384, 697, 7253, 50276, 783, 5161, 2934, 273, 4715, 23211, 12624, 310, 4722, 253, 3762, 326, 3400, 247, 2442, 22861, 323, 2139, 6311, 5536, 4194, 30547, 7695, 1537, 3157, 327, 253, 38611, 784, 1385, 3319, 11429, 310, 10112, 2299, 253, 6311, 42295, 323, 253, 5536, 4194, 23211, 3133, 1077, 3477, 281, 2740, 3365, 8143, 9634, 10175, 651, 1599, 326, 352, 556, 281, 320, 1693, 1596, 264, 4583, 697, 417, 2590, 849, 4217, 253, 6311, 23211, 651, 320, 275, 3946, 323, 326, 1921, 253, 4679, 403, 512, 2218, 327, 1077, 1355, 3237, 326, 13414, 878, 30547, 7695, 4243, 273, 253, 2929, 403, 671, 12744, 347, 18627, 1840, 323, 841, 4606, 891, 1158, 697, 2708, 253, 14924, 7887, 50276, 7152, 33032, 2520, 2929, 2175, 4315, 30547, 7695, 11333, 835, 253, 30547, 7695, 4315, 310, 6311, 390, 4561, 5223, 1242, 28055, 436, 310, 2218, 407, 48065, 625, 253, 10175, 273, 247, 342, 1029, 25057, 7363, 1223, 18236, 436, 310, 2218, 407, 1709, 839, 326, 873, 342, 10175, 273, 1781, 22429, 352, 44995, 436, 5933, 327, 1097, 298, 18, 285, 298, 19, 9077, 285, 40687, 5520, 1071, 6332, 619, 4114, 310, 625, 275, 10527, 4315, 11333, 594, 281, 479, 253, 2934, 273, 5223, 1242, 11365, 46159, 1754, 327, 25057, 7363, 310, 10380, 625, 7936, 685, 30547, 7695, 253, 4477, 513, 513, 247, 1175, 2628, 15974, 436, 2905, 6239, 533, 616, 1543, 403, 6571, 1480, 9099, 273, 4315, 4719, 14493, 50276, 783, 5661, 1543, 403, 4722, 516, 417, 6600, 273, 824, 27163, 273, 1071, 6332, 275, 2045, 2987, 597, 671, 1127, 281, 1534, 15988, 273, 841, 3082, 689, 38611, 784, 30547, 7695, 1754, 3082, 327, 253, 643, 1133, 841, 2175, 6571, 1379, 1659, 327, 1355, 281, 10290, 25180, 941, 835, 253, 3045, 15988, 432, 30547, 7695, 310, 12744, 594, 891, 1928, 597, 5752, 6571, 347, 4737, 273, 4473, 273, 253, 11839, 273, 824, 30547, 7695, 285, 4931, 671, 1127, 562, 326, 1027, 26850, 6082, 1644, 403, 3309, 323, 4933, 1805, 1071, 6332, 891, 1928, 436, 2929, 310, 973, 1869, 562, 285, 253, 5661, 1543, 403, 4122, 4722, 2299, 891, 671, 1928, 326, 253, 2175, 2668, 1060, 5223, 1242, 3609, 10175, 476, 320, 2668, 581, 3213, 12861, 275, 1798, 352, 651, 320, 1077, 4722, 281, 923, 10527, 6260, 273, 2139, 824, 6311, 30547, 7695, 1918, 5520, 1543, 689, 38611, 784, 46159, 2490, 187, 4118, 18435, 27, 2520, 2929, 29328, 247, 747, 7680, 275, 253, 3332, 6239, 327, 4715, 10670, 273, 46159, 1223, 512, 30628, 452, 7478, 253, 4583, 1175, 3290, 273, 253, 9759, 767, 2616, 1646, 281, 2801, 11306, 327, 247, 4016, 3061, 8254, 6787, 327, 253, 9021, 7990, 3559, 347, 247, 4968, 323, 2087, 344, 859, 2458, 275, 253, 10199, 533, 9142, 760, 3732, 281, 1878, 15044, 6332, 273, 4872, 23477, 281, 9295, 271, 6843, 39401, 273, 253, 344, 859, 757, 4315, 285, 4859, 342, 5368, 6239, 14855, 273, 4679, 3692, 1355, 4311, 1057, 417, 15249, 970, 46159, 275, 253, 806, 1659, 1580, 436, 310, 247, 4715, 2746, 891, 717, 3782, 7996, 281, 253, 6158, 1127, 285, 3103, 717, 21802, 281, 12009, 533, 891, 11907, 253, 4477, 281, 2953, 841, 767, 3374, 342, 253, 1655, 7482 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper considers the problem of privately generating synthetic datasets based on a private dataset for which each data point is a mix of numerical and categorical variables the bestperforming past approach in this area is an approach called momentmatching which tries to generate a synthetic dataset that minimizes the error of marginal queries ie queries that take 01 variables xi and ask for the expectation of prodi in s xi made on the synthetic database they build off an algorithm called rap which repeatedly privately chooses new marginal queries for which the current synthetic database has high error and then tries to minimize the error on all queries chosen so far past approaches made numerical variables suitable for marginal queries by placing them into bins thus making them categorical the authors argue the binning heuristic is impractical they instead propose two new classes of queries the first applies a threshold to the numerical variables to turn them into 01 variables the second applies a linear classifier to a subset of the numerical variables to generate a 01 variable the most challenging step in the rap algorithm is to after choosing a set of queries find a synthetic dataset that minimizes the error on these queries rap does this by defining a fractional relaxation of synthetic datasets and an error function over fractional datasets which allows one to use continuous optimization techniques the authors propose rap which is rap but also with the option to choose queries from their two proposed classes to make the error function differentiable the authors approximate the thresholdlinear classifier in the queries with a sigmoid function however a sigmoid function has to either be highly nonsmooth which makes it hard to optimize over or poorly approximate threshold functions the authors propose that rather than doing a oneshot optimization over the error function defined using sigmoids instead using an annealingstyle algorithm that repeatedly solves the optimization problem but each time decreases the temperature of the sigmoids in experiments the authors compare rap to several other synthetic data generation methods in the literature the authors show that rap produces synthetic datasets with better accuracyerror than the other methods in most settings however since rap places less emphasis on marginal queries just on categorical data in its training process in tasks where numerical data is not useful it is outperformed by some other methods the authors also show the runtime of rap is much less than the runtime of the nextbest method in their experiments originality to the best of my knowledge while the algorithm in this paper builds upon a preexisting algorithm rap the two improvements made to that algorithm new types of queries and an annealing method for optimization are both fairly original furthermore the problem is fairly wellstudied so a new approach for the problem is more novel than perhaps say a new approach appearing in the second or third paper on a problem quality the methods the author propose appear to be very natural despite their originalitynovelty the privacy guarantees of the algorithm are sound and easy to see the experiments appear to be fairly extensive and support the authors claims i do have some questionsconcerns about the experiments however which i have placed in the questions section of the review clarity i thought that the paper was very clean from an exposition standpoint the introduction does a good job explaining the issue with past approaches and when explaining the algorithm i felt the design decisions the authors made were wellmotivated by their explanation of the problems those decisions addressed there are some clarity issues with the experimental designresults related to my questionsconcerns from the previous bullet significance i think the paper is fairly impactful synthetic data generation is a wellstudied problem that many practitioners are trying to solve in production in turn algorithms like the rap algorithm proposed in this paper have a reasonable chance to be deployed in practice on wide scales aside from the concerns in questions i dont believe there are any major limitations or negative social impacts not addressed by the authors docsepthis paper provides a method for generating synthetic differentiallyprivate datasets for use in answering statistical queries without the need for binning  specific query types that were analyzed include mixed marginal queries class conditional linear threshold queries and querying the error the is an improvement over previous work for certain classes of statistical queries the main improvement in this paper is for the case of mixedtype queries queries which contain a mixture of categorical and numerical features the technique in the paper is an extension of the relaxed projection mechanism by considering the k hardest queries and also adding a simulated annealing step  the experimental results are also impressive  overall this would be a worthwhile contribution however i do have some concerns one question i had is about the lack of theorems in the paper  epsilondelta privacy follows straightforwardly from the addition of the noise how about other components eg added error  another thing that would help if the authors explained the where the main difficulty of applying these known techniques lies yes the limitations have been addressed docsepthe paper iterates upon the relaxed adaptive projection rap framework from abk21 a moment matching approach for generating private synthetic data the primary improvement wrt to the prior work is the ability of handling a mixture of both categorical and numerical features where rap requires a discretization of the numeric domain via binning this is achieved in two steps 1 numeric based queries are introduced which are different types of threshold queries and 2 a differentiable approximation to these numeric queries are introduced with via a tempered sigmoid annealing query the later is key in the optimization of the private synthetic data with these tools to handle numerical based queries differential privacy mechanisms are used to ensure that the synthetic data is private in summary the approach can be broken down as follows 1 the k worst query functions are selected wrt error and using report noisy topk a dp mechanism 2 the values of these queries are calculated with the gaussian dp mechanism 3 the query functions in 1 are converted to their differentiable approximations 4 a projection step occurs using results of 2 and 3 to find the best data wrt error these steps done in the papers algorithm 2 the proof of epsilon delta dp follows from standard composition and postprocessing properties of zero concentration dp experimentally the paper uses their approach to generate synthetic data for multitask learning multiple columns of labels to target in classification with this in mind the paper evaluates their approach over many variants of the acs dataset the results present appear promising for both mixedmarginal query evaluation and linear classification strengths the paper presents an iterative improvement over rap this is particularly shown in the experimental section where rap is competitive or superior to other prior work the explanation and motivation for the tempered sigmoid the subsequent queries and annealing approach was good weaknesses although there is a privacy guarantee and promising experimental results there is not an theoretical error analysis from briefly looking at abk21 rap has a bound on query error there are a few unclear aspects of the paper fig 2 5 cites the appendix for comparison against other approaches but this does not appear in the appendix points which could be rephrased as questions comments are reiterated below the author states the limitation for ml downstream tasks to that of only linear classification it would be useful to see explicit experiments on nonlinear classifiers trained on rap synthetic datasets edit addressed in the reviewer discussion ### Summary:
this paper provides a method for generating synthetic differentiallyprivate datasets for use in answering statistical queries including mixed marginal queries class conditional linear threshold queries and querying the error the is an improvement over previous work a solid paper that all reviewers are positive about
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 19401, 253, 1895, 273, 27904, 11365, 13506, 15302, 1754, 327, 247, 3055, 10895, 323, 534, 1016, 941, 1127, 310, 247, 5878, 273, 10704, 285, 31091, 4903, 253, 1682, 468, 14692, 2469, 2746, 275, 436, 2170, 310, 271, 2746, 1925, 2774, 45767, 534, 14177, 281, 6635, 247, 13506, 10895, 326, 46926, 253, 2228, 273, 16888, 19241, 26332, 19241, 326, 1379, 14805, 4903, 1269, 74, 285, 1642, 323, 253, 15355, 273, 354, 5168, 275, 256, 1269, 74, 1160, 327, 253, 13506, 5447, 597, 1973, 745, 271, 5933, 1925, 17845, 534, 12889, 27904, 28467, 747, 16888, 19241, 323, 534, 253, 1655, 13506, 5447, 556, 1029, 2228, 285, 840, 14177, 281, 15338, 253, 2228, 327, 512, 19241, 6777, 594, 2080, 2469, 7274, 1160, 10704, 4903, 7470, 323, 16888, 19241, 407, 15606, 731, 715, 27925, 3021, 2403, 731, 31091, 253, 4477, 9059, 253, 10269, 920, 47641, 310, 45783, 597, 3185, 12661, 767, 747, 5971, 273, 19241, 253, 806, 10384, 247, 7887, 281, 253, 10704, 4903, 281, 1614, 731, 715, 14805, 4903, 253, 1273, 10384, 247, 4872, 30410, 281, 247, 8578, 273, 253, 10704, 4903, 281, 6635, 247, 14805, 4778, 50275, 783, 954, 11132, 3213, 275, 253, 17845, 5933, 310, 281, 846, 13887, 247, 873, 273, 19241, 1089, 247, 13506, 10895, 326, 46926, 253, 2228, 327, 841, 19241, 17845, 1057, 436, 407, 13947, 247, 24622, 17040, 273, 13506, 15302, 285, 271, 2228, 1159, 689, 24622, 15302, 534, 4483, 581, 281, 897, 5415, 13757, 5609, 253, 4477, 12661, 17845, 534, 310, 17845, 533, 671, 342, 253, 4500, 281, 5206, 19241, 432, 616, 767, 4081, 5971, 281, 1056, 253, 2228, 1159, 46350, 253, 4477, 16851, 253, 7887, 8172, 30410, 275, 253, 19241, 342, 247, 9788, 78, 1238, 1159, 2299, 247, 9788, 78, 1238, 1159, 556, 281, 2057, 320, 4122, 14122, 78, 4902, 534, 2789, 352, 1892, 281, 22318, 689, 390, 15225, 16851, 7887, 3470, 253, 4477, 12661, 326, 2581, 685, 2509, 247, 4394, 12022, 13757, 689, 253, 2228, 1159, 2931, 970, 9788, 78, 9448, 3185, 970, 271, 35375, 4826, 5933, 326, 12889, 35910, 253, 13757, 1895, 533, 1016, 673, 12075, 253, 3276, 273, 253, 9788, 78, 9448, 50275, 249, 4679, 253, 4477, 7277, 17845, 281, 2067, 643, 13506, 941, 5978, 3082, 275, 253, 6239, 253, 4477, 921, 326, 17845, 11330, 13506, 15302, 342, 1805, 7200, 3775, 685, 253, 643, 3082, 275, 954, 7533, 2299, 1580, 17845, 5053, 1679, 15075, 327, 16888, 19241, 816, 327, 31091, 941, 275, 697, 3733, 1232, 275, 8892, 835, 10704, 941, 310, 417, 4217, 352, 310, 41731, 10574, 407, 690, 643, 3082, 253, 4477, 671, 921, 253, 20243, 273, 17845, 310, 1199, 1679, 685, 253, 20243, 273, 253, 1735, 14461, 1332, 275, 616, 4679, 3236, 414, 281, 253, 1682, 273, 619, 3640, 1223, 253, 5933, 275, 436, 2929, 21168, 2220, 247, 638, 20137, 5933, 17845, 253, 767, 11701, 1160, 281, 326, 5933, 747, 3510, 273, 19241, 285, 271, 35375, 1332, 323, 13757, 403, 1097, 9648, 3236, 33810, 253, 1895, 310, 9648, 973, 14091, 728, 594, 247, 747, 2746, 323, 253, 1895, 310, 625, 4460, 685, 4931, 1333, 247, 747, 2746, 15602, 275, 253, 1273, 390, 2626, 2929, 327, 247, 1895, 50276, 15177, 253, 3082, 253, 2488, 12661, 3176, 281, 320, 1077, 3626, 5747, 616, 3236, 414, 2369, 652, 555, 253, 11068, 23632, 273, 253, 5933, 403, 3590, 285, 3477, 281, 923, 253, 4679, 3176, 281, 320, 9648, 9470, 285, 1329, 253, 4477, 3916, 891, 513, 452, 690, 3533, 585, 1209, 2224, 670, 253, 4679, 2299, 534, 891, 452, 4845, 275, 253, 3533, 2593, 273, 253, 2278, 50276, 498, 15752, 891, 1869, 326, 253, 2929, 369, 1077, 4076, 432, 271, 47284, 32764, 253, 10199, 1057, 247, 1175, 2628, 15571, 253, 2523, 342, 2469, 7274, 285, 672, 15571, 253, 5933, 891, 3543, 253, 2216, 7089, 253, 4477, 1160, 497, 973, 24013, 8550, 407, 616, 8813, 273, 253, 3237, 1110, 7089, 9713, 627, 403, 690, 19843, 3374, 342, 253, 5661, 2216, 16680, 2905, 281, 619, 3533, 585, 1209, 2224, 432, 253, 2045, 16950, 50276, 9188, 40348, 891, 1158, 253, 2929, 310, 9648, 3486, 1020, 13506, 941, 5978, 310, 247, 973, 14091, 728, 1895, 326, 1142, 24432, 403, 2820, 281, 8415, 275, 3275, 275, 1614, 11333, 751, 253, 17845, 5933, 4081, 275, 436, 2929, 452, 247, 5272, 4839, 281, 320, 18329, 275, 3946, 327, 4618, 11498, 9255, 432, 253, 7350, 275, 3533, 891, 13414, 2868, 627, 403, 667, 2201, 7364, 390, 4016, 2675, 16274, 417, 9713, 407, 253, 4477, 5474, 33032, 2520, 2929, 3400, 247, 1332, 323, 11365, 13506, 21673, 9486, 15302, 323, 897, 275, 22291, 7605, 19241, 1293, 253, 878, 323, 10269, 920, 575, 2173, 7316, 3510, 326, 497, 5867, 2486, 575, 37340, 16888, 19241, 966, 17697, 4872, 7887, 19241, 285, 7316, 272, 253, 2228, 50276, 783, 310, 271, 7756, 689, 2045, 789, 323, 2176, 5971, 273, 7605, 19241, 253, 2022, 7756, 275, 436, 2929, 310, 323, 253, 1083, 273, 6804, 881, 19241, 19241, 534, 3831, 247, 7802, 273, 31091, 285, 10704, 3386, 253, 5853, 275, 253, 2929, 310, 271, 6880, 273, 253, 19595, 12378, 5122, 407, 7296, 253, 465, 31056, 19241, 285, 671, 6240, 247, 15524, 35375, 3213, 575, 253, 5661, 1543, 403, 671, 13943, 575, 4583, 436, 651, 320, 247, 32811, 7680, 50276, 35529, 891, 513, 452, 690, 7350, 581, 1953, 575, 74, 574, 310, 670, 253, 3480, 273, 39383, 275, 253, 2929, 575, 299, 793, 300, 857, 1862, 11068, 3637, 15246, 314, 575, 4064, 253, 1635, 273, 253, 6046, 849, 670, 643, 4295, 24088, 2879, 2228, 575, 1529, 2181, 326, 651, 575, 13070, 604, 253, 4477, 5544, 253, 835, 253, 2022, 10183, 273, 9433, 841, 1929, 5609, 8696, 4754, 253, 7364, 452, 644, 9713, 5474, 339, 431, 248, 2929, 10040, 684, 2220, 253, 19595, 17825, 12378, 17845, 7792, 432, 490, 76, 1797, 247, 2774, 11038, 2746, 323, 11365, 3055, 13506, 941, 253, 3625, 7756, 8772, 281, 253, 2720, 789, 310, 253, 3745, 273, 10885, 247, 7802, 273, 1097, 31091, 285, 10704, 3386, 50276, 2811, 17845, 4419, 247, 35132, 1320, 273, 253, 31437, 5028, 3066, 10269, 920, 436, 310, 6786, 275, 767, 5018, 337, 31437, 1754, 19241, 403, 5611, 534, 403, 1027, 3510, 273, 7887, 19241, 285, 374, 247, 46350, 11193, 281, 841, 31437, 19241, 403, 5611, 342, 3066, 247, 1565, 11712, 9788, 78, 1238, 35375, 7316, 253, 1996, 310, 2234, 275, 253, 13757, 273, 253, 3055, 13506, 941, 342, 841, 5657, 281, 6016, 10704, 1754, 19241, 8967, 11068, 6297, 403, 908, 281, 5416, 326, 253, 13506, 941, 310, 3055, 50276, 249, 6010, 253, 2746, 476, 320, 7154, 1066, 347, 3637, 337, 253, 465, 9065, 7316, 3470, 403, 4236, 8772, 2228, 285, 970, 1304, 27620, 1755, 76, 247, 33234, 5122, 374, 253, 2193, 273, 841, 19241, 403, 5118, 342, 253, 305, 12064, 33234, 5122, 495, 253, 7316, 3470, 275, 337, 403, 11516, 281, 616, 46350, 34754, 577, 247, 12378, 3213, 6634, 970, 1543, 273, 374, 285, 495, 281, 1089, 253, 1682, 941, 8772, 2228, 50276, 20513, 5018, 2218, 275, 253, 9380, 5933, 374, 253, 4737, 273, 50276, 4259, 18687, 33234, 3637, 432, 2629, 5889, 285, 1501, 21678, 3607, 273, 5058, 4719, 33234, 50276, 16217, 2092, 595, 253, 2929, 4648, 616, 2746, 281, 6635, 13506, 941, 323, 1554, 262, 1945, 4715, 2709, 9930, 273, 13301, 281, 2303, 275, 9162, 342, 436, 275, 2564, 253, 2929, 44995, 616, 2746, 689, 1142, 11640, 273, 253, 913, 84, 10895, 253, 1543, 1246, 3176, 12532, 323, 1097, 6804, 78, 1662, 989, 7316, 7103, 285, 4872, 9162, 50276, 296, 3755, 20556, 50274, 783, 2929, 10262, 271, 34560, 7756, 689, 17845, 436, 310, 3782, 2011, 275, 253, 5661, 2593, 835, 17845, 310, 12085, 390, 8936, 281, 643, 2720, 789, 50273, 783, 8813, 285, 16038, 323, 253, 1565, 11712, 9788, 78, 1238, 253, 6774, 19241, 285, 35375, 2746, 369, 1175, 50276, 20881, 1255, 265, 50274, 20261, 627, 310, 247, 11068, 12215, 285, 12532, 5661, 1543, 627, 310, 417, 271, 10527, 2228, 1783, 432, 13366, 2819, 387, 490, 76, 1797, 17845, 556, 247, 3033, 327, 7316, 2228, 50274, 9088, 403, 247, 1643, 12744, 7794, 273, 253, 2929, 50274, 926, 374, 50276, 22, 28070, 253, 30762, 323, 5301, 1411, 643, 7274, 533, 436, 1057, 417, 3176, 275, 253, 30762, 50276, 10801, 534, 812, 320, 294, 545, 83, 833, 347, 3533, 50276, 26122, 403, 43269, 2708, 253, 2488, 3054, 253, 12291, 323, 13361, 15450, 8892, 281, 326, 273, 760, 4872, 9162, 352, 651, 320, 4217, 281, 923, 6843, 4679, 327, 14561, 49996, 10166, 327, 17845, 13506, 15302, 50274, 15576, 9713, 275, 253, 37317, 5955, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 3400, 247, 1332, 323, 11365, 13506, 21673, 9486, 15302, 323, 897, 275, 22291, 7605, 19241, 1690, 6804, 16888, 19241, 966, 17697, 4872, 7887, 19241, 285, 7316, 272, 253, 2228, 253, 310, 271, 7756, 689, 2045, 789, 50276, 66, 4891, 2929, 326, 512, 30628, 403, 2762, 670 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 19401, 253, 1895, 273, 27904, 11365, 13506, 15302, 1754, 327, 247, 3055, 10895, 323, 534, 1016, 941, 1127, 310, 247, 5878, 273, 10704, 285, 31091, 4903, 253, 1682, 468, 14692, 2469, 2746, 275, 436, 2170, 310, 271, 2746, 1925, 2774, 45767, 534, 14177, 281, 6635, 247, 13506, 10895, 326, 46926, 253, 2228, 273, 16888, 19241, 26332, 19241, 326, 1379, 14805, 4903, 1269, 74, 285, 1642, 323, 253, 15355, 273, 354, 5168, 275, 256, 1269, 74, 1160, 327, 253, 13506, 5447, 597, 1973, 745, 271, 5933, 1925, 17845, 534, 12889, 27904, 28467, 747, 16888, 19241, 323, 534, 253, 1655, 13506, 5447, 556, 1029, 2228, 285, 840, 14177, 281, 15338, 253, 2228, 327, 512, 19241, 6777, 594, 2080, 2469, 7274, 1160, 10704, 4903, 7470, 323, 16888, 19241, 407, 15606, 731, 715, 27925, 3021, 2403, 731, 31091, 253, 4477, 9059, 253, 10269, 920, 47641, 310, 45783, 597, 3185, 12661, 767, 747, 5971, 273, 19241, 253, 806, 10384, 247, 7887, 281, 253, 10704, 4903, 281, 1614, 731, 715, 14805, 4903, 253, 1273, 10384, 247, 4872, 30410, 281, 247, 8578, 273, 253, 10704, 4903, 281, 6635, 247, 14805, 4778, 50275, 783, 954, 11132, 3213, 275, 253, 17845, 5933, 310, 281, 846, 13887, 247, 873, 273, 19241, 1089, 247, 13506, 10895, 326, 46926, 253, 2228, 327, 841, 19241, 17845, 1057, 436, 407, 13947, 247, 24622, 17040, 273, 13506, 15302, 285, 271, 2228, 1159, 689, 24622, 15302, 534, 4483, 581, 281, 897, 5415, 13757, 5609, 253, 4477, 12661, 17845, 534, 310, 17845, 533, 671, 342, 253, 4500, 281, 5206, 19241, 432, 616, 767, 4081, 5971, 281, 1056, 253, 2228, 1159, 46350, 253, 4477, 16851, 253, 7887, 8172, 30410, 275, 253, 19241, 342, 247, 9788, 78, 1238, 1159, 2299, 247, 9788, 78, 1238, 1159, 556, 281, 2057, 320, 4122, 14122, 78, 4902, 534, 2789, 352, 1892, 281, 22318, 689, 390, 15225, 16851, 7887, 3470, 253, 4477, 12661, 326, 2581, 685, 2509, 247, 4394, 12022, 13757, 689, 253, 2228, 1159, 2931, 970, 9788, 78, 9448, 3185, 970, 271, 35375, 4826, 5933, 326, 12889, 35910, 253, 13757, 1895, 533, 1016, 673, 12075, 253, 3276, 273, 253, 9788, 78, 9448, 50275, 249, 4679, 253, 4477, 7277, 17845, 281, 2067, 643, 13506, 941, 5978, 3082, 275, 253, 6239, 253, 4477, 921, 326, 17845, 11330, 13506, 15302, 342, 1805, 7200, 3775, 685, 253, 643, 3082, 275, 954, 7533, 2299, 1580, 17845, 5053, 1679, 15075, 327, 16888, 19241, 816, 327, 31091, 941, 275, 697, 3733, 1232, 275, 8892, 835, 10704, 941, 310, 417, 4217, 352, 310, 41731, 10574, 407, 690, 643, 3082, 253, 4477, 671, 921, 253, 20243, 273, 17845, 310, 1199, 1679, 685, 253, 20243, 273, 253, 1735, 14461, 1332, 275, 616, 4679, 3236, 414, 281, 253, 1682, 273, 619, 3640, 1223, 253, 5933, 275, 436, 2929, 21168, 2220, 247, 638, 20137, 5933, 17845, 253, 767, 11701, 1160, 281, 326, 5933, 747, 3510, 273, 19241, 285, 271, 35375, 1332, 323, 13757, 403, 1097, 9648, 3236, 33810, 253, 1895, 310, 9648, 973, 14091, 728, 594, 247, 747, 2746, 323, 253, 1895, 310, 625, 4460, 685, 4931, 1333, 247, 747, 2746, 15602, 275, 253, 1273, 390, 2626, 2929, 327, 247, 1895, 50276, 15177, 253, 3082, 253, 2488, 12661, 3176, 281, 320, 1077, 3626, 5747, 616, 3236, 414, 2369, 652, 555, 253, 11068, 23632, 273, 253, 5933, 403, 3590, 285, 3477, 281, 923, 253, 4679, 3176, 281, 320, 9648, 9470, 285, 1329, 253, 4477, 3916, 891, 513, 452, 690, 3533, 585, 1209, 2224, 670, 253, 4679, 2299, 534, 891, 452, 4845, 275, 253, 3533, 2593, 273, 253, 2278, 50276, 498, 15752, 891, 1869, 326, 253, 2929, 369, 1077, 4076, 432, 271, 47284, 32764, 253, 10199, 1057, 247, 1175, 2628, 15571, 253, 2523, 342, 2469, 7274, 285, 672, 15571, 253, 5933, 891, 3543, 253, 2216, 7089, 253, 4477, 1160, 497, 973, 24013, 8550, 407, 616, 8813, 273, 253, 3237, 1110, 7089, 9713, 627, 403, 690, 19843, 3374, 342, 253, 5661, 2216, 16680, 2905, 281, 619, 3533, 585, 1209, 2224, 432, 253, 2045, 16950, 50276, 9188, 40348, 891, 1158, 253, 2929, 310, 9648, 3486, 1020, 13506, 941, 5978, 310, 247, 973, 14091, 728, 1895, 326, 1142, 24432, 403, 2820, 281, 8415, 275, 3275, 275, 1614, 11333, 751, 253, 17845, 5933, 4081, 275, 436, 2929, 452, 247, 5272, 4839, 281, 320, 18329, 275, 3946, 327, 4618, 11498, 9255, 432, 253, 7350, 275, 3533, 891, 13414, 2868, 627, 403, 667, 2201, 7364, 390, 4016, 2675, 16274, 417, 9713, 407, 253, 4477, 5474, 33032, 2520, 2929, 3400, 247, 1332, 323, 11365, 13506, 21673, 9486, 15302, 323, 897, 275, 22291, 7605, 19241, 1293, 253, 878, 323, 10269, 920, 575, 2173, 7316, 3510, 326, 497, 5867, 2486, 575, 37340, 16888, 19241, 966, 17697, 4872, 7887, 19241, 285, 7316, 272, 253, 2228, 50276, 783, 310, 271, 7756, 689, 2045, 789, 323, 2176, 5971, 273, 7605, 19241, 253, 2022, 7756, 275, 436, 2929, 310, 323, 253, 1083, 273, 6804, 881, 19241, 19241, 534, 3831, 247, 7802, 273, 31091, 285, 10704, 3386, 253, 5853, 275, 253, 2929, 310, 271, 6880, 273, 253, 19595, 12378, 5122, 407, 7296, 253, 465, 31056, 19241, 285, 671, 6240, 247, 15524, 35375, 3213, 575, 253, 5661, 1543, 403, 671, 13943, 575, 4583, 436, 651, 320, 247, 32811, 7680, 50276, 35529, 891, 513, 452, 690, 7350, 581, 1953, 575, 74, 574, 310, 670, 253, 3480, 273, 39383, 275, 253, 2929, 575, 299, 793, 300, 857, 1862, 11068, 3637, 15246, 314, 575, 4064, 253, 1635, 273, 253, 6046, 849, 670, 643, 4295, 24088, 2879, 2228, 575, 1529, 2181, 326, 651, 575, 13070, 604, 253, 4477, 5544, 253, 835, 253, 2022, 10183, 273, 9433, 841, 1929, 5609, 8696, 4754, 253, 7364, 452, 644, 9713, 5474, 339, 431, 248, 2929, 10040, 684, 2220, 253, 19595, 17825, 12378, 17845, 7792, 432, 490, 76, 1797, 247, 2774, 11038, 2746, 323, 11365, 3055, 13506, 941, 253, 3625, 7756, 8772, 281, 253, 2720, 789, 310, 253, 3745, 273, 10885, 247, 7802, 273, 1097, 31091, 285, 10704, 3386, 50276, 2811, 17845, 4419, 247, 35132, 1320, 273, 253, 31437, 5028, 3066, 10269, 920, 436, 310, 6786, 275, 767, 5018, 337, 31437, 1754, 19241, 403, 5611, 534, 403, 1027, 3510, 273, 7887, 19241, 285, 374, 247, 46350, 11193, 281, 841, 31437, 19241, 403, 5611, 342, 3066, 247, 1565, 11712, 9788, 78, 1238, 35375, 7316, 253, 1996, 310, 2234, 275, 253, 13757, 273, 253, 3055, 13506, 941, 342, 841, 5657, 281, 6016, 10704, 1754, 19241, 8967, 11068, 6297, 403, 908, 281, 5416, 326, 253, 13506, 941, 310, 3055, 50276, 249, 6010, 253, 2746, 476, 320, 7154, 1066, 347, 3637, 337, 253, 465, 9065, 7316, 3470, 403, 4236, 8772, 2228, 285, 970, 1304, 27620, 1755, 76, 247, 33234, 5122, 374, 253, 2193, 273, 841, 19241, 403, 5118, 342, 253, 305, 12064, 33234, 5122, 495, 253, 7316, 3470, 275, 337, 403, 11516, 281, 616, 46350, 34754, 577, 247, 12378, 3213, 6634, 970, 1543, 273, 374, 285, 495, 281, 1089, 253, 1682, 941, 8772, 2228, 50276, 20513, 5018, 2218, 275, 253, 9380, 5933, 374, 253, 4737, 273, 50276, 4259, 18687, 33234, 3637, 432, 2629, 5889, 285, 1501, 21678, 3607, 273, 5058, 4719, 33234, 50276, 16217, 2092, 595, 253, 2929, 4648, 616, 2746, 281, 6635, 13506, 941, 323, 1554, 262, 1945, 4715, 2709, 9930, 273, 13301, 281, 2303, 275, 9162, 342, 436, 275, 2564, 253, 2929, 44995, 616, 2746, 689, 1142, 11640, 273, 253, 913, 84, 10895, 253, 1543, 1246, 3176, 12532, 323, 1097, 6804, 78, 1662, 989, 7316, 7103, 285, 4872, 9162, 50276, 296, 3755, 20556, 50274, 783, 2929, 10262, 271, 34560, 7756, 689, 17845, 436, 310, 3782, 2011, 275, 253, 5661, 2593, 835, 17845, 310, 12085, 390, 8936, 281, 643, 2720, 789, 50273, 783, 8813, 285, 16038, 323, 253, 1565, 11712, 9788, 78, 1238, 253, 6774, 19241, 285, 35375, 2746, 369, 1175, 50276, 20881, 1255, 265, 50274, 20261, 627, 310, 247, 11068, 12215, 285, 12532, 5661, 1543, 627, 310, 417, 271, 10527, 2228, 1783, 432, 13366, 2819, 387, 490, 76, 1797, 17845, 556, 247, 3033, 327, 7316, 2228, 50274, 9088, 403, 247, 1643, 12744, 7794, 273, 253, 2929, 50274, 926, 374, 50276, 22, 28070, 253, 30762, 323, 5301, 1411, 643, 7274, 533, 436, 1057, 417, 3176, 275, 253, 30762, 50276, 10801, 534, 812, 320, 294, 545, 83, 833, 347, 3533, 50276, 26122, 403, 43269, 2708, 253, 2488, 3054, 253, 12291, 323, 13361, 15450, 8892, 281, 326, 273, 760, 4872, 9162, 352, 651, 320, 4217, 281, 923, 6843, 4679, 327, 14561, 49996, 10166, 327, 17845, 13506, 15302, 50274, 15576, 9713, 275, 253, 37317, 5955, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 3400, 247, 1332, 323, 11365, 13506, 21673, 9486, 15302, 323, 897, 275, 22291, 7605, 19241, 1690, 6804, 16888, 19241, 966, 17697, 4872, 7887, 19241, 285, 7316, 272, 253, 2228, 253, 310, 271, 7756, 689, 2045, 789, 50276, 66, 4891, 2929, 326, 512, 30628, 403, 2762, 670 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper is an empirical study looking at how different dataset properties affect model calibration in the context of vision tasks all experiments use a specific wellknown vision model resnet 50 in particular the dataset properties that are investigated are balancedunbalanced classes label quality dataset size augmentations nlp i briefly present the main conclusions below balance in classes often times some classes have way more datapoints than others the authors look at four datasets cifar 10 cifar 100 eurosat inaturalist the last ones classes are unbalanced whereas the first three require some sampling method to artificially make them unbalanced note in this case by design there is no relationship between balanceunbalance and the class properties figure 1 shows the results for cifar and eurosat those classes with more examples are better calibrated the trend is somewhat similar for inaturalist then the authors present a number of approaches people have tried in the past to mitigate the consequences of unbalance in data they repeat the previous experiment on cifar 10 cifar 100 eurosat but this time using each of those methods while training the model table 1 shows the results the ratio column offers very mixed results depending on the dataset and method the authors conclude that overall the imbalance in calibration persists in most cases q how do these results compare to accuracy one would also expect to do better on classes with more data label quality the authors tackle the question of how label noise affects calibration in order to do that they artificially inject noise to the true labels with increasing probability figure 2 summarizes the calibration error for a number of datasets and noise level the pattern is clear the more noise the worse the calibration importantly the calibration is measured on a test set that is not perturbed with random noise accordingly results were to be expected theres a mismatch between training and test distributions and the further apart they are the less meaningful predicted probabilities one should expect again it would be informative to see how the accuracy of the model also degrades under this circumstances similarly figure 3 shows the effect of nonuniform noise across classes those classes attacked with more noise are worse calibrated dataset size another important practical aspect to study is dataset size the authors subsample uniformly at random a fraction of the data points and measure ece figure 4 shows how models trained on more data are better calibrated again the accuracy of the model should also be shown for context augmentations it is common to use data augmentation to train better models augmentations make the effective datasize larger figure 5 shows how removing augmentation axes leads to worse calibration the same probably applies to accuracy thats the reason why people use this this result is probably intimately related to the previous point dataset size nlp the conclusions regarding dataset size also hold with a transformer on an nlp dataset finally section 4 provides some theoretical explanation we can summarize this as the crossentropy loss wants to have more and more confidence probability on the right class for a given example and when the data is small and the model powerful enough we can basically memorize it to make crossentropy happy this however leads to overconfidence and poor calibration on one hand its recently becoming clear that ece is not a very robust estimator depending on design choices as number of bins argmax vs all adaptive versus fixed bins etc the ranking among models and conclusions can change wildly 1 on the other this study fixed a specific model so one could say that the conclusions are shown for the dataset model pairs still i believe the conclusions are true in a more general setting though and the model is fairly reasonable however while the paper is titled dataset curation beyond accuracy i do not see how the outcome and conclusions of all these experiments would be different if we were looking at accuracy rather than calibration the authors should measure include and address this and try to disentangle both aspects or argue for any correlation causation relationship among them 1 measuring calibration in deep learning httpsarxivorgabs190401685docsepthis work is an empirical survey of the calibration problem with convnets the authors use several existing benchmark datasets and create synthetic classimbalance for datasets that are initially balanced they then extend the wellknown results on higher prediction error of minority class to its calibration error the work investigates several existing methods that alleviate prediction error in imbalanced datasets and examine their effect on calibration error at last the effect of dataset size and data augmentation on calibration error is reported later on the effect of random label noise is also examined the observations although not surprising have not been reported before the work is interesting the writing is clear and the experiments are comprehensive although the observations are very informative the overall contribution of the paper is not sufficient for the iclr venue the work is mostly focused on reporting an existing issue with no major theoretical analysis of the problem and guidelines for alleviating the mentioned problems the paper is in an interesting direction but needs to become more mature questions and suggestoins 1 the label noise experiments are interesting in reality label noise is rarely random and is structured it would be more helpful if the authors could extend the experiment to incorporate such scenarios 2 there seems to be an interesting difference among various reweighting methods in table 1 it would be interesting if authors compared their calibration error performance to their prediction error performance to find out if there is a tradeoff or the two phenomena are in the same direction 3 in a lot of experiments for instance the dataset size its expected to have higher calibration error for smaller data it would be more informative if the general trend of calibration error is compared with the trend in prediction error side by sidedocsepin this work authors demonstrate that dataset properties can significantly affect calibration and suggest that calibration should be measured during dataset curation in the field of applied ai to reallife problem we face all the time decisionmakings on what is the most effective strategy in the pipeline eg sampling noise labeling and this paper present some evidence for those decisions this type of work is important to systematically highlight areas or processes to follow in model development the study is not very novel but important since the conclusions are very important and have key implications i would suggest to apply this to more datasets and also some of the existing synthetic datasets personally i would like to see if these observations remain solid with more datasets and more variation of datasets i did not found any inconsistenciesdocsep this paper discussed how data properties eg label noise label imbalance data size affects calibration error the author designed experiments on varying computer vision datasets ie cifar10 cifar100 eurosat and inaturalist qualitatively 1 calibration error for various individual classes under classimbalance situation 2 calibration error for different scale of label noise 3 calibration error under nonuniform noise 4 calibration error under various scale of dataset size 5 calibration error under different combinations of data augmentations the experimental results show that poor calibration performance accompanies with large noisy label rate large imbalance ratio and small dataset size for the reason of small dataset size causing poor calibration error this paper provided the theoretical proof advantages the idea of considering a softmaxcross entropy logit loss to help explain how data size affect the calibration error is interesting major concerns organization should be improved in particular the factors that affect the calibration error should be listed and well described in a separate section eg intro background affected data properties experiments and the theoretical motivation could also be integrated in such section rather than put it after experiments the novelty and practicability of this paper is limited since this paper only tells people that low label quality and small data size would arise calibration error the paper analyzed the factors qualitatively but not quantitatively in the future research the researcher still hard to justify how much calibration error the current dataset whould bring or cant tell whether the current the current classifier whould be robust enough to defense the calibration bring by the current set an example is 1 robustness of classifiers from adversarial to random noise fawz et al nips2016 this paper analyzed the robustness of classifiers quantitatively with considering adversarial and random noise minor comments table 1 expinbalance expimbalance should the captions of figure 2 and figure 3 be changed assumption 1 xi xi xi xj equation 2 sumiab sumiab there are many typos in this paper should go over the paper again and correct these small mistakes ### Summary:
the authors empirically analyse the properties of datasets which lead to poor calibration in particular they show that high class imbalance high degree of label noise and small dataset size are all likely to lead to poor overall calibration or poor perclass calibration while there are some interesting insights in this work the reviewers argued that the contribution is not substantial enough for iclr to improve the manuscript the authors should consider accuracy and calibration jointly and extend the results pertaining to label noise which were appreciated by the reviewers for the former the same conclusions hold for accuracy instead of calibration which raises the question of their relationship is there a tradeoff for the latter the reviewers pointed to a concrete extension with structured label noise finally the theoretical analysis is a step in the right direction but the assumption on the width of the network required to fit the training set is too restrictive in practice therefore i will recommend rejection
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 310, 271, 16774, 1263, 2819, 387, 849, 1027, 10895, 3607, 2818, 1566, 18543, 275, 253, 3634, 273, 8113, 8892, 512, 4679, 897, 247, 2173, 973, 4304, 8113, 1566, 501, 3024, 2456, 50275, 249, 1798, 253, 10895, 3607, 326, 403, 6949, 403, 50275, 30063, 328, 30063, 5971, 50276, 1968, 3290, 50276, 42429, 1979, 50276, 2321, 420, 569, 50276, 13307, 81, 50276, 74, 13366, 1246, 253, 2022, 11815, 2708, 50275, 20203, 275, 5971, 2223, 2069, 690, 5971, 452, 1039, 625, 2856, 522, 842, 84, 685, 2571, 253, 4477, 1007, 387, 1740, 15302, 260, 338, 274, 884, 260, 338, 274, 2233, 17069, 22354, 275, 8646, 382, 253, 1390, 4394, 5971, 403, 440, 30063, 5727, 253, 806, 1264, 2430, 690, 10491, 1332, 281, 41544, 1056, 731, 440, 30063, 3877, 275, 436, 1083, 407, 2216, 627, 310, 642, 2954, 875, 6654, 328, 20203, 285, 253, 966, 3607, 4677, 337, 2722, 253, 1543, 323, 260, 338, 274, 285, 17069, 22354, 1110, 5971, 342, 625, 6667, 403, 1805, 35890, 253, 9058, 310, 8489, 2074, 323, 275, 8646, 382, 50276, 7461, 253, 4477, 1246, 247, 1180, 273, 7274, 952, 452, 3597, 275, 253, 2469, 281, 29966, 253, 9099, 273, 440, 20203, 275, 941, 597, 10280, 253, 2045, 3368, 327, 260, 338, 274, 884, 260, 338, 274, 2233, 17069, 22354, 533, 436, 673, 970, 1016, 273, 1110, 3082, 1223, 3733, 253, 1566, 2829, 337, 2722, 253, 1543, 253, 4313, 5084, 6131, 1077, 6804, 1543, 7293, 327, 253, 10895, 285, 1332, 253, 4477, 7525, 326, 4583, 253, 31561, 275, 18543, 42575, 275, 954, 2219, 50276, 82, 849, 513, 841, 1543, 7277, 281, 7200, 581, 651, 671, 1902, 281, 513, 1805, 327, 5971, 342, 625, 941, 50275, 1968, 3290, 253, 4477, 18915, 253, 1953, 273, 849, 5203, 6046, 11852, 18543, 275, 1340, 281, 513, 326, 597, 41544, 14888, 6046, 281, 253, 2032, 13301, 342, 3629, 5912, 4677, 374, 37250, 253, 18543, 2228, 323, 247, 1180, 273, 15302, 285, 6046, 1268, 253, 3102, 310, 2590, 253, 625, 6046, 253, 7197, 253, 18543, 15538, 253, 18543, 310, 4080, 327, 247, 1071, 873, 326, 310, 417, 44711, 342, 3632, 6046, 15672, 1543, 497, 281, 320, 3264, 253, 373, 247, 29713, 875, 3733, 285, 1071, 10670, 285, 253, 2007, 7419, 597, 403, 253, 1679, 14282, 8131, 20552, 581, 943, 1902, 969, 352, 651, 320, 27096, 281, 923, 849, 253, 7200, 273, 253, 1566, 671, 372, 25013, 762, 436, 5989, 12014, 4677, 495, 2722, 253, 1055, 273, 1327, 23714, 6046, 2439, 5971, 1110, 5971, 13964, 342, 625, 6046, 403, 7197, 35890, 50275, 42429, 1979, 1529, 1774, 8542, 4809, 281, 1263, 310, 10895, 1979, 253, 4477, 8790, 4636, 17568, 387, 3632, 247, 6919, 273, 253, 941, 2792, 285, 2557, 299, 336, 4677, 577, 2722, 849, 3210, 10166, 327, 625, 941, 403, 1805, 35890, 969, 253, 7200, 273, 253, 1566, 943, 671, 320, 2011, 323, 3634, 50275, 2321, 420, 569, 352, 310, 1846, 281, 897, 941, 42072, 281, 6194, 1805, 3210, 35919, 569, 1056, 253, 3576, 7621, 907, 4067, 4677, 608, 2722, 849, 11922, 42072, 24039, 5644, 281, 7197, 18543, 253, 1072, 3164, 10384, 281, 7200, 28763, 253, 1921, 2139, 952, 897, 436, 436, 906, 310, 3164, 48867, 2905, 281, 253, 2045, 1127, 10895, 1979, 50275, 13307, 81, 253, 11815, 5001, 10895, 1979, 671, 2186, 342, 247, 39707, 327, 271, 295, 24343, 10895, 50276, 71, 3341, 2593, 577, 3400, 690, 10527, 8813, 359, 476, 26799, 436, 347, 253, 2831, 290, 10144, 2957, 5605, 281, 452, 625, 285, 625, 7162, 50276, 22275, 1430, 327, 253, 987, 966, 323, 247, 1677, 1650, 285, 672, 253, 941, 310, 1355, 285, 253, 1566, 6422, 2217, 359, 476, 10323, 16407, 907, 352, 281, 1056, 2831, 290, 10144, 5211, 436, 2299, 5644, 281, 689, 39943, 285, 4105, 18543, 50276, 251, 581, 1133, 697, 4102, 7552, 2590, 326, 299, 336, 310, 417, 247, 1077, 10237, 29107, 7293, 327, 2216, 10165, 347, 1180, 273, 27925, 1736, 4090, 4632, 512, 17825, 7147, 4229, 27925, 3966, 253, 19947, 2190, 3210, 285, 11815, 476, 1818, 32251, 337, 327, 253, 643, 436, 1263, 4229, 247, 2173, 1566, 594, 581, 812, 1333, 326, 253, 11815, 403, 2011, 323, 253, 10895, 1566, 8557, 1335, 891, 2868, 253, 11815, 403, 2032, 275, 247, 625, 2087, 4758, 2167, 285, 253, 1566, 310, 9648, 5272, 2299, 1223, 253, 2929, 310, 18879, 10895, 1095, 318, 4457, 7200, 891, 513, 417, 923, 849, 253, 6454, 285, 11815, 273, 512, 841, 4679, 651, 320, 1027, 604, 359, 497, 2819, 387, 7200, 2581, 685, 18543, 253, 4477, 943, 2557, 2486, 285, 2953, 436, 285, 1611, 281, 557, 290, 2134, 1097, 7794, 390, 9059, 323, 667, 5921, 50276, 68, 666, 318, 2954, 2190, 731, 50276, 18, 50276, 28025, 981, 18543, 275, 3676, 4715, 50276, 3614, 39962, 2061, 5375, 746, 2125, 11718, 2227, 7152, 33032, 2520, 789, 310, 271, 16774, 6630, 273, 253, 18543, 1895, 342, 2410, 47301, 253, 4477, 897, 2067, 5368, 22791, 15302, 285, 2794, 13506, 966, 6785, 267, 593, 323, 15302, 326, 403, 8523, 16645, 597, 840, 9017, 253, 973, 4304, 1543, 327, 2169, 10554, 2228, 273, 15156, 966, 281, 697, 18543, 2228, 253, 789, 2340, 684, 2067, 5368, 3082, 326, 33623, 10554, 2228, 275, 516, 30063, 15302, 285, 9186, 616, 1055, 327, 18543, 2228, 387, 1390, 253, 1055, 273, 10895, 1979, 285, 941, 42072, 327, 18543, 2228, 310, 2361, 1996, 327, 253, 1055, 273, 3632, 5203, 6046, 310, 671, 6730, 253, 7313, 3738, 417, 10084, 452, 417, 644, 2361, 1078, 50276, 783, 789, 310, 4722, 253, 4028, 310, 2590, 285, 253, 4679, 403, 11088, 3738, 253, 7313, 403, 1077, 27096, 253, 4583, 7680, 273, 253, 2929, 310, 417, 4209, 323, 253, 17857, 32888, 18767, 253, 789, 310, 6571, 7106, 327, 9610, 271, 5368, 2523, 342, 642, 2201, 10527, 1783, 273, 253, 1895, 285, 9600, 323, 7374, 6584, 839, 253, 5393, 3237, 253, 2929, 310, 275, 271, 4722, 3884, 533, 3198, 281, 2489, 625, 14242, 50275, 34974, 285, 1804, 80, 968, 337, 253, 5203, 6046, 4679, 403, 4722, 275, 6612, 5203, 6046, 310, 11766, 3632, 285, 310, 18872, 352, 651, 320, 625, 9371, 604, 253, 4477, 812, 9017, 253, 3368, 281, 19071, 824, 15216, 374, 627, 3133, 281, 320, 271, 4722, 3064, 2190, 2710, 294, 6712, 272, 3082, 275, 2829, 337, 352, 651, 320, 4722, 604, 4477, 2429, 616, 18543, 2228, 3045, 281, 616, 10554, 2228, 3045, 281, 1089, 562, 604, 627, 310, 247, 5454, 2727, 390, 253, 767, 16958, 403, 275, 253, 1072, 3884, 495, 275, 247, 2257, 273, 4679, 323, 4227, 253, 10895, 1979, 697, 3264, 281, 452, 2169, 18543, 2228, 323, 4577, 941, 352, 651, 320, 625, 27096, 604, 253, 2087, 9058, 273, 18543, 2228, 310, 2429, 342, 253, 9058, 275, 10554, 2228, 1930, 407, 256, 1356, 406, 339, 9852, 436, 789, 4477, 7568, 326, 10895, 3607, 476, 3012, 2818, 18543, 285, 1804, 326, 18543, 943, 320, 4080, 1309, 10895, 1095, 318, 275, 253, 1673, 273, 3732, 23105, 281, 294, 455, 1074, 1895, 359, 2454, 512, 253, 673, 3061, 45879, 723, 327, 752, 310, 253, 954, 3576, 5700, 275, 253, 15722, 24088, 10491, 6046, 21473, 285, 436, 2929, 1246, 690, 1941, 323, 1110, 7089, 50275, 2520, 1511, 273, 789, 310, 1774, 281, 24181, 6780, 3672, 390, 4870, 281, 956, 275, 1566, 2440, 253, 1263, 310, 417, 1077, 4460, 533, 1774, 1580, 253, 11815, 403, 1077, 1774, 285, 452, 2234, 12739, 891, 651, 1804, 281, 4647, 436, 281, 625, 15302, 285, 671, 690, 273, 253, 5368, 13506, 15302, 11697, 891, 651, 751, 281, 923, 604, 841, 7313, 3464, 4891, 342, 625, 15302, 285, 625, 7629, 273, 15302, 50274, 74, 858, 417, 1119, 667, 45611, 7152, 33032, 186, 2520, 2929, 5469, 849, 941, 3607, 24088, 5203, 6046, 5203, 31561, 941, 1979, 11852, 18543, 2228, 253, 2488, 4158, 4679, 327, 11962, 4382, 8113, 15302, 26332, 260, 338, 274, 740, 260, 338, 274, 2313, 17069, 22354, 285, 275, 8646, 382, 36143, 337, 18543, 2228, 323, 2710, 2060, 5971, 762, 966, 6785, 267, 593, 4112, 374, 18543, 2228, 323, 1027, 4311, 273, 5203, 6046, 495, 18543, 2228, 762, 1327, 23714, 6046, 577, 18543, 2228, 762, 2710, 4311, 273, 10895, 1979, 608, 18543, 2228, 762, 1027, 13553, 273, 941, 35919, 569, 253, 5661, 1543, 921, 326, 4105, 18543, 3045, 43604, 342, 1781, 27620, 5203, 2281, 1781, 31561, 4313, 285, 1355, 10895, 1979, 323, 253, 1921, 273, 1355, 10895, 1979, 8479, 4105, 18543, 2228, 436, 2929, 2530, 253, 10527, 4737, 50276, 186, 209, 186, 11402, 1131, 209, 623, 253, 2934, 273, 7296, 247, 2602, 4090, 16599, 15579, 2412, 262, 2957, 281, 1361, 5513, 849, 941, 1979, 2818, 253, 18543, 2228, 310, 4722, 28910, 209, 186, 24330, 7350, 209, 623, 6003, 943, 320, 5520, 275, 1798, 253, 2616, 326, 2818, 253, 18543, 2228, 943, 320, 7117, 285, 973, 2529, 275, 247, 4858, 2593, 24088, 26432, 50276, 11814, 50276, 48130, 941, 3607, 50276, 16217, 3825, 285, 253, 10527, 16038, 812, 671, 320, 8527, 275, 824, 2593, 2581, 685, 1691, 352, 846, 4679, 50276, 623, 253, 38135, 285, 2283, 280, 1430, 273, 436, 2929, 310, 3710, 1580, 436, 2929, 760, 8599, 952, 326, 1698, 5203, 3290, 285, 1355, 941, 1979, 651, 12893, 18543, 2228, 253, 2929, 5867, 253, 2616, 36143, 533, 417, 36878, 275, 253, 2852, 2561, 253, 22780, 1335, 1892, 281, 15249, 849, 1199, 18543, 2228, 253, 1655, 10895, 364, 450, 3324, 390, 16216, 2028, 1880, 253, 1655, 253, 1655, 30410, 364, 450, 320, 10237, 2217, 281, 5684, 253, 18543, 3324, 407, 253, 1655, 873, 271, 1650, 310, 50276, 186, 186, 18, 31640, 273, 49996, 432, 48960, 281, 3632, 6046, 269, 1403, 91, 1162, 355, 295, 2824, 6961, 436, 2929, 5867, 253, 31640, 273, 49996, 36878, 342, 7296, 48960, 285, 3632, 6046, 50276, 186, 37585, 5701, 50276, 623, 2829, 337, 866, 249, 20203, 50276, 4347, 6785, 267, 593, 209, 623, 943, 253, 3403, 621, 273, 4677, 374, 285, 4677, 495, 320, 4391, 209, 623, 9376, 337, 1269, 74, 50276, 2981, 50276, 2981, 50276, 89, 75, 209, 623, 5150, 374, 2020, 31289, 50276, 2204, 31289, 209, 623, 627, 403, 1142, 963, 993, 275, 436, 2929, 943, 564, 689, 253, 2929, 969, 285, 3451, 841, 1355, 16503, 2490, 187, 4118, 18435, 27, 783, 4477, 45190, 30648, 253, 3607, 273, 15302, 534, 1421, 281, 4105, 18543, 275, 1798, 597, 921, 326, 1029, 966, 31561, 1029, 4248, 273, 5203, 6046, 285, 1355, 10895, 1979, 403, 512, 2779, 281, 1421, 281, 4105, 4583, 18543, 390, 4105, 591, 2437, 18543, 1223, 627, 403, 690, 4722, 16039, 275, 436, 789, 253, 30628, 9125, 326, 253, 7680, 310, 417, 6832, 2217, 323, 17857, 32888, 281, 3157, 253, 7714, 253, 4477, 943, 1908, 7200, 285, 18543, 26277, 285, 9017, 253, 1543, 27855, 281, 5203, 6046, 534, 497, 14109, 407, 253, 30628, 323, 253, 3438, 253, 1072, 11815, 2186, 323, 7200, 3185, 273, 18543, 534, 16540, 253, 1953, 273, 616, 2954, 50276, 261, 627, 247, 5454, 2727, 323, 253, 6158, 253, 30628, 8042, 281, 247, 11859, 6880, 342, 18872, 5203, 6046, 4720, 253, 10527, 1783, 310, 247, 3213, 275, 253, 987, 3884, 533, 253, 9376, 327, 253, 4871, 273, 253, 2990, 2424, 281, 4944, 253, 3733, 873, 310, 1512, 29190, 275, 3946, 3103, 891, 588, 5583, 18235 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 310, 271, 16774, 1263, 2819, 387, 849, 1027, 10895, 3607, 2818, 1566, 18543, 275, 253, 3634, 273, 8113, 8892, 512, 4679, 897, 247, 2173, 973, 4304, 8113, 1566, 501, 3024, 2456, 50275, 249, 1798, 253, 10895, 3607, 326, 403, 6949, 403, 50275, 30063, 328, 30063, 5971, 50276, 1968, 3290, 50276, 42429, 1979, 50276, 2321, 420, 569, 50276, 13307, 81, 50276, 74, 13366, 1246, 253, 2022, 11815, 2708, 50275, 20203, 275, 5971, 2223, 2069, 690, 5971, 452, 1039, 625, 2856, 522, 842, 84, 685, 2571, 253, 4477, 1007, 387, 1740, 15302, 260, 338, 274, 884, 260, 338, 274, 2233, 17069, 22354, 275, 8646, 382, 253, 1390, 4394, 5971, 403, 440, 30063, 5727, 253, 806, 1264, 2430, 690, 10491, 1332, 281, 41544, 1056, 731, 440, 30063, 3877, 275, 436, 1083, 407, 2216, 627, 310, 642, 2954, 875, 6654, 328, 20203, 285, 253, 966, 3607, 4677, 337, 2722, 253, 1543, 323, 260, 338, 274, 285, 17069, 22354, 1110, 5971, 342, 625, 6667, 403, 1805, 35890, 253, 9058, 310, 8489, 2074, 323, 275, 8646, 382, 50276, 7461, 253, 4477, 1246, 247, 1180, 273, 7274, 952, 452, 3597, 275, 253, 2469, 281, 29966, 253, 9099, 273, 440, 20203, 275, 941, 597, 10280, 253, 2045, 3368, 327, 260, 338, 274, 884, 260, 338, 274, 2233, 17069, 22354, 533, 436, 673, 970, 1016, 273, 1110, 3082, 1223, 3733, 253, 1566, 2829, 337, 2722, 253, 1543, 253, 4313, 5084, 6131, 1077, 6804, 1543, 7293, 327, 253, 10895, 285, 1332, 253, 4477, 7525, 326, 4583, 253, 31561, 275, 18543, 42575, 275, 954, 2219, 50276, 82, 849, 513, 841, 1543, 7277, 281, 7200, 581, 651, 671, 1902, 281, 513, 1805, 327, 5971, 342, 625, 941, 50275, 1968, 3290, 253, 4477, 18915, 253, 1953, 273, 849, 5203, 6046, 11852, 18543, 275, 1340, 281, 513, 326, 597, 41544, 14888, 6046, 281, 253, 2032, 13301, 342, 3629, 5912, 4677, 374, 37250, 253, 18543, 2228, 323, 247, 1180, 273, 15302, 285, 6046, 1268, 253, 3102, 310, 2590, 253, 625, 6046, 253, 7197, 253, 18543, 15538, 253, 18543, 310, 4080, 327, 247, 1071, 873, 326, 310, 417, 44711, 342, 3632, 6046, 15672, 1543, 497, 281, 320, 3264, 253, 373, 247, 29713, 875, 3733, 285, 1071, 10670, 285, 253, 2007, 7419, 597, 403, 253, 1679, 14282, 8131, 20552, 581, 943, 1902, 969, 352, 651, 320, 27096, 281, 923, 849, 253, 7200, 273, 253, 1566, 671, 372, 25013, 762, 436, 5989, 12014, 4677, 495, 2722, 253, 1055, 273, 1327, 23714, 6046, 2439, 5971, 1110, 5971, 13964, 342, 625, 6046, 403, 7197, 35890, 50275, 42429, 1979, 1529, 1774, 8542, 4809, 281, 1263, 310, 10895, 1979, 253, 4477, 8790, 4636, 17568, 387, 3632, 247, 6919, 273, 253, 941, 2792, 285, 2557, 299, 336, 4677, 577, 2722, 849, 3210, 10166, 327, 625, 941, 403, 1805, 35890, 969, 253, 7200, 273, 253, 1566, 943, 671, 320, 2011, 323, 3634, 50275, 2321, 420, 569, 352, 310, 1846, 281, 897, 941, 42072, 281, 6194, 1805, 3210, 35919, 569, 1056, 253, 3576, 7621, 907, 4067, 4677, 608, 2722, 849, 11922, 42072, 24039, 5644, 281, 7197, 18543, 253, 1072, 3164, 10384, 281, 7200, 28763, 253, 1921, 2139, 952, 897, 436, 436, 906, 310, 3164, 48867, 2905, 281, 253, 2045, 1127, 10895, 1979, 50275, 13307, 81, 253, 11815, 5001, 10895, 1979, 671, 2186, 342, 247, 39707, 327, 271, 295, 24343, 10895, 50276, 71, 3341, 2593, 577, 3400, 690, 10527, 8813, 359, 476, 26799, 436, 347, 253, 2831, 290, 10144, 2957, 5605, 281, 452, 625, 285, 625, 7162, 50276, 22275, 1430, 327, 253, 987, 966, 323, 247, 1677, 1650, 285, 672, 253, 941, 310, 1355, 285, 253, 1566, 6422, 2217, 359, 476, 10323, 16407, 907, 352, 281, 1056, 2831, 290, 10144, 5211, 436, 2299, 5644, 281, 689, 39943, 285, 4105, 18543, 50276, 251, 581, 1133, 697, 4102, 7552, 2590, 326, 299, 336, 310, 417, 247, 1077, 10237, 29107, 7293, 327, 2216, 10165, 347, 1180, 273, 27925, 1736, 4090, 4632, 512, 17825, 7147, 4229, 27925, 3966, 253, 19947, 2190, 3210, 285, 11815, 476, 1818, 32251, 337, 327, 253, 643, 436, 1263, 4229, 247, 2173, 1566, 594, 581, 812, 1333, 326, 253, 11815, 403, 2011, 323, 253, 10895, 1566, 8557, 1335, 891, 2868, 253, 11815, 403, 2032, 275, 247, 625, 2087, 4758, 2167, 285, 253, 1566, 310, 9648, 5272, 2299, 1223, 253, 2929, 310, 18879, 10895, 1095, 318, 4457, 7200, 891, 513, 417, 923, 849, 253, 6454, 285, 11815, 273, 512, 841, 4679, 651, 320, 1027, 604, 359, 497, 2819, 387, 7200, 2581, 685, 18543, 253, 4477, 943, 2557, 2486, 285, 2953, 436, 285, 1611, 281, 557, 290, 2134, 1097, 7794, 390, 9059, 323, 667, 5921, 50276, 68, 666, 318, 2954, 2190, 731, 50276, 18, 50276, 28025, 981, 18543, 275, 3676, 4715, 50276, 3614, 39962, 2061, 5375, 746, 2125, 11718, 2227, 7152, 33032, 2520, 789, 310, 271, 16774, 6630, 273, 253, 18543, 1895, 342, 2410, 47301, 253, 4477, 897, 2067, 5368, 22791, 15302, 285, 2794, 13506, 966, 6785, 267, 593, 323, 15302, 326, 403, 8523, 16645, 597, 840, 9017, 253, 973, 4304, 1543, 327, 2169, 10554, 2228, 273, 15156, 966, 281, 697, 18543, 2228, 253, 789, 2340, 684, 2067, 5368, 3082, 326, 33623, 10554, 2228, 275, 516, 30063, 15302, 285, 9186, 616, 1055, 327, 18543, 2228, 387, 1390, 253, 1055, 273, 10895, 1979, 285, 941, 42072, 327, 18543, 2228, 310, 2361, 1996, 327, 253, 1055, 273, 3632, 5203, 6046, 310, 671, 6730, 253, 7313, 3738, 417, 10084, 452, 417, 644, 2361, 1078, 50276, 783, 789, 310, 4722, 253, 4028, 310, 2590, 285, 253, 4679, 403, 11088, 3738, 253, 7313, 403, 1077, 27096, 253, 4583, 7680, 273, 253, 2929, 310, 417, 4209, 323, 253, 17857, 32888, 18767, 253, 789, 310, 6571, 7106, 327, 9610, 271, 5368, 2523, 342, 642, 2201, 10527, 1783, 273, 253, 1895, 285, 9600, 323, 7374, 6584, 839, 253, 5393, 3237, 253, 2929, 310, 275, 271, 4722, 3884, 533, 3198, 281, 2489, 625, 14242, 50275, 34974, 285, 1804, 80, 968, 337, 253, 5203, 6046, 4679, 403, 4722, 275, 6612, 5203, 6046, 310, 11766, 3632, 285, 310, 18872, 352, 651, 320, 625, 9371, 604, 253, 4477, 812, 9017, 253, 3368, 281, 19071, 824, 15216, 374, 627, 3133, 281, 320, 271, 4722, 3064, 2190, 2710, 294, 6712, 272, 3082, 275, 2829, 337, 352, 651, 320, 4722, 604, 4477, 2429, 616, 18543, 2228, 3045, 281, 616, 10554, 2228, 3045, 281, 1089, 562, 604, 627, 310, 247, 5454, 2727, 390, 253, 767, 16958, 403, 275, 253, 1072, 3884, 495, 275, 247, 2257, 273, 4679, 323, 4227, 253, 10895, 1979, 697, 3264, 281, 452, 2169, 18543, 2228, 323, 4577, 941, 352, 651, 320, 625, 27096, 604, 253, 2087, 9058, 273, 18543, 2228, 310, 2429, 342, 253, 9058, 275, 10554, 2228, 1930, 407, 256, 1356, 406, 339, 9852, 436, 789, 4477, 7568, 326, 10895, 3607, 476, 3012, 2818, 18543, 285, 1804, 326, 18543, 943, 320, 4080, 1309, 10895, 1095, 318, 275, 253, 1673, 273, 3732, 23105, 281, 294, 455, 1074, 1895, 359, 2454, 512, 253, 673, 3061, 45879, 723, 327, 752, 310, 253, 954, 3576, 5700, 275, 253, 15722, 24088, 10491, 6046, 21473, 285, 436, 2929, 1246, 690, 1941, 323, 1110, 7089, 50275, 2520, 1511, 273, 789, 310, 1774, 281, 24181, 6780, 3672, 390, 4870, 281, 956, 275, 1566, 2440, 253, 1263, 310, 417, 1077, 4460, 533, 1774, 1580, 253, 11815, 403, 1077, 1774, 285, 452, 2234, 12739, 891, 651, 1804, 281, 4647, 436, 281, 625, 15302, 285, 671, 690, 273, 253, 5368, 13506, 15302, 11697, 891, 651, 751, 281, 923, 604, 841, 7313, 3464, 4891, 342, 625, 15302, 285, 625, 7629, 273, 15302, 50274, 74, 858, 417, 1119, 667, 45611, 7152, 33032, 186, 2520, 2929, 5469, 849, 941, 3607, 24088, 5203, 6046, 5203, 31561, 941, 1979, 11852, 18543, 2228, 253, 2488, 4158, 4679, 327, 11962, 4382, 8113, 15302, 26332, 260, 338, 274, 740, 260, 338, 274, 2313, 17069, 22354, 285, 275, 8646, 382, 36143, 337, 18543, 2228, 323, 2710, 2060, 5971, 762, 966, 6785, 267, 593, 4112, 374, 18543, 2228, 323, 1027, 4311, 273, 5203, 6046, 495, 18543, 2228, 762, 1327, 23714, 6046, 577, 18543, 2228, 762, 2710, 4311, 273, 10895, 1979, 608, 18543, 2228, 762, 1027, 13553, 273, 941, 35919, 569, 253, 5661, 1543, 921, 326, 4105, 18543, 3045, 43604, 342, 1781, 27620, 5203, 2281, 1781, 31561, 4313, 285, 1355, 10895, 1979, 323, 253, 1921, 273, 1355, 10895, 1979, 8479, 4105, 18543, 2228, 436, 2929, 2530, 253, 10527, 4737, 50276, 186, 209, 186, 11402, 1131, 209, 623, 253, 2934, 273, 7296, 247, 2602, 4090, 16599, 15579, 2412, 262, 2957, 281, 1361, 5513, 849, 941, 1979, 2818, 253, 18543, 2228, 310, 4722, 28910, 209, 186, 24330, 7350, 209, 623, 6003, 943, 320, 5520, 275, 1798, 253, 2616, 326, 2818, 253, 18543, 2228, 943, 320, 7117, 285, 973, 2529, 275, 247, 4858, 2593, 24088, 26432, 50276, 11814, 50276, 48130, 941, 3607, 50276, 16217, 3825, 285, 253, 10527, 16038, 812, 671, 320, 8527, 275, 824, 2593, 2581, 685, 1691, 352, 846, 4679, 50276, 623, 253, 38135, 285, 2283, 280, 1430, 273, 436, 2929, 310, 3710, 1580, 436, 2929, 760, 8599, 952, 326, 1698, 5203, 3290, 285, 1355, 941, 1979, 651, 12893, 18543, 2228, 253, 2929, 5867, 253, 2616, 36143, 533, 417, 36878, 275, 253, 2852, 2561, 253, 22780, 1335, 1892, 281, 15249, 849, 1199, 18543, 2228, 253, 1655, 10895, 364, 450, 3324, 390, 16216, 2028, 1880, 253, 1655, 253, 1655, 30410, 364, 450, 320, 10237, 2217, 281, 5684, 253, 18543, 3324, 407, 253, 1655, 873, 271, 1650, 310, 50276, 186, 186, 18, 31640, 273, 49996, 432, 48960, 281, 3632, 6046, 269, 1403, 91, 1162, 355, 295, 2824, 6961, 436, 2929, 5867, 253, 31640, 273, 49996, 36878, 342, 7296, 48960, 285, 3632, 6046, 50276, 186, 37585, 5701, 50276, 623, 2829, 337, 866, 249, 20203, 50276, 4347, 6785, 267, 593, 209, 623, 943, 253, 3403, 621, 273, 4677, 374, 285, 4677, 495, 320, 4391, 209, 623, 9376, 337, 1269, 74, 50276, 2981, 50276, 2981, 50276, 89, 75, 209, 623, 5150, 374, 2020, 31289, 50276, 2204, 31289, 209, 623, 627, 403, 1142, 963, 993, 275, 436, 2929, 943, 564, 689, 253, 2929, 969, 285, 3451, 841, 1355, 16503, 2490, 187, 4118, 18435, 27, 783, 4477, 45190, 30648, 253, 3607, 273, 15302, 534, 1421, 281, 4105, 18543, 275, 1798, 597, 921, 326, 1029, 966, 31561, 1029, 4248, 273, 5203, 6046, 285, 1355, 10895, 1979, 403, 512, 2779, 281, 1421, 281, 4105, 4583, 18543, 390, 4105, 591, 2437, 18543, 1223, 627, 403, 690, 4722, 16039, 275, 436, 789, 253, 30628, 9125, 326, 253, 7680, 310, 417, 6832, 2217, 323, 17857, 32888, 281, 3157, 253, 7714, 253, 4477, 943, 1908, 7200, 285, 18543, 26277, 285, 9017, 253, 1543, 27855, 281, 5203, 6046, 534, 497, 14109, 407, 253, 30628, 323, 253, 3438, 253, 1072, 11815, 2186, 323, 7200, 3185, 273, 18543, 534, 16540, 253, 1953, 273, 616, 2954, 50276, 261, 627, 247, 5454, 2727, 323, 253, 6158, 253, 30628, 8042, 281, 247, 11859, 6880, 342, 18872, 5203, 6046, 4720, 253, 10527, 1783, 310, 247, 3213, 275, 253, 987, 3884, 533, 253, 9376, 327, 253, 4871, 273, 253, 2990, 2424, 281, 4944, 253, 3733, 873, 310, 1512, 29190, 275, 3946, 3103, 891, 588, 5583, 18235 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper presents an empirical study of whether modularity can emerge within neural networks it starts by proposing a novel definition of modularity that identifies modules by their functionality to discover the module that implements a specific target functionality the paper proposes to first pretrain the full network on the original task then freeze the pretrained weights and train a binary mask for each weight using gumbelsigmoid the training objective for the masks is given by the target functionality eg a subtask of the original task plus some sparsity regularization the paper then investigates the discovered modules in terms of specialization reusability and compositionality the main findings are 1 neural nets tend to satisfy specialization but not reusability 2 weight sharing between modules tends to be affected more by whether io are shared than by task similarity and there tends to be less sharing in larger networks 3 when trained on algorithmic tasks neural nets fail to learn compositional rules and thus generalize poorly 4 cnns trained for image classification contain classspecific nonshared weights in the feature detectors pros the experiments are comprehensive covering many neural net architectures and datasets the results seem consistent across these architectures and datasets also source code is provided and the supplementary material provides sufficient details to reproduce the results the paper investigates natural emergence of functional modules which is a novel perspective on modularity cons some concepts are not rigorously defined for example in pspecialize and preuse when should two modules be considered the same the paper seems to define a module as a subset of weights does this imply that two different subsets potentially differ by only one element will correspond to two different modules if this is the case then preuse is extremely hard to achieve and it can never be true if the inputoutput neurons of two modules are different ie separate io considered in section 32 hence the experiments in section 32 seem meaningless one may argue that the weights in the inputoutput layer should be excluded from the module but then the functionality of the module is not well defined another confusion i had is what can be a target functionality and what cannot take the additionmultiplication task as an example the task is basically to learn the function fabs where a and b are two numbers and s is a switch indicating whether addition or multiplication should be performed for a and b the paper suggests fabs and fabs as two target functionalities which are the original function restricted to two subsets of the input space is it reasonable to consider fa99bs a target functionality would the result be different also all other experiments in the paper seem to construct target functionalities by similarly restricting the original functions are there other ways to define target functionalities i am somewhat skeptical about investigating properties like compositionality and generalization after network pruning it is well known that training performance does not reflect these properties so even if training performance only slightly drops after network pruning the pruned network may lose some properties that the full network has minor comments it is mentioned that separate io biases the network at initialization time to not share weights but to me it seems to only bias the inputoutput layers rather than the hidden layers it would be interesting to see if shared io will also lead to unshared hidden weights fig 13 did not mention which lstm variant was used in section 33 why is increased sharing in the first layer undesirable it might not be possible to undo the permutation with only one layerdocsepthe paper investigates the functional modularity of neural networks using a simple yet efficient endtoend method the paper is well written and clearly articulates a contribution to the literature the proposed method is intuitive and straightforward the experimental evidence is provided for both synthetic language and image classification tasks most of the related works are cited concerns the biggest concern i had is whether the conclusion reached in the paper is invariant to different neural network architectures the size of the network and the complexity of the task as depicted in figure 6 a there is a huge difference in the relative drop in performance for simple cnn simple cnn without drouput and resnet110 it seems that a larger and more complex network tends to not sharing weights besides the paper uses accuracy drop after masking the weights as the main metric which is related to the number of masked weights it might be better to learn the binary mask for each subtask with a certain accuracy objective eg less than 12 lower than the original network for each subtask and compare the learned mask of different subtasks in addition it is useful to see the results on a more complex task such as imagenet in imagenet there exist many similar subtasks as many categories of the images are actually belong to one big category i am wondering whether these subtasks can share the weights minor comments the paper claims the advantage of using gumbelsigmoid than a simple threshold function however the stateoftheart binarized neural networks are trained using a simple sign function with a straightthrough estimator is there a significant difference eg the stability of the training in training the binary mask with the gumbelsigmoid and threshold function reasons for score i vote for accepting i like the finding that most neural networks have nonoverlapped functional modules however i still have some concerns about the generality of the conclusion and also the number of shared weights is not calculated with respect to a uniform accuracy requirement i would consider raising my score if the authors can address my concernsdocsep in its current form i feel that the paper should be disqualified because it contains some results essential to its claims in the appendix referenced in the second paragraph page 5 however this can be easily addressed thus my full review below summary of the paper the paper aims to analyze if and how neural networks learn modular representations modularity under the papers definition means that the network learns representations that 1 specialize using different modules for different functions and that 2 compose in a reusable fashion ie that functions are used in diverse tasks to do so the authors train different probabistic masks using a gumbelsigmoid on the weights of a neural network over a series of different tasks they incentivize these masks to be sparse by regularizing the number of active elements in a mask they then compare masks learned for different tasks including simple arithmetic tasks and permuted mnist and then compare the usage of the parameter masks over different tasks they find that nns learn to specialize only without reuse the paper continues by postulating that the reason is either that the network learned bad representations or that the network did not learn the correct composition they argue that results on the scan dataset show that the representations are of a sufficient quality concluding that the network did not learn the correct composition commentary the question the paper tries to answer is very relevant on two levels the first is that we do not understand how an nn learns sufficiently well the second is that compositional modularity is a highly desireable property and it is important to know if neural network exhibit it strengths the paper does some interesting analysis in general the paper is wellwritten and easily understood unfortunately the paper suffers from major drawbacks this is a minor point im putting here to facilitate my discussion below the paper is not positioning itself correctly in the literature thereby using confusing terminology while modularity is not a main focus in neural network research there exists some meaningful research that has established some terminology in essence the paper talks about compositional modularity and combinatorial generalization but does not use this terminology this makes the paper a little difficult to follow if one is familiar with this literature within this terminology specialize would be called modularize and preuse would be called compose a terminology i will use in the following the conclusions are not convincing the core argument in the paper is that neural networks fail to modularize because they either learn insufficient representations or because they fail to learn to compose to learn the algorithm required to utilize the modules correctly because the network learns reusable representations they modularize and must thus fail to compose while this is probably true i cannot help to feel a little underwhelmed it is well known that neural networks are overparameterized see the lottery ticket literature for this analysis this means that it appears to be easier for the model to relearn using the available capacity than to reuse the existing modules this is not particularly surpsising either because this is at its core overfitting the model does not generalize which in effect is a softer way to reuse but instead learns something akint to a separate function for different inputs what the paper does not investigate is why a neural network not reuses capabilities even those should be a good fit for a problem this would be a really interesting analysis one which i would very strongly argue for acceptance in any venue the novelty of some parts is overstated this is particularly true for using binary masks for multitask like learning see bengio e bacon p l pineau j precup d 2015 conditional computation in neural networks for faster models arxiv preprint arxiv151106297 additionally the paper does not sufficiently relate their insights to the cited work around inducing modularity and compositionaly in networks some of which does already come to similar results docsepthis paper studies weight modularity in neural networks nns in particular given a nn trained to perform a task a subset of weights are identified which in isolation perform well on a subtask of the original task such subsets are inspected to understand the extent to which they are specialized or reused across different subtasks of the original task to identify subtask specific weights a mask is learned that minimizes loss over a subtask when applied to the original nns frozen weights this process is carried out using gradient based optimization techniques adam extensive experiments are performed across various datasets and architectures the paper concludes that while nns seem to exhibit module specialization they fail to exhibit reuse strengths 1 understanding modularity within nns and its relation to failures of systematic generalization is an important research direction 2 the precise masking method proposed here is novel to the best of my knowledge 3 the functional view of modularity has clear advantages over clustering based approaches 4 i believe this work could be extended in interesting ways for example it seems like the methods presented here could naturally be extended to encourage modularity rather than measure it weaknesses 1 the paper addresses a well known issue that nns often fail to generalize systematically why this occurs and potential solutions are left unaddressed while the paper does put forward a hypothesis regarding why that the difficulty of learning routing operation in nns may lead current approaches to fail to discover solutions which generalize systematically little evidence is provided to support this conclusion as such i believe claims such as our approach can also be applied in these settings to provide additional insight regarding the underlying reason for the observed failure at systematic generalization should be relaxed or clarified if there is evidence for this hypothesis presented that i am failing to connect 2 how do we know the problem is with the full model rather than the mask recent literature cited in the paper demonstrates that performant models can be obtained even when severe weight constraints are imposed 1 2 indeed 2 demonstrates masking alone is sufficient to adapt a nn to a completely new task given these observations im concerned about the validity of assuming the mask optimization procedure identifies an independent functional module in the context of the computation performed by the full model it seems plausible that the mask performs well on the subtask given the fixed model weights but is unrelated to the function of the full model if this was the case i believe a number of the conclusions in the paper would not be justified the paper does provide some evidence this may not be occurring for example the doubleaddition experiment figure 2 shows that inverted masks perform well on the other pair this suggests the mask and inverse mask may correspond to distinct functionality in the full model however figure 10 tells a different story where inverse masks do not perform well on the other task is this due to the shared io in the additionmultiplication task is there any relation between the addition mask with shared weights removed and the inverted multiplication mask with shared weights removed ideally the paper would provide more evidence to support this assumption are units correlated in the masked and unmasked model could the analysis here be supported by explicitly constructing a model that exhibits reuse and verifying it can be identified 3 the paper argues in several places that a weight level analysis is necessary for reasoning about functional modularity eg without considering the contribution of individual weights it is not possible to reason about functional modularity i believe this statement conflates two separate issues 1 if modules should be defined in terms of units or weights and 2 whether modules should be defined by functionality as done in this work or using techniques such as clustering on the latter point i agree with the statements in the paper and believe the functional approach is more closely aligned with intuitive notions of what is meant by modularity however i dont believe a weight vs unit level perspective is necessarily mutually exclusive for example consider a simple affine layer h fwx defining a module as a subset of elements of h would be equivalent to defining a module as a subset of rows in w 4 figures 2 10 14 would be much clearer if presented in a tabular format i had to write out the results this way myself to aid understanding 5 i found the use of a nonsymmetric sharing metric extremely confusing i spent a while pondering over figure 1 trying to figure out what the weights were shared with since both tasks are represented in the figure i believe this issue could be easily remedied for example by using something like iou jaccard index the paper uses this metric in appendix b1 to determine the amount of sharing between different masks trained on the same network and task 6 permuted mnist experiment 1 the paper states it suffices to retrain a new first layer to undo the permutation so that later layers can be reused this makes sense however given that the procedure is freeze the occupied weights lower level weights cannot perform this simple permutation by construction since they are frozen ive assumed that occupied means a nonzero mask value 2 why was the choice made to train masks and weights simultaneously in this setting this introduces an additional source of variation from previous experiments and im unsure what the benefit is 7 scan experiments 1 the paper states if the masking process removes any important weights then the solution is patternrecognition like instead of being based on reusable rules confirming explanation this is related to point 2 above and im not convinced this is the only possible conclusion while the mask has identified a subnetwork whose solution is patternrecognition like im not sure that the logical jump to conclude the model as a whole performs identically is sound 2 regarding the output weight analysis im also unsure why it should be the case that the final layer is sufficiently powerful for unbinding bound variables isnt it also possible that the mask learns to ignore those weights because it is not important for the task it is trained on recommendation i recommend acceptance although i have several concerns with the paper i found the ideas and analysis presented very interesting i believe the community would benefit from further discussion scrutiny and exploration of the ideas presented post author response period update most of my concerns have been addressed by the additional experiments and updated language in the latest revision i believe the techniques and analysis presented here for assessing reuse could be an important step between observations and explanations for the failure of nns to generalize systematically i have raised my score accordingly minor issues 1 appendix c2 we always check the choosen hyperparameters we always check the chosen hyperparameters references 1 arun mallya dillon davis and svetlana lazebnik piggyback adapting a single network to multiple tasks by learning to mask weights in proceedings of the european conference on computer vision eccv pp 6782 2018 2 adam gaier and david ha weight agnostic neural networks in advances in neural information processing systems pp 53645378 2019 ### Summary:
this is a paper that is actively discussed the general sentiment is that this paper aims to address an important set of questions while the technique could be improved with more novelty the empirical study is extensive the concerns are about how to interpret the results or rather whether the empirical evidence fully supports the the claimhypothesis after discussion and rebuttal the reviewers improved their scores and one reviewer remained at weaker marginally above threshold the ac read the paper and the discussion one value the ac sees that the discussion threads between the authors and the reviewers provide a significant amount of scientific value the questions to be answered are hard and might indeed require further refinements in framing and conceptualization better techniques strong power in experimental designs to rule exclusively various hypotheses thus the ac recommends acceptance
[ 8892, 597, 1089, 326, 295, 2224, 3037, 281, 2714, 907, 760, 1293, 33150, 253, 2929, 7788, 407, 1501, 8287, 326, 253, 1921, 310, 2057, 326, 253, 2990, 6311, 3076, 14237, 390, 326, 253, 2990, 858, 417, 3037, 253, 3451, 5889, 597, 9059, 326, 1543, 327, 253, 11017, 10895, 921, 326, 253, 14237, 403, 273, 247, 4209, 3290, 26215, 326, 253, 2990, 858, 417, 3037, 253, 3451, 5889, 50276, 13982, 552, 253, 1953, 253, 2929, 14177, 281, 3662, 310, 1077, 4623, 327, 767, 2308, 253, 806, 310, 326, 359, 513, 417, 2096, 849, 271, 48257, 33772, 10481, 973, 253, 1273, 310, 326, 5889, 267, 23178, 414, 310, 247, 4122, 8327, 494, 2867, 285, 352, 310, 1774, 281, 871, 604, 11454, 2990, 10738, 352, 50276, 296, 3755, 20556, 50276, 783, 2929, 1057, 690, 4722, 1783, 50275, 249, 2087, 253, 2929, 310, 973, 15720, 285, 4354, 7192, 50276, 328, 9520, 253, 2929, 27171, 432, 2201, 30453, 50276, 2520, 310, 247, 5884, 1127, 516, 8133, 1060, 281, 12454, 619, 5955, 2708, 253, 2929, 310, 417, 19274, 3139, 9113, 275, 253, 6239, 7624, 970, 21643, 28939, 1223, 23178, 414, 310, 417, 247, 2022, 2770, 275, 11454, 2990, 2561, 627, 4961, 690, 14282, 2561, 326, 556, 4232, 690, 28939, 275, 17718, 253, 2929, 12088, 670, 5889, 267, 23178, 414, 285, 38183, 26647, 533, 1057, 417, 897, 436, 28939, 436, 2789, 253, 2929, 247, 1652, 2834, 281, 956, 604, 581, 310, 7615, 342, 436, 6239, 1561, 436, 28939, 2714, 907, 651, 320, 1925, 23178, 907, 285, 638, 2327, 651, 320, 1925, 38530, 247, 28939, 891, 588, 897, 275, 253, 1563, 50274, 783, 11815, 403, 417, 21414, 253, 5161, 4154, 275, 253, 2929, 310, 326, 11454, 6928, 1891, 281, 23178, 907, 984, 597, 2057, 3037, 12497, 14237, 390, 984, 597, 1891, 281, 3037, 281, 38530, 281, 3037, 253, 5933, 2424, 281, 16584, 253, 11911, 9113, 984, 253, 2990, 33772, 294, 34153, 14237, 597, 23178, 907, 285, 1364, 3021, 1891, 281, 38530, 50276, 6050, 436, 310, 3164, 2032, 891, 2550, 1361, 281, 1928, 247, 1652, 762, 11622, 1314, 352, 310, 973, 1929, 326, 11454, 6928, 403, 689, 19484, 1025, 923, 253, 36284, 13571, 6239, 323, 436, 1783, 436, 2097, 326, 352, 4620, 281, 320, 6927, 323, 253, 1566, 281, 1693, 1596, 970, 253, 2130, 5350, 685, 281, 33150, 253, 5368, 11911, 436, 310, 417, 3782, 919, 793, 2182, 2057, 984, 436, 310, 387, 697, 5161, 689, 31893, 253, 1566, 1057, 417, 39970, 534, 275, 1055, 310, 247, 44108, 1039, 281, 33150, 533, 3185, 33772, 1633, 29507, 565, 281, 247, 4858, 1159, 323, 1027, 14800, 752, 253, 2929, 1057, 417, 7409, 310, 2139, 247, 11454, 2990, 417, 294, 5123, 13789, 1014, 1110, 943, 320, 247, 1175, 4944, 323, 247, 1895, 436, 651, 320, 247, 1663, 4722, 1783, 581, 534, 891, 651, 1077, 7052, 9059, 323, 14924, 275, 667, 18767, 50276, 783, 38135, 273, 690, 4243, 310, 689, 33834, 436, 310, 3782, 2032, 323, 970, 8985, 25965, 323, 1554, 262, 1945, 751, 4715, 923, 270, 1205, 900, 299, 29855, 268, 298, 21175, 1952, 480, 50276, 17995, 484, 277, 4104, 17697, 13782, 275, 11454, 6928, 323, 7938, 3210, 549, 32693, 638, 3845, 549, 32693, 1010, 7749, 23, 23185, 23000, 253, 2929, 1057, 417, 10481, 14588, 616, 16039, 281, 253, 11106, 789, 1475, 24635, 23178, 414, 285, 5889, 5242, 275, 6928, 690, 273, 534, 1057, 2168, 1705, 281, 2074, 1543, 5474, 33032, 2520, 2929, 2175, 2801, 23178, 414, 275, 11454, 6928, 295, 2224, 275, 1798, 1677, 247, 48257, 10166, 281, 1347, 247, 4836, 247, 8578, 273, 13461, 403, 3636, 534, 275, 12940, 1347, 973, 327, 247, 8482, 1945, 273, 253, 3236, 4836, 824, 20077, 403, 36560, 281, 2096, 253, 6070, 281, 534, 597, 403, 18052, 390, 294, 3197, 2439, 1027, 8482, 6579, 273, 253, 3236, 4836, 281, 4271, 8482, 1945, 2173, 13461, 247, 8989, 310, 6311, 326, 46926, 2957, 689, 247, 8482, 1945, 672, 3732, 281, 253, 3236, 295, 2224, 13831, 13461, 436, 1232, 310, 4824, 562, 970, 11786, 1754, 13757, 5609, 38622, 9470, 4679, 403, 2684, 2439, 2710, 15302, 285, 35615, 253, 2929, 20097, 326, 1223, 295, 2224, 1646, 281, 10738, 6333, 48544, 597, 1891, 281, 10738, 33150, 50275, 296, 3755, 20556, 50276, 18, 4685, 23178, 414, 1561, 295, 2224, 285, 697, 5886, 281, 20101, 273, 12082, 26647, 310, 271, 1774, 2561, 3884, 374, 253, 10799, 44790, 1332, 4081, 1060, 310, 4460, 281, 253, 1682, 273, 619, 3640, 495, 253, 5164, 1859, 273, 23178, 414, 556, 2590, 11361, 689, 17524, 1754, 7274, 577, 891, 2868, 436, 789, 812, 320, 6508, 275, 4722, 4088, 323, 1650, 352, 3133, 751, 253, 3082, 3559, 1060, 812, 10748, 320, 6508, 281, 11907, 23178, 414, 2581, 685, 2557, 352, 50275, 20881, 1255, 265, 50276, 18, 253, 2929, 12453, 247, 973, 1929, 2523, 326, 295, 2224, 2223, 1891, 281, 39970, 24181, 2139, 436, 6634, 285, 2442, 5482, 403, 1669, 440, 1911, 2079, 1223, 253, 2929, 1057, 1691, 3579, 247, 9079, 5001, 2139, 326, 253, 10183, 273, 4715, 24749, 4254, 275, 295, 2224, 778, 1421, 1655, 7274, 281, 1891, 281, 9413, 5482, 534, 39970, 24181, 1652, 1941, 310, 2530, 281, 1329, 436, 6452, 347, 824, 891, 2868, 3916, 824, 347, 776, 2746, 476, 671, 320, 3732, 275, 841, 7533, 281, 2085, 3081, 12288, 5001, 253, 6944, 1921, 323, 253, 2540, 4433, 387, 12082, 26647, 943, 320, 19595, 390, 31637, 604, 627, 310, 1941, 323, 436, 9079, 3559, 326, 891, 717, 11741, 281, 4684, 50276, 19, 849, 513, 359, 871, 253, 1895, 310, 342, 253, 2120, 1566, 2581, 685, 253, 8989, 3332, 6239, 11106, 275, 253, 2929, 14371, 326, 1347, 386, 3210, 476, 320, 2797, 1014, 672, 5460, 2801, 10806, 403, 11295, 337, 374, 6296, 374, 14371, 44790, 3815, 310, 4209, 281, 5223, 247, 48257, 281, 247, 4336, 747, 4836, 1677, 841, 7313, 516, 7514, 670, 253, 13091, 273, 7384, 253, 8989, 13757, 5199, 22649, 271, 3907, 5164, 6333, 275, 253, 3634, 273, 253, 13782, 2684, 407, 253, 2120, 1566, 352, 3133, 21541, 326, 253, 8989, 17923, 973, 327, 253, 8482, 1945, 1677, 253, 4229, 1566, 13461, 533, 310, 20804, 281, 253, 1159, 273, 253, 2120, 1566, 604, 436, 369, 253, 1083, 891, 2868, 247, 1180, 273, 253, 11815, 275, 253, 2929, 651, 417, 320, 17285, 50274, 783, 2929, 1057, 2085, 690, 1941, 436, 778, 417, 320, 12952, 323, 1650, 253, 4021, 29483, 3368, 4677, 374, 2722, 326, 28483, 25965, 1347, 973, 327, 253, 643, 4667, 436, 5936, 253, 8989, 285, 13737, 8989, 778, 2723, 281, 5799, 13175, 275, 253, 2120, 1566, 2299, 4677, 884, 8599, 247, 1027, 2926, 835, 13737, 25965, 513, 417, 1347, 973, 327, 253, 643, 4836, 310, 436, 1955, 281, 253, 6096, 17908, 275, 253, 1635, 23939, 17192, 4836, 310, 627, 667, 5886, 875, 253, 1635, 8989, 342, 6096, 13461, 5176, 285, 253, 28483, 25219, 8989, 342, 6096, 13461, 5176, 50273, 504, 595, 253, 2929, 651, 2085, 625, 1941, 281, 1329, 436, 9376, 403, 5085, 9578, 275, 253, 34741, 285, 440, 12477, 264, 1566, 812, 253, 1783, 1060, 320, 4516, 407, 11120, 26736, 247, 1566, 326, 15646, 33150, 285, 49160, 352, 476, 320, 3636, 50276, 20, 253, 2929, 8219, 275, 2067, 5053, 326, 247, 2801, 1268, 1783, 310, 3309, 323, 14720, 670, 5164, 23178, 414, 24088, 1293, 7296, 253, 7680, 273, 2060, 13461, 352, 310, 417, 1896, 281, 1921, 670, 5164, 23178, 414, 891, 2868, 436, 3908, 49446, 684, 767, 4858, 3374, 337, 604, 11911, 943, 320, 2931, 275, 2426, 273, 5085, 390, 13461, 285, 374, 1880, 11911, 943, 320, 2931, 407, 13175, 347, 2218, 275, 436, 789, 390, 970, 5609, 824, 347, 17524, 327, 253, 6158, 1127, 891, 5194, 342, 253, 7234, 275, 253, 2929, 285, 2868, 253, 5164, 2746, 310, 625, 8244, 15616, 342, 27350, 27367, 273, 752, 310, 5486, 407, 23178, 414, 2299, 891, 13414, 2868, 247, 2801, 4632, 3943, 1268, 8668, 310, 7933, 25834, 11855, 323, 1650, 1908, 247, 2969, 29438, 3828, 288, 50276, 71, 22358, 13947, 247, 6333, 347, 247, 8578, 273, 3603, 273, 288, 651, 320, 6425, 281, 13947, 247, 6333, 347, 247, 8578, 273, 10175, 275, 259, 50275, 21, 8442, 374, 884, 1638, 651, 320, 1199, 30909, 604, 3559, 275, 247, 10334, 792, 5981, 891, 574, 281, 3630, 562, 253, 1543, 436, 1039, 4266, 281, 8596, 4685, 50276, 22, 891, 1119, 253, 897, 273, 247, 14122, 25562, 9628, 7982, 6685, 21643, 891, 5262, 247, 1223, 37202, 272, 689, 4677, 337, 2820, 281, 4677, 562, 752, 253, 13461, 497, 6096, 342, 1580, 1097, 8892, 403, 6607, 275, 253, 4677, 891, 2868, 436, 2523, 812, 320, 4354, 16733, 728, 323, 1650, 407, 970, 1633, 751, 891, 276, 480, 3649, 472, 3605, 253, 2929, 4648, 436, 7982, 275, 30762, 270, 18, 281, 3653, 253, 2408, 273, 9628, 875, 1027, 25965, 10166, 327, 253, 1072, 2990, 285, 4836, 50276, 23, 8143, 4525, 278, 79, 382, 3368, 50273, 18, 253, 2929, 3054, 352, 31088, 281, 851, 1949, 247, 747, 806, 3828, 281, 42137, 253, 29391, 594, 326, 1996, 8090, 476, 320, 294, 3197, 436, 2789, 3282, 2299, 1677, 326, 253, 5199, 310, 21090, 253, 13598, 13461, 2406, 1268, 13461, 2550, 1347, 436, 2969, 29391, 407, 5140, 1580, 597, 403, 13831, 209, 422, 8025, 326, 13598, 2097, 247, 28078, 8989, 1318, 50274, 19, 2139, 369, 253, 4327, 1160, 281, 6194, 25965, 285, 13461, 10486, 275, 436, 4758, 436, 23970, 271, 3081, 2603, 273, 7629, 432, 2045, 4679, 285, 516, 31488, 752, 253, 5649, 310, 50276, 24, 11017, 4679, 50273, 18, 253, 2929, 3054, 604, 253, 44790, 1232, 26586, 667, 1774, 13461, 840, 253, 2900, 310, 3102, 21888, 539, 751, 3185, 273, 1146, 1754, 327, 294, 34153, 4803, 24025, 8813, 436, 310, 2905, 281, 1127, 374, 1840, 285, 516, 417, 13762, 436, 310, 253, 760, 1896, 6452, 1223, 253, 8989, 556, 3636, 247, 749, 18428, 3692, 2900, 310, 3102, 21888, 539, 751, 516, 417, 2119, 326, 253, 13760, 6923, 281, 7525, 253, 1566, 347, 247, 2644, 17923, 45963, 310, 3590, 50274, 19, 5001, 253, 3453, 2801, 1783, 516, 671, 31488, 2139, 352, 943, 320, 253, 1083, 326, 253, 2457, 3828, 310, 10481, 6422, 323, 440, 13018, 3033, 4903, 310, 2649, 352, 671, 1896, 326, 253, 8989, 33772, 281, 11823, 1110, 13461, 984, 352, 310, 417, 1774, 323, 253, 4836, 352, 310, 10166, 327, 50275, 250, 27167, 318, 50276, 74, 5583, 14924, 3738, 891, 452, 2067, 7350, 342, 253, 2929, 891, 1119, 253, 5697, 285, 1783, 3559, 1077, 4722, 891, 2868, 253, 3114, 651, 5649, 432, 2007, 5955, 24852, 285, 17947, 273, 253, 5697, 3559, 50275, 5996, 2488, 2380, 2180, 5731, 50276, 2252, 273, 619, 7350, 452, 644, 9713, 407, 253, 3081, 4679, 285, 9300, 3448, 275, 253, 6323, 18520, 891, 2868, 253, 5609, 285, 1783, 3559, 1060, 323, 18005, 33150, 812, 320, 271, 1774, 3213, 875, 7313, 285, 22909, 323, 253, 4433, 273, 295, 2224, 281, 39970, 24181, 891, 452, 5439, 619, 4868, 15672, 50275, 37585, 3374, 50276, 18, 30762, 260, 19, 359, 1900, 2451, 253, 2093, 5458, 4373, 22041, 50276, 664, 1900, 2451, 253, 6777, 4373, 22041, 50275, 250, 3065, 50276, 18, 549, 328, 278, 595, 66, 277, 24632, 34843, 261, 285, 256, 14763, 77, 3230, 826, 91, 2275, 16825, 8393, 4233, 2135, 42174, 247, 2014, 2990, 281, 2709, 8892, 407, 4715, 281, 8989, 13461, 275, 10061, 273, 253, 19454, 266, 8059, 327, 4382, 8113, 23746, 87, 7266, 721, 44037, 4765, 374, 38622, 23646, 1321, 285, 34843, 301, 419, 2801, 639, 79, 6932, 11454, 6928, 275, 16424, 275, 11454, 1491, 5162, 2718, 7266, 8676, 25654, 25139, 6247, 187, 187, 4118, 18435, 27, 2520, 310, 247, 2929, 326, 310, 15257, 5469, 50276, 783, 2087, 21942, 310, 326, 436, 2929, 13698, 281, 2953, 271, 1774, 873, 273, 3533, 1223, 253, 5853, 812, 320, 5520, 342, 625, 38135, 253, 16774, 1263, 310, 9470, 253, 7350, 403, 670, 849, 281, 4665, 253, 1543, 390, 2581, 1880, 253, 16774, 1941, 4751, 8525, 253, 253, 1750, 35040, 4521, 261, 50276, 6438, 5955, 285, 30080, 22559, 253, 30628, 5520, 616, 7363, 285, 581, 37317, 6376, 387, 21076, 42876, 1840, 7887, 50276, 783, 913, 1239, 253, 2929, 285, 253, 5955, 581, 1318, 253, 913, 11403, 326, 253, 5955, 17059, 875, 253, 4477, 285, 253, 30628, 2085, 247, 1534, 2408, 273, 8249, 1318, 50276, 783, 3533, 281, 320, 9577, 403, 1892, 285, 1537, 6296, 2430, 2007, 46783, 3658, 275, 39926, 285, 20178, 1320, 1805, 5609, 50276, 9072, 1612, 275, 5661, 11809, 281, 4086, 14288, 2710, 24316, 50276, 40622, 253, 913, 32636, 14924 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 8892, 597, 1089, 326, 295, 2224, 3037, 281, 2714, 907, 760, 1293, 33150, 253, 2929, 7788, 407, 1501, 8287, 326, 253, 1921, 310, 2057, 326, 253, 2990, 6311, 3076, 14237, 390, 326, 253, 2990, 858, 417, 3037, 253, 3451, 5889, 597, 9059, 326, 1543, 327, 253, 11017, 10895, 921, 326, 253, 14237, 403, 273, 247, 4209, 3290, 26215, 326, 253, 2990, 858, 417, 3037, 253, 3451, 5889, 50276, 13982, 552, 253, 1953, 253, 2929, 14177, 281, 3662, 310, 1077, 4623, 327, 767, 2308, 253, 806, 310, 326, 359, 513, 417, 2096, 849, 271, 48257, 33772, 10481, 973, 253, 1273, 310, 326, 5889, 267, 23178, 414, 310, 247, 4122, 8327, 494, 2867, 285, 352, 310, 1774, 281, 871, 604, 11454, 2990, 10738, 352, 50276, 296, 3755, 20556, 50276, 783, 2929, 1057, 690, 4722, 1783, 50275, 249, 2087, 253, 2929, 310, 973, 15720, 285, 4354, 7192, 50276, 328, 9520, 253, 2929, 27171, 432, 2201, 30453, 50276, 2520, 310, 247, 5884, 1127, 516, 8133, 1060, 281, 12454, 619, 5955, 2708, 253, 2929, 310, 417, 19274, 3139, 9113, 275, 253, 6239, 7624, 970, 21643, 28939, 1223, 23178, 414, 310, 417, 247, 2022, 2770, 275, 11454, 2990, 2561, 627, 4961, 690, 14282, 2561, 326, 556, 4232, 690, 28939, 275, 17718, 253, 2929, 12088, 670, 5889, 267, 23178, 414, 285, 38183, 26647, 533, 1057, 417, 897, 436, 28939, 436, 2789, 253, 2929, 247, 1652, 2834, 281, 956, 604, 581, 310, 7615, 342, 436, 6239, 1561, 436, 28939, 2714, 907, 651, 320, 1925, 23178, 907, 285, 638, 2327, 651, 320, 1925, 38530, 247, 28939, 891, 588, 897, 275, 253, 1563, 50274, 783, 11815, 403, 417, 21414, 253, 5161, 4154, 275, 253, 2929, 310, 326, 11454, 6928, 1891, 281, 23178, 907, 984, 597, 2057, 3037, 12497, 14237, 390, 984, 597, 1891, 281, 3037, 281, 38530, 281, 3037, 253, 5933, 2424, 281, 16584, 253, 11911, 9113, 984, 253, 2990, 33772, 294, 34153, 14237, 597, 23178, 907, 285, 1364, 3021, 1891, 281, 38530, 50276, 6050, 436, 310, 3164, 2032, 891, 2550, 1361, 281, 1928, 247, 1652, 762, 11622, 1314, 352, 310, 973, 1929, 326, 11454, 6928, 403, 689, 19484, 1025, 923, 253, 36284, 13571, 6239, 323, 436, 1783, 436, 2097, 326, 352, 4620, 281, 320, 6927, 323, 253, 1566, 281, 1693, 1596, 970, 253, 2130, 5350, 685, 281, 33150, 253, 5368, 11911, 436, 310, 417, 3782, 919, 793, 2182, 2057, 984, 436, 310, 387, 697, 5161, 689, 31893, 253, 1566, 1057, 417, 39970, 534, 275, 1055, 310, 247, 44108, 1039, 281, 33150, 533, 3185, 33772, 1633, 29507, 565, 281, 247, 4858, 1159, 323, 1027, 14800, 752, 253, 2929, 1057, 417, 7409, 310, 2139, 247, 11454, 2990, 417, 294, 5123, 13789, 1014, 1110, 943, 320, 247, 1175, 4944, 323, 247, 1895, 436, 651, 320, 247, 1663, 4722, 1783, 581, 534, 891, 651, 1077, 7052, 9059, 323, 14924, 275, 667, 18767, 50276, 783, 38135, 273, 690, 4243, 310, 689, 33834, 436, 310, 3782, 2032, 323, 970, 8985, 25965, 323, 1554, 262, 1945, 751, 4715, 923, 270, 1205, 900, 299, 29855, 268, 298, 21175, 1952, 480, 50276, 17995, 484, 277, 4104, 17697, 13782, 275, 11454, 6928, 323, 7938, 3210, 549, 32693, 638, 3845, 549, 32693, 1010, 7749, 23, 23185, 23000, 253, 2929, 1057, 417, 10481, 14588, 616, 16039, 281, 253, 11106, 789, 1475, 24635, 23178, 414, 285, 5889, 5242, 275, 6928, 690, 273, 534, 1057, 2168, 1705, 281, 2074, 1543, 5474, 33032, 2520, 2929, 2175, 2801, 23178, 414, 275, 11454, 6928, 295, 2224, 275, 1798, 1677, 247, 48257, 10166, 281, 1347, 247, 4836, 247, 8578, 273, 13461, 403, 3636, 534, 275, 12940, 1347, 973, 327, 247, 8482, 1945, 273, 253, 3236, 4836, 824, 20077, 403, 36560, 281, 2096, 253, 6070, 281, 534, 597, 403, 18052, 390, 294, 3197, 2439, 1027, 8482, 6579, 273, 253, 3236, 4836, 281, 4271, 8482, 1945, 2173, 13461, 247, 8989, 310, 6311, 326, 46926, 2957, 689, 247, 8482, 1945, 672, 3732, 281, 253, 3236, 295, 2224, 13831, 13461, 436, 1232, 310, 4824, 562, 970, 11786, 1754, 13757, 5609, 38622, 9470, 4679, 403, 2684, 2439, 2710, 15302, 285, 35615, 253, 2929, 20097, 326, 1223, 295, 2224, 1646, 281, 10738, 6333, 48544, 597, 1891, 281, 10738, 33150, 50275, 296, 3755, 20556, 50276, 18, 4685, 23178, 414, 1561, 295, 2224, 285, 697, 5886, 281, 20101, 273, 12082, 26647, 310, 271, 1774, 2561, 3884, 374, 253, 10799, 44790, 1332, 4081, 1060, 310, 4460, 281, 253, 1682, 273, 619, 3640, 495, 253, 5164, 1859, 273, 23178, 414, 556, 2590, 11361, 689, 17524, 1754, 7274, 577, 891, 2868, 436, 789, 812, 320, 6508, 275, 4722, 4088, 323, 1650, 352, 3133, 751, 253, 3082, 3559, 1060, 812, 10748, 320, 6508, 281, 11907, 23178, 414, 2581, 685, 2557, 352, 50275, 20881, 1255, 265, 50276, 18, 253, 2929, 12453, 247, 973, 1929, 2523, 326, 295, 2224, 2223, 1891, 281, 39970, 24181, 2139, 436, 6634, 285, 2442, 5482, 403, 1669, 440, 1911, 2079, 1223, 253, 2929, 1057, 1691, 3579, 247, 9079, 5001, 2139, 326, 253, 10183, 273, 4715, 24749, 4254, 275, 295, 2224, 778, 1421, 1655, 7274, 281, 1891, 281, 9413, 5482, 534, 39970, 24181, 1652, 1941, 310, 2530, 281, 1329, 436, 6452, 347, 824, 891, 2868, 3916, 824, 347, 776, 2746, 476, 671, 320, 3732, 275, 841, 7533, 281, 2085, 3081, 12288, 5001, 253, 6944, 1921, 323, 253, 2540, 4433, 387, 12082, 26647, 943, 320, 19595, 390, 31637, 604, 627, 310, 1941, 323, 436, 9079, 3559, 326, 891, 717, 11741, 281, 4684, 50276, 19, 849, 513, 359, 871, 253, 1895, 310, 342, 253, 2120, 1566, 2581, 685, 253, 8989, 3332, 6239, 11106, 275, 253, 2929, 14371, 326, 1347, 386, 3210, 476, 320, 2797, 1014, 672, 5460, 2801, 10806, 403, 11295, 337, 374, 6296, 374, 14371, 44790, 3815, 310, 4209, 281, 5223, 247, 48257, 281, 247, 4336, 747, 4836, 1677, 841, 7313, 516, 7514, 670, 253, 13091, 273, 7384, 253, 8989, 13757, 5199, 22649, 271, 3907, 5164, 6333, 275, 253, 3634, 273, 253, 13782, 2684, 407, 253, 2120, 1566, 352, 3133, 21541, 326, 253, 8989, 17923, 973, 327, 253, 8482, 1945, 1677, 253, 4229, 1566, 13461, 533, 310, 20804, 281, 253, 1159, 273, 253, 2120, 1566, 604, 436, 369, 253, 1083, 891, 2868, 247, 1180, 273, 253, 11815, 275, 253, 2929, 651, 417, 320, 17285, 50274, 783, 2929, 1057, 2085, 690, 1941, 436, 778, 417, 320, 12952, 323, 1650, 253, 4021, 29483, 3368, 4677, 374, 2722, 326, 28483, 25965, 1347, 973, 327, 253, 643, 4667, 436, 5936, 253, 8989, 285, 13737, 8989, 778, 2723, 281, 5799, 13175, 275, 253, 2120, 1566, 2299, 4677, 884, 8599, 247, 1027, 2926, 835, 13737, 25965, 513, 417, 1347, 973, 327, 253, 643, 4836, 310, 436, 1955, 281, 253, 6096, 17908, 275, 253, 1635, 23939, 17192, 4836, 310, 627, 667, 5886, 875, 253, 1635, 8989, 342, 6096, 13461, 5176, 285, 253, 28483, 25219, 8989, 342, 6096, 13461, 5176, 50273, 504, 595, 253, 2929, 651, 2085, 625, 1941, 281, 1329, 436, 9376, 403, 5085, 9578, 275, 253, 34741, 285, 440, 12477, 264, 1566, 812, 253, 1783, 1060, 320, 4516, 407, 11120, 26736, 247, 1566, 326, 15646, 33150, 285, 49160, 352, 476, 320, 3636, 50276, 20, 253, 2929, 8219, 275, 2067, 5053, 326, 247, 2801, 1268, 1783, 310, 3309, 323, 14720, 670, 5164, 23178, 414, 24088, 1293, 7296, 253, 7680, 273, 2060, 13461, 352, 310, 417, 1896, 281, 1921, 670, 5164, 23178, 414, 891, 2868, 436, 3908, 49446, 684, 767, 4858, 3374, 337, 604, 11911, 943, 320, 2931, 275, 2426, 273, 5085, 390, 13461, 285, 374, 1880, 11911, 943, 320, 2931, 407, 13175, 347, 2218, 275, 436, 789, 390, 970, 5609, 824, 347, 17524, 327, 253, 6158, 1127, 891, 5194, 342, 253, 7234, 275, 253, 2929, 285, 2868, 253, 5164, 2746, 310, 625, 8244, 15616, 342, 27350, 27367, 273, 752, 310, 5486, 407, 23178, 414, 2299, 891, 13414, 2868, 247, 2801, 4632, 3943, 1268, 8668, 310, 7933, 25834, 11855, 323, 1650, 1908, 247, 2969, 29438, 3828, 288, 50276, 71, 22358, 13947, 247, 6333, 347, 247, 8578, 273, 3603, 273, 288, 651, 320, 6425, 281, 13947, 247, 6333, 347, 247, 8578, 273, 10175, 275, 259, 50275, 21, 8442, 374, 884, 1638, 651, 320, 1199, 30909, 604, 3559, 275, 247, 10334, 792, 5981, 891, 574, 281, 3630, 562, 253, 1543, 436, 1039, 4266, 281, 8596, 4685, 50276, 22, 891, 1119, 253, 897, 273, 247, 14122, 25562, 9628, 7982, 6685, 21643, 891, 5262, 247, 1223, 37202, 272, 689, 4677, 337, 2820, 281, 4677, 562, 752, 253, 13461, 497, 6096, 342, 1580, 1097, 8892, 403, 6607, 275, 253, 4677, 891, 2868, 436, 2523, 812, 320, 4354, 16733, 728, 323, 1650, 407, 970, 1633, 751, 891, 276, 480, 3649, 472, 3605, 253, 2929, 4648, 436, 7982, 275, 30762, 270, 18, 281, 3653, 253, 2408, 273, 9628, 875, 1027, 25965, 10166, 327, 253, 1072, 2990, 285, 4836, 50276, 23, 8143, 4525, 278, 79, 382, 3368, 50273, 18, 253, 2929, 3054, 352, 31088, 281, 851, 1949, 247, 747, 806, 3828, 281, 42137, 253, 29391, 594, 326, 1996, 8090, 476, 320, 294, 3197, 436, 2789, 3282, 2299, 1677, 326, 253, 5199, 310, 21090, 253, 13598, 13461, 2406, 1268, 13461, 2550, 1347, 436, 2969, 29391, 407, 5140, 1580, 597, 403, 13831, 209, 422, 8025, 326, 13598, 2097, 247, 28078, 8989, 1318, 50274, 19, 2139, 369, 253, 4327, 1160, 281, 6194, 25965, 285, 13461, 10486, 275, 436, 4758, 436, 23970, 271, 3081, 2603, 273, 7629, 432, 2045, 4679, 285, 516, 31488, 752, 253, 5649, 310, 50276, 24, 11017, 4679, 50273, 18, 253, 2929, 3054, 604, 253, 44790, 1232, 26586, 667, 1774, 13461, 840, 253, 2900, 310, 3102, 21888, 539, 751, 3185, 273, 1146, 1754, 327, 294, 34153, 4803, 24025, 8813, 436, 310, 2905, 281, 1127, 374, 1840, 285, 516, 417, 13762, 436, 310, 253, 760, 1896, 6452, 1223, 253, 8989, 556, 3636, 247, 749, 18428, 3692, 2900, 310, 3102, 21888, 539, 751, 516, 417, 2119, 326, 253, 13760, 6923, 281, 7525, 253, 1566, 347, 247, 2644, 17923, 45963, 310, 3590, 50274, 19, 5001, 253, 3453, 2801, 1783, 516, 671, 31488, 2139, 352, 943, 320, 253, 1083, 326, 253, 2457, 3828, 310, 10481, 6422, 323, 440, 13018, 3033, 4903, 310, 2649, 352, 671, 1896, 326, 253, 8989, 33772, 281, 11823, 1110, 13461, 984, 352, 310, 417, 1774, 323, 253, 4836, 352, 310, 10166, 327, 50275, 250, 27167, 318, 50276, 74, 5583, 14924, 3738, 891, 452, 2067, 7350, 342, 253, 2929, 891, 1119, 253, 5697, 285, 1783, 3559, 1077, 4722, 891, 2868, 253, 3114, 651, 5649, 432, 2007, 5955, 24852, 285, 17947, 273, 253, 5697, 3559, 50275, 5996, 2488, 2380, 2180, 5731, 50276, 2252, 273, 619, 7350, 452, 644, 9713, 407, 253, 3081, 4679, 285, 9300, 3448, 275, 253, 6323, 18520, 891, 2868, 253, 5609, 285, 1783, 3559, 1060, 323, 18005, 33150, 812, 320, 271, 1774, 3213, 875, 7313, 285, 22909, 323, 253, 4433, 273, 295, 2224, 281, 39970, 24181, 891, 452, 5439, 619, 4868, 15672, 50275, 37585, 3374, 50276, 18, 30762, 260, 19, 359, 1900, 2451, 253, 2093, 5458, 4373, 22041, 50276, 664, 1900, 2451, 253, 6777, 4373, 22041, 50275, 250, 3065, 50276, 18, 549, 328, 278, 595, 66, 277, 24632, 34843, 261, 285, 256, 14763, 77, 3230, 826, 91, 2275, 16825, 8393, 4233, 2135, 42174, 247, 2014, 2990, 281, 2709, 8892, 407, 4715, 281, 8989, 13461, 275, 10061, 273, 253, 19454, 266, 8059, 327, 4382, 8113, 23746, 87, 7266, 721, 44037, 4765, 374, 38622, 23646, 1321, 285, 34843, 301, 419, 2801, 639, 79, 6932, 11454, 6928, 275, 16424, 275, 11454, 1491, 5162, 2718, 7266, 8676, 25654, 25139, 6247, 187, 187, 4118, 18435, 27, 2520, 310, 247, 2929, 326, 310, 15257, 5469, 50276, 783, 2087, 21942, 310, 326, 436, 2929, 13698, 281, 2953, 271, 1774, 873, 273, 3533, 1223, 253, 5853, 812, 320, 5520, 342, 625, 38135, 253, 16774, 1263, 310, 9470, 253, 7350, 403, 670, 849, 281, 4665, 253, 1543, 390, 2581, 1880, 253, 16774, 1941, 4751, 8525, 253, 253, 1750, 35040, 4521, 261, 50276, 6438, 5955, 285, 30080, 22559, 253, 30628, 5520, 616, 7363, 285, 581, 37317, 6376, 387, 21076, 42876, 1840, 7887, 50276, 783, 913, 1239, 253, 2929, 285, 253, 5955, 581, 1318, 253, 913, 11403, 326, 253, 5955, 17059, 875, 253, 4477, 285, 253, 30628, 2085, 247, 1534, 2408, 273, 8249, 1318, 50276, 783, 3533, 281, 320, 9577, 403, 1892, 285, 1537, 6296, 2430, 2007, 46783, 3658, 275, 39926, 285, 20178, 1320, 1805, 5609, 50276, 9072, 1612, 275, 5661, 11809, 281, 4086, 14288, 2710, 24316, 50276, 40622, 253, 913, 32636, 14924 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: two algorithms to deal with a map covering problem where there are unknown obstacles is proposed the first algorithm is a single agent algorithm in which it needs to deal with the explorationexploitation dillema because of the partial observability of the environment the second algorithm extends the first algorithm by considering multiagent settings in which the exploration as well as the safe coverage is a chanllenge it is shown that the second algorithm obtain near optimal coveragw while guarantees the safety same as below same as above docsepthe paper proposes a multiagent coverage algorithm for a 2d environment with a priori unknown unsafe regions and noise density estimates that is guaranteed to be within a constant factor away from the optimal solution under certain assumptions eg well behaved density and safety functions the strength of the paper is a novel formulation of the exploration vs exploitation problem that takes into account agent safety in a multiagent setting as well as the performance guarantees obtained although the algorithms and analysis built on previous work eg 111623 i thought the work to be sufficiently novel the performance guarantee theorem 2 states that by doing enough iterations ie sampling as a function of epsilon and delta one can guarantee that the solution quality gets arbitrarily close ie epsilon to within a constant factor of 11e of the optimum 10 with probability 1 delta hence the claim in the abstract of near optimal coverage in finite time seems overstated in fact when running the algorithm for a finite time with probability delta 0 one may end up worse than within epsilon from the constant factor approximation moreover even if running the algorithm indefinitely one can end up 1e which corresponds to about 37 away from the optimum which is good but may not be considered as nearoptimal by everyone the main weakness of the paper is that i path planning of individual agents is not considered samples are taken from regions of sufficiently safe cells hence ignoring the time and risks that accumulate for an agent as it travels through this region ii it is not clear to what extent the proposed methods scale as only k3 agents are considered and the environment is relatively small considering the intended applications 30 x 30 cells albeit it is not a toy problem given much prior work exists for k1 i would have appreciated an evaluation of the methods that specifically address the multiagent scenario in more complex settings the paper claims that constraints prevent agents from monitoring from unsafe locations and agents may be unable to safely reach a disconnected safe area starting from their initial locations the problem statement assumes that agents can only monitor locations that they can in principle reach as they have to be part of the subset of the largest safely reachable region see equation 2 this prevents an agent from monitoring say the other side of a river if the river cannot be safely crossed irrespective of their sensing range although reasonable to not overly complicate the problem statement this assumption should be explicitly acknowledged as the optimality guarantees are not correct otherwise the problem statement is challenging to follow given there are a number of interconnected aspects up to equation 2 it is rather straightforward however given uncertainties are introduced regarding the density and safety it is less clear thereafter what is actually the objective for example if the objective was to obtain the best possible result over a fixed duration which may be realistic objective in practice than the methods would not necessarily deliver as the solutions presented do not seem to be anytime algorithms instead the objective seems to be provide a solutions however long it takes see also minor suggestions below a few assumptions may make it harder to implement the method in practice including the need for a centralised optimisation and synchronous measurements as well as the functions needed to be continuous if they are not the advantage of the method may be reduced also looking at figure 3c it seems there is a certain window in terms of number of samples where safemac excels whereas if there are fewer samples the method is clearly outperformed by passivemac and as the number of samples approach 800 which is almost 900 one sample per cell i wonder whether other solutions exists that perform equally well or better i am not sure if algorithms 1 and 4 are fully correct in lines 3 and 5 respectively initially when t0 they probe w0 which is not initialised should this be w1 corollary 1 seems to suggest that choosing tp 0 always works i think it should be not in section 5 analysis you state since control coverage consists in maximizing a monotone submodular function we cannot compute the true optimum even for known densities why is this the case in general that is nongreedy approaches could one not simply enumerate through all solutions i could not find a definition of lq input of algorithm 4 minor comments when talking about approximations of the feasible sets are the max and min operators over x you first refer to algorithm 2 then to line 4 in algorithm 1 and then you write now we introduce macopt algorithm 1 which is not ideal where is 11 published in theorem 2 you start with let delta be element of 01 but the same could be said for epsilon as you chose to use 2 variables instead of one a expander not applicable docsepthis paper looks into multiagent coverage control mac problems in which multiple agents coordinate and navigate the environment to maximize the coverage of some densities it aims at learning the density to solve the problem while ensuring the safety of the agents it formally defines the safetyconstrained multiagent coverage control problem the mac task is modeled as a conditionally linear coverage function that possesses monotonocity and submodularity a singleagent safe exploration algorithm named goose is introduced using goose as the intuition the authors propose macopt an unconstrained multiagent coverage control algorithm and safemac a safetyconstrained multiagent coverage control algorithm that is created by extending goose to multiagent cases and combining it with macopt then they prove the convergence of macopt and the optimality and safety properties of safemac finally the paper discusses how macopt and safemac are superior compared to existing methods by doing experiments in synthetic and realworld applications it shows that safemac obtains better solutions than algorithms that do not actively explore the feasible region and that it has higher sample efficiency than competing nearoptimal safe algorithms this paper introduces a singleagent safe exploration algorithm goose as the background extends it to a multiagent version and proposes two algorithms for unconstrained and safetyconstrained multiagent coverage control problems it starts from a wellknown method and then presents novel approaches inspired by it which demonstrates good originality the submission is written in a very solid style the problem statement and definitions are formal and the pseudocodes for macopt and safemac are displayed in detail the paper analyzes macopt s convergence and safemac s optimality and safety properties using theorems and mathematical proofs the statements are accurately written and rigorously proved the full derivation is included in the appendix the structure of the main paper is clear and organized the authors demonstrate that multiagent coverage control tasks are a class of difficult problems especially when safety needs to be guaranteed and their new methods provably address the tasks more efficiently than previous works on the other hand the description of the figures should be improved in the caption of figure 1 it says agent 1 covers d1 green 2 covers d2 orange and 3 covers d3 yellow however i cannot see where d1 d2 and d3 are labeled in figure 1a it also says in b in the optimistic set but this set is not marked and its color is not specified in figure 1b either another side note is about the grammar problem for instance in line 108 a the positive definite kernel matrix should be the positive definite kernel matrix and in line 265 such 1dlidars should be such as 1dlidars the authors specified that the limitation of the paper is that the proposed algorithms choose informative targets without planning informative trajectories which can be crucial in the research of robotics besides in some realworld applications the density and the constraints may not be the same as assumed in the paper in these cases the algorithms no longer have optimality and safety guarantees it is not likely that this paper will cause any potential negative social impact docsepthe paper presents an algorithm for safe coverage and exploration with multiple robots particularly the paper focuses on an active information gathering problem where multiple mobile robots choose how to move around to explore the environment and maximize its coverage coverage is modelled through a spatial gaussian process density function exploration is necessary since the density is unknown a priori safety constraints are also considered the robots cannot access all locations in the environment which are also assumed unknown a priori the constraints are also modeled via gaussian processes the paper provides an algorithm that guarantees i nearoptimal coverage and ii satisfaction of the safety constraints the algorithm is evaluated on a synthetic and two real world applications safe biodiversity monitoring and obstacle avoidance strengths proposed algorithm guarantees sublinear regret and safety algorithms effectiveness is illustrated in simulations weaknesses density and constraints have a known structure being modeled as gaussian processes given kernel functions i would elaborate on how the chosen kernel functions are known a priori the robots are allowed to jump from any point in the safety set to any other the experiments consider robot teams of only 3 robots 1 the paper discusses in the conclusion that the robots currently are allowed to jump from any point in the safety set to any other the conclusion states that future work will address the problem of planning feasible trajectories 2 i would elaborate on the limitations of modeling the density and safety constraints as gaussian processes 3 i would elaborate on how scalable the proposed method is ### Summary:
this paper presents a novel method for multiagent coverage control over an unknown density and safety constraints there is some concern about the level of significance of the approach but it is interesting and sound there were also concerns about scalability and the use of gps for density modeling but the authors have sufficiently addressed these in the response and updated paper the paper would be strengthened by highlighting the contributions and more extensive experiments to show the benefits of the approach in different settings
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 9389, 11333, 281, 2968, 342, 247, 3711, 10985, 1895, 835, 627, 403, 7202, 24238, 310, 4081, 253, 806, 5933, 310, 247, 2014, 5570, 5933, 275, 534, 352, 3198, 281, 2968, 342, 253, 17947, 15083, 80, 3535, 277, 4002, 785, 984, 273, 253, 7898, 1759, 1430, 273, 253, 3126, 50276, 783, 1273, 5933, 8725, 253, 806, 5933, 407, 7296, 4471, 12788, 7533, 275, 534, 253, 17947, 347, 973, 347, 253, 4999, 7031, 310, 247, 47853, 620, 4703, 352, 310, 2011, 326, 253, 1273, 5933, 4044, 2822, 8654, 3835, 356, 88, 1223, 23632, 253, 5252, 1072, 347, 2708, 50276, 18941, 347, 1840, 50276, 7152, 339, 431, 248, 2929, 29328, 247, 4471, 12788, 7031, 5933, 323, 247, 374, 69, 3126, 342, 247, 30400, 7202, 20372, 4811, 285, 6046, 4038, 8197, 326, 310, 16293, 281, 320, 1561, 247, 3638, 2803, 1977, 432, 253, 8654, 2900, 762, 2176, 13260, 24088, 973, 44944, 4038, 285, 5252, 3470, 253, 4757, 273, 253, 2929, 310, 247, 4460, 15895, 273, 253, 17947, 4632, 30211, 1895, 326, 3936, 715, 2395, 5570, 5252, 275, 247, 4471, 12788, 4758, 347, 973, 347, 253, 3045, 23632, 2797, 3738, 253, 11333, 285, 1783, 4270, 327, 2045, 789, 24088, 11334, 23, 1508, 891, 1869, 253, 789, 281, 320, 10481, 4460, 50276, 783, 3045, 12215, 10012, 374, 3054, 326, 407, 2509, 2217, 25142, 26332, 10491, 347, 247, 1159, 273, 299, 4277, 285, 18687, 581, 476, 12215, 326, 253, 2900, 3290, 4850, 29607, 2810, 26332, 299, 4277, 281, 1561, 247, 3638, 2803, 273, 1903, 70, 273, 253, 24571, 884, 342, 5912, 337, 50276, 3005, 7613, 253, 1750, 275, 253, 12002, 273, 2822, 8654, 7031, 275, 6486, 673, 3133, 689, 33834, 275, 958, 672, 3515, 253, 5933, 323, 247, 6486, 673, 342, 5912, 18687, 50276, 17, 581, 778, 990, 598, 7197, 685, 1561, 299, 4277, 432, 253, 3638, 2803, 11193, 25761, 1014, 604, 3515, 253, 5933, 39450, 581, 476, 990, 598, 337, 70, 534, 10140, 281, 670, 5345, 1977, 432, 253, 24571, 534, 310, 1175, 533, 778, 417, 320, 2783, 347, 2822, 29776, 407, 4130, 50276, 783, 2022, 14855, 273, 253, 2929, 310, 326, 891, 1854, 7219, 273, 2060, 6083, 310, 417, 2783, 3530, 403, 2668, 432, 4811, 273, 10481, 4999, 1341, 7613, 23111, 253, 673, 285, 10502, 326, 29010, 323, 271, 5570, 347, 352, 24376, 949, 436, 2919, 21255, 352, 310, 417, 2590, 281, 752, 6070, 253, 4081, 3082, 4311, 347, 760, 465, 20, 6083, 403, 2783, 285, 253, 3126, 310, 4942, 1355, 7296, 253, 6034, 4893, 1884, 1269, 1884, 1341, 23447, 352, 310, 417, 247, 20953, 1895, 1677, 1199, 2720, 789, 4961, 323, 465, 18, 891, 651, 452, 14109, 271, 7103, 273, 253, 3082, 326, 5742, 2953, 253, 4471, 12788, 10076, 275, 625, 2570, 7533, 50276, 783, 2929, 3916, 326, 10806, 3657, 6083, 432, 8667, 432, 20372, 8593, 285, 6083, 778, 320, 7591, 281, 15792, 3986, 247, 33817, 4999, 2170, 4983, 432, 616, 3302, 8593, 253, 1895, 3908, 19584, 326, 6083, 476, 760, 5724, 8593, 326, 597, 476, 275, 8063, 3986, 347, 597, 452, 281, 320, 629, 273, 253, 8578, 273, 253, 6253, 15792, 3986, 494, 2919, 923, 5150, 374, 436, 16897, 271, 5570, 432, 8667, 1333, 253, 643, 1930, 273, 247, 8281, 604, 253, 8281, 2550, 320, 15792, 13405, 30472, 273, 616, 17950, 2491, 3738, 5272, 281, 417, 27662, 5177, 366, 253, 1895, 3908, 436, 9376, 943, 320, 11120, 14969, 347, 253, 5556, 1319, 23632, 403, 417, 3451, 5010, 50276, 783, 1895, 3908, 310, 11132, 281, 956, 1677, 627, 403, 247, 1180, 273, 36282, 7794, 598, 281, 5150, 374, 352, 310, 2581, 15246, 2299, 1677, 20418, 403, 5611, 5001, 253, 4038, 285, 5252, 352, 310, 1679, 2590, 17096, 752, 310, 2686, 253, 8103, 323, 1650, 604, 253, 8103, 369, 281, 4044, 253, 1682, 1896, 906, 689, 247, 4229, 7467, 50276, 4609, 778, 320, 15958, 8103, 275, 3946, 50276, 14644, 253, 3082, 651, 417, 7933, 7257, 347, 253, 5482, 3559, 513, 417, 1646, 281, 320, 28537, 11333, 3185, 253, 8103, 3133, 281, 320, 2085, 247, 5482, 2299, 1048, 352, 3936, 923, 671, 5884, 13991, 2708, 50276, 66, 1643, 13260, 778, 1056, 352, 12150, 281, 3359, 253, 1332, 275, 3946, 1690, 253, 878, 323, 247, 4275, 1701, 5556, 5837, 285, 34265, 6341, 347, 973, 347, 253, 3470, 3058, 281, 320, 5415, 604, 597, 403, 417, 253, 5750, 273, 253, 1332, 778, 320, 3777, 671, 2819, 387, 4677, 495, 68, 352, 3133, 627, 310, 247, 2176, 3497, 275, 2426, 273, 1180, 273, 3530, 835, 4389, 358, 317, 2507, 1241, 5727, 604, 627, 403, 11184, 3530, 253, 1332, 310, 4518, 41731, 10574, 407, 16864, 12432, 285, 347, 253, 1180, 273, 3530, 2746, 14212, 534, 310, 2761, 22908, 50276, 531, 3410, 591, 894, 891, 4282, 1880, 643, 5482, 4961, 326, 1347, 9696, 973, 390, 1805, 50276, 74, 717, 417, 2119, 604, 11333, 337, 285, 577, 403, 4751, 3451, 275, 3104, 495, 285, 608, 2975, 8523, 672, 246, 17, 597, 10304, 259, 17, 534, 310, 417, 3302, 1701, 943, 436, 320, 259, 18, 50276, 5528, 17405, 337, 3133, 281, 1804, 326, 13887, 246, 81, 50276, 17, 1900, 2987, 891, 1158, 352, 943, 320, 50276, 1439, 50275, 249, 2593, 608, 1783, 368, 1375, 1580, 1453, 7031, 8414, 275, 46875, 247, 49123, 749, 2307, 792, 1159, 359, 2550, 11897, 253, 2032, 24571, 1014, 323, 1929, 16689, 2139, 310, 436, 253, 1083, 275, 2087, 326, 310, 295, 543, 250, 6368, 7274, 812, 581, 417, 3365, 49860, 949, 512, 5482, 50276, 74, 812, 417, 1089, 247, 5426, 273, 298, 82, 3280, 273, 5933, 577, 50276, 37585, 5701, 50276, 9453, 5015, 670, 34754, 273, 253, 17887, 5239, 403, 253, 2781, 285, 1054, 9158, 689, 1269, 50276, 5658, 806, 3730, 281, 5933, 374, 840, 281, 1386, 577, 275, 5933, 337, 285, 840, 368, 3630, 1024, 359, 9569, 5315, 2178, 5933, 337, 534, 310, 417, 7445, 50276, 2811, 310, 1903, 3863, 50276, 249, 10012, 374, 368, 1265, 342, 1339, 18687, 320, 3284, 273, 14805, 533, 253, 1072, 812, 320, 753, 323, 299, 4277, 347, 368, 9703, 281, 897, 374, 4903, 3185, 273, 581, 50276, 66, 5645, 254, 417, 7763, 5474, 33032, 2520, 2929, 4453, 715, 4471, 12788, 7031, 1453, 5315, 3237, 275, 534, 2709, 6083, 13249, 285, 24171, 253, 3126, 281, 22950, 253, 7031, 273, 690, 16689, 352, 13698, 387, 4715, 253, 4038, 281, 8415, 253, 1895, 1223, 17749, 253, 5252, 273, 253, 6083, 352, 19186, 13067, 253, 5252, 48454, 4471, 12788, 7031, 1453, 1895, 253, 5315, 4836, 310, 23115, 347, 247, 1617, 595, 4872, 7031, 1159, 326, 25099, 41907, 23716, 285, 749, 2307, 792, 414, 247, 2014, 12788, 4999, 17947, 5933, 4907, 44165, 310, 5611, 970, 44165, 347, 253, 30328, 253, 4477, 12661, 5315, 2178, 271, 440, 48454, 4471, 12788, 7031, 1453, 5933, 285, 4389, 358, 317, 247, 5252, 48454, 4471, 12788, 7031, 1453, 5933, 326, 310, 3562, 407, 13633, 44165, 281, 4471, 12788, 2219, 285, 16248, 352, 342, 5315, 2178, 840, 597, 5276, 253, 14940, 273, 5315, 2178, 285, 253, 5556, 1319, 285, 5252, 3607, 273, 4389, 358, 317, 4720, 253, 2929, 25339, 849, 5315, 2178, 285, 4389, 358, 317, 403, 8936, 2429, 281, 5368, 3082, 407, 2509, 4679, 275, 13506, 285, 1524, 10186, 4893, 352, 2722, 326, 4389, 358, 317, 31326, 1805, 5482, 685, 11333, 326, 513, 417, 15257, 8338, 253, 17887, 2919, 285, 326, 352, 556, 2169, 3410, 6733, 685, 11771, 2822, 29776, 4999, 11333, 50276, 2520, 2929, 23970, 247, 2014, 12788, 4999, 17947, 5933, 44165, 347, 253, 4114, 8725, 352, 281, 247, 4471, 12788, 2715, 285, 29328, 767, 11333, 323, 440, 48454, 285, 5252, 48454, 4471, 12788, 7031, 1453, 3237, 352, 7866, 432, 247, 973, 4304, 1332, 285, 840, 10262, 4460, 7274, 11797, 407, 352, 534, 14371, 1175, 3236, 414, 50276, 783, 19529, 310, 3542, 275, 247, 1077, 4891, 3740, 253, 1895, 3908, 285, 14308, 403, 7473, 285, 253, 10585, 406, 3180, 323, 5315, 2178, 285, 4389, 358, 317, 403, 8653, 275, 2508, 253, 2929, 3537, 13505, 5315, 2178, 256, 14940, 285, 4389, 358, 317, 256, 5556, 1319, 285, 5252, 3607, 970, 39383, 285, 15965, 27947, 253, 7234, 403, 13613, 3542, 285, 8132, 29689, 8058, 253, 2120, 28529, 310, 2908, 275, 253, 30762, 253, 2605, 273, 253, 2022, 2929, 310, 2590, 285, 10932, 253, 4477, 7568, 326, 4471, 12788, 7031, 1453, 8892, 403, 247, 966, 273, 2834, 3237, 3340, 672, 5252, 3198, 281, 320, 16293, 285, 616, 747, 3082, 872, 1598, 2953, 253, 8892, 625, 14556, 685, 2045, 2987, 50276, 251, 253, 643, 1133, 253, 5740, 273, 253, 8442, 943, 320, 5520, 275, 253, 11743, 273, 4677, 337, 352, 2296, 5570, 337, 10949, 277, 18, 4759, 374, 10949, 277, 19, 13735, 285, 495, 10949, 277, 20, 8862, 2299, 891, 2550, 923, 835, 277, 18, 277, 19, 285, 277, 20, 403, 13130, 275, 4677, 337, 66, 352, 671, 2296, 275, 270, 50276, 249, 253, 28684, 873, 533, 436, 873, 310, 417, 7101, 285, 697, 3295, 310, 417, 7616, 275, 4677, 337, 67, 2057, 50276, 23955, 1930, 3877, 310, 670, 253, 28146, 1895, 323, 4227, 50276, 249, 1386, 13278, 50276, 66, 253, 2762, 19040, 10295, 4315, 50276, 11425, 320, 50276, 783, 2762, 19040, 10295, 4315, 50276, 395, 275, 1386, 25905, 50276, 10328, 337, 11830, 301, 1032, 943, 320, 50276, 10328, 347, 337, 11830, 301, 1032, 50275, 783, 4477, 7616, 326, 253, 12291, 273, 253, 2929, 310, 326, 253, 4081, 11333, 5206, 27096, 8571, 1293, 7219, 27096, 24102, 534, 476, 320, 9560, 275, 253, 2561, 273, 15688, 982, 16280, 275, 690, 1524, 10186, 4893, 253, 4038, 285, 253, 10806, 778, 417, 320, 253, 1072, 347, 8025, 275, 253, 2929, 275, 841, 2219, 253, 11333, 642, 3356, 452, 5556, 1319, 285, 5252, 23632, 352, 310, 417, 2779, 326, 436, 2929, 588, 2847, 667, 2442, 4016, 2675, 3486, 5474, 339, 431, 248, 2929, 10262, 271, 5933, 323, 4999, 7031, 285, 17947, 342, 2709, 25497, 50276, 35456, 253, 2929, 16633, 327, 271, 3939, 1491, 16778, 1895, 835, 2709, 6109, 25497, 5206, 849, 281, 2118, 1475, 281, 8338, 253, 3126, 285, 22950, 697, 7031, 50276, 16484, 486, 310, 41329, 949, 247, 8820, 305, 12064, 1232, 4038, 1159, 50276, 15083, 7843, 310, 3309, 1580, 253, 4038, 310, 7202, 247, 30400, 50276, 84, 33816, 10806, 403, 671, 2783, 253, 25497, 2550, 2289, 512, 8593, 275, 253, 3126, 534, 403, 671, 8025, 7202, 247, 30400, 50276, 783, 10806, 403, 671, 23115, 3066, 305, 12064, 4870, 50276, 783, 2929, 3400, 271, 5933, 326, 23632, 891, 2822, 29776, 7031, 285, 21255, 13212, 273, 253, 5252, 10806, 253, 5933, 310, 6760, 327, 247, 13506, 285, 767, 1524, 1533, 4893, 4999, 40503, 8667, 285, 26982, 28772, 20544, 50275, 856, 7334, 5933, 23632, 749, 8172, 14938, 285, 5252, 50275, 267, 46042, 12510, 310, 12800, 275, 9938, 50276, 20881, 1255, 265, 50275, 20425, 285, 10806, 452, 247, 1929, 2605, 1146, 23115, 347, 305, 12064, 4870, 1677, 10295, 3470, 50276, 74, 651, 21184, 327, 849, 253, 6777, 10295, 3470, 403, 1929, 247, 30400, 50275, 783, 25497, 403, 4136, 281, 6923, 432, 667, 1127, 275, 253, 5252, 873, 281, 667, 643, 50275, 783, 4679, 1908, 15688, 6671, 273, 760, 495, 25497, 50276, 18, 50276, 783, 2929, 25339, 275, 253, 6452, 326, 253, 25497, 4390, 403, 4136, 281, 6923, 432, 667, 1127, 275, 253, 5252, 873, 281, 667, 643, 253, 6452, 3054, 326, 2852, 789, 588, 2953, 253, 1895, 273, 7219, 17887, 24102, 374, 50276, 74, 651, 21184, 327, 253, 7364, 273, 14053, 253, 4038, 285, 5252, 10806, 347, 305, 12064, 4870, 495, 50276, 74, 651, 21184, 327, 849, 44755, 253, 4081, 1332, 310, 50274, 187, 187, 4118, 18435, 27, 2520, 2929, 10262, 247, 4460, 1332, 323, 4471, 12788, 7031, 1453, 689, 271, 7202, 4038, 285, 5252, 10806, 627, 310, 690, 4468, 670, 253, 1268, 273, 8453, 273, 253, 2746, 533, 352, 310, 4722, 285, 3590, 627, 497, 671, 7350, 670, 9171, 1430, 285, 253, 897, 273, 305, 793, 323, 4038, 14053, 533, 253, 4477, 452, 10481, 9713, 841, 275, 253, 2380, 285, 9300, 2929, 253, 2929, 651, 320, 34615, 407, 27321, 253, 9021, 285, 625, 9470, 4679, 281, 921, 253, 5373, 273, 253, 2746, 275, 1027, 7533, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 9389, 11333, 281, 2968, 342, 247, 3711, 10985, 1895, 835, 627, 403, 7202, 24238, 310, 4081, 253, 806, 5933, 310, 247, 2014, 5570, 5933, 275, 534, 352, 3198, 281, 2968, 342, 253, 17947, 15083, 80, 3535, 277, 4002, 785, 984, 273, 253, 7898, 1759, 1430, 273, 253, 3126, 50276, 783, 1273, 5933, 8725, 253, 806, 5933, 407, 7296, 4471, 12788, 7533, 275, 534, 253, 17947, 347, 973, 347, 253, 4999, 7031, 310, 247, 47853, 620, 4703, 352, 310, 2011, 326, 253, 1273, 5933, 4044, 2822, 8654, 3835, 356, 88, 1223, 23632, 253, 5252, 1072, 347, 2708, 50276, 18941, 347, 1840, 50276, 7152, 339, 431, 248, 2929, 29328, 247, 4471, 12788, 7031, 5933, 323, 247, 374, 69, 3126, 342, 247, 30400, 7202, 20372, 4811, 285, 6046, 4038, 8197, 326, 310, 16293, 281, 320, 1561, 247, 3638, 2803, 1977, 432, 253, 8654, 2900, 762, 2176, 13260, 24088, 973, 44944, 4038, 285, 5252, 3470, 253, 4757, 273, 253, 2929, 310, 247, 4460, 15895, 273, 253, 17947, 4632, 30211, 1895, 326, 3936, 715, 2395, 5570, 5252, 275, 247, 4471, 12788, 4758, 347, 973, 347, 253, 3045, 23632, 2797, 3738, 253, 11333, 285, 1783, 4270, 327, 2045, 789, 24088, 11334, 23, 1508, 891, 1869, 253, 789, 281, 320, 10481, 4460, 50276, 783, 3045, 12215, 10012, 374, 3054, 326, 407, 2509, 2217, 25142, 26332, 10491, 347, 247, 1159, 273, 299, 4277, 285, 18687, 581, 476, 12215, 326, 253, 2900, 3290, 4850, 29607, 2810, 26332, 299, 4277, 281, 1561, 247, 3638, 2803, 273, 1903, 70, 273, 253, 24571, 884, 342, 5912, 337, 50276, 3005, 7613, 253, 1750, 275, 253, 12002, 273, 2822, 8654, 7031, 275, 6486, 673, 3133, 689, 33834, 275, 958, 672, 3515, 253, 5933, 323, 247, 6486, 673, 342, 5912, 18687, 50276, 17, 581, 778, 990, 598, 7197, 685, 1561, 299, 4277, 432, 253, 3638, 2803, 11193, 25761, 1014, 604, 3515, 253, 5933, 39450, 581, 476, 990, 598, 337, 70, 534, 10140, 281, 670, 5345, 1977, 432, 253, 24571, 534, 310, 1175, 533, 778, 417, 320, 2783, 347, 2822, 29776, 407, 4130, 50276, 783, 2022, 14855, 273, 253, 2929, 310, 326, 891, 1854, 7219, 273, 2060, 6083, 310, 417, 2783, 3530, 403, 2668, 432, 4811, 273, 10481, 4999, 1341, 7613, 23111, 253, 673, 285, 10502, 326, 29010, 323, 271, 5570, 347, 352, 24376, 949, 436, 2919, 21255, 352, 310, 417, 2590, 281, 752, 6070, 253, 4081, 3082, 4311, 347, 760, 465, 20, 6083, 403, 2783, 285, 253, 3126, 310, 4942, 1355, 7296, 253, 6034, 4893, 1884, 1269, 1884, 1341, 23447, 352, 310, 417, 247, 20953, 1895, 1677, 1199, 2720, 789, 4961, 323, 465, 18, 891, 651, 452, 14109, 271, 7103, 273, 253, 3082, 326, 5742, 2953, 253, 4471, 12788, 10076, 275, 625, 2570, 7533, 50276, 783, 2929, 3916, 326, 10806, 3657, 6083, 432, 8667, 432, 20372, 8593, 285, 6083, 778, 320, 7591, 281, 15792, 3986, 247, 33817, 4999, 2170, 4983, 432, 616, 3302, 8593, 253, 1895, 3908, 19584, 326, 6083, 476, 760, 5724, 8593, 326, 597, 476, 275, 8063, 3986, 347, 597, 452, 281, 320, 629, 273, 253, 8578, 273, 253, 6253, 15792, 3986, 494, 2919, 923, 5150, 374, 436, 16897, 271, 5570, 432, 8667, 1333, 253, 643, 1930, 273, 247, 8281, 604, 253, 8281, 2550, 320, 15792, 13405, 30472, 273, 616, 17950, 2491, 3738, 5272, 281, 417, 27662, 5177, 366, 253, 1895, 3908, 436, 9376, 943, 320, 11120, 14969, 347, 253, 5556, 1319, 23632, 403, 417, 3451, 5010, 50276, 783, 1895, 3908, 310, 11132, 281, 956, 1677, 627, 403, 247, 1180, 273, 36282, 7794, 598, 281, 5150, 374, 352, 310, 2581, 15246, 2299, 1677, 20418, 403, 5611, 5001, 253, 4038, 285, 5252, 352, 310, 1679, 2590, 17096, 752, 310, 2686, 253, 8103, 323, 1650, 604, 253, 8103, 369, 281, 4044, 253, 1682, 1896, 906, 689, 247, 4229, 7467, 50276, 4609, 778, 320, 15958, 8103, 275, 3946, 50276, 14644, 253, 3082, 651, 417, 7933, 7257, 347, 253, 5482, 3559, 513, 417, 1646, 281, 320, 28537, 11333, 3185, 253, 8103, 3133, 281, 320, 2085, 247, 5482, 2299, 1048, 352, 3936, 923, 671, 5884, 13991, 2708, 50276, 66, 1643, 13260, 778, 1056, 352, 12150, 281, 3359, 253, 1332, 275, 3946, 1690, 253, 878, 323, 247, 4275, 1701, 5556, 5837, 285, 34265, 6341, 347, 973, 347, 253, 3470, 3058, 281, 320, 5415, 604, 597, 403, 417, 253, 5750, 273, 253, 1332, 778, 320, 3777, 671, 2819, 387, 4677, 495, 68, 352, 3133, 627, 310, 247, 2176, 3497, 275, 2426, 273, 1180, 273, 3530, 835, 4389, 358, 317, 2507, 1241, 5727, 604, 627, 403, 11184, 3530, 253, 1332, 310, 4518, 41731, 10574, 407, 16864, 12432, 285, 347, 253, 1180, 273, 3530, 2746, 14212, 534, 310, 2761, 22908, 50276, 531, 3410, 591, 894, 891, 4282, 1880, 643, 5482, 4961, 326, 1347, 9696, 973, 390, 1805, 50276, 74, 717, 417, 2119, 604, 11333, 337, 285, 577, 403, 4751, 3451, 275, 3104, 495, 285, 608, 2975, 8523, 672, 246, 17, 597, 10304, 259, 17, 534, 310, 417, 3302, 1701, 943, 436, 320, 259, 18, 50276, 5528, 17405, 337, 3133, 281, 1804, 326, 13887, 246, 81, 50276, 17, 1900, 2987, 891, 1158, 352, 943, 320, 50276, 1439, 50275, 249, 2593, 608, 1783, 368, 1375, 1580, 1453, 7031, 8414, 275, 46875, 247, 49123, 749, 2307, 792, 1159, 359, 2550, 11897, 253, 2032, 24571, 1014, 323, 1929, 16689, 2139, 310, 436, 253, 1083, 275, 2087, 326, 310, 295, 543, 250, 6368, 7274, 812, 581, 417, 3365, 49860, 949, 512, 5482, 50276, 74, 812, 417, 1089, 247, 5426, 273, 298, 82, 3280, 273, 5933, 577, 50276, 37585, 5701, 50276, 9453, 5015, 670, 34754, 273, 253, 17887, 5239, 403, 253, 2781, 285, 1054, 9158, 689, 1269, 50276, 5658, 806, 3730, 281, 5933, 374, 840, 281, 1386, 577, 275, 5933, 337, 285, 840, 368, 3630, 1024, 359, 9569, 5315, 2178, 5933, 337, 534, 310, 417, 7445, 50276, 2811, 310, 1903, 3863, 50276, 249, 10012, 374, 368, 1265, 342, 1339, 18687, 320, 3284, 273, 14805, 533, 253, 1072, 812, 320, 753, 323, 299, 4277, 347, 368, 9703, 281, 897, 374, 4903, 3185, 273, 581, 50276, 66, 5645, 254, 417, 7763, 5474, 33032, 2520, 2929, 4453, 715, 4471, 12788, 7031, 1453, 5315, 3237, 275, 534, 2709, 6083, 13249, 285, 24171, 253, 3126, 281, 22950, 253, 7031, 273, 690, 16689, 352, 13698, 387, 4715, 253, 4038, 281, 8415, 253, 1895, 1223, 17749, 253, 5252, 273, 253, 6083, 352, 19186, 13067, 253, 5252, 48454, 4471, 12788, 7031, 1453, 1895, 253, 5315, 4836, 310, 23115, 347, 247, 1617, 595, 4872, 7031, 1159, 326, 25099, 41907, 23716, 285, 749, 2307, 792, 414, 247, 2014, 12788, 4999, 17947, 5933, 4907, 44165, 310, 5611, 970, 44165, 347, 253, 30328, 253, 4477, 12661, 5315, 2178, 271, 440, 48454, 4471, 12788, 7031, 1453, 5933, 285, 4389, 358, 317, 247, 5252, 48454, 4471, 12788, 7031, 1453, 5933, 326, 310, 3562, 407, 13633, 44165, 281, 4471, 12788, 2219, 285, 16248, 352, 342, 5315, 2178, 840, 597, 5276, 253, 14940, 273, 5315, 2178, 285, 253, 5556, 1319, 285, 5252, 3607, 273, 4389, 358, 317, 4720, 253, 2929, 25339, 849, 5315, 2178, 285, 4389, 358, 317, 403, 8936, 2429, 281, 5368, 3082, 407, 2509, 4679, 275, 13506, 285, 1524, 10186, 4893, 352, 2722, 326, 4389, 358, 317, 31326, 1805, 5482, 685, 11333, 326, 513, 417, 15257, 8338, 253, 17887, 2919, 285, 326, 352, 556, 2169, 3410, 6733, 685, 11771, 2822, 29776, 4999, 11333, 50276, 2520, 2929, 23970, 247, 2014, 12788, 4999, 17947, 5933, 44165, 347, 253, 4114, 8725, 352, 281, 247, 4471, 12788, 2715, 285, 29328, 767, 11333, 323, 440, 48454, 285, 5252, 48454, 4471, 12788, 7031, 1453, 3237, 352, 7866, 432, 247, 973, 4304, 1332, 285, 840, 10262, 4460, 7274, 11797, 407, 352, 534, 14371, 1175, 3236, 414, 50276, 783, 19529, 310, 3542, 275, 247, 1077, 4891, 3740, 253, 1895, 3908, 285, 14308, 403, 7473, 285, 253, 10585, 406, 3180, 323, 5315, 2178, 285, 4389, 358, 317, 403, 8653, 275, 2508, 253, 2929, 3537, 13505, 5315, 2178, 256, 14940, 285, 4389, 358, 317, 256, 5556, 1319, 285, 5252, 3607, 970, 39383, 285, 15965, 27947, 253, 7234, 403, 13613, 3542, 285, 8132, 29689, 8058, 253, 2120, 28529, 310, 2908, 275, 253, 30762, 253, 2605, 273, 253, 2022, 2929, 310, 2590, 285, 10932, 253, 4477, 7568, 326, 4471, 12788, 7031, 1453, 8892, 403, 247, 966, 273, 2834, 3237, 3340, 672, 5252, 3198, 281, 320, 16293, 285, 616, 747, 3082, 872, 1598, 2953, 253, 8892, 625, 14556, 685, 2045, 2987, 50276, 251, 253, 643, 1133, 253, 5740, 273, 253, 8442, 943, 320, 5520, 275, 253, 11743, 273, 4677, 337, 352, 2296, 5570, 337, 10949, 277, 18, 4759, 374, 10949, 277, 19, 13735, 285, 495, 10949, 277, 20, 8862, 2299, 891, 2550, 923, 835, 277, 18, 277, 19, 285, 277, 20, 403, 13130, 275, 4677, 337, 66, 352, 671, 2296, 275, 270, 50276, 249, 253, 28684, 873, 533, 436, 873, 310, 417, 7101, 285, 697, 3295, 310, 417, 7616, 275, 4677, 337, 67, 2057, 50276, 23955, 1930, 3877, 310, 670, 253, 28146, 1895, 323, 4227, 50276, 249, 1386, 13278, 50276, 66, 253, 2762, 19040, 10295, 4315, 50276, 11425, 320, 50276, 783, 2762, 19040, 10295, 4315, 50276, 395, 275, 1386, 25905, 50276, 10328, 337, 11830, 301, 1032, 943, 320, 50276, 10328, 347, 337, 11830, 301, 1032, 50275, 783, 4477, 7616, 326, 253, 12291, 273, 253, 2929, 310, 326, 253, 4081, 11333, 5206, 27096, 8571, 1293, 7219, 27096, 24102, 534, 476, 320, 9560, 275, 253, 2561, 273, 15688, 982, 16280, 275, 690, 1524, 10186, 4893, 253, 4038, 285, 253, 10806, 778, 417, 320, 253, 1072, 347, 8025, 275, 253, 2929, 275, 841, 2219, 253, 11333, 642, 3356, 452, 5556, 1319, 285, 5252, 23632, 352, 310, 417, 2779, 326, 436, 2929, 588, 2847, 667, 2442, 4016, 2675, 3486, 5474, 339, 431, 248, 2929, 10262, 271, 5933, 323, 4999, 7031, 285, 17947, 342, 2709, 25497, 50276, 35456, 253, 2929, 16633, 327, 271, 3939, 1491, 16778, 1895, 835, 2709, 6109, 25497, 5206, 849, 281, 2118, 1475, 281, 8338, 253, 3126, 285, 22950, 697, 7031, 50276, 16484, 486, 310, 41329, 949, 247, 8820, 305, 12064, 1232, 4038, 1159, 50276, 15083, 7843, 310, 3309, 1580, 253, 4038, 310, 7202, 247, 30400, 50276, 84, 33816, 10806, 403, 671, 2783, 253, 25497, 2550, 2289, 512, 8593, 275, 253, 3126, 534, 403, 671, 8025, 7202, 247, 30400, 50276, 783, 10806, 403, 671, 23115, 3066, 305, 12064, 4870, 50276, 783, 2929, 3400, 271, 5933, 326, 23632, 891, 2822, 29776, 7031, 285, 21255, 13212, 273, 253, 5252, 10806, 253, 5933, 310, 6760, 327, 247, 13506, 285, 767, 1524, 1533, 4893, 4999, 40503, 8667, 285, 26982, 28772, 20544, 50275, 856, 7334, 5933, 23632, 749, 8172, 14938, 285, 5252, 50275, 267, 46042, 12510, 310, 12800, 275, 9938, 50276, 20881, 1255, 265, 50275, 20425, 285, 10806, 452, 247, 1929, 2605, 1146, 23115, 347, 305, 12064, 4870, 1677, 10295, 3470, 50276, 74, 651, 21184, 327, 849, 253, 6777, 10295, 3470, 403, 1929, 247, 30400, 50275, 783, 25497, 403, 4136, 281, 6923, 432, 667, 1127, 275, 253, 5252, 873, 281, 667, 643, 50275, 783, 4679, 1908, 15688, 6671, 273, 760, 495, 25497, 50276, 18, 50276, 783, 2929, 25339, 275, 253, 6452, 326, 253, 25497, 4390, 403, 4136, 281, 6923, 432, 667, 1127, 275, 253, 5252, 873, 281, 667, 643, 253, 6452, 3054, 326, 2852, 789, 588, 2953, 253, 1895, 273, 7219, 17887, 24102, 374, 50276, 74, 651, 21184, 327, 253, 7364, 273, 14053, 253, 4038, 285, 5252, 10806, 347, 305, 12064, 4870, 495, 50276, 74, 651, 21184, 327, 849, 44755, 253, 4081, 1332, 310, 50274, 187, 187, 4118, 18435, 27, 2520, 2929, 10262, 247, 4460, 1332, 323, 4471, 12788, 7031, 1453, 689, 271, 7202, 4038, 285, 5252, 10806, 627, 310, 690, 4468, 670, 253, 1268, 273, 8453, 273, 253, 2746, 533, 352, 310, 4722, 285, 3590, 627, 497, 671, 7350, 670, 9171, 1430, 285, 253, 897, 273, 305, 793, 323, 4038, 14053, 533, 253, 4477, 452, 10481, 9713, 841, 275, 253, 2380, 285, 9300, 2929, 253, 2929, 651, 320, 34615, 407, 27321, 253, 9021, 285, 625, 9470, 4679, 281, 921, 253, 5373, 273, 253, 2746, 275, 1027, 7533, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: placement is one critical step to ensure circuit performance including power consumption delay and chip area in physical design this paper presents a reinforcementlearningbased approach to optimize the half perimeter wirelength hpwl placing circuit modules sequentially this proposed method adopts multiview mask visual representation from the placement to ensure the nonoverlapping property among modules each mask clearly captures a certain aspect allowable positioning metric change global placement information at the current placement snapshot resulting in efficient policy learning the paper performed an extensive experimental study on multiple benchmarks and demonstrated the effectiveness of individual components through the ablation study strengths 1 this paper transforms the geometric placement problem into multiple visual representations using three masks which opens the possibility of using mature convolution networks to extract the global layout information in addition each mask has its distinct purpose benefiting the overall placement task from different angles the position mask guarantees the nonoverlapping property of the proposed method the wire mask gives a good estimation of the metric hpwl change for the action and the view mask provides a global view of the current placement results 1 the paper provides multiple components which can be beneficial for future research in this area the authors designed multiple masks for the placement and their corresponding efficient generation algorithms which can be used as individual components to inspire other related placement and routing tasks 1 this paper conducts an extensive experimental study of the proposed methods over various benchmarks and multiple metrics against both stateoftheart optimizationbased and rlbased approaches demonstrating the effectiveness of their method in addition the ablation study also shows the power of each individual component of the proposed method weaknesses 1 there exist certain inconsistencies between the training and inference phase due to congestion satisfaction the congestion satisfaction is not considered during the rl training phase however it is used during the inference phase during the inference phase the probability matrix of the action doesnt anticipate the effect of the congestion satisfaction which may result in lower quality placement compared to the training phase ignoring the congestion constraint 1 some detailed components of the design rl frameworks are not fully explained as listed in the questions below the proposed methods can be hardly applied to standard cell placement due to the large state and action space however the authors show in table 6 that the placement tool which can handle standard cell placement like dreamplace may also benefit from the proposed method of handling module placement this is beneficial for the eda tool development in general there is no identified negative societal impact docsepthis paper presents a rlbased chip placement approach maskplace to automatically generate a valid chip layout design compared with the former approaches that apply hypergraph to represent the chip layout maskplace adopts the pixel level graphical representation to represent the layout and pin offset which allows for maskplace to produce a better performance the results show that maskplace can achieve 6090 wirelength reduction with zero overlaps strengths of the paper 1 chip placement with rl has been recently studied by multiple literature this work proposes a better solution for chip placement the comprehensive 2d pixelwise representation of the chip layout and wire length enables maskplace to involve the pin offset information when making the placement decision this results in a better performance than the previous work which adopts hypergraph to represent the chip layout 2 to generate the input state for the rl agents heuristic algorithms are proposed to produce the input masks for the rl agents with relatively low complexity 3 the presentation of the paper is clear the experimental results are comprehensive and promising weakness of the paper see the question section na docsepthe paper uses reinforcement learning to solve the chip placement problem it views the placement canvas as a 2d image during the representation learning strengths 1 the paper is well written and organized the appendix discusses related details 2 the method is sound and well defined 3 the results are charming and convincing weaknesses 1 dreamplace is used as a baseline which is not a good implementation for macro placement the current dreamplace is majorly optimized for standard cell placement it cannot handle macro placement very well the authors may consider replace 1 as the baseline 1 c cheng a b kahng i kang and l wang replace advancing solution quality and routability validation in global placement in ieee transactions on computeraided design of integrated circuits and systems vol 38 no 9 pp 17171730 sept 2019 doi 101109tcad20182859220 2 the authors use ispd 2005 benchmarks which are for standard cell placement macros are fixed in these benchmarks authors should describe how they edit the benchmarks the authors should also consider using mms benchmarks 2 2 j z yan n viswanathan and c chu handling complexities in modern largescale mixedsize placement 2009 46th acmieee design automation conference 2009 pp 436441 doi 10114516299111630028 3 several benchmarks are missing such as bigblue2 bigblue4 the authors should report the results of these benchmarks if they select the benchmark suites otherwise readers may suspect that these results are not ideal 4 the ablation study is not thorough for instance there are many masks in the proposed method what is the contribution of each mask is every single mask necessary what is the impact of the combination of these masks specifically if the method is really great it does not need the position mask that tells the agent the availability since the agent should have this information the authors discuss the limitation at the end of the paper the proposed method can only handle a small number of macros it can not tackle a large number of movable instances such as millions of standard cells i think the authors should stress this limitation throughout the paper otherwise readers may think that the paper proposes a method to place all instances in the first three section the authors use the term placement instead of macro placement docsepthis paper proposes maskplace which can automatically generate a highquality and valid layout within a few hours unlike previous methods that requires manual refinement to modify invalid placement maskplace casts placement as a problem of pixellevel visual representation learning more particularly 1 visual representation learning enables describing millions of circuit modules on a chip comprehensively 2 maskplace suggests a new policy network that can capture and aggregate both the global and subtle information on a chip canvas ensuring nonoverlapping placement efficiently 2 the demonstrated performance on 24 public chip benchmarks by outperforming graph placement and deeppr 5x and 9x in reducing wirelength with 0 overlap in layout strengths the paper suggested a more accurate representation of chip layout the three types of pixellevel feature maps can fully represent massive net and pin configurations different from deeppr that assumed all modules have unit size this work considers real sizes of modules the dense reward is an interesting idea it is an inexpensive way to create an advantage value at each reinforce step instead of generating one reward at only the end of the episode after the placement is done this can be both good and bad those as the decisions can be shortsighted such that the algorithm tries to incrementally improve wire length losing performance opportunity from sacrificing shortterm objectives however both the main results and ablation show that dense reward positively improves learning the paper has thorough comparisons with sota related work and demonstrated significant gains on a set of benchmark metrics like wire length density and overlap are improved the reviewer is happy to see ablation figure 6 of a decomposition of the gains all the design decisions add up to some of the performance weaknesses the results look very strong however are a bit counter intuitive the reward only considers wire length by continuously measuring an advantage function at each reinforce step the reviewer is not so sure how it could improve density which is demonstrated to be improved in all the results wire length and density are nonorthogonal metrics and are usually in conflict with each other the paper lacks more details on how the baseline methods are implemented and whether the comparisons are fair and all the promising numbers are trustable there is little formulations local mask fusion global mask encoder and decoder and other components in figure 3 should be formulated mathematically for others to reproduce the experiments yes ### Summary:
the reviewers are enthusiastic about the work and all recommended for the acceptance of the paper the reviewers think the work is solid and novel and potentially impactful for example reviewer mddg noted this paper transforms the geometric placement problem into multiple visual representations using three masks which opens the possibility of using mature convolution networks to extract the global layout information thanks to the authors for the detail rebuttal and the thorough discussion with the reviewers incorporating these points raised in the communication will further improve the paper
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 26380, 310, 581, 4619, 3213, 281, 5416, 5049, 3045, 1690, 1612, 8353, 5778, 285, 11291, 2170, 275, 3520, 2216, 436, 2929, 10262, 247, 35221, 28269, 3169, 2746, 281, 22318, 253, 2716, 31245, 6371, 3985, 288, 81, 24966, 15606, 5049, 11911, 32627, 436, 4081, 1332, 47932, 1554, 400, 827, 8989, 5304, 6779, 432, 253, 14663, 281, 5416, 253, 1327, 1189, 77, 5436, 2867, 2190, 11911, 1016, 8989, 4518, 28174, 247, 2176, 4809, 1581, 494, 19274, 7982, 1818, 4156, 14663, 1491, 387, 253, 1655, 14663, 29679, 4795, 275, 5919, 3646, 4715, 253, 2929, 2684, 271, 9470, 5661, 1263, 327, 2709, 49602, 285, 5183, 253, 12510, 273, 2060, 4295, 949, 253, 28913, 1263, 20544, 337, 436, 2929, 29698, 253, 17856, 14663, 1895, 715, 2709, 5304, 14237, 970, 1264, 25965, 534, 13279, 253, 6387, 273, 970, 14242, 27311, 6928, 281, 4908, 253, 4156, 12806, 1491, 275, 1635, 1016, 8989, 556, 697, 5799, 4096, 2750, 2996, 253, 4583, 14663, 4836, 432, 1027, 14636, 253, 1899, 8989, 23632, 253, 1327, 1189, 77, 5436, 2867, 273, 253, 4081, 1332, 253, 6371, 8989, 4245, 247, 1175, 13418, 273, 253, 7982, 288, 81, 24966, 1818, 323, 253, 2250, 285, 253, 1859, 8989, 3400, 247, 4156, 1859, 273, 253, 1655, 14663, 1543, 337, 253, 2929, 3400, 2709, 4295, 534, 476, 320, 12912, 323, 2852, 2561, 275, 436, 2170, 253, 4477, 4158, 2709, 25965, 323, 253, 14663, 285, 616, 3969, 5919, 5978, 11333, 534, 476, 320, 908, 347, 2060, 4295, 281, 26761, 643, 2905, 14663, 285, 24749, 8892, 337, 436, 2929, 2589, 84, 271, 9470, 5661, 1263, 273, 253, 4081, 3082, 689, 2710, 49602, 285, 2709, 17082, 1411, 1097, 1375, 23037, 14387, 13757, 3169, 285, 391, 77, 3169, 7274, 17227, 253, 12510, 273, 616, 1332, 275, 1635, 253, 28913, 1263, 671, 2722, 253, 1612, 273, 1016, 2060, 4445, 273, 253, 4081, 1332, 50276, 20881, 1255, 265, 337, 627, 2226, 2176, 45611, 875, 253, 3733, 285, 17032, 3408, 1955, 281, 35367, 13212, 253, 35367, 13212, 310, 417, 2783, 1309, 253, 391, 77, 3733, 3408, 2299, 352, 310, 908, 1309, 253, 17032, 3408, 1309, 253, 17032, 3408, 253, 5912, 4315, 273, 253, 2250, 36908, 30258, 253, 1055, 273, 253, 35367, 13212, 534, 778, 906, 275, 2406, 3290, 14663, 2429, 281, 253, 3733, 3408, 23111, 253, 35367, 7658, 337, 690, 7000, 4295, 273, 253, 2216, 391, 77, 31225, 403, 417, 4751, 5544, 347, 7117, 275, 253, 3533, 2708, 253, 4081, 3082, 476, 320, 10693, 3732, 281, 2629, 894, 14663, 1955, 281, 253, 1781, 1375, 285, 2250, 2317, 2299, 253, 4477, 921, 275, 2829, 721, 326, 253, 14663, 4968, 534, 476, 6016, 2629, 894, 14663, 751, 7156, 5070, 778, 671, 5649, 432, 253, 4081, 1332, 273, 10885, 6333, 14663, 436, 310, 12912, 323, 253, 1407, 66, 4968, 2440, 275, 2087, 627, 310, 642, 3636, 4016, 38058, 3486, 5474, 33032, 2520, 2929, 10262, 247, 391, 77, 3169, 11291, 14663, 2746, 8989, 5070, 281, 8356, 6635, 247, 3588, 11291, 12806, 2216, 2429, 342, 253, 3438, 7274, 326, 4647, 4373, 10580, 281, 1957, 253, 11291, 12806, 8989, 5070, 47932, 253, 12275, 1268, 29886, 6779, 281, 1957, 253, 12806, 285, 9176, 8409, 534, 4483, 323, 8989, 5070, 281, 4711, 247, 1805, 3045, 253, 1543, 921, 326, 8989, 5070, 476, 5115, 3925, 2270, 6371, 3985, 5141, 342, 5058, 47685, 20544, 273, 253, 2929, 337, 11291, 14663, 342, 391, 77, 556, 644, 4102, 5421, 407, 2709, 6239, 436, 789, 29328, 247, 1805, 2900, 323, 11291, 14663, 253, 11088, 374, 69, 12275, 3020, 6779, 273, 253, 11291, 12806, 285, 6371, 2978, 13276, 8989, 5070, 281, 6388, 253, 9176, 8409, 1491, 672, 2403, 253, 14663, 3061, 436, 1543, 275, 247, 1805, 3045, 685, 253, 2045, 789, 534, 47932, 4373, 10580, 281, 1957, 253, 11291, 12806, 50276, 19, 281, 6635, 253, 3280, 1375, 323, 253, 391, 77, 6083, 47641, 11333, 403, 4081, 281, 4711, 253, 3280, 25965, 323, 253, 391, 77, 6083, 342, 4942, 1698, 10454, 50276, 20, 253, 9759, 273, 253, 2929, 310, 2590, 253, 5661, 1543, 403, 11088, 285, 12532, 50276, 20881, 1255, 273, 253, 2929, 50276, 2887, 253, 1953, 2593, 5549, 5474, 339, 431, 248, 2929, 4648, 35221, 4715, 281, 8415, 253, 11291, 14663, 1895, 352, 6849, 253, 14663, 18857, 347, 247, 374, 69, 2460, 1309, 253, 6779, 4715, 50276, 296, 3755, 20556, 337, 253, 2929, 310, 973, 3542, 285, 10932, 253, 30762, 25339, 2905, 4278, 374, 253, 1332, 310, 3590, 285, 973, 2931, 495, 253, 1543, 403, 25507, 285, 21414, 50276, 20881, 1255, 265, 337, 7156, 5070, 310, 908, 347, 247, 8245, 534, 310, 417, 247, 1175, 7092, 323, 14823, 14663, 253, 1655, 7156, 5070, 310, 2201, 314, 18325, 323, 2629, 894, 14663, 352, 2550, 6016, 14823, 14663, 1077, 973, 253, 4477, 778, 1908, 8171, 337, 347, 253, 8245, 337, 260, 260, 24176, 247, 270, 465, 1240, 1251, 891, 465, 606, 285, 298, 259, 606, 8171, 26441, 2900, 3290, 285, 5557, 1430, 12820, 275, 4156, 14663, 275, 26332, 1796, 13122, 327, 2475, 3525, 1356, 2216, 273, 8527, 14174, 285, 2718, 1936, 6480, 642, 898, 7266, 1722, 1166, 1166, 1229, 22688, 6247, 28076, 8437, 12852, 18038, 324, 7798, 1619, 3046, 14256, 374, 253, 4477, 897, 310, 19875, 5826, 49602, 534, 403, 323, 2629, 894, 14663, 42280, 403, 4229, 275, 841, 49602, 4477, 943, 6266, 849, 597, 12921, 253, 49602, 253, 4477, 943, 671, 1908, 970, 278, 983, 49602, 374, 374, 480, 1182, 340, 266, 295, 1649, 10320, 10511, 285, 260, 448, 86, 10885, 48663, 275, 4980, 1236, 2510, 25912, 6804, 3281, 14663, 4748, 7904, 394, 913, 44492, 1796, 2216, 29885, 8059, 4748, 7266, 7652, 1540, 3156, 28076, 8437, 11838, 19603, 1525, 883, 1036, 7554, 1619, 495, 2067, 49602, 403, 5816, 824, 347, 1943, 11863, 19, 1943, 11863, 21, 253, 4477, 943, 1304, 253, 1543, 273, 841, 49602, 604, 597, 3609, 253, 22791, 49130, 5010, 10668, 778, 9101, 326, 841, 1543, 403, 417, 7445, 577, 253, 28913, 1263, 310, 417, 11080, 323, 4227, 627, 403, 1142, 25965, 275, 253, 4081, 1332, 752, 310, 253, 7680, 273, 1016, 8989, 310, 1046, 2014, 8989, 3309, 752, 310, 253, 3486, 273, 253, 5019, 273, 841, 25965, 5742, 604, 253, 1332, 310, 1663, 1270, 352, 1057, 417, 878, 253, 1899, 8989, 326, 8599, 253, 5570, 253, 11659, 1580, 253, 5570, 943, 452, 436, 1491, 253, 4477, 2319, 253, 12291, 387, 253, 990, 273, 253, 2929, 253, 4081, 1332, 476, 760, 6016, 247, 1355, 1180, 273, 42280, 352, 476, 417, 18915, 247, 1781, 1180, 273, 30650, 10872, 824, 347, 9790, 273, 2629, 1341, 891, 1158, 253, 4477, 943, 4073, 436, 12291, 4768, 253, 2929, 5010, 10668, 778, 1158, 326, 253, 2929, 29328, 247, 1332, 281, 1659, 512, 10872, 275, 253, 806, 1264, 2593, 253, 4477, 897, 253, 1307, 14663, 3185, 273, 14823, 14663, 5474, 33032, 2520, 2929, 29328, 8989, 5070, 534, 476, 8356, 6635, 247, 1029, 15177, 285, 3588, 12806, 1561, 247, 1643, 3038, 12401, 2045, 3082, 326, 4419, 11595, 29646, 281, 10007, 12078, 14663, 8989, 5070, 43603, 14663, 347, 247, 1895, 273, 8066, 4415, 652, 5304, 6779, 4715, 625, 3782, 337, 5304, 6779, 4715, 13276, 12930, 9790, 273, 5049, 11911, 327, 247, 11291, 9483, 1242, 374, 8989, 5070, 5936, 247, 747, 3646, 2990, 326, 476, 9232, 285, 19737, 1097, 253, 4156, 285, 16105, 1491, 327, 247, 11291, 18857, 17749, 1327, 1189, 77, 5436, 14663, 14556, 374, 253, 5183, 3045, 327, 2164, 1345, 11291, 49602, 407, 41731, 14692, 4216, 14663, 285, 372, 70, 377, 83, 608, 89, 285, 898, 89, 275, 8493, 6371, 3985, 342, 470, 14787, 275, 12806, 50276, 296, 3755, 20556, 50276, 783, 2929, 5125, 247, 625, 7899, 6779, 273, 11291, 12806, 253, 1264, 3510, 273, 8066, 4415, 652, 4735, 8115, 476, 4751, 1957, 7863, 2036, 285, 9176, 16012, 1027, 432, 372, 70, 377, 83, 326, 8025, 512, 11911, 452, 3943, 1979, 436, 789, 19401, 1524, 9552, 273, 11911, 50274, 783, 14086, 10921, 310, 271, 4722, 2934, 50276, 262, 310, 271, 27296, 1039, 281, 2794, 271, 5750, 1318, 387, 1016, 28432, 3213, 3185, 273, 11365, 581, 10921, 387, 760, 253, 990, 273, 253, 9037, 846, 253, 14663, 310, 2218, 436, 476, 320, 1097, 1175, 285, 3076, 1110, 347, 253, 7089, 476, 320, 2159, 18347, 264, 824, 326, 253, 5933, 14177, 281, 17627, 595, 3157, 6371, 2978, 10305, 3045, 5107, 432, 18501, 272, 2159, 3945, 16566, 2299, 1097, 253, 2022, 1543, 285, 28913, 921, 326, 14086, 10921, 14962, 19132, 4715, 50274, 783, 2929, 556, 11080, 14023, 342, 256, 5503, 2905, 789, 285, 5183, 1534, 15988, 327, 247, 873, 273, 22791, 17082, 751, 6371, 2978, 4038, 285, 14787, 403, 5520, 50274, 783, 37317, 310, 5211, 281, 923, 28913, 4677, 721, 273, 247, 14717, 273, 253, 15988, 512, 253, 2216, 7089, 823, 598, 281, 690, 273, 253, 3045, 50275, 20881, 1255, 265, 50276, 783, 1543, 1007, 1077, 2266, 2299, 403, 247, 2372, 4828, 27350, 253, 10921, 760, 19401, 6371, 2978, 407, 14949, 10499, 271, 5750, 1159, 387, 1016, 28432, 3213, 253, 37317, 310, 417, 594, 2119, 849, 352, 812, 3157, 4038, 534, 310, 5183, 281, 320, 5520, 275, 512, 253, 1543, 6371, 2978, 285, 4038, 403, 1327, 2156, 17397, 17082, 285, 403, 3798, 275, 7344, 342, 1016, 643, 50274, 783, 2929, 19756, 625, 4278, 327, 849, 253, 8245, 3082, 403, 9009, 285, 1880, 253, 14023, 403, 4344, 285, 512, 253, 12532, 3904, 403, 4517, 494, 50274, 9088, 310, 1652, 26850, 1980, 8989, 11781, 4156, 8989, 32049, 285, 29810, 285, 643, 4295, 275, 4677, 495, 943, 320, 26115, 11076, 1037, 323, 2571, 281, 18302, 253, 4679, 50276, 9820, 2490, 187, 4118, 18435, 27, 783, 30628, 403, 31905, 670, 253, 789, 285, 512, 8521, 323, 253, 14924, 273, 253, 2929, 253, 30628, 1158, 253, 789, 310, 4891, 285, 4460, 285, 7826, 3486, 1020, 323, 1650, 37317, 278, 1678, 72, 4879, 436, 2929, 29698, 253, 17856, 14663, 1895, 715, 2709, 5304, 14237, 970, 1264, 25965, 534, 13279, 253, 6387, 273, 970, 14242, 27311, 6928, 281, 4908, 253, 4156, 12806, 1491, 6701, 281, 253, 4477, 323, 253, 2508, 30080, 22559, 285, 253, 11080, 5955, 342, 253, 30628, 24049, 841, 2792, 5439, 275, 253, 5511, 588, 2007, 3157, 253, 2929 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 26380, 310, 581, 4619, 3213, 281, 5416, 5049, 3045, 1690, 1612, 8353, 5778, 285, 11291, 2170, 275, 3520, 2216, 436, 2929, 10262, 247, 35221, 28269, 3169, 2746, 281, 22318, 253, 2716, 31245, 6371, 3985, 288, 81, 24966, 15606, 5049, 11911, 32627, 436, 4081, 1332, 47932, 1554, 400, 827, 8989, 5304, 6779, 432, 253, 14663, 281, 5416, 253, 1327, 1189, 77, 5436, 2867, 2190, 11911, 1016, 8989, 4518, 28174, 247, 2176, 4809, 1581, 494, 19274, 7982, 1818, 4156, 14663, 1491, 387, 253, 1655, 14663, 29679, 4795, 275, 5919, 3646, 4715, 253, 2929, 2684, 271, 9470, 5661, 1263, 327, 2709, 49602, 285, 5183, 253, 12510, 273, 2060, 4295, 949, 253, 28913, 1263, 20544, 337, 436, 2929, 29698, 253, 17856, 14663, 1895, 715, 2709, 5304, 14237, 970, 1264, 25965, 534, 13279, 253, 6387, 273, 970, 14242, 27311, 6928, 281, 4908, 253, 4156, 12806, 1491, 275, 1635, 1016, 8989, 556, 697, 5799, 4096, 2750, 2996, 253, 4583, 14663, 4836, 432, 1027, 14636, 253, 1899, 8989, 23632, 253, 1327, 1189, 77, 5436, 2867, 273, 253, 4081, 1332, 253, 6371, 8989, 4245, 247, 1175, 13418, 273, 253, 7982, 288, 81, 24966, 1818, 323, 253, 2250, 285, 253, 1859, 8989, 3400, 247, 4156, 1859, 273, 253, 1655, 14663, 1543, 337, 253, 2929, 3400, 2709, 4295, 534, 476, 320, 12912, 323, 2852, 2561, 275, 436, 2170, 253, 4477, 4158, 2709, 25965, 323, 253, 14663, 285, 616, 3969, 5919, 5978, 11333, 534, 476, 320, 908, 347, 2060, 4295, 281, 26761, 643, 2905, 14663, 285, 24749, 8892, 337, 436, 2929, 2589, 84, 271, 9470, 5661, 1263, 273, 253, 4081, 3082, 689, 2710, 49602, 285, 2709, 17082, 1411, 1097, 1375, 23037, 14387, 13757, 3169, 285, 391, 77, 3169, 7274, 17227, 253, 12510, 273, 616, 1332, 275, 1635, 253, 28913, 1263, 671, 2722, 253, 1612, 273, 1016, 2060, 4445, 273, 253, 4081, 1332, 50276, 20881, 1255, 265, 337, 627, 2226, 2176, 45611, 875, 253, 3733, 285, 17032, 3408, 1955, 281, 35367, 13212, 253, 35367, 13212, 310, 417, 2783, 1309, 253, 391, 77, 3733, 3408, 2299, 352, 310, 908, 1309, 253, 17032, 3408, 1309, 253, 17032, 3408, 253, 5912, 4315, 273, 253, 2250, 36908, 30258, 253, 1055, 273, 253, 35367, 13212, 534, 778, 906, 275, 2406, 3290, 14663, 2429, 281, 253, 3733, 3408, 23111, 253, 35367, 7658, 337, 690, 7000, 4295, 273, 253, 2216, 391, 77, 31225, 403, 417, 4751, 5544, 347, 7117, 275, 253, 3533, 2708, 253, 4081, 3082, 476, 320, 10693, 3732, 281, 2629, 894, 14663, 1955, 281, 253, 1781, 1375, 285, 2250, 2317, 2299, 253, 4477, 921, 275, 2829, 721, 326, 253, 14663, 4968, 534, 476, 6016, 2629, 894, 14663, 751, 7156, 5070, 778, 671, 5649, 432, 253, 4081, 1332, 273, 10885, 6333, 14663, 436, 310, 12912, 323, 253, 1407, 66, 4968, 2440, 275, 2087, 627, 310, 642, 3636, 4016, 38058, 3486, 5474, 33032, 2520, 2929, 10262, 247, 391, 77, 3169, 11291, 14663, 2746, 8989, 5070, 281, 8356, 6635, 247, 3588, 11291, 12806, 2216, 2429, 342, 253, 3438, 7274, 326, 4647, 4373, 10580, 281, 1957, 253, 11291, 12806, 8989, 5070, 47932, 253, 12275, 1268, 29886, 6779, 281, 1957, 253, 12806, 285, 9176, 8409, 534, 4483, 323, 8989, 5070, 281, 4711, 247, 1805, 3045, 253, 1543, 921, 326, 8989, 5070, 476, 5115, 3925, 2270, 6371, 3985, 5141, 342, 5058, 47685, 20544, 273, 253, 2929, 337, 11291, 14663, 342, 391, 77, 556, 644, 4102, 5421, 407, 2709, 6239, 436, 789, 29328, 247, 1805, 2900, 323, 11291, 14663, 253, 11088, 374, 69, 12275, 3020, 6779, 273, 253, 11291, 12806, 285, 6371, 2978, 13276, 8989, 5070, 281, 6388, 253, 9176, 8409, 1491, 672, 2403, 253, 14663, 3061, 436, 1543, 275, 247, 1805, 3045, 685, 253, 2045, 789, 534, 47932, 4373, 10580, 281, 1957, 253, 11291, 12806, 50276, 19, 281, 6635, 253, 3280, 1375, 323, 253, 391, 77, 6083, 47641, 11333, 403, 4081, 281, 4711, 253, 3280, 25965, 323, 253, 391, 77, 6083, 342, 4942, 1698, 10454, 50276, 20, 253, 9759, 273, 253, 2929, 310, 2590, 253, 5661, 1543, 403, 11088, 285, 12532, 50276, 20881, 1255, 273, 253, 2929, 50276, 2887, 253, 1953, 2593, 5549, 5474, 339, 431, 248, 2929, 4648, 35221, 4715, 281, 8415, 253, 11291, 14663, 1895, 352, 6849, 253, 14663, 18857, 347, 247, 374, 69, 2460, 1309, 253, 6779, 4715, 50276, 296, 3755, 20556, 337, 253, 2929, 310, 973, 3542, 285, 10932, 253, 30762, 25339, 2905, 4278, 374, 253, 1332, 310, 3590, 285, 973, 2931, 495, 253, 1543, 403, 25507, 285, 21414, 50276, 20881, 1255, 265, 337, 7156, 5070, 310, 908, 347, 247, 8245, 534, 310, 417, 247, 1175, 7092, 323, 14823, 14663, 253, 1655, 7156, 5070, 310, 2201, 314, 18325, 323, 2629, 894, 14663, 352, 2550, 6016, 14823, 14663, 1077, 973, 253, 4477, 778, 1908, 8171, 337, 347, 253, 8245, 337, 260, 260, 24176, 247, 270, 465, 1240, 1251, 891, 465, 606, 285, 298, 259, 606, 8171, 26441, 2900, 3290, 285, 5557, 1430, 12820, 275, 4156, 14663, 275, 26332, 1796, 13122, 327, 2475, 3525, 1356, 2216, 273, 8527, 14174, 285, 2718, 1936, 6480, 642, 898, 7266, 1722, 1166, 1166, 1229, 22688, 6247, 28076, 8437, 12852, 18038, 324, 7798, 1619, 3046, 14256, 374, 253, 4477, 897, 310, 19875, 5826, 49602, 534, 403, 323, 2629, 894, 14663, 42280, 403, 4229, 275, 841, 49602, 4477, 943, 6266, 849, 597, 12921, 253, 49602, 253, 4477, 943, 671, 1908, 970, 278, 983, 49602, 374, 374, 480, 1182, 340, 266, 295, 1649, 10320, 10511, 285, 260, 448, 86, 10885, 48663, 275, 4980, 1236, 2510, 25912, 6804, 3281, 14663, 4748, 7904, 394, 913, 44492, 1796, 2216, 29885, 8059, 4748, 7266, 7652, 1540, 3156, 28076, 8437, 11838, 19603, 1525, 883, 1036, 7554, 1619, 495, 2067, 49602, 403, 5816, 824, 347, 1943, 11863, 19, 1943, 11863, 21, 253, 4477, 943, 1304, 253, 1543, 273, 841, 49602, 604, 597, 3609, 253, 22791, 49130, 5010, 10668, 778, 9101, 326, 841, 1543, 403, 417, 7445, 577, 253, 28913, 1263, 310, 417, 11080, 323, 4227, 627, 403, 1142, 25965, 275, 253, 4081, 1332, 752, 310, 253, 7680, 273, 1016, 8989, 310, 1046, 2014, 8989, 3309, 752, 310, 253, 3486, 273, 253, 5019, 273, 841, 25965, 5742, 604, 253, 1332, 310, 1663, 1270, 352, 1057, 417, 878, 253, 1899, 8989, 326, 8599, 253, 5570, 253, 11659, 1580, 253, 5570, 943, 452, 436, 1491, 253, 4477, 2319, 253, 12291, 387, 253, 990, 273, 253, 2929, 253, 4081, 1332, 476, 760, 6016, 247, 1355, 1180, 273, 42280, 352, 476, 417, 18915, 247, 1781, 1180, 273, 30650, 10872, 824, 347, 9790, 273, 2629, 1341, 891, 1158, 253, 4477, 943, 4073, 436, 12291, 4768, 253, 2929, 5010, 10668, 778, 1158, 326, 253, 2929, 29328, 247, 1332, 281, 1659, 512, 10872, 275, 253, 806, 1264, 2593, 253, 4477, 897, 253, 1307, 14663, 3185, 273, 14823, 14663, 5474, 33032, 2520, 2929, 29328, 8989, 5070, 534, 476, 8356, 6635, 247, 1029, 15177, 285, 3588, 12806, 1561, 247, 1643, 3038, 12401, 2045, 3082, 326, 4419, 11595, 29646, 281, 10007, 12078, 14663, 8989, 5070, 43603, 14663, 347, 247, 1895, 273, 8066, 4415, 652, 5304, 6779, 4715, 625, 3782, 337, 5304, 6779, 4715, 13276, 12930, 9790, 273, 5049, 11911, 327, 247, 11291, 9483, 1242, 374, 8989, 5070, 5936, 247, 747, 3646, 2990, 326, 476, 9232, 285, 19737, 1097, 253, 4156, 285, 16105, 1491, 327, 247, 11291, 18857, 17749, 1327, 1189, 77, 5436, 14663, 14556, 374, 253, 5183, 3045, 327, 2164, 1345, 11291, 49602, 407, 41731, 14692, 4216, 14663, 285, 372, 70, 377, 83, 608, 89, 285, 898, 89, 275, 8493, 6371, 3985, 342, 470, 14787, 275, 12806, 50276, 296, 3755, 20556, 50276, 783, 2929, 5125, 247, 625, 7899, 6779, 273, 11291, 12806, 253, 1264, 3510, 273, 8066, 4415, 652, 4735, 8115, 476, 4751, 1957, 7863, 2036, 285, 9176, 16012, 1027, 432, 372, 70, 377, 83, 326, 8025, 512, 11911, 452, 3943, 1979, 436, 789, 19401, 1524, 9552, 273, 11911, 50274, 783, 14086, 10921, 310, 271, 4722, 2934, 50276, 262, 310, 271, 27296, 1039, 281, 2794, 271, 5750, 1318, 387, 1016, 28432, 3213, 3185, 273, 11365, 581, 10921, 387, 760, 253, 990, 273, 253, 9037, 846, 253, 14663, 310, 2218, 436, 476, 320, 1097, 1175, 285, 3076, 1110, 347, 253, 7089, 476, 320, 2159, 18347, 264, 824, 326, 253, 5933, 14177, 281, 17627, 595, 3157, 6371, 2978, 10305, 3045, 5107, 432, 18501, 272, 2159, 3945, 16566, 2299, 1097, 253, 2022, 1543, 285, 28913, 921, 326, 14086, 10921, 14962, 19132, 4715, 50274, 783, 2929, 556, 11080, 14023, 342, 256, 5503, 2905, 789, 285, 5183, 1534, 15988, 327, 247, 873, 273, 22791, 17082, 751, 6371, 2978, 4038, 285, 14787, 403, 5520, 50274, 783, 37317, 310, 5211, 281, 923, 28913, 4677, 721, 273, 247, 14717, 273, 253, 15988, 512, 253, 2216, 7089, 823, 598, 281, 690, 273, 253, 3045, 50275, 20881, 1255, 265, 50276, 783, 1543, 1007, 1077, 2266, 2299, 403, 247, 2372, 4828, 27350, 253, 10921, 760, 19401, 6371, 2978, 407, 14949, 10499, 271, 5750, 1159, 387, 1016, 28432, 3213, 253, 37317, 310, 417, 594, 2119, 849, 352, 812, 3157, 4038, 534, 310, 5183, 281, 320, 5520, 275, 512, 253, 1543, 6371, 2978, 285, 4038, 403, 1327, 2156, 17397, 17082, 285, 403, 3798, 275, 7344, 342, 1016, 643, 50274, 783, 2929, 19756, 625, 4278, 327, 849, 253, 8245, 3082, 403, 9009, 285, 1880, 253, 14023, 403, 4344, 285, 512, 253, 12532, 3904, 403, 4517, 494, 50274, 9088, 310, 1652, 26850, 1980, 8989, 11781, 4156, 8989, 32049, 285, 29810, 285, 643, 4295, 275, 4677, 495, 943, 320, 26115, 11076, 1037, 323, 2571, 281, 18302, 253, 4679, 50276, 9820, 2490, 187, 4118, 18435, 27, 783, 30628, 403, 31905, 670, 253, 789, 285, 512, 8521, 323, 253, 14924, 273, 253, 2929, 253, 30628, 1158, 253, 789, 310, 4891, 285, 4460, 285, 7826, 3486, 1020, 323, 1650, 37317, 278, 1678, 72, 4879, 436, 2929, 29698, 253, 17856, 14663, 1895, 715, 2709, 5304, 14237, 970, 1264, 25965, 534, 13279, 253, 6387, 273, 970, 14242, 27311, 6928, 281, 4908, 253, 4156, 12806, 1491, 6701, 281, 253, 4477, 323, 253, 2508, 30080, 22559, 285, 253, 11080, 5955, 342, 253, 30628, 24049, 841, 2792, 5439, 275, 253, 5511, 588, 2007, 3157, 253, 2929 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the authors present minif2f a dataset of formalized mathematical problems drawn from diverse sources including imo aime amc undergraduate and high school problems the focus is on algebra inequalities and number theory as those problems are easier to formalize than for example geometry or combinatorial problems the formalization is done in metamath lean with efforts for isabelle ongoing the authors run gptf on metamath and lean and the tidy baseline from the pact paper on the dataset and present results they find that proving in lean is vastly better for performance than metamath which they conjecture is due to access to higher level tactics in lean compared to metamath deep learning applied to theorem proving is i think one of its most exciting applications the multiple different frameworks and datasets are a barrier to making progress in this area as a community and to that end this dataset is a significant step the methods the authors apply on the dataset are fairly state of the art and serve as a good baseline for someone wanting to make further progress i do however think that some more analysis would be worthwhile in particular i think the authors should add the following 1 breakdown of the performance on the problems sourced from the math dataset by level of difficulty 2 a qualitative analysis of what kinds of problems the baseline models fail on and whether they fail on similar problems highly relevant new dataset with recent baselines run on it to get an idea of sota performance however detailed analysis of the baselines on the dataset is lacking docsepthis paper presents minif2f a test suite of olympiadlevel problems of theorem proving that is implemented in metamath lean and isabelle minif2f contains 488 individual theorem statements that are formalized from olympiad math contests gptf models trained on metamath and lean are evaluated on this test suite strengths 1 since previous benchmarks of atp mainly focus on basic math theorems minif2f fills the vacancy of the contestlevel test suite for verifying the performance of theorem provers i think this is an important step towards the goal of the grandimo challenge 2 the crosssystem design of minif2f provides a good way to compare different formalizations and proving systems 3 the experiment results demonstrate the importance of expert knowledge for theorem proving built with highlevel tactics gptflean achieves better results than gptfmm the formal theorem provers also work better than the natural languagebased problem solver questions 1 what are the meanings of custom and induction in table 1 2 what is the distribution of the number of theorems proved across different difficulty levels 3 personally i am quite curious about your experience of formalizing these problems what is the average time spent on one problem except for geometry and combinatorial problems how large portion of problems could be formalized and what would be the ultimate size of minif2f in your expectation overall i think minif2f is an important benchmark that could help the community advance the research of theorem proving i recommend accepting this paper i would like to maintain my old score of this paper after reading the authors responses and other reviewers comments docsepthis paper presents a new formal mathematics benchmark consisting of 488 statements expressed in three prominent theoremprovingverification systems baseline atp systems notably gptfpact in lean are evaluated on this benchmark strengths the advantages of this benchmark are that it is crosssystem and that it covers a variety of mathematical topics at the olympiad level the motivation for the particular assemblage of mathematical topics is solid minif2f is intended as an intermediate step toward the imo as an atp task which is out of reach for current systems this is the first effort to unify various olympiad topics in one dataset and the problems cover a wide scope of tactics and difficulties weaknesses questions the benchmark is not really as crosssystem as claimed in the abstract only 12 of the training statements are available in isabelle how were the olympiad and custom problems chosen the way in which multiplechoice problems are formalized gives additional information to the solver in table 4 the amc problem asks which value of an expression is possible quantifier on a and b but the formalization drops the quantifiers and asks to prove equality this would be incorrect under minor changes in the conditions it would seem more appropriate to formalize this with x 2 or x 12 or as a hypothesis problem 22 of amc 12b 2020 asked to find the maximum value of a certain function out of five choices yet the theorem amc12b2020p22 lean asks to prove that for all values of the argument the function is smaller than the correct maximum this is clearly insufficient without the prior knowledge of the correct answer we can imagine that the solver could prove a weaker bound but exceeded its timeout trying to prove correct bound aime problems are multiplechoice as well but it is perhaps forgivable not to formalize them as such suggestion to the authors code such as table 4 and the theorem on the circle and hyperbola p6 would be more readable with simple natural language annotations describing the meaning of each line for the benefit of readers who are not familiar with all three systems or do not see the solution the main value of this work is in the set of formal olympiad problem statements which have not appeared in other datasets there is little technical novelty in the ml algorithms and analysis of their performance especially in comparison to han et al 2021 on which this paper heavily relies docsepthe paper proposes a benchmark of formal olympiadlevel math problems focusing on algebra number theory and inequalities with crosssystem support on metamath lean and isabelle in development the paper evaluates performance of textitgptf on metamath and lean and a custom baseline textittidy on lean as well textbfstrengths the paper formalizes a decent amount of crosssystem olympiadlevel benchmark of 488 problems crosssystem support on metamath lean and isabelle provides benefit on comparing automation and tactics of systems olympiadlevel problems are also interesting to both researchers and general public the inclusion of formalization of a subset of math benchmark also enables comparing provers in formal and informal format the paper is wellwritten with good literature review on theorem proving benchmarks textbfweaknesses the paper could justify more on what type of problems were selected in textbfminif2f the benchmark mostly focuses on algebra number theory and inequalities will this benchmark in some way skew the research direction of the community to only focus on developing algorithms particularly suitable for solving these types of problems that may or may not generalize well to other types of problems such as geometry problems it would be interesting to see if the gap on pass rates between metamath and lean could be reduced when the models are trained or finetuned on a subset of textbfminif2f in addition to pretraining this could provide more evidence if the gap is mostly due to access to highlevel tactics the paper makes a solid step forward on creating crosssystem olympiadlevel formal math benchmark and should have profound benefit on the community with its continual development ### Summary:
the paper presents minif2f a dataset of 488 highschool and college level math problems the problems are fully formalized and include proofs in the metamath lean and isabelle theorem provers as the reviewers pointed out the support for isabelle is limited and that should be made clearer in the abstract this multiplatform support is the main selling point of the benchmark because it will make it possible to make direct comparisons among systems targeting different theorem provers the paper also does a good job discussing the benchmark selection and formalization process this is important since some of the problems were translated from word problems as part of the rebuttal the authors added extra information on the performance of the baselines and some qualitative details on how they fail overall there is agreement among the reviewers that this is a valuable benchmark that will enable comparisons among systems that today are very hard to compare
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 1246, 1054, 338, 19, 71, 247, 10895, 273, 7473, 1025, 15965, 3237, 8392, 432, 11117, 4973, 1690, 516, 80, 247, 553, 717, 68, 31454, 285, 1029, 2143, 3237, 253, 2770, 310, 327, 8697, 25930, 285, 1180, 3762, 347, 1110, 3237, 403, 6927, 281, 7473, 907, 685, 323, 1650, 12087, 390, 38183, 3237, 253, 7473, 1320, 310, 2218, 275, 42281, 506, 9644, 342, 6031, 323, 310, 1492, 282, 10800, 50276, 783, 4477, 1408, 305, 431, 71, 327, 42281, 506, 285, 9644, 285, 253, 49031, 8245, 432, 253, 47448, 2929, 327, 253, 10895, 285, 1246, 1543, 597, 1089, 326, 18597, 275, 9644, 310, 37078, 1805, 323, 3045, 685, 42281, 506, 534, 597, 24366, 310, 1955, 281, 2289, 281, 2169, 1268, 21041, 275, 9644, 2429, 281, 42281, 506, 3676, 4715, 3732, 281, 10012, 18597, 310, 891, 1158, 581, 273, 697, 954, 12302, 4893, 253, 2709, 1027, 31225, 285, 15302, 403, 247, 11394, 281, 2403, 4780, 275, 436, 2170, 347, 247, 3114, 285, 281, 326, 990, 436, 10895, 310, 247, 1534, 3213, 50275, 783, 3082, 253, 4477, 4647, 327, 253, 10895, 403, 9648, 1375, 273, 253, 1445, 285, 5752, 347, 247, 1175, 8245, 323, 3095, 14707, 281, 1056, 2007, 4780, 891, 513, 2299, 1158, 326, 690, 625, 1783, 651, 320, 32811, 275, 1798, 891, 1158, 253, 4477, 943, 823, 253, 1563, 50276, 18, 19501, 273, 253, 3045, 327, 253, 3237, 47344, 432, 253, 14168, 10895, 407, 1268, 273, 10183, 374, 247, 18276, 1783, 273, 752, 9351, 273, 3237, 253, 8245, 3210, 1891, 327, 285, 1880, 597, 1891, 327, 2074, 3237, 4122, 4623, 747, 10895, 342, 3332, 1666, 25379, 1408, 327, 352, 281, 755, 271, 2934, 273, 256, 5503, 3045, 2299, 7000, 1783, 273, 253, 1666, 25379, 327, 253, 10895, 310, 14999, 5474, 33032, 2520, 2929, 10262, 1054, 338, 19, 71, 247, 1071, 18880, 273, 258, 314, 42225, 324, 5251, 3237, 273, 10012, 18597, 326, 310, 9009, 275, 42281, 506, 9644, 285, 310, 1492, 282, 1054, 338, 19, 71, 4428, 29364, 2060, 10012, 7234, 326, 403, 7473, 1025, 432, 258, 314, 42225, 324, 14168, 39858, 305, 431, 71, 3210, 10166, 327, 42281, 506, 285, 9644, 403, 6760, 327, 436, 1071, 18880, 20544, 337, 1580, 2045, 49602, 273, 387, 81, 7194, 2770, 327, 5044, 14168, 39383, 1054, 338, 19, 71, 32113, 253, 44719, 273, 253, 12417, 5251, 1071, 18880, 323, 49160, 253, 3045, 273, 10012, 354, 735, 891, 1158, 436, 310, 271, 1774, 3213, 4404, 253, 4736, 273, 253, 4936, 17622, 5691, 374, 253, 2831, 10394, 2216, 273, 1054, 338, 19, 71, 3400, 247, 1175, 1039, 281, 7277, 1027, 7473, 5904, 285, 18597, 2718, 495, 253, 3368, 1543, 7568, 253, 6349, 273, 6485, 3640, 323, 10012, 18597, 4270, 342, 1029, 5251, 21041, 305, 431, 39923, 266, 33526, 1805, 1543, 685, 305, 431, 71, 2188, 253, 7473, 10012, 354, 735, 671, 789, 1805, 685, 253, 3626, 3448, 3169, 1895, 47037, 50276, 34974, 337, 752, 403, 253, 30460, 273, 2840, 285, 9953, 275, 2829, 337, 374, 752, 310, 253, 3268, 273, 253, 1180, 273, 39383, 8058, 2439, 1027, 10183, 2308, 495, 11697, 891, 717, 3240, 14338, 670, 634, 2793, 273, 7473, 3006, 841, 3237, 752, 310, 253, 3388, 673, 5262, 327, 581, 1895, 3707, 323, 12087, 285, 38183, 3237, 849, 1781, 5110, 273, 3237, 812, 320, 7473, 1025, 285, 752, 651, 320, 253, 12553, 1979, 273, 1054, 338, 19, 71, 275, 634, 15355, 50273, 1189, 455, 891, 1158, 1054, 338, 19, 71, 310, 271, 1774, 22791, 326, 812, 1361, 253, 3114, 7170, 253, 2561, 273, 10012, 18597, 891, 5583, 18738, 436, 2929, 50275, 74, 651, 751, 281, 6558, 619, 1711, 4868, 273, 436, 2929, 846, 4361, 253, 4477, 6128, 285, 643, 30628, 5701, 5474, 33032, 2520, 2929, 10262, 247, 747, 7473, 23065, 22791, 11253, 273, 29364, 7234, 4469, 275, 1264, 11906, 10012, 40037, 332, 1877, 2718, 8245, 387, 81, 2718, 19836, 305, 431, 16983, 514, 275, 9644, 403, 6760, 327, 436, 22791, 20544, 50276, 783, 11361, 273, 436, 22791, 403, 326, 352, 310, 2831, 10394, 285, 326, 352, 10949, 247, 5235, 273, 15965, 12989, 387, 253, 258, 314, 42225, 324, 1268, 50276, 783, 16038, 323, 253, 1798, 19308, 486, 273, 15965, 12989, 310, 4891, 1054, 338, 19, 71, 310, 6034, 347, 271, 10444, 3213, 2584, 253, 516, 80, 347, 271, 387, 81, 4836, 534, 310, 562, 273, 3986, 323, 1655, 2718, 436, 310, 253, 806, 3434, 281, 440, 1419, 2710, 258, 314, 42225, 324, 12989, 275, 581, 10895, 285, 253, 3237, 3835, 247, 4618, 7990, 273, 21041, 285, 12748, 50276, 20881, 1255, 265, 50276, 34974, 50276, 783, 22791, 310, 417, 1663, 347, 2831, 10394, 347, 7558, 275, 253, 12002, 760, 1249, 273, 253, 3733, 7234, 403, 2130, 275, 310, 1492, 282, 50276, 5430, 497, 253, 258, 314, 42225, 324, 285, 2840, 3237, 6777, 50276, 783, 1039, 275, 534, 2709, 22122, 3237, 403, 7473, 1025, 4245, 3081, 1491, 281, 253, 47037, 50273, 249, 2829, 577, 253, 717, 68, 1895, 12325, 534, 1318, 273, 271, 2048, 310, 1896, 2677, 5425, 327, 247, 285, 270, 533, 253, 7473, 1320, 15323, 253, 2677, 13783, 285, 12325, 281, 5276, 13919, 436, 651, 320, 13583, 762, 5884, 2544, 275, 253, 2515, 352, 651, 1646, 625, 4569, 281, 7473, 907, 436, 342, 1269, 50276, 19, 390, 1269, 50276, 805, 390, 50276, 284, 247, 9079, 50273, 28872, 3307, 273, 717, 68, 1249, 67, 9169, 2546, 281, 1089, 253, 4869, 1318, 273, 247, 2176, 1159, 562, 273, 2620, 10165, 2568, 253, 10012, 717, 68, 805, 67, 14952, 81, 1423, 9644, 12325, 281, 5276, 326, 323, 512, 2193, 273, 253, 4154, 253, 1159, 310, 4577, 685, 253, 3451, 4869, 436, 310, 4518, 12497, 1293, 253, 2720, 3640, 273, 253, 3451, 3662, 359, 476, 8564, 326, 253, 47037, 812, 5276, 247, 21076, 3033, 533, 22536, 697, 22639, 2820, 281, 5276, 3451, 3033, 50273, 66, 553, 3237, 403, 2709, 22122, 347, 973, 533, 352, 310, 4931, 7175, 26430, 417, 281, 7473, 907, 731, 347, 824, 50276, 35640, 279, 281, 253, 4477, 2127, 824, 347, 2829, 577, 285, 253, 10012, 327, 253, 9096, 285, 23400, 6836, 268, 23, 651, 320, 625, 34025, 342, 2969, 3626, 3448, 31825, 12930, 253, 4495, 273, 1016, 1386, 323, 253, 5649, 273, 10668, 665, 403, 417, 7615, 342, 512, 1264, 2718, 390, 513, 417, 923, 253, 2900, 253, 2022, 1318, 273, 436, 789, 310, 275, 253, 873, 273, 7473, 258, 314, 42225, 324, 1895, 7234, 534, 452, 417, 5420, 275, 643, 15302, 627, 310, 1652, 7681, 38135, 275, 253, 13361, 11333, 285, 1783, 273, 616, 3045, 3340, 275, 5301, 281, 15761, 1162, 355, 43425, 327, 534, 436, 2929, 11306, 15771, 5474, 339, 431, 248, 2929, 29328, 247, 22791, 273, 7473, 258, 314, 42225, 324, 5251, 14168, 3237, 13654, 327, 8697, 1180, 3762, 285, 25930, 342, 2831, 10394, 1329, 327, 42281, 506, 9644, 285, 310, 1492, 282, 275, 2440, 253, 2929, 44995, 3045, 273, 2505, 262, 72, 431, 71, 327, 42281, 506, 285, 9644, 285, 247, 2840, 8245, 2505, 770, 36706, 327, 9644, 347, 973, 2505, 3342, 296, 3755, 20556, 50276, 783, 2929, 7473, 4219, 247, 12524, 2408, 273, 2831, 10394, 258, 314, 42225, 324, 5251, 22791, 273, 29364, 3237, 2831, 10394, 1329, 327, 42281, 506, 9644, 285, 310, 1492, 282, 3400, 5649, 327, 10941, 29885, 285, 21041, 273, 2718, 258, 314, 42225, 324, 5251, 3237, 403, 671, 4722, 281, 1097, 8607, 285, 2087, 1345, 50276, 783, 11250, 273, 7473, 1320, 273, 247, 8578, 273, 14168, 22791, 671, 13276, 10941, 354, 735, 275, 7473, 285, 25040, 5981, 50276, 783, 2929, 310, 973, 15720, 342, 1175, 6239, 2278, 327, 10012, 18597, 49602, 50276, 11765, 20881, 1255, 265, 50276, 783, 2929, 812, 15249, 625, 327, 752, 1511, 273, 3237, 497, 4236, 275, 2505, 3342, 1222, 338, 19, 71, 253, 22791, 6571, 16633, 327, 8697, 1180, 3762, 285, 25930, 588, 436, 22791, 275, 690, 1039, 35689, 253, 2561, 3884, 273, 253, 3114, 281, 760, 2770, 327, 6684, 11333, 3782, 7470, 323, 16161, 841, 3510, 273, 3237, 326, 778, 390, 778, 417, 39970, 973, 281, 643, 3510, 273, 3237, 824, 347, 12087, 3237, 50276, 262, 651, 320, 4722, 281, 923, 604, 253, 8037, 327, 1509, 4142, 875, 42281, 506, 285, 9644, 812, 320, 3777, 672, 253, 3210, 403, 10166, 390, 1442, 292, 37437, 327, 247, 8578, 273, 2505, 3342, 1222, 338, 19, 71, 275, 1635, 281, 3215, 26208, 436, 812, 2085, 625, 1941, 604, 253, 8037, 310, 6571, 1955, 281, 2289, 281, 1029, 5251, 21041, 253, 2929, 2789, 247, 4891, 3213, 3579, 327, 6153, 2831, 10394, 258, 314, 42225, 324, 5251, 7473, 14168, 22791, 285, 943, 452, 15585, 5649, 327, 253, 3114, 342, 697, 45120, 2440, 2490, 187, 4118, 18435, 27, 783, 2929, 10262, 1054, 338, 19, 71, 247, 10895, 273, 29364, 1029, 19221, 285, 6831, 1268, 14168, 3237, 253, 3237, 403, 4751, 7473, 1025, 285, 2486, 27947, 275, 253, 42281, 506, 9644, 285, 310, 1492, 282, 10012, 354, 735, 347, 253, 30628, 8042, 562, 253, 1329, 323, 310, 1492, 282, 310, 3710, 285, 326, 943, 320, 1160, 30909, 275, 253, 12002, 436, 18878, 4012, 1329, 310, 253, 2022, 10156, 1127, 273, 253, 22791, 984, 352, 588, 1056, 352, 1896, 281, 1056, 1480, 14023, 2190, 2718, 12262, 1027, 10012, 354, 735, 50275, 783, 2929, 671, 1057, 247, 1175, 2628, 16585, 253, 22791, 5438, 285, 7473, 1320, 1232, 436, 310, 1774, 1580, 690, 273, 253, 3237, 497, 15786, 432, 3159, 3237, 50275, 284, 629, 273, 253, 30080, 22559, 253, 4477, 2879, 4465, 1491, 327, 253, 3045, 273, 253, 1666, 25379, 285, 690, 18276, 4278, 327, 849, 597, 1891, 50275, 1189, 455, 627, 310, 4345, 2190, 253, 30628, 326, 436, 310, 247, 9865, 22791, 326, 588, 8046, 14023, 2190, 2718, 326, 3063, 403, 1077, 1892, 281, 7277 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 1246, 1054, 338, 19, 71, 247, 10895, 273, 7473, 1025, 15965, 3237, 8392, 432, 11117, 4973, 1690, 516, 80, 247, 553, 717, 68, 31454, 285, 1029, 2143, 3237, 253, 2770, 310, 327, 8697, 25930, 285, 1180, 3762, 347, 1110, 3237, 403, 6927, 281, 7473, 907, 685, 323, 1650, 12087, 390, 38183, 3237, 253, 7473, 1320, 310, 2218, 275, 42281, 506, 9644, 342, 6031, 323, 310, 1492, 282, 10800, 50276, 783, 4477, 1408, 305, 431, 71, 327, 42281, 506, 285, 9644, 285, 253, 49031, 8245, 432, 253, 47448, 2929, 327, 253, 10895, 285, 1246, 1543, 597, 1089, 326, 18597, 275, 9644, 310, 37078, 1805, 323, 3045, 685, 42281, 506, 534, 597, 24366, 310, 1955, 281, 2289, 281, 2169, 1268, 21041, 275, 9644, 2429, 281, 42281, 506, 3676, 4715, 3732, 281, 10012, 18597, 310, 891, 1158, 581, 273, 697, 954, 12302, 4893, 253, 2709, 1027, 31225, 285, 15302, 403, 247, 11394, 281, 2403, 4780, 275, 436, 2170, 347, 247, 3114, 285, 281, 326, 990, 436, 10895, 310, 247, 1534, 3213, 50275, 783, 3082, 253, 4477, 4647, 327, 253, 10895, 403, 9648, 1375, 273, 253, 1445, 285, 5752, 347, 247, 1175, 8245, 323, 3095, 14707, 281, 1056, 2007, 4780, 891, 513, 2299, 1158, 326, 690, 625, 1783, 651, 320, 32811, 275, 1798, 891, 1158, 253, 4477, 943, 823, 253, 1563, 50276, 18, 19501, 273, 253, 3045, 327, 253, 3237, 47344, 432, 253, 14168, 10895, 407, 1268, 273, 10183, 374, 247, 18276, 1783, 273, 752, 9351, 273, 3237, 253, 8245, 3210, 1891, 327, 285, 1880, 597, 1891, 327, 2074, 3237, 4122, 4623, 747, 10895, 342, 3332, 1666, 25379, 1408, 327, 352, 281, 755, 271, 2934, 273, 256, 5503, 3045, 2299, 7000, 1783, 273, 253, 1666, 25379, 327, 253, 10895, 310, 14999, 5474, 33032, 2520, 2929, 10262, 1054, 338, 19, 71, 247, 1071, 18880, 273, 258, 314, 42225, 324, 5251, 3237, 273, 10012, 18597, 326, 310, 9009, 275, 42281, 506, 9644, 285, 310, 1492, 282, 1054, 338, 19, 71, 4428, 29364, 2060, 10012, 7234, 326, 403, 7473, 1025, 432, 258, 314, 42225, 324, 14168, 39858, 305, 431, 71, 3210, 10166, 327, 42281, 506, 285, 9644, 403, 6760, 327, 436, 1071, 18880, 20544, 337, 1580, 2045, 49602, 273, 387, 81, 7194, 2770, 327, 5044, 14168, 39383, 1054, 338, 19, 71, 32113, 253, 44719, 273, 253, 12417, 5251, 1071, 18880, 323, 49160, 253, 3045, 273, 10012, 354, 735, 891, 1158, 436, 310, 271, 1774, 3213, 4404, 253, 4736, 273, 253, 4936, 17622, 5691, 374, 253, 2831, 10394, 2216, 273, 1054, 338, 19, 71, 3400, 247, 1175, 1039, 281, 7277, 1027, 7473, 5904, 285, 18597, 2718, 495, 253, 3368, 1543, 7568, 253, 6349, 273, 6485, 3640, 323, 10012, 18597, 4270, 342, 1029, 5251, 21041, 305, 431, 39923, 266, 33526, 1805, 1543, 685, 305, 431, 71, 2188, 253, 7473, 10012, 354, 735, 671, 789, 1805, 685, 253, 3626, 3448, 3169, 1895, 47037, 50276, 34974, 337, 752, 403, 253, 30460, 273, 2840, 285, 9953, 275, 2829, 337, 374, 752, 310, 253, 3268, 273, 253, 1180, 273, 39383, 8058, 2439, 1027, 10183, 2308, 495, 11697, 891, 717, 3240, 14338, 670, 634, 2793, 273, 7473, 3006, 841, 3237, 752, 310, 253, 3388, 673, 5262, 327, 581, 1895, 3707, 323, 12087, 285, 38183, 3237, 849, 1781, 5110, 273, 3237, 812, 320, 7473, 1025, 285, 752, 651, 320, 253, 12553, 1979, 273, 1054, 338, 19, 71, 275, 634, 15355, 50273, 1189, 455, 891, 1158, 1054, 338, 19, 71, 310, 271, 1774, 22791, 326, 812, 1361, 253, 3114, 7170, 253, 2561, 273, 10012, 18597, 891, 5583, 18738, 436, 2929, 50275, 74, 651, 751, 281, 6558, 619, 1711, 4868, 273, 436, 2929, 846, 4361, 253, 4477, 6128, 285, 643, 30628, 5701, 5474, 33032, 2520, 2929, 10262, 247, 747, 7473, 23065, 22791, 11253, 273, 29364, 7234, 4469, 275, 1264, 11906, 10012, 40037, 332, 1877, 2718, 8245, 387, 81, 2718, 19836, 305, 431, 16983, 514, 275, 9644, 403, 6760, 327, 436, 22791, 20544, 50276, 783, 11361, 273, 436, 22791, 403, 326, 352, 310, 2831, 10394, 285, 326, 352, 10949, 247, 5235, 273, 15965, 12989, 387, 253, 258, 314, 42225, 324, 1268, 50276, 783, 16038, 323, 253, 1798, 19308, 486, 273, 15965, 12989, 310, 4891, 1054, 338, 19, 71, 310, 6034, 347, 271, 10444, 3213, 2584, 253, 516, 80, 347, 271, 387, 81, 4836, 534, 310, 562, 273, 3986, 323, 1655, 2718, 436, 310, 253, 806, 3434, 281, 440, 1419, 2710, 258, 314, 42225, 324, 12989, 275, 581, 10895, 285, 253, 3237, 3835, 247, 4618, 7990, 273, 21041, 285, 12748, 50276, 20881, 1255, 265, 50276, 34974, 50276, 783, 22791, 310, 417, 1663, 347, 2831, 10394, 347, 7558, 275, 253, 12002, 760, 1249, 273, 253, 3733, 7234, 403, 2130, 275, 310, 1492, 282, 50276, 5430, 497, 253, 258, 314, 42225, 324, 285, 2840, 3237, 6777, 50276, 783, 1039, 275, 534, 2709, 22122, 3237, 403, 7473, 1025, 4245, 3081, 1491, 281, 253, 47037, 50273, 249, 2829, 577, 253, 717, 68, 1895, 12325, 534, 1318, 273, 271, 2048, 310, 1896, 2677, 5425, 327, 247, 285, 270, 533, 253, 7473, 1320, 15323, 253, 2677, 13783, 285, 12325, 281, 5276, 13919, 436, 651, 320, 13583, 762, 5884, 2544, 275, 253, 2515, 352, 651, 1646, 625, 4569, 281, 7473, 907, 436, 342, 1269, 50276, 19, 390, 1269, 50276, 805, 390, 50276, 284, 247, 9079, 50273, 28872, 3307, 273, 717, 68, 1249, 67, 9169, 2546, 281, 1089, 253, 4869, 1318, 273, 247, 2176, 1159, 562, 273, 2620, 10165, 2568, 253, 10012, 717, 68, 805, 67, 14952, 81, 1423, 9644, 12325, 281, 5276, 326, 323, 512, 2193, 273, 253, 4154, 253, 1159, 310, 4577, 685, 253, 3451, 4869, 436, 310, 4518, 12497, 1293, 253, 2720, 3640, 273, 253, 3451, 3662, 359, 476, 8564, 326, 253, 47037, 812, 5276, 247, 21076, 3033, 533, 22536, 697, 22639, 2820, 281, 5276, 3451, 3033, 50273, 66, 553, 3237, 403, 2709, 22122, 347, 973, 533, 352, 310, 4931, 7175, 26430, 417, 281, 7473, 907, 731, 347, 824, 50276, 35640, 279, 281, 253, 4477, 2127, 824, 347, 2829, 577, 285, 253, 10012, 327, 253, 9096, 285, 23400, 6836, 268, 23, 651, 320, 625, 34025, 342, 2969, 3626, 3448, 31825, 12930, 253, 4495, 273, 1016, 1386, 323, 253, 5649, 273, 10668, 665, 403, 417, 7615, 342, 512, 1264, 2718, 390, 513, 417, 923, 253, 2900, 253, 2022, 1318, 273, 436, 789, 310, 275, 253, 873, 273, 7473, 258, 314, 42225, 324, 1895, 7234, 534, 452, 417, 5420, 275, 643, 15302, 627, 310, 1652, 7681, 38135, 275, 253, 13361, 11333, 285, 1783, 273, 616, 3045, 3340, 275, 5301, 281, 15761, 1162, 355, 43425, 327, 534, 436, 2929, 11306, 15771, 5474, 339, 431, 248, 2929, 29328, 247, 22791, 273, 7473, 258, 314, 42225, 324, 5251, 14168, 3237, 13654, 327, 8697, 1180, 3762, 285, 25930, 342, 2831, 10394, 1329, 327, 42281, 506, 9644, 285, 310, 1492, 282, 275, 2440, 253, 2929, 44995, 3045, 273, 2505, 262, 72, 431, 71, 327, 42281, 506, 285, 9644, 285, 247, 2840, 8245, 2505, 770, 36706, 327, 9644, 347, 973, 2505, 3342, 296, 3755, 20556, 50276, 783, 2929, 7473, 4219, 247, 12524, 2408, 273, 2831, 10394, 258, 314, 42225, 324, 5251, 22791, 273, 29364, 3237, 2831, 10394, 1329, 327, 42281, 506, 9644, 285, 310, 1492, 282, 3400, 5649, 327, 10941, 29885, 285, 21041, 273, 2718, 258, 314, 42225, 324, 5251, 3237, 403, 671, 4722, 281, 1097, 8607, 285, 2087, 1345, 50276, 783, 11250, 273, 7473, 1320, 273, 247, 8578, 273, 14168, 22791, 671, 13276, 10941, 354, 735, 275, 7473, 285, 25040, 5981, 50276, 783, 2929, 310, 973, 15720, 342, 1175, 6239, 2278, 327, 10012, 18597, 49602, 50276, 11765, 20881, 1255, 265, 50276, 783, 2929, 812, 15249, 625, 327, 752, 1511, 273, 3237, 497, 4236, 275, 2505, 3342, 1222, 338, 19, 71, 253, 22791, 6571, 16633, 327, 8697, 1180, 3762, 285, 25930, 588, 436, 22791, 275, 690, 1039, 35689, 253, 2561, 3884, 273, 253, 3114, 281, 760, 2770, 327, 6684, 11333, 3782, 7470, 323, 16161, 841, 3510, 273, 3237, 326, 778, 390, 778, 417, 39970, 973, 281, 643, 3510, 273, 3237, 824, 347, 12087, 3237, 50276, 262, 651, 320, 4722, 281, 923, 604, 253, 8037, 327, 1509, 4142, 875, 42281, 506, 285, 9644, 812, 320, 3777, 672, 253, 3210, 403, 10166, 390, 1442, 292, 37437, 327, 247, 8578, 273, 2505, 3342, 1222, 338, 19, 71, 275, 1635, 281, 3215, 26208, 436, 812, 2085, 625, 1941, 604, 253, 8037, 310, 6571, 1955, 281, 2289, 281, 1029, 5251, 21041, 253, 2929, 2789, 247, 4891, 3213, 3579, 327, 6153, 2831, 10394, 258, 314, 42225, 324, 5251, 7473, 14168, 22791, 285, 943, 452, 15585, 5649, 327, 253, 3114, 342, 697, 45120, 2440, 2490, 187, 4118, 18435, 27, 783, 2929, 10262, 1054, 338, 19, 71, 247, 10895, 273, 29364, 1029, 19221, 285, 6831, 1268, 14168, 3237, 253, 3237, 403, 4751, 7473, 1025, 285, 2486, 27947, 275, 253, 42281, 506, 9644, 285, 310, 1492, 282, 10012, 354, 735, 347, 253, 30628, 8042, 562, 253, 1329, 323, 310, 1492, 282, 310, 3710, 285, 326, 943, 320, 1160, 30909, 275, 253, 12002, 436, 18878, 4012, 1329, 310, 253, 2022, 10156, 1127, 273, 253, 22791, 984, 352, 588, 1056, 352, 1896, 281, 1056, 1480, 14023, 2190, 2718, 12262, 1027, 10012, 354, 735, 50275, 783, 2929, 671, 1057, 247, 1175, 2628, 16585, 253, 22791, 5438, 285, 7473, 1320, 1232, 436, 310, 1774, 1580, 690, 273, 253, 3237, 497, 15786, 432, 3159, 3237, 50275, 284, 629, 273, 253, 30080, 22559, 253, 4477, 2879, 4465, 1491, 327, 253, 3045, 273, 253, 1666, 25379, 285, 690, 18276, 4278, 327, 849, 597, 1891, 50275, 1189, 455, 627, 310, 4345, 2190, 253, 30628, 326, 436, 310, 247, 9865, 22791, 326, 588, 8046, 14023, 2190, 2718, 326, 3063, 403, 1077, 1892, 281, 7277 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the authors present a novel architecture of an implicit unsupervised learning architectures using a teacher student approach in particular the main advantage to me seems to be the modecollapse property an important drawback in standard gan approaches the paper is written very well and is easy to follow the methodology is presented in a clear way and the experiments make sense given the research question i particular like that the authors define clear metrics to evaluate success which is often the weak point in unsupervised learning problems i believe the work is interesting but the results still preliminary and possibly limited by scalability as the authors put it the main bottleneck of lbt is how to efficiently solve the bilevel optimization problem on one hand each update of lbt could be slower than that of the existing methods because the computational cost of the unrolling technique grows linearly with respect to the unrolling steps on the other hand i appreciate the honesty in discussing possible scalability constraints i was a bit surprised that the method the authors propose seems to work well in the intramode kl divergence my expectation was that the main advantage of your method is capturing the global holistic shape of the distribution of the data whereas classical methods would because of mode collapse only capture specific subspaces therefore i would expect these classical methods to perform better in intramode kl divergence which is a metric to measure local not global approximation quality typos in practise introduction in practice 31 accent ascend conclusion on one hand other hand is used for two opposite ways of thinkingdocsepthis paper presents an learning by teaching lbt framework to train implicit generative models instead of using discriminator as in gans lbt adopts an explicit likelihood estimator as the student which is formulated as a bilevel optimization problem 1 maximize the loglikelihood of the generated samples 2 maximize the loglikelihood evaluated on the training data the authors argue that lbt avoids the mode collapse problem intrinsically as the missing modes will be penalized by the second objective i have some concerns on this why teaching an explicit likelihood can help learn an implicit one suppose the explicit likelihood estimator is a single gaussian but the real distribution has multiple modes fitting such the generated data and the training data on this likelihood will not help to avoid missing modes from the empirical results it is clear that lbtgan is better than lbt from the objective in 8 it seems the true reason is the pe and d together representing a mixture model which may fit the training data better in figure 2b the intramode kl divergence of lbtgan seems to be unstable during the training is this caused by the joint training of discriminator with the estimator can you discuss this in table 1 the authors just copied the results of veegan indeed in our implementation dcgan and veegan can be much better than the reported one the authors have not tried the effort to tune the results of baselines recently the least square gan has been purposed to address the mode collapse as well i suggested the authors should empirically compare with it as well generally the paper is wellwritten the idea is interesting however the motivation analysis and empirical results are not convincing enough to fully support their claim docsepthis work introduces a framework for learning implicit models that is robust to mode collapse it consists in learning an explicit model of the implicit model through maximum likelihood while the later is used to teach the explicit model to better match the data distribution the resulting bilevel optimization is carried out with truncated unrolled stochastic gradient descent quality the method combines an interesting set of ideas it is validated on some reasonable experiments however after reading the paper i remain with too many unanswered questions why should the method avoid mode collapse experiments clearly show that it indeed is resilient to mode collapse but i have would have been curious in seeing some more discussion regarding this point what is the exact mechanism that solves the issue what is the effect of k is mode collapse solved only because of the unrolled gradients what is the effect of m how does the method behave for m1 as usually done in gans what if the explicit model has not enough capacity the original unrolled gan paper presents better results for the ring problem why are results worse in the experiments more fundamentally what is the main benefit of this approach with respect to models that can be trained straight with maximum likelihood eg flowbased neural generative models and as required for the explicit model is it only to produce generative models that are fast because they are implicit why not training only the explicit model directly on the data clarity the paper is in general wellwritten although some elements could be removed to actually help with the presentation the development around influence functions could be removed as the method ends up instead making use of truncated unrolled gradients the theoretical analysis is straightforward and could be compressed in a single paragraph to motivate the method originality the method makes use of several ideas that have been floating around and proposed in different papers as far as i know the combination proposed in this work is original significance results show clear resistance to mode collapse which is an improvement for implicit models however other types of generative models generally do not suffer from this issue significance is therefore limited ### Summary:
the paper proposes a learning by teaching lbt framework to train an implicit generative model via an explicit one it is shown experimentally that the framework can help to avoid mode collapse the reviewers commonly raised the question why this is the case which was answered in the rebuttal by pointing to the differences between the kl and the jsdivergence and by showing a toy problem for which the jsdivergence has local minima while the kldivergence has not however it still remains unclear why this should be generally and for explicit models with insufficient capacity the case and if the model will be scalable to larger settings therefore the paper can not be accepted in the current form
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 1246, 247, 4460, 10336, 273, 50276, 266, 15424, 440, 35421, 4715, 35615, 970, 247, 9732, 5974, 2746, 50276, 249, 1798, 253, 2022, 5750, 281, 479, 3133, 281, 320, 253, 771, 886, 2555, 8023, 2867, 50276, 266, 1774, 32489, 275, 2629, 36827, 7274, 50276, 783, 2929, 310, 3542, 1077, 973, 285, 310, 3477, 281, 956, 253, 16182, 310, 3559, 275, 247, 2590, 1039, 285, 253, 4679, 1056, 3282, 1677, 253, 2561, 1953, 50276, 74, 1798, 751, 326, 253, 4477, 4853, 2590, 17082, 281, 7472, 2323, 534, 310, 2223, 253, 5075, 1127, 275, 440, 35421, 4715, 3237, 50275, 74, 2868, 253, 789, 310, 4722, 533, 253, 1543, 1335, 12611, 285, 50276, 38896, 3710, 407, 50276, 24606, 1430, 50276, 284, 253, 4477, 1691, 352, 50275, 783, 2022, 3673, 44856, 273, 298, 2612, 310, 849, 281, 14556, 8415, 253, 26413, 652, 13757, 1895, 327, 581, 1133, 1016, 5731, 273, 298, 2612, 812, 320, 17357, 685, 326, 273, 253, 5368, 3082, 984, 253, 15180, 2105, 273, 253, 440, 19891, 5853, 17202, 23352, 342, 1675, 281, 253, 440, 19891, 5018, 50276, 251, 253, 643, 1133, 891, 11435, 253, 33773, 275, 16585, 1896, 9171, 1430, 10806, 50276, 74, 369, 247, 2372, 9861, 326, 253, 1332, 253, 4477, 12661, 3133, 281, 789, 973, 275, 253, 50276, 565, 3358, 853, 27451, 23279, 50276, 2577, 15355, 369, 50276, 3529, 253, 2022, 5750, 273, 634, 1332, 310, 26475, 253, 4156, 45290, 5281, 273, 253, 3268, 273, 253, 941, 5727, 8946, 3082, 651, 984, 273, 4438, 13551, 760, 9232, 2173, 50276, 2377, 31748, 50276, 45230, 891, 651, 1902, 841, 8946, 3082, 281, 1347, 1805, 275, 34626, 853, 27451, 23279, 50276, 4609, 310, 247, 7982, 281, 2557, 1980, 50276, 1439, 4156, 11193, 3290, 50276, 555, 993, 50274, 249, 2283, 885, 10199, 50276, 249, 3946, 50276, 2405, 22713, 50276, 4843, 423, 50276, 585, 3444, 327, 581, 1133, 50276, 977, 1133, 310, 908, 323, 767, 7285, 4088, 273, 4680, 7152, 33032, 2520, 2929, 10262, 271, 4715, 407, 9551, 298, 2612, 7792, 281, 6194, 15424, 1006, 800, 3210, 3185, 273, 970, 7134, 12915, 347, 275, 305, 507, 298, 2612, 47932, 271, 6843, 12177, 29107, 347, 253, 5974, 534, 310, 26115, 347, 247, 26413, 652, 13757, 1895, 50276, 18, 22950, 253, 2412, 7513, 10202, 273, 253, 4561, 3530, 50276, 19, 22950, 253, 2412, 7513, 10202, 6760, 327, 253, 3733, 941, 50276, 783, 4477, 9059, 326, 298, 2612, 32547, 253, 4438, 13551, 1895, 45654, 347, 253, 5816, 10006, 588, 320, 29697, 1025, 407, 253, 1273, 8103, 891, 452, 690, 7350, 327, 436, 50276, 22309, 9551, 271, 6843, 12177, 476, 1361, 3037, 271, 15424, 581, 50275, 84, 2384, 253, 6843, 12177, 29107, 310, 247, 2014, 305, 12064, 533, 253, 1524, 3268, 556, 2709, 10006, 13532, 824, 253, 4561, 941, 285, 253, 3733, 941, 327, 436, 12177, 588, 417, 1361, 281, 3693, 5816, 10006, 50275, 4064, 253, 16774, 1543, 352, 310, 2590, 326, 298, 2612, 1247, 310, 1805, 685, 298, 2612, 432, 253, 8103, 275, 854, 352, 3133, 253, 2032, 1921, 310, 50276, 783, 759, 285, 277, 2366, 9999, 247, 7802, 1566, 534, 778, 4944, 253, 3733, 941, 1805, 50275, 249, 4677, 374, 67, 253, 34626, 853, 27451, 23279, 273, 298, 2612, 1247, 3133, 281, 320, 17631, 1309, 253, 3733, 310, 436, 4269, 407, 253, 6036, 3733, 273, 7134, 12915, 342, 253, 29107, 476, 368, 2319, 436, 50276, 249, 2829, 337, 253, 4477, 816, 22489, 253, 1543, 273, 1670, 30558, 6296, 275, 776, 7092, 36196, 1247, 285, 1670, 30558, 476, 320, 1199, 1805, 685, 253, 2361, 581, 253, 4477, 452, 417, 3597, 253, 3434, 281, 19928, 253, 1543, 273, 1666, 25379, 50275, 45019, 314, 253, 1878, 6278, 36827, 556, 644, 1460, 7334, 281, 2953, 253, 4438, 13551, 347, 973, 891, 5125, 253, 4477, 943, 45190, 7277, 342, 352, 347, 973, 50276, 43786, 253, 2929, 310, 973, 15720, 253, 2934, 310, 4722, 2299, 253, 16038, 1783, 285, 16774, 1543, 403, 417, 21414, 2217, 281, 4751, 1329, 616, 1750, 50275, 7152, 33032, 2520, 789, 23970, 247, 7792, 323, 4715, 15424, 3210, 326, 310, 10237, 281, 4438, 13551, 352, 8414, 275, 4715, 271, 6843, 1566, 273, 253, 15424, 1566, 949, 4869, 12177, 1223, 253, 1996, 310, 908, 281, 9798, 253, 6843, 1566, 281, 1805, 3761, 253, 941, 3268, 253, 4795, 26413, 652, 13757, 310, 4824, 562, 342, 28069, 440, 9095, 19191, 11786, 18499, 50275, 15177, 50276, 783, 1332, 24772, 271, 4722, 873, 273, 5697, 352, 310, 17618, 327, 690, 5272, 4679, 50275, 35529, 846, 4361, 253, 2929, 891, 3464, 342, 1512, 1142, 440, 42195, 3533, 50276, 22309, 943, 253, 1332, 3693, 4438, 13551, 4679, 4518, 921, 326, 352, 6296, 310, 35730, 281, 4438, 13551, 533, 891, 452, 651, 452, 644, 14338, 275, 6523, 690, 625, 5955, 5001, 436, 1127, 752, 310, 253, 3242, 5122, 326, 35910, 253, 2523, 50276, 5371, 310, 253, 1055, 273, 465, 310, 4438, 13551, 14042, 760, 984, 273, 253, 440, 9095, 27935, 50276, 5371, 310, 253, 1055, 273, 278, 849, 1057, 253, 1332, 21319, 323, 278, 18, 347, 3798, 2218, 275, 305, 507, 50276, 5371, 604, 253, 6843, 1566, 556, 417, 2217, 5350, 50276, 783, 3236, 440, 9095, 36827, 2929, 10262, 1805, 1543, 323, 253, 5818, 1895, 2139, 403, 1543, 7197, 275, 253, 4679, 50276, 3062, 26401, 752, 310, 253, 2022, 5649, 273, 436, 2746, 342, 1675, 281, 3210, 326, 476, 320, 10166, 4951, 342, 4869, 12177, 24088, 2685, 3169, 11454, 1006, 800, 3210, 285, 347, 2424, 323, 253, 6843, 1566, 310, 352, 760, 281, 4711, 1006, 800, 3210, 326, 403, 3809, 984, 597, 403, 15424, 2139, 417, 3733, 760, 253, 6843, 1566, 3587, 327, 253, 941, 50275, 498, 15752, 50276, 783, 2929, 310, 275, 2087, 973, 15720, 3738, 690, 3603, 812, 320, 5176, 281, 2686, 1361, 342, 253, 9759, 50276, 783, 2440, 1475, 4833, 3470, 812, 320, 5176, 347, 253, 1332, 7637, 598, 3185, 2403, 897, 273, 28069, 440, 9095, 27935, 50276, 783, 10527, 1783, 310, 15246, 285, 812, 320, 21012, 275, 247, 2014, 12494, 281, 41509, 253, 1332, 50275, 19164, 414, 50276, 783, 1332, 2789, 897, 273, 2067, 5697, 326, 452, 644, 14974, 1475, 285, 4081, 275, 1027, 9380, 347, 2080, 347, 891, 871, 253, 5019, 4081, 275, 436, 789, 310, 3236, 50275, 9188, 40348, 50276, 16680, 921, 2590, 5052, 281, 4438, 13551, 534, 310, 271, 7756, 323, 15424, 3210, 2299, 643, 3510, 273, 1006, 800, 3210, 3839, 513, 417, 11089, 432, 436, 2523, 8453, 310, 3103, 3710, 2490, 187, 4118, 18435, 27, 783, 2929, 29328, 247, 4715, 407, 9551, 298, 2612, 7792, 281, 6194, 271, 15424, 1006, 800, 1566, 3066, 271, 6843, 581, 352, 310, 2011, 21657, 326, 253, 7792, 476, 1361, 281, 3693, 4438, 13551, 253, 30628, 7744, 5439, 253, 1953, 2139, 436, 310, 253, 1083, 534, 369, 9577, 275, 253, 30080, 22559, 407, 13458, 281, 253, 3910, 875, 253, 27451, 285, 253, 480, 8289, 2373, 9515, 285, 407, 4645, 247, 20953, 1895, 323, 534, 253, 480, 8289, 2373, 9515, 556, 1980, 46836, 1223, 253, 465, 392, 2373, 9515, 556, 417, 2299, 352, 1335, 4558, 12744, 2139, 436, 943, 320, 3839, 285, 323, 6843, 3210, 342, 12497, 5350, 253, 1083, 285, 604, 253, 1566, 588, 320, 44755, 281, 4067, 7533, 3103, 253, 2929, 476, 417, 320, 7607, 275, 253, 1655, 830 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 1246, 247, 4460, 10336, 273, 50276, 266, 15424, 440, 35421, 4715, 35615, 970, 247, 9732, 5974, 2746, 50276, 249, 1798, 253, 2022, 5750, 281, 479, 3133, 281, 320, 253, 771, 886, 2555, 8023, 2867, 50276, 266, 1774, 32489, 275, 2629, 36827, 7274, 50276, 783, 2929, 310, 3542, 1077, 973, 285, 310, 3477, 281, 956, 253, 16182, 310, 3559, 275, 247, 2590, 1039, 285, 253, 4679, 1056, 3282, 1677, 253, 2561, 1953, 50276, 74, 1798, 751, 326, 253, 4477, 4853, 2590, 17082, 281, 7472, 2323, 534, 310, 2223, 253, 5075, 1127, 275, 440, 35421, 4715, 3237, 50275, 74, 2868, 253, 789, 310, 4722, 533, 253, 1543, 1335, 12611, 285, 50276, 38896, 3710, 407, 50276, 24606, 1430, 50276, 284, 253, 4477, 1691, 352, 50275, 783, 2022, 3673, 44856, 273, 298, 2612, 310, 849, 281, 14556, 8415, 253, 26413, 652, 13757, 1895, 327, 581, 1133, 1016, 5731, 273, 298, 2612, 812, 320, 17357, 685, 326, 273, 253, 5368, 3082, 984, 253, 15180, 2105, 273, 253, 440, 19891, 5853, 17202, 23352, 342, 1675, 281, 253, 440, 19891, 5018, 50276, 251, 253, 643, 1133, 891, 11435, 253, 33773, 275, 16585, 1896, 9171, 1430, 10806, 50276, 74, 369, 247, 2372, 9861, 326, 253, 1332, 253, 4477, 12661, 3133, 281, 789, 973, 275, 253, 50276, 565, 3358, 853, 27451, 23279, 50276, 2577, 15355, 369, 50276, 3529, 253, 2022, 5750, 273, 634, 1332, 310, 26475, 253, 4156, 45290, 5281, 273, 253, 3268, 273, 253, 941, 5727, 8946, 3082, 651, 984, 273, 4438, 13551, 760, 9232, 2173, 50276, 2377, 31748, 50276, 45230, 891, 651, 1902, 841, 8946, 3082, 281, 1347, 1805, 275, 34626, 853, 27451, 23279, 50276, 4609, 310, 247, 7982, 281, 2557, 1980, 50276, 1439, 4156, 11193, 3290, 50276, 555, 993, 50274, 249, 2283, 885, 10199, 50276, 249, 3946, 50276, 2405, 22713, 50276, 4843, 423, 50276, 585, 3444, 327, 581, 1133, 50276, 977, 1133, 310, 908, 323, 767, 7285, 4088, 273, 4680, 7152, 33032, 2520, 2929, 10262, 271, 4715, 407, 9551, 298, 2612, 7792, 281, 6194, 15424, 1006, 800, 3210, 3185, 273, 970, 7134, 12915, 347, 275, 305, 507, 298, 2612, 47932, 271, 6843, 12177, 29107, 347, 253, 5974, 534, 310, 26115, 347, 247, 26413, 652, 13757, 1895, 50276, 18, 22950, 253, 2412, 7513, 10202, 273, 253, 4561, 3530, 50276, 19, 22950, 253, 2412, 7513, 10202, 6760, 327, 253, 3733, 941, 50276, 783, 4477, 9059, 326, 298, 2612, 32547, 253, 4438, 13551, 1895, 45654, 347, 253, 5816, 10006, 588, 320, 29697, 1025, 407, 253, 1273, 8103, 891, 452, 690, 7350, 327, 436, 50276, 22309, 9551, 271, 6843, 12177, 476, 1361, 3037, 271, 15424, 581, 50275, 84, 2384, 253, 6843, 12177, 29107, 310, 247, 2014, 305, 12064, 533, 253, 1524, 3268, 556, 2709, 10006, 13532, 824, 253, 4561, 941, 285, 253, 3733, 941, 327, 436, 12177, 588, 417, 1361, 281, 3693, 5816, 10006, 50275, 4064, 253, 16774, 1543, 352, 310, 2590, 326, 298, 2612, 1247, 310, 1805, 685, 298, 2612, 432, 253, 8103, 275, 854, 352, 3133, 253, 2032, 1921, 310, 50276, 783, 759, 285, 277, 2366, 9999, 247, 7802, 1566, 534, 778, 4944, 253, 3733, 941, 1805, 50275, 249, 4677, 374, 67, 253, 34626, 853, 27451, 23279, 273, 298, 2612, 1247, 3133, 281, 320, 17631, 1309, 253, 3733, 310, 436, 4269, 407, 253, 6036, 3733, 273, 7134, 12915, 342, 253, 29107, 476, 368, 2319, 436, 50276, 249, 2829, 337, 253, 4477, 816, 22489, 253, 1543, 273, 1670, 30558, 6296, 275, 776, 7092, 36196, 1247, 285, 1670, 30558, 476, 320, 1199, 1805, 685, 253, 2361, 581, 253, 4477, 452, 417, 3597, 253, 3434, 281, 19928, 253, 1543, 273, 1666, 25379, 50275, 45019, 314, 253, 1878, 6278, 36827, 556, 644, 1460, 7334, 281, 2953, 253, 4438, 13551, 347, 973, 891, 5125, 253, 4477, 943, 45190, 7277, 342, 352, 347, 973, 50276, 43786, 253, 2929, 310, 973, 15720, 253, 2934, 310, 4722, 2299, 253, 16038, 1783, 285, 16774, 1543, 403, 417, 21414, 2217, 281, 4751, 1329, 616, 1750, 50275, 7152, 33032, 2520, 789, 23970, 247, 7792, 323, 4715, 15424, 3210, 326, 310, 10237, 281, 4438, 13551, 352, 8414, 275, 4715, 271, 6843, 1566, 273, 253, 15424, 1566, 949, 4869, 12177, 1223, 253, 1996, 310, 908, 281, 9798, 253, 6843, 1566, 281, 1805, 3761, 253, 941, 3268, 253, 4795, 26413, 652, 13757, 310, 4824, 562, 342, 28069, 440, 9095, 19191, 11786, 18499, 50275, 15177, 50276, 783, 1332, 24772, 271, 4722, 873, 273, 5697, 352, 310, 17618, 327, 690, 5272, 4679, 50275, 35529, 846, 4361, 253, 2929, 891, 3464, 342, 1512, 1142, 440, 42195, 3533, 50276, 22309, 943, 253, 1332, 3693, 4438, 13551, 4679, 4518, 921, 326, 352, 6296, 310, 35730, 281, 4438, 13551, 533, 891, 452, 651, 452, 644, 14338, 275, 6523, 690, 625, 5955, 5001, 436, 1127, 752, 310, 253, 3242, 5122, 326, 35910, 253, 2523, 50276, 5371, 310, 253, 1055, 273, 465, 310, 4438, 13551, 14042, 760, 984, 273, 253, 440, 9095, 27935, 50276, 5371, 310, 253, 1055, 273, 278, 849, 1057, 253, 1332, 21319, 323, 278, 18, 347, 3798, 2218, 275, 305, 507, 50276, 5371, 604, 253, 6843, 1566, 556, 417, 2217, 5350, 50276, 783, 3236, 440, 9095, 36827, 2929, 10262, 1805, 1543, 323, 253, 5818, 1895, 2139, 403, 1543, 7197, 275, 253, 4679, 50276, 3062, 26401, 752, 310, 253, 2022, 5649, 273, 436, 2746, 342, 1675, 281, 3210, 326, 476, 320, 10166, 4951, 342, 4869, 12177, 24088, 2685, 3169, 11454, 1006, 800, 3210, 285, 347, 2424, 323, 253, 6843, 1566, 310, 352, 760, 281, 4711, 1006, 800, 3210, 326, 403, 3809, 984, 597, 403, 15424, 2139, 417, 3733, 760, 253, 6843, 1566, 3587, 327, 253, 941, 50275, 498, 15752, 50276, 783, 2929, 310, 275, 2087, 973, 15720, 3738, 690, 3603, 812, 320, 5176, 281, 2686, 1361, 342, 253, 9759, 50276, 783, 2440, 1475, 4833, 3470, 812, 320, 5176, 347, 253, 1332, 7637, 598, 3185, 2403, 897, 273, 28069, 440, 9095, 27935, 50276, 783, 10527, 1783, 310, 15246, 285, 812, 320, 21012, 275, 247, 2014, 12494, 281, 41509, 253, 1332, 50275, 19164, 414, 50276, 783, 1332, 2789, 897, 273, 2067, 5697, 326, 452, 644, 14974, 1475, 285, 4081, 275, 1027, 9380, 347, 2080, 347, 891, 871, 253, 5019, 4081, 275, 436, 789, 310, 3236, 50275, 9188, 40348, 50276, 16680, 921, 2590, 5052, 281, 4438, 13551, 534, 310, 271, 7756, 323, 15424, 3210, 2299, 643, 3510, 273, 1006, 800, 3210, 3839, 513, 417, 11089, 432, 436, 2523, 8453, 310, 3103, 3710, 2490, 187, 4118, 18435, 27, 783, 2929, 29328, 247, 4715, 407, 9551, 298, 2612, 7792, 281, 6194, 271, 15424, 1006, 800, 1566, 3066, 271, 6843, 581, 352, 310, 2011, 21657, 326, 253, 7792, 476, 1361, 281, 3693, 4438, 13551, 253, 30628, 7744, 5439, 253, 1953, 2139, 436, 310, 253, 1083, 534, 369, 9577, 275, 253, 30080, 22559, 407, 13458, 281, 253, 3910, 875, 253, 27451, 285, 253, 480, 8289, 2373, 9515, 285, 407, 4645, 247, 20953, 1895, 323, 534, 253, 480, 8289, 2373, 9515, 556, 1980, 46836, 1223, 253, 465, 392, 2373, 9515, 556, 417, 2299, 352, 1335, 4558, 12744, 2139, 436, 943, 320, 3839, 285, 323, 6843, 3210, 342, 12497, 5350, 253, 1083, 285, 604, 253, 1566, 588, 320, 44755, 281, 4067, 7533, 3103, 253, 2929, 476, 417, 320, 7607, 275, 253, 1655, 830 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this work introduced an energyguided stochastic differential equation based method for image2image translation task unlike previous methods the proposed one takes the source training data into account and designs the energy function to preserve the domainindependent features and discard domainspecific features experiments on faces and animal faces show the effectiveness of the proposed method over existing gan based and diffusion based i2i works a nice interpretation of the discretization of egsde in the formulation of product of experts please find my concerns below i my biggest concern is that im not sure if energy or diffusion based models have beaten gan based approaches as claimed by authors the most typical gan based approach is stargan 8 which is cited by authors and the dataset source used in this work however i do not find any comparison with 8 in the experiment section any special reasons though 8 handles multidomain translation it is still okay to train the framework of 8 with just two domains additionally the fid numbers shown in 8 are much lower than what is shown in this paper overall im not quite convinced if gan based approaches have been outperformed as they also take both source and target training data into consideration ii is the proposed method based on energy function much slower than a feedforward step in gan approach if so i probably expect even higher performance in terms of quality user study is another good way to evaluate perceptual preference if there is no gt to evaluate iii it looks the egsde is not limited to i2i task would it be also applicable to other tasks like colorization or inpainting thanks for the rebuttal from authors after checking comments from other two reviewers i increased my score to 5 the reason why i asked authors to compare with the ganbased approach stargan v2 is that authors need to support the claim with solid experiments by saying the diffusion model has unique advantages on this task not just because diffusion model is popular nowadays and has not much explored for i2i yet especially considering that authors have used the dataset from stargan work but eventually ignore the comparison in the draft be more rigorous and hopefully do not make this type of mistake again in future research yes docsepthe author proposes an unpaired imagetoimage translation method using scorebased diffusion generative models compared with existing arts the method also considers the training data in the source domain to be specific the method deploys an energy function pretrained across two image domains experiments show the superiority of the proposed method strength 1 this paper considers the domainindependent and domaindependent features in diffusion models which is not explored until recently 2 measure content similarity with energy function in the diffusion model is a good idea the authors also provide a theoretical insight into egsde 3 the comparisons are sound and comprehensive weaknesses 1 one stateoftheart image translation is missing please compare egsde with diffusionclip 1 2 there are many other ways to address image translation and content preserving one can use classifier guidance to generate images in the target domain 2 to preserve identity information one can use identity features the author should highlight the novelty or the advancement compared with these solutions 3 the paper will benefit from a more exhaustive ablation study for example 1 the effectiveness of es and ei 2 replacing es with classifier guidance 2 or classifierfree guidance 3 4 the paper will benefit from a human evaluation ie comparing egsde with sdedit and ilvr on male female 1 diffusionclip textguided diffusion models for robust image manipulation 2 diffusion models beat gans on image synthesis 3 classifierfree diffusion guidance the limitations and potential negative societal impact have been adequately addressed docsepthe paper proposes scorebased diffusion generative models sdgms that does not ignore the training data in the source domain leading to improved solutions for unpaired i2i the proposed method is energyguided stochastic differential equations egsde that employs an energy function pretrained on both the source and target domains to guide the inference process of a pretrained sde for realistic and faithful unpaired i2i weakness evaluation on more standard datasets is missing strengths intuitive idea wellwritten paper no immediate negative impact docsepthis paper studies sdgms for image2image translation the authors propose energy functions pretrained from both source and target dataset to guide the sde inference process to generate realistic and faithful images experiments show that the proposed method outperforms sdgmbased methods quantitatively strengths 1 introducing the source domain knowledge to improve sdgmbased methods is promising and a good contribution 2 quantitative experimental results are rich the selection of baselines and evaluation metrics is reasonable 3 the paper is overall well written weaknesses 1 the background of sdgms is a bit hard to follow i suggest the authors give an intuitive understanding of what each equation does for image2image also i cant find what is fyt in practice 2 the paper lacks some qualitative analysis about the ablation of each expert a limitation is that unlike the realistic expert the faithful expert simply uses a lowpass filter without knowledge of source and target datasets as said in the paper this could be improved in future work ### Summary:
the paper proposes an unpaired imagetoimage translation method based on scorebased diffusion models compared to prior works 7 29 the paper adds two energy functions pretrained on both the source and target domains in an expertofproduct framework the paper has received positive reviews reviewers found the paper wellwritten the idea intuitive and the experimental results comprehensive the rebuttal further addressed the concerns regarding the user study running time and missing comparisons the ac agreed with the reviewers consensus and recommended accepting the paper
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 789, 5611, 271, 2341, 26960, 19191, 8967, 5150, 1754, 1332, 323, 2460, 19, 5695, 10234, 4836, 12401, 2045, 3082, 253, 4081, 581, 3936, 253, 2603, 3733, 941, 715, 2395, 285, 11809, 253, 2341, 1159, 281, 14003, 253, 5028, 17777, 3386, 285, 37271, 10625, 29765, 3386, 4679, 327, 9365, 285, 5893, 9365, 921, 253, 12510, 273, 253, 4081, 1332, 689, 5368, 36827, 1754, 285, 12393, 1754, 891, 19, 74, 2987, 50275, 66, 5322, 7914, 273, 253, 35132, 1320, 273, 299, 5943, 615, 275, 253, 15895, 273, 1885, 273, 10071, 50276, 32897, 1089, 619, 7350, 2708, 50276, 74, 619, 5962, 4468, 310, 326, 516, 417, 2119, 604, 2341, 390, 12393, 1754, 3210, 452, 20698, 36827, 1754, 7274, 347, 7558, 407, 4477, 253, 954, 6867, 36827, 1754, 2746, 310, 4177, 1247, 854, 534, 310, 11106, 407, 4477, 285, 253, 10895, 2603, 908, 275, 436, 789, 2299, 891, 513, 417, 1089, 667, 5301, 342, 854, 275, 253, 3368, 2593, 667, 2714, 4606, 2167, 854, 22139, 23964, 297, 404, 10234, 352, 310, 1335, 8261, 281, 6194, 253, 7792, 273, 854, 342, 816, 767, 10625, 23000, 253, 269, 301, 3904, 2011, 275, 854, 403, 1199, 2406, 685, 752, 310, 2011, 275, 436, 2929, 4583, 516, 417, 3240, 13762, 604, 36827, 1754, 7274, 452, 644, 41731, 10574, 347, 597, 671, 1379, 1097, 2603, 285, 2303, 3733, 941, 715, 8180, 50276, 2886, 310, 253, 4081, 1332, 1754, 327, 2341, 1159, 1199, 17357, 685, 247, 3997, 10495, 3213, 275, 36827, 2746, 604, 594, 891, 3164, 1902, 1014, 2169, 3045, 275, 2426, 273, 3290, 2608, 1263, 310, 1529, 1175, 1039, 281, 7472, 39612, 14682, 604, 627, 310, 642, 305, 85, 281, 7472, 50276, 12211, 352, 4453, 253, 299, 5943, 615, 310, 417, 3710, 281, 891, 19, 74, 4836, 651, 352, 320, 671, 7763, 281, 643, 8892, 751, 3295, 1320, 390, 275, 31406, 1076, 50274, 35501, 323, 253, 30080, 22559, 432, 4477, 846, 12669, 5701, 432, 643, 767, 30628, 891, 2559, 619, 4868, 281, 608, 253, 1921, 2139, 891, 2546, 4477, 281, 7277, 342, 253, 36827, 3169, 2746, 4177, 1247, 362, 19, 310, 326, 4477, 878, 281, 1329, 253, 1750, 342, 4891, 4679, 407, 3981, 253, 12393, 1566, 556, 4451, 11361, 327, 436, 4836, 417, 816, 984, 12393, 1566, 310, 4633, 31735, 285, 556, 417, 1199, 14859, 323, 891, 19, 74, 2568, 3340, 7296, 326, 4477, 452, 908, 253, 10895, 432, 4177, 1247, 789, 533, 6524, 11823, 253, 5301, 275, 253, 7482, 320, 625, 26565, 285, 18670, 513, 417, 1056, 436, 1511, 273, 10551, 969, 275, 2852, 2561, 50276, 9820, 5474, 339, 431, 248, 2488, 29328, 271, 47223, 4440, 16713, 5695, 10234, 1332, 970, 4868, 3169, 12393, 1006, 800, 3210, 2429, 342, 5368, 14635, 253, 1332, 671, 19401, 253, 3733, 941, 275, 253, 2603, 5028, 281, 320, 2173, 253, 1332, 48593, 16376, 271, 2341, 1159, 3215, 11273, 2439, 767, 2460, 10625, 4679, 921, 253, 34385, 273, 253, 4081, 1332, 4757, 50276, 18, 436, 2929, 19401, 253, 5028, 17777, 285, 5028, 6820, 3386, 275, 12393, 3210, 534, 310, 417, 14859, 1919, 4102, 374, 2557, 2600, 14259, 342, 2341, 1159, 275, 253, 12393, 1566, 310, 247, 1175, 2934, 253, 4477, 671, 2085, 247, 10527, 12288, 715, 299, 5943, 615, 495, 253, 14023, 403, 3590, 285, 11088, 50276, 20881, 1255, 265, 337, 581, 1375, 23037, 14387, 2460, 10234, 310, 5816, 4496, 7277, 299, 5943, 615, 342, 12393, 11536, 337, 374, 627, 403, 1142, 643, 4088, 281, 2953, 2460, 10234, 285, 2600, 24279, 581, 476, 897, 30410, 12925, 281, 6635, 3888, 275, 253, 2303, 5028, 374, 281, 14003, 6489, 1491, 581, 476, 897, 6489, 3386, 253, 2488, 943, 6780, 253, 38135, 390, 253, 32992, 2429, 342, 841, 5482, 495, 253, 2929, 588, 5649, 432, 247, 625, 41389, 28913, 1263, 323, 1650, 337, 253, 12510, 273, 1578, 285, 22616, 374, 15706, 1578, 342, 30410, 12925, 374, 390, 30410, 4924, 12925, 495, 577, 253, 2929, 588, 5649, 432, 247, 1966, 7103, 26332, 10941, 299, 5943, 615, 342, 256, 4861, 262, 285, 4164, 24987, 327, 5086, 50276, 35686, 50276, 18, 12393, 11536, 2505, 26960, 12393, 3210, 323, 10237, 2460, 19763, 50274, 19, 12393, 3210, 7171, 305, 507, 327, 2460, 9066, 50274, 20, 30410, 4924, 12393, 12925, 50275, 783, 7364, 285, 2442, 4016, 38058, 3486, 452, 644, 18212, 9713, 5474, 339, 431, 248, 2929, 29328, 4868, 3169, 12393, 1006, 800, 3210, 256, 27421, 983, 326, 1057, 417, 11823, 253, 3733, 941, 275, 253, 2603, 5028, 4283, 281, 5520, 5482, 323, 47223, 891, 19, 74, 253, 4081, 1332, 310, 50276, 14115, 26960, 19191, 8967, 7424, 299, 5943, 615, 326, 27532, 271, 2341, 1159, 3215, 11273, 327, 1097, 253, 2603, 285, 2303, 10625, 281, 7102, 253, 17032, 1232, 273, 247, 3215, 11273, 256, 615, 323, 15958, 285, 21738, 47223, 891, 19, 74, 14855, 50276, 15419, 2368, 327, 625, 2629, 15302, 310, 5816, 50276, 296, 3755, 20556, 50276, 565, 48714, 2934, 973, 15720, 2929, 642, 8993, 4016, 3486, 5474, 33032, 2520, 2929, 2175, 256, 27421, 983, 323, 2460, 19, 5695, 10234, 253, 4477, 12661, 2341, 3470, 3215, 11273, 432, 1097, 2603, 285, 2303, 10895, 281, 7102, 253, 256, 615, 17032, 1232, 281, 6635, 15958, 285, 21738, 3888, 4679, 921, 326, 253, 4081, 1332, 41731, 13015, 256, 27421, 1814, 833, 3082, 36878, 20544, 50276, 18, 16984, 253, 2603, 5028, 3640, 281, 3157, 256, 27421, 1814, 833, 3082, 310, 12532, 285, 247, 1175, 7680, 374, 11745, 5661, 1543, 403, 6793, 253, 5438, 273, 1666, 25379, 285, 7103, 17082, 310, 5272, 495, 253, 2929, 310, 4583, 973, 3542, 50276, 20881, 1255, 265, 337, 253, 4114, 273, 256, 27421, 983, 310, 247, 2372, 1892, 281, 956, 891, 1804, 253, 4477, 1918, 271, 27350, 4685, 273, 752, 1016, 5150, 1057, 323, 2460, 19, 5695, 50276, 12563, 891, 16216, 1089, 752, 310, 269, 1767, 275, 3946, 374, 253, 2929, 19756, 690, 18276, 1783, 670, 253, 28913, 273, 1016, 6485, 247, 12291, 310, 326, 12401, 253, 15958, 6485, 50276, 783, 21738, 6485, 3365, 4648, 247, 1698, 5858, 5806, 1293, 3640, 273, 2603, 285, 2303, 15302, 347, 753, 275, 253, 2929, 436, 812, 320, 5520, 275, 2852, 789, 2490, 187, 4118, 18435, 27, 783, 2929, 29328, 271, 47223, 4440, 16713, 5695, 10234, 1332, 1754, 327, 4868, 3169, 12393, 3210, 2429, 281, 2720, 2987, 818, 3285, 253, 2929, 11323, 767, 2341, 3470, 3215, 11273, 327, 1097, 253, 2603, 285, 2303, 10625, 275, 271, 1172, 936, 71, 7509, 7792, 253, 2929, 556, 2959, 2762, 10123, 30628, 1119, 253, 2929, 973, 15720, 253, 2934, 27350, 285, 253, 5661, 1543, 11088, 50276, 783, 30080, 22559, 2007, 9713, 253, 7350, 5001, 253, 2608, 1263, 3515, 673, 285, 5816, 14023, 253, 913, 5821, 342, 253, 30628, 13969, 285, 8521, 18738, 253, 2929, 50275 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 789, 5611, 271, 2341, 26960, 19191, 8967, 5150, 1754, 1332, 323, 2460, 19, 5695, 10234, 4836, 12401, 2045, 3082, 253, 4081, 581, 3936, 253, 2603, 3733, 941, 715, 2395, 285, 11809, 253, 2341, 1159, 281, 14003, 253, 5028, 17777, 3386, 285, 37271, 10625, 29765, 3386, 4679, 327, 9365, 285, 5893, 9365, 921, 253, 12510, 273, 253, 4081, 1332, 689, 5368, 36827, 1754, 285, 12393, 1754, 891, 19, 74, 2987, 50275, 66, 5322, 7914, 273, 253, 35132, 1320, 273, 299, 5943, 615, 275, 253, 15895, 273, 1885, 273, 10071, 50276, 32897, 1089, 619, 7350, 2708, 50276, 74, 619, 5962, 4468, 310, 326, 516, 417, 2119, 604, 2341, 390, 12393, 1754, 3210, 452, 20698, 36827, 1754, 7274, 347, 7558, 407, 4477, 253, 954, 6867, 36827, 1754, 2746, 310, 4177, 1247, 854, 534, 310, 11106, 407, 4477, 285, 253, 10895, 2603, 908, 275, 436, 789, 2299, 891, 513, 417, 1089, 667, 5301, 342, 854, 275, 253, 3368, 2593, 667, 2714, 4606, 2167, 854, 22139, 23964, 297, 404, 10234, 352, 310, 1335, 8261, 281, 6194, 253, 7792, 273, 854, 342, 816, 767, 10625, 23000, 253, 269, 301, 3904, 2011, 275, 854, 403, 1199, 2406, 685, 752, 310, 2011, 275, 436, 2929, 4583, 516, 417, 3240, 13762, 604, 36827, 1754, 7274, 452, 644, 41731, 10574, 347, 597, 671, 1379, 1097, 2603, 285, 2303, 3733, 941, 715, 8180, 50276, 2886, 310, 253, 4081, 1332, 1754, 327, 2341, 1159, 1199, 17357, 685, 247, 3997, 10495, 3213, 275, 36827, 2746, 604, 594, 891, 3164, 1902, 1014, 2169, 3045, 275, 2426, 273, 3290, 2608, 1263, 310, 1529, 1175, 1039, 281, 7472, 39612, 14682, 604, 627, 310, 642, 305, 85, 281, 7472, 50276, 12211, 352, 4453, 253, 299, 5943, 615, 310, 417, 3710, 281, 891, 19, 74, 4836, 651, 352, 320, 671, 7763, 281, 643, 8892, 751, 3295, 1320, 390, 275, 31406, 1076, 50274, 35501, 323, 253, 30080, 22559, 432, 4477, 846, 12669, 5701, 432, 643, 767, 30628, 891, 2559, 619, 4868, 281, 608, 253, 1921, 2139, 891, 2546, 4477, 281, 7277, 342, 253, 36827, 3169, 2746, 4177, 1247, 362, 19, 310, 326, 4477, 878, 281, 1329, 253, 1750, 342, 4891, 4679, 407, 3981, 253, 12393, 1566, 556, 4451, 11361, 327, 436, 4836, 417, 816, 984, 12393, 1566, 310, 4633, 31735, 285, 556, 417, 1199, 14859, 323, 891, 19, 74, 2568, 3340, 7296, 326, 4477, 452, 908, 253, 10895, 432, 4177, 1247, 789, 533, 6524, 11823, 253, 5301, 275, 253, 7482, 320, 625, 26565, 285, 18670, 513, 417, 1056, 436, 1511, 273, 10551, 969, 275, 2852, 2561, 50276, 9820, 5474, 339, 431, 248, 2488, 29328, 271, 47223, 4440, 16713, 5695, 10234, 1332, 970, 4868, 3169, 12393, 1006, 800, 3210, 2429, 342, 5368, 14635, 253, 1332, 671, 19401, 253, 3733, 941, 275, 253, 2603, 5028, 281, 320, 2173, 253, 1332, 48593, 16376, 271, 2341, 1159, 3215, 11273, 2439, 767, 2460, 10625, 4679, 921, 253, 34385, 273, 253, 4081, 1332, 4757, 50276, 18, 436, 2929, 19401, 253, 5028, 17777, 285, 5028, 6820, 3386, 275, 12393, 3210, 534, 310, 417, 14859, 1919, 4102, 374, 2557, 2600, 14259, 342, 2341, 1159, 275, 253, 12393, 1566, 310, 247, 1175, 2934, 253, 4477, 671, 2085, 247, 10527, 12288, 715, 299, 5943, 615, 495, 253, 14023, 403, 3590, 285, 11088, 50276, 20881, 1255, 265, 337, 581, 1375, 23037, 14387, 2460, 10234, 310, 5816, 4496, 7277, 299, 5943, 615, 342, 12393, 11536, 337, 374, 627, 403, 1142, 643, 4088, 281, 2953, 2460, 10234, 285, 2600, 24279, 581, 476, 897, 30410, 12925, 281, 6635, 3888, 275, 253, 2303, 5028, 374, 281, 14003, 6489, 1491, 581, 476, 897, 6489, 3386, 253, 2488, 943, 6780, 253, 38135, 390, 253, 32992, 2429, 342, 841, 5482, 495, 253, 2929, 588, 5649, 432, 247, 625, 41389, 28913, 1263, 323, 1650, 337, 253, 12510, 273, 1578, 285, 22616, 374, 15706, 1578, 342, 30410, 12925, 374, 390, 30410, 4924, 12925, 495, 577, 253, 2929, 588, 5649, 432, 247, 1966, 7103, 26332, 10941, 299, 5943, 615, 342, 256, 4861, 262, 285, 4164, 24987, 327, 5086, 50276, 35686, 50276, 18, 12393, 11536, 2505, 26960, 12393, 3210, 323, 10237, 2460, 19763, 50274, 19, 12393, 3210, 7171, 305, 507, 327, 2460, 9066, 50274, 20, 30410, 4924, 12393, 12925, 50275, 783, 7364, 285, 2442, 4016, 38058, 3486, 452, 644, 18212, 9713, 5474, 339, 431, 248, 2929, 29328, 4868, 3169, 12393, 1006, 800, 3210, 256, 27421, 983, 326, 1057, 417, 11823, 253, 3733, 941, 275, 253, 2603, 5028, 4283, 281, 5520, 5482, 323, 47223, 891, 19, 74, 253, 4081, 1332, 310, 50276, 14115, 26960, 19191, 8967, 7424, 299, 5943, 615, 326, 27532, 271, 2341, 1159, 3215, 11273, 327, 1097, 253, 2603, 285, 2303, 10625, 281, 7102, 253, 17032, 1232, 273, 247, 3215, 11273, 256, 615, 323, 15958, 285, 21738, 47223, 891, 19, 74, 14855, 50276, 15419, 2368, 327, 625, 2629, 15302, 310, 5816, 50276, 296, 3755, 20556, 50276, 565, 48714, 2934, 973, 15720, 2929, 642, 8993, 4016, 3486, 5474, 33032, 2520, 2929, 2175, 256, 27421, 983, 323, 2460, 19, 5695, 10234, 253, 4477, 12661, 2341, 3470, 3215, 11273, 432, 1097, 2603, 285, 2303, 10895, 281, 7102, 253, 256, 615, 17032, 1232, 281, 6635, 15958, 285, 21738, 3888, 4679, 921, 326, 253, 4081, 1332, 41731, 13015, 256, 27421, 1814, 833, 3082, 36878, 20544, 50276, 18, 16984, 253, 2603, 5028, 3640, 281, 3157, 256, 27421, 1814, 833, 3082, 310, 12532, 285, 247, 1175, 7680, 374, 11745, 5661, 1543, 403, 6793, 253, 5438, 273, 1666, 25379, 285, 7103, 17082, 310, 5272, 495, 253, 2929, 310, 4583, 973, 3542, 50276, 20881, 1255, 265, 337, 253, 4114, 273, 256, 27421, 983, 310, 247, 2372, 1892, 281, 956, 891, 1804, 253, 4477, 1918, 271, 27350, 4685, 273, 752, 1016, 5150, 1057, 323, 2460, 19, 5695, 50276, 12563, 891, 16216, 1089, 752, 310, 269, 1767, 275, 3946, 374, 253, 2929, 19756, 690, 18276, 1783, 670, 253, 28913, 273, 1016, 6485, 247, 12291, 310, 326, 12401, 253, 15958, 6485, 50276, 783, 21738, 6485, 3365, 4648, 247, 1698, 5858, 5806, 1293, 3640, 273, 2603, 285, 2303, 15302, 347, 753, 275, 253, 2929, 436, 812, 320, 5520, 275, 2852, 789, 2490, 187, 4118, 18435, 27, 783, 2929, 29328, 271, 47223, 4440, 16713, 5695, 10234, 1332, 1754, 327, 4868, 3169, 12393, 3210, 2429, 281, 2720, 2987, 818, 3285, 253, 2929, 11323, 767, 2341, 3470, 3215, 11273, 327, 1097, 253, 2603, 285, 2303, 10625, 275, 271, 1172, 936, 71, 7509, 7792, 253, 2929, 556, 2959, 2762, 10123, 30628, 1119, 253, 2929, 973, 15720, 253, 2934, 27350, 285, 253, 5661, 1543, 11088, 50276, 783, 30080, 22559, 2007, 9713, 253, 7350, 5001, 253, 2608, 1263, 3515, 673, 285, 5816, 14023, 253, 913, 5821, 342, 253, 30628, 13969, 285, 8521, 18738, 253, 2929, 50275 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper improves upon stackelberg deep deterministic policy gradients by proposing a set of strategies on how to deal with the hessian part of the gradient this should overcome the major limitations of this class of methods which are the great time complexity and the slow converging rate with respect to the standard actorcritic framework the authors provide a formal justification on why removing parts of the hessian and using a blockdiagonal approximation is still achieving convergence time complexity is analyzed for many variations of the algorithm and the experimental session provides mixed results i found this paper in general well written and the method presented seems to be novel the authors analyzed many different hessian approximation scenarios and i believe that this line of work can be relevant in order to make the stackelberg approach feasible for larger neural networks unfortunately i found a few issues with the theory related to ddpg and the way experiments have been conducted which might violate most of the proposed theoretical results the main issue with this paper is that it considers the approximated offpolicy policy gradient of ddpg which appears here in equation 4 as if it was the true policy gradient i want to discuss two cases and explain why equation 4 is not correct in the onpolicy case the expectation of the left side of equation 4 should be over the distribution on the initial state of the agent while on the right side it becomes the improper discounted state visitation under the current policy this is because the gradient nablathetaqpithetasa here a is any action not the deterministic one can be iteratively decomposed see theorem 1 in 1 and theorem 1 in 2 in the offpolicy case however since we do not have samples from the stationary distribution under the current policy the expectation on the left side of equation 4 is taken over the discounted state visitation under the behavioral policy here called rhos if this is the case the gradient nablathetaqpithetasa is difficult to compute and it is usually dropped see eq 15 in 1 and comment below see section 22 and appendix b errata in 3 therefore the gradient is only an approximation of the true offpolicy policy gradient this is what happens in offpolicy dpg ddpg and td3 so in order to have equality in equation 4 the term nablathetaqpithetasa must be added formally equation 4 should be nablatheta mathbbes sim rhocdot qpithetas pithetas mathbbes sim rhocdot nablathetapithetas nablaa qpithetasaapithetas nablathetaqpithetasaapithetas this issue is present in most of the proofs where equation 4 is used ignoring the additional term so the results of proposition 1 theorem 1 theorem 2 corollary 1 should not hold in the offpolicy setting note that the experiments performed are offpolicy and use a replay buffer containing past trajectories in the problem formulation it is not clear how rhos is defined the authors claim it is the discounted state distribution of s but under which policy the definition of the actionvalue function should not depend on the initial state the authors define qwpithetasa to be equal to vwpitheta s because they always consider the deterministic action of the policy this can be seen also in the minimization of the td error in equation 1 where the expectation is never taken over the actions in the replay buffer but only over the states nablaw mathbbes sim rhocdot qpithetaws pithetas qpithetas pithetas 2 note how this differs from ddpg where the target value function is deterministic but the learned value function is evaluated on a set of noisy actions this is relevant here because when taking the second order gradient the authors must use the chain rule for both terms in the temporal difference loss while this would not be the case in standard ddpg note that in the experiments the authors are instead using the standard ddpg approach and learn a full actionvalue function sampling stateactions pairs from the replay buffer this is evident also in table 4 where a noisy version of the policy is used for exploration how is this affecting the theoretical results in the proof of proposition 1 why is the term nablathetat qpithetasa disappearing i would like to note that this paper is building on the work of zheng et al 4 which appeared on arxiv only 3 days before the first iclr submission deadline i was not able to verify if 4 has been peerreviewed but given the issues above i believe that 4 might contain similar theoretical problems 1david silver guy lever nicolas heess thomas degris daan wierstra and martin riedmiller deterministic policy gradient algorithms in proceedings of the 31st international conference on international conference on machine learning volume 32 icml14 pages i387i395jmlrorg 2014 2sutton r s mcallester d a singh s p mansour y 2000 policy gradient methods for reinforcement learning with function approximation in advances in neural information processing systems pp 10571063 3thomas degris martha white and richard s sutton offpolicy actorcritic in proceedings of the 29th international conference on international conference on machine learning icml12 pages 179186 usa 2012omnipress 4zheng l fiez t alumbaugh z chasnov b ratliff l j 2021 stackelberg actorcritic gametheoretic reinforcement learning algorithms arxiv preprint arxiv210912286 while i find the paper well written and potentially relevant for this field there is a major flaw in how the policy gradient is computed it is not clear to me how the theoretical results can be fixed in the offpolicy case while for onpolicy learning some additional work might be able to keep the results valid unfortunately the experiments conducted are in the offpolicy setting hence it is not clear what we can conclude from them given the issues above i propose to reject this paper in the current version docsepthis paper aims to establish a computational efficient stackelberg training scheme for deeplearningbased actorcritic methods the proposed approach fast stackelberg deep deterministic policy gradient fsddpg considers the block diagonal approximation technique to reduce the training complexity while improving the convergence rate this paper conducts both theoretical analysis and empirical evaluation to demonstrate the strength of the proposed approach this paper is wellorganized and the discussion of the literature background is thorough the proposed method is novel and sound which has solid theoretical analysis and empirical grounding overall i recommend accepting this paper minor comments several curves in figure 3 do not seem to converge eg in inverteddoublependulum and walker it would be better to include a longer training horizon to establish a rigorous comparison in table 1 how does the scale of epsilon compare to frac1nm and frac1o in practice to my knowledge the method proposed by this paper is novel and tackles an important problem i recommend acceptance docsepthis paper proposes a method to accelerate stackelberg actorcritic by ignoring the terms that contain hessian in the indirect gradient term and applying blockdiagonal approximation technique to remaining inverse terms they prove a faster convergence rate of the proposed method to a fixed point experiments are conducted to validate the fast convergence and stability strengths this paper is wellwritten and organized the motivation and intuition of the proposed method are clear and easy to follow various experiments are presented to compare the proposed method with baselines weaknesses my concerns about this paper are threefolds firstly the novelty the theoretical analysis in section 5 mainly follows theorem 5 in fiez et al 2020 the authors should clearly state the connections of these two works second the theoretical analysis is pretty weak they only show a faster convergence rate which is an expected and relatively weak result they dont show any theoretical results for stability which is stated as one of their two main contributions faster convergence rate and maintained stability third in terms of the experiments stability is mainly illustrated and validated only on toy examples hence less convincing in all although this paper is clear to follow and nicely presented i believe the theoretical analysis part is weak the fast convergence rate somehow lacks significant novelty the stability of the proposed method is less convincing docsepthis work proposes a variant of the stackelberg actorcritic algorithm which treats actorcritic as a stackelberg game with the actor being the leader and critic being the follower based on stackelberg ac this paper proposes to adopt deterministic policies in order to simplify the computation of the implicit gradient additionally the authors propose approximation schemes including dropping smallorder terms and matrix inversion via block diagonal matrix approximation the computational complexity of gradient computation is established moreover under relatively strong assumptions the proposed method is shown to be convergent experiments on mujoco are conducted to demonstrate the efficacy of the method strength this paper seems to provide numerical experiments on mujoco that are comparable to the stateoftheart weakness 1 the formulation of stackelberg game between the actor and critic seems problematic in specific the actor pitheta is the leader and the critic qomega is the follower the followers loss of the meansquared temopral difference error beginalign ltheta omega mathbbe s sim rhotheta asim pitheta big qomega sa qpitheta sa2 bigr endalign where rhotheta is the visitation measure or stationary distribution induced by the policy pitheta then when calculating the indirect gradient term in eq 3 we need to handle the partial direvative partial 2 ltheta omega partial theta partial omega which involves taking the gradient with respect to sa sim rhotheta otimes pitheta and needs to be handled using policy gradient thoerem this work seems to neglect this matter by using a fixed sampling distribution rho which is problematic 2 more importantly i feel that the issue raised in this work most ac methods perform stochastic gradient descent on the actor and critic simultaneously this can be regarded as performing gradient descent ascent gda zheng et al 2021 wen et al 2021 which is known to suffer from convergence to limit cycles or bad locally optimal solutions wang et al 2019 jin et al 2020 balduzzi et al 2018 moreover yang et al 2019 show that the critic is biased and may not converge when the actor and critic are updated simultaneously with similar learning rates seems not well justified first it seems yang et al 2019 does not provide a proof showing biased critic leads to divergence second there are various recent works showing that actorcritic converges in finitetime see 1234 it seems unclear what advantage such a stackelberg view brings to the analysis 3 it seems that the stackelberg formulation of actorcritic used in this work is proposed in zheng et al 2021 the main contribution of this work is to i additionally use deterministic policy gradient and drop a term involving td error and ii use block diagonal approximation in the calculation of the inverse hessian 4 the theoretical results and the proof seem hard to understand it seems theorem 1 directly follows from fiez et al 2020 and convex analysis moreover the assumption of positivedefiniteness seems very strong it would be nice to provide examples on which the assumptions hold in addition how is the stepsize separation tau chosen and how does it affect the theory references 1 a finite time analysis of two timescale actor critic methods yue wu weitong zhang pan xu quanquan gu 2 improving sample complexity bounds for natural actorcritic algorithms tengyu xu zhe wang yingbin liang 3 a twotimescale framework for bilevel optimization complexity analysis and application to actorcritic mingyi hong hoito wai zhaoran wang zhuoran yang 4 an improved analysis of variancereduced policy gradient and natural policy gradient methods yanli liu kaiqing zhang tamer basar wotao yin a the stackelberg view of actorcritic seems not well motivated b there exists a small issue in the implicit gradient involving the sampling distribution c the theoretical arguments depend on strong and ungrounded assumptions and the proofs are hard to follow ### Summary:
this paper aims to make stackelberg deep deterministic policy gradients practical and efficient the main contributions are an analysis which suggests terms involving the hessian can be dropped and a blockdiagonal approximation to an expensive matrix inversion several reviewers who voted for rejection expressed concerns about the soundness of the theoretical arguments the response provided by the authors did help alleviate some of the reviewers concerns but still left significant doubts while some of the remaining concerns could be due to a misunderstanding of the deterministic setting it is up to the authors to convince the reviewers that their arguments are sound given the current scores and the low confidence of the reviewer voting for acceptance i recommend rejecting the paper in its current form
[ 6168, 19924, 522, 262, 6168, 284, 50276, 2520, 2523, 310, 1246, 275, 954, 273, 253, 27947, 835, 5150, 577, 310, 908, 23111, 253, 3081, 1307, 594, 253, 1543, 273, 13989, 337, 10012, 337, 10012, 374, 40460, 337, 943, 417, 2186, 275, 253, 745, 22872, 4758, 3877, 326, 253, 4679, 2684, 403, 745, 22872, 285, 897, 247, 44864, 6391, 4508, 2469, 24102, 50276, 249, 253, 1895, 15895, 352, 310, 417, 2590, 849, 13882, 375, 310, 2931, 253, 4477, 1750, 352, 310, 253, 42214, 1375, 3268, 273, 256, 533, 762, 534, 3646, 253, 5426, 273, 253, 2250, 2877, 1159, 943, 417, 3469, 327, 253, 3302, 1375, 50275, 783, 4477, 4853, 2805, 16471, 262, 6168, 19924, 281, 320, 4503, 281, 362, 16471, 262, 22666, 256, 984, 597, 1900, 1908, 253, 30027, 2250, 273, 253, 3646, 436, 476, 320, 2326, 671, 275, 253, 41458, 273, 253, 32989, 2228, 275, 5150, 337, 835, 253, 15355, 310, 1620, 2668, 689, 253, 5231, 275, 253, 44864, 6391, 533, 760, 689, 253, 3054, 295, 1752, 1403, 14168, 4482, 265, 948, 13882, 406, 5256, 2805, 18086, 22666, 8819, 8483, 6168, 284, 50276, 82, 18086, 6168, 284, 8483, 6168, 284, 374, 3877, 849, 436, 19986, 432, 32765, 8159, 835, 253, 2303, 1318, 1159, 310, 30027, 533, 253, 6311, 1318, 1159, 310, 6760, 327, 247, 873, 273, 27620, 5231, 436, 310, 4623, 1060, 984, 672, 3192, 253, 1273, 1340, 11786, 253, 4477, 1364, 897, 253, 5931, 4086, 323, 1097, 2426, 275, 253, 11935, 3064, 2957, 1223, 436, 651, 417, 320, 253, 1083, 275, 2629, 32765, 8159, 3877, 326, 275, 253, 4679, 253, 4477, 403, 3185, 970, 253, 2629, 32765, 8159, 2746, 285, 3037, 247, 2120, 2250, 2877, 1159, 10491, 1375, 3518, 8557, 432, 253, 44864, 6391, 436, 310, 8943, 671, 275, 2829, 577, 835, 247, 27620, 2715, 273, 253, 3646, 310, 908, 323, 17947, 849, 310, 436, 13567, 253, 10527, 1543, 50276, 249, 253, 4737, 273, 13989, 337, 2139, 310, 253, 1307, 295, 1752, 4349, 41506, 2805, 18086, 6168, 19924, 42689, 50276, 74, 651, 751, 281, 3877, 326, 436, 2929, 310, 3652, 327, 253, 789, 273, 1182, 24176, 1162, 355, 577, 534, 5420, 327, 549, 32693, 760, 495, 1897, 1078, 253, 806, 17857, 32888, 19529, 20639, 891, 369, 417, 2104, 281, 12654, 604, 577, 556, 644, 14218, 33349, 533, 1677, 253, 3374, 1840, 891, 2868, 326, 577, 1537, 3831, 2074, 10527, 3237, 50276, 18, 34926, 301, 9711, 5599, 19732, 6815, 16328, 344, 405, 289, 4921, 372, 737, 261, 4204, 266, 259, 1321, 10981, 285, 16172, 249, 209, 2200, 78, 6626, 30027, 3646, 11786, 11333, 275, 10061, 273, 253, 4562, 296, 5213, 8059, 327, 5213, 8059, 327, 5145, 4715, 50276, 21970, 4567, 17857, 1686, 1047, 7223, 891, 25745, 74, 20216, 75, 1686, 83, 2061, 4059, 50276, 19, 84, 28738, 391, 256, 278, 4065, 9358, 277, 247, 1625, 73, 256, 268, 50276, 24044, 454, 340, 5307, 3646, 11786, 3082, 323, 35221, 4715, 342, 1159, 11193, 275, 16424, 275, 11454, 1491, 5162, 2718, 7266, 884, 3011, 740, 3571, 50276, 20, 394, 4921, 372, 737, 261, 2304, 19243, 3168, 285, 6793, 472, 256, 256, 28738, 745, 22872, 12353, 68, 17425, 275, 10061, 273, 253, 3285, 394, 5213, 8059, 327, 5213, 8059, 327, 5145, 4715, 17857, 1686, 805, 7223, 24062, 20270, 441, 66, 4050, 297, 40298, 560, 50276, 21, 91, 24176, 298, 269, 466, 91, 246, 355, 3561, 3920, 1182, 448, 284, 30568, 270, 50276, 9296, 77, 1648, 298, 480, 43425, 8031, 293, 4978, 12353, 68, 17425, 18814, 10666, 30325, 35221, 4715, 11333, 549, 32693, 638, 3845, 549, 32693, 19, 12852, 805, 23360, 50276, 6050, 891, 1089, 253, 2929, 973, 3542, 285, 7826, 4623, 323, 436, 1673, 627, 310, 247, 2201, 19652, 275, 849, 253, 3646, 11786, 310, 10302, 352, 310, 417, 2590, 281, 479, 849, 253, 10527, 1543, 476, 320, 4229, 275, 253, 745, 22872, 1083, 1223, 323, 327, 22872, 4715, 690, 3081, 789, 1537, 320, 2104, 281, 1978, 253, 1543, 3588, 19235, 253, 4679, 5196, 403, 275, 253, 745, 22872, 4758, 7613, 352, 310, 417, 2590, 752, 359, 476, 7525, 432, 731, 1677, 253, 3374, 1840, 891, 12661, 281, 12009, 436, 2929, 275, 253, 1655, 2715, 5474, 33032, 2520, 2929, 13698, 281, 5100, 247, 15180, 5919, 8031, 293, 4978, 3733, 6974, 323, 3676, 28269, 3169, 12353, 68, 17425, 3082, 253, 4081, 2746, 50276, 7957, 8031, 293, 4978, 3676, 30027, 3646, 11786, 25290, 1678, 8159, 19401, 253, 2972, 16421, 11193, 5853, 281, 4796, 253, 3733, 10454, 1223, 11138, 253, 14940, 2281, 436, 2929, 2589, 84, 1097, 10527, 1783, 285, 16774, 7103, 281, 7568, 253, 4757, 273, 253, 4081, 2746, 436, 2929, 310, 973, 34092, 285, 253, 5955, 273, 253, 6239, 4114, 310, 11080, 253, 4081, 1332, 310, 4460, 285, 3590, 534, 556, 4891, 10527, 1783, 285, 16774, 3216, 272, 4583, 891, 5583, 18738, 436, 2929, 50276, 37585, 5701, 50275, 43249, 9191, 275, 4677, 495, 513, 417, 1646, 281, 29623, 24088, 275, 28483, 12237, 16183, 15508, 285, 2940, 254, 352, 651, 320, 1805, 281, 2486, 247, 3356, 3733, 16892, 281, 5100, 247, 26565, 5301, 50275, 249, 2829, 337, 849, 1057, 253, 4311, 273, 299, 4277, 7277, 281, 1315, 317, 18, 10602, 285, 1315, 317, 18, 80, 275, 3946, 50276, 936, 619, 3640, 253, 1332, 4081, 407, 436, 2929, 310, 4460, 285, 39223, 271, 1774, 1895, 891, 5583, 14924, 5474, 33032, 2520, 2929, 29328, 247, 1332, 281, 28523, 8031, 293, 4978, 12353, 68, 17425, 407, 23111, 253, 2426, 326, 3831, 344, 859, 757, 275, 253, 11686, 11786, 1307, 285, 9433, 2972, 41758, 11193, 5853, 281, 5780, 13737, 2426, 597, 5276, 247, 7938, 14940, 2281, 273, 253, 4081, 1332, 281, 247, 4229, 1127, 4679, 403, 5196, 281, 17813, 253, 3809, 14940, 285, 7882, 50276, 296, 3755, 20556, 436, 2929, 310, 973, 15720, 285, 10932, 253, 16038, 285, 30328, 273, 253, 4081, 1332, 403, 2590, 285, 3477, 281, 956, 2710, 4679, 403, 3559, 281, 7277, 253, 4081, 1332, 342, 1666, 25379, 50276, 20881, 1255, 265, 619, 7350, 670, 436, 2929, 403, 1264, 71, 3502, 41005, 253, 38135, 253, 10527, 1783, 275, 2593, 608, 7194, 3637, 10012, 608, 275, 269, 466, 91, 1162, 355, 9169, 253, 4477, 943, 4518, 1375, 253, 10291, 273, 841, 767, 2987, 1273, 253, 10527, 1783, 310, 3965, 5075, 597, 760, 921, 247, 7938, 14940, 2281, 534, 310, 271, 3264, 285, 4942, 5075, 906, 597, 13414, 921, 667, 10527, 1543, 323, 7882, 534, 310, 4767, 347, 581, 273, 616, 767, 2022, 9021, 7938, 14940, 2281, 285, 8838, 7882, 2626, 275, 2426, 273, 253, 4679, 7882, 310, 7194, 12800, 285, 17618, 760, 327, 20953, 6667, 7613, 1679, 21414, 50275, 249, 512, 3738, 436, 2929, 310, 2590, 281, 956, 285, 23395, 3559, 891, 2868, 253, 10527, 1783, 629, 310, 5075, 253, 3809, 14940, 2281, 10380, 19756, 1534, 38135, 253, 7882, 273, 253, 4081, 1332, 310, 1679, 21414, 50276, 7152, 33032, 2520, 789, 29328, 247, 12955, 273, 253, 8031, 293, 4978, 12353, 68, 17425, 5933, 534, 26574, 12353, 68, 17425, 347, 247, 8031, 293, 4978, 2165, 342, 253, 12353, 1146, 253, 6657, 285, 7291, 1146, 253, 47201, 1754, 327, 8031, 293, 4978, 913, 436, 2929, 29328, 281, 5283, 30027, 7823, 275, 1340, 281, 25636, 253, 13782, 273, 253, 15424, 11786, 23000, 253, 4477, 12661, 11193, 15849, 1690, 18752, 1355, 2621, 2426, 285, 4315, 27697, 3066, 2972, 16421, 4315, 11193, 253, 15180, 10454, 273, 11786, 13782, 310, 4232, 25761, 762, 4942, 2266, 13260, 253, 4081, 1332, 310, 2011, 281, 320, 41886, 4679, 327, 278, 10441, 16856, 403, 5196, 281, 7568, 253, 10307, 273, 253, 1332, 4757, 436, 2929, 3133, 281, 2085, 10704, 4679, 327, 278, 10441, 16856, 326, 403, 10870, 281, 253, 1375, 23037, 14387, 50275, 20881, 1255, 50276, 18, 253, 15895, 273, 8031, 293, 4978, 2165, 875, 253, 12353, 285, 7291, 3133, 20276, 275, 2173, 253, 12353, 8483, 22666, 310, 253, 6657, 285, 253, 7291, 2805, 3151, 310, 253, 47201, 253, 18409, 2957, 273, 253, 2097, 371, 1096, 1565, 412, 1544, 3064, 2228, 50276, 37803, 989, 525, 298, 3124, 40639, 50276, 1324, 1257, 50276, 84, 948, 13882, 4977, 893, 347, 303, 8483, 22666, 1943, 50274, 82, 3151, 618, 50276, 82, 18086, 22666, 50276, 6678, 19, 1943, 83, 50275, 423, 8623, 835, 13882, 4977, 893, 310, 253, 41820, 2557, 390, 17429, 3268, 5802, 407, 253, 3646, 8483, 22666, 840, 672, 18899, 253, 11686, 11786, 1307, 275, 16186, 495, 359, 50276, 22990, 281, 6016, 253, 7898, 1185, 87, 800, 7898, 374, 298, 3124, 40639, 50276, 3214, 39116, 7898, 40639, 534, 8687, 3192, 253, 11786, 342, 1675, 281, 618, 948, 13882, 4977, 893, 258, 3181, 8483, 22666, 285, 3198, 281, 320, 15726, 970, 3646, 11786, 289, 3703, 2013, 436, 789, 3133, 281, 18369, 436, 2647, 407, 970, 247, 4229, 10491, 3268, 50276, 2859, 534, 310, 20276, 50275, 19, 625, 15538, 891, 1928, 326, 253, 2523, 5439, 275, 436, 789, 50275, 2252, 913, 3082, 1347, 19191, 11786, 18499, 327, 253, 12353, 285, 7291, 10486, 436, 476, 320, 12258, 347, 9591, 11786, 18499, 49104, 305, 1473, 1182, 24176, 1162, 355, 43425, 259, 257, 1162, 355, 43425, 534, 310, 1929, 281, 11089, 432, 14940, 281, 2701, 11945, 390, 3076, 12171, 8654, 5482, 259, 606, 1162, 355, 6247, 480, 249, 1162, 355, 9169, 4273, 563, 29935, 1162, 355, 4765, 25761, 30966, 1162, 355, 6247, 921, 326, 253, 7291, 310, 23539, 285, 778, 417, 29623, 672, 253, 12353, 285, 7291, 403, 9300, 10486, 342, 2074, 4715, 4142, 3133, 417, 973, 17285, 806, 352, 3133, 30966, 1162, 355, 6247, 1057, 417, 2085, 247, 4737, 4645, 23539, 7291, 5644, 281, 23279, 1273, 627, 403, 2710, 3332, 2987, 4645, 326, 12353, 68, 17425, 26414, 275, 1442, 262, 7816, 923, 1249, 1706, 352, 3133, 12744, 752, 5750, 824, 247, 8031, 293, 4978, 1859, 10316, 281, 253, 1783, 50276, 20, 352, 3133, 326, 253, 8031, 293, 4978, 15895, 273, 12353, 68, 17425, 908, 275, 436, 789, 310, 4081, 275, 1182, 24176, 1162, 355, 43425, 253, 2022, 7680, 273, 436, 789, 310, 281, 891, 23000, 897, 30027, 3646, 11786, 285, 5926, 247, 1307, 7668, 32989, 2228, 285, 21255, 897, 2972, 16421, 11193, 275, 253, 10272, 273, 253, 13737, 344, 859, 757, 50275, 21, 253, 10527, 1543, 285, 253, 4737, 1646, 1892, 281, 2096, 352, 3133, 10012, 337, 3587, 3637, 432, 269, 466, 91, 1162, 355, 9169, 285, 17133, 1783, 25761, 253, 9376, 273, 10538, 1567, 832, 4478, 8098, 3133, 1077, 2266, 352, 651, 320, 5322, 281, 2085, 6667, 327, 534, 253, 13260, 2186, 275, 1635, 849, 310, 253, 5018, 907, 9712, 29201, 6777, 285, 849, 1057, 352, 2818, 253, 3762, 50276, 250, 3065, 50276, 18, 247, 6486, 673, 1783, 273, 767, 43936, 12353, 7291, 3082, 50276, 90, 489, 259, 86, 359, 262, 543, 1182, 12109, 3199, 1269, 86, 572, 266, 371, 266, 1149, 50276, 19, 11138, 3410, 10454, 14493, 323, 3626, 12353, 68, 17425, 11333, 50276, 85, 1205, 30838, 1269, 86, 1182, 248, 259, 606, 340, 272, 4805, 632, 606, 50276, 20, 247, 2500, 5786, 25912, 7792, 323, 26413, 652, 13757, 10454, 1783, 285, 2898, 281, 12353, 68, 17425, 50276, 3987, 28212, 288, 543, 8511, 7067, 259, 2284, 1182, 3227, 263, 266, 259, 606, 1182, 11917, 263, 266, 30966, 50276, 21, 271, 5520, 1783, 273, 11041, 43408, 3646, 11786, 285, 3626, 3646, 11786, 3082, 50276, 8202, 965, 632, 86, 465, 2284, 82, 272, 1182, 12109, 246, 13429, 1666, 274, 259, 5503, 80, 340, 249, 247, 253, 8031, 293, 4978, 1859, 273, 12353, 68, 17425, 3133, 417, 973, 17194, 50275, 67, 627, 4961, 247, 1355, 2523, 275, 253, 15424, 11786, 7668, 253, 10491, 3268, 50275, 68, 253, 10527, 7125, 3469, 327, 2266, 285, 440, 2595, 264, 13260, 285, 253, 27947, 403, 1892, 281, 956, 50275, 187, 187, 4118, 18435, 27, 2520, 2929, 13698, 281, 1056, 8031, 293, 4978, 3676, 30027, 3646, 27935, 8542, 285, 5919, 253, 2022, 9021, 403, 271, 1783, 534, 5936, 2426, 7668, 253, 344, 859, 757, 476, 320, 8231, 285, 247, 2972, 41758, 11193, 281, 271, 8214, 4315, 27697, 50276, 43249, 30628, 665, 14285, 323, 18235, 4469, 7350, 670, 253, 3590, 1255, 273, 253, 10527, 7125, 253, 2380, 2530, 407, 253, 4477, 858, 1361, 33623, 690, 273, 253, 30628, 7350, 533, 1335, 1669, 1534, 24626, 1223, 690, 273, 253, 5780, 7350, 812, 320, 1955, 281, 247, 40663, 273, 253, 30027, 4758, 352, 310, 598, 281, 253, 4477, 281, 18578, 253, 30628, 326, 616, 7125, 403, 3590, 1677, 253, 1655, 7363, 285, 253, 1698, 7162, 273, 253, 37317, 13423, 323, 14924, 891, 5583, 33944, 253, 2929, 275, 697, 1655, 830 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 6168, 19924, 522, 262, 6168, 284, 50276, 2520, 2523, 310, 1246, 275, 954, 273, 253, 27947, 835, 5150, 577, 310, 908, 23111, 253, 3081, 1307, 594, 253, 1543, 273, 13989, 337, 10012, 337, 10012, 374, 40460, 337, 943, 417, 2186, 275, 253, 745, 22872, 4758, 3877, 326, 253, 4679, 2684, 403, 745, 22872, 285, 897, 247, 44864, 6391, 4508, 2469, 24102, 50276, 249, 253, 1895, 15895, 352, 310, 417, 2590, 849, 13882, 375, 310, 2931, 253, 4477, 1750, 352, 310, 253, 42214, 1375, 3268, 273, 256, 533, 762, 534, 3646, 253, 5426, 273, 253, 2250, 2877, 1159, 943, 417, 3469, 327, 253, 3302, 1375, 50275, 783, 4477, 4853, 2805, 16471, 262, 6168, 19924, 281, 320, 4503, 281, 362, 16471, 262, 22666, 256, 984, 597, 1900, 1908, 253, 30027, 2250, 273, 253, 3646, 436, 476, 320, 2326, 671, 275, 253, 41458, 273, 253, 32989, 2228, 275, 5150, 337, 835, 253, 15355, 310, 1620, 2668, 689, 253, 5231, 275, 253, 44864, 6391, 533, 760, 689, 253, 3054, 295, 1752, 1403, 14168, 4482, 265, 948, 13882, 406, 5256, 2805, 18086, 22666, 8819, 8483, 6168, 284, 50276, 82, 18086, 6168, 284, 8483, 6168, 284, 374, 3877, 849, 436, 19986, 432, 32765, 8159, 835, 253, 2303, 1318, 1159, 310, 30027, 533, 253, 6311, 1318, 1159, 310, 6760, 327, 247, 873, 273, 27620, 5231, 436, 310, 4623, 1060, 984, 672, 3192, 253, 1273, 1340, 11786, 253, 4477, 1364, 897, 253, 5931, 4086, 323, 1097, 2426, 275, 253, 11935, 3064, 2957, 1223, 436, 651, 417, 320, 253, 1083, 275, 2629, 32765, 8159, 3877, 326, 275, 253, 4679, 253, 4477, 403, 3185, 970, 253, 2629, 32765, 8159, 2746, 285, 3037, 247, 2120, 2250, 2877, 1159, 10491, 1375, 3518, 8557, 432, 253, 44864, 6391, 436, 310, 8943, 671, 275, 2829, 577, 835, 247, 27620, 2715, 273, 253, 3646, 310, 908, 323, 17947, 849, 310, 436, 13567, 253, 10527, 1543, 50276, 249, 253, 4737, 273, 13989, 337, 2139, 310, 253, 1307, 295, 1752, 4349, 41506, 2805, 18086, 6168, 19924, 42689, 50276, 74, 651, 751, 281, 3877, 326, 436, 2929, 310, 3652, 327, 253, 789, 273, 1182, 24176, 1162, 355, 577, 534, 5420, 327, 549, 32693, 760, 495, 1897, 1078, 253, 806, 17857, 32888, 19529, 20639, 891, 369, 417, 2104, 281, 12654, 604, 577, 556, 644, 14218, 33349, 533, 1677, 253, 3374, 1840, 891, 2868, 326, 577, 1537, 3831, 2074, 10527, 3237, 50276, 18, 34926, 301, 9711, 5599, 19732, 6815, 16328, 344, 405, 289, 4921, 372, 737, 261, 4204, 266, 259, 1321, 10981, 285, 16172, 249, 209, 2200, 78, 6626, 30027, 3646, 11786, 11333, 275, 10061, 273, 253, 4562, 296, 5213, 8059, 327, 5213, 8059, 327, 5145, 4715, 50276, 21970, 4567, 17857, 1686, 1047, 7223, 891, 25745, 74, 20216, 75, 1686, 83, 2061, 4059, 50276, 19, 84, 28738, 391, 256, 278, 4065, 9358, 277, 247, 1625, 73, 256, 268, 50276, 24044, 454, 340, 5307, 3646, 11786, 3082, 323, 35221, 4715, 342, 1159, 11193, 275, 16424, 275, 11454, 1491, 5162, 2718, 7266, 884, 3011, 740, 3571, 50276, 20, 394, 4921, 372, 737, 261, 2304, 19243, 3168, 285, 6793, 472, 256, 256, 28738, 745, 22872, 12353, 68, 17425, 275, 10061, 273, 253, 3285, 394, 5213, 8059, 327, 5213, 8059, 327, 5145, 4715, 17857, 1686, 805, 7223, 24062, 20270, 441, 66, 4050, 297, 40298, 560, 50276, 21, 91, 24176, 298, 269, 466, 91, 246, 355, 3561, 3920, 1182, 448, 284, 30568, 270, 50276, 9296, 77, 1648, 298, 480, 43425, 8031, 293, 4978, 12353, 68, 17425, 18814, 10666, 30325, 35221, 4715, 11333, 549, 32693, 638, 3845, 549, 32693, 19, 12852, 805, 23360, 50276, 6050, 891, 1089, 253, 2929, 973, 3542, 285, 7826, 4623, 323, 436, 1673, 627, 310, 247, 2201, 19652, 275, 849, 253, 3646, 11786, 310, 10302, 352, 310, 417, 2590, 281, 479, 849, 253, 10527, 1543, 476, 320, 4229, 275, 253, 745, 22872, 1083, 1223, 323, 327, 22872, 4715, 690, 3081, 789, 1537, 320, 2104, 281, 1978, 253, 1543, 3588, 19235, 253, 4679, 5196, 403, 275, 253, 745, 22872, 4758, 7613, 352, 310, 417, 2590, 752, 359, 476, 7525, 432, 731, 1677, 253, 3374, 1840, 891, 12661, 281, 12009, 436, 2929, 275, 253, 1655, 2715, 5474, 33032, 2520, 2929, 13698, 281, 5100, 247, 15180, 5919, 8031, 293, 4978, 3733, 6974, 323, 3676, 28269, 3169, 12353, 68, 17425, 3082, 253, 4081, 2746, 50276, 7957, 8031, 293, 4978, 3676, 30027, 3646, 11786, 25290, 1678, 8159, 19401, 253, 2972, 16421, 11193, 5853, 281, 4796, 253, 3733, 10454, 1223, 11138, 253, 14940, 2281, 436, 2929, 2589, 84, 1097, 10527, 1783, 285, 16774, 7103, 281, 7568, 253, 4757, 273, 253, 4081, 2746, 436, 2929, 310, 973, 34092, 285, 253, 5955, 273, 253, 6239, 4114, 310, 11080, 253, 4081, 1332, 310, 4460, 285, 3590, 534, 556, 4891, 10527, 1783, 285, 16774, 3216, 272, 4583, 891, 5583, 18738, 436, 2929, 50276, 37585, 5701, 50275, 43249, 9191, 275, 4677, 495, 513, 417, 1646, 281, 29623, 24088, 275, 28483, 12237, 16183, 15508, 285, 2940, 254, 352, 651, 320, 1805, 281, 2486, 247, 3356, 3733, 16892, 281, 5100, 247, 26565, 5301, 50275, 249, 2829, 337, 849, 1057, 253, 4311, 273, 299, 4277, 7277, 281, 1315, 317, 18, 10602, 285, 1315, 317, 18, 80, 275, 3946, 50276, 936, 619, 3640, 253, 1332, 4081, 407, 436, 2929, 310, 4460, 285, 39223, 271, 1774, 1895, 891, 5583, 14924, 5474, 33032, 2520, 2929, 29328, 247, 1332, 281, 28523, 8031, 293, 4978, 12353, 68, 17425, 407, 23111, 253, 2426, 326, 3831, 344, 859, 757, 275, 253, 11686, 11786, 1307, 285, 9433, 2972, 41758, 11193, 5853, 281, 5780, 13737, 2426, 597, 5276, 247, 7938, 14940, 2281, 273, 253, 4081, 1332, 281, 247, 4229, 1127, 4679, 403, 5196, 281, 17813, 253, 3809, 14940, 285, 7882, 50276, 296, 3755, 20556, 436, 2929, 310, 973, 15720, 285, 10932, 253, 16038, 285, 30328, 273, 253, 4081, 1332, 403, 2590, 285, 3477, 281, 956, 2710, 4679, 403, 3559, 281, 7277, 253, 4081, 1332, 342, 1666, 25379, 50276, 20881, 1255, 265, 619, 7350, 670, 436, 2929, 403, 1264, 71, 3502, 41005, 253, 38135, 253, 10527, 1783, 275, 2593, 608, 7194, 3637, 10012, 608, 275, 269, 466, 91, 1162, 355, 9169, 253, 4477, 943, 4518, 1375, 253, 10291, 273, 841, 767, 2987, 1273, 253, 10527, 1783, 310, 3965, 5075, 597, 760, 921, 247, 7938, 14940, 2281, 534, 310, 271, 3264, 285, 4942, 5075, 906, 597, 13414, 921, 667, 10527, 1543, 323, 7882, 534, 310, 4767, 347, 581, 273, 616, 767, 2022, 9021, 7938, 14940, 2281, 285, 8838, 7882, 2626, 275, 2426, 273, 253, 4679, 7882, 310, 7194, 12800, 285, 17618, 760, 327, 20953, 6667, 7613, 1679, 21414, 50275, 249, 512, 3738, 436, 2929, 310, 2590, 281, 956, 285, 23395, 3559, 891, 2868, 253, 10527, 1783, 629, 310, 5075, 253, 3809, 14940, 2281, 10380, 19756, 1534, 38135, 253, 7882, 273, 253, 4081, 1332, 310, 1679, 21414, 50276, 7152, 33032, 2520, 789, 29328, 247, 12955, 273, 253, 8031, 293, 4978, 12353, 68, 17425, 5933, 534, 26574, 12353, 68, 17425, 347, 247, 8031, 293, 4978, 2165, 342, 253, 12353, 1146, 253, 6657, 285, 7291, 1146, 253, 47201, 1754, 327, 8031, 293, 4978, 913, 436, 2929, 29328, 281, 5283, 30027, 7823, 275, 1340, 281, 25636, 253, 13782, 273, 253, 15424, 11786, 23000, 253, 4477, 12661, 11193, 15849, 1690, 18752, 1355, 2621, 2426, 285, 4315, 27697, 3066, 2972, 16421, 4315, 11193, 253, 15180, 10454, 273, 11786, 13782, 310, 4232, 25761, 762, 4942, 2266, 13260, 253, 4081, 1332, 310, 2011, 281, 320, 41886, 4679, 327, 278, 10441, 16856, 403, 5196, 281, 7568, 253, 10307, 273, 253, 1332, 4757, 436, 2929, 3133, 281, 2085, 10704, 4679, 327, 278, 10441, 16856, 326, 403, 10870, 281, 253, 1375, 23037, 14387, 50275, 20881, 1255, 50276, 18, 253, 15895, 273, 8031, 293, 4978, 2165, 875, 253, 12353, 285, 7291, 3133, 20276, 275, 2173, 253, 12353, 8483, 22666, 310, 253, 6657, 285, 253, 7291, 2805, 3151, 310, 253, 47201, 253, 18409, 2957, 273, 253, 2097, 371, 1096, 1565, 412, 1544, 3064, 2228, 50276, 37803, 989, 525, 298, 3124, 40639, 50276, 1324, 1257, 50276, 84, 948, 13882, 4977, 893, 347, 303, 8483, 22666, 1943, 50274, 82, 3151, 618, 50276, 82, 18086, 22666, 50276, 6678, 19, 1943, 83, 50275, 423, 8623, 835, 13882, 4977, 893, 310, 253, 41820, 2557, 390, 17429, 3268, 5802, 407, 253, 3646, 8483, 22666, 840, 672, 18899, 253, 11686, 11786, 1307, 275, 16186, 495, 359, 50276, 22990, 281, 6016, 253, 7898, 1185, 87, 800, 7898, 374, 298, 3124, 40639, 50276, 3214, 39116, 7898, 40639, 534, 8687, 3192, 253, 11786, 342, 1675, 281, 618, 948, 13882, 4977, 893, 258, 3181, 8483, 22666, 285, 3198, 281, 320, 15726, 970, 3646, 11786, 289, 3703, 2013, 436, 789, 3133, 281, 18369, 436, 2647, 407, 970, 247, 4229, 10491, 3268, 50276, 2859, 534, 310, 20276, 50275, 19, 625, 15538, 891, 1928, 326, 253, 2523, 5439, 275, 436, 789, 50275, 2252, 913, 3082, 1347, 19191, 11786, 18499, 327, 253, 12353, 285, 7291, 10486, 436, 476, 320, 12258, 347, 9591, 11786, 18499, 49104, 305, 1473, 1182, 24176, 1162, 355, 43425, 259, 257, 1162, 355, 43425, 534, 310, 1929, 281, 11089, 432, 14940, 281, 2701, 11945, 390, 3076, 12171, 8654, 5482, 259, 606, 1162, 355, 6247, 480, 249, 1162, 355, 9169, 4273, 563, 29935, 1162, 355, 4765, 25761, 30966, 1162, 355, 6247, 921, 326, 253, 7291, 310, 23539, 285, 778, 417, 29623, 672, 253, 12353, 285, 7291, 403, 9300, 10486, 342, 2074, 4715, 4142, 3133, 417, 973, 17285, 806, 352, 3133, 30966, 1162, 355, 6247, 1057, 417, 2085, 247, 4737, 4645, 23539, 7291, 5644, 281, 23279, 1273, 627, 403, 2710, 3332, 2987, 4645, 326, 12353, 68, 17425, 26414, 275, 1442, 262, 7816, 923, 1249, 1706, 352, 3133, 12744, 752, 5750, 824, 247, 8031, 293, 4978, 1859, 10316, 281, 253, 1783, 50276, 20, 352, 3133, 326, 253, 8031, 293, 4978, 15895, 273, 12353, 68, 17425, 908, 275, 436, 789, 310, 4081, 275, 1182, 24176, 1162, 355, 43425, 253, 2022, 7680, 273, 436, 789, 310, 281, 891, 23000, 897, 30027, 3646, 11786, 285, 5926, 247, 1307, 7668, 32989, 2228, 285, 21255, 897, 2972, 16421, 11193, 275, 253, 10272, 273, 253, 13737, 344, 859, 757, 50275, 21, 253, 10527, 1543, 285, 253, 4737, 1646, 1892, 281, 2096, 352, 3133, 10012, 337, 3587, 3637, 432, 269, 466, 91, 1162, 355, 9169, 285, 17133, 1783, 25761, 253, 9376, 273, 10538, 1567, 832, 4478, 8098, 3133, 1077, 2266, 352, 651, 320, 5322, 281, 2085, 6667, 327, 534, 253, 13260, 2186, 275, 1635, 849, 310, 253, 5018, 907, 9712, 29201, 6777, 285, 849, 1057, 352, 2818, 253, 3762, 50276, 250, 3065, 50276, 18, 247, 6486, 673, 1783, 273, 767, 43936, 12353, 7291, 3082, 50276, 90, 489, 259, 86, 359, 262, 543, 1182, 12109, 3199, 1269, 86, 572, 266, 371, 266, 1149, 50276, 19, 11138, 3410, 10454, 14493, 323, 3626, 12353, 68, 17425, 11333, 50276, 85, 1205, 30838, 1269, 86, 1182, 248, 259, 606, 340, 272, 4805, 632, 606, 50276, 20, 247, 2500, 5786, 25912, 7792, 323, 26413, 652, 13757, 10454, 1783, 285, 2898, 281, 12353, 68, 17425, 50276, 3987, 28212, 288, 543, 8511, 7067, 259, 2284, 1182, 3227, 263, 266, 259, 606, 1182, 11917, 263, 266, 30966, 50276, 21, 271, 5520, 1783, 273, 11041, 43408, 3646, 11786, 285, 3626, 3646, 11786, 3082, 50276, 8202, 965, 632, 86, 465, 2284, 82, 272, 1182, 12109, 246, 13429, 1666, 274, 259, 5503, 80, 340, 249, 247, 253, 8031, 293, 4978, 1859, 273, 12353, 68, 17425, 3133, 417, 973, 17194, 50275, 67, 627, 4961, 247, 1355, 2523, 275, 253, 15424, 11786, 7668, 253, 10491, 3268, 50275, 68, 253, 10527, 7125, 3469, 327, 2266, 285, 440, 2595, 264, 13260, 285, 253, 27947, 403, 1892, 281, 956, 50275, 187, 187, 4118, 18435, 27, 2520, 2929, 13698, 281, 1056, 8031, 293, 4978, 3676, 30027, 3646, 27935, 8542, 285, 5919, 253, 2022, 9021, 403, 271, 1783, 534, 5936, 2426, 7668, 253, 344, 859, 757, 476, 320, 8231, 285, 247, 2972, 41758, 11193, 281, 271, 8214, 4315, 27697, 50276, 43249, 30628, 665, 14285, 323, 18235, 4469, 7350, 670, 253, 3590, 1255, 273, 253, 10527, 7125, 253, 2380, 2530, 407, 253, 4477, 858, 1361, 33623, 690, 273, 253, 30628, 7350, 533, 1335, 1669, 1534, 24626, 1223, 690, 273, 253, 5780, 7350, 812, 320, 1955, 281, 247, 40663, 273, 253, 30027, 4758, 352, 310, 598, 281, 253, 4477, 281, 18578, 253, 30628, 326, 616, 7125, 403, 3590, 1677, 253, 1655, 7363, 285, 253, 1698, 7162, 273, 253, 37317, 13423, 323, 14924, 891, 5583, 33944, 253, 2929, 275, 697, 1655, 830 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper introduces and details a new research framework for reinforcement learning called dopamine the authors give a brief description of the framework built upon tensorflow and reproduce some recent results on the ale framework pros 1 nice execution and they managed to successfully reproduce recent deep rl results which can be challenging at times cons 1 given that this is a paper describing a new framework i expected a lot more in terms of comparing it to existing frameworks like openai gym rllab rllib etc along different dimensions in short why should i use this framework unfortunately the current version of the paper does not provide me information to make this choice other than the framework the paper does not present any new tasksresultsalgorithms so it is not clear what the contribution is other comments 1 the paragraphs in sections 21 and 22 algorithmic research architecture research etc seem to say pretty much the same things they could be combined and the dqn can be used as a running example to make the points clear 2 the authors mention tests to ensure reliability and reproducibility can you provide more details do these tests account for semantic bugs common while implementing rl algorithmsdocsepsummary the authors present an opensource framework tensorflowbased named dopamine to facilitate the task of researchers in deep reinforcement learning deep rl it allows to build deep rl using existing components such as reinforcement learning agents as well as handling memory logs and providing checkpoints for them emphasis is given on providing a unified interface to these agents as well as keeping the framework generic and simple 2000 lines of code the framework was demonstrated on atari games notably using deep qnetwork agents dqn the authors provide numerous examples of parameter files that can be used with their framework performance results are reported for some agents dqn c51 rainbow iqn given the actual trends in deep learning works unified frameworks such as that proposed is welcome the automatization of checkpointing for instance is particularly useful for long running experiments also trying to reduce the volume of code is beneficial for longterm maintenance and usability major concerns this type of contribution may not match the scope of iclr in the abstract and a large fraction of the text the authors claim that their work is a generic reinforcement learning framework however the paper shows that the framework is very dependent on agents playing atari games moreover the word atari comes out of nowhere on pages 2 and 5 the authors should mention in the beginning eg in the abstract that they are handling only agents operating on atari games the positioning of the paper relative to existing approaches is unclear state of the art is mentioned but neither discussed nor compared to the proposal the format of the paper should be revised section 5 related works should come before presenting the authors work when reading the preceding sections we do not know what to expect from the proposed framework all the code especially in the appendices seems not useful in such a paper but rather to the online documentation of the authors framework what is the motivation of the authors experiments reproduce existing results claimed on page 1 then use the same settings as published works and show that the authors framework reaches the same level of performances show new results such as the effect of stickiness then the authors should explicitly say that one of the contributions of the paper is to show new results the authors say that they want to compare results in figure 3 they explain why the same scale is not used to my opinion the authors should find a way to bring all comparisons to the same scale for all these reasons i think the paper is not ready for publication at iclrdocsepreview this paper proposed dopamine a new framework for deeprl while this framework seems to be useful and the paper seems like a useful guide for using the framework i didnt think that the paper had enough scientific novelty to be an iclr paper i think that papers on novel frameworks can be suitable but they should demonstrate that theyre able to do something or provide a novel capability which has not been demonstrated before strengths having a standardized tool for keeping replay buffers seems useful the dopamine framework is written in python and has 12 files which means that it should be reasonably easy for users to understand how its functioning and change things or debug the paper has a little bit of analysis of how different settings effect results such as how to terminate episodes but im not sure that it does much to help us in understanding the framework i suppose its useful to understand that the settings which are configurable in the framework affect results the result on how sticky actions affect results is nice but im not sure what it adds over the machado 2018 discussion weaknesses given that the paper is about documenting a new framework it would have been nice to see more comprehensive baselines documented for different methods and settings i dont understand the point of 21 in that it seems somewhat trivial that research has been done on different architectures and algorithms in section 42 i wonder if the impact of training mode vs evaluation mode would be larger if the model used a stochastic regularizer i suspect that in general changing to evaluation mode could have a significant impact ### Summary:
the paper presents dopamine an opensource implementation of plenty of drl methods it presents a case study of dqn and experiments on atari the paper is clear and easy to follow while i believe dopamine is a very welcomed contribution to the drl software landscape it seems there is not enough scientific content in this paper to warrant publication at iclr regarding specifically the elf and rllib papers i think that the elf paper had a novelty component and presented rl baselines to a new environment minirts while the rllib paper had a stronger systems research contribution this says nothing about the future impact of dopamine elf and rllib the respective software
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 23970, 285, 4278, 247, 747, 2561, 7792, 323, 35221, 4715, 1925, 26144, 253, 4477, 1918, 247, 4864, 5740, 273, 253, 7792, 4270, 2220, 13148, 5449, 285, 18302, 690, 3332, 1543, 327, 253, 21844, 7792, 50275, 856, 84, 337, 5322, 10636, 285, 597, 7303, 281, 8379, 18302, 3332, 3676, 391, 77, 1543, 534, 476, 320, 11132, 387, 2069, 50276, 5040, 337, 1677, 326, 436, 310, 247, 2929, 12930, 247, 747, 7792, 891, 3264, 247, 2257, 625, 275, 2426, 273, 10941, 352, 281, 5368, 31225, 751, 1527, 2284, 17409, 391, 620, 357, 391, 620, 487, 3966, 2112, 1027, 10103, 50276, 249, 2159, 2139, 943, 891, 897, 436, 7792, 19235, 253, 1655, 2715, 273, 253, 2929, 1057, 417, 2085, 479, 1491, 281, 1056, 436, 4327, 643, 685, 253, 7792, 253, 2929, 1057, 417, 1246, 667, 747, 8892, 16680, 267, 46042, 594, 352, 310, 417, 2590, 752, 253, 7680, 310, 50275, 977, 5701, 337, 253, 33295, 275, 7118, 3127, 285, 3307, 5933, 280, 2561, 10336, 2561, 3966, 1646, 281, 1333, 3965, 1199, 253, 1072, 1841, 597, 812, 320, 5678, 285, 253, 277, 47051, 476, 320, 908, 347, 247, 3515, 1650, 281, 1056, 253, 2792, 2590, 374, 253, 4477, 3748, 5216, 281, 5416, 13367, 285, 38041, 476, 368, 2085, 625, 4278, 513, 841, 5216, 2395, 323, 24705, 19775, 1846, 1223, 16994, 391, 77, 11333, 7152, 339, 793, 360, 3454, 253, 4477, 1246, 271, 13279, 1505, 7792, 13148, 5449, 3169, 4907, 26144, 281, 12454, 253, 4836, 273, 8607, 275, 3676, 35221, 4715, 3676, 391, 77, 352, 4483, 281, 1973, 3676, 391, 77, 970, 5368, 4295, 824, 347, 35221, 4715, 6083, 347, 973, 347, 10885, 3541, 20131, 285, 5277, 2451, 10801, 323, 731, 15075, 310, 1677, 327, 5277, 247, 27998, 5673, 281, 841, 6083, 347, 973, 347, 7562, 253, 7792, 12314, 285, 2969, 5307, 3104, 273, 2127, 253, 7792, 369, 5183, 327, 387, 1792, 3958, 19836, 970, 3676, 2805, 18428, 6083, 277, 47051, 253, 4477, 2085, 7418, 6667, 273, 4764, 4367, 326, 476, 320, 908, 342, 616, 7792, 3045, 1543, 403, 2361, 323, 690, 6083, 277, 47051, 260, 3712, 37422, 891, 47051, 50276, 28821, 253, 4588, 13554, 275, 3676, 4715, 2987, 27998, 31225, 824, 347, 326, 4081, 310, 10112, 253, 3772, 47159, 273, 32552, 272, 323, 4227, 310, 3782, 4217, 323, 1048, 3515, 4679, 671, 2820, 281, 4796, 253, 4644, 273, 2127, 310, 12912, 323, 1048, 3945, 9363, 285, 47813, 50276, 24330, 7350, 50276, 2520, 1511, 273, 7680, 778, 417, 3761, 253, 7990, 273, 17857, 32888, 50276, 249, 253, 12002, 285, 247, 1781, 6919, 273, 253, 2505, 253, 4477, 1750, 326, 616, 789, 310, 247, 12314, 35221, 4715, 7792, 2299, 253, 2929, 2722, 326, 253, 7792, 310, 1077, 7976, 327, 6083, 4882, 387, 1792, 3958, 25761, 253, 3159, 387, 1792, 3249, 562, 273, 17663, 327, 7223, 374, 285, 608, 253, 4477, 943, 3748, 275, 253, 5068, 24088, 275, 253, 12002, 326, 597, 403, 10885, 760, 6083, 6498, 327, 387, 1792, 3958, 50276, 783, 19274, 273, 253, 2929, 4103, 281, 5368, 7274, 310, 12744, 1375, 273, 253, 1445, 310, 5393, 533, 6747, 5469, 4543, 2429, 281, 253, 10419, 50276, 783, 5981, 273, 253, 2929, 943, 320, 17265, 50260, 4674, 608, 2905, 2987, 943, 1705, 1078, 15250, 253, 4477, 789, 672, 4361, 253, 17691, 7118, 359, 513, 417, 871, 752, 281, 1902, 432, 253, 4081, 7792, 50260, 455, 253, 2127, 3340, 275, 253, 14801, 1271, 3133, 417, 4217, 275, 824, 247, 2929, 533, 2581, 281, 253, 3909, 10097, 273, 253, 4477, 7792, 50276, 5371, 310, 253, 16038, 273, 253, 4477, 4679, 50260, 250, 5551, 336, 5368, 1543, 7558, 327, 3239, 337, 840, 897, 253, 1072, 7533, 347, 3863, 2987, 285, 921, 326, 253, 4477, 7792, 14190, 253, 1072, 1268, 273, 16226, 50260, 9029, 747, 1543, 824, 347, 253, 1055, 273, 7356, 1632, 840, 253, 4477, 943, 11120, 1333, 326, 581, 273, 253, 9021, 273, 253, 2929, 310, 281, 921, 747, 1543, 50276, 783, 4477, 1333, 326, 597, 971, 281, 7277, 1543, 275, 4677, 495, 597, 5513, 2139, 253, 1072, 4311, 310, 417, 908, 281, 619, 4743, 253, 4477, 943, 1089, 247, 1039, 281, 3324, 512, 14023, 281, 253, 1072, 4311, 50276, 1542, 512, 841, 4606, 891, 1158, 253, 2929, 310, 417, 4704, 323, 9311, 387, 17857, 77, 5784, 406, 339, 42150, 436, 2929, 4081, 26144, 247, 747, 7792, 323, 3676, 8435, 50276, 6050, 436, 7792, 3133, 281, 320, 4217, 285, 253, 2929, 3133, 751, 247, 4217, 7102, 323, 970, 253, 7792, 891, 42126, 1158, 326, 253, 2929, 574, 2217, 8249, 38135, 281, 320, 271, 17857, 32888, 2929, 50276, 74, 1158, 326, 9380, 327, 4460, 31225, 476, 320, 7470, 533, 597, 943, 7568, 326, 597, 250, 2104, 281, 513, 1633, 390, 2085, 247, 4460, 14603, 534, 556, 417, 644, 5183, 1078, 50274, 296, 3755, 20556, 50275, 30819, 247, 19817, 4968, 323, 7562, 44864, 32743, 3133, 4217, 50274, 783, 26144, 7792, 310, 3542, 275, 15548, 285, 556, 1249, 4367, 534, 2097, 326, 352, 943, 320, 12054, 3477, 323, 4212, 281, 2096, 849, 697, 15415, 285, 1818, 1841, 390, 13844, 50274, 783, 2929, 556, 247, 1652, 2372, 273, 1783, 273, 849, 1027, 7533, 1055, 1543, 824, 347, 849, 281, 24174, 13305, 533, 516, 417, 2119, 326, 352, 1057, 1199, 281, 1361, 441, 275, 4685, 253, 7792, 50276, 74, 9428, 697, 4217, 281, 2096, 326, 253, 7533, 534, 403, 3596, 11722, 275, 253, 7792, 2818, 1543, 50274, 783, 906, 327, 849, 31714, 5231, 2818, 1543, 310, 5322, 533, 516, 417, 2119, 752, 352, 11323, 689, 253, 3674, 3377, 4765, 5955, 50274, 20881, 1255, 265, 50275, 28821, 326, 253, 2929, 310, 670, 48447, 247, 747, 7792, 352, 651, 452, 644, 5322, 281, 923, 625, 11088, 1666, 25379, 14290, 323, 1027, 3082, 285, 7533, 50274, 74, 13414, 2096, 253, 1127, 273, 3127, 275, 326, 352, 3133, 8489, 14916, 326, 2561, 556, 644, 2218, 327, 1027, 35615, 285, 11333, 50274, 249, 2593, 5976, 891, 4282, 604, 253, 3486, 273, 3733, 4438, 4632, 7103, 4438, 651, 320, 4067, 604, 253, 1566, 908, 247, 19191, 3963, 6081, 50276, 74, 9101, 326, 275, 2087, 6890, 281, 7103, 4438, 812, 452, 247, 1534, 3486, 50275, 187, 187, 4118, 18435, 27, 783, 2929, 10262, 26144, 271, 13279, 1505, 7092, 273, 9828, 273, 1837, 77, 3082, 352, 10262, 247, 1083, 1263, 273, 277, 47051, 285, 4679, 327, 387, 1792, 253, 2929, 310, 2590, 285, 3477, 281, 956, 50276, 6050, 891, 2868, 26144, 310, 247, 1077, 25213, 7680, 281, 253, 1837, 77, 3694, 13016, 352, 3133, 627, 310, 417, 2217, 8249, 2600, 275, 436, 2929, 281, 7501, 9311, 387, 17857, 32888, 5001, 5742, 253, 46141, 285, 391, 620, 487, 9380, 891, 1158, 326, 253, 46141, 2929, 574, 247, 38135, 4445, 285, 3559, 391, 77, 1666, 25379, 281, 247, 747, 3126, 1054, 14068, 1223, 253, 391, 620, 487, 2929, 574, 247, 10046, 2718, 2561, 7680, 436, 2296, 2717, 670, 253, 2852, 3486, 273, 26144, 46141, 285, 391, 620, 487, 50276, 783, 9056, 3694 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 23970, 285, 4278, 247, 747, 2561, 7792, 323, 35221, 4715, 1925, 26144, 253, 4477, 1918, 247, 4864, 5740, 273, 253, 7792, 4270, 2220, 13148, 5449, 285, 18302, 690, 3332, 1543, 327, 253, 21844, 7792, 50275, 856, 84, 337, 5322, 10636, 285, 597, 7303, 281, 8379, 18302, 3332, 3676, 391, 77, 1543, 534, 476, 320, 11132, 387, 2069, 50276, 5040, 337, 1677, 326, 436, 310, 247, 2929, 12930, 247, 747, 7792, 891, 3264, 247, 2257, 625, 275, 2426, 273, 10941, 352, 281, 5368, 31225, 751, 1527, 2284, 17409, 391, 620, 357, 391, 620, 487, 3966, 2112, 1027, 10103, 50276, 249, 2159, 2139, 943, 891, 897, 436, 7792, 19235, 253, 1655, 2715, 273, 253, 2929, 1057, 417, 2085, 479, 1491, 281, 1056, 436, 4327, 643, 685, 253, 7792, 253, 2929, 1057, 417, 1246, 667, 747, 8892, 16680, 267, 46042, 594, 352, 310, 417, 2590, 752, 253, 7680, 310, 50275, 977, 5701, 337, 253, 33295, 275, 7118, 3127, 285, 3307, 5933, 280, 2561, 10336, 2561, 3966, 1646, 281, 1333, 3965, 1199, 253, 1072, 1841, 597, 812, 320, 5678, 285, 253, 277, 47051, 476, 320, 908, 347, 247, 3515, 1650, 281, 1056, 253, 2792, 2590, 374, 253, 4477, 3748, 5216, 281, 5416, 13367, 285, 38041, 476, 368, 2085, 625, 4278, 513, 841, 5216, 2395, 323, 24705, 19775, 1846, 1223, 16994, 391, 77, 11333, 7152, 339, 793, 360, 3454, 253, 4477, 1246, 271, 13279, 1505, 7792, 13148, 5449, 3169, 4907, 26144, 281, 12454, 253, 4836, 273, 8607, 275, 3676, 35221, 4715, 3676, 391, 77, 352, 4483, 281, 1973, 3676, 391, 77, 970, 5368, 4295, 824, 347, 35221, 4715, 6083, 347, 973, 347, 10885, 3541, 20131, 285, 5277, 2451, 10801, 323, 731, 15075, 310, 1677, 327, 5277, 247, 27998, 5673, 281, 841, 6083, 347, 973, 347, 7562, 253, 7792, 12314, 285, 2969, 5307, 3104, 273, 2127, 253, 7792, 369, 5183, 327, 387, 1792, 3958, 19836, 970, 3676, 2805, 18428, 6083, 277, 47051, 253, 4477, 2085, 7418, 6667, 273, 4764, 4367, 326, 476, 320, 908, 342, 616, 7792, 3045, 1543, 403, 2361, 323, 690, 6083, 277, 47051, 260, 3712, 37422, 891, 47051, 50276, 28821, 253, 4588, 13554, 275, 3676, 4715, 2987, 27998, 31225, 824, 347, 326, 4081, 310, 10112, 253, 3772, 47159, 273, 32552, 272, 323, 4227, 310, 3782, 4217, 323, 1048, 3515, 4679, 671, 2820, 281, 4796, 253, 4644, 273, 2127, 310, 12912, 323, 1048, 3945, 9363, 285, 47813, 50276, 24330, 7350, 50276, 2520, 1511, 273, 7680, 778, 417, 3761, 253, 7990, 273, 17857, 32888, 50276, 249, 253, 12002, 285, 247, 1781, 6919, 273, 253, 2505, 253, 4477, 1750, 326, 616, 789, 310, 247, 12314, 35221, 4715, 7792, 2299, 253, 2929, 2722, 326, 253, 7792, 310, 1077, 7976, 327, 6083, 4882, 387, 1792, 3958, 25761, 253, 3159, 387, 1792, 3249, 562, 273, 17663, 327, 7223, 374, 285, 608, 253, 4477, 943, 3748, 275, 253, 5068, 24088, 275, 253, 12002, 326, 597, 403, 10885, 760, 6083, 6498, 327, 387, 1792, 3958, 50276, 783, 19274, 273, 253, 2929, 4103, 281, 5368, 7274, 310, 12744, 1375, 273, 253, 1445, 310, 5393, 533, 6747, 5469, 4543, 2429, 281, 253, 10419, 50276, 783, 5981, 273, 253, 2929, 943, 320, 17265, 50260, 4674, 608, 2905, 2987, 943, 1705, 1078, 15250, 253, 4477, 789, 672, 4361, 253, 17691, 7118, 359, 513, 417, 871, 752, 281, 1902, 432, 253, 4081, 7792, 50260, 455, 253, 2127, 3340, 275, 253, 14801, 1271, 3133, 417, 4217, 275, 824, 247, 2929, 533, 2581, 281, 253, 3909, 10097, 273, 253, 4477, 7792, 50276, 5371, 310, 253, 16038, 273, 253, 4477, 4679, 50260, 250, 5551, 336, 5368, 1543, 7558, 327, 3239, 337, 840, 897, 253, 1072, 7533, 347, 3863, 2987, 285, 921, 326, 253, 4477, 7792, 14190, 253, 1072, 1268, 273, 16226, 50260, 9029, 747, 1543, 824, 347, 253, 1055, 273, 7356, 1632, 840, 253, 4477, 943, 11120, 1333, 326, 581, 273, 253, 9021, 273, 253, 2929, 310, 281, 921, 747, 1543, 50276, 783, 4477, 1333, 326, 597, 971, 281, 7277, 1543, 275, 4677, 495, 597, 5513, 2139, 253, 1072, 4311, 310, 417, 908, 281, 619, 4743, 253, 4477, 943, 1089, 247, 1039, 281, 3324, 512, 14023, 281, 253, 1072, 4311, 50276, 1542, 512, 841, 4606, 891, 1158, 253, 2929, 310, 417, 4704, 323, 9311, 387, 17857, 77, 5784, 406, 339, 42150, 436, 2929, 4081, 26144, 247, 747, 7792, 323, 3676, 8435, 50276, 6050, 436, 7792, 3133, 281, 320, 4217, 285, 253, 2929, 3133, 751, 247, 4217, 7102, 323, 970, 253, 7792, 891, 42126, 1158, 326, 253, 2929, 574, 2217, 8249, 38135, 281, 320, 271, 17857, 32888, 2929, 50276, 74, 1158, 326, 9380, 327, 4460, 31225, 476, 320, 7470, 533, 597, 943, 7568, 326, 597, 250, 2104, 281, 513, 1633, 390, 2085, 247, 4460, 14603, 534, 556, 417, 644, 5183, 1078, 50274, 296, 3755, 20556, 50275, 30819, 247, 19817, 4968, 323, 7562, 44864, 32743, 3133, 4217, 50274, 783, 26144, 7792, 310, 3542, 275, 15548, 285, 556, 1249, 4367, 534, 2097, 326, 352, 943, 320, 12054, 3477, 323, 4212, 281, 2096, 849, 697, 15415, 285, 1818, 1841, 390, 13844, 50274, 783, 2929, 556, 247, 1652, 2372, 273, 1783, 273, 849, 1027, 7533, 1055, 1543, 824, 347, 849, 281, 24174, 13305, 533, 516, 417, 2119, 326, 352, 1057, 1199, 281, 1361, 441, 275, 4685, 253, 7792, 50276, 74, 9428, 697, 4217, 281, 2096, 326, 253, 7533, 534, 403, 3596, 11722, 275, 253, 7792, 2818, 1543, 50274, 783, 906, 327, 849, 31714, 5231, 2818, 1543, 310, 5322, 533, 516, 417, 2119, 752, 352, 11323, 689, 253, 3674, 3377, 4765, 5955, 50274, 20881, 1255, 265, 50275, 28821, 326, 253, 2929, 310, 670, 48447, 247, 747, 7792, 352, 651, 452, 644, 5322, 281, 923, 625, 11088, 1666, 25379, 14290, 323, 1027, 3082, 285, 7533, 50274, 74, 13414, 2096, 253, 1127, 273, 3127, 275, 326, 352, 3133, 8489, 14916, 326, 2561, 556, 644, 2218, 327, 1027, 35615, 285, 11333, 50274, 249, 2593, 5976, 891, 4282, 604, 253, 3486, 273, 3733, 4438, 4632, 7103, 4438, 651, 320, 4067, 604, 253, 1566, 908, 247, 19191, 3963, 6081, 50276, 74, 9101, 326, 275, 2087, 6890, 281, 7103, 4438, 812, 452, 247, 1534, 3486, 50275, 187, 187, 4118, 18435, 27, 783, 2929, 10262, 26144, 271, 13279, 1505, 7092, 273, 9828, 273, 1837, 77, 3082, 352, 10262, 247, 1083, 1263, 273, 277, 47051, 285, 4679, 327, 387, 1792, 253, 2929, 310, 2590, 285, 3477, 281, 956, 50276, 6050, 891, 2868, 26144, 310, 247, 1077, 25213, 7680, 281, 253, 1837, 77, 3694, 13016, 352, 3133, 627, 310, 417, 2217, 8249, 2600, 275, 436, 2929, 281, 7501, 9311, 387, 17857, 32888, 5001, 5742, 253, 46141, 285, 391, 620, 487, 9380, 891, 1158, 326, 253, 46141, 2929, 574, 247, 38135, 4445, 285, 3559, 391, 77, 1666, 25379, 281, 247, 747, 3126, 1054, 14068, 1223, 253, 391, 620, 487, 2929, 574, 247, 10046, 2718, 2561, 7680, 436, 2296, 2717, 670, 253, 2852, 3486, 273, 26144, 46141, 285, 391, 620, 487, 50276, 783, 9056, 3694 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper proposes an interesting regularization method for encouraging function values to be discrete the main idea is very interesting with two correlated random vectors the function is encouraged to maximize the correlation of outputs from the two vectors the function that maximizes the correlation is the indicator function of a halfspace combining this objective and original training objectives a learning model can output random values that are near 0 or 1 the model plays the same role as the gumbelsoftmax trick the submission has done extensive experiments comparing the proposed technique against gumbelsoftmax the results indicate that the new technique improves the performance in several learning tasks though the improvement is not consistent overall i think the proposed new method is interesting and worth further study but i also feel that the contribution of the submission is somewhat limited in particular i have several concerns about the submission 1 i think the study is not thorough enough though we see some performance improvement can we compare the proposed method against gumbelsoftmax at a more detailed level for example gumbelsoftmax uses temperature to control the extremeness of function values i think the corresponding hyperparameter in the proposed method is the strength of regularization can we do a sidebyside comparison between the two hyperparameters for example checking the performance vs temperature and the performance vs regularization strength my hypothesis is that the temperature value is hard to tune while the regularization strength is easier to tune the performance difference might be from this reason in general i think the optimization involving discrete variables is hard and the backpropagation through these variables does not behave i dont feel the proposed method is changing the property 2 the performance improvement does not seem to be consistent sometimes we see performance drops is the experiment in 53 a good comparison the proposed method uses 20 clusters while the first two baselines use 30 clusters more clusters should leads to higher performance right the authors may want to increase the number of clusters and check the performance how do you get the performance of sbvae i dont find it in the original paper 3 which experiments use meancentering do you see a performance improvement by using meancentering i dont see a reason for mean centering in experiments i feel that the model will learn the function and decide an underlying mean even without meancentering overall i think the technique is interesting but i also feel the overall contribution of the submission is somewhat limited docsepthe paper proposes a method for regularizing neural networks with boolean and categorical stochastic variables the proposed stability regularization is based on the gaussian noise stability which is maximized by the indicator functions of half spaces experiments with various benchmarks show promising results comparing to existing methods this work introduces stability regularization for training neural network with discrete stochastic variables unlike most of existing methods for handling discrete stochastic variables the proposed method promotes the output of continuous functions of gaussian random variables to discrete which leverages the nice property of the gaussian noise stability the authors have conducted various experiments on different benchmarks to showcase its superiority over existing methods it has been known that halfspaces maximize noise stability subject to a measure constraint since borells work in 1985 but the theorem only states about the relationship between noise stability and partitions with two parts this can be directly related to the binary variables but it is not obvious how this could be extended to the categorical variables that is partitions with more than two parts the way that the authors handle this is to sum across dimension which is more like treating each dimension as independent binary variables rather than a single categorical variable this does not prevent the output from generating all zeros or multiple ones which is no longer a categorical variable it has been shown that the standard simplex partition fails to maximize multipart noise stability unless all of the parts have equal measure 1 2 therefore the analogue of borells theorem in the multipart partitions case seems still an open problem the authors might want to discuss more regarding the way that they extend the twoparts result to the categorical case and how the proposed regularization 4 promotes a single one for the categorical variable the authors present two versions of stability regularization one with mean centering and the other without centering it seems natural to compare these two versions quantitatively and give recommendations for situations where one is better than the other however in all 6 experiments the authors only choose to provide results for either one of the two versions without any direct comparisons between them and the choice of which one to apply in each experiment appears a bit arbitrary on the other hand there are a few hyperparameters introduced for the regularization the correlation variable the maximum stability and the regularization strength the authors claim that these either do not impact the performance much or are easier to tune without any quantitative results the most ideal case is that one set of hyperparameters can be applied to multiple problems and gives reasonably good performance for all of them but this does not seem to be the case here the maximum stabilities are different across applications and the regularization strength is only mentioned for the neural relational inference experiments since the hasslefree is one of the most important features of the proposed method it is critical to provide sensitivity analysis of the introduced parameters and demonstrate that competitive performance can be obtained without careful tuning the authors also discuss how the proposed regularization can be combined with the gumbelsoftmax method where the output is from the gumbelsoftmax computation but the stability is computed with the vanilla softmax without adding the gumbel noise this creates a mismatch here the regularization is not applied on the real output the authors might want to provide the rationale behind this and why the gumbel noise is necessary to produce the output as input for the downstream network one benefit of combining the stability regularization and the continuous relaxations as the authors argue is to provide a form of implicit temperature control and mitigate the need to manually tune the temperature there are quantitative results supporting the combination is better than using gumbelsoftmax alone but it would be more interesting to see either the less sensitive to the temperature with the combination or the universal improvement of the combination over the gumbelsoftmax alone for different temperatures it is really nice that the authors conduct various experiments across different problem domains but they are mostly in the same flavor without providing complementary support i would suggest move some of the experiments to the supplementary material the saved space can be used for the sensitivity analysis mentioned above and the detailed comparisons with existing methods handling stochastic neural networks similar to the experiments done in 3 the same toy experiment and the section b2 in the supplementary material after all this is a work of training stochastic neural network it would be nicer to give a thorough comparison with methods discussed in the related work some typos in algorithm 2 v1 and v2 are used in line 2 but y1 y2 are used in line 3 the error of the meancentered stability approximation in proposition 1 is orho2rho2 rho1 but orho1 rho1 rho2 in section a in the supplementary material 1 s heilman e mossel and j neeman standard simplices and pluralities are not the most noise stable itcs 2015 2 a de e mossel and j neeman noise stability is computatble and approximately lowdimensional theory of computing 2019 3 a pervez t cohen and e gavves low bias low variance gradient estimates for boolean stochastic networks icml 2020 the proposed stability regularization has nice theoretical foundation and shows good performance across different problems involving discrete stochastic variables but i am lean to reject mainly due to the following reasons 1 the lack of theoretical foundation for the extension to the categorical variable 2 the insufficient quantitative results supporting the hasslefree characteristic of the proposed method docsepthe paper describes a computational approach to discrete optimisation based on a probabilistic regularization procedure that enforces the output of a continuous function to be quasidiscrete the idea is to use a property of gaussian noise described by borells isoperimetric theorem strengths the method implemented in the paper is innovative and exploits a nice and notsopopular property of the gaussian distributions connecting real and discrete outputs through the expectation of the indicator function is a promising idea and may have a good impact on various discrete optimization applications the paper has an extensive set of experiments weaknesses i have two main concerns about this work one is the computational cost of the proposed method probabilistic approaches to discrete optimization often suffer from high computational complexity eg the likelihood is intractable the proposed procedure is based on a different idea but it looks like the resulting optimization requires monte carlo sampling or elbo approximations the second concern is about a possible comparison with other methods for discrete optimization eg ste or more recent versions of it if the authors have some available results these could have been put in more evidence in the experiments section questions would it be possible to use the method for learning a binary nn where only the weights and not the layer representations are discrete is this the first time gaussian stability is used for learning approximations of discretevalued functions could be the proposed method compared with other approaches for discrete optimization eg ste on both the quality of the output and the computational cost how many artificial gaussian variables are needed for a ddimensional optimization is the procedure expected to be scalable for networks with millions of parameters a very nice idea probably a bit expansive on the computational side docsepthe paper presents a new method to train neural networks with stochastic discrete variables called stability regularisation the method pushes the outputs of functions of gaussian random variables to be close to discrete and unlike other methods used with discrete variables such as gumbel softmax which requires temperature annealing it is easy to tune the method is demonstrated on a very rich variety of tasks and models and shows state of the art performance the paper builds on the concept of noise stability of functions of gaussian variables noise stability measures the resilience of such functions to noise by estimating how their outputs correlate when they receive as input correlated gaussian variables the higher the correlation the more stable the function for a family of functions bounded and fix volume the stability is maximized for these functions that are indicator functions of half spaces which is a natural way to construct binary variables the paper proposes to maximize stability in order to obtain categorical variables the implementation of the method is rather simple and it comes with a small computational overhead due to the fact that it requires a double evaluation of these computational units whose output we would want to push to be discrete the paper explores the performance of the method in a diverse set of experiments learning discrete latent spaces gating computational units using binary variables and generating discrete outputs i appreciate the breadth of scenarios i only have a small comment here on the fact that the comparisons are only done with the gumbelsoftmax why this choice and not compare also with some of the other approaches that the authors discuss eg straighthrough andor some variants of the score based estimators overall this is a nice contribution to what is an important problem learning discrete stochastic variables the idea is simple and the authors demonstrate it in a variety of datasets the only thing that i would have liked to see is a comparison with other methods that are meant to also work in such discrete settings eg straightthrough scorebased estimators ### Summary:
the paper introduces a method to train neural networks based on socalled stability regularisation the method encourages the outputs of functions of gaussian random variables to be close to discrete and does not require temperature annealing like the gumbel softmax all reviewers agreed that the proposed method was novel and of interest the authors conducted extensive experiments they also adequately addressed the concerns raised by the reviewers eg theoretical foundation and computational cost
[ 12669, 253, 3045, 4632, 3276, 285, 253, 3045, 4632, 37820, 4757, 50275, 2577, 9079, 310, 326, 253, 3276, 1318, 310, 1892, 281, 19928, 1223, 253, 37820, 4757, 310, 6927, 281, 19928, 253, 3045, 3064, 1537, 320, 432, 436, 1921, 50275, 249, 2087, 891, 1158, 253, 13757, 7668, 13358, 4903, 310, 1892, 285, 253, 896, 44263, 318, 949, 841, 4903, 1057, 417, 21319, 891, 13414, 1928, 253, 4081, 1332, 310, 6890, 253, 2867, 50275, 19, 253, 3045, 7756, 1057, 417, 1646, 281, 320, 5185, 4536, 359, 923, 3045, 15323, 310, 253, 3368, 275, 8676, 247, 1175, 5301, 253, 4081, 1332, 4648, 1384, 9959, 1223, 253, 806, 767, 1666, 25379, 897, 1884, 9959, 625, 9959, 943, 5644, 281, 2169, 3045, 987, 253, 4477, 778, 971, 281, 2572, 253, 1180, 273, 9959, 285, 2451, 253, 3045, 849, 513, 368, 755, 253, 3045, 273, 34505, 21574, 891, 13414, 1089, 352, 275, 253, 3236, 2929, 50275, 20, 534, 4679, 897, 1599, 1154, 2158, 513, 368, 923, 247, 3045, 7756, 407, 970, 1599, 1154, 2158, 891, 13414, 923, 247, 1921, 323, 1599, 1399, 2158, 275, 4679, 891, 1928, 326, 253, 1566, 588, 3037, 253, 1159, 285, 7617, 271, 6944, 1599, 1014, 1293, 1599, 1154, 2158, 50271, 1189, 455, 891, 1158, 253, 5853, 310, 4722, 533, 891, 671, 1928, 253, 4583, 7680, 273, 253, 19529, 310, 8489, 3710, 50276, 7152, 339, 431, 248, 2929, 29328, 247, 1332, 323, 3963, 3006, 11454, 6928, 342, 12419, 285, 31091, 19191, 4903, 253, 4081, 7882, 37820, 310, 1754, 327, 253, 305, 12064, 6046, 7882, 534, 310, 11903, 1025, 407, 253, 15301, 3470, 273, 2716, 8470, 4679, 342, 2710, 49602, 921, 12532, 1543, 10941, 281, 5368, 3082, 436, 789, 23970, 7882, 37820, 323, 3733, 11454, 2990, 342, 13358, 19191, 4903, 12401, 954, 273, 5368, 3082, 323, 10885, 13358, 19191, 4903, 253, 4081, 1332, 18653, 253, 3453, 273, 5415, 3470, 273, 305, 12064, 3632, 4903, 281, 13358, 534, 19732, 1131, 253, 5322, 2867, 273, 253, 305, 12064, 6046, 7882, 253, 4477, 452, 5196, 2710, 4679, 327, 1027, 49602, 281, 34647, 697, 34385, 689, 5368, 3082, 50276, 262, 556, 644, 1929, 326, 2716, 31748, 22950, 6046, 7882, 2256, 281, 247, 2557, 7658, 1580, 17564, 620, 84, 789, 275, 12210, 533, 253, 10012, 760, 3054, 670, 253, 2954, 875, 6046, 7882, 285, 27959, 342, 767, 4243, 436, 476, 320, 3587, 2905, 281, 253, 8985, 4903, 533, 352, 310, 417, 4755, 849, 436, 812, 320, 6508, 281, 253, 31091, 4903, 326, 310, 27959, 342, 625, 685, 767, 4243, 253, 1039, 326, 253, 4477, 6016, 436, 310, 281, 2020, 2439, 7877, 534, 310, 625, 751, 12767, 1016, 7877, 347, 3907, 8985, 4903, 2581, 685, 247, 2014, 31091, 4778, 436, 1057, 417, 3657, 253, 3453, 432, 11365, 512, 33303, 390, 2709, 4394, 534, 310, 642, 3356, 247, 31091, 4778, 352, 556, 644, 2011, 326, 253, 2629, 44053, 10883, 10224, 281, 22950, 10796, 435, 6046, 7882, 5734, 512, 273, 253, 4243, 452, 4503, 2557, 337, 374, 3103, 253, 28046, 273, 17564, 620, 84, 10012, 275, 253, 10796, 435, 27959, 1083, 3133, 1335, 271, 1527, 1895, 253, 4477, 1537, 971, 281, 2319, 625, 5001, 253, 1039, 326, 597, 9017, 253, 2500, 412, 12863, 906, 281, 253, 31091, 1083, 285, 849, 253, 4081, 37820, 577, 18653, 247, 2014, 581, 323, 253, 31091, 4778, 50276, 783, 4477, 1246, 767, 9508, 273, 7882, 37820, 581, 342, 1599, 1399, 2158, 285, 253, 643, 1293, 1399, 2158, 352, 3133, 3626, 281, 7277, 841, 767, 9508, 36878, 285, 1918, 12645, 323, 9534, 835, 581, 310, 1805, 685, 253, 643, 2299, 275, 512, 721, 4679, 253, 4477, 760, 5206, 281, 2085, 1543, 323, 2057, 581, 273, 253, 767, 9508, 1293, 667, 1480, 14023, 875, 731, 285, 253, 4327, 273, 534, 581, 281, 4647, 275, 1016, 3368, 4620, 247, 2372, 10341, 327, 253, 643, 1133, 627, 403, 247, 1643, 4373, 22041, 5611, 323, 253, 37820, 253, 5921, 4778, 253, 4869, 7882, 285, 253, 37820, 4757, 253, 4477, 1750, 326, 841, 2057, 513, 417, 3486, 253, 3045, 1199, 390, 403, 6927, 281, 19928, 1293, 667, 11745, 1543, 253, 954, 7445, 1083, 310, 326, 581, 873, 273, 4373, 22041, 476, 320, 3732, 281, 2709, 3237, 285, 4245, 12054, 1175, 3045, 323, 512, 273, 731, 533, 436, 1057, 417, 1646, 281, 320, 253, 1083, 1060, 253, 4869, 331, 6720, 403, 1027, 2439, 4893, 285, 253, 37820, 4757, 310, 760, 5393, 323, 253, 11454, 38524, 17032, 4679, 1580, 253, 45454, 4924, 310, 581, 273, 253, 954, 1774, 3386, 273, 253, 4081, 1332, 352, 310, 4619, 281, 2085, 7340, 1783, 273, 253, 5611, 3602, 285, 7568, 326, 12085, 3045, 476, 320, 2797, 1293, 10182, 25184, 50276, 783, 4477, 671, 2319, 849, 253, 4081, 37820, 476, 320, 5678, 342, 253, 305, 3561, 293, 5530, 4090, 1332, 835, 253, 3453, 310, 432, 253, 305, 3561, 293, 5530, 4090, 13782, 533, 253, 7882, 310, 10302, 342, 253, 26724, 2602, 4090, 1293, 6240, 253, 305, 3561, 293, 6046, 436, 10513, 247, 29713, 1060, 253, 37820, 310, 417, 3732, 327, 253, 1524, 3453, 253, 4477, 1537, 971, 281, 2085, 253, 24775, 3212, 436, 285, 2139, 253, 305, 3561, 293, 6046, 310, 3309, 281, 4711, 253, 3453, 347, 3280, 323, 253, 15450, 2990, 581, 5649, 273, 16248, 253, 7882, 37820, 285, 253, 5415, 7921, 569, 347, 253, 4477, 9059, 310, 281, 2085, 247, 830, 273, 15424, 3276, 1453, 285, 29966, 253, 878, 281, 13542, 19928, 253, 3276, 627, 403, 11745, 1543, 8109, 253, 5019, 310, 1805, 685, 970, 305, 3561, 293, 5530, 4090, 3815, 533, 352, 651, 320, 625, 4722, 281, 923, 2057, 253, 1679, 7996, 281, 253, 3276, 342, 253, 5019, 390, 253, 10898, 7756, 273, 253, 5019, 689, 253, 305, 3561, 293, 5530, 4090, 3815, 323, 1027, 9208, 50276, 262, 310, 1663, 5322, 326, 253, 4477, 2589, 2710, 4679, 2439, 1027, 1895, 10625, 533, 597, 403, 6571, 275, 253, 1072, 13746, 1293, 5277, 19767, 1329, 891, 651, 1804, 2118, 690, 273, 253, 4679, 281, 253, 24864, 2144, 253, 9809, 2317, 476, 320, 908, 323, 253, 7340, 1783, 5393, 1840, 285, 253, 7000, 14023, 342, 5368, 3082, 10885, 19191, 11454, 6928, 2074, 281, 253, 4679, 2218, 275, 495, 253, 1072, 20953, 3368, 285, 253, 2593, 270, 19, 275, 253, 24864, 2144, 846, 512, 436, 310, 247, 789, 273, 3733, 19191, 11454, 2990, 352, 651, 320, 49482, 281, 1918, 247, 11080, 5301, 342, 3082, 5469, 275, 253, 2905, 789, 50276, 8826, 963, 993, 50276, 249, 5933, 374, 362, 18, 285, 362, 19, 403, 908, 275, 1386, 374, 533, 340, 18, 340, 19, 403, 908, 275, 1386, 495, 50276, 783, 2228, 273, 253, 1599, 34872, 7882, 11193, 275, 13989, 337, 310, 390, 1689, 19, 2859, 19, 50276, 2859, 18, 533, 390, 1689, 18, 391, 1689, 18, 50276, 2859, 19, 275, 2593, 247, 275, 253, 24864, 2144, 50276, 18, 256, 344, 300, 1342, 299, 43177, 293, 285, 480, 425, 11155, 2629, 8077, 1271, 285, 25540, 1005, 403, 417, 253, 954, 6046, 6474, 352, 6113, 4104, 50276, 19, 247, 372, 299, 43177, 293, 285, 480, 425, 11155, 6046, 7882, 310, 2475, 255, 934, 285, 5512, 1698, 6967, 3762, 273, 12672, 6247, 50276, 20, 247, 591, 29350, 246, 820, 864, 285, 299, 305, 580, 1634, 1698, 8492, 1698, 11041, 11786, 8197, 323, 12419, 19191, 6928, 17857, 1686, 9169, 253, 4081, 7882, 37820, 556, 5322, 10527, 12153, 285, 2722, 1175, 3045, 2439, 1027, 3237, 7668, 13358, 19191, 4903, 533, 891, 717, 9644, 281, 12009, 7194, 1955, 281, 253, 1563, 4606, 337, 253, 3480, 273, 10527, 12153, 323, 253, 6880, 281, 253, 31091, 4778, 374, 253, 12497, 11745, 1543, 8109, 253, 45454, 4924, 8847, 273, 253, 4081, 1332, 5474, 339, 431, 248, 2929, 8631, 247, 15180, 2746, 281, 13358, 5556, 5837, 1754, 327, 247, 37851, 37820, 5199, 326, 546, 36217, 253, 3453, 273, 247, 5415, 1159, 281, 320, 21582, 30861, 6713, 253, 2934, 310, 281, 897, 247, 2867, 273, 305, 12064, 6046, 2529, 407, 17564, 620, 84, 310, 2211, 35079, 10012, 50276, 296, 3755, 20556, 253, 1332, 9009, 275, 253, 2929, 310, 16694, 285, 40725, 247, 5322, 285, 417, 84, 412, 412, 792, 2867, 273, 253, 305, 12064, 10670, 12873, 1524, 285, 13358, 18012, 949, 253, 15355, 273, 253, 15301, 1159, 310, 247, 12532, 2934, 285, 778, 452, 247, 1175, 3486, 327, 2710, 13358, 13757, 4893, 253, 2929, 556, 271, 9470, 873, 273, 4679, 50276, 20881, 1255, 265, 891, 452, 767, 2022, 7350, 670, 436, 789, 581, 310, 253, 15180, 2105, 273, 253, 4081, 1332, 37851, 7274, 281, 13358, 13757, 2223, 11089, 432, 1029, 15180, 10454, 24088, 253, 12177, 310, 540, 44374, 253, 4081, 5199, 310, 1754, 327, 247, 1027, 2934, 533, 352, 4453, 751, 253, 4795, 13757, 4419, 1114, 442, 1113, 4213, 10491, 390, 1045, 2399, 34754, 253, 1273, 4468, 310, 670, 247, 1896, 5301, 342, 643, 3082, 323, 13358, 13757, 24088, 2870, 390, 625, 3332, 9508, 273, 352, 604, 253, 4477, 452, 690, 2130, 1543, 841, 812, 452, 644, 1691, 275, 625, 1941, 275, 253, 4679, 2593, 50276, 34974, 50276, 12756, 352, 320, 1896, 281, 897, 253, 1332, 323, 4715, 247, 8985, 48257, 835, 760, 253, 13461, 285, 417, 253, 3828, 14237, 403, 13358, 50276, 261, 436, 253, 806, 673, 305, 12064, 7882, 310, 908, 323, 4715, 34754, 273, 13358, 24995, 3470, 50276, 16534, 320, 253, 4081, 1332, 2429, 342, 643, 7274, 323, 13358, 13757, 24088, 2870, 327, 1097, 253, 3290, 273, 253, 3453, 285, 253, 15180, 2105, 50276, 5430, 1142, 13345, 305, 12064, 4903, 403, 3058, 323, 247, 277, 6967, 13757, 310, 253, 5199, 3264, 281, 320, 44755, 323, 6928, 342, 9790, 273, 3602, 247, 1077, 5322, 2934, 3164, 247, 2372, 44380, 327, 253, 15180, 1930, 5474, 339, 431, 248, 2929, 10262, 247, 747, 1332, 281, 6194, 11454, 6928, 342, 19191, 13358, 4903, 1925, 7882, 3963, 5837, 253, 1332, 32804, 50276, 783, 18012, 273, 3470, 273, 305, 12064, 3632, 4903, 281, 320, 2810, 281, 13358, 285, 12401, 643, 3082, 908, 342, 13358, 4903, 50276, 10328, 347, 305, 3561, 293, 2602, 4090, 534, 4419, 3276, 35375, 352, 310, 3477, 281, 19928, 253, 1332, 310, 5183, 327, 247, 1077, 6793, 5235, 273, 8892, 285, 3210, 285, 2722, 1375, 273, 253, 1445, 3045, 50275, 783, 2929, 21168, 327, 253, 4473, 273, 6046, 7882, 273, 3470, 273, 305, 12064, 4903, 50276, 24946, 7882, 5593, 253, 35124, 273, 824, 3470, 50276, 936, 6046, 407, 26230, 849, 616, 18012, 24888, 672, 597, 4763, 347, 3280, 9578, 305, 12064, 4903, 253, 2169, 253, 5921, 253, 625, 50276, 11351, 253, 1159, 323, 247, 2021, 273, 3470, 11542, 285, 4993, 4644, 253, 7882, 310, 11903, 1025, 323, 841, 3470, 326, 403, 15301, 3470, 50276, 1171, 2716, 8470, 534, 310, 247, 3626, 1039, 281, 3989, 8985, 4903, 253, 2929, 29328, 281, 22950, 7882, 275, 1340, 281, 4044, 31091, 50276, 39448, 50276, 783, 7092, 273, 253, 1332, 310, 2581, 2969, 285, 352, 3249, 342, 247, 1355, 15180, 18332, 1955, 281, 253, 958, 326, 352, 4419, 247, 4021, 7103, 273, 841, 15180, 5085, 3692, 3453, 359, 651, 971, 281, 7450, 281, 320, 13358, 50275, 783, 2929, 33826, 253, 3045, 273, 253, 1332, 275, 247, 11117, 873, 273, 4679, 4715, 13358, 21624, 8470, 305, 839, 15180, 5085, 970, 8985, 4903, 285, 11365, 13358, 18012, 891, 11435, 253, 37535, 273, 15216, 891, 760, 452, 247, 1355, 4385, 1060, 327, 253, 958, 326, 253, 14023, 403, 50276, 7483, 2218, 342, 253, 305, 3561, 293, 5530, 4090, 2139, 436, 4327, 285, 417, 7277, 671, 342, 690, 273, 253, 643, 7274, 326, 253, 4477, 2319, 24088, 4951, 73, 903, 285, 263, 690, 11640, 273, 253, 4868, 1754, 48489, 50275, 1189, 455, 436, 310, 247, 5322, 7680, 281, 752, 310, 271, 1774, 1895, 4715, 13358, 19191, 4903, 253, 2934, 310, 2969, 285, 253, 4477, 7568, 352, 275, 247, 5235, 273, 15302, 253, 760, 2181, 326, 891, 651, 452, 10490, 281, 923, 310, 247, 5301, 342, 643, 3082, 326, 403, 5486, 281, 671, 789, 275, 824, 13358, 7533, 24088, 4951, 10489, 4868, 3169, 48489, 50276, 187, 187, 4118, 18435, 27, 783, 2929, 23970, 247, 1332, 281, 6194, 11454, 6928, 1754, 327, 9267, 18859, 7882, 3963, 5837, 253, 1332, 29426, 253, 18012, 273, 3470, 273, 305, 12064, 3632, 4903, 281, 320, 2810, 281, 13358, 285, 1057, 417, 2430, 3276, 35375, 751, 253, 305, 3561, 293, 2602, 4090, 512, 30628, 5821, 326, 253, 4081, 1332, 369, 4460, 285, 273, 1600, 253, 4477, 5196, 9470, 4679, 597, 671, 18212, 9713, 253, 7350, 5439, 407, 253, 30628, 24088, 10527, 12153, 285, 15180, 2105 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 12669, 253, 3045, 4632, 3276, 285, 253, 3045, 4632, 37820, 4757, 50275, 2577, 9079, 310, 326, 253, 3276, 1318, 310, 1892, 281, 19928, 1223, 253, 37820, 4757, 310, 6927, 281, 19928, 253, 3045, 3064, 1537, 320, 432, 436, 1921, 50275, 249, 2087, 891, 1158, 253, 13757, 7668, 13358, 4903, 310, 1892, 285, 253, 896, 44263, 318, 949, 841, 4903, 1057, 417, 21319, 891, 13414, 1928, 253, 4081, 1332, 310, 6890, 253, 2867, 50275, 19, 253, 3045, 7756, 1057, 417, 1646, 281, 320, 5185, 4536, 359, 923, 3045, 15323, 310, 253, 3368, 275, 8676, 247, 1175, 5301, 253, 4081, 1332, 4648, 1384, 9959, 1223, 253, 806, 767, 1666, 25379, 897, 1884, 9959, 625, 9959, 943, 5644, 281, 2169, 3045, 987, 253, 4477, 778, 971, 281, 2572, 253, 1180, 273, 9959, 285, 2451, 253, 3045, 849, 513, 368, 755, 253, 3045, 273, 34505, 21574, 891, 13414, 1089, 352, 275, 253, 3236, 2929, 50275, 20, 534, 4679, 897, 1599, 1154, 2158, 513, 368, 923, 247, 3045, 7756, 407, 970, 1599, 1154, 2158, 891, 13414, 923, 247, 1921, 323, 1599, 1399, 2158, 275, 4679, 891, 1928, 326, 253, 1566, 588, 3037, 253, 1159, 285, 7617, 271, 6944, 1599, 1014, 1293, 1599, 1154, 2158, 50271, 1189, 455, 891, 1158, 253, 5853, 310, 4722, 533, 891, 671, 1928, 253, 4583, 7680, 273, 253, 19529, 310, 8489, 3710, 50276, 7152, 339, 431, 248, 2929, 29328, 247, 1332, 323, 3963, 3006, 11454, 6928, 342, 12419, 285, 31091, 19191, 4903, 253, 4081, 7882, 37820, 310, 1754, 327, 253, 305, 12064, 6046, 7882, 534, 310, 11903, 1025, 407, 253, 15301, 3470, 273, 2716, 8470, 4679, 342, 2710, 49602, 921, 12532, 1543, 10941, 281, 5368, 3082, 436, 789, 23970, 7882, 37820, 323, 3733, 11454, 2990, 342, 13358, 19191, 4903, 12401, 954, 273, 5368, 3082, 323, 10885, 13358, 19191, 4903, 253, 4081, 1332, 18653, 253, 3453, 273, 5415, 3470, 273, 305, 12064, 3632, 4903, 281, 13358, 534, 19732, 1131, 253, 5322, 2867, 273, 253, 305, 12064, 6046, 7882, 253, 4477, 452, 5196, 2710, 4679, 327, 1027, 49602, 281, 34647, 697, 34385, 689, 5368, 3082, 50276, 262, 556, 644, 1929, 326, 2716, 31748, 22950, 6046, 7882, 2256, 281, 247, 2557, 7658, 1580, 17564, 620, 84, 789, 275, 12210, 533, 253, 10012, 760, 3054, 670, 253, 2954, 875, 6046, 7882, 285, 27959, 342, 767, 4243, 436, 476, 320, 3587, 2905, 281, 253, 8985, 4903, 533, 352, 310, 417, 4755, 849, 436, 812, 320, 6508, 281, 253, 31091, 4903, 326, 310, 27959, 342, 625, 685, 767, 4243, 253, 1039, 326, 253, 4477, 6016, 436, 310, 281, 2020, 2439, 7877, 534, 310, 625, 751, 12767, 1016, 7877, 347, 3907, 8985, 4903, 2581, 685, 247, 2014, 31091, 4778, 436, 1057, 417, 3657, 253, 3453, 432, 11365, 512, 33303, 390, 2709, 4394, 534, 310, 642, 3356, 247, 31091, 4778, 352, 556, 644, 2011, 326, 253, 2629, 44053, 10883, 10224, 281, 22950, 10796, 435, 6046, 7882, 5734, 512, 273, 253, 4243, 452, 4503, 2557, 337, 374, 3103, 253, 28046, 273, 17564, 620, 84, 10012, 275, 253, 10796, 435, 27959, 1083, 3133, 1335, 271, 1527, 1895, 253, 4477, 1537, 971, 281, 2319, 625, 5001, 253, 1039, 326, 597, 9017, 253, 2500, 412, 12863, 906, 281, 253, 31091, 1083, 285, 849, 253, 4081, 37820, 577, 18653, 247, 2014, 581, 323, 253, 31091, 4778, 50276, 783, 4477, 1246, 767, 9508, 273, 7882, 37820, 581, 342, 1599, 1399, 2158, 285, 253, 643, 1293, 1399, 2158, 352, 3133, 3626, 281, 7277, 841, 767, 9508, 36878, 285, 1918, 12645, 323, 9534, 835, 581, 310, 1805, 685, 253, 643, 2299, 275, 512, 721, 4679, 253, 4477, 760, 5206, 281, 2085, 1543, 323, 2057, 581, 273, 253, 767, 9508, 1293, 667, 1480, 14023, 875, 731, 285, 253, 4327, 273, 534, 581, 281, 4647, 275, 1016, 3368, 4620, 247, 2372, 10341, 327, 253, 643, 1133, 627, 403, 247, 1643, 4373, 22041, 5611, 323, 253, 37820, 253, 5921, 4778, 253, 4869, 7882, 285, 253, 37820, 4757, 253, 4477, 1750, 326, 841, 2057, 513, 417, 3486, 253, 3045, 1199, 390, 403, 6927, 281, 19928, 1293, 667, 11745, 1543, 253, 954, 7445, 1083, 310, 326, 581, 873, 273, 4373, 22041, 476, 320, 3732, 281, 2709, 3237, 285, 4245, 12054, 1175, 3045, 323, 512, 273, 731, 533, 436, 1057, 417, 1646, 281, 320, 253, 1083, 1060, 253, 4869, 331, 6720, 403, 1027, 2439, 4893, 285, 253, 37820, 4757, 310, 760, 5393, 323, 253, 11454, 38524, 17032, 4679, 1580, 253, 45454, 4924, 310, 581, 273, 253, 954, 1774, 3386, 273, 253, 4081, 1332, 352, 310, 4619, 281, 2085, 7340, 1783, 273, 253, 5611, 3602, 285, 7568, 326, 12085, 3045, 476, 320, 2797, 1293, 10182, 25184, 50276, 783, 4477, 671, 2319, 849, 253, 4081, 37820, 476, 320, 5678, 342, 253, 305, 3561, 293, 5530, 4090, 1332, 835, 253, 3453, 310, 432, 253, 305, 3561, 293, 5530, 4090, 13782, 533, 253, 7882, 310, 10302, 342, 253, 26724, 2602, 4090, 1293, 6240, 253, 305, 3561, 293, 6046, 436, 10513, 247, 29713, 1060, 253, 37820, 310, 417, 3732, 327, 253, 1524, 3453, 253, 4477, 1537, 971, 281, 2085, 253, 24775, 3212, 436, 285, 2139, 253, 305, 3561, 293, 6046, 310, 3309, 281, 4711, 253, 3453, 347, 3280, 323, 253, 15450, 2990, 581, 5649, 273, 16248, 253, 7882, 37820, 285, 253, 5415, 7921, 569, 347, 253, 4477, 9059, 310, 281, 2085, 247, 830, 273, 15424, 3276, 1453, 285, 29966, 253, 878, 281, 13542, 19928, 253, 3276, 627, 403, 11745, 1543, 8109, 253, 5019, 310, 1805, 685, 970, 305, 3561, 293, 5530, 4090, 3815, 533, 352, 651, 320, 625, 4722, 281, 923, 2057, 253, 1679, 7996, 281, 253, 3276, 342, 253, 5019, 390, 253, 10898, 7756, 273, 253, 5019, 689, 253, 305, 3561, 293, 5530, 4090, 3815, 323, 1027, 9208, 50276, 262, 310, 1663, 5322, 326, 253, 4477, 2589, 2710, 4679, 2439, 1027, 1895, 10625, 533, 597, 403, 6571, 275, 253, 1072, 13746, 1293, 5277, 19767, 1329, 891, 651, 1804, 2118, 690, 273, 253, 4679, 281, 253, 24864, 2144, 253, 9809, 2317, 476, 320, 908, 323, 253, 7340, 1783, 5393, 1840, 285, 253, 7000, 14023, 342, 5368, 3082, 10885, 19191, 11454, 6928, 2074, 281, 253, 4679, 2218, 275, 495, 253, 1072, 20953, 3368, 285, 253, 2593, 270, 19, 275, 253, 24864, 2144, 846, 512, 436, 310, 247, 789, 273, 3733, 19191, 11454, 2990, 352, 651, 320, 49482, 281, 1918, 247, 11080, 5301, 342, 3082, 5469, 275, 253, 2905, 789, 50276, 8826, 963, 993, 50276, 249, 5933, 374, 362, 18, 285, 362, 19, 403, 908, 275, 1386, 374, 533, 340, 18, 340, 19, 403, 908, 275, 1386, 495, 50276, 783, 2228, 273, 253, 1599, 34872, 7882, 11193, 275, 13989, 337, 310, 390, 1689, 19, 2859, 19, 50276, 2859, 18, 533, 390, 1689, 18, 391, 1689, 18, 50276, 2859, 19, 275, 2593, 247, 275, 253, 24864, 2144, 50276, 18, 256, 344, 300, 1342, 299, 43177, 293, 285, 480, 425, 11155, 2629, 8077, 1271, 285, 25540, 1005, 403, 417, 253, 954, 6046, 6474, 352, 6113, 4104, 50276, 19, 247, 372, 299, 43177, 293, 285, 480, 425, 11155, 6046, 7882, 310, 2475, 255, 934, 285, 5512, 1698, 6967, 3762, 273, 12672, 6247, 50276, 20, 247, 591, 29350, 246, 820, 864, 285, 299, 305, 580, 1634, 1698, 8492, 1698, 11041, 11786, 8197, 323, 12419, 19191, 6928, 17857, 1686, 9169, 253, 4081, 7882, 37820, 556, 5322, 10527, 12153, 285, 2722, 1175, 3045, 2439, 1027, 3237, 7668, 13358, 19191, 4903, 533, 891, 717, 9644, 281, 12009, 7194, 1955, 281, 253, 1563, 4606, 337, 253, 3480, 273, 10527, 12153, 323, 253, 6880, 281, 253, 31091, 4778, 374, 253, 12497, 11745, 1543, 8109, 253, 45454, 4924, 8847, 273, 253, 4081, 1332, 5474, 339, 431, 248, 2929, 8631, 247, 15180, 2746, 281, 13358, 5556, 5837, 1754, 327, 247, 37851, 37820, 5199, 326, 546, 36217, 253, 3453, 273, 247, 5415, 1159, 281, 320, 21582, 30861, 6713, 253, 2934, 310, 281, 897, 247, 2867, 273, 305, 12064, 6046, 2529, 407, 17564, 620, 84, 310, 2211, 35079, 10012, 50276, 296, 3755, 20556, 253, 1332, 9009, 275, 253, 2929, 310, 16694, 285, 40725, 247, 5322, 285, 417, 84, 412, 412, 792, 2867, 273, 253, 305, 12064, 10670, 12873, 1524, 285, 13358, 18012, 949, 253, 15355, 273, 253, 15301, 1159, 310, 247, 12532, 2934, 285, 778, 452, 247, 1175, 3486, 327, 2710, 13358, 13757, 4893, 253, 2929, 556, 271, 9470, 873, 273, 4679, 50276, 20881, 1255, 265, 891, 452, 767, 2022, 7350, 670, 436, 789, 581, 310, 253, 15180, 2105, 273, 253, 4081, 1332, 37851, 7274, 281, 13358, 13757, 2223, 11089, 432, 1029, 15180, 10454, 24088, 253, 12177, 310, 540, 44374, 253, 4081, 5199, 310, 1754, 327, 247, 1027, 2934, 533, 352, 4453, 751, 253, 4795, 13757, 4419, 1114, 442, 1113, 4213, 10491, 390, 1045, 2399, 34754, 253, 1273, 4468, 310, 670, 247, 1896, 5301, 342, 643, 3082, 323, 13358, 13757, 24088, 2870, 390, 625, 3332, 9508, 273, 352, 604, 253, 4477, 452, 690, 2130, 1543, 841, 812, 452, 644, 1691, 275, 625, 1941, 275, 253, 4679, 2593, 50276, 34974, 50276, 12756, 352, 320, 1896, 281, 897, 253, 1332, 323, 4715, 247, 8985, 48257, 835, 760, 253, 13461, 285, 417, 253, 3828, 14237, 403, 13358, 50276, 261, 436, 253, 806, 673, 305, 12064, 7882, 310, 908, 323, 4715, 34754, 273, 13358, 24995, 3470, 50276, 16534, 320, 253, 4081, 1332, 2429, 342, 643, 7274, 323, 13358, 13757, 24088, 2870, 327, 1097, 253, 3290, 273, 253, 3453, 285, 253, 15180, 2105, 50276, 5430, 1142, 13345, 305, 12064, 4903, 403, 3058, 323, 247, 277, 6967, 13757, 310, 253, 5199, 3264, 281, 320, 44755, 323, 6928, 342, 9790, 273, 3602, 247, 1077, 5322, 2934, 3164, 247, 2372, 44380, 327, 253, 15180, 1930, 5474, 339, 431, 248, 2929, 10262, 247, 747, 1332, 281, 6194, 11454, 6928, 342, 19191, 13358, 4903, 1925, 7882, 3963, 5837, 253, 1332, 32804, 50276, 783, 18012, 273, 3470, 273, 305, 12064, 3632, 4903, 281, 320, 2810, 281, 13358, 285, 12401, 643, 3082, 908, 342, 13358, 4903, 50276, 10328, 347, 305, 3561, 293, 2602, 4090, 534, 4419, 3276, 35375, 352, 310, 3477, 281, 19928, 253, 1332, 310, 5183, 327, 247, 1077, 6793, 5235, 273, 8892, 285, 3210, 285, 2722, 1375, 273, 253, 1445, 3045, 50275, 783, 2929, 21168, 327, 253, 4473, 273, 6046, 7882, 273, 3470, 273, 305, 12064, 4903, 50276, 24946, 7882, 5593, 253, 35124, 273, 824, 3470, 50276, 936, 6046, 407, 26230, 849, 616, 18012, 24888, 672, 597, 4763, 347, 3280, 9578, 305, 12064, 4903, 253, 2169, 253, 5921, 253, 625, 50276, 11351, 253, 1159, 323, 247, 2021, 273, 3470, 11542, 285, 4993, 4644, 253, 7882, 310, 11903, 1025, 323, 841, 3470, 326, 403, 15301, 3470, 50276, 1171, 2716, 8470, 534, 310, 247, 3626, 1039, 281, 3989, 8985, 4903, 253, 2929, 29328, 281, 22950, 7882, 275, 1340, 281, 4044, 31091, 50276, 39448, 50276, 783, 7092, 273, 253, 1332, 310, 2581, 2969, 285, 352, 3249, 342, 247, 1355, 15180, 18332, 1955, 281, 253, 958, 326, 352, 4419, 247, 4021, 7103, 273, 841, 15180, 5085, 3692, 3453, 359, 651, 971, 281, 7450, 281, 320, 13358, 50275, 783, 2929, 33826, 253, 3045, 273, 253, 1332, 275, 247, 11117, 873, 273, 4679, 4715, 13358, 21624, 8470, 305, 839, 15180, 5085, 970, 8985, 4903, 285, 11365, 13358, 18012, 891, 11435, 253, 37535, 273, 15216, 891, 760, 452, 247, 1355, 4385, 1060, 327, 253, 958, 326, 253, 14023, 403, 50276, 7483, 2218, 342, 253, 305, 3561, 293, 5530, 4090, 2139, 436, 4327, 285, 417, 7277, 671, 342, 690, 273, 253, 643, 7274, 326, 253, 4477, 2319, 24088, 4951, 73, 903, 285, 263, 690, 11640, 273, 253, 4868, 1754, 48489, 50275, 1189, 455, 436, 310, 247, 5322, 7680, 281, 752, 310, 271, 1774, 1895, 4715, 13358, 19191, 4903, 253, 2934, 310, 2969, 285, 253, 4477, 7568, 352, 275, 247, 5235, 273, 15302, 253, 760, 2181, 326, 891, 651, 452, 10490, 281, 923, 310, 247, 5301, 342, 643, 3082, 326, 403, 5486, 281, 671, 789, 275, 824, 13358, 7533, 24088, 4951, 10489, 4868, 3169, 48489, 50276, 187, 187, 4118, 18435, 27, 783, 2929, 23970, 247, 1332, 281, 6194, 11454, 6928, 1754, 327, 9267, 18859, 7882, 3963, 5837, 253, 1332, 29426, 253, 18012, 273, 3470, 273, 305, 12064, 3632, 4903, 281, 320, 2810, 281, 13358, 285, 1057, 417, 2430, 3276, 35375, 751, 253, 305, 3561, 293, 2602, 4090, 512, 30628, 5821, 326, 253, 4081, 1332, 369, 4460, 285, 273, 1600, 253, 4477, 5196, 9470, 4679, 597, 671, 18212, 9713, 253, 7350, 5439, 407, 253, 30628, 24088, 10527, 12153, 285, 15180, 2105 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: overview the paper addresses the problem of learning from human feedback it provides an analysis of reward learningwhere human feedback is used to extract a task description in the form of a rewardand assistancewhere the learning agent and human coexist in the environment and both perform actions the agent seeks to select its actions to optimize the unknown that is implicit in the humans actions the paper shows that reward learning problems can be converted to assistance problems turning queries from reward learning to communicative actions in a twophase communicative assistance problem conversely twophase communicative assistance problems can be converted to active reward learning problems comments in my opinion the paper reads very well and the discussion in the paper is quite interesting in spite of an interesting discussion i am not certain about the contribution of the paper the derivations although they seem technically sound are not particularly surprising similarly the qualitative differences in behavior pointed out in the paper are hardly surprisingassistance learning being a more general problem than reward learning as follows from the discussion of the paper will lead to more diverse behaviors the particular behaviors observed are also a consequence of the pomdp formulation as the agent will act to strike an adequate tradeoff between information gathering and goal optimization postrebuttal update i thank the authors for the clarifications and discussion however and admitting that i may have missed something i remain unconvinced regarding the contributions of the paperdocsep summary the submission provides a survey of two paradigms for agents learning from human feedback the two paradigms are unified under a new formalism assistance games which subsumes them as its special cases further a taxonomy of different problems resulting from the formalism is provided communicative games twophase games etc along with illustrative examples of resulting agent behaviors based on the survey and taxonomy the authors highlight that the assistance paradigm is more advantageous in terms of possible behaviors that it can result in than the reward learning paradigm reasons for score strengths the topic of agent learning from human feedback is topical and of interest to the iclr community the proposed taxonomy ie assistance games can serve as a useful common ground for discussing the different paradigms problems and solutions of agent learning from human feedback the paper is well written and organized weaknesses while the submission does discuss related work section 2 the discussion omits several related research threads some of these threads have strong overlap with the setting proposed in the submission similarly the qualitative behaviors that emerge from assistance games section 4 have been demonstrated in prior research including on larger problems and with human users thereby making it difficult to assess the novelty of the formalism the key contribution of the submission is unclear eg whether it is a survey a model a taxonomy or all i am truly on the fence regarding this submission a novel taxonomy is certainly needed to relate and compare the diverse and growing body of research in the area of agent learning from human feedback however to arrive at this taxonomy a more complete consideration of existing formalisms and algorithms is necessary please see suggestions listed below on prior research that relates to and in certain cases extends the formalism of assistance collaboration key comments 1 introduction and proposition 1 the insight of having a single control policy for both reward learning and control modules has been previously explored this insight is identical to that of planning control formulations in humanrobot interaction literature that model the humans preferences which in turn influence the reward as latent states these planning methods use pomdps to represent the interaction collaboration problem and solve the explorationexploitation tradeoff associated with reward learning exploration and control exploitation using pomdp solvers or mpc please discuss the novelty of the proposed formalism in relation to these methods for instance considers continuous states and actions spaces assistance paradigm sadigh dorsa et al information gathering actions over human internal state 2016 ieeersj international conference on intelligent robots and systems iros ieee 2016 does not require one agent r or h to act before the other assistance paradigm chen min et al planning with trust for humanrobot collaboration proceedings of the 2018 acmieee international conference on humanrobot interaction 2018 gopalan nakul and stefanie tellex modeling and solving humanrobot collaborative tasks using pomdps rss workshop on model learning for humanrobot communication 2015 nikolaidis stefanos et al gametheoretic modeling of human adaptation in humanrobot collaboration proceedings of the 2017 acmieee international conference on humanrobot interaction 2017 2 section 23 assistance games as defined assume parametric specification of the hypothesis space of the reward preference of humans however nonparametric extensions both for assistance and reward learning have been proposed please consider relating the proposed formalism with these prior works for instance michini bernard and jonathan p how bayesian nonparametric inverse reinforcement learning joint european conference on machine learning and knowledge discovery in databases springer berlin heidelberg 2012 panella alessandro and piotr gmytrasiewicz bayesian learning of other agents finite controllers for interactive pomdps proceedings of the thirtieth aaai conference on artificial intelligence 2016 3 section 23 in assistance games as defined the reward or human preferences over reward does not change during the task however extensions exist which model the latent state corresponding to reward as being locally active and or time varying please consider relating the proposed formalism with these related works for example locally active reward reward learning paradigm michini bernard and jonathan p how bayesian nonparametric inverse reinforcement learning joint european conference on machine learning and knowledge discovery in databases springer berlin heidelberg 2012 locally active reward reward learning paradigm park daehyung et al inferring task goals and constraints using bayesian nonparametric inverse reinforcement learning conference on robot learning pmlr 2020 considers time varying preferences with learned dynamics assistance paradigm nikolaidis stefanos david hsu and siddhartha srinivasa humanrobot mutual adaptation in collaborative tasks models and experiments the international journal of robotics research 3657 2017 618634 considers time varying preferences with learned dynamics assistance paradigm unhelkar vaibhav v shen li and julie a shah semisupervised learning of decisionmaking models for humanrobot collaboration conference on robot learning 2020 4 the behaviors arising from solving assistance games section 4 have been previously formalized by multiple humanai collaboration approaches and demonstrated with human users for instance exhibits behaviors outlined in section 42 kamar ece yaakov gal and barbara j grosz incorporating helpful behavior into collaborative planning proceedings of the 8th international conference on autonomous agents and multiagent systems aamas springer verlag 2009 reasons about communicative actions whitney david et al reducing errors in objectfetching interactions through social feedback 2017 ieee international conference on robotics and automation icra ieee 2017 reasons about both physical actions and communications nikolaidis stefanos et al planning with verbal communication for humanrobot collaboration acm transactions on humanrobot interaction thri 73 2018 121 reasons about both physical actions and communications unhelkar vaibhav v shen li and julie a shah decisionmaking for bidirectional communication in sequential humanrobot collaborative tasks proceedings of the 2020 acmieee international conference on humanrobot interaction 2020 communicative actions liang claire et al implicit communication of actionable information in humanai teams proceedings of the 2019 chi conference on human factors in computing systems 2019 please discuss the connection of the proposed formalism with these approaches 5 section 22 please also consider discussing the following related works on active reward learning lopes manuel francisco melo and luis montesano active learning for reward estimation in inverse reinforcement learning joint european conference on machine learning and knowledge discovery in databases springer berlin heidelberg 2009 brown daniel s yuchen cui and scott niekum riskaware active inverse reinforcement learning conference on robot learning 2018 tschiatschek sebastian et al learneraware teaching inverse reinforcement learning with preferences and constraints advances in neural information processing systems 2019 cui yuchen and scott niekum active reward learning from critiques 2018 ieee international conference on robotics and automation icra ieee 2018 minor comment section 21 please consider including a note explaining the asterisk notation used to define the domain for pomdp policy docsepupdate after extensive discussion with the authors im raising my score to a 7 i believe their revision will adequately address the concerns ive raised i think this paper clearly identifies and illustrates the qualitative advantages of assistance and that this is a novel and significant contribution in particular i do not believe as other reviewers seem to that any of the following are sufficient reasons for rejecting this work the wealth of prior work on variants of reward learning and assistance the lack of a comprehensive survey or categorization of such work in this submission the lack of further results after reading the other reviews and responses i am more confident that this paper makes a valuable contribution although i stand ready to be challenged by other reviewers this is because the authors have argued that the qualitative benefits they describe in sections 4123 have not been available to any of the many previous works reviewers mentioned and no reviewers disputed this furthermore i did not find the reasons provided for rejection to be very relevant to the goals of this work so overall i do not believe that other reviewers have made a strong case for rejecting this work in my mind the best argument would seem to be simply that the contribution is insufficient i think this is a common criticism of papers that do not adhere to a conventional format or type but in this case it seems unfair i believe the intellectual contribution of this paper is rather modest but nonetheless novel and significant and i found this motivation for the work quite compelling emphasis mine this existing literature is exactly why we wrote this paper almost all of the existing literature on learning without reward functions can be captured by the reward learning paradigm as we formalize it but as we show the assistance paradigm can enable significantly better behavior from the agents we train we are hoping to influence researchers to put more effort into algorithms for the assistance domain in order to realize these qualitative benefits instead of continuing to work in the reward learning paradigm as they have done so far i would encourage the authors to explain this goal in their revision and make sure their claims about the superiority of assistance are appropriately modest overall i think the qualitative benefits of assistance presented provide a compelling argument for more work in the assistance paradigm given the paucity of such work but i think the overall message of the paper should be given these advantages and the lack of work on assistance there should be more work on assistance since it seems promising and neglected and not assistance is better so why would you do reward learning and my first impression of the paper was closer to the latter end update evaluation this paper is well written and makes a nice point regarding qualitative advantages of assistance over reward learning  specifically the authors show how assistance naturally leads an agent to agent to 1 choose questions based on their relevance 2 create plans whose success depends on future feedback and 3 learn from physical human actions in addition to communicative feedback however the framing is quite onesided and the authors make no mention of potential advantages of reward learning  and it is possible to achieve the same qualitative advantages by slightly modifying the rather restrictive formulation of reward learning that the authors use  i think the work should either include such a discussion or offer more convincing evidence regarding when assistance is in fact preferable in practice eg experiments with some features that might advantage either approach some potential advantages of reward learning are reduced complexity of the learning problem the human retains more control it seems to require less modeling of human psychology less opportunities to corrupt the reward signal regarding the qualitative advantages mentioned in sections 412 we can achieve the same benefits in many instances by incorporating regular feedback sessions where h and r can communicate  i was confused by the claimed advantage in 43 since irl already can learn from physical actions overall i think this paper sets up a bit of a false dichotomy between noninteractive reward learning and the full assistance game formulation  we can instead view these methods as varying wrt 1 when and how we ask r to interpret hs actions as communicative and 2 how interactive the learning process is this paper still makes important corresponding points 1 there is a benefit to considering hs behavior as communicative whenever we understand its semantics and 2 interaction is important   these are not very surprising although very nicely and clearly argued for and demonstrated however regarding point 1 if we dont understand the semantics of hs behavior it makes sense to restrict the communication to a more welldefined channel as in reward learning furthermore even if we do know the semantics h may not wish all of her behavior to be viewed as communicative and allowing her to directly control when and how her behavior will be viewed as communicative grants h more agency over the behavior of r accepting points 1 and 2 it still seems like most effective methods are likely to lie somewhere between the two extremes described in this work  for example the authors state thus for good performance we need to model h this will require insights from a broad range of fields that study human behavior  however accurately modelling h  seems significantly less urgent when communication from h is more limited and literal ie when we move closer towards current reward learning practice as another piece of loosely supporting evidence in literal or pedagogic human analyzing human model misspecification in objective learning milli and dragan show that more sophisticated assumptions about the semantics of human behaviors may also be more brittle another contribution of the paper is to explicitly frame reward learning as a special case of assistance  this seems like a straightforward minor contribution i was also disappointed that the paper didnt discuss reward corruption the need to more fully understand h in the assistance paradigm creates more opportunities for r to misinterpret hs behavior in practice the tighter feedback loops favored by the assistance paradigm might also create more opportunities for r to eg irreversibly corrupt hs reward function andor policy since an initial misunderstanding could be selfreinforcing  for instance h might fail to tell r that its behavior was intimidating if doing so previously had led r to become more instead of less intimidating  this could lead r to become confident that this behavior was not intimidating to h  such a scenario seems less likely when h provides feedback to r in an offline or purely communicative context detailed suggestions figure 1 caption could be clearer what is depicted on the right these behaviors cannot be expressed by a reward learning agent not literally true suggest a rephrase since c0k1 only serves to provide information to r it doesnt have to eg people do irl on data that doesnt fit this description provide more of an informal introduction in section 23 before jumping into definitions however since we want to compare to reward learning we follow its assumptions and so assume that the human policy h is available while i was eventually able to understand this reasoning it sounds false out of context and should be clarified ahnoop is not defined i consider the first paragraph of section 4 to be the meat of the paper and i think some of this content should be frontloaded more  its a shame to wait until page 6 i would start each of 4123 with a paragraph giving a nonmathematical qualitative description of the result i like the paragraph at top of page 7 which gives an explanation of how reward learning would actfail  can you include such a paragraph in 413 as well  i think the more you mirror structure in these sections the better docsepsummary this work proposes learning a single control policy for humanintheloop learning rather than having a reward learning component and a control component the key difference is that the action selection can use information from the reward learning module the authors formulate an assistance game in this setting and show that it can reduce to an equivalent pomdp the work then describes a communicative assistance problem and shows the equivalence of reward learning to assistance and visa versa results show qualitative improvements on variants of the kitchen domain pros the paper was overall wellwritten although some key differences with prior work were unclear as described below the highlevel area of humaninthelearning is a useful and important space cons there are many works in humanintheloop learning and it wasnt clear whether the ideas in this paper were novel enough at a highlevel some relevant works are included below the paper was hard to follow with respect to the distinction between reward learning and assistance adding more examples or describing the difference from other angles could be useful the experimental domains were simple and there were no computational results shown in the main paper this is important to show in order to support the claims of the paper comments the qualitative results are useful for showing specific cases in which the proposed approach may be beneficial but this restricts how applicable the approach seems to be it would be nice to see computational results over a variety of domains and comparisons with baseline approaches to see the benefit of the approach more generally section 42 focuses on asking the right questions at the right time what is the main novelty here with respect to more general active learning approaches a few minor notational confusions for example its confusing to have r be the robot as its often the reward function and in section 41 c is used for choices and for cherry other work that might be relevant to the problem proposed here gametheoretic modeling of human adaptation in humanrobot collaboration s nikolaidis s nath ad procaccia s srinivasa maximizing bci human feedback using active learning z wang j shi i akinola p allen modeling humans as observation providers using pomdps s rosenthal m veloso active learning for risksensitive inverse reinforcement learning r chen w wang z zhao d zhao contact deciding to communicate during timecritical collaborative tasks in unknown deterministic domains vv unhelkar ja shah efficient model learning from jointaction demonstrations for humanrobot collaborative tasks s nikolaidis r ramakrishnan k gu j shah recommendation overall i thought the work at a highlevel was limited in its novelty compared with prior work in humanintheloop learning the distinction between reward learning and assistance a key part of the paper was hard to fully understand so clarifying this description through examples or more clear text would be valuable the evaluation was also simple and not very convincing with respect to supporting the claims adding computational experiments and appropriate baselines would be important to show the general applicability of the approach response after rebuttal thank you to the authors for their response i appreciate the detailed answers to each of the prior works i still do think the paper needs to be more clear in the problem and differentiation with prior work in the writing itself which will require a nontrivial update to the paper on the computational results baselines side while the authors have run the experiment and have described these qualitative behaviors this isnt a substitute for quantitative results especially because this is important to show when comparing with baselines the authors say we have updated section 4 to be clearer about what baseline approaches would do in the environments we have tested its important to show evidence that this is what the baselines actually did based on these points i dont think the paper is quite ready for publication yetdocsep postrebuttal given the effort of the authors of improving their manuscript i am improving my original score however my evaluation is still weak reject for the reasons below 1 i still fail to see clear differences between assistance as defined by the authors and the other reinforcement learninglike approaches that assume that the reward function is unknown i can see that they perhaps provide a more organized and methodological description of how that assistance can happen when compared to the previous works however the paper lacks practical advice exactly how should i build an agent to leverage such assistance i dont think their ideas are so novel that other methods couldnt be at least adapted to work in their scenario to include some empirical evaluation in the manuscript 2 the paper seems a little displaced to me in this conference the paper neither provides practical and direct guidance on how to build algorithms to leverage assistance nor is a survey that focuses on organizing the area and discussing differences between works perhaps the paper would be better placed in a blue sky track the authors propose two learning paradigms where the learning agent doesnt have access to a reward function but instead has to learn directly from the assistance from a trainer agent while looking for ways to facilitate task specification and human integration in the learning process is a relevant and promising research goal the authors dont explain the difference between their newlyproposed paradigms and the very rich literature on inverse reinforcement learning and learning from demonstrations learning from human assistance instead of a reward function is not a new thing and the surveys below not cited by the authors summarize a rich literature that does precisely that silva felipe leno and anna helena reali costa a survey on transfer learning for multiagent reinforcement learning systems journal of artificial intelligence research 64 2019 645703 surveys many categories of works where one agent provides guidance to others including humans providing assistance to learning agents argall brenna d et al a survey of robot learning from demonstration robotics and autonomous systems 575 2009 469483 surveys learning from demonstration where a human provides policy demonstrations to a learning agent which usually doesnt have access to a reward function gao yang et al a survey of inverse reinforcement learning techniques international journal of intelligent computing and cybernetics 2012 inverse reinforcement learning where the learning agent doesnt have access to a reward function and has to infer a policy from human assistance without a comprehensive discussion about the differences between the authors proposal and those paradigms i cant judge the paper contribution to my eyes all the descriptions gave in the paper look just like the same as one of the problems surveyed in the papers above for example nonactive reward learning seems to me equivalent to inverse reinforcement learning also active reward learning seems to me a special case of the action advising problem surveyed in the first paper above in the case the authors deal with a different problem in this paper i suggest that the manuscript is rewritten to thoughtfully explain the difference between all those scenarios in case they indeed are correspondent scenarios i suggest that the authors use the same notation as the previous works and include comparisons with the state of the art methods in the experimental evaluation other suggestions i dont get how the policy decision function will compute the expected reward if the reward function is unknown you assume access to a human decision function what exactly does that mean do you need the probabilities for taking each action if thats the case it is very unrealistic to expect that a human is able to provide probabilities for each action asking she to simply pick one action when requested is more realistic ### Summary:
this is a well written paper outlining a class of assistive algorithms being more or less a survey paper it could do a better job of discussing inverse reinforcement learning and collaborative inverse reinforcement learning it could also be slightly more general for example the human dewcision function need not be known if we model the interaction as a bayesian game then the human might have a latent type which can be inferred together with the reward function the active reward learning problem is sometimes referred to as preference elicitation in the end it was not clear that the discussion in this paper had any actionable insights for future models or algorithms in this area
[ 432, 288, 310, 625, 3710, 285, 22436, 26332, 672, 359, 2118, 8003, 4404, 1655, 10921, 4715, 3946, 50276, 284, 1529, 5313, 273, 35056, 8109, 1941, 575, 249, 575, 34943, 390, 7690, 356, 462, 280, 1966, 18918, 1966, 1566, 2985, 1553, 1877, 275, 8103, 4715, 278, 3370, 285, 9310, 266, 921, 326, 625, 18144, 13260, 670, 253, 35185, 273, 1966, 13576, 778, 671, 320, 625, 1308, 1522, 50276, 23955, 7680, 273, 253, 2929, 310, 281, 11120, 3665, 10921, 4715, 347, 247, 2714, 1083, 273, 8385, 575, 436, 3133, 751, 247, 15246, 5884, 7680, 50276, 74, 369, 671, 19271, 326, 253, 2929, 42126, 2319, 10921, 16933, 575, 783, 878, 281, 625, 4751, 2096, 288, 275, 253, 8385, 22199, 10513, 625, 9091, 323, 391, 281, 3731, 22416, 49343, 3879, 50276, 249, 3946, 253, 40638, 8680, 17417, 25973, 407, 253, 8385, 22199, 1537, 671, 575, 6953, 625, 9091, 323, 391, 281, 24088, 21388, 735, 4360, 17715, 49343, 10921, 1159, 285, 263, 3646, 1580, 271, 3302, 40663, 812, 320, 1881, 39910, 22958, 575, 323, 4227, 575, 73, 1537, 1891, 281, 2028, 391, 326, 697, 3879, 369, 28427, 839, 604, 2509, 594, 3786, 574, 3977, 391, 281, 2489, 625, 3185, 273, 1679, 28427, 839, 575, 436, 812, 1421, 391, 281, 2489, 13224, 326, 436, 3879, 369, 417, 28427, 839, 281, 288, 575, 824, 247, 10076, 3133, 1679, 2779, 672, 288, 3400, 8680, 281, 391, 275, 271, 28841, 390, 15846, 3461, 800, 3634, 50275, 5992, 7193, 13991, 50276, 13206, 337, 11743, 812, 320, 30909, 752, 310, 17253, 327, 253, 987, 50276, 20513, 13576, 2550, 320, 4469, 407, 247, 10921, 4715, 5570, 50276, 1439, 12832, 2032, 1804, 247, 294, 40712, 50276, 17480, 260, 17, 76, 18, 760, 11029, 281, 2085, 1491, 281, 391, 50276, 262, 36908, 452, 281, 24088, 952, 513, 209, 2587, 327, 941, 326, 36908, 4944, 436, 5740, 50276, 42260, 625, 273, 271, 25040, 10199, 275, 2593, 3495, 1078, 22802, 715, 14308, 50276, 35529, 1580, 359, 971, 281, 7277, 281, 575, 250, 1034, 4715, 359, 956, 697, 13260, 285, 594, 5467, 326, 253, 1966, 3646, 288, 310, 2130, 575, 6050, 891, 369, 6524, 2104, 281, 2096, 436, 14720, 352, 7835, 3221, 562, 273, 3634, 285, 943, 320, 31637, 50276, 1240, 27607, 310, 417, 2931, 50276, 74, 1908, 253, 806, 12494, 273, 2593, 577, 281, 320, 253, 9132, 273, 253, 2929, 285, 891, 1158, 690, 273, 436, 2600, 943, 320, 2914, 19052, 625, 575, 697, 247, 14816, 281, 3343, 1919, 3239, 721, 50276, 74, 651, 1265, 1016, 273, 577, 10683, 342, 247, 12494, 4933, 247, 1327, 2056, 10479, 474, 18276, 5740, 273, 253, 906, 50276, 74, 751, 253, 12494, 387, 1755, 273, 3239, 818, 534, 4245, 271, 8813, 273, 849, 10921, 4715, 651, 769, 24796, 575, 476, 368, 2486, 824, 247, 12494, 275, 37066, 347, 973, 575, 891, 1158, 253, 625, 368, 11472, 2605, 275, 841, 7118, 253, 1805, 575, 7152, 339, 793, 360, 3454, 436, 789, 29328, 4715, 247, 2014, 1453, 3646, 323, 1966, 565, 2955, 24318, 4715, 2581, 685, 1907, 247, 10921, 4715, 4445, 285, 247, 1453, 4445, 253, 2234, 3064, 310, 326, 253, 2250, 5438, 476, 897, 1491, 432, 253, 10921, 4715, 6333, 253, 4477, 36803, 271, 8385, 2165, 275, 436, 4758, 285, 921, 326, 352, 476, 4796, 281, 271, 6425, 31204, 12132, 253, 789, 840, 8631, 247, 3461, 800, 8385, 1895, 285, 2722, 253, 19945, 273, 10921, 4715, 281, 8385, 285, 25105, 26620, 1543, 921, 18276, 11701, 327, 11640, 273, 253, 8576, 5028, 50276, 856, 84, 50276, 783, 2929, 369, 4583, 973, 15720, 3738, 690, 2234, 3910, 342, 2720, 789, 497, 12744, 347, 2529, 2708, 50276, 783, 1029, 5251, 2170, 273, 1966, 565, 248, 28269, 310, 247, 4217, 285, 1774, 2317, 50276, 5040, 50276, 9088, 403, 1142, 2987, 275, 1966, 565, 2955, 24318, 4715, 285, 352, 369, 2649, 2590, 1880, 253, 5697, 275, 436, 2929, 497, 4460, 2217, 387, 247, 1029, 5251, 690, 4623, 2987, 403, 2908, 2708, 50276, 783, 2929, 369, 1892, 281, 956, 342, 1675, 281, 253, 13812, 875, 10921, 4715, 285, 8385, 6240, 625, 6667, 390, 12930, 253, 3064, 432, 643, 14636, 812, 320, 4217, 50276, 783, 5661, 10625, 497, 2969, 285, 627, 497, 642, 15180, 1543, 2011, 275, 253, 2022, 2929, 436, 310, 1774, 281, 921, 275, 1340, 281, 1329, 253, 3916, 273, 253, 2929, 50276, 26122, 50276, 783, 18276, 1543, 403, 4217, 323, 4645, 2173, 2219, 275, 534, 253, 4081, 2746, 778, 320, 12912, 533, 436, 45798, 849, 7763, 253, 2746, 3133, 281, 320, 352, 651, 320, 5322, 281, 923, 15180, 1543, 689, 247, 5235, 273, 10625, 285, 14023, 342, 8245, 7274, 281, 923, 253, 5649, 273, 253, 2746, 625, 3839, 50275, 4674, 5976, 16633, 327, 7004, 253, 987, 3533, 387, 253, 987, 673, 752, 310, 253, 2022, 38135, 1060, 342, 1675, 281, 625, 2087, 3939, 4715, 7274, 50276, 66, 1643, 5884, 417, 1050, 1461, 16723, 323, 1650, 697, 21643, 281, 452, 391, 320, 253, 15688, 347, 697, 2223, 253, 10921, 1159, 285, 275, 2593, 7609, 260, 310, 908, 323, 10165, 285, 323, 33804, 50276, 977, 789, 326, 1537, 320, 4623, 281, 253, 1895, 4081, 1060, 50276, 72, 312, 10666, 30325, 14053, 273, 1966, 15644, 275, 1966, 287, 12042, 14448, 256, 295, 1479, 6836, 30861, 256, 295, 506, 519, 15613, 3649, 571, 256, 256, 11078, 400, 19924, 50276, 785, 3266, 3006, 270, 5297, 1966, 8680, 970, 3939, 4715, 1182, 259, 606, 480, 439, 74, 891, 33917, 6836, 268, 512, 257, 50276, 7645, 272, 7497, 347, 8310, 11967, 970, 31204, 69, 793, 256, 687, 17094, 5590, 278, 5828, 26471, 50276, 4507, 4715, 323, 10502, 18917, 13737, 35221, 4715, 391, 260, 864, 259, 259, 606, 1182, 1182, 31035, 277, 1182, 31035, 50276, 22045, 18000, 281, 13791, 1309, 673, 26717, 27549, 8892, 275, 7202, 30027, 10625, 362, 87, 440, 2955, 18970, 8729, 439, 1240, 50276, 20246, 1566, 4715, 432, 6036, 1913, 32367, 323, 1966, 287, 12042, 27549, 8892, 256, 295, 1479, 6836, 30861, 391, 17653, 518, 41657, 11943, 465, 1149, 480, 439, 1240, 50276, 250, 27167, 318, 4583, 891, 1869, 253, 789, 387, 247, 1029, 5251, 369, 3710, 275, 697, 38135, 2429, 342, 2720, 789, 275, 1966, 565, 2955, 24318, 4715, 253, 13812, 875, 10921, 4715, 285, 8385, 247, 2234, 629, 273, 253, 2929, 369, 1892, 281, 4751, 2096, 594, 8254, 5411, 436, 5740, 949, 6667, 390, 625, 2590, 2505, 651, 320, 9865, 253, 7103, 369, 671, 2969, 285, 417, 1077, 21414, 342, 1675, 281, 8109, 253, 3916, 6240, 15180, 4679, 285, 4569, 1666, 25379, 651, 320, 1774, 281, 921, 253, 2087, 30437, 273, 253, 2746, 50275, 10927, 846, 30080, 22559, 5717, 368, 281, 253, 4477, 323, 616, 2380, 891, 11435, 253, 7000, 9172, 281, 1016, 273, 253, 2720, 2987, 891, 1335, 513, 1158, 253, 2929, 3198, 281, 320, 625, 2590, 275, 253, 1895, 285, 9827, 342, 2720, 789, 275, 253, 4028, 3139, 534, 588, 2430, 247, 37825, 5731, 281, 253, 2929, 50276, 251, 253, 15180, 1543, 50276, 10352, 25379, 1930, 1223, 253, 4477, 452, 1408, 253, 3368, 285, 452, 2529, 841, 18276, 13576, 436, 310, 2649, 247, 16502, 323, 11745, 1543, 3340, 984, 436, 310, 1774, 281, 921, 672, 10941, 342, 1666, 25379, 253, 4477, 1333, 359, 452, 9300, 2593, 577, 281, 320, 30909, 670, 752, 8245, 7274, 651, 513, 275, 253, 12620, 359, 452, 5762, 697, 1774, 281, 921, 1941, 326, 436, 310, 752, 253, 1666, 25379, 2686, 858, 50276, 3169, 327, 841, 2792, 891, 13414, 1158, 253, 2929, 310, 3240, 4704, 323, 9311, 2568, 7152, 33032, 1501, 250, 2858, 22559, 50276, 28821, 253, 3434, 273, 253, 4477, 273, 11138, 616, 7714, 891, 717, 11138, 619, 3236, 4868, 2299, 619, 7103, 310, 1335, 5075, 12009, 323, 253, 4606, 2708, 50276, 18, 891, 1335, 1891, 281, 923, 2590, 3910, 875, 8385, 347, 2931, 407, 253, 4477, 285, 253, 643, 35221, 4715, 3022, 7274, 326, 5467, 326, 253, 10921, 1159, 310, 7202, 891, 476, 923, 326, 597, 4931, 2085, 247, 625, 10932, 285, 35961, 5740, 273, 849, 326, 8385, 476, 5108, 672, 2429, 281, 253, 2045, 2987, 2299, 253, 2929, 19756, 8542, 7535, 4555, 849, 943, 891, 1973, 271, 5570, 281, 25057, 824, 8385, 891, 13414, 1158, 616, 5697, 403, 594, 4460, 326, 643, 3082, 812, 2649, 320, 387, 1878, 12956, 281, 789, 275, 616, 10076, 281, 2486, 690, 16774, 7103, 275, 253, 7714, 50276, 19, 253, 2929, 3133, 247, 1652, 26699, 281, 479, 275, 436, 8059, 253, 2929, 6747, 3400, 8542, 285, 1480, 12925, 327, 849, 281, 1973, 11333, 281, 25057, 8385, 4543, 310, 247, 6630, 326, 16633, 327, 26169, 253, 2170, 285, 16585, 3910, 875, 2987, 4931, 253, 2929, 651, 320, 1805, 4845, 275, 247, 4797, 8467, 3540, 50273, 783, 4477, 12661, 767, 4715, 11951, 304, 983, 835, 253, 4715, 5570, 36908, 452, 2289, 281, 247, 10921, 1159, 533, 3185, 556, 281, 3037, 3587, 432, 253, 8385, 432, 247, 33837, 5570, 50276, 6050, 2819, 323, 4088, 281, 12454, 4836, 17776, 285, 1966, 9554, 275, 253, 4715, 1232, 310, 247, 4623, 285, 12532, 2561, 4736, 253, 4477, 13414, 5513, 253, 3064, 875, 616, 9841, 856, 7334, 11951, 304, 983, 285, 253, 1077, 6793, 6239, 327, 13737, 35221, 4715, 285, 4715, 432, 32367, 50276, 28269, 432, 1966, 8385, 3185, 273, 247, 10921, 1159, 310, 417, 247, 747, 2181, 285, 253, 17276, 2708, 417, 11106, 407, 253, 4477, 26799, 247, 6793, 6239, 326, 1057, 10534, 326, 50276, 17525, 6156, 11664, 5495, 8472, 80, 285, 271, 2072, 1203, 6736, 1524, 74, 2105, 66, 247, 6630, 327, 3700, 4715, 323, 4471, 12788, 35221, 4715, 2718, 6698, 273, 13345, 9260, 2561, 6705, 6247, 721, 1857, 30349, 50276, 9960, 306, 656, 1142, 9050, 273, 2987, 835, 581, 5570, 3400, 12925, 281, 2571, 1690, 7497, 5277, 8385, 281, 4715, 6083, 50276, 1662, 455, 270, 445, 2072, 277, 1162, 355, 247, 6630, 273, 15688, 4715, 432, 20028, 15688, 982, 285, 26279, 2718, 45916, 4748, 40089, 32282, 50276, 9960, 306, 656, 4715, 432, 20028, 835, 247, 1966, 3400, 3646, 32367, 281, 247, 4715, 5570, 534, 3798, 36908, 452, 2289, 281, 247, 10921, 1159, 50276, 2485, 80, 30966, 1162, 355, 247, 6630, 273, 13737, 35221, 4715, 5609, 5213, 6698, 273, 17497, 12672, 285, 20239, 3024, 982, 4050, 50276, 46429, 35221, 4715, 835, 253, 4715, 5570, 36908, 452, 2289, 281, 247, 10921, 1159, 285, 556, 281, 9441, 247, 3646, 432, 1966, 8385, 50276, 14920, 247, 11088, 5955, 670, 253, 3910, 875, 253, 4477, 10419, 285, 1110, 11951, 304, 983, 891, 16216, 5963, 253, 2929, 7680, 281, 619, 2927, 512, 253, 20121, 3534, 275, 253, 2929, 1007, 816, 751, 253, 1072, 347, 581, 273, 253, 3237, 28671, 275, 253, 9380, 1840, 50276, 1542, 1650, 1327, 4507, 10921, 4715, 3133, 281, 479, 6425, 281, 13737, 35221, 4715, 50276, 12563, 3939, 10921, 4715, 3133, 281, 479, 247, 2714, 1083, 273, 253, 2250, 44083, 1895, 28671, 275, 253, 806, 2929, 1840, 50276, 249, 253, 1083, 253, 4477, 2968, 342, 247, 1027, 1895, 275, 436, 2929, 891, 1804, 326, 253, 7714, 310, 35993, 281, 1869, 2920, 5513, 253, 3064, 875, 512, 1110, 15216, 275, 1083, 597, 6296, 403, 35260, 15216, 891, 1804, 326, 253, 4477, 897, 253, 1072, 14951, 347, 253, 2045, 2987, 285, 2486, 14023, 342, 253, 1375, 273, 253, 1445, 3082, 275, 253, 5661, 7103, 50275, 977, 13991, 50275, 74, 13414, 755, 849, 253, 3646, 3061, 1159, 588, 11897, 253, 3264, 10921, 604, 253, 10921, 1159, 310, 7202, 50275, 5658, 5467, 2289, 281, 247, 1966, 3061, 1159, 752, 4555, 1057, 326, 1599, 513, 368, 878, 253, 20552, 323, 3192, 1016, 2250, 604, 28763, 253, 1083, 352, 310, 1077, 46521, 281, 1902, 326, 247, 1966, 310, 2104, 281, 2085, 20552, 323, 1016, 2250, 7004, 703, 281, 3365, 2619, 581, 2250, 672, 9521, 310, 625, 15958, 50276, 187, 187, 4118, 18435, 27, 2520, 310, 247, 973, 3542, 2929, 562, 30927, 247, 966, 273, 10073, 422, 11333, 1146, 625, 390, 1679, 247, 6630, 2929, 352, 812, 513, 247, 1805, 2628, 273, 16585, 13737, 35221, 4715, 285, 27549, 13737, 35221, 4715, 352, 812, 671, 320, 5777, 625, 2087, 323, 1650, 253, 1966, 372, 38212, 1297, 1159, 878, 417, 320, 1929, 604, 359, 1566, 253, 5016, 347, 247, 17699, 16561, 2165, 840, 253, 1966, 1537, 452, 247, 21624, 1511, 534, 476, 320, 22245, 2366, 342, 253, 10921, 1159, 253, 3939, 10921, 4715, 1895, 310, 4536, 6289, 281, 347, 14682, 24021, 3535, 275, 253, 990, 352, 369, 417, 2590, 326, 253, 5955, 275, 436, 2929, 574, 667, 49353, 16039, 323, 2852, 3210, 390, 11333, 275, 436, 2170, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 432, 288, 310, 625, 3710, 285, 22436, 26332, 672, 359, 2118, 8003, 4404, 1655, 10921, 4715, 3946, 50276, 284, 1529, 5313, 273, 35056, 8109, 1941, 575, 249, 575, 34943, 390, 7690, 356, 462, 280, 1966, 18918, 1966, 1566, 2985, 1553, 1877, 275, 8103, 4715, 278, 3370, 285, 9310, 266, 921, 326, 625, 18144, 13260, 670, 253, 35185, 273, 1966, 13576, 778, 671, 320, 625, 1308, 1522, 50276, 23955, 7680, 273, 253, 2929, 310, 281, 11120, 3665, 10921, 4715, 347, 247, 2714, 1083, 273, 8385, 575, 436, 3133, 751, 247, 15246, 5884, 7680, 50276, 74, 369, 671, 19271, 326, 253, 2929, 42126, 2319, 10921, 16933, 575, 783, 878, 281, 625, 4751, 2096, 288, 275, 253, 8385, 22199, 10513, 625, 9091, 323, 391, 281, 3731, 22416, 49343, 3879, 50276, 249, 3946, 253, 40638, 8680, 17417, 25973, 407, 253, 8385, 22199, 1537, 671, 575, 6953, 625, 9091, 323, 391, 281, 24088, 21388, 735, 4360, 17715, 49343, 10921, 1159, 285, 263, 3646, 1580, 271, 3302, 40663, 812, 320, 1881, 39910, 22958, 575, 323, 4227, 575, 73, 1537, 1891, 281, 2028, 391, 326, 697, 3879, 369, 28427, 839, 604, 2509, 594, 3786, 574, 3977, 391, 281, 2489, 625, 3185, 273, 1679, 28427, 839, 575, 436, 812, 1421, 391, 281, 2489, 13224, 326, 436, 3879, 369, 417, 28427, 839, 281, 288, 575, 824, 247, 10076, 3133, 1679, 2779, 672, 288, 3400, 8680, 281, 391, 275, 271, 28841, 390, 15846, 3461, 800, 3634, 50275, 5992, 7193, 13991, 50276, 13206, 337, 11743, 812, 320, 30909, 752, 310, 17253, 327, 253, 987, 50276, 20513, 13576, 2550, 320, 4469, 407, 247, 10921, 4715, 5570, 50276, 1439, 12832, 2032, 1804, 247, 294, 40712, 50276, 17480, 260, 17, 76, 18, 760, 11029, 281, 2085, 1491, 281, 391, 50276, 262, 36908, 452, 281, 24088, 952, 513, 209, 2587, 327, 941, 326, 36908, 4944, 436, 5740, 50276, 42260, 625, 273, 271, 25040, 10199, 275, 2593, 3495, 1078, 22802, 715, 14308, 50276, 35529, 1580, 359, 971, 281, 7277, 281, 575, 250, 1034, 4715, 359, 956, 697, 13260, 285, 594, 5467, 326, 253, 1966, 3646, 288, 310, 2130, 575, 6050, 891, 369, 6524, 2104, 281, 2096, 436, 14720, 352, 7835, 3221, 562, 273, 3634, 285, 943, 320, 31637, 50276, 1240, 27607, 310, 417, 2931, 50276, 74, 1908, 253, 806, 12494, 273, 2593, 577, 281, 320, 253, 9132, 273, 253, 2929, 285, 891, 1158, 690, 273, 436, 2600, 943, 320, 2914, 19052, 625, 575, 697, 247, 14816, 281, 3343, 1919, 3239, 721, 50276, 74, 651, 1265, 1016, 273, 577, 10683, 342, 247, 12494, 4933, 247, 1327, 2056, 10479, 474, 18276, 5740, 273, 253, 906, 50276, 74, 751, 253, 12494, 387, 1755, 273, 3239, 818, 534, 4245, 271, 8813, 273, 849, 10921, 4715, 651, 769, 24796, 575, 476, 368, 2486, 824, 247, 12494, 275, 37066, 347, 973, 575, 891, 1158, 253, 625, 368, 11472, 2605, 275, 841, 7118, 253, 1805, 575, 7152, 339, 793, 360, 3454, 436, 789, 29328, 4715, 247, 2014, 1453, 3646, 323, 1966, 565, 2955, 24318, 4715, 2581, 685, 1907, 247, 10921, 4715, 4445, 285, 247, 1453, 4445, 253, 2234, 3064, 310, 326, 253, 2250, 5438, 476, 897, 1491, 432, 253, 10921, 4715, 6333, 253, 4477, 36803, 271, 8385, 2165, 275, 436, 4758, 285, 921, 326, 352, 476, 4796, 281, 271, 6425, 31204, 12132, 253, 789, 840, 8631, 247, 3461, 800, 8385, 1895, 285, 2722, 253, 19945, 273, 10921, 4715, 281, 8385, 285, 25105, 26620, 1543, 921, 18276, 11701, 327, 11640, 273, 253, 8576, 5028, 50276, 856, 84, 50276, 783, 2929, 369, 4583, 973, 15720, 3738, 690, 2234, 3910, 342, 2720, 789, 497, 12744, 347, 2529, 2708, 50276, 783, 1029, 5251, 2170, 273, 1966, 565, 248, 28269, 310, 247, 4217, 285, 1774, 2317, 50276, 5040, 50276, 9088, 403, 1142, 2987, 275, 1966, 565, 2955, 24318, 4715, 285, 352, 369, 2649, 2590, 1880, 253, 5697, 275, 436, 2929, 497, 4460, 2217, 387, 247, 1029, 5251, 690, 4623, 2987, 403, 2908, 2708, 50276, 783, 2929, 369, 1892, 281, 956, 342, 1675, 281, 253, 13812, 875, 10921, 4715, 285, 8385, 6240, 625, 6667, 390, 12930, 253, 3064, 432, 643, 14636, 812, 320, 4217, 50276, 783, 5661, 10625, 497, 2969, 285, 627, 497, 642, 15180, 1543, 2011, 275, 253, 2022, 2929, 436, 310, 1774, 281, 921, 275, 1340, 281, 1329, 253, 3916, 273, 253, 2929, 50276, 26122, 50276, 783, 18276, 1543, 403, 4217, 323, 4645, 2173, 2219, 275, 534, 253, 4081, 2746, 778, 320, 12912, 533, 436, 45798, 849, 7763, 253, 2746, 3133, 281, 320, 352, 651, 320, 5322, 281, 923, 15180, 1543, 689, 247, 5235, 273, 10625, 285, 14023, 342, 8245, 7274, 281, 923, 253, 5649, 273, 253, 2746, 625, 3839, 50275, 4674, 5976, 16633, 327, 7004, 253, 987, 3533, 387, 253, 987, 673, 752, 310, 253, 2022, 38135, 1060, 342, 1675, 281, 625, 2087, 3939, 4715, 7274, 50276, 66, 1643, 5884, 417, 1050, 1461, 16723, 323, 1650, 697, 21643, 281, 452, 391, 320, 253, 15688, 347, 697, 2223, 253, 10921, 1159, 285, 275, 2593, 7609, 260, 310, 908, 323, 10165, 285, 323, 33804, 50276, 977, 789, 326, 1537, 320, 4623, 281, 253, 1895, 4081, 1060, 50276, 72, 312, 10666, 30325, 14053, 273, 1966, 15644, 275, 1966, 287, 12042, 14448, 256, 295, 1479, 6836, 30861, 256, 295, 506, 519, 15613, 3649, 571, 256, 256, 11078, 400, 19924, 50276, 785, 3266, 3006, 270, 5297, 1966, 8680, 970, 3939, 4715, 1182, 259, 606, 480, 439, 74, 891, 33917, 6836, 268, 512, 257, 50276, 7645, 272, 7497, 347, 8310, 11967, 970, 31204, 69, 793, 256, 687, 17094, 5590, 278, 5828, 26471, 50276, 4507, 4715, 323, 10502, 18917, 13737, 35221, 4715, 391, 260, 864, 259, 259, 606, 1182, 1182, 31035, 277, 1182, 31035, 50276, 22045, 18000, 281, 13791, 1309, 673, 26717, 27549, 8892, 275, 7202, 30027, 10625, 362, 87, 440, 2955, 18970, 8729, 439, 1240, 50276, 20246, 1566, 4715, 432, 6036, 1913, 32367, 323, 1966, 287, 12042, 27549, 8892, 256, 295, 1479, 6836, 30861, 391, 17653, 518, 41657, 11943, 465, 1149, 480, 439, 1240, 50276, 250, 27167, 318, 4583, 891, 1869, 253, 789, 387, 247, 1029, 5251, 369, 3710, 275, 697, 38135, 2429, 342, 2720, 789, 275, 1966, 565, 2955, 24318, 4715, 253, 13812, 875, 10921, 4715, 285, 8385, 247, 2234, 629, 273, 253, 2929, 369, 1892, 281, 4751, 2096, 594, 8254, 5411, 436, 5740, 949, 6667, 390, 625, 2590, 2505, 651, 320, 9865, 253, 7103, 369, 671, 2969, 285, 417, 1077, 21414, 342, 1675, 281, 8109, 253, 3916, 6240, 15180, 4679, 285, 4569, 1666, 25379, 651, 320, 1774, 281, 921, 253, 2087, 30437, 273, 253, 2746, 50275, 10927, 846, 30080, 22559, 5717, 368, 281, 253, 4477, 323, 616, 2380, 891, 11435, 253, 7000, 9172, 281, 1016, 273, 253, 2720, 2987, 891, 1335, 513, 1158, 253, 2929, 3198, 281, 320, 625, 2590, 275, 253, 1895, 285, 9827, 342, 2720, 789, 275, 253, 4028, 3139, 534, 588, 2430, 247, 37825, 5731, 281, 253, 2929, 50276, 251, 253, 15180, 1543, 50276, 10352, 25379, 1930, 1223, 253, 4477, 452, 1408, 253, 3368, 285, 452, 2529, 841, 18276, 13576, 436, 310, 2649, 247, 16502, 323, 11745, 1543, 3340, 984, 436, 310, 1774, 281, 921, 672, 10941, 342, 1666, 25379, 253, 4477, 1333, 359, 452, 9300, 2593, 577, 281, 320, 30909, 670, 752, 8245, 7274, 651, 513, 275, 253, 12620, 359, 452, 5762, 697, 1774, 281, 921, 1941, 326, 436, 310, 752, 253, 1666, 25379, 2686, 858, 50276, 3169, 327, 841, 2792, 891, 13414, 1158, 253, 2929, 310, 3240, 4704, 323, 9311, 2568, 7152, 33032, 1501, 250, 2858, 22559, 50276, 28821, 253, 3434, 273, 253, 4477, 273, 11138, 616, 7714, 891, 717, 11138, 619, 3236, 4868, 2299, 619, 7103, 310, 1335, 5075, 12009, 323, 253, 4606, 2708, 50276, 18, 891, 1335, 1891, 281, 923, 2590, 3910, 875, 8385, 347, 2931, 407, 253, 4477, 285, 253, 643, 35221, 4715, 3022, 7274, 326, 5467, 326, 253, 10921, 1159, 310, 7202, 891, 476, 923, 326, 597, 4931, 2085, 247, 625, 10932, 285, 35961, 5740, 273, 849, 326, 8385, 476, 5108, 672, 2429, 281, 253, 2045, 2987, 2299, 253, 2929, 19756, 8542, 7535, 4555, 849, 943, 891, 1973, 271, 5570, 281, 25057, 824, 8385, 891, 13414, 1158, 616, 5697, 403, 594, 4460, 326, 643, 3082, 812, 2649, 320, 387, 1878, 12956, 281, 789, 275, 616, 10076, 281, 2486, 690, 16774, 7103, 275, 253, 7714, 50276, 19, 253, 2929, 3133, 247, 1652, 26699, 281, 479, 275, 436, 8059, 253, 2929, 6747, 3400, 8542, 285, 1480, 12925, 327, 849, 281, 1973, 11333, 281, 25057, 8385, 4543, 310, 247, 6630, 326, 16633, 327, 26169, 253, 2170, 285, 16585, 3910, 875, 2987, 4931, 253, 2929, 651, 320, 1805, 4845, 275, 247, 4797, 8467, 3540, 50273, 783, 4477, 12661, 767, 4715, 11951, 304, 983, 835, 253, 4715, 5570, 36908, 452, 2289, 281, 247, 10921, 1159, 533, 3185, 556, 281, 3037, 3587, 432, 253, 8385, 432, 247, 33837, 5570, 50276, 6050, 2819, 323, 4088, 281, 12454, 4836, 17776, 285, 1966, 9554, 275, 253, 4715, 1232, 310, 247, 4623, 285, 12532, 2561, 4736, 253, 4477, 13414, 5513, 253, 3064, 875, 616, 9841, 856, 7334, 11951, 304, 983, 285, 253, 1077, 6793, 6239, 327, 13737, 35221, 4715, 285, 4715, 432, 32367, 50276, 28269, 432, 1966, 8385, 3185, 273, 247, 10921, 1159, 310, 417, 247, 747, 2181, 285, 253, 17276, 2708, 417, 11106, 407, 253, 4477, 26799, 247, 6793, 6239, 326, 1057, 10534, 326, 50276, 17525, 6156, 11664, 5495, 8472, 80, 285, 271, 2072, 1203, 6736, 1524, 74, 2105, 66, 247, 6630, 327, 3700, 4715, 323, 4471, 12788, 35221, 4715, 2718, 6698, 273, 13345, 9260, 2561, 6705, 6247, 721, 1857, 30349, 50276, 9960, 306, 656, 1142, 9050, 273, 2987, 835, 581, 5570, 3400, 12925, 281, 2571, 1690, 7497, 5277, 8385, 281, 4715, 6083, 50276, 1662, 455, 270, 445, 2072, 277, 1162, 355, 247, 6630, 273, 15688, 4715, 432, 20028, 15688, 982, 285, 26279, 2718, 45916, 4748, 40089, 32282, 50276, 9960, 306, 656, 4715, 432, 20028, 835, 247, 1966, 3400, 3646, 32367, 281, 247, 4715, 5570, 534, 3798, 36908, 452, 2289, 281, 247, 10921, 1159, 50276, 2485, 80, 30966, 1162, 355, 247, 6630, 273, 13737, 35221, 4715, 5609, 5213, 6698, 273, 17497, 12672, 285, 20239, 3024, 982, 4050, 50276, 46429, 35221, 4715, 835, 253, 4715, 5570, 36908, 452, 2289, 281, 247, 10921, 1159, 285, 556, 281, 9441, 247, 3646, 432, 1966, 8385, 50276, 14920, 247, 11088, 5955, 670, 253, 3910, 875, 253, 4477, 10419, 285, 1110, 11951, 304, 983, 891, 16216, 5963, 253, 2929, 7680, 281, 619, 2927, 512, 253, 20121, 3534, 275, 253, 2929, 1007, 816, 751, 253, 1072, 347, 581, 273, 253, 3237, 28671, 275, 253, 9380, 1840, 50276, 1542, 1650, 1327, 4507, 10921, 4715, 3133, 281, 479, 6425, 281, 13737, 35221, 4715, 50276, 12563, 3939, 10921, 4715, 3133, 281, 479, 247, 2714, 1083, 273, 253, 2250, 44083, 1895, 28671, 275, 253, 806, 2929, 1840, 50276, 249, 253, 1083, 253, 4477, 2968, 342, 247, 1027, 1895, 275, 436, 2929, 891, 1804, 326, 253, 7714, 310, 35993, 281, 1869, 2920, 5513, 253, 3064, 875, 512, 1110, 15216, 275, 1083, 597, 6296, 403, 35260, 15216, 891, 1804, 326, 253, 4477, 897, 253, 1072, 14951, 347, 253, 2045, 2987, 285, 2486, 14023, 342, 253, 1375, 273, 253, 1445, 3082, 275, 253, 5661, 7103, 50275, 977, 13991, 50275, 74, 13414, 755, 849, 253, 3646, 3061, 1159, 588, 11897, 253, 3264, 10921, 604, 253, 10921, 1159, 310, 7202, 50275, 5658, 5467, 2289, 281, 247, 1966, 3061, 1159, 752, 4555, 1057, 326, 1599, 513, 368, 878, 253, 20552, 323, 3192, 1016, 2250, 604, 28763, 253, 1083, 352, 310, 1077, 46521, 281, 1902, 326, 247, 1966, 310, 2104, 281, 2085, 20552, 323, 1016, 2250, 7004, 703, 281, 3365, 2619, 581, 2250, 672, 9521, 310, 625, 15958, 50276, 187, 187, 4118, 18435, 27, 2520, 310, 247, 973, 3542, 2929, 562, 30927, 247, 966, 273, 10073, 422, 11333, 1146, 625, 390, 1679, 247, 6630, 2929, 352, 812, 513, 247, 1805, 2628, 273, 16585, 13737, 35221, 4715, 285, 27549, 13737, 35221, 4715, 352, 812, 671, 320, 5777, 625, 2087, 323, 1650, 253, 1966, 372, 38212, 1297, 1159, 878, 417, 320, 1929, 604, 359, 1566, 253, 5016, 347, 247, 17699, 16561, 2165, 840, 253, 1966, 1537, 452, 247, 21624, 1511, 534, 476, 320, 22245, 2366, 342, 253, 10921, 1159, 253, 3939, 10921, 4715, 1895, 310, 4536, 6289, 281, 347, 14682, 24021, 3535, 275, 253, 990, 352, 369, 417, 2590, 326, 253, 5955, 275, 436, 2929, 574, 667, 49353, 16039, 323, 2852, 3210, 390, 11333, 275, 436, 2170, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this work aims at advancing nighttime flare removal and proposes a new dataset called flare7k based on the proposed dataset extensive experiments are conducted which demonstrates that the proposed dataset can complement the diversity of existing flare datasets and push the frontier of nighttime flare removal this proposed dataset is mainly designed to improve the quality of nighttime images and might help the community to strengthen the research 1 some noreference metrics should be used in the table 2 of the manuscript 2 in th table 2 the benchmarking study is not convcing due to lack some the general methods for image restoration such as mprnet etc docsepthis paper presents the first synthetic dataset for nighttime flare removal authors collected diverse nighttime flare images with different types of cameras and light sources and then render flares based on the observations and statistics these rendered flares can be added to flarefree nighttime images to form paired nighttime flarecorrupted and flarefree data compared to the only daytime flare removal dataset flares generated in this paper look more similar to real flares in night scenes and more diverse annotations of flare components are also provided in this dataset which is helpful for other interesting applications related to nighttime flare images experimental results claims the effectiveness of this synthetic nighttime flare removal dataset 1 the first nighttime flare removal dataset with realisticlooking synthetic flares 2 additional annotation of flare components 1 as claimed in the abstract and conclusion nighttime flare also degrades the performance of vision algorithms thus it may be better to include a simple experiment showing the benefits the dataset can bring to existing downstream vision algorithms eg the flareremoved outputs result in better segmentation or detection performance 2 it may be better to compare more existing flare removal methods rather than haze removal and image enhancement methods in the experiments part it is good to include such methods for comparison but training more learningbased methods can better claim the effectiveness of the proposed dataset 3 the writing and figures can be improved further see the clarify part below docsepthe authors propose a firstofitskind dataset flare7k for nighttime flare removal the dataset has 5000 scattering flare images and 2000 reflective flare images in the dataset the scattering flares are of 25 types and the reflective flares are of 10 types in addition the dataset has more annotations with respect to light source glare with shimmer reflective flare and streak than other flare removal datasets the dataset is trained on unet and the results are compared with previous datasets it is the the first nighttime flare removal dataset more diversity in annotation with respect to light source reflective flare and streak more number of images compared to previous datasets 1firstly the manuscript has several typos and sentences are not well structured extensive proof reading required 2a lot of citations are missing in sections 1 and 3 when explaining concepts even section 3 seems to be completely paraphrased from wu et als work but citations are missing 3as there is only 1 other night flare dataset comparisons are done with only that paper it is difficult to find reasonable contributions when compared with only 1 paper 4the authors claim that compared to the previous dataset flare7k has more annotations but the difference in size between the two datasets is not huge other datasets can easily overcome this size with proper data augmentation methods 5the authors have trained their dataset with only the network unet from the paper wu et al there are several other stateoftheart models that the authors could have experimented with in comparison tables to show the reliability of this dataset in short experiments are not extensive and clear 6in line 177 authors start with a new paragraph with the sentence to solve these issues there is no reference to the said issues mentioned in the previous sections it is not a good practice to skip referencing sections where there is clear verification of a claim or a fact 7in line 213 compared with previous datasets again missing references 8table 2 comparison is explained in supplementary material but it is not clear as to how those models parameters were selected for the evaluation and how flare7k was deployed on those models was flare7k trained or evaluated on all the models it looks like only fair comparison is with wus paper as they have used the same training model as flare7k but what about zhang and sharma is the flare7k evaluated on all the models and thats the result comparison in table 2 in short clear explanation answering these questions are needed for fair comparison 9conclusion is vague needs to be rewritten summarizing and highlighting specific contributions and results of the paper docsepthis work provides the first nighttime flare removal dataset flare7k before this work there was only a daytime flare dataset the lack of the nighttime flare dataset hinders research on the task of nighttime flare removal the dataset contains 7k flare patterns that can be used to synthetically generate flarecorrupted and flarefree pairs the authors compare the patterns of the existing daytime dataset by wu 7 and the proposed nighttime dataset flare7k in figure 3 flare7k seems to have more diverse patterns figure 5 shows the effectiveness of flare7k on the nighttime flare removal task only replacing the training dataset with flare7k improves results in my opinion from the qualitative comparison in figure 5 the difference between wus and the authors is not small however from the quantitative comparison in table 2 the difference between wus and the authors seems relatively small 1 the paper provides a detailed introduction which makes readers easily understand why lens flare occurs and why lens flare is a significant problem 2 provide the first nighttime flare dataset flare7k and show the effectiveness of flare7k 1 question are authors planning to provide the flarecorrupted and flarefree image pairs or provide the training code that generates paired images on the fly in my opinion the same paired dataset used in the experiments is necessary to reproduce the results and fairly compare methods i downloaded and checked the datasets the dataset only contains 7k flare patterns and test data not flarecorrupted and flarefree image pairs furthermore i cannot find the number of paired data it would be important to grasp how many paired images are required to remove nighttime flare in supplementary materials i find to train our nighttime flare removal model our paired flarecorrupted and flarefree images are generated on the fly and 60 epochs on flickr24k does it mean the number of paired data is 24k x 60 2 hard to understand section 42 dataset generation and figure 4 from l188l191 figure 4 presents our scattering flare synthesis pipeline from the reference flare images of each type we can obtain the relationship between the rgb value of the pixel and its distance to the light source such a relationship can be viewed as a color correction curve question 1 is the reference image in figure 4 a real nighttime flare image question 2 if so how do we obtain the steak mask 4 pattern parameters and round gradients from the reference image we can obtain the relationship seems not enough to explain in addition a brief explanation of optical flares in adobe after effects is necessary 3 the evaluation set only contains 20 synthetic and 20 real paired data it is small but at least it seems enough to compare flare removal methods as shown in table 2 increasing the size of the test set especially real data would be better rewriting l223 to l239 would be better it is hard to know that the test set consists of 20 real and 20 synthetic paired data at a glance 4 it would be better if there were more details about other methods zhang 40 and sharma 39 from table 2 their performances are worse than input i wonder why and how they work i find slightly more details in supplementary materials but i still cannot roughly understand how they work in addition in my opinion when using zhangs method authors should finetune their model on flair7k rather than using the pretrained model please ignore this comment if zhangs official code does not provide a proper training code 5 according to l107l108 qiao et al 8 tried to remove lens flare with unpaired data using the cyclegan framework if possible i would like to see the results when the cyclegan framework is trained on unpaired data generated by flare7k please ignore this comment if additional experiments are prohibited or somewhat burdensome docsepthe paper presents a dataset of synthetically generated flare images based on realworld flare images collected from different cameras with different lenses using the realworld reference as well as a strong background in the physics of flares the synthetically generated dataset had a much greater diversity than prior works thus allowing models to generalize better on unseen data the paper has shown that a model trained on their dataset has shown undeniable improvements than when trained on datasets from prior works the presented dataset has not only been shown to have already introduced significant boosts to model performance but has also opened up doors to other areas of research because of the accompaniment of annotation data although it is not as immediately clear why it would be relevant to industrial usage the paper has also done a very thorough literature review into the topic of image flare that will aid future studies the dataset is also diverse across geographical regions it isnt highlighted in the paper but the website for the dataset showcases images from the dataset that are taken from countries outside of singapore and in different environments the most immediate potential shortcoming i see to their method is that flare may have been synthetically generated in images where there would not otherwise be any flare even with any configuration of camera and lens for example there seem to be images in the dataset where flares were synthetically generated even though the original image does not have any lights or there were multiple lights in the image but only flares for one of them were generated just based on this disparity there is a clear distribution shift between their dataset and realworld data which might have held back the models abilities to generalize the paper could include details of methods to collect images that flares were synthetically generated for as well as details of how the adobe after effects tools worked under the hood ### Summary:
this paper propose a new nighttime flare hazard dataset the synthetic dataset enables to obtain ground truth which is otherwise very difficult in real scenes all reviewers are satisfied with the quality of the paper only one reviewer recommends marginal below acceptance and the concerns are minor the concerns has been carefully addressed by the authors through the discussion all other concerns have also been carefully addressed and the paper has been rewritten accordingly therefore this paper is suitable for acceptance
[ 1754, 327, 253, 4081, 10895, 9470, 4679, 403, 5196, 534, 14371, 326, 253, 4081, 10895, 476, 13503, 253, 9991, 273, 5368, 32818, 15302, 285, 7450, 253, 34642, 273, 2360, 2606, 32818, 8570, 436, 4081, 10895, 310, 7194, 4158, 281, 3157, 253, 3290, 273, 2360, 2606, 3888, 285, 1537, 1361, 253, 3114, 281, 17084, 253, 2561, 337, 690, 295, 410, 1793, 17082, 943, 320, 908, 275, 253, 2829, 374, 273, 253, 7714, 374, 275, 289, 2829, 374, 253, 22791, 272, 1263, 310, 417, 2410, 2844, 1955, 281, 3480, 690, 253, 2087, 3082, 323, 2460, 20384, 824, 347, 278, 1087, 3024, 3966, 50272, 7152, 33032, 2520, 2929, 10262, 253, 806, 13506, 10895, 323, 2360, 2606, 32818, 8570, 4477, 5728, 11117, 2360, 2606, 32818, 3888, 342, 1027, 3510, 273, 14693, 285, 1708, 4973, 285, 840, 8600, 49569, 1754, 327, 253, 7313, 285, 9990, 841, 13697, 49569, 476, 320, 2879, 281, 32818, 4924, 2360, 2606, 3888, 281, 830, 18433, 2360, 2606, 32818, 5528, 3787, 264, 285, 32818, 4924, 941, 2429, 281, 253, 760, 37527, 32818, 8570, 10895, 49569, 4561, 275, 436, 2929, 1007, 625, 2074, 281, 1524, 49569, 275, 2360, 13451, 285, 625, 11117, 31825, 273, 32818, 4295, 403, 671, 2530, 275, 436, 10895, 534, 310, 9371, 323, 643, 4722, 4893, 2905, 281, 2360, 2606, 32818, 3888, 5661, 1543, 3916, 253, 12510, 273, 436, 13506, 2360, 2606, 32818, 8570, 10895, 337, 253, 806, 2360, 2606, 32818, 8570, 10895, 342, 15958, 13565, 13506, 49569, 374, 3081, 22581, 273, 32818, 4295, 337, 347, 7558, 275, 253, 12002, 285, 6452, 2360, 2606, 32818, 671, 372, 25013, 253, 3045, 273, 8113, 11333, 3021, 352, 778, 320, 1805, 281, 2486, 247, 2969, 3368, 4645, 253, 5373, 253, 10895, 476, 3324, 281, 5368, 15450, 8113, 11333, 24088, 253, 32818, 2013, 3149, 18012, 906, 275, 1805, 26405, 390, 5481, 3045, 374, 352, 778, 320, 1805, 281, 7277, 625, 5368, 32818, 8570, 3082, 2581, 685, 419, 2721, 8570, 285, 2460, 14314, 3082, 275, 253, 4679, 629, 352, 310, 1175, 281, 2486, 824, 3082, 323, 5301, 533, 3733, 625, 4715, 3169, 3082, 476, 1805, 1750, 253, 12510, 273, 253, 4081, 10895, 495, 253, 4028, 285, 8442, 476, 320, 5520, 2007, 923, 253, 19148, 629, 2708, 5474, 339, 431, 248, 4477, 12661, 247, 50276, 7053, 1171, 953, 11258, 10895, 32818, 24, 76, 323, 2360, 2606, 32818, 8570, 253, 10895, 556, 29067, 11715, 32818, 3888, 285, 5307, 29210, 32818, 3888, 275, 253, 10895, 253, 11715, 49569, 403, 273, 2030, 3510, 285, 253, 29210, 49569, 403, 273, 884, 3510, 275, 1635, 253, 10895, 556, 625, 31825, 342, 1675, 281, 1708, 2603, 40332, 342, 439, 22369, 29210, 32818, 285, 29642, 685, 643, 32818, 8570, 15302, 253, 10895, 310, 10166, 327, 440, 292, 285, 253, 1543, 403, 2429, 342, 2045, 15302, 50276, 262, 310, 253, 253, 806, 2360, 2606, 32818, 8570, 10895, 625, 9991, 275, 22581, 342, 1675, 281, 1708, 2603, 29210, 32818, 285, 29642, 625, 1180, 273, 3888, 2429, 281, 2045, 15302, 50276, 18, 7053, 314, 253, 7714, 556, 2067, 963, 993, 285, 14683, 403, 417, 973, 18872, 9470, 4737, 4361, 2424, 50275, 19, 66, 2257, 273, 30404, 403, 5816, 275, 7118, 337, 285, 495, 672, 15571, 12342, 1014, 2593, 495, 3133, 281, 320, 4336, 1061, 24596, 833, 432, 259, 86, 1162, 14350, 789, 533, 30404, 403, 5816, 50275, 20, 284, 627, 310, 760, 337, 643, 2360, 32818, 10895, 14023, 403, 2218, 342, 760, 326, 2929, 352, 310, 2834, 281, 1089, 5272, 9021, 672, 2429, 342, 760, 337, 2929, 50276, 21, 783, 4477, 1750, 326, 2429, 281, 253, 2045, 10895, 32818, 24, 76, 556, 625, 31825, 533, 253, 3064, 275, 1979, 875, 253, 767, 15302, 310, 417, 5699, 643, 15302, 476, 4354, 11399, 436, 1979, 342, 1463, 941, 42072, 3082, 50276, 22, 783, 4477, 452, 10166, 616, 10895, 342, 760, 253, 2990, 440, 292, 432, 253, 2929, 259, 86, 1162, 355, 627, 403, 2067, 643, 1375, 23037, 14387, 3210, 326, 253, 4477, 812, 452, 3368, 264, 342, 275, 5301, 7180, 281, 921, 253, 13367, 273, 436, 10895, 275, 2159, 4679, 403, 417, 9470, 285, 2590, 50276, 23, 249, 1386, 24232, 4477, 1265, 342, 247, 747, 12494, 342, 253, 6197, 281, 8415, 841, 3374, 50276, 9088, 310, 642, 3806, 281, 253, 753, 3374, 5393, 275, 253, 2045, 7118, 352, 310, 417, 247, 1175, 3946, 281, 17049, 44978, 7118, 835, 627, 310, 2590, 21999, 273, 247, 1750, 390, 247, 958, 50276, 24, 249, 1386, 25098, 2429, 342, 2045, 15302, 969, 5816, 10414, 50276, 25, 2420, 374, 5301, 310, 5544, 275, 24864, 2144, 533, 352, 310, 417, 2590, 347, 281, 849, 1110, 3210, 3602, 497, 4236, 323, 253, 7103, 285, 849, 32818, 24, 76, 369, 18329, 327, 1110, 3210, 369, 32818, 24, 76, 10166, 390, 6760, 327, 512, 253, 3210, 352, 4453, 751, 760, 4344, 5301, 310, 342, 259, 316, 2929, 347, 597, 452, 908, 253, 1072, 3733, 1566, 347, 32818, 24, 76, 533, 752, 670, 1182, 12109, 285, 17614, 785, 310, 253, 32818, 24, 76, 6760, 327, 512, 253, 3210, 285, 28763, 253, 906, 5301, 275, 2829, 374, 275, 2159, 2590, 8813, 22291, 841, 3533, 403, 3058, 323, 4344, 5301, 50276, 26, 585, 3444, 310, 21248, 3198, 281, 320, 35993, 10405, 3006, 285, 27321, 2173, 9021, 285, 1543, 273, 253, 2929, 5474, 33032, 2520, 789, 3400, 253, 806, 2360, 2606, 32818, 8570, 10895, 32818, 24, 76, 1078, 436, 789, 627, 369, 760, 247, 37527, 32818, 10895, 50276, 783, 3480, 273, 253, 2360, 2606, 32818, 10895, 17134, 398, 2561, 327, 253, 4836, 273, 2360, 2606, 32818, 8570, 50276, 783, 10895, 4428, 818, 76, 32818, 6127, 326, 476, 320, 908, 281, 5132, 85, 1037, 6635, 32818, 5528, 3787, 264, 285, 32818, 4924, 8557, 253, 4477, 7277, 253, 6127, 273, 253, 5368, 37527, 10895, 407, 259, 86, 818, 50276, 395, 253, 4081, 2360, 2606, 10895, 32818, 24, 76, 275, 4677, 495, 32818, 24, 76, 3133, 281, 452, 625, 11117, 6127, 50276, 13206, 608, 2722, 253, 12510, 273, 32818, 24, 76, 327, 253, 2360, 2606, 32818, 8570, 4836, 760, 15706, 253, 3733, 10895, 342, 32818, 24, 76, 19132, 1543, 275, 619, 4743, 432, 253, 18276, 5301, 275, 4677, 608, 253, 3064, 875, 259, 316, 285, 253, 4477, 310, 417, 1355, 50276, 35529, 432, 253, 11745, 5301, 275, 2829, 374, 253, 3064, 875, 259, 316, 285, 253, 4477, 3133, 4942, 1355, 50276, 18, 253, 2929, 3400, 247, 7000, 10199, 534, 2789, 10668, 4354, 2096, 2139, 9655, 32818, 6634, 285, 2139, 9655, 32818, 310, 247, 1534, 1895, 50276, 19, 2085, 253, 806, 2360, 2606, 32818, 10895, 32818, 24, 76, 285, 921, 253, 12510, 273, 32818, 24, 76, 337, 1953, 403, 4477, 7219, 281, 2085, 253, 32818, 5528, 3787, 264, 285, 32818, 4924, 2460, 8557, 390, 2085, 253, 3733, 2127, 326, 15693, 18433, 3888, 327, 253, 8778, 275, 619, 4743, 253, 1072, 18433, 10895, 908, 275, 253, 4679, 310, 3309, 281, 18302, 253, 1543, 285, 9648, 7277, 3082, 891, 20582, 285, 10141, 253, 15302, 50276, 783, 10895, 760, 4428, 818, 76, 32818, 6127, 285, 1071, 941, 417, 32818, 5528, 3787, 264, 285, 32818, 4924, 2460, 8557, 50275, 44295, 3062, 891, 2550, 1089, 253, 1180, 273, 18433, 941, 352, 651, 320, 1774, 281, 15909, 849, 1142, 18433, 3888, 403, 2424, 281, 5386, 2360, 2606, 32818, 275, 24864, 4753, 891, 1089, 281, 6194, 776, 2360, 2606, 32818, 8570, 1566, 776, 18433, 32818, 5528, 3787, 264, 285, 32818, 4924, 3888, 403, 4561, 327, 253, 8778, 285, 3925, 44540, 327, 22336, 83, 1348, 76, 50276, 18566, 352, 1599, 253, 1180, 273, 18433, 941, 310, 2164, 76, 1269, 3925, 50273, 19, 1892, 281, 2096, 2593, 5976, 10895, 5978, 285, 4677, 577, 432, 298, 17599, 77, 22179, 4677, 577, 10262, 776, 11715, 32818, 9066, 15722, 432, 253, 3806, 32818, 3888, 273, 1016, 1511, 359, 476, 4044, 253, 2954, 875, 253, 46206, 1318, 273, 253, 12275, 285, 697, 4181, 281, 253, 1708, 2603, 824, 247, 2954, 476, 320, 11575, 347, 247, 3295, 10618, 6970, 50275, 19751, 337, 310, 253, 3806, 2460, 275, 4677, 577, 247, 1524, 2360, 2606, 32818, 2460, 50276, 19751, 374, 604, 594, 849, 513, 359, 4044, 253, 36557, 8989, 577, 3102, 3602, 285, 3790, 27935, 432, 253, 3806, 2460, 359, 476, 4044, 253, 2954, 3133, 417, 2217, 281, 5513, 50276, 249, 1635, 247, 4864, 8813, 273, 5748, 49569, 275, 519, 8594, 846, 2538, 310, 3309, 50274, 20, 253, 7103, 873, 760, 4428, 1384, 13506, 285, 1384, 1524, 18433, 941, 352, 310, 1355, 533, 387, 1878, 352, 3133, 2217, 281, 7277, 32818, 8570, 3082, 347, 2011, 275, 2829, 374, 3629, 253, 1979, 273, 253, 1071, 873, 3340, 1524, 941, 651, 320, 1805, 294, 17695, 298, 20360, 281, 298, 20487, 651, 320, 1805, 50276, 262, 310, 1892, 281, 871, 326, 253, 1071, 873, 8414, 273, 1384, 1524, 285, 1384, 13506, 18433, 941, 387, 247, 17834, 50274, 21, 352, 651, 320, 1805, 604, 627, 497, 625, 4278, 670, 643, 3082, 1182, 12109, 3387, 285, 17614, 785, 6931, 432, 2829, 374, 616, 16226, 403, 7197, 685, 3280, 891, 4282, 2139, 285, 849, 597, 789, 891, 1089, 5777, 625, 4278, 275, 24864, 4753, 533, 891, 1335, 2550, 11467, 2096, 849, 597, 789, 275, 1635, 275, 619, 4743, 672, 970, 1182, 12109, 84, 1332, 4477, 943, 1442, 292, 2517, 616, 1566, 327, 892, 1094, 24, 76, 2581, 685, 970, 253, 3215, 11273, 1566, 50276, 32897, 11823, 436, 4385, 604, 1182, 12109, 84, 3565, 2127, 1057, 417, 2085, 247, 1463, 3733, 2127, 50274, 22, 2556, 281, 298, 12224, 77, 12347, 2805, 22728, 1162, 355, 854, 3597, 281, 5386, 9655, 32818, 342, 47223, 941, 970, 253, 5880, 1247, 7792, 50276, 338, 1896, 891, 651, 751, 281, 923, 253, 1543, 672, 253, 5880, 1247, 7792, 310, 10166, 327, 47223, 941, 4561, 407, 32818, 24, 76, 50276, 32897, 11823, 436, 4385, 604, 3081, 4679, 403, 19772, 390, 8489, 32274, 485, 5474, 339, 431, 248, 2929, 10262, 247, 10895, 273, 5132, 85, 1037, 4561, 32818, 3888, 1754, 327, 1524, 10186, 32818, 3888, 5728, 432, 1027, 14693, 342, 1027, 22881, 970, 253, 1524, 10186, 3806, 347, 973, 347, 247, 2266, 4114, 275, 253, 12057, 273, 49569, 253, 5132, 85, 1037, 4561, 10895, 574, 247, 1199, 3687, 9991, 685, 2720, 2987, 3021, 6941, 3210, 281, 39970, 1805, 327, 39709, 941, 253, 2929, 556, 2011, 326, 247, 1566, 10166, 327, 616, 10895, 556, 2011, 43296, 6051, 11701, 685, 672, 10166, 327, 15302, 432, 2720, 2987, 253, 3559, 10895, 556, 417, 760, 644, 2011, 281, 452, 2168, 5611, 1534, 9510, 84, 281, 1566, 3045, 533, 556, 671, 5485, 598, 11008, 281, 643, 3672, 273, 2561, 984, 273, 253, 10353, 2092, 273, 22581, 941, 3738, 352, 310, 417, 347, 4745, 2590, 2139, 352, 651, 320, 4623, 281, 9787, 10393, 253, 2929, 556, 671, 2218, 247, 1077, 11080, 6239, 2278, 715, 253, 9400, 273, 2460, 32818, 326, 588, 8596, 2852, 2175, 50274, 783, 10895, 310, 671, 11117, 2439, 25231, 4811, 352, 310, 2649, 16318, 275, 253, 2929, 533, 253, 4422, 323, 253, 10895, 921, 12866, 3888, 432, 253, 10895, 326, 403, 2668, 432, 4343, 3345, 273, 1625, 15375, 285, 275, 1027, 12620, 253, 954, 8993, 2442, 2159, 4202, 891, 923, 281, 616, 1332, 310, 326, 32818, 778, 452, 644, 5132, 85, 1037, 4561, 275, 3888, 835, 627, 651, 417, 5010, 320, 667, 32818, 1014, 342, 667, 6661, 273, 6568, 285, 9655, 323, 1650, 627, 1646, 281, 320, 3888, 275, 253, 10895, 835, 49569, 497, 5132, 85, 1037, 4561, 1014, 2167, 253, 3236, 2460, 1057, 417, 452, 667, 10654, 390, 627, 497, 2709, 10654, 275, 253, 2460, 533, 760, 49569, 323, 581, 273, 731, 497, 4561, 816, 1754, 327, 436, 37808, 627, 310, 247, 2590, 3268, 5333, 875, 616, 10895, 285, 1524, 10186, 941, 534, 1537, 452, 2918, 896, 253, 3210, 15277, 281, 39970, 50276, 783, 2929, 812, 2486, 4278, 273, 3082, 281, 4822, 3888, 326, 49569, 497, 5132, 85, 1037, 4561, 323, 347, 973, 347, 4278, 273, 849, 253, 519, 8594, 846, 2538, 5657, 4307, 762, 253, 21613, 2490, 187, 4118, 18435, 27, 2520, 2929, 12661, 247, 747, 2360, 2606, 32818, 16466, 10895, 253, 13506, 10895, 13276, 281, 4044, 3216, 5083, 534, 310, 5010, 1077, 2834, 275, 1524, 13451, 512, 30628, 403, 10048, 342, 253, 3290, 273, 253, 2929, 760, 581, 37317, 32636, 16888, 2708, 14924, 285, 253, 7350, 403, 5884, 253, 7350, 556, 644, 9257, 9713, 407, 253, 4477, 949, 253, 5955, 512, 643, 7350, 452, 671, 644, 9257, 9713, 285, 253, 2929, 556, 644, 35993, 15672, 3103, 436, 2929, 310, 7470, 323, 14924 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 1754, 327, 253, 4081, 10895, 9470, 4679, 403, 5196, 534, 14371, 326, 253, 4081, 10895, 476, 13503, 253, 9991, 273, 5368, 32818, 15302, 285, 7450, 253, 34642, 273, 2360, 2606, 32818, 8570, 436, 4081, 10895, 310, 7194, 4158, 281, 3157, 253, 3290, 273, 2360, 2606, 3888, 285, 1537, 1361, 253, 3114, 281, 17084, 253, 2561, 337, 690, 295, 410, 1793, 17082, 943, 320, 908, 275, 253, 2829, 374, 273, 253, 7714, 374, 275, 289, 2829, 374, 253, 22791, 272, 1263, 310, 417, 2410, 2844, 1955, 281, 3480, 690, 253, 2087, 3082, 323, 2460, 20384, 824, 347, 278, 1087, 3024, 3966, 50272, 7152, 33032, 2520, 2929, 10262, 253, 806, 13506, 10895, 323, 2360, 2606, 32818, 8570, 4477, 5728, 11117, 2360, 2606, 32818, 3888, 342, 1027, 3510, 273, 14693, 285, 1708, 4973, 285, 840, 8600, 49569, 1754, 327, 253, 7313, 285, 9990, 841, 13697, 49569, 476, 320, 2879, 281, 32818, 4924, 2360, 2606, 3888, 281, 830, 18433, 2360, 2606, 32818, 5528, 3787, 264, 285, 32818, 4924, 941, 2429, 281, 253, 760, 37527, 32818, 8570, 10895, 49569, 4561, 275, 436, 2929, 1007, 625, 2074, 281, 1524, 49569, 275, 2360, 13451, 285, 625, 11117, 31825, 273, 32818, 4295, 403, 671, 2530, 275, 436, 10895, 534, 310, 9371, 323, 643, 4722, 4893, 2905, 281, 2360, 2606, 32818, 3888, 5661, 1543, 3916, 253, 12510, 273, 436, 13506, 2360, 2606, 32818, 8570, 10895, 337, 253, 806, 2360, 2606, 32818, 8570, 10895, 342, 15958, 13565, 13506, 49569, 374, 3081, 22581, 273, 32818, 4295, 337, 347, 7558, 275, 253, 12002, 285, 6452, 2360, 2606, 32818, 671, 372, 25013, 253, 3045, 273, 8113, 11333, 3021, 352, 778, 320, 1805, 281, 2486, 247, 2969, 3368, 4645, 253, 5373, 253, 10895, 476, 3324, 281, 5368, 15450, 8113, 11333, 24088, 253, 32818, 2013, 3149, 18012, 906, 275, 1805, 26405, 390, 5481, 3045, 374, 352, 778, 320, 1805, 281, 7277, 625, 5368, 32818, 8570, 3082, 2581, 685, 419, 2721, 8570, 285, 2460, 14314, 3082, 275, 253, 4679, 629, 352, 310, 1175, 281, 2486, 824, 3082, 323, 5301, 533, 3733, 625, 4715, 3169, 3082, 476, 1805, 1750, 253, 12510, 273, 253, 4081, 10895, 495, 253, 4028, 285, 8442, 476, 320, 5520, 2007, 923, 253, 19148, 629, 2708, 5474, 339, 431, 248, 4477, 12661, 247, 50276, 7053, 1171, 953, 11258, 10895, 32818, 24, 76, 323, 2360, 2606, 32818, 8570, 253, 10895, 556, 29067, 11715, 32818, 3888, 285, 5307, 29210, 32818, 3888, 275, 253, 10895, 253, 11715, 49569, 403, 273, 2030, 3510, 285, 253, 29210, 49569, 403, 273, 884, 3510, 275, 1635, 253, 10895, 556, 625, 31825, 342, 1675, 281, 1708, 2603, 40332, 342, 439, 22369, 29210, 32818, 285, 29642, 685, 643, 32818, 8570, 15302, 253, 10895, 310, 10166, 327, 440, 292, 285, 253, 1543, 403, 2429, 342, 2045, 15302, 50276, 262, 310, 253, 253, 806, 2360, 2606, 32818, 8570, 10895, 625, 9991, 275, 22581, 342, 1675, 281, 1708, 2603, 29210, 32818, 285, 29642, 625, 1180, 273, 3888, 2429, 281, 2045, 15302, 50276, 18, 7053, 314, 253, 7714, 556, 2067, 963, 993, 285, 14683, 403, 417, 973, 18872, 9470, 4737, 4361, 2424, 50275, 19, 66, 2257, 273, 30404, 403, 5816, 275, 7118, 337, 285, 495, 672, 15571, 12342, 1014, 2593, 495, 3133, 281, 320, 4336, 1061, 24596, 833, 432, 259, 86, 1162, 14350, 789, 533, 30404, 403, 5816, 50275, 20, 284, 627, 310, 760, 337, 643, 2360, 32818, 10895, 14023, 403, 2218, 342, 760, 326, 2929, 352, 310, 2834, 281, 1089, 5272, 9021, 672, 2429, 342, 760, 337, 2929, 50276, 21, 783, 4477, 1750, 326, 2429, 281, 253, 2045, 10895, 32818, 24, 76, 556, 625, 31825, 533, 253, 3064, 275, 1979, 875, 253, 767, 15302, 310, 417, 5699, 643, 15302, 476, 4354, 11399, 436, 1979, 342, 1463, 941, 42072, 3082, 50276, 22, 783, 4477, 452, 10166, 616, 10895, 342, 760, 253, 2990, 440, 292, 432, 253, 2929, 259, 86, 1162, 355, 627, 403, 2067, 643, 1375, 23037, 14387, 3210, 326, 253, 4477, 812, 452, 3368, 264, 342, 275, 5301, 7180, 281, 921, 253, 13367, 273, 436, 10895, 275, 2159, 4679, 403, 417, 9470, 285, 2590, 50276, 23, 249, 1386, 24232, 4477, 1265, 342, 247, 747, 12494, 342, 253, 6197, 281, 8415, 841, 3374, 50276, 9088, 310, 642, 3806, 281, 253, 753, 3374, 5393, 275, 253, 2045, 7118, 352, 310, 417, 247, 1175, 3946, 281, 17049, 44978, 7118, 835, 627, 310, 2590, 21999, 273, 247, 1750, 390, 247, 958, 50276, 24, 249, 1386, 25098, 2429, 342, 2045, 15302, 969, 5816, 10414, 50276, 25, 2420, 374, 5301, 310, 5544, 275, 24864, 2144, 533, 352, 310, 417, 2590, 347, 281, 849, 1110, 3210, 3602, 497, 4236, 323, 253, 7103, 285, 849, 32818, 24, 76, 369, 18329, 327, 1110, 3210, 369, 32818, 24, 76, 10166, 390, 6760, 327, 512, 253, 3210, 352, 4453, 751, 760, 4344, 5301, 310, 342, 259, 316, 2929, 347, 597, 452, 908, 253, 1072, 3733, 1566, 347, 32818, 24, 76, 533, 752, 670, 1182, 12109, 285, 17614, 785, 310, 253, 32818, 24, 76, 6760, 327, 512, 253, 3210, 285, 28763, 253, 906, 5301, 275, 2829, 374, 275, 2159, 2590, 8813, 22291, 841, 3533, 403, 3058, 323, 4344, 5301, 50276, 26, 585, 3444, 310, 21248, 3198, 281, 320, 35993, 10405, 3006, 285, 27321, 2173, 9021, 285, 1543, 273, 253, 2929, 5474, 33032, 2520, 789, 3400, 253, 806, 2360, 2606, 32818, 8570, 10895, 32818, 24, 76, 1078, 436, 789, 627, 369, 760, 247, 37527, 32818, 10895, 50276, 783, 3480, 273, 253, 2360, 2606, 32818, 10895, 17134, 398, 2561, 327, 253, 4836, 273, 2360, 2606, 32818, 8570, 50276, 783, 10895, 4428, 818, 76, 32818, 6127, 326, 476, 320, 908, 281, 5132, 85, 1037, 6635, 32818, 5528, 3787, 264, 285, 32818, 4924, 8557, 253, 4477, 7277, 253, 6127, 273, 253, 5368, 37527, 10895, 407, 259, 86, 818, 50276, 395, 253, 4081, 2360, 2606, 10895, 32818, 24, 76, 275, 4677, 495, 32818, 24, 76, 3133, 281, 452, 625, 11117, 6127, 50276, 13206, 608, 2722, 253, 12510, 273, 32818, 24, 76, 327, 253, 2360, 2606, 32818, 8570, 4836, 760, 15706, 253, 3733, 10895, 342, 32818, 24, 76, 19132, 1543, 275, 619, 4743, 432, 253, 18276, 5301, 275, 4677, 608, 253, 3064, 875, 259, 316, 285, 253, 4477, 310, 417, 1355, 50276, 35529, 432, 253, 11745, 5301, 275, 2829, 374, 253, 3064, 875, 259, 316, 285, 253, 4477, 3133, 4942, 1355, 50276, 18, 253, 2929, 3400, 247, 7000, 10199, 534, 2789, 10668, 4354, 2096, 2139, 9655, 32818, 6634, 285, 2139, 9655, 32818, 310, 247, 1534, 1895, 50276, 19, 2085, 253, 806, 2360, 2606, 32818, 10895, 32818, 24, 76, 285, 921, 253, 12510, 273, 32818, 24, 76, 337, 1953, 403, 4477, 7219, 281, 2085, 253, 32818, 5528, 3787, 264, 285, 32818, 4924, 2460, 8557, 390, 2085, 253, 3733, 2127, 326, 15693, 18433, 3888, 327, 253, 8778, 275, 619, 4743, 253, 1072, 18433, 10895, 908, 275, 253, 4679, 310, 3309, 281, 18302, 253, 1543, 285, 9648, 7277, 3082, 891, 20582, 285, 10141, 253, 15302, 50276, 783, 10895, 760, 4428, 818, 76, 32818, 6127, 285, 1071, 941, 417, 32818, 5528, 3787, 264, 285, 32818, 4924, 2460, 8557, 50275, 44295, 3062, 891, 2550, 1089, 253, 1180, 273, 18433, 941, 352, 651, 320, 1774, 281, 15909, 849, 1142, 18433, 3888, 403, 2424, 281, 5386, 2360, 2606, 32818, 275, 24864, 4753, 891, 1089, 281, 6194, 776, 2360, 2606, 32818, 8570, 1566, 776, 18433, 32818, 5528, 3787, 264, 285, 32818, 4924, 3888, 403, 4561, 327, 253, 8778, 285, 3925, 44540, 327, 22336, 83, 1348, 76, 50276, 18566, 352, 1599, 253, 1180, 273, 18433, 941, 310, 2164, 76, 1269, 3925, 50273, 19, 1892, 281, 2096, 2593, 5976, 10895, 5978, 285, 4677, 577, 432, 298, 17599, 77, 22179, 4677, 577, 10262, 776, 11715, 32818, 9066, 15722, 432, 253, 3806, 32818, 3888, 273, 1016, 1511, 359, 476, 4044, 253, 2954, 875, 253, 46206, 1318, 273, 253, 12275, 285, 697, 4181, 281, 253, 1708, 2603, 824, 247, 2954, 476, 320, 11575, 347, 247, 3295, 10618, 6970, 50275, 19751, 337, 310, 253, 3806, 2460, 275, 4677, 577, 247, 1524, 2360, 2606, 32818, 2460, 50276, 19751, 374, 604, 594, 849, 513, 359, 4044, 253, 36557, 8989, 577, 3102, 3602, 285, 3790, 27935, 432, 253, 3806, 2460, 359, 476, 4044, 253, 2954, 3133, 417, 2217, 281, 5513, 50276, 249, 1635, 247, 4864, 8813, 273, 5748, 49569, 275, 519, 8594, 846, 2538, 310, 3309, 50274, 20, 253, 7103, 873, 760, 4428, 1384, 13506, 285, 1384, 1524, 18433, 941, 352, 310, 1355, 533, 387, 1878, 352, 3133, 2217, 281, 7277, 32818, 8570, 3082, 347, 2011, 275, 2829, 374, 3629, 253, 1979, 273, 253, 1071, 873, 3340, 1524, 941, 651, 320, 1805, 294, 17695, 298, 20360, 281, 298, 20487, 651, 320, 1805, 50276, 262, 310, 1892, 281, 871, 326, 253, 1071, 873, 8414, 273, 1384, 1524, 285, 1384, 13506, 18433, 941, 387, 247, 17834, 50274, 21, 352, 651, 320, 1805, 604, 627, 497, 625, 4278, 670, 643, 3082, 1182, 12109, 3387, 285, 17614, 785, 6931, 432, 2829, 374, 616, 16226, 403, 7197, 685, 3280, 891, 4282, 2139, 285, 849, 597, 789, 891, 1089, 5777, 625, 4278, 275, 24864, 4753, 533, 891, 1335, 2550, 11467, 2096, 849, 597, 789, 275, 1635, 275, 619, 4743, 672, 970, 1182, 12109, 84, 1332, 4477, 943, 1442, 292, 2517, 616, 1566, 327, 892, 1094, 24, 76, 2581, 685, 970, 253, 3215, 11273, 1566, 50276, 32897, 11823, 436, 4385, 604, 1182, 12109, 84, 3565, 2127, 1057, 417, 2085, 247, 1463, 3733, 2127, 50274, 22, 2556, 281, 298, 12224, 77, 12347, 2805, 22728, 1162, 355, 854, 3597, 281, 5386, 9655, 32818, 342, 47223, 941, 970, 253, 5880, 1247, 7792, 50276, 338, 1896, 891, 651, 751, 281, 923, 253, 1543, 672, 253, 5880, 1247, 7792, 310, 10166, 327, 47223, 941, 4561, 407, 32818, 24, 76, 50276, 32897, 11823, 436, 4385, 604, 3081, 4679, 403, 19772, 390, 8489, 32274, 485, 5474, 339, 431, 248, 2929, 10262, 247, 10895, 273, 5132, 85, 1037, 4561, 32818, 3888, 1754, 327, 1524, 10186, 32818, 3888, 5728, 432, 1027, 14693, 342, 1027, 22881, 970, 253, 1524, 10186, 3806, 347, 973, 347, 247, 2266, 4114, 275, 253, 12057, 273, 49569, 253, 5132, 85, 1037, 4561, 10895, 574, 247, 1199, 3687, 9991, 685, 2720, 2987, 3021, 6941, 3210, 281, 39970, 1805, 327, 39709, 941, 253, 2929, 556, 2011, 326, 247, 1566, 10166, 327, 616, 10895, 556, 2011, 43296, 6051, 11701, 685, 672, 10166, 327, 15302, 432, 2720, 2987, 253, 3559, 10895, 556, 417, 760, 644, 2011, 281, 452, 2168, 5611, 1534, 9510, 84, 281, 1566, 3045, 533, 556, 671, 5485, 598, 11008, 281, 643, 3672, 273, 2561, 984, 273, 253, 10353, 2092, 273, 22581, 941, 3738, 352, 310, 417, 347, 4745, 2590, 2139, 352, 651, 320, 4623, 281, 9787, 10393, 253, 2929, 556, 671, 2218, 247, 1077, 11080, 6239, 2278, 715, 253, 9400, 273, 2460, 32818, 326, 588, 8596, 2852, 2175, 50274, 783, 10895, 310, 671, 11117, 2439, 25231, 4811, 352, 310, 2649, 16318, 275, 253, 2929, 533, 253, 4422, 323, 253, 10895, 921, 12866, 3888, 432, 253, 10895, 326, 403, 2668, 432, 4343, 3345, 273, 1625, 15375, 285, 275, 1027, 12620, 253, 954, 8993, 2442, 2159, 4202, 891, 923, 281, 616, 1332, 310, 326, 32818, 778, 452, 644, 5132, 85, 1037, 4561, 275, 3888, 835, 627, 651, 417, 5010, 320, 667, 32818, 1014, 342, 667, 6661, 273, 6568, 285, 9655, 323, 1650, 627, 1646, 281, 320, 3888, 275, 253, 10895, 835, 49569, 497, 5132, 85, 1037, 4561, 1014, 2167, 253, 3236, 2460, 1057, 417, 452, 667, 10654, 390, 627, 497, 2709, 10654, 275, 253, 2460, 533, 760, 49569, 323, 581, 273, 731, 497, 4561, 816, 1754, 327, 436, 37808, 627, 310, 247, 2590, 3268, 5333, 875, 616, 10895, 285, 1524, 10186, 941, 534, 1537, 452, 2918, 896, 253, 3210, 15277, 281, 39970, 50276, 783, 2929, 812, 2486, 4278, 273, 3082, 281, 4822, 3888, 326, 49569, 497, 5132, 85, 1037, 4561, 323, 347, 973, 347, 4278, 273, 849, 253, 519, 8594, 846, 2538, 5657, 4307, 762, 253, 21613, 2490, 187, 4118, 18435, 27, 2520, 2929, 12661, 247, 747, 2360, 2606, 32818, 16466, 10895, 253, 13506, 10895, 13276, 281, 4044, 3216, 5083, 534, 310, 5010, 1077, 2834, 275, 1524, 13451, 512, 30628, 403, 10048, 342, 253, 3290, 273, 253, 2929, 760, 581, 37317, 32636, 16888, 2708, 14924, 285, 253, 7350, 403, 5884, 253, 7350, 556, 644, 9257, 9713, 407, 253, 4477, 949, 253, 5955, 512, 643, 7350, 452, 671, 644, 9257, 9713, 285, 253, 2929, 556, 644, 35993, 15672, 3103, 436, 2929, 310, 7470, 323, 14924 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the authors propose to visualize training trajectories with the dimension reduction and visualization tool phate the authors try to reconstruct manifolds on which the training trajectories lie it argues that the method captures data characteristics from all dimensions and shows consistent geometric patterns for loss landscape surrounding good and bad generalization optima strength the approach embeds multiple training trajectories in one visualization by adopting dimension reduction approaches which is not explored before the comparison of multiple dimension reduction approaches is also lacking in previous literatures the paper has a nice review and summarization of previous approaches on loss surface visualization weakness my major concern is that the insights observed from the methods are not quite straightforward for interpretation and some conclusions are not clear or novel in page 4 the authors claimed we note that methods that pick random directionsare not able to visualize entire trajectories as the spaces and planes change this observation is also made in previous work1 and pca is used for visualizing the training trajectories however this is not necessarily a disadvantage for visualizing the flatness of the minima as discussed follows to characterize the region of loss landscape the authors propose to adopt jump and retrain setting 1 and this involves visualizing multiple trajectories together however the rationale to do this is not quite clear and the interpretation of the visualization is hard it is not quite clear about the advantage of proposed visualization figure 2 on explaining the flatness in comparison with previous 1d or 2d visualization with randomized directions on the other hand the computation cost multiple initialization multiple steps and retraining for generating those figures could be significant the proposed visualization should be valuable for comparing different trajectories produced by different optimizers as all trajectories can be plotted together however the visualization figure 4 does not convey a clear message about which optimizer trajectory is better in performance generalization the individual trajectory visualization figure 3 in sec 52 seems not to differ too much with the pca based visualization 12 what is the takeaway message from figure 3 it is not clear how the observation the resulting parameters appear to be further away from the initialization is made from the figure questions in section 53 the authors claim there are consistent and distinct patterns for good and bad minima is there a smooth transition between the good and bad will the pattern differ with a much better or worse initialization eg what is the pattern for random initialization which is the worst 1 li et al visualizing the loss landscape of neural nets nips 2018 2 eliana lorch visualizing deep network training trajectories with pca icml workshop on visualization for deep learning 2016docsep 1 brief summary the authors use a new dimensionality reduction technique called phate that was first introduced in a different paper to study the weightspace positions and training trajectories of several dnn architectures resnet wideresnet on vision classification tasks cifar10 cifar100 they use different optimizers sgd sgdmomentum adam study very good and decent they call them bad optima and the solutions to memorization of random labels they perform two kinds of experiments 1 perturbing a single optimum and retraining from there visualizing those trajectories and 2 visualizing several random init optimum trajectories from different inits they then draw some conclusions from the phate projections for the two kind of experiments and different optimizers and kinds of optima 2 strengths i like the introduction of the phate algorithm and the authors explanation of its advantages and why it might be a good fit for the weight space trajectory visualization of dnns the paper is well motivated understanding the loss landscape of dnns is likely very important and still very underexplored i like the comparison to other algorithms pca tsne and the synthetic treelike structure they visualize in addition to real dnns to demonstrate the advantageous properties of phate i like that the others study the stability of their visualization to retraining and seed change that is very good and should be the norm in all papers that make claims that depend on stochasticity 3 weaker points 1 the scope of the paper is in my opinion a bit too limited a lot of space is used to introduce the phate algorithm that is as i understand it not a novel contribution of this work 2 while the visualizations look compelling the conclusions that are drawn from them are often too strong i dont see how the visualization can be connected to flatness dimensionality etc directly they do serve as a good visual guide but i feel the paper doesnt establish the connection to those quantities very well i see that lets say for the random labels and real labels the projections look different but how do i link this to eg the size of the lowloss basin around those two 3 the authors flip between a language suggesting that there are some specific dimensions that are being visualized eg potentially flying off to an orthogonal subspace and appreciating that the phate technique doesnt preserve dimensions in any meaningful way i would advise to soften the claims to reflect the nonlinear adaptive nature of phate and the difficulty of connecting the embeddings to any particular directions 4 the experiment type 1 going in random directions off an optimum and retraining back doesnt seem to push far enough what would happen if i perturbed more would the solutions glide back as they do how does this compare between the real labels and random labels id say this would be a nice validation at least if you go to far you should not go back if you do the phate projections do not project the way you believe if you dont thats interesting on its own and can be used for comparison between cases adam sgd random labels 5 the figure 3 results for a single trajectory look uninformative what would happen if you did this multiple times from different seeds are the adam paths smoother generically or just this particular random seed you should either make the experiment statistically meaningful in some way or explain what the reader is supposed to take away from it as is it doesnt do much 6 the experiments of type 2 random inits training optima look promising however they really break my intuition for what phate does and if it is at all relevant my main worry is this why do the inits end up coinciding from 1 and 4 it is quite certain that a the inits are mutually orthogonal to a high degree and b so are their optimized endpoints how come your inits get mapped to the same point sometimes but other times they do not this really puts your results into question for me especially because you use the visualizations to make pretty strong claims about the loss landscape structure 7 correct me if i am wrong only wideresnet and resnet on cifar10100 were used in this paper those are two pretty similar architectures and two very similar vision tasks i would appreciate a much broader set of experiments to make sure the claims here hold for example a what do skip connections do both architectures here have them and b how would mnist fashion mnist svhn do those are cheap to train and easy to use datasets and i would expect the authors do use them c how would a simple feedforward cnn do d what about fullyconnected nets would they behave the same way empathically i am not asking for imagenet i know that it is very hard to run and while it would be nice to have there is no need to use it in every paper but i would want to see a much broader set of experiments on other cnnbased architectures as a well as other potentially weaker datasets as is the paper doesnt have the sufficient experimental support to see the generality of the interpretation claims 4 relevant papers that might be worth addingexploring there are some papers that i think might be relevant here and that the authors might want to wish to add read explore 1 large scale structure of neural network loss landscapes by stanislav fort stanislaw jastrzebski httpsarxivorgabs190604724 at neurips 2019 build a geometric model of the lowloss manifolds of dnns incorporating the observations of 1 connectivity of init to optimum the highdim nature of the manifolds and their connectedness they also visualize the loss landscape on sections that might be relevant here they also show that they can find ndimensional surfaces connected n1 independent optima together going beyond the 2 optima on a 1d path 2 you might be missing the second paper that established the connectivity between different modes garipov t izmailov p podoprikhin d vetrov d p and wilson a g loss surfaces mode connectivity and fast ensembling of dnns at httparxivorgabs180210026 3 measuring the intrinsic dimension of objective landscapes by chunyuan li heerad farkhoor rosanne liu jason yosinski httpsarxivorgabs180408838 establishes that the loss landscape lowloss manifolds have a low intrinsic dimension that can be interpreted as high d of the manifolds done in httpsarxivorgabs190604724 4 deep ensembles a loss landscape perspective by stanislav fort huiyi hu balaji lakshminarayanan httpsarxivorgabs191202757 look at the cosine similarity of the weight space positions of multiple runs as well as a single run they also visualize the loss landscape between optima on different affine and nonaffine sections in your justification for why phase might be good at visualizing the trajectories you mention that they are low dimensional there is a number of works showing that it might or might not be the case 5 gradient descent happens in a tiny subspace by guy gurari daniel a roberts ethan dyer httpsarxivorgabs181204754 shows the gradients in a mostly lowd subspace but the actual learning happens orthogonal to that 6 emergent properties of the local geometry of neural loss landscapes by stanislav fort surya ganguli httpsarxivorgabs191005929 dissects the hessian structure and finds lowd signal highd noise where the signal is not in the learning directions but rather forms constraints 7 traces of classcrossclass structure pervade deep learning spectra by v papyan arxiv200811865 has a on overview of the lowd structures in the dnn hessians that might be relevant 5 summary i like that you introduced a new dimensionality reduction technique to visualizing loss landscape trajectories and positions for dnns the reasons why it might be better than others are compelling however i think that the scope of the experiments you provide is limited the claims you make a bit stronger than would be justified by the results shown and there are some worrisome features of some of the embedded trajectories that make my question the validity of your overall conclusions i think the paper has a promise but it needs a bit more work i am open to revising my score if you address my questions well docsepthe paper suggests using phate a modern dimensionality reduction method for visualizing the training trajectories of deep networks it argues that phate visualizations can bring to light interesting aspects of the training dynamics that are missed by other dimensionality reduction algorithms because phate does a better job at preserving both local and global structure in the data this is not the first application of phate in the context of deep learning but to my knowledge it is the first application to deep learning trajectories the paper shows that phate visualizations of trainandjump trajectories where the model is repeatedly pushed away from a minimum and then allowed to retrain can have features that are correlated with how well a model generalizes it also shows that different optimization algorithms can lead to distinguishable visual features while the use of better visualization techniques may lead to better understanding of neural network dynamics i am not convinced that the paper succeeds in making this case 1 the generalization results figure 2 are interpreted using known results about the generalization properties of flat vs sharp minima these results are nice and are potentially useful however they were only demonstrated in a few settings do these results hold more generally when using other architectures and different data sets 2 i felt that section 52 on visualizing the trajectories of different optimizers did not have a clear take away message 3 except for figure 1 phate is not compared against more commonly used reduction algorithms such as tsne and umap for example regarding figure 2 could we draw similar conclusions about generalization using other techniques docsep paper summary this paper uses phate to visualize the progression of neural net parameters during learning in neural networks to provide insight into generalizable vs nongeneralizable minima and the behaviors of different optimization algorithms phate is an improvement over previous visualization techniques due to its approach to finding a manifold allowing it to plot in two dimensions multiple trajectories that do not otherwise share a plane this is then used to plot trajectories in jump and retrain experiments in which a minimum is found and then perturbations made to the network parameters before restarting training it is shown that in the networks experimented on minima with good test set performance reliably funnel the new learning trajectories into the same minimum while minima with poor test set performance see the perturbed initializations find other minima thus demonstrating the flatness vs sharpness of minima trajectories produced by sgd sgd with momentum and adam are also compared adam is shown to travel further but along a smoother trajectory originality the visualization approach is not new but is new to neural network trajectories the visualizations effectively confirm suspected properties of neural networks but do not offer anything particularly new the behaviors of learning algorithms are demonstrated but not much discussed so little new is learned significance visualization is important to understanding the behaviors of highdimensional learning trajectories and this is better for plotting multiple trajectories than previous approaches it is an effective approach and is superior at demonstrating known neural network behaviors than others insights are not significant but the visualization of them is improved clarity very clearly written and easy to understand quality enjoyable paper which offers a new effective tool but no real new insights into neural networks the jumpandretrain experiments are effective and wellchosen the demonstrations of learning algorithm are not as effective and im not sure what to take away from them i invite the authors to help me find the significance of these experiments ### Summary:
the reviewers and i agree that the paper is well motivated and that there are good comparisons to prior work however the scope of the paper is rather limited and there were some doubts about the overall conclusions and whether the current results fully support them as such i cannot recommend the paper for publication
[ 11815, 326, 403, 8392, 432, 731, 403, 2223, 1512, 2266, 891, 13414, 923, 849, 253, 24426, 476, 320, 4802, 281, 6507, 1255, 7877, 1319, 3966, 3587, 597, 513, 5752, 347, 247, 1175, 5304, 7102, 533, 891, 1928, 253, 2929, 36908, 5100, 253, 4602, 281, 1110, 13483, 1077, 973, 891, 923, 326, 14935, 1333, 323, 253, 3632, 13301, 285, 1524, 13301, 253, 20553, 1007, 1027, 533, 849, 513, 891, 3048, 436, 281, 24088, 253, 1979, 273, 253, 1698, 18585, 31567, 1475, 1110, 767, 50276, 20, 253, 4477, 19153, 875, 247, 3448, 7738, 326, 627, 403, 690, 2173, 10103, 326, 403, 1146, 27130, 24088, 7826, 12060, 745, 281, 271, 19627, 24822, 285, 6373, 15544, 326, 253, 815, 366, 5853, 36908, 14003, 10103, 275, 667, 14282, 1039, 891, 651, 22276, 281, 50007, 253, 3916, 281, 4887, 253, 14561, 17825, 3753, 273, 815, 366, 285, 253, 10183, 273, 12873, 253, 46234, 281, 667, 1798, 10746, 50276, 21, 253, 3368, 1511, 337, 50276, 5681, 275, 3632, 10746, 745, 271, 24571, 285, 851, 26208, 896, 36908, 1646, 281, 7450, 2080, 2217, 752, 651, 5108, 604, 891, 44711, 625, 651, 253, 5482, 1289, 504, 896, 347, 597, 513, 849, 1057, 436, 7277, 875, 253, 1524, 13301, 285, 3632, 13301, 2654, 1333, 436, 651, 320, 247, 5322, 12820, 387, 1878, 50276, 338, 368, 564, 281, 2080, 368, 943, 417, 564, 896, 604, 368, 513, 253, 815, 366, 20553, 513, 417, 2199, 253, 1039, 368, 2868, 604, 368, 13414, 28763, 4722, 327, 697, 1211, 285, 476, 320, 908, 323, 5301, 875, 2219, 38622, 256, 35333, 3632, 13301, 50275, 22, 253, 4677, 495, 1543, 323, 247, 2014, 18974, 1007, 440, 37650, 800, 50276, 5371, 651, 5108, 604, 368, 858, 436, 2709, 2069, 432, 1027, 12922, 403, 253, 38622, 11865, 39797, 977, 1006, 1037, 390, 816, 436, 1798, 3632, 8357, 368, 943, 2057, 1056, 253, 3368, 10126, 14282, 275, 690, 1039, 390, 5513, 752, 253, 9414, 310, 6326, 281, 1379, 1977, 432, 352, 347, 310, 352, 36908, 513, 1199, 50276, 23, 253, 4679, 273, 1511, 374, 3632, 275, 953, 50276, 31158, 50276, 2178, 8032, 1007, 12532, 2299, 597, 1663, 2740, 619, 30328, 323, 752, 815, 366, 1057, 285, 604, 352, 310, 387, 512, 4623, 619, 2022, 7664, 310, 436, 2139, 513, 253, 275, 953, 990, 598, 10891, 2821, 432, 337, 285, 577, 352, 310, 3240, 2176, 326, 247, 253, 275, 953, 403, 25834, 19627, 281, 247, 1029, 4248, 285, 270, 594, 403, 616, 18325, 29959, 849, 1705, 634, 275, 953, 755, 18301, 281, 253, 1072, 1127, 4536, 533, 643, 2069, 597, 513, 417, 436, 1663, 12516, 634, 1543, 715, 1953, 323, 479, 3340, 984, 368, 897, 253, 5304, 5904, 281, 1056, 3965, 2266, 3916, 670, 253, 2957, 13016, 2605, 50276, 24, 3451, 479, 604, 891, 717, 3430, 760, 4618, 373, 3024, 285, 501, 3024, 327, 260, 338, 274, 6903, 361, 497, 908, 275, 436, 2929, 1110, 403, 767, 3965, 2074, 35615, 285, 767, 1077, 2074, 8113, 8892, 891, 651, 11435, 247, 1199, 16055, 873, 273, 4679, 281, 1056, 2119, 253, 3916, 1060, 2186, 323, 1650, 50276, 66, 752, 513, 17049, 10291, 513, 1097, 35615, 1060, 452, 731, 285, 270, 849, 651, 278, 79, 382, 8142, 278, 79, 382, 18504, 13107, 513, 1110, 403, 11142, 281, 6194, 285, 3477, 281, 897, 15302, 285, 891, 651, 1902, 253, 4477, 513, 897, 731, 260, 849, 651, 247, 2969, 3997, 10495, 260, 9866, 513, 277, 752, 670, 4751, 14063, 37507, 651, 597, 21319, 253, 1072, 1039, 802, 3967, 1037, 891, 717, 417, 7004, 323, 4440, 257, 292, 50276, 74, 871, 326, 352, 310, 1077, 1892, 281, 1408, 285, 1223, 352, 651, 320, 5322, 281, 452, 627, 310, 642, 878, 281, 897, 352, 275, 1046, 2929, 533, 891, 651, 971, 281, 923, 247, 1199, 16055, 873, 273, 4679, 50276, 251, 643, 260, 9866, 3169, 35615, 347, 247, 973, 347, 643, 7826, 21076, 15302, 347, 310, 253, 2929, 36908, 452, 253, 4209, 5661, 1329, 281, 923, 253, 31376, 273, 253, 7914, 3916, 50274, 21, 4623, 9380, 326, 1537, 320, 4409, 6240, 15083, 4263, 627, 403, 690, 9380, 326, 891, 1158, 1537, 320, 4623, 1060, 285, 326, 253, 4477, 1537, 971, 281, 5730, 281, 823, 50276, 1088, 50276, 15083, 410, 50275, 18, 1781, 4311, 2605, 273, 11454, 2990, 2957, 37328, 407, 331, 266, 12937, 580, 7574, 331, 266, 261, 6937, 480, 505, 83, 2721, 1768, 5985, 5987, 39962, 2061, 5375, 16129, 1549, 2504, 1348, 387, 5723, 2824, 6247, 1973, 247, 17856, 1566, 273, 253, 1698, 18585, 28236, 273, 277, 79, 2224, 24049, 253, 7313, 273, 337, 17769, 273, 2012, 281, 24571, 253, 1029, 4528, 3753, 273, 253, 28236, 285, 616, 4802, 1255, 597, 671, 31986, 253, 2957, 13016, 327, 7118, 326, 1537, 320, 4623, 1060, 597, 671, 921, 326, 597, 476, 1089, 295, 6967, 9421, 4802, 295, 18, 3907, 5556, 66, 2366, 1469, 4457, 253, 374, 5556, 66, 327, 247, 337, 69, 1854, 50276, 19, 368, 1537, 320, 5816, 253, 1273, 2929, 326, 4232, 253, 17769, 875, 1027, 10006, 6746, 532, 729, 246, 24901, 5719, 729, 268, 7360, 412, 16409, 23187, 277, 26925, 18540, 277, 268, 285, 31380, 1665, 50276, 66, 305, 50276, 18585, 9421, 4438, 17769, 285, 3809, 546, 35128, 273, 277, 79, 2224, 387, 2832, 1148, 32693, 2061, 5375, 11395, 1797, 36310, 50276, 20, 10499, 253, 15276, 7877, 273, 8103, 37328, 407, 448, 328, 90, 9041, 632, 344, 45158, 269, 782, 1689, 263, 687, 84, 17416, 632, 86, 480, 1187, 340, 375, 26630, 5987, 39962, 2061, 5375, 11395, 1449, 2055, 1839, 25097, 326, 253, 2957, 13016, 1698, 18585, 28236, 452, 247, 1698, 15276, 7877, 326, 476, 320, 12814, 347, 1029, 277, 273, 253, 28236, 2218, 275, 5987, 39962, 2061, 5375, 16129, 1549, 2504, 1348, 50276, 21, 3676, 49328, 247, 2957, 13016, 8668, 407, 331, 266, 12937, 580, 7574, 288, 4113, 28212, 30287, 4273, 37848, 298, 518, 1200, 1222, 274, 26782, 266, 5987, 39962, 2061, 5375, 746, 8193, 1630, 3011, 1007, 387, 253, 7349, 460, 14259, 273, 253, 2801, 2317, 6887, 273, 2709, 6613, 347, 973, 347, 247, 2014, 1408, 597, 671, 31986, 253, 2957, 13016, 875, 5556, 66, 327, 1027, 29438, 285, 1327, 2843, 460, 7118, 50276, 249, 634, 22861, 323, 2139, 3408, 1537, 320, 1175, 387, 5304, 3006, 253, 24102, 368, 3748, 326, 597, 403, 1698, 15759, 627, 310, 247, 1180, 273, 2987, 4645, 326, 352, 1537, 390, 1537, 417, 320, 253, 1083, 608, 11786, 18499, 6569, 275, 247, 10058, 24822, 407, 5599, 305, 321, 1792, 16447, 928, 247, 687, 589, 1641, 5105, 266, 277, 7885, 5987, 39962, 2061, 5375, 1093, 805, 2125, 31254, 2722, 253, 27935, 275, 247, 6571, 1698, 69, 24822, 533, 253, 4588, 4715, 6569, 19627, 281, 326, 721, 47006, 3607, 273, 253, 1980, 12087, 273, 11454, 2957, 37328, 407, 331, 266, 12937, 580, 7574, 919, 5973, 10821, 22357, 5987, 39962, 2061, 5375, 746, 2313, 3046, 1717, 557, 33009, 253, 344, 859, 757, 2605, 285, 9010, 1698, 69, 2625, 50276, 8656, 69, 6046, 835, 253, 2625, 310, 417, 275, 253, 4715, 10746, 533, 2581, 4948, 10806, 818, 20274, 273, 966, 16599, 2437, 2605, 591, 87, 796, 3676, 4715, 9408, 407, 362, 13860, 8202, 549, 32693, 8012, 14711, 2082, 556, 247, 327, 18389, 273, 253, 1698, 69, 5289, 275, 253, 277, 9866, 344, 859, 2458, 326, 1537, 320, 4623, 50273, 22, 6010, 891, 751, 326, 368, 5611, 247, 747, 7877, 1319, 5141, 5853, 281, 5304, 3006, 2957, 13016, 24102, 285, 6887, 323, 277, 79, 2224, 253, 4606, 2139, 352, 1537, 320, 1805, 685, 2571, 403, 18511, 2299, 891, 1158, 326, 253, 7990, 273, 253, 4679, 368, 2085, 310, 3710, 253, 3916, 368, 1056, 247, 2372, 10046, 685, 651, 320, 17285, 407, 253, 1543, 2011, 285, 627, 403, 690, 548, 4448, 485, 3386, 273, 690, 273, 253, 12691, 24102, 326, 1056, 619, 1953, 253, 13091, 273, 634, 4583, 11815, 891, 1158, 253, 2929, 556, 247, 9023, 533, 352, 3198, 247, 2372, 625, 789, 891, 717, 1527, 281, 3585, 2182, 619, 4868, 604, 368, 2953, 619, 3533, 973, 5474, 339, 431, 248, 2929, 5936, 970, 815, 366, 247, 4980, 7877, 1319, 5141, 1332, 323, 5304, 3006, 253, 3733, 24102, 273, 3676, 6928, 352, 8219, 326, 815, 366, 5304, 5904, 476, 3324, 281, 1708, 4722, 7794, 273, 253, 3733, 8062, 326, 403, 9829, 407, 643, 7877, 1319, 5141, 11333, 984, 815, 366, 1057, 247, 1805, 2628, 387, 24279, 1097, 1980, 285, 4156, 2605, 275, 253, 941, 436, 310, 417, 253, 806, 2898, 273, 815, 366, 275, 253, 3634, 273, 3676, 4715, 533, 281, 619, 3640, 352, 310, 253, 806, 2898, 281, 3676, 4715, 24102, 50276, 783, 2929, 2722, 326, 815, 366, 5304, 5904, 273, 6194, 395, 48742, 24102, 835, 253, 1566, 310, 12889, 10184, 1977, 432, 247, 5927, 285, 840, 4136, 281, 851, 1949, 476, 452, 3386, 326, 403, 9578, 342, 849, 973, 247, 1566, 2087, 4219, 352, 671, 2722, 326, 1027, 13757, 11333, 476, 1421, 281, 37651, 5304, 3386, 50276, 6050, 253, 897, 273, 1805, 24426, 5609, 778, 1421, 281, 1805, 4685, 273, 11454, 2990, 8062, 891, 717, 417, 13762, 326, 253, 2929, 44584, 275, 2403, 436, 1083, 50276, 18, 253, 26647, 1543, 4677, 374, 403, 12814, 970, 1929, 1543, 670, 253, 26647, 3607, 273, 6507, 4632, 9479, 46836, 841, 1543, 403, 5322, 285, 403, 7826, 4217, 2299, 597, 497, 760, 5183, 275, 247, 1643, 7533, 513, 841, 1543, 2186, 625, 3839, 672, 970, 643, 35615, 285, 1027, 941, 5239, 50276, 19, 891, 3543, 326, 2593, 8073, 327, 5304, 3006, 253, 24102, 273, 1027, 5556, 14460, 858, 417, 452, 247, 2590, 1379, 1977, 3935, 50276, 20, 3707, 323, 4677, 337, 815, 366, 310, 417, 2429, 1411, 625, 7744, 908, 5141, 11333, 824, 347, 28669, 570, 285, 5111, 522, 323, 1650, 5001, 4677, 374, 812, 359, 3812, 2074, 11815, 670, 26647, 970, 643, 5609, 5474, 33032, 2929, 6010, 50276, 2520, 2929, 4648, 815, 366, 281, 31986, 253, 10005, 273, 11454, 2036, 3602, 1309, 4715, 275, 11454, 6928, 281, 2085, 12288, 715, 2087, 12729, 4632, 295, 543, 257, 1560, 12729, 46836, 285, 253, 13576, 273, 1027, 13757, 11333, 50276, 545, 366, 310, 271, 7756, 689, 2045, 24426, 5609, 1955, 281, 697, 2746, 281, 4560, 247, 16751, 6941, 352, 281, 7484, 275, 767, 10103, 2709, 24102, 326, 513, 417, 5010, 3894, 247, 6415, 50276, 2520, 310, 840, 908, 281, 7484, 24102, 275, 6923, 285, 851, 1949, 4679, 275, 534, 247, 5927, 310, 1119, 285, 840, 26309, 1160, 281, 253, 2990, 3602, 1078, 19855, 272, 3733, 50276, 262, 310, 2011, 326, 275, 253, 6928, 3368, 264, 327, 46836, 342, 1175, 1071, 873, 3045, 27340, 37346, 253, 747, 4715, 24102, 715, 253, 1072, 5927, 1223, 46836, 342, 4105, 1071, 873, 3045, 923, 253, 44711, 3302, 5904, 1089, 643, 46836, 3021, 17227, 253, 6507, 1255, 4632, 9479, 1255, 273, 46836, 50276, 7604, 720, 2370, 4197, 407, 256, 35333, 256, 35333, 342, 10254, 285, 38622, 403, 671, 2429, 50276, 43089, 310, 2011, 281, 4288, 2007, 533, 2112, 247, 39797, 977, 18974, 50275, 19164, 414, 50276, 783, 24426, 2746, 310, 417, 747, 533, 310, 747, 281, 11454, 2990, 24102, 50276, 783, 5304, 5904, 8069, 6583, 13282, 3607, 273, 11454, 6928, 533, 513, 417, 3959, 2712, 3782, 747, 50276, 783, 13576, 273, 4715, 11333, 403, 5183, 533, 417, 1199, 5469, 594, 1652, 747, 310, 6311, 50275, 9188, 40348, 50276, 34309, 1320, 310, 1774, 281, 4685, 253, 13576, 273, 1029, 6967, 4715, 24102, 285, 436, 310, 1805, 323, 38542, 2709, 24102, 685, 2045, 7274, 50276, 262, 310, 271, 3576, 2746, 285, 310, 8936, 387, 17227, 1929, 11454, 2990, 13576, 685, 2571, 50276, 968, 4380, 403, 417, 1534, 533, 253, 24426, 273, 731, 310, 5520, 50275, 498, 15752, 50276, 635, 4518, 3542, 285, 3477, 281, 2096, 50275, 15177, 50276, 257, 3881, 494, 2929, 534, 6131, 247, 747, 3576, 4968, 533, 642, 1524, 747, 16039, 715, 11454, 6928, 50276, 783, 6923, 395, 1221, 1949, 4679, 403, 3576, 285, 973, 348, 5458, 50276, 783, 32367, 273, 4715, 5933, 403, 417, 347, 3576, 285, 516, 417, 2119, 752, 281, 1379, 1977, 432, 731, 50276, 74, 19864, 253, 4477, 281, 1361, 479, 1089, 253, 8453, 273, 841, 4679, 187, 187, 4118, 18435, 27, 783, 30628, 285, 891, 5194, 326, 253, 2929, 310, 973, 17194, 285, 326, 627, 403, 1175, 14023, 281, 2720, 789, 2299, 253, 7990, 273, 253, 2929, 310, 2581, 3710, 285, 627, 497, 690, 24626, 670, 253, 4583, 11815, 285, 1880, 253, 1655, 1543, 4751, 1329, 731, 347, 824, 891, 2550, 5583, 253, 2929, 323, 9311 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 11815, 326, 403, 8392, 432, 731, 403, 2223, 1512, 2266, 891, 13414, 923, 849, 253, 24426, 476, 320, 4802, 281, 6507, 1255, 7877, 1319, 3966, 3587, 597, 513, 5752, 347, 247, 1175, 5304, 7102, 533, 891, 1928, 253, 2929, 36908, 5100, 253, 4602, 281, 1110, 13483, 1077, 973, 891, 923, 326, 14935, 1333, 323, 253, 3632, 13301, 285, 1524, 13301, 253, 20553, 1007, 1027, 533, 849, 513, 891, 3048, 436, 281, 24088, 253, 1979, 273, 253, 1698, 18585, 31567, 1475, 1110, 767, 50276, 20, 253, 4477, 19153, 875, 247, 3448, 7738, 326, 627, 403, 690, 2173, 10103, 326, 403, 1146, 27130, 24088, 7826, 12060, 745, 281, 271, 19627, 24822, 285, 6373, 15544, 326, 253, 815, 366, 5853, 36908, 14003, 10103, 275, 667, 14282, 1039, 891, 651, 22276, 281, 50007, 253, 3916, 281, 4887, 253, 14561, 17825, 3753, 273, 815, 366, 285, 253, 10183, 273, 12873, 253, 46234, 281, 667, 1798, 10746, 50276, 21, 253, 3368, 1511, 337, 50276, 5681, 275, 3632, 10746, 745, 271, 24571, 285, 851, 26208, 896, 36908, 1646, 281, 7450, 2080, 2217, 752, 651, 5108, 604, 891, 44711, 625, 651, 253, 5482, 1289, 504, 896, 347, 597, 513, 849, 1057, 436, 7277, 875, 253, 1524, 13301, 285, 3632, 13301, 2654, 1333, 436, 651, 320, 247, 5322, 12820, 387, 1878, 50276, 338, 368, 564, 281, 2080, 368, 943, 417, 564, 896, 604, 368, 513, 253, 815, 366, 20553, 513, 417, 2199, 253, 1039, 368, 2868, 604, 368, 13414, 28763, 4722, 327, 697, 1211, 285, 476, 320, 908, 323, 5301, 875, 2219, 38622, 256, 35333, 3632, 13301, 50275, 22, 253, 4677, 495, 1543, 323, 247, 2014, 18974, 1007, 440, 37650, 800, 50276, 5371, 651, 5108, 604, 368, 858, 436, 2709, 2069, 432, 1027, 12922, 403, 253, 38622, 11865, 39797, 977, 1006, 1037, 390, 816, 436, 1798, 3632, 8357, 368, 943, 2057, 1056, 253, 3368, 10126, 14282, 275, 690, 1039, 390, 5513, 752, 253, 9414, 310, 6326, 281, 1379, 1977, 432, 352, 347, 310, 352, 36908, 513, 1199, 50276, 23, 253, 4679, 273, 1511, 374, 3632, 275, 953, 50276, 31158, 50276, 2178, 8032, 1007, 12532, 2299, 597, 1663, 2740, 619, 30328, 323, 752, 815, 366, 1057, 285, 604, 352, 310, 387, 512, 4623, 619, 2022, 7664, 310, 436, 2139, 513, 253, 275, 953, 990, 598, 10891, 2821, 432, 337, 285, 577, 352, 310, 3240, 2176, 326, 247, 253, 275, 953, 403, 25834, 19627, 281, 247, 1029, 4248, 285, 270, 594, 403, 616, 18325, 29959, 849, 1705, 634, 275, 953, 755, 18301, 281, 253, 1072, 1127, 4536, 533, 643, 2069, 597, 513, 417, 436, 1663, 12516, 634, 1543, 715, 1953, 323, 479, 3340, 984, 368, 897, 253, 5304, 5904, 281, 1056, 3965, 2266, 3916, 670, 253, 2957, 13016, 2605, 50276, 24, 3451, 479, 604, 891, 717, 3430, 760, 4618, 373, 3024, 285, 501, 3024, 327, 260, 338, 274, 6903, 361, 497, 908, 275, 436, 2929, 1110, 403, 767, 3965, 2074, 35615, 285, 767, 1077, 2074, 8113, 8892, 891, 651, 11435, 247, 1199, 16055, 873, 273, 4679, 281, 1056, 2119, 253, 3916, 1060, 2186, 323, 1650, 50276, 66, 752, 513, 17049, 10291, 513, 1097, 35615, 1060, 452, 731, 285, 270, 849, 651, 278, 79, 382, 8142, 278, 79, 382, 18504, 13107, 513, 1110, 403, 11142, 281, 6194, 285, 3477, 281, 897, 15302, 285, 891, 651, 1902, 253, 4477, 513, 897, 731, 260, 849, 651, 247, 2969, 3997, 10495, 260, 9866, 513, 277, 752, 670, 4751, 14063, 37507, 651, 597, 21319, 253, 1072, 1039, 802, 3967, 1037, 891, 717, 417, 7004, 323, 4440, 257, 292, 50276, 74, 871, 326, 352, 310, 1077, 1892, 281, 1408, 285, 1223, 352, 651, 320, 5322, 281, 452, 627, 310, 642, 878, 281, 897, 352, 275, 1046, 2929, 533, 891, 651, 971, 281, 923, 247, 1199, 16055, 873, 273, 4679, 50276, 251, 643, 260, 9866, 3169, 35615, 347, 247, 973, 347, 643, 7826, 21076, 15302, 347, 310, 253, 2929, 36908, 452, 253, 4209, 5661, 1329, 281, 923, 253, 31376, 273, 253, 7914, 3916, 50274, 21, 4623, 9380, 326, 1537, 320, 4409, 6240, 15083, 4263, 627, 403, 690, 9380, 326, 891, 1158, 1537, 320, 4623, 1060, 285, 326, 253, 4477, 1537, 971, 281, 5730, 281, 823, 50276, 1088, 50276, 15083, 410, 50275, 18, 1781, 4311, 2605, 273, 11454, 2990, 2957, 37328, 407, 331, 266, 12937, 580, 7574, 331, 266, 261, 6937, 480, 505, 83, 2721, 1768, 5985, 5987, 39962, 2061, 5375, 16129, 1549, 2504, 1348, 387, 5723, 2824, 6247, 1973, 247, 17856, 1566, 273, 253, 1698, 18585, 28236, 273, 277, 79, 2224, 24049, 253, 7313, 273, 337, 17769, 273, 2012, 281, 24571, 253, 1029, 4528, 3753, 273, 253, 28236, 285, 616, 4802, 1255, 597, 671, 31986, 253, 2957, 13016, 327, 7118, 326, 1537, 320, 4623, 1060, 597, 671, 921, 326, 597, 476, 1089, 295, 6967, 9421, 4802, 295, 18, 3907, 5556, 66, 2366, 1469, 4457, 253, 374, 5556, 66, 327, 247, 337, 69, 1854, 50276, 19, 368, 1537, 320, 5816, 253, 1273, 2929, 326, 4232, 253, 17769, 875, 1027, 10006, 6746, 532, 729, 246, 24901, 5719, 729, 268, 7360, 412, 16409, 23187, 277, 26925, 18540, 277, 268, 285, 31380, 1665, 50276, 66, 305, 50276, 18585, 9421, 4438, 17769, 285, 3809, 546, 35128, 273, 277, 79, 2224, 387, 2832, 1148, 32693, 2061, 5375, 11395, 1797, 36310, 50276, 20, 10499, 253, 15276, 7877, 273, 8103, 37328, 407, 448, 328, 90, 9041, 632, 344, 45158, 269, 782, 1689, 263, 687, 84, 17416, 632, 86, 480, 1187, 340, 375, 26630, 5987, 39962, 2061, 5375, 11395, 1449, 2055, 1839, 25097, 326, 253, 2957, 13016, 1698, 18585, 28236, 452, 247, 1698, 15276, 7877, 326, 476, 320, 12814, 347, 1029, 277, 273, 253, 28236, 2218, 275, 5987, 39962, 2061, 5375, 16129, 1549, 2504, 1348, 50276, 21, 3676, 49328, 247, 2957, 13016, 8668, 407, 331, 266, 12937, 580, 7574, 288, 4113, 28212, 30287, 4273, 37848, 298, 518, 1200, 1222, 274, 26782, 266, 5987, 39962, 2061, 5375, 746, 8193, 1630, 3011, 1007, 387, 253, 7349, 460, 14259, 273, 253, 2801, 2317, 6887, 273, 2709, 6613, 347, 973, 347, 247, 2014, 1408, 597, 671, 31986, 253, 2957, 13016, 875, 5556, 66, 327, 1027, 29438, 285, 1327, 2843, 460, 7118, 50276, 249, 634, 22861, 323, 2139, 3408, 1537, 320, 1175, 387, 5304, 3006, 253, 24102, 368, 3748, 326, 597, 403, 1698, 15759, 627, 310, 247, 1180, 273, 2987, 4645, 326, 352, 1537, 390, 1537, 417, 320, 253, 1083, 608, 11786, 18499, 6569, 275, 247, 10058, 24822, 407, 5599, 305, 321, 1792, 16447, 928, 247, 687, 589, 1641, 5105, 266, 277, 7885, 5987, 39962, 2061, 5375, 1093, 805, 2125, 31254, 2722, 253, 27935, 275, 247, 6571, 1698, 69, 24822, 533, 253, 4588, 4715, 6569, 19627, 281, 326, 721, 47006, 3607, 273, 253, 1980, 12087, 273, 11454, 2957, 37328, 407, 331, 266, 12937, 580, 7574, 919, 5973, 10821, 22357, 5987, 39962, 2061, 5375, 746, 2313, 3046, 1717, 557, 33009, 253, 344, 859, 757, 2605, 285, 9010, 1698, 69, 2625, 50276, 8656, 69, 6046, 835, 253, 2625, 310, 417, 275, 253, 4715, 10746, 533, 2581, 4948, 10806, 818, 20274, 273, 966, 16599, 2437, 2605, 591, 87, 796, 3676, 4715, 9408, 407, 362, 13860, 8202, 549, 32693, 8012, 14711, 2082, 556, 247, 327, 18389, 273, 253, 1698, 69, 5289, 275, 253, 277, 9866, 344, 859, 2458, 326, 1537, 320, 4623, 50273, 22, 6010, 891, 751, 326, 368, 5611, 247, 747, 7877, 1319, 5141, 5853, 281, 5304, 3006, 2957, 13016, 24102, 285, 6887, 323, 277, 79, 2224, 253, 4606, 2139, 352, 1537, 320, 1805, 685, 2571, 403, 18511, 2299, 891, 1158, 326, 253, 7990, 273, 253, 4679, 368, 2085, 310, 3710, 253, 3916, 368, 1056, 247, 2372, 10046, 685, 651, 320, 17285, 407, 253, 1543, 2011, 285, 627, 403, 690, 548, 4448, 485, 3386, 273, 690, 273, 253, 12691, 24102, 326, 1056, 619, 1953, 253, 13091, 273, 634, 4583, 11815, 891, 1158, 253, 2929, 556, 247, 9023, 533, 352, 3198, 247, 2372, 625, 789, 891, 717, 1527, 281, 3585, 2182, 619, 4868, 604, 368, 2953, 619, 3533, 973, 5474, 339, 431, 248, 2929, 5936, 970, 815, 366, 247, 4980, 7877, 1319, 5141, 1332, 323, 5304, 3006, 253, 3733, 24102, 273, 3676, 6928, 352, 8219, 326, 815, 366, 5304, 5904, 476, 3324, 281, 1708, 4722, 7794, 273, 253, 3733, 8062, 326, 403, 9829, 407, 643, 7877, 1319, 5141, 11333, 984, 815, 366, 1057, 247, 1805, 2628, 387, 24279, 1097, 1980, 285, 4156, 2605, 275, 253, 941, 436, 310, 417, 253, 806, 2898, 273, 815, 366, 275, 253, 3634, 273, 3676, 4715, 533, 281, 619, 3640, 352, 310, 253, 806, 2898, 281, 3676, 4715, 24102, 50276, 783, 2929, 2722, 326, 815, 366, 5304, 5904, 273, 6194, 395, 48742, 24102, 835, 253, 1566, 310, 12889, 10184, 1977, 432, 247, 5927, 285, 840, 4136, 281, 851, 1949, 476, 452, 3386, 326, 403, 9578, 342, 849, 973, 247, 1566, 2087, 4219, 352, 671, 2722, 326, 1027, 13757, 11333, 476, 1421, 281, 37651, 5304, 3386, 50276, 6050, 253, 897, 273, 1805, 24426, 5609, 778, 1421, 281, 1805, 4685, 273, 11454, 2990, 8062, 891, 717, 417, 13762, 326, 253, 2929, 44584, 275, 2403, 436, 1083, 50276, 18, 253, 26647, 1543, 4677, 374, 403, 12814, 970, 1929, 1543, 670, 253, 26647, 3607, 273, 6507, 4632, 9479, 46836, 841, 1543, 403, 5322, 285, 403, 7826, 4217, 2299, 597, 497, 760, 5183, 275, 247, 1643, 7533, 513, 841, 1543, 2186, 625, 3839, 672, 970, 643, 35615, 285, 1027, 941, 5239, 50276, 19, 891, 3543, 326, 2593, 8073, 327, 5304, 3006, 253, 24102, 273, 1027, 5556, 14460, 858, 417, 452, 247, 2590, 1379, 1977, 3935, 50276, 20, 3707, 323, 4677, 337, 815, 366, 310, 417, 2429, 1411, 625, 7744, 908, 5141, 11333, 824, 347, 28669, 570, 285, 5111, 522, 323, 1650, 5001, 4677, 374, 812, 359, 3812, 2074, 11815, 670, 26647, 970, 643, 5609, 5474, 33032, 2929, 6010, 50276, 2520, 2929, 4648, 815, 366, 281, 31986, 253, 10005, 273, 11454, 2036, 3602, 1309, 4715, 275, 11454, 6928, 281, 2085, 12288, 715, 2087, 12729, 4632, 295, 543, 257, 1560, 12729, 46836, 285, 253, 13576, 273, 1027, 13757, 11333, 50276, 545, 366, 310, 271, 7756, 689, 2045, 24426, 5609, 1955, 281, 697, 2746, 281, 4560, 247, 16751, 6941, 352, 281, 7484, 275, 767, 10103, 2709, 24102, 326, 513, 417, 5010, 3894, 247, 6415, 50276, 2520, 310, 840, 908, 281, 7484, 24102, 275, 6923, 285, 851, 1949, 4679, 275, 534, 247, 5927, 310, 1119, 285, 840, 26309, 1160, 281, 253, 2990, 3602, 1078, 19855, 272, 3733, 50276, 262, 310, 2011, 326, 275, 253, 6928, 3368, 264, 327, 46836, 342, 1175, 1071, 873, 3045, 27340, 37346, 253, 747, 4715, 24102, 715, 253, 1072, 5927, 1223, 46836, 342, 4105, 1071, 873, 3045, 923, 253, 44711, 3302, 5904, 1089, 643, 46836, 3021, 17227, 253, 6507, 1255, 4632, 9479, 1255, 273, 46836, 50276, 7604, 720, 2370, 4197, 407, 256, 35333, 256, 35333, 342, 10254, 285, 38622, 403, 671, 2429, 50276, 43089, 310, 2011, 281, 4288, 2007, 533, 2112, 247, 39797, 977, 18974, 50275, 19164, 414, 50276, 783, 24426, 2746, 310, 417, 747, 533, 310, 747, 281, 11454, 2990, 24102, 50276, 783, 5304, 5904, 8069, 6583, 13282, 3607, 273, 11454, 6928, 533, 513, 417, 3959, 2712, 3782, 747, 50276, 783, 13576, 273, 4715, 11333, 403, 5183, 533, 417, 1199, 5469, 594, 1652, 747, 310, 6311, 50275, 9188, 40348, 50276, 34309, 1320, 310, 1774, 281, 4685, 253, 13576, 273, 1029, 6967, 4715, 24102, 285, 436, 310, 1805, 323, 38542, 2709, 24102, 685, 2045, 7274, 50276, 262, 310, 271, 3576, 2746, 285, 310, 8936, 387, 17227, 1929, 11454, 2990, 13576, 685, 2571, 50276, 968, 4380, 403, 417, 1534, 533, 253, 24426, 273, 731, 310, 5520, 50275, 498, 15752, 50276, 635, 4518, 3542, 285, 3477, 281, 2096, 50275, 15177, 50276, 257, 3881, 494, 2929, 534, 6131, 247, 747, 3576, 4968, 533, 642, 1524, 747, 16039, 715, 11454, 6928, 50276, 783, 6923, 395, 1221, 1949, 4679, 403, 3576, 285, 973, 348, 5458, 50276, 783, 32367, 273, 4715, 5933, 403, 417, 347, 3576, 285, 516, 417, 2119, 752, 281, 1379, 1977, 432, 731, 50276, 74, 19864, 253, 4477, 281, 1361, 479, 1089, 253, 8453, 273, 841, 4679, 187, 187, 4118, 18435, 27, 783, 30628, 285, 891, 5194, 326, 253, 2929, 310, 973, 17194, 285, 326, 627, 403, 1175, 14023, 281, 2720, 789, 2299, 253, 7990, 273, 253, 2929, 310, 2581, 3710, 285, 627, 497, 690, 24626, 670, 253, 4583, 11815, 285, 1880, 253, 1655, 1543, 4751, 1329, 731, 347, 824, 891, 2550, 5583, 253, 2929, 323, 9311 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper presents a selfsupervised video representation learning method videomae based on transformer design the idea of this approach is inspired by the imagemae 22 which is based on the masking and reconstruction strategy in videomae applying a high masking ratio and tube masking strategy lead to a challenging video reconstruction task which is the key to learn more representative features empirical results demonstrate the effectiveness of approach on 4 different video datasets in different settings strength the paper is wellwritten and easy to follow the motivation and the method are clearly elaborated the experiments support most of the claims in the paper the method outperforms existing approaches on challenging video benchmarks weaknesses 1 in my opinion the claim of the first contribution line 64 needs to be tone down as we have multiple works on the video ssl based on the idea of masking and predicting such as 5350 in which the results are almost on par with the proposed approach while there are some differences between the proposed approach and the existing ones which has been also mentioned by the authors in the submission the claim of the first masked video autoencoder that performs well for ssvp on smallscale video dataset is not entirely precise 2 while the proposed method presents an approach that performs onpar with sota the novelty of videomae remains limited in fact a the idea of masking and reconstruction for video ssl has been explored before 5350 b the masking strategy tube masking has been already explored in 53 c the idea of passing nonmasked tokens to the transformer encoder which results in an efficient encoding has been explored in imagemae22 paper 3 in table 5 of the main paper and table 13 of supp maskfeat53 results on ssv2 shown supervised labels while based on 53 the pretraining is done on k400 and k600 unlabaled data and then they have a finetuning step on ssv2 it would be good if the authors can clarify this to me 53 follows the standard ssl paradigm for these downstream tasks 4 while the feature transferability is investigated in table 4 as far as i understood the main set of experiments of comparison to the sota in table 5 for ssv2 dataset is based on the pretraining step on ssv2 however in the most of the previous works the common practice is to consider a single pretrained model which has been trained on a largescale dataset such as k400k600 and then evaluate on downstream tasks such as ssv2 im curious to see the result of videomae on ssv2 when pretrained on k400 with the same design and training setting as in table 5 which makes the comparison of feature transferability comparable to sota yes docsepthis paper studies the masked autoencoder in the context of video selfsupervised learning it adapts the mae technique to the video domain and propose tube masking along with a high masking ratio for videos the experiments are conducted on a set of standard video action recognition tasks showing the effectiveness of the approach in addition with this model the authors also find that videomae is data efficient when sufficiently trained the model achieves strong performance with a fraction of the training data the main concerns of this paper are novelty lack of comparison to a similar prior work and insufficient ablation studies overall if these concerns are properly address i am happy to see this work accepted strengths the proposed model is simple yet quite effective as the core task is video classification it may benefit a lot of downstream video tasks as a foundation model this work shows that videomae is very data efficient this seems to be not studied in the image mae paper specifically the authors show that when sufficiently trained the model achieves strong performance with a fraction of the training data similarily they also show the model performs much better than alternative approaches on smallscale dataset such as hmdb51 weaknesses the proposed tube masking has been studied in vimpac 1 with a name block masking this undermines the novelty of the approach i suggest the authors citediscuss and compare with this work the novelty is limited in the sense that the work seems to simply combine mae modeling techniques with the tube masking in vimpac with additional minor adjustments eg increase masking ratio of tube masking but please note i only put novelty as a minor concern as i do value the engineering effort of this paper and its potential benefits to the video understanding community the ablation study is conducted on a single dataset ssv2 however as shown in timesformer 2 the conclusion may vary a lot when looking at a different dataset eg kinetics400 k400 as ssv2 requires more temporal modeling while k400 videos are mostly stationary i would imagine many conclusions will change i strongly suggest the authors add additional ablation results on k400 i also commented in the question section about a specific ablation study where i believe the dataset property may affect the conclusion a lot 1 tan h lei j wolf t and bansal m 2021 vimpac video pretraining via masked token prediction and contrastive learning arxiv preprint arxiv210611250 2bertasius g wang h and torresani l 2021 july is spacetime attention all you need for video understanding in icml i dont see potential negative societal impact being discussed in this work docsepthe paper presents a selfsupervised learning method to learn representations of videos the method is based on being able to decode masked elements from an encoded video similar to the masked autoencoder approaches to image representation learning the authors perform extensive experiments on different video classification datasets and ablation studies over the different design choices as a result the model obtains stateoftheart performance and is able to perform well on smaller datasets without needing additional data which is not common for selfsupervised methods strengths comprehensive ablation studies that help understand which design choices work best very strong results on multiple datasets especially on smaller scale datasets without using additional data simple idea that should be reproducible with the information in the paper and the submitted code weaknesses the work is incremental and mostly consists in extending masked autoencoders to videos the small design choices required to adapt vit methods for images to videos such as temporal striding temporal size of the input tokens etc are well understood and the idea that the same spatial regions are correlated over time are well known in the video literature the analysis of the model result is shallow and limited to showing the performance of the model with a certain configuration without analyzing its mistakes detailed comment this paper extends the idea of masked autoencoders to video and runs multiple benchmarks to understand the effect of different design choices while the paper is relatively incremental there is a significant engineering effort and the authors are able to obtain strong results on multiple video classification benchmarks one important insight is that this model is able to perform well on small datasets contrary to other selfsupervised learning methods however the analysis of the strengths and weaknesses of the model is quite limited overall it is a good submission with a simple idea and impactful experimental results therefore i argue for its acceptance the discussion on limitations and potential negative societal impacts of the submission is quite limited however there are no major ethical considerations with video representation learning in general the discussion could be extended by mentioning potential negative applications when applying the method to sensitive data ### Summary:
this paper studies application of masked autoencoders to video data it is a very empirical paper with lots of ablations and experiments all three reviewers lean toward the acceptance of the paper reviewer c915 has a slight concern regarding the novelty of the paper over concurrent works including 5053 the reviewers believe that the ablation study is exhaustive and the paper has a good reproducibility the authors are encouraged to add new experiments with kinetics pretraining in the final version
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 10262, 247, 1881, 35421, 3492, 6779, 4715, 1332, 8851, 297, 3348, 1754, 327, 39707, 2216, 253, 2934, 273, 436, 2746, 310, 11797, 407, 253, 4440, 358, 3348, 3307, 534, 310, 1754, 327, 253, 44790, 285, 14433, 5700, 275, 8851, 297, 3348, 9433, 247, 1029, 44790, 4313, 285, 9402, 44790, 5700, 1421, 281, 247, 11132, 3492, 14433, 4836, 534, 310, 253, 2234, 281, 3037, 625, 8612, 3386, 16774, 1543, 7568, 253, 12510, 273, 2746, 327, 577, 1027, 3492, 15302, 275, 1027, 7533, 50275, 45563, 50276, 783, 2929, 310, 973, 15720, 285, 3477, 281, 956, 50276, 783, 16038, 285, 253, 1332, 403, 4518, 50221, 50276, 783, 4679, 1329, 954, 273, 253, 3916, 275, 253, 2929, 50276, 783, 1332, 41731, 13015, 5368, 7274, 327, 11132, 3492, 49602, 209, 623, 50276, 20881, 1255, 265, 337, 275, 619, 4743, 253, 1750, 273, 253, 806, 7680, 1386, 6705, 3198, 281, 320, 10541, 1066, 347, 359, 452, 2709, 2987, 327, 253, 3492, 256, 3433, 1754, 327, 253, 2934, 273, 44790, 285, 21565, 824, 347, 8676, 1235, 275, 534, 253, 1543, 403, 2761, 327, 1061, 342, 253, 4081, 2746, 1223, 627, 403, 690, 3910, 875, 253, 4081, 2746, 285, 253, 5368, 4394, 534, 556, 644, 671, 5393, 407, 253, 4477, 275, 253, 19529, 253, 1750, 273, 253, 806, 34741, 3492, 6753, 36465, 326, 17923, 973, 323, 256, 11427, 81, 327, 1355, 7527, 3492, 10895, 310, 417, 7094, 10799, 374, 1223, 253, 4081, 1332, 10262, 271, 2746, 326, 17923, 327, 1148, 342, 256, 5503, 253, 38135, 273, 8851, 297, 3348, 4558, 3710, 275, 958, 247, 253, 2934, 273, 44790, 285, 14433, 323, 3492, 256, 3433, 556, 644, 14859, 1078, 8676, 1235, 270, 253, 44790, 5700, 9402, 44790, 556, 644, 2168, 14859, 275, 8676, 260, 50276, 783, 2934, 273, 8136, 1327, 12477, 264, 21761, 281, 253, 39707, 32049, 534, 1543, 275, 271, 5919, 9706, 556, 644, 14859, 275, 4440, 358, 3348, 1423, 2929, 495, 275, 2829, 608, 273, 253, 2022, 2929, 285, 2829, 2145, 273, 915, 8989, 48887, 3357, 1543, 327, 256, 11427, 19, 2011, 22296, 13301, 1223, 1754, 327, 8676, 253, 3215, 26208, 310, 2218, 327, 465, 8320, 285, 465, 10487, 440, 13068, 3256, 941, 285, 840, 597, 452, 247, 1442, 292, 25004, 3213, 327, 256, 11427, 19, 352, 651, 320, 1175, 604, 253, 4477, 476, 19148, 436, 281, 479, 8676, 3637, 253, 2629, 256, 3433, 22199, 323, 841, 15450, 8892, 577, 1223, 253, 4735, 3700, 1430, 310, 6949, 275, 2829, 577, 347, 2080, 347, 891, 7192, 253, 2022, 873, 273, 4679, 273, 5301, 281, 253, 256, 5503, 275, 2829, 608, 323, 256, 11427, 19, 10895, 310, 1754, 327, 253, 3215, 26208, 3213, 327, 256, 11427, 19, 2299, 275, 253, 954, 273, 253, 2045, 2987, 253, 1846, 3946, 310, 281, 1908, 247, 2014, 3215, 11273, 1566, 534, 556, 644, 10166, 327, 247, 1236, 2510, 25912, 10895, 824, 347, 465, 8320, 76, 10487, 285, 840, 7472, 327, 15450, 8892, 824, 347, 256, 11427, 19, 516, 14338, 281, 923, 253, 906, 273, 8851, 297, 3348, 327, 256, 11427, 19, 672, 3215, 11273, 327, 465, 8320, 342, 253, 1072, 2216, 285, 3733, 4758, 347, 275, 2829, 608, 534, 2789, 253, 5301, 273, 4735, 3700, 1430, 10870, 281, 256, 5503, 50275, 9820, 5474, 33032, 2520, 2929, 2175, 253, 34741, 6753, 36465, 275, 253, 3634, 273, 3492, 1881, 35421, 4715, 352, 5223, 84, 253, 278, 3348, 5853, 281, 253, 3492, 5028, 285, 12661, 9402, 44790, 2112, 342, 247, 1029, 44790, 4313, 323, 10556, 253, 4679, 403, 5196, 327, 247, 873, 273, 2629, 3492, 2250, 8981, 8892, 4645, 253, 12510, 273, 253, 2746, 275, 1635, 342, 436, 1566, 253, 4477, 671, 1089, 326, 8851, 297, 3348, 310, 941, 5919, 50276, 9453, 10481, 10166, 253, 1566, 33526, 2266, 3045, 342, 247, 6919, 273, 253, 3733, 941, 253, 2022, 7350, 273, 436, 2929, 403, 38135, 3480, 273, 5301, 281, 247, 2074, 2720, 789, 285, 12497, 28913, 2175, 50276, 1189, 455, 604, 841, 7350, 403, 6283, 2953, 891, 717, 5211, 281, 923, 436, 789, 7607, 20544, 50275, 783, 4081, 1566, 310, 2969, 2568, 3240, 3576, 347, 253, 5161, 4836, 310, 3492, 9162, 352, 778, 5649, 247, 2257, 273, 15450, 3492, 8892, 347, 247, 12153, 1566, 50276, 2520, 789, 2722, 326, 8851, 297, 3348, 310, 1077, 941, 5919, 436, 3133, 281, 320, 417, 5421, 275, 253, 2460, 278, 3348, 2929, 5742, 253, 4477, 921, 326, 672, 10481, 10166, 253, 1566, 33526, 2266, 3045, 342, 247, 6919, 273, 253, 3733, 941, 2074, 1031, 597, 671, 921, 253, 1566, 17923, 1199, 1805, 685, 5795, 7274, 327, 1355, 7527, 10895, 824, 347, 288, 78, 5470, 3712, 50276, 20881, 1255, 265, 50275, 783, 4081, 9402, 44790, 556, 644, 5421, 275, 362, 11548, 317, 337, 342, 247, 1416, 2972, 44790, 436, 35162, 1100, 253, 38135, 273, 253, 2746, 891, 1804, 253, 4477, 11106, 8552, 285, 7277, 342, 436, 789, 50276, 783, 38135, 310, 3710, 275, 253, 3282, 326, 253, 789, 3133, 281, 3365, 13398, 278, 3348, 14053, 5609, 342, 253, 9402, 44790, 275, 362, 11548, 317, 342, 3081, 5884, 23927, 24088, 2572, 44790, 4313, 273, 9402, 44790, 533, 4496, 3877, 891, 760, 1691, 38135, 347, 247, 5884, 4468, 347, 891, 513, 1318, 253, 11369, 3434, 273, 436, 2929, 285, 697, 2442, 5373, 281, 253, 3492, 4685, 3114, 50276, 783, 28913, 1263, 310, 5196, 327, 247, 2014, 10895, 256, 11427, 19, 2299, 347, 2011, 275, 2069, 19946, 374, 253, 6452, 778, 6889, 247, 2257, 672, 2819, 387, 247, 1027, 10895, 24088, 24273, 8320, 465, 8320, 347, 256, 11427, 19, 4419, 625, 11935, 14053, 1223, 465, 8320, 10556, 403, 6571, 17429, 891, 651, 8564, 1142, 11815, 588, 1818, 891, 7052, 1804, 253, 4477, 823, 3081, 28913, 1543, 327, 465, 8320, 891, 671, 20503, 275, 253, 1953, 2593, 670, 247, 2173, 28913, 1263, 835, 891, 2868, 253, 10895, 2867, 778, 2818, 253, 6452, 247, 2257, 50276, 18, 23136, 288, 43278, 480, 25872, 246, 285, 44464, 267, 278, 43425, 362, 11548, 317, 3492, 3215, 26208, 3066, 34741, 10669, 10554, 285, 4499, 422, 4715, 575, 39962, 638, 3845, 549, 32693, 16899, 3832, 805, 1235, 50276, 19, 6291, 284, 3750, 305, 259, 606, 288, 285, 7263, 373, 6451, 298, 43425, 480, 2988, 310, 29380, 4116, 512, 368, 878, 323, 3492, 4685, 275, 575, 280, 1686, 891, 13414, 923, 2442, 4016, 38058, 3486, 1146, 5469, 275, 436, 789, 50275, 7152, 339, 431, 248, 2929, 10262, 247, 1881, 35421, 4715, 1332, 281, 3037, 14237, 273, 10556, 253, 1332, 310, 1754, 327, 1146, 2104, 281, 30358, 34741, 3603, 432, 271, 16202, 3492, 2074, 281, 253, 34741, 6753, 36465, 7274, 281, 2460, 6779, 4715, 253, 4477, 1347, 9470, 4679, 327, 1027, 3492, 9162, 15302, 285, 28913, 2175, 689, 253, 1027, 2216, 10165, 347, 247, 906, 253, 1566, 31326, 1375, 23037, 14387, 3045, 285, 310, 2104, 281, 1347, 973, 327, 4577, 15302, 1293, 25312, 3081, 941, 534, 310, 417, 1846, 323, 1881, 35421, 3082, 50276, 296, 3755, 20556, 50276, 3118, 8391, 422, 28913, 2175, 326, 1361, 2096, 534, 2216, 10165, 789, 1682, 50276, 635, 2266, 1543, 327, 2709, 15302, 3340, 327, 4577, 4311, 15302, 1293, 970, 3081, 941, 50276, 19583, 2934, 326, 943, 320, 41374, 342, 253, 1491, 275, 253, 2929, 285, 253, 9262, 2127, 50276, 20881, 1255, 265, 50276, 783, 789, 310, 32809, 285, 6571, 8414, 275, 13633, 34741, 6753, 2083, 351, 398, 281, 10556, 253, 1355, 2216, 10165, 2424, 281, 5223, 9084, 3082, 323, 3888, 281, 10556, 824, 347, 11935, 1213, 2821, 11935, 1979, 273, 253, 3280, 21761, 3966, 403, 973, 7192, 285, 253, 2934, 326, 253, 1072, 8820, 4811, 403, 9578, 689, 673, 403, 973, 1929, 275, 253, 3492, 6239, 50276, 783, 1783, 273, 253, 1566, 906, 310, 20126, 285, 3710, 281, 4645, 253, 3045, 273, 253, 1566, 342, 247, 2176, 6661, 1293, 18918, 697, 16503, 50276, 5992, 7193, 4385, 50276, 2520, 2929, 8725, 253, 2934, 273, 34741, 6753, 2083, 351, 398, 281, 3492, 285, 6613, 2709, 49602, 281, 2096, 253, 1055, 273, 1027, 2216, 10165, 1223, 253, 2929, 310, 4942, 32809, 627, 310, 247, 1534, 11369, 3434, 285, 253, 4477, 403, 2104, 281, 4044, 2266, 1543, 327, 2709, 3492, 9162, 49602, 581, 1774, 12288, 310, 326, 436, 1566, 310, 2104, 281, 1347, 973, 327, 1355, 15302, 10214, 281, 643, 1881, 35421, 4715, 3082, 2299, 253, 1783, 273, 253, 20544, 285, 32213, 273, 253, 1566, 310, 3240, 3710, 50276, 1189, 455, 352, 310, 247, 1175, 19529, 342, 247, 2969, 2934, 285, 3486, 1020, 5661, 1543, 3103, 891, 9059, 323, 697, 14924, 253, 5955, 327, 7364, 285, 2442, 4016, 38058, 16274, 273, 253, 19529, 310, 3240, 3710, 2299, 627, 403, 642, 2201, 16289, 15711, 342, 3492, 6779, 4715, 275, 2087, 253, 5955, 812, 320, 6508, 407, 29570, 2442, 4016, 4893, 672, 9433, 253, 1332, 281, 7996, 941, 2490, 187, 4118, 18435, 27, 2520, 2929, 2175, 2898, 273, 34741, 6753, 2083, 351, 398, 281, 3492, 941, 352, 310, 247, 1077, 16774, 2929, 342, 8783, 273, 490, 77, 569, 285, 4679, 512, 1264, 30628, 9644, 2584, 253, 14924, 273, 253, 2929, 37317, 260, 36245, 556, 247, 4512, 4468, 5001, 253, 38135, 273, 253, 2929, 689, 17336, 2987, 1690, 608, 33220, 253, 30628, 2868, 326, 253, 28913, 1263, 310, 41389, 285, 253, 2929, 556, 247, 1175, 38041, 253, 4477, 403, 14659, 281, 823, 747, 4679, 342, 24273, 3215, 26208, 275, 253, 2457, 2715 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 10262, 247, 1881, 35421, 3492, 6779, 4715, 1332, 8851, 297, 3348, 1754, 327, 39707, 2216, 253, 2934, 273, 436, 2746, 310, 11797, 407, 253, 4440, 358, 3348, 3307, 534, 310, 1754, 327, 253, 44790, 285, 14433, 5700, 275, 8851, 297, 3348, 9433, 247, 1029, 44790, 4313, 285, 9402, 44790, 5700, 1421, 281, 247, 11132, 3492, 14433, 4836, 534, 310, 253, 2234, 281, 3037, 625, 8612, 3386, 16774, 1543, 7568, 253, 12510, 273, 2746, 327, 577, 1027, 3492, 15302, 275, 1027, 7533, 50275, 45563, 50276, 783, 2929, 310, 973, 15720, 285, 3477, 281, 956, 50276, 783, 16038, 285, 253, 1332, 403, 4518, 50221, 50276, 783, 4679, 1329, 954, 273, 253, 3916, 275, 253, 2929, 50276, 783, 1332, 41731, 13015, 5368, 7274, 327, 11132, 3492, 49602, 209, 623, 50276, 20881, 1255, 265, 337, 275, 619, 4743, 253, 1750, 273, 253, 806, 7680, 1386, 6705, 3198, 281, 320, 10541, 1066, 347, 359, 452, 2709, 2987, 327, 253, 3492, 256, 3433, 1754, 327, 253, 2934, 273, 44790, 285, 21565, 824, 347, 8676, 1235, 275, 534, 253, 1543, 403, 2761, 327, 1061, 342, 253, 4081, 2746, 1223, 627, 403, 690, 3910, 875, 253, 4081, 2746, 285, 253, 5368, 4394, 534, 556, 644, 671, 5393, 407, 253, 4477, 275, 253, 19529, 253, 1750, 273, 253, 806, 34741, 3492, 6753, 36465, 326, 17923, 973, 323, 256, 11427, 81, 327, 1355, 7527, 3492, 10895, 310, 417, 7094, 10799, 374, 1223, 253, 4081, 1332, 10262, 271, 2746, 326, 17923, 327, 1148, 342, 256, 5503, 253, 38135, 273, 8851, 297, 3348, 4558, 3710, 275, 958, 247, 253, 2934, 273, 44790, 285, 14433, 323, 3492, 256, 3433, 556, 644, 14859, 1078, 8676, 1235, 270, 253, 44790, 5700, 9402, 44790, 556, 644, 2168, 14859, 275, 8676, 260, 50276, 783, 2934, 273, 8136, 1327, 12477, 264, 21761, 281, 253, 39707, 32049, 534, 1543, 275, 271, 5919, 9706, 556, 644, 14859, 275, 4440, 358, 3348, 1423, 2929, 495, 275, 2829, 608, 273, 253, 2022, 2929, 285, 2829, 2145, 273, 915, 8989, 48887, 3357, 1543, 327, 256, 11427, 19, 2011, 22296, 13301, 1223, 1754, 327, 8676, 253, 3215, 26208, 310, 2218, 327, 465, 8320, 285, 465, 10487, 440, 13068, 3256, 941, 285, 840, 597, 452, 247, 1442, 292, 25004, 3213, 327, 256, 11427, 19, 352, 651, 320, 1175, 604, 253, 4477, 476, 19148, 436, 281, 479, 8676, 3637, 253, 2629, 256, 3433, 22199, 323, 841, 15450, 8892, 577, 1223, 253, 4735, 3700, 1430, 310, 6949, 275, 2829, 577, 347, 2080, 347, 891, 7192, 253, 2022, 873, 273, 4679, 273, 5301, 281, 253, 256, 5503, 275, 2829, 608, 323, 256, 11427, 19, 10895, 310, 1754, 327, 253, 3215, 26208, 3213, 327, 256, 11427, 19, 2299, 275, 253, 954, 273, 253, 2045, 2987, 253, 1846, 3946, 310, 281, 1908, 247, 2014, 3215, 11273, 1566, 534, 556, 644, 10166, 327, 247, 1236, 2510, 25912, 10895, 824, 347, 465, 8320, 76, 10487, 285, 840, 7472, 327, 15450, 8892, 824, 347, 256, 11427, 19, 516, 14338, 281, 923, 253, 906, 273, 8851, 297, 3348, 327, 256, 11427, 19, 672, 3215, 11273, 327, 465, 8320, 342, 253, 1072, 2216, 285, 3733, 4758, 347, 275, 2829, 608, 534, 2789, 253, 5301, 273, 4735, 3700, 1430, 10870, 281, 256, 5503, 50275, 9820, 5474, 33032, 2520, 2929, 2175, 253, 34741, 6753, 36465, 275, 253, 3634, 273, 3492, 1881, 35421, 4715, 352, 5223, 84, 253, 278, 3348, 5853, 281, 253, 3492, 5028, 285, 12661, 9402, 44790, 2112, 342, 247, 1029, 44790, 4313, 323, 10556, 253, 4679, 403, 5196, 327, 247, 873, 273, 2629, 3492, 2250, 8981, 8892, 4645, 253, 12510, 273, 253, 2746, 275, 1635, 342, 436, 1566, 253, 4477, 671, 1089, 326, 8851, 297, 3348, 310, 941, 5919, 50276, 9453, 10481, 10166, 253, 1566, 33526, 2266, 3045, 342, 247, 6919, 273, 253, 3733, 941, 253, 2022, 7350, 273, 436, 2929, 403, 38135, 3480, 273, 5301, 281, 247, 2074, 2720, 789, 285, 12497, 28913, 2175, 50276, 1189, 455, 604, 841, 7350, 403, 6283, 2953, 891, 717, 5211, 281, 923, 436, 789, 7607, 20544, 50275, 783, 4081, 1566, 310, 2969, 2568, 3240, 3576, 347, 253, 5161, 4836, 310, 3492, 9162, 352, 778, 5649, 247, 2257, 273, 15450, 3492, 8892, 347, 247, 12153, 1566, 50276, 2520, 789, 2722, 326, 8851, 297, 3348, 310, 1077, 941, 5919, 436, 3133, 281, 320, 417, 5421, 275, 253, 2460, 278, 3348, 2929, 5742, 253, 4477, 921, 326, 672, 10481, 10166, 253, 1566, 33526, 2266, 3045, 342, 247, 6919, 273, 253, 3733, 941, 2074, 1031, 597, 671, 921, 253, 1566, 17923, 1199, 1805, 685, 5795, 7274, 327, 1355, 7527, 10895, 824, 347, 288, 78, 5470, 3712, 50276, 20881, 1255, 265, 50275, 783, 4081, 9402, 44790, 556, 644, 5421, 275, 362, 11548, 317, 337, 342, 247, 1416, 2972, 44790, 436, 35162, 1100, 253, 38135, 273, 253, 2746, 891, 1804, 253, 4477, 11106, 8552, 285, 7277, 342, 436, 789, 50276, 783, 38135, 310, 3710, 275, 253, 3282, 326, 253, 789, 3133, 281, 3365, 13398, 278, 3348, 14053, 5609, 342, 253, 9402, 44790, 275, 362, 11548, 317, 342, 3081, 5884, 23927, 24088, 2572, 44790, 4313, 273, 9402, 44790, 533, 4496, 3877, 891, 760, 1691, 38135, 347, 247, 5884, 4468, 347, 891, 513, 1318, 253, 11369, 3434, 273, 436, 2929, 285, 697, 2442, 5373, 281, 253, 3492, 4685, 3114, 50276, 783, 28913, 1263, 310, 5196, 327, 247, 2014, 10895, 256, 11427, 19, 2299, 347, 2011, 275, 2069, 19946, 374, 253, 6452, 778, 6889, 247, 2257, 672, 2819, 387, 247, 1027, 10895, 24088, 24273, 8320, 465, 8320, 347, 256, 11427, 19, 4419, 625, 11935, 14053, 1223, 465, 8320, 10556, 403, 6571, 17429, 891, 651, 8564, 1142, 11815, 588, 1818, 891, 7052, 1804, 253, 4477, 823, 3081, 28913, 1543, 327, 465, 8320, 891, 671, 20503, 275, 253, 1953, 2593, 670, 247, 2173, 28913, 1263, 835, 891, 2868, 253, 10895, 2867, 778, 2818, 253, 6452, 247, 2257, 50276, 18, 23136, 288, 43278, 480, 25872, 246, 285, 44464, 267, 278, 43425, 362, 11548, 317, 3492, 3215, 26208, 3066, 34741, 10669, 10554, 285, 4499, 422, 4715, 575, 39962, 638, 3845, 549, 32693, 16899, 3832, 805, 1235, 50276, 19, 6291, 284, 3750, 305, 259, 606, 288, 285, 7263, 373, 6451, 298, 43425, 480, 2988, 310, 29380, 4116, 512, 368, 878, 323, 3492, 4685, 275, 575, 280, 1686, 891, 13414, 923, 2442, 4016, 38058, 3486, 1146, 5469, 275, 436, 789, 50275, 7152, 339, 431, 248, 2929, 10262, 247, 1881, 35421, 4715, 1332, 281, 3037, 14237, 273, 10556, 253, 1332, 310, 1754, 327, 1146, 2104, 281, 30358, 34741, 3603, 432, 271, 16202, 3492, 2074, 281, 253, 34741, 6753, 36465, 7274, 281, 2460, 6779, 4715, 253, 4477, 1347, 9470, 4679, 327, 1027, 3492, 9162, 15302, 285, 28913, 2175, 689, 253, 1027, 2216, 10165, 347, 247, 906, 253, 1566, 31326, 1375, 23037, 14387, 3045, 285, 310, 2104, 281, 1347, 973, 327, 4577, 15302, 1293, 25312, 3081, 941, 534, 310, 417, 1846, 323, 1881, 35421, 3082, 50276, 296, 3755, 20556, 50276, 3118, 8391, 422, 28913, 2175, 326, 1361, 2096, 534, 2216, 10165, 789, 1682, 50276, 635, 2266, 1543, 327, 2709, 15302, 3340, 327, 4577, 4311, 15302, 1293, 970, 3081, 941, 50276, 19583, 2934, 326, 943, 320, 41374, 342, 253, 1491, 275, 253, 2929, 285, 253, 9262, 2127, 50276, 20881, 1255, 265, 50276, 783, 789, 310, 32809, 285, 6571, 8414, 275, 13633, 34741, 6753, 2083, 351, 398, 281, 10556, 253, 1355, 2216, 10165, 2424, 281, 5223, 9084, 3082, 323, 3888, 281, 10556, 824, 347, 11935, 1213, 2821, 11935, 1979, 273, 253, 3280, 21761, 3966, 403, 973, 7192, 285, 253, 2934, 326, 253, 1072, 8820, 4811, 403, 9578, 689, 673, 403, 973, 1929, 275, 253, 3492, 6239, 50276, 783, 1783, 273, 253, 1566, 906, 310, 20126, 285, 3710, 281, 4645, 253, 3045, 273, 253, 1566, 342, 247, 2176, 6661, 1293, 18918, 697, 16503, 50276, 5992, 7193, 4385, 50276, 2520, 2929, 8725, 253, 2934, 273, 34741, 6753, 2083, 351, 398, 281, 3492, 285, 6613, 2709, 49602, 281, 2096, 253, 1055, 273, 1027, 2216, 10165, 1223, 253, 2929, 310, 4942, 32809, 627, 310, 247, 1534, 11369, 3434, 285, 253, 4477, 403, 2104, 281, 4044, 2266, 1543, 327, 2709, 3492, 9162, 49602, 581, 1774, 12288, 310, 326, 436, 1566, 310, 2104, 281, 1347, 973, 327, 1355, 15302, 10214, 281, 643, 1881, 35421, 4715, 3082, 2299, 253, 1783, 273, 253, 20544, 285, 32213, 273, 253, 1566, 310, 3240, 3710, 50276, 1189, 455, 352, 310, 247, 1175, 19529, 342, 247, 2969, 2934, 285, 3486, 1020, 5661, 1543, 3103, 891, 9059, 323, 697, 14924, 253, 5955, 327, 7364, 285, 2442, 4016, 38058, 16274, 273, 253, 19529, 310, 3240, 3710, 2299, 627, 403, 642, 2201, 16289, 15711, 342, 3492, 6779, 4715, 275, 2087, 253, 5955, 812, 320, 6508, 407, 29570, 2442, 4016, 4893, 672, 9433, 253, 1332, 281, 7996, 941, 2490, 187, 4118, 18435, 27, 2520, 2929, 2175, 2898, 273, 34741, 6753, 2083, 351, 398, 281, 3492, 941, 352, 310, 247, 1077, 16774, 2929, 342, 8783, 273, 490, 77, 569, 285, 4679, 512, 1264, 30628, 9644, 2584, 253, 14924, 273, 253, 2929, 37317, 260, 36245, 556, 247, 4512, 4468, 5001, 253, 38135, 273, 253, 2929, 689, 17336, 2987, 1690, 608, 33220, 253, 30628, 2868, 326, 253, 28913, 1263, 310, 41389, 285, 253, 2929, 556, 247, 1175, 38041, 253, 4477, 403, 14659, 281, 823, 747, 4679, 342, 24273, 3215, 26208, 275, 253, 2457, 2715 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper extends the online metalearning oml model with the er during the metatraining also the paper proposes a replacement buffer policy for samples replacement from the reservoir instead of storing the raw samples since the backbone model is static during the metatest its better to store the feature representations that paper does 1 integrating the experience replay er during the metatraining phase is the key contribution the paper mentioned that using er during metatest only but not meta train creates inconsistency what do you mean by inconsistency during the metatest and metatrain the classes are disjoint how consistency is defined 2 the paper extends the oml with er which does not seem a significant contribution the buffer replacement policy is novel and it may be useful for the other replay based approaches pss improves the model performance compared to the ring gss and bilevel 3 in algorithm2 metatraining procedures are provided for the er while metatest procedures are unclear how are metatests performed would you please provide the algorithm for the same 4 the direct comparison with oml is not fair since oml does not leverage on er it seems that results are trivial since compared to without er replay always improves the performance also since during the metatest only classifiers are trained therefore it will be beneficial to store the feature representation is also trivial 5 another recent work a also improves the representation over the oml for continual learning i request the author please discuss the result with a a knowledge consolidation based class incremental online learning with limited data ijcai21 paper has some novel part but the main contributions compared to oml is not significant docsepthe paper looks at the problem of catastrophic forgetting in continual learning they propose a method that extends oml to use er both during training and testing in the er they store the embedding of the samples instead of the samples themselves to avoid the interference caused by the batch nature of er and the online nature of oml they use metalearning to select samples that are useful for avoiding catastrophic forgetting to be in the er instead of reservoir sampling they predict the difference in loss in predicting the class caused by removing a sample from replay and use that score to decide what to remove strengths 1 the paper identifies some issues with one of the current methods for continual learning and propose simple extensions to reduce the effect of those issues 2 the empirical evaluationsvisualizations show clear improvement in performance by adding the proposed ideas 3 the paper is written very well easy to read and understand questionsweaknesses 1 could you elaborate a bit more on the inconsistencies how exactly it affects learningperformance of using er only during testing is the inconsistency of concern here the inconsistency in inner loop updates between the metatraining and metatesting 2 representation replay a wouldnt not using er for rln potentially lead to forgetting b wouldnt representation replay still have the interference issue for pln c is there a tradeoff that is being made here if so any thoughts on what exactly it is and how to choose that tradeoff 3 isnt the target for the prediction in pss always shifting if thats the case what is the rationale behind not updating the fc during metatesting training wouldnt the loss for the same sample change over the course of metatesting training 4 in er would having multiple different batches of data updates from the r new sample alleviate the problem of losing the information from the new sample 5 not clear if the contributions of the paper are significant enough for iclr or very incremental could you clarify the contributions of the papers in more detail please including the issues identified the ideas contributed novel and point out where ideas were borrowed and applied in a different setting and evaluations etc 6 are there methods other than anml for the setup of interest in the paper following oml which was proposed in 2019 while evaluating the proposed changes with oml and variations of oml is essential it is also important to compare with other methods that look at the same setting to situate the proposed method in the literature other comments 1 related works section note that apart from regularization and rehearsal based methods there are also several architecture based modular networks type and parameter isolation methods that have been proposed for avoiding catastrophic forgetting in continual learning simple extension to a current method unclear about the significance of the contribution docsepthe paper studies the metacontinual learning problem representations replay instead of samples replay are used to improve the performance of metacontinual learning and a new strategy called predictive sample selection is used to select samples into the replay buffer experiments demonstrate the effectiveness of the proposed methods strengths 1 the idea of using representations instead of samples for buffer is interesting 2 the predictive sample selection method is effective weaknesses 1 lack of experiments or theoretical analysis to support using representations instead of samples 2 sample selection method lacks concrete analysis the topic is intersting however the contribution of this paper is minor it is a combination of several incremental improvements docsepthis paper proposes to integrate er into metatraining of online aware metalearning oml javed white 2019 the authors propose to store the samples representations not the samples themselves into a replay buffer in addition the authors propose a sample selection method with a metalearned predictive sample selection to select the most informative examples experimental results on several datasets demonstrate the effectiveness of the proposed method strengths the authors proposed improvement for oml by integrating er into metatraining experiment shows the effectiveness of the proposed method weakness the technical novelty of the proposed method is limited and is incremental compared to online aware metalearning oml the proposed method is just a simple improvement based on the framework of oml the difference is that the proposed method integrates er into the metatraining of oml the clarity of presentation could be improved for example the terms metatraining training metatraining testing metatesting training are not comfortable to read it would be better to use some common simple words that people widely used to improve the clarity of presentations for the proposed representation replay there is a lack of concrete analysis why the proposed method of integrating er into metatraining could better alleviate catastrophic forgetting compared to oml it is unclear to see the benefit and necessity of using er during metatraining only from the derived gradients sample selection method lacks indepth analysis about how different task samples are selected what are the sample selection results for different seeds is the proposed sample selection method sensitive to data streams or task ordering for the experiment lack of comparisons to some existing methods which consider the memory selection eg 1 reference 1 online continual learning with maximal interfered retrieval rahaf aljundi et al neurips 2019 the technical contributions of this paper is limited compared to oml also the presentation clarity detailed analysis of the proposed method experiments need to be improved ### Summary:
addressing the problem of catastrophic forgetting in continual learning this paper extends oml to use experience replay er during training instead of the original approach which uses er during test phase only the paper proposes a policy for samples replacement from the reservoir experiments show the superiority of the approach in three standard benchmarks compared with several baselines reviewers were unanimously concerned that the technical contribution of the paper is not sufficient the authors addressed several issues including experiments to compare with additional baselines but the technical novelty remains limited for an iclr publication the paper cannot be accepted at its current form
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 8725, 253, 3909, 5148, 613, 920, 258, 1686, 1566, 342, 253, 2827, 1309, 253, 1313, 255, 26208, 671, 253, 2929, 29328, 247, 5407, 6391, 3646, 323, 3530, 5407, 432, 253, 18699, 3185, 273, 20073, 253, 9305, 3530, 1580, 253, 27882, 1566, 310, 4228, 1309, 253, 1313, 255, 383, 697, 1805, 281, 4657, 253, 4735, 14237, 326, 2929, 1057, 50276, 18, 24399, 253, 2793, 44864, 2827, 1309, 253, 1313, 255, 26208, 3408, 310, 253, 2234, 7680, 253, 2929, 5393, 326, 970, 2827, 1309, 1313, 255, 383, 760, 533, 417, 11419, 6194, 10513, 43430, 752, 513, 368, 1599, 407, 43430, 1309, 253, 1313, 255, 383, 285, 1313, 255, 1949, 253, 5971, 403, 28465, 849, 15274, 310, 2931, 50276, 19, 253, 2929, 8725, 253, 258, 1686, 342, 2827, 534, 1057, 417, 1646, 247, 1534, 7680, 253, 6391, 5407, 3646, 310, 4460, 285, 352, 778, 320, 4217, 323, 253, 643, 44864, 1754, 7274, 268, 859, 19132, 253, 1566, 3045, 2429, 281, 253, 5818, 305, 859, 285, 26413, 652, 50275, 20, 275, 5933, 19, 1313, 255, 26208, 7259, 403, 2530, 323, 253, 2827, 1223, 1313, 255, 383, 7259, 403, 12744, 849, 403, 1313, 255, 6655, 2684, 651, 368, 4496, 2085, 253, 5933, 323, 253, 1072, 50276, 21, 253, 1480, 5301, 342, 258, 1686, 310, 417, 4344, 1580, 258, 1686, 1057, 417, 25057, 327, 2827, 352, 3133, 326, 1543, 403, 14916, 1580, 2429, 281, 1293, 2827, 44864, 1900, 19132, 253, 3045, 671, 1580, 1309, 253, 1313, 255, 383, 760, 49996, 403, 10166, 3103, 352, 588, 320, 12912, 281, 4657, 253, 4735, 6779, 310, 671, 14916, 50276, 22, 1529, 3332, 789, 247, 671, 19132, 253, 6779, 689, 253, 258, 1686, 323, 45120, 4715, 891, 2748, 253, 2488, 4496, 2319, 253, 906, 342, 247, 50276, 66, 3640, 34889, 1754, 966, 32809, 3909, 4715, 342, 3710, 941, 891, 23925, 2284, 1797, 2929, 556, 690, 4460, 629, 533, 253, 2022, 9021, 2429, 281, 258, 1686, 310, 417, 1534, 50276, 7152, 339, 431, 248, 2929, 4453, 387, 253, 1895, 273, 36256, 37264, 275, 45120, 4715, 597, 12661, 247, 1332, 326, 8725, 258, 1686, 281, 897, 2827, 1097, 1309, 3733, 285, 5175, 275, 253, 2827, 597, 4657, 253, 21496, 273, 253, 3530, 3185, 273, 253, 3530, 3746, 281, 3693, 253, 11689, 4269, 407, 253, 14604, 3753, 273, 2827, 285, 253, 3909, 3753, 273, 258, 1686, 597, 897, 5148, 613, 920, 281, 3609, 3530, 326, 403, 4217, 323, 17816, 36256, 37264, 281, 320, 275, 253, 2827, 3185, 273, 18699, 10491, 597, 3283, 253, 3064, 275, 2957, 275, 21565, 253, 966, 4269, 407, 11922, 247, 3410, 432, 44864, 285, 897, 326, 4868, 281, 7617, 752, 281, 5386, 20544, 337, 253, 2929, 22649, 690, 3374, 342, 581, 273, 253, 1655, 3082, 323, 45120, 4715, 285, 12661, 2969, 18149, 281, 4796, 253, 1055, 273, 1110, 3374, 374, 253, 16774, 27163, 34309, 5904, 921, 2590, 7756, 275, 3045, 407, 6240, 253, 4081, 5697, 495, 253, 2929, 310, 3542, 1077, 973, 3477, 281, 1239, 285, 2096, 50276, 34974, 20881, 1255, 265, 337, 812, 368, 21184, 247, 2372, 625, 327, 253, 45611, 849, 4555, 352, 11852, 4715, 24159, 273, 970, 2827, 760, 1309, 5175, 310, 253, 43430, 273, 4468, 1060, 253, 43430, 275, 6703, 6287, 11269, 875, 253, 1313, 255, 26208, 285, 1313, 255, 38972, 374, 6779, 44864, 50276, 66, 651, 2649, 417, 970, 2827, 323, 391, 6677, 7826, 1421, 281, 37264, 270, 651, 2649, 6779, 44864, 1335, 452, 253, 11689, 2523, 323, 499, 79, 50276, 68, 310, 627, 247, 5454, 2727, 326, 310, 1146, 1160, 1060, 604, 594, 667, 7906, 327, 752, 4555, 352, 310, 285, 849, 281, 5206, 326, 5454, 2727, 495, 310, 2649, 253, 2303, 323, 253, 10554, 275, 268, 859, 1900, 19507, 604, 28763, 253, 1083, 752, 310, 253, 24775, 3212, 417, 22753, 253, 269, 68, 1309, 1313, 255, 38972, 3733, 651, 2649, 253, 2957, 323, 253, 1072, 3410, 1818, 689, 253, 2282, 273, 1313, 255, 38972, 3733, 577, 275, 2827, 651, 1907, 2709, 1027, 39657, 273, 941, 11269, 432, 253, 391, 50276, 1826, 3410, 33623, 253, 1895, 273, 10305, 253, 1491, 432, 253, 747, 3410, 50276, 22, 417, 2590, 604, 253, 9021, 273, 253, 2929, 403, 1534, 2217, 323, 17857, 32888, 50276, 263, 1077, 32809, 812, 368, 19148, 253, 9021, 273, 253, 9380, 275, 625, 2508, 4496, 1690, 253, 3374, 3636, 253, 5697, 9945, 4460, 285, 1127, 562, 835, 5697, 497, 29563, 285, 3732, 275, 247, 1027, 4758, 285, 27163, 3966, 721, 403, 627, 3082, 643, 685, 271, 1686, 323, 253, 9978, 273, 1600, 275, 253, 2929, 1563, 258, 1686, 534, 369, 4081, 275, 6247, 1223, 16344, 253, 4081, 2544, 342, 258, 1686, 285, 10575, 273, 258, 1686, 310, 5667, 352, 310, 671, 1774, 281, 7277, 342, 643, 3082, 326, 1007, 387, 253, 1072, 4758, 281, 5999, 366, 253, 4081, 1332, 275, 253, 6239, 50275, 977, 5701, 337, 2905, 2987, 2593, 3877, 326, 7419, 432, 37820, 285, 33558, 267, 1754, 3082, 627, 403, 671, 2067, 10336, 1754, 23178, 6928, 1511, 285, 4764, 12940, 3082, 326, 452, 644, 4081, 323, 17816, 36256, 37264, 275, 45120, 4715, 50276, 19583, 6880, 281, 247, 1655, 1332, 12744, 670, 253, 8453, 273, 253, 7680, 50276, 7152, 339, 431, 248, 2929, 2175, 253, 1313, 317, 834, 249, 780, 4715, 1895, 14237, 44864, 3185, 273, 3530, 44864, 403, 908, 281, 3157, 253, 3045, 273, 1313, 317, 834, 249, 780, 4715, 285, 247, 747, 5700, 1925, 15970, 3410, 5438, 310, 908, 281, 3609, 3530, 715, 253, 44864, 6391, 4679, 7568, 253, 12510, 273, 253, 4081, 3082, 50276, 296, 3755, 20556, 337, 253, 2934, 273, 970, 14237, 3185, 273, 3530, 323, 6391, 310, 4722, 50276, 19, 253, 15970, 3410, 5438, 1332, 310, 3576, 50276, 20881, 1255, 265, 337, 3480, 273, 4679, 390, 10527, 1783, 281, 1329, 970, 14237, 3185, 273, 3530, 50276, 19, 3410, 5438, 1332, 19756, 11859, 1783, 253, 9400, 310, 734, 296, 272, 2299, 253, 7680, 273, 436, 2929, 310, 5884, 352, 310, 247, 5019, 273, 2067, 32809, 11701, 5474, 33032, 2520, 2929, 29328, 281, 50276, 13897, 366, 2827, 715, 1313, 255, 26208, 273, 3909, 6600, 5148, 613, 920, 258, 1686, 480, 9367, 50276, 11300, 6247, 253, 4477, 12661, 281, 4657, 253, 3530, 14237, 417, 253, 3530, 3746, 715, 247, 44864, 6391, 50276, 249, 1635, 253, 4477, 12661, 247, 3410, 5438, 1332, 342, 247, 5148, 613, 9306, 15970, 3410, 5438, 281, 3609, 253, 954, 27096, 6667, 5661, 1543, 327, 2067, 15302, 7568, 253, 12510, 273, 253, 4081, 1332, 50276, 296, 3755, 20556, 50275, 783, 4477, 4081, 7756, 323, 258, 1686, 407, 24399, 2827, 715, 50276, 3899, 255, 26208, 50275, 16217, 2092, 2722, 253, 12510, 273, 253, 4081, 1332, 50273, 20881, 1255, 50274, 783, 7681, 38135, 273, 253, 4081, 1332, 310, 3710, 285, 310, 32809, 2429, 281, 3909, 6600, 5148, 613, 920, 258, 1686, 253, 4081, 1332, 310, 816, 247, 2969, 7756, 1754, 327, 253, 7792, 273, 258, 1686, 253, 3064, 310, 326, 253, 4081, 1332, 50276, 13897, 684, 2827, 50276, 14806, 253, 1313, 255, 26208, 273, 258, 1686, 50274, 783, 19843, 273, 9759, 812, 320, 5520, 50276, 1542, 1650, 253, 2426, 1313, 255, 26208, 3733, 1313, 255, 26208, 5175, 1313, 255, 38972, 3733, 403, 417, 9848, 281, 1239, 352, 651, 320, 1805, 281, 897, 690, 1846, 2969, 3000, 326, 952, 7561, 908, 281, 3157, 253, 19843, 273, 27228, 50273, 1542, 253, 4081, 6779, 44864, 627, 310, 247, 3480, 273, 11859, 1783, 2139, 253, 4081, 1332, 273, 24399, 2827, 715, 1313, 255, 26208, 812, 1805, 33623, 36256, 37264, 2429, 281, 258, 1686, 352, 310, 12744, 281, 923, 253, 5649, 285, 15504, 273, 970, 2827, 1309, 1313, 255, 26208, 760, 432, 253, 6012, 27935, 50274, 16848, 5438, 1332, 19756, 801, 554, 394, 1783, 670, 849, 1027, 4836, 3530, 403, 4236, 752, 403, 253, 3410, 5438, 1543, 323, 1027, 12922, 50276, 261, 253, 4081, 3410, 5438, 1332, 7996, 281, 941, 17795, 390, 4836, 15824, 50275, 1542, 253, 3368, 3480, 273, 14023, 281, 690, 5368, 3082, 534, 1908, 253, 3541, 5438, 24088, 337, 50275, 14005, 50275, 18, 3909, 45120, 4715, 342, 13493, 734, 3850, 25064, 1218, 73, 2320, 355, 75, 1504, 74, 1162, 355, 5723, 2824, 6247, 50276, 783, 7681, 9021, 273, 436, 2929, 310, 3710, 2429, 281, 258, 1686, 671, 253, 9759, 19843, 7000, 1783, 273, 253, 4081, 1332, 4679, 878, 281, 320, 5520, 50276, 187, 187, 4118, 18435, 27, 12025, 272, 253, 1895, 273, 36256, 37264, 275, 45120, 4715, 436, 2929, 8725, 258, 1686, 281, 897, 2793, 44864, 2827, 1309, 3733, 3185, 273, 253, 3236, 2746, 534, 4648, 2827, 1309, 1071, 3408, 760, 253, 2929, 29328, 247, 3646, 323, 3530, 5407, 432, 253, 18699, 4679, 921, 253, 34385, 273, 253, 2746, 275, 1264, 2629, 49602, 2429, 342, 2067, 1666, 25379, 50275, 15337, 398, 497, 38350, 7514, 326, 253, 7681, 7680, 273, 253, 2929, 310, 417, 4209, 253, 4477, 9713, 2067, 3374, 1690, 4679, 281, 7277, 342, 3081, 1666, 25379, 533, 253, 7681, 38135, 4558, 3710, 323, 271, 17857, 32888, 9311, 50276, 783, 2929, 2550, 320, 7607, 387, 697, 1655, 830 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 8725, 253, 3909, 5148, 613, 920, 258, 1686, 1566, 342, 253, 2827, 1309, 253, 1313, 255, 26208, 671, 253, 2929, 29328, 247, 5407, 6391, 3646, 323, 3530, 5407, 432, 253, 18699, 3185, 273, 20073, 253, 9305, 3530, 1580, 253, 27882, 1566, 310, 4228, 1309, 253, 1313, 255, 383, 697, 1805, 281, 4657, 253, 4735, 14237, 326, 2929, 1057, 50276, 18, 24399, 253, 2793, 44864, 2827, 1309, 253, 1313, 255, 26208, 3408, 310, 253, 2234, 7680, 253, 2929, 5393, 326, 970, 2827, 1309, 1313, 255, 383, 760, 533, 417, 11419, 6194, 10513, 43430, 752, 513, 368, 1599, 407, 43430, 1309, 253, 1313, 255, 383, 285, 1313, 255, 1949, 253, 5971, 403, 28465, 849, 15274, 310, 2931, 50276, 19, 253, 2929, 8725, 253, 258, 1686, 342, 2827, 534, 1057, 417, 1646, 247, 1534, 7680, 253, 6391, 5407, 3646, 310, 4460, 285, 352, 778, 320, 4217, 323, 253, 643, 44864, 1754, 7274, 268, 859, 19132, 253, 1566, 3045, 2429, 281, 253, 5818, 305, 859, 285, 26413, 652, 50275, 20, 275, 5933, 19, 1313, 255, 26208, 7259, 403, 2530, 323, 253, 2827, 1223, 1313, 255, 383, 7259, 403, 12744, 849, 403, 1313, 255, 6655, 2684, 651, 368, 4496, 2085, 253, 5933, 323, 253, 1072, 50276, 21, 253, 1480, 5301, 342, 258, 1686, 310, 417, 4344, 1580, 258, 1686, 1057, 417, 25057, 327, 2827, 352, 3133, 326, 1543, 403, 14916, 1580, 2429, 281, 1293, 2827, 44864, 1900, 19132, 253, 3045, 671, 1580, 1309, 253, 1313, 255, 383, 760, 49996, 403, 10166, 3103, 352, 588, 320, 12912, 281, 4657, 253, 4735, 6779, 310, 671, 14916, 50276, 22, 1529, 3332, 789, 247, 671, 19132, 253, 6779, 689, 253, 258, 1686, 323, 45120, 4715, 891, 2748, 253, 2488, 4496, 2319, 253, 906, 342, 247, 50276, 66, 3640, 34889, 1754, 966, 32809, 3909, 4715, 342, 3710, 941, 891, 23925, 2284, 1797, 2929, 556, 690, 4460, 629, 533, 253, 2022, 9021, 2429, 281, 258, 1686, 310, 417, 1534, 50276, 7152, 339, 431, 248, 2929, 4453, 387, 253, 1895, 273, 36256, 37264, 275, 45120, 4715, 597, 12661, 247, 1332, 326, 8725, 258, 1686, 281, 897, 2827, 1097, 1309, 3733, 285, 5175, 275, 253, 2827, 597, 4657, 253, 21496, 273, 253, 3530, 3185, 273, 253, 3530, 3746, 281, 3693, 253, 11689, 4269, 407, 253, 14604, 3753, 273, 2827, 285, 253, 3909, 3753, 273, 258, 1686, 597, 897, 5148, 613, 920, 281, 3609, 3530, 326, 403, 4217, 323, 17816, 36256, 37264, 281, 320, 275, 253, 2827, 3185, 273, 18699, 10491, 597, 3283, 253, 3064, 275, 2957, 275, 21565, 253, 966, 4269, 407, 11922, 247, 3410, 432, 44864, 285, 897, 326, 4868, 281, 7617, 752, 281, 5386, 20544, 337, 253, 2929, 22649, 690, 3374, 342, 581, 273, 253, 1655, 3082, 323, 45120, 4715, 285, 12661, 2969, 18149, 281, 4796, 253, 1055, 273, 1110, 3374, 374, 253, 16774, 27163, 34309, 5904, 921, 2590, 7756, 275, 3045, 407, 6240, 253, 4081, 5697, 495, 253, 2929, 310, 3542, 1077, 973, 3477, 281, 1239, 285, 2096, 50276, 34974, 20881, 1255, 265, 337, 812, 368, 21184, 247, 2372, 625, 327, 253, 45611, 849, 4555, 352, 11852, 4715, 24159, 273, 970, 2827, 760, 1309, 5175, 310, 253, 43430, 273, 4468, 1060, 253, 43430, 275, 6703, 6287, 11269, 875, 253, 1313, 255, 26208, 285, 1313, 255, 38972, 374, 6779, 44864, 50276, 66, 651, 2649, 417, 970, 2827, 323, 391, 6677, 7826, 1421, 281, 37264, 270, 651, 2649, 6779, 44864, 1335, 452, 253, 11689, 2523, 323, 499, 79, 50276, 68, 310, 627, 247, 5454, 2727, 326, 310, 1146, 1160, 1060, 604, 594, 667, 7906, 327, 752, 4555, 352, 310, 285, 849, 281, 5206, 326, 5454, 2727, 495, 310, 2649, 253, 2303, 323, 253, 10554, 275, 268, 859, 1900, 19507, 604, 28763, 253, 1083, 752, 310, 253, 24775, 3212, 417, 22753, 253, 269, 68, 1309, 1313, 255, 38972, 3733, 651, 2649, 253, 2957, 323, 253, 1072, 3410, 1818, 689, 253, 2282, 273, 1313, 255, 38972, 3733, 577, 275, 2827, 651, 1907, 2709, 1027, 39657, 273, 941, 11269, 432, 253, 391, 50276, 1826, 3410, 33623, 253, 1895, 273, 10305, 253, 1491, 432, 253, 747, 3410, 50276, 22, 417, 2590, 604, 253, 9021, 273, 253, 2929, 403, 1534, 2217, 323, 17857, 32888, 50276, 263, 1077, 32809, 812, 368, 19148, 253, 9021, 273, 253, 9380, 275, 625, 2508, 4496, 1690, 253, 3374, 3636, 253, 5697, 9945, 4460, 285, 1127, 562, 835, 5697, 497, 29563, 285, 3732, 275, 247, 1027, 4758, 285, 27163, 3966, 721, 403, 627, 3082, 643, 685, 271, 1686, 323, 253, 9978, 273, 1600, 275, 253, 2929, 1563, 258, 1686, 534, 369, 4081, 275, 6247, 1223, 16344, 253, 4081, 2544, 342, 258, 1686, 285, 10575, 273, 258, 1686, 310, 5667, 352, 310, 671, 1774, 281, 7277, 342, 643, 3082, 326, 1007, 387, 253, 1072, 4758, 281, 5999, 366, 253, 4081, 1332, 275, 253, 6239, 50275, 977, 5701, 337, 2905, 2987, 2593, 3877, 326, 7419, 432, 37820, 285, 33558, 267, 1754, 3082, 627, 403, 671, 2067, 10336, 1754, 23178, 6928, 1511, 285, 4764, 12940, 3082, 326, 452, 644, 4081, 323, 17816, 36256, 37264, 275, 45120, 4715, 50276, 19583, 6880, 281, 247, 1655, 1332, 12744, 670, 253, 8453, 273, 253, 7680, 50276, 7152, 339, 431, 248, 2929, 2175, 253, 1313, 317, 834, 249, 780, 4715, 1895, 14237, 44864, 3185, 273, 3530, 44864, 403, 908, 281, 3157, 253, 3045, 273, 1313, 317, 834, 249, 780, 4715, 285, 247, 747, 5700, 1925, 15970, 3410, 5438, 310, 908, 281, 3609, 3530, 715, 253, 44864, 6391, 4679, 7568, 253, 12510, 273, 253, 4081, 3082, 50276, 296, 3755, 20556, 337, 253, 2934, 273, 970, 14237, 3185, 273, 3530, 323, 6391, 310, 4722, 50276, 19, 253, 15970, 3410, 5438, 1332, 310, 3576, 50276, 20881, 1255, 265, 337, 3480, 273, 4679, 390, 10527, 1783, 281, 1329, 970, 14237, 3185, 273, 3530, 50276, 19, 3410, 5438, 1332, 19756, 11859, 1783, 253, 9400, 310, 734, 296, 272, 2299, 253, 7680, 273, 436, 2929, 310, 5884, 352, 310, 247, 5019, 273, 2067, 32809, 11701, 5474, 33032, 2520, 2929, 29328, 281, 50276, 13897, 366, 2827, 715, 1313, 255, 26208, 273, 3909, 6600, 5148, 613, 920, 258, 1686, 480, 9367, 50276, 11300, 6247, 253, 4477, 12661, 281, 4657, 253, 3530, 14237, 417, 253, 3530, 3746, 715, 247, 44864, 6391, 50276, 249, 1635, 253, 4477, 12661, 247, 3410, 5438, 1332, 342, 247, 5148, 613, 9306, 15970, 3410, 5438, 281, 3609, 253, 954, 27096, 6667, 5661, 1543, 327, 2067, 15302, 7568, 253, 12510, 273, 253, 4081, 1332, 50276, 296, 3755, 20556, 50275, 783, 4477, 4081, 7756, 323, 258, 1686, 407, 24399, 2827, 715, 50276, 3899, 255, 26208, 50275, 16217, 2092, 2722, 253, 12510, 273, 253, 4081, 1332, 50273, 20881, 1255, 50274, 783, 7681, 38135, 273, 253, 4081, 1332, 310, 3710, 285, 310, 32809, 2429, 281, 3909, 6600, 5148, 613, 920, 258, 1686, 253, 4081, 1332, 310, 816, 247, 2969, 7756, 1754, 327, 253, 7792, 273, 258, 1686, 253, 3064, 310, 326, 253, 4081, 1332, 50276, 13897, 684, 2827, 50276, 14806, 253, 1313, 255, 26208, 273, 258, 1686, 50274, 783, 19843, 273, 9759, 812, 320, 5520, 50276, 1542, 1650, 253, 2426, 1313, 255, 26208, 3733, 1313, 255, 26208, 5175, 1313, 255, 38972, 3733, 403, 417, 9848, 281, 1239, 352, 651, 320, 1805, 281, 897, 690, 1846, 2969, 3000, 326, 952, 7561, 908, 281, 3157, 253, 19843, 273, 27228, 50273, 1542, 253, 4081, 6779, 44864, 627, 310, 247, 3480, 273, 11859, 1783, 2139, 253, 4081, 1332, 273, 24399, 2827, 715, 1313, 255, 26208, 812, 1805, 33623, 36256, 37264, 2429, 281, 258, 1686, 352, 310, 12744, 281, 923, 253, 5649, 285, 15504, 273, 970, 2827, 1309, 1313, 255, 26208, 760, 432, 253, 6012, 27935, 50274, 16848, 5438, 1332, 19756, 801, 554, 394, 1783, 670, 849, 1027, 4836, 3530, 403, 4236, 752, 403, 253, 3410, 5438, 1543, 323, 1027, 12922, 50276, 261, 253, 4081, 3410, 5438, 1332, 7996, 281, 941, 17795, 390, 4836, 15824, 50275, 1542, 253, 3368, 3480, 273, 14023, 281, 690, 5368, 3082, 534, 1908, 253, 3541, 5438, 24088, 337, 50275, 14005, 50275, 18, 3909, 45120, 4715, 342, 13493, 734, 3850, 25064, 1218, 73, 2320, 355, 75, 1504, 74, 1162, 355, 5723, 2824, 6247, 50276, 783, 7681, 9021, 273, 436, 2929, 310, 3710, 2429, 281, 258, 1686, 671, 253, 9759, 19843, 7000, 1783, 273, 253, 4081, 1332, 4679, 878, 281, 320, 5520, 50276, 187, 187, 4118, 18435, 27, 12025, 272, 253, 1895, 273, 36256, 37264, 275, 45120, 4715, 436, 2929, 8725, 258, 1686, 281, 897, 2793, 44864, 2827, 1309, 3733, 3185, 273, 253, 3236, 2746, 534, 4648, 2827, 1309, 1071, 3408, 760, 253, 2929, 29328, 247, 3646, 323, 3530, 5407, 432, 253, 18699, 4679, 921, 253, 34385, 273, 253, 2746, 275, 1264, 2629, 49602, 2429, 342, 2067, 1666, 25379, 50275, 15337, 398, 497, 38350, 7514, 326, 253, 7681, 7680, 273, 253, 2929, 310, 417, 4209, 253, 4477, 9713, 2067, 3374, 1690, 4679, 281, 7277, 342, 3081, 1666, 25379, 533, 253, 7681, 38135, 4558, 3710, 323, 271, 17857, 32888, 9311, 50276, 783, 2929, 2550, 320, 7607, 387, 697, 1655, 830 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper proposes to study exploration at different levels of granularity current methods either explore at the level of individual steps eg epsilongreedy or at the level of experiments eg first a rewardfree exploration phase followed by a taskdependent learning phase using the gathered data this paper proposes to study exploration at the intraepisodic level ie where the agent switches between exploration and exploitation within the same episode they discuss various design choices to perform exploration at this level for example switching after a certain number of steps or with a certain probability or switching based on the discrepancy between the predicted value and actual experienced value the experimental results show that including intraepisodic exploration gives a modest benefit over other exploration schemes when using an r2d2 base agent other insights are also included which show that the proportion of exploration does change throughout the learning process indicating that different degrees of exploration are useful at different stages they also show that the informed switching component learns switching behaviors which are nonuniform throughout the episodes strengths this paper investigates a novel area which seems very important at a high level it also proposes novel methods to start to address this area weaknesses the paper does not feel very focused there are lots of different ideas and methods presented but the takeaways are unclear the figures are a bit hard to parse first i applaud the authors for tacking a new and unexplored area however the takeaways and next research steps are unclear at least on atari the benefits of intraepisodic exploration seem limited some suggestions for improving the paper it would be helpful to have the different algorithm variants in section 2 spelled out especially regarding the different switching mechanisms the current descriptions are quite highlevel and it would be helpful to make them concrete the tuple notation shown in figure 2 is a bit hard to parse it would be helpful to show several examples so that the differences in the different elements of the list can be seen more clearly i think the paper would be a lot stronger if there were some tasks which more convincingly demonstrated the benefits of intraepisode exploration this could be a new task which the authors design themselves i think including the atari experiments is useful in that it shows their methods can improve performance on a standard benchmark however the monolithic exploration methods already work quite well on these tasks and it is not clear if there is that much more improvement to be had by exploring at a finer granularity i do agree with the authors that in the big picture monolithic exploration is likely suboptimal and more informed exploration will be necessary i found their motivating example of learning to ride a bike while maintaining necessary daily activities to be very helpful can you design some task which distills this task in a simpler format for example some setup where the agent must regularly find food from a predictable source exploiting but also must explore when it can between finding food introducing new tasks which measure the ability to optimally switch between explore and exploit modes would also be useful to the community in building on this work im on the fence about accepting this paper on one hand i think it is good that the authors are exploring a new and important area and the ideas are interesting on the other hand this work still feels preliminary and the benefits of intraepisode exploration are not yet convincingly demonstrated i am not strongly opposed to accepting this paper since it could at least be a starting point for research in this area however i think that if the authors could introduce new tasks where intraepisodic exploration convincingly helps then i think this would be a very strong submission to a later conference docsepthis paper studies switching between exploit and explore modes in reinforcement learning it discusses switching mechanisms based on time blind switching and based on state informed switching studying seven atari games an empirical analysis of different switching mechanisms is performed overall the paper is very strong wellmotivated and empirically sound most of my concerns are with clarifying details of the experiments and improving the exposition these are listed below paragraph 2 of the introduction the typical answer to when to explore is when the agent is unfamiliar with the environment or its structure ie early in its interactions with the environment i would highlight why the heuristic of explore earlier should be challenged here instead paragraph 3 of the introduction its not clear what the connection to schizophrenia here is other than that this was a study of explore vs exploit are these individuals somehow impaired in choosing between these options etc paragraph 1 of section 2 methods not sure that the example of riding a bicycle works is the targeted acquisition of a new skill really exploration section 21 description of episodelevel it would be more useful to define episode in the context studied here rather than using the example of training games vs tournament matches caption to figure 1 the differences for dg are unclear just by looking at this figure and the caption other than that they are different intraepisode approaches it would be better to clarify this section 22 description of blind switching this refers to fractional episode length but if we dont get to choose the length of an episode how do we implement fractional episode length exante section 23 description of bandit adaptation the citations should be intext citations section 3 it would be useful to explain the observed differences between these domains are there hypotheses for why certain domains are better suited for different switching mechanisms the discussion in section 33 starts to get at this but does not specifically address why the differences may arise section 32 in appendix a3 the compute budget is mentioned as being 2b frames please clarify this difference section a1 regarding using the full action sets does this mean that some games have meaningless actions ie actions that cause no effect in the world how does this affect learning and why was this choice made section a1 regarding no lifeloss signal what exactly is an episode here if there is no lifeloss signal is it just 108000 frames and why this number how does scoring work section a1 why was the choice to use raw unprocessed frames made in particular what is the effect of keeping color information for learning section a2 is 1 tpu used as mentioned here or 2 tpus as mentioned in section a3 section a2 citations should be intext citations should be after schaul et al 2021 in paragraph 2 line 4 section a3 while 2 tpus should be using 2 tpus the captions to figures 9 15 especially 9 and 10 should be expanded to clarify what the reader can learn from this figure eg any hypotheses on what accounts for the differences between different environments the paper studies the relatively underexplored question of when agents should explore introduces a novel exploration trigger called value promise discrepancy and performs a thorough empirical analysis in the domain of seven atari games please see the main review for detailed comments and suggestions for improvement docsepthis paper investigates when to switch between exploitation and exploration and how long to stay in each exploration mode during rl learning it proposes new ways to explore the subject especially with intraepisodic exploration variants it presents a large body of study results 10 pages of appendices and concludes with very thoughtprovoking suggestions and discussions this paper conducts a series of experiments aimed at answering the question of when we should switch between exploitation and exploration during rl learning as positive points we can cite the wide related bibliography from analogies with the animal system human included of exploring to other techniques such as the use of options another point to emphasize is the clear careful and pleasurable writing that the authors developed in the paper the authors managed to transmit a moment of reflection and expansion of thoughts about the possible behaviors shown by agents who learn with rl moments of reflection in an area that transforms at a very high speed are always welcome perhaps the negative point is the limitation of the contribution since it is still a study that must be deepened in order to provide effective and efficient guidelines for rl system developers working in real applications however it is an indepth study with interesting results the article is worth being divulged to remind everyone working in the rl area that there is still a lot to study investigate and evaluate in order to have robust efficient and effective systems the paper brings an indepth study with interesting results and very well written it is worth being divulged to remind everyone who works in the rl area that there is still a lot to study investigate and evaluate in order to have robust efficient and effective systems docsepthis paper proposes a modeswitching strategy for the explorationexploitation dilemma instead of monolithic behaviour policies in order to obtain more diverse behaviour different granularities for the timing of the switches as well as different switching mechanisms are investigated blind vs informed switching the focus for exploration is not on how but when for exploration they both use random network distillation rnd as well as a uniform policy their experiments are conducted on the atari learning environment ale where they provide performance and diversity results the idea of modeswitching is interesting however the presentation of the motivation as well as the results seem somewhat weak in my view the main comparison baseline for the modeswitching architecture is the monolithic variant where typically sparse rewards in hard exploration tasks are augmented with intrinsic reward signals this means that the modes that they switch between in this paper are merged homogeneously in time a valid problem that the authors point out for the monolithic case is that the scale of the intrinsic reward signal has to be tuned and may need to change in time but there is no comparison to these methods and the superiority of their method to the monolithic variants are not highlighted well enough the authors try to circumvent the performance comparisons with other baselines by saying that they can obtain more diverse behaviours in terms of exploration strategies and say that they dont aim to show improved performance however this diversity argumentation is not strong enough in my view first it is not clear to me if the authors want to focus on the diversity of behaviour that is obtained within one variant of the modeswitching eg informed switching or whether they want to highlight the diversity across different granularities and switching mechanisms if it is the latter the diverse behaviour doesnt necessarily translate into performance for the different games and best performance for different games might be obtained with different strategies but since this strategy is fixed prior to the experiment it is not clear to me how this helps the diversity argument of the method itself the authors point out that montezumas revenge and phoenix have their best performance with different modeswitching behaviours but these results have a really high variance perhaps 3 seeds are just not enough and more seeds are needed to draw reliable conclusions i believe that the idea presented in this paper is interesting but the results are lacking the related work section briefly covers some similar methods and i think comparisons are still needed with these other methods eg goexplore also focuses on the when question of exploration and should be included as a baseline as well as works with monolithic behaviour policies where the modeswitching is replaced by a weighting problem of external and intrinsic rewards of a single behaviour policy ### Summary:
exploration can happen at various levels of granularity and at different times during an episode and this work performs a study of the problem of exploration when to explorewhen to switch between exploring and exploitation at what timescale to do so and what signals would be good triggers to switch the study is performed on atari games strenghts the study is well motivated and the manuscript is overall well written studies a new problem area and proposes an initial novel method for this problem extensive study on atari problems weaknesses some clarity issues as pointed out by the reviewers no illustrative task is given to give a more intuitive exposition of the when to explore problem comparison to some extra baselines like goexplore would have been insightful rebuttal most clarity issues have been addressed satisfactorily it has been explained why some requests for extra baselines would be challengingor not relevant enough while the authors agree that goexplore would be an interesting baseline they seem to have not added it an illustrative task was not provided summary all reviewers agree that this manuscript opens up and tackles a novel direction in exploration and provides an extensive empirical study on atari games a standard benchmark for such problem settings while i agree with the reviewers that point out that this paper could have been made stronger by adding an illustrative task and additional baselines like goexplore there is a general consensus that the provided empirical study on this novel problem setting is a good contribution in itself because of this i recommend accept
[ 253, 387, 1792, 4679, 310, 4217, 275, 326, 352, 2722, 616, 3082, 476, 3157, 3045, 327, 247, 2629, 22791, 2299, 253, 1114, 36842, 17947, 3082, 2168, 789, 3240, 973, 327, 841, 8892, 285, 352, 310, 417, 2590, 604, 627, 310, 326, 1199, 625, 7756, 281, 320, 574, 407, 18216, 387, 247, 40259, 32449, 414, 891, 513, 5194, 342, 253, 4477, 326, 275, 253, 1943, 5406, 1114, 36842, 17947, 310, 2779, 749, 29776, 285, 625, 8191, 17947, 588, 320, 3309, 891, 1119, 616, 15265, 839, 1650, 273, 4715, 281, 9549, 247, 13696, 1223, 11850, 3309, 5312, 4712, 281, 320, 1077, 9371, 476, 368, 2216, 690, 4836, 534, 940, 3171, 436, 4836, 275, 247, 19554, 5981, 323, 1650, 690, 9978, 835, 253, 5570, 1364, 11719, 1089, 2739, 432, 247, 28826, 2603, 38883, 533, 671, 1364, 8338, 672, 352, 476, 875, 4560, 2739, 16984, 747, 8892, 534, 2557, 253, 3745, 281, 5556, 595, 5234, 875, 8338, 285, 22059, 10006, 651, 671, 320, 4217, 281, 253, 3114, 275, 3652, 327, 436, 789, 50274, 303, 327, 253, 19354, 670, 18738, 436, 2929, 327, 581, 1133, 891, 1158, 352, 310, 1175, 326, 253, 4477, 403, 18216, 247, 747, 285, 1774, 2170, 285, 253, 5697, 403, 4722, 327, 253, 643, 1133, 436, 789, 1335, 9193, 12611, 285, 253, 5373, 273, 8376, 554, 22151, 17947, 403, 417, 2568, 2410, 1763, 5356, 5183, 50276, 74, 717, 417, 7052, 10066, 281, 18738, 436, 2929, 1580, 352, 812, 387, 1878, 320, 247, 4983, 1127, 323, 2561, 275, 436, 2170, 2299, 891, 1158, 326, 604, 253, 4477, 812, 9569, 747, 8892, 835, 8376, 554, 261, 23329, 17947, 2410, 1763, 5356, 7729, 840, 891, 1158, 436, 651, 320, 247, 1077, 2266, 19529, 281, 247, 1996, 8059, 5474, 33032, 2520, 2929, 2175, 12797, 875, 22059, 285, 8338, 10006, 275, 35221, 4715, 352, 25339, 12797, 6297, 1754, 327, 673, 9645, 12797, 285, 1754, 327, 1375, 8191, 12797, 12392, 5093, 387, 1792, 3958, 271, 16774, 1783, 273, 1027, 12797, 6297, 310, 2684, 4583, 253, 2929, 310, 1077, 2266, 973, 24013, 8550, 285, 45190, 3590, 954, 273, 619, 7350, 403, 342, 8254, 5411, 4278, 273, 253, 4679, 285, 11138, 253, 47284, 841, 403, 7117, 2708, 50275, 43575, 374, 273, 253, 10199, 253, 6867, 3662, 281, 672, 281, 8338, 310, 672, 253, 5570, 310, 32139, 342, 253, 3126, 390, 697, 2605, 26332, 2393, 275, 697, 6355, 342, 253, 3126, 891, 651, 6780, 2139, 253, 47641, 273, 8338, 4321, 943, 320, 14870, 1060, 3185, 50275, 43575, 495, 273, 253, 10199, 697, 417, 2590, 752, 253, 4602, 281, 24532, 1060, 310, 643, 685, 326, 436, 369, 247, 1263, 273, 8338, 4632, 22059, 403, 841, 4292, 10380, 15506, 275, 13887, 875, 841, 4610, 3966, 50275, 43575, 337, 273, 2593, 374, 3082, 417, 2119, 326, 253, 1650, 273, 15150, 247, 25812, 2987, 50276, 261, 253, 10522, 11931, 273, 247, 747, 10861, 1663, 17947, 50275, 4674, 3127, 5740, 273, 9037, 5251, 352, 651, 320, 625, 4217, 281, 4853, 9037, 275, 253, 3634, 5421, 1060, 2581, 685, 970, 253, 1650, 273, 3733, 3958, 4632, 14811, 10129, 50275, 34480, 281, 4677, 337, 253, 3910, 323, 277, 72, 403, 12744, 816, 407, 2819, 387, 436, 4677, 285, 253, 11743, 643, 685, 326, 597, 403, 1027, 8376, 554, 22151, 7274, 352, 651, 320, 1805, 281, 19148, 436, 50275, 4674, 3307, 5740, 273, 9645, 12797, 436, 10770, 281, 24622, 9037, 2978, 533, 604, 359, 13414, 755, 281, 5206, 253, 2978, 273, 271, 9037, 849, 513, 359, 3359, 24622, 9037, 2978, 385, 7961, 50275, 4674, 3495, 5740, 273, 3961, 262, 15644, 253, 30404, 943, 320, 1101, 633, 30404, 50275, 4674, 495, 352, 651, 320, 4217, 281, 5513, 253, 2540, 3910, 875, 841, 10625, 403, 627, 24316, 323, 2139, 2176, 10625, 403, 1805, 18960, 323, 1027, 12797, 6297, 253, 5955, 275, 2593, 5922, 7866, 281, 755, 387, 436, 533, 1057, 417, 5742, 2953, 2139, 253, 3910, 778, 12893, 50275, 4674, 4567, 275, 30762, 247, 20, 253, 11897, 7563, 310, 5393, 347, 1146, 374, 67, 13009, 4496, 19148, 436, 3064, 50275, 4674, 247, 18, 5001, 970, 253, 2120, 2250, 5239, 1057, 436, 1599, 326, 690, 3958, 452, 34209, 5231, 26332, 5231, 326, 2847, 642, 1055, 275, 253, 1533, 849, 1057, 436, 2818, 4715, 285, 2139, 369, 436, 4327, 1160, 50275, 4674, 247, 18, 5001, 642, 5243, 293, 1730, 2625, 752, 4555, 310, 271, 9037, 1060, 604, 627, 310, 642, 5243, 293, 1730, 2625, 310, 352, 816, 13278, 933, 13009, 285, 2139, 436, 1180, 849, 1057, 14755, 789, 50275, 4674, 247, 18, 2139, 369, 253, 4327, 281, 897, 9305, 440, 36981, 13009, 1160, 275, 1798, 752, 310, 253, 1055, 273, 7562, 3295, 1491, 323, 4715, 50275, 4674, 247, 19, 310, 337, 246, 11113, 908, 347, 5393, 1060, 390, 374, 246, 81, 316, 347, 5393, 275, 2593, 247, 20, 50275, 4674, 247, 19, 30404, 943, 320, 1101, 633, 30404, 50276, 11425, 320, 50276, 6438, 5807, 40186, 1162, 355, 43425, 275, 12494, 374, 1386, 577, 50275, 4674, 247, 20, 1223, 374, 246, 81, 316, 943, 320, 970, 374, 246, 81, 316, 50275, 783, 3403, 621, 281, 8442, 898, 50276, 1010, 3340, 898, 285, 884, 943, 320, 11848, 281, 19148, 752, 253, 9414, 476, 3037, 432, 436, 4677, 24088, 667, 24316, 327, 752, 8553, 323, 253, 3910, 875, 1027, 12620, 253, 2929, 2175, 253, 4942, 15560, 18398, 446, 2149, 1953, 273, 672, 6083, 943, 8338, 23970, 247, 4460, 17947, 9632, 1925, 1318, 9023, 26210, 285, 17923, 247, 11080, 16774, 1783, 275, 253, 5028, 273, 5093, 387, 1792, 3958, 4496, 923, 253, 2022, 2278, 323, 7000, 5701, 285, 13991, 323, 7756, 5474, 33032, 2520, 2929, 2340, 684, 672, 281, 5234, 875, 30211, 285, 17947, 285, 849, 1048, 281, 3297, 275, 1016, 17947, 4438, 1309, 391, 77, 4715, 352, 29328, 747, 4088, 281, 8338, 253, 2256, 3340, 342, 8376, 554, 261, 23329, 17947, 11640, 352, 10262, 247, 1781, 2133, 273, 1263, 1543, 884, 7223, 273, 14801, 1271, 285, 20097, 342, 1077, 1869, 11404, 6856, 13991, 285, 11985, 436, 2929, 2589, 84, 247, 2962, 273, 4679, 11205, 387, 22291, 253, 1953, 273, 672, 359, 943, 5234, 875, 30211, 285, 17947, 1309, 391, 77, 4715, 347, 2762, 2792, 359, 476, 26542, 253, 4618, 2905, 20314, 20561, 432, 7370, 447, 342, 253, 5893, 985, 1966, 2908, 273, 18216, 281, 643, 5609, 824, 347, 253, 897, 273, 4610, 1529, 1127, 281, 22175, 310, 253, 2590, 10182, 285, 26757, 11722, 4028, 326, 253, 4477, 3715, 275, 253, 2929, 253, 4477, 7303, 281, 13185, 247, 2774, 273, 12906, 285, 7466, 273, 7906, 670, 253, 1896, 13576, 2011, 407, 6083, 665, 3037, 342, 391, 77, 9506, 273, 12906, 275, 271, 2170, 326, 29698, 387, 247, 1077, 1029, 3885, 403, 1900, 10112, 50276, 30875, 253, 4016, 1127, 310, 253, 12291, 273, 253, 7680, 50276, 17480, 352, 310, 1335, 247, 1263, 326, 1364, 320, 3676, 2348, 275, 1340, 281, 2085, 3576, 285, 5919, 9600, 323, 391, 77, 985, 12259, 2444, 275, 1524, 4893, 50276, 35529, 352, 310, 271, 801, 554, 394, 1263, 342, 4722, 1543, 253, 3929, 310, 4409, 1146, 2017, 335, 2400, 281, 9287, 4130, 2444, 275, 253, 391, 77, 2170, 326, 627, 310, 1335, 247, 2257, 281, 1263, 7409, 285, 7472, 275, 1340, 281, 452, 10237, 5919, 285, 3576, 2718, 253, 2929, 10316, 271, 801, 554, 394, 1263, 342, 4722, 1543, 285, 1077, 973, 3542, 352, 310, 4409, 1146, 2017, 335, 2400, 281, 9287, 4130, 665, 2987, 275, 253, 391, 77, 2170, 326, 627, 310, 1335, 247, 2257, 281, 1263, 7409, 285, 7472, 275, 1340, 281, 452, 10237, 5919, 285, 3576, 2718, 5474, 33032, 2520, 2929, 29328, 247, 10006, 88, 31054, 5700, 323, 253, 17947, 15083, 80, 3535, 34390, 3185, 273, 1114, 36842, 8770, 7823, 275, 1340, 281, 4044, 625, 11117, 8770, 1027, 32449, 1005, 323, 253, 11795, 273, 253, 20994, 347, 973, 347, 1027, 12797, 6297, 403, 6949, 9645, 4632, 8191, 12797, 253, 2770, 323, 17947, 310, 417, 327, 849, 533, 672, 323, 17947, 597, 1097, 897, 3632, 2990, 940, 21755, 391, 2109, 347, 973, 347, 247, 6447, 3646, 616, 4679, 403, 5196, 327, 253, 387, 1792, 4715, 3126, 21844, 835, 597, 2085, 3045, 285, 9991, 1543, 253, 2934, 273, 10006, 88, 31054, 310, 4722, 2299, 253, 9759, 273, 253, 16038, 347, 973, 347, 253, 1543, 1646, 8489, 5075, 275, 619, 1859, 253, 2022, 5301, 8245, 323, 253, 10006, 88, 31054, 10336, 310, 253, 1114, 36842, 12955, 835, 5431, 23507, 23267, 275, 1892, 17947, 8892, 403, 31612, 342, 15276, 10921, 6298, 436, 2097, 326, 253, 10006, 326, 597, 5234, 875, 275, 436, 2929, 403, 21884, 2860, 6679, 4087, 275, 673, 247, 3588, 1895, 326, 253, 4477, 1127, 562, 323, 253, 1114, 36842, 1083, 310, 326, 253, 4311, 273, 253, 15276, 10921, 2625, 556, 281, 320, 24251, 285, 778, 878, 281, 1818, 275, 673, 533, 627, 310, 642, 5301, 281, 841, 3082, 285, 253, 34385, 273, 616, 1332, 281, 253, 1114, 36842, 11640, 403, 417, 16318, 973, 2217, 50276, 783, 4477, 1611, 281, 39256, 253, 3045, 14023, 342, 643, 1666, 25379, 407, 3981, 326, 597, 476, 4044, 625, 11117, 32536, 275, 2426, 273, 17947, 8130, 285, 1333, 326, 597, 13414, 4388, 281, 921, 5520, 3045, 2299, 436, 9991, 4154, 318, 310, 417, 2266, 2217, 275, 619, 1859, 806, 352, 310, 417, 2590, 281, 479, 604, 253, 4477, 971, 281, 2770, 327, 253, 9991, 273, 8770, 326, 310, 2797, 1561, 581, 12955, 273, 253, 10006, 88, 31054, 24088, 8191, 12797, 390, 1880, 597, 971, 281, 6780, 253, 9991, 2439, 1027, 32449, 1005, 285, 12797, 6297, 604, 352, 310, 253, 6158, 253, 11117, 8770, 36908, 7933, 16497, 715, 3045, 323, 253, 1027, 3958, 285, 1682, 3045, 323, 1027, 3958, 1537, 320, 2797, 342, 1027, 8130, 533, 1580, 436, 5700, 310, 4229, 2720, 281, 253, 3368, 352, 310, 417, 2590, 281, 479, 849, 436, 7729, 253, 9991, 4154, 273, 253, 1332, 3139, 253, 4477, 1127, 562, 326, 1114, 442, 91, 49998, 25442, 285, 815, 80, 17473, 452, 616, 1682, 3045, 342, 1027, 10006, 88, 31054, 32536, 533, 841, 1543, 452, 247, 1663, 1029, 11041, 4931, 495, 12922, 403, 816, 417, 2217, 285, 625, 12922, 403, 3058, 281, 3812, 9630, 11815, 891, 2868, 326, 253, 2934, 3559, 275, 436, 2929, 310, 4722, 533, 253, 1543, 403, 14999, 253, 2905, 789, 2593, 13366, 10949, 690, 2074, 3082, 285, 891, 1158, 14023, 403, 1335, 3058, 342, 841, 643, 3082, 24088, 564, 15083, 410, 671, 16633, 327, 253, 672, 1953, 273, 17947, 285, 943, 320, 2908, 347, 247, 8245, 347, 973, 347, 2987, 342, 1114, 36842, 8770, 7823, 835, 253, 10006, 88, 31054, 310, 7932, 407, 247, 42428, 1895, 273, 6024, 285, 15276, 23267, 273, 247, 2014, 8770, 3646, 2490, 187, 4118, 18435, 27, 15083, 7843, 476, 5108, 387, 2710, 2308, 273, 32449, 414, 285, 387, 1027, 2069, 1309, 271, 9037, 50276, 395, 436, 789, 17923, 247, 1263, 273, 253, 1895, 273, 17947, 672, 281, 8338, 9453, 281, 5234, 875, 18216, 285, 30211, 387, 752, 43936, 281, 513, 594, 285, 752, 6298, 651, 320, 1175, 23785, 281, 5234, 253, 1263, 310, 2684, 327, 387, 1792, 3958, 50276, 296, 3755, 384, 84, 50276, 783, 1263, 310, 973, 17194, 285, 253, 7714, 310, 4583, 973, 3542, 2175, 247, 747, 1895, 2170, 285, 29328, 271, 3302, 4460, 1332, 323, 436, 1895, 9470, 1263, 327, 387, 1792, 3237, 50276, 20881, 1255, 265, 50276, 8826, 19843, 3374, 347, 8042, 562, 407, 253, 30628, 642, 47386, 4836, 310, 1677, 281, 1918, 247, 625, 27350, 47284, 273, 253, 672, 281, 8338, 1895, 5301, 281, 690, 4465, 1666, 25379, 751, 564, 15083, 410, 651, 452, 644, 47860, 50276, 250, 2858, 22559, 50276, 2252, 19843, 3374, 452, 644, 9713, 3449, 5906, 1031, 352, 556, 644, 5544, 2139, 690, 9762, 323, 4465, 1666, 25379, 651, 320, 11132, 263, 417, 4623, 2217, 1223, 253, 4477, 5194, 326, 564, 15083, 410, 651, 320, 271, 4722, 8245, 597, 1646, 281, 452, 417, 2879, 352, 271, 47386, 4836, 369, 417, 2530, 50276, 8774, 50276, 455, 30628, 5194, 326, 436, 7714, 13279, 598, 285, 39223, 247, 4460, 3884, 275, 17947, 285, 3400, 271, 9470, 16774, 1263, 327, 387, 1792, 3958, 247, 2629, 22791, 323, 824, 1895, 7533, 1223, 891, 5194, 342, 253, 30628, 326, 1127, 562, 326, 436, 2929, 812, 452, 644, 1160, 10046, 407, 6240, 271, 47386, 4836, 285, 3081, 1666, 25379, 751, 564, 15083, 410, 627, 310, 247, 2087, 13969, 326, 253, 2530, 16774, 1263, 327, 436, 4460, 1895, 4758, 310, 247, 1175, 7680, 275, 3139, 984, 273, 436, 891, 5583, 2997 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 253, 387, 1792, 4679, 310, 4217, 275, 326, 352, 2722, 616, 3082, 476, 3157, 3045, 327, 247, 2629, 22791, 2299, 253, 1114, 36842, 17947, 3082, 2168, 789, 3240, 973, 327, 841, 8892, 285, 352, 310, 417, 2590, 604, 627, 310, 326, 1199, 625, 7756, 281, 320, 574, 407, 18216, 387, 247, 40259, 32449, 414, 891, 513, 5194, 342, 253, 4477, 326, 275, 253, 1943, 5406, 1114, 36842, 17947, 310, 2779, 749, 29776, 285, 625, 8191, 17947, 588, 320, 3309, 891, 1119, 616, 15265, 839, 1650, 273, 4715, 281, 9549, 247, 13696, 1223, 11850, 3309, 5312, 4712, 281, 320, 1077, 9371, 476, 368, 2216, 690, 4836, 534, 940, 3171, 436, 4836, 275, 247, 19554, 5981, 323, 1650, 690, 9978, 835, 253, 5570, 1364, 11719, 1089, 2739, 432, 247, 28826, 2603, 38883, 533, 671, 1364, 8338, 672, 352, 476, 875, 4560, 2739, 16984, 747, 8892, 534, 2557, 253, 3745, 281, 5556, 595, 5234, 875, 8338, 285, 22059, 10006, 651, 671, 320, 4217, 281, 253, 3114, 275, 3652, 327, 436, 789, 50274, 303, 327, 253, 19354, 670, 18738, 436, 2929, 327, 581, 1133, 891, 1158, 352, 310, 1175, 326, 253, 4477, 403, 18216, 247, 747, 285, 1774, 2170, 285, 253, 5697, 403, 4722, 327, 253, 643, 1133, 436, 789, 1335, 9193, 12611, 285, 253, 5373, 273, 8376, 554, 22151, 17947, 403, 417, 2568, 2410, 1763, 5356, 5183, 50276, 74, 717, 417, 7052, 10066, 281, 18738, 436, 2929, 1580, 352, 812, 387, 1878, 320, 247, 4983, 1127, 323, 2561, 275, 436, 2170, 2299, 891, 1158, 326, 604, 253, 4477, 812, 9569, 747, 8892, 835, 8376, 554, 261, 23329, 17947, 2410, 1763, 5356, 7729, 840, 891, 1158, 436, 651, 320, 247, 1077, 2266, 19529, 281, 247, 1996, 8059, 5474, 33032, 2520, 2929, 2175, 12797, 875, 22059, 285, 8338, 10006, 275, 35221, 4715, 352, 25339, 12797, 6297, 1754, 327, 673, 9645, 12797, 285, 1754, 327, 1375, 8191, 12797, 12392, 5093, 387, 1792, 3958, 271, 16774, 1783, 273, 1027, 12797, 6297, 310, 2684, 4583, 253, 2929, 310, 1077, 2266, 973, 24013, 8550, 285, 45190, 3590, 954, 273, 619, 7350, 403, 342, 8254, 5411, 4278, 273, 253, 4679, 285, 11138, 253, 47284, 841, 403, 7117, 2708, 50275, 43575, 374, 273, 253, 10199, 253, 6867, 3662, 281, 672, 281, 8338, 310, 672, 253, 5570, 310, 32139, 342, 253, 3126, 390, 697, 2605, 26332, 2393, 275, 697, 6355, 342, 253, 3126, 891, 651, 6780, 2139, 253, 47641, 273, 8338, 4321, 943, 320, 14870, 1060, 3185, 50275, 43575, 495, 273, 253, 10199, 697, 417, 2590, 752, 253, 4602, 281, 24532, 1060, 310, 643, 685, 326, 436, 369, 247, 1263, 273, 8338, 4632, 22059, 403, 841, 4292, 10380, 15506, 275, 13887, 875, 841, 4610, 3966, 50275, 43575, 337, 273, 2593, 374, 3082, 417, 2119, 326, 253, 1650, 273, 15150, 247, 25812, 2987, 50276, 261, 253, 10522, 11931, 273, 247, 747, 10861, 1663, 17947, 50275, 4674, 3127, 5740, 273, 9037, 5251, 352, 651, 320, 625, 4217, 281, 4853, 9037, 275, 253, 3634, 5421, 1060, 2581, 685, 970, 253, 1650, 273, 3733, 3958, 4632, 14811, 10129, 50275, 34480, 281, 4677, 337, 253, 3910, 323, 277, 72, 403, 12744, 816, 407, 2819, 387, 436, 4677, 285, 253, 11743, 643, 685, 326, 597, 403, 1027, 8376, 554, 22151, 7274, 352, 651, 320, 1805, 281, 19148, 436, 50275, 4674, 3307, 5740, 273, 9645, 12797, 436, 10770, 281, 24622, 9037, 2978, 533, 604, 359, 13414, 755, 281, 5206, 253, 2978, 273, 271, 9037, 849, 513, 359, 3359, 24622, 9037, 2978, 385, 7961, 50275, 4674, 3495, 5740, 273, 3961, 262, 15644, 253, 30404, 943, 320, 1101, 633, 30404, 50275, 4674, 495, 352, 651, 320, 4217, 281, 5513, 253, 2540, 3910, 875, 841, 10625, 403, 627, 24316, 323, 2139, 2176, 10625, 403, 1805, 18960, 323, 1027, 12797, 6297, 253, 5955, 275, 2593, 5922, 7866, 281, 755, 387, 436, 533, 1057, 417, 5742, 2953, 2139, 253, 3910, 778, 12893, 50275, 4674, 4567, 275, 30762, 247, 20, 253, 11897, 7563, 310, 5393, 347, 1146, 374, 67, 13009, 4496, 19148, 436, 3064, 50275, 4674, 247, 18, 5001, 970, 253, 2120, 2250, 5239, 1057, 436, 1599, 326, 690, 3958, 452, 34209, 5231, 26332, 5231, 326, 2847, 642, 1055, 275, 253, 1533, 849, 1057, 436, 2818, 4715, 285, 2139, 369, 436, 4327, 1160, 50275, 4674, 247, 18, 5001, 642, 5243, 293, 1730, 2625, 752, 4555, 310, 271, 9037, 1060, 604, 627, 310, 642, 5243, 293, 1730, 2625, 310, 352, 816, 13278, 933, 13009, 285, 2139, 436, 1180, 849, 1057, 14755, 789, 50275, 4674, 247, 18, 2139, 369, 253, 4327, 281, 897, 9305, 440, 36981, 13009, 1160, 275, 1798, 752, 310, 253, 1055, 273, 7562, 3295, 1491, 323, 4715, 50275, 4674, 247, 19, 310, 337, 246, 11113, 908, 347, 5393, 1060, 390, 374, 246, 81, 316, 347, 5393, 275, 2593, 247, 20, 50275, 4674, 247, 19, 30404, 943, 320, 1101, 633, 30404, 50276, 11425, 320, 50276, 6438, 5807, 40186, 1162, 355, 43425, 275, 12494, 374, 1386, 577, 50275, 4674, 247, 20, 1223, 374, 246, 81, 316, 943, 320, 970, 374, 246, 81, 316, 50275, 783, 3403, 621, 281, 8442, 898, 50276, 1010, 3340, 898, 285, 884, 943, 320, 11848, 281, 19148, 752, 253, 9414, 476, 3037, 432, 436, 4677, 24088, 667, 24316, 327, 752, 8553, 323, 253, 3910, 875, 1027, 12620, 253, 2929, 2175, 253, 4942, 15560, 18398, 446, 2149, 1953, 273, 672, 6083, 943, 8338, 23970, 247, 4460, 17947, 9632, 1925, 1318, 9023, 26210, 285, 17923, 247, 11080, 16774, 1783, 275, 253, 5028, 273, 5093, 387, 1792, 3958, 4496, 923, 253, 2022, 2278, 323, 7000, 5701, 285, 13991, 323, 7756, 5474, 33032, 2520, 2929, 2340, 684, 672, 281, 5234, 875, 30211, 285, 17947, 285, 849, 1048, 281, 3297, 275, 1016, 17947, 4438, 1309, 391, 77, 4715, 352, 29328, 747, 4088, 281, 8338, 253, 2256, 3340, 342, 8376, 554, 261, 23329, 17947, 11640, 352, 10262, 247, 1781, 2133, 273, 1263, 1543, 884, 7223, 273, 14801, 1271, 285, 20097, 342, 1077, 1869, 11404, 6856, 13991, 285, 11985, 436, 2929, 2589, 84, 247, 2962, 273, 4679, 11205, 387, 22291, 253, 1953, 273, 672, 359, 943, 5234, 875, 30211, 285, 17947, 1309, 391, 77, 4715, 347, 2762, 2792, 359, 476, 26542, 253, 4618, 2905, 20314, 20561, 432, 7370, 447, 342, 253, 5893, 985, 1966, 2908, 273, 18216, 281, 643, 5609, 824, 347, 253, 897, 273, 4610, 1529, 1127, 281, 22175, 310, 253, 2590, 10182, 285, 26757, 11722, 4028, 326, 253, 4477, 3715, 275, 253, 2929, 253, 4477, 7303, 281, 13185, 247, 2774, 273, 12906, 285, 7466, 273, 7906, 670, 253, 1896, 13576, 2011, 407, 6083, 665, 3037, 342, 391, 77, 9506, 273, 12906, 275, 271, 2170, 326, 29698, 387, 247, 1077, 1029, 3885, 403, 1900, 10112, 50276, 30875, 253, 4016, 1127, 310, 253, 12291, 273, 253, 7680, 50276, 17480, 352, 310, 1335, 247, 1263, 326, 1364, 320, 3676, 2348, 275, 1340, 281, 2085, 3576, 285, 5919, 9600, 323, 391, 77, 985, 12259, 2444, 275, 1524, 4893, 50276, 35529, 352, 310, 271, 801, 554, 394, 1263, 342, 4722, 1543, 253, 3929, 310, 4409, 1146, 2017, 335, 2400, 281, 9287, 4130, 2444, 275, 253, 391, 77, 2170, 326, 627, 310, 1335, 247, 2257, 281, 1263, 7409, 285, 7472, 275, 1340, 281, 452, 10237, 5919, 285, 3576, 2718, 253, 2929, 10316, 271, 801, 554, 394, 1263, 342, 4722, 1543, 285, 1077, 973, 3542, 352, 310, 4409, 1146, 2017, 335, 2400, 281, 9287, 4130, 665, 2987, 275, 253, 391, 77, 2170, 326, 627, 310, 1335, 247, 2257, 281, 1263, 7409, 285, 7472, 275, 1340, 281, 452, 10237, 5919, 285, 3576, 2718, 5474, 33032, 2520, 2929, 29328, 247, 10006, 88, 31054, 5700, 323, 253, 17947, 15083, 80, 3535, 34390, 3185, 273, 1114, 36842, 8770, 7823, 275, 1340, 281, 4044, 625, 11117, 8770, 1027, 32449, 1005, 323, 253, 11795, 273, 253, 20994, 347, 973, 347, 1027, 12797, 6297, 403, 6949, 9645, 4632, 8191, 12797, 253, 2770, 323, 17947, 310, 417, 327, 849, 533, 672, 323, 17947, 597, 1097, 897, 3632, 2990, 940, 21755, 391, 2109, 347, 973, 347, 247, 6447, 3646, 616, 4679, 403, 5196, 327, 253, 387, 1792, 4715, 3126, 21844, 835, 597, 2085, 3045, 285, 9991, 1543, 253, 2934, 273, 10006, 88, 31054, 310, 4722, 2299, 253, 9759, 273, 253, 16038, 347, 973, 347, 253, 1543, 1646, 8489, 5075, 275, 619, 1859, 253, 2022, 5301, 8245, 323, 253, 10006, 88, 31054, 10336, 310, 253, 1114, 36842, 12955, 835, 5431, 23507, 23267, 275, 1892, 17947, 8892, 403, 31612, 342, 15276, 10921, 6298, 436, 2097, 326, 253, 10006, 326, 597, 5234, 875, 275, 436, 2929, 403, 21884, 2860, 6679, 4087, 275, 673, 247, 3588, 1895, 326, 253, 4477, 1127, 562, 323, 253, 1114, 36842, 1083, 310, 326, 253, 4311, 273, 253, 15276, 10921, 2625, 556, 281, 320, 24251, 285, 778, 878, 281, 1818, 275, 673, 533, 627, 310, 642, 5301, 281, 841, 3082, 285, 253, 34385, 273, 616, 1332, 281, 253, 1114, 36842, 11640, 403, 417, 16318, 973, 2217, 50276, 783, 4477, 1611, 281, 39256, 253, 3045, 14023, 342, 643, 1666, 25379, 407, 3981, 326, 597, 476, 4044, 625, 11117, 32536, 275, 2426, 273, 17947, 8130, 285, 1333, 326, 597, 13414, 4388, 281, 921, 5520, 3045, 2299, 436, 9991, 4154, 318, 310, 417, 2266, 2217, 275, 619, 1859, 806, 352, 310, 417, 2590, 281, 479, 604, 253, 4477, 971, 281, 2770, 327, 253, 9991, 273, 8770, 326, 310, 2797, 1561, 581, 12955, 273, 253, 10006, 88, 31054, 24088, 8191, 12797, 390, 1880, 597, 971, 281, 6780, 253, 9991, 2439, 1027, 32449, 1005, 285, 12797, 6297, 604, 352, 310, 253, 6158, 253, 11117, 8770, 36908, 7933, 16497, 715, 3045, 323, 253, 1027, 3958, 285, 1682, 3045, 323, 1027, 3958, 1537, 320, 2797, 342, 1027, 8130, 533, 1580, 436, 5700, 310, 4229, 2720, 281, 253, 3368, 352, 310, 417, 2590, 281, 479, 849, 436, 7729, 253, 9991, 4154, 273, 253, 1332, 3139, 253, 4477, 1127, 562, 326, 1114, 442, 91, 49998, 25442, 285, 815, 80, 17473, 452, 616, 1682, 3045, 342, 1027, 10006, 88, 31054, 32536, 533, 841, 1543, 452, 247, 1663, 1029, 11041, 4931, 495, 12922, 403, 816, 417, 2217, 285, 625, 12922, 403, 3058, 281, 3812, 9630, 11815, 891, 2868, 326, 253, 2934, 3559, 275, 436, 2929, 310, 4722, 533, 253, 1543, 403, 14999, 253, 2905, 789, 2593, 13366, 10949, 690, 2074, 3082, 285, 891, 1158, 14023, 403, 1335, 3058, 342, 841, 643, 3082, 24088, 564, 15083, 410, 671, 16633, 327, 253, 672, 1953, 273, 17947, 285, 943, 320, 2908, 347, 247, 8245, 347, 973, 347, 2987, 342, 1114, 36842, 8770, 7823, 835, 253, 10006, 88, 31054, 310, 7932, 407, 247, 42428, 1895, 273, 6024, 285, 15276, 23267, 273, 247, 2014, 8770, 3646, 2490, 187, 4118, 18435, 27, 15083, 7843, 476, 5108, 387, 2710, 2308, 273, 32449, 414, 285, 387, 1027, 2069, 1309, 271, 9037, 50276, 395, 436, 789, 17923, 247, 1263, 273, 253, 1895, 273, 17947, 672, 281, 8338, 9453, 281, 5234, 875, 18216, 285, 30211, 387, 752, 43936, 281, 513, 594, 285, 752, 6298, 651, 320, 1175, 23785, 281, 5234, 253, 1263, 310, 2684, 327, 387, 1792, 3958, 50276, 296, 3755, 384, 84, 50276, 783, 1263, 310, 973, 17194, 285, 253, 7714, 310, 4583, 973, 3542, 2175, 247, 747, 1895, 2170, 285, 29328, 271, 3302, 4460, 1332, 323, 436, 1895, 9470, 1263, 327, 387, 1792, 3237, 50276, 20881, 1255, 265, 50276, 8826, 19843, 3374, 347, 8042, 562, 407, 253, 30628, 642, 47386, 4836, 310, 1677, 281, 1918, 247, 625, 27350, 47284, 273, 253, 672, 281, 8338, 1895, 5301, 281, 690, 4465, 1666, 25379, 751, 564, 15083, 410, 651, 452, 644, 47860, 50276, 250, 2858, 22559, 50276, 2252, 19843, 3374, 452, 644, 9713, 3449, 5906, 1031, 352, 556, 644, 5544, 2139, 690, 9762, 323, 4465, 1666, 25379, 651, 320, 11132, 263, 417, 4623, 2217, 1223, 253, 4477, 5194, 326, 564, 15083, 410, 651, 320, 271, 4722, 8245, 597, 1646, 281, 452, 417, 2879, 352, 271, 47386, 4836, 369, 417, 2530, 50276, 8774, 50276, 455, 30628, 5194, 326, 436, 7714, 13279, 598, 285, 39223, 247, 4460, 3884, 275, 17947, 285, 3400, 271, 9470, 16774, 1263, 327, 387, 1792, 3958, 247, 2629, 22791, 323, 824, 1895, 7533, 1223, 891, 5194, 342, 253, 30628, 326, 1127, 562, 326, 436, 2929, 812, 452, 644, 1160, 10046, 407, 6240, 271, 47386, 4836, 285, 3081, 1666, 25379, 751, 564, 15083, 410, 627, 310, 247, 2087, 13969, 326, 253, 2530, 16774, 1263, 327, 436, 4460, 1895, 4758, 310, 247, 1175, 7680, 275, 3139, 984, 273, 436, 891, 5583, 2997 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: summary the paper investigates intersection of federated learning and distant supervision of knowledge graphs from texts the main innovation is a simple yet empirically effective denoising rule that selects only the sentences deemed to be the most reliable for learning in each round of training strong points the method is very simple to implement therefore it might be widely adopted as a baseline the denoising step is specific only for federated learning and distant supervision it can be applied to many other domains as well and not just for relation extraction weak points the paper would benefit from further proofreading  some uncertainty about experimental results namely repeatability of the runs and methodology of hyperparam selection more in questions section recommendation pre rebuttal i like the simplistic approach however in current form i am leaning towards rejecting the paper due to my uncertainty in the empirical evaluation however i am willing to change my evaluation if these questions are resolved post rebuttal update several of my concerns about clarity of the experimental section were addressed therefore i increased my score and now i am inclining towards accepting the paper questions methodology of obtaining the empirical results isnt clear from the text section 41 mentions a single held out test set however then it isnt clear how hyperparam sweep discussed in sec 42 was done if there is only a single test set were the hyperparams selected on the same test set that is later used to report results in fig 2 and 3 i would consider that a flaw or is there a second validation set that isnt mentioned in the text explicitly used to tune hyperparams curves in fig 2 and 3 seem to be from just one run for each setting doing more runs with different random initializations and reporting mean or top k results confidence intervals would increase confidence in these results possible improvements lazy mil almost does not leak the corpus information in each platform can you make this statement more precise   baseline attention models are always forced to spread their attention among sentences in the local bag even if they are all irrelevant what if the attention module is allowed to output all sentences are irrelevant option this is sometimes called sentinel httpsarxivorgabs161201887 this might work as learnable denoising that deals with the issue that some local bags dont include any relevant examples add references for all datasets in section 41 alg 1 are lines 6 and 7 needed they seem to be contained by line 8  alg 1 line 11 this piece of python in pseudocode might not be readable by everyone i would explain it in plain words  alg 2 hyperparams e is the number what number some language issues  leads to catastrophic repro ducibility is not reproducible is a large biomedical with biomedical dataset with  kb is public available publicly available  the token representation is represented as the token is represented as the overall of lazy mil is illustrated in algorithm 1 overall design of overview of to extract the highlevel sentence representation from three segments of cnn outputs and the boundaries of segments are determined by the positions of the two entities i would first define what segments are and only after that how they are used in nn here the order is reversed and i find it more difficult to follow  konen different accentation in the last characterdocsepthis paper addresses relation extraction problem for distributed platforms for privacy concerns the authors propose to leverage federated learning with denoising techinques for better performance the relation extractor is set to the conventional piecewise cnn and the main contributions focus on dealing noises in the distributed settings pros a new method for relation extraction in the federated learning to address privacy concerns best performance compared to baseline models cons lack of novelty this method follows the federated learning paradigm with few improvements the main contribution is limited to the proposed lazy multiple instance learning which is not specific to relation extraction problem if this is not specified for relation extraction more tasks are expected to demonstrate it is a general algorithm experiments are insufficient first it is doubtful whether the iid setting is a practical assumption for relation extraction because in the real world the quality of corpus in different platforms may vary drastically second given the iid setting the authors can repeat their experiments to reduce randomness because the proposed denoising strategy may only perform well in situations that useful sentences are allocated to only few platforms the authors may provide more analysis on the distribution of contributions from platforms writing could be improved in section 33 what does the value vi refers to also if vi vj then the idith sentence in the platform i is selected does it mean that at every round only one sentence from a platform is selected what if for a certain relation there are many promising candidate sentences from one platformdocsepsummary this work focuses on investigating distant supervision for relation extraction task within federated learning paradigm the proposed lazy multiinstance learning approach identifies reliable sentences for the same entity pairs across platforms to denoise distantly supervised data and perform relation extraction the proposed approach has been applied on two relation extraction datasets and has reported gains strengths the core of distant supervision for relation extraction is to denoise sentences for building a supervised classifier the problem is well investigated in nlp community however this work focuses on modeling the distant supervision problem in realworld environment addressing data decentralization ownership and privacy i buy the idea of applying the existing methods or extend the nlp applications in federated setup the paper is well written clearly motivation and addressed realworld problem interesting work blend of distant supervision and federated learning for nlp task weaknesses the paper lacks novelty in terms of methodology distant supervision for relation extraction pcnn and mil and the federated learning methodologies used in this work have been inspired from existing works this works combines the two the experimental setup needs more clarity questions 1 assumption made sentences with same entity pairs must scatter across platforms how would the lazy mil system bootstrap without such an assumption no same entity pairs 2 assumption made the number of output classes are same therefore the theta parameter including outputlayer weights connecting softmax is used across platforms in real world it is not the case each platform or customer may have different relation types how would you aggregatedistribute the output layer parameters 3 how to obtain a minimum viable global theta how to init this without having the actual training data in master server 4 in equation 2 the vi is not controlled by any threshold will it denoise if its value is too low say 050 and it is still the highest among all the platforms 5 in section 41 in data partitioning how do you ensure the number of relation types remains same across all platforms additional comments include an ablation study analyzing scores due to different values of k experimental setup unclear how do to init theta in global model what is the heldout data set for local and global model training results the experimental results show noticeable gains it is surprising as the nyt dataset is well investigated in distant supervision settings as the training data is split across platforms the overall system is decoupled into several local models in essence the overall performance in federated settings should deteriorate or remain competitive to the baseline models due to no joint training on the overall corpora as well as due to somewhat lossy aggregation please provide a detailed reasoning about the noticeable gains achieved reproducibility no code availabledocsepthis paper explores relation extraction with distant supervision in the federated setting and focuses on handling the label noise problem from automatic distant supervision by multiple instance learningbased methods and proposes lazy mil for a specific triple h r t in kb values probability with relation r for sentences containing h and t will be calculated locally in each platform and such best value and index in each platform will be uploaded to the master server a master server decides the most reliable sentence and broadcast such information to all activated platforms in this round strengths this paper considers relation extraction in the federated setting which is a new direction in this area the mentioned label noise problem is a real problem in distant supervision and its intuitive that this problem exists in the federated setting experiments show the effectiveness of the proposed method weaknesses the contribution of the paper is weak in the context of the expectations of iclr for baselines in experiments its not clear how to use baselines in the federated setting for example for keep the other modules unchanged and only replace the denoising module in the last sentence of sec 43 its confusing for me that how to replace this part is the part for broadcasting denoising information except for the experiment results its not clear that if the proposed method is specific to the federated setting and what are the federatedsettingspecific designed compared with other baselines questions in sec 44 we believe the reason is that our denoising method can hinder falsepositive instances from poisoning local models does this mean that other baselines cant hinder falsepositive instancesdocsepsummary this paper introduced a new federated scenario for distantly supervised relation extraction where extractions come from multiple private resources and therefore a global model cannot directly access all the data simultaneously to protect data privacy this paper assigned each resource a local model for separate training in order to apply the atleastone sentence bag denoising technique at the beginning of each round local models will calculate scores of sentences in the local bag compare scores with other resources and then generate the training data after each round of parallel training local models will be synchronized with the global model via a weighted average algorithm by only sharing scores rather than the full text data privacy is protected experiments over two datasets show promising results reasons for score i vote for marginally negative overall this paper brought up an interesting challenge for this nlp subtask however it is not clear to me the necessity of compromising accuracy while doing parallel computing and model innovation is slightly weak there are two comments from a performancedriven perspective 1 is there a specific reason for this task to do parallel computing from my understanding if correct if parallel training is not required we can avoid major information loss from delayed denoising and averaging step for model integration in another word if we only train one local platform at a time and denoising frequently and then synchronize model parameters to the next platform for further training the performance should be close or equal if using one model to the performance upper bound when k1 if parallel computing is the key then it is necessary to analyze the speed and performance tradeoff 2 it is necessary to use pretrained language models like mtb matching the blank paper as the encoder because improvement over the current baseline could possibly be mitigated by a stronger encoder ### Summary:
the paper studies distantly supervised relation extractiondsre in distributed settings though dsre has been studied in centralized setting it has not been studied in distributed platform this paper leverages the federated learning setup for this problem and proposes to use lazy mil for this purpose the paper identifies the main challenge as label noise but does not attempt to characterise the severity of the problem visavis the centralised setup though intuitive but a formal approach would have helped in understanding the importance of the derived results better
[ 19843, 273, 253, 5661, 2593, 497, 9713, 3103, 891, 2559, 619, 4868, 285, 1024, 891, 717, 12186, 1699, 4404, 18738, 253, 2929, 50276, 34974, 50275, 9349, 1497, 273, 13546, 253, 16774, 1543, 310, 2649, 2590, 432, 253, 2505, 2593, 7609, 25957, 247, 2014, 2918, 562, 1071, 873, 2299, 840, 352, 310, 2649, 2590, 849, 4373, 3575, 24516, 5469, 275, 4706, 5976, 369, 2218, 604, 627, 310, 760, 247, 2014, 1071, 873, 497, 253, 4373, 12928, 4236, 327, 253, 1072, 1071, 873, 326, 310, 1996, 908, 281, 1304, 1543, 275, 3036, 374, 285, 495, 891, 651, 1908, 326, 247, 19652, 390, 310, 627, 247, 1273, 12820, 873, 326, 310, 2649, 5393, 275, 253, 2505, 11120, 908, 281, 19928, 4373, 12928, 50275, 1915, 1634, 275, 3036, 374, 285, 495, 1646, 281, 320, 432, 816, 581, 1408, 323, 1016, 4758, 2509, 625, 6613, 342, 1027, 3632, 3302, 5904, 285, 9610, 1599, 390, 1755, 465, 1543, 50276, 39943, 11508, 651, 2572, 7162, 275, 841, 1543, 50275, 24902, 11701, 50275, 77, 26537, 2301, 2761, 1057, 417, 13584, 253, 20689, 1491, 275, 1016, 5147, 50276, 5092, 368, 1056, 436, 3908, 625, 10799, 575, 50276, 575, 44650, 4116, 3210, 403, 1900, 6726, 281, 5195, 616, 4116, 2190, 14683, 275, 253, 1980, 7351, 1014, 604, 597, 403, 512, 19124, 752, 604, 253, 4116, 6333, 310, 4136, 281, 3453, 512, 14683, 403, 19124, 4500, 436, 310, 4536, 1925, 2197, 29943, 5987, 39962, 2061, 5375, 1036, 805, 520, 32381, 436, 1537, 789, 347, 3037, 494, 1850, 80, 2182, 326, 13330, 342, 253, 2523, 326, 690, 1980, 15284, 13414, 2486, 667, 4623, 6667, 50274, 1911, 10414, 323, 512, 15302, 275, 2593, 7609, 50276, 13256, 337, 403, 3104, 721, 285, 818, 3058, 597, 1646, 281, 320, 6221, 407, 1386, 854, 209, 575, 13256, 337, 1386, 1903, 436, 5313, 273, 15548, 275, 10585, 406, 853, 1537, 417, 320, 34025, 407, 4130, 891, 651, 5513, 352, 275, 8342, 3000, 209, 575, 13256, 374, 4373, 12928, 50276, 70, 310, 253, 1180, 752, 1180, 50276, 8826, 3448, 3374, 50276, 575, 282, 6594, 281, 36256, 41543, 277, 1028, 2322, 50276, 261, 417, 41374, 50276, 261, 247, 1781, 35156, 342, 50276, 4193, 14944, 474, 10895, 342, 209, 575, 22421, 310, 1345, 2130, 50276, 4387, 314, 2130, 209, 575, 783, 10669, 6779, 310, 6607, 347, 50276, 783, 10669, 310, 6607, 347, 50276, 783, 4583, 273, 22658, 2301, 310, 12800, 275, 5933, 337, 50276, 1189, 455, 2216, 273, 18389, 273, 50276, 936, 4908, 253, 1029, 5251, 6197, 6779, 432, 1264, 13288, 273, 260, 9866, 18012, 285, 253, 13674, 273, 13288, 403, 3413, 407, 253, 6887, 273, 253, 767, 14429, 50276, 74, 651, 806, 4853, 752, 13288, 403, 285, 760, 846, 326, 849, 597, 403, 908, 275, 48257, 1060, 253, 1340, 310, 13891, 285, 891, 1089, 352, 625, 2834, 281, 956, 209, 575, 32937, 257, 1027, 22713, 318, 275, 253, 1390, 1894, 7152, 33032, 2520, 2929, 12453, 5886, 11998, 1895, 323, 5939, 13498, 323, 11068, 7350, 253, 4477, 12661, 281, 25057, 10208, 12072, 4715, 342, 1850, 80, 2182, 13817, 249, 10999, 323, 1805, 3045, 253, 5886, 4908, 263, 310, 873, 281, 253, 6041, 5313, 3020, 260, 9866, 285, 253, 2022, 9021, 2770, 327, 10620, 33737, 275, 253, 5939, 7533, 50276, 856, 84, 50276, 66, 747, 1332, 323, 5886, 11998, 275, 253, 10208, 12072, 4715, 281, 2953, 11068, 7350, 50276, 14461, 3045, 2429, 281, 8245, 3210, 50276, 5040, 50276, 77, 471, 273, 38135, 436, 1332, 3637, 253, 10208, 12072, 4715, 22199, 342, 1643, 11701, 253, 2022, 7680, 310, 3710, 281, 253, 4081, 22658, 2709, 4227, 4715, 534, 310, 417, 2173, 281, 5886, 11998, 1895, 604, 436, 310, 417, 7616, 323, 5886, 11998, 625, 8892, 403, 3264, 281, 7568, 352, 310, 247, 2087, 5933, 50276, 16217, 3825, 403, 12497, 806, 352, 310, 38342, 1880, 253, 891, 301, 4758, 310, 247, 8542, 9376, 323, 5886, 11998, 984, 275, 253, 1524, 1533, 253, 3290, 273, 20689, 275, 1027, 13498, 778, 6889, 31063, 1273, 1677, 253, 891, 301, 4758, 253, 4477, 476, 10280, 616, 4679, 281, 4796, 3632, 1255, 984, 253, 4081, 1850, 80, 2182, 5700, 778, 760, 1347, 973, 275, 9534, 326, 4217, 14683, 403, 18564, 281, 760, 1643, 13498, 253, 4477, 778, 2085, 625, 1783, 327, 253, 3268, 273, 9021, 432, 13498, 50276, 17695, 812, 320, 5520, 275, 2593, 5922, 752, 1057, 253, 1318, 2177, 10770, 281, 671, 604, 2177, 50276, 87, 75, 840, 253, 2654, 334, 6197, 275, 253, 5147, 891, 310, 4236, 1057, 352, 1599, 326, 387, 1046, 3790, 760, 581, 6197, 432, 247, 5147, 310, 4236, 752, 604, 323, 247, 2176, 5886, 627, 403, 1142, 12532, 7431, 14683, 432, 581, 5147, 7152, 339, 793, 360, 3454, 50276, 2520, 789, 16633, 327, 15686, 13392, 20446, 323, 5886, 11998, 4836, 1561, 10208, 12072, 4715, 22199, 253, 4081, 22658, 4471, 14966, 4715, 2746, 22649, 9630, 14683, 323, 253, 1072, 10726, 8557, 2439, 13498, 281, 1850, 45416, 940, 5954, 22296, 941, 285, 1347, 5886, 11998, 50274, 783, 4081, 2746, 556, 644, 3732, 327, 767, 5886, 11998, 15302, 285, 556, 2361, 15988, 50274, 296, 3755, 20556, 50276, 783, 5161, 273, 13392, 20446, 323, 5886, 11998, 310, 281, 1850, 45416, 14683, 323, 3652, 247, 22296, 30410, 253, 1895, 310, 973, 6949, 275, 295, 24343, 3114, 2299, 436, 789, 16633, 327, 14053, 253, 13392, 20446, 1895, 275, 1524, 10186, 3126, 15974, 941, 31331, 1320, 12851, 285, 11068, 891, 4489, 253, 2934, 273, 9433, 253, 5368, 3082, 390, 9017, 253, 295, 24343, 4893, 275, 10208, 12072, 9978, 50275, 783, 2929, 310, 973, 3542, 4518, 16038, 285, 9713, 1524, 10186, 1895, 50275, 47606, 789, 19310, 273, 13392, 20446, 285, 10208, 12072, 4715, 323, 295, 24343, 4836, 50276, 20881, 1255, 265, 50276, 783, 2929, 19756, 38135, 275, 2426, 273, 16182, 50275, 69, 5567, 20446, 323, 5886, 11998, 21136, 9866, 285, 2301, 285, 253, 10208, 12072, 4715, 39396, 908, 275, 436, 789, 452, 644, 11797, 432, 5368, 2987, 436, 2987, 24772, 253, 767, 50276, 783, 5661, 9978, 3198, 625, 19843, 50274, 34974, 337, 9376, 1160, 14683, 342, 1072, 10726, 8557, 1364, 24493, 2439, 13498, 849, 651, 253, 22658, 2301, 985, 28551, 1293, 824, 271, 9376, 642, 1072, 10726, 8557, 374, 9376, 1160, 253, 1180, 273, 3453, 5971, 403, 1072, 3103, 253, 39116, 4764, 1690, 3453, 12026, 13461, 12873, 2602, 4090, 310, 908, 2439, 13498, 275, 1524, 1533, 352, 310, 417, 253, 1083, 1016, 5147, 390, 7731, 778, 452, 1027, 5886, 3510, 849, 651, 368, 40006, 382, 3337, 253, 3453, 3828, 3602, 495, 849, 281, 4044, 247, 5927, 16571, 4156, 39116, 849, 281, 2012, 436, 1293, 1907, 253, 4588, 3733, 941, 275, 6303, 4771, 50275, 21, 275, 5150, 374, 253, 2177, 310, 417, 6537, 407, 667, 7887, 588, 352, 1850, 45416, 604, 697, 1318, 310, 1512, 1698, 1333, 470, 1235, 285, 352, 310, 1335, 253, 4585, 2190, 512, 253, 13498, 50276, 22, 275, 2593, 7609, 275, 941, 41463, 849, 513, 368, 5416, 253, 1180, 273, 5886, 3510, 4558, 1072, 2439, 512, 13498, 50276, 38092, 5701, 50274, 3709, 271, 28913, 1263, 18918, 7363, 1955, 281, 1027, 2193, 273, 465, 50276, 49363, 9978, 50276, 328, 8250, 50276, 5430, 513, 281, 2012, 39116, 275, 4156, 1566, 50276, 5371, 310, 253, 2918, 483, 941, 873, 323, 1980, 285, 4156, 1566, 3733, 50276, 16680, 50276, 783, 5661, 1543, 921, 28629, 15988, 352, 310, 10084, 347, 253, 295, 1767, 10895, 310, 973, 6949, 275, 13392, 20446, 7533, 50275, 284, 253, 3733, 941, 310, 8085, 2439, 13498, 253, 4583, 985, 310, 34430, 6216, 715, 2067, 1980, 3210, 275, 17718, 253, 4583, 3045, 275, 10208, 12072, 7533, 943, 16528, 366, 390, 3464, 12085, 281, 253, 8245, 3210, 1955, 281, 642, 6036, 3733, 327, 253, 4583, 5944, 66, 347, 973, 347, 1955, 281, 8489, 2957, 90, 20828, 4496, 2085, 247, 7000, 14720, 670, 253, 28629, 15988, 6786, 50273, 250, 5551, 33593, 50276, 2369, 2127, 1961, 5063, 406, 33032, 2520, 2929, 33826, 5886, 11998, 342, 13392, 20446, 275, 253, 10208, 12072, 4758, 285, 16633, 327, 10885, 253, 5203, 6046, 1895, 432, 12077, 13392, 20446, 407, 2709, 4227, 4715, 3169, 3082, 285, 29328, 22658, 2301, 50275, 1542, 247, 2173, 16260, 288, 391, 246, 275, 28724, 2193, 5912, 342, 5886, 391, 323, 14683, 4508, 288, 285, 246, 588, 320, 5118, 12171, 275, 1016, 5147, 285, 824, 1682, 1318, 285, 3605, 275, 1016, 5147, 588, 320, 28228, 281, 253, 6303, 4771, 247, 6303, 4771, 21936, 253, 954, 9630, 6197, 285, 10675, 824, 1491, 281, 512, 10564, 13498, 275, 436, 3790, 50276, 296, 3755, 20556, 50274, 2520, 2929, 19401, 5886, 11998, 275, 253, 10208, 12072, 4758, 534, 310, 247, 747, 3884, 275, 436, 2170, 50276, 783, 5393, 5203, 6046, 1895, 310, 247, 1524, 1895, 275, 13392, 20446, 285, 697, 27350, 326, 436, 1895, 4961, 275, 253, 10208, 12072, 4758, 50276, 16217, 3825, 921, 253, 12510, 273, 253, 4081, 1332, 50276, 20881, 1255, 265, 50275, 783, 7680, 273, 253, 2929, 310, 5075, 275, 253, 3634, 273, 253, 12656, 273, 17857, 32888, 50275, 1542, 1666, 25379, 275, 4679, 697, 417, 2590, 849, 281, 897, 1666, 25379, 275, 253, 10208, 12072, 4758, 323, 1650, 323, 1978, 253, 643, 11911, 19965, 285, 760, 8171, 253, 1850, 80, 2182, 6333, 275, 253, 1390, 6197, 273, 4706, 7652, 697, 21643, 323, 479, 326, 849, 281, 8171, 436, 629, 310, 253, 629, 323, 32380, 1850, 80, 2182, 1491, 50275, 16829, 323, 253, 3368, 1543, 697, 417, 2590, 326, 604, 253, 4081, 1332, 310, 2173, 281, 253, 10208, 12072, 4758, 285, 752, 403, 253, 10208, 12072, 17494, 29765, 4158, 2429, 342, 643, 1666, 25379, 50276, 34974, 50276, 249, 4706, 7127, 359, 2868, 253, 1921, 310, 326, 776, 1850, 80, 2182, 1332, 476, 35007, 3221, 10247, 10872, 432, 33254, 1980, 3210, 1057, 436, 1599, 326, 643, 1666, 25379, 16216, 35007, 3221, 10247, 10872, 7152, 339, 793, 360, 3454, 436, 2929, 5611, 247, 747, 10208, 12072, 10076, 323, 940, 5954, 22296, 5886, 11998, 835, 4465, 960, 1705, 432, 2709, 3055, 5300, 285, 3103, 247, 4156, 1566, 2550, 3587, 2289, 512, 253, 941, 10486, 50275, 936, 4017, 941, 11068, 436, 2929, 7922, 1016, 7741, 247, 1980, 1566, 323, 4858, 3733, 275, 1340, 281, 4647, 253, 387, 38462, 531, 6197, 7351, 1850, 80, 2182, 5853, 387, 253, 5068, 273, 1016, 3790, 1980, 3210, 588, 10173, 7363, 273, 14683, 275, 253, 1980, 7351, 7277, 7363, 342, 643, 5300, 285, 840, 6635, 253, 3733, 941, 846, 1016, 3790, 273, 7529, 3733, 1980, 3210, 588, 320, 30492, 342, 253, 4156, 1566, 3066, 247, 17375, 3388, 5933, 407, 760, 9628, 7363, 2581, 685, 253, 2120, 2505, 941, 11068, 310, 6885, 4679, 689, 767, 15302, 921, 12532, 1543, 50274, 250, 3743, 323, 4868, 50276, 74, 6273, 323, 42876, 4016, 4583, 436, 2929, 3982, 598, 271, 4722, 5691, 323, 436, 295, 24343, 8482, 1945, 2299, 352, 310, 417, 2590, 281, 479, 253, 15504, 273, 48637, 7200, 1223, 2509, 7529, 12672, 285, 1566, 15832, 310, 5777, 5075, 50273, 9088, 403, 767, 5701, 432, 247, 1347, 3086, 1069, 257, 8668, 50276, 18, 310, 627, 247, 2173, 1921, 323, 436, 4836, 281, 513, 7529, 12672, 432, 619, 4685, 604, 3451, 604, 7529, 3733, 310, 417, 2424, 359, 476, 3693, 2201, 1491, 2957, 432, 13444, 1850, 80, 2182, 285, 25001, 3213, 323, 1566, 9554, 275, 1529, 3159, 604, 359, 760, 6194, 581, 1980, 5147, 387, 247, 673, 285, 1850, 80, 2182, 7208, 285, 840, 11842, 907, 1566, 3602, 281, 253, 1735, 5147, 323, 2007, 3733, 253, 3045, 943, 320, 2810, 390, 4503, 604, 970, 581, 1566, 281, 253, 3045, 5170, 3033, 672, 465, 18, 50276, 338, 7529, 12672, 310, 253, 2234, 840, 352, 310, 3309, 281, 12106, 253, 3885, 285, 3045, 5454, 2727, 50276, 19, 352, 310, 3309, 281, 897, 3215, 11273, 3448, 3210, 751, 278, 25192, 11038, 253, 9912, 2929, 347, 253, 32049, 984, 7756, 689, 253, 1655, 8245, 812, 6830, 320, 4784, 27285, 407, 247, 10046, 32049, 50276, 187, 187, 4118, 18435, 27, 783, 2929, 2175, 940, 5954, 22296, 5886, 11998, 1397, 250, 275, 5939, 7533, 2167, 20505, 250, 556, 644, 5421, 275, 36409, 4758, 352, 556, 417, 644, 5421, 275, 5939, 5147, 436, 50276, 20790, 19732, 1131, 253, 10208, 12072, 4715, 9978, 323, 436, 1895, 285, 29328, 281, 897, 22658, 2301, 323, 436, 4096, 50276, 783, 2929, 50276, 888, 7790, 253, 2022, 5691, 347, 5203, 6046, 533, 1057, 417, 3177, 281, 1894, 885, 253, 12147, 273, 253, 1895, 1649, 41826, 253, 4275, 1701, 9978, 50276, 2004, 27350, 533, 247, 7473, 2746, 651, 452, 6518, 275, 4685, 253, 6349, 273, 253, 6012, 1543, 1805, 50276 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 19843, 273, 253, 5661, 2593, 497, 9713, 3103, 891, 2559, 619, 4868, 285, 1024, 891, 717, 12186, 1699, 4404, 18738, 253, 2929, 50276, 34974, 50275, 9349, 1497, 273, 13546, 253, 16774, 1543, 310, 2649, 2590, 432, 253, 2505, 2593, 7609, 25957, 247, 2014, 2918, 562, 1071, 873, 2299, 840, 352, 310, 2649, 2590, 849, 4373, 3575, 24516, 5469, 275, 4706, 5976, 369, 2218, 604, 627, 310, 760, 247, 2014, 1071, 873, 497, 253, 4373, 12928, 4236, 327, 253, 1072, 1071, 873, 326, 310, 1996, 908, 281, 1304, 1543, 275, 3036, 374, 285, 495, 891, 651, 1908, 326, 247, 19652, 390, 310, 627, 247, 1273, 12820, 873, 326, 310, 2649, 5393, 275, 253, 2505, 11120, 908, 281, 19928, 4373, 12928, 50275, 1915, 1634, 275, 3036, 374, 285, 495, 1646, 281, 320, 432, 816, 581, 1408, 323, 1016, 4758, 2509, 625, 6613, 342, 1027, 3632, 3302, 5904, 285, 9610, 1599, 390, 1755, 465, 1543, 50276, 39943, 11508, 651, 2572, 7162, 275, 841, 1543, 50275, 24902, 11701, 50275, 77, 26537, 2301, 2761, 1057, 417, 13584, 253, 20689, 1491, 275, 1016, 5147, 50276, 5092, 368, 1056, 436, 3908, 625, 10799, 575, 50276, 575, 44650, 4116, 3210, 403, 1900, 6726, 281, 5195, 616, 4116, 2190, 14683, 275, 253, 1980, 7351, 1014, 604, 597, 403, 512, 19124, 752, 604, 253, 4116, 6333, 310, 4136, 281, 3453, 512, 14683, 403, 19124, 4500, 436, 310, 4536, 1925, 2197, 29943, 5987, 39962, 2061, 5375, 1036, 805, 520, 32381, 436, 1537, 789, 347, 3037, 494, 1850, 80, 2182, 326, 13330, 342, 253, 2523, 326, 690, 1980, 15284, 13414, 2486, 667, 4623, 6667, 50274, 1911, 10414, 323, 512, 15302, 275, 2593, 7609, 50276, 13256, 337, 403, 3104, 721, 285, 818, 3058, 597, 1646, 281, 320, 6221, 407, 1386, 854, 209, 575, 13256, 337, 1386, 1903, 436, 5313, 273, 15548, 275, 10585, 406, 853, 1537, 417, 320, 34025, 407, 4130, 891, 651, 5513, 352, 275, 8342, 3000, 209, 575, 13256, 374, 4373, 12928, 50276, 70, 310, 253, 1180, 752, 1180, 50276, 8826, 3448, 3374, 50276, 575, 282, 6594, 281, 36256, 41543, 277, 1028, 2322, 50276, 261, 417, 41374, 50276, 261, 247, 1781, 35156, 342, 50276, 4193, 14944, 474, 10895, 342, 209, 575, 22421, 310, 1345, 2130, 50276, 4387, 314, 2130, 209, 575, 783, 10669, 6779, 310, 6607, 347, 50276, 783, 10669, 310, 6607, 347, 50276, 783, 4583, 273, 22658, 2301, 310, 12800, 275, 5933, 337, 50276, 1189, 455, 2216, 273, 18389, 273, 50276, 936, 4908, 253, 1029, 5251, 6197, 6779, 432, 1264, 13288, 273, 260, 9866, 18012, 285, 253, 13674, 273, 13288, 403, 3413, 407, 253, 6887, 273, 253, 767, 14429, 50276, 74, 651, 806, 4853, 752, 13288, 403, 285, 760, 846, 326, 849, 597, 403, 908, 275, 48257, 1060, 253, 1340, 310, 13891, 285, 891, 1089, 352, 625, 2834, 281, 956, 209, 575, 32937, 257, 1027, 22713, 318, 275, 253, 1390, 1894, 7152, 33032, 2520, 2929, 12453, 5886, 11998, 1895, 323, 5939, 13498, 323, 11068, 7350, 253, 4477, 12661, 281, 25057, 10208, 12072, 4715, 342, 1850, 80, 2182, 13817, 249, 10999, 323, 1805, 3045, 253, 5886, 4908, 263, 310, 873, 281, 253, 6041, 5313, 3020, 260, 9866, 285, 253, 2022, 9021, 2770, 327, 10620, 33737, 275, 253, 5939, 7533, 50276, 856, 84, 50276, 66, 747, 1332, 323, 5886, 11998, 275, 253, 10208, 12072, 4715, 281, 2953, 11068, 7350, 50276, 14461, 3045, 2429, 281, 8245, 3210, 50276, 5040, 50276, 77, 471, 273, 38135, 436, 1332, 3637, 253, 10208, 12072, 4715, 22199, 342, 1643, 11701, 253, 2022, 7680, 310, 3710, 281, 253, 4081, 22658, 2709, 4227, 4715, 534, 310, 417, 2173, 281, 5886, 11998, 1895, 604, 436, 310, 417, 7616, 323, 5886, 11998, 625, 8892, 403, 3264, 281, 7568, 352, 310, 247, 2087, 5933, 50276, 16217, 3825, 403, 12497, 806, 352, 310, 38342, 1880, 253, 891, 301, 4758, 310, 247, 8542, 9376, 323, 5886, 11998, 984, 275, 253, 1524, 1533, 253, 3290, 273, 20689, 275, 1027, 13498, 778, 6889, 31063, 1273, 1677, 253, 891, 301, 4758, 253, 4477, 476, 10280, 616, 4679, 281, 4796, 3632, 1255, 984, 253, 4081, 1850, 80, 2182, 5700, 778, 760, 1347, 973, 275, 9534, 326, 4217, 14683, 403, 18564, 281, 760, 1643, 13498, 253, 4477, 778, 2085, 625, 1783, 327, 253, 3268, 273, 9021, 432, 13498, 50276, 17695, 812, 320, 5520, 275, 2593, 5922, 752, 1057, 253, 1318, 2177, 10770, 281, 671, 604, 2177, 50276, 87, 75, 840, 253, 2654, 334, 6197, 275, 253, 5147, 891, 310, 4236, 1057, 352, 1599, 326, 387, 1046, 3790, 760, 581, 6197, 432, 247, 5147, 310, 4236, 752, 604, 323, 247, 2176, 5886, 627, 403, 1142, 12532, 7431, 14683, 432, 581, 5147, 7152, 339, 793, 360, 3454, 50276, 2520, 789, 16633, 327, 15686, 13392, 20446, 323, 5886, 11998, 4836, 1561, 10208, 12072, 4715, 22199, 253, 4081, 22658, 4471, 14966, 4715, 2746, 22649, 9630, 14683, 323, 253, 1072, 10726, 8557, 2439, 13498, 281, 1850, 45416, 940, 5954, 22296, 941, 285, 1347, 5886, 11998, 50274, 783, 4081, 2746, 556, 644, 3732, 327, 767, 5886, 11998, 15302, 285, 556, 2361, 15988, 50274, 296, 3755, 20556, 50276, 783, 5161, 273, 13392, 20446, 323, 5886, 11998, 310, 281, 1850, 45416, 14683, 323, 3652, 247, 22296, 30410, 253, 1895, 310, 973, 6949, 275, 295, 24343, 3114, 2299, 436, 789, 16633, 327, 14053, 253, 13392, 20446, 1895, 275, 1524, 10186, 3126, 15974, 941, 31331, 1320, 12851, 285, 11068, 891, 4489, 253, 2934, 273, 9433, 253, 5368, 3082, 390, 9017, 253, 295, 24343, 4893, 275, 10208, 12072, 9978, 50275, 783, 2929, 310, 973, 3542, 4518, 16038, 285, 9713, 1524, 10186, 1895, 50275, 47606, 789, 19310, 273, 13392, 20446, 285, 10208, 12072, 4715, 323, 295, 24343, 4836, 50276, 20881, 1255, 265, 50276, 783, 2929, 19756, 38135, 275, 2426, 273, 16182, 50275, 69, 5567, 20446, 323, 5886, 11998, 21136, 9866, 285, 2301, 285, 253, 10208, 12072, 4715, 39396, 908, 275, 436, 789, 452, 644, 11797, 432, 5368, 2987, 436, 2987, 24772, 253, 767, 50276, 783, 5661, 9978, 3198, 625, 19843, 50274, 34974, 337, 9376, 1160, 14683, 342, 1072, 10726, 8557, 1364, 24493, 2439, 13498, 849, 651, 253, 22658, 2301, 985, 28551, 1293, 824, 271, 9376, 642, 1072, 10726, 8557, 374, 9376, 1160, 253, 1180, 273, 3453, 5971, 403, 1072, 3103, 253, 39116, 4764, 1690, 3453, 12026, 13461, 12873, 2602, 4090, 310, 908, 2439, 13498, 275, 1524, 1533, 352, 310, 417, 253, 1083, 1016, 5147, 390, 7731, 778, 452, 1027, 5886, 3510, 849, 651, 368, 40006, 382, 3337, 253, 3453, 3828, 3602, 495, 849, 281, 4044, 247, 5927, 16571, 4156, 39116, 849, 281, 2012, 436, 1293, 1907, 253, 4588, 3733, 941, 275, 6303, 4771, 50275, 21, 275, 5150, 374, 253, 2177, 310, 417, 6537, 407, 667, 7887, 588, 352, 1850, 45416, 604, 697, 1318, 310, 1512, 1698, 1333, 470, 1235, 285, 352, 310, 1335, 253, 4585, 2190, 512, 253, 13498, 50276, 22, 275, 2593, 7609, 275, 941, 41463, 849, 513, 368, 5416, 253, 1180, 273, 5886, 3510, 4558, 1072, 2439, 512, 13498, 50276, 38092, 5701, 50274, 3709, 271, 28913, 1263, 18918, 7363, 1955, 281, 1027, 2193, 273, 465, 50276, 49363, 9978, 50276, 328, 8250, 50276, 5430, 513, 281, 2012, 39116, 275, 4156, 1566, 50276, 5371, 310, 253, 2918, 483, 941, 873, 323, 1980, 285, 4156, 1566, 3733, 50276, 16680, 50276, 783, 5661, 1543, 921, 28629, 15988, 352, 310, 10084, 347, 253, 295, 1767, 10895, 310, 973, 6949, 275, 13392, 20446, 7533, 50275, 284, 253, 3733, 941, 310, 8085, 2439, 13498, 253, 4583, 985, 310, 34430, 6216, 715, 2067, 1980, 3210, 275, 17718, 253, 4583, 3045, 275, 10208, 12072, 7533, 943, 16528, 366, 390, 3464, 12085, 281, 253, 8245, 3210, 1955, 281, 642, 6036, 3733, 327, 253, 4583, 5944, 66, 347, 973, 347, 1955, 281, 8489, 2957, 90, 20828, 4496, 2085, 247, 7000, 14720, 670, 253, 28629, 15988, 6786, 50273, 250, 5551, 33593, 50276, 2369, 2127, 1961, 5063, 406, 33032, 2520, 2929, 33826, 5886, 11998, 342, 13392, 20446, 275, 253, 10208, 12072, 4758, 285, 16633, 327, 10885, 253, 5203, 6046, 1895, 432, 12077, 13392, 20446, 407, 2709, 4227, 4715, 3169, 3082, 285, 29328, 22658, 2301, 50275, 1542, 247, 2173, 16260, 288, 391, 246, 275, 28724, 2193, 5912, 342, 5886, 391, 323, 14683, 4508, 288, 285, 246, 588, 320, 5118, 12171, 275, 1016, 5147, 285, 824, 1682, 1318, 285, 3605, 275, 1016, 5147, 588, 320, 28228, 281, 253, 6303, 4771, 247, 6303, 4771, 21936, 253, 954, 9630, 6197, 285, 10675, 824, 1491, 281, 512, 10564, 13498, 275, 436, 3790, 50276, 296, 3755, 20556, 50274, 2520, 2929, 19401, 5886, 11998, 275, 253, 10208, 12072, 4758, 534, 310, 247, 747, 3884, 275, 436, 2170, 50276, 783, 5393, 5203, 6046, 1895, 310, 247, 1524, 1895, 275, 13392, 20446, 285, 697, 27350, 326, 436, 1895, 4961, 275, 253, 10208, 12072, 4758, 50276, 16217, 3825, 921, 253, 12510, 273, 253, 4081, 1332, 50276, 20881, 1255, 265, 50275, 783, 7680, 273, 253, 2929, 310, 5075, 275, 253, 3634, 273, 253, 12656, 273, 17857, 32888, 50275, 1542, 1666, 25379, 275, 4679, 697, 417, 2590, 849, 281, 897, 1666, 25379, 275, 253, 10208, 12072, 4758, 323, 1650, 323, 1978, 253, 643, 11911, 19965, 285, 760, 8171, 253, 1850, 80, 2182, 6333, 275, 253, 1390, 6197, 273, 4706, 7652, 697, 21643, 323, 479, 326, 849, 281, 8171, 436, 629, 310, 253, 629, 323, 32380, 1850, 80, 2182, 1491, 50275, 16829, 323, 253, 3368, 1543, 697, 417, 2590, 326, 604, 253, 4081, 1332, 310, 2173, 281, 253, 10208, 12072, 4758, 285, 752, 403, 253, 10208, 12072, 17494, 29765, 4158, 2429, 342, 643, 1666, 25379, 50276, 34974, 50276, 249, 4706, 7127, 359, 2868, 253, 1921, 310, 326, 776, 1850, 80, 2182, 1332, 476, 35007, 3221, 10247, 10872, 432, 33254, 1980, 3210, 1057, 436, 1599, 326, 643, 1666, 25379, 16216, 35007, 3221, 10247, 10872, 7152, 339, 793, 360, 3454, 436, 2929, 5611, 247, 747, 10208, 12072, 10076, 323, 940, 5954, 22296, 5886, 11998, 835, 4465, 960, 1705, 432, 2709, 3055, 5300, 285, 3103, 247, 4156, 1566, 2550, 3587, 2289, 512, 253, 941, 10486, 50275, 936, 4017, 941, 11068, 436, 2929, 7922, 1016, 7741, 247, 1980, 1566, 323, 4858, 3733, 275, 1340, 281, 4647, 253, 387, 38462, 531, 6197, 7351, 1850, 80, 2182, 5853, 387, 253, 5068, 273, 1016, 3790, 1980, 3210, 588, 10173, 7363, 273, 14683, 275, 253, 1980, 7351, 7277, 7363, 342, 643, 5300, 285, 840, 6635, 253, 3733, 941, 846, 1016, 3790, 273, 7529, 3733, 1980, 3210, 588, 320, 30492, 342, 253, 4156, 1566, 3066, 247, 17375, 3388, 5933, 407, 760, 9628, 7363, 2581, 685, 253, 2120, 2505, 941, 11068, 310, 6885, 4679, 689, 767, 15302, 921, 12532, 1543, 50274, 250, 3743, 323, 4868, 50276, 74, 6273, 323, 42876, 4016, 4583, 436, 2929, 3982, 598, 271, 4722, 5691, 323, 436, 295, 24343, 8482, 1945, 2299, 352, 310, 417, 2590, 281, 479, 253, 15504, 273, 48637, 7200, 1223, 2509, 7529, 12672, 285, 1566, 15832, 310, 5777, 5075, 50273, 9088, 403, 767, 5701, 432, 247, 1347, 3086, 1069, 257, 8668, 50276, 18, 310, 627, 247, 2173, 1921, 323, 436, 4836, 281, 513, 7529, 12672, 432, 619, 4685, 604, 3451, 604, 7529, 3733, 310, 417, 2424, 359, 476, 3693, 2201, 1491, 2957, 432, 13444, 1850, 80, 2182, 285, 25001, 3213, 323, 1566, 9554, 275, 1529, 3159, 604, 359, 760, 6194, 581, 1980, 5147, 387, 247, 673, 285, 1850, 80, 2182, 7208, 285, 840, 11842, 907, 1566, 3602, 281, 253, 1735, 5147, 323, 2007, 3733, 253, 3045, 943, 320, 2810, 390, 4503, 604, 970, 581, 1566, 281, 253, 3045, 5170, 3033, 672, 465, 18, 50276, 338, 7529, 12672, 310, 253, 2234, 840, 352, 310, 3309, 281, 12106, 253, 3885, 285, 3045, 5454, 2727, 50276, 19, 352, 310, 3309, 281, 897, 3215, 11273, 3448, 3210, 751, 278, 25192, 11038, 253, 9912, 2929, 347, 253, 32049, 984, 7756, 689, 253, 1655, 8245, 812, 6830, 320, 4784, 27285, 407, 247, 10046, 32049, 50276, 187, 187, 4118, 18435, 27, 783, 2929, 2175, 940, 5954, 22296, 5886, 11998, 1397, 250, 275, 5939, 7533, 2167, 20505, 250, 556, 644, 5421, 275, 36409, 4758, 352, 556, 417, 644, 5421, 275, 5939, 5147, 436, 50276, 20790, 19732, 1131, 253, 10208, 12072, 4715, 9978, 323, 436, 1895, 285, 29328, 281, 897, 22658, 2301, 323, 436, 4096, 50276, 783, 2929, 50276, 888, 7790, 253, 2022, 5691, 347, 5203, 6046, 533, 1057, 417, 3177, 281, 1894, 885, 253, 12147, 273, 253, 1895, 1649, 41826, 253, 4275, 1701, 9978, 50276, 2004, 27350, 533, 247, 7473, 2746, 651, 452, 6518, 275, 4685, 253, 6349, 273, 253, 6012, 1543, 1805, 50276 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the authors propose an autoregressive framework metaglar comprising of a global rnn model and a local linear model learned using a closed form solution for the task of metaforecasting of time series ts the linear model is the taskspecific model varies for each ts in the dataset whereas the rnn model is the meta model that is shared across al time series the authors perform comparisons with local models as well as nbeats and deepar and show that metaglar is competitive with some of them on particular metrics they also perform an ablation study to show that each component in their model is important to the overall zeroshot transfer in ts forecasting strengths the authors describe the problem setting for meta forecasting as well as the metaglar approach quite clearly the experiments conducted on metaglar are quite comprehensive large hyperparameter sweeps comparisons with models directly trained on target ts furthemore the experimental setup is described in sufficient detail along with supplementary code to ensure reproducibility the ablation study conducted to determine the importance of each component in the metaglar model is comprehensive and shows that each component rnn meta closedform adaptation and iterated forecasts in the model adds to the performance although it does not clarify why rnn with iterated forecasts leads to worse performance than an rnn baseline weaknesses the novelty of the approach itself is quite limited the local model differentiable local adaptation layer is essentially a linear layer applied on top of the rnn in the context window the metaglar approach could be approximately described as training an rnn on the full time series without training the linear layer in the forecast horizon and then using the linear weights from the context window only during transfer to new tasks in table 1 why did the authors chose to report three different metrics across the four datasets nd on electr and traff smape on m3 mape on m4 in their comparisons if all three metrics were calculated across all baselines and the authors proposed models then these results need to be included at least in the appendix since the authors have not reported their reasoning behind chosing different metrics for different datasets it raises the suspicion that these comparisons might be cherry picked in their comparison with finetuning did the authors try finetuning both the final layer and rnn that seems like a fairer comparison versus keeping the rnn model fixed and only training the final layer the authors used closed form solutions for the final layer which might get more expensive to compute if the rnn model wasnt fixed a comparison with a global linear layer finetuned alongside the rnn model would provide a better picture of the importance of their local adaptive layer one broader question that this research raises is about the scope and relevance of the meta learning framework to ts forecasting the framework lends itself very well to domains like computer vision due to relationship between visual tasks 1 however it is not immediately obvious why it would be relevant to time series forecasting where the underlying domains and granularities can vary widely id like to see the authors comment on the scope and limits of this work in their claims metaglar is trained on a source dataset and can then generate accurate forecasts for new time series that may differ significantly from the source dataset errataclarifications did the authors mean to say generalization in the into para 2 globallocal approaches exhibit a greater level of specialization as they learn parameters that are shared by all ts in the training set the reference regarding the use of closed form solvers in spatial regression seems incorrect in the related work end of para 2 and spatial regression iwata and kumagai 2020 while we are not aware of any application in forecasting references 1 taskonomy disentangling task transfer learning amir r zamir alexander sax william shen leonidas j guibas jitendra malik silvio savarese proceedings of the ieee conference on computer vision and pattern recognition cvpr 2018 the problem of zeroshot transfer in time series forecasting is important but it is also necessary to define its scope the method presented by the authors is novel but incremental there are concerns regarding the reporting of results see weaknesses which cast doubt on the model performance which is already lower than nbeats on some tasks overall the paper presents an interesting approach to performing metaforecasting but it needs more detailed analysis in terms of its applicability as well as completeness in terms of reporting its results i would be willing to consider an improvement in the score if these issues are addressed postrebuttal my concerns regarding scope and cherrypicking have been sufficiently addressed in the revised submission and the authors comments however i am still concerned about the novelty of the work as well as the extent of ablations i have increased my overall score to 5 marginally below the acceptance threshold and provided additional feedback to the authors in comments docsepthe authors propose a new metalearning framework to tackle the zeroshot learning problem for time series data through the combination of 1 a standard autoregressive architecture encoding historical information eg deepar 2 an adaptive linear output layer whose weights are calibrated in closedform using encoder history this allows the model to be trained endtoend on a source task while automatically performing domain adaptation when applied to a new target task strengths 1 the automatic output layer tuning they propose is an interesting idea although not explored in this paper this could potentially be adopted to cope with concept drifts or structural breaks which are prevalent in nonstationary time series datasets weaknesses however i do have some concerns with the paper in its current form which has a couple of areas that need to be addressed 1 the authors use a couple of technical terms in nonstandard ways making the introduction slightly confusing outofsample in the time series forecasting domain outofsample data typically refers to data forwards in time that is unavailable during training generalisation forwards in time is the primary purpose of training forecasting models whether statistical or machine learning approaches the authors use this to refer to generalisation performance on a different time series entirely however which depends critically on similarities between source and target tasks for instance it is not unreasonable for a forecasting model that predicts patient survival times to fail to predict retail sales multitask learning global forecasting models are typically not referred to as multitask learning as the goal is to achieve good forecasting performance across all entities and time steps as measured by a single evaluation metric see 1 for more on multitask learning 2 suitability of benchmarks while local statistical models can be useful for datalimited settings i am not sure these are suitable comparisons for zeroshot models which cannot access the target dataset before prediction time assuming some data is available for training there are few global network options available for small data regimes eg 2 and comparisons to other standard transfer learningdomain adaptation techniques should be evaluated similarly any differences vs nbeats could have arisen from the choice of encoder a nbeats style encoder should be used within the proposed metalearning framework for a fair comparison 3 how does the framework handle transfer learning between datasets with a different number of covariates the utility of the approach would very limited if only univariate target inputs can be used as a lot of useful information would also need to be discarded 4 complexity of matrix inversion assuming that ht has dimensions d inverting the dxd matrix typically is od3 as it is possible for d t see optimal hidden state sizes in deepar and temporal fusion transformer vs encoder length this can be greater than the sequential matrix multiplications for the lstm od2 t i believe as such computational complexity may indeed be a concern in some scenarios for this layer references 1 crawshaw multitask learning with deep neural networks arxiv200909796 2 rangapuram et al deep state space models for time series forecasting neurips 2018 while the raw idea has promise many correctionsclarifications need to be made before it is ready for publication specifically related to the terminology used whether inputs between datasets can differ and the complexity claims of their proposed layer in addition more suitable benchmarks are required to fully evaluate performance claims docsepthis work proposes a new forecasting method for jointly learning from a large pool of related timeseries the method called meta globallocal autoregression metaglar learns the mapping from representations produced by an rnn to onestep ahead forecasts where the parameters are learned across multiple timeseries through backpropagation this work studies the zeroshot transfer learning problem and proposes a method for it the method is somewhat incremental and evaluation has some issues see below for details this work develops a new method for outofsample timeseries forecasting that is transfer setting instead of calling the method a metaforecasting method which seems problematic it would be better to use the more standard and frequently used term of zeroshot transfer learning which fits much better with the method proposed metaglar is shown to have similar runtime and memory overhead compared to a global onestep ahead rnn with similar structure however it is unclear how it compares with the other baselines shown in table 1 in fact the only result showing training time i noticed is in figure 2 left however this only compares metaglar to rnn what about the other methods especially nbeats i was hoping to see a table similar to table 1 but with training time for all the baseline methods across all the datasets that said i would remove the last contribution mentioned since it is only in terms of rnn but not the actual stateoftheart methods deep factors and the more recent graph deep factors were both shown to outperform deepar nbeats and the other baselines used how do these methods compare to metaglar it seems they would be trivial to use in this setting as well nevertheless they should also be included as baselines and discussed appropriately since both utilize global and local components and can be trained with a large corpus of timeseries the caption of table 3 and table 4 have a typo table 3 smape on m3 all all models the reproducibility details mentioned in the appendix are good and easy to follow the method proposed is interesting but somewhat incremental and the problem is important however there are several issues that need to be addressed the method typically doesnt perform well compared to the stateoftheart as shown in table 1 it is also strange why different metrics are used for different datasets for instance of the four datasets used there are 3 different metrics used in the evaluation nd for electrtraff smape for m3 and mape for tourism it would have been better to report all 3 metrics for each dataset or just show one of the metrics and the others in the appendix it currently seems to be cherry picked a bit furthermore the training time for each baseline and dataset is missing please include a table like table 1 but with the runtime for each baseline and dataset please see above for other points overall the contribution and novelty of this work is limited ### Summary:
this paper proposes an autoregressive framework that combines rnn and local linear component for the problem of metaforecasting of time series the linear model can domainadapt to different time series while the rnn component is shared across series reviewers thought the problem was important the paper was generally clear and the experiments extensive however they found the significance to be limited and all took issue with some of the ways that the comparisons were done fzbr also raised the issue of complexity of the matrix inversion component of the method i believe this paper does fall on the rejection side of the fence due to the issues of complexity significance and evaluations with some development the paper could certainly be ready for acceptance
[ 275, 253, 3634, 3497, 253, 1313, 356, 9388, 2746, 812, 320, 5512, 2529, 347, 3733, 271, 391, 9866, 327, 253, 2120, 673, 2962, 1293, 3733, 253, 4872, 3828, 275, 253, 16923, 16892, 285, 840, 970, 253, 4872, 13461, 432, 253, 3634, 3497, 760, 1309, 3700, 281, 747, 8892, 50274, 249, 2829, 337, 2139, 858, 253, 4477, 9703, 281, 1304, 1264, 1027, 17082, 2439, 253, 1740, 15302, 40515, 327, 1516, 83, 285, 19225, 256, 785, 365, 327, 278, 20, 278, 2259, 327, 278, 21, 275, 616, 14023, 604, 512, 1264, 17082, 497, 5118, 2439, 512, 1666, 25379, 285, 253, 4477, 4081, 3210, 840, 841, 1543, 878, 281, 320, 2908, 387, 1878, 275, 253, 30762, 1580, 253, 4477, 452, 417, 2361, 616, 14720, 3212, 448, 5555, 1027, 17082, 323, 1027, 15302, 352, 16540, 253, 18910, 326, 841, 14023, 1537, 320, 33804, 5055, 50275, 249, 616, 5301, 342, 1442, 292, 25004, 858, 253, 4477, 1611, 1442, 292, 25004, 1097, 253, 2457, 3828, 285, 391, 9866, 326, 3133, 751, 247, 22870, 83, 5301, 7147, 7562, 253, 391, 9866, 1566, 4229, 285, 760, 3733, 253, 2457, 3828, 253, 4477, 908, 4581, 830, 5482, 323, 253, 2457, 3828, 534, 1537, 755, 625, 8214, 281, 11897, 604, 253, 391, 9866, 1566, 369, 2649, 4229, 247, 5301, 342, 247, 4156, 4872, 3828, 1442, 292, 37437, 12936, 253, 391, 9866, 1566, 651, 2085, 247, 1805, 5406, 273, 253, 6349, 273, 616, 1980, 17825, 3828, 50275, 531, 16055, 1953, 326, 436, 2561, 16540, 310, 670, 253, 7990, 285, 17200, 273, 253, 11419, 4715, 7792, 281, 28669, 16923, 272, 253, 7792, 298, 1727, 3139, 1077, 973, 281, 10625, 751, 4382, 8113, 1955, 281, 2954, 875, 5304, 8892, 337, 2299, 352, 310, 417, 4745, 4755, 2139, 352, 651, 320, 4623, 281, 673, 2962, 16923, 272, 835, 253, 6944, 10625, 285, 32449, 1005, 476, 6889, 7561, 2654, 751, 281, 923, 253, 4477, 4385, 327, 253, 7990, 285, 7787, 273, 436, 789, 275, 616, 3916, 50276, 3899, 356, 9388, 310, 10166, 327, 247, 2603, 10895, 285, 476, 840, 6635, 7899, 43942, 323, 747, 673, 2962, 326, 778, 9184, 3012, 432, 253, 2603, 10895, 50274, 1000, 255, 317, 9388, 6787, 50274, 14958, 253, 4477, 1599, 281, 1333, 26647, 275, 253, 715, 5586, 374, 50275, 28626, 455, 3100, 7274, 50276, 911, 26849, 247, 3687, 1268, 273, 48544, 347, 597, 3037, 3602, 326, 403, 6096, 407, 512, 28669, 275, 253, 3733, 873, 50274, 783, 3806, 5001, 253, 897, 273, 4581, 830, 1220, 735, 275, 8820, 9077, 3133, 13583, 275, 253, 2905, 789, 990, 273, 5586, 374, 50276, 395, 8820, 9077, 891, 88, 682, 285, 465, 360, 356, 2284, 9169, 1223, 359, 403, 417, 6600, 273, 667, 2898, 275, 16923, 272, 50275, 250, 3065, 50276, 18, 4836, 13646, 557, 290, 36874, 4836, 3700, 4715, 717, 343, 391, 1182, 312, 343, 247, 1591, 5945, 44392, 588, 16726, 703, 79, 458, 251, 21400, 480, 1149, 487, 284, 480, 262, 49955, 4691, 1479, 2830, 87, 900, 5745, 609, 339, 10061, 273, 253, 26332, 1796, 8059, 327, 4382, 8113, 285, 3102, 8981, 30105, 1087, 4765, 253, 1895, 273, 1182, 254, 6934, 302, 3700, 275, 673, 2962, 16923, 272, 310, 1774, 533, 352, 310, 671, 3309, 281, 4853, 697, 7990, 253, 1332, 3559, 407, 253, 4477, 310, 4460, 533, 32809, 627, 403, 7350, 5001, 253, 9610, 273, 1543, 923, 32213, 534, 5248, 5545, 327, 253, 1566, 3045, 534, 310, 2168, 2406, 685, 295, 1257, 1832, 327, 690, 8892, 4583, 253, 2929, 10262, 271, 4722, 2746, 281, 9591, 11419, 922, 29851, 533, 352, 3198, 625, 7000, 1783, 275, 2426, 273, 697, 30437, 347, 973, 347, 29867, 275, 2426, 273, 9610, 697, 1543, 891, 651, 320, 7378, 281, 1908, 271, 7756, 275, 253, 4868, 604, 841, 3374, 403, 9713, 50274, 5996, 250, 2858, 22559, 619, 7350, 5001, 7990, 285, 33804, 81, 12427, 452, 644, 10481, 9713, 275, 253, 17265, 19529, 285, 253, 4477, 5701, 2299, 891, 717, 1335, 7514, 670, 253, 38135, 273, 253, 789, 347, 973, 347, 253, 6070, 273, 490, 77, 569, 891, 452, 2559, 619, 4583, 4868, 281, 608, 42876, 2708, 253, 14924, 7887, 285, 2530, 3081, 8680, 281, 253, 4477, 275, 5701, 50276, 7152, 339, 431, 248, 4477, 12661, 247, 747, 5148, 613, 920, 7792, 281, 18915, 253, 1182, 254, 6934, 302, 4715, 1895, 323, 673, 2962, 941, 50276, 10489, 253, 5019, 273, 337, 186, 66, 2629, 47694, 11020, 10336, 9706, 9493, 1491, 24088, 3676, 274, 374, 186, 266, 17825, 4872, 3453, 3828, 3692, 13461, 403, 35890, 275, 4581, 630, 970, 32049, 2892, 50276, 2520, 4483, 253, 1566, 281, 320, 10166, 990, 936, 423, 327, 247, 2603, 4836, 1223, 8356, 9591, 5028, 15644, 672, 3732, 281, 247, 747, 2303, 4836, 50276, 296, 3755, 20556, 50276, 18, 186, 783, 12077, 3453, 3828, 25184, 597, 12661, 310, 271, 4722, 2934, 3738, 417, 14859, 275, 436, 2929, 436, 812, 7826, 320, 8671, 281, 23808, 342, 4473, 1837, 8515, 390, 8350, 13471, 534, 403, 21270, 275, 1327, 20502, 552, 673, 2962, 15302, 50276, 20881, 1255, 265, 50276, 35529, 891, 513, 452, 690, 7350, 342, 253, 2929, 275, 697, 1655, 830, 534, 556, 247, 4564, 273, 3672, 326, 878, 281, 320, 9713, 50276, 18, 186, 783, 4477, 897, 247, 4564, 273, 7681, 2426, 275, 1327, 15291, 4088, 2403, 253, 10199, 5777, 21643, 50274, 483, 1171, 16848, 50276, 249, 253, 673, 2962, 16923, 272, 5028, 562, 1171, 16848, 941, 5431, 10770, 281, 941, 32856, 275, 673, 326, 310, 29356, 1309, 3733, 2087, 5837, 32856, 275, 673, 310, 253, 3625, 4096, 273, 3733, 16923, 272, 3210, 1880, 7605, 390, 5145, 4715, 7274, 253, 4477, 897, 436, 281, 3730, 281, 2087, 5837, 3045, 327, 247, 1027, 673, 2962, 7094, 2299, 50276, 4609, 7024, 21038, 327, 22620, 875, 2603, 285, 2303, 8892, 323, 4227, 352, 310, 417, 20697, 323, 247, 16923, 272, 1566, 326, 26295, 3110, 5788, 2069, 281, 1891, 281, 3283, 10567, 6224, 50274, 9961, 262, 1945, 4715, 50276, 14456, 16923, 272, 3210, 403, 5431, 417, 6289, 281, 347, 1554, 262, 1945, 4715, 347, 253, 4736, 310, 281, 5115, 1175, 16923, 272, 3045, 2439, 512, 14429, 285, 673, 5018, 347, 4080, 407, 247, 2014, 7103, 7982, 923, 337, 323, 625, 327, 1554, 262, 1945, 4715, 374, 186, 46522, 1430, 273, 49602, 50276, 6050, 1980, 7605, 3210, 476, 320, 4217, 323, 2856, 267, 303, 959, 7533, 891, 717, 417, 2119, 841, 403, 7470, 14023, 323, 1182, 254, 6934, 302, 3210, 534, 2550, 2289, 253, 2303, 10895, 1078, 10554, 673, 7384, 690, 941, 310, 2130, 323, 3733, 627, 403, 1643, 4156, 2990, 4610, 2130, 323, 1355, 941, 27005, 24088, 374, 285, 14023, 281, 643, 2629, 3700, 4715, 13517, 15644, 5609, 943, 320, 6760, 12014, 667, 3910, 4632, 295, 1257, 1832, 812, 452, 45764, 432, 253, 4327, 273, 32049, 50276, 66, 295, 1257, 1832, 3740, 32049, 943, 320, 908, 1561, 253, 4081, 5148, 613, 920, 7792, 323, 247, 4344, 5301, 495, 186, 5430, 1057, 253, 7792, 6016, 3700, 4715, 875, 15302, 342, 247, 1027, 1180, 273, 33520, 253, 11839, 273, 253, 2746, 651, 1077, 3710, 604, 760, 36474, 2303, 14800, 476, 320, 908, 347, 247, 2257, 273, 4217, 1491, 651, 671, 878, 281, 320, 25665, 577, 186, 19017, 414, 273, 4315, 27697, 50276, 37411, 326, 288, 85, 556, 10103, 277, 275, 31324, 253, 277, 17176, 4315, 5431, 310, 7687, 20, 347, 352, 310, 1896, 323, 277, 50276, 85, 923, 8654, 8763, 1375, 9552, 275, 3676, 274, 285, 11935, 11781, 39707, 4632, 32049, 2978, 436, 476, 320, 3687, 685, 253, 22453, 4315, 30840, 569, 323, 253, 298, 296, 78, 7687, 19, 246, 891, 2868, 347, 824, 15180, 10454, 778, 6296, 320, 247, 4468, 275, 690, 15216, 323, 436, 3828, 50276, 250, 3065, 337, 186, 68, 2040, 40590, 1554, 262, 1945, 4715, 342, 3676, 11454, 6928, 549, 32693, 1518, 28766, 35323, 374, 186, 17943, 522, 321, 312, 1162, 355, 3676, 1375, 2317, 3210, 323, 673, 2962, 16923, 272, 5723, 2824, 4765, 50276, 6050, 253, 9305, 2934, 556, 9023, 1142, 17660, 498, 274, 6787, 878, 281, 320, 1160, 1078, 352, 310, 4704, 323, 9311, 50276, 46458, 2905, 281, 253, 28939, 908, 1880, 14800, 875, 15302, 476, 9184, 285, 253, 10454, 3916, 273, 616, 4081, 3828, 275, 1635, 625, 7470, 49602, 403, 2424, 281, 4751, 7472, 3045, 3916, 5474, 33032, 2520, 789, 29328, 247, 747, 16923, 272, 1332, 323, 26277, 4715, 432, 247, 1781, 6363, 273, 2905, 2069, 12395, 253, 1332, 1925, 11419, 13371, 455, 3100, 47694, 7186, 1313, 356, 9388, 33772, 253, 10603, 432, 14237, 4197, 407, 271, 391, 9866, 281, 327, 383, 554, 6386, 43942, 835, 253, 3602, 403, 6311, 2439, 2709, 2069, 12395, 949, 896, 44263, 318, 436, 789, 2175, 253, 1182, 254, 6934, 302, 3700, 4715, 1895, 285, 29328, 247, 1332, 323, 352, 253, 1332, 310, 8489, 32809, 285, 7103, 556, 690, 3374, 923, 2708, 323, 4278, 50276, 2520, 789, 24357, 247, 747, 1332, 323, 562, 1171, 16848, 2069, 12395, 16923, 272, 326, 310, 3700, 4758, 3185, 273, 6789, 253, 1332, 247, 11419, 922, 29851, 1332, 534, 3133, 20276, 352, 651, 320, 1805, 281, 897, 253, 625, 2629, 285, 7208, 908, 1307, 273, 1182, 254, 6934, 302, 3700, 4715, 534, 13840, 1199, 1805, 342, 253, 1332, 4081, 50276, 3899, 356, 9388, 310, 2011, 281, 452, 2074, 20243, 285, 3541, 18332, 2429, 281, 247, 4156, 327, 383, 554, 6386, 391, 9866, 342, 2074, 2605, 2299, 352, 310, 12744, 849, 352, 26662, 342, 253, 643, 1666, 25379, 2011, 275, 2829, 337, 275, 958, 253, 760, 906, 4645, 3733, 673, 891, 8344, 310, 275, 4677, 374, 1669, 2299, 436, 760, 26662, 1313, 356, 9388, 281, 391, 9866, 752, 670, 253, 643, 3082, 3340, 295, 1257, 1832, 891, 369, 11525, 281, 923, 247, 2829, 2074, 281, 2829, 337, 533, 342, 3733, 673, 323, 512, 253, 8245, 3082, 2439, 512, 253, 15302, 326, 753, 891, 651, 5386, 253, 1390, 7680, 5393, 1580, 352, 310, 760, 275, 2426, 273, 391, 9866, 533, 417, 253, 4588, 1375, 23037, 14387, 3082, 50274, 22412, 2616, 285, 253, 625, 3332, 4216, 3676, 2616, 497, 1097, 2011, 281, 562, 32231, 3676, 274, 295, 1257, 1832, 285, 253, 643, 1666, 25379, 908, 849, 513, 841, 3082, 7277, 281, 1313, 356, 9388, 352, 3133, 597, 651, 320, 14916, 281, 897, 275, 436, 4758, 347, 973, 17837, 597, 943, 671, 320, 2908, 347, 1666, 25379, 285, 5469, 20420, 1580, 1097, 16584, 4156, 285, 1980, 4295, 285, 476, 320, 10166, 342, 247, 1781, 20689, 273, 2069, 12395, 50275, 783, 11743, 273, 2829, 495, 285, 2829, 577, 452, 247, 1745, 80, 2829, 495, 256, 785, 365, 327, 278, 20, 512, 512, 3210, 253, 38041, 4278, 5393, 275, 253, 30762, 403, 1175, 285, 3477, 281, 956, 50275, 783, 1332, 4081, 310, 4722, 533, 8489, 32809, 285, 253, 1895, 310, 1774, 2299, 627, 403, 2067, 3374, 326, 878, 281, 320, 9713, 253, 1332, 5431, 36908, 1347, 973, 2429, 281, 253, 1375, 23037, 14387, 347, 2011, 275, 2829, 337, 352, 310, 671, 8921, 2139, 1027, 17082, 403, 908, 323, 1027, 15302, 323, 4227, 273, 253, 1740, 15302, 908, 627, 403, 495, 1027, 17082, 908, 275, 253, 7103, 40515, 323, 1516, 1378, 376, 567, 256, 785, 365, 323, 278, 20, 285, 278, 2259, 323, 26742, 352, 651, 452, 644, 1805, 281, 1304, 512, 495, 17082, 323, 1016, 10895, 390, 816, 921, 581, 273, 253, 17082, 285, 253, 2571, 275, 253, 30762, 352, 4390, 3133, 281, 320, 33804, 5055, 247, 2372, 33810, 253, 3733, 673, 323, 1016, 8245, 285, 10895, 310, 5816, 4496, 2486, 247, 2829, 751, 2829, 337, 533, 342, 253, 20243, 323, 1016, 8245, 285, 10895, 4496, 923, 1840, 323, 643, 2792, 4583, 253, 7680, 285, 38135, 273, 436, 789, 310, 3710, 2490, 187, 4118, 18435, 27, 2520, 2929, 29328, 271, 47694, 11020, 7792, 326, 24772, 391, 9866, 285, 1980, 4872, 4445, 323, 253, 1895, 273, 11419, 922, 29851, 273, 673, 2962, 253, 4872, 1566, 476, 5028, 26672, 281, 1027, 673, 2962, 1223, 253, 391, 9866, 4445, 310, 6096, 2439, 2962, 30628, 1869, 253, 1895, 369, 1774, 253, 2929, 369, 3839, 2590, 285, 253, 4679, 9470, 2299, 597, 1119, 253, 8453, 281, 320, 3710, 285, 512, 2335, 2523, 342, 690, 273, 253, 4088, 326, 253, 14023, 497, 2218, 269, 91, 1288, 671, 5439, 253, 2523, 273, 10454, 273, 253, 4315, 27697, 4445, 273, 253, 1332, 50276, 74, 2868, 436, 2929, 1057, 2965, 327, 253, 18235, 1930, 273, 253, 19354, 1955, 281, 253, 3374, 273, 10454, 8453, 285, 27163, 342, 690, 2440, 253, 2929, 812, 5604, 320, 4704, 323, 14924 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 275, 253, 3634, 3497, 253, 1313, 356, 9388, 2746, 812, 320, 5512, 2529, 347, 3733, 271, 391, 9866, 327, 253, 2120, 673, 2962, 1293, 3733, 253, 4872, 3828, 275, 253, 16923, 16892, 285, 840, 970, 253, 4872, 13461, 432, 253, 3634, 3497, 760, 1309, 3700, 281, 747, 8892, 50274, 249, 2829, 337, 2139, 858, 253, 4477, 9703, 281, 1304, 1264, 1027, 17082, 2439, 253, 1740, 15302, 40515, 327, 1516, 83, 285, 19225, 256, 785, 365, 327, 278, 20, 278, 2259, 327, 278, 21, 275, 616, 14023, 604, 512, 1264, 17082, 497, 5118, 2439, 512, 1666, 25379, 285, 253, 4477, 4081, 3210, 840, 841, 1543, 878, 281, 320, 2908, 387, 1878, 275, 253, 30762, 1580, 253, 4477, 452, 417, 2361, 616, 14720, 3212, 448, 5555, 1027, 17082, 323, 1027, 15302, 352, 16540, 253, 18910, 326, 841, 14023, 1537, 320, 33804, 5055, 50275, 249, 616, 5301, 342, 1442, 292, 25004, 858, 253, 4477, 1611, 1442, 292, 25004, 1097, 253, 2457, 3828, 285, 391, 9866, 326, 3133, 751, 247, 22870, 83, 5301, 7147, 7562, 253, 391, 9866, 1566, 4229, 285, 760, 3733, 253, 2457, 3828, 253, 4477, 908, 4581, 830, 5482, 323, 253, 2457, 3828, 534, 1537, 755, 625, 8214, 281, 11897, 604, 253, 391, 9866, 1566, 369, 2649, 4229, 247, 5301, 342, 247, 4156, 4872, 3828, 1442, 292, 37437, 12936, 253, 391, 9866, 1566, 651, 2085, 247, 1805, 5406, 273, 253, 6349, 273, 616, 1980, 17825, 3828, 50275, 531, 16055, 1953, 326, 436, 2561, 16540, 310, 670, 253, 7990, 285, 17200, 273, 253, 11419, 4715, 7792, 281, 28669, 16923, 272, 253, 7792, 298, 1727, 3139, 1077, 973, 281, 10625, 751, 4382, 8113, 1955, 281, 2954, 875, 5304, 8892, 337, 2299, 352, 310, 417, 4745, 4755, 2139, 352, 651, 320, 4623, 281, 673, 2962, 16923, 272, 835, 253, 6944, 10625, 285, 32449, 1005, 476, 6889, 7561, 2654, 751, 281, 923, 253, 4477, 4385, 327, 253, 7990, 285, 7787, 273, 436, 789, 275, 616, 3916, 50276, 3899, 356, 9388, 310, 10166, 327, 247, 2603, 10895, 285, 476, 840, 6635, 7899, 43942, 323, 747, 673, 2962, 326, 778, 9184, 3012, 432, 253, 2603, 10895, 50274, 1000, 255, 317, 9388, 6787, 50274, 14958, 253, 4477, 1599, 281, 1333, 26647, 275, 253, 715, 5586, 374, 50275, 28626, 455, 3100, 7274, 50276, 911, 26849, 247, 3687, 1268, 273, 48544, 347, 597, 3037, 3602, 326, 403, 6096, 407, 512, 28669, 275, 253, 3733, 873, 50274, 783, 3806, 5001, 253, 897, 273, 4581, 830, 1220, 735, 275, 8820, 9077, 3133, 13583, 275, 253, 2905, 789, 990, 273, 5586, 374, 50276, 395, 8820, 9077, 891, 88, 682, 285, 465, 360, 356, 2284, 9169, 1223, 359, 403, 417, 6600, 273, 667, 2898, 275, 16923, 272, 50275, 250, 3065, 50276, 18, 4836, 13646, 557, 290, 36874, 4836, 3700, 4715, 717, 343, 391, 1182, 312, 343, 247, 1591, 5945, 44392, 588, 16726, 703, 79, 458, 251, 21400, 480, 1149, 487, 284, 480, 262, 49955, 4691, 1479, 2830, 87, 900, 5745, 609, 339, 10061, 273, 253, 26332, 1796, 8059, 327, 4382, 8113, 285, 3102, 8981, 30105, 1087, 4765, 253, 1895, 273, 1182, 254, 6934, 302, 3700, 275, 673, 2962, 16923, 272, 310, 1774, 533, 352, 310, 671, 3309, 281, 4853, 697, 7990, 253, 1332, 3559, 407, 253, 4477, 310, 4460, 533, 32809, 627, 403, 7350, 5001, 253, 9610, 273, 1543, 923, 32213, 534, 5248, 5545, 327, 253, 1566, 3045, 534, 310, 2168, 2406, 685, 295, 1257, 1832, 327, 690, 8892, 4583, 253, 2929, 10262, 271, 4722, 2746, 281, 9591, 11419, 922, 29851, 533, 352, 3198, 625, 7000, 1783, 275, 2426, 273, 697, 30437, 347, 973, 347, 29867, 275, 2426, 273, 9610, 697, 1543, 891, 651, 320, 7378, 281, 1908, 271, 7756, 275, 253, 4868, 604, 841, 3374, 403, 9713, 50274, 5996, 250, 2858, 22559, 619, 7350, 5001, 7990, 285, 33804, 81, 12427, 452, 644, 10481, 9713, 275, 253, 17265, 19529, 285, 253, 4477, 5701, 2299, 891, 717, 1335, 7514, 670, 253, 38135, 273, 253, 789, 347, 973, 347, 253, 6070, 273, 490, 77, 569, 891, 452, 2559, 619, 4583, 4868, 281, 608, 42876, 2708, 253, 14924, 7887, 285, 2530, 3081, 8680, 281, 253, 4477, 275, 5701, 50276, 7152, 339, 431, 248, 4477, 12661, 247, 747, 5148, 613, 920, 7792, 281, 18915, 253, 1182, 254, 6934, 302, 4715, 1895, 323, 673, 2962, 941, 50276, 10489, 253, 5019, 273, 337, 186, 66, 2629, 47694, 11020, 10336, 9706, 9493, 1491, 24088, 3676, 274, 374, 186, 266, 17825, 4872, 3453, 3828, 3692, 13461, 403, 35890, 275, 4581, 630, 970, 32049, 2892, 50276, 2520, 4483, 253, 1566, 281, 320, 10166, 990, 936, 423, 327, 247, 2603, 4836, 1223, 8356, 9591, 5028, 15644, 672, 3732, 281, 247, 747, 2303, 4836, 50276, 296, 3755, 20556, 50276, 18, 186, 783, 12077, 3453, 3828, 25184, 597, 12661, 310, 271, 4722, 2934, 3738, 417, 14859, 275, 436, 2929, 436, 812, 7826, 320, 8671, 281, 23808, 342, 4473, 1837, 8515, 390, 8350, 13471, 534, 403, 21270, 275, 1327, 20502, 552, 673, 2962, 15302, 50276, 20881, 1255, 265, 50276, 35529, 891, 513, 452, 690, 7350, 342, 253, 2929, 275, 697, 1655, 830, 534, 556, 247, 4564, 273, 3672, 326, 878, 281, 320, 9713, 50276, 18, 186, 783, 4477, 897, 247, 4564, 273, 7681, 2426, 275, 1327, 15291, 4088, 2403, 253, 10199, 5777, 21643, 50274, 483, 1171, 16848, 50276, 249, 253, 673, 2962, 16923, 272, 5028, 562, 1171, 16848, 941, 5431, 10770, 281, 941, 32856, 275, 673, 326, 310, 29356, 1309, 3733, 2087, 5837, 32856, 275, 673, 310, 253, 3625, 4096, 273, 3733, 16923, 272, 3210, 1880, 7605, 390, 5145, 4715, 7274, 253, 4477, 897, 436, 281, 3730, 281, 2087, 5837, 3045, 327, 247, 1027, 673, 2962, 7094, 2299, 50276, 4609, 7024, 21038, 327, 22620, 875, 2603, 285, 2303, 8892, 323, 4227, 352, 310, 417, 20697, 323, 247, 16923, 272, 1566, 326, 26295, 3110, 5788, 2069, 281, 1891, 281, 3283, 10567, 6224, 50274, 9961, 262, 1945, 4715, 50276, 14456, 16923, 272, 3210, 403, 5431, 417, 6289, 281, 347, 1554, 262, 1945, 4715, 347, 253, 4736, 310, 281, 5115, 1175, 16923, 272, 3045, 2439, 512, 14429, 285, 673, 5018, 347, 4080, 407, 247, 2014, 7103, 7982, 923, 337, 323, 625, 327, 1554, 262, 1945, 4715, 374, 186, 46522, 1430, 273, 49602, 50276, 6050, 1980, 7605, 3210, 476, 320, 4217, 323, 2856, 267, 303, 959, 7533, 891, 717, 417, 2119, 841, 403, 7470, 14023, 323, 1182, 254, 6934, 302, 3210, 534, 2550, 2289, 253, 2303, 10895, 1078, 10554, 673, 7384, 690, 941, 310, 2130, 323, 3733, 627, 403, 1643, 4156, 2990, 4610, 2130, 323, 1355, 941, 27005, 24088, 374, 285, 14023, 281, 643, 2629, 3700, 4715, 13517, 15644, 5609, 943, 320, 6760, 12014, 667, 3910, 4632, 295, 1257, 1832, 812, 452, 45764, 432, 253, 4327, 273, 32049, 50276, 66, 295, 1257, 1832, 3740, 32049, 943, 320, 908, 1561, 253, 4081, 5148, 613, 920, 7792, 323, 247, 4344, 5301, 495, 186, 5430, 1057, 253, 7792, 6016, 3700, 4715, 875, 15302, 342, 247, 1027, 1180, 273, 33520, 253, 11839, 273, 253, 2746, 651, 1077, 3710, 604, 760, 36474, 2303, 14800, 476, 320, 908, 347, 247, 2257, 273, 4217, 1491, 651, 671, 878, 281, 320, 25665, 577, 186, 19017, 414, 273, 4315, 27697, 50276, 37411, 326, 288, 85, 556, 10103, 277, 275, 31324, 253, 277, 17176, 4315, 5431, 310, 7687, 20, 347, 352, 310, 1896, 323, 277, 50276, 85, 923, 8654, 8763, 1375, 9552, 275, 3676, 274, 285, 11935, 11781, 39707, 4632, 32049, 2978, 436, 476, 320, 3687, 685, 253, 22453, 4315, 30840, 569, 323, 253, 298, 296, 78, 7687, 19, 246, 891, 2868, 347, 824, 15180, 10454, 778, 6296, 320, 247, 4468, 275, 690, 15216, 323, 436, 3828, 50276, 250, 3065, 337, 186, 68, 2040, 40590, 1554, 262, 1945, 4715, 342, 3676, 11454, 6928, 549, 32693, 1518, 28766, 35323, 374, 186, 17943, 522, 321, 312, 1162, 355, 3676, 1375, 2317, 3210, 323, 673, 2962, 16923, 272, 5723, 2824, 4765, 50276, 6050, 253, 9305, 2934, 556, 9023, 1142, 17660, 498, 274, 6787, 878, 281, 320, 1160, 1078, 352, 310, 4704, 323, 9311, 50276, 46458, 2905, 281, 253, 28939, 908, 1880, 14800, 875, 15302, 476, 9184, 285, 253, 10454, 3916, 273, 616, 4081, 3828, 275, 1635, 625, 7470, 49602, 403, 2424, 281, 4751, 7472, 3045, 3916, 5474, 33032, 2520, 789, 29328, 247, 747, 16923, 272, 1332, 323, 26277, 4715, 432, 247, 1781, 6363, 273, 2905, 2069, 12395, 253, 1332, 1925, 11419, 13371, 455, 3100, 47694, 7186, 1313, 356, 9388, 33772, 253, 10603, 432, 14237, 4197, 407, 271, 391, 9866, 281, 327, 383, 554, 6386, 43942, 835, 253, 3602, 403, 6311, 2439, 2709, 2069, 12395, 949, 896, 44263, 318, 436, 789, 2175, 253, 1182, 254, 6934, 302, 3700, 4715, 1895, 285, 29328, 247, 1332, 323, 352, 253, 1332, 310, 8489, 32809, 285, 7103, 556, 690, 3374, 923, 2708, 323, 4278, 50276, 2520, 789, 24357, 247, 747, 1332, 323, 562, 1171, 16848, 2069, 12395, 16923, 272, 326, 310, 3700, 4758, 3185, 273, 6789, 253, 1332, 247, 11419, 922, 29851, 1332, 534, 3133, 20276, 352, 651, 320, 1805, 281, 897, 253, 625, 2629, 285, 7208, 908, 1307, 273, 1182, 254, 6934, 302, 3700, 4715, 534, 13840, 1199, 1805, 342, 253, 1332, 4081, 50276, 3899, 356, 9388, 310, 2011, 281, 452, 2074, 20243, 285, 3541, 18332, 2429, 281, 247, 4156, 327, 383, 554, 6386, 391, 9866, 342, 2074, 2605, 2299, 352, 310, 12744, 849, 352, 26662, 342, 253, 643, 1666, 25379, 2011, 275, 2829, 337, 275, 958, 253, 760, 906, 4645, 3733, 673, 891, 8344, 310, 275, 4677, 374, 1669, 2299, 436, 760, 26662, 1313, 356, 9388, 281, 391, 9866, 752, 670, 253, 643, 3082, 3340, 295, 1257, 1832, 891, 369, 11525, 281, 923, 247, 2829, 2074, 281, 2829, 337, 533, 342, 3733, 673, 323, 512, 253, 8245, 3082, 2439, 512, 253, 15302, 326, 753, 891, 651, 5386, 253, 1390, 7680, 5393, 1580, 352, 310, 760, 275, 2426, 273, 391, 9866, 533, 417, 253, 4588, 1375, 23037, 14387, 3082, 50274, 22412, 2616, 285, 253, 625, 3332, 4216, 3676, 2616, 497, 1097, 2011, 281, 562, 32231, 3676, 274, 295, 1257, 1832, 285, 253, 643, 1666, 25379, 908, 849, 513, 841, 3082, 7277, 281, 1313, 356, 9388, 352, 3133, 597, 651, 320, 14916, 281, 897, 275, 436, 4758, 347, 973, 17837, 597, 943, 671, 320, 2908, 347, 1666, 25379, 285, 5469, 20420, 1580, 1097, 16584, 4156, 285, 1980, 4295, 285, 476, 320, 10166, 342, 247, 1781, 20689, 273, 2069, 12395, 50275, 783, 11743, 273, 2829, 495, 285, 2829, 577, 452, 247, 1745, 80, 2829, 495, 256, 785, 365, 327, 278, 20, 512, 512, 3210, 253, 38041, 4278, 5393, 275, 253, 30762, 403, 1175, 285, 3477, 281, 956, 50275, 783, 1332, 4081, 310, 4722, 533, 8489, 32809, 285, 253, 1895, 310, 1774, 2299, 627, 403, 2067, 3374, 326, 878, 281, 320, 9713, 253, 1332, 5431, 36908, 1347, 973, 2429, 281, 253, 1375, 23037, 14387, 347, 2011, 275, 2829, 337, 352, 310, 671, 8921, 2139, 1027, 17082, 403, 908, 323, 1027, 15302, 323, 4227, 273, 253, 1740, 15302, 908, 627, 403, 495, 1027, 17082, 908, 275, 253, 7103, 40515, 323, 1516, 1378, 376, 567, 256, 785, 365, 323, 278, 20, 285, 278, 2259, 323, 26742, 352, 651, 452, 644, 1805, 281, 1304, 512, 495, 17082, 323, 1016, 10895, 390, 816, 921, 581, 273, 253, 17082, 285, 253, 2571, 275, 253, 30762, 352, 4390, 3133, 281, 320, 33804, 5055, 247, 2372, 33810, 253, 3733, 673, 323, 1016, 8245, 285, 10895, 310, 5816, 4496, 2486, 247, 2829, 751, 2829, 337, 533, 342, 253, 20243, 323, 1016, 8245, 285, 10895, 4496, 923, 1840, 323, 643, 2792, 4583, 253, 7680, 285, 38135, 273, 436, 789, 310, 3710, 2490, 187, 4118, 18435, 27, 2520, 2929, 29328, 271, 47694, 11020, 7792, 326, 24772, 391, 9866, 285, 1980, 4872, 4445, 323, 253, 1895, 273, 11419, 922, 29851, 273, 673, 2962, 253, 4872, 1566, 476, 5028, 26672, 281, 1027, 673, 2962, 1223, 253, 391, 9866, 4445, 310, 6096, 2439, 2962, 30628, 1869, 253, 1895, 369, 1774, 253, 2929, 369, 3839, 2590, 285, 253, 4679, 9470, 2299, 597, 1119, 253, 8453, 281, 320, 3710, 285, 512, 2335, 2523, 342, 690, 273, 253, 4088, 326, 253, 14023, 497, 2218, 269, 91, 1288, 671, 5439, 253, 2523, 273, 10454, 273, 253, 4315, 27697, 4445, 273, 253, 1332, 50276, 74, 2868, 436, 2929, 1057, 2965, 327, 253, 18235, 1930, 273, 253, 19354, 1955, 281, 253, 3374, 273, 10454, 8453, 285, 27163, 342, 690, 2440, 253, 2929, 812, 5604, 320, 4704, 323, 14924 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: presents mathematical analysis and computer simulation of model that maximizes mutual information between photoreceptors and the rgc outputs of the retina shows that several different spatiotemporal filters are derived that mosaic the retina in different ways strengths a nice formulation and modeling approach to understand retinal function impressive technical work weakness not sure what the main take away is the goal appears to be to understand the neural encoding in the retina but after that the analysis and results there is no attempt to tie these back to neurobiological mechanisms it seems one could but the paper just ends with the statement our results are in strong agreement with observed retinal data which leaves you hanging specific issues the difference of gaussians model in eq 8 it mentions that the center position of each kernel is different for each neuron but is this also learned not mentioned section 3 linear model in the continuum limit this is very unclear what is being continuized space the integral is over frequency space not following whats going on principal vectors a1 a2 and reciprocal vectors b1 b2 what are these section 41 power spectral density can be well approximated by a product of spatial and temporal powerlaw densities dong atick is cited but curiously the claim the exact opposite it is not separable figure 4 panel a shows striking clustering in temporal spectral centroids they are all stacked neatly in tight columns no scatter is this what emerges from the learned filters or is somehow the quantization imposed the mosaics are interesting to look at but not clear what to take away from this overall this seems like a very promising direction i want to like this paper but i find it a bit confusing and lacking a clear message see above no societal impact issues docsepthis paper presents a model for retinal mosaic organization derived by application of the efficient coding principle to natural movies in particular the model shows that the total number of retinal ganglion cells is a key parameter that controls the emergence of distinct cell type the model is also used to extract predictions about the relative phase of distinct retinal mosaics as a function of input noise strengths the paper tackles a relevant problem that is modeling the emergence of distinct retinal mosaics from an efficient coding perspective by building on a solid foundation reference 10 weaknesses the paper is very hard to read mostly because 1 its tight coupling with reference 10 and 2 how much of it is relegated to the appendix with respect to 1 i am not sure i understand fully what parts of the model and which results are fully novel here and which are minor refinements or variants of the results in 10 regarding 2 the appendix includes not only derivations for most of the equations shown in the main text or technical details for the realization of some of the plots but also to my understanding the basic logic of some of the results ill give more details on this below under questions additionally the paper claims strong agreement with observed data but contains no data and no quantitative assessment of such claim the limitations of the work are acknowledged very briefly in the discussion despite the strong assumptions of our model linear filtering separable filters firing rates instead of spikes but not unsuitably so for a modelling work of this type docsepthe authors present a unified perspective on the appearance of different ganglion cell types in terms of receptive field size temporal properties and polarity in the framework of efficient coding they deploy a previously developed model and efficient coding framework to spatiotemporal movies they show first analytically in the case of a simplified linear model later experimentally in the nonlinear case how optimal spatial and temporal filters change as the number of neurons channel capacity increases additionally they investigate the resulting mosaics of similar types for anti alignment and show that input as well as output noise has an important but opposed influence on the alignment of onoff receptive fields strengths the authors embed their work excellently into the previous literature and work clearly out the parallels and their novel contributions the theoretical as well as the experimental approaches are well described and cover different interesting arrangements and detailed analyses all experiments are carried out carefully and are of very high quality as well as the figures the authors push the understanding of retinal layout further and try to unify previous approaches however there are some minor weaknesses weaknesses 1 there are some open questions see below 2 a simple schema for the model would help to understand the setup 3 from equation 2 it does not become clear what the differences to 7 are and how the formula is derived as i could not find the exact formulation in 7 4 some parts of section 3 and especially suppl a are quite technical and it is sometimes difficult to follow the authors could try to strengthen a red thread and give also an intuition where possible 5 sections 3 and 4 seem to be rather detached from each other while formulas are derived thoroughly their interpretation and link to section 4 could be improved 6 their final conclusion that previous cell types change down in temporal frequency l 274 is not well supported by the presented experiments while it is more obvious for the spatial frequency an additional analysis for the temporal domain could help to support this argument minor comments l 256 shouldnt it reference to eq 8 the authors did not comment on limitations or negative societal impact while latter remains quite abstract the limitations could be indeed discussed in more detail docsepthe authors study how retinal mosaics emerge from an efficient coding framework they extend the framework to spatialtemporal kernels and video inputs which is a crucial extensions to understand how natural environmental statistics influence coding in neural systems also the approach is highly innovative as it allows to derive multiple optimized mosaics strengths extend previous models to spatiotemporal signals study relationship between natural video statistics and optimal neural codes interesting approach to derive multiple mosaics with interesting results weaknesses sometimes a bit jargonydense the authors should check how understandable each section is and make sure they can be followed by nontheoreticians limited by focus on efficient coding objective the authors could discucss whether reconstruction of the visual environment is really the ultimate goal of the visual system the authors should discuss the limitation of the efficient coding objective is that really all that the organism cares about ### Summary:
this paper received 1 accept 2 strong accepts and 1 reject all reviewers agree that the proposed model is elegant and that the technical work is impressive even the negative reviewer the main criticism of the negative reviewer is that the main take away is not clear the authors submitted a revised version of the manuscript sadly the reviewer did not read the rebuttal andor engage in a discussion postrebuttal the ac considers that the main criticism of this reviewer was addressed in light of this the ac recommends the paper to be accepted
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 81, 5957, 15965, 1783, 285, 4382, 9864, 273, 1566, 326, 11903, 4219, 15577, 1491, 875, 38099, 37555, 285, 253, 391, 23654, 18012, 273, 253, 30067, 50276, 1200, 5811, 326, 2067, 1027, 7046, 7173, 358, 23702, 15116, 403, 6012, 326, 37997, 253, 30067, 275, 1027, 4088, 20544, 50276, 66, 5322, 15895, 285, 14053, 2746, 281, 2096, 22043, 1159, 50276, 303, 7100, 422, 7681, 789, 50276, 20881, 1255, 50276, 1439, 2119, 752, 253, 2022, 1379, 1977, 310, 50276, 783, 4736, 4620, 281, 320, 281, 2096, 253, 11454, 9706, 275, 253, 30067, 533, 846, 326, 253, 1783, 285, 1543, 627, 310, 642, 3177, 281, 13898, 841, 896, 281, 6551, 4193, 1975, 6297, 50276, 262, 3133, 581, 812, 533, 253, 2929, 816, 7637, 342, 253, 3908, 776, 1543, 403, 275, 2266, 4345, 342, 2540, 22043, 941, 534, 6505, 368, 14203, 50276, 6160, 3374, 50276, 783, 3064, 273, 305, 10064, 2458, 1566, 275, 16186, 854, 352, 25957, 326, 253, 4055, 1899, 273, 1016, 10295, 310, 1027, 323, 1016, 23586, 533, 310, 436, 671, 6311, 50276, 1439, 5393, 50276, 4674, 495, 50276, 8172, 1566, 275, 253, 19106, 2701, 50276, 2520, 310, 1077, 12744, 50276, 5371, 310, 1146, 44351, 1025, 50276, 5641, 50276, 783, 9909, 310, 689, 4294, 2317, 50276, 1439, 1563, 47515, 1469, 327, 50276, 26985, 19522, 11390, 247, 18, 247, 19, 285, 33561, 11390, 270, 18, 270, 19, 50276, 5371, 403, 841, 50274, 4674, 7609, 50276, 9177, 9879, 4038, 476, 320, 973, 34930, 407, 247, 1885, 273, 8820, 285, 11935, 1612, 6937, 16689, 50275, 45689, 50276, 255, 781, 310, 11106, 533, 1095, 8140, 253, 1750, 253, 3242, 7285, 352, 310, 417, 39690, 50276, 13206, 577, 5370, 247, 2722, 13631, 17524, 275, 11935, 9879, 1399, 287, 2352, 50276, 9328, 403, 512, 24982, 36166, 275, 6863, 9930, 642, 24493, 50276, 261, 436, 752, 32361, 432, 253, 6311, 15116, 390, 310, 10380, 253, 36643, 11295, 50274, 783, 278, 6859, 982, 403, 4722, 281, 1007, 387, 533, 417, 2590, 752, 281, 1379, 1977, 432, 436, 50274, 1189, 455, 436, 3133, 751, 247, 1077, 12532, 3884, 891, 971, 281, 751, 436, 2929, 533, 891, 1089, 352, 247, 2372, 21643, 285, 14999, 247, 2590, 3935, 50275, 2887, 1840, 50276, 2369, 38058, 3486, 3374, 50276, 7152, 33032, 2520, 2929, 10262, 247, 1566, 323, 22043, 37997, 6003, 6012, 407, 2898, 273, 253, 5919, 12425, 8063, 281, 3626, 11321, 275, 1798, 253, 1566, 2722, 326, 253, 2264, 1180, 273, 22043, 10821, 38374, 1341, 310, 247, 2234, 4764, 326, 5760, 253, 21313, 273, 5799, 894, 1511, 253, 1566, 310, 671, 908, 281, 4908, 13650, 670, 253, 4103, 3408, 273, 5799, 22043, 278, 6859, 982, 347, 247, 1159, 273, 3280, 6046, 50275, 296, 3755, 20556, 253, 2929, 39223, 247, 4623, 1895, 326, 310, 14053, 253, 21313, 273, 5799, 22043, 278, 6859, 982, 432, 271, 5919, 12425, 8668, 407, 3652, 327, 247, 4891, 12153, 3806, 884, 50275, 20881, 1255, 265, 253, 2929, 310, 1077, 1892, 281, 1239, 6571, 984, 337, 697, 6863, 8789, 342, 3806, 884, 285, 374, 849, 1199, 273, 352, 310, 50217, 281, 253, 30762, 342, 1675, 281, 337, 891, 717, 417, 2119, 891, 2096, 4751, 752, 4243, 273, 253, 1566, 285, 534, 1543, 403, 4751, 4460, 1060, 285, 534, 403, 5884, 46783, 3658, 390, 11640, 273, 253, 1543, 275, 884, 5001, 374, 253, 30762, 3797, 417, 760, 3538, 569, 323, 954, 273, 253, 7424, 2011, 275, 253, 2022, 2505, 390, 7681, 4278, 323, 253, 22786, 273, 690, 273, 253, 14777, 533, 671, 50276, 936, 619, 4685, 50276, 783, 5044, 9317, 273, 690, 273, 253, 1543, 2853, 1918, 625, 4278, 327, 436, 2708, 762, 3533, 50276, 29483, 595, 253, 2929, 3916, 2266, 4345, 342, 2540, 941, 533, 4428, 642, 941, 285, 642, 11745, 6803, 273, 824, 1750, 50276, 783, 7364, 273, 253, 789, 403, 14969, 1077, 13366, 275, 253, 5955, 5747, 253, 2266, 13260, 273, 776, 1566, 50276, 8172, 19690, 39690, 15116, 14954, 4142, 3185, 273, 34635, 50276, 2858, 417, 5061, 2338, 1598, 594, 323, 247, 26278, 789, 273, 436, 1511, 50276, 7152, 339, 431, 248, 4477, 1246, 247, 27998, 8668, 327, 253, 7286, 273, 1027, 10821, 38374, 894, 3510, 275, 2426, 273, 44952, 1673, 1979, 11935, 3607, 285, 33487, 275, 253, 7792, 273, 5919, 12425, 597, 8745, 247, 3786, 3715, 1566, 285, 5919, 12425, 7792, 281, 7046, 7173, 358, 23702, 11321, 597, 921, 806, 41398, 275, 253, 1083, 273, 247, 21010, 4872, 1566, 1996, 21657, 275, 253, 14561, 1083, 849, 8654, 8820, 285, 11935, 15116, 1818, 347, 253, 1180, 273, 8512, 5048, 5350, 5459, 23000, 597, 7409, 253, 4795, 278, 6859, 982, 273, 2074, 3510, 323, 3270, 12420, 285, 921, 326, 3280, 347, 973, 347, 3453, 6046, 556, 271, 1774, 533, 10066, 4833, 327, 253, 12420, 273, 327, 2727, 44952, 4910, 50275, 296, 3755, 20556, 50276, 783, 4477, 8473, 616, 789, 6552, 1574, 715, 253, 2045, 6239, 285, 789, 4518, 562, 253, 43630, 285, 616, 4460, 9021, 253, 10527, 347, 973, 347, 253, 5661, 7274, 403, 973, 2529, 285, 3835, 1027, 4722, 16669, 285, 7000, 6260, 512, 4679, 403, 4824, 562, 9257, 285, 403, 273, 1077, 1029, 3290, 347, 973, 347, 253, 8442, 253, 4477, 7450, 253, 4685, 273, 22043, 12806, 2007, 285, 1611, 281, 440, 1419, 2045, 7274, 50275, 35529, 627, 403, 690, 5884, 32213, 50274, 20881, 1255, 265, 337, 627, 403, 690, 1527, 3533, 923, 2708, 374, 247, 2969, 20824, 323, 253, 1566, 651, 1361, 281, 2096, 253, 9978, 50276, 20, 432, 5150, 374, 352, 1057, 417, 2489, 2590, 752, 253, 3910, 281, 818, 403, 285, 849, 253, 7212, 310, 6012, 347, 891, 812, 417, 1089, 253, 3242, 15895, 275, 818, 50276, 21, 690, 4243, 273, 2593, 495, 285, 3340, 7642, 247, 403, 3240, 7681, 285, 352, 310, 4536, 2834, 281, 956, 253, 4477, 812, 1611, 281, 17084, 247, 2502, 6293, 285, 1918, 671, 271, 30328, 835, 1896, 608, 7118, 495, 285, 577, 1646, 281, 320, 2581, 31418, 432, 1016, 643, 1223, 23276, 403, 6012, 16575, 616, 7914, 285, 3048, 281, 2593, 577, 812, 320, 5520, 50276, 23, 616, 2457, 6452, 326, 2045, 894, 3510, 1818, 1066, 275, 11935, 4294, 298, 32900, 310, 417, 973, 4516, 407, 253, 3559, 4679, 1223, 352, 310, 625, 4755, 323, 253, 8820, 4294, 271, 3081, 1783, 323, 253, 11935, 5028, 812, 1361, 281, 1329, 436, 4154, 50273, 37585, 5701, 50276, 77, 17558, 943, 2649, 352, 3806, 281, 16186, 854, 253, 4477, 858, 417, 4385, 327, 7364, 390, 4016, 38058, 3486, 1223, 6158, 4558, 3240, 12002, 253, 7364, 812, 320, 6296, 5469, 275, 625, 2508, 50276, 7152, 339, 431, 248, 4477, 1263, 849, 22043, 278, 6859, 982, 20177, 432, 271, 5919, 12425, 7792, 597, 9017, 253, 7792, 281, 8820, 46258, 34501, 285, 3492, 14800, 534, 310, 247, 9560, 18149, 281, 2096, 849, 3626, 6938, 9990, 4833, 12425, 275, 11454, 2718, 50276, 12563, 253, 2746, 310, 4122, 16694, 347, 352, 4483, 281, 15313, 2709, 18325, 278, 6859, 982, 20544, 50276, 31930, 2045, 3210, 281, 7046, 7173, 358, 23702, 6298, 50275, 34966, 2954, 875, 3626, 3492, 9990, 285, 8654, 11454, 11646, 50276, 47606, 2746, 281, 15313, 2709, 278, 6859, 982, 342, 4722, 1543, 50276, 20881, 1255, 265, 50276, 32307, 247, 2372, 480, 1662, 2421, 43200, 50276, 783, 4477, 943, 2451, 849, 34007, 1016, 2593, 310, 285, 1056, 2119, 597, 476, 320, 3560, 407, 1327, 783, 30325, 2458, 50276, 15870, 407, 2770, 327, 5919, 12425, 8103, 50276, 783, 4477, 812, 1262, 1028, 859, 1880, 14433, 273, 253, 5304, 3126, 310, 1663, 253, 12553, 4736, 273, 253, 5304, 985, 253, 4477, 943, 2319, 253, 12291, 273, 253, 5919, 12425, 8103, 50276, 261, 326, 1663, 512, 326, 253, 21339, 24505, 670, 2490, 187, 4118, 18435, 27, 2520, 2929, 2959, 337, 2997, 374, 2266, 25026, 285, 337, 12009, 512, 30628, 5194, 326, 253, 4081, 1566, 310, 20654, 285, 326, 253, 7681, 789, 310, 13943, 1014, 253, 4016, 37317, 253, 2022, 14226, 273, 253, 4016, 37317, 310, 326, 50276, 783, 2022, 1379, 1977, 310, 417, 2590, 253, 4477, 9262, 247, 17265, 2715, 273, 253, 7714, 30018, 253, 37317, 858, 417, 1239, 253, 30080, 22559, 285, 263, 11377, 275, 247, 5955, 1501, 250, 2858, 22559, 253, 913, 19401, 326, 253, 2022, 14226, 273, 436, 37317, 369, 9713, 275, 1708, 273, 436, 253, 913, 32636, 253, 2929, 281, 320, 7607 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 81, 5957, 15965, 1783, 285, 4382, 9864, 273, 1566, 326, 11903, 4219, 15577, 1491, 875, 38099, 37555, 285, 253, 391, 23654, 18012, 273, 253, 30067, 50276, 1200, 5811, 326, 2067, 1027, 7046, 7173, 358, 23702, 15116, 403, 6012, 326, 37997, 253, 30067, 275, 1027, 4088, 20544, 50276, 66, 5322, 15895, 285, 14053, 2746, 281, 2096, 22043, 1159, 50276, 303, 7100, 422, 7681, 789, 50276, 20881, 1255, 50276, 1439, 2119, 752, 253, 2022, 1379, 1977, 310, 50276, 783, 4736, 4620, 281, 320, 281, 2096, 253, 11454, 9706, 275, 253, 30067, 533, 846, 326, 253, 1783, 285, 1543, 627, 310, 642, 3177, 281, 13898, 841, 896, 281, 6551, 4193, 1975, 6297, 50276, 262, 3133, 581, 812, 533, 253, 2929, 816, 7637, 342, 253, 3908, 776, 1543, 403, 275, 2266, 4345, 342, 2540, 22043, 941, 534, 6505, 368, 14203, 50276, 6160, 3374, 50276, 783, 3064, 273, 305, 10064, 2458, 1566, 275, 16186, 854, 352, 25957, 326, 253, 4055, 1899, 273, 1016, 10295, 310, 1027, 323, 1016, 23586, 533, 310, 436, 671, 6311, 50276, 1439, 5393, 50276, 4674, 495, 50276, 8172, 1566, 275, 253, 19106, 2701, 50276, 2520, 310, 1077, 12744, 50276, 5371, 310, 1146, 44351, 1025, 50276, 5641, 50276, 783, 9909, 310, 689, 4294, 2317, 50276, 1439, 1563, 47515, 1469, 327, 50276, 26985, 19522, 11390, 247, 18, 247, 19, 285, 33561, 11390, 270, 18, 270, 19, 50276, 5371, 403, 841, 50274, 4674, 7609, 50276, 9177, 9879, 4038, 476, 320, 973, 34930, 407, 247, 1885, 273, 8820, 285, 11935, 1612, 6937, 16689, 50275, 45689, 50276, 255, 781, 310, 11106, 533, 1095, 8140, 253, 1750, 253, 3242, 7285, 352, 310, 417, 39690, 50276, 13206, 577, 5370, 247, 2722, 13631, 17524, 275, 11935, 9879, 1399, 287, 2352, 50276, 9328, 403, 512, 24982, 36166, 275, 6863, 9930, 642, 24493, 50276, 261, 436, 752, 32361, 432, 253, 6311, 15116, 390, 310, 10380, 253, 36643, 11295, 50274, 783, 278, 6859, 982, 403, 4722, 281, 1007, 387, 533, 417, 2590, 752, 281, 1379, 1977, 432, 436, 50274, 1189, 455, 436, 3133, 751, 247, 1077, 12532, 3884, 891, 971, 281, 751, 436, 2929, 533, 891, 1089, 352, 247, 2372, 21643, 285, 14999, 247, 2590, 3935, 50275, 2887, 1840, 50276, 2369, 38058, 3486, 3374, 50276, 7152, 33032, 2520, 2929, 10262, 247, 1566, 323, 22043, 37997, 6003, 6012, 407, 2898, 273, 253, 5919, 12425, 8063, 281, 3626, 11321, 275, 1798, 253, 1566, 2722, 326, 253, 2264, 1180, 273, 22043, 10821, 38374, 1341, 310, 247, 2234, 4764, 326, 5760, 253, 21313, 273, 5799, 894, 1511, 253, 1566, 310, 671, 908, 281, 4908, 13650, 670, 253, 4103, 3408, 273, 5799, 22043, 278, 6859, 982, 347, 247, 1159, 273, 3280, 6046, 50275, 296, 3755, 20556, 253, 2929, 39223, 247, 4623, 1895, 326, 310, 14053, 253, 21313, 273, 5799, 22043, 278, 6859, 982, 432, 271, 5919, 12425, 8668, 407, 3652, 327, 247, 4891, 12153, 3806, 884, 50275, 20881, 1255, 265, 253, 2929, 310, 1077, 1892, 281, 1239, 6571, 984, 337, 697, 6863, 8789, 342, 3806, 884, 285, 374, 849, 1199, 273, 352, 310, 50217, 281, 253, 30762, 342, 1675, 281, 337, 891, 717, 417, 2119, 891, 2096, 4751, 752, 4243, 273, 253, 1566, 285, 534, 1543, 403, 4751, 4460, 1060, 285, 534, 403, 5884, 46783, 3658, 390, 11640, 273, 253, 1543, 275, 884, 5001, 374, 253, 30762, 3797, 417, 760, 3538, 569, 323, 954, 273, 253, 7424, 2011, 275, 253, 2022, 2505, 390, 7681, 4278, 323, 253, 22786, 273, 690, 273, 253, 14777, 533, 671, 50276, 936, 619, 4685, 50276, 783, 5044, 9317, 273, 690, 273, 253, 1543, 2853, 1918, 625, 4278, 327, 436, 2708, 762, 3533, 50276, 29483, 595, 253, 2929, 3916, 2266, 4345, 342, 2540, 941, 533, 4428, 642, 941, 285, 642, 11745, 6803, 273, 824, 1750, 50276, 783, 7364, 273, 253, 789, 403, 14969, 1077, 13366, 275, 253, 5955, 5747, 253, 2266, 13260, 273, 776, 1566, 50276, 8172, 19690, 39690, 15116, 14954, 4142, 3185, 273, 34635, 50276, 2858, 417, 5061, 2338, 1598, 594, 323, 247, 26278, 789, 273, 436, 1511, 50276, 7152, 339, 431, 248, 4477, 1246, 247, 27998, 8668, 327, 253, 7286, 273, 1027, 10821, 38374, 894, 3510, 275, 2426, 273, 44952, 1673, 1979, 11935, 3607, 285, 33487, 275, 253, 7792, 273, 5919, 12425, 597, 8745, 247, 3786, 3715, 1566, 285, 5919, 12425, 7792, 281, 7046, 7173, 358, 23702, 11321, 597, 921, 806, 41398, 275, 253, 1083, 273, 247, 21010, 4872, 1566, 1996, 21657, 275, 253, 14561, 1083, 849, 8654, 8820, 285, 11935, 15116, 1818, 347, 253, 1180, 273, 8512, 5048, 5350, 5459, 23000, 597, 7409, 253, 4795, 278, 6859, 982, 273, 2074, 3510, 323, 3270, 12420, 285, 921, 326, 3280, 347, 973, 347, 3453, 6046, 556, 271, 1774, 533, 10066, 4833, 327, 253, 12420, 273, 327, 2727, 44952, 4910, 50275, 296, 3755, 20556, 50276, 783, 4477, 8473, 616, 789, 6552, 1574, 715, 253, 2045, 6239, 285, 789, 4518, 562, 253, 43630, 285, 616, 4460, 9021, 253, 10527, 347, 973, 347, 253, 5661, 7274, 403, 973, 2529, 285, 3835, 1027, 4722, 16669, 285, 7000, 6260, 512, 4679, 403, 4824, 562, 9257, 285, 403, 273, 1077, 1029, 3290, 347, 973, 347, 253, 8442, 253, 4477, 7450, 253, 4685, 273, 22043, 12806, 2007, 285, 1611, 281, 440, 1419, 2045, 7274, 50275, 35529, 627, 403, 690, 5884, 32213, 50274, 20881, 1255, 265, 337, 627, 403, 690, 1527, 3533, 923, 2708, 374, 247, 2969, 20824, 323, 253, 1566, 651, 1361, 281, 2096, 253, 9978, 50276, 20, 432, 5150, 374, 352, 1057, 417, 2489, 2590, 752, 253, 3910, 281, 818, 403, 285, 849, 253, 7212, 310, 6012, 347, 891, 812, 417, 1089, 253, 3242, 15895, 275, 818, 50276, 21, 690, 4243, 273, 2593, 495, 285, 3340, 7642, 247, 403, 3240, 7681, 285, 352, 310, 4536, 2834, 281, 956, 253, 4477, 812, 1611, 281, 17084, 247, 2502, 6293, 285, 1918, 671, 271, 30328, 835, 1896, 608, 7118, 495, 285, 577, 1646, 281, 320, 2581, 31418, 432, 1016, 643, 1223, 23276, 403, 6012, 16575, 616, 7914, 285, 3048, 281, 2593, 577, 812, 320, 5520, 50276, 23, 616, 2457, 6452, 326, 2045, 894, 3510, 1818, 1066, 275, 11935, 4294, 298, 32900, 310, 417, 973, 4516, 407, 253, 3559, 4679, 1223, 352, 310, 625, 4755, 323, 253, 8820, 4294, 271, 3081, 1783, 323, 253, 11935, 5028, 812, 1361, 281, 1329, 436, 4154, 50273, 37585, 5701, 50276, 77, 17558, 943, 2649, 352, 3806, 281, 16186, 854, 253, 4477, 858, 417, 4385, 327, 7364, 390, 4016, 38058, 3486, 1223, 6158, 4558, 3240, 12002, 253, 7364, 812, 320, 6296, 5469, 275, 625, 2508, 50276, 7152, 339, 431, 248, 4477, 1263, 849, 22043, 278, 6859, 982, 20177, 432, 271, 5919, 12425, 7792, 597, 9017, 253, 7792, 281, 8820, 46258, 34501, 285, 3492, 14800, 534, 310, 247, 9560, 18149, 281, 2096, 849, 3626, 6938, 9990, 4833, 12425, 275, 11454, 2718, 50276, 12563, 253, 2746, 310, 4122, 16694, 347, 352, 4483, 281, 15313, 2709, 18325, 278, 6859, 982, 20544, 50276, 31930, 2045, 3210, 281, 7046, 7173, 358, 23702, 6298, 50275, 34966, 2954, 875, 3626, 3492, 9990, 285, 8654, 11454, 11646, 50276, 47606, 2746, 281, 15313, 2709, 278, 6859, 982, 342, 4722, 1543, 50276, 20881, 1255, 265, 50276, 32307, 247, 2372, 480, 1662, 2421, 43200, 50276, 783, 4477, 943, 2451, 849, 34007, 1016, 2593, 310, 285, 1056, 2119, 597, 476, 320, 3560, 407, 1327, 783, 30325, 2458, 50276, 15870, 407, 2770, 327, 5919, 12425, 8103, 50276, 783, 4477, 812, 1262, 1028, 859, 1880, 14433, 273, 253, 5304, 3126, 310, 1663, 253, 12553, 4736, 273, 253, 5304, 985, 253, 4477, 943, 2319, 253, 12291, 273, 253, 5919, 12425, 8103, 50276, 261, 326, 1663, 512, 326, 253, 21339, 24505, 670, 2490, 187, 4118, 18435, 27, 2520, 2929, 2959, 337, 2997, 374, 2266, 25026, 285, 337, 12009, 512, 30628, 5194, 326, 253, 4081, 1566, 310, 20654, 285, 326, 253, 7681, 789, 310, 13943, 1014, 253, 4016, 37317, 253, 2022, 14226, 273, 253, 4016, 37317, 310, 326, 50276, 783, 2022, 1379, 1977, 310, 417, 2590, 253, 4477, 9262, 247, 17265, 2715, 273, 253, 7714, 30018, 253, 37317, 858, 417, 1239, 253, 30080, 22559, 285, 263, 11377, 275, 247, 5955, 1501, 250, 2858, 22559, 253, 913, 19401, 326, 253, 2022, 14226, 273, 436, 37317, 369, 9713, 275, 1708, 273, 436, 253, 913, 32636, 253, 2929, 281, 320, 7607 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper is generally easy to follow it addresses the problem of existing method in evaluation of generalization of neural networks it proposes an interesting method to evaluate fcnns which includes computation cost and better exploits the structure of the network experiments are limited only one type convolutional neural network unet the paper would be strong if its analysis extended to more fcnn architectures the experiments section is short and lacks sufficient details to fully understand the proposed analyses docsepthe paper seems to tackle an important question in deep learning that of providing visual quantitative notions of generalization beyond simply a testvalidation performance that examines intermediate representations within deep networks although the models being examined in the paper may be limited to 3d u nets the principles being introduced seem to be general and may warrant broad interest 1 to me the paper was a bit confusing to follow with many details either lost within the explanation or the exact setup often lacking in explanationjustification and intuition 2 within the introduction of metrics section 22 the authors do not provide enough intuitions for the extensions they propose or theoretical justification for correctness or extensive evaluation under different test scenarios 3 the authors neither mention clearly nor justify why certain modeling choices were made during evaluation for example number of dimensions in pca number of clusters in kmeans number of mixtures in gmms 4 the experimental scope is also limited in terms of replicability of the results specifically only one dataset and one u net architecture is evaluated neither are performance trends evaluated for consistency within the dataset by subsamplingbootstrapping although different loss functions for training were used as a proxy for different models the results overall are not convincing enough in terms of the main thesis of the paper docsepthe problem the authors tackle is extremely relevant and the paper is well written and embedded in literature the paper is quite varied and gives many interesting pointers to find different angles to evaluate and develop new models past the simple show performance on a specific validation dataset these methods are very relevant to real applications of convolutional networks where labeled data is sparse and the variation in data is large thus estimating generalisation capabilities are very valuable the final metric does not have a very strong correlation as acknowledged by the authors although the correlation metric itself is already interesting and a possible metric for future unsupervised generalisation metrics to optimise the discussion of existing metrics by previous work and what is newly introduced by the authors can be a bit hard to separate part is discussed under 11 prior work but it continues in all further subsections although contributions are more clearly spelled out under 12 contributions docsep1 visualizing the generalization of unets is an important area of research and the paper highlighted the drawbacks of existing approaches 2 the efforts towards labelfree generalization metrics and visualizations are challenging and very much needed in general i think this paper is hard to follow and lacks sufficient empirical results to support the main claims 1 i find it difficult to understand how instrumentation and local receptivefi eld analysis is designed exactly the paper attempts to formalize a layer into a mapping function from r3times 3times 3 times 1 to rk where k represents the number of feature channels where k is further reduced to d through pca i dont understand why and how the receptive field can be reduced to k in the first place how it differs from directly applying pca to the receptive field to d dimensions without the step in between 2 the paper lacks empirical evidence to support the proposed approaches the first proposal in the paper instrumentation and local receptive field should be evaluated against methods without local pca the same with the full model in section 32 the lack of comparison makes it very difficult to know how effective the proposal is furthermore the correlations in fig 5 are too weak to draw meaningful conclusions ### Summary:
the authors address a hot topic visualizing features and representation inside fcn to improve interpretability with an interesting methodology even though the paper had initially some limitations such as i the experimental scope is limited only one type of architecture tested ii some technical aspects need to be discussed iii there are some clarity issues the authors took the reviewers comment into account in their new paper version in addition the code is released to the public thus i will follow the reviewers suggestion to recommend acceptance of this paper
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 253, 2929, 310, 3839, 3477, 281, 956, 50275, 262, 12453, 253, 1895, 273, 5368, 1332, 275, 7103, 273, 26647, 273, 11454, 6928, 50275, 262, 29328, 271, 4722, 1332, 281, 7472, 269, 14340, 2224, 534, 3797, 13782, 2105, 285, 1805, 40725, 253, 2605, 273, 253, 2990, 50275, 16217, 3825, 403, 3710, 760, 581, 1511, 27311, 267, 11454, 2990, 440, 292, 253, 2929, 651, 320, 2266, 604, 697, 1783, 6508, 281, 625, 269, 68, 9866, 35615, 50275, 783, 4679, 2593, 310, 2159, 285, 19756, 4209, 4278, 281, 4751, 2096, 253, 4081, 6260, 50276, 7152, 339, 431, 248, 2929, 3133, 281, 18915, 271, 1774, 1953, 275, 3676, 4715, 326, 273, 5277, 5304, 11745, 27367, 273, 26647, 4457, 3365, 247, 1071, 29599, 3045, 326, 33888, 10444, 14237, 1561, 3676, 6928, 3738, 253, 3210, 1146, 6730, 275, 253, 2929, 778, 320, 3710, 281, 495, 69, 1484, 37507, 253, 9241, 1146, 5611, 1646, 281, 320, 2087, 285, 778, 7501, 3862, 1600, 337, 281, 479, 253, 2929, 369, 247, 2372, 21643, 281, 956, 342, 1142, 4278, 2057, 3663, 1561, 253, 8813, 390, 253, 3242, 9978, 2223, 14999, 275, 8813, 6309, 1877, 285, 30328, 374, 1561, 253, 10199, 273, 17082, 2593, 3307, 253, 4477, 513, 417, 2085, 2217, 16875, 4431, 323, 253, 18149, 597, 12661, 390, 10527, 22861, 323, 36594, 390, 9470, 7103, 762, 1027, 1071, 15216, 495, 253, 4477, 6747, 3748, 4518, 4543, 15249, 2139, 2176, 14053, 10165, 497, 1160, 1309, 7103, 323, 1650, 1180, 273, 10103, 275, 268, 6357, 1180, 273, 9959, 275, 465, 30799, 1180, 273, 24170, 275, 305, 78, 983, 577, 253, 5661, 7990, 310, 671, 3710, 275, 2426, 273, 7446, 1430, 273, 253, 1543, 5742, 760, 581, 10895, 285, 581, 1484, 2036, 10336, 310, 6760, 6747, 403, 3045, 13554, 6760, 323, 15274, 1561, 253, 10895, 407, 8790, 312, 4906, 13449, 10981, 2784, 3738, 1027, 2957, 3470, 323, 3733, 497, 908, 347, 247, 17335, 323, 1027, 3210, 253, 1543, 4583, 403, 417, 21414, 2217, 275, 2426, 273, 253, 2022, 22857, 273, 253, 2929, 5474, 339, 431, 248, 1895, 253, 4477, 18915, 310, 6685, 4623, 285, 253, 2929, 310, 973, 3542, 285, 12691, 275, 6239, 253, 2929, 310, 3240, 12848, 285, 4245, 1142, 4722, 29476, 281, 1089, 1027, 14636, 281, 7472, 285, 1287, 747, 3210, 2469, 253, 2969, 921, 3045, 327, 247, 2173, 12820, 10895, 841, 3082, 403, 1077, 4623, 281, 1524, 4893, 273, 27311, 267, 6928, 835, 13130, 941, 310, 23507, 285, 253, 7629, 275, 941, 310, 1781, 3021, 26230, 2087, 5837, 13789, 403, 1077, 9865, 253, 2457, 7982, 1057, 417, 452, 247, 1077, 2266, 5921, 347, 14969, 407, 253, 4477, 50276, 20261, 253, 5921, 7982, 3139, 310, 2168, 4722, 285, 247, 1896, 7982, 323, 2852, 440, 35421, 2087, 5837, 17082, 281, 5556, 885, 50276, 783, 5955, 273, 5368, 17082, 407, 2045, 789, 285, 752, 310, 9841, 5611, 407, 253, 4477, 476, 320, 247, 2372, 1892, 281, 4858, 629, 310, 5469, 762, 1903, 2720, 789, 533, 352, 7788, 275, 512, 2007, 749, 21454, 3738, 9021, 403, 625, 4518, 43997, 562, 762, 1249, 9021, 5474, 33032, 18, 5304, 3006, 253, 26647, 273, 440, 1507, 310, 271, 1774, 2170, 273, 2561, 285, 253, 2929, 16318, 253, 30453, 273, 5368, 7274, 374, 253, 6031, 4404, 5188, 813, 658, 26647, 17082, 285, 5304, 5904, 403, 11132, 285, 1077, 1199, 3058, 50276, 249, 2087, 891, 1158, 436, 2929, 310, 1892, 281, 956, 285, 19756, 4209, 16774, 1543, 281, 1329, 253, 2022, 3916, 50276, 18, 891, 1089, 352, 2834, 281, 2096, 849, 42507, 285, 1980, 44952, 11125, 189, 10391, 1783, 310, 4158, 4555, 253, 2929, 9437, 281, 7473, 907, 247, 3828, 715, 247, 10603, 1159, 432, 391, 20, 3181, 495, 3181, 495, 2069, 337, 281, 391, 76, 835, 465, 6125, 253, 1180, 273, 4735, 8123, 835, 465, 310, 2007, 3777, 281, 277, 949, 268, 6357, 891, 13414, 2096, 2139, 285, 849, 253, 44952, 1673, 476, 320, 3777, 281, 465, 275, 253, 806, 1659, 849, 352, 19986, 432, 3587, 9433, 268, 6357, 281, 253, 44952, 1673, 281, 277, 10103, 1293, 253, 3213, 275, 875, 50276, 19, 253, 2929, 19756, 16774, 1941, 281, 1329, 253, 4081, 7274, 253, 806, 10419, 275, 253, 2929, 42507, 285, 1980, 44952, 1673, 943, 320, 6760, 1411, 3082, 1293, 1980, 268, 6357, 253, 1072, 342, 253, 2120, 1566, 275, 2593, 4567, 253, 3480, 273, 5301, 2789, 352, 1077, 2834, 281, 871, 849, 3576, 253, 10419, 310, 33810, 253, 13007, 275, 3036, 608, 403, 1512, 5075, 281, 3812, 14282, 11815, 2490, 187, 4118, 18435, 27, 783, 4477, 2953, 247, 3511, 9400, 5304, 3006, 3386, 285, 6779, 3304, 269, 14340, 281, 3157, 4665, 1430, 342, 271, 4722, 16182, 50276, 9154, 2167, 253, 2929, 574, 8523, 690, 7364, 824, 347, 50276, 74, 253, 5661, 7990, 310, 3710, 760, 581, 1511, 273, 10336, 5762, 21255, 690, 7681, 7794, 878, 281, 320, 5469, 37685, 50276, 9088, 403, 690, 19843, 3374, 253, 4477, 2335, 253, 30628, 4385, 715, 2395, 275, 616, 747, 2929, 2715, 50276, 249, 1635, 253, 2127, 310, 4439, 281, 253, 1345, 50276, 40622, 891, 588, 956, 253, 30628, 14876, 281, 5583, 14924, 273, 436, 2929 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 253, 2929, 310, 3839, 3477, 281, 956, 50275, 262, 12453, 253, 1895, 273, 5368, 1332, 275, 7103, 273, 26647, 273, 11454, 6928, 50275, 262, 29328, 271, 4722, 1332, 281, 7472, 269, 14340, 2224, 534, 3797, 13782, 2105, 285, 1805, 40725, 253, 2605, 273, 253, 2990, 50275, 16217, 3825, 403, 3710, 760, 581, 1511, 27311, 267, 11454, 2990, 440, 292, 253, 2929, 651, 320, 2266, 604, 697, 1783, 6508, 281, 625, 269, 68, 9866, 35615, 50275, 783, 4679, 2593, 310, 2159, 285, 19756, 4209, 4278, 281, 4751, 2096, 253, 4081, 6260, 50276, 7152, 339, 431, 248, 2929, 3133, 281, 18915, 271, 1774, 1953, 275, 3676, 4715, 326, 273, 5277, 5304, 11745, 27367, 273, 26647, 4457, 3365, 247, 1071, 29599, 3045, 326, 33888, 10444, 14237, 1561, 3676, 6928, 3738, 253, 3210, 1146, 6730, 275, 253, 2929, 778, 320, 3710, 281, 495, 69, 1484, 37507, 253, 9241, 1146, 5611, 1646, 281, 320, 2087, 285, 778, 7501, 3862, 1600, 337, 281, 479, 253, 2929, 369, 247, 2372, 21643, 281, 956, 342, 1142, 4278, 2057, 3663, 1561, 253, 8813, 390, 253, 3242, 9978, 2223, 14999, 275, 8813, 6309, 1877, 285, 30328, 374, 1561, 253, 10199, 273, 17082, 2593, 3307, 253, 4477, 513, 417, 2085, 2217, 16875, 4431, 323, 253, 18149, 597, 12661, 390, 10527, 22861, 323, 36594, 390, 9470, 7103, 762, 1027, 1071, 15216, 495, 253, 4477, 6747, 3748, 4518, 4543, 15249, 2139, 2176, 14053, 10165, 497, 1160, 1309, 7103, 323, 1650, 1180, 273, 10103, 275, 268, 6357, 1180, 273, 9959, 275, 465, 30799, 1180, 273, 24170, 275, 305, 78, 983, 577, 253, 5661, 7990, 310, 671, 3710, 275, 2426, 273, 7446, 1430, 273, 253, 1543, 5742, 760, 581, 10895, 285, 581, 1484, 2036, 10336, 310, 6760, 6747, 403, 3045, 13554, 6760, 323, 15274, 1561, 253, 10895, 407, 8790, 312, 4906, 13449, 10981, 2784, 3738, 1027, 2957, 3470, 323, 3733, 497, 908, 347, 247, 17335, 323, 1027, 3210, 253, 1543, 4583, 403, 417, 21414, 2217, 275, 2426, 273, 253, 2022, 22857, 273, 253, 2929, 5474, 339, 431, 248, 1895, 253, 4477, 18915, 310, 6685, 4623, 285, 253, 2929, 310, 973, 3542, 285, 12691, 275, 6239, 253, 2929, 310, 3240, 12848, 285, 4245, 1142, 4722, 29476, 281, 1089, 1027, 14636, 281, 7472, 285, 1287, 747, 3210, 2469, 253, 2969, 921, 3045, 327, 247, 2173, 12820, 10895, 841, 3082, 403, 1077, 4623, 281, 1524, 4893, 273, 27311, 267, 6928, 835, 13130, 941, 310, 23507, 285, 253, 7629, 275, 941, 310, 1781, 3021, 26230, 2087, 5837, 13789, 403, 1077, 9865, 253, 2457, 7982, 1057, 417, 452, 247, 1077, 2266, 5921, 347, 14969, 407, 253, 4477, 50276, 20261, 253, 5921, 7982, 3139, 310, 2168, 4722, 285, 247, 1896, 7982, 323, 2852, 440, 35421, 2087, 5837, 17082, 281, 5556, 885, 50276, 783, 5955, 273, 5368, 17082, 407, 2045, 789, 285, 752, 310, 9841, 5611, 407, 253, 4477, 476, 320, 247, 2372, 1892, 281, 4858, 629, 310, 5469, 762, 1903, 2720, 789, 533, 352, 7788, 275, 512, 2007, 749, 21454, 3738, 9021, 403, 625, 4518, 43997, 562, 762, 1249, 9021, 5474, 33032, 18, 5304, 3006, 253, 26647, 273, 440, 1507, 310, 271, 1774, 2170, 273, 2561, 285, 253, 2929, 16318, 253, 30453, 273, 5368, 7274, 374, 253, 6031, 4404, 5188, 813, 658, 26647, 17082, 285, 5304, 5904, 403, 11132, 285, 1077, 1199, 3058, 50276, 249, 2087, 891, 1158, 436, 2929, 310, 1892, 281, 956, 285, 19756, 4209, 16774, 1543, 281, 1329, 253, 2022, 3916, 50276, 18, 891, 1089, 352, 2834, 281, 2096, 849, 42507, 285, 1980, 44952, 11125, 189, 10391, 1783, 310, 4158, 4555, 253, 2929, 9437, 281, 7473, 907, 247, 3828, 715, 247, 10603, 1159, 432, 391, 20, 3181, 495, 3181, 495, 2069, 337, 281, 391, 76, 835, 465, 6125, 253, 1180, 273, 4735, 8123, 835, 465, 310, 2007, 3777, 281, 277, 949, 268, 6357, 891, 13414, 2096, 2139, 285, 849, 253, 44952, 1673, 476, 320, 3777, 281, 465, 275, 253, 806, 1659, 849, 352, 19986, 432, 3587, 9433, 268, 6357, 281, 253, 44952, 1673, 281, 277, 10103, 1293, 253, 3213, 275, 875, 50276, 19, 253, 2929, 19756, 16774, 1941, 281, 1329, 253, 4081, 7274, 253, 806, 10419, 275, 253, 2929, 42507, 285, 1980, 44952, 1673, 943, 320, 6760, 1411, 3082, 1293, 1980, 268, 6357, 253, 1072, 342, 253, 2120, 1566, 275, 2593, 4567, 253, 3480, 273, 5301, 2789, 352, 1077, 2834, 281, 871, 849, 3576, 253, 10419, 310, 33810, 253, 13007, 275, 3036, 608, 403, 1512, 5075, 281, 3812, 14282, 11815, 2490, 187, 4118, 18435, 27, 783, 4477, 2953, 247, 3511, 9400, 5304, 3006, 3386, 285, 6779, 3304, 269, 14340, 281, 3157, 4665, 1430, 342, 271, 4722, 16182, 50276, 9154, 2167, 253, 2929, 574, 8523, 690, 7364, 824, 347, 50276, 74, 253, 5661, 7990, 310, 3710, 760, 581, 1511, 273, 10336, 5762, 21255, 690, 7681, 7794, 878, 281, 320, 5469, 37685, 50276, 9088, 403, 690, 19843, 3374, 253, 4477, 2335, 253, 30628, 4385, 715, 2395, 275, 616, 747, 2929, 2715, 50276, 249, 1635, 253, 2127, 310, 4439, 281, 253, 1345, 50276, 40622, 891, 588, 956, 253, 30628, 14876, 281, 5583, 14924, 273, 436, 2929 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this work revisits a famous counterexample on the convergence of adam originally presented in reddi 2018 the authors show that if the ema parameter beta2 in rmsprop and adam is chosen high enough then both methods converge to a bounded region in the stochastic setting in addition the authors provide some results for the fullbatch case crucially and differently from many other papers on the topic the gradients are not assumed to be bounded and the beta2 hyperparameter is not chosen to increase to 1 the paper is well written and the logic of it is convincing i like the introduction and figure 1 this nicely illustrates the relevance of this paper it is also very well organized unfortunately i did not have the time i wish i had to dig into the proofs just had a quick check but the methodology of the authors and the results are convincing this is overall a very nice paper with clean and easy to read results that clarifies an important point it is misleading to claim that adam does not converge which was pointed out in reddi 2018 to introduce amsgrad i have heard this wrong claim many times in the optimization community hence i think this paper deserves attention therefore my clear accept this work truly does merge the gap between theory and practice in nonconvex stochastic optimization just a few suggestions i think the authors should cite and discuss the results in defossez et al 2019 on the convergence of adam and adagrad also i think figure 1 deserves better quality its done in matlab so in the xlabel command you can put interpreterlatex and fontsize20 finally i spotted 1 typo in remark2 cases of nondivergence cases docsepthe paper starts off from the recent realization that there exists divergent examples for any set of hyperparameters for algorithms in the adam family such as rmsprop it sets out to study the effect of the beta2 parameter on convergence for a fixed specific problem the analysis shows that there exists a beta2 1 that leads to convergence for realizable problems and to convergence to a bounded region of interest for nonrealizable problems without requiring a boundedgradient assumption experiments confirm this new theory overall the paper is wellwritten clear and easy to read one of its strongest points is how well the analysis and the relevance of the results is motived for instance the importance of removing the assumption on the bounds on the gradient because it effectively removes one of the convergencedivergence regimes is well executed there is also significant efforts on providing clear simplified examples from rather complex theorems which is very appreciated eg corollary 41 further there is a real effort to contrast the results with the previous work and to explain how it complements them resolving clearly what initially appears as direct contradictions the results are relevant both from the point of view of the theory where it adds to a body of work explaining how and why the adam family of algorithm performs well on modern machine learning taskloads and from the point of view of the practitioner outlining what hyperparameter tuning is necessary to achieve convergence they are also original in the sense that they provide novel insights while removing problematic assumptions that permeate most of the related work a couple of things could be improved as pointed out in the paper if beta2 1 the algorithm degenerates to sgd while there is a remark explaining why as long as beta2 1 the two algorithms differ it would be informative to compare the convergence regimes with high beta2 to sgd directly to validate that there exists a set of hyperparameters that not only provide convergence but improved convergence properties compared to sgd otherwise the results are a lot less relevant as well as give an order of magnitude of what value is typically necessary for beta2 condition 4 in theorem 43 is quite difficult to apprehend with a slightly worrying beta2n term more exegesis would be beneficial for reader comprehension overall this is a nice wellwritten and relevant paper that clears the bar for publication in its current versiondocsepsummary the paper studies one of the most popular algorithms in machine learning rmsprop more specifically it investigates the relation between the hyperparameters and the convergence of the algorithm by proving the convergence without using bounded gradient assumption the authors establish a phase transition from divergence to nondivergence for rmsprop pros 1 the paper concerns one of the most important algorithms in machine learning in my opinion the problem is practical and of interest in machine learning community 2 the results of the paper provide explicit conditions for the hyperparameters of rmspropadam that ensure the convergence of the algorithms these results provide basic guidelines for tuning hyperparameters of the algorithms in practice cons apart from the strong points i still have some concerns about the clarity of the paper i hope the authors can address my concerns to improve the quality of the paper 1 the parameter beta2 is the most important subject of the paper until algorithm 1 the paper discusses beta2 without defining it clearly it would be more clear if beta2 is mentioned from the beginning of the paper that it comes form algorithm 1 2 the authors divide the problems into 2 subclasses to investigate realizable and nonrealizable which are not clearly defined it would be better if the authors can define these 2 subclasses more formally 3 the experiments supporting the theoretical results are comprehensible however i would suggest the authors provide a figure with xaxis to be epochs and yaxis to be accuracy so that the readers can have better idea upon how sgd and rmsprop behave during training ### Summary:
the paper shows convergence results for rmsprop in certain regimes the reviews are uniformly positive about this paper and i recommend acceptance
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 789, 27694, 953, 247, 8530, 2258, 442, 18398, 4636, 327, 253, 14940, 273, 38622, 8927, 3559, 275, 28159, 74, 4765, 253, 4477, 921, 326, 604, 253, 299, 785, 4764, 9840, 19, 275, 391, 983, 8560, 285, 38622, 310, 6777, 1029, 2217, 840, 1097, 3082, 29623, 281, 247, 11542, 2919, 275, 253, 19191, 4758, 275, 1635, 253, 4477, 2085, 690, 1543, 323, 253, 2120, 23941, 1083, 29325, 1365, 285, 13359, 432, 1142, 643, 9380, 327, 253, 9400, 253, 27935, 403, 417, 8025, 281, 320, 11542, 285, 253, 9840, 19, 4373, 19484, 310, 417, 6777, 281, 2572, 281, 337, 50276, 783, 2929, 310, 973, 3542, 285, 253, 9317, 273, 352, 310, 21414, 891, 751, 253, 10199, 285, 4677, 337, 436, 23395, 18303, 253, 17200, 273, 436, 2929, 352, 310, 671, 1077, 973, 10932, 19235, 891, 858, 417, 452, 253, 673, 891, 5730, 891, 574, 281, 2836, 715, 253, 27947, 816, 574, 247, 3158, 2451, 533, 253, 16182, 273, 253, 4477, 285, 253, 1543, 403, 21414, 50276, 2520, 310, 4583, 247, 1077, 5322, 2929, 342, 4076, 285, 3477, 281, 1239, 1543, 326, 8254, 7790, 271, 1774, 1127, 352, 310, 24363, 281, 1750, 326, 38622, 1057, 417, 29623, 534, 369, 8042, 562, 275, 28159, 74, 4765, 281, 9569, 717, 84, 4971, 891, 452, 3735, 436, 3430, 1750, 1142, 2069, 275, 253, 13757, 3114, 50276, 48521, 891, 1158, 436, 2929, 22828, 4116, 3103, 619, 2590, 2997, 436, 789, 7777, 1057, 17310, 253, 8037, 875, 3762, 285, 3946, 275, 1327, 44181, 19191, 13757, 50276, 6309, 247, 1643, 13991, 891, 1158, 253, 4477, 943, 26542, 285, 2319, 253, 1543, 275, 809, 37554, 91, 1162, 355, 6247, 327, 253, 14940, 273, 38622, 285, 519, 356, 4614, 671, 891, 1158, 4677, 337, 22828, 1805, 3290, 697, 2218, 275, 1111, 13068, 594, 275, 253, 1269, 1968, 3923, 368, 476, 1691, 35374, 12579, 89, 285, 8266, 3281, 938, 4720, 891, 20673, 337, 1745, 80, 275, 7579, 19, 2219, 273, 27370, 2373, 9515, 2219, 5474, 339, 431, 248, 2929, 7866, 745, 432, 253, 3332, 22786, 326, 627, 4961, 34249, 6667, 323, 667, 873, 273, 4373, 22041, 323, 11333, 275, 253, 38622, 2021, 824, 347, 391, 983, 8560, 352, 5239, 562, 281, 1263, 253, 1055, 273, 253, 9840, 19, 4764, 327, 14940, 323, 247, 4229, 2173, 1895, 253, 1783, 2722, 326, 627, 4961, 247, 9840, 19, 50276, 18, 326, 5644, 281, 14940, 323, 1524, 12729, 3237, 285, 281, 14940, 281, 247, 11542, 2919, 273, 1600, 323, 1327, 6549, 12729, 3237, 1293, 10568, 247, 11542, 29844, 9376, 4679, 6583, 436, 747, 3762, 50276, 1189, 455, 253, 2929, 310, 973, 15720, 2590, 285, 3477, 281, 1239, 581, 273, 697, 19508, 2792, 310, 849, 973, 253, 1783, 285, 253, 17200, 273, 253, 1543, 310, 1733, 1567, 323, 4227, 253, 6349, 273, 11922, 253, 9376, 327, 253, 14493, 327, 253, 11786, 984, 352, 8069, 26586, 581, 273, 253, 5975, 1541, 758, 2373, 9515, 27005, 310, 973, 11407, 627, 310, 671, 1534, 6031, 327, 5277, 2590, 21010, 6667, 432, 2581, 2570, 39383, 534, 310, 1077, 14109, 24088, 40460, 7609, 2007, 627, 310, 247, 1524, 3434, 281, 4499, 253, 1543, 342, 253, 2045, 789, 285, 281, 5513, 849, 352, 509, 9115, 731, 30426, 4518, 752, 8523, 4620, 347, 1480, 10435, 11297, 50276, 783, 1543, 403, 4623, 1097, 432, 253, 1127, 273, 1859, 273, 253, 3762, 835, 352, 11323, 281, 247, 2133, 273, 789, 15571, 849, 285, 2139, 253, 38622, 2021, 273, 5933, 17923, 973, 327, 4980, 5145, 4715, 4836, 2799, 84, 285, 432, 253, 1127, 273, 1859, 273, 253, 34815, 562, 30927, 752, 4373, 19484, 25184, 310, 3309, 281, 5115, 14940, 597, 403, 671, 3236, 275, 253, 3282, 326, 597, 2085, 4460, 16039, 1223, 11922, 20276, 13260, 326, 15498, 366, 954, 273, 253, 2905, 789, 50276, 66, 4564, 273, 1841, 812, 320, 5520, 50276, 284, 8042, 562, 275, 253, 2929, 604, 9840, 19, 50276, 18, 253, 5933, 25273, 684, 281, 256, 35333, 1223, 627, 310, 247, 7579, 15571, 2139, 347, 1048, 347, 9840, 19, 50276, 18, 253, 767, 11333, 9184, 352, 651, 320, 27096, 281, 7277, 253, 14940, 27005, 342, 1029, 9840, 19, 281, 256, 35333, 3587, 281, 17813, 326, 627, 4961, 247, 873, 273, 4373, 22041, 326, 417, 760, 2085, 14940, 533, 5520, 14940, 3607, 2429, 281, 256, 35333, 5010, 253, 1543, 403, 247, 2257, 1679, 4623, 347, 973, 347, 1918, 271, 1340, 273, 9777, 273, 752, 1318, 310, 5431, 3309, 323, 9840, 19, 50276, 12380, 577, 275, 10012, 7652, 310, 3240, 2834, 281, 43127, 69, 342, 247, 5777, 29124, 9840, 19, 79, 1307, 625, 385, 909, 6090, 651, 320, 12912, 323, 9414, 35380, 50276, 1189, 455, 436, 310, 247, 5322, 973, 15720, 285, 4623, 2929, 326, 1391, 1032, 253, 2534, 323, 9311, 275, 697, 1655, 2715, 7152, 339, 793, 360, 3454, 50276, 783, 2929, 2175, 581, 273, 253, 954, 4633, 11333, 275, 5145, 4715, 391, 983, 8560, 625, 5742, 352, 2340, 684, 253, 5886, 875, 253, 4373, 22041, 285, 253, 14940, 273, 253, 5933, 407, 18597, 253, 14940, 1293, 970, 11542, 11786, 9376, 253, 4477, 5100, 247, 3408, 5502, 432, 23279, 281, 27370, 2373, 9515, 323, 391, 983, 8560, 50276, 856, 84, 50276, 18, 253, 2929, 7350, 581, 273, 253, 954, 1774, 11333, 275, 5145, 4715, 275, 619, 4743, 253, 1895, 310, 8542, 285, 273, 1600, 275, 5145, 4715, 3114, 50276, 19, 253, 1543, 273, 253, 2929, 2085, 6843, 2515, 323, 253, 4373, 22041, 273, 391, 983, 8560, 43089, 326, 5416, 253, 14940, 273, 253, 11333, 841, 1543, 2085, 5044, 9600, 323, 25184, 4373, 22041, 273, 253, 11333, 275, 3946, 50276, 5040, 50276, 522, 435, 432, 253, 2266, 2792, 891, 1335, 452, 690, 7350, 670, 253, 19843, 273, 253, 2929, 891, 3524, 253, 4477, 476, 2953, 619, 7350, 281, 3157, 253, 3290, 273, 253, 2929, 50276, 18, 253, 4764, 9840, 19, 310, 253, 954, 1774, 2256, 273, 253, 2929, 1919, 5933, 337, 253, 2929, 25339, 9840, 19, 1293, 13947, 352, 4518, 352, 651, 320, 625, 2590, 604, 9840, 19, 310, 5393, 432, 253, 5068, 273, 253, 2929, 326, 352, 3249, 830, 5933, 337, 50276, 19, 253, 4477, 10957, 253, 3237, 715, 374, 749, 19770, 281, 7409, 1524, 12729, 285, 1327, 6549, 12729, 534, 403, 417, 4518, 2931, 352, 651, 320, 1805, 604, 253, 4477, 476, 4853, 841, 374, 749, 19770, 625, 19186, 50276, 20, 253, 4679, 8109, 253, 10527, 1543, 403, 28535, 6286, 2299, 891, 651, 1804, 253, 4477, 2085, 247, 4677, 342, 1269, 10565, 281, 320, 44540, 285, 340, 10565, 281, 320, 7200, 594, 326, 253, 10668, 476, 452, 1805, 2934, 2220, 849, 256, 35333, 285, 391, 983, 8560, 21319, 1309, 3733, 187, 187, 4118, 18435, 27, 783, 2929, 2722, 14940, 1543, 323, 391, 983, 8560, 275, 2176, 27005, 253, 10123, 403, 17568, 2762, 670, 436, 2929, 285, 891, 5583, 14924 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 789, 27694, 953, 247, 8530, 2258, 442, 18398, 4636, 327, 253, 14940, 273, 38622, 8927, 3559, 275, 28159, 74, 4765, 253, 4477, 921, 326, 604, 253, 299, 785, 4764, 9840, 19, 275, 391, 983, 8560, 285, 38622, 310, 6777, 1029, 2217, 840, 1097, 3082, 29623, 281, 247, 11542, 2919, 275, 253, 19191, 4758, 275, 1635, 253, 4477, 2085, 690, 1543, 323, 253, 2120, 23941, 1083, 29325, 1365, 285, 13359, 432, 1142, 643, 9380, 327, 253, 9400, 253, 27935, 403, 417, 8025, 281, 320, 11542, 285, 253, 9840, 19, 4373, 19484, 310, 417, 6777, 281, 2572, 281, 337, 50276, 783, 2929, 310, 973, 3542, 285, 253, 9317, 273, 352, 310, 21414, 891, 751, 253, 10199, 285, 4677, 337, 436, 23395, 18303, 253, 17200, 273, 436, 2929, 352, 310, 671, 1077, 973, 10932, 19235, 891, 858, 417, 452, 253, 673, 891, 5730, 891, 574, 281, 2836, 715, 253, 27947, 816, 574, 247, 3158, 2451, 533, 253, 16182, 273, 253, 4477, 285, 253, 1543, 403, 21414, 50276, 2520, 310, 4583, 247, 1077, 5322, 2929, 342, 4076, 285, 3477, 281, 1239, 1543, 326, 8254, 7790, 271, 1774, 1127, 352, 310, 24363, 281, 1750, 326, 38622, 1057, 417, 29623, 534, 369, 8042, 562, 275, 28159, 74, 4765, 281, 9569, 717, 84, 4971, 891, 452, 3735, 436, 3430, 1750, 1142, 2069, 275, 253, 13757, 3114, 50276, 48521, 891, 1158, 436, 2929, 22828, 4116, 3103, 619, 2590, 2997, 436, 789, 7777, 1057, 17310, 253, 8037, 875, 3762, 285, 3946, 275, 1327, 44181, 19191, 13757, 50276, 6309, 247, 1643, 13991, 891, 1158, 253, 4477, 943, 26542, 285, 2319, 253, 1543, 275, 809, 37554, 91, 1162, 355, 6247, 327, 253, 14940, 273, 38622, 285, 519, 356, 4614, 671, 891, 1158, 4677, 337, 22828, 1805, 3290, 697, 2218, 275, 1111, 13068, 594, 275, 253, 1269, 1968, 3923, 368, 476, 1691, 35374, 12579, 89, 285, 8266, 3281, 938, 4720, 891, 20673, 337, 1745, 80, 275, 7579, 19, 2219, 273, 27370, 2373, 9515, 2219, 5474, 339, 431, 248, 2929, 7866, 745, 432, 253, 3332, 22786, 326, 627, 4961, 34249, 6667, 323, 667, 873, 273, 4373, 22041, 323, 11333, 275, 253, 38622, 2021, 824, 347, 391, 983, 8560, 352, 5239, 562, 281, 1263, 253, 1055, 273, 253, 9840, 19, 4764, 327, 14940, 323, 247, 4229, 2173, 1895, 253, 1783, 2722, 326, 627, 4961, 247, 9840, 19, 50276, 18, 326, 5644, 281, 14940, 323, 1524, 12729, 3237, 285, 281, 14940, 281, 247, 11542, 2919, 273, 1600, 323, 1327, 6549, 12729, 3237, 1293, 10568, 247, 11542, 29844, 9376, 4679, 6583, 436, 747, 3762, 50276, 1189, 455, 253, 2929, 310, 973, 15720, 2590, 285, 3477, 281, 1239, 581, 273, 697, 19508, 2792, 310, 849, 973, 253, 1783, 285, 253, 17200, 273, 253, 1543, 310, 1733, 1567, 323, 4227, 253, 6349, 273, 11922, 253, 9376, 327, 253, 14493, 327, 253, 11786, 984, 352, 8069, 26586, 581, 273, 253, 5975, 1541, 758, 2373, 9515, 27005, 310, 973, 11407, 627, 310, 671, 1534, 6031, 327, 5277, 2590, 21010, 6667, 432, 2581, 2570, 39383, 534, 310, 1077, 14109, 24088, 40460, 7609, 2007, 627, 310, 247, 1524, 3434, 281, 4499, 253, 1543, 342, 253, 2045, 789, 285, 281, 5513, 849, 352, 509, 9115, 731, 30426, 4518, 752, 8523, 4620, 347, 1480, 10435, 11297, 50276, 783, 1543, 403, 4623, 1097, 432, 253, 1127, 273, 1859, 273, 253, 3762, 835, 352, 11323, 281, 247, 2133, 273, 789, 15571, 849, 285, 2139, 253, 38622, 2021, 273, 5933, 17923, 973, 327, 4980, 5145, 4715, 4836, 2799, 84, 285, 432, 253, 1127, 273, 1859, 273, 253, 34815, 562, 30927, 752, 4373, 19484, 25184, 310, 3309, 281, 5115, 14940, 597, 403, 671, 3236, 275, 253, 3282, 326, 597, 2085, 4460, 16039, 1223, 11922, 20276, 13260, 326, 15498, 366, 954, 273, 253, 2905, 789, 50276, 66, 4564, 273, 1841, 812, 320, 5520, 50276, 284, 8042, 562, 275, 253, 2929, 604, 9840, 19, 50276, 18, 253, 5933, 25273, 684, 281, 256, 35333, 1223, 627, 310, 247, 7579, 15571, 2139, 347, 1048, 347, 9840, 19, 50276, 18, 253, 767, 11333, 9184, 352, 651, 320, 27096, 281, 7277, 253, 14940, 27005, 342, 1029, 9840, 19, 281, 256, 35333, 3587, 281, 17813, 326, 627, 4961, 247, 873, 273, 4373, 22041, 326, 417, 760, 2085, 14940, 533, 5520, 14940, 3607, 2429, 281, 256, 35333, 5010, 253, 1543, 403, 247, 2257, 1679, 4623, 347, 973, 347, 1918, 271, 1340, 273, 9777, 273, 752, 1318, 310, 5431, 3309, 323, 9840, 19, 50276, 12380, 577, 275, 10012, 7652, 310, 3240, 2834, 281, 43127, 69, 342, 247, 5777, 29124, 9840, 19, 79, 1307, 625, 385, 909, 6090, 651, 320, 12912, 323, 9414, 35380, 50276, 1189, 455, 436, 310, 247, 5322, 973, 15720, 285, 4623, 2929, 326, 1391, 1032, 253, 2534, 323, 9311, 275, 697, 1655, 2715, 7152, 339, 793, 360, 3454, 50276, 783, 2929, 2175, 581, 273, 253, 954, 4633, 11333, 275, 5145, 4715, 391, 983, 8560, 625, 5742, 352, 2340, 684, 253, 5886, 875, 253, 4373, 22041, 285, 253, 14940, 273, 253, 5933, 407, 18597, 253, 14940, 1293, 970, 11542, 11786, 9376, 253, 4477, 5100, 247, 3408, 5502, 432, 23279, 281, 27370, 2373, 9515, 323, 391, 983, 8560, 50276, 856, 84, 50276, 18, 253, 2929, 7350, 581, 273, 253, 954, 1774, 11333, 275, 5145, 4715, 275, 619, 4743, 253, 1895, 310, 8542, 285, 273, 1600, 275, 5145, 4715, 3114, 50276, 19, 253, 1543, 273, 253, 2929, 2085, 6843, 2515, 323, 253, 4373, 22041, 273, 391, 983, 8560, 43089, 326, 5416, 253, 14940, 273, 253, 11333, 841, 1543, 2085, 5044, 9600, 323, 25184, 4373, 22041, 273, 253, 11333, 275, 3946, 50276, 5040, 50276, 522, 435, 432, 253, 2266, 2792, 891, 1335, 452, 690, 7350, 670, 253, 19843, 273, 253, 2929, 891, 3524, 253, 4477, 476, 2953, 619, 7350, 281, 3157, 253, 3290, 273, 253, 2929, 50276, 18, 253, 4764, 9840, 19, 310, 253, 954, 1774, 2256, 273, 253, 2929, 1919, 5933, 337, 253, 2929, 25339, 9840, 19, 1293, 13947, 352, 4518, 352, 651, 320, 625, 2590, 604, 9840, 19, 310, 5393, 432, 253, 5068, 273, 253, 2929, 326, 352, 3249, 830, 5933, 337, 50276, 19, 253, 4477, 10957, 253, 3237, 715, 374, 749, 19770, 281, 7409, 1524, 12729, 285, 1327, 6549, 12729, 534, 403, 417, 4518, 2931, 352, 651, 320, 1805, 604, 253, 4477, 476, 4853, 841, 374, 749, 19770, 625, 19186, 50276, 20, 253, 4679, 8109, 253, 10527, 1543, 403, 28535, 6286, 2299, 891, 651, 1804, 253, 4477, 2085, 247, 4677, 342, 1269, 10565, 281, 320, 44540, 285, 340, 10565, 281, 320, 7200, 594, 326, 253, 10668, 476, 452, 1805, 2934, 2220, 849, 256, 35333, 285, 391, 983, 8560, 21319, 1309, 3733, 187, 187, 4118, 18435, 27, 783, 2929, 2722, 14940, 1543, 323, 391, 983, 8560, 275, 2176, 27005, 253, 10123, 403, 17568, 2762, 670, 436, 2929, 285, 891, 5583, 14924 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper poses an approach to retrosynthesis that addresses the challenges of i availability of reactants and ii generalization to unseen templates to achieve this the authors reformulate retrosynthesis as the selection of reactants from a fixed set in the case of the uspto database this is the set of 671578 commercially available reactants used in the database their reactant selection framework retcl uses gnns to calculate selection scores for all candidate molecules a novel contrastive training scheme is used to learn the selection score which is computed as the cosine similarity between embeddings of the product and the reactants computed by the graph neural networks the authors provide a good summary of related work in this area my major questions for the authors of this paper concern the limitation imposed by restricting to a specific candidate set for example the results reported on uspto50k are very impressive particularly the ability to generalize to heldout reaction types however the model is never challenged by reactions for which the reactants are not present in the candidate set and it is completely unclear how the model would perform in this scenario this is important because to tackle the chemically relevant retrosynthesis problem it is exactly necessary to solve reactions where the reactants may not be present in the 671518 reactants present in uspto50k i find it a bit surprising that the authors do not address this constraint in the main text and i would ask that they carry out experiments where they restrict the candidate set to a subset of the 671518 reactants yet consider reactions from the whole database and ask how severe this restriction is in terms of the solutions obtained by the model it would be necessary to build these splits carefully and provide results for multiple splits there are other arguments that the authors could also make to defend this point in addition to these experiments but the current state in which the issue is ignored is not satisfactory the description of the contrastive training approach is clear and coherent the addition of hard negatives to the batches to improve learning is interesting and the results of the ablation study speak to the important role that it plays docsepretcl enumerates all of the candidate molecules all us patent dataset all 671k based on selection scores computed by graph neural networks the cosine similarity between products and reactants are used to design scores which is later used for training the way of cosine similarity to bridge reactants to products are interesting q1 used test data as selection set during training training using all us patent dataset 671k as candidates for retcl to select from leaked test data uspto50k test data during training all us patent dataset is a superset of the uspto50k data selection based algorithm tends to achieve overly optimistic results due to this reason q2 how to generalize it is great the paper shows that retcl generalizes well to unseen templates however if we select upon an existing dataset though very large how does retcl possibly yield totally unseen reactants ie not in any existing dataset q3 approximation of c sec 23 computing prp rgiven c and qpr c requires summing over all candidates in c which is necessary for proper probability but computational expensive how does these probabilities be approximated by a minibatch of reactions the batch statistics are very different from the c all us patent dataset all 671k docsep summary of the paper this paper proposes a sequential reactant selection scheme for retrosynthesis in each step the model gives a ranking of reactants based on previously chosen reactants rgiven after all the reactants are selected the model checks whether the chosen reactants result in desired product the ranking module psi is trained via contrastive learning the negative reactant candidates are constrained to be similar to the positive reactant strength and weakness 1 the method proposes a selection based approach to address one of the weaknesses of templatefree approaches the predicted reactants may be commercially unavailable this is indeed an important issue that needs to be addressed however i am afraid the proposed approach is an overkill if we only select reactants appeared in the uspto database the model cannot generalize to new reactions which involves new reactants not in the uspto database this is problematic for two reasons 1 for instance if the model is evaluated on a harder test set where ground truth reactants are not in uspto database i think the model will fail with 0 top1 accuracy 2 moreover in multistep retrosynthesis you are allowed to make new intermediate compounds from commercially available compounds in order to make your final product i dont see why we have to choose reactants only from commercially available compounds 2 scalability the total number of commercially available compounds are up to 109 eg enamine real database i am concerned that the neural network based ranking will run very slowly and cannot scale to larger sets of compounds 3 in section 34 authors evaluate their approach on a harder test set with novel reaction templates in my opinion templatefree approaches are also capable of generalizing to novel reaction templates why there is no comparison to g2g or transformer in table 5 4 the proposed method is quite straightforward with limited technical novelty in my opinion i am afraid the contrastive learning part is just a straightforward application of negative sampling 5 the result looks very strong on the uspto50k test set overall evaluation i vote for weak reject mainly for two reasons 1 the method seems incapable of generalizing to new reactant compounds outside of the commercial library 2 the approach has limited novelty to iclr audience post rebuttal i would like to thank the authors for their valuable response the experimental results seem strong but technical novelty is still limited my review score remains the same i believe this paper can make great impact if submitted to a chemistry journal suggestions despite my negative evaluation i do appreciate the point authors are trying to address i think the paper can be significantly strengthened if you can loosen the commercially available constraint for instance you can loosen the constraint to synthesizable choosing the reactants that can be synthesized from a given set of building blocks via reactions in multistep retrosynthesis you are allowed to make new compounds from your building blocks for the sake of making your final productsdocsepthis submission describes an approach to singlestep retrosynthesis based on contrastive learning that selects reactants that can be used to synthesize a target product in a single step the stated contributions are 1 an approach to retrosynthesis that is constrained to only select available starting materials and 2 a novel contrastive learning scheme with hard negative mining the use of a contrastive loss to learn an embedding of reactants and their products that exhibit the property that the sum of reactant vectors has a high cosine similarity to the product vector is clever this is a nice way to give structure to a continuous vector space the strategy of hard negative mining is also clever and identifies the examples that would intuitively be the most informative i take some issue with the premise of constraining retrosynthetic recommendations to an enumerated list this works for small corpora like the ones used here but in reality retrosynthesis is a multistep process where most reactants are not commercially available and the onestep retrosynthetic expansion must be repeated recursively this approach is fundamentally unable to operate on reaction products where multiple synthetic steps are required which represent the challenging cases that is why in the multistep evaluations the authors rely on the transformer model to propose intermediate structures appendix d also suggests that in the pathway search experiments knowledge of the routes in advance was required to construct the set of all starting materials to select from the empirical evaluation as a result of the premise is somewhat flawed by constraining reactant proposals to an enumerated list of reactants extracted from a parent database the authors have simplified the task in comparison to previous approaches making a headtohead comparison of accuracy less informative the evaluation in 34 generalizing to unseen templates could simply be an indication that the model is learning atom conservation excepting leaving groups and to maximize substructure overlap with the products the model has access to a restricted list of possible starting materials of which very few are likely to be plausible precursors for a given product this advantage invalidates the comparison in table 5 this is not evidence of generalization since the test set answers were included in the set of candidates while the contrastive learning approach is clever this work uses a contrived formulation for retrosynthesis that is not applicable to multistep planning and the experiments do not support the conclusions drawn ### Summary:
while the authors appreciated the proposed contrastive training scheme and the strong related work summary all authors agreed that the approach was severeley limited by being a pure selectionbased method without the help of another model that proposes molecules the approach can only select reactants from an existing set as target molecules become more complicated the modeller must make a choice a use a much larger initial candidate set which hopefully encompases all molecules necessary to make the target molecule or b use another model to propose new intermediate molecules the authors went with b which harmed their novelty claim a big reason why retrosynthesis is hard is because of the need to generate unseen molecules and if this is left to an already proposed model the current approach is not adding much methodological novelty while their approach does improve upon existing work in the multistep setting theres even more recent work that has not been compared against eg httpsarxivorgpdf200607038pdf so the improved performance may be outperformed the fix is straightforward modify the methodology to also propose intermediate molecules this will fix the novelty complaint and strengthen the practicality argument practitioners could directly use this approach to discover synthesis routes the authors could slightly update the related work add comparisons against recent methods and take into account the other feedback given by the authors the paper is very nicely written the proposed changes are purely methodological and not insurmountable in my opinion i would urge the authors to make these changes which i believe will result in a very nice paper
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 24543, 271, 2746, 281, 14582, 15001, 326, 12453, 253, 7881, 273, 891, 11659, 273, 8071, 1103, 285, 21255, 26647, 281, 39709, 20665, 281, 5115, 436, 253, 4477, 8460, 4187, 14582, 15001, 347, 253, 5438, 273, 8071, 1103, 432, 247, 4229, 873, 275, 253, 1083, 273, 253, 441, 22352, 5447, 436, 310, 253, 873, 273, 9963, 1010, 3141, 21917, 2130, 8071, 1103, 908, 275, 253, 5447, 616, 8071, 386, 5438, 7792, 851, 498, 4648, 18976, 2224, 281, 10173, 5438, 7363, 323, 512, 7431, 8094, 247, 4460, 4499, 422, 3733, 6974, 310, 908, 281, 3037, 253, 5438, 4868, 534, 310, 10302, 347, 253, 7349, 460, 14259, 875, 46234, 273, 253, 1885, 285, 253, 8071, 1103, 10302, 407, 253, 4216, 11454, 6928, 253, 4477, 2085, 247, 1175, 6010, 273, 2905, 789, 275, 436, 2170, 50275, 2577, 2201, 3533, 323, 253, 4477, 273, 436, 2929, 4468, 253, 12291, 11295, 407, 34617, 281, 247, 2173, 7431, 873, 323, 1650, 253, 1543, 2361, 327, 441, 22352, 1235, 76, 403, 1077, 13943, 3782, 253, 3745, 281, 39970, 281, 2918, 483, 4884, 3510, 2299, 253, 1566, 310, 1620, 14870, 407, 9969, 323, 534, 253, 8071, 1103, 403, 417, 1246, 275, 253, 7431, 873, 285, 352, 310, 4336, 12744, 849, 253, 1566, 651, 1347, 275, 436, 10076, 436, 310, 1774, 984, 281, 18915, 253, 34663, 4623, 14582, 15001, 1895, 352, 310, 4555, 3309, 281, 8415, 9969, 835, 253, 8071, 1103, 778, 417, 320, 1246, 275, 253, 9963, 1010, 1093, 8071, 1103, 1246, 275, 441, 22352, 1235, 76, 50275, 74, 1089, 352, 247, 2372, 10084, 326, 253, 4477, 513, 417, 2953, 436, 7658, 275, 253, 2022, 2505, 285, 891, 651, 1642, 326, 597, 4459, 562, 4679, 835, 597, 4656, 253, 7431, 873, 281, 247, 8578, 273, 253, 9963, 1010, 1093, 8071, 1103, 2568, 1908, 9969, 432, 253, 2644, 5447, 285, 1642, 849, 5460, 436, 12400, 310, 275, 2426, 273, 253, 5482, 2797, 407, 253, 1566, 352, 651, 320, 3309, 281, 1973, 841, 36509, 9257, 285, 2085, 1543, 323, 2709, 36509, 627, 403, 643, 7125, 326, 253, 4477, 812, 671, 1056, 281, 2342, 436, 1127, 275, 1635, 281, 841, 4679, 50276, 2858, 253, 1655, 1375, 275, 534, 253, 2523, 310, 12841, 310, 417, 20297, 50275, 783, 5740, 273, 253, 4499, 422, 3733, 2746, 310, 2590, 285, 18893, 253, 1635, 273, 1892, 2297, 3993, 281, 253, 39657, 281, 3157, 4715, 310, 4722, 285, 253, 1543, 273, 253, 28913, 1263, 3984, 281, 253, 1774, 2554, 326, 352, 7120, 50276, 7152, 339, 4025, 498, 30482, 684, 512, 273, 253, 7431, 8094, 512, 441, 9469, 10895, 512, 721, 3677, 76, 1754, 327, 5438, 7363, 10302, 407, 4216, 11454, 6928, 253, 7349, 460, 14259, 875, 3580, 285, 8071, 1103, 403, 908, 281, 2216, 7363, 534, 310, 1996, 908, 323, 3733, 253, 1039, 273, 7349, 460, 14259, 281, 9729, 8071, 1103, 281, 3580, 403, 4722, 50273, 82, 18, 50276, 3197, 1071, 941, 347, 5438, 873, 1309, 3733, 50275, 31158, 970, 512, 441, 9469, 10895, 721, 3677, 76, 347, 9183, 323, 851, 498, 281, 3609, 432, 31347, 1071, 941, 441, 22352, 1235, 76, 1071, 941, 1309, 3733, 512, 441, 9469, 10895, 310, 247, 17402, 292, 273, 253, 441, 22352, 1235, 76, 941, 50276, 27423, 1754, 5933, 14280, 281, 5115, 27662, 28684, 1543, 1955, 281, 436, 1921, 50275, 82, 19, 849, 281, 39970, 50276, 262, 310, 1270, 253, 2929, 2722, 326, 851, 498, 2087, 4219, 973, 281, 39709, 20665, 2299, 604, 359, 3609, 2220, 271, 5368, 10895, 2167, 1077, 1781, 849, 1057, 851, 498, 6830, 4917, 9106, 39709, 8071, 1103, 26332, 417, 275, 667, 5368, 10895, 50274, 82, 20, 11193, 273, 260, 4706, 3495, 50276, 16777, 272, 819, 81, 391, 28821, 260, 50276, 395, 2805, 1087, 260, 50276, 36042, 49947, 689, 512, 9183, 275, 260, 534, 310, 3309, 323, 1463, 5912, 533, 15180, 8214, 50276, 5430, 1057, 841, 20552, 320, 34930, 407, 247, 1054, 487, 1506, 273, 9969, 50276, 783, 14604, 9990, 403, 1077, 1027, 432, 253, 260, 512, 441, 9469, 10895, 512, 721, 3677, 76, 50274, 7152, 33032, 6010, 273, 253, 2929, 436, 2929, 29328, 247, 22453, 8071, 386, 5438, 6974, 323, 14582, 15001, 275, 1016, 3213, 253, 1566, 4245, 247, 19947, 273, 8071, 1103, 1754, 327, 3786, 6777, 8071, 1103, 391, 28821, 846, 512, 253, 8071, 1103, 403, 4236, 253, 1566, 12255, 1880, 253, 6777, 8071, 1103, 906, 275, 6799, 1885, 253, 19947, 6333, 3714, 74, 310, 10166, 3066, 4499, 422, 4715, 253, 4016, 8071, 386, 9183, 403, 20793, 281, 320, 2074, 281, 253, 2762, 8071, 386, 50275, 45563, 285, 14855, 337, 253, 1332, 29328, 247, 5438, 1754, 2746, 281, 2953, 581, 273, 253, 32213, 273, 7646, 4924, 7274, 50276, 783, 8131, 8071, 1103, 778, 320, 21917, 29356, 436, 310, 6296, 271, 1774, 2523, 326, 3198, 281, 320, 9713, 2299, 891, 717, 9202, 253, 4081, 2746, 310, 271, 689, 24212, 604, 359, 760, 3609, 8071, 1103, 5420, 275, 253, 441, 22352, 5447, 253, 1566, 2550, 39970, 281, 747, 9969, 534, 8687, 747, 8071, 1103, 417, 275, 253, 441, 22352, 5447, 436, 310, 20276, 323, 767, 4606, 50276, 18, 323, 4227, 604, 253, 1566, 310, 6760, 327, 247, 12150, 1071, 873, 835, 3216, 5083, 8071, 1103, 403, 417, 275, 441, 22352, 5447, 891, 1158, 253, 1566, 588, 1891, 342, 470, 1755, 18, 7200, 50275, 19, 25761, 275, 1554, 382, 554, 14582, 15001, 368, 403, 4136, 281, 1056, 747, 10444, 7006, 432, 21917, 2130, 7006, 275, 1340, 281, 1056, 634, 2457, 1885, 891, 13414, 923, 2139, 359, 452, 281, 5206, 8071, 1103, 760, 432, 21917, 2130, 7006, 50276, 19, 9171, 1430, 253, 2264, 1180, 273, 21917, 2130, 7006, 403, 598, 281, 12652, 24088, 546, 6771, 1524, 5447, 891, 717, 7514, 326, 253, 11454, 2990, 1754, 19947, 588, 1408, 1077, 7808, 285, 2550, 4311, 281, 4067, 5239, 273, 7006, 50276, 20, 275, 2593, 5910, 4477, 7472, 616, 2746, 327, 247, 12150, 1071, 873, 342, 4460, 4884, 20665, 275, 619, 4743, 7646, 4924, 7274, 403, 671, 7032, 273, 2087, 3006, 281, 4460, 4884, 20665, 2139, 627, 310, 642, 5301, 281, 305, 19, 72, 390, 39707, 275, 2829, 608, 577, 253, 4081, 1332, 310, 3240, 15246, 342, 3710, 7681, 38135, 275, 619, 4743, 891, 717, 9202, 253, 4499, 422, 4715, 629, 310, 816, 247, 15246, 2898, 273, 4016, 10491, 608, 253, 906, 4453, 1077, 2266, 327, 253, 441, 22352, 1235, 76, 1071, 873, 50274, 1189, 455, 7103, 891, 6273, 323, 5075, 12009, 7194, 323, 767, 4606, 337, 253, 1332, 3133, 31257, 273, 2087, 3006, 281, 747, 8071, 386, 7006, 3345, 273, 253, 6264, 6335, 50276, 19, 253, 2746, 556, 3710, 38135, 281, 17857, 32888, 8446, 50275, 5996, 30080, 22559, 891, 651, 751, 281, 5717, 253, 4477, 323, 616, 9865, 2380, 253, 5661, 1543, 1646, 2266, 533, 7681, 38135, 310, 1335, 3710, 619, 2278, 4868, 4558, 253, 1072, 891, 2868, 436, 2929, 476, 1056, 1270, 3486, 604, 9262, 281, 247, 18090, 6698, 50274, 35640, 621, 5747, 619, 4016, 7103, 891, 513, 11435, 253, 1127, 4477, 403, 2820, 281, 2953, 891, 1158, 253, 2929, 476, 320, 3012, 34615, 604, 368, 476, 2343, 5458, 253, 21917, 2130, 7658, 323, 4227, 368, 476, 2343, 5458, 253, 7658, 281, 35143, 12729, 50276, 4039, 5555, 253, 8071, 1103, 326, 476, 320, 17791, 432, 247, 1677, 873, 273, 3652, 8336, 3066, 9969, 275, 1554, 382, 554, 14582, 15001, 368, 403, 4136, 281, 1056, 747, 7006, 432, 634, 3652, 8336, 323, 253, 13232, 273, 2403, 634, 2457, 3580, 7152, 33032, 2520, 19529, 8631, 271, 2746, 281, 1625, 46701, 554, 14582, 15001, 1754, 327, 4499, 422, 4715, 326, 34899, 8071, 1103, 326, 476, 320, 908, 281, 46919, 247, 2303, 1885, 275, 247, 2014, 3213, 253, 4767, 9021, 403, 337, 271, 2746, 281, 14582, 15001, 326, 310, 20793, 281, 760, 3609, 2130, 4983, 4753, 285, 374, 247, 4460, 4499, 422, 4715, 6974, 342, 1892, 4016, 15067, 253, 897, 273, 247, 4499, 422, 2957, 281, 3037, 271, 21496, 273, 8071, 1103, 285, 616, 3580, 326, 10738, 253, 2867, 326, 253, 2020, 273, 8071, 386, 11390, 556, 247, 1029, 7349, 460, 14259, 281, 253, 1885, 4972, 310, 19080, 436, 310, 247, 5322, 1039, 281, 1918, 2605, 281, 247, 5415, 4972, 2317, 253, 5700, 273, 1892, 4016, 15067, 310, 671, 19080, 285, 22649, 253, 6667, 326, 651, 540, 41597, 320, 253, 954, 27096, 50276, 74, 1379, 690, 2523, 342, 253, 26536, 273, 1030, 26208, 14582, 23744, 12645, 281, 271, 41671, 1618, 436, 2987, 323, 1355, 5944, 66, 751, 253, 4394, 908, 1060, 533, 275, 6612, 14582, 15001, 310, 247, 1554, 382, 554, 1232, 835, 954, 8071, 1103, 403, 417, 21917, 2130, 285, 253, 327, 383, 554, 14582, 23744, 7466, 1364, 320, 6015, 17910, 1242, 436, 2746, 310, 26401, 7591, 281, 10196, 327, 4884, 3580, 835, 2709, 13506, 5018, 403, 2424, 534, 1957, 253, 11132, 2219, 326, 310, 2139, 275, 253, 1554, 382, 554, 27163, 253, 4477, 10725, 327, 253, 39707, 1566, 281, 12661, 10444, 5289, 30762, 277, 671, 5936, 326, 275, 253, 7178, 3186, 4679, 3640, 273, 253, 15050, 275, 7170, 369, 2424, 281, 3989, 253, 873, 273, 512, 4983, 4753, 281, 3609, 432, 50276, 783, 16774, 7103, 347, 247, 906, 273, 253, 26536, 310, 8489, 33657, 407, 1030, 26208, 8071, 386, 18595, 281, 271, 41671, 1618, 273, 8071, 1103, 10375, 432, 247, 2885, 5447, 253, 4477, 452, 21010, 253, 4836, 275, 5301, 281, 2045, 7274, 2403, 247, 1481, 936, 2522, 5301, 273, 7200, 1679, 27096, 253, 7103, 275, 5910, 2087, 3006, 281, 39709, 20665, 812, 3365, 320, 271, 14011, 326, 253, 1566, 310, 4715, 13112, 14144, 3707, 272, 6108, 2390, 285, 281, 22950, 749, 18317, 14787, 342, 253, 3580, 253, 1566, 556, 2289, 281, 247, 11096, 1618, 273, 1896, 4983, 4753, 273, 534, 1077, 1643, 403, 2779, 281, 320, 21541, 31744, 323, 247, 1677, 1885, 436, 5750, 12078, 684, 253, 5301, 275, 2829, 608, 436, 310, 417, 1941, 273, 26647, 1580, 253, 1071, 873, 9172, 497, 2908, 275, 253, 873, 273, 9183, 50276, 6050, 253, 4499, 422, 4715, 2746, 310, 19080, 436, 789, 4648, 247, 523, 30487, 15895, 323, 14582, 15001, 326, 310, 417, 7763, 281, 1554, 382, 554, 7219, 285, 253, 4679, 513, 417, 1329, 253, 11815, 8392, 187, 187, 4118, 18435, 27, 6050, 253, 4477, 14109, 253, 4081, 4499, 422, 3733, 6974, 285, 253, 2266, 2905, 789, 6010, 512, 4477, 5821, 326, 253, 2746, 369, 5460, 2205, 3710, 407, 1146, 247, 6313, 5438, 3169, 1332, 1293, 253, 1361, 273, 1529, 1566, 326, 29328, 8094, 253, 2746, 476, 760, 3609, 8071, 1103, 432, 271, 5368, 873, 347, 2303, 8094, 2489, 625, 9542, 253, 771, 7707, 1364, 1056, 247, 4327, 247, 897, 247, 1199, 4067, 3302, 7431, 873, 534, 18670, 546, 3118, 1169, 512, 8094, 3309, 281, 1056, 253, 2303, 12570, 390, 270, 897, 1529, 1566, 281, 12661, 747, 10444, 8094, 253, 4477, 2427, 342, 270, 534, 44137, 616, 38135, 1750, 247, 1943, 1921, 2139, 14582, 15001, 310, 1892, 310, 984, 273, 253, 878, 281, 6635, 39709, 8094, 285, 604, 436, 310, 1669, 281, 271, 2168, 4081, 1566, 253, 1655, 2746, 310, 417, 6240, 1199, 35961, 38135, 1223, 616, 2746, 1057, 3157, 2220, 5368, 789, 275, 253, 1554, 382, 554, 4758, 253, 373, 1014, 625, 3332, 789, 326, 556, 417, 644, 2429, 1411, 24088, 5987, 39962, 2061, 9275, 1518, 1549, 1967, 1839, 9275, 594, 253, 5520, 3045, 778, 320, 41731, 10574, 50276, 783, 4993, 310, 15246, 10007, 253, 16182, 281, 671, 12661, 10444, 8094, 436, 588, 4993, 253, 38135, 5833, 285, 17084, 253, 8542, 414, 4154, 24432, 812, 3587, 897, 436, 2746, 281, 9413, 9066, 15050, 253, 4477, 812, 5777, 5731, 253, 2905, 789, 823, 14023, 1411, 3332, 3082, 285, 1379, 715, 2395, 253, 643, 8680, 1677, 407, 253, 4477, 253, 2929, 310, 1077, 23395, 3542, 253, 4081, 2544, 403, 15846, 35961, 285, 417, 1210, 321, 16285, 494, 275, 619, 4743, 891, 651, 21434, 253, 4477, 281, 1056, 841, 2544, 534, 891, 2868, 588, 906, 275, 247, 1077, 5322, 2929 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 24543, 271, 2746, 281, 14582, 15001, 326, 12453, 253, 7881, 273, 891, 11659, 273, 8071, 1103, 285, 21255, 26647, 281, 39709, 20665, 281, 5115, 436, 253, 4477, 8460, 4187, 14582, 15001, 347, 253, 5438, 273, 8071, 1103, 432, 247, 4229, 873, 275, 253, 1083, 273, 253, 441, 22352, 5447, 436, 310, 253, 873, 273, 9963, 1010, 3141, 21917, 2130, 8071, 1103, 908, 275, 253, 5447, 616, 8071, 386, 5438, 7792, 851, 498, 4648, 18976, 2224, 281, 10173, 5438, 7363, 323, 512, 7431, 8094, 247, 4460, 4499, 422, 3733, 6974, 310, 908, 281, 3037, 253, 5438, 4868, 534, 310, 10302, 347, 253, 7349, 460, 14259, 875, 46234, 273, 253, 1885, 285, 253, 8071, 1103, 10302, 407, 253, 4216, 11454, 6928, 253, 4477, 2085, 247, 1175, 6010, 273, 2905, 789, 275, 436, 2170, 50275, 2577, 2201, 3533, 323, 253, 4477, 273, 436, 2929, 4468, 253, 12291, 11295, 407, 34617, 281, 247, 2173, 7431, 873, 323, 1650, 253, 1543, 2361, 327, 441, 22352, 1235, 76, 403, 1077, 13943, 3782, 253, 3745, 281, 39970, 281, 2918, 483, 4884, 3510, 2299, 253, 1566, 310, 1620, 14870, 407, 9969, 323, 534, 253, 8071, 1103, 403, 417, 1246, 275, 253, 7431, 873, 285, 352, 310, 4336, 12744, 849, 253, 1566, 651, 1347, 275, 436, 10076, 436, 310, 1774, 984, 281, 18915, 253, 34663, 4623, 14582, 15001, 1895, 352, 310, 4555, 3309, 281, 8415, 9969, 835, 253, 8071, 1103, 778, 417, 320, 1246, 275, 253, 9963, 1010, 1093, 8071, 1103, 1246, 275, 441, 22352, 1235, 76, 50275, 74, 1089, 352, 247, 2372, 10084, 326, 253, 4477, 513, 417, 2953, 436, 7658, 275, 253, 2022, 2505, 285, 891, 651, 1642, 326, 597, 4459, 562, 4679, 835, 597, 4656, 253, 7431, 873, 281, 247, 8578, 273, 253, 9963, 1010, 1093, 8071, 1103, 2568, 1908, 9969, 432, 253, 2644, 5447, 285, 1642, 849, 5460, 436, 12400, 310, 275, 2426, 273, 253, 5482, 2797, 407, 253, 1566, 352, 651, 320, 3309, 281, 1973, 841, 36509, 9257, 285, 2085, 1543, 323, 2709, 36509, 627, 403, 643, 7125, 326, 253, 4477, 812, 671, 1056, 281, 2342, 436, 1127, 275, 1635, 281, 841, 4679, 50276, 2858, 253, 1655, 1375, 275, 534, 253, 2523, 310, 12841, 310, 417, 20297, 50275, 783, 5740, 273, 253, 4499, 422, 3733, 2746, 310, 2590, 285, 18893, 253, 1635, 273, 1892, 2297, 3993, 281, 253, 39657, 281, 3157, 4715, 310, 4722, 285, 253, 1543, 273, 253, 28913, 1263, 3984, 281, 253, 1774, 2554, 326, 352, 7120, 50276, 7152, 339, 4025, 498, 30482, 684, 512, 273, 253, 7431, 8094, 512, 441, 9469, 10895, 512, 721, 3677, 76, 1754, 327, 5438, 7363, 10302, 407, 4216, 11454, 6928, 253, 7349, 460, 14259, 875, 3580, 285, 8071, 1103, 403, 908, 281, 2216, 7363, 534, 310, 1996, 908, 323, 3733, 253, 1039, 273, 7349, 460, 14259, 281, 9729, 8071, 1103, 281, 3580, 403, 4722, 50273, 82, 18, 50276, 3197, 1071, 941, 347, 5438, 873, 1309, 3733, 50275, 31158, 970, 512, 441, 9469, 10895, 721, 3677, 76, 347, 9183, 323, 851, 498, 281, 3609, 432, 31347, 1071, 941, 441, 22352, 1235, 76, 1071, 941, 1309, 3733, 512, 441, 9469, 10895, 310, 247, 17402, 292, 273, 253, 441, 22352, 1235, 76, 941, 50276, 27423, 1754, 5933, 14280, 281, 5115, 27662, 28684, 1543, 1955, 281, 436, 1921, 50275, 82, 19, 849, 281, 39970, 50276, 262, 310, 1270, 253, 2929, 2722, 326, 851, 498, 2087, 4219, 973, 281, 39709, 20665, 2299, 604, 359, 3609, 2220, 271, 5368, 10895, 2167, 1077, 1781, 849, 1057, 851, 498, 6830, 4917, 9106, 39709, 8071, 1103, 26332, 417, 275, 667, 5368, 10895, 50274, 82, 20, 11193, 273, 260, 4706, 3495, 50276, 16777, 272, 819, 81, 391, 28821, 260, 50276, 395, 2805, 1087, 260, 50276, 36042, 49947, 689, 512, 9183, 275, 260, 534, 310, 3309, 323, 1463, 5912, 533, 15180, 8214, 50276, 5430, 1057, 841, 20552, 320, 34930, 407, 247, 1054, 487, 1506, 273, 9969, 50276, 783, 14604, 9990, 403, 1077, 1027, 432, 253, 260, 512, 441, 9469, 10895, 512, 721, 3677, 76, 50274, 7152, 33032, 6010, 273, 253, 2929, 436, 2929, 29328, 247, 22453, 8071, 386, 5438, 6974, 323, 14582, 15001, 275, 1016, 3213, 253, 1566, 4245, 247, 19947, 273, 8071, 1103, 1754, 327, 3786, 6777, 8071, 1103, 391, 28821, 846, 512, 253, 8071, 1103, 403, 4236, 253, 1566, 12255, 1880, 253, 6777, 8071, 1103, 906, 275, 6799, 1885, 253, 19947, 6333, 3714, 74, 310, 10166, 3066, 4499, 422, 4715, 253, 4016, 8071, 386, 9183, 403, 20793, 281, 320, 2074, 281, 253, 2762, 8071, 386, 50275, 45563, 285, 14855, 337, 253, 1332, 29328, 247, 5438, 1754, 2746, 281, 2953, 581, 273, 253, 32213, 273, 7646, 4924, 7274, 50276, 783, 8131, 8071, 1103, 778, 320, 21917, 29356, 436, 310, 6296, 271, 1774, 2523, 326, 3198, 281, 320, 9713, 2299, 891, 717, 9202, 253, 4081, 2746, 310, 271, 689, 24212, 604, 359, 760, 3609, 8071, 1103, 5420, 275, 253, 441, 22352, 5447, 253, 1566, 2550, 39970, 281, 747, 9969, 534, 8687, 747, 8071, 1103, 417, 275, 253, 441, 22352, 5447, 436, 310, 20276, 323, 767, 4606, 50276, 18, 323, 4227, 604, 253, 1566, 310, 6760, 327, 247, 12150, 1071, 873, 835, 3216, 5083, 8071, 1103, 403, 417, 275, 441, 22352, 5447, 891, 1158, 253, 1566, 588, 1891, 342, 470, 1755, 18, 7200, 50275, 19, 25761, 275, 1554, 382, 554, 14582, 15001, 368, 403, 4136, 281, 1056, 747, 10444, 7006, 432, 21917, 2130, 7006, 275, 1340, 281, 1056, 634, 2457, 1885, 891, 13414, 923, 2139, 359, 452, 281, 5206, 8071, 1103, 760, 432, 21917, 2130, 7006, 50276, 19, 9171, 1430, 253, 2264, 1180, 273, 21917, 2130, 7006, 403, 598, 281, 12652, 24088, 546, 6771, 1524, 5447, 891, 717, 7514, 326, 253, 11454, 2990, 1754, 19947, 588, 1408, 1077, 7808, 285, 2550, 4311, 281, 4067, 5239, 273, 7006, 50276, 20, 275, 2593, 5910, 4477, 7472, 616, 2746, 327, 247, 12150, 1071, 873, 342, 4460, 4884, 20665, 275, 619, 4743, 7646, 4924, 7274, 403, 671, 7032, 273, 2087, 3006, 281, 4460, 4884, 20665, 2139, 627, 310, 642, 5301, 281, 305, 19, 72, 390, 39707, 275, 2829, 608, 577, 253, 4081, 1332, 310, 3240, 15246, 342, 3710, 7681, 38135, 275, 619, 4743, 891, 717, 9202, 253, 4499, 422, 4715, 629, 310, 816, 247, 15246, 2898, 273, 4016, 10491, 608, 253, 906, 4453, 1077, 2266, 327, 253, 441, 22352, 1235, 76, 1071, 873, 50274, 1189, 455, 7103, 891, 6273, 323, 5075, 12009, 7194, 323, 767, 4606, 337, 253, 1332, 3133, 31257, 273, 2087, 3006, 281, 747, 8071, 386, 7006, 3345, 273, 253, 6264, 6335, 50276, 19, 253, 2746, 556, 3710, 38135, 281, 17857, 32888, 8446, 50275, 5996, 30080, 22559, 891, 651, 751, 281, 5717, 253, 4477, 323, 616, 9865, 2380, 253, 5661, 1543, 1646, 2266, 533, 7681, 38135, 310, 1335, 3710, 619, 2278, 4868, 4558, 253, 1072, 891, 2868, 436, 2929, 476, 1056, 1270, 3486, 604, 9262, 281, 247, 18090, 6698, 50274, 35640, 621, 5747, 619, 4016, 7103, 891, 513, 11435, 253, 1127, 4477, 403, 2820, 281, 2953, 891, 1158, 253, 2929, 476, 320, 3012, 34615, 604, 368, 476, 2343, 5458, 253, 21917, 2130, 7658, 323, 4227, 368, 476, 2343, 5458, 253, 7658, 281, 35143, 12729, 50276, 4039, 5555, 253, 8071, 1103, 326, 476, 320, 17791, 432, 247, 1677, 873, 273, 3652, 8336, 3066, 9969, 275, 1554, 382, 554, 14582, 15001, 368, 403, 4136, 281, 1056, 747, 7006, 432, 634, 3652, 8336, 323, 253, 13232, 273, 2403, 634, 2457, 3580, 7152, 33032, 2520, 19529, 8631, 271, 2746, 281, 1625, 46701, 554, 14582, 15001, 1754, 327, 4499, 422, 4715, 326, 34899, 8071, 1103, 326, 476, 320, 908, 281, 46919, 247, 2303, 1885, 275, 247, 2014, 3213, 253, 4767, 9021, 403, 337, 271, 2746, 281, 14582, 15001, 326, 310, 20793, 281, 760, 3609, 2130, 4983, 4753, 285, 374, 247, 4460, 4499, 422, 4715, 6974, 342, 1892, 4016, 15067, 253, 897, 273, 247, 4499, 422, 2957, 281, 3037, 271, 21496, 273, 8071, 1103, 285, 616, 3580, 326, 10738, 253, 2867, 326, 253, 2020, 273, 8071, 386, 11390, 556, 247, 1029, 7349, 460, 14259, 281, 253, 1885, 4972, 310, 19080, 436, 310, 247, 5322, 1039, 281, 1918, 2605, 281, 247, 5415, 4972, 2317, 253, 5700, 273, 1892, 4016, 15067, 310, 671, 19080, 285, 22649, 253, 6667, 326, 651, 540, 41597, 320, 253, 954, 27096, 50276, 74, 1379, 690, 2523, 342, 253, 26536, 273, 1030, 26208, 14582, 23744, 12645, 281, 271, 41671, 1618, 436, 2987, 323, 1355, 5944, 66, 751, 253, 4394, 908, 1060, 533, 275, 6612, 14582, 15001, 310, 247, 1554, 382, 554, 1232, 835, 954, 8071, 1103, 403, 417, 21917, 2130, 285, 253, 327, 383, 554, 14582, 23744, 7466, 1364, 320, 6015, 17910, 1242, 436, 2746, 310, 26401, 7591, 281, 10196, 327, 4884, 3580, 835, 2709, 13506, 5018, 403, 2424, 534, 1957, 253, 11132, 2219, 326, 310, 2139, 275, 253, 1554, 382, 554, 27163, 253, 4477, 10725, 327, 253, 39707, 1566, 281, 12661, 10444, 5289, 30762, 277, 671, 5936, 326, 275, 253, 7178, 3186, 4679, 3640, 273, 253, 15050, 275, 7170, 369, 2424, 281, 3989, 253, 873, 273, 512, 4983, 4753, 281, 3609, 432, 50276, 783, 16774, 7103, 347, 247, 906, 273, 253, 26536, 310, 8489, 33657, 407, 1030, 26208, 8071, 386, 18595, 281, 271, 41671, 1618, 273, 8071, 1103, 10375, 432, 247, 2885, 5447, 253, 4477, 452, 21010, 253, 4836, 275, 5301, 281, 2045, 7274, 2403, 247, 1481, 936, 2522, 5301, 273, 7200, 1679, 27096, 253, 7103, 275, 5910, 2087, 3006, 281, 39709, 20665, 812, 3365, 320, 271, 14011, 326, 253, 1566, 310, 4715, 13112, 14144, 3707, 272, 6108, 2390, 285, 281, 22950, 749, 18317, 14787, 342, 253, 3580, 253, 1566, 556, 2289, 281, 247, 11096, 1618, 273, 1896, 4983, 4753, 273, 534, 1077, 1643, 403, 2779, 281, 320, 21541, 31744, 323, 247, 1677, 1885, 436, 5750, 12078, 684, 253, 5301, 275, 2829, 608, 436, 310, 417, 1941, 273, 26647, 1580, 253, 1071, 873, 9172, 497, 2908, 275, 253, 873, 273, 9183, 50276, 6050, 253, 4499, 422, 4715, 2746, 310, 19080, 436, 789, 4648, 247, 523, 30487, 15895, 323, 14582, 15001, 326, 310, 417, 7763, 281, 1554, 382, 554, 7219, 285, 253, 4679, 513, 417, 1329, 253, 11815, 8392, 187, 187, 4118, 18435, 27, 6050, 253, 4477, 14109, 253, 4081, 4499, 422, 3733, 6974, 285, 253, 2266, 2905, 789, 6010, 512, 4477, 5821, 326, 253, 2746, 369, 5460, 2205, 3710, 407, 1146, 247, 6313, 5438, 3169, 1332, 1293, 253, 1361, 273, 1529, 1566, 326, 29328, 8094, 253, 2746, 476, 760, 3609, 8071, 1103, 432, 271, 5368, 873, 347, 2303, 8094, 2489, 625, 9542, 253, 771, 7707, 1364, 1056, 247, 4327, 247, 897, 247, 1199, 4067, 3302, 7431, 873, 534, 18670, 546, 3118, 1169, 512, 8094, 3309, 281, 1056, 253, 2303, 12570, 390, 270, 897, 1529, 1566, 281, 12661, 747, 10444, 8094, 253, 4477, 2427, 342, 270, 534, 44137, 616, 38135, 1750, 247, 1943, 1921, 2139, 14582, 15001, 310, 1892, 310, 984, 273, 253, 878, 281, 6635, 39709, 8094, 285, 604, 436, 310, 1669, 281, 271, 2168, 4081, 1566, 253, 1655, 2746, 310, 417, 6240, 1199, 35961, 38135, 1223, 616, 2746, 1057, 3157, 2220, 5368, 789, 275, 253, 1554, 382, 554, 4758, 253, 373, 1014, 625, 3332, 789, 326, 556, 417, 644, 2429, 1411, 24088, 5987, 39962, 2061, 9275, 1518, 1549, 1967, 1839, 9275, 594, 253, 5520, 3045, 778, 320, 41731, 10574, 50276, 783, 4993, 310, 15246, 10007, 253, 16182, 281, 671, 12661, 10444, 8094, 436, 588, 4993, 253, 38135, 5833, 285, 17084, 253, 8542, 414, 4154, 24432, 812, 3587, 897, 436, 2746, 281, 9413, 9066, 15050, 253, 4477, 812, 5777, 5731, 253, 2905, 789, 823, 14023, 1411, 3332, 3082, 285, 1379, 715, 2395, 253, 643, 8680, 1677, 407, 253, 4477, 253, 2929, 310, 1077, 23395, 3542, 253, 4081, 2544, 403, 15846, 35961, 285, 417, 1210, 321, 16285, 494, 275, 619, 4743, 891, 651, 21434, 253, 4477, 281, 1056, 841, 2544, 534, 891, 2868, 588, 906, 275, 247, 1077, 5322, 2929 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the authors propose to combine badgan framework and vat to accelerate learning in the semisupervised setting the paper shows that the vat approach is actively pushing the decision boundary away from the highdensity regions while the badgan approach pulls the decision boundary to lowdensity regions this simultaneous push and pull lead to after convergence in testing accuracy the authors also report competitive results on standard datasets used for ssl such as svhn and cifar10 positives the approach overcomes some of the difficulties with badgan which arise from training a gan and density estimation network for generating bad samples useful for ssl instead of using a gan the proposed approach uses adversarial samples using vat that are sufficiently confusing to the current estimate of the classifier the theoretical justifications for the vat interpretation are interesting and convincing the visualizations of the bad samples show qualitatively that the bad samples from the badgan and proposed approach differ several other visualization aids in understanding the behavior of the algorithm negatives requires additional hyperparmeter tuning in tau and rho tuning these with large validation sets can lead to an overoptimistic estimate of the generalization how sensitive is the performance to these parameters as the authors point out the method has limitations when the number of labeled samples is much smaller it will be nice to see some results in this aspect please include more details to clarify what is meant by the role of the second term of 1 and 2 are overlappeddocsepthis paper makes the interesting observation that the generative procedure proposed by bad gan paper can be replaced by a slightly modified vat procedure the reasoning is sound and leverages the intuition that adversarial examples subject to a sufficiently small perturbation radius are likely to be closer to a decision boundary than the original sample the paper is generally easy to follow but the presentation could be improved in particular more could be done to describe the terms in equation 5 im also curious about the behavior of ltrue which is equivalently the fourth term in eq 1 even when reading bad gan paper i did not quite understand their claim that this can be correctly interpreted as a conditional entropy term if they really wanted conditional entropy they should probably have either done hpkx or hpkx k k i agree with the authors that the roles of the second and fourth terms overlapped and i think this is sufficiently interesting to warrant some further elaboration in the paper i also liked the reminder that power iteration selects a nonunique sign for the first eigenvector subject to the random vector initialization i encourage the authors to do an ablation test to convince the reader that this modification helps to improve convergence speed of the test accuracy the propositions in this paper were in my opinion not particularly insightful while i think it is nice that the authors went through the effort of providing some formalism to the intuition that vat has a push decision boundary away from highdensity regions im less sure if propositions 1 and 2 really provides any additional insight the behavior of vat proposition 1 is pretty weak in that it only covers a 2class logistic regression it seems obvious that the adversarial perturbation points in the direction toward the decision hyperplane if the authors could extend this to more general nonlinear classifiers perhaps subject to some assumptions that would be more interesting i dont think proposition 2 has any real value and recommend its relegation to the appendix i think the biggest weakness of this paper is the experiments taking table 1 at face value the conclusion that fat is simply competitive with existing approaches suggests that the additional machinery isnt particularly useful providing little more than a vanilla vat i also think mnistsvhn has run its course as good semisupervised learning benchmarks and would prefer to see such algorithms being scaled to more complex data the main argument for why fat should be prefered over vat comes from section 62 figure 4 is more interesting but is complicated by the fact that fat checks both possible eigenvectors u during training which requires two forward passes in the classifier did the authors give a similar treatment to vat please show wallclock time too unfortunately the computational efficiency gain seems to only hold true for mnistsvhn but not for cifar i worry that the observed gains will not sustain once we move to more complicated datasets pros simple and clean proposal easy to read cons limited insight weak experimentsdocsepthe paper proposes to use the technique in vat to generate adversarial complementary examples in the k1class semisupervised learning framework described by the bad gan paper this leads to a formulation that combines the vat loss and the k1class classification loss the paper also provides analysis regarding why vat is useful for semisupervised learning pros 1 it is interesting to bridge two stateoftheart semisupervise learning methods in a meaningful 2 some positive results have been presented in table 1 and figure 4 cons and questions 1 i dont understand the authors claim that fat uses both pushing and pulling operations it might be true that both bad gan and vat encourage a decision boundary in the lowdensity region but how are they different are pushing and pulling really different things here 2 unfortunately the proposed method does not give substantial improvement over bad gan or vat in terms of accuracy 3 if using vat to generate bad samples is a reasonable approach then based on the theory in dai et al the bad gan formulation would not need the additional vat regularization term to guarantee generalization on the other hand based on the theory of proposition 2 vat itself should be sufficient why do we still need the k1class formulation it seems that combination of bad gan and vat objectives has not been well motivated or fully justified does this explain the fact that not much empirical gain was obtained by this method 4 the authors try to use proposition 1 to motivate the use of vat for generating complementary examples however it seems that the authors misinterprets the concept of bad examples proposed in dai et al the original definition which led to the theoretical guarantees in dai et al of bad examples is lowdensity data samples in the current paper the authors assume that data samples close to decision boundaries are bad examples this is not sound because lowdensity samples are not equivalent to samples close to decision boundaries especially when the classifier is less perfect as a result the theoretical justification of using vat to sample complementary examples is a bit weak 5 there is not ablation study of different terms in the objective function 6 in figure 4 you can compare your method with bad gan without a pixelcnn bad gan does not need a pixelcnn to achieve the reported results in their paper and their results are reproducible by running the commands given in the github repo it would be good to add this comparison ### Summary:
the paper combines the ideas of vat and bad gan replacing the fake samples in bad gan objective with vat generated samples the motivation behind using the k1 ssl framework with vat examples remains unclear particularly in the light of prop 2 which shows smoothness of classifier around the unlabeled examples is enough which vat already encourages r2 and r3 have raised the point of limited insight and lack of motivation behind combining vat and bad gan objectives in this way r2 and r3 are also concerned about the empirical results which show only marginal improvements over vatbadgan in most settings ac feels that the idea of the paper is interesting but agrees with r2r3 that the proposed objective is not motivated well enough what is the precise advantage of using k1 ssl formulation with vat examples the paper really falls on the borderline and could be improved if this point is addressed convincingly
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 12661, 281, 13398, 3076, 1247, 7792, 285, 362, 255, 281, 28523, 4715, 275, 253, 49863, 29974, 13337, 4758, 253, 2929, 2722, 326, 253, 362, 255, 2746, 310, 15257, 13383, 253, 3061, 7548, 1977, 432, 253, 1029, 20425, 4811, 1223, 253, 3076, 1247, 2746, 25612, 253, 3061, 7548, 281, 1698, 20425, 4811, 436, 19645, 7450, 285, 3785, 1421, 281, 846, 14940, 275, 5175, 7200, 253, 4477, 671, 1304, 12085, 1543, 327, 2629, 15302, 908, 323, 256, 3433, 824, 347, 18504, 13107, 285, 260, 338, 274, 740, 50276, 993, 23223, 253, 2746, 689, 3217, 690, 273, 253, 12748, 342, 3076, 1247, 534, 12893, 432, 3733, 247, 36827, 285, 4038, 13418, 2990, 323, 11365, 3076, 3530, 4217, 323, 256, 3433, 3185, 273, 970, 247, 36827, 253, 4081, 2746, 4648, 48960, 3530, 970, 362, 255, 326, 403, 10481, 21643, 281, 253, 1655, 6642, 273, 253, 30410, 50276, 783, 10527, 816, 6787, 323, 253, 362, 255, 7914, 403, 4722, 285, 21414, 253, 5304, 5904, 273, 253, 3076, 3530, 921, 36143, 326, 253, 3076, 3530, 432, 253, 3076, 1247, 285, 4081, 2746, 9184, 2067, 643, 24426, 34253, 275, 4685, 253, 3879, 273, 253, 5933, 50276, 8265, 3993, 4419, 3081, 4373, 1148, 35782, 25184, 275, 29201, 285, 391, 1689, 25184, 841, 342, 1781, 12820, 5239, 476, 1421, 281, 271, 689, 32581, 2531, 6642, 273, 253, 26647, 849, 7996, 310, 253, 3045, 281, 841, 3602, 50276, 284, 253, 4477, 1127, 562, 253, 1332, 556, 7364, 672, 253, 1180, 273, 13130, 3530, 310, 1199, 4577, 352, 588, 320, 5322, 281, 923, 690, 1543, 275, 436, 4809, 4496, 2486, 625, 4278, 281, 19148, 752, 310, 5486, 407, 253, 2554, 273, 253, 1273, 1307, 273, 337, 285, 374, 403, 48955, 7152, 33032, 2520, 2929, 2789, 253, 4722, 8310, 326, 253, 1006, 800, 5199, 4081, 407, 3076, 36827, 2929, 476, 320, 7932, 407, 247, 5777, 7321, 362, 255, 5199, 253, 14720, 310, 3590, 285, 19732, 1131, 253, 30328, 326, 48960, 6667, 2256, 281, 247, 10481, 1355, 20452, 9941, 403, 2779, 281, 320, 8003, 281, 247, 3061, 7548, 685, 253, 3236, 3410, 50275, 783, 2929, 310, 3839, 3477, 281, 956, 533, 253, 9759, 812, 320, 5520, 275, 1798, 625, 812, 320, 2218, 281, 6266, 253, 2426, 275, 5150, 608, 516, 671, 14338, 670, 253, 3879, 273, 298, 5672, 534, 310, 39406, 253, 7002, 1307, 275, 16186, 337, 1014, 672, 4361, 3076, 36827, 2929, 891, 858, 417, 3240, 2096, 616, 1750, 326, 436, 476, 320, 9113, 12814, 347, 247, 17697, 15579, 1307, 604, 597, 1663, 3078, 17697, 15579, 597, 943, 3164, 452, 2057, 2218, 288, 27905, 89, 390, 288, 27905, 89, 465, 50276, 76, 891, 5194, 342, 253, 4477, 326, 253, 9503, 273, 253, 1273, 285, 7002, 2426, 48955, 285, 891, 1158, 436, 310, 10481, 4722, 281, 7501, 690, 2007, 14883, 318, 275, 253, 2929, 891, 671, 10490, 253, 24388, 326, 1612, 19502, 34899, 247, 1327, 22524, 861, 323, 253, 806, 9216, 11000, 2256, 281, 253, 3632, 4972, 31850, 891, 11907, 253, 4477, 281, 513, 271, 28913, 1071, 281, 18578, 253, 9414, 326, 436, 11237, 7729, 281, 3157, 14940, 3885, 273, 253, 1071, 7200, 50276, 783, 39325, 275, 436, 2929, 497, 275, 619, 4743, 417, 3782, 47860, 1223, 891, 1158, 352, 310, 5322, 326, 253, 4477, 2427, 949, 253, 3434, 273, 5277, 690, 30221, 281, 253, 30328, 326, 362, 255, 556, 247, 7450, 3061, 7548, 1977, 432, 1029, 20425, 4811, 516, 1679, 2119, 604, 39325, 337, 285, 374, 1663, 3400, 667, 3081, 12288, 253, 3879, 273, 362, 255, 13989, 337, 310, 3965, 5075, 275, 326, 352, 760, 10949, 247, 374, 2437, 21535, 9077, 352, 3133, 4755, 326, 253, 48960, 20452, 2792, 275, 253, 3884, 2584, 253, 3061, 4373, 13568, 604, 253, 4477, 812, 9017, 436, 281, 625, 2087, 14561, 49996, 4931, 2256, 281, 690, 13260, 326, 651, 320, 625, 4722, 891, 13414, 1158, 13989, 374, 556, 667, 1524, 1318, 285, 5583, 697, 1693, 72, 318, 281, 253, 30762, 50276, 74, 1158, 253, 5962, 14855, 273, 436, 2929, 310, 253, 4679, 3192, 2829, 337, 387, 2454, 1318, 253, 6452, 326, 4688, 310, 3365, 12085, 342, 5368, 7274, 5936, 326, 253, 3081, 20949, 310, 2649, 3782, 4217, 5277, 1652, 625, 685, 247, 26724, 362, 255, 891, 671, 1158, 278, 79, 1346, 87, 13107, 556, 1408, 697, 2282, 347, 1175, 49863, 29974, 13337, 4715, 49602, 285, 651, 4510, 281, 923, 824, 11333, 1146, 24337, 281, 625, 2570, 941, 253, 2022, 4154, 323, 2139, 4688, 943, 320, 638, 3850, 689, 362, 255, 3249, 432, 2593, 9743, 4677, 577, 310, 625, 4722, 533, 310, 9542, 407, 253, 958, 326, 4688, 12255, 1097, 1896, 48670, 50276, 86, 1309, 3733, 534, 4419, 767, 3579, 11999, 275, 253, 30410, 858, 253, 4477, 1918, 247, 2074, 1971, 281, 362, 255, 4496, 921, 3402, 13273, 673, 1512, 19235, 253, 15180, 6733, 6351, 3133, 281, 760, 2186, 2032, 323, 278, 79, 1346, 87, 13107, 533, 417, 323, 260, 338, 274, 891, 7664, 326, 253, 2540, 15988, 588, 417, 10265, 2378, 359, 2118, 281, 625, 9542, 15302, 50275, 856, 84, 50276, 19583, 285, 4076, 10419, 50276, 36423, 281, 1239, 772, 50276, 15870, 12288, 50276, 20881, 4679, 7152, 339, 431, 248, 2929, 29328, 281, 897, 253, 5853, 275, 362, 255, 281, 6635, 48960, 19767, 6667, 275, 253, 465, 18, 2437, 49863, 29974, 13337, 4715, 7792, 2529, 407, 253, 3076, 36827, 2929, 436, 5644, 281, 247, 15895, 326, 24772, 253, 362, 255, 2957, 285, 253, 465, 18, 2437, 9162, 2957, 253, 2929, 671, 3400, 1783, 5001, 2139, 362, 255, 310, 4217, 323, 49863, 29974, 13337, 4715, 50276, 856, 84, 337, 352, 310, 4722, 281, 9729, 767, 1375, 23037, 14387, 49863, 29974, 87, 885, 4715, 3082, 275, 247, 14282, 374, 690, 2762, 1543, 452, 644, 3559, 275, 2829, 337, 285, 4677, 577, 50276, 5040, 285, 3533, 337, 891, 13414, 2096, 253, 4477, 1750, 326, 4688, 4648, 1097, 13383, 285, 14252, 5871, 352, 1537, 320, 2032, 326, 1097, 3076, 36827, 285, 362, 255, 11907, 247, 3061, 7548, 275, 253, 1698, 20425, 2919, 533, 849, 403, 597, 1027, 403, 13383, 285, 14252, 1663, 1027, 1841, 1060, 374, 19235, 253, 4081, 1332, 1057, 417, 1918, 6832, 7756, 689, 3076, 36827, 390, 362, 255, 275, 2426, 273, 7200, 495, 604, 970, 362, 255, 281, 6635, 3076, 3530, 310, 247, 5272, 2746, 840, 1754, 327, 253, 3762, 275, 277, 2284, 1162, 355, 253, 3076, 36827, 15895, 651, 417, 878, 253, 3081, 362, 255, 37820, 1307, 281, 12215, 26647, 327, 253, 643, 1133, 1754, 327, 253, 3762, 273, 13989, 374, 362, 255, 3139, 943, 320, 4209, 2139, 513, 359, 1335, 878, 253, 465, 18, 2437, 15895, 352, 3133, 326, 5019, 273, 3076, 36827, 285, 362, 255, 16566, 556, 417, 644, 973, 17194, 390, 4751, 17285, 1057, 436, 5513, 253, 958, 326, 417, 1199, 16774, 6351, 369, 2797, 407, 436, 1332, 577, 253, 4477, 1611, 281, 897, 13989, 337, 281, 41509, 253, 897, 273, 362, 255, 323, 11365, 19767, 6667, 2299, 352, 3133, 326, 253, 4477, 3731, 22416, 84, 253, 4473, 273, 3076, 6667, 4081, 275, 277, 2284, 1162, 355, 253, 3236, 5426, 534, 3977, 281, 253, 10527, 23632, 275, 277, 2284, 1162, 355, 273, 3076, 6667, 310, 1698, 20425, 941, 3530, 275, 253, 1655, 2929, 253, 4477, 5467, 326, 941, 3530, 2810, 281, 3061, 13674, 403, 3076, 6667, 436, 310, 417, 3590, 984, 1698, 20425, 3530, 403, 417, 6425, 281, 3530, 2810, 281, 3061, 13674, 3340, 672, 253, 30410, 310, 1679, 3962, 347, 247, 906, 253, 10527, 22861, 273, 970, 362, 255, 281, 3410, 19767, 6667, 310, 247, 2372, 5075, 608, 627, 310, 417, 28913, 1263, 273, 1027, 2426, 275, 253, 8103, 1159, 721, 275, 4677, 577, 368, 476, 7277, 634, 1332, 342, 3076, 36827, 1293, 247, 12275, 68, 9866, 3076, 36827, 1057, 417, 878, 247, 12275, 68, 9866, 281, 5115, 253, 2361, 1543, 275, 616, 2929, 285, 616, 1543, 403, 41374, 407, 3515, 253, 13896, 1677, 275, 253, 40477, 30905, 352, 651, 320, 1175, 281, 823, 436, 5301, 187, 187, 4118, 18435, 27, 783, 2929, 24772, 253, 5697, 273, 362, 255, 285, 3076, 36827, 15706, 253, 15223, 3530, 275, 3076, 36827, 8103, 342, 362, 255, 4561, 3530, 253, 16038, 3212, 970, 253, 465, 18, 256, 3433, 7792, 342, 362, 255, 6667, 4558, 12744, 3782, 275, 253, 1708, 273, 4198, 374, 534, 2722, 6032, 1255, 273, 30410, 1475, 253, 440, 22027, 6667, 310, 2217, 534, 362, 255, 2168, 29426, 391, 19, 285, 391, 20, 452, 5439, 253, 1127, 273, 3710, 12288, 285, 3480, 273, 16038, 3212, 16248, 362, 255, 285, 3076, 36827, 16566, 275, 436, 1039, 391, 19, 285, 391, 20, 403, 671, 7514, 670, 253, 16774, 1543, 534, 921, 760, 16888, 11701, 689, 362, 255, 14367, 1247, 275, 954, 7533, 50275, 317, 9193, 326, 253, 2934, 273, 253, 2929, 310, 4722, 533, 18726, 342, 391, 19, 83, 20, 326, 253, 4081, 8103, 310, 417, 17194, 973, 2217, 752, 310, 253, 10799, 5750, 273, 970, 465, 18, 256, 3433, 15895, 342, 362, 255, 6667, 253, 2929, 1663, 11521, 327, 253, 45210, 285, 812, 320, 5520, 604, 436, 1127, 310, 9713, 2410, 1763, 5356, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 12661, 281, 13398, 3076, 1247, 7792, 285, 362, 255, 281, 28523, 4715, 275, 253, 49863, 29974, 13337, 4758, 253, 2929, 2722, 326, 253, 362, 255, 2746, 310, 15257, 13383, 253, 3061, 7548, 1977, 432, 253, 1029, 20425, 4811, 1223, 253, 3076, 1247, 2746, 25612, 253, 3061, 7548, 281, 1698, 20425, 4811, 436, 19645, 7450, 285, 3785, 1421, 281, 846, 14940, 275, 5175, 7200, 253, 4477, 671, 1304, 12085, 1543, 327, 2629, 15302, 908, 323, 256, 3433, 824, 347, 18504, 13107, 285, 260, 338, 274, 740, 50276, 993, 23223, 253, 2746, 689, 3217, 690, 273, 253, 12748, 342, 3076, 1247, 534, 12893, 432, 3733, 247, 36827, 285, 4038, 13418, 2990, 323, 11365, 3076, 3530, 4217, 323, 256, 3433, 3185, 273, 970, 247, 36827, 253, 4081, 2746, 4648, 48960, 3530, 970, 362, 255, 326, 403, 10481, 21643, 281, 253, 1655, 6642, 273, 253, 30410, 50276, 783, 10527, 816, 6787, 323, 253, 362, 255, 7914, 403, 4722, 285, 21414, 253, 5304, 5904, 273, 253, 3076, 3530, 921, 36143, 326, 253, 3076, 3530, 432, 253, 3076, 1247, 285, 4081, 2746, 9184, 2067, 643, 24426, 34253, 275, 4685, 253, 3879, 273, 253, 5933, 50276, 8265, 3993, 4419, 3081, 4373, 1148, 35782, 25184, 275, 29201, 285, 391, 1689, 25184, 841, 342, 1781, 12820, 5239, 476, 1421, 281, 271, 689, 32581, 2531, 6642, 273, 253, 26647, 849, 7996, 310, 253, 3045, 281, 841, 3602, 50276, 284, 253, 4477, 1127, 562, 253, 1332, 556, 7364, 672, 253, 1180, 273, 13130, 3530, 310, 1199, 4577, 352, 588, 320, 5322, 281, 923, 690, 1543, 275, 436, 4809, 4496, 2486, 625, 4278, 281, 19148, 752, 310, 5486, 407, 253, 2554, 273, 253, 1273, 1307, 273, 337, 285, 374, 403, 48955, 7152, 33032, 2520, 2929, 2789, 253, 4722, 8310, 326, 253, 1006, 800, 5199, 4081, 407, 3076, 36827, 2929, 476, 320, 7932, 407, 247, 5777, 7321, 362, 255, 5199, 253, 14720, 310, 3590, 285, 19732, 1131, 253, 30328, 326, 48960, 6667, 2256, 281, 247, 10481, 1355, 20452, 9941, 403, 2779, 281, 320, 8003, 281, 247, 3061, 7548, 685, 253, 3236, 3410, 50275, 783, 2929, 310, 3839, 3477, 281, 956, 533, 253, 9759, 812, 320, 5520, 275, 1798, 625, 812, 320, 2218, 281, 6266, 253, 2426, 275, 5150, 608, 516, 671, 14338, 670, 253, 3879, 273, 298, 5672, 534, 310, 39406, 253, 7002, 1307, 275, 16186, 337, 1014, 672, 4361, 3076, 36827, 2929, 891, 858, 417, 3240, 2096, 616, 1750, 326, 436, 476, 320, 9113, 12814, 347, 247, 17697, 15579, 1307, 604, 597, 1663, 3078, 17697, 15579, 597, 943, 3164, 452, 2057, 2218, 288, 27905, 89, 390, 288, 27905, 89, 465, 50276, 76, 891, 5194, 342, 253, 4477, 326, 253, 9503, 273, 253, 1273, 285, 7002, 2426, 48955, 285, 891, 1158, 436, 310, 10481, 4722, 281, 7501, 690, 2007, 14883, 318, 275, 253, 2929, 891, 671, 10490, 253, 24388, 326, 1612, 19502, 34899, 247, 1327, 22524, 861, 323, 253, 806, 9216, 11000, 2256, 281, 253, 3632, 4972, 31850, 891, 11907, 253, 4477, 281, 513, 271, 28913, 1071, 281, 18578, 253, 9414, 326, 436, 11237, 7729, 281, 3157, 14940, 3885, 273, 253, 1071, 7200, 50276, 783, 39325, 275, 436, 2929, 497, 275, 619, 4743, 417, 3782, 47860, 1223, 891, 1158, 352, 310, 5322, 326, 253, 4477, 2427, 949, 253, 3434, 273, 5277, 690, 30221, 281, 253, 30328, 326, 362, 255, 556, 247, 7450, 3061, 7548, 1977, 432, 1029, 20425, 4811, 516, 1679, 2119, 604, 39325, 337, 285, 374, 1663, 3400, 667, 3081, 12288, 253, 3879, 273, 362, 255, 13989, 337, 310, 3965, 5075, 275, 326, 352, 760, 10949, 247, 374, 2437, 21535, 9077, 352, 3133, 4755, 326, 253, 48960, 20452, 2792, 275, 253, 3884, 2584, 253, 3061, 4373, 13568, 604, 253, 4477, 812, 9017, 436, 281, 625, 2087, 14561, 49996, 4931, 2256, 281, 690, 13260, 326, 651, 320, 625, 4722, 891, 13414, 1158, 13989, 374, 556, 667, 1524, 1318, 285, 5583, 697, 1693, 72, 318, 281, 253, 30762, 50276, 74, 1158, 253, 5962, 14855, 273, 436, 2929, 310, 253, 4679, 3192, 2829, 337, 387, 2454, 1318, 253, 6452, 326, 4688, 310, 3365, 12085, 342, 5368, 7274, 5936, 326, 253, 3081, 20949, 310, 2649, 3782, 4217, 5277, 1652, 625, 685, 247, 26724, 362, 255, 891, 671, 1158, 278, 79, 1346, 87, 13107, 556, 1408, 697, 2282, 347, 1175, 49863, 29974, 13337, 4715, 49602, 285, 651, 4510, 281, 923, 824, 11333, 1146, 24337, 281, 625, 2570, 941, 253, 2022, 4154, 323, 2139, 4688, 943, 320, 638, 3850, 689, 362, 255, 3249, 432, 2593, 9743, 4677, 577, 310, 625, 4722, 533, 310, 9542, 407, 253, 958, 326, 4688, 12255, 1097, 1896, 48670, 50276, 86, 1309, 3733, 534, 4419, 767, 3579, 11999, 275, 253, 30410, 858, 253, 4477, 1918, 247, 2074, 1971, 281, 362, 255, 4496, 921, 3402, 13273, 673, 1512, 19235, 253, 15180, 6733, 6351, 3133, 281, 760, 2186, 2032, 323, 278, 79, 1346, 87, 13107, 533, 417, 323, 260, 338, 274, 891, 7664, 326, 253, 2540, 15988, 588, 417, 10265, 2378, 359, 2118, 281, 625, 9542, 15302, 50275, 856, 84, 50276, 19583, 285, 4076, 10419, 50276, 36423, 281, 1239, 772, 50276, 15870, 12288, 50276, 20881, 4679, 7152, 339, 431, 248, 2929, 29328, 281, 897, 253, 5853, 275, 362, 255, 281, 6635, 48960, 19767, 6667, 275, 253, 465, 18, 2437, 49863, 29974, 13337, 4715, 7792, 2529, 407, 253, 3076, 36827, 2929, 436, 5644, 281, 247, 15895, 326, 24772, 253, 362, 255, 2957, 285, 253, 465, 18, 2437, 9162, 2957, 253, 2929, 671, 3400, 1783, 5001, 2139, 362, 255, 310, 4217, 323, 49863, 29974, 13337, 4715, 50276, 856, 84, 337, 352, 310, 4722, 281, 9729, 767, 1375, 23037, 14387, 49863, 29974, 87, 885, 4715, 3082, 275, 247, 14282, 374, 690, 2762, 1543, 452, 644, 3559, 275, 2829, 337, 285, 4677, 577, 50276, 5040, 285, 3533, 337, 891, 13414, 2096, 253, 4477, 1750, 326, 4688, 4648, 1097, 13383, 285, 14252, 5871, 352, 1537, 320, 2032, 326, 1097, 3076, 36827, 285, 362, 255, 11907, 247, 3061, 7548, 275, 253, 1698, 20425, 2919, 533, 849, 403, 597, 1027, 403, 13383, 285, 14252, 1663, 1027, 1841, 1060, 374, 19235, 253, 4081, 1332, 1057, 417, 1918, 6832, 7756, 689, 3076, 36827, 390, 362, 255, 275, 2426, 273, 7200, 495, 604, 970, 362, 255, 281, 6635, 3076, 3530, 310, 247, 5272, 2746, 840, 1754, 327, 253, 3762, 275, 277, 2284, 1162, 355, 253, 3076, 36827, 15895, 651, 417, 878, 253, 3081, 362, 255, 37820, 1307, 281, 12215, 26647, 327, 253, 643, 1133, 1754, 327, 253, 3762, 273, 13989, 374, 362, 255, 3139, 943, 320, 4209, 2139, 513, 359, 1335, 878, 253, 465, 18, 2437, 15895, 352, 3133, 326, 5019, 273, 3076, 36827, 285, 362, 255, 16566, 556, 417, 644, 973, 17194, 390, 4751, 17285, 1057, 436, 5513, 253, 958, 326, 417, 1199, 16774, 6351, 369, 2797, 407, 436, 1332, 577, 253, 4477, 1611, 281, 897, 13989, 337, 281, 41509, 253, 897, 273, 362, 255, 323, 11365, 19767, 6667, 2299, 352, 3133, 326, 253, 4477, 3731, 22416, 84, 253, 4473, 273, 3076, 6667, 4081, 275, 277, 2284, 1162, 355, 253, 3236, 5426, 534, 3977, 281, 253, 10527, 23632, 275, 277, 2284, 1162, 355, 273, 3076, 6667, 310, 1698, 20425, 941, 3530, 275, 253, 1655, 2929, 253, 4477, 5467, 326, 941, 3530, 2810, 281, 3061, 13674, 403, 3076, 6667, 436, 310, 417, 3590, 984, 1698, 20425, 3530, 403, 417, 6425, 281, 3530, 2810, 281, 3061, 13674, 3340, 672, 253, 30410, 310, 1679, 3962, 347, 247, 906, 253, 10527, 22861, 273, 970, 362, 255, 281, 3410, 19767, 6667, 310, 247, 2372, 5075, 608, 627, 310, 417, 28913, 1263, 273, 1027, 2426, 275, 253, 8103, 1159, 721, 275, 4677, 577, 368, 476, 7277, 634, 1332, 342, 3076, 36827, 1293, 247, 12275, 68, 9866, 3076, 36827, 1057, 417, 878, 247, 12275, 68, 9866, 281, 5115, 253, 2361, 1543, 275, 616, 2929, 285, 616, 1543, 403, 41374, 407, 3515, 253, 13896, 1677, 275, 253, 40477, 30905, 352, 651, 320, 1175, 281, 823, 436, 5301, 187, 187, 4118, 18435, 27, 783, 2929, 24772, 253, 5697, 273, 362, 255, 285, 3076, 36827, 15706, 253, 15223, 3530, 275, 3076, 36827, 8103, 342, 362, 255, 4561, 3530, 253, 16038, 3212, 970, 253, 465, 18, 256, 3433, 7792, 342, 362, 255, 6667, 4558, 12744, 3782, 275, 253, 1708, 273, 4198, 374, 534, 2722, 6032, 1255, 273, 30410, 1475, 253, 440, 22027, 6667, 310, 2217, 534, 362, 255, 2168, 29426, 391, 19, 285, 391, 20, 452, 5439, 253, 1127, 273, 3710, 12288, 285, 3480, 273, 16038, 3212, 16248, 362, 255, 285, 3076, 36827, 16566, 275, 436, 1039, 391, 19, 285, 391, 20, 403, 671, 7514, 670, 253, 16774, 1543, 534, 921, 760, 16888, 11701, 689, 362, 255, 14367, 1247, 275, 954, 7533, 50275, 317, 9193, 326, 253, 2934, 273, 253, 2929, 310, 4722, 533, 18726, 342, 391, 19, 83, 20, 326, 253, 4081, 8103, 310, 417, 17194, 973, 2217, 752, 310, 253, 10799, 5750, 273, 970, 465, 18, 256, 3433, 15895, 342, 362, 255, 6667, 253, 2929, 1663, 11521, 327, 253, 45210, 285, 812, 320, 5520, 604, 436, 1127, 310, 9713, 2410, 1763, 5356, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: summary the paper focused on the sample importance in the adversarial training the authors firstly revealed that overparameterized deep models on natural data may have insufficient model capacity for adversarial data because the training loss is hard to zero for adversarial training then the authors argued that limited capacity should be used for these important samples that is we should not treat samples equally important they used the distance to the decision boundary to distinguish important samples and proposed geometryaware instancereweighted adversarial training experiments show the superiority over baselines pros the finding on insufficient model capacity is very interesting the following motivation for gairat is intuitive and well explained the authors proposed a realized measurement to compute the distance to the decision boundary this is inspiring for a series of decisionbased work the experiments demonstrate the effectiveness of the proposed method cons treating data differently has been investigated in related work like mart and mma the authors should discuss the difference from these methods the capacity analysis provides a very good perspective to analyze adversarial training however the explanations in figure 2 are a little bit weak the weight function of eq 6 lack some intuitive explanations why such a formula why choose these constants pgd steps are also investigated in cat and dat papers the authors should also discuss the difference to them the experiments should compare with some baselines considering the example difference such as mart mma the evaluations should test some modern whitebox attacks like autoattack only pgd is not convincing besides blackbox attacks should be tested for a complete evaluation and checking the obfuscated gradients docsepthis paper focuses on adversarial learning it improves the robustness while keeping the accuracy to achieve this point the authors find that adversarial data should have unequal importance which naturally brings geometryaward instancereweighted adversarial training gairat pros 1 the paper has strong novelty in philosophy level the common belief is that robustness and accuracy hurt each other however this paper shows that the robustness can be improved while keeping accuracy as far as i know this point has never been explored before 2 the paper is well motivated and easy to follow first the authors use figure 1 to illustrate the gairat which explicitly gives larger weights on the losses of adversarial data the authors use two toy examples in figure 3 to explain gairat more second the whole logic of this paper is easy to follow for example after explaining motivations of gairat we can clearly see the objective function of gairat and its realization 3 the paper is sufficiently justified in experiments for example pgd200 has been used to verify the robustness of gairat from my personal opinion this result is quite strong moreover the authors upgrade their method by incorporing fat and verify the robustness of gairfat cons 1in the top right panel of figure 10 the svhn experiments have a period of increasing robustness training error for gairat could you explain this 2although authors show that model capacity is not enough in adversarial training how large the dnn should be enough what do you thinkdocsepthis paper challenges the common belief of the inherent tradeoff between robustness and accuracy instead of recent methods improving accuracy while maintaining robustness this paper proposes a geometry aware instance reweighed adversarial training gairat method to improve robustness while maintaining accuracy pros 1 the directionimproving robustness while maintaining accuracyis novel and interesting specifically several papers are challenging the inherent tradeoff eg using more data 1 utilizing early stopped pgd 2 and incorporating dropout 3 this paper still challenges the inherent tradeoff however different from 23 improving accuracy while maintaining robustness this paper goes the other direction to my knowledge this is the first paper to explore this direction 1 understanding and mitigating the tradeoff between robustness and accuracy icml 2020 2 attacks which do not kill training make adversarial learning stronger icml 2020 3 a closer look at accuracy vs robustness neurips 2020 2 this paper has made two conceptual improvements a this paper explicitly argues that the overparameterized networks that have enough model capacity in standard training suffer from the insufficiency in adversarial training though many studies have already shown at needs the large model b this paper argues that under limited model capacity adversarial data should have unequal importance unequal datas treatment was explored in the traditional ml methods several years ago but it is rare in deep learning at this moment 3 the proposed gairat method is effective indeed increasing robustness while retaining accuracy the experiments are comprehensive over different network structures datasets and attack methods the experiments in the appendix provide much useful information cons 1the design of weight assignment function in section 33 seems heuristic would you explain some principles on assigning instance dependent weights 2in figure 4 the gairat method can relieve undesirable robust overfitting would you explain more about this for example why the robust overfitting exists in standard adversarial training howwhy your gairat methods relieve it ### Summary:
the paper proposes an insightful study on the robustness and accuracy of the model it was hard to simultaneously keep the robustness and accuracy a few works tried to improve accuracy while maintaining the robustness by investigating more data early stopping or dropout from a different perspective this paper aims to improve robustness while maintaining accuracy there are some interesting findings in this paper which could deepen our understanding of adversarial training for example the authors conducted experiments with different sizes of the network in standard training and adversarial training the capacity of an overparameterized network can be sufficient for standard training but it may be far from enough to fit adversarial data because of the smoothing effect hence given the limited model capacity adversarial data all have unequal importance though this technique is simple and widely studied in traditional ml it is an interesting attempt in adversarial ml and the authors provide extensive experimental results to justify its effectiveness in the authors responses the concerns raised by the reviewers have been well addressed the new version becomes more complete by including more results on different pgd steps and the insights on designing weight assignment function also the authors gave an interesting discussion on enough model size for the adversarial training though it is still kind of an open question i would thus like to recommend the acceptance of this paper
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 8774, 253, 2929, 7106, 327, 253, 3410, 6349, 275, 253, 48960, 3733, 253, 4477, 41005, 4950, 326, 689, 19484, 1025, 3676, 3210, 327, 3626, 941, 778, 452, 12497, 1566, 5350, 323, 48960, 941, 984, 253, 3733, 2957, 310, 1892, 281, 5058, 323, 48960, 3733, 840, 253, 4477, 9125, 326, 3710, 5350, 943, 320, 908, 323, 841, 1774, 3530, 326, 310, 359, 943, 417, 1555, 3530, 9696, 1774, 597, 908, 253, 4181, 281, 253, 3061, 7548, 281, 12129, 1774, 3530, 285, 4081, 12087, 13823, 4227, 250, 24676, 48960, 3733, 4679, 921, 253, 34385, 689, 1666, 25379, 50275, 856, 84, 50276, 783, 4560, 327, 12497, 1566, 5350, 310, 1077, 4722, 253, 1563, 16038, 323, 305, 1094, 255, 310, 27350, 285, 973, 5544, 50275, 783, 4477, 4081, 247, 8156, 6814, 281, 11897, 253, 4181, 281, 253, 3061, 7548, 436, 310, 29853, 323, 247, 2962, 273, 3061, 3169, 789, 50275, 783, 4679, 7568, 253, 12510, 273, 253, 4081, 1332, 50275, 5040, 50276, 38441, 272, 941, 13359, 556, 644, 6949, 275, 2905, 789, 751, 16172, 285, 278, 785, 253, 4477, 943, 2319, 253, 3064, 432, 841, 3082, 50275, 783, 5350, 1783, 3400, 247, 1077, 1175, 8668, 281, 12106, 48960, 3733, 2299, 253, 22909, 275, 4677, 374, 403, 247, 1652, 2372, 5075, 50275, 783, 2801, 1159, 273, 16186, 721, 3480, 690, 27350, 22909, 2139, 824, 247, 7212, 2139, 5206, 841, 14637, 50276, 8159, 69, 5018, 403, 671, 6949, 275, 5798, 285, 2856, 9380, 253, 4477, 943, 671, 2319, 253, 3064, 281, 731, 50275, 783, 4679, 943, 7277, 342, 690, 1666, 25379, 7296, 253, 1650, 3064, 824, 347, 16172, 278, 785, 50275, 783, 27163, 943, 1071, 690, 4980, 3168, 3364, 8104, 751, 6753, 35946, 760, 23256, 69, 310, 417, 21414, 16280, 2806, 3364, 8104, 943, 320, 5762, 323, 247, 3426, 7103, 285, 12669, 253, 691, 71, 19387, 456, 27935, 50276, 7152, 33032, 2520, 2929, 16633, 327, 48960, 4715, 352, 19132, 253, 31640, 1223, 7562, 253, 7200, 281, 5115, 436, 1127, 253, 4477, 1089, 326, 48960, 941, 943, 452, 38158, 6349, 534, 10748, 10316, 12087, 66, 1034, 4227, 250, 24676, 48960, 3733, 305, 1094, 255, 50276, 856, 84, 337, 253, 2929, 556, 2266, 38135, 275, 11727, 1268, 253, 1846, 9927, 310, 326, 31640, 285, 7200, 8513, 1016, 643, 2299, 436, 2929, 2722, 326, 253, 31640, 476, 320, 5520, 1223, 7562, 7200, 347, 2080, 347, 891, 871, 436, 1127, 556, 1620, 644, 14859, 1078, 50276, 19, 253, 2929, 310, 973, 17194, 285, 3477, 281, 956, 806, 253, 4477, 897, 4677, 337, 281, 17093, 253, 305, 1094, 255, 534, 11120, 4245, 4067, 13461, 327, 253, 11655, 273, 48960, 941, 253, 4477, 897, 767, 20953, 6667, 275, 4677, 495, 281, 5513, 305, 1094, 255, 625, 1273, 253, 2644, 9317, 273, 436, 2929, 310, 3477, 281, 956, 323, 1650, 846, 15571, 42852, 273, 305, 1094, 255, 359, 476, 4518, 923, 253, 8103, 1159, 273, 305, 1094, 255, 285, 697, 22786, 50276, 20, 253, 2929, 310, 10481, 17285, 275, 4679, 323, 1650, 23256, 69, 1518, 556, 644, 908, 281, 12654, 253, 31640, 273, 305, 1094, 255, 432, 619, 3367, 4743, 436, 906, 310, 3240, 2266, 25761, 253, 4477, 15047, 616, 1332, 407, 6220, 272, 4688, 285, 12654, 253, 31640, 273, 305, 1094, 19397, 50276, 5040, 337, 249, 253, 1755, 987, 5370, 273, 4677, 884, 253, 18504, 13107, 4679, 452, 247, 2180, 273, 3629, 31640, 3733, 2228, 323, 305, 1094, 255, 812, 368, 5513, 436, 50275, 19, 20261, 4477, 921, 326, 1566, 5350, 310, 417, 2217, 275, 48960, 3733, 849, 1781, 253, 277, 9866, 943, 320, 2217, 752, 513, 368, 1158, 7152, 33032, 2520, 2929, 7881, 253, 1846, 9927, 273, 253, 12794, 5454, 2727, 875, 31640, 285, 7200, 3185, 273, 3332, 3082, 11138, 7200, 1223, 11850, 31640, 436, 2929, 29328, 247, 12087, 6600, 4227, 294, 664, 18201, 48960, 3733, 305, 1094, 255, 1332, 281, 3157, 31640, 1223, 11850, 7200, 50275, 856, 84, 337, 253, 3884, 303, 40037, 31640, 1223, 11850, 7200, 261, 4460, 285, 4722, 50275, 46458, 2067, 9380, 403, 11132, 253, 12794, 5454, 2727, 24088, 970, 625, 941, 337, 17617, 2393, 6331, 23256, 69, 374, 285, 24049, 5926, 483, 495, 436, 2929, 1335, 7881, 253, 12794, 5454, 2727, 50276, 35529, 1027, 432, 3495, 11138, 7200, 1223, 11850, 31640, 436, 2929, 4566, 253, 643, 3884, 50276, 936, 619, 3640, 436, 310, 253, 806, 2929, 281, 8338, 436, 3884, 50275, 18, 4685, 285, 37460, 253, 5454, 2727, 875, 31640, 285, 7200, 17857, 1686, 9169, 374, 8104, 534, 513, 417, 5159, 3733, 1056, 48960, 4715, 10046, 17857, 1686, 9169, 495, 247, 8003, 1007, 387, 7200, 4632, 31640, 5723, 2824, 9169, 50276, 19, 436, 2929, 556, 1160, 767, 20178, 11701, 247, 436, 2929, 11120, 8219, 326, 253, 689, 19484, 1025, 6928, 326, 452, 2217, 1566, 5350, 275, 2629, 3733, 11089, 432, 253, 39975, 275, 48960, 3733, 2167, 1142, 2175, 452, 2168, 2011, 387, 3198, 253, 1781, 1566, 270, 436, 2929, 8219, 326, 762, 3710, 1566, 5350, 48960, 941, 943, 452, 38158, 6349, 38158, 7621, 1971, 369, 14859, 275, 253, 5899, 13361, 3082, 2067, 1107, 3622, 533, 352, 310, 7520, 275, 3676, 4715, 387, 436, 2774, 50275, 20, 253, 4081, 305, 1094, 255, 1332, 310, 3576, 6296, 3629, 31640, 1223, 26179, 7200, 253, 4679, 403, 11088, 689, 1027, 2990, 5289, 15302, 285, 2983, 3082, 253, 4679, 275, 253, 30762, 2085, 1199, 4217, 1491, 50274, 5040, 337, 783, 2216, 273, 2801, 12714, 1159, 275, 2593, 5922, 3133, 47641, 651, 368, 5513, 690, 9241, 327, 34018, 4227, 7976, 13461, 50275, 19, 249, 4677, 577, 253, 305, 1094, 255, 1332, 476, 29522, 26016, 10237, 689, 31893, 651, 368, 5513, 625, 670, 436, 323, 1650, 2139, 253, 10237, 689, 31893, 4961, 275, 2629, 48960, 3733, 849, 22309, 634, 305, 1094, 255, 3082, 29522, 352, 187, 187, 4118, 18435, 27, 783, 2929, 29328, 271, 47860, 1263, 327, 253, 31640, 285, 7200, 273, 253, 1566, 352, 369, 1892, 281, 10486, 1978, 253, 31640, 285, 7200, 247, 1643, 2987, 3597, 281, 3157, 7200, 1223, 11850, 253, 31640, 407, 15686, 625, 941, 2393, 15910, 390, 5926, 483, 432, 247, 1027, 8668, 436, 2929, 13698, 281, 3157, 31640, 1223, 11850, 7200, 50275, 9088, 403, 690, 4722, 4342, 275, 436, 2929, 534, 812, 3676, 257, 776, 4685, 273, 48960, 3733, 323, 1650, 253, 4477, 5196, 4679, 342, 1027, 9552, 273, 253, 2990, 275, 2629, 3733, 285, 48960, 3733, 253, 5350, 273, 271, 689, 19484, 1025, 2990, 476, 320, 4209, 323, 2629, 3733, 533, 352, 778, 320, 2080, 432, 2217, 281, 4944, 48960, 941, 984, 273, 253, 36971, 1055, 7613, 1677, 253, 3710, 1566, 5350, 48960, 941, 512, 452, 38158, 6349, 2167, 436, 5853, 310, 2969, 285, 7561, 5421, 275, 5899, 13361, 352, 310, 271, 4722, 3177, 275, 48960, 13361, 285, 253, 4477, 2085, 9470, 5661, 1543, 281, 15249, 697, 12510, 50275, 249, 253, 4477, 6128, 253, 7350, 5439, 407, 253, 30628, 452, 644, 973, 9713, 253, 747, 2715, 4916, 625, 3426, 407, 1690, 625, 1543, 327, 1027, 23256, 69, 5018, 285, 253, 16039, 327, 20462, 2801, 12714, 1159, 671, 253, 4477, 3534, 271, 4722, 5955, 327, 2217, 1566, 1979, 323, 253, 48960, 3733, 2167, 352, 310, 1335, 2238, 273, 271, 1527, 1953, 891, 651, 3021, 751, 281, 5583, 253, 14924, 273, 436, 2929, 50276 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 8774, 253, 2929, 7106, 327, 253, 3410, 6349, 275, 253, 48960, 3733, 253, 4477, 41005, 4950, 326, 689, 19484, 1025, 3676, 3210, 327, 3626, 941, 778, 452, 12497, 1566, 5350, 323, 48960, 941, 984, 253, 3733, 2957, 310, 1892, 281, 5058, 323, 48960, 3733, 840, 253, 4477, 9125, 326, 3710, 5350, 943, 320, 908, 323, 841, 1774, 3530, 326, 310, 359, 943, 417, 1555, 3530, 9696, 1774, 597, 908, 253, 4181, 281, 253, 3061, 7548, 281, 12129, 1774, 3530, 285, 4081, 12087, 13823, 4227, 250, 24676, 48960, 3733, 4679, 921, 253, 34385, 689, 1666, 25379, 50275, 856, 84, 50276, 783, 4560, 327, 12497, 1566, 5350, 310, 1077, 4722, 253, 1563, 16038, 323, 305, 1094, 255, 310, 27350, 285, 973, 5544, 50275, 783, 4477, 4081, 247, 8156, 6814, 281, 11897, 253, 4181, 281, 253, 3061, 7548, 436, 310, 29853, 323, 247, 2962, 273, 3061, 3169, 789, 50275, 783, 4679, 7568, 253, 12510, 273, 253, 4081, 1332, 50275, 5040, 50276, 38441, 272, 941, 13359, 556, 644, 6949, 275, 2905, 789, 751, 16172, 285, 278, 785, 253, 4477, 943, 2319, 253, 3064, 432, 841, 3082, 50275, 783, 5350, 1783, 3400, 247, 1077, 1175, 8668, 281, 12106, 48960, 3733, 2299, 253, 22909, 275, 4677, 374, 403, 247, 1652, 2372, 5075, 50275, 783, 2801, 1159, 273, 16186, 721, 3480, 690, 27350, 22909, 2139, 824, 247, 7212, 2139, 5206, 841, 14637, 50276, 8159, 69, 5018, 403, 671, 6949, 275, 5798, 285, 2856, 9380, 253, 4477, 943, 671, 2319, 253, 3064, 281, 731, 50275, 783, 4679, 943, 7277, 342, 690, 1666, 25379, 7296, 253, 1650, 3064, 824, 347, 16172, 278, 785, 50275, 783, 27163, 943, 1071, 690, 4980, 3168, 3364, 8104, 751, 6753, 35946, 760, 23256, 69, 310, 417, 21414, 16280, 2806, 3364, 8104, 943, 320, 5762, 323, 247, 3426, 7103, 285, 12669, 253, 691, 71, 19387, 456, 27935, 50276, 7152, 33032, 2520, 2929, 16633, 327, 48960, 4715, 352, 19132, 253, 31640, 1223, 7562, 253, 7200, 281, 5115, 436, 1127, 253, 4477, 1089, 326, 48960, 941, 943, 452, 38158, 6349, 534, 10748, 10316, 12087, 66, 1034, 4227, 250, 24676, 48960, 3733, 305, 1094, 255, 50276, 856, 84, 337, 253, 2929, 556, 2266, 38135, 275, 11727, 1268, 253, 1846, 9927, 310, 326, 31640, 285, 7200, 8513, 1016, 643, 2299, 436, 2929, 2722, 326, 253, 31640, 476, 320, 5520, 1223, 7562, 7200, 347, 2080, 347, 891, 871, 436, 1127, 556, 1620, 644, 14859, 1078, 50276, 19, 253, 2929, 310, 973, 17194, 285, 3477, 281, 956, 806, 253, 4477, 897, 4677, 337, 281, 17093, 253, 305, 1094, 255, 534, 11120, 4245, 4067, 13461, 327, 253, 11655, 273, 48960, 941, 253, 4477, 897, 767, 20953, 6667, 275, 4677, 495, 281, 5513, 305, 1094, 255, 625, 1273, 253, 2644, 9317, 273, 436, 2929, 310, 3477, 281, 956, 323, 1650, 846, 15571, 42852, 273, 305, 1094, 255, 359, 476, 4518, 923, 253, 8103, 1159, 273, 305, 1094, 255, 285, 697, 22786, 50276, 20, 253, 2929, 310, 10481, 17285, 275, 4679, 323, 1650, 23256, 69, 1518, 556, 644, 908, 281, 12654, 253, 31640, 273, 305, 1094, 255, 432, 619, 3367, 4743, 436, 906, 310, 3240, 2266, 25761, 253, 4477, 15047, 616, 1332, 407, 6220, 272, 4688, 285, 12654, 253, 31640, 273, 305, 1094, 19397, 50276, 5040, 337, 249, 253, 1755, 987, 5370, 273, 4677, 884, 253, 18504, 13107, 4679, 452, 247, 2180, 273, 3629, 31640, 3733, 2228, 323, 305, 1094, 255, 812, 368, 5513, 436, 50275, 19, 20261, 4477, 921, 326, 1566, 5350, 310, 417, 2217, 275, 48960, 3733, 849, 1781, 253, 277, 9866, 943, 320, 2217, 752, 513, 368, 1158, 7152, 33032, 2520, 2929, 7881, 253, 1846, 9927, 273, 253, 12794, 5454, 2727, 875, 31640, 285, 7200, 3185, 273, 3332, 3082, 11138, 7200, 1223, 11850, 31640, 436, 2929, 29328, 247, 12087, 6600, 4227, 294, 664, 18201, 48960, 3733, 305, 1094, 255, 1332, 281, 3157, 31640, 1223, 11850, 7200, 50275, 856, 84, 337, 253, 3884, 303, 40037, 31640, 1223, 11850, 7200, 261, 4460, 285, 4722, 50275, 46458, 2067, 9380, 403, 11132, 253, 12794, 5454, 2727, 24088, 970, 625, 941, 337, 17617, 2393, 6331, 23256, 69, 374, 285, 24049, 5926, 483, 495, 436, 2929, 1335, 7881, 253, 12794, 5454, 2727, 50276, 35529, 1027, 432, 3495, 11138, 7200, 1223, 11850, 31640, 436, 2929, 4566, 253, 643, 3884, 50276, 936, 619, 3640, 436, 310, 253, 806, 2929, 281, 8338, 436, 3884, 50275, 18, 4685, 285, 37460, 253, 5454, 2727, 875, 31640, 285, 7200, 17857, 1686, 9169, 374, 8104, 534, 513, 417, 5159, 3733, 1056, 48960, 4715, 10046, 17857, 1686, 9169, 495, 247, 8003, 1007, 387, 7200, 4632, 31640, 5723, 2824, 9169, 50276, 19, 436, 2929, 556, 1160, 767, 20178, 11701, 247, 436, 2929, 11120, 8219, 326, 253, 689, 19484, 1025, 6928, 326, 452, 2217, 1566, 5350, 275, 2629, 3733, 11089, 432, 253, 39975, 275, 48960, 3733, 2167, 1142, 2175, 452, 2168, 2011, 387, 3198, 253, 1781, 1566, 270, 436, 2929, 8219, 326, 762, 3710, 1566, 5350, 48960, 941, 943, 452, 38158, 6349, 38158, 7621, 1971, 369, 14859, 275, 253, 5899, 13361, 3082, 2067, 1107, 3622, 533, 352, 310, 7520, 275, 3676, 4715, 387, 436, 2774, 50275, 20, 253, 4081, 305, 1094, 255, 1332, 310, 3576, 6296, 3629, 31640, 1223, 26179, 7200, 253, 4679, 403, 11088, 689, 1027, 2990, 5289, 15302, 285, 2983, 3082, 253, 4679, 275, 253, 30762, 2085, 1199, 4217, 1491, 50274, 5040, 337, 783, 2216, 273, 2801, 12714, 1159, 275, 2593, 5922, 3133, 47641, 651, 368, 5513, 690, 9241, 327, 34018, 4227, 7976, 13461, 50275, 19, 249, 4677, 577, 253, 305, 1094, 255, 1332, 476, 29522, 26016, 10237, 689, 31893, 651, 368, 5513, 625, 670, 436, 323, 1650, 2139, 253, 10237, 689, 31893, 4961, 275, 2629, 48960, 3733, 849, 22309, 634, 305, 1094, 255, 3082, 29522, 352, 187, 187, 4118, 18435, 27, 783, 2929, 29328, 271, 47860, 1263, 327, 253, 31640, 285, 7200, 273, 253, 1566, 352, 369, 1892, 281, 10486, 1978, 253, 31640, 285, 7200, 247, 1643, 2987, 3597, 281, 3157, 7200, 1223, 11850, 253, 31640, 407, 15686, 625, 941, 2393, 15910, 390, 5926, 483, 432, 247, 1027, 8668, 436, 2929, 13698, 281, 3157, 31640, 1223, 11850, 7200, 50275, 9088, 403, 690, 4722, 4342, 275, 436, 2929, 534, 812, 3676, 257, 776, 4685, 273, 48960, 3733, 323, 1650, 253, 4477, 5196, 4679, 342, 1027, 9552, 273, 253, 2990, 275, 2629, 3733, 285, 48960, 3733, 253, 5350, 273, 271, 689, 19484, 1025, 2990, 476, 320, 4209, 323, 2629, 3733, 533, 352, 778, 320, 2080, 432, 2217, 281, 4944, 48960, 941, 984, 273, 253, 36971, 1055, 7613, 1677, 253, 3710, 1566, 5350, 48960, 941, 512, 452, 38158, 6349, 2167, 436, 5853, 310, 2969, 285, 7561, 5421, 275, 5899, 13361, 352, 310, 271, 4722, 3177, 275, 48960, 13361, 285, 253, 4477, 2085, 9470, 5661, 1543, 281, 15249, 697, 12510, 50275, 249, 253, 4477, 6128, 253, 7350, 5439, 407, 253, 30628, 452, 644, 973, 9713, 253, 747, 2715, 4916, 625, 3426, 407, 1690, 625, 1543, 327, 1027, 23256, 69, 5018, 285, 253, 16039, 327, 20462, 2801, 12714, 1159, 671, 253, 4477, 3534, 271, 4722, 5955, 327, 2217, 1566, 1979, 323, 253, 48960, 3733, 2167, 352, 310, 1335, 2238, 273, 271, 1527, 1953, 891, 651, 3021, 751, 281, 5583, 253, 14924, 273, 436, 2929, 50276 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: training deep neural networks in the presence of adversarial perturbations in the input data is a very active research topic there are lots of works defining notions of robustness proposing solution algorithms and introducing algorithmic improvements several of the recent techniques involve or extend the interval bound propagation ibp technique however there is no work analysing the convergence of ibp even in its simplest setting in this paper the authors analyse the convergence of ibp in a simplified setting for the first time although i am aware of the line of research certifiably robust training of neural networks i have not seen any work on the analysis of convergence of the gradient descent algorithm especially in the ibp setting hence assuming there is really no work doing this i believe this paper studies a very essential and relevant problem the mathematical steps look correct to me my major concerns 1 presentation overall i believe the paper does not introduce the literature thoroughly includes multiple typos and has mistakes in the terminology i will list some of those at the end of my review but i think in general the paper should undergo a major review language terminology editorial positioning presentation of the existing literature presentation of the main theorem and the proofs 2 generality and assumptions sections 13 are on known work and section 5 is not very critical since this paper proves a convergence result theoretically hence this papers contribution can be summarised by theorem 1 and the lemmas leading to it this result is for i overparametrised networks ii box perturbations with small enough radius iii 1 classification problems iv relu networks v single and wide hidden layer even in the presence of these assumptions the results are with high probability although having some assumptions is essential eg bounding the perturbation radius most results follow from the specific set of assumptions that work for this case but cannot be easily extended eg the classes being 1 is helping largely to look at the sign of the linear expressions to classify etc moreover the upper bound on the error radius decays if we require a stronger result not the other way around moreover currently if we fix an architecture then we should decrease the error radius until the result works minor points typos overparameteried abstract overparameterized page 2 maxmization page 3 prwr0t xi page 7 training can converge to a point is being used frequently but not clear what it is the perturbations are additive but this is not mentioned there are other wellknown perturbations including multiplicative affine or functional transformations end of section 1 under certain conditions this is not clear unless someone reads the whole paper could the authors please mention the conditions briefly section 21 based on linear relaxation linear relaxation of what is not clear section 22 to a globally optima optima is plural problem1 xy sim mathcalx is not clarified as mathcalx is a training set please explain how an expectation is being taken over the training set page 3 inner maximization is achieved is not very clear page 3 you et al 2021 showed that adversarial training provably learn robust halfspaces in the presence of noise the context of robust half spaces is not clarified it does not have meaning alone section 23 last paragraph follows simply from the definition of robust optimisation we typically minimise theta for the worstcase solution which depends on theta so the inner problem cannot be solved numerically rather its tractable counterpart is derived analytically this is wellknown already page 4 u is not defined when lu is being introduced page 4 onward sometimes ui is being used sometimes uiwaxi some sentences claim a result for deltai but do not define i or state for all i 1n assumption 2 j is not defined definition of overlinel in equation 6 is not in line with the standard definitions maybe in the summation you can already include the domains of maximisation problems as max delta i log deltaiinftyleq epsilon usage of we only care about is perhaps a bit informal page 5 to denote the value at time t value of what is not clarified page 5 ari and ari are being used before being defined also the definitions are in general not clear using instead of or mentioning let be least eigenvalue smallestminimum eigenvalue theorem 1 and throughout the paper xi is never defined referring to section 2 in the statement of theorem 1 is perhaps a little bit informal theorem 1 is overall very hard to read could the authors please split it to multiple sentences and present it more clearly section 43 in the beginning wrt wr02 leq r is being used without defining r or t overall theorem 1 and lemma 2 state at least 1delta without defining delta lemma 3 mentions the result holds when assumption 2 is satisfied but assumption 1 is not mentioned in the other results relying on that page 8 then we want to make is unclear lemma 7 the sentences are not complete page 8 plug in eq 30 is this a typo section 4 last paragraph when epsilon 0 the result should be wellknown could the authors please cite relevant papers and the convergence mentioned there to show whether the results found in this paper match these results numerical experiments are not very clear the mnist dataset is being used but the paper relies on binary classification problems overall i think the numerical experiments could be explained more clearly conclusion which converges to 100 terminology is not clear to me the references are not consistent eg in iclr 2015 versus in international conference on learning representations 2020a appendix at most 1 minus the probability this is informal i can share a further list of minor editorial suggestions should the chairs decide to accept this work the paper studies a very relevant problem there are no mistakes in the proofs as far as i can see however the main concerns are i the paper is hard to read and understand ii the convergence result relies on many assumptions that cannot easily be extended to more general settings iii the convergence is probabilistic and if the desired probability gets finer then the required radius of the infinityball will get smaller hence one needs to also increase the number of parameters of the network so the results do not work for a fixed architecture or a fixed error radius docsepthe paper presents to the best of my knowledge the first convergence analysis of ibp training a method commonly employed to train networks that are certifiably robust to adversarial examples the authors build on previous work on proving the convergence of gradient descent for natural training of overparametrized networks in particular relying on similar assumptions and derivation to du et al 2018b the authors extend the proofs to the case of the upper bound to the robust loss as given by ibp it is proved that ibp training converges to zero certified robust loss with high probabiliy the choice of ibp is relevant as it forms the basis of most stateoftheart certified training algorithms while the assumptions seem to be quite restrictive twolayer relu network of constrained width and the upper bound on the perturbation radius the community could start from this work to provide similar extensions as those provided for natural training by allenzhu et al 2019 furthermore as corollary of the convergence proof it is implied that the ibp certified robust accuracy will converge to the true robust accuracy this complies with the folkore observation in the community that methods trained with a given algorithm are more easily verify with the same certification method moreover i found the observations in the experimental section to be of interest to the community in particular point b providing a possible explanation for the commonly employed epsilon warmup schedules minor comments in theorem 1 it is said the ibp certified robust accuracy can converge to zero do the authors mean the ibp certified robust loss epsilon003 is actually a fairly small value as opposed to epsilon01 or epsilon03 which are more commonly employed in mnist in the conclusions our results has a condition our results have a condition the authors nontrivially extend theoretical analyses previously presented in the context of natural training of overparametrized networks to the context of ibp training while the assumptions that lead to the results are fairly restrictive i believe the results are of great interest to the community and could lay the ground for further work in the area docsepthis work provides a theoretical analysis on the convergence of ibp training on overparametrized networks the main theorem states that the ibp certified robust training error can converge to zero with high probability this paper explores the training dynamics of ibp training and provides a convergence analysis this is a novel direction for certified robust training and as such valuable for the research community specifically the authors restrict themselves to a presumably simpler then the general setting of 2 layer relu networks where only the weights of the first layer will get changed during training the weights of the second layer remain unchanged and are either 1 or 1 the authors then proceed to establish various relationships with between the time dependent network weights and eigenvalues of a matrix governing the training dynamics finally they establish that a growing network width will with high probability lead to a convergence of the ibp certified robust error is zero the high level ideas seem to check out the proofs where sporadically checked the background section section 3 provides a good introduction to the techniques the paper builds upon however section 4 needs to be improved between the many lemmas the central theme is sometimes not clear section 4 would benefit from adding more explanations and more intuition to give an example the step from eq 30 to eq 31 seems not obvious further if i understood the paper correctly it should be clarified in the conclusion that the certified robust accuracy converges to 100 on the training set further questions for the authors please speculate do you expect a similar result for arbitrary depth instead of arbitrary width does assumption 1 hold for the mnist dataset if not can this be fixed by offsetting all pixel values by a small positive constant why is assumption 2 needed intuitively could this assumption potentially be relaxed or do you expect that then the theorem would not hold anymore what are the the technical similarities and differences to du et al 2018b minor eq 6 here it could be clarified notationally that barl is obtained using ibp eq 11 one to much eq 12 clarify that the in l denotes the derivative theorem 1 do i suppose correctly the authors mean here that the ibp certified robust error converges on the training set to zero with high probability instead of the ibp certified robust accuracy this paper explores the certified robust training dynamics and proves convergence with high probability on a training set under certain assumptions while the direction is novel the writing and presentation should be improved ### Summary:
verifying robustness of neural networks is an important application in machine learning the submission takes on this challenge via the interval bound propagation ibp framework and provides a theoretical analysis on the training procedure they establish in the large network with case that the certification via ibp reflects the robustness of the neural network despite the tensions between the changing architecture and the required accuracy the results are insightful the ac recommends the authors to revise the paper correcting the significant amounts of typos and improve the presentation for its final version
[ 1077, 5667, 285, 4623, 1895, 253, 15965, 5018, 1007, 3451, 281, 479, 50275, 2577, 2201, 7350, 50276, 18, 50276, 49836, 4583, 891, 2868, 253, 2929, 1057, 417, 9569, 253, 6239, 16575, 3797, 2709, 963, 993, 285, 556, 16503, 275, 253, 28939, 891, 588, 1618, 690, 273, 1110, 387, 253, 990, 273, 619, 2278, 533, 891, 1158, 275, 2087, 253, 2929, 943, 15080, 247, 2201, 2278, 3448, 28939, 21977, 19274, 9759, 273, 253, 5368, 6239, 9759, 273, 253, 2022, 10012, 285, 253, 27947, 50276, 19, 50276, 8719, 1319, 285, 13260, 7118, 2145, 403, 327, 1929, 789, 285, 2593, 608, 310, 417, 1077, 4619, 1580, 436, 2929, 19539, 247, 14940, 906, 28055, 50276, 48521, 436, 9380, 7680, 476, 320, 10405, 1701, 407, 10012, 337, 285, 253, 458, 44661, 4283, 281, 352, 436, 906, 310, 323, 891, 689, 3575, 11656, 1701, 6928, 21255, 3817, 26309, 342, 1355, 2217, 9941, 50276, 12211, 337, 9162, 3237, 21983, 774, 86, 6928, 362, 2014, 285, 4618, 8763, 3828, 1014, 275, 253, 3361, 273, 841, 13260, 253, 1543, 403, 342, 1029, 5912, 3738, 1907, 690, 13260, 310, 5667, 24088, 41113, 253, 20452, 9941, 954, 1543, 956, 432, 253, 2173, 873, 273, 13260, 326, 789, 323, 436, 1083, 533, 2550, 320, 4354, 6508, 24088, 253, 5971, 1146, 337, 310, 9073, 8127, 281, 1007, 387, 253, 861, 273, 253, 4872, 12091, 281, 30215, 3966, 25761, 253, 5170, 3033, 327, 253, 2228, 9941, 27221, 604, 359, 2430, 247, 10046, 906, 417, 253, 643, 1039, 1475, 25761, 4390, 604, 359, 4993, 271, 10336, 840, 359, 943, 6379, 253, 2228, 9941, 1919, 253, 906, 2987, 50276, 37585, 2792, 50276, 555, 993, 689, 19484, 728, 12002, 689, 19484, 1025, 3239, 374, 2781, 78, 1320, 3239, 495, 819, 10641, 17, 85, 1269, 74, 50276, 6377, 818, 50276, 31158, 476, 29623, 281, 247, 1127, 310, 1146, 908, 7208, 533, 417, 2590, 752, 352, 310, 50276, 783, 26309, 403, 21842, 533, 436, 310, 417, 5393, 627, 403, 643, 973, 4304, 26309, 1690, 43904, 29438, 390, 5164, 21257, 50275, 423, 273, 2593, 337, 762, 2176, 2515, 50276, 2520, 310, 417, 2590, 5734, 3095, 9563, 253, 2644, 2929, 812, 253, 4477, 4496, 3748, 253, 2515, 13366, 50276, 4674, 3127, 1754, 327, 4872, 17040, 50276, 8172, 17040, 273, 752, 310, 417, 2590, 50276, 4674, 3307, 281, 247, 21349, 5556, 66, 50276, 2178, 8032, 310, 25540, 50276, 28872, 18, 1269, 90, 948, 14168, 1179, 89, 310, 417, 31637, 347, 14168, 1179, 89, 310, 247, 3733, 873, 4496, 5513, 849, 271, 15355, 310, 1146, 2668, 689, 253, 3733, 873, 50276, 6377, 495, 6703, 11903, 1320, 310, 6786, 310, 417, 1077, 2590, 50276, 6377, 495, 368, 1162, 355, 43425, 2692, 326, 48960, 3733, 872, 1598, 3037, 10237, 2716, 31748, 275, 253, 3361, 273, 6046, 50276, 783, 3634, 273, 10237, 2716, 8470, 310, 417, 31637, 352, 1057, 417, 452, 4495, 3815, 50276, 4674, 3495, 1390, 12494, 3637, 3365, 432, 253, 5426, 273, 10237, 5556, 5837, 359, 5431, 7221, 885, 39116, 323, 253, 9065, 5045, 2900, 534, 7024, 327, 39116, 594, 253, 6703, 1895, 2550, 320, 14042, 27184, 2581, 697, 10649, 494, 14317, 310, 6012, 41398, 436, 310, 973, 4304, 2168, 50276, 6377, 577, 1484, 310, 417, 2931, 672, 26535, 310, 1146, 5611, 50276, 6377, 577, 47768, 4536, 28243, 310, 1146, 908, 4536, 1484, 27684, 991, 74, 50276, 8826, 14683, 1750, 247, 906, 323, 18687, 74, 533, 513, 417, 4853, 891, 390, 1375, 323, 512, 891, 337, 79, 50276, 515, 23892, 374, 480, 310, 417, 2931, 50276, 28692, 273, 689, 3642, 293, 275, 5150, 721, 310, 417, 275, 1386, 342, 253, 2629, 14308, 5046, 275, 253, 36138, 368, 476, 2168, 2486, 253, 10625, 273, 11903, 5837, 3237, 347, 2781, 18687, 891, 50276, 2808, 50276, 3005, 74, 3259, 3040, 299, 4277, 50274, 24483, 273, 359, 760, 1557, 670, 310, 4931, 247, 2372, 25040, 50276, 6377, 608, 281, 9173, 253, 1318, 387, 673, 246, 50276, 2877, 273, 752, 310, 417, 31637, 50276, 6377, 608, 247, 363, 285, 247, 363, 403, 1146, 908, 1078, 1146, 2931, 671, 253, 14308, 403, 275, 2087, 417, 2590, 970, 50276, 34235, 273, 50276, 263, 29570, 1339, 50276, 1257, 50275, 38462, 25023, 50276, 6795, 383, 35674, 25023, 50276, 33921, 337, 285, 4768, 253, 2929, 1269, 74, 310, 1620, 2931, 50276, 709, 24247, 281, 2593, 374, 275, 253, 3908, 273, 10012, 337, 310, 4931, 247, 1652, 2372, 25040, 50276, 33921, 337, 310, 4583, 1077, 1892, 281, 1239, 812, 253, 4477, 4496, 8085, 352, 281, 2709, 14683, 285, 1246, 352, 625, 4518, 50276, 4674, 7652, 275, 253, 5068, 8772, 50276, 10641, 2640, 458, 82, 391, 310, 1146, 908, 1293, 13947, 391, 390, 246, 50276, 1189, 455, 10012, 337, 285, 18057, 374, 1375, 387, 1878, 337, 3005, 1293, 13947, 18687, 50276, 21838, 495, 25957, 253, 906, 6556, 672, 9376, 374, 310, 10048, 533, 9376, 337, 310, 417, 5393, 275, 253, 643, 1543, 22128, 327, 326, 50276, 6377, 854, 840, 359, 971, 281, 1056, 310, 12744, 50276, 21838, 818, 253, 14683, 403, 417, 3426, 50276, 6377, 854, 10358, 275, 16186, 1884, 310, 436, 247, 1745, 80, 50276, 4674, 577, 1390, 12494, 50276, 9453, 299, 4277, 50276, 17, 253, 906, 943, 320, 973, 4304, 812, 253, 4477, 4496, 26542, 4623, 9380, 285, 253, 14940, 5393, 627, 281, 921, 1880, 253, 1543, 1119, 275, 436, 2929, 3761, 841, 1543, 50276, 40907, 474, 4679, 403, 417, 1077, 2590, 253, 278, 79, 382, 10895, 310, 1146, 908, 533, 253, 2929, 15771, 327, 8985, 9162, 3237, 4583, 891, 1158, 253, 10704, 4679, 812, 320, 5544, 625, 4518, 50276, 585, 3444, 534, 26414, 281, 2233, 28939, 310, 417, 2590, 281, 479, 50276, 783, 10414, 403, 417, 5185, 24088, 275, 17857, 32888, 4104, 7147, 275, 5213, 8059, 327, 4715, 14237, 9169, 66, 50275, 50237, 387, 954, 337, 19734, 253, 5912, 50276, 2520, 310, 25040, 50276, 74, 476, 3894, 247, 2007, 1618, 273, 5884, 21977, 13991, 943, 253, 21583, 7617, 281, 2997, 436, 789, 50276, 783, 2929, 2175, 247, 1077, 4623, 1895, 627, 403, 642, 16503, 275, 253, 27947, 347, 2080, 347, 891, 476, 923, 2299, 253, 2022, 7350, 403, 891, 253, 2929, 310, 1892, 281, 1239, 285, 2096, 21255, 253, 14940, 906, 15771, 327, 1142, 13260, 326, 2550, 4354, 320, 6508, 281, 625, 2087, 7533, 37685, 253, 14940, 310, 37851, 285, 604, 253, 6799, 5912, 4850, 40259, 840, 253, 2424, 9941, 273, 253, 23579, 2910, 588, 755, 4577, 7613, 581, 3198, 281, 671, 2572, 253, 1180, 273, 3602, 273, 253, 2990, 594, 253, 1543, 513, 417, 789, 323, 247, 4229, 10336, 390, 247, 4229, 2228, 9941, 5474, 339, 431, 248, 2929, 10262, 281, 253, 1682, 273, 619, 3640, 253, 806, 14940, 1783, 273, 18890, 81, 3733, 247, 1332, 7744, 7091, 281, 6194, 6928, 326, 403, 5306, 18279, 1598, 10237, 281, 48960, 6667, 253, 4477, 1973, 327, 2045, 789, 327, 18597, 253, 14940, 273, 11786, 18499, 323, 3626, 3733, 273, 689, 3575, 292, 50065, 6928, 275, 1798, 22128, 327, 2074, 13260, 285, 28529, 281, 3443, 1162, 355, 4765, 67, 253, 4477, 9017, 253, 27947, 281, 253, 1083, 273, 253, 5170, 3033, 281, 253, 10237, 2957, 347, 1677, 407, 18890, 81, 352, 310, 8058, 326, 18890, 81, 3733, 26414, 281, 5058, 18065, 10237, 2957, 342, 1029, 1742, 357, 3093, 90, 253, 4327, 273, 18890, 81, 310, 4623, 347, 352, 4948, 253, 3720, 273, 954, 1375, 23037, 14387, 18065, 3733, 11333, 50276, 6050, 253, 13260, 1646, 281, 320, 3240, 29190, 2500, 311, 4071, 774, 86, 2990, 273, 20793, 4871, 285, 253, 5170, 3033, 327, 253, 20452, 9941, 253, 3114, 812, 1265, 432, 436, 789, 281, 2085, 2074, 18149, 347, 1110, 2530, 323, 3626, 3733, 407, 512, 12586, 11917, 1162, 355, 6247, 33810, 347, 40460, 273, 253, 14940, 4737, 352, 310, 10466, 326, 253, 18890, 81, 18065, 10237, 7200, 588, 29623, 281, 253, 2032, 10237, 7200, 436, 3137, 447, 342, 253, 19365, 410, 8310, 275, 253, 3114, 326, 3082, 10166, 342, 247, 1677, 5933, 403, 625, 4354, 12654, 342, 253, 1072, 21612, 1332, 25761, 891, 1119, 253, 7313, 275, 253, 5661, 2593, 281, 320, 273, 1600, 281, 253, 3114, 275, 1798, 1127, 270, 5277, 247, 1896, 8813, 323, 253, 7744, 7091, 299, 4277, 5890, 484, 28631, 50276, 37585, 5701, 50276, 249, 10012, 337, 352, 310, 753, 253, 18890, 81, 18065, 10237, 7200, 50276, 5092, 29623, 281, 5058, 513, 253, 4477, 1599, 253, 18890, 81, 18065, 10237, 2957, 50276, 4259, 4838, 310, 2686, 247, 9648, 1355, 1318, 347, 10066, 281, 299, 4277, 520, 390, 299, 4277, 2941, 534, 403, 625, 7744, 7091, 275, 278, 79, 382, 50276, 249, 253, 11815, 776, 1543, 556, 247, 1617, 50276, 454, 1543, 452, 247, 1617, 50276, 783, 4477, 25450, 1069, 1365, 9017, 10527, 6260, 3786, 3559, 275, 253, 3634, 273, 3626, 3733, 273, 689, 3575, 292, 50065, 6928, 281, 253, 3634, 273, 18890, 81, 3733, 1223, 253, 13260, 326, 1421, 281, 253, 1543, 403, 9648, 29190, 891, 2868, 253, 1543, 403, 273, 1270, 1600, 281, 253, 3114, 285, 812, 2242, 253, 3216, 323, 2007, 789, 275, 253, 2170, 5474, 33032, 2520, 789, 3400, 247, 10527, 1783, 327, 253, 14940, 273, 18890, 81, 3733, 327, 689, 3575, 292, 50065, 6928, 253, 2022, 10012, 3054, 326, 253, 18890, 81, 18065, 10237, 3733, 2228, 476, 29623, 281, 5058, 342, 1029, 5912, 50276, 2520, 2929, 33826, 253, 3733, 8062, 273, 18890, 81, 3733, 285, 3400, 247, 14940, 1783, 436, 310, 247, 4460, 3884, 323, 18065, 10237, 3733, 285, 347, 824, 9865, 323, 253, 2561, 3114, 5742, 253, 4477, 4656, 3746, 281, 247, 18289, 19554, 840, 253, 2087, 4758, 273, 374, 3828, 774, 86, 6928, 835, 760, 253, 13461, 273, 253, 806, 3828, 588, 755, 4391, 1309, 3733, 253, 13461, 273, 253, 1273, 3828, 3464, 19965, 285, 403, 2057, 337, 390, 337, 50276, 783, 4477, 840, 4262, 281, 5100, 2710, 7688, 342, 875, 253, 673, 7976, 2990, 13461, 285, 20223, 273, 247, 4315, 13200, 253, 3733, 8062, 4720, 597, 5100, 326, 247, 5675, 2990, 4871, 588, 342, 1029, 5912, 1421, 281, 247, 14940, 273, 253, 18890, 81, 18065, 10237, 2228, 310, 5058, 253, 1029, 1268, 5697, 1646, 281, 2451, 562, 253, 27947, 835, 24188, 324, 1037, 10141, 50275, 783, 4114, 2593, 2593, 495, 3400, 247, 1175, 10199, 281, 253, 5609, 253, 2929, 21168, 2220, 2299, 2593, 577, 3198, 281, 320, 5520, 875, 253, 1142, 458, 44661, 253, 4275, 10014, 310, 4536, 417, 2590, 2593, 577, 651, 5649, 432, 6240, 625, 22909, 285, 625, 30328, 281, 1918, 271, 1650, 253, 3213, 432, 16186, 1884, 281, 16186, 4562, 3133, 417, 4755, 2007, 604, 891, 7192, 253, 2929, 9113, 352, 943, 320, 31637, 275, 253, 6452, 326, 253, 18065, 10237, 7200, 26414, 281, 2233, 327, 253, 3733, 873, 50275, 44295, 3533, 323, 253, 4477, 50276, 32897, 30821, 513, 368, 1902, 247, 2074, 906, 323, 10341, 6864, 3185, 273, 10341, 4871, 50276, 18566, 9376, 337, 2186, 323, 253, 278, 79, 382, 10895, 604, 417, 476, 436, 320, 4229, 407, 8409, 1076, 512, 12275, 2193, 407, 247, 1355, 2762, 3638, 50276, 22309, 310, 9376, 374, 3058, 540, 41597, 812, 436, 9376, 7826, 320, 19595, 390, 513, 368, 1902, 326, 840, 253, 10012, 651, 417, 2186, 10542, 50276, 5371, 403, 253, 253, 7681, 22620, 285, 3910, 281, 3443, 1162, 355, 4765, 67, 50275, 37585, 50276, 2574, 721, 1060, 352, 812, 320, 31637, 14951, 595, 326, 2534, 77, 310, 2797, 970, 18890, 81, 50276, 2574, 1903, 581, 50276, 936, 1199, 50276, 2574, 1249, 19148, 326, 253, 50276, 249, 298, 12853, 253, 4309, 50275, 33921, 337, 513, 891, 9428, 9113, 253, 4477, 1599, 1060, 326, 253, 18890, 81, 18065, 10237, 2228, 26414, 327, 253, 3733, 873, 281, 5058, 342, 1029, 5912, 3185, 273, 253, 18890, 81, 18065, 10237, 7200, 436, 2929, 33826, 253, 18065, 10237, 3733, 8062, 285, 19539, 14940, 342, 1029, 5912, 327, 247, 3733, 873, 762, 2176, 13260, 1223, 253, 3884, 310, 4460, 253, 4028, 285, 9759, 943, 320, 5520, 50275, 187, 187, 4118, 18435, 27, 332, 5411, 31640, 273, 11454, 6928, 310, 271, 1774, 2898, 275, 5145, 4715, 253, 19529, 3936, 327, 436, 5691, 3066, 253, 7726, 3033, 18634, 18890, 81, 7792, 285, 3400, 247, 10527, 1783, 327, 253, 3733, 5199, 597, 5100, 275, 253, 1781, 2990, 342, 1083, 326, 253, 21612, 3066, 18890, 81, 13806, 253, 31640, 273, 253, 11454, 2990, 5747, 253, 29005, 875, 253, 6890, 10336, 285, 253, 2424, 7200, 253, 1543, 403, 47860, 253, 913, 32636, 253, 4477, 281, 49620, 253, 2929, 35827, 253, 1534, 8322, 273, 963, 993, 285, 3157, 253, 9759, 323, 697, 2457, 2715 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 1077, 5667, 285, 4623, 1895, 253, 15965, 5018, 1007, 3451, 281, 479, 50275, 2577, 2201, 7350, 50276, 18, 50276, 49836, 4583, 891, 2868, 253, 2929, 1057, 417, 9569, 253, 6239, 16575, 3797, 2709, 963, 993, 285, 556, 16503, 275, 253, 28939, 891, 588, 1618, 690, 273, 1110, 387, 253, 990, 273, 619, 2278, 533, 891, 1158, 275, 2087, 253, 2929, 943, 15080, 247, 2201, 2278, 3448, 28939, 21977, 19274, 9759, 273, 253, 5368, 6239, 9759, 273, 253, 2022, 10012, 285, 253, 27947, 50276, 19, 50276, 8719, 1319, 285, 13260, 7118, 2145, 403, 327, 1929, 789, 285, 2593, 608, 310, 417, 1077, 4619, 1580, 436, 2929, 19539, 247, 14940, 906, 28055, 50276, 48521, 436, 9380, 7680, 476, 320, 10405, 1701, 407, 10012, 337, 285, 253, 458, 44661, 4283, 281, 352, 436, 906, 310, 323, 891, 689, 3575, 11656, 1701, 6928, 21255, 3817, 26309, 342, 1355, 2217, 9941, 50276, 12211, 337, 9162, 3237, 21983, 774, 86, 6928, 362, 2014, 285, 4618, 8763, 3828, 1014, 275, 253, 3361, 273, 841, 13260, 253, 1543, 403, 342, 1029, 5912, 3738, 1907, 690, 13260, 310, 5667, 24088, 41113, 253, 20452, 9941, 954, 1543, 956, 432, 253, 2173, 873, 273, 13260, 326, 789, 323, 436, 1083, 533, 2550, 320, 4354, 6508, 24088, 253, 5971, 1146, 337, 310, 9073, 8127, 281, 1007, 387, 253, 861, 273, 253, 4872, 12091, 281, 30215, 3966, 25761, 253, 5170, 3033, 327, 253, 2228, 9941, 27221, 604, 359, 2430, 247, 10046, 906, 417, 253, 643, 1039, 1475, 25761, 4390, 604, 359, 4993, 271, 10336, 840, 359, 943, 6379, 253, 2228, 9941, 1919, 253, 906, 2987, 50276, 37585, 2792, 50276, 555, 993, 689, 19484, 728, 12002, 689, 19484, 1025, 3239, 374, 2781, 78, 1320, 3239, 495, 819, 10641, 17, 85, 1269, 74, 50276, 6377, 818, 50276, 31158, 476, 29623, 281, 247, 1127, 310, 1146, 908, 7208, 533, 417, 2590, 752, 352, 310, 50276, 783, 26309, 403, 21842, 533, 436, 310, 417, 5393, 627, 403, 643, 973, 4304, 26309, 1690, 43904, 29438, 390, 5164, 21257, 50275, 423, 273, 2593, 337, 762, 2176, 2515, 50276, 2520, 310, 417, 2590, 5734, 3095, 9563, 253, 2644, 2929, 812, 253, 4477, 4496, 3748, 253, 2515, 13366, 50276, 4674, 3127, 1754, 327, 4872, 17040, 50276, 8172, 17040, 273, 752, 310, 417, 2590, 50276, 4674, 3307, 281, 247, 21349, 5556, 66, 50276, 2178, 8032, 310, 25540, 50276, 28872, 18, 1269, 90, 948, 14168, 1179, 89, 310, 417, 31637, 347, 14168, 1179, 89, 310, 247, 3733, 873, 4496, 5513, 849, 271, 15355, 310, 1146, 2668, 689, 253, 3733, 873, 50276, 6377, 495, 6703, 11903, 1320, 310, 6786, 310, 417, 1077, 2590, 50276, 6377, 495, 368, 1162, 355, 43425, 2692, 326, 48960, 3733, 872, 1598, 3037, 10237, 2716, 31748, 275, 253, 3361, 273, 6046, 50276, 783, 3634, 273, 10237, 2716, 8470, 310, 417, 31637, 352, 1057, 417, 452, 4495, 3815, 50276, 4674, 3495, 1390, 12494, 3637, 3365, 432, 253, 5426, 273, 10237, 5556, 5837, 359, 5431, 7221, 885, 39116, 323, 253, 9065, 5045, 2900, 534, 7024, 327, 39116, 594, 253, 6703, 1895, 2550, 320, 14042, 27184, 2581, 697, 10649, 494, 14317, 310, 6012, 41398, 436, 310, 973, 4304, 2168, 50276, 6377, 577, 1484, 310, 417, 2931, 672, 26535, 310, 1146, 5611, 50276, 6377, 577, 47768, 4536, 28243, 310, 1146, 908, 4536, 1484, 27684, 991, 74, 50276, 8826, 14683, 1750, 247, 906, 323, 18687, 74, 533, 513, 417, 4853, 891, 390, 1375, 323, 512, 891, 337, 79, 50276, 515, 23892, 374, 480, 310, 417, 2931, 50276, 28692, 273, 689, 3642, 293, 275, 5150, 721, 310, 417, 275, 1386, 342, 253, 2629, 14308, 5046, 275, 253, 36138, 368, 476, 2168, 2486, 253, 10625, 273, 11903, 5837, 3237, 347, 2781, 18687, 891, 50276, 2808, 50276, 3005, 74, 3259, 3040, 299, 4277, 50274, 24483, 273, 359, 760, 1557, 670, 310, 4931, 247, 2372, 25040, 50276, 6377, 608, 281, 9173, 253, 1318, 387, 673, 246, 50276, 2877, 273, 752, 310, 417, 31637, 50276, 6377, 608, 247, 363, 285, 247, 363, 403, 1146, 908, 1078, 1146, 2931, 671, 253, 14308, 403, 275, 2087, 417, 2590, 970, 50276, 34235, 273, 50276, 263, 29570, 1339, 50276, 1257, 50275, 38462, 25023, 50276, 6795, 383, 35674, 25023, 50276, 33921, 337, 285, 4768, 253, 2929, 1269, 74, 310, 1620, 2931, 50276, 709, 24247, 281, 2593, 374, 275, 253, 3908, 273, 10012, 337, 310, 4931, 247, 1652, 2372, 25040, 50276, 33921, 337, 310, 4583, 1077, 1892, 281, 1239, 812, 253, 4477, 4496, 8085, 352, 281, 2709, 14683, 285, 1246, 352, 625, 4518, 50276, 4674, 7652, 275, 253, 5068, 8772, 50276, 10641, 2640, 458, 82, 391, 310, 1146, 908, 1293, 13947, 391, 390, 246, 50276, 1189, 455, 10012, 337, 285, 18057, 374, 1375, 387, 1878, 337, 3005, 1293, 13947, 18687, 50276, 21838, 495, 25957, 253, 906, 6556, 672, 9376, 374, 310, 10048, 533, 9376, 337, 310, 417, 5393, 275, 253, 643, 1543, 22128, 327, 326, 50276, 6377, 854, 840, 359, 971, 281, 1056, 310, 12744, 50276, 21838, 818, 253, 14683, 403, 417, 3426, 50276, 6377, 854, 10358, 275, 16186, 1884, 310, 436, 247, 1745, 80, 50276, 4674, 577, 1390, 12494, 50276, 9453, 299, 4277, 50276, 17, 253, 906, 943, 320, 973, 4304, 812, 253, 4477, 4496, 26542, 4623, 9380, 285, 253, 14940, 5393, 627, 281, 921, 1880, 253, 1543, 1119, 275, 436, 2929, 3761, 841, 1543, 50276, 40907, 474, 4679, 403, 417, 1077, 2590, 253, 278, 79, 382, 10895, 310, 1146, 908, 533, 253, 2929, 15771, 327, 8985, 9162, 3237, 4583, 891, 1158, 253, 10704, 4679, 812, 320, 5544, 625, 4518, 50276, 585, 3444, 534, 26414, 281, 2233, 28939, 310, 417, 2590, 281, 479, 50276, 783, 10414, 403, 417, 5185, 24088, 275, 17857, 32888, 4104, 7147, 275, 5213, 8059, 327, 4715, 14237, 9169, 66, 50275, 50237, 387, 954, 337, 19734, 253, 5912, 50276, 2520, 310, 25040, 50276, 74, 476, 3894, 247, 2007, 1618, 273, 5884, 21977, 13991, 943, 253, 21583, 7617, 281, 2997, 436, 789, 50276, 783, 2929, 2175, 247, 1077, 4623, 1895, 627, 403, 642, 16503, 275, 253, 27947, 347, 2080, 347, 891, 476, 923, 2299, 253, 2022, 7350, 403, 891, 253, 2929, 310, 1892, 281, 1239, 285, 2096, 21255, 253, 14940, 906, 15771, 327, 1142, 13260, 326, 2550, 4354, 320, 6508, 281, 625, 2087, 7533, 37685, 253, 14940, 310, 37851, 285, 604, 253, 6799, 5912, 4850, 40259, 840, 253, 2424, 9941, 273, 253, 23579, 2910, 588, 755, 4577, 7613, 581, 3198, 281, 671, 2572, 253, 1180, 273, 3602, 273, 253, 2990, 594, 253, 1543, 513, 417, 789, 323, 247, 4229, 10336, 390, 247, 4229, 2228, 9941, 5474, 339, 431, 248, 2929, 10262, 281, 253, 1682, 273, 619, 3640, 253, 806, 14940, 1783, 273, 18890, 81, 3733, 247, 1332, 7744, 7091, 281, 6194, 6928, 326, 403, 5306, 18279, 1598, 10237, 281, 48960, 6667, 253, 4477, 1973, 327, 2045, 789, 327, 18597, 253, 14940, 273, 11786, 18499, 323, 3626, 3733, 273, 689, 3575, 292, 50065, 6928, 275, 1798, 22128, 327, 2074, 13260, 285, 28529, 281, 3443, 1162, 355, 4765, 67, 253, 4477, 9017, 253, 27947, 281, 253, 1083, 273, 253, 5170, 3033, 281, 253, 10237, 2957, 347, 1677, 407, 18890, 81, 352, 310, 8058, 326, 18890, 81, 3733, 26414, 281, 5058, 18065, 10237, 2957, 342, 1029, 1742, 357, 3093, 90, 253, 4327, 273, 18890, 81, 310, 4623, 347, 352, 4948, 253, 3720, 273, 954, 1375, 23037, 14387, 18065, 3733, 11333, 50276, 6050, 253, 13260, 1646, 281, 320, 3240, 29190, 2500, 311, 4071, 774, 86, 2990, 273, 20793, 4871, 285, 253, 5170, 3033, 327, 253, 20452, 9941, 253, 3114, 812, 1265, 432, 436, 789, 281, 2085, 2074, 18149, 347, 1110, 2530, 323, 3626, 3733, 407, 512, 12586, 11917, 1162, 355, 6247, 33810, 347, 40460, 273, 253, 14940, 4737, 352, 310, 10466, 326, 253, 18890, 81, 18065, 10237, 7200, 588, 29623, 281, 253, 2032, 10237, 7200, 436, 3137, 447, 342, 253, 19365, 410, 8310, 275, 253, 3114, 326, 3082, 10166, 342, 247, 1677, 5933, 403, 625, 4354, 12654, 342, 253, 1072, 21612, 1332, 25761, 891, 1119, 253, 7313, 275, 253, 5661, 2593, 281, 320, 273, 1600, 281, 253, 3114, 275, 1798, 1127, 270, 5277, 247, 1896, 8813, 323, 253, 7744, 7091, 299, 4277, 5890, 484, 28631, 50276, 37585, 5701, 50276, 249, 10012, 337, 352, 310, 753, 253, 18890, 81, 18065, 10237, 7200, 50276, 5092, 29623, 281, 5058, 513, 253, 4477, 1599, 253, 18890, 81, 18065, 10237, 2957, 50276, 4259, 4838, 310, 2686, 247, 9648, 1355, 1318, 347, 10066, 281, 299, 4277, 520, 390, 299, 4277, 2941, 534, 403, 625, 7744, 7091, 275, 278, 79, 382, 50276, 249, 253, 11815, 776, 1543, 556, 247, 1617, 50276, 454, 1543, 452, 247, 1617, 50276, 783, 4477, 25450, 1069, 1365, 9017, 10527, 6260, 3786, 3559, 275, 253, 3634, 273, 3626, 3733, 273, 689, 3575, 292, 50065, 6928, 281, 253, 3634, 273, 18890, 81, 3733, 1223, 253, 13260, 326, 1421, 281, 253, 1543, 403, 9648, 29190, 891, 2868, 253, 1543, 403, 273, 1270, 1600, 281, 253, 3114, 285, 812, 2242, 253, 3216, 323, 2007, 789, 275, 253, 2170, 5474, 33032, 2520, 789, 3400, 247, 10527, 1783, 327, 253, 14940, 273, 18890, 81, 3733, 327, 689, 3575, 292, 50065, 6928, 253, 2022, 10012, 3054, 326, 253, 18890, 81, 18065, 10237, 3733, 2228, 476, 29623, 281, 5058, 342, 1029, 5912, 50276, 2520, 2929, 33826, 253, 3733, 8062, 273, 18890, 81, 3733, 285, 3400, 247, 14940, 1783, 436, 310, 247, 4460, 3884, 323, 18065, 10237, 3733, 285, 347, 824, 9865, 323, 253, 2561, 3114, 5742, 253, 4477, 4656, 3746, 281, 247, 18289, 19554, 840, 253, 2087, 4758, 273, 374, 3828, 774, 86, 6928, 835, 760, 253, 13461, 273, 253, 806, 3828, 588, 755, 4391, 1309, 3733, 253, 13461, 273, 253, 1273, 3828, 3464, 19965, 285, 403, 2057, 337, 390, 337, 50276, 783, 4477, 840, 4262, 281, 5100, 2710, 7688, 342, 875, 253, 673, 7976, 2990, 13461, 285, 20223, 273, 247, 4315, 13200, 253, 3733, 8062, 4720, 597, 5100, 326, 247, 5675, 2990, 4871, 588, 342, 1029, 5912, 1421, 281, 247, 14940, 273, 253, 18890, 81, 18065, 10237, 2228, 310, 5058, 253, 1029, 1268, 5697, 1646, 281, 2451, 562, 253, 27947, 835, 24188, 324, 1037, 10141, 50275, 783, 4114, 2593, 2593, 495, 3400, 247, 1175, 10199, 281, 253, 5609, 253, 2929, 21168, 2220, 2299, 2593, 577, 3198, 281, 320, 5520, 875, 253, 1142, 458, 44661, 253, 4275, 10014, 310, 4536, 417, 2590, 2593, 577, 651, 5649, 432, 6240, 625, 22909, 285, 625, 30328, 281, 1918, 271, 1650, 253, 3213, 432, 16186, 1884, 281, 16186, 4562, 3133, 417, 4755, 2007, 604, 891, 7192, 253, 2929, 9113, 352, 943, 320, 31637, 275, 253, 6452, 326, 253, 18065, 10237, 7200, 26414, 281, 2233, 327, 253, 3733, 873, 50275, 44295, 3533, 323, 253, 4477, 50276, 32897, 30821, 513, 368, 1902, 247, 2074, 906, 323, 10341, 6864, 3185, 273, 10341, 4871, 50276, 18566, 9376, 337, 2186, 323, 253, 278, 79, 382, 10895, 604, 417, 476, 436, 320, 4229, 407, 8409, 1076, 512, 12275, 2193, 407, 247, 1355, 2762, 3638, 50276, 22309, 310, 9376, 374, 3058, 540, 41597, 812, 436, 9376, 7826, 320, 19595, 390, 513, 368, 1902, 326, 840, 253, 10012, 651, 417, 2186, 10542, 50276, 5371, 403, 253, 253, 7681, 22620, 285, 3910, 281, 3443, 1162, 355, 4765, 67, 50275, 37585, 50276, 2574, 721, 1060, 352, 812, 320, 31637, 14951, 595, 326, 2534, 77, 310, 2797, 970, 18890, 81, 50276, 2574, 1903, 581, 50276, 936, 1199, 50276, 2574, 1249, 19148, 326, 253, 50276, 249, 298, 12853, 253, 4309, 50275, 33921, 337, 513, 891, 9428, 9113, 253, 4477, 1599, 1060, 326, 253, 18890, 81, 18065, 10237, 2228, 26414, 327, 253, 3733, 873, 281, 5058, 342, 1029, 5912, 3185, 273, 253, 18890, 81, 18065, 10237, 7200, 436, 2929, 33826, 253, 18065, 10237, 3733, 8062, 285, 19539, 14940, 342, 1029, 5912, 327, 247, 3733, 873, 762, 2176, 13260, 1223, 253, 3884, 310, 4460, 253, 4028, 285, 9759, 943, 320, 5520, 50275, 187, 187, 4118, 18435, 27, 332, 5411, 31640, 273, 11454, 6928, 310, 271, 1774, 2898, 275, 5145, 4715, 253, 19529, 3936, 327, 436, 5691, 3066, 253, 7726, 3033, 18634, 18890, 81, 7792, 285, 3400, 247, 10527, 1783, 327, 253, 3733, 5199, 597, 5100, 275, 253, 1781, 2990, 342, 1083, 326, 253, 21612, 3066, 18890, 81, 13806, 253, 31640, 273, 253, 11454, 2990, 5747, 253, 29005, 875, 253, 6890, 10336, 285, 253, 2424, 7200, 253, 1543, 403, 47860, 253, 913, 32636, 253, 4477, 281, 49620, 253, 2929, 35827, 253, 1534, 8322, 273, 963, 993, 285, 3157, 253, 9759, 323, 697, 2457, 2715 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: edit after reading the other reviews the authors responses and thinking more about the concerns raised i have increased my score however i still recommend rejection because of questions around the hyperparameters used in the experiments summary the paper introduces a regularized mean squared projected bellman error objective function where the regularizer penalizes large changes to the estimated value function this regularized objective function is used to derive a gtd2like algorithm where updates to the value function weights are penalized the paper claims an improved rate of convergence and empirically investigates the proposed algorithm on tabular random walks the boyan chain environment and bairds counterexample pros paper proposes interesting new method paper includes theoretical argument for proposed method paper empirically investigates proposed method cons concerns about soundness of method concerns about originality clarity and quality decision at the present time i recommend rejecting the paper until the following concerns can be addressed soundness does the proposed modification to the mspbe change the underlying problem being solved is the solution to the regularized mspbe the same as the solution to the original mspbe even with function approximation the fact that gradientdd4 did not converge on the boyan chain is very concerning the motivation for gtd2 is to converge when used offpolicy with function approximation if the proposed modifications lose the convergence guarantee then why not just use conventional td offpolicy originality there are no references to prior work on convergence rates of gtd2 in section 4 the analysis seems like it was based on an existing analysis but nothing is cited there is no explicit related work section which would help clarify the novelty of contributions and would help position the paper within the existing literature clarity section 4 improved convergence rate is poorly explained and very difficult to follow section 5 doesnt mention betathe step size for the auxiliary weights earlier in the paper kappa is referred to as a regularization parameter but in section 5 its called a step size parameter and annealed there are several statements that dont make sense to me the regularization term uses the previous value function estimate to avoid large biases in the updating process the use of the word biases here is confusing and conflicts with the statistical notion of bias updates to weights would generally not be considered biases in the statistical sense however the regularization term can be thought of as biasing the optimization towards solutions with certain qualities importance sampling is useful for decreasing the variance of parameter updates using importance sampling to correct the difference between the target and behaviour policies usually increases the variance of parameter updates is shrinks updates that occur more often than they would when following the target policy and enlarges updates that occur less often than they would when following the target policy the average distance from the mean update can be larger than without importance sampling in effect the regularization term encourages reexperience around the estimate at previous time step especially when the state space is large what does reexperience mean accelerate the gtd2 algorithm the word accelerate is used several times in the paper to describe the gradientdd update but the idea of penalizing large updates to the value function weights conflicts with the conventional meaning of acceleration in optimization using past information to make larger changes to weights as is done with nesterov acceleration momentum adam etc which is confusing penalizing updates to the value function weights would actually slow the changing of the value function weights not accelerate it this might allow the second set of weights to learn better estimates of the expected td error because the expected td error is changing as the value function weights change which could account for the performance increase over gtd2 quality best performance in the final episode is not an appropriate way to determine the bestperforming parameter settings when the paper makes claims about the speed of learning of various methods the parameter settings that result in the lowest error at the end of training will not in general be the parameter settings that result in the fastest learning ie smallest area under the curve if the paper is going to make claims about learning speed then the parameter settings should be selected based on the smallest area under the curve this might be why tdc performs so poorly in these experiments when it outperforms gtd2 in other papers see ghiassian et al 2018 tdc is called gtd in that paper and intuitively should perform similarly to conventional td in early learning when the correction weights are near 0 this seems like a serious issue to me the experiments may need to be rerun with different parameter settings that better match the claims the paper is making about learning speed suggestions for improvement in addition to addressing the concerns mentioned above consider adding a related work section that explicitly compares and contrasts the most relevant related methods consider motivating gradientdd more along the lines of trpo reps and other algorithms that penalize large changes to the weights being learned instead of motivating it as accelerating gtd2 actually it would be better to do some simple experiments to test why the regularization improves performance over gtd2 does it result in the second set of weights learning the expected td error with greater accuracy can the same effect be achieved by a two timescale approach where the value function weights are updated with a smaller step size than the second set of weights if not it would provide more support for the proposed method despite the concerns listed in this review i actually think this paper has a very interesting premise and deserves further study and investigation misc details a sentence trails off in the first paragraph of the introduction where this term originates from the squared bias term in the objective 6 equation 6 seems to be the gtd2 update rules not the objective function references ghiassian s patterson a white m sutton r s white a 2018 online offpolicy prediction arxiv preprint arxiv181102597 docsep summary of contributions the paper proposes the gradient descent td difference learning gdd algorithm which adds a term to the mspbe objective to constrain how quickly a value function can change they argue that their approach has a quicker convergence rate and empirically demonstrate in several examples with linear function approximation that it substantially improves over existing gradientbased td methods review i like the simplicity of the proposed method and its intuitive interpretation as a valuebased trust region however i have the following questions and concerns 1 there doesnt seem to be any information regarding how many independent runs were performed in the empirical evaluation and there was no no reported statistical significance testing can the authors clarify this information and comment on the significance of the results 2 while it led to improvements over gtd2 it largely didnt improve over regular semigradient td apart from bairds counterexample which was designed to make td fail as such i dont think the addition of a new parameter was convincingly justified some of the results seemed to suggest that the improvement grew as the state spacecomplexity increased that it may be the case that the evaluation falls a bit short on exploring more complex environments while the breadth of the ablation studies is really nice we observe similar trends in many neighbouring figures that the space in the main text from showcasing the many different configurations could be summarized with representative examples and the additional space could have been used to provide some additional experimentsinsights like those suggested in the discussion 3 from how modular the addition of the term is to the objective have the authors tried incorporating the regularization to semigradient td is there anything about the semigradient update that bars its use td generally performed really well in the papers evaluation outside of bairds counterexample that it would make a stronger case if the extension was demonstrated to be more generally applicable and that it consistently improved over the methods it was applied to this sort of ties into what was described in 2 where what was presented seems to fall a bit short and how the space could have showcased a bit more 4 while the papers focus was on the case of linear function approximation can the authors comment on how readily the approach can be extended to the nonlinear case gtd methods have not seen as much adoption as their approximate dynamic programming counterparts when combining td methods with nonlinear function approximation that it can raise questions as to how the methods scale to more complicated settings given the above i am erring toward rejection at this time i think 1 is a rather significant issue that needs to be addressed and im willing to raise my score if that and my other concerns can be sufficiently addressed post discussion taking the other reviews and the authors response into account i still maintain my score while i agree that its good to be thorough in something clear and simple it can still be done to a point of redundancy and consequently seem less thorough in the overall picture and claims made im still largely unsure on the choice to only apply the supposedly modular extension to gtd2 and not try it with td which seemed like a clearer winner apart from bairds counterexample as others suggested there are additional methods which might be good to compare to and other evaluation metrics might make more sense for the claims being made many of my concerns were largely brushed off as future work that little got addressed without having to carry out the experiments high level commentscurrent thoughts could be provided regarding how readily the approach can extend to the scenarios suggested or if there are nuances that need to be worked out etcdocsepthis paper proposes a variant of the gtd2 algorithm by adding an additional regularization term to the objective function and the new algorithm is named as gradientdd gdd the regularization ensures that the value function does not change drastically between consecutive iterations the authors show that the update rule of gdd can be written as a difference equation and aim to further show the convergence via lyapunov based analysis an simulation study is provided to compare the proposed gdd algorithm with td etd and gtd the paper is well written in general the idea of extra regularization on the distance between two value functions sounds reasonable to me since it resembles the constraint in trust region optimization for policy gradient methods however the claimed improved convergence over gtd is not rigorously proved and thus not convincing in section 4 the convergence analysis is not derived in a rigorous way it would help the readers to understand the improved convergence if the authors could complete the analysis and show the convergence rate why can the eigenvalues of matrix jn be written as the block matrix before eq 14 it seems to me that g and h are diagonal matrices with the diagonal elements being the eigenvalues of gn and hn ideally the eigenvalues of jn which is denoted as j in this paper should also be a diagonal matrix furthermore since gn is not symmetric g may have some complex values as its eigenvalues this is ignored from the current analysis without any explanation in the experiment part figure 3 shows that the rms error of the gdd algorithm will blow up when step size is large it seems that the proposed algorithm may not be as robust as the conventional td algorithm edits after the rebuttal thank you for the responses after reading them and the discussion with other reviewers i still think the current contribution of this paper is marginal and i keep my score as 5 docsep summary of paper this paper introduces a novel regularized meansquared projected bellman objective and corresponding gtd2like algorithm which minimizes the objective the paper analytically investigates the convergence rate of the proposed algorithm then empirically investigates the performance of the algorithm across several problems with linear function approximation summary of review this paper is a clear reject for me there appear to be significant issues in both the analytical section and the empirical section which in total bring into question the utility of the proposed algorithm the literature review also appears to be lacking and there appear to be several minor incorrect statements throughout the paper i feel quite confident in my evaluation of the empirical section and literature review i feel confident that there is a bugtypoincorrect result in the analytical section i did not attempt to debug the proof to determine which of the three bug typo or incorrect result was true details follow roughly in order of greatest concern least concern proof of convergence rate i follow the proof up to equation 12 this is a standard result from eg maeis 2011 thesis after equation 12 i follow the transformation that solved for rhon1 rhon resulting in a matrix inverse however after distributing the matrix inverse i remain confused why sqrtxi gn1 is not multiplied by the inverted matrix this however may not change the resulting eigenvalues so i do not believe it will change the result continuing on to equation 14 i did not check for correctness solving the polynomial for the eigenvalues so i will rely on the given result in the paper the conclusion of which states lambda frac12 alpha1 lambdag pm frac12 sqrttextthing lambdag ignoring a few details in the middle the result ultimately states lambda lambdag here lies the fundamental problem because gtd2 is known to be a convergent linear system then we know lambdag are all strictly negative proof of this in maeis thesis under some assumptions notably the invertibility of x xtop for such a linear system to be convergent we must have the real part of the eigenvalues be strictly negative the proof under equation 14 states lambda lambdag which because lambdag is strictly negative means that lambda textsome positive number this a tells us nothing of the comparative convergence rates since lambda could be larger than lambdag and b also suggests that there could be cases where the proposed algorithm does not even converge in the first place because lambda could be greater than or equal to 0 this could be a typo and the negative sign on the right shouldnt exist but because there are few details from equation 14 to the end i was unable to debug and decide if this is typobugincorrect conclusion there a few other issues with the proof that concern me do we know that the upperbound on kappa is reasonable the quantity alpha lambdag alpha12 4 is difficult to interpret but the reliance on alpha1 seems to imply a preference toward much smaller stepsizes which seems counterintuitive towards the goal of improving convergence rates likewise the first part of the solution for lambda we have frac12 alpha1 lambdag which likewise implies that smaller stepsizes significantly improve convergence rate small alpha implies highly negative lambda considering the remainder of the solution for lambda is under a squareroot this first factor appears to be dominant though i could be wrong on this more insight would be appreciated a stepsize approaching 0 would yield the fastest convergence rate apparently by having the smallest eigenvalue approach negative infinity this simply does not pass the smell test further building on my previous point for large enough values of alpha we will be in a situation where alpha1 lambdag meaning that the first term will become positive considering that the second term is a plusorminus then we could be adding a positive term to a positive term yielding a positive eigenvalue this means that for large enough alpha we dont have convergence any longer i wonder how feasible the upperbound on alpha is to guarantee convergence i would also like to see these assumptions explicitly stated in the proof a lot of little details were left out of the proof where are the assumptions on boundedness of the features and the rewards can one show that the noise sequence of the modified algorithm is actually a martingale difference sequence and thus the result from borkar and meyn 2000 holds need there be an assumption of independent samples or are these samples coming from markovian sampling empirical section the choice to set the initial value function vs 05 forall s for the random walk was odd i suppose that because the left reward 0 and the right reward 1 and the policy is 50 chance to go left or right and gamma1 then the optimal value function vpi linearly interpolates from 0 1 with the centermost state having value vn 2 05 this choice seems likely to disproportionately favor the proposed gdd algorithm which encourages the value function estimate to change slowly because the initial estimates are so close to correct only small changes will be necessary and the regularizer term will remain small what happens if the value function is initialized to 0 everywhere or even to 1 everywhere the exclusion of tdc from the stepsize sensitivity investigation makes little sense to me the first experiment chose an aggressively large stepsize alpha 05 for which tdc performed poorly then did not investigate the sensitivity of tdc to stepsize in later plots because of this choice if you check giassian et al 2020 they report that tdc in fact outperforms gtd2 on all of the same domains tested here for appropriately chosen alpha and beta the choice of beta is never discussed how did you set beta how many runs what is the variance are any results statistically significant the primary motivation of the paper was around offpolicy learning yet only one of the tested domains was offpolicy bairds counterexample star mdp it would have been nice to see the random walks made into offpolicy domains literature review this paper modifies the mspbe by adding a regularizer term there are a few other papers in the literature that do this and derive the corresponding gtd2tdc algorithms liu et al 2012 and ghiassian et al 2020 immediately come to mind these should both be cited and discussed are there any papers that add such a constraint as wn wn 1 2 to any known objective function this seems like an odd choice of regularizer penalizes making changes to the weights so any prior literature from any field supervised learning online learning optimization etc would go a long way in convincing the reader that this is a good idea the paper mentions several times that gtd methods converge more slowly than td i know of a single proof that shows this in maeis thesis for the gtd algorithm i do not know of any such proof for tdc or gtd2 there exists empirical evidence of this in ghiassian et al 2020 or white and white 2016 but neither of these papers are cited gtd methods and importance sampling are not mutually exclusive methods for offpolicy learning in fact gtd methods canonically use is for their offpolicy variants further importance sampling definitely does not decrease the variance of parameter updates mentioned in the second paragraph of section 1 sutton et al 2009 is not really a breakthrough in the study of convergence properties of mdp systems in fact the proofs of sutton et al 2009 do not even assume samples are drawn from a distribution induced by an mdp perhaps borkar and meyn 2000 is a better reference as it fundamentally builds the proof structure used by sutton et al 2009 other minutiae eyeballing the modified objective function leads me to believe the objective shares the same fixedpoint as the mspbe thus the new gtd2 algorithm converges to the same fixedpoint as td but it would be nice to show this formally in the analytical section is it possible to extend this objective to the nonlinear setting the paper mentions that the proposed regularizer avoids large biases in the updating process does it not add bias to the updating process perhaps it was meant that the regularizer avoids high variance either way a careful analytical discussion of the biasvariance properties would go a long way towards improving this paper in section 53 what is eta i believe this is supposed to be beta ie the stepsize for the secondary weights since eta is your secondary weight vector would it make more sense to consider kappa to be a regularizer parameter instead of a stepsize and having it absorb alpha it seems in the experiment section you split these anyways so perhaps it makes the analytical section much more clear if the algorithm was instead alphakappa xtop wn xtop wn1 the paper repeatedly defines offpolicy learning as learning the optimal policy using an exploratory policy this is a bit of a restrictive setting and is certainly not the setting that sutton et al 2009 considered the work that this paper builds upon why assume that the target policy is deterministic mentioned in section 21 this is a strange choice that is not used in either the empirical or analytical section as far as i can tell it is mentioned that td methods should seek to minimize the mspbe or perhaps the msbe it isnt clear which is meant but shouldnt instead the goal be to minimize the msve eg hatvw vpi 2 papers mentioned in this review maei hamid reza gradient temporaldifference learning algorithms university of alberta 2011 sina ghiassian andrew patterson shivam garg dhawal gupta adam white and martha white gradient temporaldifference learning with regularized corrections international conference on machine learning 2020 httparxivorgabs200700611 adam white and martha white investigating practical linear temporal difference learning international conference on autonomous agents and multiagent systems 2016 httparxivorgabs160208771 borkar v s meyn s p 2000 the ode method for convergence of stochastic approximation and reinforcement learning siam journal on control and optimization 382 447469 httpsdoiorg101137s0363012997331639 sutton r s maei h r precup d bhatnagar s silver d szepesvri c wiewiora e 2009 fast gradientdescent methods for temporaldifference learning with linear function approximation proceedings of the 26th annual international conference on machine learning icml 09 18 httpsdoiorg10114515533741553501 liu b mahadevan s liu j 2012 regularized offpolicy tdlearning advances in neural information processing systems 9 after discussion and edits i acknowledge that i have read the other reviews and resulting discussions and i have read the relevant changes in the edited text i have raised my score from 23 to reflect that several concerns were alleviated through the edits but several new concerns and old concerns remain i will summarize below after the author edits the issue with convergence and convergence rates appear to have been resolved i additionally appreciate the much greater clarity in the analytical section however i still find the contribution to be borderline at best in terms of novelty of approach and i find that the evidence of applicability is still considerably lacking the introduction of a regularizer to accelerate gtd methods is itself not novel the form of the proposed regularizer is novel however i find its form to be unintuitive as it punishes making changes to the weights there are some prior works that motivate this well ie trpo and other trustregion optimization techniques but this paper does not appeal to prior works to motivate their regularizer instead i must rely on the empirical study which does not investigate the learning speed of the proposed algorithm compared to baselines in many cases the proposed algorithm does not clearly outperform baselines ### Summary:
this paper introduces an simple but potentially effective offpolicy td algorithm overall the reviewers felt the work was incomplete and not yet ready for publication the all recognized the authors made significant updates to the paper but serious issues remain with the empirical work studying the impact of the proposed extension on other algorithms missing baselines eg tdrc scope of environments limited similar chainlike domains significant questions about how best parameter settings where chosen for comparison etc this is clearly an interesting direction if the authors can improve the experiments and better situate their method if the literature connecting to the lit in offpolicy rl about accelerating and improving offpolicy td methods this will become a solid contribution
[ 4016, 2097, 326, 29331, 50276, 1156, 8826, 2762, 1180, 436, 247, 8599, 441, 2717, 273, 253, 20407, 14940, 4142, 1580, 29331, 812, 320, 4067, 685, 24082, 21675, 285, 270, 671, 5936, 326, 627, 812, 320, 2219, 835, 253, 4081, 5933, 1057, 417, 1014, 29623, 275, 253, 806, 1659, 984, 29331, 812, 320, 3687, 685, 390, 4503, 281, 470, 436, 812, 320, 247, 1745, 80, 285, 253, 4016, 861, 327, 253, 987, 943, 2649, 2226, 533, 984, 627, 403, 1643, 4278, 432, 5150, 1638, 281, 253, 990, 891, 369, 7591, 281, 13844, 285, 7617, 604, 436, 310, 1745, 706, 814, 1763, 263, 6471, 6452, 50276, 9088, 247, 1643, 643, 3374, 342, 253, 4737, 326, 4468, 479, 50276, 3088, 359, 871, 326, 253, 5170, 9458, 327, 465, 5596, 310, 5272, 253, 10671, 9765, 24082, 21675, 50276, 1637, 805, 50276, 21, 310, 2834, 281, 4665, 533, 253, 22095, 327, 9765, 18, 3133, 281, 16084, 247, 14682, 2584, 1199, 4577, 5018, 4219, 534, 3133, 4828, 565, 48714, 4404, 253, 4736, 273, 11138, 14940, 4142, 50276, 3022, 3020, 253, 806, 629, 273, 253, 2900, 323, 29331, 359, 452, 1315, 317, 805, 9765, 18, 50276, 77, 1369, 21675, 534, 21223, 8018, 326, 4577, 5018, 4219, 3012, 3157, 14940, 2281, 1355, 9765, 8018, 4122, 4016, 29331, 7296, 253, 6414, 273, 253, 2900, 323, 29331, 310, 762, 247, 6278, 9723, 436, 806, 2803, 4620, 281, 320, 11360, 2167, 891, 812, 320, 3430, 327, 436, 625, 12288, 651, 320, 14109, 247, 5018, 907, 17682, 470, 651, 4917, 253, 22583, 14940, 2281, 8505, 407, 1907, 253, 8004, 25023, 2746, 4016, 23579, 436, 3365, 1057, 417, 1509, 253, 13624, 1071, 50276, 44295, 3652, 327, 619, 2045, 1127, 323, 1781, 2217, 2193, 273, 9765, 359, 588, 320, 275, 247, 4112, 835, 9765, 18, 50276, 77, 1369, 21675, 4495, 326, 253, 806, 1307, 588, 2489, 2762, 7296, 326, 253, 1273, 1307, 310, 247, 5043, 526, 30264, 840, 359, 812, 320, 6240, 247, 2762, 1307, 281, 247, 2762, 1307, 27012, 247, 2762, 25023, 436, 2097, 326, 323, 1781, 2217, 9765, 359, 13414, 452, 14940, 667, 3356, 891, 4282, 849, 17887, 253, 5170, 9458, 327, 9765, 310, 281, 12215, 14940, 891, 651, 671, 751, 281, 923, 841, 13260, 11120, 4767, 275, 253, 4737, 50276, 66, 2257, 273, 1652, 4278, 497, 1669, 562, 273, 253, 4737, 835, 403, 253, 13260, 327, 11542, 1255, 273, 253, 3386, 285, 253, 23267, 476, 581, 921, 326, 253, 6046, 3425, 273, 253, 7321, 5933, 310, 2686, 247, 16172, 46760, 3064, 3425, 285, 3021, 253, 906, 432, 270, 1064, 274, 285, 479, 1362, 5307, 6556, 878, 627, 320, 271, 9376, 273, 3907, 3530, 390, 403, 841, 3530, 3551, 432, 1616, 729, 757, 10491, 50275, 358, 5378, 474, 2593, 50276, 783, 4327, 281, 873, 253, 3302, 1318, 1159, 4632, 50276, 1762, 323, 455, 256, 323, 253, 3632, 2940, 369, 8909, 891, 9428, 326, 984, 253, 1669, 10921, 50276, 17, 285, 253, 987, 10921, 50276, 18, 285, 253, 3646, 310, 2456, 4839, 281, 564, 1669, 390, 987, 285, 17356, 18, 840, 253, 8654, 1318, 1159, 362, 2059, 23352, 20670, 684, 432, 470, 337, 342, 253, 1399, 32848, 1375, 1907, 1318, 362, 79, 50276, 19, 50276, 1762, 436, 4327, 3133, 2779, 281, 30839, 1523, 3718, 253, 4081, 305, 1678, 5933, 534, 29426, 253, 1318, 1159, 6642, 281, 1818, 7808, 984, 253, 3302, 8197, 403, 594, 2810, 281, 3451, 760, 1355, 2544, 588, 320, 3309, 285, 253, 3963, 6081, 1307, 588, 3464, 1355, 752, 6569, 604, 253, 1318, 1159, 310, 31260, 281, 470, 11678, 390, 1014, 281, 337, 11678, 50276, 783, 14978, 273, 246, 12352, 432, 253, 5018, 907, 7340, 5839, 2789, 1652, 3282, 281, 479, 253, 806, 3368, 9703, 271, 39730, 1781, 5018, 907, 9765, 50276, 1762, 323, 534, 246, 12352, 2684, 15225, 840, 858, 417, 7409, 253, 7340, 273, 246, 12352, 281, 5018, 907, 275, 1996, 14777, 984, 273, 436, 4327, 604, 368, 2451, 15891, 515, 757, 1162, 355, 9169, 597, 1304, 326, 246, 12352, 275, 958, 41731, 13015, 305, 2851, 19, 327, 512, 273, 253, 1072, 10625, 5762, 1060, 323, 20420, 6777, 9765, 285, 9840, 50276, 783, 4327, 273, 9840, 310, 1620, 5469, 849, 858, 368, 873, 9840, 50276, 5430, 1142, 6613, 752, 310, 253, 11041, 403, 667, 1543, 10126, 1534, 50276, 783, 3625, 16038, 273, 253, 2929, 369, 1475, 745, 22872, 4715, 2568, 760, 581, 273, 253, 5762, 10625, 369, 745, 22872, 270, 1094, 1397, 2258, 442, 18398, 4636, 4177, 278, 12132, 352, 651, 452, 644, 5322, 281, 923, 253, 3632, 16771, 1160, 715, 745, 22872, 10625, 50275, 22478, 1177, 2278, 50276, 2520, 2929, 771, 7790, 253, 278, 1033, 1257, 407, 6240, 247, 3963, 6081, 1307, 627, 403, 247, 1643, 643, 9380, 275, 253, 6239, 326, 513, 436, 285, 15313, 253, 3969, 305, 2851, 19, 2851, 68, 11333, 632, 86, 1162, 355, 4050, 285, 305, 5801, 515, 757, 1162, 355, 9169, 4745, 1705, 281, 2564, 841, 943, 1097, 320, 11106, 285, 5469, 50276, 609, 627, 667, 9380, 326, 823, 824, 247, 7658, 347, 50276, 939, 50276, 939, 50276, 18, 374, 281, 667, 1929, 8103, 1159, 436, 3133, 751, 271, 8909, 4327, 273, 3963, 6081, 29697, 4219, 2403, 2544, 281, 253, 13461, 594, 667, 2720, 6239, 432, 667, 1673, 22296, 4715, 3909, 4715, 13757, 3966, 651, 564, 247, 1048, 1039, 275, 21414, 253, 9414, 326, 436, 310, 247, 1175, 2934, 50276, 783, 2929, 25957, 2067, 2069, 326, 305, 2851, 3082, 29623, 625, 7808, 685, 32989, 891, 871, 273, 247, 2014, 4737, 326, 2722, 436, 275, 278, 3348, 261, 22857, 323, 253, 305, 2851, 5933, 891, 513, 417, 871, 273, 667, 824, 4737, 323, 246, 12352, 390, 305, 2851, 19, 627, 4961, 16774, 1941, 273, 436, 275, 305, 5801, 515, 757, 1162, 355, 9169, 390, 3168, 285, 3168, 4022, 533, 6747, 273, 841, 9380, 403, 11106, 50276, 72, 2851, 3082, 285, 6349, 10491, 403, 417, 25834, 11855, 3082, 323, 745, 22872, 4715, 275, 958, 305, 2851, 3082, 31992, 1037, 897, 310, 323, 616, 745, 22872, 11640, 2007, 6349, 10491, 7964, 1057, 417, 6379, 253, 11041, 273, 4764, 11269, 5393, 275, 253, 1273, 12494, 273, 2593, 337, 50276, 84, 28738, 1162, 355, 4748, 310, 417, 1663, 247, 29709, 275, 253, 1263, 273, 14940, 3607, 273, 278, 12132, 2718, 275, 958, 253, 27947, 273, 256, 28738, 1162, 355, 4748, 513, 417, 1014, 5467, 3530, 403, 8392, 432, 247, 3268, 5802, 407, 271, 278, 12132, 4931, 270, 1064, 274, 285, 479, 1362, 5307, 310, 247, 1805, 3806, 347, 352, 26401, 21168, 253, 4737, 2605, 908, 407, 256, 28738, 1162, 355, 4748, 50275, 977, 1054, 307, 18620, 50276, 2653, 2275, 11822, 253, 7321, 8103, 1159, 5644, 479, 281, 2868, 253, 8103, 10764, 253, 1072, 4229, 3659, 347, 253, 278, 1033, 1257, 3021, 253, 747, 305, 2851, 19, 5933, 26414, 281, 253, 1072, 4229, 3659, 347, 32989, 533, 352, 651, 320, 5322, 281, 921, 436, 19186, 275, 253, 16101, 2593, 50276, 261, 352, 1896, 281, 9017, 436, 8103, 281, 253, 14561, 4758, 50276, 783, 2929, 25957, 326, 253, 4081, 3963, 6081, 32547, 1781, 31306, 275, 253, 22753, 1232, 1057, 352, 417, 823, 8492, 281, 253, 22753, 1232, 4931, 352, 369, 5486, 326, 253, 3963, 6081, 32547, 1029, 11041, 2057, 1039, 247, 10182, 16101, 5955, 273, 253, 8492, 87, 14417, 3607, 651, 564, 247, 1048, 1039, 4404, 11138, 436, 2929, 50276, 249, 2593, 8676, 752, 310, 1162, 66, 891, 2868, 436, 310, 6326, 281, 320, 9840, 26332, 253, 5018, 907, 323, 253, 6561, 13461, 1580, 1162, 66, 310, 634, 6561, 2801, 4972, 50276, 12756, 352, 1056, 625, 3282, 281, 1908, 465, 5596, 281, 320, 247, 3963, 6081, 4764, 3185, 273, 247, 5018, 907, 285, 1907, 352, 15816, 9765, 352, 3133, 275, 253, 3368, 2593, 368, 8085, 841, 667, 1576, 594, 4931, 352, 2789, 253, 16101, 2593, 1199, 625, 2590, 604, 253, 5933, 369, 3185, 355, 545, 518, 5596, 209, 633, 412, 259, 79, 50276, 633, 412, 259, 79, 18, 50275, 783, 2929, 12889, 13067, 745, 22872, 4715, 347, 4715, 253, 8654, 3646, 970, 271, 41075, 3646, 436, 310, 247, 2372, 273, 247, 29190, 4758, 285, 310, 5604, 417, 253, 4758, 326, 256, 28738, 1162, 355, 4748, 2783, 253, 789, 326, 436, 2929, 21168, 2220, 50276, 22309, 5467, 326, 253, 2303, 3646, 310, 30027, 5393, 275, 2593, 3127, 436, 310, 247, 8921, 4327, 326, 310, 417, 908, 275, 2057, 253, 16774, 390, 16101, 2593, 347, 2080, 347, 891, 476, 2028, 50276, 262, 310, 5393, 326, 32989, 3082, 943, 7703, 281, 15338, 253, 278, 1033, 1257, 390, 4931, 253, 13818, 1257, 352, 310, 2649, 2590, 534, 310, 5486, 533, 943, 2649, 3185, 253, 4736, 320, 281, 15338, 253, 13818, 306, 24088, 50276, 700, 87, 88, 50276, 87, 2059, 374, 50275, 50004, 5393, 275, 436, 2278, 278, 3348, 74, 10546, 301, 294, 4019, 11786, 5897, 8950, 17693, 4715, 11333, 9835, 273, 355, 589, 893, 4332, 50276, 84, 1758, 305, 5801, 515, 757, 285, 2663, 869, 30964, 439, 400, 312, 305, 1662, 277, 21733, 267, 1149, 37668, 38622, 3168, 285, 2304, 19243, 3168, 11786, 5897, 8950, 17693, 4715, 342, 3963, 1025, 17660, 5213, 8059, 327, 5145, 4715, 9169, 2832, 1148, 32693, 2061, 5375, 8602, 7174, 883, 50276, 43089, 3168, 285, 2304, 19243, 3168, 15686, 8542, 4872, 11935, 3064, 4715, 5213, 8059, 327, 26279, 6083, 285, 4471, 12788, 2718, 4022, 2832, 1148, 32693, 2061, 5375, 9913, 17391, 40379, 50276, 67, 1064, 274, 362, 256, 50276, 1405, 1362, 256, 268, 5307, 253, 258, 615, 1332, 323, 14940, 273, 19191, 11193, 285, 35221, 4715, 4927, 312, 6698, 327, 1453, 285, 13757, 42671, 577, 30413, 2090, 5987, 3088, 1528, 72, 6903, 15497, 84, 21348, 20, 12522, 28315, 1610, 1036, 1867, 50276, 84, 28738, 391, 256, 278, 3348, 74, 288, 391, 3509, 484, 277, 270, 700, 79, 28923, 256, 9711, 277, 18558, 554, 265, 87, 363, 260, 50276, 88, 827, 1528, 66, 299, 4748, 3809, 11786, 3229, 1154, 3082, 323, 5897, 8950, 17693, 4715, 342, 4872, 1159, 11193, 10061, 273, 253, 3436, 394, 7970, 5213, 8059, 327, 5145, 4715, 50276, 280, 1686, 15630, 1283, 5987, 3088, 1528, 72, 6903, 11838, 15054, 1610, 3566, 15054, 1671, 520, 50276, 965, 86, 270, 35926, 796, 6148, 256, 50276, 965, 86, 480, 4050, 3963, 1025, 745, 22872, 32989, 28269, 16424, 275, 11454, 1491, 5162, 2718, 898, 50275, 6438, 5955, 285, 1407, 953, 50276, 74, 14409, 326, 891, 452, 1239, 253, 643, 10123, 285, 4795, 11985, 285, 891, 452, 1239, 253, 4623, 2544, 275, 253, 16168, 2505, 891, 452, 5439, 619, 4868, 432, 3495, 281, 4887, 326, 2067, 7350, 497, 26353, 4215, 949, 253, 1407, 953, 533, 2067, 747, 7350, 285, 1711, 7350, 3464, 891, 588, 26799, 2708, 50276, 6438, 253, 2488, 1407, 953, 253, 2523, 342, 14940, 285, 14940, 4142, 3176, 281, 452, 644, 11512, 891, 23000, 11435, 253, 1199, 3687, 19843, 275, 253, 16101, 2593, 2299, 891, 1335, 1089, 253, 7680, 281, 320, 45210, 387, 1682, 275, 2426, 273, 38135, 273, 2746, 285, 891, 1089, 326, 253, 1941, 273, 30437, 310, 1335, 15455, 14999, 253, 10199, 273, 247, 3963, 6081, 281, 28523, 305, 2851, 3082, 310, 3139, 417, 4460, 253, 830, 273, 253, 4081, 3963, 6081, 310, 4460, 2299, 891, 1089, 697, 830, 281, 320, 25962, 48714, 347, 352, 5419, 6419, 2403, 2544, 281, 253, 13461, 627, 403, 690, 2720, 2987, 326, 41509, 436, 973, 26332, 492, 5367, 285, 643, 4517, 17187, 13757, 5609, 533, 436, 2929, 1057, 417, 4549, 281, 2720, 2987, 281, 41509, 616, 3963, 6081, 3185, 891, 1364, 10725, 327, 253, 16774, 1263, 534, 1057, 417, 7409, 253, 4715, 3885, 273, 253, 4081, 5933, 2429, 281, 1666, 25379, 275, 1142, 2219, 253, 4081, 5933, 1057, 417, 4518, 562, 32231, 1666, 25379, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 23970, 271, 2969, 533, 7826, 3576, 745, 22872, 32989, 5933, 50275, 1189, 455, 253, 30628, 3543, 253, 789, 369, 18464, 285, 417, 2568, 4704, 323, 9311, 253, 512, 7478, 253, 4477, 1160, 1534, 11269, 281, 253, 2929, 533, 4092, 3374, 3464, 342, 253, 16774, 789, 12392, 253, 3486, 273, 253, 4081, 6880, 327, 643, 11333, 5816, 1666, 25379, 24088, 32989, 3373, 7990, 273, 12620, 3710, 2074, 5931, 3022, 10625, 1534, 3533, 670, 849, 1682, 4764, 7533, 835, 6777, 323, 5301, 3966, 50276, 2520, 310, 4518, 271, 4722, 3884, 604, 253, 4477, 476, 3157, 253, 4679, 285, 1805, 5999, 366, 616, 1332, 604, 253, 6239, 12873, 281, 253, 6195, 275, 745, 22872, 391, 77, 670, 38757, 285, 11138, 745, 22872, 32989, 3082, 436, 588, 2489, 247, 4891, 7680 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 4016, 2097, 326, 29331, 50276, 1156, 8826, 2762, 1180, 436, 247, 8599, 441, 2717, 273, 253, 20407, 14940, 4142, 1580, 29331, 812, 320, 4067, 685, 24082, 21675, 285, 270, 671, 5936, 326, 627, 812, 320, 2219, 835, 253, 4081, 5933, 1057, 417, 1014, 29623, 275, 253, 806, 1659, 984, 29331, 812, 320, 3687, 685, 390, 4503, 281, 470, 436, 812, 320, 247, 1745, 80, 285, 253, 4016, 861, 327, 253, 987, 943, 2649, 2226, 533, 984, 627, 403, 1643, 4278, 432, 5150, 1638, 281, 253, 990, 891, 369, 7591, 281, 13844, 285, 7617, 604, 436, 310, 1745, 706, 814, 1763, 263, 6471, 6452, 50276, 9088, 247, 1643, 643, 3374, 342, 253, 4737, 326, 4468, 479, 50276, 3088, 359, 871, 326, 253, 5170, 9458, 327, 465, 5596, 310, 5272, 253, 10671, 9765, 24082, 21675, 50276, 1637, 805, 50276, 21, 310, 2834, 281, 4665, 533, 253, 22095, 327, 9765, 18, 3133, 281, 16084, 247, 14682, 2584, 1199, 4577, 5018, 4219, 534, 3133, 4828, 565, 48714, 4404, 253, 4736, 273, 11138, 14940, 4142, 50276, 3022, 3020, 253, 806, 629, 273, 253, 2900, 323, 29331, 359, 452, 1315, 317, 805, 9765, 18, 50276, 77, 1369, 21675, 534, 21223, 8018, 326, 4577, 5018, 4219, 3012, 3157, 14940, 2281, 1355, 9765, 8018, 4122, 4016, 29331, 7296, 253, 6414, 273, 253, 2900, 323, 29331, 310, 762, 247, 6278, 9723, 436, 806, 2803, 4620, 281, 320, 11360, 2167, 891, 812, 320, 3430, 327, 436, 625, 12288, 651, 320, 14109, 247, 5018, 907, 17682, 470, 651, 4917, 253, 22583, 14940, 2281, 8505, 407, 1907, 253, 8004, 25023, 2746, 4016, 23579, 436, 3365, 1057, 417, 1509, 253, 13624, 1071, 50276, 44295, 3652, 327, 619, 2045, 1127, 323, 1781, 2217, 2193, 273, 9765, 359, 588, 320, 275, 247, 4112, 835, 9765, 18, 50276, 77, 1369, 21675, 4495, 326, 253, 806, 1307, 588, 2489, 2762, 7296, 326, 253, 1273, 1307, 310, 247, 5043, 526, 30264, 840, 359, 812, 320, 6240, 247, 2762, 1307, 281, 247, 2762, 1307, 27012, 247, 2762, 25023, 436, 2097, 326, 323, 1781, 2217, 9765, 359, 13414, 452, 14940, 667, 3356, 891, 4282, 849, 17887, 253, 5170, 9458, 327, 9765, 310, 281, 12215, 14940, 891, 651, 671, 751, 281, 923, 841, 13260, 11120, 4767, 275, 253, 4737, 50276, 66, 2257, 273, 1652, 4278, 497, 1669, 562, 273, 253, 4737, 835, 403, 253, 13260, 327, 11542, 1255, 273, 253, 3386, 285, 253, 23267, 476, 581, 921, 326, 253, 6046, 3425, 273, 253, 7321, 5933, 310, 2686, 247, 16172, 46760, 3064, 3425, 285, 3021, 253, 906, 432, 270, 1064, 274, 285, 479, 1362, 5307, 6556, 878, 627, 320, 271, 9376, 273, 3907, 3530, 390, 403, 841, 3530, 3551, 432, 1616, 729, 757, 10491, 50275, 358, 5378, 474, 2593, 50276, 783, 4327, 281, 873, 253, 3302, 1318, 1159, 4632, 50276, 1762, 323, 455, 256, 323, 253, 3632, 2940, 369, 8909, 891, 9428, 326, 984, 253, 1669, 10921, 50276, 17, 285, 253, 987, 10921, 50276, 18, 285, 253, 3646, 310, 2456, 4839, 281, 564, 1669, 390, 987, 285, 17356, 18, 840, 253, 8654, 1318, 1159, 362, 2059, 23352, 20670, 684, 432, 470, 337, 342, 253, 1399, 32848, 1375, 1907, 1318, 362, 79, 50276, 19, 50276, 1762, 436, 4327, 3133, 2779, 281, 30839, 1523, 3718, 253, 4081, 305, 1678, 5933, 534, 29426, 253, 1318, 1159, 6642, 281, 1818, 7808, 984, 253, 3302, 8197, 403, 594, 2810, 281, 3451, 760, 1355, 2544, 588, 320, 3309, 285, 253, 3963, 6081, 1307, 588, 3464, 1355, 752, 6569, 604, 253, 1318, 1159, 310, 31260, 281, 470, 11678, 390, 1014, 281, 337, 11678, 50276, 783, 14978, 273, 246, 12352, 432, 253, 5018, 907, 7340, 5839, 2789, 1652, 3282, 281, 479, 253, 806, 3368, 9703, 271, 39730, 1781, 5018, 907, 9765, 50276, 1762, 323, 534, 246, 12352, 2684, 15225, 840, 858, 417, 7409, 253, 7340, 273, 246, 12352, 281, 5018, 907, 275, 1996, 14777, 984, 273, 436, 4327, 604, 368, 2451, 15891, 515, 757, 1162, 355, 9169, 597, 1304, 326, 246, 12352, 275, 958, 41731, 13015, 305, 2851, 19, 327, 512, 273, 253, 1072, 10625, 5762, 1060, 323, 20420, 6777, 9765, 285, 9840, 50276, 783, 4327, 273, 9840, 310, 1620, 5469, 849, 858, 368, 873, 9840, 50276, 5430, 1142, 6613, 752, 310, 253, 11041, 403, 667, 1543, 10126, 1534, 50276, 783, 3625, 16038, 273, 253, 2929, 369, 1475, 745, 22872, 4715, 2568, 760, 581, 273, 253, 5762, 10625, 369, 745, 22872, 270, 1094, 1397, 2258, 442, 18398, 4636, 4177, 278, 12132, 352, 651, 452, 644, 5322, 281, 923, 253, 3632, 16771, 1160, 715, 745, 22872, 10625, 50275, 22478, 1177, 2278, 50276, 2520, 2929, 771, 7790, 253, 278, 1033, 1257, 407, 6240, 247, 3963, 6081, 1307, 627, 403, 247, 1643, 643, 9380, 275, 253, 6239, 326, 513, 436, 285, 15313, 253, 3969, 305, 2851, 19, 2851, 68, 11333, 632, 86, 1162, 355, 4050, 285, 305, 5801, 515, 757, 1162, 355, 9169, 4745, 1705, 281, 2564, 841, 943, 1097, 320, 11106, 285, 5469, 50276, 609, 627, 667, 9380, 326, 823, 824, 247, 7658, 347, 50276, 939, 50276, 939, 50276, 18, 374, 281, 667, 1929, 8103, 1159, 436, 3133, 751, 271, 8909, 4327, 273, 3963, 6081, 29697, 4219, 2403, 2544, 281, 253, 13461, 594, 667, 2720, 6239, 432, 667, 1673, 22296, 4715, 3909, 4715, 13757, 3966, 651, 564, 247, 1048, 1039, 275, 21414, 253, 9414, 326, 436, 310, 247, 1175, 2934, 50276, 783, 2929, 25957, 2067, 2069, 326, 305, 2851, 3082, 29623, 625, 7808, 685, 32989, 891, 871, 273, 247, 2014, 4737, 326, 2722, 436, 275, 278, 3348, 261, 22857, 323, 253, 305, 2851, 5933, 891, 513, 417, 871, 273, 667, 824, 4737, 323, 246, 12352, 390, 305, 2851, 19, 627, 4961, 16774, 1941, 273, 436, 275, 305, 5801, 515, 757, 1162, 355, 9169, 390, 3168, 285, 3168, 4022, 533, 6747, 273, 841, 9380, 403, 11106, 50276, 72, 2851, 3082, 285, 6349, 10491, 403, 417, 25834, 11855, 3082, 323, 745, 22872, 4715, 275, 958, 305, 2851, 3082, 31992, 1037, 897, 310, 323, 616, 745, 22872, 11640, 2007, 6349, 10491, 7964, 1057, 417, 6379, 253, 11041, 273, 4764, 11269, 5393, 275, 253, 1273, 12494, 273, 2593, 337, 50276, 84, 28738, 1162, 355, 4748, 310, 417, 1663, 247, 29709, 275, 253, 1263, 273, 14940, 3607, 273, 278, 12132, 2718, 275, 958, 253, 27947, 273, 256, 28738, 1162, 355, 4748, 513, 417, 1014, 5467, 3530, 403, 8392, 432, 247, 3268, 5802, 407, 271, 278, 12132, 4931, 270, 1064, 274, 285, 479, 1362, 5307, 310, 247, 1805, 3806, 347, 352, 26401, 21168, 253, 4737, 2605, 908, 407, 256, 28738, 1162, 355, 4748, 50275, 977, 1054, 307, 18620, 50276, 2653, 2275, 11822, 253, 7321, 8103, 1159, 5644, 479, 281, 2868, 253, 8103, 10764, 253, 1072, 4229, 3659, 347, 253, 278, 1033, 1257, 3021, 253, 747, 305, 2851, 19, 5933, 26414, 281, 253, 1072, 4229, 3659, 347, 32989, 533, 352, 651, 320, 5322, 281, 921, 436, 19186, 275, 253, 16101, 2593, 50276, 261, 352, 1896, 281, 9017, 436, 8103, 281, 253, 14561, 4758, 50276, 783, 2929, 25957, 326, 253, 4081, 3963, 6081, 32547, 1781, 31306, 275, 253, 22753, 1232, 1057, 352, 417, 823, 8492, 281, 253, 22753, 1232, 4931, 352, 369, 5486, 326, 253, 3963, 6081, 32547, 1029, 11041, 2057, 1039, 247, 10182, 16101, 5955, 273, 253, 8492, 87, 14417, 3607, 651, 564, 247, 1048, 1039, 4404, 11138, 436, 2929, 50276, 249, 2593, 8676, 752, 310, 1162, 66, 891, 2868, 436, 310, 6326, 281, 320, 9840, 26332, 253, 5018, 907, 323, 253, 6561, 13461, 1580, 1162, 66, 310, 634, 6561, 2801, 4972, 50276, 12756, 352, 1056, 625, 3282, 281, 1908, 465, 5596, 281, 320, 247, 3963, 6081, 4764, 3185, 273, 247, 5018, 907, 285, 1907, 352, 15816, 9765, 352, 3133, 275, 253, 3368, 2593, 368, 8085, 841, 667, 1576, 594, 4931, 352, 2789, 253, 16101, 2593, 1199, 625, 2590, 604, 253, 5933, 369, 3185, 355, 545, 518, 5596, 209, 633, 412, 259, 79, 50276, 633, 412, 259, 79, 18, 50275, 783, 2929, 12889, 13067, 745, 22872, 4715, 347, 4715, 253, 8654, 3646, 970, 271, 41075, 3646, 436, 310, 247, 2372, 273, 247, 29190, 4758, 285, 310, 5604, 417, 253, 4758, 326, 256, 28738, 1162, 355, 4748, 2783, 253, 789, 326, 436, 2929, 21168, 2220, 50276, 22309, 5467, 326, 253, 2303, 3646, 310, 30027, 5393, 275, 2593, 3127, 436, 310, 247, 8921, 4327, 326, 310, 417, 908, 275, 2057, 253, 16774, 390, 16101, 2593, 347, 2080, 347, 891, 476, 2028, 50276, 262, 310, 5393, 326, 32989, 3082, 943, 7703, 281, 15338, 253, 278, 1033, 1257, 390, 4931, 253, 13818, 1257, 352, 310, 2649, 2590, 534, 310, 5486, 533, 943, 2649, 3185, 253, 4736, 320, 281, 15338, 253, 13818, 306, 24088, 50276, 700, 87, 88, 50276, 87, 2059, 374, 50275, 50004, 5393, 275, 436, 2278, 278, 3348, 74, 10546, 301, 294, 4019, 11786, 5897, 8950, 17693, 4715, 11333, 9835, 273, 355, 589, 893, 4332, 50276, 84, 1758, 305, 5801, 515, 757, 285, 2663, 869, 30964, 439, 400, 312, 305, 1662, 277, 21733, 267, 1149, 37668, 38622, 3168, 285, 2304, 19243, 3168, 11786, 5897, 8950, 17693, 4715, 342, 3963, 1025, 17660, 5213, 8059, 327, 5145, 4715, 9169, 2832, 1148, 32693, 2061, 5375, 8602, 7174, 883, 50276, 43089, 3168, 285, 2304, 19243, 3168, 15686, 8542, 4872, 11935, 3064, 4715, 5213, 8059, 327, 26279, 6083, 285, 4471, 12788, 2718, 4022, 2832, 1148, 32693, 2061, 5375, 9913, 17391, 40379, 50276, 67, 1064, 274, 362, 256, 50276, 1405, 1362, 256, 268, 5307, 253, 258, 615, 1332, 323, 14940, 273, 19191, 11193, 285, 35221, 4715, 4927, 312, 6698, 327, 1453, 285, 13757, 42671, 577, 30413, 2090, 5987, 3088, 1528, 72, 6903, 15497, 84, 21348, 20, 12522, 28315, 1610, 1036, 1867, 50276, 84, 28738, 391, 256, 278, 3348, 74, 288, 391, 3509, 484, 277, 270, 700, 79, 28923, 256, 9711, 277, 18558, 554, 265, 87, 363, 260, 50276, 88, 827, 1528, 66, 299, 4748, 3809, 11786, 3229, 1154, 3082, 323, 5897, 8950, 17693, 4715, 342, 4872, 1159, 11193, 10061, 273, 253, 3436, 394, 7970, 5213, 8059, 327, 5145, 4715, 50276, 280, 1686, 15630, 1283, 5987, 3088, 1528, 72, 6903, 11838, 15054, 1610, 3566, 15054, 1671, 520, 50276, 965, 86, 270, 35926, 796, 6148, 256, 50276, 965, 86, 480, 4050, 3963, 1025, 745, 22872, 32989, 28269, 16424, 275, 11454, 1491, 5162, 2718, 898, 50275, 6438, 5955, 285, 1407, 953, 50276, 74, 14409, 326, 891, 452, 1239, 253, 643, 10123, 285, 4795, 11985, 285, 891, 452, 1239, 253, 4623, 2544, 275, 253, 16168, 2505, 891, 452, 5439, 619, 4868, 432, 3495, 281, 4887, 326, 2067, 7350, 497, 26353, 4215, 949, 253, 1407, 953, 533, 2067, 747, 7350, 285, 1711, 7350, 3464, 891, 588, 26799, 2708, 50276, 6438, 253, 2488, 1407, 953, 253, 2523, 342, 14940, 285, 14940, 4142, 3176, 281, 452, 644, 11512, 891, 23000, 11435, 253, 1199, 3687, 19843, 275, 253, 16101, 2593, 2299, 891, 1335, 1089, 253, 7680, 281, 320, 45210, 387, 1682, 275, 2426, 273, 38135, 273, 2746, 285, 891, 1089, 326, 253, 1941, 273, 30437, 310, 1335, 15455, 14999, 253, 10199, 273, 247, 3963, 6081, 281, 28523, 305, 2851, 3082, 310, 3139, 417, 4460, 253, 830, 273, 253, 4081, 3963, 6081, 310, 4460, 2299, 891, 1089, 697, 830, 281, 320, 25962, 48714, 347, 352, 5419, 6419, 2403, 2544, 281, 253, 13461, 627, 403, 690, 2720, 2987, 326, 41509, 436, 973, 26332, 492, 5367, 285, 643, 4517, 17187, 13757, 5609, 533, 436, 2929, 1057, 417, 4549, 281, 2720, 2987, 281, 41509, 616, 3963, 6081, 3185, 891, 1364, 10725, 327, 253, 16774, 1263, 534, 1057, 417, 7409, 253, 4715, 3885, 273, 253, 4081, 5933, 2429, 281, 1666, 25379, 275, 1142, 2219, 253, 4081, 5933, 1057, 417, 4518, 562, 32231, 1666, 25379, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 23970, 271, 2969, 533, 7826, 3576, 745, 22872, 32989, 5933, 50275, 1189, 455, 253, 30628, 3543, 253, 789, 369, 18464, 285, 417, 2568, 4704, 323, 9311, 253, 512, 7478, 253, 4477, 1160, 1534, 11269, 281, 253, 2929, 533, 4092, 3374, 3464, 342, 253, 16774, 789, 12392, 253, 3486, 273, 253, 4081, 6880, 327, 643, 11333, 5816, 1666, 25379, 24088, 32989, 3373, 7990, 273, 12620, 3710, 2074, 5931, 3022, 10625, 1534, 3533, 670, 849, 1682, 4764, 7533, 835, 6777, 323, 5301, 3966, 50276, 2520, 310, 4518, 271, 4722, 3884, 604, 253, 4477, 476, 3157, 253, 4679, 285, 1805, 5999, 366, 616, 1332, 604, 253, 6239, 12873, 281, 253, 6195, 275, 745, 22872, 391, 77, 670, 38757, 285, 11138, 745, 22872, 32989, 3082, 436, 588, 2489, 247, 4891, 7680 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this work considers a regularized irl setup where instead of the entropy regularization used in maximum entropy irl an arbitrary convex regularizer omega is used the work presents a number of theoretical results for this general setting and it is shown that when omega is tsallis entropy the rl cdot irl is equivalent to minimizing a bregman divergence defined based on the tsallis entropy and the expert stateaction distribution a practical algorithm is presented for irl with the tsallis entropy a number of experiments are performed to obtain understanding of various components i hope that the following questions can be resolved during the discussion period as some things are a bit unclear to me which is preventing me from providing a more detailed analysis of the work section 3 section 31 paragraph after equation 6 can you clarify for example with concrete examples or equations what you mean by terms like functionalform and intractable section 4 could you clarify your structured discriminator what exactly are you using for tsapi how does this relate to the tsallis entropy baseline equations derived earlier equation 13 should this be argmax i think maybe this should instead be just a small gradient update using this objective since hattsapii is the correct objective locally around pii section 5 based on the description in h2 i dont understand what the densitybased model is please clarify section 51 can you clarify what you mean by acquires the ground truth rewards figure 2 is showing different reward curves for each method so how can they all be the ground truth section 52 can you explain how you are computing the divergences between the joint stateaction distributions psa of the expert and trained policy also how do you compute these divergences for the mujoco experiments in section 53 section 53 could you elaborate what you mean by unfortunately rairl fails to acquire a reward function that effectively minimizes the target divergence in continuous controls you may also want to cite articleke2019imitation titleimitation learning as f divergence minimization authorke liyiming and barnes matt and sun wen and lee gilwoo and choudhury sanjiban and srinivasa siddhartha journalarxiv preprint arxiv190512888 year2019 docsepthis paper shows a formulation of regularized markov decision processes mdps which is slightly different from that of geist et al 2019 then the authors propose a novel inverse reinforcement learning under regularized mdps one of the contributions is that policy regularization considered here is more general than that of yang et al 2019 this paper is written very well and is of publishing quality i think it is sufficiently significant to be accepted still i have the following questions 1 the proposed method is based on the relationship between imitation learning and statistical divergence minimization if my understanding is correct bregman divergence plays a role in generalizing generalized adversarial imitation learning however as the authors mentioned in section 6 bregman divergence does not include fdivergence which is also studied in imitation learning would you discuss the connection to the formulation using fdivergence in more detail 2 i am interested in the relationship between the proposed method and lee et al neurips2018 is the proposed method nearly the same as lee et al 2018 when tsallis entropy is selected as regularization if not does the proposed method outperform lee et al 2018 in the mujoco control tasks 3 the authors claim that the solutions provided by geist et al 2019 are intractable in the introduction however it is shown that the reward baseline term in corollary 1 is intractable except for some wellstudied setups does it imply that the proposed method faces the same difficulty when applied with arbitrary policy regularization 4 the experimental results shown in figure 3 is interesting but i have a few concerns in some cases the averaged bregman divergence of rairlnsm lambda 1 was larger than that of random would you show the example of the learned policy for the readers understanding besides is the same policy regularization used in behavior cloning finally are exp cos and sin the meaningful regularizer 5 to derive the practical algorithms the authors consider the same form of the policy regularization used by yang et al 2019 which is given by lambda ephipia is it possible to derive the algorithm in which the regularizer is given by omegapi docseppros 1 this paper studies an interesting problem called regularized inverse reinforcement learning which is novel to me and brings me some knowledge 2 this paper first proposes a general solution for regularized irl then tsallis entropy is proposed with irl as well as an adversary learningbased training strategy the methodology part seems reasonable and the contribution is good 3 the paper is wellorganized the experiments are convincing trying to illustrate the performance under different scenarios including continuous and discrete reinforcement learning settings cons 1 in the experiments much comparison is internal comparison of proposed methods in the beginning the authors mention shannonentropy regularizers limitations the authors should conduct more experiments to prove the statement now only experiment 3 mentions it 2 i dont see many comparisons with other baselines adding more baselines is better in summary this paper studies a novel problem regularized inverse reinforcement learning the paper proposes several techniques to solve the regularization in different aspects the experiments are conducted under different settings but some cons are needed to revisedocseppostrebuttal comments i thank the authors for the response and the efforts in the updated draft i think the paper is stronger and should be accepted summary this paper examines the problem of regularized irl as opposed to standard irl which can have degenerate solutions regularized irl has a unique optimal solution the authors examine different forms of regularization and derive an efficient irl algorithm which generalizes airl and is applicable to continuous control tasks reasons for score the idea of regularized irl is a nice contribution to the field it ties in nicely with recent work on regularized rl and overcomes some of the challenges of classical irl the mathematical foundation is rigorous and the experimental results are promising pros 1 nice general mathematical framework that generalizes prior work 2 tractable algorithms that work in continuous state and action domains 3 the empirical analysis of the actual divergence is interesting cons 1 the paper is a bit notation heavy and that makes it hard to follow 2 some undefined notation such as mathcald on page 2 3 lacking in justification for the choice of entropy regularizers why tsallis why follow yang 2019 in choice of regularizer 4 in general a lot of the theoretical results are given without any intuitive explanation for what they mean or how to interpret them 5 its unclear how this is different from the theory for gail which is also in terms of a general regularizer other comments the sentences before the discussion mention that rairl is good for safe imitation learning this is unclear since safety is undefined also in the same sentences the authors mention that rairl fails to minimize divergence but it seems to work in the figure for some values of q what do the authors mean by fail figure 1 seems to transition from having high divergence in the top to low divergence for small log sigma why rather than using a single dash for parentheticals it is usually better to use latex emdash for parentheticals by using three dashes with no space before and after docsepthis paper proposes a new method for regularized inverse rl the paper builds upon work by geist et al who studied regularized mdps with convex policy regularizers the shannon entropy is a special case of such a policy regularizer the paper extends the analysis of geist et al for regularized irl and devises tractable solutions to regularized irl that only depend on the analytic knowledge of the regularizer the paper further proposes regularized adversarial irl rairl an extension of airl by fu et al as an algorithm for irl in regularized mdps the algorithm is validated on a number of domains the paper is generally clearly written i believe that the paper is technically correct and i appreciated that the derivations are well explained in the appendix the paper is novel to the best of my knowledge and i think better forms of regularization could improve the state of the art in inverse rl and will be of interest to the iclr community i think its cool that this paper allows us to solve inverse rl problems in tsallis entropy regularized mdps and a great technical contribution however i think what is less clear to me is why we would want to as it is im leaning towards acceptance because i enjoyed the paper and appreciate the technical contribution but i think the paper would be substantially stronger if it made a better case for why regularizers other than the normal shannon entropy are relevant for practical irl problems perhaps even with an example application demonstrating a clear benefit over airl as well as other imitation learning baselines minor commentsquestionstypos would rairlnsm eventually reach expert performance on the antv2 task i find the results in figure 4 a little surprising i would have expected the divergence to be minimal for qq could you comment further on this empirical result im wondering why the exponentialcosinesine regularizers were chosen in experiment 1 are these just meant to demonstrate the flexibility of the framework or is there a motivation for using them in practice i appreciate the need to stay within the page limit but would encourage the authors to decompress the paper a bit for a potential cameraready version i think it would be nice to include an algorithm box as a summary of the proposed algorithm figures 3 and 4 are a bit small and hard to parse the paper might benefit from some further proofreading for typos and small grammatical errors such as missing articles ### Summary:
this paper studies inverse reinforcement learning through the prism of regularized markov decision processes by generalizing maxentirl from the negative entropy to any strongly convex regularizer as a side note strict convexity might be enough for many results the reviewers appreciated the clarity the mathematical rigor and the empirical evaluation of this paper they asked some questions and raised some concerns that were mostly addressed in the rebuttal and the revision provided by the authors this is a strong paper for which the ac recommends acceptance
[ 8439, 27, 187, 2520, 789, 19401, 247, 3963, 1025, 209, 2587, 9978, 835, 3185, 273, 253, 15579, 37820, 908, 275, 4869, 15579, 209, 2587, 271, 10341, 17133, 3963, 6081, 40639, 310, 908, 253, 789, 10262, 247, 1180, 273, 10527, 1543, 323, 436, 2087, 4758, 285, 352, 310, 2011, 326, 672, 40639, 310, 28669, 455, 261, 15579, 253, 391, 77, 260, 5256, 209, 2587, 310, 6425, 281, 28699, 247, 1517, 72, 1342, 23279, 2931, 1754, 327, 253, 28669, 455, 261, 15579, 285, 253, 6485, 1375, 1913, 3268, 247, 8542, 5933, 310, 3559, 323, 209, 2587, 342, 253, 28669, 455, 261, 15579, 247, 1180, 273, 4679, 403, 2684, 281, 4044, 4685, 273, 2710, 4295, 50276, 74, 3524, 326, 253, 1563, 3533, 476, 320, 11512, 1309, 253, 5955, 2180, 347, 690, 1841, 403, 247, 2372, 12744, 281, 479, 534, 310, 13538, 479, 432, 5277, 247, 625, 7000, 1783, 273, 253, 789, 50275, 4674, 495, 50274, 4674, 4562, 12494, 846, 5150, 721, 476, 368, 19148, 323, 1650, 342, 11859, 6667, 390, 7424, 752, 368, 1599, 407, 2426, 751, 5164, 630, 285, 540, 44374, 50276, 4674, 577, 50274, 16534, 368, 19148, 634, 18872, 7134, 12915, 752, 4555, 403, 368, 970, 323, 28669, 6682, 849, 1057, 436, 14588, 281, 253, 28669, 455, 261, 15579, 8245, 7424, 6012, 4321, 50274, 29813, 2145, 943, 436, 320, 1736, 4090, 891, 1158, 5046, 436, 943, 3185, 320, 816, 247, 1355, 11786, 5731, 970, 436, 8103, 1580, 288, 24729, 522, 2886, 310, 253, 3451, 8103, 12171, 1475, 268, 2886, 50276, 4674, 608, 50274, 3169, 327, 253, 5740, 275, 288, 19, 891, 13414, 2096, 752, 253, 4038, 3169, 1566, 310, 4496, 19148, 50274, 4674, 8319, 476, 368, 19148, 752, 368, 1599, 407, 4171, 2731, 253, 3216, 5083, 23267, 4677, 374, 310, 4645, 1027, 10921, 9191, 323, 1016, 1332, 594, 849, 476, 597, 512, 320, 253, 3216, 5083, 50274, 4674, 8073, 476, 368, 5513, 849, 368, 403, 12672, 253, 11711, 1541, 707, 875, 253, 6036, 1375, 1913, 10670, 3714, 66, 273, 253, 6485, 285, 10166, 3646, 671, 849, 513, 368, 11897, 841, 11711, 1541, 707, 323, 253, 278, 10441, 16856, 4679, 275, 2593, 8676, 50274, 4674, 8676, 812, 368, 21184, 752, 368, 1599, 407, 19235, 1218, 2587, 10224, 281, 16270, 247, 10921, 1159, 326, 8069, 46926, 253, 2303, 23279, 275, 5415, 5760, 50276, 5658, 778, 671, 971, 281, 26542, 3929, 413, 9638, 303, 3535, 50275, 5564, 303, 3535, 4715, 347, 50276, 71, 23279, 41458, 50275, 7582, 413, 632, 90, 303, 272, 285, 23584, 265, 26714, 285, 5101, 259, 257, 285, 458, 70, 305, 300, 680, 80, 285, 448, 2995, 73, 1626, 7699, 75, 27546, 285, 256, 11078, 400, 19924, 256, 2016, 73, 5401, 66, 50275, 19317, 39962, 638, 3845, 549, 32693, 746, 1762, 805, 25452, 50275, 2913, 9638, 5474, 33032, 2520, 2929, 2722, 247, 15895, 273, 3963, 1025, 1616, 729, 3061, 4870, 31934, 793, 534, 310, 5777, 1027, 432, 326, 273, 3471, 382, 1162, 355, 6247, 840, 253, 4477, 12661, 247, 4460, 13737, 35221, 4715, 762, 3963, 1025, 31934, 793, 581, 273, 253, 9021, 310, 326, 3646, 37820, 2783, 1060, 310, 625, 2087, 685, 326, 273, 30966, 1162, 355, 6247, 50275, 2520, 2929, 310, 3542, 1077, 973, 285, 310, 273, 18051, 3290, 891, 1158, 352, 310, 10481, 1534, 281, 320, 7607, 1335, 891, 452, 253, 1563, 3533, 50275, 18, 253, 4081, 1332, 310, 1754, 327, 253, 2954, 875, 45738, 4715, 285, 7605, 23279, 41458, 604, 619, 4685, 310, 3451, 1517, 72, 1342, 23279, 7120, 247, 2554, 275, 2087, 3006, 14923, 48960, 45738, 4715, 2299, 347, 253, 4477, 5393, 275, 2593, 721, 1517, 72, 1342, 23279, 1057, 417, 2486, 29439, 2373, 9515, 534, 310, 671, 5421, 275, 45738, 4715, 651, 368, 2319, 253, 4602, 281, 253, 15895, 970, 29439, 2373, 9515, 275, 625, 2508, 50276, 19, 891, 717, 6110, 275, 253, 2954, 875, 253, 4081, 1332, 285, 458, 70, 1162, 355, 5723, 2824, 7798, 310, 253, 4081, 1332, 4829, 253, 1072, 347, 458, 70, 1162, 355, 4765, 672, 28669, 455, 261, 15579, 310, 4236, 347, 37820, 604, 417, 1057, 253, 4081, 1332, 562, 32231, 458, 70, 1162, 355, 4765, 275, 253, 278, 10441, 16856, 1453, 8892, 50275, 20, 253, 4477, 1750, 326, 253, 5482, 2530, 407, 3471, 382, 1162, 355, 6247, 403, 540, 44374, 275, 253, 10199, 2299, 352, 310, 2011, 326, 253, 10921, 8245, 1307, 275, 40460, 337, 310, 540, 44374, 3707, 323, 690, 973, 14091, 728, 873, 8777, 1057, 352, 16084, 326, 253, 4081, 1332, 9365, 253, 1072, 10183, 672, 3732, 342, 10341, 3646, 37820, 50276, 21, 253, 5661, 1543, 2011, 275, 4677, 495, 310, 4722, 533, 891, 452, 247, 1643, 7350, 275, 690, 2219, 253, 17522, 1517, 72, 1342, 23279, 273, 1218, 2587, 2224, 78, 29331, 50276, 18, 369, 4067, 685, 326, 273, 3632, 651, 368, 921, 253, 1650, 273, 253, 6311, 3646, 323, 253, 10668, 4685, 16280, 310, 253, 1072, 3646, 37820, 908, 275, 3879, 34591, 4720, 403, 866, 7349, 285, 6868, 253, 14282, 3963, 6081, 50275, 22, 281, 15313, 253, 8542, 11333, 253, 4477, 1908, 253, 1072, 830, 273, 253, 3646, 37820, 908, 407, 30966, 1162, 355, 6247, 534, 310, 1677, 407, 50276, 2260, 299, 545, 532, 571, 310, 352, 1896, 281, 15313, 253, 5933, 275, 534, 253, 3963, 6081, 310, 1677, 407, 7005, 909, 6682, 5474, 339, 377, 2921, 337, 436, 2929, 2175, 271, 4722, 1895, 1925, 3963, 1025, 13737, 35221, 4715, 534, 310, 4460, 281, 479, 285, 10316, 479, 690, 3640, 374, 436, 2929, 806, 29328, 247, 2087, 2900, 323, 3963, 1025, 209, 2587, 840, 28669, 455, 261, 15579, 310, 4081, 342, 209, 2587, 347, 973, 347, 271, 34014, 4715, 3169, 3733, 5700, 253, 16182, 629, 3133, 5272, 285, 253, 7680, 310, 1175, 495, 253, 2929, 310, 973, 34092, 253, 4679, 403, 21414, 2820, 281, 17093, 253, 3045, 762, 1027, 15216, 1690, 5415, 285, 13358, 35221, 4715, 7533, 50276, 5040, 337, 275, 253, 4679, 1199, 5301, 310, 4812, 5301, 273, 4081, 3082, 275, 253, 5068, 253, 4477, 3748, 439, 1136, 5318, 10144, 3963, 14460, 7364, 253, 4477, 943, 2589, 625, 4679, 281, 5276, 253, 3908, 1024, 760, 3368, 495, 25957, 352, 50276, 19, 891, 13414, 923, 1142, 14023, 342, 643, 1666, 25379, 6240, 625, 1666, 25379, 310, 1805, 50275, 249, 6010, 436, 2929, 2175, 247, 4460, 1895, 3963, 1025, 13737, 35221, 4715, 253, 2929, 29328, 2067, 5609, 281, 8415, 253, 37820, 275, 1027, 7794, 253, 4679, 403, 5196, 762, 1027, 7533, 533, 690, 772, 403, 3058, 281, 17265, 406, 339, 377, 493, 250, 2858, 22559, 5701, 50275, 74, 5717, 253, 4477, 323, 253, 2380, 285, 253, 6031, 275, 253, 9300, 7482, 891, 1158, 253, 2929, 310, 10046, 285, 943, 320, 7607, 50273, 8774, 50276, 2520, 2929, 33888, 253, 1895, 273, 3963, 1025, 209, 2587, 347, 10066, 281, 2629, 209, 2587, 534, 476, 452, 29458, 5482, 3963, 1025, 209, 2587, 556, 247, 4451, 8654, 2900, 253, 4477, 9186, 1027, 4948, 273, 37820, 285, 15313, 271, 5919, 209, 2587, 5933, 534, 2087, 4219, 2329, 77, 285, 310, 7763, 281, 5415, 1453, 8892, 50274, 250, 3743, 323, 4868, 50276, 783, 50276, 36665, 273, 3963, 1025, 209, 2587, 310, 247, 5322, 7680, 281, 253, 1673, 352, 16027, 275, 23395, 342, 3332, 789, 327, 3963, 1025, 391, 77, 285, 689, 3217, 690, 273, 253, 7881, 273, 8946, 209, 2587, 253, 15965, 12153, 310, 26565, 285, 253, 5661, 1543, 403, 12532, 50274, 856, 84, 337, 5322, 2087, 15965, 7792, 326, 2087, 4219, 2720, 789, 374, 10649, 494, 11333, 326, 789, 275, 5415, 1375, 285, 2250, 10625, 495, 253, 16774, 1783, 273, 253, 4588, 23279, 310, 4722, 50273, 5040, 337, 253, 2929, 310, 247, 2372, 14951, 5536, 285, 326, 2789, 352, 1892, 281, 956, 374, 690, 17011, 14951, 824, 347, 14168, 1179, 69, 327, 3239, 374, 50276, 20, 14999, 275, 22861, 323, 253, 4327, 273, 15579, 3963, 14460, 2139, 28669, 455, 261, 2139, 956, 30966, 6247, 275, 4327, 273, 3963, 6081, 577, 275, 2087, 247, 2257, 273, 253, 10527, 1543, 403, 1677, 1293, 667, 27350, 8813, 323, 752, 597, 1599, 390, 849, 281, 4665, 731, 608, 697, 12744, 849, 436, 310, 1027, 432, 253, 3762, 323, 305, 647, 534, 310, 671, 275, 2426, 273, 247, 2087, 3963, 6081, 50275, 977, 5701, 50276, 783, 14683, 1078, 253, 5955, 3748, 326, 1218, 2587, 310, 1175, 323, 4999, 45738, 4715, 436, 310, 12744, 1580, 5252, 310, 17011, 671, 275, 253, 1072, 14683, 253, 4477, 3748, 326, 1218, 2587, 10224, 281, 15338, 23279, 533, 352, 3133, 281, 789, 275, 253, 4677, 323, 690, 2193, 273, 2805, 752, 513, 253, 4477, 1599, 407, 1891, 50276, 13206, 337, 3133, 281, 5502, 432, 1907, 1029, 23279, 275, 253, 1755, 281, 1698, 23279, 323, 50276, 6795, 2412, 40009, 2139, 50276, 30786, 685, 970, 247, 2014, 20134, 323, 2885, 6168, 30241, 352, 310, 3798, 1805, 281, 897, 44127, 802, 27207, 323, 2885, 6168, 30241, 407, 970, 1264, 9527, 1041, 50276, 3113, 642, 2317, 1078, 285, 846, 50274, 7152, 33032, 2520, 2929, 29328, 247, 747, 1332, 323, 3963, 1025, 13737, 391, 77, 253, 2929, 21168, 2220, 789, 407, 3471, 382, 1162, 355, 665, 5421, 3963, 1025, 31934, 793, 342, 17133, 3646, 3963, 14460, 253, 439, 16554, 15579, 310, 247, 2714, 1083, 273, 824, 247, 3646, 3963, 6081, 253, 2929, 8725, 253, 1783, 273, 3471, 382, 1162, 355, 323, 3963, 1025, 209, 2587, 285, 1474, 3013, 10649, 494, 5482, 281, 3963, 1025, 209, 2587, 326, 760, 3469, 327, 253, 20059, 3640, 273, 253, 3963, 6081, 253, 2929, 2007, 29328, 3963, 1025, 48960, 209, 2587, 1218, 2587, 271, 6880, 273, 2329, 77, 407, 15260, 1162, 355, 347, 271, 5933, 323, 209, 2587, 275, 3963, 1025, 31934, 793, 253, 5933, 310, 17618, 327, 247, 1180, 273, 10625, 50276, 783, 2929, 310, 3839, 4518, 3542, 891, 2868, 326, 253, 2929, 310, 22335, 3451, 285, 891, 14109, 326, 253, 3538, 569, 403, 973, 5544, 275, 253, 30762, 253, 2929, 310, 4460, 281, 253, 1682, 273, 619, 3640, 285, 891, 1158, 1805, 4948, 273, 37820, 812, 3157, 253, 1375, 273, 253, 1445, 275, 13737, 391, 77, 285, 588, 320, 273, 1600, 281, 253, 17857, 32888, 3114, 50275, 74, 1158, 697, 4484, 326, 436, 2929, 4483, 441, 281, 8415, 13737, 391, 77, 3237, 275, 28669, 455, 261, 15579, 3963, 1025, 31934, 793, 285, 247, 1270, 7681, 7680, 2299, 891, 1158, 752, 310, 1679, 2590, 281, 479, 310, 2139, 359, 651, 971, 281, 347, 352, 310, 516, 25661, 4404, 14924, 984, 891, 11346, 253, 2929, 285, 11435, 253, 7681, 7680, 533, 891, 1158, 253, 2929, 651, 320, 9619, 10046, 604, 352, 1160, 247, 1805, 1083, 323, 2139, 3963, 14460, 643, 685, 253, 2622, 439, 16554, 15579, 403, 4623, 323, 8542, 209, 2587, 3237, 4931, 1014, 342, 271, 1650, 2898, 17227, 247, 2590, 5649, 689, 2329, 77, 347, 973, 347, 643, 45738, 4715, 1666, 25379, 50276, 37585, 5701, 19751, 25871, 993, 50276, 12756, 1218, 2587, 2224, 78, 6524, 3986, 6485, 3045, 327, 253, 1331, 87, 19, 4836, 50276, 74, 1089, 253, 1543, 275, 4677, 577, 247, 1652, 10084, 891, 651, 452, 3264, 253, 23279, 281, 320, 8723, 323, 2805, 82, 812, 368, 4385, 2007, 327, 436, 16774, 906, 50276, 303, 12371, 2139, 253, 17619, 4752, 1100, 460, 3963, 14460, 497, 6777, 275, 3368, 337, 403, 841, 816, 5486, 281, 7568, 253, 15840, 273, 253, 7792, 390, 310, 627, 247, 16038, 323, 970, 731, 275, 3946, 50276, 74, 11435, 253, 878, 281, 3297, 1561, 253, 3239, 2701, 533, 651, 11907, 253, 4477, 281, 30572, 560, 253, 2929, 247, 2372, 323, 247, 2442, 4049, 254, 609, 5102, 2715, 50275, 74, 1158, 352, 651, 320, 5322, 281, 2486, 271, 5933, 3817, 347, 247, 6010, 273, 253, 4081, 5933, 50275, 40203, 495, 285, 577, 403, 247, 2372, 1355, 285, 1892, 281, 14390, 50275, 783, 2929, 1537, 5649, 432, 690, 2007, 4737, 24042, 323, 963, 993, 285, 1355, 47412, 474, 6332, 824, 347, 5816, 7774, 2490, 187, 4118, 18435, 27, 2520, 2929, 2175, 13737, 35221, 4715, 949, 253, 49615, 273, 3963, 1025, 1616, 729, 3061, 4870, 407, 2087, 3006, 2781, 290, 2587, 432, 253, 4016, 15579, 281, 667, 7052, 17133, 3963, 6081, 347, 247, 1930, 3877, 7654, 17133, 414, 1537, 320, 2217, 323, 1142, 1543, 253, 30628, 14109, 253, 19843, 253, 15965, 8132, 263, 285, 253, 16774, 7103, 273, 436, 2929, 597, 2546, 690, 3533, 285, 5439, 690, 7350, 326, 497, 6571, 9713, 275, 253, 30080, 22559, 285, 253, 18520, 2530, 407, 253, 4477, 436, 310, 247, 2266, 2929, 323, 534, 253, 913, 32636, 14924, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 8439, 27, 187, 2520, 789, 19401, 247, 3963, 1025, 209, 2587, 9978, 835, 3185, 273, 253, 15579, 37820, 908, 275, 4869, 15579, 209, 2587, 271, 10341, 17133, 3963, 6081, 40639, 310, 908, 253, 789, 10262, 247, 1180, 273, 10527, 1543, 323, 436, 2087, 4758, 285, 352, 310, 2011, 326, 672, 40639, 310, 28669, 455, 261, 15579, 253, 391, 77, 260, 5256, 209, 2587, 310, 6425, 281, 28699, 247, 1517, 72, 1342, 23279, 2931, 1754, 327, 253, 28669, 455, 261, 15579, 285, 253, 6485, 1375, 1913, 3268, 247, 8542, 5933, 310, 3559, 323, 209, 2587, 342, 253, 28669, 455, 261, 15579, 247, 1180, 273, 4679, 403, 2684, 281, 4044, 4685, 273, 2710, 4295, 50276, 74, 3524, 326, 253, 1563, 3533, 476, 320, 11512, 1309, 253, 5955, 2180, 347, 690, 1841, 403, 247, 2372, 12744, 281, 479, 534, 310, 13538, 479, 432, 5277, 247, 625, 7000, 1783, 273, 253, 789, 50275, 4674, 495, 50274, 4674, 4562, 12494, 846, 5150, 721, 476, 368, 19148, 323, 1650, 342, 11859, 6667, 390, 7424, 752, 368, 1599, 407, 2426, 751, 5164, 630, 285, 540, 44374, 50276, 4674, 577, 50274, 16534, 368, 19148, 634, 18872, 7134, 12915, 752, 4555, 403, 368, 970, 323, 28669, 6682, 849, 1057, 436, 14588, 281, 253, 28669, 455, 261, 15579, 8245, 7424, 6012, 4321, 50274, 29813, 2145, 943, 436, 320, 1736, 4090, 891, 1158, 5046, 436, 943, 3185, 320, 816, 247, 1355, 11786, 5731, 970, 436, 8103, 1580, 288, 24729, 522, 2886, 310, 253, 3451, 8103, 12171, 1475, 268, 2886, 50276, 4674, 608, 50274, 3169, 327, 253, 5740, 275, 288, 19, 891, 13414, 2096, 752, 253, 4038, 3169, 1566, 310, 4496, 19148, 50274, 4674, 8319, 476, 368, 19148, 752, 368, 1599, 407, 4171, 2731, 253, 3216, 5083, 23267, 4677, 374, 310, 4645, 1027, 10921, 9191, 323, 1016, 1332, 594, 849, 476, 597, 512, 320, 253, 3216, 5083, 50274, 4674, 8073, 476, 368, 5513, 849, 368, 403, 12672, 253, 11711, 1541, 707, 875, 253, 6036, 1375, 1913, 10670, 3714, 66, 273, 253, 6485, 285, 10166, 3646, 671, 849, 513, 368, 11897, 841, 11711, 1541, 707, 323, 253, 278, 10441, 16856, 4679, 275, 2593, 8676, 50274, 4674, 8676, 812, 368, 21184, 752, 368, 1599, 407, 19235, 1218, 2587, 10224, 281, 16270, 247, 10921, 1159, 326, 8069, 46926, 253, 2303, 23279, 275, 5415, 5760, 50276, 5658, 778, 671, 971, 281, 26542, 3929, 413, 9638, 303, 3535, 50275, 5564, 303, 3535, 4715, 347, 50276, 71, 23279, 41458, 50275, 7582, 413, 632, 90, 303, 272, 285, 23584, 265, 26714, 285, 5101, 259, 257, 285, 458, 70, 305, 300, 680, 80, 285, 448, 2995, 73, 1626, 7699, 75, 27546, 285, 256, 11078, 400, 19924, 256, 2016, 73, 5401, 66, 50275, 19317, 39962, 638, 3845, 549, 32693, 746, 1762, 805, 25452, 50275, 2913, 9638, 5474, 33032, 2520, 2929, 2722, 247, 15895, 273, 3963, 1025, 1616, 729, 3061, 4870, 31934, 793, 534, 310, 5777, 1027, 432, 326, 273, 3471, 382, 1162, 355, 6247, 840, 253, 4477, 12661, 247, 4460, 13737, 35221, 4715, 762, 3963, 1025, 31934, 793, 581, 273, 253, 9021, 310, 326, 3646, 37820, 2783, 1060, 310, 625, 2087, 685, 326, 273, 30966, 1162, 355, 6247, 50275, 2520, 2929, 310, 3542, 1077, 973, 285, 310, 273, 18051, 3290, 891, 1158, 352, 310, 10481, 1534, 281, 320, 7607, 1335, 891, 452, 253, 1563, 3533, 50275, 18, 253, 4081, 1332, 310, 1754, 327, 253, 2954, 875, 45738, 4715, 285, 7605, 23279, 41458, 604, 619, 4685, 310, 3451, 1517, 72, 1342, 23279, 7120, 247, 2554, 275, 2087, 3006, 14923, 48960, 45738, 4715, 2299, 347, 253, 4477, 5393, 275, 2593, 721, 1517, 72, 1342, 23279, 1057, 417, 2486, 29439, 2373, 9515, 534, 310, 671, 5421, 275, 45738, 4715, 651, 368, 2319, 253, 4602, 281, 253, 15895, 970, 29439, 2373, 9515, 275, 625, 2508, 50276, 19, 891, 717, 6110, 275, 253, 2954, 875, 253, 4081, 1332, 285, 458, 70, 1162, 355, 5723, 2824, 7798, 310, 253, 4081, 1332, 4829, 253, 1072, 347, 458, 70, 1162, 355, 4765, 672, 28669, 455, 261, 15579, 310, 4236, 347, 37820, 604, 417, 1057, 253, 4081, 1332, 562, 32231, 458, 70, 1162, 355, 4765, 275, 253, 278, 10441, 16856, 1453, 8892, 50275, 20, 253, 4477, 1750, 326, 253, 5482, 2530, 407, 3471, 382, 1162, 355, 6247, 403, 540, 44374, 275, 253, 10199, 2299, 352, 310, 2011, 326, 253, 10921, 8245, 1307, 275, 40460, 337, 310, 540, 44374, 3707, 323, 690, 973, 14091, 728, 873, 8777, 1057, 352, 16084, 326, 253, 4081, 1332, 9365, 253, 1072, 10183, 672, 3732, 342, 10341, 3646, 37820, 50276, 21, 253, 5661, 1543, 2011, 275, 4677, 495, 310, 4722, 533, 891, 452, 247, 1643, 7350, 275, 690, 2219, 253, 17522, 1517, 72, 1342, 23279, 273, 1218, 2587, 2224, 78, 29331, 50276, 18, 369, 4067, 685, 326, 273, 3632, 651, 368, 921, 253, 1650, 273, 253, 6311, 3646, 323, 253, 10668, 4685, 16280, 310, 253, 1072, 3646, 37820, 908, 275, 3879, 34591, 4720, 403, 866, 7349, 285, 6868, 253, 14282, 3963, 6081, 50275, 22, 281, 15313, 253, 8542, 11333, 253, 4477, 1908, 253, 1072, 830, 273, 253, 3646, 37820, 908, 407, 30966, 1162, 355, 6247, 534, 310, 1677, 407, 50276, 2260, 299, 545, 532, 571, 310, 352, 1896, 281, 15313, 253, 5933, 275, 534, 253, 3963, 6081, 310, 1677, 407, 7005, 909, 6682, 5474, 339, 377, 2921, 337, 436, 2929, 2175, 271, 4722, 1895, 1925, 3963, 1025, 13737, 35221, 4715, 534, 310, 4460, 281, 479, 285, 10316, 479, 690, 3640, 374, 436, 2929, 806, 29328, 247, 2087, 2900, 323, 3963, 1025, 209, 2587, 840, 28669, 455, 261, 15579, 310, 4081, 342, 209, 2587, 347, 973, 347, 271, 34014, 4715, 3169, 3733, 5700, 253, 16182, 629, 3133, 5272, 285, 253, 7680, 310, 1175, 495, 253, 2929, 310, 973, 34092, 253, 4679, 403, 21414, 2820, 281, 17093, 253, 3045, 762, 1027, 15216, 1690, 5415, 285, 13358, 35221, 4715, 7533, 50276, 5040, 337, 275, 253, 4679, 1199, 5301, 310, 4812, 5301, 273, 4081, 3082, 275, 253, 5068, 253, 4477, 3748, 439, 1136, 5318, 10144, 3963, 14460, 7364, 253, 4477, 943, 2589, 625, 4679, 281, 5276, 253, 3908, 1024, 760, 3368, 495, 25957, 352, 50276, 19, 891, 13414, 923, 1142, 14023, 342, 643, 1666, 25379, 6240, 625, 1666, 25379, 310, 1805, 50275, 249, 6010, 436, 2929, 2175, 247, 4460, 1895, 3963, 1025, 13737, 35221, 4715, 253, 2929, 29328, 2067, 5609, 281, 8415, 253, 37820, 275, 1027, 7794, 253, 4679, 403, 5196, 762, 1027, 7533, 533, 690, 772, 403, 3058, 281, 17265, 406, 339, 377, 493, 250, 2858, 22559, 5701, 50275, 74, 5717, 253, 4477, 323, 253, 2380, 285, 253, 6031, 275, 253, 9300, 7482, 891, 1158, 253, 2929, 310, 10046, 285, 943, 320, 7607, 50273, 8774, 50276, 2520, 2929, 33888, 253, 1895, 273, 3963, 1025, 209, 2587, 347, 10066, 281, 2629, 209, 2587, 534, 476, 452, 29458, 5482, 3963, 1025, 209, 2587, 556, 247, 4451, 8654, 2900, 253, 4477, 9186, 1027, 4948, 273, 37820, 285, 15313, 271, 5919, 209, 2587, 5933, 534, 2087, 4219, 2329, 77, 285, 310, 7763, 281, 5415, 1453, 8892, 50274, 250, 3743, 323, 4868, 50276, 783, 50276, 36665, 273, 3963, 1025, 209, 2587, 310, 247, 5322, 7680, 281, 253, 1673, 352, 16027, 275, 23395, 342, 3332, 789, 327, 3963, 1025, 391, 77, 285, 689, 3217, 690, 273, 253, 7881, 273, 8946, 209, 2587, 253, 15965, 12153, 310, 26565, 285, 253, 5661, 1543, 403, 12532, 50274, 856, 84, 337, 5322, 2087, 15965, 7792, 326, 2087, 4219, 2720, 789, 374, 10649, 494, 11333, 326, 789, 275, 5415, 1375, 285, 2250, 10625, 495, 253, 16774, 1783, 273, 253, 4588, 23279, 310, 4722, 50273, 5040, 337, 253, 2929, 310, 247, 2372, 14951, 5536, 285, 326, 2789, 352, 1892, 281, 956, 374, 690, 17011, 14951, 824, 347, 14168, 1179, 69, 327, 3239, 374, 50276, 20, 14999, 275, 22861, 323, 253, 4327, 273, 15579, 3963, 14460, 2139, 28669, 455, 261, 2139, 956, 30966, 6247, 275, 4327, 273, 3963, 6081, 577, 275, 2087, 247, 2257, 273, 253, 10527, 1543, 403, 1677, 1293, 667, 27350, 8813, 323, 752, 597, 1599, 390, 849, 281, 4665, 731, 608, 697, 12744, 849, 436, 310, 1027, 432, 253, 3762, 323, 305, 647, 534, 310, 671, 275, 2426, 273, 247, 2087, 3963, 6081, 50275, 977, 5701, 50276, 783, 14683, 1078, 253, 5955, 3748, 326, 1218, 2587, 310, 1175, 323, 4999, 45738, 4715, 436, 310, 12744, 1580, 5252, 310, 17011, 671, 275, 253, 1072, 14683, 253, 4477, 3748, 326, 1218, 2587, 10224, 281, 15338, 23279, 533, 352, 3133, 281, 789, 275, 253, 4677, 323, 690, 2193, 273, 2805, 752, 513, 253, 4477, 1599, 407, 1891, 50276, 13206, 337, 3133, 281, 5502, 432, 1907, 1029, 23279, 275, 253, 1755, 281, 1698, 23279, 323, 50276, 6795, 2412, 40009, 2139, 50276, 30786, 685, 970, 247, 2014, 20134, 323, 2885, 6168, 30241, 352, 310, 3798, 1805, 281, 897, 44127, 802, 27207, 323, 2885, 6168, 30241, 407, 970, 1264, 9527, 1041, 50276, 3113, 642, 2317, 1078, 285, 846, 50274, 7152, 33032, 2520, 2929, 29328, 247, 747, 1332, 323, 3963, 1025, 13737, 391, 77, 253, 2929, 21168, 2220, 789, 407, 3471, 382, 1162, 355, 665, 5421, 3963, 1025, 31934, 793, 342, 17133, 3646, 3963, 14460, 253, 439, 16554, 15579, 310, 247, 2714, 1083, 273, 824, 247, 3646, 3963, 6081, 253, 2929, 8725, 253, 1783, 273, 3471, 382, 1162, 355, 323, 3963, 1025, 209, 2587, 285, 1474, 3013, 10649, 494, 5482, 281, 3963, 1025, 209, 2587, 326, 760, 3469, 327, 253, 20059, 3640, 273, 253, 3963, 6081, 253, 2929, 2007, 29328, 3963, 1025, 48960, 209, 2587, 1218, 2587, 271, 6880, 273, 2329, 77, 407, 15260, 1162, 355, 347, 271, 5933, 323, 209, 2587, 275, 3963, 1025, 31934, 793, 253, 5933, 310, 17618, 327, 247, 1180, 273, 10625, 50276, 783, 2929, 310, 3839, 4518, 3542, 891, 2868, 326, 253, 2929, 310, 22335, 3451, 285, 891, 14109, 326, 253, 3538, 569, 403, 973, 5544, 275, 253, 30762, 253, 2929, 310, 4460, 281, 253, 1682, 273, 619, 3640, 285, 891, 1158, 1805, 4948, 273, 37820, 812, 3157, 253, 1375, 273, 253, 1445, 275, 13737, 391, 77, 285, 588, 320, 273, 1600, 281, 253, 17857, 32888, 3114, 50275, 74, 1158, 697, 4484, 326, 436, 2929, 4483, 441, 281, 8415, 13737, 391, 77, 3237, 275, 28669, 455, 261, 15579, 3963, 1025, 31934, 793, 285, 247, 1270, 7681, 7680, 2299, 891, 1158, 752, 310, 1679, 2590, 281, 479, 310, 2139, 359, 651, 971, 281, 347, 352, 310, 516, 25661, 4404, 14924, 984, 891, 11346, 253, 2929, 285, 11435, 253, 7681, 7680, 533, 891, 1158, 253, 2929, 651, 320, 9619, 10046, 604, 352, 1160, 247, 1805, 1083, 323, 2139, 3963, 14460, 643, 685, 253, 2622, 439, 16554, 15579, 403, 4623, 323, 8542, 209, 2587, 3237, 4931, 1014, 342, 271, 1650, 2898, 17227, 247, 2590, 5649, 689, 2329, 77, 347, 973, 347, 643, 45738, 4715, 1666, 25379, 50276, 37585, 5701, 19751, 25871, 993, 50276, 12756, 1218, 2587, 2224, 78, 6524, 3986, 6485, 3045, 327, 253, 1331, 87, 19, 4836, 50276, 74, 1089, 253, 1543, 275, 4677, 577, 247, 1652, 10084, 891, 651, 452, 3264, 253, 23279, 281, 320, 8723, 323, 2805, 82, 812, 368, 4385, 2007, 327, 436, 16774, 906, 50276, 303, 12371, 2139, 253, 17619, 4752, 1100, 460, 3963, 14460, 497, 6777, 275, 3368, 337, 403, 841, 816, 5486, 281, 7568, 253, 15840, 273, 253, 7792, 390, 310, 627, 247, 16038, 323, 970, 731, 275, 3946, 50276, 74, 11435, 253, 878, 281, 3297, 1561, 253, 3239, 2701, 533, 651, 11907, 253, 4477, 281, 30572, 560, 253, 2929, 247, 2372, 323, 247, 2442, 4049, 254, 609, 5102, 2715, 50275, 74, 1158, 352, 651, 320, 5322, 281, 2486, 271, 5933, 3817, 347, 247, 6010, 273, 253, 4081, 5933, 50275, 40203, 495, 285, 577, 403, 247, 2372, 1355, 285, 1892, 281, 14390, 50275, 783, 2929, 1537, 5649, 432, 690, 2007, 4737, 24042, 323, 963, 993, 285, 1355, 47412, 474, 6332, 824, 347, 5816, 7774, 2490, 187, 4118, 18435, 27, 2520, 2929, 2175, 13737, 35221, 4715, 949, 253, 49615, 273, 3963, 1025, 1616, 729, 3061, 4870, 407, 2087, 3006, 2781, 290, 2587, 432, 253, 4016, 15579, 281, 667, 7052, 17133, 3963, 6081, 347, 247, 1930, 3877, 7654, 17133, 414, 1537, 320, 2217, 323, 1142, 1543, 253, 30628, 14109, 253, 19843, 253, 15965, 8132, 263, 285, 253, 16774, 7103, 273, 436, 2929, 597, 2546, 690, 3533, 285, 5439, 690, 7350, 326, 497, 6571, 9713, 275, 253, 30080, 22559, 285, 253, 18520, 2530, 407, 253, 4477, 436, 310, 247, 2266, 2929, 323, 534, 253, 913, 32636, 14924, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper tries to quantify how dense representations we need for a specific task more specifically how many dimensions are needed from a given representation for a given task to achieve a percentage of the performance of the entire representation the second thing the paper tries to quantify is how well representations learned for one task can be fine tuned for another experiments are conducted with 4 different representation technique on a dozen or so tasks quick summary while i liked aspects of this including the motivation of having a lightweight way of understanding how well representations transfer across tasks overall my concerns surrounding the methodology and some missing analysis leads me to believe this needs more work before it is ready for publication quality below average i believe the proposed techniques have some flaws which hurt the eventual method there are also concerns about the motivations behind parts of the technique clarity fair there were some experimental details that were poorly explained but in general the paper was readable originality fair there were some nice ideas in the work but i remain concerned about aspects of it significance below average my concern is that the flaws in the method do not make it conducive to use as is strengths things i liked i really liked the motivating problem of being able to hopefully cheaply efficiently estimate transfer potential to understand how well representations will perform on a different task multiple representations and tasks experimented with weaknesses things that concerned me in no specific order w1 adversely affected by rotations one of my big concerns with the work is the way the cfs is computed while it seems ok to estimate these different metrics using only linear models my concern with this is that the linear models are only given a subset of the exact dimensions of the original representations this is very much unlike the learning objectives of most of these representation learning methods and hence is highly biased and dependent on the actual methods and the random seeds used and the rotations it performs in many cases the representations are used starting with a fully connected layer bottom layer on top of the representations and hence rotations of the representations do not affect performance lets take an example say there is a single dimension of the representation that is a perfect predictor of a task suppose we rotated these representations now the signal from the original dimension is split across multiple dimensions and hence the cfs may be deceivingly high to me this is a big concern as different runs of the same representation technique can likely have very different cfs scores based on initializations and random seeds w2 related to the last line i did not see any experiments analysis showing how stable these different numbers are across different runs of the representation technique nor did i see any error bars in the experiments this again greatly concerned me as i am not certain how stable these metrics are w3 baselines for transfer learning i felt this was another notable oversight i would have liked to see results for both trivial baselines like random ranking as well as more informed baselines where we can estimate transfer potential using say k representation techniques and then use that to help us understand how well it would do on the other representations this latter baseline is a zerocost baseline as it is not even dependent on the method w4 metrics for ranking of transfer dont make sense and some are missing i also dont understand how precision and ndcg are used as metrics based on my understanding the authors rank which itself is questionable the different tasks in order of potential for transfer and then call this the gold set how is precision and ndcg calculated from this more importantly i dont believe looking at rank alone is sufficient since that completely obscures the actual performance numbers obtained via transfer in most cases i would care about how well my model would perform on transfer not just which tasks i should transfer from i would have wanted to understand something like the correlation of these produced scores with the actual ground truth performance numbers w5 multitask learning i did not see any mention or experiments of what can be expected when the representations are themselves trained on multiple tasks this seems like something that could easily be done in the empirical analysis as well and would provide richer empirical signals as well w6 motivation for cfs i still dont fully understand the need to understand the density of the representation especially in the manner proposed in the paper why is this an important problem perhaps expanding on this would be helpful w7 alternatives to cfs computational concerns a big concern i had was the computational expense of the proposed approach unfortunately i did not see any discussion about this in the paper or empirically i find this striking because i can easily come up with cheaper alternatives to get at this density for example using lasso lars like methods you can perhaps figure out a good reduced dimension set more efficiently if i were to go through the computation of then why not just train a smaller version of that representation technique instead and directly see how well it can encode data in k dimensions via that technique for that task alternatively why not try using a factorization technique to reduce the rank and then see how well the method does for different ranks w7b likewise i wonder if we could just measure transfer more directly as well and why we need to go via these cfs sets w8 the proposed clf weight difference method has some concerning aspects as well for example say we had two task with exact opposite labels they would have a very low weight difference score though they are ideal representations for each other likewise looking at a difference of weight vectors seems arbitrary in other ways as well docsepmeasuring density and similarity of task relevant information in neural representations summary this work attempts to define two kinds of metrics metrics for information density and for information similarity for the sake of automatically detecting similarity between tasks so that transfer learning can be done more efficiently the concepts are clearly explained and the metric for information density seems to match up with intuitions coming out of forward selections approaches the metric for information transfer seems to be the commonplace metric that other works default to when they show that pretrained representations are effective on downstream tasks it is not clear that the notion of similarity through classifier weights makes sense but see below for clarification questions the problem addressed automatic similarity scoring of tasks is important for transfer learning and thus the results have potential to be very impactful if they generalize to other kinds of tasks as is they seem to apply only to classification tasks but that is a good step pros clearly written experiments on the datasets chosen do seem to suggest that the proposed methods have potential brings in nice intuition from forward feature selection an important problem with potential for high impact cons it is not clear to me that the classifier difference metric is welldefined is there a constraint on the cfs and classifiers that ensure the difference between the weights really captures what is suggested is it not the case that classifier weights could come out quite different despite the tasks being quite similar if the linear classifiers learned to capitalize on dissimilar yet equally fruitful patterns in the input features do you have thoughts on how this could be applied outside the context of sentence representations and further outside the context of classification those seem to be quite limiting features of these methods which is not to say that they are not useful in that realm but only to clarify my understanding of their possible scope of application these classification datasets are often so close that i do wonder whether even simpler methods would work just as well for example clustering on bagsofwords might also show that sst sstfine and imdb are closesimilartransferable the same could be said for sick and snli it would be nice to see a comparison to such baselines in order to get a sense of how the proposed methods give insights that other unsupervised or supervised methods might give just as well otherwise it is hard to tell how significant these correlations are since the end goal is to determine transferability of tasks and not the methods it does seem like there are simpler baselines that you could compare againstdocsepthis paper proposes simple metrics for measuring the information density in learned representations overall this is an interesting direction however there are a few key weaknesses in my view not least that the practical utility of these metrics is not obvious since they require supervision in the target domain and while there is an argument to be made for the inherent interestingness of exploring these questions this angle would be more compelling if multiple encoder architectures were explored and compared the overarching questions that the authors set out to answer how taskspecific information is stored and to what extent this transfers is inherently interesting and important the proposed metrics and simple and intuitive it is interesting that a few units seem to capture most task specific information the envisioned scenario and hence utility of these metrics is a bit unclear to me here as noted by the authors transfer is most attractive in lowsupervision regimes wrt the target task yet the metrics proposed depend on supervision in the target domain if we already have this then as the authors themselves note it is trivial to simply try out different source datasets empirically on a target dev set it is argued that this is an issue because it requires training 2n networks where n is the number of source tasks i am unconvinced that one frequently enough has access to a sufficiently large set of candidate source tasks for this to be a real practical issue the metrics are tightly coupled to the encoder used and no exploration of encoder architectures is performed the lstm architecture used is reasonable but it would be nice to see how much results change if at all with alternative architectures the cfs metric depends on a hyperparameter the retention ratio which here is arbitrarily set to 80 without any justification what is the motivation for the restriction to linear models in the referenced probing paper for example mlps were also used to explore whether attributes were coded for nonlinearly ### Summary:
this paper addresses important general questions about how linear classifiers use features and about the transferability of those features across tasks the paper presents a specific new analysis method and demonstrates it on a family of nlp tasks all four reviewers counting the emergency fourth review found the general direction of research to be interesting and worthwhile but all four shared several serious concerns about the impact and soundness of the proposed method the impact concerns mostly dealt with the observation that the method is specific to linear classifiers and that its only applicable to tasks for which a substantial amount of training data is available as the ac im willing to accept that it should still be possible to conduct an informative analysis under these conditions but im more concerned about the soundness issues the reviewers were not convinced that a method based on the counting of specific features was appropriate for the proposed setting due to rotation sensitivity among other issues and did not find that the experiments were sufficiently extensive to overcome these doubts
[ 327, 247, 13365, 390, 594, 8892, 50276, 32600, 6010, 1223, 891, 10490, 7794, 273, 436, 50276, 10387, 253, 16038, 273, 1907, 247, 28441, 1039, 273, 4685, 849, 973, 14237, 3700, 2439, 8892, 4583, 619, 7350, 8704, 253, 16182, 285, 690, 5816, 1783, 5644, 479, 281, 2868, 436, 3198, 625, 789, 1078, 352, 310, 4704, 323, 9311, 50276, 15177, 2708, 3388, 891, 2868, 253, 4081, 5609, 452, 690, 32138, 534, 8513, 253, 27585, 1332, 627, 403, 671, 7350, 670, 253, 42852, 3212, 4243, 273, 253, 5853, 50276, 498, 15752, 4344, 627, 497, 690, 5661, 4278, 326, 497, 15225, 5544, 533, 275, 2087, 253, 2929, 369, 34025, 50276, 19164, 414, 4344, 627, 497, 690, 5322, 5697, 275, 253, 789, 533, 891, 3464, 7514, 670, 7794, 273, 352, 50276, 9188, 40348, 2708, 3388, 619, 4468, 310, 326, 253, 32138, 275, 253, 1332, 513, 417, 1056, 352, 49598, 422, 281, 897, 347, 310, 50275, 296, 3755, 20556, 50276, 28579, 891, 10490, 50275, 74, 1663, 10490, 253, 15265, 839, 1895, 273, 1146, 2104, 281, 18670, 11142, 314, 50276, 20246, 314, 6642, 3700, 2442, 281, 2096, 849, 973, 14237, 588, 1347, 327, 247, 1027, 4836, 50275, 34263, 14237, 285, 8892, 3368, 264, 342, 50276, 20881, 1255, 265, 50276, 28579, 326, 7514, 479, 275, 642, 2173, 1340, 50275, 88, 18, 31611, 5876, 407, 39501, 581, 273, 619, 1943, 7350, 342, 253, 789, 310, 253, 1039, 253, 260, 3671, 310, 10302, 1223, 352, 3133, 8718, 281, 6642, 841, 1027, 17082, 970, 760, 4872, 3210, 619, 4468, 342, 436, 310, 326, 253, 4872, 3210, 403, 760, 1677, 247, 8578, 273, 253, 3242, 10103, 273, 253, 3236, 14237, 436, 310, 1077, 1199, 12401, 253, 4715, 16566, 273, 954, 273, 841, 6779, 4715, 3082, 285, 7613, 310, 4122, 23539, 285, 7976, 327, 253, 4588, 3082, 285, 253, 3632, 12922, 908, 285, 253, 39501, 352, 17923, 275, 1142, 2219, 253, 14237, 403, 908, 4983, 342, 247, 4751, 4802, 3828, 5004, 3828, 327, 1755, 273, 253, 14237, 285, 7613, 39501, 273, 253, 14237, 513, 417, 2818, 3045, 50276, 6639, 1379, 271, 1650, 1333, 627, 310, 247, 2014, 7877, 273, 253, 6779, 326, 310, 247, 3962, 23403, 273, 247, 4836, 9428, 359, 27272, 841, 14237, 1024, 253, 2625, 432, 253, 3236, 7877, 310, 8085, 2439, 2709, 10103, 285, 7613, 253, 260, 3671, 778, 320, 17630, 2179, 314, 1029, 50276, 936, 479, 436, 310, 247, 1943, 4468, 347, 1027, 6613, 273, 253, 1072, 6779, 5853, 476, 2779, 452, 1077, 1027, 260, 3671, 7363, 1754, 327, 3302, 5904, 285, 3632, 12922, 50275, 88, 19, 2905, 281, 253, 1390, 1386, 891, 858, 417, 923, 667, 4679, 50276, 12792, 4645, 849, 6474, 841, 1027, 3904, 403, 2439, 1027, 6613, 273, 253, 6779, 5853, 4543, 858, 891, 923, 667, 2228, 8965, 275, 253, 4679, 436, 969, 10260, 7514, 479, 347, 891, 717, 417, 2176, 849, 6474, 841, 17082, 403, 50275, 88, 20, 1666, 25379, 323, 3700, 4715, 891, 3543, 436, 369, 1529, 16613, 29002, 891, 651, 452, 10490, 281, 923, 1543, 323, 1097, 14916, 1666, 25379, 751, 3632, 19947, 347, 973, 347, 625, 8191, 1666, 25379, 835, 359, 476, 6642, 3700, 2442, 970, 1333, 465, 6779, 5609, 285, 840, 897, 326, 281, 1361, 441, 2096, 849, 973, 352, 651, 513, 327, 253, 643, 14237, 436, 6158, 8245, 310, 247, 1182, 254, 406, 493, 8245, 347, 352, 310, 417, 1014, 7976, 327, 253, 1332, 50275, 88, 21, 17082, 323, 19947, 273, 3700, 13414, 1056, 3282, 285, 690, 403, 5816, 50276, 74, 671, 13414, 2096, 849, 12320, 285, 295, 12352, 72, 403, 908, 347, 17082, 1754, 327, 619, 4685, 253, 4477, 5958, 534, 3139, 310, 30455, 253, 1027, 8892, 275, 1340, 273, 2442, 323, 3700, 285, 840, 1067, 436, 253, 5328, 873, 849, 310, 12320, 285, 295, 12352, 72, 5118, 432, 436, 50276, 3062, 15538, 891, 13414, 2868, 2819, 387, 5958, 3815, 310, 4209, 1580, 326, 4336, 14551, 980, 253, 4588, 3045, 3904, 2797, 3066, 3700, 275, 954, 2219, 891, 651, 1557, 670, 849, 973, 619, 1566, 651, 1347, 327, 3700, 417, 816, 534, 8892, 891, 943, 3700, 432, 891, 651, 452, 3078, 281, 2096, 1633, 751, 253, 5921, 273, 841, 4197, 7363, 342, 253, 4588, 3216, 5083, 3045, 3904, 50275, 88, 22, 1554, 262, 1945, 4715, 891, 858, 417, 923, 667, 3748, 390, 4679, 273, 752, 476, 320, 3264, 672, 253, 14237, 403, 3746, 10166, 327, 2709, 8892, 436, 3133, 751, 1633, 326, 812, 4354, 320, 2218, 275, 253, 16774, 1783, 347, 973, 285, 651, 2085, 38539, 16774, 6298, 347, 973, 50275, 88, 23, 16038, 323, 260, 3671, 891, 1335, 13414, 4751, 2096, 253, 878, 281, 2096, 253, 4038, 273, 253, 6779, 3340, 275, 253, 5133, 4081, 275, 253, 2929, 2139, 310, 436, 271, 1774, 1895, 4931, 16122, 327, 436, 651, 320, 9371, 50275, 88, 24, 18075, 281, 260, 3671, 50276, 16777, 1050, 7350, 247, 1943, 4468, 891, 574, 369, 253, 15180, 14247, 273, 253, 4081, 2746, 19235, 891, 858, 417, 923, 667, 5955, 670, 436, 275, 253, 2929, 390, 45190, 50276, 74, 1089, 436, 13631, 984, 891, 476, 4354, 1705, 598, 342, 20182, 18075, 281, 755, 387, 436, 4038, 323, 1650, 970, 298, 26341, 50276, 77, 1032, 751, 3082, 368, 476, 4931, 4677, 562, 247, 1175, 3777, 7877, 873, 625, 14556, 50276, 338, 891, 497, 281, 564, 949, 253, 13782, 273, 840, 2139, 417, 816, 6194, 247, 4577, 2715, 273, 326, 6779, 5853, 3185, 285, 3587, 923, 849, 973, 352, 476, 22573, 941, 275, 465, 10103, 3066, 326, 5853, 50276, 1542, 326, 4836, 50276, 30991, 3146, 2139, 417, 1611, 970, 247, 39401, 5853, 281, 4796, 253, 5958, 285, 840, 923, 849, 973, 253, 1332, 1057, 323, 1027, 17210, 50275, 88, 24, 67, 21223, 891, 4282, 604, 359, 812, 816, 2557, 3700, 625, 3587, 347, 973, 285, 2139, 359, 878, 281, 564, 3066, 841, 260, 3671, 5239, 50275, 88, 25, 253, 4081, 50276, 498, 71, 2801, 3064, 1332, 556, 690, 8664, 7794, 347, 973, 323, 1650, 1333, 359, 574, 767, 4836, 342, 3242, 7285, 13301, 597, 651, 452, 247, 1077, 1698, 2801, 3064, 4868, 2167, 597, 403, 7445, 14237, 323, 1016, 643, 21223, 2819, 387, 247, 3064, 273, 2801, 11390, 3133, 10341, 275, 643, 4088, 347, 973, 5474, 33032, 28025, 981, 4038, 285, 14259, 273, 4836, 4623, 1491, 275, 11454, 14237, 50276, 8774, 50276, 2520, 789, 9437, 281, 4853, 767, 9351, 273, 17082, 17082, 323, 1491, 4038, 285, 323, 1491, 14259, 323, 253, 13232, 273, 8356, 15549, 14259, 875, 8892, 594, 326, 3700, 4715, 476, 320, 2218, 625, 14556, 253, 12342, 403, 4518, 5544, 285, 253, 7982, 323, 1491, 4038, 3133, 281, 3761, 598, 342, 16875, 4431, 3551, 562, 273, 3579, 36318, 7274, 253, 7982, 323, 1491, 3700, 3133, 281, 320, 253, 47817, 7982, 326, 643, 2987, 4284, 281, 672, 597, 921, 326, 3215, 11273, 14237, 403, 3576, 327, 15450, 8892, 352, 310, 417, 2590, 326, 253, 10732, 273, 14259, 949, 30410, 13461, 2789, 3282, 533, 923, 2708, 323, 37699, 3533, 253, 1895, 9713, 12077, 14259, 14755, 273, 8892, 310, 1774, 323, 3700, 4715, 285, 3021, 253, 1543, 452, 2442, 281, 320, 1077, 3486, 1020, 604, 597, 39970, 281, 643, 9351, 273, 8892, 347, 310, 597, 1646, 281, 4647, 760, 281, 9162, 8892, 533, 326, 310, 247, 1175, 3213, 50276, 856, 84, 50276, 49346, 3542, 4679, 327, 253, 15302, 6777, 513, 1646, 281, 1804, 326, 253, 4081, 3082, 452, 2442, 10316, 275, 5322, 30328, 432, 3579, 4735, 5438, 271, 1774, 1895, 342, 2442, 323, 1029, 3486, 50275, 5040, 50276, 262, 310, 417, 2590, 281, 479, 326, 253, 30410, 3064, 7982, 310, 6210, 392, 37224, 310, 627, 247, 7658, 327, 253, 260, 3671, 285, 49996, 326, 5416, 253, 3064, 875, 253, 13461, 1663, 28174, 752, 310, 5125, 310, 352, 417, 253, 1083, 326, 30410, 13461, 812, 1705, 562, 3240, 1027, 5747, 253, 8892, 1146, 3240, 2074, 604, 253, 4872, 49996, 6311, 281, 5347, 907, 327, 43110, 2568, 9696, 46001, 6127, 275, 253, 3280, 3386, 50276, 3088, 368, 452, 7906, 327, 849, 436, 812, 320, 3732, 3345, 253, 3634, 273, 6197, 14237, 285, 2007, 3345, 253, 3634, 273, 9162, 1110, 1646, 281, 320, 3240, 14155, 3386, 273, 841, 3082, 534, 310, 417, 281, 1333, 326, 597, 403, 417, 4217, 275, 326, 19929, 533, 760, 281, 19148, 619, 4685, 273, 616, 1896, 7990, 273, 2898, 50276, 20513, 9162, 15302, 403, 2223, 594, 2810, 326, 891, 513, 4282, 1880, 1014, 19554, 3082, 651, 789, 816, 347, 973, 323, 1650, 17524, 327, 7351, 601, 71, 12113, 1537, 671, 921, 326, 256, 296, 256, 296, 32829, 285, 516, 5470, 403, 27599, 303, 300, 435, 16147, 1592, 494, 253, 1072, 812, 320, 753, 323, 8334, 285, 3802, 965, 352, 651, 320, 5322, 281, 923, 247, 5301, 281, 824, 1666, 25379, 275, 1340, 281, 755, 247, 3282, 273, 849, 253, 4081, 3082, 1918, 16039, 326, 643, 440, 35421, 390, 22296, 3082, 1537, 1918, 816, 347, 973, 5010, 352, 310, 1892, 281, 2028, 849, 1534, 841, 13007, 403, 1580, 253, 990, 4736, 310, 281, 3653, 3700, 1430, 273, 8892, 285, 417, 253, 3082, 352, 1057, 1646, 751, 627, 403, 19554, 1666, 25379, 326, 368, 812, 7277, 1411, 7152, 33032, 2520, 2929, 29328, 2969, 17082, 323, 10499, 253, 1491, 4038, 275, 6311, 14237, 4583, 436, 310, 271, 4722, 3884, 2299, 627, 403, 247, 1643, 2234, 32213, 275, 619, 1859, 417, 1878, 326, 253, 8542, 11839, 273, 841, 17082, 310, 417, 4755, 1580, 597, 2430, 20446, 275, 253, 2303, 5028, 285, 1223, 627, 310, 271, 4154, 281, 320, 1160, 323, 253, 12794, 4722, 1255, 273, 18216, 841, 3533, 436, 6907, 651, 320, 625, 18511, 604, 2709, 32049, 35615, 497, 14859, 285, 2429, 50274, 783, 689, 50238, 3533, 326, 253, 4477, 873, 562, 281, 3662, 849, 8892, 29765, 1491, 310, 7141, 285, 281, 752, 6070, 436, 21916, 310, 26557, 4722, 285, 1774, 50274, 783, 4081, 17082, 285, 2969, 285, 27350, 50275, 262, 310, 4722, 326, 247, 1643, 5085, 1646, 281, 9232, 954, 4836, 2173, 1491, 50274, 783, 44921, 10076, 285, 7613, 11839, 273, 841, 17082, 310, 247, 2372, 12744, 281, 479, 1060, 347, 4879, 407, 253, 4477, 3700, 310, 954, 12994, 275, 1698, 12185, 4694, 27005, 8772, 253, 2303, 4836, 2568, 253, 17082, 4081, 3469, 327, 20446, 275, 253, 2303, 5028, 604, 359, 2168, 452, 436, 840, 50276, 284, 253, 4477, 3746, 3877, 50276, 262, 310, 14916, 281, 3365, 1611, 562, 1027, 2603, 15302, 45190, 327, 247, 2303, 1474, 873, 352, 310, 9125, 326, 436, 310, 271, 2523, 984, 352, 4419, 3733, 374, 79, 6928, 835, 295, 310, 253, 1180, 273, 2603, 8892, 891, 717, 10915, 8498, 758, 326, 581, 7208, 2217, 556, 2289, 281, 247, 10481, 1781, 873, 273, 7431, 2603, 8892, 323, 436, 281, 320, 247, 1524, 8542, 2523, 50274, 783, 17082, 403, 18996, 9904, 281, 253, 32049, 908, 285, 642, 17947, 273, 32049, 35615, 310, 2684, 253, 298, 296, 78, 10336, 908, 310, 5272, 533, 352, 651, 320, 5322, 281, 923, 849, 1199, 1543, 1818, 604, 387, 512, 342, 5795, 35615, 50275, 783, 260, 3671, 7982, 7024, 327, 247, 4373, 19484, 253, 17302, 4313, 534, 1060, 310, 29607, 873, 281, 5096, 1293, 667, 22861, 50275, 5371, 310, 253, 16038, 323, 253, 12400, 281, 4872, 3210, 275, 253, 23378, 39578, 2929, 323, 1650, 13361, 793, 497, 671, 908, 281, 8338, 1880, 12474, 497, 25175, 323, 1327, 1282, 1285, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 12453, 1774, 2087, 3533, 670, 849, 4872, 49996, 897, 3386, 285, 670, 253, 3700, 1430, 273, 1110, 3386, 2439, 8892, 253, 2929, 10262, 247, 2173, 747, 1783, 1332, 285, 14371, 352, 327, 247, 2021, 273, 295, 24343, 8892, 50275, 455, 1740, 30628, 15496, 253, 8945, 7002, 2278, 1119, 253, 2087, 3884, 273, 2561, 281, 320, 4722, 285, 32811, 533, 512, 1740, 6096, 2067, 4092, 7350, 670, 253, 3486, 285, 3590, 1255, 273, 253, 4081, 1332, 50275, 783, 3486, 7350, 6571, 18445, 342, 253, 8310, 326, 253, 1332, 310, 2173, 281, 4872, 49996, 285, 326, 697, 760, 7763, 281, 8892, 323, 534, 247, 6832, 2408, 273, 3733, 941, 310, 2130, 50275, 284, 253, 913, 516, 7378, 281, 2997, 326, 352, 943, 1335, 320, 1896, 281, 2589, 271, 27096, 1783, 762, 841, 2515, 533, 516, 625, 7514, 670, 253, 3590, 1255, 3374, 253, 30628, 497, 417, 13762, 326, 247, 1332, 1754, 327, 253, 15496, 273, 2173, 3386, 369, 4569, 323, 253, 4081, 4758, 1955, 281, 9381, 7340, 2190, 643, 3374, 285, 858, 417, 1089, 326, 253, 4679, 497, 10481, 9470, 281, 11399, 841, 24626 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 327, 247, 13365, 390, 594, 8892, 50276, 32600, 6010, 1223, 891, 10490, 7794, 273, 436, 50276, 10387, 253, 16038, 273, 1907, 247, 28441, 1039, 273, 4685, 849, 973, 14237, 3700, 2439, 8892, 4583, 619, 7350, 8704, 253, 16182, 285, 690, 5816, 1783, 5644, 479, 281, 2868, 436, 3198, 625, 789, 1078, 352, 310, 4704, 323, 9311, 50276, 15177, 2708, 3388, 891, 2868, 253, 4081, 5609, 452, 690, 32138, 534, 8513, 253, 27585, 1332, 627, 403, 671, 7350, 670, 253, 42852, 3212, 4243, 273, 253, 5853, 50276, 498, 15752, 4344, 627, 497, 690, 5661, 4278, 326, 497, 15225, 5544, 533, 275, 2087, 253, 2929, 369, 34025, 50276, 19164, 414, 4344, 627, 497, 690, 5322, 5697, 275, 253, 789, 533, 891, 3464, 7514, 670, 7794, 273, 352, 50276, 9188, 40348, 2708, 3388, 619, 4468, 310, 326, 253, 32138, 275, 253, 1332, 513, 417, 1056, 352, 49598, 422, 281, 897, 347, 310, 50275, 296, 3755, 20556, 50276, 28579, 891, 10490, 50275, 74, 1663, 10490, 253, 15265, 839, 1895, 273, 1146, 2104, 281, 18670, 11142, 314, 50276, 20246, 314, 6642, 3700, 2442, 281, 2096, 849, 973, 14237, 588, 1347, 327, 247, 1027, 4836, 50275, 34263, 14237, 285, 8892, 3368, 264, 342, 50276, 20881, 1255, 265, 50276, 28579, 326, 7514, 479, 275, 642, 2173, 1340, 50275, 88, 18, 31611, 5876, 407, 39501, 581, 273, 619, 1943, 7350, 342, 253, 789, 310, 253, 1039, 253, 260, 3671, 310, 10302, 1223, 352, 3133, 8718, 281, 6642, 841, 1027, 17082, 970, 760, 4872, 3210, 619, 4468, 342, 436, 310, 326, 253, 4872, 3210, 403, 760, 1677, 247, 8578, 273, 253, 3242, 10103, 273, 253, 3236, 14237, 436, 310, 1077, 1199, 12401, 253, 4715, 16566, 273, 954, 273, 841, 6779, 4715, 3082, 285, 7613, 310, 4122, 23539, 285, 7976, 327, 253, 4588, 3082, 285, 253, 3632, 12922, 908, 285, 253, 39501, 352, 17923, 275, 1142, 2219, 253, 14237, 403, 908, 4983, 342, 247, 4751, 4802, 3828, 5004, 3828, 327, 1755, 273, 253, 14237, 285, 7613, 39501, 273, 253, 14237, 513, 417, 2818, 3045, 50276, 6639, 1379, 271, 1650, 1333, 627, 310, 247, 2014, 7877, 273, 253, 6779, 326, 310, 247, 3962, 23403, 273, 247, 4836, 9428, 359, 27272, 841, 14237, 1024, 253, 2625, 432, 253, 3236, 7877, 310, 8085, 2439, 2709, 10103, 285, 7613, 253, 260, 3671, 778, 320, 17630, 2179, 314, 1029, 50276, 936, 479, 436, 310, 247, 1943, 4468, 347, 1027, 6613, 273, 253, 1072, 6779, 5853, 476, 2779, 452, 1077, 1027, 260, 3671, 7363, 1754, 327, 3302, 5904, 285, 3632, 12922, 50275, 88, 19, 2905, 281, 253, 1390, 1386, 891, 858, 417, 923, 667, 4679, 50276, 12792, 4645, 849, 6474, 841, 1027, 3904, 403, 2439, 1027, 6613, 273, 253, 6779, 5853, 4543, 858, 891, 923, 667, 2228, 8965, 275, 253, 4679, 436, 969, 10260, 7514, 479, 347, 891, 717, 417, 2176, 849, 6474, 841, 17082, 403, 50275, 88, 20, 1666, 25379, 323, 3700, 4715, 891, 3543, 436, 369, 1529, 16613, 29002, 891, 651, 452, 10490, 281, 923, 1543, 323, 1097, 14916, 1666, 25379, 751, 3632, 19947, 347, 973, 347, 625, 8191, 1666, 25379, 835, 359, 476, 6642, 3700, 2442, 970, 1333, 465, 6779, 5609, 285, 840, 897, 326, 281, 1361, 441, 2096, 849, 973, 352, 651, 513, 327, 253, 643, 14237, 436, 6158, 8245, 310, 247, 1182, 254, 406, 493, 8245, 347, 352, 310, 417, 1014, 7976, 327, 253, 1332, 50275, 88, 21, 17082, 323, 19947, 273, 3700, 13414, 1056, 3282, 285, 690, 403, 5816, 50276, 74, 671, 13414, 2096, 849, 12320, 285, 295, 12352, 72, 403, 908, 347, 17082, 1754, 327, 619, 4685, 253, 4477, 5958, 534, 3139, 310, 30455, 253, 1027, 8892, 275, 1340, 273, 2442, 323, 3700, 285, 840, 1067, 436, 253, 5328, 873, 849, 310, 12320, 285, 295, 12352, 72, 5118, 432, 436, 50276, 3062, 15538, 891, 13414, 2868, 2819, 387, 5958, 3815, 310, 4209, 1580, 326, 4336, 14551, 980, 253, 4588, 3045, 3904, 2797, 3066, 3700, 275, 954, 2219, 891, 651, 1557, 670, 849, 973, 619, 1566, 651, 1347, 327, 3700, 417, 816, 534, 8892, 891, 943, 3700, 432, 891, 651, 452, 3078, 281, 2096, 1633, 751, 253, 5921, 273, 841, 4197, 7363, 342, 253, 4588, 3216, 5083, 3045, 3904, 50275, 88, 22, 1554, 262, 1945, 4715, 891, 858, 417, 923, 667, 3748, 390, 4679, 273, 752, 476, 320, 3264, 672, 253, 14237, 403, 3746, 10166, 327, 2709, 8892, 436, 3133, 751, 1633, 326, 812, 4354, 320, 2218, 275, 253, 16774, 1783, 347, 973, 285, 651, 2085, 38539, 16774, 6298, 347, 973, 50275, 88, 23, 16038, 323, 260, 3671, 891, 1335, 13414, 4751, 2096, 253, 878, 281, 2096, 253, 4038, 273, 253, 6779, 3340, 275, 253, 5133, 4081, 275, 253, 2929, 2139, 310, 436, 271, 1774, 1895, 4931, 16122, 327, 436, 651, 320, 9371, 50275, 88, 24, 18075, 281, 260, 3671, 50276, 16777, 1050, 7350, 247, 1943, 4468, 891, 574, 369, 253, 15180, 14247, 273, 253, 4081, 2746, 19235, 891, 858, 417, 923, 667, 5955, 670, 436, 275, 253, 2929, 390, 45190, 50276, 74, 1089, 436, 13631, 984, 891, 476, 4354, 1705, 598, 342, 20182, 18075, 281, 755, 387, 436, 4038, 323, 1650, 970, 298, 26341, 50276, 77, 1032, 751, 3082, 368, 476, 4931, 4677, 562, 247, 1175, 3777, 7877, 873, 625, 14556, 50276, 338, 891, 497, 281, 564, 949, 253, 13782, 273, 840, 2139, 417, 816, 6194, 247, 4577, 2715, 273, 326, 6779, 5853, 3185, 285, 3587, 923, 849, 973, 352, 476, 22573, 941, 275, 465, 10103, 3066, 326, 5853, 50276, 1542, 326, 4836, 50276, 30991, 3146, 2139, 417, 1611, 970, 247, 39401, 5853, 281, 4796, 253, 5958, 285, 840, 923, 849, 973, 253, 1332, 1057, 323, 1027, 17210, 50275, 88, 24, 67, 21223, 891, 4282, 604, 359, 812, 816, 2557, 3700, 625, 3587, 347, 973, 285, 2139, 359, 878, 281, 564, 3066, 841, 260, 3671, 5239, 50275, 88, 25, 253, 4081, 50276, 498, 71, 2801, 3064, 1332, 556, 690, 8664, 7794, 347, 973, 323, 1650, 1333, 359, 574, 767, 4836, 342, 3242, 7285, 13301, 597, 651, 452, 247, 1077, 1698, 2801, 3064, 4868, 2167, 597, 403, 7445, 14237, 323, 1016, 643, 21223, 2819, 387, 247, 3064, 273, 2801, 11390, 3133, 10341, 275, 643, 4088, 347, 973, 5474, 33032, 28025, 981, 4038, 285, 14259, 273, 4836, 4623, 1491, 275, 11454, 14237, 50276, 8774, 50276, 2520, 789, 9437, 281, 4853, 767, 9351, 273, 17082, 17082, 323, 1491, 4038, 285, 323, 1491, 14259, 323, 253, 13232, 273, 8356, 15549, 14259, 875, 8892, 594, 326, 3700, 4715, 476, 320, 2218, 625, 14556, 253, 12342, 403, 4518, 5544, 285, 253, 7982, 323, 1491, 4038, 3133, 281, 3761, 598, 342, 16875, 4431, 3551, 562, 273, 3579, 36318, 7274, 253, 7982, 323, 1491, 3700, 3133, 281, 320, 253, 47817, 7982, 326, 643, 2987, 4284, 281, 672, 597, 921, 326, 3215, 11273, 14237, 403, 3576, 327, 15450, 8892, 352, 310, 417, 2590, 326, 253, 10732, 273, 14259, 949, 30410, 13461, 2789, 3282, 533, 923, 2708, 323, 37699, 3533, 253, 1895, 9713, 12077, 14259, 14755, 273, 8892, 310, 1774, 323, 3700, 4715, 285, 3021, 253, 1543, 452, 2442, 281, 320, 1077, 3486, 1020, 604, 597, 39970, 281, 643, 9351, 273, 8892, 347, 310, 597, 1646, 281, 4647, 760, 281, 9162, 8892, 533, 326, 310, 247, 1175, 3213, 50276, 856, 84, 50276, 49346, 3542, 4679, 327, 253, 15302, 6777, 513, 1646, 281, 1804, 326, 253, 4081, 3082, 452, 2442, 10316, 275, 5322, 30328, 432, 3579, 4735, 5438, 271, 1774, 1895, 342, 2442, 323, 1029, 3486, 50275, 5040, 50276, 262, 310, 417, 2590, 281, 479, 326, 253, 30410, 3064, 7982, 310, 6210, 392, 37224, 310, 627, 247, 7658, 327, 253, 260, 3671, 285, 49996, 326, 5416, 253, 3064, 875, 253, 13461, 1663, 28174, 752, 310, 5125, 310, 352, 417, 253, 1083, 326, 30410, 13461, 812, 1705, 562, 3240, 1027, 5747, 253, 8892, 1146, 3240, 2074, 604, 253, 4872, 49996, 6311, 281, 5347, 907, 327, 43110, 2568, 9696, 46001, 6127, 275, 253, 3280, 3386, 50276, 3088, 368, 452, 7906, 327, 849, 436, 812, 320, 3732, 3345, 253, 3634, 273, 6197, 14237, 285, 2007, 3345, 253, 3634, 273, 9162, 1110, 1646, 281, 320, 3240, 14155, 3386, 273, 841, 3082, 534, 310, 417, 281, 1333, 326, 597, 403, 417, 4217, 275, 326, 19929, 533, 760, 281, 19148, 619, 4685, 273, 616, 1896, 7990, 273, 2898, 50276, 20513, 9162, 15302, 403, 2223, 594, 2810, 326, 891, 513, 4282, 1880, 1014, 19554, 3082, 651, 789, 816, 347, 973, 323, 1650, 17524, 327, 7351, 601, 71, 12113, 1537, 671, 921, 326, 256, 296, 256, 296, 32829, 285, 516, 5470, 403, 27599, 303, 300, 435, 16147, 1592, 494, 253, 1072, 812, 320, 753, 323, 8334, 285, 3802, 965, 352, 651, 320, 5322, 281, 923, 247, 5301, 281, 824, 1666, 25379, 275, 1340, 281, 755, 247, 3282, 273, 849, 253, 4081, 3082, 1918, 16039, 326, 643, 440, 35421, 390, 22296, 3082, 1537, 1918, 816, 347, 973, 5010, 352, 310, 1892, 281, 2028, 849, 1534, 841, 13007, 403, 1580, 253, 990, 4736, 310, 281, 3653, 3700, 1430, 273, 8892, 285, 417, 253, 3082, 352, 1057, 1646, 751, 627, 403, 19554, 1666, 25379, 326, 368, 812, 7277, 1411, 7152, 33032, 2520, 2929, 29328, 2969, 17082, 323, 10499, 253, 1491, 4038, 275, 6311, 14237, 4583, 436, 310, 271, 4722, 3884, 2299, 627, 403, 247, 1643, 2234, 32213, 275, 619, 1859, 417, 1878, 326, 253, 8542, 11839, 273, 841, 17082, 310, 417, 4755, 1580, 597, 2430, 20446, 275, 253, 2303, 5028, 285, 1223, 627, 310, 271, 4154, 281, 320, 1160, 323, 253, 12794, 4722, 1255, 273, 18216, 841, 3533, 436, 6907, 651, 320, 625, 18511, 604, 2709, 32049, 35615, 497, 14859, 285, 2429, 50274, 783, 689, 50238, 3533, 326, 253, 4477, 873, 562, 281, 3662, 849, 8892, 29765, 1491, 310, 7141, 285, 281, 752, 6070, 436, 21916, 310, 26557, 4722, 285, 1774, 50274, 783, 4081, 17082, 285, 2969, 285, 27350, 50275, 262, 310, 4722, 326, 247, 1643, 5085, 1646, 281, 9232, 954, 4836, 2173, 1491, 50274, 783, 44921, 10076, 285, 7613, 11839, 273, 841, 17082, 310, 247, 2372, 12744, 281, 479, 1060, 347, 4879, 407, 253, 4477, 3700, 310, 954, 12994, 275, 1698, 12185, 4694, 27005, 8772, 253, 2303, 4836, 2568, 253, 17082, 4081, 3469, 327, 20446, 275, 253, 2303, 5028, 604, 359, 2168, 452, 436, 840, 50276, 284, 253, 4477, 3746, 3877, 50276, 262, 310, 14916, 281, 3365, 1611, 562, 1027, 2603, 15302, 45190, 327, 247, 2303, 1474, 873, 352, 310, 9125, 326, 436, 310, 271, 2523, 984, 352, 4419, 3733, 374, 79, 6928, 835, 295, 310, 253, 1180, 273, 2603, 8892, 891, 717, 10915, 8498, 758, 326, 581, 7208, 2217, 556, 2289, 281, 247, 10481, 1781, 873, 273, 7431, 2603, 8892, 323, 436, 281, 320, 247, 1524, 8542, 2523, 50274, 783, 17082, 403, 18996, 9904, 281, 253, 32049, 908, 285, 642, 17947, 273, 32049, 35615, 310, 2684, 253, 298, 296, 78, 10336, 908, 310, 5272, 533, 352, 651, 320, 5322, 281, 923, 849, 1199, 1543, 1818, 604, 387, 512, 342, 5795, 35615, 50275, 783, 260, 3671, 7982, 7024, 327, 247, 4373, 19484, 253, 17302, 4313, 534, 1060, 310, 29607, 873, 281, 5096, 1293, 667, 22861, 50275, 5371, 310, 253, 16038, 323, 253, 12400, 281, 4872, 3210, 275, 253, 23378, 39578, 2929, 323, 1650, 13361, 793, 497, 671, 908, 281, 8338, 1880, 12474, 497, 25175, 323, 1327, 1282, 1285, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 12453, 1774, 2087, 3533, 670, 849, 4872, 49996, 897, 3386, 285, 670, 253, 3700, 1430, 273, 1110, 3386, 2439, 8892, 253, 2929, 10262, 247, 2173, 747, 1783, 1332, 285, 14371, 352, 327, 247, 2021, 273, 295, 24343, 8892, 50275, 455, 1740, 30628, 15496, 253, 8945, 7002, 2278, 1119, 253, 2087, 3884, 273, 2561, 281, 320, 4722, 285, 32811, 533, 512, 1740, 6096, 2067, 4092, 7350, 670, 253, 3486, 285, 3590, 1255, 273, 253, 4081, 1332, 50275, 783, 3486, 7350, 6571, 18445, 342, 253, 8310, 326, 253, 1332, 310, 2173, 281, 4872, 49996, 285, 326, 697, 760, 7763, 281, 8892, 323, 534, 247, 6832, 2408, 273, 3733, 941, 310, 2130, 50275, 284, 253, 913, 516, 7378, 281, 2997, 326, 352, 943, 1335, 320, 1896, 281, 2589, 271, 27096, 1783, 762, 841, 2515, 533, 516, 625, 7514, 670, 253, 3590, 1255, 3374, 253, 30628, 497, 417, 13762, 326, 247, 1332, 1754, 327, 253, 15496, 273, 2173, 3386, 369, 4569, 323, 253, 4081, 4758, 1955, 281, 9381, 7340, 2190, 643, 3374, 285, 858, 417, 1089, 326, 253, 4679, 497, 10481, 9470, 281, 11399, 841, 24626 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: in this paper the authors study dro with data geometry considered in the uncertainty set making use of the socalled geometric wasserstein distance the authors derive an approximate algorithm for the proposed gdro and prove its convergence numerical experiments are performed to demonstrate the proposed gdro framework over erm and other dro frameworks strengths an interesting framework proposed using the geometric wasserstein distance extensive experiments weaknesses the whole paper is quite hard to follow due to the lack of selfcontainednesssome rather nonstandard notions are not welldefined or wellexplained or very hard to understand unsatisfactory english writing wrong grammar choice of words etc inaccuracy in mathematics exposition the lack of mathematical rigor and lots of abuse of notation related works are not discussed in detail eg dro a lot of typos typesetting and formatting issues the settings and descriptions of the experiments are unclear postrebuttal thanks the authors for their effort the revised version has addressed my concerns and i have raised my score although it is allowed in this neurips submission cycle i think the authors should have provided sufficient experimental details in their initial submission a lot of new content have been added in the revised version of the paper including supplementary material say for the experimental details i feel it is quite unfair to assess the merits of the paper according to this updated version comparing to other paper submissions it appears to me that the authors was submitting unfinished work on the submission deadline and take advantage of the rebuttal phase relevant discussion of limitations might appear in the paper but is hard to find docsepthis paper proposed a novel geometric wasserstein dro gdro method by exploiting the discrete geometric wasserstein distance a generically applicable approximate algorithm is derived for model optimization extensive experiments on both simulation and realworld datasets demonstrate its effectiveness pros 1 the proposed method is well motivated and reasonable this paper studied an important problem of dro the uncertainty set is too overflexible such that it may include implausible worstcase distributions to address this issue the authors proposed to use discrete geometric wasserstein distance to construct the uncertainty set in order to constrain the uncertainty set within the data manifold the method is somewhat novel and interesting 2 both convergence rate and the bounded error rate are provided and the superiority of the proposed method is also empirically demonstrated through experiments on both simulation and realworld datasets cons 1 data from unseen distributions may fall out of the manifold constructed by training data in this case simply constraining the uncertainty set may not be helpful for ood generalization 2 training efficiency the authors use a graph to represent the manifold structure it may be problematic for largescale datasets since the graph needs to be estimated at every iteration 3 in the experiments the authors only compare with erm and drobased methods it would be a bonus if some general methods for ood generalization can be included yes docsepthis paper considers the data geometry in the distributionally robust optimization dro problems and proposes a novel framework called geometric wasserstein dro gdro to achieve their goal the authors also provide some theoretical analyses such as approximated optimization error and convergence rate to theoretically show the strengths of gdro finally the experimental results show the effectiveness of the proposed approach strengths the motivation and contributions are good this work attacks an overlooked issue in the dro community and proposes a reasonable method to alleviate this this would raise much more research attention in this direction this paper presents some roughly decent theoretical guarantees which support the effectiveness theoretically weaknesses this paper lacks comprehensive discussions about the related work as we all know dro has attracted tremendous research interest in the machine learning community and there are many studies about dro the proposed gdro may has somewhat limitations since it is not easy to apply to the deep neural networks dnns perhaps i am wrong but at least the authors do not mention it as a whole despite some limitations i believe the proposed method is a qualified work and could bring some new insights into the dro community therefore i tend to accept this paper none docsepthis work aims to solve the problem that dro is too pessimistic the uncertainty set is too large and often leads to poor results in practice the motivation is that high dimensional data approximately reside on low dimensional manifolds lines 67 so this work tries to constrain the uncertainty set on this low dimensional manifold to do this this work i uses the nndescent method to estimate the low dimensional manifold ii formulates the gdro objective and optimizes it with alternating optimization and proves that it converges the authors then conduct a series of experiments on synthetic and real datasets and claim that the proposed method is better than existing dro methods i really like the highlevel idea of this paper dro is too pessimistic is a wellknown open problem in this field and this work tries to solve this problem by constraining the uncertainty set on a low dimensional manifold which makes lots of sense i also think that a lot of credit should be given to the authors for the theory part the gdro objective is nicely formulated and optimizing this objective with alternating optimization makes sense the convergence result is also very nice the major weaknesses of this work however come from the experiment section recall that the core motivation of this work is that high dimensional data approximately reside on low dimensional manifolds lines 67 whereas there is a huge gap between the motivation and the experiments which makes the experiments very confusing and unconvincing take the first experiment in section 41 as an example first of all the experimental setting is very confusing i suppose that the task is to infer y from s v i also couldnt find the definition of alphav so i suppose that it is used to define v thus in this task the input data is 2dimensional and s v seems to also reside on a 2dimensional manifold which is the union of a number of 1dimensional curves if alphav is in a b for some a and b i dont think this could be called high dimensional data approximately residing on low dimensional manifolds then the authors claim that gdro is much better than other dro on this task i dont know how the graph g0 is estimated i suppose that the authors just simply provide g0 to the algorithm because nndescent is only used for largescaled datasets lines 109110 so it seems to me that the reason why gdro is so good is that g0 leaks some additional information about the target distribution to it but not to other methods not because it leverages the fact that the data resides on a low dimensional manifold of course it is nice that gdro could utilize this additional leaked information from g0 the question is how to get this g0 in practice the authors propose to estimate g0 with nndescent but they dont demonstrate how well nndescent can estimate g0 on realistic tasks if g0 is not well estimated and the target distribution is outside the estimated manifold then i imagine that gdro could completely fail moreover most of the tasks in the experiments are not really highdimensional 50 dimensions and all tasks seem to follow some simple unrealistic structures which make it easier for gdro to achieve high performances it is questionable whether these good performances are transferable to realworld applications with realistic distribution shifts a valid experimental setting i would suggest the authors try is the following the input data comes from a low dimensional manifold in a high dimensional space at least 200 dimensions but the structure of this manifold is unknown for instance introduce randomness into the manifold structure so gdro must first estimate g0 by itself this setting is closer to the authors motivation that high dimensional data approximately reside on low dimensional manifolds otherwise it is always questionable whether the performance gain of gdro comes from the information leakage from the provided g0 rather than its ability to estimate and utilize the lowdimensional manifold in summary i really like the highlevel idea and the theory part of this paper but the experiment section does require a lot of improvement currently there is a huge gap between the authors motivation and the experiments making the main conclusion of this paper highly debatable for this reason i recommend rejecting this paper for this time and hope that the authors could resubmit after rewriting the experiment section post rebuttal the authors have revised the paper as suggested so i would like to raise my rating to accept the limitations are not sufficiently addressed ### Summary:
the paper proposes a novel distributionally robust optimization formulation leveraging data geometry to construct the uncertainty set after a lengthy discussion and revision process the reviewers have reached a consensus acceptance recommendation which i support currently the reproducibility checklist part 3a states that the authors submitted code to reproduce their results along with the paper but i do not see it as part of the supplementary material or as a link please provide code with the camera ready submission or correct the checklist
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 249, 436, 2929, 253, 4477, 1263, 3926, 342, 941, 12087, 2783, 275, 253, 11649, 873, 2403, 897, 273, 253, 9267, 18859, 17856, 369, 2152, 6339, 4181, 253, 4477, 15313, 271, 16851, 5933, 323, 253, 4081, 305, 3002, 285, 5276, 697, 14940, 10704, 4679, 403, 2684, 281, 7568, 253, 4081, 305, 3002, 7792, 689, 209, 693, 285, 643, 3926, 31225, 50276, 296, 3755, 20556, 50275, 266, 4722, 7792, 4081, 970, 253, 17856, 369, 2152, 6339, 4181, 50276, 2068, 3134, 4679, 50276, 20881, 1255, 265, 50275, 783, 2644, 2929, 310, 3240, 1892, 281, 956, 1955, 281, 50271, 783, 3480, 273, 1881, 41010, 1255, 8826, 2581, 1327, 15291, 27367, 403, 417, 6210, 392, 37224, 390, 6210, 1591, 446, 1243, 390, 1077, 1892, 281, 2096, 50272, 4539, 16412, 6697, 48087, 4028, 3430, 28146, 4327, 273, 3000, 3966, 50272, 249, 18921, 1974, 275, 23065, 47284, 253, 3480, 273, 15965, 8132, 263, 285, 8783, 273, 7242, 273, 14951, 50276, 4919, 2987, 403, 417, 5469, 275, 2508, 24088, 3926, 50276, 66, 2257, 273, 963, 993, 3510, 33513, 285, 33907, 3374, 50275, 783, 7533, 285, 20121, 273, 253, 4679, 403, 12744, 50274, 5996, 250, 2858, 22559, 6701, 253, 4477, 323, 616, 3434, 253, 17265, 2715, 556, 9713, 619, 7350, 285, 891, 452, 5439, 619, 4868, 3738, 352, 310, 4136, 275, 436, 5723, 2824, 19529, 5880, 891, 1158, 253, 4477, 943, 452, 2530, 4209, 5661, 4278, 275, 616, 3302, 19529, 247, 2257, 273, 747, 2600, 452, 644, 2879, 275, 253, 17265, 2715, 273, 253, 2929, 1690, 24864, 2144, 1333, 323, 253, 5661, 4278, 891, 1928, 352, 310, 3240, 16593, 281, 2939, 253, 16108, 273, 253, 2929, 2556, 281, 436, 9300, 2715, 10941, 281, 643, 2929, 35103, 352, 4620, 281, 479, 326, 253, 4477, 369, 29315, 46809, 789, 327, 253, 19529, 20639, 285, 1379, 5750, 273, 253, 30080, 22559, 3408, 50276, 15477, 5955, 273, 7364, 1537, 3176, 275, 253, 2929, 533, 310, 1892, 281, 1089, 50275, 7152, 33032, 2520, 2929, 4081, 247, 4460, 17856, 369, 2152, 6339, 3926, 305, 3002, 1332, 407, 38883, 253, 13358, 17856, 369, 2152, 6339, 4181, 247, 1006, 1037, 7763, 16851, 5933, 310, 6012, 323, 1566, 13757, 9470, 4679, 327, 1097, 9864, 285, 1524, 10186, 15302, 7568, 697, 12510, 50276, 856, 84, 337, 253, 4081, 1332, 310, 973, 17194, 285, 5272, 436, 2929, 5421, 271, 1774, 1895, 273, 3926, 253, 11649, 873, 310, 1512, 689, 23210, 917, 824, 326, 352, 778, 2486, 3898, 666, 917, 9065, 5045, 10670, 281, 2953, 436, 2523, 253, 4477, 4081, 281, 897, 13358, 17856, 369, 2152, 6339, 4181, 281, 3989, 253, 11649, 873, 275, 1340, 281, 37709, 253, 11649, 873, 1561, 253, 941, 16751, 253, 1332, 310, 8489, 4460, 285, 4722, 50275, 19, 1097, 14940, 2281, 285, 253, 11542, 2228, 2281, 403, 2530, 285, 253, 34385, 273, 253, 4081, 1332, 310, 671, 45190, 5183, 949, 4679, 327, 1097, 9864, 285, 1524, 10186, 15302, 50276, 5040, 337, 941, 432, 39709, 10670, 778, 2965, 562, 273, 253, 16751, 8818, 407, 3733, 941, 275, 436, 1083, 3365, 1030, 26208, 253, 11649, 873, 778, 417, 320, 9371, 323, 258, 351, 26647, 50276, 19, 3733, 6733, 253, 4477, 897, 247, 4216, 281, 1957, 253, 16751, 2605, 352, 778, 320, 20276, 323, 1236, 2510, 25912, 15302, 1580, 253, 4216, 3198, 281, 320, 5998, 387, 1046, 19502, 50276, 20, 275, 253, 4679, 253, 4477, 760, 7277, 342, 209, 693, 285, 3926, 3169, 3082, 352, 651, 320, 247, 17301, 604, 690, 2087, 3082, 323, 50276, 836, 26647, 476, 320, 2908, 50273, 9820, 5474, 33032, 2520, 2929, 19401, 253, 941, 12087, 275, 253, 3268, 595, 10237, 13757, 3926, 3237, 285, 29328, 247, 4460, 7792, 1925, 17856, 369, 2152, 6339, 3926, 305, 3002, 281, 5115, 616, 4736, 253, 4477, 671, 2085, 690, 10527, 6260, 824, 347, 34930, 13757, 2228, 285, 14940, 2281, 281, 28055, 921, 253, 20544, 273, 305, 3002, 4720, 253, 5661, 1543, 921, 253, 12510, 273, 253, 4081, 2746, 20544, 50276, 783, 16038, 285, 9021, 403, 1175, 436, 789, 8104, 271, 28849, 2523, 275, 253, 3926, 3114, 285, 29328, 247, 5272, 1332, 281, 33623, 436, 436, 651, 7164, 1199, 625, 2561, 4116, 275, 436, 3884, 50276, 2520, 2929, 10262, 690, 11467, 12524, 10527, 23632, 534, 1329, 253, 12510, 28055, 50276, 20881, 1255, 265, 50276, 2520, 2929, 19756, 11088, 11985, 670, 253, 2905, 789, 347, 359, 512, 871, 3926, 556, 17755, 19999, 2561, 1600, 275, 253, 5145, 4715, 3114, 285, 627, 403, 1142, 2175, 670, 3926, 50276, 783, 4081, 305, 3002, 778, 556, 8489, 7364, 1580, 352, 310, 417, 3477, 281, 4647, 281, 253, 3676, 11454, 6928, 277, 79, 2224, 4931, 891, 717, 3430, 533, 387, 1878, 253, 4477, 513, 417, 3748, 352, 50276, 284, 247, 2644, 5747, 690, 7364, 891, 2868, 253, 4081, 1332, 310, 247, 12165, 789, 285, 812, 3324, 690, 747, 16039, 715, 253, 3926, 3114, 3103, 891, 5257, 281, 2997, 436, 2929, 5293, 50276, 7152, 33032, 2520, 789, 13698, 281, 8415, 253, 1895, 326, 3926, 310, 1512, 45234, 2531, 253, 11649, 873, 310, 1512, 1781, 285, 2223, 5644, 281, 4105, 1543, 275, 3946, 253, 16038, 310, 326, 1029, 15759, 941, 5512, 28932, 327, 1698, 15759, 28236, 3104, 9963, 594, 436, 789, 14177, 281, 37709, 253, 11649, 873, 327, 436, 1698, 15759, 16751, 281, 513, 436, 436, 789, 891, 4648, 253, 295, 2109, 40513, 1332, 281, 6642, 253, 1698, 15759, 16751, 21255, 17075, 684, 253, 305, 3002, 8103, 285, 5556, 4219, 352, 342, 28035, 13757, 285, 19539, 326, 352, 26414, 253, 4477, 840, 2589, 247, 2962, 273, 4679, 327, 13506, 285, 1524, 15302, 285, 1750, 326, 253, 4081, 1332, 310, 1805, 685, 5368, 3926, 3082, 891, 1663, 751, 253, 1029, 5251, 2934, 273, 436, 2929, 3926, 310, 1512, 45234, 2531, 310, 247, 973, 4304, 1527, 1895, 275, 436, 1673, 285, 436, 789, 14177, 281, 8415, 436, 1895, 407, 1030, 26208, 253, 11649, 873, 327, 247, 1698, 15759, 16751, 534, 2789, 8783, 273, 3282, 891, 671, 1158, 326, 247, 2257, 273, 6152, 943, 320, 1677, 281, 253, 4477, 323, 253, 3762, 629, 253, 305, 3002, 8103, 310, 23395, 26115, 285, 39793, 436, 8103, 342, 28035, 13757, 2789, 3282, 253, 14940, 906, 310, 671, 1077, 5322, 50276, 783, 2201, 32213, 273, 436, 789, 2299, 1705, 432, 253, 3368, 2593, 6983, 326, 253, 5161, 16038, 273, 436, 789, 310, 326, 1029, 15759, 941, 5512, 28932, 327, 1698, 15759, 28236, 3104, 9963, 5727, 627, 310, 247, 5699, 8037, 875, 253, 16038, 285, 253, 4679, 534, 2789, 253, 4679, 1077, 21643, 285, 10915, 87, 19163, 50276, 21528, 253, 806, 3368, 275, 2593, 7609, 347, 271, 1650, 806, 273, 512, 253, 5661, 4758, 310, 1077, 21643, 891, 9428, 326, 253, 4836, 310, 281, 9441, 340, 432, 256, 362, 891, 671, 812, 2649, 1089, 253, 5426, 273, 355, 545, 580, 594, 891, 9428, 326, 352, 310, 908, 281, 4853, 362, 3021, 275, 436, 4836, 253, 3280, 941, 310, 374, 6967, 285, 256, 362, 3133, 281, 671, 28932, 327, 247, 374, 6967, 16751, 534, 310, 253, 8083, 273, 247, 1180, 273, 337, 6967, 9191, 604, 355, 545, 580, 310, 275, 247, 270, 323, 690, 247, 285, 270, 891, 13414, 1158, 436, 812, 320, 1925, 1029, 15759, 941, 5512, 33978, 327, 1698, 15759, 28236, 50275, 7461, 253, 4477, 1750, 326, 305, 3002, 310, 1199, 1805, 685, 643, 3926, 327, 436, 4836, 891, 13414, 871, 849, 253, 4216, 305, 17, 310, 5998, 891, 9428, 326, 253, 4477, 816, 3365, 2085, 305, 17, 281, 253, 5933, 984, 295, 2109, 40513, 310, 760, 908, 323, 1236, 2510, 1179, 264, 15302, 3104, 884, 4739, 740, 594, 352, 3133, 281, 479, 326, 253, 1921, 2139, 305, 3002, 310, 594, 1175, 310, 326, 305, 17, 31693, 690, 3081, 1491, 670, 253, 2303, 3268, 281, 352, 533, 417, 281, 643, 3082, 417, 984, 352, 19732, 1131, 253, 958, 326, 253, 941, 31951, 327, 247, 1698, 15759, 16751, 50276, 1171, 2282, 352, 310, 5322, 326, 305, 3002, 812, 16584, 436, 3081, 31347, 1491, 432, 305, 17, 253, 1953, 310, 849, 281, 755, 436, 305, 17, 275, 3946, 253, 4477, 12661, 281, 6642, 305, 17, 342, 295, 2109, 40513, 533, 597, 13414, 7568, 849, 973, 295, 2109, 40513, 476, 6642, 305, 17, 327, 15958, 8892, 604, 305, 17, 310, 417, 973, 5998, 285, 253, 2303, 3268, 310, 3345, 253, 5998, 16751, 840, 891, 8564, 326, 305, 3002, 812, 4336, 1891, 50276, 3062, 1189, 954, 273, 253, 8892, 275, 253, 4679, 403, 417, 1663, 1029, 6967, 2456, 10103, 285, 512, 8892, 1646, 281, 956, 690, 2969, 46521, 5289, 534, 1056, 352, 6927, 323, 305, 3002, 281, 5115, 1029, 16226, 352, 310, 30455, 1880, 841, 1175, 16226, 403, 3700, 494, 281, 1524, 10186, 4893, 342, 15958, 3268, 15036, 50276, 66, 3588, 5661, 4758, 891, 651, 1804, 253, 4477, 1611, 310, 253, 1563, 253, 3280, 941, 3249, 432, 247, 1698, 15759, 16751, 275, 247, 1029, 15759, 2317, 387, 1878, 1052, 10103, 533, 253, 2605, 273, 436, 16751, 310, 7202, 323, 4227, 9569, 3632, 1255, 715, 253, 16751, 2605, 594, 305, 3002, 1364, 806, 6642, 305, 17, 407, 3139, 436, 4758, 310, 8003, 281, 253, 4477, 16038, 326, 1029, 15759, 941, 5512, 28932, 327, 1698, 15759, 28236, 5010, 352, 310, 1900, 30455, 1880, 253, 3045, 6351, 273, 305, 3002, 3249, 432, 253, 1491, 23753, 432, 253, 2530, 305, 17, 2581, 685, 697, 3745, 281, 6642, 285, 16584, 253, 1698, 6967, 16751, 50276, 249, 6010, 891, 1663, 751, 253, 1029, 5251, 2934, 285, 253, 3762, 629, 273, 436, 2929, 533, 253, 3368, 2593, 1057, 2430, 247, 2257, 273, 7756, 4390, 627, 310, 247, 5699, 8037, 875, 253, 4477, 16038, 285, 253, 4679, 2403, 253, 2022, 6452, 273, 436, 2929, 4122, 4274, 17980, 323, 436, 1921, 891, 5583, 33944, 436, 2929, 323, 436, 673, 285, 3524, 326, 253, 4477, 812, 501, 538, 2225, 846, 294, 17695, 253, 3368, 2593, 50275, 5996, 30080, 22559, 50276, 783, 4477, 452, 17265, 253, 2929, 347, 5125, 594, 891, 651, 751, 281, 7164, 619, 13716, 281, 2997, 50276, 783, 7364, 403, 417, 10481, 9713, 2490, 187, 4118, 18435, 27, 783, 2929, 29328, 247, 4460, 3268, 595, 10237, 13757, 15895, 19732, 2977, 941, 12087, 281, 3989, 253, 11649, 873, 846, 247, 24585, 5955, 285, 18520, 1232, 253, 30628, 452, 4925, 247, 13969, 14924, 17401, 534, 891, 1329, 50276, 47590, 253, 38041, 44282, 629, 495, 66, 3054, 326, 253, 4477, 9262, 2127, 281, 18302, 616, 1543, 2112, 342, 253, 2929, 533, 891, 513, 417, 923, 352, 347, 629, 273, 253, 24864, 2144, 390, 347, 247, 3048, 4496, 2085, 2127, 342, 253, 6568, 4704, 19529, 390, 3451, 253, 44282, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 249, 436, 2929, 253, 4477, 1263, 3926, 342, 941, 12087, 2783, 275, 253, 11649, 873, 2403, 897, 273, 253, 9267, 18859, 17856, 369, 2152, 6339, 4181, 253, 4477, 15313, 271, 16851, 5933, 323, 253, 4081, 305, 3002, 285, 5276, 697, 14940, 10704, 4679, 403, 2684, 281, 7568, 253, 4081, 305, 3002, 7792, 689, 209, 693, 285, 643, 3926, 31225, 50276, 296, 3755, 20556, 50275, 266, 4722, 7792, 4081, 970, 253, 17856, 369, 2152, 6339, 4181, 50276, 2068, 3134, 4679, 50276, 20881, 1255, 265, 50275, 783, 2644, 2929, 310, 3240, 1892, 281, 956, 1955, 281, 50271, 783, 3480, 273, 1881, 41010, 1255, 8826, 2581, 1327, 15291, 27367, 403, 417, 6210, 392, 37224, 390, 6210, 1591, 446, 1243, 390, 1077, 1892, 281, 2096, 50272, 4539, 16412, 6697, 48087, 4028, 3430, 28146, 4327, 273, 3000, 3966, 50272, 249, 18921, 1974, 275, 23065, 47284, 253, 3480, 273, 15965, 8132, 263, 285, 8783, 273, 7242, 273, 14951, 50276, 4919, 2987, 403, 417, 5469, 275, 2508, 24088, 3926, 50276, 66, 2257, 273, 963, 993, 3510, 33513, 285, 33907, 3374, 50275, 783, 7533, 285, 20121, 273, 253, 4679, 403, 12744, 50274, 5996, 250, 2858, 22559, 6701, 253, 4477, 323, 616, 3434, 253, 17265, 2715, 556, 9713, 619, 7350, 285, 891, 452, 5439, 619, 4868, 3738, 352, 310, 4136, 275, 436, 5723, 2824, 19529, 5880, 891, 1158, 253, 4477, 943, 452, 2530, 4209, 5661, 4278, 275, 616, 3302, 19529, 247, 2257, 273, 747, 2600, 452, 644, 2879, 275, 253, 17265, 2715, 273, 253, 2929, 1690, 24864, 2144, 1333, 323, 253, 5661, 4278, 891, 1928, 352, 310, 3240, 16593, 281, 2939, 253, 16108, 273, 253, 2929, 2556, 281, 436, 9300, 2715, 10941, 281, 643, 2929, 35103, 352, 4620, 281, 479, 326, 253, 4477, 369, 29315, 46809, 789, 327, 253, 19529, 20639, 285, 1379, 5750, 273, 253, 30080, 22559, 3408, 50276, 15477, 5955, 273, 7364, 1537, 3176, 275, 253, 2929, 533, 310, 1892, 281, 1089, 50275, 7152, 33032, 2520, 2929, 4081, 247, 4460, 17856, 369, 2152, 6339, 3926, 305, 3002, 1332, 407, 38883, 253, 13358, 17856, 369, 2152, 6339, 4181, 247, 1006, 1037, 7763, 16851, 5933, 310, 6012, 323, 1566, 13757, 9470, 4679, 327, 1097, 9864, 285, 1524, 10186, 15302, 7568, 697, 12510, 50276, 856, 84, 337, 253, 4081, 1332, 310, 973, 17194, 285, 5272, 436, 2929, 5421, 271, 1774, 1895, 273, 3926, 253, 11649, 873, 310, 1512, 689, 23210, 917, 824, 326, 352, 778, 2486, 3898, 666, 917, 9065, 5045, 10670, 281, 2953, 436, 2523, 253, 4477, 4081, 281, 897, 13358, 17856, 369, 2152, 6339, 4181, 281, 3989, 253, 11649, 873, 275, 1340, 281, 37709, 253, 11649, 873, 1561, 253, 941, 16751, 253, 1332, 310, 8489, 4460, 285, 4722, 50275, 19, 1097, 14940, 2281, 285, 253, 11542, 2228, 2281, 403, 2530, 285, 253, 34385, 273, 253, 4081, 1332, 310, 671, 45190, 5183, 949, 4679, 327, 1097, 9864, 285, 1524, 10186, 15302, 50276, 5040, 337, 941, 432, 39709, 10670, 778, 2965, 562, 273, 253, 16751, 8818, 407, 3733, 941, 275, 436, 1083, 3365, 1030, 26208, 253, 11649, 873, 778, 417, 320, 9371, 323, 258, 351, 26647, 50276, 19, 3733, 6733, 253, 4477, 897, 247, 4216, 281, 1957, 253, 16751, 2605, 352, 778, 320, 20276, 323, 1236, 2510, 25912, 15302, 1580, 253, 4216, 3198, 281, 320, 5998, 387, 1046, 19502, 50276, 20, 275, 253, 4679, 253, 4477, 760, 7277, 342, 209, 693, 285, 3926, 3169, 3082, 352, 651, 320, 247, 17301, 604, 690, 2087, 3082, 323, 50276, 836, 26647, 476, 320, 2908, 50273, 9820, 5474, 33032, 2520, 2929, 19401, 253, 941, 12087, 275, 253, 3268, 595, 10237, 13757, 3926, 3237, 285, 29328, 247, 4460, 7792, 1925, 17856, 369, 2152, 6339, 3926, 305, 3002, 281, 5115, 616, 4736, 253, 4477, 671, 2085, 690, 10527, 6260, 824, 347, 34930, 13757, 2228, 285, 14940, 2281, 281, 28055, 921, 253, 20544, 273, 305, 3002, 4720, 253, 5661, 1543, 921, 253, 12510, 273, 253, 4081, 2746, 20544, 50276, 783, 16038, 285, 9021, 403, 1175, 436, 789, 8104, 271, 28849, 2523, 275, 253, 3926, 3114, 285, 29328, 247, 5272, 1332, 281, 33623, 436, 436, 651, 7164, 1199, 625, 2561, 4116, 275, 436, 3884, 50276, 2520, 2929, 10262, 690, 11467, 12524, 10527, 23632, 534, 1329, 253, 12510, 28055, 50276, 20881, 1255, 265, 50276, 2520, 2929, 19756, 11088, 11985, 670, 253, 2905, 789, 347, 359, 512, 871, 3926, 556, 17755, 19999, 2561, 1600, 275, 253, 5145, 4715, 3114, 285, 627, 403, 1142, 2175, 670, 3926, 50276, 783, 4081, 305, 3002, 778, 556, 8489, 7364, 1580, 352, 310, 417, 3477, 281, 4647, 281, 253, 3676, 11454, 6928, 277, 79, 2224, 4931, 891, 717, 3430, 533, 387, 1878, 253, 4477, 513, 417, 3748, 352, 50276, 284, 247, 2644, 5747, 690, 7364, 891, 2868, 253, 4081, 1332, 310, 247, 12165, 789, 285, 812, 3324, 690, 747, 16039, 715, 253, 3926, 3114, 3103, 891, 5257, 281, 2997, 436, 2929, 5293, 50276, 7152, 33032, 2520, 789, 13698, 281, 8415, 253, 1895, 326, 3926, 310, 1512, 45234, 2531, 253, 11649, 873, 310, 1512, 1781, 285, 2223, 5644, 281, 4105, 1543, 275, 3946, 253, 16038, 310, 326, 1029, 15759, 941, 5512, 28932, 327, 1698, 15759, 28236, 3104, 9963, 594, 436, 789, 14177, 281, 37709, 253, 11649, 873, 327, 436, 1698, 15759, 16751, 281, 513, 436, 436, 789, 891, 4648, 253, 295, 2109, 40513, 1332, 281, 6642, 253, 1698, 15759, 16751, 21255, 17075, 684, 253, 305, 3002, 8103, 285, 5556, 4219, 352, 342, 28035, 13757, 285, 19539, 326, 352, 26414, 253, 4477, 840, 2589, 247, 2962, 273, 4679, 327, 13506, 285, 1524, 15302, 285, 1750, 326, 253, 4081, 1332, 310, 1805, 685, 5368, 3926, 3082, 891, 1663, 751, 253, 1029, 5251, 2934, 273, 436, 2929, 3926, 310, 1512, 45234, 2531, 310, 247, 973, 4304, 1527, 1895, 275, 436, 1673, 285, 436, 789, 14177, 281, 8415, 436, 1895, 407, 1030, 26208, 253, 11649, 873, 327, 247, 1698, 15759, 16751, 534, 2789, 8783, 273, 3282, 891, 671, 1158, 326, 247, 2257, 273, 6152, 943, 320, 1677, 281, 253, 4477, 323, 253, 3762, 629, 253, 305, 3002, 8103, 310, 23395, 26115, 285, 39793, 436, 8103, 342, 28035, 13757, 2789, 3282, 253, 14940, 906, 310, 671, 1077, 5322, 50276, 783, 2201, 32213, 273, 436, 789, 2299, 1705, 432, 253, 3368, 2593, 6983, 326, 253, 5161, 16038, 273, 436, 789, 310, 326, 1029, 15759, 941, 5512, 28932, 327, 1698, 15759, 28236, 3104, 9963, 5727, 627, 310, 247, 5699, 8037, 875, 253, 16038, 285, 253, 4679, 534, 2789, 253, 4679, 1077, 21643, 285, 10915, 87, 19163, 50276, 21528, 253, 806, 3368, 275, 2593, 7609, 347, 271, 1650, 806, 273, 512, 253, 5661, 4758, 310, 1077, 21643, 891, 9428, 326, 253, 4836, 310, 281, 9441, 340, 432, 256, 362, 891, 671, 812, 2649, 1089, 253, 5426, 273, 355, 545, 580, 594, 891, 9428, 326, 352, 310, 908, 281, 4853, 362, 3021, 275, 436, 4836, 253, 3280, 941, 310, 374, 6967, 285, 256, 362, 3133, 281, 671, 28932, 327, 247, 374, 6967, 16751, 534, 310, 253, 8083, 273, 247, 1180, 273, 337, 6967, 9191, 604, 355, 545, 580, 310, 275, 247, 270, 323, 690, 247, 285, 270, 891, 13414, 1158, 436, 812, 320, 1925, 1029, 15759, 941, 5512, 33978, 327, 1698, 15759, 28236, 50275, 7461, 253, 4477, 1750, 326, 305, 3002, 310, 1199, 1805, 685, 643, 3926, 327, 436, 4836, 891, 13414, 871, 849, 253, 4216, 305, 17, 310, 5998, 891, 9428, 326, 253, 4477, 816, 3365, 2085, 305, 17, 281, 253, 5933, 984, 295, 2109, 40513, 310, 760, 908, 323, 1236, 2510, 1179, 264, 15302, 3104, 884, 4739, 740, 594, 352, 3133, 281, 479, 326, 253, 1921, 2139, 305, 3002, 310, 594, 1175, 310, 326, 305, 17, 31693, 690, 3081, 1491, 670, 253, 2303, 3268, 281, 352, 533, 417, 281, 643, 3082, 417, 984, 352, 19732, 1131, 253, 958, 326, 253, 941, 31951, 327, 247, 1698, 15759, 16751, 50276, 1171, 2282, 352, 310, 5322, 326, 305, 3002, 812, 16584, 436, 3081, 31347, 1491, 432, 305, 17, 253, 1953, 310, 849, 281, 755, 436, 305, 17, 275, 3946, 253, 4477, 12661, 281, 6642, 305, 17, 342, 295, 2109, 40513, 533, 597, 13414, 7568, 849, 973, 295, 2109, 40513, 476, 6642, 305, 17, 327, 15958, 8892, 604, 305, 17, 310, 417, 973, 5998, 285, 253, 2303, 3268, 310, 3345, 253, 5998, 16751, 840, 891, 8564, 326, 305, 3002, 812, 4336, 1891, 50276, 3062, 1189, 954, 273, 253, 8892, 275, 253, 4679, 403, 417, 1663, 1029, 6967, 2456, 10103, 285, 512, 8892, 1646, 281, 956, 690, 2969, 46521, 5289, 534, 1056, 352, 6927, 323, 305, 3002, 281, 5115, 1029, 16226, 352, 310, 30455, 1880, 841, 1175, 16226, 403, 3700, 494, 281, 1524, 10186, 4893, 342, 15958, 3268, 15036, 50276, 66, 3588, 5661, 4758, 891, 651, 1804, 253, 4477, 1611, 310, 253, 1563, 253, 3280, 941, 3249, 432, 247, 1698, 15759, 16751, 275, 247, 1029, 15759, 2317, 387, 1878, 1052, 10103, 533, 253, 2605, 273, 436, 16751, 310, 7202, 323, 4227, 9569, 3632, 1255, 715, 253, 16751, 2605, 594, 305, 3002, 1364, 806, 6642, 305, 17, 407, 3139, 436, 4758, 310, 8003, 281, 253, 4477, 16038, 326, 1029, 15759, 941, 5512, 28932, 327, 1698, 15759, 28236, 5010, 352, 310, 1900, 30455, 1880, 253, 3045, 6351, 273, 305, 3002, 3249, 432, 253, 1491, 23753, 432, 253, 2530, 305, 17, 2581, 685, 697, 3745, 281, 6642, 285, 16584, 253, 1698, 6967, 16751, 50276, 249, 6010, 891, 1663, 751, 253, 1029, 5251, 2934, 285, 253, 3762, 629, 273, 436, 2929, 533, 253, 3368, 2593, 1057, 2430, 247, 2257, 273, 7756, 4390, 627, 310, 247, 5699, 8037, 875, 253, 4477, 16038, 285, 253, 4679, 2403, 253, 2022, 6452, 273, 436, 2929, 4122, 4274, 17980, 323, 436, 1921, 891, 5583, 33944, 436, 2929, 323, 436, 673, 285, 3524, 326, 253, 4477, 812, 501, 538, 2225, 846, 294, 17695, 253, 3368, 2593, 50275, 5996, 30080, 22559, 50276, 783, 4477, 452, 17265, 253, 2929, 347, 5125, 594, 891, 651, 751, 281, 7164, 619, 13716, 281, 2997, 50276, 783, 7364, 403, 417, 10481, 9713, 2490, 187, 4118, 18435, 27, 783, 2929, 29328, 247, 4460, 3268, 595, 10237, 13757, 15895, 19732, 2977, 941, 12087, 281, 3989, 253, 11649, 873, 846, 247, 24585, 5955, 285, 18520, 1232, 253, 30628, 452, 4925, 247, 13969, 14924, 17401, 534, 891, 1329, 50276, 47590, 253, 38041, 44282, 629, 495, 66, 3054, 326, 253, 4477, 9262, 2127, 281, 18302, 616, 1543, 2112, 342, 253, 2929, 533, 891, 513, 417, 923, 352, 347, 629, 273, 253, 24864, 2144, 390, 347, 247, 3048, 4496, 2085, 2127, 342, 253, 6568, 4704, 19529, 390, 3451, 253, 44282, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: summary this paper introduces shoot tree search sts a planning algorithm that performs a multistep expansion in montecarlo tree search standard mcts algorithms expand the search tree by adding one node to the tree for each simulation in contrast the proposed sts adds multiple nodes to the search tree at each simulation where each node corresponds to the state and action that are encountered during rollout by multistep expansion the evaluation of the trajectory is lessbiased which can be analogous to nstep td in the experiments on sokoban and google research football domains sts outperforms baselines that include random shooting banding shooting and mcts overall my main concerns are technical novelty and presentation quality the most common mcts methods assume that the leaf node is expanded one at a time in each simulation and its evaluation is performed either by rollout policy or by function approximator but this common practice does not necessarily mean that mcts should always do that the main reason for only expanding one node per simulation in standard mcts is memory efficiency if we fully expand the rollout trajectory and retain its information to the search tree we may get slightly more accurate value estimates however the nodes located deep in the tree will not be visited more than once in most cases thus its effect is usually not significant leading to the common practice of onestep expansion more importantly multistep expansion has already been used in existing works eg in 1 the tree is expanded by adding the whole rollout trajectory thus i am not convinced that this work introduces a technical novelty it seems that the relative benefit of the sts over mcts observed in the experiments comes from the bias of the value function approximator however to show the effectiveness of multistep expansion compared to singlestep expansion i think that more thorough ablation experiments should have been conducted for example we can consider the setting where both sts and mcts perform leafnode evaluation ie update in algorithm 5 by executing rollout policy rather than by using value function approximator by doing so we can focus only on the benefits of stss retaining information of full rollout trajectory ie multistep expansion compared to mctss retaining onestep information ie singlestep expansion while eliminating the effect of biased value function estimation to relieve too much bias in the current mctss leaf node evaluation mixing mc return of rollout policy and the output of the value network could also have been considered as in alphago silver et al 2016 it would be great to see if sts still has advantages over mcts in various leaf node evaluation situations also more writing effort may be required and the current version of the manuscript seems premature to be published there are some unclear or questionable parts algorithm 3 and algorithm 4 are not the contributions of this work thus they can be removed or moved to the appendix instead more discussions regarding the proposed method should have been placed in the main text in algorithm 2 the definition of calculatetarget is missing in algorithm 5 in select the tree policy is defined by chooseaction that selects purely greedy action if this describes the mcts used in the experiments i would say this is wrong to make mcts be properly working an intree policy that balances exploration vs exploitation is required eg a classical choice is ucb rule in algorithm 6 in update nsa and quality are increased by c times more which means that the longer rollout length the more weight is given what is the reason for assigning more weight to the trajectory that has a longer rollout length if the entire planning horizon is limited to finite length this means that early simulations short path length long rollout length have more weight than later simulations long path length short rollout length but i do not think this is desirable is my understanding correct for the sokoban experiments the pretrained value function would significantly affect the performance of mcts and sts but i could not find the way how the value function was pretrained in appendix a2 the hyperparameters for shooting and sts are very much different why did you set shootings hyperparameter differently from sts eg vf zeroinitialization action sampling temp etc it seems that the choice of zeroinitialization of the value network is rather arbitrary i am not convinced that this would always work better in some situations optimistic initialization of the value network may be helpful to encourage exploration of the uncertain state regions in table 2 why does randomshootingppo underperform ppo since randomshootingppo puts additional search efforts upon ppo i expected that randomshootingppo must outperform ppo table 5 could have been moved to the main text replacing table 2 1 soemers et al enhancements for realtime montecarlo tree search in general video game playing 2016 ieee conference on computational intelligence and games cig 2016 docsepsummary this paper proposes a new algorithm named shoot tree search sts to perform planning in large state spaces the authors construct sts by redesigning the expansion phase of mcts using multistep expansion the authors provide pseudocode of the sts and compare the performance of sts and mcts empirically in various domains such as sokoban google research football grf comments firstly there is no intuitive explanation of why what and how even after reading the paper i do not agree that sts is good because there is no intuition as to why it is better than nave mcts more detail i have a question the main difference between sts and mcts seems to be using multistep expansion or 1step expansion although multistep expansion will gather more information about sa pairs with high qsa value because the actions chosen by argmax q and sts expands such trajectories but in sparse reward problem sts and mcts will work similarly moreover before getting positive reward sts may worse than mcts because it requires more samples to explore because sts uses more samples for sa pairs with high qvalues which is not meaningful yet so i think that this paper needs at least discussion on an intuitive level about the advantage of sts in addition the empirical details in appendix figure 7 and 8 on page 18 and 19 respectively look weird each algorithm seems to have stopped randomly or incompletely also the authors seem to need to make an effort to make the paper more selfcontained minor comments some abbreviations are used without its full word or phrase for examples mcts it has been used in page 1 but the full phrase appears on page 3 and rl there are no reference for random shooting and bandit shooting the authors should provide more explanation about them with referencesdocsepsummary this paper presents a new planning algorithm called shoot tree search to control the tradeoff between depth and breath of the search sts modifies the expansion phase of tree search by choosing multiple actions eg gt 1 instead of one level expansion the presented idea is simple and straightforward and seems to provide improvement over existing treebased planning algorithms the presented detailed ablation studies provides insights about the choices made in the paper reasons for score overall i liked the paper and the simplicity of the idea however my major concern is the comparison with mcts i am not convinced that sts would outperform vanilla mcts when the number of simulations is in order of thousands eg the number of simulations in alphago paper is around 1600 strengths the idea is simple and seems to outperform vanilla mcts implementation in the environments with large action space weaknesses the comparison with the related work is not thorough which makes it hard to come into a decisive conclusion about the performance of the proposed method there are some missing related work eg using policy network for multiple rounds of simulations questions what would the benefits if we have a policy network to perform the rollouts eg a similar method to 1 in general the benefit of mcts algorithm like alphago which performs around 1600 simulations presents itself when the number of simulations are large can you compare running mcts with more number of simulations eg large c and sts can you please provide some insights on why in corner sts underperform compared to random shooting 1 httpscsbrownedupeoplegdkpubsanalysismctspdfdocsepthe authors present a method that combines monte carlo tree search mcts and random rollouts the authors their relate this to the biasvariance tradeoff observed in nstep temporal difference methods the authors evaluate their method on sokoban and the google football league environment the results show that the authors method leads to marginal improvements on these domains i do not think what the authors are doing is very novel as mcts combined with rollouts was already used in alphago furthermore i believe the small difference in results can be made up by using only mcts with a different exploration parameter ie like the one that was used in the alphago paper i would like to know what benefits this method brings that cannot be obtained from combining mcts with rollouts as in alphago or from a hyperaparameter search with mcts is there an anaylsis of the bias variance tradeoff of this methoddocsepsummary the paper presents shoot tree search an approach that can basically be summarised as a variant of mcts that expands adds to the search tree a longer sequence of up to h nodes to the tree per iteration as opposed to the standard approach of expanding a single node per iteration the experiments demonstrate improved performance in comparison to a standard mcts and a variety of simpler rolloutbased planning approaches in challenging planning domains such as sokoban and google research football strong points 1 wellwritten mostly easy to read and understand 2 simple but interesting idea 3 thorough empirical evaluation interesting results weak points 1 the paper describes the modification of mcts into sts which consists of making it expand a longer sequence of up to h nodes within a single iteration as an entirely novel way to extend mcts but im not sure that thats entirely the case for instance couloms 20062007 paper efficient selectivity and backup operators in montecarlo tree search already states in practice not all the nodes are stored storing the whole tree would waste too much time and memory only nodes close to the root are memorized which suggests that something like this may have already been considered but in that case was not found to be worthwhile the 2016 paper enhancements for realtime montecarlo tree search in general video game playing describes in this paper the tree is simply expanded by adding the whole playout to the tree which seems similar i do still like that the paper performs a thorough evaluation of this idea which i am not aware of appearing in previous literature and the setting with dnns for value policy function approximations is also different from aforementioned papers which may lead to different tradeoffs the use of a dnns for value function probably changes the story quite a bit here because the longer horizon h also changes the point at which the value function is computed as opposed to those older papers with values estimated by random rollouts which remains the same regardless of the horizon h so im not saying the idea isnt novel enough just that some discussion of past work seems to be missing 2 in my experience the primary reasons historically for the typical strategy of expanding just 1 node per iteration in standard mcts without dnns are 1 to reduce memory usage especially when copies of game states are stored inside nodes because then every node can be quite big and 2 efficiency because if you store copies of game states in nodes and create more nodes you also need to copy more game states whereas a random playout without node and state storing can just roll out at once without making intermediate copies of states im kind of missing a discussion of these kinds of considerations 3 im not sure that i can fully understand the experiment setup in particular looking at table 1 c is a hyperparameter denoting the number of planning passes and np is described as the average number of passes until the solution is found how can np ever exceed c shouldnt it be upper bounded by c i guess c might be the number of planning passes per time step and np is total over the entire episode something like that but this is not really clear to me if the algorithms are really restricted to just c iterations of mcts i guess its fair to always keep ch constant and then my points above about memory usage efficiency are not a big deal since they would still be equal across all scenarios but im a bit confused here due to np exceeding c overall recommendation right now i have too many little points of confusion missing discussion as pointed out under weak points above to recommend acceptance that said there is also enough to like about the paper and i can easily envision that most of the points of confusion could be relatively straightforward to clear up in a revision questions for authors could you please clarify on the points raised under weak points above minor comments on first page the comma in google research football is an advanced seems unnecessary and confusing on page 6 the wording shooting methods perform poorly for sokoban could be confusing because the newly proposed shoot tree search method can very easily be interpreted as also being a shooting method due to its name in lemma a61 the assumption that sts and mcts build the same tree t seems to me like its a very strong assumption the mcts has to make very very specific choices with very frequent overlap making identical choices across different iterations inherently somewhat unlikely due to the visit count terms in puct and other selection strategies for this to be true after discussion i increased my review from marginally below to marginally above acceptance threshold most of the remarks i had were at least partially addressed if the paper gets accepted id still recommend looking at some of them again and clarifying more a simple explicit remark somewhere around table 1 explaining that np can indeed exceed c due to relevant parts of the search tree being preserved across time steps would help a lot some more explicit discussion about why the difference between using a trained value functions vs heuristics terminal results matters so much that it makes this substantially different from prior work would also help i understand that it is because in prior work the only advantage of storing all those extra nodes was really just that it could retain slightly more information from backpropgations in those nodes whereas in your case it changes which state is the state that gets evaluated by a trained value function but this should be more explicit in the paper ### Summary:
this paper proposes a modification to mcts in which a sequence of nodes obtained by following the policy prior are added to the search tree per simulation rather than just a single node this encourages deeper searches that what is typically attained by vanilla mcts sts results in slightly improved performance in sokoban and much larger improvements google research football r4 and r1 both liked the simplicity of the idea with r1 also praising the paper for the thoroughness of its evaluation i agree that the idea is interesting and worth exploring and am impressed by the scope of the experiments in the paper as well as the additional ones linked to in the rebuttal however r1 and r5 explicitly noted they had many points of confusion and across the reviews there seemed to be many questions regarding the difference between sts and other variants of mcts i also needed to read parts of the paper multiple times to fully understand the approach if this many experts on planning and mcts are confused then i think readers who are less familiar with the area will definitely struggle to understand the main takeaways while i do think the clarifications and new experiments provided in the rebuttal help my overall sense is that the paper at this stage is not written clearly enough to be ready for publication at iclr i would encourage the authors to try to synthesize their results and organize them more succinctly in future versions of the paper one comment about a point of confusion that i had i noticed the puct exploration parameter was set to zero for sokoban and one for grf with an explanation given that many values were tried though these values are unspecified as the exploration parameter is normally considered to be the thing that controls whether mcts acts more like bfs c infty or dfs c 00 i would encourage the authors to more explicitly report which values they tried and to be clearer about the advantage of stss multistep expansions over low values of the exploration parameter
[ 275, 43665, 671, 253, 4477, 1646, 281, 878, 281, 1056, 271, 3434, 281, 1056, 253, 2929, 625, 1881, 41010, 5884, 5701, 690, 490, 25669, 403, 908, 1293, 697, 2120, 3159, 390, 12616, 323, 6667, 278, 291, 84, 352, 556, 644, 908, 275, 3239, 337, 533, 253, 2120, 12616, 4620, 327, 3239, 495, 285, 391, 77, 627, 403, 642, 3806, 323, 3632, 9602, 285, 3961, 262, 9602, 253, 4477, 943, 2085, 625, 8813, 670, 731, 342, 10414, 7152, 339, 793, 360, 3454, 436, 2929, 10262, 247, 747, 7219, 5933, 1925, 5310, 5202, 3186, 281, 1453, 253, 5454, 2727, 875, 6864, 285, 6345, 273, 253, 3186, 331, 84, 771, 7790, 253, 7466, 3408, 273, 5202, 3186, 407, 13887, 2709, 5231, 24088, 305, 85, 337, 3185, 273, 581, 1268, 7466, 253, 3559, 2934, 310, 2969, 285, 15246, 285, 3133, 281, 2085, 7756, 689, 5368, 5202, 3169, 7219, 11333, 253, 3559, 7000, 28913, 2175, 3400, 16039, 670, 253, 10165, 1160, 275, 253, 2929, 50275, 250, 3743, 323, 4868, 4583, 891, 10490, 253, 2929, 285, 253, 17647, 273, 253, 2934, 2299, 619, 2201, 4468, 310, 253, 5301, 342, 278, 291, 84, 891, 717, 417, 13762, 326, 331, 84, 651, 562, 32231, 26724, 278, 291, 84, 672, 253, 1180, 273, 9938, 310, 275, 1340, 273, 6763, 24088, 253, 1180, 273, 9938, 275, 355, 545, 5477, 2929, 310, 1475, 39678, 50275, 296, 3755, 20556, 50276, 783, 2934, 310, 2969, 285, 3133, 281, 562, 32231, 26724, 278, 291, 84, 7092, 275, 253, 12620, 342, 1781, 2250, 2317, 50276, 20881, 1255, 265, 50276, 783, 5301, 342, 253, 2905, 789, 310, 417, 11080, 534, 2789, 352, 1892, 281, 1705, 715, 247, 30417, 6452, 670, 253, 3045, 273, 253, 4081, 1332, 50275, 9088, 403, 690, 5816, 2905, 789, 24088, 970, 3646, 2990, 323, 2709, 16334, 273, 9938, 50276, 34974, 50276, 5371, 651, 253, 5373, 604, 359, 452, 247, 3646, 2990, 281, 1347, 253, 4533, 8349, 24088, 247, 2074, 1332, 281, 337, 50276, 249, 2087, 253, 5649, 273, 278, 291, 84, 5933, 751, 355, 545, 5477, 534, 17923, 1475, 39678, 9938, 10262, 3139, 672, 253, 1180, 273, 9938, 403, 1781, 476, 368, 7277, 3515, 278, 291, 84, 342, 625, 1180, 273, 9938, 24088, 1781, 260, 285, 331, 84, 50276, 5092, 368, 4496, 2085, 690, 16039, 327, 2139, 275, 7145, 331, 84, 762, 32231, 2429, 281, 3632, 9602, 50276, 18, 5987, 6113, 33167, 13808, 13174, 72, 17261, 16712, 84, 14983, 656, 1204, 291, 1033, 4989, 7152, 339, 431, 248, 4477, 1246, 247, 1332, 326, 24772, 1114, 442, 1113, 4213, 5202, 3186, 278, 291, 84, 285, 3632, 4533, 8349, 253, 4477, 616, 14588, 436, 281, 253, 8492, 87, 14417, 5454, 2727, 2540, 275, 295, 10539, 11935, 3064, 3082, 253, 4477, 7472, 616, 1332, 327, 256, 536, 706, 266, 285, 253, 17899, 5842, 9728, 3126, 253, 1543, 921, 326, 253, 4477, 1332, 5644, 281, 16888, 11701, 327, 841, 10625, 50276, 74, 513, 417, 1158, 752, 253, 4477, 403, 2509, 310, 1077, 4460, 347, 278, 291, 84, 5678, 342, 4533, 8349, 369, 2168, 908, 275, 355, 545, 5477, 33810, 891, 2868, 253, 1355, 3064, 275, 1543, 476, 320, 1160, 598, 407, 970, 760, 278, 291, 84, 342, 247, 1027, 17947, 4764, 26332, 751, 253, 581, 326, 369, 908, 275, 253, 355, 545, 5477, 2929, 50276, 74, 651, 751, 281, 871, 752, 5373, 436, 1332, 10316, 326, 2550, 320, 2797, 432, 16248, 278, 291, 84, 342, 4533, 8349, 347, 275, 355, 545, 5477, 390, 432, 247, 4373, 522, 274, 6245, 3186, 342, 278, 291, 84, 310, 627, 271, 271, 333, 77, 5114, 273, 253, 8492, 11041, 5454, 2727, 273, 436, 1332, 7152, 339, 793, 360, 3454, 50275, 783, 2929, 10262, 5310, 5202, 3186, 271, 2746, 326, 476, 10323, 320, 10405, 1701, 347, 247, 12955, 273, 278, 291, 84, 326, 35205, 11323, 281, 253, 3186, 5202, 247, 3356, 3425, 273, 598, 281, 288, 7632, 281, 253, 5202, 591, 19502, 347, 10066, 281, 253, 2629, 2746, 273, 16122, 247, 2014, 4666, 591, 19502, 253, 4679, 7568, 5520, 3045, 275, 5301, 281, 247, 2629, 278, 291, 84, 285, 247, 5235, 273, 19554, 4533, 483, 3169, 7219, 7274, 275, 11132, 7219, 10625, 824, 347, 256, 536, 706, 266, 285, 17899, 2561, 5842, 50276, 9072, 2792, 50275, 18, 973, 15720, 6571, 3477, 281, 1239, 285, 2096, 374, 2969, 533, 4722, 2934, 495, 11080, 16774, 7103, 4722, 1543, 50276, 20881, 2792, 50275, 18, 253, 2929, 8631, 253, 11237, 273, 278, 291, 84, 715, 331, 84, 534, 8414, 273, 2403, 352, 5645, 247, 3356, 3425, 273, 598, 281, 288, 7632, 1561, 247, 2014, 19502, 347, 271, 7094, 4460, 1039, 281, 9017, 278, 291, 84, 533, 516, 417, 2119, 326, 28763, 7094, 253, 1083, 50276, 1542, 4227, 2565, 77, 3056, 5403, 8602, 2929, 5919, 29029, 285, 17119, 9158, 275, 1114, 442, 5546, 4213, 5202, 3186, 2168, 3054, 275, 3946, 417, 512, 253, 7632, 403, 7141, 20073, 253, 2644, 5202, 651, 8138, 1512, 1199, 673, 285, 3541, 760, 7632, 2810, 281, 253, 5230, 403, 16407, 1025, 534, 5936, 326, 1633, 751, 436, 778, 452, 2168, 644, 2783, 533, 275, 326, 1083, 369, 417, 1119, 281, 320, 32811, 253, 4022, 2929, 42752, 323, 1524, 2606, 1114, 442, 5546, 4213, 5202, 3186, 275, 2087, 3492, 2165, 4882, 8631, 275, 436, 2929, 253, 5202, 310, 3365, 11848, 407, 6240, 253, 2644, 1132, 483, 281, 253, 5202, 534, 3133, 2074, 891, 513, 1335, 751, 326, 253, 2929, 17923, 247, 11080, 7103, 273, 436, 2934, 534, 891, 717, 417, 6600, 273, 15602, 275, 2045, 6239, 285, 253, 4758, 342, 277, 79, 2224, 323, 1318, 50276, 22872, 1159, 34754, 310, 671, 1027, 432, 18979, 9380, 534, 778, 1421, 281, 1027, 5454, 14273, 253, 897, 273, 247, 277, 79, 2224, 323, 1318, 1159, 3164, 2544, 253, 2926, 3240, 247, 2372, 1060, 984, 253, 3356, 16892, 288, 671, 2544, 253, 1127, 387, 534, 253, 1318, 1159, 310, 10302, 347, 10066, 281, 1110, 5662, 9380, 342, 2193, 5998, 407, 3632, 4533, 8349, 534, 4558, 253, 1072, 10159, 273, 253, 16892, 288, 594, 516, 417, 3981, 253, 2934, 310, 2649, 4460, 2217, 816, 326, 690, 5955, 273, 2469, 789, 3133, 281, 320, 5816, 50276, 19, 275, 619, 2793, 253, 3625, 4606, 24842, 323, 253, 6867, 5700, 273, 16122, 816, 337, 4666, 591, 19502, 275, 2629, 278, 291, 84, 1293, 277, 79, 2224, 403, 337, 281, 4796, 3541, 10393, 3340, 672, 10125, 273, 2165, 3054, 403, 7141, 3304, 7632, 984, 840, 1046, 4666, 476, 320, 3240, 1943, 285, 374, 6733, 984, 604, 368, 4657, 10125, 273, 2165, 3054, 275, 7632, 285, 2794, 625, 7632, 368, 671, 878, 281, 3491, 625, 2165, 3054, 5727, 247, 3632, 1132, 483, 1293, 4666, 285, 1375, 20073, 476, 816, 4533, 562, 387, 2378, 1293, 2403, 10444, 10125, 273, 3054, 516, 2238, 273, 5816, 247, 5955, 273, 841, 9351, 273, 15711, 50275, 20, 516, 417, 2119, 326, 891, 476, 4751, 2096, 253, 3368, 9978, 275, 1798, 2819, 387, 2829, 337, 260, 310, 247, 4373, 19484, 1850, 5341, 253, 1180, 273, 7219, 11999, 285, 15749, 310, 2529, 347, 253, 3388, 1180, 273, 11999, 1919, 253, 2900, 310, 1119, 849, 476, 15749, 2455, 8268, 260, 943, 2649, 352, 320, 5170, 11542, 407, 260, 891, 5476, 260, 1537, 320, 253, 1180, 273, 7219, 11999, 591, 673, 3213, 285, 15749, 310, 2264, 689, 253, 2862, 9037, 1633, 751, 326, 533, 436, 310, 417, 1663, 2590, 281, 479, 604, 253, 11333, 403, 1663, 11096, 281, 816, 260, 25142, 273, 278, 291, 84, 891, 5476, 697, 4344, 281, 1900, 1978, 448, 3638, 285, 840, 619, 2792, 1840, 670, 3541, 10393, 50276, 46505, 403, 417, 247, 1943, 2968, 1580, 597, 651, 1335, 320, 4503, 2439, 512, 15216, 533, 516, 247, 2372, 13477, 1060, 1955, 281, 15749, 27433, 260, 50276, 1189, 455, 17401, 50275, 918, 1024, 891, 452, 1512, 1142, 1652, 2792, 273, 13775, 50276, 33722, 5955, 347, 8042, 562, 762, 5075, 2792, 1840, 281, 5583, 14924, 326, 753, 627, 310, 671, 2217, 281, 751, 670, 253, 2929, 285, 891, 476, 4354, 31161, 326, 954, 273, 253, 2792, 273, 13775, 812, 320, 4942, 15246, 281, 2590, 598, 275, 247, 18520, 50276, 34974, 323, 4477, 50275, 16534, 368, 4496, 19148, 327, 253, 2792, 5439, 762, 5075, 2792, 1840, 50276, 37585, 5701, 50274, 251, 806, 3239, 253, 39169, 275, 17899, 2561, 5842, 310, 271, 7269, 3133, 15279, 285, 21643, 50276, 251, 3239, 721, 253, 41066, 9602, 3082, 1347, 15225, 323, 256, 536, 706, 266, 812, 320, 21643, 984, 253, 9841, 4081, 5310, 5202, 3186, 1332, 476, 1077, 4354, 320, 12814, 347, 671, 1146, 247, 9602, 1332, 1955, 281, 697, 1416, 50276, 249, 18057, 247, 3832, 253, 9376, 326, 331, 84, 285, 278, 291, 84, 1973, 253, 1072, 5202, 246, 3133, 281, 479, 751, 697, 247, 1077, 2266, 9376, 253, 278, 291, 84, 556, 281, 1056, 1077, 1077, 2173, 10165, 342, 1077, 10879, 14787, 2403, 8931, 10165, 2439, 1027, 25142, 26557, 8489, 11543, 1955, 281, 253, 4143, 1385, 2426, 275, 8429, 291, 285, 643, 5438, 8130, 323, 436, 281, 320, 2032, 50276, 6438, 5955, 50275, 74, 2559, 619, 2278, 432, 42876, 2708, 281, 42876, 1840, 14924, 7887, 954, 273, 253, 16157, 891, 574, 497, 387, 1878, 10571, 9713, 604, 253, 2929, 4850, 7607, 2654, 1335, 5583, 2819, 387, 690, 273, 731, 969, 285, 8254, 5411, 625, 247, 2969, 6843, 7579, 9366, 1475, 2829, 337, 15571, 326, 15749, 476, 6296, 8268, 260, 1955, 281, 4623, 4243, 273, 253, 3186, 5202, 1146, 15296, 2439, 673, 5018, 651, 1361, 247, 2257, 690, 625, 6843, 5955, 670, 2139, 253, 3064, 875, 970, 247, 10166, 1318, 3470, 4632, 344, 321, 3397, 50276, 14104, 1543, 8213, 594, 1199, 326, 352, 2789, 436, 9619, 1027, 432, 2720, 789, 651, 671, 1361, 891, 2096, 326, 352, 310, 984, 275, 2720, 789, 253, 760, 5750, 273, 20073, 512, 1110, 4465, 7632, 369, 1663, 816, 326, 352, 812, 13280, 5777, 625, 1491, 432, 896, 856, 8159, 569, 275, 1110, 7632, 5727, 275, 634, 1083, 352, 2544, 534, 1375, 310, 253, 1375, 326, 4850, 6760, 407, 247, 10166, 1318, 1159, 533, 436, 943, 320, 625, 6843, 275, 253, 2929, 187, 187, 4118, 18435, 27, 2520, 2929, 29328, 247, 11237, 281, 278, 291, 84, 275, 534, 247, 3425, 273, 7632, 2797, 407, 1563, 253, 3646, 2720, 403, 2879, 281, 253, 3186, 5202, 591, 9864, 2581, 685, 816, 247, 2014, 4666, 436, 29426, 12861, 17891, 326, 752, 310, 5431, 26553, 407, 26724, 278, 291, 84, 331, 84, 1543, 275, 5777, 5520, 3045, 275, 256, 536, 706, 266, 285, 1199, 4067, 11701, 17899, 2561, 5842, 50276, 83, 21, 285, 391, 18, 1097, 10490, 253, 17647, 273, 253, 2934, 342, 391, 18, 671, 9791, 2182, 253, 2929, 323, 253, 11080, 1255, 273, 697, 7103, 891, 5194, 326, 253, 2934, 310, 4722, 285, 4409, 18216, 285, 717, 17847, 407, 253, 7990, 273, 253, 4679, 275, 253, 2929, 347, 973, 347, 253, 3081, 4394, 7939, 281, 275, 253, 30080, 22559, 2299, 391, 18, 285, 391, 22, 11120, 4879, 597, 574, 1142, 2792, 273, 13775, 285, 2439, 253, 10123, 627, 4455, 281, 320, 1142, 3533, 5001, 253, 3064, 875, 331, 84, 285, 643, 11640, 273, 278, 291, 84, 891, 671, 3058, 281, 1239, 4243, 273, 253, 2929, 2709, 2069, 281, 4751, 2096, 253, 2746, 604, 436, 1142, 10071, 327, 7219, 285, 278, 291, 84, 403, 13477, 840, 891, 1158, 10668, 665, 403, 1679, 7615, 342, 253, 2170, 588, 7964, 11182, 281, 2096, 253, 2022, 1379, 42287, 1223, 891, 513, 1158, 253, 8254, 6787, 285, 747, 4679, 2530, 275, 253, 30080, 22559, 1361, 619, 4583, 3282, 310, 326, 253, 2929, 387, 436, 3924, 310, 417, 3542, 4518, 2217, 281, 320, 4704, 323, 9311, 387, 17857, 32888, 891, 651, 11907, 253, 4477, 281, 1611, 281, 46919, 616, 1543, 285, 23968, 731, 625, 18382, 4291, 314, 275, 2852, 9508, 273, 253, 2929, 50276, 531, 4385, 670, 247, 1127, 273, 13775, 326, 891, 574, 891, 8344, 253, 8429, 291, 17947, 4764, 369, 873, 281, 5058, 323, 256, 536, 706, 266, 285, 581, 323, 650, 71, 342, 271, 8813, 1677, 326, 1142, 2193, 497, 3597, 2167, 841, 2193, 403, 45346, 347, 253, 17947, 4764, 310, 9403, 2783, 281, 320, 253, 2181, 326, 5760, 1880, 278, 291, 84, 6993, 625, 751, 270, 3671, 260, 50276, 3259, 390, 277, 3671, 260, 50276, 361, 891, 651, 11907, 253, 4477, 281, 625, 11120, 1304, 534, 2193, 597, 3597, 285, 281, 320, 30909, 670, 253, 5750, 273, 331, 859, 1554, 382, 554, 40955, 689, 1698, 2193, 273, 253, 17947, 4764 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 275, 43665, 671, 253, 4477, 1646, 281, 878, 281, 1056, 271, 3434, 281, 1056, 253, 2929, 625, 1881, 41010, 5884, 5701, 690, 490, 25669, 403, 908, 1293, 697, 2120, 3159, 390, 12616, 323, 6667, 278, 291, 84, 352, 556, 644, 908, 275, 3239, 337, 533, 253, 2120, 12616, 4620, 327, 3239, 495, 285, 391, 77, 627, 403, 642, 3806, 323, 3632, 9602, 285, 3961, 262, 9602, 253, 4477, 943, 2085, 625, 8813, 670, 731, 342, 10414, 7152, 339, 793, 360, 3454, 436, 2929, 10262, 247, 747, 7219, 5933, 1925, 5310, 5202, 3186, 281, 1453, 253, 5454, 2727, 875, 6864, 285, 6345, 273, 253, 3186, 331, 84, 771, 7790, 253, 7466, 3408, 273, 5202, 3186, 407, 13887, 2709, 5231, 24088, 305, 85, 337, 3185, 273, 581, 1268, 7466, 253, 3559, 2934, 310, 2969, 285, 15246, 285, 3133, 281, 2085, 7756, 689, 5368, 5202, 3169, 7219, 11333, 253, 3559, 7000, 28913, 2175, 3400, 16039, 670, 253, 10165, 1160, 275, 253, 2929, 50275, 250, 3743, 323, 4868, 4583, 891, 10490, 253, 2929, 285, 253, 17647, 273, 253, 2934, 2299, 619, 2201, 4468, 310, 253, 5301, 342, 278, 291, 84, 891, 717, 417, 13762, 326, 331, 84, 651, 562, 32231, 26724, 278, 291, 84, 672, 253, 1180, 273, 9938, 310, 275, 1340, 273, 6763, 24088, 253, 1180, 273, 9938, 275, 355, 545, 5477, 2929, 310, 1475, 39678, 50275, 296, 3755, 20556, 50276, 783, 2934, 310, 2969, 285, 3133, 281, 562, 32231, 26724, 278, 291, 84, 7092, 275, 253, 12620, 342, 1781, 2250, 2317, 50276, 20881, 1255, 265, 50276, 783, 5301, 342, 253, 2905, 789, 310, 417, 11080, 534, 2789, 352, 1892, 281, 1705, 715, 247, 30417, 6452, 670, 253, 3045, 273, 253, 4081, 1332, 50275, 9088, 403, 690, 5816, 2905, 789, 24088, 970, 3646, 2990, 323, 2709, 16334, 273, 9938, 50276, 34974, 50276, 5371, 651, 253, 5373, 604, 359, 452, 247, 3646, 2990, 281, 1347, 253, 4533, 8349, 24088, 247, 2074, 1332, 281, 337, 50276, 249, 2087, 253, 5649, 273, 278, 291, 84, 5933, 751, 355, 545, 5477, 534, 17923, 1475, 39678, 9938, 10262, 3139, 672, 253, 1180, 273, 9938, 403, 1781, 476, 368, 7277, 3515, 278, 291, 84, 342, 625, 1180, 273, 9938, 24088, 1781, 260, 285, 331, 84, 50276, 5092, 368, 4496, 2085, 690, 16039, 327, 2139, 275, 7145, 331, 84, 762, 32231, 2429, 281, 3632, 9602, 50276, 18, 5987, 6113, 33167, 13808, 13174, 72, 17261, 16712, 84, 14983, 656, 1204, 291, 1033, 4989, 7152, 339, 431, 248, 4477, 1246, 247, 1332, 326, 24772, 1114, 442, 1113, 4213, 5202, 3186, 278, 291, 84, 285, 3632, 4533, 8349, 253, 4477, 616, 14588, 436, 281, 253, 8492, 87, 14417, 5454, 2727, 2540, 275, 295, 10539, 11935, 3064, 3082, 253, 4477, 7472, 616, 1332, 327, 256, 536, 706, 266, 285, 253, 17899, 5842, 9728, 3126, 253, 1543, 921, 326, 253, 4477, 1332, 5644, 281, 16888, 11701, 327, 841, 10625, 50276, 74, 513, 417, 1158, 752, 253, 4477, 403, 2509, 310, 1077, 4460, 347, 278, 291, 84, 5678, 342, 4533, 8349, 369, 2168, 908, 275, 355, 545, 5477, 33810, 891, 2868, 253, 1355, 3064, 275, 1543, 476, 320, 1160, 598, 407, 970, 760, 278, 291, 84, 342, 247, 1027, 17947, 4764, 26332, 751, 253, 581, 326, 369, 908, 275, 253, 355, 545, 5477, 2929, 50276, 74, 651, 751, 281, 871, 752, 5373, 436, 1332, 10316, 326, 2550, 320, 2797, 432, 16248, 278, 291, 84, 342, 4533, 8349, 347, 275, 355, 545, 5477, 390, 432, 247, 4373, 522, 274, 6245, 3186, 342, 278, 291, 84, 310, 627, 271, 271, 333, 77, 5114, 273, 253, 8492, 11041, 5454, 2727, 273, 436, 1332, 7152, 339, 793, 360, 3454, 50275, 783, 2929, 10262, 5310, 5202, 3186, 271, 2746, 326, 476, 10323, 320, 10405, 1701, 347, 247, 12955, 273, 278, 291, 84, 326, 35205, 11323, 281, 253, 3186, 5202, 247, 3356, 3425, 273, 598, 281, 288, 7632, 281, 253, 5202, 591, 19502, 347, 10066, 281, 253, 2629, 2746, 273, 16122, 247, 2014, 4666, 591, 19502, 253, 4679, 7568, 5520, 3045, 275, 5301, 281, 247, 2629, 278, 291, 84, 285, 247, 5235, 273, 19554, 4533, 483, 3169, 7219, 7274, 275, 11132, 7219, 10625, 824, 347, 256, 536, 706, 266, 285, 17899, 2561, 5842, 50276, 9072, 2792, 50275, 18, 973, 15720, 6571, 3477, 281, 1239, 285, 2096, 374, 2969, 533, 4722, 2934, 495, 11080, 16774, 7103, 4722, 1543, 50276, 20881, 2792, 50275, 18, 253, 2929, 8631, 253, 11237, 273, 278, 291, 84, 715, 331, 84, 534, 8414, 273, 2403, 352, 5645, 247, 3356, 3425, 273, 598, 281, 288, 7632, 1561, 247, 2014, 19502, 347, 271, 7094, 4460, 1039, 281, 9017, 278, 291, 84, 533, 516, 417, 2119, 326, 28763, 7094, 253, 1083, 50276, 1542, 4227, 2565, 77, 3056, 5403, 8602, 2929, 5919, 29029, 285, 17119, 9158, 275, 1114, 442, 5546, 4213, 5202, 3186, 2168, 3054, 275, 3946, 417, 512, 253, 7632, 403, 7141, 20073, 253, 2644, 5202, 651, 8138, 1512, 1199, 673, 285, 3541, 760, 7632, 2810, 281, 253, 5230, 403, 16407, 1025, 534, 5936, 326, 1633, 751, 436, 778, 452, 2168, 644, 2783, 533, 275, 326, 1083, 369, 417, 1119, 281, 320, 32811, 253, 4022, 2929, 42752, 323, 1524, 2606, 1114, 442, 5546, 4213, 5202, 3186, 275, 2087, 3492, 2165, 4882, 8631, 275, 436, 2929, 253, 5202, 310, 3365, 11848, 407, 6240, 253, 2644, 1132, 483, 281, 253, 5202, 534, 3133, 2074, 891, 513, 1335, 751, 326, 253, 2929, 17923, 247, 11080, 7103, 273, 436, 2934, 534, 891, 717, 417, 6600, 273, 15602, 275, 2045, 6239, 285, 253, 4758, 342, 277, 79, 2224, 323, 1318, 50276, 22872, 1159, 34754, 310, 671, 1027, 432, 18979, 9380, 534, 778, 1421, 281, 1027, 5454, 14273, 253, 897, 273, 247, 277, 79, 2224, 323, 1318, 1159, 3164, 2544, 253, 2926, 3240, 247, 2372, 1060, 984, 253, 3356, 16892, 288, 671, 2544, 253, 1127, 387, 534, 253, 1318, 1159, 310, 10302, 347, 10066, 281, 1110, 5662, 9380, 342, 2193, 5998, 407, 3632, 4533, 8349, 534, 4558, 253, 1072, 10159, 273, 253, 16892, 288, 594, 516, 417, 3981, 253, 2934, 310, 2649, 4460, 2217, 816, 326, 690, 5955, 273, 2469, 789, 3133, 281, 320, 5816, 50276, 19, 275, 619, 2793, 253, 3625, 4606, 24842, 323, 253, 6867, 5700, 273, 16122, 816, 337, 4666, 591, 19502, 275, 2629, 278, 291, 84, 1293, 277, 79, 2224, 403, 337, 281, 4796, 3541, 10393, 3340, 672, 10125, 273, 2165, 3054, 403, 7141, 3304, 7632, 984, 840, 1046, 4666, 476, 320, 3240, 1943, 285, 374, 6733, 984, 604, 368, 4657, 10125, 273, 2165, 3054, 275, 7632, 285, 2794, 625, 7632, 368, 671, 878, 281, 3491, 625, 2165, 3054, 5727, 247, 3632, 1132, 483, 1293, 4666, 285, 1375, 20073, 476, 816, 4533, 562, 387, 2378, 1293, 2403, 10444, 10125, 273, 3054, 516, 2238, 273, 5816, 247, 5955, 273, 841, 9351, 273, 15711, 50275, 20, 516, 417, 2119, 326, 891, 476, 4751, 2096, 253, 3368, 9978, 275, 1798, 2819, 387, 2829, 337, 260, 310, 247, 4373, 19484, 1850, 5341, 253, 1180, 273, 7219, 11999, 285, 15749, 310, 2529, 347, 253, 3388, 1180, 273, 11999, 1919, 253, 2900, 310, 1119, 849, 476, 15749, 2455, 8268, 260, 943, 2649, 352, 320, 5170, 11542, 407, 260, 891, 5476, 260, 1537, 320, 253, 1180, 273, 7219, 11999, 591, 673, 3213, 285, 15749, 310, 2264, 689, 253, 2862, 9037, 1633, 751, 326, 533, 436, 310, 417, 1663, 2590, 281, 479, 604, 253, 11333, 403, 1663, 11096, 281, 816, 260, 25142, 273, 278, 291, 84, 891, 5476, 697, 4344, 281, 1900, 1978, 448, 3638, 285, 840, 619, 2792, 1840, 670, 3541, 10393, 50276, 46505, 403, 417, 247, 1943, 2968, 1580, 597, 651, 1335, 320, 4503, 2439, 512, 15216, 533, 516, 247, 2372, 13477, 1060, 1955, 281, 15749, 27433, 260, 50276, 1189, 455, 17401, 50275, 918, 1024, 891, 452, 1512, 1142, 1652, 2792, 273, 13775, 50276, 33722, 5955, 347, 8042, 562, 762, 5075, 2792, 1840, 281, 5583, 14924, 326, 753, 627, 310, 671, 2217, 281, 751, 670, 253, 2929, 285, 891, 476, 4354, 31161, 326, 954, 273, 253, 2792, 273, 13775, 812, 320, 4942, 15246, 281, 2590, 598, 275, 247, 18520, 50276, 34974, 323, 4477, 50275, 16534, 368, 4496, 19148, 327, 253, 2792, 5439, 762, 5075, 2792, 1840, 50276, 37585, 5701, 50274, 251, 806, 3239, 253, 39169, 275, 17899, 2561, 5842, 310, 271, 7269, 3133, 15279, 285, 21643, 50276, 251, 3239, 721, 253, 41066, 9602, 3082, 1347, 15225, 323, 256, 536, 706, 266, 812, 320, 21643, 984, 253, 9841, 4081, 5310, 5202, 3186, 1332, 476, 1077, 4354, 320, 12814, 347, 671, 1146, 247, 9602, 1332, 1955, 281, 697, 1416, 50276, 249, 18057, 247, 3832, 253, 9376, 326, 331, 84, 285, 278, 291, 84, 1973, 253, 1072, 5202, 246, 3133, 281, 479, 751, 697, 247, 1077, 2266, 9376, 253, 278, 291, 84, 556, 281, 1056, 1077, 1077, 2173, 10165, 342, 1077, 10879, 14787, 2403, 8931, 10165, 2439, 1027, 25142, 26557, 8489, 11543, 1955, 281, 253, 4143, 1385, 2426, 275, 8429, 291, 285, 643, 5438, 8130, 323, 436, 281, 320, 2032, 50276, 6438, 5955, 50275, 74, 2559, 619, 2278, 432, 42876, 2708, 281, 42876, 1840, 14924, 7887, 954, 273, 253, 16157, 891, 574, 497, 387, 1878, 10571, 9713, 604, 253, 2929, 4850, 7607, 2654, 1335, 5583, 2819, 387, 690, 273, 731, 969, 285, 8254, 5411, 625, 247, 2969, 6843, 7579, 9366, 1475, 2829, 337, 15571, 326, 15749, 476, 6296, 8268, 260, 1955, 281, 4623, 4243, 273, 253, 3186, 5202, 1146, 15296, 2439, 673, 5018, 651, 1361, 247, 2257, 690, 625, 6843, 5955, 670, 2139, 253, 3064, 875, 970, 247, 10166, 1318, 3470, 4632, 344, 321, 3397, 50276, 14104, 1543, 8213, 594, 1199, 326, 352, 2789, 436, 9619, 1027, 432, 2720, 789, 651, 671, 1361, 891, 2096, 326, 352, 310, 984, 275, 2720, 789, 253, 760, 5750, 273, 20073, 512, 1110, 4465, 7632, 369, 1663, 816, 326, 352, 812, 13280, 5777, 625, 1491, 432, 896, 856, 8159, 569, 275, 1110, 7632, 5727, 275, 634, 1083, 352, 2544, 534, 1375, 310, 253, 1375, 326, 4850, 6760, 407, 247, 10166, 1318, 1159, 533, 436, 943, 320, 625, 6843, 275, 253, 2929, 187, 187, 4118, 18435, 27, 2520, 2929, 29328, 247, 11237, 281, 278, 291, 84, 275, 534, 247, 3425, 273, 7632, 2797, 407, 1563, 253, 3646, 2720, 403, 2879, 281, 253, 3186, 5202, 591, 9864, 2581, 685, 816, 247, 2014, 4666, 436, 29426, 12861, 17891, 326, 752, 310, 5431, 26553, 407, 26724, 278, 291, 84, 331, 84, 1543, 275, 5777, 5520, 3045, 275, 256, 536, 706, 266, 285, 1199, 4067, 11701, 17899, 2561, 5842, 50276, 83, 21, 285, 391, 18, 1097, 10490, 253, 17647, 273, 253, 2934, 342, 391, 18, 671, 9791, 2182, 253, 2929, 323, 253, 11080, 1255, 273, 697, 7103, 891, 5194, 326, 253, 2934, 310, 4722, 285, 4409, 18216, 285, 717, 17847, 407, 253, 7990, 273, 253, 4679, 275, 253, 2929, 347, 973, 347, 253, 3081, 4394, 7939, 281, 275, 253, 30080, 22559, 2299, 391, 18, 285, 391, 22, 11120, 4879, 597, 574, 1142, 2792, 273, 13775, 285, 2439, 253, 10123, 627, 4455, 281, 320, 1142, 3533, 5001, 253, 3064, 875, 331, 84, 285, 643, 11640, 273, 278, 291, 84, 891, 671, 3058, 281, 1239, 4243, 273, 253, 2929, 2709, 2069, 281, 4751, 2096, 253, 2746, 604, 436, 1142, 10071, 327, 7219, 285, 278, 291, 84, 403, 13477, 840, 891, 1158, 10668, 665, 403, 1679, 7615, 342, 253, 2170, 588, 7964, 11182, 281, 2096, 253, 2022, 1379, 42287, 1223, 891, 513, 1158, 253, 8254, 6787, 285, 747, 4679, 2530, 275, 253, 30080, 22559, 1361, 619, 4583, 3282, 310, 326, 253, 2929, 387, 436, 3924, 310, 417, 3542, 4518, 2217, 281, 320, 4704, 323, 9311, 387, 17857, 32888, 891, 651, 11907, 253, 4477, 281, 1611, 281, 46919, 616, 1543, 285, 23968, 731, 625, 18382, 4291, 314, 275, 2852, 9508, 273, 253, 2929, 50276, 531, 4385, 670, 247, 1127, 273, 13775, 326, 891, 574, 891, 8344, 253, 8429, 291, 17947, 4764, 369, 873, 281, 5058, 323, 256, 536, 706, 266, 285, 581, 323, 650, 71, 342, 271, 8813, 1677, 326, 1142, 2193, 497, 3597, 2167, 841, 2193, 403, 45346, 347, 253, 17947, 4764, 310, 9403, 2783, 281, 320, 253, 2181, 326, 5760, 1880, 278, 291, 84, 6993, 625, 751, 270, 3671, 260, 50276, 3259, 390, 277, 3671, 260, 50276, 361, 891, 651, 11907, 253, 4477, 281, 625, 11120, 1304, 534, 2193, 597, 3597, 285, 281, 320, 30909, 670, 253, 5750, 273, 331, 859, 1554, 382, 554, 40955, 689, 1698, 2193, 273, 253, 17947, 4764 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: authors aim to reveal relevant dependencies between voice and image data under a crossmodal matching framework through common covariates gender id nationality each covariate is learned using a cnn from each provided domain speak recordings and face images then a classifier is determined from a shared representation which includes the cnn outputs from voicebased and imagebased covariate estimations the idea is interesting and the paper ideas are clear to follow pros new insights to support crossmodality matching from covariates competitive results against stateoftheart convincing experiments cons fixing the output dimension to d for both voice and imagebased cnn outputs could lead to unstable results indeed the comparison of voice and facebased covariate estimates are not entirely fair due to the intrinsic dimensionality can vary for each domain alternatives as canonical correlation analysis can be coupled to joint properly both domains table 4 column id results are not convincing maybe are not clear for medocsep summary the article proposes a deep learningbased approach aimed at matching face images to voice recordings belonging to the same person to this end the authors use independently parametrized neural networks to map face images and audio recordings represented as spectrograms to embeddings of fixed and equal dimensionality key to the proposed approach unlike related prior work these modules are not directly trained on some particular form of the crossmodal matching task instead the resulting embeddings are fed to a modalityagnostic multiclass logistic regression classifier that aims to predict simple covariates such as gender nationality or identity the whole system is trained jointly to maximise the performance of these classifiers given that face image voice recording pairs belonging to the same person must share equal for these covariates the neural networks embedding face images and audio recordings are thus indirectly encouraged to map face images and voice recordings belonging to the same person to similar embeddings the article concludes with an exhaustive set of experiments using the vggface and voxceleb datasets that demonstrates improvements over prior work on the same set of tasks originality and significance the article followsup on recent work 1 2 building on their original application experimental setup and model architecture the key innovation of the article compared to the aforementioned papers lies on the idea of learning facevoice embeddings to maximise their ability to predict covariates rather than by explicitly trying to optimise an objective related to crossmodal matching while the fact that these covariates are strongly associated to face images and audio recordings had already been discussed in 1 2 the idea of actually using them to drive the learning process is novel in this particular task while the article does not present substantial generalpurpose methodological innovations in machine learning i believe it constitutes a solid application of existing techniques empirically the proposed covariatedriven architecture is demonstrated to lead to better performance in the vggface voxceleb dataset in a comprehensive set of experiments as a result i believe the article might be of interest to practitioners interested in solving related crossmodal matching tasks clarity the descriptions of the approach related work and the different experiments carried out are written clearly and precisely overall the paper is rather easy to read and is presented using a logical easytofollow structure in my opinion perhaps the only exception to that claim lies in section 34 if possible i believe the seenheard and unseenunheard scenarios should be introduced in order to make the article selfcontained quality the experimental section is rather exhaustive despite essentially consisting of a single dataset it builds on 1 2 and presents a solid study that rigorously accounts for many factors such as potential confounding due to gender andor nationality driving prediction performance in the test set multiple variations of the crossmodal matching task are studied while in absolute terms no approach seems to have satisfactory performance yet the experimental results seem to indicate that the proposed approach outperforms prior work given that the authors claimed to have run 5 repetitions of the experiment i believe reporting some form of uncertainty estimates around the reported performance values would strengthen the results however i believe that the success of the experimental results more precisely of the variants trained to predict the covariate identity call into question the very premise of the article unlike gender or nationality i believe that identity is not a covariate per se in fact as argued in section 31 the prediction task for this covariate is not welldefined as the set of identities in the training validation and test sets are disjoint in my opinion this calls into question the hypothesis that what drives the improved performance is the fact that these models are trained to predict the covariates rather i wonder if the advantages are instead a fortunate byproduct of the more efficient usage of the data during the training process thanks to not requiring face image audio recording pairs as input typos section 24 1 imagemgiven 2 cosine similarity written using absolute value f rather than l2norm f2 3 here we are give a probe input references 1 nagrani arsha samuel albanie and andrew zisserman learnable pins crossmodal embeddings for person identity arxiv preprint arxiv180500833 2018 2 nagrani arsha samuel albanie and andrew zisserman seeing voices and hearing faces crossmodal biometric matching proceedings of the ieee conference on computer vision and pattern recognition 2018docsepthis paper aims at matching peoples voices to the images of their faces it describes a method to train shared embeddings of voices and face images the speech and image features go through separate neural networks until a shared embedding layer then a classification network is built on top of the embeddings from both networks the classification network predicts various combinations of covariates of faces and voices gender nationality and identity the input to the classification network is then used as a shared representation for performing retrieval and matching tasks compared with similar work from nagrani et al 2018 who generate paired inputs of voices and faces and train a network to classify if the pair is matched or not the proposed method doesnt require paired inputs it does however require inputs that are labeled with the same covariates across modalities my feeling is that paired positive examples are easier to obtain eg from unlabeled video than inputs labeled with these covariates although paired negative examples require labeling and so may be as difficult to obtain several different evaluations are performed comparing networks that were trained to predict all subsets of identity gender and nationality these include identifying a matching face in a set of faces 12 or n faces for a given voice or vice versa results show that the network that predicts identitygender tends to work best under a variety of careful examinations of various stratifications of the data these stratifications also show that while gender is useful overall it is not when the gender of imposters is the same as that of the target individual the results also show that even when evaluating the voices and faces not shown in the training data the model can achieve 832 auc on unseenunheard individuals which outperforms the stateoftheart method from nagrani et al 2018 an interesting avenue of future work would be using the prediction of these covariates to initialize a network and then refine it using some sort of ranking loss like the triplet loss contrastive loss etc writing overall ciations are all given in textual form nagrani et al 2018 in latex this is citet or cite when many times parenthetical citations nagrani et al 2018 in latex this is citep would be more appropriate the image of the voice waveform in figures 1 and 2 should be replaced by log melspectrograms in order to illustrate the networks input state or art instead of stateoftheart on page 3 in subsection 24 mgiven is written instead of given on page 6 section 31 12 matching paragraph nagrani et al is written twice page 6 mentions that there is a row labelled svhfnet in table 2 but there is no such row is this table page 7 line 1 gn should be g n ### Summary:
all reviewers agree that the proposed method interesting and well presented the authors rebuttal addressed all outstanding raised issues two reviewers recommend clear accept and the third recommends borderline accept i agree with this recommendation and believe that the paper will be of interest to the audience attending iclr i recommend accepting this work for a poster presentation at iclr
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 43355, 4388, 281, 10313, 4623, 21011, 875, 4318, 285, 2460, 941, 762, 247, 2831, 24353, 11038, 7792, 949, 1846, 33520, 8645, 2654, 47166, 1016, 9383, 11610, 310, 6311, 970, 247, 260, 9866, 432, 1016, 2530, 5028, 3984, 19654, 285, 2454, 3888, 840, 247, 30410, 310, 3413, 432, 247, 6096, 6779, 534, 3797, 253, 260, 9866, 18012, 432, 4318, 3169, 285, 2460, 3169, 9383, 11610, 3311, 569, 253, 2934, 310, 4722, 285, 253, 2929, 5697, 403, 2590, 281, 956, 50276, 856, 84, 50276, 1826, 16039, 281, 1329, 2831, 2307, 1319, 11038, 432, 33520, 50276, 3118, 19922, 1543, 1411, 1375, 23037, 14387, 21414, 4679, 50276, 5040, 18505, 253, 3453, 7877, 281, 277, 323, 1097, 4318, 285, 2460, 3169, 260, 9866, 18012, 812, 1421, 281, 17631, 1543, 6296, 253, 5301, 273, 4318, 285, 2454, 3169, 9383, 11610, 8197, 403, 417, 7094, 4344, 1955, 281, 253, 15276, 7877, 1319, 476, 6889, 323, 1016, 5028, 18075, 347, 15516, 5921, 1783, 476, 320, 9904, 281, 6036, 6283, 1097, 10625, 50276, 2420, 577, 50276, 11631, 2654, 1543, 403, 417, 21414, 5046, 403, 417, 2590, 323, 1129, 406, 33032, 6010, 50276, 783, 3929, 29328, 247, 3676, 4715, 3169, 2746, 11205, 387, 11038, 2454, 3888, 281, 4318, 19654, 15823, 281, 253, 1072, 1436, 50275, 936, 436, 990, 253, 4477, 897, 10939, 30364, 50065, 11454, 6928, 281, 3711, 2454, 3888, 285, 9797, 19654, 50276, 33174, 347, 2812, 287, 5059, 50276, 936, 46234, 273, 4229, 285, 4503, 7877, 1319, 2234, 281, 253, 4081, 2746, 12401, 2905, 2720, 789, 841, 11911, 403, 417, 3587, 10166, 327, 690, 1798, 830, 273, 253, 2831, 24353, 11038, 4836, 3185, 253, 4795, 46234, 403, 10208, 281, 247, 36453, 1530, 6932, 23559, 14407, 21535, 9077, 30410, 326, 13698, 281, 3283, 2969, 33520, 824, 347, 8645, 47166, 390, 6489, 253, 2644, 985, 310, 10166, 26277, 281, 11903, 885, 253, 3045, 273, 841, 49996, 1677, 326, 2454, 2460, 4318, 7663, 8557, 15823, 281, 253, 1072, 1436, 1364, 3894, 4503, 323, 841, 33520, 253, 11454, 6928, 21496, 2454, 3888, 285, 9797, 19654, 403, 3021, 21719, 14659, 281, 3711, 2454, 3888, 285, 4318, 19654, 15823, 281, 253, 1072, 1436, 281, 2074, 46234, 50276, 783, 3929, 20097, 342, 271, 41389, 873, 273, 4679, 970, 253, 362, 1266, 1664, 285, 32582, 44033, 67, 15302, 326, 14371, 11701, 689, 2720, 789, 327, 253, 1072, 873, 273, 8892, 50275, 19164, 414, 285, 8453, 50276, 783, 3929, 3637, 484, 327, 3332, 789, 337, 374, 3652, 327, 616, 3236, 2898, 5661, 9978, 285, 1566, 10336, 253, 2234, 15832, 273, 253, 3929, 2429, 281, 253, 18979, 9380, 8696, 327, 253, 2934, 273, 4715, 2454, 22619, 46234, 281, 11903, 885, 616, 3745, 281, 3283, 33520, 2581, 685, 407, 11120, 2820, 281, 5556, 885, 271, 8103, 2905, 281, 2831, 24353, 11038, 1223, 253, 958, 326, 841, 33520, 403, 7052, 2330, 281, 2454, 3888, 285, 9797, 19654, 574, 2168, 644, 5469, 275, 337, 374, 253, 2934, 273, 2686, 970, 731, 281, 4446, 253, 4715, 1232, 310, 4460, 275, 436, 1798, 4836, 50276, 6050, 253, 3929, 1057, 417, 1246, 6832, 2087, 27299, 35961, 32771, 275, 5145, 4715, 891, 2868, 352, 16988, 247, 4891, 2898, 273, 5368, 5609, 45190, 253, 4081, 31214, 456, 1069, 257, 10336, 310, 5183, 281, 1421, 281, 1805, 3045, 275, 253, 362, 1266, 1664, 32582, 44033, 67, 10895, 275, 247, 11088, 873, 273, 4679, 347, 247, 906, 891, 2868, 253, 3929, 1537, 320, 273, 1600, 281, 24432, 6110, 275, 16161, 2905, 2831, 24353, 11038, 8892, 50275, 498, 15752, 50276, 783, 20121, 273, 253, 2746, 2905, 789, 285, 253, 1027, 4679, 4824, 562, 403, 3542, 4518, 285, 10534, 4583, 253, 2929, 310, 2581, 3477, 281, 1239, 285, 310, 3559, 970, 247, 13760, 3477, 936, 25739, 2605, 50276, 249, 619, 4743, 4931, 253, 760, 6517, 281, 326, 1750, 8696, 275, 2593, 5910, 604, 1896, 891, 2868, 253, 2326, 28996, 285, 39709, 328, 28996, 15216, 943, 320, 5611, 275, 1340, 281, 1056, 253, 3929, 1881, 41010, 50274, 15177, 50276, 783, 5661, 2593, 310, 2581, 41389, 5747, 9093, 11253, 273, 247, 2014, 10895, 352, 21168, 327, 337, 374, 285, 10262, 247, 4891, 1263, 326, 8132, 29689, 8553, 323, 1142, 2616, 824, 347, 2442, 34541, 1955, 281, 8645, 285, 263, 47166, 6276, 10554, 3045, 275, 253, 1071, 873, 50275, 34263, 10575, 273, 253, 2831, 24353, 11038, 4836, 403, 5421, 1223, 275, 7880, 2426, 642, 2746, 3133, 281, 452, 20297, 3045, 2568, 253, 5661, 1543, 1646, 281, 5224, 326, 253, 4081, 2746, 41731, 13015, 2720, 789, 50276, 28821, 326, 253, 4477, 7558, 281, 452, 1408, 608, 49495, 273, 253, 3368, 891, 2868, 9610, 690, 830, 273, 11649, 8197, 1475, 253, 2361, 3045, 2193, 651, 17084, 253, 1543, 50276, 35529, 891, 2868, 326, 253, 2323, 273, 253, 5661, 1543, 625, 10534, 273, 253, 11640, 10166, 281, 3283, 253, 9383, 11610, 6489, 1067, 715, 1953, 253, 1077, 26536, 273, 253, 3929, 12401, 8645, 390, 47166, 891, 2868, 326, 6489, 310, 417, 247, 9383, 11610, 591, 396, 275, 958, 347, 9125, 275, 2593, 4562, 253, 10554, 4836, 323, 436, 9383, 11610, 310, 417, 6210, 392, 37224, 347, 253, 873, 273, 22925, 275, 253, 3733, 12820, 285, 1071, 5239, 403, 28465, 275, 619, 4743, 436, 5841, 715, 1953, 253, 9079, 326, 752, 14137, 253, 5520, 3045, 310, 253, 958, 326, 841, 3210, 403, 10166, 281, 3283, 253, 33520, 2581, 891, 4282, 604, 253, 11361, 403, 3185, 247, 27949, 407, 7509, 273, 253, 625, 5919, 10393, 273, 253, 941, 1309, 253, 3733, 1232, 6701, 281, 417, 10568, 2454, 2460, 9797, 7663, 8557, 347, 3280, 50275, 555, 993, 50276, 4674, 2164, 337, 50276, 303, 31646, 28821, 50276, 19, 7349, 460, 14259, 3542, 970, 7880, 1318, 269, 2581, 685, 298, 19, 12850, 269, 19, 495, 1060, 359, 403, 1918, 247, 10304, 3280, 50274, 250, 3065, 50276, 18, 295, 356, 4011, 74, 549, 20409, 1775, 3814, 355, 5568, 466, 285, 285, 2663, 1182, 739, 8592, 3037, 494, 23854, 2831, 24353, 46234, 323, 1436, 6489, 549, 32693, 638, 3845, 549, 32693, 1093, 1762, 8897, 1610, 4765, 374, 295, 356, 4011, 74, 549, 20409, 1775, 3814, 355, 5568, 466, 285, 285, 2663, 1182, 739, 8592, 6523, 15547, 285, 4854, 9365, 2831, 24353, 1794, 7480, 11038, 10061, 273, 253, 26332, 1796, 8059, 327, 4382, 8113, 285, 3102, 8981, 4765, 7152, 33032, 2520, 2929, 13698, 387, 11038, 22132, 15547, 281, 253, 3888, 273, 616, 9365, 352, 8631, 247, 1332, 281, 6194, 6096, 46234, 273, 15547, 285, 2454, 3888, 253, 6519, 285, 2460, 3386, 564, 949, 4858, 11454, 6928, 1919, 247, 6096, 21496, 3828, 840, 247, 9162, 2990, 310, 4270, 327, 1755, 273, 253, 46234, 432, 1097, 6928, 50276, 783, 9162, 2990, 26295, 2710, 13553, 273, 33520, 273, 9365, 285, 15547, 8645, 47166, 285, 6489, 50276, 783, 3280, 281, 253, 9162, 2990, 310, 840, 908, 347, 247, 6096, 6779, 323, 9591, 25064, 285, 11038, 8892, 50276, 3118, 1096, 342, 2074, 789, 432, 295, 356, 4011, 74, 1162, 355, 4765, 665, 6635, 18433, 14800, 273, 15547, 285, 9365, 285, 6194, 247, 2990, 281, 30215, 604, 253, 4667, 310, 13373, 390, 417, 253, 4081, 1332, 36908, 2430, 18433, 14800, 50276, 262, 1057, 2299, 2430, 14800, 326, 403, 13130, 342, 253, 1072, 33520, 2439, 33433, 50276, 2577, 5471, 310, 326, 18433, 2762, 6667, 403, 6927, 281, 4044, 24088, 432, 440, 22027, 3492, 685, 14800, 13130, 342, 841, 33520, 3738, 18433, 4016, 6667, 2430, 21473, 285, 594, 778, 320, 347, 2834, 281, 4044, 50276, 43249, 1027, 27163, 403, 2684, 10941, 6928, 326, 497, 10166, 281, 3283, 512, 20077, 273, 6489, 8645, 285, 47166, 50276, 20513, 2486, 12488, 247, 11038, 2454, 275, 247, 873, 273, 9365, 1249, 390, 295, 9365, 323, 247, 1677, 4318, 390, 12008, 26620, 1543, 921, 326, 253, 2990, 326, 26295, 6489, 21069, 14280, 281, 789, 1682, 762, 247, 5235, 273, 10182, 25988, 273, 2710, 15252, 6787, 273, 253, 941, 50276, 20513, 15252, 6787, 671, 921, 326, 1223, 8645, 310, 4217, 4583, 352, 310, 417, 672, 253, 8645, 273, 1607, 493, 398, 310, 253, 1072, 347, 326, 273, 253, 2303, 2060, 50276, 783, 1543, 671, 921, 326, 1014, 672, 16344, 253, 15547, 285, 9365, 417, 2011, 275, 253, 3733, 941, 253, 1566, 476, 5115, 854, 1237, 247, 1028, 327, 39709, 328, 28996, 4292, 534, 41731, 13015, 253, 1375, 23037, 14387, 1332, 432, 295, 356, 4011, 74, 1162, 355, 4765, 50276, 266, 4722, 39893, 273, 2852, 789, 651, 320, 970, 253, 10554, 273, 841, 33520, 281, 26641, 247, 2990, 285, 840, 39494, 352, 970, 690, 3686, 273, 19947, 2957, 751, 253, 39716, 2957, 4499, 422, 2957, 3966, 50275, 17695, 50276, 1189, 455, 260, 10944, 403, 512, 1677, 275, 45860, 830, 295, 356, 4011, 74, 1162, 355, 4765, 275, 44127, 436, 310, 4851, 292, 390, 26542, 672, 1142, 2069, 2885, 6168, 474, 30404, 295, 356, 4011, 74, 1162, 355, 4765, 275, 44127, 436, 310, 4851, 554, 651, 320, 625, 4569, 50276, 783, 2460, 273, 253, 4318, 34048, 275, 8442, 337, 285, 374, 943, 320, 7932, 407, 2412, 278, 1241, 808, 287, 5059, 275, 1340, 281, 17093, 253, 6928, 3280, 50276, 3409, 390, 1445, 3185, 273, 1375, 23037, 14387, 327, 3239, 495, 50275, 249, 19087, 2164, 5770, 3870, 310, 3542, 3185, 273, 1677, 50275, 251, 3239, 721, 2593, 4562, 1249, 11038, 12494, 295, 356, 4011, 74, 1162, 355, 310, 3542, 7019, 50275, 6377, 721, 25957, 326, 627, 310, 247, 4194, 27214, 18504, 45791, 3024, 275, 2829, 374, 533, 627, 310, 642, 824, 4194, 310, 436, 2829, 50275, 6377, 818, 1386, 337, 18976, 943, 320, 305, 295, 2490, 187, 4118, 18435, 27, 455, 30628, 5194, 326, 253, 4081, 1332, 4722, 285, 973, 3559, 253, 4477, 30080, 22559, 9713, 512, 16383, 5439, 3374, 767, 30628, 5583, 2590, 2997, 285, 253, 2626, 32636, 45210, 2997, 891, 5194, 342, 436, 17401, 285, 2868, 326, 253, 2929, 588, 320, 273, 1600, 281, 253, 8446, 16362, 17857, 32888, 891, 5583, 18738, 436, 789, 323, 247, 20731, 9759, 387, 17857, 32888 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 43355, 4388, 281, 10313, 4623, 21011, 875, 4318, 285, 2460, 941, 762, 247, 2831, 24353, 11038, 7792, 949, 1846, 33520, 8645, 2654, 47166, 1016, 9383, 11610, 310, 6311, 970, 247, 260, 9866, 432, 1016, 2530, 5028, 3984, 19654, 285, 2454, 3888, 840, 247, 30410, 310, 3413, 432, 247, 6096, 6779, 534, 3797, 253, 260, 9866, 18012, 432, 4318, 3169, 285, 2460, 3169, 9383, 11610, 3311, 569, 253, 2934, 310, 4722, 285, 253, 2929, 5697, 403, 2590, 281, 956, 50276, 856, 84, 50276, 1826, 16039, 281, 1329, 2831, 2307, 1319, 11038, 432, 33520, 50276, 3118, 19922, 1543, 1411, 1375, 23037, 14387, 21414, 4679, 50276, 5040, 18505, 253, 3453, 7877, 281, 277, 323, 1097, 4318, 285, 2460, 3169, 260, 9866, 18012, 812, 1421, 281, 17631, 1543, 6296, 253, 5301, 273, 4318, 285, 2454, 3169, 9383, 11610, 8197, 403, 417, 7094, 4344, 1955, 281, 253, 15276, 7877, 1319, 476, 6889, 323, 1016, 5028, 18075, 347, 15516, 5921, 1783, 476, 320, 9904, 281, 6036, 6283, 1097, 10625, 50276, 2420, 577, 50276, 11631, 2654, 1543, 403, 417, 21414, 5046, 403, 417, 2590, 323, 1129, 406, 33032, 6010, 50276, 783, 3929, 29328, 247, 3676, 4715, 3169, 2746, 11205, 387, 11038, 2454, 3888, 281, 4318, 19654, 15823, 281, 253, 1072, 1436, 50275, 936, 436, 990, 253, 4477, 897, 10939, 30364, 50065, 11454, 6928, 281, 3711, 2454, 3888, 285, 9797, 19654, 50276, 33174, 347, 2812, 287, 5059, 50276, 936, 46234, 273, 4229, 285, 4503, 7877, 1319, 2234, 281, 253, 4081, 2746, 12401, 2905, 2720, 789, 841, 11911, 403, 417, 3587, 10166, 327, 690, 1798, 830, 273, 253, 2831, 24353, 11038, 4836, 3185, 253, 4795, 46234, 403, 10208, 281, 247, 36453, 1530, 6932, 23559, 14407, 21535, 9077, 30410, 326, 13698, 281, 3283, 2969, 33520, 824, 347, 8645, 47166, 390, 6489, 253, 2644, 985, 310, 10166, 26277, 281, 11903, 885, 253, 3045, 273, 841, 49996, 1677, 326, 2454, 2460, 4318, 7663, 8557, 15823, 281, 253, 1072, 1436, 1364, 3894, 4503, 323, 841, 33520, 253, 11454, 6928, 21496, 2454, 3888, 285, 9797, 19654, 403, 3021, 21719, 14659, 281, 3711, 2454, 3888, 285, 4318, 19654, 15823, 281, 253, 1072, 1436, 281, 2074, 46234, 50276, 783, 3929, 20097, 342, 271, 41389, 873, 273, 4679, 970, 253, 362, 1266, 1664, 285, 32582, 44033, 67, 15302, 326, 14371, 11701, 689, 2720, 789, 327, 253, 1072, 873, 273, 8892, 50275, 19164, 414, 285, 8453, 50276, 783, 3929, 3637, 484, 327, 3332, 789, 337, 374, 3652, 327, 616, 3236, 2898, 5661, 9978, 285, 1566, 10336, 253, 2234, 15832, 273, 253, 3929, 2429, 281, 253, 18979, 9380, 8696, 327, 253, 2934, 273, 4715, 2454, 22619, 46234, 281, 11903, 885, 616, 3745, 281, 3283, 33520, 2581, 685, 407, 11120, 2820, 281, 5556, 885, 271, 8103, 2905, 281, 2831, 24353, 11038, 1223, 253, 958, 326, 841, 33520, 403, 7052, 2330, 281, 2454, 3888, 285, 9797, 19654, 574, 2168, 644, 5469, 275, 337, 374, 253, 2934, 273, 2686, 970, 731, 281, 4446, 253, 4715, 1232, 310, 4460, 275, 436, 1798, 4836, 50276, 6050, 253, 3929, 1057, 417, 1246, 6832, 2087, 27299, 35961, 32771, 275, 5145, 4715, 891, 2868, 352, 16988, 247, 4891, 2898, 273, 5368, 5609, 45190, 253, 4081, 31214, 456, 1069, 257, 10336, 310, 5183, 281, 1421, 281, 1805, 3045, 275, 253, 362, 1266, 1664, 32582, 44033, 67, 10895, 275, 247, 11088, 873, 273, 4679, 347, 247, 906, 891, 2868, 253, 3929, 1537, 320, 273, 1600, 281, 24432, 6110, 275, 16161, 2905, 2831, 24353, 11038, 8892, 50275, 498, 15752, 50276, 783, 20121, 273, 253, 2746, 2905, 789, 285, 253, 1027, 4679, 4824, 562, 403, 3542, 4518, 285, 10534, 4583, 253, 2929, 310, 2581, 3477, 281, 1239, 285, 310, 3559, 970, 247, 13760, 3477, 936, 25739, 2605, 50276, 249, 619, 4743, 4931, 253, 760, 6517, 281, 326, 1750, 8696, 275, 2593, 5910, 604, 1896, 891, 2868, 253, 2326, 28996, 285, 39709, 328, 28996, 15216, 943, 320, 5611, 275, 1340, 281, 1056, 253, 3929, 1881, 41010, 50274, 15177, 50276, 783, 5661, 2593, 310, 2581, 41389, 5747, 9093, 11253, 273, 247, 2014, 10895, 352, 21168, 327, 337, 374, 285, 10262, 247, 4891, 1263, 326, 8132, 29689, 8553, 323, 1142, 2616, 824, 347, 2442, 34541, 1955, 281, 8645, 285, 263, 47166, 6276, 10554, 3045, 275, 253, 1071, 873, 50275, 34263, 10575, 273, 253, 2831, 24353, 11038, 4836, 403, 5421, 1223, 275, 7880, 2426, 642, 2746, 3133, 281, 452, 20297, 3045, 2568, 253, 5661, 1543, 1646, 281, 5224, 326, 253, 4081, 2746, 41731, 13015, 2720, 789, 50276, 28821, 326, 253, 4477, 7558, 281, 452, 1408, 608, 49495, 273, 253, 3368, 891, 2868, 9610, 690, 830, 273, 11649, 8197, 1475, 253, 2361, 3045, 2193, 651, 17084, 253, 1543, 50276, 35529, 891, 2868, 326, 253, 2323, 273, 253, 5661, 1543, 625, 10534, 273, 253, 11640, 10166, 281, 3283, 253, 9383, 11610, 6489, 1067, 715, 1953, 253, 1077, 26536, 273, 253, 3929, 12401, 8645, 390, 47166, 891, 2868, 326, 6489, 310, 417, 247, 9383, 11610, 591, 396, 275, 958, 347, 9125, 275, 2593, 4562, 253, 10554, 4836, 323, 436, 9383, 11610, 310, 417, 6210, 392, 37224, 347, 253, 873, 273, 22925, 275, 253, 3733, 12820, 285, 1071, 5239, 403, 28465, 275, 619, 4743, 436, 5841, 715, 1953, 253, 9079, 326, 752, 14137, 253, 5520, 3045, 310, 253, 958, 326, 841, 3210, 403, 10166, 281, 3283, 253, 33520, 2581, 891, 4282, 604, 253, 11361, 403, 3185, 247, 27949, 407, 7509, 273, 253, 625, 5919, 10393, 273, 253, 941, 1309, 253, 3733, 1232, 6701, 281, 417, 10568, 2454, 2460, 9797, 7663, 8557, 347, 3280, 50275, 555, 993, 50276, 4674, 2164, 337, 50276, 303, 31646, 28821, 50276, 19, 7349, 460, 14259, 3542, 970, 7880, 1318, 269, 2581, 685, 298, 19, 12850, 269, 19, 495, 1060, 359, 403, 1918, 247, 10304, 3280, 50274, 250, 3065, 50276, 18, 295, 356, 4011, 74, 549, 20409, 1775, 3814, 355, 5568, 466, 285, 285, 2663, 1182, 739, 8592, 3037, 494, 23854, 2831, 24353, 46234, 323, 1436, 6489, 549, 32693, 638, 3845, 549, 32693, 1093, 1762, 8897, 1610, 4765, 374, 295, 356, 4011, 74, 549, 20409, 1775, 3814, 355, 5568, 466, 285, 285, 2663, 1182, 739, 8592, 6523, 15547, 285, 4854, 9365, 2831, 24353, 1794, 7480, 11038, 10061, 273, 253, 26332, 1796, 8059, 327, 4382, 8113, 285, 3102, 8981, 4765, 7152, 33032, 2520, 2929, 13698, 387, 11038, 22132, 15547, 281, 253, 3888, 273, 616, 9365, 352, 8631, 247, 1332, 281, 6194, 6096, 46234, 273, 15547, 285, 2454, 3888, 253, 6519, 285, 2460, 3386, 564, 949, 4858, 11454, 6928, 1919, 247, 6096, 21496, 3828, 840, 247, 9162, 2990, 310, 4270, 327, 1755, 273, 253, 46234, 432, 1097, 6928, 50276, 783, 9162, 2990, 26295, 2710, 13553, 273, 33520, 273, 9365, 285, 15547, 8645, 47166, 285, 6489, 50276, 783, 3280, 281, 253, 9162, 2990, 310, 840, 908, 347, 247, 6096, 6779, 323, 9591, 25064, 285, 11038, 8892, 50276, 3118, 1096, 342, 2074, 789, 432, 295, 356, 4011, 74, 1162, 355, 4765, 665, 6635, 18433, 14800, 273, 15547, 285, 9365, 285, 6194, 247, 2990, 281, 30215, 604, 253, 4667, 310, 13373, 390, 417, 253, 4081, 1332, 36908, 2430, 18433, 14800, 50276, 262, 1057, 2299, 2430, 14800, 326, 403, 13130, 342, 253, 1072, 33520, 2439, 33433, 50276, 2577, 5471, 310, 326, 18433, 2762, 6667, 403, 6927, 281, 4044, 24088, 432, 440, 22027, 3492, 685, 14800, 13130, 342, 841, 33520, 3738, 18433, 4016, 6667, 2430, 21473, 285, 594, 778, 320, 347, 2834, 281, 4044, 50276, 43249, 1027, 27163, 403, 2684, 10941, 6928, 326, 497, 10166, 281, 3283, 512, 20077, 273, 6489, 8645, 285, 47166, 50276, 20513, 2486, 12488, 247, 11038, 2454, 275, 247, 873, 273, 9365, 1249, 390, 295, 9365, 323, 247, 1677, 4318, 390, 12008, 26620, 1543, 921, 326, 253, 2990, 326, 26295, 6489, 21069, 14280, 281, 789, 1682, 762, 247, 5235, 273, 10182, 25988, 273, 2710, 15252, 6787, 273, 253, 941, 50276, 20513, 15252, 6787, 671, 921, 326, 1223, 8645, 310, 4217, 4583, 352, 310, 417, 672, 253, 8645, 273, 1607, 493, 398, 310, 253, 1072, 347, 326, 273, 253, 2303, 2060, 50276, 783, 1543, 671, 921, 326, 1014, 672, 16344, 253, 15547, 285, 9365, 417, 2011, 275, 253, 3733, 941, 253, 1566, 476, 5115, 854, 1237, 247, 1028, 327, 39709, 328, 28996, 4292, 534, 41731, 13015, 253, 1375, 23037, 14387, 1332, 432, 295, 356, 4011, 74, 1162, 355, 4765, 50276, 266, 4722, 39893, 273, 2852, 789, 651, 320, 970, 253, 10554, 273, 841, 33520, 281, 26641, 247, 2990, 285, 840, 39494, 352, 970, 690, 3686, 273, 19947, 2957, 751, 253, 39716, 2957, 4499, 422, 2957, 3966, 50275, 17695, 50276, 1189, 455, 260, 10944, 403, 512, 1677, 275, 45860, 830, 295, 356, 4011, 74, 1162, 355, 4765, 275, 44127, 436, 310, 4851, 292, 390, 26542, 672, 1142, 2069, 2885, 6168, 474, 30404, 295, 356, 4011, 74, 1162, 355, 4765, 275, 44127, 436, 310, 4851, 554, 651, 320, 625, 4569, 50276, 783, 2460, 273, 253, 4318, 34048, 275, 8442, 337, 285, 374, 943, 320, 7932, 407, 2412, 278, 1241, 808, 287, 5059, 275, 1340, 281, 17093, 253, 6928, 3280, 50276, 3409, 390, 1445, 3185, 273, 1375, 23037, 14387, 327, 3239, 495, 50275, 249, 19087, 2164, 5770, 3870, 310, 3542, 3185, 273, 1677, 50275, 251, 3239, 721, 2593, 4562, 1249, 11038, 12494, 295, 356, 4011, 74, 1162, 355, 310, 3542, 7019, 50275, 6377, 721, 25957, 326, 627, 310, 247, 4194, 27214, 18504, 45791, 3024, 275, 2829, 374, 533, 627, 310, 642, 824, 4194, 310, 436, 2829, 50275, 6377, 818, 1386, 337, 18976, 943, 320, 305, 295, 2490, 187, 4118, 18435, 27, 455, 30628, 5194, 326, 253, 4081, 1332, 4722, 285, 973, 3559, 253, 4477, 30080, 22559, 9713, 512, 16383, 5439, 3374, 767, 30628, 5583, 2590, 2997, 285, 253, 2626, 32636, 45210, 2997, 891, 5194, 342, 436, 17401, 285, 2868, 326, 253, 2929, 588, 320, 273, 1600, 281, 253, 8446, 16362, 17857, 32888, 891, 5583, 18738, 436, 789, 323, 247, 20731, 9759, 387, 17857, 32888 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper proposes a new spatiotemporal adapter stadapter for parameterefficient finetuning on video tasks with a much smaller trainable parameter stadapter can match or even outperform the strong full finetuning strategy on k400 and ssv2 datasets pros 1 its a good attempt to introduce an adapter for spatiotemporal modeling using a 2d pretrained model 2 the proposed stadapter are effective it can match or even outperform the strong full finetuning strategy with a much smaller trainable parameter 3 sota results on k400 and ssv2 using a clip pretrained model cons 1 lack two important comparisons the new added stadapter has a quite different structure from common practice for extending a 2d backbone to spatiotemporal eg an additional temporal selfattention in figure1 so it would be better to disentangle the effect of the new structure eg dw 3dconv which may have a complementary impact on a full transformer architecture and learning strategy only tuning some new added modules which is also called adapter so exp1 finetuning the b of figure 1 and exp2 using a as an adapter of figure 1 would be valuable 2 clip pretrained model and imagenet21k pretrained model performs very differently for example partial finetuning using clip pretrained model on k400 only performs a little worse than full finetuning 801 vs 810 while this gap is quite large when using imagenet21k pretrained model 617 vs 769 it seems that the clip learns many video concepts which may narrow the gap between pretraining and video finetuning especially for k400 which is largely biased to spatial appearance so it may be more suitable to conduct an ablation study using imagenet21k pretrained model or using ssv2 dataset 3 similar to 2 it would be helpful to add the imagenet22k pretrained results in table 2 and table 3 no docsepthe goal of this paper is to perform parameterefficient learning on imagetovideo adaptation tasks specifically the authors propose a spatiotemporal adapter which applies an additional depthwise 3d convolution in adapters to incorporate temporal and spatial information with only a limited amount of parameters in the model finetuned in the experiment the authors showed that compared with full finetuning and other finetuning techniques the proposed stadapter can achieve better results in kinetics400 and somethingsomethingv2 datasets strengths the motivation is reasonable since the video pretraining data or pretrained models are limited or hard to obtain it is ideal to use pretrained modelsdata from other modalities and adapt to video domains the paper is easy to understand and the presentation is clear and good weaknesses since the major difference between the prior work 1 and the proposed method is depthwise 3d convolutional layers in the hidden state it is important to compare the prior work and show that such a design makes a difference in the performance a major weakness of this paper is the experimental setup as mentioned in the prior empirical study 2 on parameterefficient methods a common pitfall or mistake of the existing parameterefficient works is only presenting validation set results while doing earlystoping on the validation set this makes the reported results of the parameterefficient methods biased on the validation set and higher than the full finetuning a more reasonable experimental setup is to either report test set results or split the training set and perform earlystoping on the heldout set it is unclear whether this paper follows the proper experimental setup in table 1 only showing the number of finetuned parameters is a bit misleading as some methods including stadapter introduces extra parameters which makes the comparison unfair due to the different amounts of total parameters in the models across different methods comparison to the stateoftheart methods is confusing since the major focus is the effectiveness of the stadapter a fair comparison should be under the same experiment setups eg same pretrained datasets how the stadapter improves against the prior works for example the authors should present the results of 1 vitl pretrained on im21k and apply stadapter and 2 vitl pretrained on clip and without stadapter in this way the reader can understand the improvement gains solely provided by the proposed stadapter 1 parameterefficient transfer learning for nlp houlsby et al icml 19 2 revisiting parameterefficient tuning are we really there yet chen et al i do not see the authors including any discussion on limitations or negative societal impact the potential limitations of the paper are the inference time of using extra parameters and generalization to other video tasks docsepauthors propose an extension to the adapter 24 framework to build a spacetime adapter which converts a pretrained image model into a video recognition model by finetuning very few parameters on the video task the proposed adapter is very simple based upon depthwise 3d convolution added to each transformer block experiments are shown on kinetics and ssv2 where the proposed method obtains strong results often comparable to fully finetuning the network less so on ssv2 given its temporally challenging recognition task moreover the resulting model is more data and training efficient strengths 1 originality while the method is a simple extension of adapter 26 the proposed method is very simple to implement and builds upon a well known adapter architecture nevertheless it obtains strong results and to my knowledge these results are not well known in the community 2 quality comprehensive ablations authors provide comprehensive set of interesting ablations this includes apples to apples comparisons with reasonable baselines table 1 as well as properties of the adapter in fig 2 and tab 4 which show stadapter makes the model training and data efficient 3 clarity the paper is generally very well written easy to follow with useful and easy to read figures weaknesses 1 quality runtime savings while authors show impressive results in reducing parameters being finetuned yet obtaining near sota performance it is unclear what the realworld advantages of such a method is does the reduced parameters being trained lead to speed up in training time if so by how much does that make the training possible on gpus with less memory if the proposed approach makes sota video models accessible to researchers with constrained resources the impact of the paper will be much more 2 quality other backbones the results are largely limited to vit based models its not clear if the approach can also work with other popular architectures such as mvit and swin swin for instance does come with large scale pretrained models which would be interesting to adapt to videos 3 significance results without clip are not that strong authors show impressive results including 856 on kinetics while finetuning very few parameters and operating on very few frames from clip pretrained model however the results with imagenet21k are not as strong although they do obtain improvements over other approaches for parameter efficient finetuning its not clear why this is the case perhaps more results with other foundation models such as httpsgithubcomfacebookresearchswag could help decipher if this approach is best applicable to visiontext models or does it work more generally 4 clarity sota results missing some higher performing methods like multiview transformers for video recognition cvpr22 omnivore a single model for many visual modalities cvpr22 etc while not directly comparable since they use different architectures and pretraining they obtain better performance in table 2 and 3 and should be reported when comparing to sota 5 significance more datasets since the proposed method is fairly simple and general it would be nice to evaluate it more broadly for other video understanding tasks such as egocentric action classification eg epic ego4d long video classification eg howto100m such experiments would further throw light on the advantages and limitations of the proposed adapter minor l251 arts art l256 stat state na docsepthe paper proposes a parameterefficient imagetovideo transfer learning framework in particular the authors introduce a spatiotemporal adapter stadapter for finetuning a pretrained image model to video tasks in this setting the parameters of the original network are frozen and only the parameters of the stadapter are updated the authors demonstrate that will only 8 trainable parameters they are able to achieve competitive results to previous fully finetuned approaches on various video recognition benchmarks strengths timely and relevant topics as video models are getting larger and larger strong results with a small number of trainable parameters detailed ablation studies for the most part the paper is well written and easy to follow weaknesses the technical contribution of the paper can be seen as somewhat limited adapters and other parameterefficient modules ie lora prompttuning etc have been widely used in other domains such as nlp images etc the authors use a relatively straightforward adaptation of these ideas with the addition of 3d depthwise convolutions which are commonly used for efficient video processing to the video setting while the technical contribution is small given that this is the first attempt to the best of my knowledge to perform parameterefficient tuning on video recognition benchmarks im ok with it i wish the authors validated their proposed stadapter scheme with more backbones eg swin uniformer motionformer etc to demonstrate its generality i think it would have also been useful to see more experiments insights into the design choices for stadapter while 3d depthwise convolution is a natural choice the question remains is it the best choice have the authors experimented with any other operators eg selfattention efficient selfattention etc in my view it would be useful to see more designrelated experiments to verify that this is indeed the adapter scheme that works most effectively many of the tables listing gflops are unfair to the proposed method is it possible to highlight not only the gflops but also the number of trained parameters the proposed approach is trained using significantly fewer parameters than the remaining baselines that are fully finetuned no the limitations were not discussed ### Summary:
this paper proposes a new spatiotemporal adapter for parameterefficient finetuning per video task transferred from a pretrained image model after the discussion phase the requested comparisons on full finetuning finetuning only the temporal attention modules and more backbones are added and the reviewers are satisfied with the rebuttal given all the positive scores by reviewers the metareviewers recommend accepting this paper
[ 4118, 8439, 27, 187, 2520, 2929, 29328, 247, 747, 7046, 7173, 358, 23702, 23675, 331, 31644, 323, 30364, 11892, 2276, 1442, 292, 25004, 327, 3492, 8892, 342, 247, 1199, 4577, 6194, 494, 4764, 331, 31644, 476, 3761, 390, 1014, 562, 32231, 253, 2266, 2120, 1442, 292, 25004, 5700, 327, 465, 8320, 285, 256, 11427, 19, 15302, 50276, 856, 84, 337, 697, 247, 1175, 3177, 281, 9569, 271, 23675, 323, 7046, 7173, 358, 23702, 14053, 970, 247, 374, 69, 3215, 11273, 1566, 374, 253, 4081, 331, 31644, 403, 3576, 352, 476, 3761, 390, 1014, 562, 32231, 253, 2266, 2120, 1442, 292, 25004, 5700, 342, 247, 1199, 4577, 6194, 494, 4764, 495, 256, 5503, 1543, 327, 465, 8320, 285, 256, 11427, 19, 970, 247, 17230, 3215, 11273, 1566, 50276, 5040, 337, 3480, 767, 1774, 14023, 253, 747, 2879, 331, 31644, 556, 247, 50276, 39911, 1027, 2605, 432, 1846, 3946, 323, 13633, 247, 374, 69, 27882, 281, 7046, 7173, 358, 23702, 24088, 271, 3081, 11935, 1881, 42959, 275, 4677, 18, 594, 352, 651, 320, 1805, 281, 557, 290, 2134, 253, 1055, 273, 253, 747, 2605, 24088, 19858, 495, 69, 13118, 534, 778, 452, 247, 19767, 3486, 327, 247, 2120, 39707, 10336, 285, 4715, 5700, 760, 25184, 690, 747, 2879, 11911, 534, 310, 671, 1925, 23675, 594, 866, 18, 1442, 292, 25004, 253, 270, 273, 4677, 337, 285, 866, 19, 970, 247, 347, 271, 23675, 273, 4677, 337, 651, 320, 9865, 374, 17230, 3215, 11273, 1566, 285, 4440, 257, 292, 1797, 76, 3215, 11273, 1566, 17923, 1077, 13359, 323, 1650, 7898, 1442, 292, 25004, 970, 17230, 3215, 11273, 1566, 327, 465, 8320, 760, 17923, 247, 1652, 7197, 685, 2120, 1442, 292, 25004, 48242, 4632, 854, 740, 1223, 436, 8037, 310, 3240, 1781, 672, 970, 4440, 257, 292, 1797, 76, 3215, 11273, 1566, 48906, 4632, 818, 2090, 352, 3133, 326, 253, 17230, 33772, 1142, 3492, 12342, 534, 778, 6891, 253, 8037, 875, 3215, 26208, 285, 3492, 1442, 292, 25004, 3340, 323, 465, 8320, 534, 310, 8127, 23539, 281, 8820, 7286, 594, 352, 778, 320, 625, 7470, 281, 2589, 271, 28913, 1263, 970, 4440, 257, 292, 1797, 76, 3215, 11273, 1566, 390, 970, 256, 11427, 19, 10895, 495, 2074, 281, 374, 352, 651, 320, 9371, 281, 823, 253, 4440, 257, 292, 1423, 76, 3215, 11273, 1543, 275, 2829, 374, 285, 2829, 495, 50276, 2369, 50276, 7152, 339, 431, 248, 4736, 273, 436, 2929, 310, 281, 1347, 30364, 11892, 2276, 4715, 327, 4440, 292, 729, 2842, 15644, 8892, 5742, 253, 4477, 12661, 247, 7046, 7173, 358, 23702, 23675, 534, 10384, 271, 3081, 6864, 3020, 495, 69, 27311, 275, 519, 49872, 281, 19071, 11935, 285, 8820, 1491, 342, 760, 247, 3710, 2408, 273, 3602, 275, 253, 1566, 1442, 292, 37437, 275, 253, 3368, 253, 4477, 2692, 326, 2429, 342, 2120, 1442, 292, 25004, 285, 643, 1442, 292, 25004, 5609, 253, 4081, 331, 31644, 476, 5115, 1805, 1543, 275, 24273, 8320, 285, 1260, 678, 723, 8792, 87, 19, 15302, 20544, 50276, 783, 16038, 310, 5272, 1580, 253, 3492, 3215, 26208, 941, 390, 3215, 11273, 3210, 403, 3710, 390, 1892, 281, 4044, 352, 310, 7445, 281, 897, 3215, 11273, 3210, 2203, 432, 643, 33433, 285, 5223, 281, 3492, 10625, 50275, 783, 2929, 310, 3477, 281, 2096, 285, 253, 9759, 310, 2590, 285, 1175, 50276, 20881, 1255, 265, 50276, 17480, 253, 2201, 3064, 875, 253, 2720, 789, 337, 285, 253, 4081, 1332, 310, 6864, 3020, 495, 69, 27311, 267, 8090, 275, 253, 8763, 1375, 352, 310, 1774, 281, 7277, 253, 2720, 789, 285, 921, 326, 824, 247, 2216, 2789, 247, 3064, 275, 253, 3045, 50274, 66, 2201, 14855, 273, 436, 2929, 310, 253, 5661, 9978, 347, 5393, 275, 253, 2720, 16774, 1263, 374, 327, 30364, 11892, 2276, 3082, 247, 1846, 8483, 12615, 390, 10551, 273, 253, 5368, 30364, 11892, 2276, 2987, 310, 760, 15250, 12820, 873, 1543, 1223, 2509, 2393, 13121, 272, 327, 253, 12820, 873, 436, 2789, 253, 2361, 1543, 273, 253, 30364, 11892, 2276, 3082, 23539, 327, 253, 12820, 873, 285, 2169, 685, 253, 2120, 1442, 292, 25004, 247, 625, 5272, 5661, 9978, 310, 281, 2057, 1304, 1071, 873, 1543, 390, 8085, 253, 3733, 873, 285, 1347, 2393, 13121, 272, 327, 253, 2918, 483, 873, 352, 310, 12744, 1880, 436, 2929, 3637, 253, 1463, 5661, 9978, 50274, 249, 2829, 337, 760, 4645, 253, 1180, 273, 1442, 292, 37437, 3602, 310, 247, 2372, 24363, 347, 690, 3082, 1690, 331, 31644, 23970, 4465, 3602, 534, 2789, 253, 5301, 16593, 1955, 281, 253, 1027, 8322, 273, 2264, 3602, 275, 253, 3210, 2439, 1027, 3082, 50274, 47109, 281, 253, 1375, 23037, 14387, 3082, 310, 21643, 1580, 253, 2201, 2770, 310, 253, 12510, 273, 253, 331, 31644, 247, 4344, 5301, 943, 320, 762, 253, 1072, 3368, 873, 8777, 24088, 1072, 3215, 11273, 15302, 849, 253, 331, 31644, 19132, 1411, 253, 2720, 2987, 323, 1650, 253, 4477, 943, 1246, 253, 1543, 273, 337, 9084, 77, 3215, 11273, 327, 516, 1797, 76, 285, 4647, 331, 31644, 285, 374, 9084, 77, 3215, 11273, 327, 17230, 285, 1293, 331, 31644, 275, 436, 1039, 253, 9414, 476, 2096, 253, 7756, 15988, 12718, 2530, 407, 253, 4081, 331, 31644, 50276, 18, 30364, 11892, 2276, 3700, 4715, 323, 295, 24343, 288, 3941, 84, 1615, 1162, 355, 17857, 1686, 655, 50276, 19, 27694, 2996, 30364, 11892, 2276, 25184, 403, 359, 1663, 627, 2568, 260, 864, 1162, 355, 50276, 74, 513, 417, 923, 253, 4477, 1690, 667, 5955, 327, 7364, 390, 4016, 38058, 3486, 253, 2442, 7364, 273, 253, 2929, 403, 253, 17032, 673, 273, 970, 4465, 3602, 285, 26647, 281, 643, 3492, 8892, 50276, 7152, 33032, 43355, 12661, 271, 6880, 281, 253, 23675, 2164, 7792, 281, 1973, 247, 29380, 23675, 534, 28472, 247, 3215, 11273, 2460, 1566, 715, 247, 3492, 8981, 1566, 407, 1442, 292, 25004, 1077, 1643, 3602, 327, 253, 3492, 4836, 253, 4081, 23675, 310, 1077, 2969, 1754, 2220, 6864, 3020, 495, 69, 27311, 2879, 281, 1016, 39707, 2972, 4679, 403, 2011, 327, 24273, 285, 256, 11427, 19, 835, 253, 4081, 1332, 31326, 2266, 1543, 2223, 10870, 281, 4751, 1442, 292, 25004, 253, 2990, 1679, 594, 327, 256, 11427, 19, 1677, 697, 5897, 595, 11132, 8981, 4836, 25761, 253, 4795, 1566, 310, 625, 941, 285, 3733, 5919, 50276, 296, 3755, 20556, 337, 3236, 414, 1223, 253, 1332, 310, 247, 2969, 6880, 273, 23675, 3436, 253, 4081, 1332, 310, 1077, 2969, 281, 3359, 285, 21168, 2220, 247, 973, 1929, 23675, 10336, 17837, 352, 31326, 2266, 1543, 285, 281, 619, 3640, 841, 1543, 403, 417, 973, 1929, 275, 253, 3114, 374, 3290, 11088, 490, 77, 569, 4477, 2085, 11088, 873, 273, 4722, 490, 77, 569, 436, 3797, 28580, 281, 28580, 14023, 342, 5272, 1666, 25379, 2829, 337, 347, 973, 347, 3607, 273, 253, 23675, 275, 3036, 374, 285, 10334, 577, 534, 921, 331, 31644, 2789, 253, 1566, 3733, 285, 941, 5919, 495, 19843, 253, 2929, 310, 3839, 1077, 973, 3542, 3477, 281, 956, 342, 4217, 285, 3477, 281, 1239, 8442, 50274, 20881, 1255, 265, 337, 3290, 20243, 16347, 1223, 4477, 921, 13943, 1543, 275, 8493, 3602, 1146, 1442, 292, 37437, 2568, 13546, 2822, 256, 5503, 3045, 352, 310, 12744, 752, 253, 1524, 10186, 11361, 273, 824, 247, 1332, 310, 1057, 253, 3777, 3602, 1146, 10166, 1421, 281, 3885, 598, 275, 3733, 673, 604, 594, 407, 849, 1199, 1057, 326, 1056, 253, 3733, 1896, 327, 31025, 316, 342, 1679, 3541, 604, 253, 4081, 2746, 2789, 256, 5503, 3492, 3210, 12482, 281, 8607, 342, 20793, 5300, 253, 3486, 273, 253, 2929, 588, 320, 1199, 625, 374, 3290, 643, 896, 47473, 253, 1543, 403, 8127, 3710, 281, 9084, 1754, 3210, 697, 417, 2590, 604, 253, 2746, 476, 671, 789, 342, 643, 4633, 35615, 824, 347, 278, 34490, 285, 1863, 249, 1863, 249, 323, 4227, 1057, 1705, 342, 1781, 4311, 3215, 11273, 3210, 534, 651, 320, 4722, 281, 5223, 281, 10556, 495, 8453, 1543, 1293, 17230, 403, 417, 326, 2266, 4477, 921, 13943, 1543, 1690, 854, 3208, 327, 24273, 1223, 1442, 292, 25004, 1077, 1643, 3602, 285, 6498, 327, 1077, 1643, 13009, 432, 17230, 3215, 11273, 1566, 2299, 253, 1543, 342, 4440, 257, 292, 1797, 76, 403, 417, 347, 2266, 3738, 597, 513, 4044, 11701, 689, 643, 7274, 323, 4764, 5919, 1442, 292, 25004, 697, 417, 2590, 2139, 436, 310, 253, 1083, 4931, 625, 1543, 342, 643, 12153, 3210, 824, 347, 5987, 7280, 681, 24557, 36642, 2140, 356, 812, 1361, 1086, 6894, 604, 436, 2746, 310, 1682, 7763, 281, 8113, 1156, 3210, 390, 1057, 352, 789, 625, 3839, 50276, 21, 19843, 256, 5503, 1543, 5816, 690, 2169, 9591, 3082, 751, 1554, 400, 827, 4979, 398, 323, 3492, 8981, 30105, 1087, 1423, 33039, 400, 410, 247, 2014, 1566, 323, 1142, 5304, 33433, 30105, 1087, 1423, 3966, 1223, 417, 3587, 10870, 1580, 597, 897, 1027, 35615, 285, 3215, 26208, 597, 4044, 1805, 3045, 275, 2829, 374, 285, 495, 285, 943, 320, 2361, 672, 10941, 281, 256, 5503, 608, 8453, 625, 15302, 1580, 253, 4081, 1332, 310, 9648, 2969, 285, 2087, 352, 651, 320, 5322, 281, 7472, 352, 625, 21450, 323, 643, 3492, 4685, 8892, 824, 347, 24088, 406, 19458, 2250, 9162, 24088, 19876, 23057, 21, 69, 1048, 3492, 9162, 24088, 849, 936, 2313, 78, 824, 4679, 651, 2007, 4710, 1708, 327, 253, 11361, 285, 7364, 273, 253, 4081, 23675, 50275, 37585, 50276, 77, 21451, 14635, 50276, 435, 50276, 77, 9726, 1098, 50276, 3409, 5549, 5474, 339, 431, 248, 2929, 29328, 247, 30364, 11892, 2276, 4440, 292, 729, 2842, 3700, 4715, 7792, 275, 1798, 253, 4477, 9569, 247, 7046, 7173, 358, 23702, 23675, 331, 31644, 323, 1442, 292, 25004, 247, 3215, 11273, 2460, 1566, 281, 3492, 8892, 275, 436, 4758, 253, 3602, 273, 253, 3236, 2990, 403, 13831, 285, 760, 253, 3602, 273, 253, 50276, 25134, 3758, 403, 9300, 253, 4477, 7568, 326, 588, 760, 854, 6194, 494, 3602, 597, 403, 2104, 281, 5115, 12085, 1543, 281, 2045, 4751, 1442, 292, 37437, 7274, 327, 2710, 3492, 8981, 49602, 50276, 296, 3755, 20556, 50276, 38764, 285, 4623, 12989, 347, 3492, 3210, 403, 2970, 4067, 285, 4067, 50276, 9072, 1543, 342, 247, 1355, 1180, 273, 6194, 494, 3602, 50276, 5992, 7193, 28913, 2175, 323, 253, 954, 629, 50275, 783, 2929, 310, 973, 3542, 285, 3477, 281, 956, 50276, 20881, 1255, 265, 50276, 783, 7681, 7680, 273, 253, 2929, 476, 320, 2326, 347, 8489, 3710, 519, 49872, 285, 643, 30364, 11892, 2276, 11911, 26332, 298, 6464, 8959, 85, 25004, 3966, 452, 644, 7561, 908, 275, 643, 10625, 824, 347, 295, 24343, 3888, 3966, 253, 4477, 897, 247, 4942, 15246, 15644, 273, 841, 5697, 342, 253, 1635, 273, 495, 69, 6864, 3020, 2410, 17009, 534, 403, 7744, 908, 323, 5919, 3492, 5162, 281, 253, 3492, 4758, 1223, 253, 7681, 7680, 310, 1355, 1677, 326, 436, 310, 253, 806, 3177, 281, 253, 1682, 273, 619, 3640, 281, 1347, 30364, 11892, 2276, 25184, 327, 3492, 8981, 49602, 516, 8718, 342, 352, 50275, 74, 5730, 253, 4477, 17618, 616, 4081, 331, 31644, 6974, 342, 625, 896, 47473, 24088, 1863, 249, 6447, 254, 3200, 19946, 3966, 281, 7568, 697, 31376, 50276, 74, 1158, 352, 651, 452, 671, 644, 4217, 281, 923, 625, 4679, 50276, 968, 4380, 715, 253, 2216, 10165, 323, 331, 31644, 1223, 495, 69, 6864, 3020, 27311, 310, 247, 3626, 4327, 253, 1953, 4558, 310, 352, 253, 1682, 4327, 452, 253, 4477, 3368, 264, 342, 667, 643, 9158, 24088, 1881, 42959, 5919, 1881, 42959, 3966, 275, 619, 1859, 352, 651, 320, 4217, 281, 923, 625, 2216, 4919, 4679, 281, 12654, 326, 436, 310, 6296, 253, 23675, 6974, 326, 2987, 954, 8069, 50276, 20415, 273, 253, 7180, 16485, 305, 1258, 2695, 403, 16593, 281, 253, 4081, 1332, 310, 352, 1896, 281, 6780, 417, 760, 253, 305, 1258, 2695, 533, 671, 253, 1180, 273, 10166, 3602, 253, 4081, 2746, 310, 10166, 970, 3012, 11184, 3602, 685, 253, 5780, 1666, 25379, 326, 403, 4751, 1442, 292, 37437, 642, 253, 7364, 497, 417, 5469, 2490, 187, 4118, 18435, 27, 2520, 2929, 29328, 247, 747, 7046, 7173, 358, 23702, 23675, 323, 30364, 11892, 2276, 1442, 292, 25004, 591, 3492, 4836, 9495, 432, 247, 3215, 11273, 2460, 1566, 846, 253, 5955, 3408, 253, 9521, 14023, 327, 2120, 1442, 292, 25004, 1442, 292, 25004, 760, 253, 11935, 4116, 11911, 285, 625, 896, 47473, 403, 2879, 285, 253, 30628, 403, 10048, 342, 253, 30080, 22559, 1677, 512, 253, 2762, 7363, 407, 30628, 253, 1313, 609, 1374, 398, 5583, 18738, 436, 2929 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 4118, 8439, 27, 187, 2520, 2929, 29328, 247, 747, 7046, 7173, 358, 23702, 23675, 331, 31644, 323, 30364, 11892, 2276, 1442, 292, 25004, 327, 3492, 8892, 342, 247, 1199, 4577, 6194, 494, 4764, 331, 31644, 476, 3761, 390, 1014, 562, 32231, 253, 2266, 2120, 1442, 292, 25004, 5700, 327, 465, 8320, 285, 256, 11427, 19, 15302, 50276, 856, 84, 337, 697, 247, 1175, 3177, 281, 9569, 271, 23675, 323, 7046, 7173, 358, 23702, 14053, 970, 247, 374, 69, 3215, 11273, 1566, 374, 253, 4081, 331, 31644, 403, 3576, 352, 476, 3761, 390, 1014, 562, 32231, 253, 2266, 2120, 1442, 292, 25004, 5700, 342, 247, 1199, 4577, 6194, 494, 4764, 495, 256, 5503, 1543, 327, 465, 8320, 285, 256, 11427, 19, 970, 247, 17230, 3215, 11273, 1566, 50276, 5040, 337, 3480, 767, 1774, 14023, 253, 747, 2879, 331, 31644, 556, 247, 50276, 39911, 1027, 2605, 432, 1846, 3946, 323, 13633, 247, 374, 69, 27882, 281, 7046, 7173, 358, 23702, 24088, 271, 3081, 11935, 1881, 42959, 275, 4677, 18, 594, 352, 651, 320, 1805, 281, 557, 290, 2134, 253, 1055, 273, 253, 747, 2605, 24088, 19858, 495, 69, 13118, 534, 778, 452, 247, 19767, 3486, 327, 247, 2120, 39707, 10336, 285, 4715, 5700, 760, 25184, 690, 747, 2879, 11911, 534, 310, 671, 1925, 23675, 594, 866, 18, 1442, 292, 25004, 253, 270, 273, 4677, 337, 285, 866, 19, 970, 247, 347, 271, 23675, 273, 4677, 337, 651, 320, 9865, 374, 17230, 3215, 11273, 1566, 285, 4440, 257, 292, 1797, 76, 3215, 11273, 1566, 17923, 1077, 13359, 323, 1650, 7898, 1442, 292, 25004, 970, 17230, 3215, 11273, 1566, 327, 465, 8320, 760, 17923, 247, 1652, 7197, 685, 2120, 1442, 292, 25004, 48242, 4632, 854, 740, 1223, 436, 8037, 310, 3240, 1781, 672, 970, 4440, 257, 292, 1797, 76, 3215, 11273, 1566, 48906, 4632, 818, 2090, 352, 3133, 326, 253, 17230, 33772, 1142, 3492, 12342, 534, 778, 6891, 253, 8037, 875, 3215, 26208, 285, 3492, 1442, 292, 25004, 3340, 323, 465, 8320, 534, 310, 8127, 23539, 281, 8820, 7286, 594, 352, 778, 320, 625, 7470, 281, 2589, 271, 28913, 1263, 970, 4440, 257, 292, 1797, 76, 3215, 11273, 1566, 390, 970, 256, 11427, 19, 10895, 495, 2074, 281, 374, 352, 651, 320, 9371, 281, 823, 253, 4440, 257, 292, 1423, 76, 3215, 11273, 1543, 275, 2829, 374, 285, 2829, 495, 50276, 2369, 50276, 7152, 339, 431, 248, 4736, 273, 436, 2929, 310, 281, 1347, 30364, 11892, 2276, 4715, 327, 4440, 292, 729, 2842, 15644, 8892, 5742, 253, 4477, 12661, 247, 7046, 7173, 358, 23702, 23675, 534, 10384, 271, 3081, 6864, 3020, 495, 69, 27311, 275, 519, 49872, 281, 19071, 11935, 285, 8820, 1491, 342, 760, 247, 3710, 2408, 273, 3602, 275, 253, 1566, 1442, 292, 37437, 275, 253, 3368, 253, 4477, 2692, 326, 2429, 342, 2120, 1442, 292, 25004, 285, 643, 1442, 292, 25004, 5609, 253, 4081, 331, 31644, 476, 5115, 1805, 1543, 275, 24273, 8320, 285, 1260, 678, 723, 8792, 87, 19, 15302, 20544, 50276, 783, 16038, 310, 5272, 1580, 253, 3492, 3215, 26208, 941, 390, 3215, 11273, 3210, 403, 3710, 390, 1892, 281, 4044, 352, 310, 7445, 281, 897, 3215, 11273, 3210, 2203, 432, 643, 33433, 285, 5223, 281, 3492, 10625, 50275, 783, 2929, 310, 3477, 281, 2096, 285, 253, 9759, 310, 2590, 285, 1175, 50276, 20881, 1255, 265, 50276, 17480, 253, 2201, 3064, 875, 253, 2720, 789, 337, 285, 253, 4081, 1332, 310, 6864, 3020, 495, 69, 27311, 267, 8090, 275, 253, 8763, 1375, 352, 310, 1774, 281, 7277, 253, 2720, 789, 285, 921, 326, 824, 247, 2216, 2789, 247, 3064, 275, 253, 3045, 50274, 66, 2201, 14855, 273, 436, 2929, 310, 253, 5661, 9978, 347, 5393, 275, 253, 2720, 16774, 1263, 374, 327, 30364, 11892, 2276, 3082, 247, 1846, 8483, 12615, 390, 10551, 273, 253, 5368, 30364, 11892, 2276, 2987, 310, 760, 15250, 12820, 873, 1543, 1223, 2509, 2393, 13121, 272, 327, 253, 12820, 873, 436, 2789, 253, 2361, 1543, 273, 253, 30364, 11892, 2276, 3082, 23539, 327, 253, 12820, 873, 285, 2169, 685, 253, 2120, 1442, 292, 25004, 247, 625, 5272, 5661, 9978, 310, 281, 2057, 1304, 1071, 873, 1543, 390, 8085, 253, 3733, 873, 285, 1347, 2393, 13121, 272, 327, 253, 2918, 483, 873, 352, 310, 12744, 1880, 436, 2929, 3637, 253, 1463, 5661, 9978, 50274, 249, 2829, 337, 760, 4645, 253, 1180, 273, 1442, 292, 37437, 3602, 310, 247, 2372, 24363, 347, 690, 3082, 1690, 331, 31644, 23970, 4465, 3602, 534, 2789, 253, 5301, 16593, 1955, 281, 253, 1027, 8322, 273, 2264, 3602, 275, 253, 3210, 2439, 1027, 3082, 50274, 47109, 281, 253, 1375, 23037, 14387, 3082, 310, 21643, 1580, 253, 2201, 2770, 310, 253, 12510, 273, 253, 331, 31644, 247, 4344, 5301, 943, 320, 762, 253, 1072, 3368, 873, 8777, 24088, 1072, 3215, 11273, 15302, 849, 253, 331, 31644, 19132, 1411, 253, 2720, 2987, 323, 1650, 253, 4477, 943, 1246, 253, 1543, 273, 337, 9084, 77, 3215, 11273, 327, 516, 1797, 76, 285, 4647, 331, 31644, 285, 374, 9084, 77, 3215, 11273, 327, 17230, 285, 1293, 331, 31644, 275, 436, 1039, 253, 9414, 476, 2096, 253, 7756, 15988, 12718, 2530, 407, 253, 4081, 331, 31644, 50276, 18, 30364, 11892, 2276, 3700, 4715, 323, 295, 24343, 288, 3941, 84, 1615, 1162, 355, 17857, 1686, 655, 50276, 19, 27694, 2996, 30364, 11892, 2276, 25184, 403, 359, 1663, 627, 2568, 260, 864, 1162, 355, 50276, 74, 513, 417, 923, 253, 4477, 1690, 667, 5955, 327, 7364, 390, 4016, 38058, 3486, 253, 2442, 7364, 273, 253, 2929, 403, 253, 17032, 673, 273, 970, 4465, 3602, 285, 26647, 281, 643, 3492, 8892, 50276, 7152, 33032, 43355, 12661, 271, 6880, 281, 253, 23675, 2164, 7792, 281, 1973, 247, 29380, 23675, 534, 28472, 247, 3215, 11273, 2460, 1566, 715, 247, 3492, 8981, 1566, 407, 1442, 292, 25004, 1077, 1643, 3602, 327, 253, 3492, 4836, 253, 4081, 23675, 310, 1077, 2969, 1754, 2220, 6864, 3020, 495, 69, 27311, 2879, 281, 1016, 39707, 2972, 4679, 403, 2011, 327, 24273, 285, 256, 11427, 19, 835, 253, 4081, 1332, 31326, 2266, 1543, 2223, 10870, 281, 4751, 1442, 292, 25004, 253, 2990, 1679, 594, 327, 256, 11427, 19, 1677, 697, 5897, 595, 11132, 8981, 4836, 25761, 253, 4795, 1566, 310, 625, 941, 285, 3733, 5919, 50276, 296, 3755, 20556, 337, 3236, 414, 1223, 253, 1332, 310, 247, 2969, 6880, 273, 23675, 3436, 253, 4081, 1332, 310, 1077, 2969, 281, 3359, 285, 21168, 2220, 247, 973, 1929, 23675, 10336, 17837, 352, 31326, 2266, 1543, 285, 281, 619, 3640, 841, 1543, 403, 417, 973, 1929, 275, 253, 3114, 374, 3290, 11088, 490, 77, 569, 4477, 2085, 11088, 873, 273, 4722, 490, 77, 569, 436, 3797, 28580, 281, 28580, 14023, 342, 5272, 1666, 25379, 2829, 337, 347, 973, 347, 3607, 273, 253, 23675, 275, 3036, 374, 285, 10334, 577, 534, 921, 331, 31644, 2789, 253, 1566, 3733, 285, 941, 5919, 495, 19843, 253, 2929, 310, 3839, 1077, 973, 3542, 3477, 281, 956, 342, 4217, 285, 3477, 281, 1239, 8442, 50274, 20881, 1255, 265, 337, 3290, 20243, 16347, 1223, 4477, 921, 13943, 1543, 275, 8493, 3602, 1146, 1442, 292, 37437, 2568, 13546, 2822, 256, 5503, 3045, 352, 310, 12744, 752, 253, 1524, 10186, 11361, 273, 824, 247, 1332, 310, 1057, 253, 3777, 3602, 1146, 10166, 1421, 281, 3885, 598, 275, 3733, 673, 604, 594, 407, 849, 1199, 1057, 326, 1056, 253, 3733, 1896, 327, 31025, 316, 342, 1679, 3541, 604, 253, 4081, 2746, 2789, 256, 5503, 3492, 3210, 12482, 281, 8607, 342, 20793, 5300, 253, 3486, 273, 253, 2929, 588, 320, 1199, 625, 374, 3290, 643, 896, 47473, 253, 1543, 403, 8127, 3710, 281, 9084, 1754, 3210, 697, 417, 2590, 604, 253, 2746, 476, 671, 789, 342, 643, 4633, 35615, 824, 347, 278, 34490, 285, 1863, 249, 1863, 249, 323, 4227, 1057, 1705, 342, 1781, 4311, 3215, 11273, 3210, 534, 651, 320, 4722, 281, 5223, 281, 10556, 495, 8453, 1543, 1293, 17230, 403, 417, 326, 2266, 4477, 921, 13943, 1543, 1690, 854, 3208, 327, 24273, 1223, 1442, 292, 25004, 1077, 1643, 3602, 285, 6498, 327, 1077, 1643, 13009, 432, 17230, 3215, 11273, 1566, 2299, 253, 1543, 342, 4440, 257, 292, 1797, 76, 403, 417, 347, 2266, 3738, 597, 513, 4044, 11701, 689, 643, 7274, 323, 4764, 5919, 1442, 292, 25004, 697, 417, 2590, 2139, 436, 310, 253, 1083, 4931, 625, 1543, 342, 643, 12153, 3210, 824, 347, 5987, 7280, 681, 24557, 36642, 2140, 356, 812, 1361, 1086, 6894, 604, 436, 2746, 310, 1682, 7763, 281, 8113, 1156, 3210, 390, 1057, 352, 789, 625, 3839, 50276, 21, 19843, 256, 5503, 1543, 5816, 690, 2169, 9591, 3082, 751, 1554, 400, 827, 4979, 398, 323, 3492, 8981, 30105, 1087, 1423, 33039, 400, 410, 247, 2014, 1566, 323, 1142, 5304, 33433, 30105, 1087, 1423, 3966, 1223, 417, 3587, 10870, 1580, 597, 897, 1027, 35615, 285, 3215, 26208, 597, 4044, 1805, 3045, 275, 2829, 374, 285, 495, 285, 943, 320, 2361, 672, 10941, 281, 256, 5503, 608, 8453, 625, 15302, 1580, 253, 4081, 1332, 310, 9648, 2969, 285, 2087, 352, 651, 320, 5322, 281, 7472, 352, 625, 21450, 323, 643, 3492, 4685, 8892, 824, 347, 24088, 406, 19458, 2250, 9162, 24088, 19876, 23057, 21, 69, 1048, 3492, 9162, 24088, 849, 936, 2313, 78, 824, 4679, 651, 2007, 4710, 1708, 327, 253, 11361, 285, 7364, 273, 253, 4081, 23675, 50275, 37585, 50276, 77, 21451, 14635, 50276, 435, 50276, 77, 9726, 1098, 50276, 3409, 5549, 5474, 339, 431, 248, 2929, 29328, 247, 30364, 11892, 2276, 4440, 292, 729, 2842, 3700, 4715, 7792, 275, 1798, 253, 4477, 9569, 247, 7046, 7173, 358, 23702, 23675, 331, 31644, 323, 1442, 292, 25004, 247, 3215, 11273, 2460, 1566, 281, 3492, 8892, 275, 436, 4758, 253, 3602, 273, 253, 3236, 2990, 403, 13831, 285, 760, 253, 3602, 273, 253, 50276, 25134, 3758, 403, 9300, 253, 4477, 7568, 326, 588, 760, 854, 6194, 494, 3602, 597, 403, 2104, 281, 5115, 12085, 1543, 281, 2045, 4751, 1442, 292, 37437, 7274, 327, 2710, 3492, 8981, 49602, 50276, 296, 3755, 20556, 50276, 38764, 285, 4623, 12989, 347, 3492, 3210, 403, 2970, 4067, 285, 4067, 50276, 9072, 1543, 342, 247, 1355, 1180, 273, 6194, 494, 3602, 50276, 5992, 7193, 28913, 2175, 323, 253, 954, 629, 50275, 783, 2929, 310, 973, 3542, 285, 3477, 281, 956, 50276, 20881, 1255, 265, 50276, 783, 7681, 7680, 273, 253, 2929, 476, 320, 2326, 347, 8489, 3710, 519, 49872, 285, 643, 30364, 11892, 2276, 11911, 26332, 298, 6464, 8959, 85, 25004, 3966, 452, 644, 7561, 908, 275, 643, 10625, 824, 347, 295, 24343, 3888, 3966, 253, 4477, 897, 247, 4942, 15246, 15644, 273, 841, 5697, 342, 253, 1635, 273, 495, 69, 6864, 3020, 2410, 17009, 534, 403, 7744, 908, 323, 5919, 3492, 5162, 281, 253, 3492, 4758, 1223, 253, 7681, 7680, 310, 1355, 1677, 326, 436, 310, 253, 806, 3177, 281, 253, 1682, 273, 619, 3640, 281, 1347, 30364, 11892, 2276, 25184, 327, 3492, 8981, 49602, 516, 8718, 342, 352, 50275, 74, 5730, 253, 4477, 17618, 616, 4081, 331, 31644, 6974, 342, 625, 896, 47473, 24088, 1863, 249, 6447, 254, 3200, 19946, 3966, 281, 7568, 697, 31376, 50276, 74, 1158, 352, 651, 452, 671, 644, 4217, 281, 923, 625, 4679, 50276, 968, 4380, 715, 253, 2216, 10165, 323, 331, 31644, 1223, 495, 69, 6864, 3020, 27311, 310, 247, 3626, 4327, 253, 1953, 4558, 310, 352, 253, 1682, 4327, 452, 253, 4477, 3368, 264, 342, 667, 643, 9158, 24088, 1881, 42959, 5919, 1881, 42959, 3966, 275, 619, 1859, 352, 651, 320, 4217, 281, 923, 625, 2216, 4919, 4679, 281, 12654, 326, 436, 310, 6296, 253, 23675, 6974, 326, 2987, 954, 8069, 50276, 20415, 273, 253, 7180, 16485, 305, 1258, 2695, 403, 16593, 281, 253, 4081, 1332, 310, 352, 1896, 281, 6780, 417, 760, 253, 305, 1258, 2695, 533, 671, 253, 1180, 273, 10166, 3602, 253, 4081, 2746, 310, 10166, 970, 3012, 11184, 3602, 685, 253, 5780, 1666, 25379, 326, 403, 4751, 1442, 292, 37437, 642, 253, 7364, 497, 417, 5469, 2490, 187, 4118, 18435, 27, 2520, 2929, 29328, 247, 747, 7046, 7173, 358, 23702, 23675, 323, 30364, 11892, 2276, 1442, 292, 25004, 591, 3492, 4836, 9495, 432, 247, 3215, 11273, 2460, 1566, 846, 253, 5955, 3408, 253, 9521, 14023, 327, 2120, 1442, 292, 25004, 1442, 292, 25004, 760, 253, 11935, 4116, 11911, 285, 625, 896, 47473, 403, 2879, 285, 253, 30628, 403, 10048, 342, 253, 30080, 22559, 1677, 512, 253, 2762, 7363, 407, 30628, 253, 1313, 609, 1374, 398, 5583, 18738, 436, 2929 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper proposes to first predict a coarse 323 voxel grid by aggregating independent predictions from individual views then it translate it into a mesh and refine it using deepmvs predictions using each view in turn as a reference view and a gcn architecture on the mesh on the positive side i like the idea of using mvsnet but why not use it from the start before the single view voxel prediction i think this paper is going toward a renderandcompare approach for 3d shape prediction which i think is a good idea the boost in the results seems impressive compared to p2m there are however several things i dont like or that worry me about this paper the pipeline presented in this paper is extremely complicated and has many different parts after reading it i have no idea what really makes the improvement compared to p2m it uses voxels mesh and depth maps graph convolution networks attentionbased architecture svr and deepmvs the training loss has 5 balancing hyperparameters between things as different as crossentropy and chamfer distance to me the ablation studies table 2 and 3 show clearly that the most complex parts of the pipeline 32 contrastive depth and attention based aggregation only provides very minor improvements 1 given their complexity and number of hyper parameters i do not think these can be considered as significative given these results it is completely unclear to me how the proposed approach can lead to a 14 improvement over pixel2mesh i thus think the approach should be strongly simplified maybe loosing 1 in final performance but the paper should provide a clear ablation that actually explain why their framework is so much better than p2m and this is interesting right now i believe it could be for a bad reason for example deepmvs could give excellent results on synthetic data because it is too simple note i realize that table 3 shows it is not perfect since there is a further 35 boost using gt depth but it could still be unrealistically good for synthetic data related work is lacking discussion of important references namely all classical references for pointbased sfm in 21 foldingnet and atlasnet for mesh generation in 22 all implicit volumetric works also in 22 deepsdf occupancynetworks the most classical deep depth prediction works in 23 eigen and fergus to summarise despite its impressive numbers i think this paper cannot be accepted as is mainly because of its complexity lack of clear explanation for its huge performance boost and the only marginalnot significative boosts given by the most complexe parts of the pipeline some additional notes on presentation i am not sure contrastive depth is a good choice of name since contrastive feature learning is a popular but unrelated research direction i found 32 very hard to parsereorder i could only do it with the help of fig 1 which is itself hard to parse and does not represent eg how the attentionbased pooling happens docsepoverview this paper proposes a system of reconstructing 3d objects from multiview images the system consists of a singleview voxel generation network a multiview voxel fusion mechanism a multiview depth estimation network and a refinement network aggregating multiview depth features the major contribution is in the refinement stage upon the coarse reconstruction obtained from voxel predictions typically for the introduction of the attentionbased multiview feature pooling method novelty according to the paper and the attached code it seems like the authors mostly utilized existing networks to build a system the author introduces their attentionbased multiview feature pooling mechanism which is new despite the results the system is rather bulky and adhoc for the use of gcn in refinement see question 2 results the paper achieves plausible stateofthearts quantitate results on standard evaluation sets and metrics the visual quality is reasonable however from figure 3 it seems like reconstructed local surface suffers from noises their results struggles to getting clean surface especially when compared to implicitbased methods such as deepsdf the authors did not provide more qualitative results in supplementals clarity this paper is well written and easy to understand the attached code is well documented and can be deployed conclusion overall this is a well written paper with plausible outcomes the reviewer believes this paper carries out reasonable efforts and insights into this topic the reviewer is marginally positive towards its acceptance due to the pleasing results but is holding a conservative attitude towards its contribution significances the reviewer would like to see the questions addressed in the rebuttal period while also refer to others reviews questions 1 for each singleview voxel prediction the paper did not clarify which coordinate system those voxel are in when aggregating multiview voxel grid how is the coordinate transformation handled between different viewpoints if voxel from different coordinate systems should undertake transformation how is interpolation handled when merging to a single 32x32x32 grid 2 use of gcn as gcn only optimizes the current mesh it cannot correct the topology error occurring after the coarse reconstruction how would this method overcome this especially when the cubified mesh is in wrong topology 3 use of depth from multiview predicted depth one can simply reconstruct from the depths or run differentiable render for optimizing the mesh geometry directly why would we need contrastive depth feature extractiondocsepquality overall the quality of this work is high the quantitative and qualitative results are impressive relative to the soa i would like to see the qualitative results for the best model as opposed to just the pretty model and im curious why the best qualitative mode was not the same as the best quantitative model i would think analyzing this difference could give the authors insight into how to improve the model clarity overall the paper is written clearly explaining and justifying the different components of the model clearly there are a few issuesquestions i have page 2 change nonreflective reflective for depth estimation im wondering why you changed the mvsnet loss function to use berhu instead of l1 used in the original paper could you define the terms in the berhu criterion what are x and c it would also be good to shed some intuition on why this criterion is the right one the mixing constants in your loss function lambda vary across several orders of magnitude how were those selected on page 6 you state that two values of tau are used but elsewhere in the paper tau is defined as 104 and you use tau and 2tau originality the paper generally uses a mix of soa techniques creatively woven together in a fairly sophisticated model oher novel aspects such as using the neural renderer to create the contrastive depth module was interesting significance this work is significant based on the importance of the problem this is one of the harder and most important problems in computer vision today in the quality of its results and in the creative way it combines soa methods to provide multiple semisupervised losses ### Summary:
this submission is an interesting case the method it presents appears to work quite well achieving stateoftheart quantitative reconstruction results though qualitatively the reconstructed surfaces are locally noisy the method is quite complex which different reviewers saw as either a strength or a weakness a mix of soa techniques creatively woven together in a fairly sophisticated model vs bulky and ad hoc most critically it appears that the reasons for the methods significant 14 improvement over the prior art for this problem pixel2mesh are not due to the novel contributions that the paper focuses on multiheaded attention contrastive depth loss rather it is other system design choices that are not novel research contributions that make up all but 1 of this difference primarily using a voxel grid predictor to get the initial mesh as opposed to an initial ellipsoid mesh it might be possible for the authors to write a systems paper supporting these design decisions and showing how they lead to better results however this is not the paper the authors have written the majority of the technical detail in the paper is focused on method components that make minimal impact i would also argue that this hypothetical paper would not necessarily be appropriate for iclr since it does not focus on any new representations it would be better suited to a venue such as cvpr iccv or 3dv ps reviewer 5 deserves all of the credit for noticing this major issue with the paper
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 253, 2929, 29328, 281, 806, 3283, 247, 25319, 32597, 46092, 9860, 407, 9406, 839, 3907, 13650, 432, 2060, 6849, 840, 352, 16497, 352, 715, 247, 17489, 285, 39494, 352, 970, 3676, 78, 10936, 13650, 970, 1016, 1859, 275, 1614, 347, 247, 3806, 1859, 285, 247, 305, 14340, 10336, 327, 253, 17489, 50276, 251, 253, 2762, 1930, 50276, 74, 751, 253, 2934, 273, 970, 278, 10936, 3024, 533, 2139, 417, 897, 352, 432, 253, 1265, 1078, 253, 2014, 1859, 46092, 10554, 50275, 74, 1158, 436, 2929, 310, 1469, 2584, 247, 8600, 395, 23813, 2746, 323, 495, 69, 5281, 10554, 534, 891, 1158, 310, 247, 1175, 2934, 50276, 783, 9510, 275, 253, 1543, 3133, 13943, 2429, 281, 268, 19, 78, 50276, 9088, 403, 2299, 2067, 1841, 891, 13414, 751, 390, 326, 7664, 479, 670, 436, 2929, 50276, 783, 15722, 3559, 275, 436, 2929, 310, 6685, 9542, 285, 556, 1142, 1027, 4243, 846, 4361, 352, 891, 452, 642, 2934, 752, 1663, 2789, 253, 7756, 2429, 281, 268, 19, 78, 352, 4648, 32582, 1241, 17489, 285, 6864, 8115, 4216, 27311, 6928, 4116, 3169, 10336, 18504, 83, 285, 3676, 78, 10936, 253, 3733, 2957, 556, 608, 26259, 4373, 22041, 875, 1841, 347, 1027, 347, 2831, 290, 10144, 285, 45909, 1592, 4181, 50276, 936, 479, 253, 28913, 2175, 2829, 374, 285, 495, 921, 4518, 326, 253, 954, 2570, 4243, 273, 253, 15722, 4567, 4499, 422, 6864, 285, 4116, 1754, 20828, 760, 3400, 1077, 5884, 11701, 337, 1677, 616, 10454, 285, 1180, 273, 4373, 3602, 891, 513, 417, 1158, 841, 476, 320, 2783, 347, 1415, 800, 1677, 841, 1543, 352, 310, 4336, 12744, 281, 479, 849, 253, 4081, 2746, 476, 1421, 281, 247, 1638, 7756, 689, 12275, 19, 36742, 891, 3021, 1158, 253, 2746, 943, 320, 7052, 21010, 5046, 2343, 5555, 337, 275, 2457, 3045, 533, 253, 2929, 943, 2085, 247, 2590, 28913, 326, 2686, 5513, 2139, 616, 7792, 310, 594, 1199, 1805, 685, 268, 19, 78, 285, 436, 310, 4722, 987, 1024, 891, 2868, 352, 812, 320, 323, 247, 3076, 1921, 323, 1650, 3676, 78, 10936, 812, 1918, 7126, 1543, 327, 13506, 941, 984, 352, 310, 1512, 2969, 50276, 9939, 891, 8968, 326, 2829, 495, 2722, 352, 310, 417, 3962, 1580, 627, 310, 247, 2007, 4791, 9510, 970, 305, 85, 6864, 533, 352, 812, 1335, 320, 32638, 18260, 1175, 323, 13506, 941, 50276, 4919, 789, 310, 14999, 5955, 273, 1774, 10414, 10775, 512, 8946, 10414, 323, 1127, 3169, 256, 22401, 275, 3127, 50276, 8089, 272, 3024, 285, 387, 9869, 3024, 323, 17489, 5978, 275, 3307, 512, 15424, 1936, 45558, 2987, 671, 275, 3307, 372, 2265, 4989, 35190, 3024, 4896, 253, 954, 8946, 3676, 6864, 10554, 2987, 275, 3495, 9216, 285, 269, 26163, 50276, 936, 10405, 885, 5747, 697, 13943, 3904, 891, 1158, 436, 2929, 2550, 320, 7607, 347, 310, 7194, 984, 273, 697, 10454, 3480, 273, 2590, 8813, 323, 697, 5699, 3045, 9510, 285, 253, 760, 16888, 1439, 1415, 800, 9510, 84, 1677, 407, 253, 954, 2570, 70, 4243, 273, 253, 15722, 50276, 8826, 3081, 7211, 327, 9759, 50274, 74, 717, 417, 2119, 4499, 422, 6864, 310, 247, 1175, 4327, 273, 1416, 1580, 4499, 422, 4735, 4715, 310, 247, 4633, 533, 20804, 2561, 3884, 50276, 74, 1119, 4567, 1077, 1892, 281, 14390, 250, 2621, 891, 812, 760, 513, 352, 342, 253, 1361, 273, 3036, 337, 534, 310, 3139, 1892, 281, 14390, 285, 1057, 417, 1957, 24088, 849, 253, 4116, 3169, 45900, 6569, 50276, 7152, 33032, 39930, 50276, 2520, 2929, 29328, 247, 985, 273, 17029, 272, 495, 69, 5113, 432, 1554, 400, 827, 3888, 253, 985, 8414, 273, 247, 2014, 1374, 46092, 5978, 2990, 247, 1554, 400, 827, 46092, 11781, 5122, 247, 1554, 400, 827, 6864, 13418, 2990, 285, 247, 29646, 2990, 9406, 839, 1554, 400, 827, 6864, 3386, 50276, 783, 2201, 7680, 310, 275, 253, 29646, 3924, 2220, 253, 25319, 14433, 2797, 432, 46092, 13650, 5431, 323, 253, 10199, 273, 253, 4116, 3169, 1554, 400, 827, 4735, 45900, 50276, 9349, 38135, 50276, 35861, 281, 253, 2929, 285, 253, 7660, 2127, 352, 3133, 751, 253, 4477, 6571, 12845, 5368, 6928, 281, 1973, 247, 985, 253, 2488, 23970, 616, 4116, 3169, 1554, 400, 827, 4735, 45900, 5122, 534, 310, 747, 5747, 253, 1543, 253, 985, 310, 2581, 41274, 285, 519, 37806, 323, 253, 897, 273, 305, 14340, 275, 29646, 923, 1953, 374, 50276, 16680, 253, 2929, 33526, 21541, 1375, 23037, 248, 12863, 2677, 17255, 1543, 327, 2629, 7103, 5239, 285, 17082, 253, 5304, 3290, 310, 5272, 2299, 432, 4677, 495, 352, 3133, 751, 25578, 1980, 2553, 27171, 432, 33737, 616, 1543, 23490, 281, 2970, 4076, 2553, 3340, 672, 2429, 281, 15424, 3169, 3082, 824, 347, 372, 2265, 4989, 253, 4477, 858, 417, 2085, 625, 18276, 1543, 275, 8499, 932, 50276, 498, 15752, 436, 2929, 310, 973, 3542, 285, 3477, 281, 2096, 253, 7660, 2127, 310, 973, 14290, 285, 476, 320, 18329, 50276, 585, 3444, 50276, 1189, 455, 436, 310, 247, 973, 3542, 2929, 342, 21541, 6973, 253, 37317, 11532, 436, 2929, 15814, 562, 5272, 6031, 285, 16039, 715, 436, 9400, 253, 37317, 310, 42876, 2762, 4404, 697, 14924, 1955, 281, 253, 37902, 1543, 533, 310, 5877, 247, 11518, 12046, 4404, 697, 7680, 1415, 1972, 253, 37317, 651, 751, 281, 923, 253, 3533, 9713, 275, 253, 30080, 22559, 2180, 1223, 671, 3730, 281, 2571, 10123, 50275, 34974, 337, 323, 1016, 2014, 1374, 46092, 10554, 253, 2929, 858, 417, 19148, 534, 13249, 985, 1110, 46092, 403, 275, 672, 9406, 839, 1554, 400, 827, 46092, 9860, 849, 310, 253, 13249, 9261, 15726, 875, 1027, 1859, 10801, 604, 46092, 432, 1027, 13249, 2718, 943, 30618, 9261, 849, 310, 30370, 15726, 672, 34047, 281, 247, 2014, 4567, 89, 1237, 89, 1237, 9860, 50276, 19, 897, 273, 305, 14340, 347, 305, 14340, 760, 5556, 4219, 253, 1655, 17489, 352, 2550, 3451, 253, 18080, 2228, 12952, 846, 253, 25319, 14433, 849, 651, 436, 1332, 11399, 436, 3340, 672, 253, 12966, 1245, 17489, 310, 275, 3430, 18080, 495, 897, 273, 6864, 432, 1554, 400, 827, 8131, 6864, 581, 476, 3365, 17029, 432, 253, 24484, 390, 1408, 46350, 8600, 323, 39793, 253, 17489, 12087, 3587, 2139, 651, 359, 878, 4499, 422, 6864, 4735, 11998, 7152, 33032, 15177, 4583, 253, 3290, 273, 436, 789, 310, 1029, 50276, 783, 11745, 285, 18276, 1543, 403, 13943, 4103, 281, 253, 594, 66, 891, 651, 751, 281, 923, 253, 18276, 1543, 323, 253, 1682, 1566, 347, 10066, 281, 816, 253, 3965, 1566, 285, 516, 14338, 2139, 253, 1682, 18276, 4438, 369, 417, 253, 1072, 347, 253, 1682, 11745, 1566, 50276, 74, 651, 1158, 18918, 436, 3064, 812, 1918, 253, 4477, 12288, 715, 849, 281, 3157, 253, 1566, 50275, 498, 15752, 4583, 253, 2929, 310, 3542, 4518, 15571, 285, 816, 5411, 253, 1027, 4295, 273, 253, 1566, 4518, 50276, 9088, 403, 247, 1643, 3374, 34974, 891, 452, 50275, 6377, 374, 1818, 1327, 22697, 422, 50276, 22697, 422, 50276, 1542, 6864, 13418, 516, 12371, 2139, 368, 4391, 253, 278, 10936, 3024, 2957, 1159, 281, 897, 17099, 11917, 3185, 273, 298, 18, 908, 275, 253, 3236, 2929, 50276, 16534, 368, 4853, 253, 2426, 275, 253, 17099, 11917, 17705, 50276, 5371, 403, 1269, 285, 260, 352, 651, 671, 320, 1175, 281, 17914, 690, 30328, 327, 2139, 436, 17705, 310, 253, 987, 581, 50276, 783, 12480, 14637, 275, 634, 2957, 1159, 29331, 6889, 2439, 2067, 7367, 273, 9777, 50276, 5430, 497, 1110, 4236, 50276, 251, 3239, 721, 368, 1375, 326, 767, 2193, 273, 29201, 403, 908, 533, 11358, 275, 253, 2929, 29201, 310, 2931, 347, 12131, 285, 368, 897, 29201, 285, 374, 3115, 50275, 19164, 414, 253, 2929, 3839, 4648, 247, 5878, 273, 594, 66, 5609, 2833, 1242, 39239, 2366, 275, 247, 9648, 18144, 1566, 258, 379, 4460, 7794, 824, 347, 970, 253, 11454, 3816, 21052, 281, 2794, 253, 4499, 422, 6864, 6333, 369, 4722, 50274, 9188, 40348, 436, 789, 310, 1534, 1754, 327, 253, 6349, 273, 253, 1895, 50276, 2520, 310, 581, 273, 253, 12150, 285, 954, 1774, 3237, 275, 4382, 8113, 3063, 50276, 249, 253, 3290, 273, 697, 1543, 285, 275, 253, 10995, 1039, 352, 24772, 594, 66, 3082, 281, 2085, 2709, 49863, 29974, 13337, 11655, 50276, 187, 187, 4118, 18435, 27, 2520, 19529, 310, 271, 4722, 1083, 50276, 783, 1332, 352, 10262, 4620, 281, 789, 3240, 973, 17170, 1375, 23037, 14387, 11745, 14433, 1543, 2167, 36143, 253, 25578, 9421, 403, 12171, 27620, 50276, 783, 1332, 310, 3240, 2570, 534, 1027, 30628, 3047, 347, 2057, 247, 4757, 390, 247, 14855, 247, 5878, 273, 594, 66, 5609, 2833, 1242, 39239, 2366, 275, 247, 9648, 18144, 1566, 4632, 41274, 285, 519, 26901, 50276, 2252, 21038, 352, 4620, 326, 253, 4606, 323, 253, 3082, 1534, 1638, 7756, 689, 253, 2720, 1445, 323, 436, 1895, 12275, 19, 36742, 403, 417, 1955, 281, 253, 4460, 9021, 326, 253, 2929, 16633, 327, 4471, 24818, 4116, 4499, 422, 6864, 2957, 2581, 352, 310, 643, 985, 2216, 10165, 326, 403, 417, 4460, 2561, 9021, 326, 1056, 598, 512, 533, 337, 273, 436, 3064, 8558, 970, 247, 46092, 9860, 23403, 281, 755, 253, 3302, 17489, 347, 10066, 281, 271, 3302, 36809, 601, 301, 17489, 50276, 262, 1537, 320, 1896, 323, 253, 4477, 281, 3630, 247, 2718, 2929, 8109, 841, 2216, 7089, 285, 4645, 849, 597, 1421, 281, 1805, 1543, 2299, 436, 310, 417, 253, 2929, 253, 4477, 452, 3542, 253, 5020, 273, 253, 7681, 2508, 275, 253, 2929, 310, 7106, 327, 1332, 4295, 326, 1056, 8723, 3486, 891, 651, 671, 9059, 326, 436, 27710, 2929, 651, 417, 7933, 320, 4569, 323, 17857, 32888, 1580, 352, 1057, 417, 2770, 327, 667, 747, 14237, 352, 651, 320, 1805, 18960, 281, 247, 18767, 824, 347, 30105, 1087, 17857, 17312, 390, 495, 27088, 50276, 793, 37317, 608, 22828, 512, 273, 253, 6152, 323, 36307, 436, 2201, 2523, 342, 253, 2929 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 253, 2929, 29328, 281, 806, 3283, 247, 25319, 32597, 46092, 9860, 407, 9406, 839, 3907, 13650, 432, 2060, 6849, 840, 352, 16497, 352, 715, 247, 17489, 285, 39494, 352, 970, 3676, 78, 10936, 13650, 970, 1016, 1859, 275, 1614, 347, 247, 3806, 1859, 285, 247, 305, 14340, 10336, 327, 253, 17489, 50276, 251, 253, 2762, 1930, 50276, 74, 751, 253, 2934, 273, 970, 278, 10936, 3024, 533, 2139, 417, 897, 352, 432, 253, 1265, 1078, 253, 2014, 1859, 46092, 10554, 50275, 74, 1158, 436, 2929, 310, 1469, 2584, 247, 8600, 395, 23813, 2746, 323, 495, 69, 5281, 10554, 534, 891, 1158, 310, 247, 1175, 2934, 50276, 783, 9510, 275, 253, 1543, 3133, 13943, 2429, 281, 268, 19, 78, 50276, 9088, 403, 2299, 2067, 1841, 891, 13414, 751, 390, 326, 7664, 479, 670, 436, 2929, 50276, 783, 15722, 3559, 275, 436, 2929, 310, 6685, 9542, 285, 556, 1142, 1027, 4243, 846, 4361, 352, 891, 452, 642, 2934, 752, 1663, 2789, 253, 7756, 2429, 281, 268, 19, 78, 352, 4648, 32582, 1241, 17489, 285, 6864, 8115, 4216, 27311, 6928, 4116, 3169, 10336, 18504, 83, 285, 3676, 78, 10936, 253, 3733, 2957, 556, 608, 26259, 4373, 22041, 875, 1841, 347, 1027, 347, 2831, 290, 10144, 285, 45909, 1592, 4181, 50276, 936, 479, 253, 28913, 2175, 2829, 374, 285, 495, 921, 4518, 326, 253, 954, 2570, 4243, 273, 253, 15722, 4567, 4499, 422, 6864, 285, 4116, 1754, 20828, 760, 3400, 1077, 5884, 11701, 337, 1677, 616, 10454, 285, 1180, 273, 4373, 3602, 891, 513, 417, 1158, 841, 476, 320, 2783, 347, 1415, 800, 1677, 841, 1543, 352, 310, 4336, 12744, 281, 479, 849, 253, 4081, 2746, 476, 1421, 281, 247, 1638, 7756, 689, 12275, 19, 36742, 891, 3021, 1158, 253, 2746, 943, 320, 7052, 21010, 5046, 2343, 5555, 337, 275, 2457, 3045, 533, 253, 2929, 943, 2085, 247, 2590, 28913, 326, 2686, 5513, 2139, 616, 7792, 310, 594, 1199, 1805, 685, 268, 19, 78, 285, 436, 310, 4722, 987, 1024, 891, 2868, 352, 812, 320, 323, 247, 3076, 1921, 323, 1650, 3676, 78, 10936, 812, 1918, 7126, 1543, 327, 13506, 941, 984, 352, 310, 1512, 2969, 50276, 9939, 891, 8968, 326, 2829, 495, 2722, 352, 310, 417, 3962, 1580, 627, 310, 247, 2007, 4791, 9510, 970, 305, 85, 6864, 533, 352, 812, 1335, 320, 32638, 18260, 1175, 323, 13506, 941, 50276, 4919, 789, 310, 14999, 5955, 273, 1774, 10414, 10775, 512, 8946, 10414, 323, 1127, 3169, 256, 22401, 275, 3127, 50276, 8089, 272, 3024, 285, 387, 9869, 3024, 323, 17489, 5978, 275, 3307, 512, 15424, 1936, 45558, 2987, 671, 275, 3307, 372, 2265, 4989, 35190, 3024, 4896, 253, 954, 8946, 3676, 6864, 10554, 2987, 275, 3495, 9216, 285, 269, 26163, 50276, 936, 10405, 885, 5747, 697, 13943, 3904, 891, 1158, 436, 2929, 2550, 320, 7607, 347, 310, 7194, 984, 273, 697, 10454, 3480, 273, 2590, 8813, 323, 697, 5699, 3045, 9510, 285, 253, 760, 16888, 1439, 1415, 800, 9510, 84, 1677, 407, 253, 954, 2570, 70, 4243, 273, 253, 15722, 50276, 8826, 3081, 7211, 327, 9759, 50274, 74, 717, 417, 2119, 4499, 422, 6864, 310, 247, 1175, 4327, 273, 1416, 1580, 4499, 422, 4735, 4715, 310, 247, 4633, 533, 20804, 2561, 3884, 50276, 74, 1119, 4567, 1077, 1892, 281, 14390, 250, 2621, 891, 812, 760, 513, 352, 342, 253, 1361, 273, 3036, 337, 534, 310, 3139, 1892, 281, 14390, 285, 1057, 417, 1957, 24088, 849, 253, 4116, 3169, 45900, 6569, 50276, 7152, 33032, 39930, 50276, 2520, 2929, 29328, 247, 985, 273, 17029, 272, 495, 69, 5113, 432, 1554, 400, 827, 3888, 253, 985, 8414, 273, 247, 2014, 1374, 46092, 5978, 2990, 247, 1554, 400, 827, 46092, 11781, 5122, 247, 1554, 400, 827, 6864, 13418, 2990, 285, 247, 29646, 2990, 9406, 839, 1554, 400, 827, 6864, 3386, 50276, 783, 2201, 7680, 310, 275, 253, 29646, 3924, 2220, 253, 25319, 14433, 2797, 432, 46092, 13650, 5431, 323, 253, 10199, 273, 253, 4116, 3169, 1554, 400, 827, 4735, 45900, 50276, 9349, 38135, 50276, 35861, 281, 253, 2929, 285, 253, 7660, 2127, 352, 3133, 751, 253, 4477, 6571, 12845, 5368, 6928, 281, 1973, 247, 985, 253, 2488, 23970, 616, 4116, 3169, 1554, 400, 827, 4735, 45900, 5122, 534, 310, 747, 5747, 253, 1543, 253, 985, 310, 2581, 41274, 285, 519, 37806, 323, 253, 897, 273, 305, 14340, 275, 29646, 923, 1953, 374, 50276, 16680, 253, 2929, 33526, 21541, 1375, 23037, 248, 12863, 2677, 17255, 1543, 327, 2629, 7103, 5239, 285, 17082, 253, 5304, 3290, 310, 5272, 2299, 432, 4677, 495, 352, 3133, 751, 25578, 1980, 2553, 27171, 432, 33737, 616, 1543, 23490, 281, 2970, 4076, 2553, 3340, 672, 2429, 281, 15424, 3169, 3082, 824, 347, 372, 2265, 4989, 253, 4477, 858, 417, 2085, 625, 18276, 1543, 275, 8499, 932, 50276, 498, 15752, 436, 2929, 310, 973, 3542, 285, 3477, 281, 2096, 253, 7660, 2127, 310, 973, 14290, 285, 476, 320, 18329, 50276, 585, 3444, 50276, 1189, 455, 436, 310, 247, 973, 3542, 2929, 342, 21541, 6973, 253, 37317, 11532, 436, 2929, 15814, 562, 5272, 6031, 285, 16039, 715, 436, 9400, 253, 37317, 310, 42876, 2762, 4404, 697, 14924, 1955, 281, 253, 37902, 1543, 533, 310, 5877, 247, 11518, 12046, 4404, 697, 7680, 1415, 1972, 253, 37317, 651, 751, 281, 923, 253, 3533, 9713, 275, 253, 30080, 22559, 2180, 1223, 671, 3730, 281, 2571, 10123, 50275, 34974, 337, 323, 1016, 2014, 1374, 46092, 10554, 253, 2929, 858, 417, 19148, 534, 13249, 985, 1110, 46092, 403, 275, 672, 9406, 839, 1554, 400, 827, 46092, 9860, 849, 310, 253, 13249, 9261, 15726, 875, 1027, 1859, 10801, 604, 46092, 432, 1027, 13249, 2718, 943, 30618, 9261, 849, 310, 30370, 15726, 672, 34047, 281, 247, 2014, 4567, 89, 1237, 89, 1237, 9860, 50276, 19, 897, 273, 305, 14340, 347, 305, 14340, 760, 5556, 4219, 253, 1655, 17489, 352, 2550, 3451, 253, 18080, 2228, 12952, 846, 253, 25319, 14433, 849, 651, 436, 1332, 11399, 436, 3340, 672, 253, 12966, 1245, 17489, 310, 275, 3430, 18080, 495, 897, 273, 6864, 432, 1554, 400, 827, 8131, 6864, 581, 476, 3365, 17029, 432, 253, 24484, 390, 1408, 46350, 8600, 323, 39793, 253, 17489, 12087, 3587, 2139, 651, 359, 878, 4499, 422, 6864, 4735, 11998, 7152, 33032, 15177, 4583, 253, 3290, 273, 436, 789, 310, 1029, 50276, 783, 11745, 285, 18276, 1543, 403, 13943, 4103, 281, 253, 594, 66, 891, 651, 751, 281, 923, 253, 18276, 1543, 323, 253, 1682, 1566, 347, 10066, 281, 816, 253, 3965, 1566, 285, 516, 14338, 2139, 253, 1682, 18276, 4438, 369, 417, 253, 1072, 347, 253, 1682, 11745, 1566, 50276, 74, 651, 1158, 18918, 436, 3064, 812, 1918, 253, 4477, 12288, 715, 849, 281, 3157, 253, 1566, 50275, 498, 15752, 4583, 253, 2929, 310, 3542, 4518, 15571, 285, 816, 5411, 253, 1027, 4295, 273, 253, 1566, 4518, 50276, 9088, 403, 247, 1643, 3374, 34974, 891, 452, 50275, 6377, 374, 1818, 1327, 22697, 422, 50276, 22697, 422, 50276, 1542, 6864, 13418, 516, 12371, 2139, 368, 4391, 253, 278, 10936, 3024, 2957, 1159, 281, 897, 17099, 11917, 3185, 273, 298, 18, 908, 275, 253, 3236, 2929, 50276, 16534, 368, 4853, 253, 2426, 275, 253, 17099, 11917, 17705, 50276, 5371, 403, 1269, 285, 260, 352, 651, 671, 320, 1175, 281, 17914, 690, 30328, 327, 2139, 436, 17705, 310, 253, 987, 581, 50276, 783, 12480, 14637, 275, 634, 2957, 1159, 29331, 6889, 2439, 2067, 7367, 273, 9777, 50276, 5430, 497, 1110, 4236, 50276, 251, 3239, 721, 368, 1375, 326, 767, 2193, 273, 29201, 403, 908, 533, 11358, 275, 253, 2929, 29201, 310, 2931, 347, 12131, 285, 368, 897, 29201, 285, 374, 3115, 50275, 19164, 414, 253, 2929, 3839, 4648, 247, 5878, 273, 594, 66, 5609, 2833, 1242, 39239, 2366, 275, 247, 9648, 18144, 1566, 258, 379, 4460, 7794, 824, 347, 970, 253, 11454, 3816, 21052, 281, 2794, 253, 4499, 422, 6864, 6333, 369, 4722, 50274, 9188, 40348, 436, 789, 310, 1534, 1754, 327, 253, 6349, 273, 253, 1895, 50276, 2520, 310, 581, 273, 253, 12150, 285, 954, 1774, 3237, 275, 4382, 8113, 3063, 50276, 249, 253, 3290, 273, 697, 1543, 285, 275, 253, 10995, 1039, 352, 24772, 594, 66, 3082, 281, 2085, 2709, 49863, 29974, 13337, 11655, 50276, 187, 187, 4118, 18435, 27, 2520, 19529, 310, 271, 4722, 1083, 50276, 783, 1332, 352, 10262, 4620, 281, 789, 3240, 973, 17170, 1375, 23037, 14387, 11745, 14433, 1543, 2167, 36143, 253, 25578, 9421, 403, 12171, 27620, 50276, 783, 1332, 310, 3240, 2570, 534, 1027, 30628, 3047, 347, 2057, 247, 4757, 390, 247, 14855, 247, 5878, 273, 594, 66, 5609, 2833, 1242, 39239, 2366, 275, 247, 9648, 18144, 1566, 4632, 41274, 285, 519, 26901, 50276, 2252, 21038, 352, 4620, 326, 253, 4606, 323, 253, 3082, 1534, 1638, 7756, 689, 253, 2720, 1445, 323, 436, 1895, 12275, 19, 36742, 403, 417, 1955, 281, 253, 4460, 9021, 326, 253, 2929, 16633, 327, 4471, 24818, 4116, 4499, 422, 6864, 2957, 2581, 352, 310, 643, 985, 2216, 10165, 326, 403, 417, 4460, 2561, 9021, 326, 1056, 598, 512, 533, 337, 273, 436, 3064, 8558, 970, 247, 46092, 9860, 23403, 281, 755, 253, 3302, 17489, 347, 10066, 281, 271, 3302, 36809, 601, 301, 17489, 50276, 262, 1537, 320, 1896, 323, 253, 4477, 281, 3630, 247, 2718, 2929, 8109, 841, 2216, 7089, 285, 4645, 849, 597, 1421, 281, 1805, 1543, 2299, 436, 310, 417, 253, 2929, 253, 4477, 452, 3542, 253, 5020, 273, 253, 7681, 2508, 275, 253, 2929, 310, 7106, 327, 1332, 4295, 326, 1056, 8723, 3486, 891, 651, 671, 9059, 326, 436, 27710, 2929, 651, 417, 7933, 320, 4569, 323, 17857, 32888, 1580, 352, 1057, 417, 2770, 327, 667, 747, 14237, 352, 651, 320, 1805, 18960, 281, 247, 18767, 824, 347, 30105, 1087, 17857, 17312, 390, 495, 27088, 50276, 793, 37317, 608, 22828, 512, 273, 253, 6152, 323, 36307, 436, 2201, 2523, 342, 253, 2929 ]