Input
stringlengths
251
41.6k
Output
stringlengths
137
9.7k
input_ids
listlengths
157
2.05k
attention_mask
listlengths
157
2.05k
labels
listlengths
157
2.05k
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper presents a selfsupervised learning with an information maximization criterion among alternative latent representations of the same input that naturally prevents dimensional collapse it considers a secondorderstatistics based mutual information measure the logdeterminant mutual information ldmi which is equivalent to shannon mutual information under gaussian distribution a further firstorder approximation to the logdeterminant of the sum of two matrices is used to simplify the final objective to a euclidean distancebased objective function regularized by the logdeterminant of the feature covariance matrix consequently it avoids the collapse problem establishes relevance among alternative representations by increasing the linear dependence among them experiments on 4 image datasets show that the proposed approach gives better results than contrastive and noncontrastive methods the paper is well written and easy to follow however there are several issues id like to address the revised version has cleared my concerns 1 this following statement is misleading a common selfsupervision task is to match the latent representations that come from the distortions of the same input its true in computer vision but not true in speech or natural language processing 2 the following statement is not quite right maximizing smi between the representations of the same input is a challenging task whose implementation would require relatively large sample sizes 11 12 11 demonstrates that when mutual information is large existing variational lower bounds degrade and exhibits either high bias or high variance in fact mcallester and stratos prove that serious statistical limitations are inherent to any lower bound method of measuring mutual information more specifically any distributionfree highconfidence lower bound on mutual information estimated from n samples cannot be larger than oln n 3 collapse is a central concern for ssl with the same input in computer vision reference david mcallester and karl stratos formal limitations on the measurement of mutual information the twenty third international conference on artificial intelligence and statistics 108875884 2020 yes docsepthis paper extensively addresses the collapse problem by proposing secondorder statisticsbased mutual information measure that reflects the level of correlation among the inputs it claims that maximizing this correlative information measure between alternative representations of the same input serves two purposes 1 it avoids the collapse problem by generating feature vectors with nondegenerate covariances 2 it establishes relevance among alternative representations by increasing the linear dependence among them all these are proved to simplify as a regularization term which acts as a natural barrier against feature space degeneracy in general the paper is well written and reasonably understandable i found the underlying theory to be very strong however the presented experiments arent sufficient to show the strength of the work strengths 1 the paper is theoretically grounded on correlative information measure of representation additionally it is very well reasoned i like the way it is presented 2 on classification downstream task the proposed methods has obtained stateoftheart results one some datasets although i found that the results are not robust and the comparison is not complete please also see my experiment related comments in the weaknesses section weaknesses 1 the experiments shown are bit limited and not very robust currently the experiments are only done on one downstream task ie classification more experiments on other tasks such as object detection semantic segmentation fewshot learning representation transferability etc could have been interesting and could have strengthen the method moreover ablation of the model on different hyperparameters should have been studied to understand the strength of the model furthermore standard experimental strategies are not followed as an example for classification tasks selfsupervised models follow linear and finetuning strategies which is not followed in this work 2 i wonder if there is any way to estimate the amount of mutual information maximized via the proposed corinfomax approach it would be interesting to see some intermediate results or plots showing the evolution of estimated mutual information as the training progresses also it would have been interesting to see analysis on the performance behaviour of the model with evolution of mutual information does higher mutual information always guarantee a better trained model the authors have not mentioned about the limitation of the work in the paper a possible limitation could be the consideration of only the global information and not considering the local information which might make this method only applicable for global task such as classification i suspect this type of model wont work very well on the task like involving finegrained feature understanding docsepthe paper introduced corinfomax a second order statistics based mutual information method to avoid collapse problem in self supervised learning ssl it used logdeterminant mutual information ldmi as the measure between data pairs with different augmentations mathematically the paper gave stepbystep formulation to derivate the final form of the objective via its firstorder taylor series approximation experimental results on cifar10cifar100imagenet gives competitive results strengths this work is novel and well intuited collapse issue is a wellknown problem in ssl traditionally shannon mutual information smi is applied to mitigate the collapse issue with several drawbacks such as large batch requirement the logdeterminant mutual information ldmi naturally enforces latent distributions have nondegenerate distributions which can avoid collapse therefore this method is well motivated and makes sense clean and neat formula step by step formula gives clean loss format in eq 8 it clearly indicated data under different augmentation should be closed to each other and log determinant formula enforces the model from model collapse the result is elegent this result shows stateoftheart ssl performance weakness mathematical formula is too heavy in the paper it is a bit hard for audience to follow i would strongly recommend a more detailed inference in any appendix to show how they are coming from more though experiments should be provided to demonstrate the effectiveness of the approach for example the paper argues the drawbacks for smi and introduce ldmi measure as a better solution to avoid collapse however no evidence shown hot does it better avoids collapse yes docsepthis paper presents a new selfsupervised learning method based on the logdeterminant mutual information proposed earlier experiments on three small size datasets and one medium size dataset show the effectiveness of the proposed method strengths 1 the experimental results are good 2 overall the article is easy to read weaknesses 1 the experimental part is relatively weak 2 the novel of this paper is limited the main contribution of this paper is to directly generalize the earlier proposed logdeterminant mutual information to the field of selfsupervised learning 3 section 3 is superfluous the authors have adequately addressed the limitations and potential negative societal impact of their work ### Summary:
the paper describes a selfsupervised learning method based on an information maximization criterion that naturally prevents dimensional collapse the authors consider the shannon mutual information under the assumption that the data is gaussian a firstorder approximation to the logdeterminant of the sum of two matrices is used to simplify the final objective experiments on 4 image datasets show that the proposed approach gives better results than contrastive and noncontrastive methods strengths 1 the paper is well written and easy to follow 2 the paper is theoretically grounded on correlative information measure of representation 3 strong results on some downstream classification problems 4 initially the experiments included only one downstream task regarding classification but the paper has been updated to include also results for object segmentation and detection task 5 novel and well motivated 6 stateoftheart ssl performance weaknesses some weaknesses are pointed out by reviewer gzwk but these are not well justified decision a majority of reviewers vote for acceptance the only reviewer voting slightly towards rejection is gzwk with a reasoning that is not well justified for example the main criticisms mentioned by reviewer gzwk the paper directly generalizes the earlier proposed logdeterminant mutual information to the field of selfsupervised learning this paper does not give a deepgoing analysis that why the secondorder statistics can play a important role in selfsupervised learning are not mentioned by any of the other reviewers because of this i have decided to accept the paper
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 10262, 247, 1881, 35421, 4715, 342, 271, 1491, 11903, 1320, 17705, 2190, 5795, 21624, 14237, 273, 253, 1072, 3280, 326, 10748, 16897, 15759, 13551, 352, 19401, 247, 1273, 2621, 8766, 3397, 1754, 15577, 1491, 2557, 253, 2412, 18916, 249, 386, 15577, 1491, 42651, 7373, 534, 310, 6425, 281, 439, 16554, 15577, 1491, 762, 305, 12064, 3268, 247, 2007, 806, 2621, 11193, 281, 253, 2412, 18916, 249, 386, 273, 253, 2020, 273, 767, 12624, 310, 908, 281, 25636, 253, 2457, 8103, 281, 247, 299, 26365, 4181, 3169, 8103, 1159, 3963, 1025, 407, 253, 2412, 18916, 249, 386, 273, 253, 4735, 26677, 4315, 17912, 352, 50276, 580, 9448, 253, 13551, 1895, 25097, 17200, 2190, 5795, 14237, 407, 3629, 253, 4872, 10096, 2190, 731, 50276, 16217, 3825, 327, 577, 2460, 15302, 921, 326, 253, 4081, 2746, 4245, 1805, 1543, 685, 4499, 422, 285, 1327, 45842, 422, 3082, 50275, 783, 2929, 310, 973, 3542, 285, 3477, 281, 956, 2299, 627, 403, 2067, 3374, 2654, 751, 281, 2953, 253, 17265, 2715, 556, 16481, 619, 7350, 50276, 18, 436, 1563, 3908, 310, 24363, 247, 1846, 1881, 12185, 4694, 4836, 310, 281, 3761, 253, 21624, 14237, 326, 1705, 432, 253, 46598, 273, 253, 1072, 3280, 697, 2032, 275, 4382, 8113, 533, 417, 2032, 275, 6519, 390, 3626, 3448, 5162, 50275, 19, 253, 1563, 3908, 310, 417, 3240, 987, 46875, 924, 74, 875, 253, 14237, 273, 253, 1072, 3280, 310, 247, 11132, 4836, 3692, 7092, 651, 2430, 4942, 1781, 3410, 9552, 1903, 1249, 1903, 14371, 326, 672, 15577, 1491, 310, 1781, 50276, 20137, 39762, 2406, 14493, 40195, 285, 15646, 2057, 1029, 8492, 390, 1029, 11041, 275, 958, 278, 4065, 9358, 285, 15252, 375, 5276, 326, 4092, 7605, 7364, 403, 12794, 281, 667, 2406, 3033, 1332, 273, 10499, 15577, 1491, 625, 5742, 667, 3268, 4924, 1029, 39943, 2406, 3033, 327, 15577, 1491, 5998, 432, 295, 3530, 2550, 320, 4067, 685, 258, 6677, 295, 50275, 20, 13551, 310, 247, 4275, 4468, 323, 256, 3433, 342, 253, 1072, 3280, 275, 4382, 8113, 50276, 14005, 34843, 301, 278, 4065, 9358, 285, 465, 7694, 15252, 375, 7473, 7364, 327, 253, 6814, 273, 15577, 1491, 253, 6818, 2626, 5213, 8059, 327, 13345, 9260, 285, 9990, 884, 2055, 1976, 34305, 9169, 50276, 9820, 5474, 33032, 2520, 2929, 18171, 12453, 253, 13551, 1895, 407, 36636, 1273, 2621, 9990, 3169, 15577, 1491, 2557, 326, 13806, 253, 1268, 273, 5921, 2190, 253, 14800, 352, 3916, 326, 46875, 436, 3411, 800, 1491, 2557, 875, 5795, 14237, 273, 253, 1072, 3280, 11029, 767, 6378, 337, 352, 32547, 253, 13551, 1895, 407, 11365, 4735, 11390, 342, 1327, 42822, 9383, 6656, 707, 374, 352, 25097, 17200, 2190, 5795, 14237, 407, 3629, 253, 4872, 10096, 2190, 731, 512, 841, 403, 8058, 281, 25636, 347, 247, 37820, 1307, 534, 6993, 347, 247, 3626, 11394, 1411, 4735, 2317, 44877, 275, 2087, 253, 2929, 310, 973, 3542, 285, 12054, 34007, 891, 1119, 253, 6944, 3762, 281, 320, 1077, 2266, 2299, 253, 3559, 4679, 403, 2649, 4209, 281, 921, 253, 4757, 273, 253, 789, 20544, 50276, 18, 253, 2929, 310, 28055, 28462, 327, 3411, 800, 1491, 2557, 273, 6779, 23000, 352, 310, 1077, 973, 31114, 891, 751, 253, 1039, 352, 310, 3559, 50276, 19, 327, 9162, 15450, 4836, 253, 4081, 3082, 556, 2797, 1375, 23037, 14387, 1543, 581, 690, 15302, 3738, 891, 1119, 326, 253, 1543, 403, 417, 10237, 285, 253, 5301, 310, 417, 3426, 4496, 671, 923, 619, 3368, 2905, 5701, 275, 253, 32213, 2593, 50276, 20881, 1255, 265, 50276, 18, 253, 4679, 2011, 403, 2372, 3710, 285, 417, 1077, 10237, 4390, 253, 4679, 403, 760, 2218, 327, 581, 15450, 4836, 26332, 9162, 625, 4679, 327, 643, 8892, 824, 347, 1789, 5481, 24705, 26405, 1643, 11860, 4715, 6779, 3700, 1430, 3966, 812, 452, 644, 4722, 285, 812, 452, 17084, 253, 1332, 25761, 28913, 273, 253, 1566, 327, 1027, 4373, 22041, 943, 452, 644, 5421, 281, 2096, 253, 4757, 273, 253, 1566, 33810, 2629, 5661, 8130, 403, 417, 3560, 347, 271, 1650, 323, 9162, 8892, 1881, 35421, 3210, 956, 4872, 285, 1442, 292, 25004, 8130, 534, 310, 417, 3560, 275, 436, 789, 50276, 19, 891, 4282, 604, 627, 310, 667, 1039, 281, 6642, 253, 2408, 273, 15577, 1491, 11903, 1025, 3066, 253, 4081, 944, 2050, 297, 991, 2746, 352, 651, 320, 4722, 281, 923, 690, 10444, 1543, 390, 14777, 4645, 253, 5606, 273, 5998, 15577, 1491, 347, 253, 3733, 42851, 671, 352, 651, 452, 644, 4722, 281, 923, 1783, 327, 253, 3045, 8770, 273, 253, 1566, 342, 5606, 273, 15577, 1491, 1057, 2169, 15577, 1491, 1900, 12215, 247, 1805, 10166, 1566, 253, 4477, 452, 417, 5393, 670, 253, 12291, 273, 253, 789, 275, 253, 2929, 247, 1896, 12291, 812, 320, 253, 8180, 273, 760, 253, 4156, 1491, 285, 417, 7296, 253, 1980, 1491, 534, 1537, 1056, 436, 1332, 760, 7763, 323, 4156, 4836, 824, 347, 9162, 891, 9101, 436, 1511, 273, 1566, 31451, 789, 1077, 973, 327, 253, 4836, 751, 7668, 4030, 72, 11273, 4735, 4685, 50276, 7152, 339, 431, 248, 2929, 5611, 944, 2050, 297, 991, 247, 1273, 1340, 9990, 1754, 15577, 1491, 1332, 281, 3693, 13551, 1895, 275, 1881, 22296, 4715, 256, 3433, 352, 908, 2412, 18916, 249, 386, 15577, 1491, 42651, 7373, 347, 253, 2557, 875, 941, 8557, 342, 1027, 35919, 569, 11076, 1037, 253, 2929, 3534, 3213, 1615, 10539, 15895, 281, 3538, 366, 253, 2457, 830, 273, 253, 8103, 3066, 697, 806, 2621, 246, 9614, 2962, 11193, 5661, 1543, 327, 260, 338, 274, 740, 46277, 274, 2313, 303, 6533, 292, 4245, 12085, 1543, 50276, 296, 3755, 20556, 50276, 2520, 789, 310, 4460, 285, 973, 16875, 959, 13551, 2523, 310, 247, 973, 4304, 1895, 275, 256, 3433, 21533, 439, 16554, 15577, 1491, 924, 74, 310, 3732, 281, 29966, 253, 13551, 2523, 342, 2067, 30453, 824, 347, 1781, 14604, 8284, 253, 2412, 18916, 249, 386, 15577, 1491, 42651, 7373, 10748, 546, 36217, 21624, 10670, 452, 1327, 42822, 10670, 534, 476, 3693, 13551, 3103, 436, 1332, 310, 973, 17194, 285, 2789, 3282, 50276, 16437, 285, 18176, 7212, 3213, 407, 3213, 7212, 4245, 4076, 2957, 5981, 275, 16186, 854, 352, 4518, 4860, 941, 762, 1027, 42072, 943, 320, 4581, 281, 1016, 643, 285, 2412, 27152, 7212, 546, 36217, 253, 1566, 432, 1566, 13551, 253, 906, 310, 13990, 290, 50276, 2520, 906, 2722, 1375, 23037, 14387, 256, 3433, 3045, 50275, 20881, 1255, 50276, 2056, 10479, 474, 7212, 310, 1512, 5536, 275, 253, 2929, 352, 310, 247, 2372, 1892, 323, 8446, 281, 956, 891, 651, 7052, 5583, 247, 625, 7000, 17032, 275, 667, 30762, 281, 921, 849, 597, 403, 3551, 432, 50276, 3062, 2167, 4679, 943, 320, 2530, 281, 7568, 253, 12510, 273, 253, 2746, 323, 1650, 253, 2929, 8219, 253, 30453, 323, 924, 74, 285, 9569, 42651, 7373, 2557, 347, 247, 1805, 2900, 281, 3693, 13551, 2299, 642, 1941, 2011, 3511, 1057, 352, 1805, 32547, 13551, 50275, 9820, 5474, 33032, 2520, 2929, 10262, 247, 747, 1881, 35421, 4715, 1332, 1754, 327, 253, 2412, 18916, 249, 386, 15577, 1491, 4081, 4321, 4679, 327, 1264, 1355, 1979, 15302, 285, 581, 4646, 1979, 10895, 921, 253, 12510, 273, 253, 4081, 1332, 20544, 337, 253, 5661, 1543, 403, 1175, 374, 4583, 253, 3929, 310, 3477, 281, 1239, 50276, 20881, 1255, 265, 337, 253, 5661, 629, 310, 4942, 5075, 50276, 19, 253, 4460, 273, 436, 2929, 310, 3710, 253, 2022, 7680, 273, 436, 2929, 310, 281, 3587, 39970, 253, 4321, 4081, 2412, 18916, 249, 386, 15577, 1491, 281, 253, 1673, 273, 1881, 35421, 4715, 495, 2593, 495, 310, 2221, 1258, 3472, 50275, 783, 4477, 452, 18212, 9713, 253, 7364, 285, 2442, 4016, 38058, 3486, 273, 616, 789, 2490, 187, 4118, 18435, 27, 783, 2929, 8631, 247, 1881, 35421, 4715, 1332, 1754, 327, 271, 1491, 11903, 1320, 17705, 326, 10748, 16897, 15759, 13551, 253, 4477, 1908, 253, 439, 16554, 15577, 1491, 762, 253, 9376, 326, 253, 941, 310, 305, 12064, 247, 806, 2621, 11193, 281, 253, 2412, 18916, 249, 386, 273, 253, 2020, 273, 767, 12624, 310, 908, 281, 25636, 253, 2457, 8103, 4679, 327, 577, 2460, 15302, 921, 326, 253, 4081, 2746, 4245, 1805, 1543, 685, 4499, 422, 285, 1327, 45842, 422, 3082, 50276, 296, 3755, 20556, 50276, 18, 50276, 783, 2929, 310, 973, 3542, 285, 3477, 281, 956, 374, 50276, 783, 2929, 310, 28055, 28462, 327, 3411, 800, 1491, 2557, 273, 6779, 495, 50276, 9072, 1543, 327, 690, 15450, 9162, 3237, 577, 50276, 4478, 1365, 253, 4679, 2908, 760, 581, 15450, 4836, 5001, 9162, 533, 253, 2929, 556, 644, 9300, 281, 2486, 671, 1543, 323, 1789, 26405, 285, 5481, 4836, 608, 50276, 2369, 652, 285, 973, 17194, 721, 50276, 3409, 23037, 14387, 256, 3433, 3045, 50276, 20881, 1255, 265, 50275, 8826, 32213, 403, 8042, 562, 407, 37317, 305, 91, 30567, 533, 841, 403, 417, 973, 17285, 50276, 33642, 50276, 66, 5020, 273, 30628, 6273, 323, 14924, 253, 760, 37317, 13423, 5777, 4404, 18235, 310, 305, 91, 30567, 342, 247, 14720, 326, 310, 417, 973, 17285, 323, 1650, 253, 2022, 43680, 5393, 407, 37317, 305, 91, 30567, 50275, 783, 2929, 3587, 2087, 4219, 253, 4321, 4081, 2412, 18916, 249, 386, 15577, 1491, 281, 253, 1673, 273, 1881, 35421, 4715, 50275, 2520, 2929, 1057, 417, 1918, 247, 3676, 5681, 1783, 326, 2139, 253, 1273, 2621, 9990, 476, 1132, 247, 1774, 2554, 275, 1881, 35421, 4715, 50276, 609, 417, 5393, 407, 667, 273, 253, 643, 30628, 50276, 12157, 273, 436, 891, 452, 4425, 281, 2997, 253, 2929, 50276 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 10262, 247, 1881, 35421, 4715, 342, 271, 1491, 11903, 1320, 17705, 2190, 5795, 21624, 14237, 273, 253, 1072, 3280, 326, 10748, 16897, 15759, 13551, 352, 19401, 247, 1273, 2621, 8766, 3397, 1754, 15577, 1491, 2557, 253, 2412, 18916, 249, 386, 15577, 1491, 42651, 7373, 534, 310, 6425, 281, 439, 16554, 15577, 1491, 762, 305, 12064, 3268, 247, 2007, 806, 2621, 11193, 281, 253, 2412, 18916, 249, 386, 273, 253, 2020, 273, 767, 12624, 310, 908, 281, 25636, 253, 2457, 8103, 281, 247, 299, 26365, 4181, 3169, 8103, 1159, 3963, 1025, 407, 253, 2412, 18916, 249, 386, 273, 253, 4735, 26677, 4315, 17912, 352, 50276, 580, 9448, 253, 13551, 1895, 25097, 17200, 2190, 5795, 14237, 407, 3629, 253, 4872, 10096, 2190, 731, 50276, 16217, 3825, 327, 577, 2460, 15302, 921, 326, 253, 4081, 2746, 4245, 1805, 1543, 685, 4499, 422, 285, 1327, 45842, 422, 3082, 50275, 783, 2929, 310, 973, 3542, 285, 3477, 281, 956, 2299, 627, 403, 2067, 3374, 2654, 751, 281, 2953, 253, 17265, 2715, 556, 16481, 619, 7350, 50276, 18, 436, 1563, 3908, 310, 24363, 247, 1846, 1881, 12185, 4694, 4836, 310, 281, 3761, 253, 21624, 14237, 326, 1705, 432, 253, 46598, 273, 253, 1072, 3280, 697, 2032, 275, 4382, 8113, 533, 417, 2032, 275, 6519, 390, 3626, 3448, 5162, 50275, 19, 253, 1563, 3908, 310, 417, 3240, 987, 46875, 924, 74, 875, 253, 14237, 273, 253, 1072, 3280, 310, 247, 11132, 4836, 3692, 7092, 651, 2430, 4942, 1781, 3410, 9552, 1903, 1249, 1903, 14371, 326, 672, 15577, 1491, 310, 1781, 50276, 20137, 39762, 2406, 14493, 40195, 285, 15646, 2057, 1029, 8492, 390, 1029, 11041, 275, 958, 278, 4065, 9358, 285, 15252, 375, 5276, 326, 4092, 7605, 7364, 403, 12794, 281, 667, 2406, 3033, 1332, 273, 10499, 15577, 1491, 625, 5742, 667, 3268, 4924, 1029, 39943, 2406, 3033, 327, 15577, 1491, 5998, 432, 295, 3530, 2550, 320, 4067, 685, 258, 6677, 295, 50275, 20, 13551, 310, 247, 4275, 4468, 323, 256, 3433, 342, 253, 1072, 3280, 275, 4382, 8113, 50276, 14005, 34843, 301, 278, 4065, 9358, 285, 465, 7694, 15252, 375, 7473, 7364, 327, 253, 6814, 273, 15577, 1491, 253, 6818, 2626, 5213, 8059, 327, 13345, 9260, 285, 9990, 884, 2055, 1976, 34305, 9169, 50276, 9820, 5474, 33032, 2520, 2929, 18171, 12453, 253, 13551, 1895, 407, 36636, 1273, 2621, 9990, 3169, 15577, 1491, 2557, 326, 13806, 253, 1268, 273, 5921, 2190, 253, 14800, 352, 3916, 326, 46875, 436, 3411, 800, 1491, 2557, 875, 5795, 14237, 273, 253, 1072, 3280, 11029, 767, 6378, 337, 352, 32547, 253, 13551, 1895, 407, 11365, 4735, 11390, 342, 1327, 42822, 9383, 6656, 707, 374, 352, 25097, 17200, 2190, 5795, 14237, 407, 3629, 253, 4872, 10096, 2190, 731, 512, 841, 403, 8058, 281, 25636, 347, 247, 37820, 1307, 534, 6993, 347, 247, 3626, 11394, 1411, 4735, 2317, 44877, 275, 2087, 253, 2929, 310, 973, 3542, 285, 12054, 34007, 891, 1119, 253, 6944, 3762, 281, 320, 1077, 2266, 2299, 253, 3559, 4679, 403, 2649, 4209, 281, 921, 253, 4757, 273, 253, 789, 20544, 50276, 18, 253, 2929, 310, 28055, 28462, 327, 3411, 800, 1491, 2557, 273, 6779, 23000, 352, 310, 1077, 973, 31114, 891, 751, 253, 1039, 352, 310, 3559, 50276, 19, 327, 9162, 15450, 4836, 253, 4081, 3082, 556, 2797, 1375, 23037, 14387, 1543, 581, 690, 15302, 3738, 891, 1119, 326, 253, 1543, 403, 417, 10237, 285, 253, 5301, 310, 417, 3426, 4496, 671, 923, 619, 3368, 2905, 5701, 275, 253, 32213, 2593, 50276, 20881, 1255, 265, 50276, 18, 253, 4679, 2011, 403, 2372, 3710, 285, 417, 1077, 10237, 4390, 253, 4679, 403, 760, 2218, 327, 581, 15450, 4836, 26332, 9162, 625, 4679, 327, 643, 8892, 824, 347, 1789, 5481, 24705, 26405, 1643, 11860, 4715, 6779, 3700, 1430, 3966, 812, 452, 644, 4722, 285, 812, 452, 17084, 253, 1332, 25761, 28913, 273, 253, 1566, 327, 1027, 4373, 22041, 943, 452, 644, 5421, 281, 2096, 253, 4757, 273, 253, 1566, 33810, 2629, 5661, 8130, 403, 417, 3560, 347, 271, 1650, 323, 9162, 8892, 1881, 35421, 3210, 956, 4872, 285, 1442, 292, 25004, 8130, 534, 310, 417, 3560, 275, 436, 789, 50276, 19, 891, 4282, 604, 627, 310, 667, 1039, 281, 6642, 253, 2408, 273, 15577, 1491, 11903, 1025, 3066, 253, 4081, 944, 2050, 297, 991, 2746, 352, 651, 320, 4722, 281, 923, 690, 10444, 1543, 390, 14777, 4645, 253, 5606, 273, 5998, 15577, 1491, 347, 253, 3733, 42851, 671, 352, 651, 452, 644, 4722, 281, 923, 1783, 327, 253, 3045, 8770, 273, 253, 1566, 342, 5606, 273, 15577, 1491, 1057, 2169, 15577, 1491, 1900, 12215, 247, 1805, 10166, 1566, 253, 4477, 452, 417, 5393, 670, 253, 12291, 273, 253, 789, 275, 253, 2929, 247, 1896, 12291, 812, 320, 253, 8180, 273, 760, 253, 4156, 1491, 285, 417, 7296, 253, 1980, 1491, 534, 1537, 1056, 436, 1332, 760, 7763, 323, 4156, 4836, 824, 347, 9162, 891, 9101, 436, 1511, 273, 1566, 31451, 789, 1077, 973, 327, 253, 4836, 751, 7668, 4030, 72, 11273, 4735, 4685, 50276, 7152, 339, 431, 248, 2929, 5611, 944, 2050, 297, 991, 247, 1273, 1340, 9990, 1754, 15577, 1491, 1332, 281, 3693, 13551, 1895, 275, 1881, 22296, 4715, 256, 3433, 352, 908, 2412, 18916, 249, 386, 15577, 1491, 42651, 7373, 347, 253, 2557, 875, 941, 8557, 342, 1027, 35919, 569, 11076, 1037, 253, 2929, 3534, 3213, 1615, 10539, 15895, 281, 3538, 366, 253, 2457, 830, 273, 253, 8103, 3066, 697, 806, 2621, 246, 9614, 2962, 11193, 5661, 1543, 327, 260, 338, 274, 740, 46277, 274, 2313, 303, 6533, 292, 4245, 12085, 1543, 50276, 296, 3755, 20556, 50276, 2520, 789, 310, 4460, 285, 973, 16875, 959, 13551, 2523, 310, 247, 973, 4304, 1895, 275, 256, 3433, 21533, 439, 16554, 15577, 1491, 924, 74, 310, 3732, 281, 29966, 253, 13551, 2523, 342, 2067, 30453, 824, 347, 1781, 14604, 8284, 253, 2412, 18916, 249, 386, 15577, 1491, 42651, 7373, 10748, 546, 36217, 21624, 10670, 452, 1327, 42822, 10670, 534, 476, 3693, 13551, 3103, 436, 1332, 310, 973, 17194, 285, 2789, 3282, 50276, 16437, 285, 18176, 7212, 3213, 407, 3213, 7212, 4245, 4076, 2957, 5981, 275, 16186, 854, 352, 4518, 4860, 941, 762, 1027, 42072, 943, 320, 4581, 281, 1016, 643, 285, 2412, 27152, 7212, 546, 36217, 253, 1566, 432, 1566, 13551, 253, 906, 310, 13990, 290, 50276, 2520, 906, 2722, 1375, 23037, 14387, 256, 3433, 3045, 50275, 20881, 1255, 50276, 2056, 10479, 474, 7212, 310, 1512, 5536, 275, 253, 2929, 352, 310, 247, 2372, 1892, 323, 8446, 281, 956, 891, 651, 7052, 5583, 247, 625, 7000, 17032, 275, 667, 30762, 281, 921, 849, 597, 403, 3551, 432, 50276, 3062, 2167, 4679, 943, 320, 2530, 281, 7568, 253, 12510, 273, 253, 2746, 323, 1650, 253, 2929, 8219, 253, 30453, 323, 924, 74, 285, 9569, 42651, 7373, 2557, 347, 247, 1805, 2900, 281, 3693, 13551, 2299, 642, 1941, 2011, 3511, 1057, 352, 1805, 32547, 13551, 50275, 9820, 5474, 33032, 2520, 2929, 10262, 247, 747, 1881, 35421, 4715, 1332, 1754, 327, 253, 2412, 18916, 249, 386, 15577, 1491, 4081, 4321, 4679, 327, 1264, 1355, 1979, 15302, 285, 581, 4646, 1979, 10895, 921, 253, 12510, 273, 253, 4081, 1332, 20544, 337, 253, 5661, 1543, 403, 1175, 374, 4583, 253, 3929, 310, 3477, 281, 1239, 50276, 20881, 1255, 265, 337, 253, 5661, 629, 310, 4942, 5075, 50276, 19, 253, 4460, 273, 436, 2929, 310, 3710, 253, 2022, 7680, 273, 436, 2929, 310, 281, 3587, 39970, 253, 4321, 4081, 2412, 18916, 249, 386, 15577, 1491, 281, 253, 1673, 273, 1881, 35421, 4715, 495, 2593, 495, 310, 2221, 1258, 3472, 50275, 783, 4477, 452, 18212, 9713, 253, 7364, 285, 2442, 4016, 38058, 3486, 273, 616, 789, 2490, 187, 4118, 18435, 27, 783, 2929, 8631, 247, 1881, 35421, 4715, 1332, 1754, 327, 271, 1491, 11903, 1320, 17705, 326, 10748, 16897, 15759, 13551, 253, 4477, 1908, 253, 439, 16554, 15577, 1491, 762, 253, 9376, 326, 253, 941, 310, 305, 12064, 247, 806, 2621, 11193, 281, 253, 2412, 18916, 249, 386, 273, 253, 2020, 273, 767, 12624, 310, 908, 281, 25636, 253, 2457, 8103, 4679, 327, 577, 2460, 15302, 921, 326, 253, 4081, 2746, 4245, 1805, 1543, 685, 4499, 422, 285, 1327, 45842, 422, 3082, 50276, 296, 3755, 20556, 50276, 18, 50276, 783, 2929, 310, 973, 3542, 285, 3477, 281, 956, 374, 50276, 783, 2929, 310, 28055, 28462, 327, 3411, 800, 1491, 2557, 273, 6779, 495, 50276, 9072, 1543, 327, 690, 15450, 9162, 3237, 577, 50276, 4478, 1365, 253, 4679, 2908, 760, 581, 15450, 4836, 5001, 9162, 533, 253, 2929, 556, 644, 9300, 281, 2486, 671, 1543, 323, 1789, 26405, 285, 5481, 4836, 608, 50276, 2369, 652, 285, 973, 17194, 721, 50276, 3409, 23037, 14387, 256, 3433, 3045, 50276, 20881, 1255, 265, 50275, 8826, 32213, 403, 8042, 562, 407, 37317, 305, 91, 30567, 533, 841, 403, 417, 973, 17285, 50276, 33642, 50276, 66, 5020, 273, 30628, 6273, 323, 14924, 253, 760, 37317, 13423, 5777, 4404, 18235, 310, 305, 91, 30567, 342, 247, 14720, 326, 310, 417, 973, 17285, 323, 1650, 253, 2022, 43680, 5393, 407, 37317, 305, 91, 30567, 50275, 783, 2929, 3587, 2087, 4219, 253, 4321, 4081, 2412, 18916, 249, 386, 15577, 1491, 281, 253, 1673, 273, 1881, 35421, 4715, 50275, 2520, 2929, 1057, 417, 1918, 247, 3676, 5681, 1783, 326, 2139, 253, 1273, 2621, 9990, 476, 1132, 247, 1774, 2554, 275, 1881, 35421, 4715, 50276, 609, 417, 5393, 407, 667, 273, 253, 643, 30628, 50276, 12157, 273, 436, 891, 452, 4425, 281, 2997, 253, 2929, 50276 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: summary this paper proposes a new method called halfspace stochastic projected gradient hspg to find a group sparse solution of regularized finitesum problems theoretical analysis tries to show the sparsity identification guarantees in experiments the effectiveness of hspg was verified on the classification tasks strength the idea behind the proposed method seems to be reasonable and interesting weakness a major concern is the correctness of the statements in the equation 97 in the proof the equation ebex 0 is used essentially and it is also stated in page 6 however i think it does not hold because the proximal operator associated with sparse regularization is nonlinear it may be probably difficult to fix this issue minor comment there is a missing reference it is known that rda has the superior ability to find a manifold structure of solutions as shown in the following paper s lee and s j wright manifold identification in dual averaging for regularized stochastic online learning jmlr 2012 improvement if there is a misunderstanding in my review id appreciate it if you could mention them docsepthis paper proposed a new algorithm for the group sparsity regularization problem they claim most existing algorithms though return solutions with low objective function value only give dense solutions and cannot effectively ensure the desired structured sparsity the new technique requires an initialization that is closed to some truly sparse local minimum which is achieved by running proximal gradient descent first then they proposed a new halfspace iterative step to force elements in specific groups exactly to zero the authors also provide convergence analysis and numerical evidence for the newly proposed algorithm comments 1 it is not clear to me in theorem 1 how are the parameters depend on the confidence tau i am confused as it seems no parameter is explicitly dependent on tau so the convergence in theorem is almost surely one i skim the proof and find that dependence vanished on the page appendix 10 proof of lemma 6 i dont understand why 1theta can be omitted please clarify 2 where is np defined np is used almost everywhere for example in statement of theorem 1 and algorithm 1 i didnt find the definition of it i guess np mink xk x r2 and r is further constrained by 2delta1 r 0 3 in theorem 1 and proposition 1 only asymptotic and polynomial bounds are given no rate of convergence for either the initialization phase or the halfspace projection phase docsepthe paper studies how to solve a class of group sparsity regularized minimization problems in particular a halfspace stochastic projected gradient hspg method is proposed which is based on the proxsg and a new halfspace step that promotes group sparsity this step is to decompose the feasible space and then perform group projection convergence analysis is provided together with the theoretical discussion that hspg has looser requirements to identify the sparsity patter than proxsg numerical experiments on the dcnns based image classification shows the proposed method achieves the stateoftheart performance in terms of accuracy the work looks interesting with wide applications especially in deep neural networks however the novelty is incremental and limited 1 there are some places where the notation is confusing vectors and scalars are constantly not distinguished 2 practical guidance on the selection of the parameters lambda and varepsilon could be provided 3 in the numerical experiments comparison of computational complexity and running time for the listed methods is not provided discussions on the group sparsity level and noise robustness could be included docsepin summary this paper proposes a postprocessing algorithm on the estimator obtained by the usual proximal stochastic gradient method this leads to an estimator with enhanced group sparsity without the sacrifice of accuracy my major concern is how to use such a group sparsity result we end up with a more group sparse estimator which is good but i feel like that is not the end of the story in a deep neural network the group sparsity seems not the major point people care about my hope is that the enhanced group sparsity can be used to guide maybe the design of the neural network structure or at least provide some better understandings of the model for example if we always see that some groups of filters are inactive then we may modify the neural network structure accordingly etc in all my understanding is that the group sparsity could be an intermediate result that can be further analyzed and used to improve the design of the model instead of being the final goal itself if we start with different initialization probably just slightly different then will we end up with the same final estimator or at least the same group sparsity results after obtaining the final estimator with group sparsity can we refit the model on the active group index only will this improve the performance ### Summary:
the paper received four borderline reviews overall the manuscript has improved after the rebuttal in particular an issue in the convergence proof has been fixed and a reviewer has increased his score to borderline accept yet the paper did not convince the reviewers that the contribution was significant enough and none of the reviewer got enthusiastic about the paper the main issue with the paper seems to be the unclear positioning between the optimization literature for stochastic composite optimization the literature on support identification eg nutini 2019 and the more empirical deep learning literature the paper postulates that the groupsparsity regularization is crucial for deep neural networks which seems to be the main motivation of the paper yet the experiments do not demonstrate any concrete consequence of better group sparsity wether it is in terms of accuracy or interpretability if positioned in this literature a comparison should be made with classical pruning approaches where pruning occurs as an iterative procedure that is distinct from optimization if positioned instead in the stochastic optimization literature better analysis of the convergence rates should be provided if positioned in the support identification literature the paper should explain how the results compare to those of the literature eg nutini 2019 and others in other words any point of view requires clarifications and additional discussions besides the theoretical assumptions need to be discussed does the lipschitz assumption holds for multilayer neural networks certainly not for relu networks but what can we say something useful even with smooth activation functions the experimental setup needs more details reproducing the experiments with the current paper seems difficult in particular the choice of hyperparameters is not crystal clear for these reasons the area chair recommends to reject the paper but encourages the authors to resubmit to a future venue while taking into account the previous comments
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 8774, 436, 2929, 29328, 247, 747, 1332, 1925, 2716, 5641, 19191, 16589, 11786, 288, 1033, 72, 281, 1089, 247, 1387, 23507, 2900, 273, 3963, 1025, 1442, 3254, 360, 3237, 10527, 1783, 14177, 281, 921, 253, 37139, 414, 8137, 23632, 275, 4679, 253, 12510, 273, 288, 1033, 72, 369, 16058, 327, 253, 9162, 8892, 50276, 45563, 253, 2934, 3212, 253, 4081, 1332, 3133, 281, 320, 5272, 285, 4722, 50276, 20881, 1255, 247, 2201, 4468, 310, 253, 36594, 273, 253, 7234, 275, 253, 5150, 10694, 275, 253, 4737, 253, 5150, 38391, 911, 50276, 17, 310, 908, 9093, 285, 352, 310, 671, 4767, 275, 3239, 721, 2299, 891, 1158, 352, 1057, 417, 2186, 984, 253, 19561, 5572, 2330, 342, 23507, 37820, 310, 14561, 352, 778, 320, 3164, 2834, 281, 4993, 436, 2523, 50276, 37585, 4385, 627, 310, 247, 5816, 3806, 352, 310, 1929, 326, 391, 1473, 556, 253, 8936, 3745, 281, 1089, 247, 16751, 2605, 273, 5482, 347, 2011, 275, 253, 1563, 2929, 50275, 84, 458, 70, 285, 256, 480, 259, 918, 16751, 8137, 275, 8746, 25001, 323, 3963, 1025, 19191, 3909, 4715, 480, 1686, 83, 4050, 50276, 49831, 420, 604, 627, 310, 247, 40663, 275, 619, 2278, 2654, 11435, 352, 604, 368, 812, 3748, 731, 5474, 33032, 2520, 2929, 4081, 247, 747, 5933, 323, 253, 1387, 37139, 414, 37820, 1895, 597, 1750, 954, 5368, 11333, 2167, 1091, 5482, 342, 1698, 8103, 1159, 1318, 760, 1918, 14086, 5482, 285, 2550, 8069, 5416, 253, 6799, 18872, 37139, 414, 253, 747, 5853, 4419, 271, 31850, 326, 310, 4581, 281, 690, 7777, 23507, 1980, 5927, 534, 310, 6786, 407, 3515, 19561, 11786, 18499, 806, 840, 597, 4081, 247, 747, 2716, 5641, 34560, 3213, 281, 3490, 3603, 275, 2173, 2390, 4555, 281, 5058, 253, 4477, 671, 2085, 14940, 1783, 285, 10704, 1941, 323, 253, 9841, 4081, 5933, 50276, 26122, 337, 352, 310, 417, 2590, 281, 479, 275, 10012, 337, 849, 403, 253, 3602, 3469, 327, 253, 7162, 29201, 891, 717, 13477, 347, 352, 3133, 642, 4764, 310, 11120, 7976, 327, 29201, 594, 253, 14940, 275, 10012, 310, 2761, 13353, 581, 891, 43816, 253, 4737, 285, 1089, 326, 10096, 30117, 327, 253, 3239, 30762, 884, 4737, 273, 18057, 721, 891, 13414, 2096, 2139, 337, 3124, 476, 320, 11035, 4496, 19148, 374, 835, 310, 15749, 2931, 15749, 310, 908, 2761, 11678, 323, 1650, 275, 3908, 273, 10012, 337, 285, 5933, 337, 891, 42126, 1089, 253, 5426, 273, 352, 891, 5476, 15749, 50276, 78, 750, 1269, 76, 50276, 89, 50276, 83, 19, 285, 391, 310, 2007, 20793, 407, 374, 3005, 18, 50276, 83, 50276, 17, 495, 275, 10012, 337, 285, 13989, 337, 760, 20185, 285, 14189, 14493, 403, 1677, 642, 2281, 273, 14940, 323, 2057, 253, 31850, 3408, 390, 253, 2716, 5641, 12378, 3408, 5474, 339, 431, 248, 2929, 2175, 849, 281, 8415, 247, 966, 273, 1387, 37139, 414, 3963, 1025, 41458, 3237, 275, 1798, 247, 2716, 5641, 19191, 16589, 11786, 288, 1033, 72, 1332, 310, 4081, 534, 310, 1754, 327, 253, 16843, 8433, 285, 247, 747, 2716, 5641, 3213, 326, 18653, 1387, 37139, 414, 436, 3213, 310, 281, 11101, 3014, 253, 17887, 2317, 285, 840, 1347, 1387, 12378, 14940, 1783, 310, 2530, 2366, 342, 253, 10527, 5955, 326, 288, 1033, 72, 556, 2343, 14356, 6095, 281, 4271, 253, 37139, 414, 43181, 685, 16843, 8433, 10704, 4679, 327, 253, 277, 14340, 2224, 1754, 2460, 9162, 2722, 253, 4081, 1332, 33526, 253, 1375, 23037, 14387, 3045, 275, 2426, 273, 7200, 253, 789, 4453, 4722, 342, 4618, 4893, 3340, 275, 3676, 11454, 6928, 2299, 253, 38135, 310, 32809, 285, 3710, 50275, 18, 627, 403, 690, 5053, 835, 253, 14951, 310, 21643, 11390, 285, 9171, 1032, 403, 11485, 417, 15622, 374, 8542, 12925, 327, 253, 5438, 273, 253, 3602, 29331, 285, 362, 609, 4277, 812, 320, 2530, 50276, 20, 275, 253, 10704, 4679, 5301, 273, 15180, 10454, 285, 3515, 673, 323, 253, 7117, 3082, 310, 417, 2530, 11985, 327, 253, 1387, 37139, 414, 1268, 285, 6046, 31640, 812, 320, 2908, 5474, 339, 9852, 6010, 436, 2929, 29328, 247, 1501, 21678, 5933, 327, 253, 29107, 2797, 407, 253, 7312, 19561, 19191, 11786, 1332, 436, 5644, 281, 271, 29107, 342, 8655, 1387, 37139, 414, 1293, 253, 17789, 273, 7200, 50276, 2577, 2201, 4468, 310, 849, 281, 897, 824, 247, 1387, 37139, 414, 906, 359, 990, 598, 342, 247, 625, 1387, 23507, 29107, 534, 310, 1175, 533, 891, 1928, 751, 326, 310, 417, 253, 990, 273, 253, 2926, 275, 247, 3676, 11454, 2990, 253, 1387, 37139, 414, 3133, 417, 253, 2201, 1127, 952, 1557, 670, 619, 3524, 310, 326, 253, 8655, 1387, 37139, 414, 476, 320, 908, 281, 7102, 5046, 253, 2216, 273, 253, 11454, 2990, 2605, 390, 387, 1878, 2085, 690, 1805, 2096, 723, 273, 253, 1566, 323, 1650, 604, 359, 1900, 923, 326, 690, 2390, 273, 15116, 403, 25515, 840, 359, 778, 10007, 253, 11454, 2990, 2605, 15672, 3966, 275, 512, 619, 4685, 310, 326, 253, 1387, 37139, 414, 812, 320, 271, 10444, 906, 326, 476, 320, 2007, 5867, 285, 908, 281, 3157, 253, 2216, 273, 253, 1566, 3185, 273, 1146, 253, 2457, 4736, 3139, 50276, 338, 359, 1265, 342, 1027, 31850, 3164, 816, 5777, 1027, 840, 588, 359, 990, 598, 342, 253, 1072, 2457, 29107, 390, 387, 1878, 253, 1072, 1387, 37139, 414, 1543, 50276, 6438, 13546, 253, 2457, 29107, 342, 1387, 37139, 414, 476, 359, 1275, 262, 253, 1566, 327, 253, 3939, 1387, 3605, 760, 588, 436, 3157, 253, 3045, 2490, 187, 4118, 18435, 27, 783, 2929, 2959, 1740, 45210, 10123, 50276, 1189, 455, 253, 7714, 556, 5520, 846, 253, 30080, 22559, 275, 1798, 271, 2523, 275, 253, 14940, 4737, 556, 644, 4229, 285, 247, 37317, 556, 2559, 521, 4868, 281, 45210, 2997, 2568, 253, 2929, 858, 417, 18578, 253, 30628, 326, 253, 7680, 369, 1534, 2217, 285, 5293, 273, 253, 37317, 1694, 31905, 670, 253, 2929, 253, 2022, 2523, 342, 253, 2929, 3133, 281, 320, 253, 12744, 19274, 875, 253, 13757, 6239, 323, 19191, 8212, 13757, 253, 6239, 327, 1329, 8137, 24088, 5825, 5391, 6247, 285, 253, 625, 16774, 3676, 4715, 6239, 50276, 783, 2929, 1501, 17815, 326, 253, 2390, 35422, 414, 37820, 310, 9560, 323, 3676, 11454, 6928, 534, 3133, 281, 320, 253, 2022, 16038, 273, 253, 2929, 2568, 253, 4679, 513, 417, 7568, 667, 11859, 9936, 273, 1805, 1387, 37139, 414, 259, 7851, 352, 310, 275, 2426, 273, 7200, 390, 4665, 1430, 50276, 338, 15471, 275, 436, 6239, 247, 5301, 943, 320, 1160, 342, 8946, 819, 25004, 7274, 835, 819, 25004, 6634, 347, 271, 34560, 5199, 326, 310, 5799, 432, 13757, 604, 15471, 3185, 275, 253, 19191, 13757, 6239, 1805, 1783, 273, 253, 14940, 4142, 943, 320, 2530, 604, 15471, 275, 253, 1329, 8137, 6239, 253, 2929, 943, 5513, 849, 253, 1543, 7277, 281, 1110, 273, 253, 6239, 24088, 5825, 5391, 6247, 285, 2571, 275, 643, 3000, 667, 1127, 273, 1859, 4419, 8254, 6787, 285, 3081, 11985, 50275, 67, 11587, 50272, 783, 10527, 13260, 878, 281, 320, 5469, 50276, 18566, 253, 11233, 37913, 9376, 6556, 323, 33362, 4071, 11454, 6928, 50276, 68, 20427, 417, 323, 774, 86, 6928, 533, 752, 476, 359, 1333, 1633, 4217, 1014, 342, 6032, 5743, 3470, 50273, 783, 5661, 9978, 3198, 625, 4278, 39306, 253, 4679, 342, 253, 1655, 2929, 3133, 2834, 275, 1798, 253, 4327, 273, 4373, 22041, 310, 417, 9266, 2590, 50276, 1542, 841, 4606, 253, 2170, 6951, 32636, 281, 12009, 253, 2929, 533, 29426, 253, 4477, 281, 501, 538, 2225, 281, 247, 2852, 18767, 1223, 3192, 715, 2395, 253, 2045, 5701 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 8774, 436, 2929, 29328, 247, 747, 1332, 1925, 2716, 5641, 19191, 16589, 11786, 288, 1033, 72, 281, 1089, 247, 1387, 23507, 2900, 273, 3963, 1025, 1442, 3254, 360, 3237, 10527, 1783, 14177, 281, 921, 253, 37139, 414, 8137, 23632, 275, 4679, 253, 12510, 273, 288, 1033, 72, 369, 16058, 327, 253, 9162, 8892, 50276, 45563, 253, 2934, 3212, 253, 4081, 1332, 3133, 281, 320, 5272, 285, 4722, 50276, 20881, 1255, 247, 2201, 4468, 310, 253, 36594, 273, 253, 7234, 275, 253, 5150, 10694, 275, 253, 4737, 253, 5150, 38391, 911, 50276, 17, 310, 908, 9093, 285, 352, 310, 671, 4767, 275, 3239, 721, 2299, 891, 1158, 352, 1057, 417, 2186, 984, 253, 19561, 5572, 2330, 342, 23507, 37820, 310, 14561, 352, 778, 320, 3164, 2834, 281, 4993, 436, 2523, 50276, 37585, 4385, 627, 310, 247, 5816, 3806, 352, 310, 1929, 326, 391, 1473, 556, 253, 8936, 3745, 281, 1089, 247, 16751, 2605, 273, 5482, 347, 2011, 275, 253, 1563, 2929, 50275, 84, 458, 70, 285, 256, 480, 259, 918, 16751, 8137, 275, 8746, 25001, 323, 3963, 1025, 19191, 3909, 4715, 480, 1686, 83, 4050, 50276, 49831, 420, 604, 627, 310, 247, 40663, 275, 619, 2278, 2654, 11435, 352, 604, 368, 812, 3748, 731, 5474, 33032, 2520, 2929, 4081, 247, 747, 5933, 323, 253, 1387, 37139, 414, 37820, 1895, 597, 1750, 954, 5368, 11333, 2167, 1091, 5482, 342, 1698, 8103, 1159, 1318, 760, 1918, 14086, 5482, 285, 2550, 8069, 5416, 253, 6799, 18872, 37139, 414, 253, 747, 5853, 4419, 271, 31850, 326, 310, 4581, 281, 690, 7777, 23507, 1980, 5927, 534, 310, 6786, 407, 3515, 19561, 11786, 18499, 806, 840, 597, 4081, 247, 747, 2716, 5641, 34560, 3213, 281, 3490, 3603, 275, 2173, 2390, 4555, 281, 5058, 253, 4477, 671, 2085, 14940, 1783, 285, 10704, 1941, 323, 253, 9841, 4081, 5933, 50276, 26122, 337, 352, 310, 417, 2590, 281, 479, 275, 10012, 337, 849, 403, 253, 3602, 3469, 327, 253, 7162, 29201, 891, 717, 13477, 347, 352, 3133, 642, 4764, 310, 11120, 7976, 327, 29201, 594, 253, 14940, 275, 10012, 310, 2761, 13353, 581, 891, 43816, 253, 4737, 285, 1089, 326, 10096, 30117, 327, 253, 3239, 30762, 884, 4737, 273, 18057, 721, 891, 13414, 2096, 2139, 337, 3124, 476, 320, 11035, 4496, 19148, 374, 835, 310, 15749, 2931, 15749, 310, 908, 2761, 11678, 323, 1650, 275, 3908, 273, 10012, 337, 285, 5933, 337, 891, 42126, 1089, 253, 5426, 273, 352, 891, 5476, 15749, 50276, 78, 750, 1269, 76, 50276, 89, 50276, 83, 19, 285, 391, 310, 2007, 20793, 407, 374, 3005, 18, 50276, 83, 50276, 17, 495, 275, 10012, 337, 285, 13989, 337, 760, 20185, 285, 14189, 14493, 403, 1677, 642, 2281, 273, 14940, 323, 2057, 253, 31850, 3408, 390, 253, 2716, 5641, 12378, 3408, 5474, 339, 431, 248, 2929, 2175, 849, 281, 8415, 247, 966, 273, 1387, 37139, 414, 3963, 1025, 41458, 3237, 275, 1798, 247, 2716, 5641, 19191, 16589, 11786, 288, 1033, 72, 1332, 310, 4081, 534, 310, 1754, 327, 253, 16843, 8433, 285, 247, 747, 2716, 5641, 3213, 326, 18653, 1387, 37139, 414, 436, 3213, 310, 281, 11101, 3014, 253, 17887, 2317, 285, 840, 1347, 1387, 12378, 14940, 1783, 310, 2530, 2366, 342, 253, 10527, 5955, 326, 288, 1033, 72, 556, 2343, 14356, 6095, 281, 4271, 253, 37139, 414, 43181, 685, 16843, 8433, 10704, 4679, 327, 253, 277, 14340, 2224, 1754, 2460, 9162, 2722, 253, 4081, 1332, 33526, 253, 1375, 23037, 14387, 3045, 275, 2426, 273, 7200, 253, 789, 4453, 4722, 342, 4618, 4893, 3340, 275, 3676, 11454, 6928, 2299, 253, 38135, 310, 32809, 285, 3710, 50275, 18, 627, 403, 690, 5053, 835, 253, 14951, 310, 21643, 11390, 285, 9171, 1032, 403, 11485, 417, 15622, 374, 8542, 12925, 327, 253, 5438, 273, 253, 3602, 29331, 285, 362, 609, 4277, 812, 320, 2530, 50276, 20, 275, 253, 10704, 4679, 5301, 273, 15180, 10454, 285, 3515, 673, 323, 253, 7117, 3082, 310, 417, 2530, 11985, 327, 253, 1387, 37139, 414, 1268, 285, 6046, 31640, 812, 320, 2908, 5474, 339, 9852, 6010, 436, 2929, 29328, 247, 1501, 21678, 5933, 327, 253, 29107, 2797, 407, 253, 7312, 19561, 19191, 11786, 1332, 436, 5644, 281, 271, 29107, 342, 8655, 1387, 37139, 414, 1293, 253, 17789, 273, 7200, 50276, 2577, 2201, 4468, 310, 849, 281, 897, 824, 247, 1387, 37139, 414, 906, 359, 990, 598, 342, 247, 625, 1387, 23507, 29107, 534, 310, 1175, 533, 891, 1928, 751, 326, 310, 417, 253, 990, 273, 253, 2926, 275, 247, 3676, 11454, 2990, 253, 1387, 37139, 414, 3133, 417, 253, 2201, 1127, 952, 1557, 670, 619, 3524, 310, 326, 253, 8655, 1387, 37139, 414, 476, 320, 908, 281, 7102, 5046, 253, 2216, 273, 253, 11454, 2990, 2605, 390, 387, 1878, 2085, 690, 1805, 2096, 723, 273, 253, 1566, 323, 1650, 604, 359, 1900, 923, 326, 690, 2390, 273, 15116, 403, 25515, 840, 359, 778, 10007, 253, 11454, 2990, 2605, 15672, 3966, 275, 512, 619, 4685, 310, 326, 253, 1387, 37139, 414, 812, 320, 271, 10444, 906, 326, 476, 320, 2007, 5867, 285, 908, 281, 3157, 253, 2216, 273, 253, 1566, 3185, 273, 1146, 253, 2457, 4736, 3139, 50276, 338, 359, 1265, 342, 1027, 31850, 3164, 816, 5777, 1027, 840, 588, 359, 990, 598, 342, 253, 1072, 2457, 29107, 390, 387, 1878, 253, 1072, 1387, 37139, 414, 1543, 50276, 6438, 13546, 253, 2457, 29107, 342, 1387, 37139, 414, 476, 359, 1275, 262, 253, 1566, 327, 253, 3939, 1387, 3605, 760, 588, 436, 3157, 253, 3045, 2490, 187, 4118, 18435, 27, 783, 2929, 2959, 1740, 45210, 10123, 50276, 1189, 455, 253, 7714, 556, 5520, 846, 253, 30080, 22559, 275, 1798, 271, 2523, 275, 253, 14940, 4737, 556, 644, 4229, 285, 247, 37317, 556, 2559, 521, 4868, 281, 45210, 2997, 2568, 253, 2929, 858, 417, 18578, 253, 30628, 326, 253, 7680, 369, 1534, 2217, 285, 5293, 273, 253, 37317, 1694, 31905, 670, 253, 2929, 253, 2022, 2523, 342, 253, 2929, 3133, 281, 320, 253, 12744, 19274, 875, 253, 13757, 6239, 323, 19191, 8212, 13757, 253, 6239, 327, 1329, 8137, 24088, 5825, 5391, 6247, 285, 253, 625, 16774, 3676, 4715, 6239, 50276, 783, 2929, 1501, 17815, 326, 253, 2390, 35422, 414, 37820, 310, 9560, 323, 3676, 11454, 6928, 534, 3133, 281, 320, 253, 2022, 16038, 273, 253, 2929, 2568, 253, 4679, 513, 417, 7568, 667, 11859, 9936, 273, 1805, 1387, 37139, 414, 259, 7851, 352, 310, 275, 2426, 273, 7200, 390, 4665, 1430, 50276, 338, 15471, 275, 436, 6239, 247, 5301, 943, 320, 1160, 342, 8946, 819, 25004, 7274, 835, 819, 25004, 6634, 347, 271, 34560, 5199, 326, 310, 5799, 432, 13757, 604, 15471, 3185, 275, 253, 19191, 13757, 6239, 1805, 1783, 273, 253, 14940, 4142, 943, 320, 2530, 604, 15471, 275, 253, 1329, 8137, 6239, 253, 2929, 943, 5513, 849, 253, 1543, 7277, 281, 1110, 273, 253, 6239, 24088, 5825, 5391, 6247, 285, 2571, 275, 643, 3000, 667, 1127, 273, 1859, 4419, 8254, 6787, 285, 3081, 11985, 50275, 67, 11587, 50272, 783, 10527, 13260, 878, 281, 320, 5469, 50276, 18566, 253, 11233, 37913, 9376, 6556, 323, 33362, 4071, 11454, 6928, 50276, 68, 20427, 417, 323, 774, 86, 6928, 533, 752, 476, 359, 1333, 1633, 4217, 1014, 342, 6032, 5743, 3470, 50273, 783, 5661, 9978, 3198, 625, 4278, 39306, 253, 4679, 342, 253, 1655, 2929, 3133, 2834, 275, 1798, 253, 4327, 273, 4373, 22041, 310, 417, 9266, 2590, 50276, 1542, 841, 4606, 253, 2170, 6951, 32636, 281, 12009, 253, 2929, 533, 29426, 253, 4477, 281, 501, 538, 2225, 281, 247, 2852, 18767, 1223, 3192, 715, 2395, 253, 2045, 5701 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: docsepthe paper is tackling the topic of finegrained representation in contrast to comparable methods arp focus on subsampled via strided operation to represent finegrained information arp is an intuitive strategy to extract critical regions and is more concise compared with multistage cascade architecture this fact facilitates the efficiency strong points of the submission well written paper which is easy to follow based on clear formulation for proposed method the work provides a detailed description of the implementation weak points of the submission lacking extensive use cases to demonstrate that arp can crop a more discriminative region it would be useful to provide a visualization that compares with other methods especially the hard and tiny critical key regions while arp assists in finegrained representation this idea technically is usable for other image tasks given sec2 and compared baseline in sec4 i am concerned that there are similar approaches proposed in the field of nonfine grained recognition if this paper is to be accepted i believe it needs a comprehensive survey about adaptive pooling strategy why the baselines in tables 1 and 2 do not include approaches in 2021 for example 1 it is good and necessary that compare arp with a functionally similar method in sec 43 stn is wellknown but old according to the authors survey is there not a better and closer method in followup works of stn arp seems to be a plugandplay module can the original results be boosted if apply it to existing architectures this experiment could help prove the applicability of arp 1 huynh s v a strong baseline for vehicle reidentificationccvpr 2021 41474154 docsep motivation the paper argues that a there are some small but important regions of an interesting object which are usually ignored by the previous finegrained methods b excessive reduction operations on image resolution fade the discriminative features method according to the above observations the paper proposes a pooling algorithm called adaptive region pooling arp this module has two procedures a learn to crop image regions b downsampling various size regions using bilinear operation experiments the proposed method is verified in two tasks finegrained classification and reidentification experimental results show that the improvements are promising but i have three considerations a the proposed method is mainly based on the assumption that there is only one target object in an image but this is a limitation of the paper is there a potential limitation that the module could not work well on the images with multiple isolated objects or the arp is sensitive to the noise in images b most of the datasets used in the papers experiments are small please verify the proposed method on a largescale dataset like inaturalist used in competitive methods such as tasn to verify on largescale datasets is also helpful for erasing the above limitation the generalization of the proposed method is the biggest concern due to the fact that images used in the experiments have a clean and single target object it could be better to use a largescale dataset to verify the generalization ability of the proposed method docsepthis paper presents an adaptive region pooling arp method for finegrained representation learning arp crops the features with the estimated scale the features are further downsampled to a consistent size through bilinear downsampling experiments conducted on two tasks validate the effectiveness of the proposed method strengths 1 both tasks of finegrained image classification and vehicle reidentification are evaluated in this paper 2 the code is released weaknesses 1 the novelty is limited in recent years many poolingbased methods have been proposed to explore the finegrained classification task such as adaptive bilinear pooling multidomain pooling and so on the ideas of arp are similar to these methods however the details are not discussed in this paper 2 in the experiments it seems unfair to compare the proposed method with the stateoftheart methods arp is a general pooling operation for finegrained representation learning it is necessary to verify whether arp helps improve these methods 3 many symbols in the paper are confusing although they are casesensitive or bold 4 many related references were not mentioned in this paper see above comments based on weaknesses and strengths i have concerns about the contributions and experimental evaluations docsepthis paper proposes an adaptive region pooling arp module that adaptively estimates the scale factors and crops the discriminative regions based on the estimated scale factors it aims to focus on the most discriminative region and simultaneously contain more finegrained information the authors apply the arp module to finegrained image recognition and vehicle reid tasks indeed the arp module is an attentional mechanism to adaptively locate discriminative regions to facilitate learning finegrained representation it first crops the feature from the most critical region and then downsampled to a consistent size through bilinear downsampling it is quite similar to current work 1 which proposes a contextaware attentional feature pooling cap module that learns to locate discriminative regions with different scales and map to fixedsize feature maps the advantage of the arp compared with cap should be presented and discussed besides this work achieves quite well performance eg 918 on cub the performance comparisons with this work should also be provided figure 3 presents the network architecture that shows how to integrate the arp module however why using such an architecture is not discussed besides are there other ways to integrate the module i think more discussion and experiments on how to integrate the apr module should be provided to compute the confidence map the authors utilize instance normalization to normalize the feature maps the motivation of this point is not well explained and its contribution is not verified 1 behera et al contextaware attentional pooling cap for finegrained visual classification in aaai 2021 i think the advantage and performance comparisons with current works are not well clarified the discussion and analysis about some crucial steps and components are not presented ### Summary:
four reviewers have reviewed this submission three of them recommended to reject the paper and one was marginally above the acceptance threshold the authors have not responded to the criticisms or questions of reviewers among many concerns were the issues with the use of lean and single target object images lack of discussions on related models such as adaptive bilinear pooling and multidomain pooling lack of evaluations on datasets such as largescale inaturalist given the above criticisms and the lack of authors response this submission falls below the acceptance bar
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 7152, 339, 431, 248, 2929, 310, 46710, 253, 9400, 273, 4030, 72, 11273, 6779, 275, 4499, 281, 10870, 3082, 549, 81, 2770, 327, 8790, 312, 6216, 3066, 1213, 1356, 4254, 281, 1957, 4030, 72, 11273, 1491, 50276, 5916, 310, 271, 27350, 5700, 281, 4908, 4619, 4811, 285, 310, 625, 44003, 2429, 342, 1554, 382, 486, 25282, 10336, 436, 958, 29499, 253, 6733, 2266, 2792, 273, 253, 19529, 50275, 4714, 3542, 2929, 534, 310, 3477, 281, 956, 1754, 327, 2590, 15895, 323, 4081, 1332, 50275, 783, 789, 3400, 247, 7000, 5740, 273, 253, 7092, 50275, 20881, 2792, 273, 253, 19529, 50275, 77, 10892, 9470, 897, 2219, 281, 7568, 326, 549, 81, 476, 17177, 247, 625, 20741, 800, 2919, 352, 651, 320, 4217, 281, 2085, 247, 24426, 326, 26662, 342, 643, 3082, 3340, 253, 1892, 285, 10058, 4619, 2234, 4811, 50274, 6050, 549, 81, 27593, 275, 4030, 72, 11273, 6779, 436, 2934, 22335, 310, 31998, 323, 643, 2460, 8892, 1677, 4706, 19, 285, 2429, 8245, 275, 4706, 21, 891, 717, 7514, 326, 627, 403, 2074, 7274, 4081, 275, 253, 1673, 273, 1327, 32829, 7098, 967, 8981, 604, 436, 2929, 310, 281, 320, 7607, 891, 2868, 352, 3198, 247, 11088, 6630, 670, 17825, 45900, 5700, 50276, 22309, 253, 1666, 25379, 275, 7180, 337, 285, 374, 513, 417, 2486, 7274, 275, 43425, 323, 1650, 337, 50275, 262, 310, 1175, 285, 3309, 326, 7277, 549, 81, 342, 247, 30333, 2074, 1332, 275, 4706, 7652, 331, 79, 310, 973, 4304, 533, 1711, 2556, 281, 253, 4477, 6630, 310, 627, 417, 247, 1805, 285, 8003, 1332, 275, 956, 484, 2987, 273, 331, 79, 50274, 5916, 3133, 281, 320, 247, 10358, 395, 1993, 6333, 476, 253, 3236, 1543, 320, 46002, 604, 4647, 352, 281, 5368, 35615, 436, 3368, 812, 1361, 5276, 253, 30437, 273, 549, 81, 50276, 18, 30287, 1362, 73, 256, 362, 247, 2266, 8245, 323, 4958, 294, 888, 1877, 550, 87, 1087, 43425, 36573, 3566, 17161, 5474, 33032, 16038, 253, 2929, 8219, 326, 50273, 66, 627, 403, 690, 1355, 533, 1774, 4811, 273, 271, 4722, 1789, 534, 403, 3798, 12841, 407, 253, 2045, 4030, 72, 11273, 3082, 50274, 67, 13622, 5141, 5871, 327, 2460, 6064, 28755, 253, 20741, 800, 3386, 50272, 9349, 2556, 281, 253, 1840, 7313, 253, 2929, 29328, 247, 45900, 5933, 1925, 17825, 2919, 45900, 549, 81, 436, 6333, 556, 767, 7259, 50274, 66, 3037, 281, 17177, 2460, 4811, 50274, 67, 1066, 48027, 2710, 1979, 4811, 970, 10370, 48971, 4254, 50274, 16217, 3825, 253, 4081, 1332, 310, 16058, 275, 767, 8892, 4030, 72, 11273, 9162, 285, 294, 888, 1877, 50276, 49363, 1543, 921, 326, 253, 11701, 403, 12532, 533, 891, 452, 1264, 15711, 50276, 66, 253, 4081, 1332, 310, 7194, 1754, 327, 253, 9376, 326, 627, 310, 760, 581, 2303, 1789, 275, 271, 2460, 533, 436, 310, 247, 12291, 273, 253, 2929, 310, 627, 247, 2442, 12291, 326, 253, 6333, 812, 417, 789, 973, 327, 253, 3888, 342, 2709, 7011, 5113, 390, 253, 549, 81, 310, 7996, 281, 253, 6046, 275, 3888, 50276, 67, 954, 273, 253, 15302, 908, 275, 253, 9380, 4679, 403, 1355, 4496, 12654, 253, 4081, 1332, 327, 247, 1236, 2510, 25912, 10895, 751, 275, 8646, 382, 908, 275, 12085, 3082, 824, 347, 246, 284, 79, 281, 12654, 327, 1236, 2510, 25912, 15302, 310, 671, 9371, 323, 2827, 2355, 253, 1840, 12291, 50276, 783, 26647, 273, 253, 4081, 1332, 310, 253, 5962, 4468, 1955, 281, 253, 958, 326, 3888, 908, 275, 253, 4679, 452, 247, 4076, 285, 2014, 2303, 1789, 352, 812, 320, 1805, 281, 897, 247, 1236, 2510, 25912, 10895, 281, 12654, 253, 26647, 3745, 273, 253, 4081, 1332, 5474, 33032, 2520, 2929, 10262, 271, 17825, 2919, 45900, 549, 81, 1332, 323, 4030, 72, 11273, 6779, 4715, 549, 81, 19492, 253, 3386, 342, 253, 5998, 4311, 253, 3386, 403, 2007, 1066, 22163, 6216, 281, 247, 5185, 1979, 949, 10370, 48971, 1066, 48027, 4679, 5196, 327, 767, 8892, 17813, 253, 12510, 273, 253, 4081, 1332, 20544, 50276, 18, 1097, 8892, 273, 4030, 72, 11273, 2460, 9162, 285, 4958, 294, 888, 1877, 403, 6760, 275, 436, 2929, 374, 253, 2127, 310, 4439, 50276, 20881, 1255, 265, 337, 253, 38135, 310, 3710, 275, 3332, 1107, 1142, 45900, 3169, 3082, 452, 644, 4081, 281, 8338, 253, 4030, 72, 11273, 9162, 4836, 824, 347, 17825, 10370, 48971, 45900, 23964, 297, 404, 45900, 285, 594, 327, 253, 5697, 273, 549, 81, 403, 2074, 281, 841, 3082, 2299, 253, 4278, 403, 417, 5469, 275, 436, 2929, 50275, 19, 275, 253, 4679, 352, 3133, 16593, 281, 7277, 253, 4081, 1332, 342, 253, 1375, 23037, 14387, 3082, 549, 81, 310, 247, 2087, 45900, 4254, 323, 4030, 72, 11273, 6779, 4715, 352, 310, 3309, 281, 12654, 1880, 549, 81, 7729, 3157, 841, 3082, 50276, 20, 1142, 14217, 275, 253, 2929, 403, 21643, 3738, 597, 403, 2219, 18917, 390, 13433, 50276, 21, 1142, 2905, 10414, 497, 417, 5393, 275, 436, 2929, 50276, 2887, 1840, 5701, 50275, 3169, 327, 32213, 285, 20544, 891, 452, 7350, 670, 253, 9021, 285, 5661, 27163, 50276, 7152, 33032, 2520, 2929, 29328, 271, 17825, 2919, 45900, 549, 81, 6333, 326, 5223, 1242, 8197, 253, 4311, 2616, 285, 19492, 253, 20741, 800, 4811, 1754, 327, 253, 5998, 4311, 2616, 352, 13698, 281, 2770, 327, 253, 954, 20741, 800, 2919, 285, 10486, 3831, 625, 4030, 72, 11273, 1491, 253, 4477, 4647, 253, 549, 81, 6333, 281, 4030, 72, 11273, 2460, 8981, 285, 4958, 294, 301, 8892, 6296, 253, 549, 81, 6333, 310, 271, 4116, 267, 5122, 281, 5223, 1242, 19912, 20741, 800, 4811, 281, 12454, 4715, 4030, 72, 11273, 6779, 352, 806, 19492, 253, 4735, 432, 253, 954, 4619, 2919, 285, 840, 1066, 22163, 6216, 281, 247, 5185, 1979, 949, 10370, 48971, 1066, 48027, 352, 310, 3240, 2074, 281, 1655, 789, 337, 534, 29328, 247, 3634, 13823, 4116, 267, 4735, 45900, 1729, 6333, 326, 33772, 281, 19912, 20741, 800, 4811, 342, 1027, 11498, 285, 3711, 281, 4229, 3281, 4735, 8115, 253, 5750, 273, 253, 549, 81, 2429, 342, 1729, 943, 320, 3559, 285, 5469, 16280, 436, 789, 33526, 3240, 973, 3045, 24088, 898, 1093, 327, 12966, 253, 3045, 14023, 342, 436, 789, 943, 671, 320, 2530, 50276, 13206, 495, 10262, 253, 2990, 10336, 326, 2722, 849, 281, 19837, 253, 549, 81, 6333, 2299, 2139, 970, 824, 271, 10336, 310, 417, 5469, 16280, 403, 627, 643, 4088, 281, 19837, 253, 6333, 891, 1158, 625, 5955, 285, 4679, 327, 849, 281, 19837, 253, 1049, 83, 6333, 943, 320, 2530, 50276, 936, 11897, 253, 7162, 3711, 253, 4477, 16584, 4227, 21539, 281, 39142, 253, 4735, 8115, 253, 16038, 273, 436, 1127, 310, 417, 973, 5544, 285, 697, 7680, 310, 417, 16058, 50276, 18, 320, 248, 376, 1162, 355, 3634, 13823, 4116, 267, 45900, 1729, 323, 4030, 72, 11273, 5304, 9162, 275, 39951, 2284, 43425, 891, 1158, 253, 5750, 285, 3045, 14023, 342, 1655, 2987, 403, 417, 973, 31637, 253, 5955, 285, 1783, 670, 690, 9560, 5018, 285, 4295, 403, 417, 3559, 2490, 187, 4118, 18435, 27, 12496, 30628, 452, 9814, 436, 19529, 1264, 273, 731, 8521, 281, 12009, 253, 2929, 285, 581, 369, 42876, 1840, 253, 14924, 7887, 253, 4477, 452, 417, 10974, 281, 253, 43680, 390, 3533, 273, 30628, 2190, 1142, 7350, 497, 253, 3374, 342, 253, 897, 273, 9644, 285, 2014, 2303, 1789, 3888, 3480, 273, 11985, 327, 2905, 3210, 824, 347, 17825, 10370, 48971, 45900, 285, 23964, 297, 404, 45900, 3480, 273, 27163, 327, 15302, 824, 347, 50276, 14915, 20039, 1079, 275, 8646, 382, 1677, 253, 1840, 43680, 285, 253, 3480, 273, 4477, 2380, 436, 19529, 11521, 2708, 253, 14924, 2534 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 7152, 339, 431, 248, 2929, 310, 46710, 253, 9400, 273, 4030, 72, 11273, 6779, 275, 4499, 281, 10870, 3082, 549, 81, 2770, 327, 8790, 312, 6216, 3066, 1213, 1356, 4254, 281, 1957, 4030, 72, 11273, 1491, 50276, 5916, 310, 271, 27350, 5700, 281, 4908, 4619, 4811, 285, 310, 625, 44003, 2429, 342, 1554, 382, 486, 25282, 10336, 436, 958, 29499, 253, 6733, 2266, 2792, 273, 253, 19529, 50275, 4714, 3542, 2929, 534, 310, 3477, 281, 956, 1754, 327, 2590, 15895, 323, 4081, 1332, 50275, 783, 789, 3400, 247, 7000, 5740, 273, 253, 7092, 50275, 20881, 2792, 273, 253, 19529, 50275, 77, 10892, 9470, 897, 2219, 281, 7568, 326, 549, 81, 476, 17177, 247, 625, 20741, 800, 2919, 352, 651, 320, 4217, 281, 2085, 247, 24426, 326, 26662, 342, 643, 3082, 3340, 253, 1892, 285, 10058, 4619, 2234, 4811, 50274, 6050, 549, 81, 27593, 275, 4030, 72, 11273, 6779, 436, 2934, 22335, 310, 31998, 323, 643, 2460, 8892, 1677, 4706, 19, 285, 2429, 8245, 275, 4706, 21, 891, 717, 7514, 326, 627, 403, 2074, 7274, 4081, 275, 253, 1673, 273, 1327, 32829, 7098, 967, 8981, 604, 436, 2929, 310, 281, 320, 7607, 891, 2868, 352, 3198, 247, 11088, 6630, 670, 17825, 45900, 5700, 50276, 22309, 253, 1666, 25379, 275, 7180, 337, 285, 374, 513, 417, 2486, 7274, 275, 43425, 323, 1650, 337, 50275, 262, 310, 1175, 285, 3309, 326, 7277, 549, 81, 342, 247, 30333, 2074, 1332, 275, 4706, 7652, 331, 79, 310, 973, 4304, 533, 1711, 2556, 281, 253, 4477, 6630, 310, 627, 417, 247, 1805, 285, 8003, 1332, 275, 956, 484, 2987, 273, 331, 79, 50274, 5916, 3133, 281, 320, 247, 10358, 395, 1993, 6333, 476, 253, 3236, 1543, 320, 46002, 604, 4647, 352, 281, 5368, 35615, 436, 3368, 812, 1361, 5276, 253, 30437, 273, 549, 81, 50276, 18, 30287, 1362, 73, 256, 362, 247, 2266, 8245, 323, 4958, 294, 888, 1877, 550, 87, 1087, 43425, 36573, 3566, 17161, 5474, 33032, 16038, 253, 2929, 8219, 326, 50273, 66, 627, 403, 690, 1355, 533, 1774, 4811, 273, 271, 4722, 1789, 534, 403, 3798, 12841, 407, 253, 2045, 4030, 72, 11273, 3082, 50274, 67, 13622, 5141, 5871, 327, 2460, 6064, 28755, 253, 20741, 800, 3386, 50272, 9349, 2556, 281, 253, 1840, 7313, 253, 2929, 29328, 247, 45900, 5933, 1925, 17825, 2919, 45900, 549, 81, 436, 6333, 556, 767, 7259, 50274, 66, 3037, 281, 17177, 2460, 4811, 50274, 67, 1066, 48027, 2710, 1979, 4811, 970, 10370, 48971, 4254, 50274, 16217, 3825, 253, 4081, 1332, 310, 16058, 275, 767, 8892, 4030, 72, 11273, 9162, 285, 294, 888, 1877, 50276, 49363, 1543, 921, 326, 253, 11701, 403, 12532, 533, 891, 452, 1264, 15711, 50276, 66, 253, 4081, 1332, 310, 7194, 1754, 327, 253, 9376, 326, 627, 310, 760, 581, 2303, 1789, 275, 271, 2460, 533, 436, 310, 247, 12291, 273, 253, 2929, 310, 627, 247, 2442, 12291, 326, 253, 6333, 812, 417, 789, 973, 327, 253, 3888, 342, 2709, 7011, 5113, 390, 253, 549, 81, 310, 7996, 281, 253, 6046, 275, 3888, 50276, 67, 954, 273, 253, 15302, 908, 275, 253, 9380, 4679, 403, 1355, 4496, 12654, 253, 4081, 1332, 327, 247, 1236, 2510, 25912, 10895, 751, 275, 8646, 382, 908, 275, 12085, 3082, 824, 347, 246, 284, 79, 281, 12654, 327, 1236, 2510, 25912, 15302, 310, 671, 9371, 323, 2827, 2355, 253, 1840, 12291, 50276, 783, 26647, 273, 253, 4081, 1332, 310, 253, 5962, 4468, 1955, 281, 253, 958, 326, 3888, 908, 275, 253, 4679, 452, 247, 4076, 285, 2014, 2303, 1789, 352, 812, 320, 1805, 281, 897, 247, 1236, 2510, 25912, 10895, 281, 12654, 253, 26647, 3745, 273, 253, 4081, 1332, 5474, 33032, 2520, 2929, 10262, 271, 17825, 2919, 45900, 549, 81, 1332, 323, 4030, 72, 11273, 6779, 4715, 549, 81, 19492, 253, 3386, 342, 253, 5998, 4311, 253, 3386, 403, 2007, 1066, 22163, 6216, 281, 247, 5185, 1979, 949, 10370, 48971, 1066, 48027, 4679, 5196, 327, 767, 8892, 17813, 253, 12510, 273, 253, 4081, 1332, 20544, 50276, 18, 1097, 8892, 273, 4030, 72, 11273, 2460, 9162, 285, 4958, 294, 888, 1877, 403, 6760, 275, 436, 2929, 374, 253, 2127, 310, 4439, 50276, 20881, 1255, 265, 337, 253, 38135, 310, 3710, 275, 3332, 1107, 1142, 45900, 3169, 3082, 452, 644, 4081, 281, 8338, 253, 4030, 72, 11273, 9162, 4836, 824, 347, 17825, 10370, 48971, 45900, 23964, 297, 404, 45900, 285, 594, 327, 253, 5697, 273, 549, 81, 403, 2074, 281, 841, 3082, 2299, 253, 4278, 403, 417, 5469, 275, 436, 2929, 50275, 19, 275, 253, 4679, 352, 3133, 16593, 281, 7277, 253, 4081, 1332, 342, 253, 1375, 23037, 14387, 3082, 549, 81, 310, 247, 2087, 45900, 4254, 323, 4030, 72, 11273, 6779, 4715, 352, 310, 3309, 281, 12654, 1880, 549, 81, 7729, 3157, 841, 3082, 50276, 20, 1142, 14217, 275, 253, 2929, 403, 21643, 3738, 597, 403, 2219, 18917, 390, 13433, 50276, 21, 1142, 2905, 10414, 497, 417, 5393, 275, 436, 2929, 50276, 2887, 1840, 5701, 50275, 3169, 327, 32213, 285, 20544, 891, 452, 7350, 670, 253, 9021, 285, 5661, 27163, 50276, 7152, 33032, 2520, 2929, 29328, 271, 17825, 2919, 45900, 549, 81, 6333, 326, 5223, 1242, 8197, 253, 4311, 2616, 285, 19492, 253, 20741, 800, 4811, 1754, 327, 253, 5998, 4311, 2616, 352, 13698, 281, 2770, 327, 253, 954, 20741, 800, 2919, 285, 10486, 3831, 625, 4030, 72, 11273, 1491, 253, 4477, 4647, 253, 549, 81, 6333, 281, 4030, 72, 11273, 2460, 8981, 285, 4958, 294, 301, 8892, 6296, 253, 549, 81, 6333, 310, 271, 4116, 267, 5122, 281, 5223, 1242, 19912, 20741, 800, 4811, 281, 12454, 4715, 4030, 72, 11273, 6779, 352, 806, 19492, 253, 4735, 432, 253, 954, 4619, 2919, 285, 840, 1066, 22163, 6216, 281, 247, 5185, 1979, 949, 10370, 48971, 1066, 48027, 352, 310, 3240, 2074, 281, 1655, 789, 337, 534, 29328, 247, 3634, 13823, 4116, 267, 4735, 45900, 1729, 6333, 326, 33772, 281, 19912, 20741, 800, 4811, 342, 1027, 11498, 285, 3711, 281, 4229, 3281, 4735, 8115, 253, 5750, 273, 253, 549, 81, 2429, 342, 1729, 943, 320, 3559, 285, 5469, 16280, 436, 789, 33526, 3240, 973, 3045, 24088, 898, 1093, 327, 12966, 253, 3045, 14023, 342, 436, 789, 943, 671, 320, 2530, 50276, 13206, 495, 10262, 253, 2990, 10336, 326, 2722, 849, 281, 19837, 253, 549, 81, 6333, 2299, 2139, 970, 824, 271, 10336, 310, 417, 5469, 16280, 403, 627, 643, 4088, 281, 19837, 253, 6333, 891, 1158, 625, 5955, 285, 4679, 327, 849, 281, 19837, 253, 1049, 83, 6333, 943, 320, 2530, 50276, 936, 11897, 253, 7162, 3711, 253, 4477, 16584, 4227, 21539, 281, 39142, 253, 4735, 8115, 253, 16038, 273, 436, 1127, 310, 417, 973, 5544, 285, 697, 7680, 310, 417, 16058, 50276, 18, 320, 248, 376, 1162, 355, 3634, 13823, 4116, 267, 45900, 1729, 323, 4030, 72, 11273, 5304, 9162, 275, 39951, 2284, 43425, 891, 1158, 253, 5750, 285, 3045, 14023, 342, 1655, 2987, 403, 417, 973, 31637, 253, 5955, 285, 1783, 670, 690, 9560, 5018, 285, 4295, 403, 417, 3559, 2490, 187, 4118, 18435, 27, 12496, 30628, 452, 9814, 436, 19529, 1264, 273, 731, 8521, 281, 12009, 253, 2929, 285, 581, 369, 42876, 1840, 253, 14924, 7887, 253, 4477, 452, 417, 10974, 281, 253, 43680, 390, 3533, 273, 30628, 2190, 1142, 7350, 497, 253, 3374, 342, 253, 897, 273, 9644, 285, 2014, 2303, 1789, 3888, 3480, 273, 11985, 327, 2905, 3210, 824, 347, 17825, 10370, 48971, 45900, 285, 23964, 297, 404, 45900, 3480, 273, 27163, 327, 15302, 824, 347, 50276, 14915, 20039, 1079, 275, 8646, 382, 1677, 253, 1840, 43680, 285, 253, 3480, 273, 4477, 2380, 436, 19529, 11521, 2708, 253, 14924, 2534 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: useful regret bounds are provided especially for dpts it is shown that the regret is optimal till an olog logt factor it is identified that using the same reward observation for multiple estimations will require more noise for the same degree of privacy in theorem 2 and 4 equivalence between differential privacy guarantees for computing empirical mean in 1 round and differential privacy of overall the algorithm is used a composition overall differentially private empirical mean calculation needs to be shown to state the algorithm as differentially private the privacy factor also changes in this case a more technical analysis needs to be shown to demonstrate that using same reward observation multiple times will require more noise theorems should be numbered starting from 1 not 2 composition over all empirical mean estimation needs to be shown to state dp t s as differentially private since the same reward observations would be used repeatedly also the complete algorithm will have a privacy factor exceeding because of composition docsepthe work proposes the first tstype algorithm for differentially private stochastic bandits due to the better practical performance of ts such an attempt is worthwhile the regret upper bound of the proposed dplazyts algorithm matches the problem lower bound some experiments are conducted to show the advantage of tsbased algorithms over ucb and eliminationtype ones the main weakness is the technical contribution given the previous analysis of anytimelazyucb hu et al 2021 the main difficulty is to analyze the regret caused by inaccurate samples of the optimal arm 1 this paper deal with this difficulty by simply reshaping the posterior distribution optimistically and then using the standard nonprivate ts analysis on this part based on the above consideration the technical contribution is a bit weak q1 the paper studies the differentially private stochastic bandit problem and proposes tsbased algorithms to solve the problem the main idea to obtain the regret analysis while guaranteeing the dp is to reshape the posterior distribution in an optimistic way such that the mean of the posterior can be better than that in the nonprivate setting they propose both dpts and lazydpts the latter matches the regret lower bound some experiments are conducted to verify the efficiency of the tsbased algorithm compared with ucb and eliminationbased ones q5 the current technique to deal with ts is reshaping the posterior distribution and thus previous analysis for nonprivate ts can be used i would like to see more discussions in the paper on the difficulty of ts itself for the dp problem some minor issues the regret with expectation over actions and rewards should be called expected regret in experiments how the reward is generated and the number of runs are not reported docsep1 the differential privacy setting is an important area and they propose the first order optimal thompson sampling algorithm for this setting 2 the exposition to their algorithm is clear and they also theoretically analyze their algorithm the lazydpts is also order optimal and reaches the lower bound proposed in shariff and sheffet 2018 3 they empirically analyze their algorithm on toy datasets 1 it seems to be that the regret analysis mostly follows the same track as agrawal and goyal 2017 with some minor changes what is the key technical improvementnovelty over their proof 2 similarly theorem 2 and theorem 4 showing dpts and lazydpts are differentially private seems to be the extension of the proofs from dwork et al 2014 what is the key technical improvementnovelty over their proof 3 writing can be improved you need to discuss the implication of definition 1 especially why there is eepsilon and not just epsilon it is also not clear to me why you choose laplacian distribution to set the value of cj the cumulative differential privacy reward tracker can you explain it refer to 1 2 3 in q4 4 the role of vepsilon ojt1 t is not clear to me is it like an optimistic estimation of the mean that shifts the posterior to the right 5 then again it seems that you clip the optimistic estimation of the mean barmuj i dont think that agrawal and goyal 2017 use any clipped mean estimation how do you handle this in your proof sketch is that the key improvement over the proof technique of agrawal and goyal 2017 6 the toy experiments really dont bring out much it would be nice to have some experiments on a real dataset like movielensyahoo which are standard datasets used in the mab setting ### Summary:
meta review this paper proposes nearly optimal thompson samplingbased algorithms with differentially privacy guarantees the authors have provided a very detailed and helpful response to the reviewer and meta reviewers questions there is unanimous support to accept this paper thus i recommend acceptance
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 4217, 14938, 14493, 403, 2530, 3340, 323, 277, 45276, 352, 310, 2011, 326, 253, 14938, 310, 8654, 7357, 271, 258, 2808, 2412, 85, 2803, 50276, 262, 310, 3636, 326, 970, 253, 1072, 10921, 8310, 323, 2709, 3311, 569, 588, 2430, 625, 6046, 323, 253, 1072, 4248, 273, 11068, 275, 10012, 374, 285, 577, 19945, 875, 8967, 11068, 23632, 323, 12672, 16774, 1599, 275, 337, 3790, 285, 8967, 11068, 273, 4583, 253, 5933, 310, 908, 247, 5889, 4583, 21673, 3055, 16774, 1599, 10272, 3198, 281, 320, 2011, 281, 1375, 253, 5933, 347, 21673, 3055, 253, 11068, 2803, 671, 2544, 275, 436, 1083, 50276, 66, 625, 7681, 1783, 3198, 281, 320, 2011, 281, 7568, 326, 970, 1072, 10921, 8310, 2709, 2069, 588, 2430, 625, 6046, 39383, 943, 320, 31050, 4983, 432, 337, 417, 374, 50276, 42190, 689, 512, 16774, 1599, 13418, 3198, 281, 320, 2011, 281, 1375, 33234, 246, 256, 347, 21673, 3055, 1580, 253, 1072, 10921, 7313, 651, 320, 908, 12889, 671, 253, 3426, 5933, 588, 452, 247, 11068, 2803, 27433, 50276, 12157, 273, 5889, 5474, 339, 431, 248, 789, 29328, 253, 806, 246, 296, 1692, 5933, 323, 21673, 3055, 19191, 3961, 953, 50276, 21848, 281, 253, 1805, 8542, 3045, 273, 28669, 824, 271, 3177, 310, 32811, 50276, 783, 14938, 5170, 3033, 273, 253, 4081, 277, 446, 26537, 1641, 5933, 10129, 253, 1895, 2406, 3033, 50276, 8826, 4679, 403, 5196, 281, 921, 253, 5750, 273, 28669, 3169, 11333, 689, 44274, 67, 285, 20408, 881, 4394, 50276, 783, 2022, 14855, 310, 253, 7681, 7680, 1677, 253, 2045, 1783, 273, 667, 12292, 293, 26537, 1028, 67, 30287, 1162, 355, 43425, 253, 2022, 10183, 310, 281, 12106, 253, 14938, 4269, 407, 31215, 3530, 273, 253, 8654, 4430, 337, 436, 2929, 2968, 342, 436, 10183, 407, 3365, 40206, 15609, 253, 12637, 3268, 5556, 18260, 285, 840, 970, 253, 2629, 1327, 9486, 28669, 1783, 327, 436, 629, 1754, 327, 253, 1840, 8180, 253, 7681, 7680, 310, 247, 2372, 5075, 2805, 18, 50276, 783, 2929, 2175, 253, 21673, 3055, 19191, 3961, 262, 1895, 285, 29328, 28669, 3169, 11333, 281, 8415, 253, 1895, 253, 2022, 2934, 281, 4044, 253, 14938, 1783, 1223, 12215, 272, 253, 33234, 310, 281, 40206, 2259, 253, 12637, 3268, 275, 271, 28684, 1039, 824, 326, 253, 1599, 273, 253, 12637, 476, 320, 1805, 685, 326, 275, 253, 1327, 9486, 4758, 597, 12661, 1097, 277, 45276, 285, 22658, 69, 45276, 253, 6158, 10129, 253, 14938, 2406, 3033, 690, 4679, 403, 5196, 281, 12654, 253, 6733, 273, 253, 28669, 3169, 5933, 2429, 342, 44274, 67, 285, 20408, 3169, 4394, 50276, 82, 22, 50276, 783, 1655, 5853, 281, 2968, 342, 28669, 310, 40206, 15609, 253, 12637, 3268, 285, 3021, 2045, 1783, 323, 1327, 9486, 28669, 476, 320, 908, 891, 651, 751, 281, 923, 625, 11985, 275, 253, 2929, 327, 253, 10183, 273, 28669, 3139, 323, 253, 33234, 1895, 50275, 8826, 5884, 3374, 50276, 783, 14938, 342, 15355, 689, 5231, 285, 23267, 943, 320, 1925, 3264, 14938, 50276, 249, 4679, 849, 253, 10921, 310, 4561, 285, 253, 1180, 273, 6613, 403, 417, 2361, 50275, 7152, 33032, 18, 253, 8967, 11068, 4758, 310, 271, 1774, 2170, 285, 597, 12661, 253, 806, 1340, 8654, 289, 297, 10836, 10491, 5933, 323, 436, 4758, 50275, 19, 253, 47284, 281, 616, 5933, 310, 2590, 285, 597, 671, 28055, 12106, 616, 5933, 253, 22658, 69, 45276, 310, 671, 1340, 8654, 285, 14190, 253, 2406, 3033, 4081, 275, 17614, 1648, 285, 703, 567, 292, 4765, 50276, 20, 597, 45190, 12106, 616, 5933, 327, 20953, 15302, 337, 352, 3133, 281, 320, 326, 253, 14938, 1783, 6571, 3637, 253, 1072, 3540, 347, 639, 2040, 267, 285, 564, 90, 267, 4240, 342, 690, 5884, 2544, 752, 310, 253, 2234, 7681, 7756, 2369, 652, 555, 689, 616, 4737, 50276, 19, 12014, 10012, 374, 285, 10012, 577, 4645, 277, 45276, 285, 22658, 69, 45276, 403, 21673, 3055, 3133, 281, 320, 253, 6880, 273, 253, 27947, 432, 277, 1601, 1162, 355, 4059, 752, 310, 253, 2234, 7681, 7756, 2369, 652, 555, 689, 616, 4737, 50276, 20, 4028, 476, 320, 5520, 368, 878, 281, 2319, 253, 27570, 273, 5426, 337, 3340, 2139, 627, 310, 299, 4259, 285, 417, 816, 299, 4277, 352, 310, 671, 417, 2590, 281, 479, 2139, 368, 5206, 826, 43917, 3268, 281, 873, 253, 1318, 273, 260, 75, 253, 18849, 8967, 11068, 10921, 40143, 476, 368, 5513, 352, 3730, 281, 337, 374, 495, 275, 2805, 21, 50275, 21, 253, 2554, 273, 1670, 4277, 258, 42565, 18, 246, 310, 417, 2590, 281, 479, 310, 352, 751, 271, 28684, 13418, 273, 253, 1599, 326, 15036, 253, 12637, 281, 253, 987, 50276, 22, 840, 969, 352, 3133, 326, 368, 17230, 253, 28684, 13418, 273, 253, 1599, 270, 1513, 10441, 891, 13414, 1158, 326, 639, 2040, 267, 285, 564, 90, 267, 4240, 897, 667, 502, 6390, 1599, 13418, 849, 513, 368, 6016, 436, 275, 634, 4737, 23211, 310, 326, 253, 2234, 7756, 689, 253, 4737, 5853, 273, 639, 2040, 267, 285, 564, 90, 267, 4240, 50276, 23, 253, 20953, 4679, 1663, 13414, 3324, 562, 1199, 352, 651, 320, 5322, 281, 452, 690, 4679, 327, 247, 1524, 10895, 751, 1855, 928, 561, 49269, 534, 403, 2629, 15302, 908, 275, 253, 278, 357, 4758, 2490, 187, 4118, 18435, 27, 13518, 2278, 436, 2929, 29328, 4829, 8654, 289, 297, 10836, 10491, 3169, 11333, 342, 21673, 11068, 23632, 253, 4477, 452, 2530, 247, 1077, 7000, 285, 9371, 2380, 281, 253, 37317, 285, 11419, 30628, 3533, 627, 310, 42293, 1329, 281, 2997, 436, 2929, 3021, 891, 5583, 14924, 50276 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 4217, 14938, 14493, 403, 2530, 3340, 323, 277, 45276, 352, 310, 2011, 326, 253, 14938, 310, 8654, 7357, 271, 258, 2808, 2412, 85, 2803, 50276, 262, 310, 3636, 326, 970, 253, 1072, 10921, 8310, 323, 2709, 3311, 569, 588, 2430, 625, 6046, 323, 253, 1072, 4248, 273, 11068, 275, 10012, 374, 285, 577, 19945, 875, 8967, 11068, 23632, 323, 12672, 16774, 1599, 275, 337, 3790, 285, 8967, 11068, 273, 4583, 253, 5933, 310, 908, 247, 5889, 4583, 21673, 3055, 16774, 1599, 10272, 3198, 281, 320, 2011, 281, 1375, 253, 5933, 347, 21673, 3055, 253, 11068, 2803, 671, 2544, 275, 436, 1083, 50276, 66, 625, 7681, 1783, 3198, 281, 320, 2011, 281, 7568, 326, 970, 1072, 10921, 8310, 2709, 2069, 588, 2430, 625, 6046, 39383, 943, 320, 31050, 4983, 432, 337, 417, 374, 50276, 42190, 689, 512, 16774, 1599, 13418, 3198, 281, 320, 2011, 281, 1375, 33234, 246, 256, 347, 21673, 3055, 1580, 253, 1072, 10921, 7313, 651, 320, 908, 12889, 671, 253, 3426, 5933, 588, 452, 247, 11068, 2803, 27433, 50276, 12157, 273, 5889, 5474, 339, 431, 248, 789, 29328, 253, 806, 246, 296, 1692, 5933, 323, 21673, 3055, 19191, 3961, 953, 50276, 21848, 281, 253, 1805, 8542, 3045, 273, 28669, 824, 271, 3177, 310, 32811, 50276, 783, 14938, 5170, 3033, 273, 253, 4081, 277, 446, 26537, 1641, 5933, 10129, 253, 1895, 2406, 3033, 50276, 8826, 4679, 403, 5196, 281, 921, 253, 5750, 273, 28669, 3169, 11333, 689, 44274, 67, 285, 20408, 881, 4394, 50276, 783, 2022, 14855, 310, 253, 7681, 7680, 1677, 253, 2045, 1783, 273, 667, 12292, 293, 26537, 1028, 67, 30287, 1162, 355, 43425, 253, 2022, 10183, 310, 281, 12106, 253, 14938, 4269, 407, 31215, 3530, 273, 253, 8654, 4430, 337, 436, 2929, 2968, 342, 436, 10183, 407, 3365, 40206, 15609, 253, 12637, 3268, 5556, 18260, 285, 840, 970, 253, 2629, 1327, 9486, 28669, 1783, 327, 436, 629, 1754, 327, 253, 1840, 8180, 253, 7681, 7680, 310, 247, 2372, 5075, 2805, 18, 50276, 783, 2929, 2175, 253, 21673, 3055, 19191, 3961, 262, 1895, 285, 29328, 28669, 3169, 11333, 281, 8415, 253, 1895, 253, 2022, 2934, 281, 4044, 253, 14938, 1783, 1223, 12215, 272, 253, 33234, 310, 281, 40206, 2259, 253, 12637, 3268, 275, 271, 28684, 1039, 824, 326, 253, 1599, 273, 253, 12637, 476, 320, 1805, 685, 326, 275, 253, 1327, 9486, 4758, 597, 12661, 1097, 277, 45276, 285, 22658, 69, 45276, 253, 6158, 10129, 253, 14938, 2406, 3033, 690, 4679, 403, 5196, 281, 12654, 253, 6733, 273, 253, 28669, 3169, 5933, 2429, 342, 44274, 67, 285, 20408, 3169, 4394, 50276, 82, 22, 50276, 783, 1655, 5853, 281, 2968, 342, 28669, 310, 40206, 15609, 253, 12637, 3268, 285, 3021, 2045, 1783, 323, 1327, 9486, 28669, 476, 320, 908, 891, 651, 751, 281, 923, 625, 11985, 275, 253, 2929, 327, 253, 10183, 273, 28669, 3139, 323, 253, 33234, 1895, 50275, 8826, 5884, 3374, 50276, 783, 14938, 342, 15355, 689, 5231, 285, 23267, 943, 320, 1925, 3264, 14938, 50276, 249, 4679, 849, 253, 10921, 310, 4561, 285, 253, 1180, 273, 6613, 403, 417, 2361, 50275, 7152, 33032, 18, 253, 8967, 11068, 4758, 310, 271, 1774, 2170, 285, 597, 12661, 253, 806, 1340, 8654, 289, 297, 10836, 10491, 5933, 323, 436, 4758, 50275, 19, 253, 47284, 281, 616, 5933, 310, 2590, 285, 597, 671, 28055, 12106, 616, 5933, 253, 22658, 69, 45276, 310, 671, 1340, 8654, 285, 14190, 253, 2406, 3033, 4081, 275, 17614, 1648, 285, 703, 567, 292, 4765, 50276, 20, 597, 45190, 12106, 616, 5933, 327, 20953, 15302, 337, 352, 3133, 281, 320, 326, 253, 14938, 1783, 6571, 3637, 253, 1072, 3540, 347, 639, 2040, 267, 285, 564, 90, 267, 4240, 342, 690, 5884, 2544, 752, 310, 253, 2234, 7681, 7756, 2369, 652, 555, 689, 616, 4737, 50276, 19, 12014, 10012, 374, 285, 10012, 577, 4645, 277, 45276, 285, 22658, 69, 45276, 403, 21673, 3055, 3133, 281, 320, 253, 6880, 273, 253, 27947, 432, 277, 1601, 1162, 355, 4059, 752, 310, 253, 2234, 7681, 7756, 2369, 652, 555, 689, 616, 4737, 50276, 20, 4028, 476, 320, 5520, 368, 878, 281, 2319, 253, 27570, 273, 5426, 337, 3340, 2139, 627, 310, 299, 4259, 285, 417, 816, 299, 4277, 352, 310, 671, 417, 2590, 281, 479, 2139, 368, 5206, 826, 43917, 3268, 281, 873, 253, 1318, 273, 260, 75, 253, 18849, 8967, 11068, 10921, 40143, 476, 368, 5513, 352, 3730, 281, 337, 374, 495, 275, 2805, 21, 50275, 21, 253, 2554, 273, 1670, 4277, 258, 42565, 18, 246, 310, 417, 2590, 281, 479, 310, 352, 751, 271, 28684, 13418, 273, 253, 1599, 326, 15036, 253, 12637, 281, 253, 987, 50276, 22, 840, 969, 352, 3133, 326, 368, 17230, 253, 28684, 13418, 273, 253, 1599, 270, 1513, 10441, 891, 13414, 1158, 326, 639, 2040, 267, 285, 564, 90, 267, 4240, 897, 667, 502, 6390, 1599, 13418, 849, 513, 368, 6016, 436, 275, 634, 4737, 23211, 310, 326, 253, 2234, 7756, 689, 253, 4737, 5853, 273, 639, 2040, 267, 285, 564, 90, 267, 4240, 50276, 23, 253, 20953, 4679, 1663, 13414, 3324, 562, 1199, 352, 651, 320, 5322, 281, 452, 690, 4679, 327, 247, 1524, 10895, 751, 1855, 928, 561, 49269, 534, 403, 2629, 15302, 908, 275, 253, 278, 357, 4758, 2490, 187, 4118, 18435, 27, 13518, 2278, 436, 2929, 29328, 4829, 8654, 289, 297, 10836, 10491, 3169, 11333, 342, 21673, 11068, 23632, 253, 4477, 452, 2530, 247, 1077, 7000, 285, 9371, 2380, 281, 253, 37317, 285, 11419, 30628, 3533, 627, 310, 42293, 1329, 281, 2997, 436, 2929, 3021, 891, 5583, 14924, 50276 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the authors present two online biologically plausible algorithms for the irreducible representation learning method proposed by cohen and welling 2014 the first algorithm uses svd to extract the subspace of maximal average rotation as part of this algorithm the authors present a novel algorithm for online svd crosspower iteration the second is based on pca of time difference vectors and uses local learning rules to preserve biological plausibility the authors propose that these models may serve as hypotheses to guide future connectomics and neurophysiology research results the authors find that both the svd and pca methods are able to recover the irreducible representations of so2 and s1 on synthetic datasets these synthetic datasets are sets of random images or vectors sampled from the standard normal distribution followed by the application of the transformations of interest translation or rotation strengths the paper contributes two novel and interesting methods for learning the irreducible representations of toroidal groups with biologically plausible mechanisms this is a valuable contribution for both computational neuroscience as a hypothesis for neural circuitry and machine learning as a novel algorithm for learning and modeling transformation structure from data weaknesses it would be nice to see a quantitative analysis on the learned filters explicitly comparing to the standard translational and rotational fourier bases and reporting how well the learned weights approximate these bases it would be useful to see how well the model trains on natural datasets general comments the paper is framed as a neurobiological model but it really seems more neuromorphic as the emphasis is on how to learn all these things with local learning rules in a neural circuit as opposed to specific neurobiological substrates in the visual system there are issues such as how to compute arctan and many others that would need to be discussed or elaborated more to think of this in neurobiological terms it will be interesting to see how these results could be extended to realworld applications such as estimating optical flow from natural videos see above docsepthis manuscript proposes biologically plausible algorithms for learning group transformations from sequences of observations two algorithms are proposed one based on svd and the other on pca the performance of these algorithms is evaluated in synthetic experiments strengths the authors address a very interesting and challenging question biological plausibility of the learning of transformation representations weaknesses the setting is quite restrictive reducing the problem to previously proposed biologically plausible algorithms for pca and svd experiments are very toylike with very simplified assumptions whose influence is not tested relation to previous work on biologically inspired pca and other such algorithms is elusive it is difficult to figure out what is the originality of the proposed approaches and why previous approaches could not directly be applied limitations are not discussed docsepthe present paper studied the neural implementation of learning irreducible representation of commuting group transformations specifically using the 2d rotation group as an example the paper studied how the neural network learns the rotation groups by implementing different algorithms including singular value decomposition and principal component analysis and different algorithms eventually lead to different network architecture it is a fundamental question that how neural circuits learn the commutative group transformations this study provides two novel solutions which rely on single neuron mechanism implementing svd and network mechanism implementing pca respectively to learn the group transformations the network model implementing pca similarity matching specifically to learn transformations is based on a recent study ref 1 so it is not clear how much difference of this part compared with the mechanism presented in ref 1 the whole paper is structurewise but some presentation is not clear and the comparison with earlier studies was not done systematically see details below writing eq 3 the index t should be t1 if i understood correctly line 141 should the sigma be the g in lemma 1 if so please use consistent notations across the manuscript it is not clear the derivation in eq 18 based on eq 16 and some motivations are missing also it is not clear the meaning of matrices w and m in eq 18 and i need to guess whether they correspond to feedforward and recurrent weight matrices respectively fig 5a not clear which algorithm leads to the presented filters that is are those filters learnt from the svd representing the u and v or from the network implementation of pca w and m both models considered in this paper only passively estimate the rotation angle it would be more interesting to synthesize the transformed images as in an autoencoder framework docsepthe authors depict two novel biologically plausible algorithms for online estimation of transformations using irreducible representations by building on prior work existing algorithms making use of commutative lie groups for decomposing transformations are altered for biological plausibility possible neural implementations are depicted which can be searched for in connectomics the results are evaluated experimentally on two small examples strengths originality the general idea of representing rotations by toroidal groups is not new but nonetheless interesting and coming up with actual biologically plausible implementations is invaluable the text is accessible starting with a general introduction and good explanation of the problem at hand all proofs including proof of convergence for crosspower iteration are readily available in the appendix weaknesses not clear which parts are contributions ie labeled as theorem sadly no code provided quantitative comparison of the experimental results with biologically implausible counterparts ie cohen and welling wouldve been interesting suspected errors line 73 canonical ly line 227 canter limitations such as biological implausibility of the original svd based algorithm are stated clearly ### Summary:
this manuscript presents novel biologically plausible algorithms for learning representations for lie groups the derivation of the algorithms and the networks are based on previously studied biologically plausible networks although there are some limitations the reviewers agree that this work is sound clearly presented and represents a valuable contribution to both computationaltheoretical neuroscience and machine learning
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 1246, 767, 3909, 35605, 21541, 11333, 323, 253, 22816, 6779, 4715, 1332, 4081, 407, 820, 864, 285, 973, 272, 4059, 50276, 783, 806, 5933, 4648, 18504, 69, 281, 4908, 253, 24822, 273, 13493, 3388, 9381, 347, 629, 273, 436, 5933, 253, 4477, 1246, 247, 4460, 5933, 323, 3909, 18504, 69, 2831, 9177, 19502, 253, 1273, 310, 1754, 327, 268, 6357, 273, 673, 3064, 11390, 285, 4648, 1980, 4715, 4803, 281, 14003, 7534, 18662, 2322, 253, 4477, 12661, 326, 841, 3210, 778, 5752, 347, 24316, 281, 7102, 2852, 4684, 19177, 285, 6551, 14453, 10537, 2561, 50276, 16680, 50276, 783, 4477, 1089, 326, 1097, 253, 18504, 69, 285, 268, 6357, 3082, 403, 2104, 281, 9295, 253, 22816, 14237, 273, 594, 19, 285, 256, 18, 327, 13506, 15302, 841, 13506, 15302, 403, 5239, 273, 3632, 3888, 390, 11390, 19958, 432, 253, 2629, 2622, 3268, 3560, 407, 253, 2898, 273, 253, 21257, 273, 1600, 10234, 390, 9381, 50276, 296, 3755, 20556, 50276, 783, 2929, 17904, 767, 4460, 285, 4722, 3082, 323, 4715, 253, 22816, 14237, 273, 7263, 16080, 2390, 342, 35605, 21541, 6297, 436, 310, 247, 9865, 7680, 323, 1097, 15180, 6551, 21559, 347, 247, 9079, 323, 11454, 29009, 285, 5145, 4715, 347, 247, 4460, 5933, 323, 4715, 285, 14053, 9261, 2605, 432, 941, 50275, 20881, 1255, 265, 50276, 262, 651, 320, 5322, 281, 923, 247, 11745, 1783, 327, 253, 6311, 15116, 11120, 10941, 281, 253, 2629, 33103, 285, 22090, 269, 15421, 14395, 285, 9610, 849, 973, 253, 6311, 13461, 16851, 841, 14395, 50276, 262, 651, 320, 4217, 281, 923, 849, 973, 253, 1566, 18784, 327, 3626, 15302, 50276, 16691, 5701, 50276, 783, 2929, 310, 29318, 347, 247, 6551, 4193, 1975, 1566, 533, 352, 1663, 3133, 625, 39650, 41050, 347, 253, 15075, 310, 327, 849, 281, 3037, 512, 841, 1841, 342, 1980, 4715, 4803, 275, 247, 11454, 5049, 347, 10066, 281, 2173, 6551, 4193, 1975, 16007, 275, 253, 5304, 985, 50276, 9088, 403, 3374, 824, 347, 849, 281, 11897, 549, 291, 266, 285, 1142, 2571, 326, 651, 878, 281, 320, 5469, 390, 50221, 625, 281, 1158, 273, 436, 275, 6551, 4193, 1975, 2426, 50276, 262, 588, 320, 4722, 281, 923, 849, 841, 1543, 812, 320, 6508, 281, 1524, 10186, 4893, 824, 347, 26230, 5748, 2685, 432, 3626, 10556, 50275, 2887, 1840, 5474, 33032, 2520, 7714, 29328, 35605, 21541, 11333, 323, 4715, 1387, 21257, 432, 6430, 273, 7313, 767, 11333, 403, 4081, 581, 1754, 327, 18504, 69, 285, 253, 643, 327, 268, 6357, 253, 3045, 273, 841, 11333, 310, 6760, 275, 13506, 4679, 50276, 296, 3755, 20556, 50276, 783, 4477, 2953, 247, 1077, 4722, 285, 11132, 1953, 7534, 18662, 2322, 273, 253, 4715, 273, 9261, 14237, 50276, 20881, 1255, 265, 50276, 783, 4758, 310, 3240, 29190, 8493, 253, 1895, 281, 3786, 4081, 35605, 21541, 11333, 323, 268, 6357, 285, 18504, 69, 50276, 16217, 3825, 403, 1077, 20953, 3022, 342, 1077, 21010, 13260, 3692, 4833, 310, 417, 5762, 50276, 16429, 281, 2045, 789, 327, 35605, 11797, 268, 6357, 285, 643, 824, 11333, 310, 38037, 352, 310, 2834, 281, 4677, 562, 752, 310, 253, 3236, 414, 273, 253, 4081, 7274, 285, 2139, 2045, 7274, 812, 417, 3587, 320, 3732, 50276, 17465, 569, 403, 417, 5469, 50276, 7152, 339, 431, 248, 1246, 2929, 5421, 253, 11454, 7092, 273, 4715, 22816, 6779, 273, 49681, 1387, 21257, 5742, 970, 253, 374, 69, 9381, 1387, 347, 271, 1650, 253, 2929, 5421, 849, 253, 11454, 2990, 33772, 253, 9381, 2390, 407, 16994, 1027, 11333, 1690, 11098, 1318, 14717, 285, 8624, 4445, 1783, 285, 1027, 11333, 6524, 1421, 281, 1027, 2990, 10336, 352, 310, 247, 7936, 1953, 326, 849, 11454, 14174, 3037, 253, 33796, 1387, 21257, 436, 1263, 3400, 767, 4460, 5482, 534, 10725, 327, 2014, 23586, 5122, 16994, 18504, 69, 285, 2990, 5122, 16994, 268, 6357, 2975, 281, 3037, 253, 1387, 21257, 253, 2990, 1566, 16994, 268, 6357, 14259, 11038, 5742, 281, 3037, 21257, 310, 1754, 327, 247, 3332, 1263, 1275, 337, 594, 352, 310, 417, 2590, 849, 1199, 3064, 273, 436, 629, 2429, 342, 253, 5122, 3559, 275, 1275, 337, 253, 2644, 2929, 310, 2605, 3020, 533, 690, 9759, 310, 417, 2590, 285, 253, 5301, 342, 4321, 2175, 369, 417, 2218, 24181, 923, 4278, 2708, 50275, 17695, 50276, 2574, 495, 253, 3605, 246, 943, 320, 246, 18, 604, 891, 7192, 9113, 50275, 1282, 21886, 943, 253, 40009, 320, 253, 305, 275, 18057, 337, 604, 594, 4496, 897, 5185, 41818, 2439, 253, 7714, 50274, 262, 310, 417, 2590, 253, 28529, 275, 16186, 1283, 1754, 327, 16186, 1668, 285, 690, 42852, 403, 5816, 671, 352, 310, 417, 2590, 253, 4495, 273, 12624, 259, 285, 278, 275, 16186, 1283, 285, 891, 878, 281, 5476, 1880, 597, 2723, 281, 3997, 10495, 285, 18902, 2801, 12624, 2975, 50275, 926, 608, 66, 417, 2590, 534, 5933, 5644, 281, 253, 3559, 15116, 326, 310, 403, 1110, 15116, 34003, 432, 253, 18504, 69, 9999, 253, 1484, 285, 362, 390, 432, 253, 2990, 7092, 273, 268, 6357, 259, 285, 278, 50275, 15617, 3210, 2783, 275, 436, 2929, 760, 1509, 1242, 6642, 253, 9381, 6907, 352, 651, 320, 625, 4722, 281, 46919, 253, 13657, 3888, 347, 275, 271, 6753, 36465, 7792, 5474, 339, 431, 248, 4477, 17154, 767, 4460, 35605, 21541, 11333, 323, 3909, 13418, 273, 21257, 970, 22816, 14237, 407, 3652, 327, 2720, 789, 5368, 11333, 2403, 897, 273, 33796, 7027, 2390, 323, 11101, 28163, 21257, 403, 12059, 323, 7534, 18662, 2322, 1896, 11454, 27558, 403, 17253, 534, 476, 320, 16113, 323, 275, 4684, 19177, 253, 1543, 403, 6760, 21657, 327, 767, 1355, 6667, 20544, 50276, 19164, 414, 253, 2087, 2934, 273, 9999, 39501, 407, 7263, 16080, 2390, 310, 417, 747, 533, 23188, 4722, 285, 3551, 598, 342, 4588, 35605, 21541, 27558, 310, 38089, 50276, 783, 2505, 310, 12482, 4983, 342, 247, 2087, 10199, 285, 1175, 8813, 273, 253, 1895, 387, 1133, 50275, 455, 27947, 1690, 4737, 273, 14940, 323, 2831, 9177, 19502, 403, 12450, 2130, 275, 253, 30762, 50275, 20881, 1255, 265, 50276, 1439, 2590, 534, 4243, 403, 9021, 26332, 13130, 347, 10012, 50276, 84, 324, 314, 642, 2127, 2530, 50276, 17149, 6716, 5301, 273, 253, 5661, 1543, 342, 35605, 3898, 666, 917, 21421, 26332, 820, 864, 285, 973, 272, 651, 306, 644, 4722, 50276, 36903, 5344, 6332, 50276, 1282, 11087, 15516, 12865, 50276, 1282, 26472, 476, 350, 7364, 824, 347, 7534, 3898, 666, 2322, 273, 253, 3236, 18504, 69, 1754, 5933, 403, 4767, 4518, 2490, 187, 4118, 18435, 27, 2520, 7714, 10262, 4460, 35605, 21541, 11333, 323, 4715, 14237, 323, 7027, 2390, 253, 28529, 273, 253, 11333, 285, 253, 6928, 403, 1754, 327, 3786, 5421, 35605, 21541, 6928, 3738, 627, 403, 690, 7364, 253, 30628, 5194, 326, 436, 789, 310, 3590, 4518, 3559, 285, 6125, 247, 9865, 7680, 281, 1097, 15180, 783, 33977, 6551, 21559, 285, 5145, 4715 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 1246, 767, 3909, 35605, 21541, 11333, 323, 253, 22816, 6779, 4715, 1332, 4081, 407, 820, 864, 285, 973, 272, 4059, 50276, 783, 806, 5933, 4648, 18504, 69, 281, 4908, 253, 24822, 273, 13493, 3388, 9381, 347, 629, 273, 436, 5933, 253, 4477, 1246, 247, 4460, 5933, 323, 3909, 18504, 69, 2831, 9177, 19502, 253, 1273, 310, 1754, 327, 268, 6357, 273, 673, 3064, 11390, 285, 4648, 1980, 4715, 4803, 281, 14003, 7534, 18662, 2322, 253, 4477, 12661, 326, 841, 3210, 778, 5752, 347, 24316, 281, 7102, 2852, 4684, 19177, 285, 6551, 14453, 10537, 2561, 50276, 16680, 50276, 783, 4477, 1089, 326, 1097, 253, 18504, 69, 285, 268, 6357, 3082, 403, 2104, 281, 9295, 253, 22816, 14237, 273, 594, 19, 285, 256, 18, 327, 13506, 15302, 841, 13506, 15302, 403, 5239, 273, 3632, 3888, 390, 11390, 19958, 432, 253, 2629, 2622, 3268, 3560, 407, 253, 2898, 273, 253, 21257, 273, 1600, 10234, 390, 9381, 50276, 296, 3755, 20556, 50276, 783, 2929, 17904, 767, 4460, 285, 4722, 3082, 323, 4715, 253, 22816, 14237, 273, 7263, 16080, 2390, 342, 35605, 21541, 6297, 436, 310, 247, 9865, 7680, 323, 1097, 15180, 6551, 21559, 347, 247, 9079, 323, 11454, 29009, 285, 5145, 4715, 347, 247, 4460, 5933, 323, 4715, 285, 14053, 9261, 2605, 432, 941, 50275, 20881, 1255, 265, 50276, 262, 651, 320, 5322, 281, 923, 247, 11745, 1783, 327, 253, 6311, 15116, 11120, 10941, 281, 253, 2629, 33103, 285, 22090, 269, 15421, 14395, 285, 9610, 849, 973, 253, 6311, 13461, 16851, 841, 14395, 50276, 262, 651, 320, 4217, 281, 923, 849, 973, 253, 1566, 18784, 327, 3626, 15302, 50276, 16691, 5701, 50276, 783, 2929, 310, 29318, 347, 247, 6551, 4193, 1975, 1566, 533, 352, 1663, 3133, 625, 39650, 41050, 347, 253, 15075, 310, 327, 849, 281, 3037, 512, 841, 1841, 342, 1980, 4715, 4803, 275, 247, 11454, 5049, 347, 10066, 281, 2173, 6551, 4193, 1975, 16007, 275, 253, 5304, 985, 50276, 9088, 403, 3374, 824, 347, 849, 281, 11897, 549, 291, 266, 285, 1142, 2571, 326, 651, 878, 281, 320, 5469, 390, 50221, 625, 281, 1158, 273, 436, 275, 6551, 4193, 1975, 2426, 50276, 262, 588, 320, 4722, 281, 923, 849, 841, 1543, 812, 320, 6508, 281, 1524, 10186, 4893, 824, 347, 26230, 5748, 2685, 432, 3626, 10556, 50275, 2887, 1840, 5474, 33032, 2520, 7714, 29328, 35605, 21541, 11333, 323, 4715, 1387, 21257, 432, 6430, 273, 7313, 767, 11333, 403, 4081, 581, 1754, 327, 18504, 69, 285, 253, 643, 327, 268, 6357, 253, 3045, 273, 841, 11333, 310, 6760, 275, 13506, 4679, 50276, 296, 3755, 20556, 50276, 783, 4477, 2953, 247, 1077, 4722, 285, 11132, 1953, 7534, 18662, 2322, 273, 253, 4715, 273, 9261, 14237, 50276, 20881, 1255, 265, 50276, 783, 4758, 310, 3240, 29190, 8493, 253, 1895, 281, 3786, 4081, 35605, 21541, 11333, 323, 268, 6357, 285, 18504, 69, 50276, 16217, 3825, 403, 1077, 20953, 3022, 342, 1077, 21010, 13260, 3692, 4833, 310, 417, 5762, 50276, 16429, 281, 2045, 789, 327, 35605, 11797, 268, 6357, 285, 643, 824, 11333, 310, 38037, 352, 310, 2834, 281, 4677, 562, 752, 310, 253, 3236, 414, 273, 253, 4081, 7274, 285, 2139, 2045, 7274, 812, 417, 3587, 320, 3732, 50276, 17465, 569, 403, 417, 5469, 50276, 7152, 339, 431, 248, 1246, 2929, 5421, 253, 11454, 7092, 273, 4715, 22816, 6779, 273, 49681, 1387, 21257, 5742, 970, 253, 374, 69, 9381, 1387, 347, 271, 1650, 253, 2929, 5421, 849, 253, 11454, 2990, 33772, 253, 9381, 2390, 407, 16994, 1027, 11333, 1690, 11098, 1318, 14717, 285, 8624, 4445, 1783, 285, 1027, 11333, 6524, 1421, 281, 1027, 2990, 10336, 352, 310, 247, 7936, 1953, 326, 849, 11454, 14174, 3037, 253, 33796, 1387, 21257, 436, 1263, 3400, 767, 4460, 5482, 534, 10725, 327, 2014, 23586, 5122, 16994, 18504, 69, 285, 2990, 5122, 16994, 268, 6357, 2975, 281, 3037, 253, 1387, 21257, 253, 2990, 1566, 16994, 268, 6357, 14259, 11038, 5742, 281, 3037, 21257, 310, 1754, 327, 247, 3332, 1263, 1275, 337, 594, 352, 310, 417, 2590, 849, 1199, 3064, 273, 436, 629, 2429, 342, 253, 5122, 3559, 275, 1275, 337, 253, 2644, 2929, 310, 2605, 3020, 533, 690, 9759, 310, 417, 2590, 285, 253, 5301, 342, 4321, 2175, 369, 417, 2218, 24181, 923, 4278, 2708, 50275, 17695, 50276, 2574, 495, 253, 3605, 246, 943, 320, 246, 18, 604, 891, 7192, 9113, 50275, 1282, 21886, 943, 253, 40009, 320, 253, 305, 275, 18057, 337, 604, 594, 4496, 897, 5185, 41818, 2439, 253, 7714, 50274, 262, 310, 417, 2590, 253, 28529, 275, 16186, 1283, 1754, 327, 16186, 1668, 285, 690, 42852, 403, 5816, 671, 352, 310, 417, 2590, 253, 4495, 273, 12624, 259, 285, 278, 275, 16186, 1283, 285, 891, 878, 281, 5476, 1880, 597, 2723, 281, 3997, 10495, 285, 18902, 2801, 12624, 2975, 50275, 926, 608, 66, 417, 2590, 534, 5933, 5644, 281, 253, 3559, 15116, 326, 310, 403, 1110, 15116, 34003, 432, 253, 18504, 69, 9999, 253, 1484, 285, 362, 390, 432, 253, 2990, 7092, 273, 268, 6357, 259, 285, 278, 50275, 15617, 3210, 2783, 275, 436, 2929, 760, 1509, 1242, 6642, 253, 9381, 6907, 352, 651, 320, 625, 4722, 281, 46919, 253, 13657, 3888, 347, 275, 271, 6753, 36465, 7792, 5474, 339, 431, 248, 4477, 17154, 767, 4460, 35605, 21541, 11333, 323, 3909, 13418, 273, 21257, 970, 22816, 14237, 407, 3652, 327, 2720, 789, 5368, 11333, 2403, 897, 273, 33796, 7027, 2390, 323, 11101, 28163, 21257, 403, 12059, 323, 7534, 18662, 2322, 1896, 11454, 27558, 403, 17253, 534, 476, 320, 16113, 323, 275, 4684, 19177, 253, 1543, 403, 6760, 21657, 327, 767, 1355, 6667, 20544, 50276, 19164, 414, 253, 2087, 2934, 273, 9999, 39501, 407, 7263, 16080, 2390, 310, 417, 747, 533, 23188, 4722, 285, 3551, 598, 342, 4588, 35605, 21541, 27558, 310, 38089, 50276, 783, 2505, 310, 12482, 4983, 342, 247, 2087, 10199, 285, 1175, 8813, 273, 253, 1895, 387, 1133, 50275, 455, 27947, 1690, 4737, 273, 14940, 323, 2831, 9177, 19502, 403, 12450, 2130, 275, 253, 30762, 50275, 20881, 1255, 265, 50276, 1439, 2590, 534, 4243, 403, 9021, 26332, 13130, 347, 10012, 50276, 84, 324, 314, 642, 2127, 2530, 50276, 17149, 6716, 5301, 273, 253, 5661, 1543, 342, 35605, 3898, 666, 917, 21421, 26332, 820, 864, 285, 973, 272, 651, 306, 644, 4722, 50276, 36903, 5344, 6332, 50276, 1282, 11087, 15516, 12865, 50276, 1282, 26472, 476, 350, 7364, 824, 347, 7534, 3898, 666, 2322, 273, 253, 3236, 18504, 69, 1754, 5933, 403, 4767, 4518, 2490, 187, 4118, 18435, 27, 2520, 7714, 10262, 4460, 35605, 21541, 11333, 323, 4715, 14237, 323, 7027, 2390, 253, 28529, 273, 253, 11333, 285, 253, 6928, 403, 1754, 327, 3786, 5421, 35605, 21541, 6928, 3738, 627, 403, 690, 7364, 253, 30628, 5194, 326, 436, 789, 310, 3590, 4518, 3559, 285, 6125, 247, 9865, 7680, 281, 1097, 15180, 783, 33977, 6551, 21559, 285, 5145, 4715 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: summary this paper proposes a novel contrastive learning method of sentence embeddings using unpaired examples from another modality experiments on semantic textual similarity benchmarks demonstrate that this method is able to produce sentence embeddings higher quality strength this paper presents a novel method to conduct multimodal contrastive learning for sentence encoders this method is simple and generalizable ie not only applied to vision but also to other modalities such as audio weakness im not confident that the improvement is significant on sts tasks on average it improves 1 accuracy in comparison the original simcse paper improves more than 4 accuracy and is able to push the stateoftheart in a supervised setting i wonder how much variance these datasets have and how much hyperparameters sweeps is conducted for both methods the model setup is a bit unintuitive in order to conduct multimodal contrastive learning it assumes that we can replace the embedding layer of a sentence encoder with a visual embedding layer to get an image encoder na docsepthis work drives a plausible approach for enhancing unsupervised sentence embedding learning by leveraging unpaired examples from another modality motivated by the previous success of lu et al 33 transformer models can transfer knowledge by learning from another modality eg vision or audio the text part uses the same training paradigm as simcse while an additional image contrastive loss is incorporated by using the same language transformer for image encoding results show that the proposed method is languageagnostic and outperforms vanilla simcse across various tasks analysis versus supervised simcse also illustrated that the proposed approach performs better than simcse learned with a noisy and smaller nli dataset strengths the idea of using nonlinguistic supervision for sentence embedding learning is novel the proposed model does not require paired multimodal data for training which can be applied to lowresource language learning the finding that transformer models can generalize by learning a similar task across different modalities may shed light on future research on multimodal representation learning the experiments demonstrate its success on both visualcse and audiocse weaknesses though the authors attempted to answer the underlying rationale of the proposed model in sec 34 how the language model can be trained using other modalities remains unclear to me more analysis experiments are expected to strengthen the theoretical support for the framework the work has discussed how language dataset supervision can affect the final performance however the nonlinguistic dataset quality is not discussed what kind of nonlinguistic dataset is suitable for the paradigm does imagenet fit for all language training minors line 172 thesame rightarrow the same the authors discussed limitations on the reduction in uniformity of the sentence embeddings other possible limitations could be the marginal improvement with larger nonlinguisitic dataset docsepthis work is an extension of existing contrastive sentence representation learning instead of only using text data this work further bring the other modality to improve the model performance in this work the author proposed unpaired image or audio data as additional supervision signals the extra modality data share the same encoding backbone models and similar contrastive learning losses the author conducted experiments on standard sentence similarity evaluation datasets and achieved improvements over textonly models strengths the proposed idea is simple and reasonable by introducing imageaudio through the same transformer encoder the model transferred from text only to modal agnostic model this is quite different from other works which use separated structures for each modality one good point of the model is that the image data size 500 is much smaller than the text training data this means the model extra cost is small the author also well analyzed the proposed model weakness the model performance improvement is relatively small compared to other cl sentence representation learning work eg diffcse debiasedcse promptbert the proposed method performance with more data is not that significant the author also did not evaluate the performance on other modalities then it is hard to give a overall judgment of the model as a modal agnostic mode the overall training loss on different modality losses are also missing which in my view in also important to understand the contribution of different modalities i do not see important social limitations of this work docsepthis paper studies the nonlinguistic supervision to contrastivelearningbased sentence embedding learning the author proposes a transformerbased multimodal framework to learn textcontrastive and visionaudiocontrastive tasks with unpaired samples experiments on various semantic textual similarity tasks show the effectiveness of the proposed method summary of strengths the paper is clearly presented and wellwritten extensive empirical experiments with multiple runs not only clearly show the proscons for the sts tasks but may suggest the path for future improvement in the lowresource language a simple yet effective modification is proposed and shows consistent improvements over all systems summary of weaknesses it seems the paper is missing a comparison with mcse 1 which is a very relevant work that augments the simcse with multimodal contrastive objective it is unclear except for the objective and hyperparameters how other factors like pretrained imageaudio encoder augmentation methods dataset and imageaudio quality could affect the final performance of the proposed methods it is unclear to me how many epochsimages will the proposed methods improve over the simcse methods and how will that affect the final quality 1 zhang miaoran marius mosbach david ifeoluwa adelani michael a hedderich and dietrich klakow mcse multimodal contrastive learning of sentence embeddings arxiv preprint naacl 2022 it would be good for more discussion around the usage quality and amount of external multimodality datasets and will aligned dataset can also be leveraged by the proposed method docsepthis paper presents a contrastive method for learning sentence embeddings that incorporates additional modalities using unaligned data the method follows the simcse approach of clustering similar sentence representations and expands upon it by additionally clustering visual or audio input named visualcse and audiocse respectively because the approach does not require aligned data between the text and additional modality the method works for any language and additional modalities can be easily evaluated as well the authors show that visualcse and audiocse have clear performance gains in both new approaches across a range of downstream tasks compared to the baseline simcse additional ablations and experiments also show that the learned sentences embeddings have higher alignment and perform similarly to supervised cse when training with less data or increasing the level of noise strengths originality this is a really interesting concept that im a big fan of its an interesting take on using multimodal input to benefit language learning by using another modality in a completely unaligned way to improve the language embeddings quality this paper shows a number of experiments analyzing different aspects of the approach raw performance on downstream tasks when compared to simcse performance across 2 modalities compared to textonly training performance across 3 models performance across different seeds performance across different languages and ablations because there are so many experiments its easy to analyze and think about the contributions of the approach for different communities the visual ablation was interesting as it showed whether supervision from the visual domain in the form of labels helps performance seeing that it did not contribute much likely due to the smaller batch size also leaves room for future work section 51 is another strong ablation providing a relationship between the number of examples and amount of noise in supervised simcse versus unsupervised visualcse experimental setup is thorough especially by comparing across seeds and therefore across different subsets of the training data clarity the paper is clearly written and i could follow the motivation experimental setup and results without really needing additional background searches significance not requiring aligned data means virtually any modality can be plugged in here in addition the data being used in the existing modalities can easily be expanded or modified in addition not being tied to single language is a major contribution this is especially true when incorporating additional modalities as most work using images is quite limited in the number of languages with annotations for a quantitative perspective the results in table 2 show that visualcse and audiocse nearly always outperform simcse across different models and across downstream tasks these are impressive gains and really validates the idea of the approach weaknesses clarity in equations 2 and 3 the losss superscript wasnt immediately clear to me on the first read and doesnt make sense there to me can these just be explicitly mentioned in the text along the lines of the unsupervised loss based on data augmentation is as follows then for a supervised task like nli its loss is defined as significance while the authors showed the benefits of visualcse on languages other than english french german and russian i would have liked to see a lowresource language from outside of europe yes limitations are addressed ### Summary:
this paper improves contrastive learning of sentence embedding by using unpaired examples from the image or audio modality reviewers liked the significance of this work due to its simplicity and general applicability but some questioned the amount of improvement and advocated for the inclusion of low resource languages the authors included chinese which is noneuropean but not low resource
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 8774, 50276, 2520, 2929, 29328, 247, 4460, 4499, 422, 4715, 1332, 273, 6197, 46234, 970, 47223, 6667, 432, 1529, 36453, 4679, 327, 24705, 45860, 14259, 49602, 7568, 326, 436, 1332, 310, 2104, 281, 4711, 6197, 46234, 2169, 3290, 50276, 45563, 50276, 2520, 2929, 10262, 247, 4460, 1332, 281, 2589, 23390, 26306, 4499, 422, 4715, 323, 6197, 2349, 351, 398, 50275, 2520, 1332, 310, 2969, 285, 2087, 12729, 26332, 417, 760, 3732, 281, 8113, 533, 671, 281, 643, 33433, 824, 347, 9797, 50275, 20881, 1255, 50276, 303, 417, 13224, 326, 253, 7756, 310, 1534, 327, 331, 84, 8892, 327, 3388, 352, 19132, 337, 7200, 50276, 249, 5301, 253, 3236, 948, 68, 339, 2929, 19132, 625, 685, 577, 7200, 285, 310, 2104, 281, 7450, 253, 1375, 23037, 14387, 275, 247, 22296, 4758, 50276, 74, 4282, 849, 1199, 11041, 841, 15302, 452, 285, 849, 1199, 4373, 22041, 4365, 2265, 310, 5196, 323, 1097, 3082, 50275, 783, 1566, 9978, 310, 247, 2372, 25962, 48714, 275, 1340, 281, 2589, 23390, 26306, 4499, 422, 4715, 352, 19584, 326, 359, 476, 8171, 253, 21496, 3828, 273, 247, 6197, 32049, 342, 247, 5304, 21496, 3828, 281, 755, 271, 2460, 32049, 50275, 2072, 5474, 33032, 2520, 789, 14137, 247, 21541, 2746, 323, 22474, 440, 35421, 6197, 21496, 4715, 407, 19732, 2977, 47223, 6667, 432, 1529, 36453, 17194, 407, 253, 2045, 2323, 273, 26535, 1162, 355, 5922, 39707, 3210, 476, 3700, 3640, 407, 4715, 432, 1529, 36453, 24088, 8113, 390, 9797, 253, 2505, 629, 4648, 253, 1072, 3733, 22199, 347, 948, 68, 339, 1223, 271, 3081, 2460, 4499, 422, 2957, 310, 11217, 407, 970, 253, 1072, 3448, 39707, 323, 2460, 9706, 1543, 921, 326, 253, 4081, 1332, 310, 3448, 1530, 6932, 285, 41731, 13015, 26724, 948, 68, 339, 2439, 2710, 8892, 1783, 7147, 22296, 948, 68, 339, 671, 12800, 326, 253, 4081, 2746, 17923, 1805, 685, 948, 68, 339, 6311, 342, 247, 27620, 285, 4577, 295, 965, 10895, 20544, 50276, 783, 2934, 273, 970, 1327, 1981, 86, 2531, 20446, 323, 6197, 21496, 4715, 310, 4460, 50276, 783, 4081, 1566, 1057, 417, 2430, 18433, 23390, 26306, 941, 323, 3733, 534, 476, 320, 3732, 281, 1698, 15024, 3448, 4715, 50276, 783, 4560, 326, 39707, 3210, 476, 39970, 407, 4715, 247, 2074, 4836, 2439, 1027, 33433, 778, 17914, 1708, 327, 2852, 2561, 327, 23390, 26306, 6779, 4715, 50276, 783, 4679, 7568, 697, 2323, 327, 1097, 5304, 68, 339, 285, 41174, 406, 339, 50276, 20881, 1255, 265, 50276, 2004, 253, 4477, 9919, 281, 3662, 253, 6944, 24775, 273, 253, 4081, 1566, 275, 4706, 5910, 849, 253, 3448, 1566, 476, 320, 10166, 970, 643, 33433, 4558, 12744, 281, 479, 625, 1783, 4679, 403, 3264, 281, 17084, 253, 10527, 1329, 323, 253, 7792, 50276, 783, 789, 556, 5469, 849, 3448, 10895, 20446, 476, 2818, 253, 2457, 3045, 2299, 253, 1327, 1981, 86, 2531, 10895, 3290, 310, 417, 5469, 752, 2238, 273, 1327, 1981, 86, 2531, 10895, 310, 7470, 323, 253, 22199, 1057, 4440, 257, 292, 4944, 323, 512, 3448, 3733, 50276, 1222, 641, 50276, 1282, 24347, 253, 18941, 987, 2501, 253, 1072, 253, 4477, 5469, 7364, 327, 253, 5141, 275, 38255, 273, 253, 6197, 46234, 643, 1896, 7364, 812, 320, 253, 16888, 7756, 342, 4067, 1327, 1981, 25338, 17533, 10895, 5474, 33032, 2520, 789, 310, 271, 6880, 273, 5368, 4499, 422, 6197, 6779, 4715, 3185, 273, 760, 970, 2505, 941, 436, 789, 2007, 3324, 253, 643, 36453, 281, 3157, 253, 1566, 3045, 275, 436, 789, 253, 2488, 4081, 47223, 2460, 390, 9797, 941, 347, 3081, 20446, 6298, 253, 4465, 36453, 941, 3894, 253, 1072, 9706, 27882, 3210, 285, 2074, 4499, 422, 4715, 11655, 253, 2488, 5196, 4679, 327, 2629, 6197, 14259, 7103, 15302, 285, 6786, 11701, 689, 2505, 7483, 3210, 50276, 296, 3755, 20556, 253, 4081, 2934, 310, 2969, 285, 5272, 407, 16984, 2460, 22172, 949, 253, 1072, 39707, 32049, 253, 1566, 9495, 432, 2505, 760, 281, 30771, 639, 79, 6932, 1566, 436, 310, 3240, 1027, 432, 643, 2987, 534, 897, 9070, 5289, 323, 1016, 36453, 581, 1175, 1127, 273, 253, 1566, 310, 326, 253, 2460, 941, 1979, 6783, 310, 1199, 4577, 685, 253, 2505, 3733, 941, 436, 2097, 253, 1566, 4465, 2105, 310, 1355, 253, 2488, 671, 973, 5867, 253, 4081, 1566, 50276, 20881, 1255, 253, 1566, 3045, 7756, 310, 4942, 1355, 2429, 281, 643, 502, 6197, 6779, 4715, 789, 24088, 2171, 68, 339, 372, 30344, 68, 339, 8959, 6291, 253, 4081, 1332, 3045, 342, 625, 941, 310, 417, 326, 1534, 253, 2488, 671, 858, 417, 7472, 253, 3045, 327, 643, 33433, 840, 352, 310, 1892, 281, 1918, 247, 4583, 3883, 273, 253, 1566, 347, 247, 30771, 639, 79, 6932, 4438, 253, 4583, 3733, 2957, 327, 1027, 36453, 11655, 403, 671, 5816, 534, 275, 619, 1859, 275, 671, 1774, 281, 2096, 253, 7680, 273, 1027, 33433, 50276, 74, 513, 417, 923, 1774, 2675, 7364, 273, 436, 789, 5474, 33032, 2520, 2929, 2175, 253, 1327, 1981, 86, 2531, 20446, 281, 4499, 422, 28269, 3169, 6197, 21496, 4715, 253, 2488, 29328, 247, 39707, 3169, 23390, 26306, 7792, 281, 3037, 2505, 45842, 422, 285, 8113, 14504, 406, 834, 42836, 422, 8892, 342, 47223, 3530, 4679, 327, 2710, 24705, 45860, 14259, 8892, 921, 253, 12510, 273, 253, 4081, 1332, 50274, 8774, 273, 20544, 50276, 783, 2929, 310, 4518, 3559, 285, 973, 15720, 50275, 2068, 3134, 16774, 4679, 342, 2709, 6613, 417, 760, 4518, 921, 253, 5847, 5040, 323, 253, 331, 84, 8892, 533, 778, 1804, 253, 1854, 323, 2852, 7756, 275, 253, 1698, 15024, 3448, 50275, 66, 2969, 2568, 3576, 11237, 310, 4081, 285, 2722, 5185, 11701, 689, 512, 2718, 50274, 8774, 273, 32213, 50276, 262, 3133, 253, 2929, 310, 5816, 247, 5301, 342, 278, 68, 339, 337, 534, 310, 247, 1077, 4623, 789, 326, 14688, 942, 253, 948, 68, 339, 342, 23390, 26306, 4499, 422, 8103, 50275, 262, 310, 12744, 3707, 323, 253, 8103, 285, 4373, 22041, 849, 643, 2616, 751, 3215, 11273, 2460, 22172, 32049, 42072, 3082, 10895, 285, 2460, 22172, 3290, 812, 2818, 253, 2457, 3045, 273, 253, 4081, 3082, 50275, 262, 310, 12744, 281, 479, 849, 1142, 23657, 3549, 1131, 588, 253, 4081, 3082, 3157, 689, 253, 948, 68, 339, 3082, 285, 849, 588, 326, 2818, 253, 2457, 3290, 50275, 18, 1182, 12109, 278, 571, 263, 266, 278, 26548, 15039, 16836, 34843, 301, 604, 70, 50025, 8754, 519, 293, 6451, 278, 44023, 247, 21867, 491, 469, 285, 6196, 5969, 27451, 518, 319, 278, 68, 339, 23390, 26306, 4499, 422, 4715, 273, 6197, 46234, 549, 32693, 638, 3845, 5549, 29404, 1384, 1423, 50276, 262, 651, 320, 1175, 323, 625, 5955, 1475, 253, 10393, 3290, 285, 2408, 273, 6024, 23390, 351, 1319, 15302, 285, 588, 15616, 10895, 476, 671, 320, 19732, 2961, 407, 253, 4081, 1332, 50275, 7152, 33032, 2520, 2929, 10262, 247, 4499, 422, 1332, 323, 4715, 6197, 46234, 326, 31167, 3081, 33433, 970, 440, 2132, 941, 253, 1332, 3637, 253, 948, 68, 339, 2746, 273, 17524, 2074, 6197, 14237, 285, 35205, 2220, 352, 407, 23000, 17524, 5304, 390, 9797, 3280, 4907, 5304, 68, 339, 285, 41174, 406, 339, 2975, 984, 253, 2746, 1057, 417, 2430, 15616, 941, 875, 253, 2505, 285, 3081, 36453, 253, 1332, 2987, 323, 667, 3448, 285, 3081, 33433, 476, 320, 4354, 6760, 347, 973, 253, 4477, 921, 326, 5304, 68, 339, 285, 41174, 406, 339, 452, 2590, 3045, 15988, 275, 1097, 747, 7274, 2439, 247, 2491, 273, 15450, 8892, 2429, 281, 253, 8245, 948, 68, 339, 3081, 490, 77, 569, 285, 4679, 671, 921, 326, 253, 6311, 14683, 46234, 452, 2169, 12420, 285, 1347, 12014, 281, 22296, 260, 339, 672, 3733, 342, 1679, 941, 390, 3629, 253, 1268, 273, 6046, 20544, 50276, 19164, 414, 50275, 2520, 310, 247, 1663, 4722, 4473, 326, 516, 247, 1943, 7989, 273, 697, 271, 4722, 1379, 327, 970, 23390, 26306, 3280, 281, 5649, 3448, 4715, 407, 970, 1529, 36453, 275, 247, 4336, 440, 2132, 1039, 281, 3157, 253, 3448, 46234, 50276, 15177, 50275, 2520, 2929, 2722, 247, 1180, 273, 4679, 18918, 1027, 7794, 273, 253, 2746, 50276, 2040, 3045, 327, 15450, 8892, 672, 2429, 281, 948, 68, 339, 3045, 2439, 374, 33433, 2429, 281, 2505, 7483, 3733, 3045, 2439, 495, 3210, 3045, 2439, 1027, 12922, 3045, 2439, 1027, 11515, 285, 490, 77, 569, 984, 627, 403, 594, 1142, 4679, 697, 3477, 281, 12106, 285, 1158, 670, 253, 9021, 273, 253, 2746, 323, 1027, 7888, 50276, 783, 5304, 28913, 369, 4722, 347, 352, 2692, 1880, 20446, 432, 253, 5304, 5028, 275, 253, 830, 273, 13301, 7729, 3045, 6523, 326, 352, 858, 417, 8162, 1199, 2779, 1955, 281, 253, 4577, 14604, 1979, 671, 6505, 2316, 323, 2852, 789, 50276, 4674, 8319, 310, 1529, 2266, 28913, 5277, 247, 2954, 875, 253, 1180, 273, 6667, 285, 2408, 273, 6046, 275, 22296, 948, 68, 339, 7147, 440, 35421, 5304, 68, 339, 50276, 49363, 9978, 310, 11080, 3340, 407, 10941, 2439, 12922, 285, 3103, 2439, 1027, 20077, 273, 253, 3733, 941, 50276, 498, 15752, 50276, 783, 2929, 310, 4518, 3542, 285, 891, 812, 956, 253, 16038, 5661, 9978, 285, 1543, 1293, 1663, 25312, 3081, 4114, 17891, 50276, 9188, 40348, 50276, 1439, 10568, 15616, 941, 2097, 14257, 667, 36453, 476, 320, 43867, 275, 1060, 275, 1635, 253, 941, 1146, 908, 275, 253, 5368, 33433, 476, 4354, 320, 11848, 390, 7321, 275, 1635, 417, 1146, 12331, 281, 2014, 3448, 310, 247, 2201, 7680, 436, 310, 3340, 2032, 672, 24049, 3081, 33433, 347, 954, 789, 970, 3888, 310, 3240, 3710, 275, 253, 1180, 273, 11515, 342, 31825, 50276, 1542, 247, 11745, 8668, 253, 1543, 275, 2829, 374, 921, 326, 5304, 68, 339, 285, 41174, 406, 339, 4829, 1900, 562, 32231, 948, 68, 339, 2439, 1027, 3210, 285, 2439, 15450, 8892, 841, 403, 13943, 15988, 285, 1663, 3588, 684, 253, 2934, 273, 253, 2746, 50276, 20881, 1255, 265, 19843, 50276, 249, 7424, 374, 285, 495, 253, 3897, 859, 17402, 1687, 369, 2649, 4745, 2590, 281, 479, 327, 253, 806, 1239, 285, 36908, 1056, 3282, 627, 281, 479, 476, 841, 816, 320, 11120, 5393, 275, 253, 2505, 2112, 253, 3104, 273, 253, 440, 35421, 2957, 1754, 327, 941, 42072, 310, 347, 3637, 840, 323, 247, 22296, 4836, 751, 295, 965, 697, 2957, 310, 2931, 347, 50276, 9188, 40348, 50276, 6050, 253, 4477, 2692, 253, 5373, 273, 5304, 68, 339, 327, 11515, 643, 685, 48087, 269, 4099, 305, 8592, 285, 391, 1316, 757, 891, 651, 452, 10490, 281, 923, 247, 1698, 15024, 3448, 432, 3345, 273, 19454, 4754, 7364, 403, 9713, 2490, 187, 4118, 18435, 27, 2520, 2929, 19132, 4499, 422, 4715, 273, 6197, 21496, 407, 970, 47223, 6667, 432, 253, 2460, 390, 9797, 36453, 50276, 15337, 398, 10490, 253, 8453, 273, 436, 789, 1955, 281, 697, 17647, 285, 2087, 30437, 533, 690, 17801, 253, 2408, 273, 7756, 285, 36431, 323, 253, 11250, 273, 1698, 7741, 11515, 253, 4477, 2908, 448, 5187, 534, 310, 5293, 2737, 266, 533, 417, 1698, 7741 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 8774, 50276, 2520, 2929, 29328, 247, 4460, 4499, 422, 4715, 1332, 273, 6197, 46234, 970, 47223, 6667, 432, 1529, 36453, 4679, 327, 24705, 45860, 14259, 49602, 7568, 326, 436, 1332, 310, 2104, 281, 4711, 6197, 46234, 2169, 3290, 50276, 45563, 50276, 2520, 2929, 10262, 247, 4460, 1332, 281, 2589, 23390, 26306, 4499, 422, 4715, 323, 6197, 2349, 351, 398, 50275, 2520, 1332, 310, 2969, 285, 2087, 12729, 26332, 417, 760, 3732, 281, 8113, 533, 671, 281, 643, 33433, 824, 347, 9797, 50275, 20881, 1255, 50276, 303, 417, 13224, 326, 253, 7756, 310, 1534, 327, 331, 84, 8892, 327, 3388, 352, 19132, 337, 7200, 50276, 249, 5301, 253, 3236, 948, 68, 339, 2929, 19132, 625, 685, 577, 7200, 285, 310, 2104, 281, 7450, 253, 1375, 23037, 14387, 275, 247, 22296, 4758, 50276, 74, 4282, 849, 1199, 11041, 841, 15302, 452, 285, 849, 1199, 4373, 22041, 4365, 2265, 310, 5196, 323, 1097, 3082, 50275, 783, 1566, 9978, 310, 247, 2372, 25962, 48714, 275, 1340, 281, 2589, 23390, 26306, 4499, 422, 4715, 352, 19584, 326, 359, 476, 8171, 253, 21496, 3828, 273, 247, 6197, 32049, 342, 247, 5304, 21496, 3828, 281, 755, 271, 2460, 32049, 50275, 2072, 5474, 33032, 2520, 789, 14137, 247, 21541, 2746, 323, 22474, 440, 35421, 6197, 21496, 4715, 407, 19732, 2977, 47223, 6667, 432, 1529, 36453, 17194, 407, 253, 2045, 2323, 273, 26535, 1162, 355, 5922, 39707, 3210, 476, 3700, 3640, 407, 4715, 432, 1529, 36453, 24088, 8113, 390, 9797, 253, 2505, 629, 4648, 253, 1072, 3733, 22199, 347, 948, 68, 339, 1223, 271, 3081, 2460, 4499, 422, 2957, 310, 11217, 407, 970, 253, 1072, 3448, 39707, 323, 2460, 9706, 1543, 921, 326, 253, 4081, 1332, 310, 3448, 1530, 6932, 285, 41731, 13015, 26724, 948, 68, 339, 2439, 2710, 8892, 1783, 7147, 22296, 948, 68, 339, 671, 12800, 326, 253, 4081, 2746, 17923, 1805, 685, 948, 68, 339, 6311, 342, 247, 27620, 285, 4577, 295, 965, 10895, 20544, 50276, 783, 2934, 273, 970, 1327, 1981, 86, 2531, 20446, 323, 6197, 21496, 4715, 310, 4460, 50276, 783, 4081, 1566, 1057, 417, 2430, 18433, 23390, 26306, 941, 323, 3733, 534, 476, 320, 3732, 281, 1698, 15024, 3448, 4715, 50276, 783, 4560, 326, 39707, 3210, 476, 39970, 407, 4715, 247, 2074, 4836, 2439, 1027, 33433, 778, 17914, 1708, 327, 2852, 2561, 327, 23390, 26306, 6779, 4715, 50276, 783, 4679, 7568, 697, 2323, 327, 1097, 5304, 68, 339, 285, 41174, 406, 339, 50276, 20881, 1255, 265, 50276, 2004, 253, 4477, 9919, 281, 3662, 253, 6944, 24775, 273, 253, 4081, 1566, 275, 4706, 5910, 849, 253, 3448, 1566, 476, 320, 10166, 970, 643, 33433, 4558, 12744, 281, 479, 625, 1783, 4679, 403, 3264, 281, 17084, 253, 10527, 1329, 323, 253, 7792, 50276, 783, 789, 556, 5469, 849, 3448, 10895, 20446, 476, 2818, 253, 2457, 3045, 2299, 253, 1327, 1981, 86, 2531, 10895, 3290, 310, 417, 5469, 752, 2238, 273, 1327, 1981, 86, 2531, 10895, 310, 7470, 323, 253, 22199, 1057, 4440, 257, 292, 4944, 323, 512, 3448, 3733, 50276, 1222, 641, 50276, 1282, 24347, 253, 18941, 987, 2501, 253, 1072, 253, 4477, 5469, 7364, 327, 253, 5141, 275, 38255, 273, 253, 6197, 46234, 643, 1896, 7364, 812, 320, 253, 16888, 7756, 342, 4067, 1327, 1981, 25338, 17533, 10895, 5474, 33032, 2520, 789, 310, 271, 6880, 273, 5368, 4499, 422, 6197, 6779, 4715, 3185, 273, 760, 970, 2505, 941, 436, 789, 2007, 3324, 253, 643, 36453, 281, 3157, 253, 1566, 3045, 275, 436, 789, 253, 2488, 4081, 47223, 2460, 390, 9797, 941, 347, 3081, 20446, 6298, 253, 4465, 36453, 941, 3894, 253, 1072, 9706, 27882, 3210, 285, 2074, 4499, 422, 4715, 11655, 253, 2488, 5196, 4679, 327, 2629, 6197, 14259, 7103, 15302, 285, 6786, 11701, 689, 2505, 7483, 3210, 50276, 296, 3755, 20556, 253, 4081, 2934, 310, 2969, 285, 5272, 407, 16984, 2460, 22172, 949, 253, 1072, 39707, 32049, 253, 1566, 9495, 432, 2505, 760, 281, 30771, 639, 79, 6932, 1566, 436, 310, 3240, 1027, 432, 643, 2987, 534, 897, 9070, 5289, 323, 1016, 36453, 581, 1175, 1127, 273, 253, 1566, 310, 326, 253, 2460, 941, 1979, 6783, 310, 1199, 4577, 685, 253, 2505, 3733, 941, 436, 2097, 253, 1566, 4465, 2105, 310, 1355, 253, 2488, 671, 973, 5867, 253, 4081, 1566, 50276, 20881, 1255, 253, 1566, 3045, 7756, 310, 4942, 1355, 2429, 281, 643, 502, 6197, 6779, 4715, 789, 24088, 2171, 68, 339, 372, 30344, 68, 339, 8959, 6291, 253, 4081, 1332, 3045, 342, 625, 941, 310, 417, 326, 1534, 253, 2488, 671, 858, 417, 7472, 253, 3045, 327, 643, 33433, 840, 352, 310, 1892, 281, 1918, 247, 4583, 3883, 273, 253, 1566, 347, 247, 30771, 639, 79, 6932, 4438, 253, 4583, 3733, 2957, 327, 1027, 36453, 11655, 403, 671, 5816, 534, 275, 619, 1859, 275, 671, 1774, 281, 2096, 253, 7680, 273, 1027, 33433, 50276, 74, 513, 417, 923, 1774, 2675, 7364, 273, 436, 789, 5474, 33032, 2520, 2929, 2175, 253, 1327, 1981, 86, 2531, 20446, 281, 4499, 422, 28269, 3169, 6197, 21496, 4715, 253, 2488, 29328, 247, 39707, 3169, 23390, 26306, 7792, 281, 3037, 2505, 45842, 422, 285, 8113, 14504, 406, 834, 42836, 422, 8892, 342, 47223, 3530, 4679, 327, 2710, 24705, 45860, 14259, 8892, 921, 253, 12510, 273, 253, 4081, 1332, 50274, 8774, 273, 20544, 50276, 783, 2929, 310, 4518, 3559, 285, 973, 15720, 50275, 2068, 3134, 16774, 4679, 342, 2709, 6613, 417, 760, 4518, 921, 253, 5847, 5040, 323, 253, 331, 84, 8892, 533, 778, 1804, 253, 1854, 323, 2852, 7756, 275, 253, 1698, 15024, 3448, 50275, 66, 2969, 2568, 3576, 11237, 310, 4081, 285, 2722, 5185, 11701, 689, 512, 2718, 50274, 8774, 273, 32213, 50276, 262, 3133, 253, 2929, 310, 5816, 247, 5301, 342, 278, 68, 339, 337, 534, 310, 247, 1077, 4623, 789, 326, 14688, 942, 253, 948, 68, 339, 342, 23390, 26306, 4499, 422, 8103, 50275, 262, 310, 12744, 3707, 323, 253, 8103, 285, 4373, 22041, 849, 643, 2616, 751, 3215, 11273, 2460, 22172, 32049, 42072, 3082, 10895, 285, 2460, 22172, 3290, 812, 2818, 253, 2457, 3045, 273, 253, 4081, 3082, 50275, 262, 310, 12744, 281, 479, 849, 1142, 23657, 3549, 1131, 588, 253, 4081, 3082, 3157, 689, 253, 948, 68, 339, 3082, 285, 849, 588, 326, 2818, 253, 2457, 3290, 50275, 18, 1182, 12109, 278, 571, 263, 266, 278, 26548, 15039, 16836, 34843, 301, 604, 70, 50025, 8754, 519, 293, 6451, 278, 44023, 247, 21867, 491, 469, 285, 6196, 5969, 27451, 518, 319, 278, 68, 339, 23390, 26306, 4499, 422, 4715, 273, 6197, 46234, 549, 32693, 638, 3845, 5549, 29404, 1384, 1423, 50276, 262, 651, 320, 1175, 323, 625, 5955, 1475, 253, 10393, 3290, 285, 2408, 273, 6024, 23390, 351, 1319, 15302, 285, 588, 15616, 10895, 476, 671, 320, 19732, 2961, 407, 253, 4081, 1332, 50275, 7152, 33032, 2520, 2929, 10262, 247, 4499, 422, 1332, 323, 4715, 6197, 46234, 326, 31167, 3081, 33433, 970, 440, 2132, 941, 253, 1332, 3637, 253, 948, 68, 339, 2746, 273, 17524, 2074, 6197, 14237, 285, 35205, 2220, 352, 407, 23000, 17524, 5304, 390, 9797, 3280, 4907, 5304, 68, 339, 285, 41174, 406, 339, 2975, 984, 253, 2746, 1057, 417, 2430, 15616, 941, 875, 253, 2505, 285, 3081, 36453, 253, 1332, 2987, 323, 667, 3448, 285, 3081, 33433, 476, 320, 4354, 6760, 347, 973, 253, 4477, 921, 326, 5304, 68, 339, 285, 41174, 406, 339, 452, 2590, 3045, 15988, 275, 1097, 747, 7274, 2439, 247, 2491, 273, 15450, 8892, 2429, 281, 253, 8245, 948, 68, 339, 3081, 490, 77, 569, 285, 4679, 671, 921, 326, 253, 6311, 14683, 46234, 452, 2169, 12420, 285, 1347, 12014, 281, 22296, 260, 339, 672, 3733, 342, 1679, 941, 390, 3629, 253, 1268, 273, 6046, 20544, 50276, 19164, 414, 50275, 2520, 310, 247, 1663, 4722, 4473, 326, 516, 247, 1943, 7989, 273, 697, 271, 4722, 1379, 327, 970, 23390, 26306, 3280, 281, 5649, 3448, 4715, 407, 970, 1529, 36453, 275, 247, 4336, 440, 2132, 1039, 281, 3157, 253, 3448, 46234, 50276, 15177, 50275, 2520, 2929, 2722, 247, 1180, 273, 4679, 18918, 1027, 7794, 273, 253, 2746, 50276, 2040, 3045, 327, 15450, 8892, 672, 2429, 281, 948, 68, 339, 3045, 2439, 374, 33433, 2429, 281, 2505, 7483, 3733, 3045, 2439, 495, 3210, 3045, 2439, 1027, 12922, 3045, 2439, 1027, 11515, 285, 490, 77, 569, 984, 627, 403, 594, 1142, 4679, 697, 3477, 281, 12106, 285, 1158, 670, 253, 9021, 273, 253, 2746, 323, 1027, 7888, 50276, 783, 5304, 28913, 369, 4722, 347, 352, 2692, 1880, 20446, 432, 253, 5304, 5028, 275, 253, 830, 273, 13301, 7729, 3045, 6523, 326, 352, 858, 417, 8162, 1199, 2779, 1955, 281, 253, 4577, 14604, 1979, 671, 6505, 2316, 323, 2852, 789, 50276, 4674, 8319, 310, 1529, 2266, 28913, 5277, 247, 2954, 875, 253, 1180, 273, 6667, 285, 2408, 273, 6046, 275, 22296, 948, 68, 339, 7147, 440, 35421, 5304, 68, 339, 50276, 49363, 9978, 310, 11080, 3340, 407, 10941, 2439, 12922, 285, 3103, 2439, 1027, 20077, 273, 253, 3733, 941, 50276, 498, 15752, 50276, 783, 2929, 310, 4518, 3542, 285, 891, 812, 956, 253, 16038, 5661, 9978, 285, 1543, 1293, 1663, 25312, 3081, 4114, 17891, 50276, 9188, 40348, 50276, 1439, 10568, 15616, 941, 2097, 14257, 667, 36453, 476, 320, 43867, 275, 1060, 275, 1635, 253, 941, 1146, 908, 275, 253, 5368, 33433, 476, 4354, 320, 11848, 390, 7321, 275, 1635, 417, 1146, 12331, 281, 2014, 3448, 310, 247, 2201, 7680, 436, 310, 3340, 2032, 672, 24049, 3081, 33433, 347, 954, 789, 970, 3888, 310, 3240, 3710, 275, 253, 1180, 273, 11515, 342, 31825, 50276, 1542, 247, 11745, 8668, 253, 1543, 275, 2829, 374, 921, 326, 5304, 68, 339, 285, 41174, 406, 339, 4829, 1900, 562, 32231, 948, 68, 339, 2439, 1027, 3210, 285, 2439, 15450, 8892, 841, 403, 13943, 15988, 285, 1663, 3588, 684, 253, 2934, 273, 253, 2746, 50276, 20881, 1255, 265, 19843, 50276, 249, 7424, 374, 285, 495, 253, 3897, 859, 17402, 1687, 369, 2649, 4745, 2590, 281, 479, 327, 253, 806, 1239, 285, 36908, 1056, 3282, 627, 281, 479, 476, 841, 816, 320, 11120, 5393, 275, 253, 2505, 2112, 253, 3104, 273, 253, 440, 35421, 2957, 1754, 327, 941, 42072, 310, 347, 3637, 840, 323, 247, 22296, 4836, 751, 295, 965, 697, 2957, 310, 2931, 347, 50276, 9188, 40348, 50276, 6050, 253, 4477, 2692, 253, 5373, 273, 5304, 68, 339, 327, 11515, 643, 685, 48087, 269, 4099, 305, 8592, 285, 391, 1316, 757, 891, 651, 452, 10490, 281, 923, 247, 1698, 15024, 3448, 432, 3345, 273, 19454, 4754, 7364, 403, 9713, 2490, 187, 4118, 18435, 27, 2520, 2929, 19132, 4499, 422, 4715, 273, 6197, 21496, 407, 970, 47223, 6667, 432, 253, 2460, 390, 9797, 36453, 50276, 15337, 398, 10490, 253, 8453, 273, 436, 789, 1955, 281, 697, 17647, 285, 2087, 30437, 533, 690, 17801, 253, 2408, 273, 7756, 285, 36431, 323, 253, 11250, 273, 1698, 7741, 11515, 253, 4477, 2908, 448, 5187, 534, 310, 5293, 2737, 266, 533, 417, 1698, 7741 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper proposed a variational model to infer the underlying directed graph of a dynamical system from observed features of the agents over a time period regularization terms were also introduced to encourage learing of directed interactions evaluations on a benchmark of datasets show good results strengths 1 the idea of iterative learning of the adjacency matrix seems new weaknesses 1 there are many regularization terms in this approach and all of them seems to be crucial for the overall performance of direct interation discovery which makes the information bottleneck formulation a little bit vague 2 it is not clear how the inner loop is conducted in section 44 yes they do docsepthe authors propose the iterative structural inference of directed graphs isidg which is able to infer the directed interactions between agents in dynamic systems with information bottleneck theory and extra regularization the proposed framework is examined on several simulated and synthetic networks to identify the interactions strengths  this paper focuses on inferring the underlying structure of agent interactions in timevarying systems which is a complex problem due to the dynamic feature apart from using information bottleneck of vae to infer the structure the proposed method also consider the sparsity and connectiveness of the asymmetric adjacency matrix property and design extra regularization term to address the related features ablation study also shows that the sparsity regularization term can especially help to infer direct connections weakness size of the dataset number of nodes is very small here is it because of the limitation of the datasets or the proposed framework is very computationally expensive  it would be beneficial if the authors add a ground truth subfigure in fig 2 the proposed method is only applied on simulated or synthetic datasets i strongly recommend examining the framework on at least two realworld datasets to show the priority of the proposed method to reconstruct the graph structure simulated or synthetic data are usually more predictable since the data generation process is known and with less uncertainty while high complexity and unknowns is more likely to be involved in realworld datasets i suggest the authors to look into the following related paper on reconstructing network httpsopenreviewnetforumidrjgtcir9tm httpswwwnaturecomarticless4146701909774x see weakness docsepthe paper studies the structural inference problem of reconstructing the asymmetric adjacency matrix of the graph in an unsupervised fashion specifically the paper introduces an iterative process based on variational information bottleneck that uses gnnbased encoder to infer directed interaction represented in the latent space which is then used to update the asymmetric adjacency iteratively starting from a fully connected graph the updated adjacency is decoded to predict the future states of the nodes the paper also introduces additional regularization terms in the objective function for realistic structure and eliminate the influence of indirect interactions strength the paper is wellwritten and the presentation is clear the preliminaries section provides a detailed explanation of the problem which the paper is trying to solve and figure 1 gives a clear overview of the model architecture the proposed iterative scheme for structural inference seems to be novel and simple yet effective although i have some concerns about the efficiency described in the weakness part the proposed method shows clear performance improvement the paper clearly addresses the problem of the influence of the indirect connections of the structural inference methods and provides the solutions for eliminating such challenges extensive analyses are provided in the appendix which gives further insights especially the analysis of the scalability experiments shows meaningful results even with the limitations weakness the proposed method seems to be quite slow due to the iterative scheme thus comparing timememory efficiency may address my concern especially as isidg shows marginal improvements on some datasets compared with fnri eg tf springs kuramoto i believe the efficiency comparison is critical furthermore the efficiency problem may be increased for larger datasets i would like to see the efficiency comparison on springs 100 and esc datasets as used in section e of appendix the influence of the hyperparameters of equation 21 is not sufficiently explained although the ablation study shows the results of each term in the loss without the analysis the significance of the proposed hybrid loss is not fully clarified the authors present scalability limitations in section e of the appendix which is a common limitation among the baselines ### Summary:
this is a bordeline paper all reviewers liked the paper but were a bit concerned with the number of regularization terms the loss combines vae and information bottleneck approaches and thus the number of hyperparameters secondly the reviewers were also initially concerned with the size of the benchmarks but the discussion phase convinced them that these are standard in the field thirdly some reviewers pointed to more related work this however was not a crucial point since all reviewers finds the paper interesting and wellwritten acceptance is recommended
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 4081, 247, 39762, 1566, 281, 9441, 253, 6944, 6828, 4216, 273, 247, 18525, 985, 432, 2540, 3386, 273, 253, 6083, 689, 247, 673, 2180, 37820, 2426, 497, 671, 5611, 281, 11907, 458, 1875, 273, 6828, 6355, 27163, 327, 247, 22791, 273, 15302, 921, 1175, 1543, 20544, 50276, 18, 253, 2934, 273, 34560, 4715, 273, 253, 3067, 43850, 4315, 3133, 747, 50275, 20881, 1255, 265, 50276, 18, 627, 403, 1142, 37820, 2426, 275, 436, 2746, 285, 512, 273, 731, 3133, 281, 320, 9560, 323, 253, 4583, 3045, 273, 1480, 734, 318, 8900, 534, 2789, 253, 1491, 3673, 44856, 15895, 247, 1652, 2372, 21248, 50276, 19, 352, 310, 417, 2590, 849, 253, 6703, 6287, 310, 5196, 275, 2593, 7127, 50275, 9820, 597, 513, 5474, 339, 431, 248, 4477, 12661, 253, 34560, 8350, 17032, 273, 6828, 14580, 310, 301, 72, 534, 310, 2104, 281, 9441, 253, 6828, 6355, 875, 6083, 275, 7870, 2718, 342, 1491, 3673, 44856, 3762, 285, 4465, 37820, 253, 4081, 7792, 310, 6730, 327, 2067, 15524, 285, 13506, 6928, 281, 4271, 253, 6355, 50276, 296, 3755, 20556, 575, 50276, 2520, 2929, 16633, 327, 9441, 804, 253, 6944, 2605, 273, 5570, 6355, 275, 673, 39381, 272, 2718, 534, 310, 247, 2570, 1895, 1955, 281, 253, 7870, 4735, 50276, 522, 435, 432, 970, 1491, 3673, 44856, 273, 362, 3348, 281, 9441, 253, 2605, 253, 4081, 1332, 575, 12563, 1908, 253, 37139, 414, 285, 4684, 6460, 575, 1171, 253, 26640, 3067, 43850, 4315, 2867, 285, 2216, 4465, 37820, 1307, 281, 2953, 253, 2905, 3386, 28913, 575, 34966, 671, 2722, 326, 253, 37139, 414, 37820, 1307, 476, 3340, 1361, 281, 9441, 1480, 10291, 50276, 20881, 1255, 50276, 3281, 273, 253, 10895, 1180, 273, 7632, 310, 1077, 1355, 1060, 310, 352, 984, 273, 253, 12291, 273, 253, 15302, 390, 253, 4081, 7792, 310, 1077, 43245, 8214, 575, 50276, 262, 651, 320, 12912, 604, 253, 4477, 823, 247, 575, 2595, 575, 33024, 749, 13206, 275, 3036, 374, 50276, 783, 4081, 1332, 310, 760, 3732, 327, 15524, 390, 13506, 15302, 891, 7052, 5583, 17565, 253, 7792, 327, 387, 1878, 767, 1524, 10186, 15302, 281, 921, 253, 11674, 273, 253, 4081, 1332, 281, 17029, 253, 4216, 2605, 15524, 390, 13506, 941, 403, 3798, 625, 28826, 1580, 253, 941, 5978, 1232, 310, 1929, 285, 342, 1679, 11649, 1223, 1029, 10454, 285, 7202, 84, 310, 625, 2779, 281, 320, 3206, 275, 1524, 10186, 15302, 50276, 74, 1804, 253, 4477, 281, 1007, 715, 253, 1563, 2905, 2929, 327, 17029, 272, 2990, 50275, 3614, 5758, 15337, 3024, 39061, 301, 83, 75, 7332, 27148, 26, 20583, 50275, 3614, 1477, 939, 1177, 681, 45894, 1417, 32631, 2251, 520, 28766, 30715, 89, 50275, 2887, 14855, 5474, 339, 431, 248, 2929, 2175, 253, 8350, 17032, 1895, 273, 17029, 272, 253, 26640, 3067, 43850, 4315, 273, 253, 4216, 275, 271, 440, 35421, 8142, 5742, 253, 2929, 23970, 271, 34560, 1232, 1754, 327, 39762, 1491, 3673, 44856, 326, 4648, 305, 9866, 3169, 32049, 281, 9441, 6828, 5016, 6607, 275, 253, 21624, 2317, 534, 310, 840, 908, 281, 5731, 253, 26640, 3067, 43850, 10040, 3146, 4983, 432, 247, 4751, 4802, 4216, 253, 9300, 3067, 43850, 310, 45775, 281, 3283, 253, 2852, 3054, 273, 253, 7632, 253, 2929, 671, 23970, 3081, 37820, 2426, 275, 253, 8103, 1159, 323, 15958, 2605, 285, 13469, 253, 4833, 273, 11686, 6355, 4757, 50276, 783, 2929, 310, 973, 15720, 285, 253, 9759, 310, 2590, 50276, 783, 11944, 249, 3927, 2593, 3400, 247, 7000, 8813, 273, 253, 1895, 534, 253, 2929, 310, 2820, 281, 8415, 285, 4677, 337, 4245, 247, 2590, 18389, 273, 253, 1566, 10336, 50275, 783, 4081, 34560, 6974, 323, 8350, 17032, 3133, 281, 320, 4460, 285, 2969, 2568, 3576, 3738, 891, 452, 690, 7350, 670, 253, 6733, 2529, 275, 253, 14855, 629, 253, 4081, 1332, 2722, 2590, 3045, 7756, 50275, 783, 2929, 4518, 12453, 253, 1895, 273, 253, 4833, 273, 253, 11686, 10291, 273, 253, 8350, 17032, 3082, 285, 3400, 253, 5482, 323, 23703, 824, 7881, 50275, 2068, 3134, 6260, 403, 2530, 275, 253, 30762, 534, 4245, 2007, 16039, 3340, 253, 1783, 273, 253, 9171, 1430, 4679, 2722, 14282, 1543, 1014, 342, 253, 7364, 50275, 20881, 1255, 50275, 783, 4081, 1332, 3133, 281, 320, 3240, 3468, 1955, 281, 253, 34560, 6974, 3021, 10941, 4522, 358, 358, 590, 6733, 778, 2953, 619, 4468, 3340, 347, 310, 301, 72, 2722, 16888, 11701, 327, 690, 15302, 2429, 342, 21486, 363, 24088, 28793, 29121, 42981, 32150, 891, 2868, 253, 6733, 5301, 310, 4619, 33810, 253, 6733, 1895, 778, 320, 2559, 323, 4067, 15302, 891, 651, 751, 281, 923, 253, 6733, 5301, 327, 29121, 2233, 285, 6262, 15302, 347, 908, 275, 2593, 299, 273, 30762, 50275, 783, 4833, 273, 253, 4373, 22041, 273, 5150, 3127, 310, 417, 10481, 5544, 3738, 253, 28913, 1263, 2722, 253, 1543, 273, 1016, 1307, 275, 253, 2957, 1293, 253, 1783, 253, 8453, 273, 253, 4081, 9769, 2957, 310, 417, 4751, 31637, 50276, 783, 4477, 1246, 9171, 1430, 7364, 275, 2593, 299, 273, 253, 30762, 534, 310, 247, 1846, 12291, 2190, 253, 1666, 25379, 50276, 187, 187, 4118, 18435, 27, 2520, 310, 247, 270, 636, 4115, 2929, 50276, 455, 30628, 10490, 253, 2929, 533, 497, 247, 2372, 7514, 342, 253, 1180, 273, 37820, 2426, 253, 2957, 24772, 362, 3348, 285, 1491, 3673, 44856, 7274, 285, 3021, 253, 1180, 273, 4373, 22041, 1273, 314, 253, 30628, 497, 671, 8523, 7514, 342, 253, 1979, 273, 253, 49602, 533, 253, 5955, 3408, 13762, 731, 326, 841, 403, 2629, 275, 253, 1673, 2626, 314, 690, 30628, 8042, 281, 625, 2905, 789, 436, 2299, 369, 417, 247, 9560, 1127, 50274, 17480, 512, 30628, 9010, 253, 2929, 4722, 285, 973, 15720, 14924, 310, 8521, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 4081, 247, 39762, 1566, 281, 9441, 253, 6944, 6828, 4216, 273, 247, 18525, 985, 432, 2540, 3386, 273, 253, 6083, 689, 247, 673, 2180, 37820, 2426, 497, 671, 5611, 281, 11907, 458, 1875, 273, 6828, 6355, 27163, 327, 247, 22791, 273, 15302, 921, 1175, 1543, 20544, 50276, 18, 253, 2934, 273, 34560, 4715, 273, 253, 3067, 43850, 4315, 3133, 747, 50275, 20881, 1255, 265, 50276, 18, 627, 403, 1142, 37820, 2426, 275, 436, 2746, 285, 512, 273, 731, 3133, 281, 320, 9560, 323, 253, 4583, 3045, 273, 1480, 734, 318, 8900, 534, 2789, 253, 1491, 3673, 44856, 15895, 247, 1652, 2372, 21248, 50276, 19, 352, 310, 417, 2590, 849, 253, 6703, 6287, 310, 5196, 275, 2593, 7127, 50275, 9820, 597, 513, 5474, 339, 431, 248, 4477, 12661, 253, 34560, 8350, 17032, 273, 6828, 14580, 310, 301, 72, 534, 310, 2104, 281, 9441, 253, 6828, 6355, 875, 6083, 275, 7870, 2718, 342, 1491, 3673, 44856, 3762, 285, 4465, 37820, 253, 4081, 7792, 310, 6730, 327, 2067, 15524, 285, 13506, 6928, 281, 4271, 253, 6355, 50276, 296, 3755, 20556, 575, 50276, 2520, 2929, 16633, 327, 9441, 804, 253, 6944, 2605, 273, 5570, 6355, 275, 673, 39381, 272, 2718, 534, 310, 247, 2570, 1895, 1955, 281, 253, 7870, 4735, 50276, 522, 435, 432, 970, 1491, 3673, 44856, 273, 362, 3348, 281, 9441, 253, 2605, 253, 4081, 1332, 575, 12563, 1908, 253, 37139, 414, 285, 4684, 6460, 575, 1171, 253, 26640, 3067, 43850, 4315, 2867, 285, 2216, 4465, 37820, 1307, 281, 2953, 253, 2905, 3386, 28913, 575, 34966, 671, 2722, 326, 253, 37139, 414, 37820, 1307, 476, 3340, 1361, 281, 9441, 1480, 10291, 50276, 20881, 1255, 50276, 3281, 273, 253, 10895, 1180, 273, 7632, 310, 1077, 1355, 1060, 310, 352, 984, 273, 253, 12291, 273, 253, 15302, 390, 253, 4081, 7792, 310, 1077, 43245, 8214, 575, 50276, 262, 651, 320, 12912, 604, 253, 4477, 823, 247, 575, 2595, 575, 33024, 749, 13206, 275, 3036, 374, 50276, 783, 4081, 1332, 310, 760, 3732, 327, 15524, 390, 13506, 15302, 891, 7052, 5583, 17565, 253, 7792, 327, 387, 1878, 767, 1524, 10186, 15302, 281, 921, 253, 11674, 273, 253, 4081, 1332, 281, 17029, 253, 4216, 2605, 15524, 390, 13506, 941, 403, 3798, 625, 28826, 1580, 253, 941, 5978, 1232, 310, 1929, 285, 342, 1679, 11649, 1223, 1029, 10454, 285, 7202, 84, 310, 625, 2779, 281, 320, 3206, 275, 1524, 10186, 15302, 50276, 74, 1804, 253, 4477, 281, 1007, 715, 253, 1563, 2905, 2929, 327, 17029, 272, 2990, 50275, 3614, 5758, 15337, 3024, 39061, 301, 83, 75, 7332, 27148, 26, 20583, 50275, 3614, 1477, 939, 1177, 681, 45894, 1417, 32631, 2251, 520, 28766, 30715, 89, 50275, 2887, 14855, 5474, 339, 431, 248, 2929, 2175, 253, 8350, 17032, 1895, 273, 17029, 272, 253, 26640, 3067, 43850, 4315, 273, 253, 4216, 275, 271, 440, 35421, 8142, 5742, 253, 2929, 23970, 271, 34560, 1232, 1754, 327, 39762, 1491, 3673, 44856, 326, 4648, 305, 9866, 3169, 32049, 281, 9441, 6828, 5016, 6607, 275, 253, 21624, 2317, 534, 310, 840, 908, 281, 5731, 253, 26640, 3067, 43850, 10040, 3146, 4983, 432, 247, 4751, 4802, 4216, 253, 9300, 3067, 43850, 310, 45775, 281, 3283, 253, 2852, 3054, 273, 253, 7632, 253, 2929, 671, 23970, 3081, 37820, 2426, 275, 253, 8103, 1159, 323, 15958, 2605, 285, 13469, 253, 4833, 273, 11686, 6355, 4757, 50276, 783, 2929, 310, 973, 15720, 285, 253, 9759, 310, 2590, 50276, 783, 11944, 249, 3927, 2593, 3400, 247, 7000, 8813, 273, 253, 1895, 534, 253, 2929, 310, 2820, 281, 8415, 285, 4677, 337, 4245, 247, 2590, 18389, 273, 253, 1566, 10336, 50275, 783, 4081, 34560, 6974, 323, 8350, 17032, 3133, 281, 320, 4460, 285, 2969, 2568, 3576, 3738, 891, 452, 690, 7350, 670, 253, 6733, 2529, 275, 253, 14855, 629, 253, 4081, 1332, 2722, 2590, 3045, 7756, 50275, 783, 2929, 4518, 12453, 253, 1895, 273, 253, 4833, 273, 253, 11686, 10291, 273, 253, 8350, 17032, 3082, 285, 3400, 253, 5482, 323, 23703, 824, 7881, 50275, 2068, 3134, 6260, 403, 2530, 275, 253, 30762, 534, 4245, 2007, 16039, 3340, 253, 1783, 273, 253, 9171, 1430, 4679, 2722, 14282, 1543, 1014, 342, 253, 7364, 50275, 20881, 1255, 50275, 783, 4081, 1332, 3133, 281, 320, 3240, 3468, 1955, 281, 253, 34560, 6974, 3021, 10941, 4522, 358, 358, 590, 6733, 778, 2953, 619, 4468, 3340, 347, 310, 301, 72, 2722, 16888, 11701, 327, 690, 15302, 2429, 342, 21486, 363, 24088, 28793, 29121, 42981, 32150, 891, 2868, 253, 6733, 5301, 310, 4619, 33810, 253, 6733, 1895, 778, 320, 2559, 323, 4067, 15302, 891, 651, 751, 281, 923, 253, 6733, 5301, 327, 29121, 2233, 285, 6262, 15302, 347, 908, 275, 2593, 299, 273, 30762, 50275, 783, 4833, 273, 253, 4373, 22041, 273, 5150, 3127, 310, 417, 10481, 5544, 3738, 253, 28913, 1263, 2722, 253, 1543, 273, 1016, 1307, 275, 253, 2957, 1293, 253, 1783, 253, 8453, 273, 253, 4081, 9769, 2957, 310, 417, 4751, 31637, 50276, 783, 4477, 1246, 9171, 1430, 7364, 275, 2593, 299, 273, 253, 30762, 534, 310, 247, 1846, 12291, 2190, 253, 1666, 25379, 50276, 187, 187, 4118, 18435, 27, 2520, 310, 247, 270, 636, 4115, 2929, 50276, 455, 30628, 10490, 253, 2929, 533, 497, 247, 2372, 7514, 342, 253, 1180, 273, 37820, 2426, 253, 2957, 24772, 362, 3348, 285, 1491, 3673, 44856, 7274, 285, 3021, 253, 1180, 273, 4373, 22041, 1273, 314, 253, 30628, 497, 671, 8523, 7514, 342, 253, 1979, 273, 253, 49602, 533, 253, 5955, 3408, 13762, 731, 326, 841, 403, 2629, 275, 253, 1673, 2626, 314, 690, 30628, 8042, 281, 625, 2905, 789, 436, 2299, 369, 417, 247, 9560, 1127, 50274, 17480, 512, 30628, 9010, 253, 2929, 4722, 285, 973, 15720, 14924, 310, 8521, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper proposes a bayesian optimization algorithm in the context of federated learning the whole framework is built on top of generalized bayesian learning to overcome the locality of clients distributions the authors propose their solution as an integration of partitioned variational inference pvi and stein variational gradient descent svgd numerical experiments have been conducted on a synthetic dataset and some standard benchmark datasets and evaluated on both regression and classification tasks the problem of federated bayesian learning is important especially in the time when communication and data privacy arise public attention the algorithm proposed is interesting and has shown benefits in the experiments i have a few comments as follows 1 the organization of the paper needs to be revised compared with pvi and svgd the proposed algorithm seems too long and too complicated and the algorithm is not shown in the body but in appendix i think it might be better to decompose the proposed algorithms into several components eg the server part and the agent part and packing important updating steps as a procedure 2 the motivation is not clearly stated in this paper the comparison with pvi should be extended with more details for example it seems to me that the authors use a different way to optimize the local free energy functional compared with pvi if my understanding is wrong please correct me but if so whats the motivation of doing so why we should consider svgd instead of natural gradient for this subproblem 3 i would also like to see more theoretical understanding of the proposed algorithm for example the convergence analysis for usvgd even for the most simplified case and compares with the existing work also the density evolution and the fixedpoints analysis are also good to include i do see some analysis in the appendix but i think some results are better to be present in the main text 4 the experiments are conducted on synthetic or small datasets i think the authors should include experiments on larger datasets andor more complicated models 5 im not sure if global iteration index is a common term but in federated learning eg the fedavg paper its usually called communication rounds 6 a quick question the proposed algorithm and pvi only selects one agent per communication round what if more agents can be selected in a single round like fedavg doesdocsepthis paper proposes distributed svgd which maintains n particles both on the server and on the client the communication between the server and the client is conducted by uploadingdownloading these n particles the learning of local client is formulated as inferring corresponding tilted distribution experiments are conducted on synthetic gaussian 1d mixture bayesian logistic regression on covertype and twonorm dataset and bayesian nn on the uci dataset the idea of federated bayesian learning is important and well motivated however using dsvgd for this purpose is not well supported in the paper noniid data on each client communication cost bottleneck and limited computational resources either storage or computation are three characteristics of federated learning however this paper pays little attention to them the performance of svgd relies on n liu 2017 and more particles are needed for higher dimensional problems eg nn this property is unsuitable for federated learning since dsvgd needs to transfer n particles between the server and the client in each round and it needs to store and compute n particle in each client the former increases the communication cost and the latter increases the burden of client besides in the experiments training dataset is randomly split into partition of equal size among the k agents which follows the iid setting noniid dataset partition is needed to evaluate thoroughly the performance of dsvgd for federated learning some minor points 1 sec 3 svgd original svgd use the particles directly as an approximation instead of using kde 2 eq 14 the notation of qj and qi is confusingdocseppaper summary this paper introduces a new approach to probabilistic federated learning which builds on the previous pvi work of bui 2018 the proposed approach follows the same recipe in pvi where local agents learn their own model posteriors from private data and communicate their posterior representations to a server which aggregate local posterior representations into a universal representation local agents then download the aggregated posterior and offset it with their current posterior the offsetted posterior is in turn used as the new local prior to rerun the corresponding local posterior approximation via a generalized form of variational inference new local posterior estimates are subsequently communicated to the server and so on however unlike pvi the proposed method aims to replace the parametric representation of posterior with a nonparametric particle representation developed by the prior svgd work of liu wang 2016 this necessitate the development of a distributed particle aggregation algorithm in section 4 which is the key contribution of this work this development is also motivated by two practical desiderata of federated learning a a good tradeoff between communication load per iteration and no of communication iterations and b wellcalibrated predictions that are more trustworthy following the above summary i will give my opinions regarding several aspects of this paper below novelty significance on the high level of idea this paper presents an interesting perspective on a practical federated learning system communication tradeoff trustworthy prediction these are definitely important problems in the direction of making federated learning more efficient and robust this is the novel angle that i like about this paper its technical development on the other hand is leaning a bit more on the incremental side as the entire system is pretty much the same as that of pvi with the exception that a new particle representation is considered instead of pvis parametric representation in the statistical form of an exponential distribution a common pattern here is that both representations allow universal posterior information to factorize additively across local devices in the respective forms of local posterior representation in both cases this leads to a variant of a distributed sum problem where each local party has some running estimate of some piece of local information the goal is to communicate asynchronously so that each can refine its local estimate and eventually recover the correct sum of information in the case of svgd however the exact local update would require buffering all previous particle representations ie past estimates to date so that the downloaded posterior can be accurately offsetted to act as a prior for the local model ie independent of local data this necessitates the development of a distillation scheme in section 42 which is to me the key technical contribution here in addition the theoretical analysis on the udvsgds periteration decrease for the kl divergence is also an interesting contribution on this note it seems the authors have deferred the demonstration of how well the kde distillation approximate the original particle representation to various places in the appendix perhaps putting some of those back into the main text would be better if space permits on the practical aspect of this paper ie communication load trustworthy prediction while the demonstration is sufficient against pointestimate method such as fedavg and dsgld there is no comparison against other nonparametric probabilistic methods such as pvi bui 2018 andor pnfm yurochkin 2019 given that the difference between pvi and dsvgd is a matter of posterior representation comparison against pvi is probably necessary to showcase that the particle representation yields better calibrated predictions also the probabilistic nonparametric federated learning work of yurochkin 2019 also allows multiple rounds of communication although it can also be used as a oneshot model fusion of pretrained local models so it would be good to also compare both the communication load the prediction caliberation against this work technical soundness i have made highlevel check of the derivations and have not found any technical issues clarity the paper is very wellwritten especially the part that summarizes the background on svgd and pvi review summary in short this paper presents an interesting perspective on nonparametric probabilistic federated learning via particle representation of posterior the technical development is sufficiently novel with demonstrated practical advantages against fedavg and dsgld these practical advantages however were not demonstrated against existing probabilistic nonparametric federated learning works such as pvi pnfm this is perhaps strange given that dvsgd builds on pvi and is mostly different only in terms of posterior representation postrebuttal feedback the authors have addressed most of my concerns my rating for this paper therefore remains on the positive side docseppromising but incomplete federated learning algorithm this paper proposes a federated version of the stein variational gradient descent svgd method the general approach to perform federated learning is based on a previously published method called partitioned variational inference pvi this work takes the pvi approach and adapts it to the svgd framework the paper is in general wellwritten and easy to follow main ideas are clearly highlighted and the technical parts are well structured and provide enough details the study problem is of great relevance because most of the data today is generated and stored in a distributed way the presented approach is sound and builds on top of wellestablished methods strong points the paper addresses a very relevant problem by combining two wellfounded approaches the use of particlebased variational inference methods in the context of federated learning is worth exploring the presented approach is rigorous and well evaluated the presented approach can train models with similar prediction performance than standard centralized approaches and its also able to produce wellcalibrated predictions weak points the presented approach has limited practical use because of the current restrictions it imposes ie update one agent at a time the convergence of the approach seems quite dubious once the current constrains of one agent updated at a time is lifted the implementation of this method for federated learning of large deep neural networks cast doubts in the feasibility of the approach due to the high overhead of sendingreceiving multiple set of weights i can not recommend the acceptation of this work for the following reasons the originality of method is low because it directly builds on top of two wellestablished approaches pvi and svgd however the combination of these two approaches is not straightforward and shows how particlebased approximation methods can be also used in this challenging setting in my opinion there is a relevant limitation to this approach which although acknowledged by authors is not properly discussed federated updates can only be done by one of the agents at a time which implies that this particular algorithm is of limited practical use one of the key points in federated learning is the possibility to exploit distributed computing infrastructure such as our mobile phones so updating one agent at a time practically makes unfeasible this possibility another limitation is the convergence of the algorithm or at least an iterative improvement of the global free energy this current version of the paper guarantees that the global free energy is decreased at every round mainly because the algorithm only updates one agent at a time and it essentially works as the standard svgd the proof of convergence is directly borrowed from korba et al 2020 but i have strong doubts that this approach can provide this guarantee once the updating one agent at a time constraint is lifted because the pvi framework which is the basis of this approach can not guarantee a decrease of the global energy or convergence at every updating round this is not properly discussed in the paper another relevant limitation which is inherent to the svgd method is the number of particles to be used each particle in the context of the deep neural networks corresponds to the whole set of weights of the network so the transmission of several set of weights can lead to very significant communication costsdelays this is not properlyexplicitly discussed in the paper minor comments eq 17 is out of margin postrebutal i really thank the authors for their efforts following my comments i think they have really addressed my concerns i therefore raise the score of the paper and recommend it for acceptance ### Summary:
this work presents a distributed svgd dsvgd algorithm as a new nonparametric bayesian framework for federated learning the reviewers concerned with the practical advantages of the proposed method including the communication cost and the constraint of updating one agent per time the authors rebuttal helped addressing some of the concerns including proposing a new paralleldsvgd algorithm this is very much appreciated however given the significant modification needed over the original version we think it is better for the authors to further improve the work and submit to the next conference
[ 1355, 15302, 891, 1158, 253, 4477, 943, 2486, 4679, 327, 4067, 15302, 285, 263, 625, 9542, 3210, 50276, 22, 516, 417, 2119, 604, 4156, 19502, 3605, 310, 247, 1846, 1307, 533, 275, 10208, 12072, 4715, 24088, 253, 10208, 42921, 2929, 697, 3798, 1925, 5511, 16334, 50276, 23, 247, 3158, 1953, 253, 4081, 5933, 285, 268, 6584, 760, 34899, 581, 5570, 591, 5511, 3790, 752, 604, 625, 6083, 476, 320, 4236, 275, 247, 2014, 3790, 751, 10208, 42921, 1057, 7152, 33032, 2520, 2929, 29328, 5939, 18504, 35333, 534, 18922, 295, 6353, 1097, 327, 253, 4771, 285, 327, 253, 5268, 253, 5511, 875, 253, 4771, 285, 253, 5268, 310, 5196, 407, 49487, 21596, 272, 841, 295, 6353, 253, 4715, 273, 1980, 5268, 310, 26115, 347, 9441, 804, 3969, 37126, 3268, 4679, 403, 5196, 327, 13506, 305, 12064, 337, 69, 7802, 17699, 16561, 21535, 9077, 327, 3835, 881, 285, 2500, 251, 526, 10895, 285, 17699, 16561, 48257, 327, 253, 44274, 74, 10895, 50276, 783, 2934, 273, 10208, 12072, 17699, 16561, 4715, 310, 1774, 285, 973, 17194, 2299, 970, 277, 22503, 69, 323, 436, 4096, 310, 417, 973, 4516, 275, 253, 2929, 1327, 74, 301, 941, 327, 1016, 5268, 5511, 2105, 3673, 44856, 285, 3710, 15180, 5300, 2057, 5718, 390, 13782, 403, 1264, 5319, 273, 10208, 12072, 4715, 2299, 436, 2929, 17587, 1652, 4116, 281, 731, 253, 3045, 273, 18504, 35333, 15771, 327, 295, 632, 86, 4240, 285, 625, 6353, 403, 3058, 323, 2169, 15759, 3237, 24088, 48257, 436, 2867, 310, 49590, 323, 10208, 12072, 4715, 1580, 277, 22503, 69, 3198, 281, 3700, 295, 6353, 875, 253, 4771, 285, 253, 5268, 275, 1016, 3790, 285, 50276, 262, 3198, 281, 4657, 285, 11897, 295, 8091, 275, 1016, 5268, 253, 3438, 5459, 253, 5511, 2105, 285, 253, 6158, 5459, 253, 7977, 273, 5268, 16280, 275, 253, 4679, 3733, 10895, 310, 12421, 8085, 715, 10883, 273, 4503, 1979, 2190, 253, 465, 6083, 534, 3637, 253, 891, 301, 4758, 1327, 74, 301, 10895, 10883, 310, 3058, 281, 7472, 16575, 253, 3045, 273, 277, 22503, 69, 323, 10208, 12072, 4715, 50275, 8826, 5884, 2792, 337, 4706, 495, 18504, 35333, 3236, 18504, 35333, 897, 253, 6353, 3587, 347, 271, 11193, 3185, 273, 970, 49745, 374, 16186, 1638, 253, 14951, 273, 2805, 75, 285, 2805, 74, 310, 21643, 7152, 339, 377, 6653, 6010, 50276, 2520, 2929, 23970, 247, 747, 2746, 281, 37851, 10208, 12072, 4715, 534, 21168, 327, 253, 2045, 268, 6584, 789, 273, 1081, 74, 4765, 50275, 783, 4081, 2746, 3637, 253, 1072, 13612, 275, 268, 6584, 835, 1980, 6083, 3037, 616, 1211, 1566, 20731, 17327, 432, 3055, 941, 285, 13791, 616, 12637, 14237, 281, 247, 4771, 534, 19737, 1980, 12637, 14237, 715, 247, 10898, 6779, 1980, 6083, 840, 6184, 253, 40006, 12637, 285, 8409, 352, 342, 616, 1655, 12637, 253, 8409, 8659, 12637, 310, 275, 1614, 908, 347, 253, 747, 1980, 2720, 281, 294, 6321, 253, 3969, 1980, 12637, 11193, 3066, 247, 14923, 830, 273, 39762, 17032, 747, 1980, 12637, 8197, 403, 9674, 32452, 281, 253, 4771, 285, 594, 327, 50275, 35529, 12401, 268, 6584, 253, 4081, 1332, 13698, 281, 8171, 253, 36833, 6779, 273, 12637, 342, 247, 1327, 36928, 8091, 6779, 3715, 407, 253, 2720, 18504, 35333, 789, 273, 632, 86, 50276, 33317, 4022, 436, 2436, 17255, 253, 2440, 273, 247, 5939, 8091, 20828, 5933, 275, 2593, 577, 534, 310, 253, 2234, 7680, 273, 436, 789, 436, 2440, 310, 671, 17194, 407, 767, 8542, 711, 1334, 682, 273, 10208, 12072, 4715, 247, 247, 1175, 5454, 2727, 875, 5511, 3301, 591, 19502, 285, 642, 273, 5511, 25142, 285, 270, 973, 1179, 50250, 13650, 326, 403, 625, 46808, 50276, 34814, 253, 1840, 6010, 891, 588, 1918, 619, 11626, 5001, 2067, 7794, 273, 436, 2929, 2708, 50276, 2369, 652, 555, 50276, 9188, 40348, 50276, 251, 253, 1029, 1268, 273, 2934, 436, 2929, 10262, 271, 4722, 8668, 327, 247, 8542, 10208, 12072, 4715, 985, 5511, 5454, 2727, 50276, 32993, 26214, 10554, 841, 403, 7964, 1774, 3237, 275, 253, 3884, 273, 2403, 10208, 12072, 4715, 625, 5919, 285, 10237, 436, 310, 253, 4460, 6907, 326, 891, 751, 670, 436, 2929, 50276, 953, 7681, 2440, 327, 253, 643, 1133, 310, 25661, 247, 2372, 625, 327, 253, 32809, 1930, 347, 253, 2862, 985, 310, 3965, 1199, 253, 1072, 347, 326, 273, 268, 6584, 342, 253, 6517, 326, 247, 747, 8091, 6779, 310, 2783, 3185, 273, 268, 4534, 36833, 6779, 275, 253, 7605, 830, 273, 271, 17619, 3268, 50275, 66, 1846, 3102, 1060, 310, 326, 1097, 14237, 1581, 10898, 12637, 1491, 281, 2803, 907, 823, 25785, 2439, 1980, 4095, 275, 253, 9056, 4948, 273, 1980, 12637, 6779, 275, 1097, 2219, 436, 5644, 281, 247, 12955, 273, 247, 5939, 2020, 1895, 835, 1016, 1980, 3128, 556, 690, 3515, 6642, 273, 690, 5313, 273, 1980, 1491, 50276, 783, 4736, 310, 281, 13791, 29307, 4087, 594, 326, 1016, 476, 39494, 697, 1980, 6642, 285, 6524, 9295, 253, 3451, 2020, 273, 1491, 50275, 249, 253, 1083, 273, 18504, 35333, 2299, 253, 3242, 1980, 5731, 651, 2430, 14664, 2158, 512, 2045, 8091, 14237, 26332, 2469, 8197, 281, 3522, 594, 326, 253, 20582, 12637, 476, 320, 13613, 8409, 8659, 281, 769, 347, 247, 2720, 323, 253, 1980, 1566, 26332, 3907, 273, 1980, 941, 436, 2436, 36269, 253, 2440, 273, 247, 940, 21755, 6974, 275, 2593, 5976, 534, 310, 281, 479, 253, 2234, 7681, 7680, 1060, 275, 1635, 253, 10527, 1783, 327, 253, 18198, 87, 8433, 1397, 591, 2562, 318, 6379, 323, 253, 27451, 23279, 310, 671, 271, 4722, 7680, 50276, 251, 436, 3877, 352, 3133, 253, 4477, 452, 36334, 253, 20028, 273, 849, 973, 253, 49745, 940, 21755, 16851, 253, 3236, 8091, 6779, 281, 2710, 5053, 275, 253, 30762, 4931, 8133, 690, 273, 1110, 896, 715, 253, 2022, 2505, 651, 320, 1805, 604, 2317, 16000, 50275, 251, 253, 8542, 4809, 273, 436, 2929, 26332, 5511, 3301, 50276, 32993, 26214, 10554, 1223, 253, 20028, 310, 4209, 1411, 1127, 383, 2542, 1332, 824, 347, 10208, 42921, 285, 277, 8433, 392, 627, 310, 642, 5301, 1411, 643, 1327, 36928, 37851, 3082, 824, 347, 268, 6584, 1081, 74, 4765, 285, 263, 268, 79, 22401, 340, 1822, 348, 5914, 6247, 1677, 326, 253, 3064, 875, 268, 6584, 285, 277, 22503, 69, 310, 247, 2647, 273, 12637, 6779, 5301, 1411, 268, 6584, 310, 3164, 3309, 281, 34647, 326, 253, 8091, 6779, 11026, 1805, 35890, 13650, 50276, 12563, 253, 37851, 1327, 36928, 10208, 12072, 4715, 789, 273, 340, 1822, 348, 5914, 6247, 671, 4483, 2709, 16334, 273, 5511, 3738, 352, 476, 671, 320, 908, 347, 247, 4394, 12022, 1566, 11781, 273, 3215, 11273, 1980, 3210, 594, 352, 651, 320, 1175, 281, 671, 7277, 1097, 253, 5511, 3301, 50276, 783, 10554, 1724, 487, 3328, 1411, 436, 789, 50276, 48746, 3590, 1255, 50276, 74, 452, 1160, 1029, 5251, 2451, 273, 253, 3538, 569, 285, 452, 417, 1119, 667, 7681, 3374, 50276, 498, 15752, 50276, 783, 2929, 310, 1077, 973, 15720, 3340, 253, 629, 326, 37250, 253, 4114, 327, 18504, 35333, 285, 268, 6584, 50276, 15337, 6010, 50276, 249, 2159, 436, 2929, 10262, 271, 4722, 8668, 327, 1327, 36928, 37851, 10208, 12072, 4715, 3066, 8091, 6779, 273, 12637, 253, 7681, 2440, 310, 10481, 4460, 342, 5183, 8542, 11361, 1411, 10208, 42921, 285, 277, 8433, 392, 841, 8542, 11361, 2299, 497, 417, 5183, 1411, 5368, 37851, 1327, 36928, 10208, 12072, 4715, 2987, 824, 347, 268, 6584, 50276, 16077, 22401, 50276, 2520, 310, 4931, 8921, 1677, 326, 43857, 8433, 69, 21168, 327, 268, 6584, 285, 310, 6571, 1027, 760, 275, 2426, 273, 12637, 6779, 50275, 5996, 250, 2858, 22559, 8680, 50275, 783, 4477, 452, 9713, 954, 273, 619, 7350, 619, 13716, 323, 436, 2929, 3103, 4558, 327, 253, 2762, 1930, 5474, 339, 377, 409, 2182, 533, 18464, 10208, 12072, 4715, 5933, 50276, 2520, 2929, 29328, 247, 10208, 12072, 2715, 273, 253, 2870, 249, 39762, 11786, 18499, 18504, 35333, 1332, 253, 2087, 2746, 281, 1347, 10208, 12072, 4715, 310, 1754, 327, 247, 3786, 3863, 1332, 1925, 10883, 264, 39762, 17032, 268, 6584, 436, 789, 3936, 253, 268, 6584, 2746, 285, 5223, 84, 352, 281, 253, 18504, 35333, 7792, 50275, 783, 2929, 310, 275, 2087, 973, 15720, 285, 3477, 281, 956, 2022, 5697, 403, 4518, 16318, 285, 253, 7681, 4243, 403, 973, 18872, 285, 2085, 2217, 4278, 50276, 783, 1263, 1895, 310, 273, 1270, 17200, 984, 954, 273, 253, 941, 3063, 310, 4561, 285, 7141, 275, 247, 5939, 1039, 253, 3559, 2746, 310, 3590, 285, 21168, 327, 1755, 273, 973, 21877, 3082, 50275, 9072, 2792, 50275, 783, 2929, 12453, 247, 1077, 4623, 1895, 407, 16248, 767, 973, 41607, 7274, 50274, 783, 897, 273, 8091, 3169, 39762, 17032, 3082, 275, 253, 3634, 273, 10208, 12072, 4715, 310, 4409, 18216, 50274, 783, 3559, 2746, 310, 26565, 285, 973, 6760, 50274, 783, 3559, 2746, 476, 6194, 3210, 342, 2074, 10554, 3045, 685, 2629, 36409, 7274, 285, 697, 671, 2104, 281, 4711, 973, 1179, 50250, 13650, 50275, 20881, 2792, 50275, 783, 3559, 2746, 556, 3710, 8542, 897, 984, 273, 253, 1655, 13133, 352, 35979, 26332, 5731, 581, 5570, 387, 247, 673, 50274, 783, 14940, 273, 253, 2746, 3133, 3240, 42326, 2378, 253, 1655, 1030, 44196, 273, 581, 5570, 9300, 387, 247, 673, 310, 14287, 50274, 783, 7092, 273, 436, 1332, 323, 10208, 12072, 4715, 273, 1781, 3676, 11454, 6928, 5248, 24626, 275, 253, 25720, 273, 253, 2746, 1955, 281, 253, 1029, 18332, 273, 10430, 250, 32177, 2709, 873, 273, 13461, 50274, 74, 476, 417, 5583, 253, 2997, 318, 273, 436, 789, 323, 253, 1563, 4606, 50275, 783, 3236, 414, 273, 1332, 310, 1698, 984, 352, 3587, 21168, 327, 1755, 273, 767, 973, 21877, 7274, 268, 6584, 285, 18504, 35333, 2299, 253, 5019, 273, 841, 767, 7274, 310, 417, 15246, 285, 2722, 849, 8091, 3169, 11193, 3082, 476, 320, 671, 908, 275, 436, 11132, 4758, 50274, 249, 619, 4743, 627, 310, 247, 4623, 12291, 281, 436, 2746, 534, 3738, 14969, 407, 4477, 310, 417, 6283, 5469, 10208, 12072, 11269, 476, 760, 320, 2218, 407, 581, 273, 253, 6083, 387, 247, 673, 534, 8018, 326, 436, 1798, 5933, 310, 273, 3710, 8542, 897, 581, 273, 253, 2234, 2792, 275, 10208, 12072, 4715, 310, 253, 6387, 281, 22059, 5939, 12672, 11319, 824, 347, 776, 6109, 15169, 594, 22753, 581, 5570, 387, 247, 673, 18236, 2789, 440, 36764, 917, 436, 6387, 50275, 23955, 12291, 310, 253, 14940, 273, 253, 5933, 390, 387, 1878, 271, 34560, 7756, 273, 253, 4156, 1959, 2341, 436, 1655, 2715, 273, 253, 2929, 23632, 326, 253, 4156, 1959, 2341, 310, 6137, 387, 1046, 3790, 7194, 984, 253, 5933, 760, 11269, 581, 5570, 387, 247, 673, 285, 352, 9093, 2987, 347, 253, 2629, 18504, 35333, 253, 4737, 273, 14940, 310, 3587, 29563, 432, 37720, 5830, 1162, 355, 9169, 533, 891, 452, 2266, 24626, 326, 436, 2746, 476, 2085, 436, 12215, 2378, 253, 22753, 581, 5570, 387, 247, 673, 7658, 310, 14287, 984, 253, 268, 6584, 7792, 534, 310, 253, 3720, 273, 436, 2746, 476, 417, 12215, 247, 6379, 273, 253, 4156, 2341, 390, 14940, 387, 1046, 22753, 3790, 436, 310, 417, 6283, 5469, 275, 253, 2929, 50273, 23955, 4623, 12291, 534, 310, 12794, 281, 253, 50276, 22503, 69, 1332, 310, 253, 1180, 273, 6353, 281, 320, 908, 1016, 8091, 275, 253, 3634, 273, 253, 3676, 11454, 6928, 10140, 281, 253, 2644, 873, 273, 13461, 273, 253, 2990, 594, 253, 6322, 273, 2067, 873, 273, 13461, 476, 1421, 281, 1077, 1534, 5511, 4815, 7555, 698, 436, 310, 417, 6283, 911, 20692, 314, 5469, 275, 253, 2929, 50273, 37585, 5701, 16186, 1722, 310, 562, 273, 8459, 50276, 5996, 250, 2858, 267, 891, 1663, 5717, 253, 4477, 323, 616, 6031, 50276, 34814, 619, 5701, 891, 1158, 597, 452, 1663, 9713, 619, 7350, 891, 3103, 7164, 253, 4868, 273, 253, 2929, 285, 5583, 352, 323, 14924, 50276, 187, 187, 4118, 18435, 27, 2520, 789, 10262, 247, 5939, 18504, 35333, 277, 22503, 69, 5933, 347, 247, 747, 1327, 36928, 17699, 16561, 7792, 323, 10208, 12072, 4715, 253, 30628, 7514, 342, 253, 8542, 11361, 273, 253, 4081, 1332, 1690, 253, 5511, 2105, 285, 253, 7658, 273, 22753, 581, 5570, 591, 673, 253, 4477, 30080, 22559, 6518, 15974, 690, 273, 253, 7350, 1690, 36636, 247, 747, 29736, 392, 22503, 69, 5933, 436, 310, 1077, 1199, 14109, 2299, 1677, 253, 1534, 11237, 3058, 689, 253, 3236, 2715, 359, 1158, 352, 310, 1805, 323, 253, 4477, 281, 2007, 3157, 253, 789, 285, 11929, 281, 253, 1735, 8059, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 1355, 15302, 891, 1158, 253, 4477, 943, 2486, 4679, 327, 4067, 15302, 285, 263, 625, 9542, 3210, 50276, 22, 516, 417, 2119, 604, 4156, 19502, 3605, 310, 247, 1846, 1307, 533, 275, 10208, 12072, 4715, 24088, 253, 10208, 42921, 2929, 697, 3798, 1925, 5511, 16334, 50276, 23, 247, 3158, 1953, 253, 4081, 5933, 285, 268, 6584, 760, 34899, 581, 5570, 591, 5511, 3790, 752, 604, 625, 6083, 476, 320, 4236, 275, 247, 2014, 3790, 751, 10208, 42921, 1057, 7152, 33032, 2520, 2929, 29328, 5939, 18504, 35333, 534, 18922, 295, 6353, 1097, 327, 253, 4771, 285, 327, 253, 5268, 253, 5511, 875, 253, 4771, 285, 253, 5268, 310, 5196, 407, 49487, 21596, 272, 841, 295, 6353, 253, 4715, 273, 1980, 5268, 310, 26115, 347, 9441, 804, 3969, 37126, 3268, 4679, 403, 5196, 327, 13506, 305, 12064, 337, 69, 7802, 17699, 16561, 21535, 9077, 327, 3835, 881, 285, 2500, 251, 526, 10895, 285, 17699, 16561, 48257, 327, 253, 44274, 74, 10895, 50276, 783, 2934, 273, 10208, 12072, 17699, 16561, 4715, 310, 1774, 285, 973, 17194, 2299, 970, 277, 22503, 69, 323, 436, 4096, 310, 417, 973, 4516, 275, 253, 2929, 1327, 74, 301, 941, 327, 1016, 5268, 5511, 2105, 3673, 44856, 285, 3710, 15180, 5300, 2057, 5718, 390, 13782, 403, 1264, 5319, 273, 10208, 12072, 4715, 2299, 436, 2929, 17587, 1652, 4116, 281, 731, 253, 3045, 273, 18504, 35333, 15771, 327, 295, 632, 86, 4240, 285, 625, 6353, 403, 3058, 323, 2169, 15759, 3237, 24088, 48257, 436, 2867, 310, 49590, 323, 10208, 12072, 4715, 1580, 277, 22503, 69, 3198, 281, 3700, 295, 6353, 875, 253, 4771, 285, 253, 5268, 275, 1016, 3790, 285, 50276, 262, 3198, 281, 4657, 285, 11897, 295, 8091, 275, 1016, 5268, 253, 3438, 5459, 253, 5511, 2105, 285, 253, 6158, 5459, 253, 7977, 273, 5268, 16280, 275, 253, 4679, 3733, 10895, 310, 12421, 8085, 715, 10883, 273, 4503, 1979, 2190, 253, 465, 6083, 534, 3637, 253, 891, 301, 4758, 1327, 74, 301, 10895, 10883, 310, 3058, 281, 7472, 16575, 253, 3045, 273, 277, 22503, 69, 323, 10208, 12072, 4715, 50275, 8826, 5884, 2792, 337, 4706, 495, 18504, 35333, 3236, 18504, 35333, 897, 253, 6353, 3587, 347, 271, 11193, 3185, 273, 970, 49745, 374, 16186, 1638, 253, 14951, 273, 2805, 75, 285, 2805, 74, 310, 21643, 7152, 339, 377, 6653, 6010, 50276, 2520, 2929, 23970, 247, 747, 2746, 281, 37851, 10208, 12072, 4715, 534, 21168, 327, 253, 2045, 268, 6584, 789, 273, 1081, 74, 4765, 50275, 783, 4081, 2746, 3637, 253, 1072, 13612, 275, 268, 6584, 835, 1980, 6083, 3037, 616, 1211, 1566, 20731, 17327, 432, 3055, 941, 285, 13791, 616, 12637, 14237, 281, 247, 4771, 534, 19737, 1980, 12637, 14237, 715, 247, 10898, 6779, 1980, 6083, 840, 6184, 253, 40006, 12637, 285, 8409, 352, 342, 616, 1655, 12637, 253, 8409, 8659, 12637, 310, 275, 1614, 908, 347, 253, 747, 1980, 2720, 281, 294, 6321, 253, 3969, 1980, 12637, 11193, 3066, 247, 14923, 830, 273, 39762, 17032, 747, 1980, 12637, 8197, 403, 9674, 32452, 281, 253, 4771, 285, 594, 327, 50275, 35529, 12401, 268, 6584, 253, 4081, 1332, 13698, 281, 8171, 253, 36833, 6779, 273, 12637, 342, 247, 1327, 36928, 8091, 6779, 3715, 407, 253, 2720, 18504, 35333, 789, 273, 632, 86, 50276, 33317, 4022, 436, 2436, 17255, 253, 2440, 273, 247, 5939, 8091, 20828, 5933, 275, 2593, 577, 534, 310, 253, 2234, 7680, 273, 436, 789, 436, 2440, 310, 671, 17194, 407, 767, 8542, 711, 1334, 682, 273, 10208, 12072, 4715, 247, 247, 1175, 5454, 2727, 875, 5511, 3301, 591, 19502, 285, 642, 273, 5511, 25142, 285, 270, 973, 1179, 50250, 13650, 326, 403, 625, 46808, 50276, 34814, 253, 1840, 6010, 891, 588, 1918, 619, 11626, 5001, 2067, 7794, 273, 436, 2929, 2708, 50276, 2369, 652, 555, 50276, 9188, 40348, 50276, 251, 253, 1029, 1268, 273, 2934, 436, 2929, 10262, 271, 4722, 8668, 327, 247, 8542, 10208, 12072, 4715, 985, 5511, 5454, 2727, 50276, 32993, 26214, 10554, 841, 403, 7964, 1774, 3237, 275, 253, 3884, 273, 2403, 10208, 12072, 4715, 625, 5919, 285, 10237, 436, 310, 253, 4460, 6907, 326, 891, 751, 670, 436, 2929, 50276, 953, 7681, 2440, 327, 253, 643, 1133, 310, 25661, 247, 2372, 625, 327, 253, 32809, 1930, 347, 253, 2862, 985, 310, 3965, 1199, 253, 1072, 347, 326, 273, 268, 6584, 342, 253, 6517, 326, 247, 747, 8091, 6779, 310, 2783, 3185, 273, 268, 4534, 36833, 6779, 275, 253, 7605, 830, 273, 271, 17619, 3268, 50275, 66, 1846, 3102, 1060, 310, 326, 1097, 14237, 1581, 10898, 12637, 1491, 281, 2803, 907, 823, 25785, 2439, 1980, 4095, 275, 253, 9056, 4948, 273, 1980, 12637, 6779, 275, 1097, 2219, 436, 5644, 281, 247, 12955, 273, 247, 5939, 2020, 1895, 835, 1016, 1980, 3128, 556, 690, 3515, 6642, 273, 690, 5313, 273, 1980, 1491, 50276, 783, 4736, 310, 281, 13791, 29307, 4087, 594, 326, 1016, 476, 39494, 697, 1980, 6642, 285, 6524, 9295, 253, 3451, 2020, 273, 1491, 50275, 249, 253, 1083, 273, 18504, 35333, 2299, 253, 3242, 1980, 5731, 651, 2430, 14664, 2158, 512, 2045, 8091, 14237, 26332, 2469, 8197, 281, 3522, 594, 326, 253, 20582, 12637, 476, 320, 13613, 8409, 8659, 281, 769, 347, 247, 2720, 323, 253, 1980, 1566, 26332, 3907, 273, 1980, 941, 436, 2436, 36269, 253, 2440, 273, 247, 940, 21755, 6974, 275, 2593, 5976, 534, 310, 281, 479, 253, 2234, 7681, 7680, 1060, 275, 1635, 253, 10527, 1783, 327, 253, 18198, 87, 8433, 1397, 591, 2562, 318, 6379, 323, 253, 27451, 23279, 310, 671, 271, 4722, 7680, 50276, 251, 436, 3877, 352, 3133, 253, 4477, 452, 36334, 253, 20028, 273, 849, 973, 253, 49745, 940, 21755, 16851, 253, 3236, 8091, 6779, 281, 2710, 5053, 275, 253, 30762, 4931, 8133, 690, 273, 1110, 896, 715, 253, 2022, 2505, 651, 320, 1805, 604, 2317, 16000, 50275, 251, 253, 8542, 4809, 273, 436, 2929, 26332, 5511, 3301, 50276, 32993, 26214, 10554, 1223, 253, 20028, 310, 4209, 1411, 1127, 383, 2542, 1332, 824, 347, 10208, 42921, 285, 277, 8433, 392, 627, 310, 642, 5301, 1411, 643, 1327, 36928, 37851, 3082, 824, 347, 268, 6584, 1081, 74, 4765, 285, 263, 268, 79, 22401, 340, 1822, 348, 5914, 6247, 1677, 326, 253, 3064, 875, 268, 6584, 285, 277, 22503, 69, 310, 247, 2647, 273, 12637, 6779, 5301, 1411, 268, 6584, 310, 3164, 3309, 281, 34647, 326, 253, 8091, 6779, 11026, 1805, 35890, 13650, 50276, 12563, 253, 37851, 1327, 36928, 10208, 12072, 4715, 789, 273, 340, 1822, 348, 5914, 6247, 671, 4483, 2709, 16334, 273, 5511, 3738, 352, 476, 671, 320, 908, 347, 247, 4394, 12022, 1566, 11781, 273, 3215, 11273, 1980, 3210, 594, 352, 651, 320, 1175, 281, 671, 7277, 1097, 253, 5511, 3301, 50276, 783, 10554, 1724, 487, 3328, 1411, 436, 789, 50276, 48746, 3590, 1255, 50276, 74, 452, 1160, 1029, 5251, 2451, 273, 253, 3538, 569, 285, 452, 417, 1119, 667, 7681, 3374, 50276, 498, 15752, 50276, 783, 2929, 310, 1077, 973, 15720, 3340, 253, 629, 326, 37250, 253, 4114, 327, 18504, 35333, 285, 268, 6584, 50276, 15337, 6010, 50276, 249, 2159, 436, 2929, 10262, 271, 4722, 8668, 327, 1327, 36928, 37851, 10208, 12072, 4715, 3066, 8091, 6779, 273, 12637, 253, 7681, 2440, 310, 10481, 4460, 342, 5183, 8542, 11361, 1411, 10208, 42921, 285, 277, 8433, 392, 841, 8542, 11361, 2299, 497, 417, 5183, 1411, 5368, 37851, 1327, 36928, 10208, 12072, 4715, 2987, 824, 347, 268, 6584, 50276, 16077, 22401, 50276, 2520, 310, 4931, 8921, 1677, 326, 43857, 8433, 69, 21168, 327, 268, 6584, 285, 310, 6571, 1027, 760, 275, 2426, 273, 12637, 6779, 50275, 5996, 250, 2858, 22559, 8680, 50275, 783, 4477, 452, 9713, 954, 273, 619, 7350, 619, 13716, 323, 436, 2929, 3103, 4558, 327, 253, 2762, 1930, 5474, 339, 377, 409, 2182, 533, 18464, 10208, 12072, 4715, 5933, 50276, 2520, 2929, 29328, 247, 10208, 12072, 2715, 273, 253, 2870, 249, 39762, 11786, 18499, 18504, 35333, 1332, 253, 2087, 2746, 281, 1347, 10208, 12072, 4715, 310, 1754, 327, 247, 3786, 3863, 1332, 1925, 10883, 264, 39762, 17032, 268, 6584, 436, 789, 3936, 253, 268, 6584, 2746, 285, 5223, 84, 352, 281, 253, 18504, 35333, 7792, 50275, 783, 2929, 310, 275, 2087, 973, 15720, 285, 3477, 281, 956, 2022, 5697, 403, 4518, 16318, 285, 253, 7681, 4243, 403, 973, 18872, 285, 2085, 2217, 4278, 50276, 783, 1263, 1895, 310, 273, 1270, 17200, 984, 954, 273, 253, 941, 3063, 310, 4561, 285, 7141, 275, 247, 5939, 1039, 253, 3559, 2746, 310, 3590, 285, 21168, 327, 1755, 273, 973, 21877, 3082, 50275, 9072, 2792, 50275, 783, 2929, 12453, 247, 1077, 4623, 1895, 407, 16248, 767, 973, 41607, 7274, 50274, 783, 897, 273, 8091, 3169, 39762, 17032, 3082, 275, 253, 3634, 273, 10208, 12072, 4715, 310, 4409, 18216, 50274, 783, 3559, 2746, 310, 26565, 285, 973, 6760, 50274, 783, 3559, 2746, 476, 6194, 3210, 342, 2074, 10554, 3045, 685, 2629, 36409, 7274, 285, 697, 671, 2104, 281, 4711, 973, 1179, 50250, 13650, 50275, 20881, 2792, 50275, 783, 3559, 2746, 556, 3710, 8542, 897, 984, 273, 253, 1655, 13133, 352, 35979, 26332, 5731, 581, 5570, 387, 247, 673, 50274, 783, 14940, 273, 253, 2746, 3133, 3240, 42326, 2378, 253, 1655, 1030, 44196, 273, 581, 5570, 9300, 387, 247, 673, 310, 14287, 50274, 783, 7092, 273, 436, 1332, 323, 10208, 12072, 4715, 273, 1781, 3676, 11454, 6928, 5248, 24626, 275, 253, 25720, 273, 253, 2746, 1955, 281, 253, 1029, 18332, 273, 10430, 250, 32177, 2709, 873, 273, 13461, 50274, 74, 476, 417, 5583, 253, 2997, 318, 273, 436, 789, 323, 253, 1563, 4606, 50275, 783, 3236, 414, 273, 1332, 310, 1698, 984, 352, 3587, 21168, 327, 1755, 273, 767, 973, 21877, 7274, 268, 6584, 285, 18504, 35333, 2299, 253, 5019, 273, 841, 767, 7274, 310, 417, 15246, 285, 2722, 849, 8091, 3169, 11193, 3082, 476, 320, 671, 908, 275, 436, 11132, 4758, 50274, 249, 619, 4743, 627, 310, 247, 4623, 12291, 281, 436, 2746, 534, 3738, 14969, 407, 4477, 310, 417, 6283, 5469, 10208, 12072, 11269, 476, 760, 320, 2218, 407, 581, 273, 253, 6083, 387, 247, 673, 534, 8018, 326, 436, 1798, 5933, 310, 273, 3710, 8542, 897, 581, 273, 253, 2234, 2792, 275, 10208, 12072, 4715, 310, 253, 6387, 281, 22059, 5939, 12672, 11319, 824, 347, 776, 6109, 15169, 594, 22753, 581, 5570, 387, 247, 673, 18236, 2789, 440, 36764, 917, 436, 6387, 50275, 23955, 12291, 310, 253, 14940, 273, 253, 5933, 390, 387, 1878, 271, 34560, 7756, 273, 253, 4156, 1959, 2341, 436, 1655, 2715, 273, 253, 2929, 23632, 326, 253, 4156, 1959, 2341, 310, 6137, 387, 1046, 3790, 7194, 984, 253, 5933, 760, 11269, 581, 5570, 387, 247, 673, 285, 352, 9093, 2987, 347, 253, 2629, 18504, 35333, 253, 4737, 273, 14940, 310, 3587, 29563, 432, 37720, 5830, 1162, 355, 9169, 533, 891, 452, 2266, 24626, 326, 436, 2746, 476, 2085, 436, 12215, 2378, 253, 22753, 581, 5570, 387, 247, 673, 7658, 310, 14287, 984, 253, 268, 6584, 7792, 534, 310, 253, 3720, 273, 436, 2746, 476, 417, 12215, 247, 6379, 273, 253, 4156, 2341, 390, 14940, 387, 1046, 22753, 3790, 436, 310, 417, 6283, 5469, 275, 253, 2929, 50273, 23955, 4623, 12291, 534, 310, 12794, 281, 253, 50276, 22503, 69, 1332, 310, 253, 1180, 273, 6353, 281, 320, 908, 1016, 8091, 275, 253, 3634, 273, 253, 3676, 11454, 6928, 10140, 281, 253, 2644, 873, 273, 13461, 273, 253, 2990, 594, 253, 6322, 273, 2067, 873, 273, 13461, 476, 1421, 281, 1077, 1534, 5511, 4815, 7555, 698, 436, 310, 417, 6283, 911, 20692, 314, 5469, 275, 253, 2929, 50273, 37585, 5701, 16186, 1722, 310, 562, 273, 8459, 50276, 5996, 250, 2858, 267, 891, 1663, 5717, 253, 4477, 323, 616, 6031, 50276, 34814, 619, 5701, 891, 1158, 597, 452, 1663, 9713, 619, 7350, 891, 3103, 7164, 253, 4868, 273, 253, 2929, 285, 5583, 352, 323, 14924, 50276, 187, 187, 4118, 18435, 27, 2520, 789, 10262, 247, 5939, 18504, 35333, 277, 22503, 69, 5933, 347, 247, 747, 1327, 36928, 17699, 16561, 7792, 323, 10208, 12072, 4715, 253, 30628, 7514, 342, 253, 8542, 11361, 273, 253, 4081, 1332, 1690, 253, 5511, 2105, 285, 253, 7658, 273, 22753, 581, 5570, 591, 673, 253, 4477, 30080, 22559, 6518, 15974, 690, 273, 253, 7350, 1690, 36636, 247, 747, 29736, 392, 22503, 69, 5933, 436, 310, 1077, 1199, 14109, 2299, 1677, 253, 1534, 11237, 3058, 689, 253, 3236, 2715, 359, 1158, 352, 310, 1805, 323, 253, 4477, 281, 2007, 3157, 253, 789, 285, 11929, 281, 253, 1735, 8059, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper highlights a very relevant problem in mri acquisition where corrupted raw image data needs to be improved upon in order to reach diagnostic image quality the strengths to highlight are 1 while many dl mri reconstruction methods are relying solely on imagebased data augmentations it is refreshing to read about this approach to include physicsdriven data augmentations as well it is also nice to see in the results that this as expected improves upon the purely imagebased approaches 2 the paper is well written and covers all the necessary basics especially the literature review is thorough while still leaving enough place for the methods and the experimental description 3 it is refreshing to actually see that the authors are providing the code and behind the link is a reasonable github repository sharing the trained models for several different scanner settings would be the next step forward there are only a few weaknesses 1 in the results presented in table 2 only the mean ssim and psnr values are given for the first dataset but wy are the standard deviations or standard errors not included and furthermore since this is all paired data you could easily visualize this table in box plots and also at least for some of the improvements do statistical testing with a nonparametric paired test to see if you have a significant improvement upon eg the baseline imagebased approaches although of course multiple comparisons is an issue when trying to do this for all 2 noise and motion levels were chosen based on visual inspections of clinical scans can this be elaborated you used different augmentations with different values on the training images and then visually compared it to clinical scans 3 your current training data consists of fullysampled examples in kspace ys with corresponding supervised reference ground truth images xs and undersampledonly kspace examples yu now through your augmentation you are trying to turn the undersampledonly kspace examples to more appropriate real examples but what about additionally including clinically relevant real corrupted data either form the clinic or from publicly accessible challenges such as httpsrealnoisemrigrandchallengeorg 4 regarding the motion modeling it seesm you are only modeling translation in image aka shifts in kspace bt what about including rotations in image so phase ramps in kspace rotations are almost always part of the problem when eg imaging children see eg httpswwwfrontiersinorgarticles103389fradi2021789632full docsepthe paper presents a solid methodology it is well written and clearly structured the underlying mri theory in terms of the encoding operator is nicely explained and formulated it is well described how the equivariant and invariant augmentation is combined in the training framework and how it is reflected in the loss functions also the paper demonstrates convincingly how both the physicsdriven augmentations and the imagebased augmentations can contribute to an improved reconstruction performance including an ablation of the individual features of the vortex framework the authors present a comprehensive performance analysis of the proposed vortex scheme including qualitative and quantitative comparisons with stateoftheart dl reference methods such as noise2recon mraugment or ssdu baselines source code is available on github the results look promising nevertheless they would be even more convincing if not only based on synthetic scenarios in fact id be curious to see how the model performs on real motion data on prospectively undersampled datasets or how it generalizes to patient data with pathologies the authors state that 2d poissondisc undersampling was used for model training and testing can you please motivate this choice in more detail in particular i am wondering how this type of undersampling relates to real prospective coherentincoherent undersampling techniques although the authors have conducted a thorough comparison with other dlbased methods i am wondering how they compare to nondl approaches eg csbased reconstructions limitations of their work are not discussed at all docsepto arrive at clinically relevant mriacquisition data augmentation is performed through the forward model section 3 targeting the multicoil setting with coil sensitivities and motion estimation affine transformations are indeed applied to augment data table 4 separate experiments on noise and motion are performed the motion model linear phase modulation is effectively a 1d translation model indeed apparent in fig 2 this is insufficient to capture 3d motion in practice the structure of the paper can be improved relevant information is now in appendices and overhead in the main paper for me it is difficult to follow the main line of the paper the noise level used in experiments of sigma04 is too high to arrive at meaningful results without smoothing out anatomical details compared to the ground truth in fig 3 vortex physics appears to treat motion as noise overly smoothing the image including clinically relevant anatomical detail eg in the meniscal area ### Summary:
i would like to thank all reviewers for their time and effort spent reviewing the paper and for their engagement with the rebuttal process i also thank the authors for their detailed rebuttal and the changes made to the manuscript all three reviewers agree that the paper should be accepted so i am happy to go with their recommendation and look forward to seeing the paper presented at midl
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 16681, 247, 1077, 4623, 1895, 275, 278, 363, 11931, 835, 40634, 9305, 2460, 941, 3198, 281, 320, 5520, 2220, 275, 1340, 281, 3986, 10401, 2460, 3290, 50276, 783, 20544, 281, 6780, 403, 50276, 18, 1223, 1142, 45439, 278, 363, 14433, 3082, 403, 22128, 12718, 327, 2460, 3169, 941, 35919, 569, 352, 310, 31255, 281, 1239, 670, 436, 2746, 281, 2486, 12057, 17477, 941, 35919, 569, 347, 973, 352, 310, 671, 5322, 281, 923, 275, 253, 1543, 326, 436, 347, 3264, 19132, 2220, 253, 15846, 2460, 3169, 7274, 50275, 19, 253, 2929, 310, 973, 3542, 285, 10949, 512, 253, 3309, 30486, 3340, 253, 6239, 2278, 310, 11080, 1223, 1335, 6108, 2217, 1659, 323, 253, 3082, 285, 253, 5661, 5740, 50275, 20, 352, 310, 31255, 281, 2686, 923, 326, 253, 4477, 403, 5277, 253, 2127, 285, 3212, 253, 3048, 310, 247, 5272, 40477, 18491, 9628, 253, 10166, 3210, 323, 2067, 1027, 25152, 7533, 651, 320, 253, 1735, 3213, 3579, 50276, 9088, 403, 760, 247, 1643, 32213, 50276, 18, 275, 253, 1543, 3559, 275, 2829, 374, 760, 253, 1599, 256, 3549, 285, 3714, 23838, 2193, 403, 1677, 323, 253, 806, 10895, 533, 37136, 403, 253, 2629, 21492, 390, 2629, 6332, 417, 2908, 285, 33810, 1580, 436, 310, 512, 18433, 941, 368, 812, 4354, 31986, 436, 2829, 275, 3817, 14777, 285, 671, 387, 1878, 323, 690, 273, 253, 11701, 513, 7605, 5175, 342, 247, 1327, 36928, 18433, 1071, 281, 923, 604, 368, 452, 247, 1534, 7756, 2220, 24088, 253, 8245, 2460, 3169, 7274, 3738, 273, 2282, 2709, 14023, 310, 271, 2523, 672, 2820, 281, 513, 436, 323, 512, 50276, 19, 6046, 285, 3200, 2308, 497, 6777, 1754, 327, 5304, 48350, 273, 3382, 20947, 476, 436, 320, 50221, 368, 908, 1027, 35919, 569, 342, 1027, 2193, 327, 253, 3733, 3888, 285, 840, 25910, 2429, 352, 281, 3382, 20947, 50276, 20, 634, 1655, 3733, 941, 8414, 273, 4751, 22163, 6216, 6667, 275, 465, 5641, 340, 84, 342, 3969, 22296, 3806, 3216, 5083, 3888, 48361, 285, 17433, 312, 6216, 7483, 465, 5641, 6667, 340, 86, 1024, 949, 634, 42072, 368, 403, 2820, 281, 1614, 253, 17433, 312, 6216, 7483, 465, 5641, 6667, 281, 625, 4569, 1524, 6667, 533, 752, 670, 23000, 1690, 16747, 4623, 1524, 40634, 941, 2057, 830, 253, 14235, 390, 432, 13644, 12482, 7881, 824, 347, 5987, 6549, 2369, 261, 358, 10389, 17554, 48781, 2061, 50275, 21, 5001, 253, 3200, 14053, 352, 11403, 78, 368, 403, 760, 14053, 10234, 275, 2460, 38857, 15036, 275, 465, 5641, 37989, 752, 670, 1690, 39501, 275, 2460, 594, 3408, 391, 11441, 275, 465, 5641, 39501, 403, 2761, 1900, 629, 273, 253, 1895, 672, 24088, 6979, 2151, 923, 24088, 5987, 2700, 6342, 4670, 249, 2061, 13137, 740, 1610, 2511, 925, 11282, 18161, 1166, 33608, 1237, 11546, 50276, 7152, 339, 431, 248, 2929, 10262, 247, 4891, 16182, 352, 310, 973, 3542, 285, 4518, 18872, 253, 6944, 278, 363, 3762, 275, 2426, 273, 253, 9706, 5572, 310, 23395, 5544, 285, 26115, 50275, 262, 310, 973, 2529, 849, 253, 32270, 6410, 285, 13727, 42072, 310, 5678, 275, 253, 3733, 7792, 285, 849, 352, 310, 11392, 275, 253, 2957, 3470, 50275, 12563, 253, 2929, 14371, 2410, 1763, 5356, 849, 1097, 253, 12057, 17477, 35919, 569, 285, 253, 2460, 3169, 35919, 569, 476, 8162, 281, 271, 5520, 14433, 3045, 1690, 271, 28913, 273, 253, 2060, 3386, 273, 253, 25584, 7792, 253, 4477, 1246, 247, 11088, 3045, 1783, 273, 253, 4081, 25584, 6974, 1690, 18276, 285, 11745, 14023, 342, 1375, 23037, 14387, 45439, 3806, 3082, 824, 347, 6046, 19, 250, 585, 278, 376, 814, 420, 390, 23524, 563, 1666, 25379, 50276, 6756, 2127, 310, 2130, 327, 40477, 50275, 783, 1543, 1007, 12532, 17837, 597, 651, 320, 1014, 625, 21414, 604, 417, 760, 1754, 327, 13506, 15216, 275, 958, 2654, 320, 14338, 281, 923, 849, 253, 1566, 17923, 327, 1524, 3200, 941, 327, 44045, 17433, 312, 6216, 15302, 390, 849, 352, 2087, 4219, 281, 3110, 941, 342, 49404, 50275, 783, 4477, 1375, 326, 374, 69, 2963, 739, 857, 2865, 17433, 312, 4906, 369, 908, 323, 1566, 3733, 285, 5175, 476, 368, 4496, 41509, 436, 4327, 275, 625, 2508, 275, 1798, 891, 717, 12371, 849, 436, 1511, 273, 17433, 312, 4906, 7033, 281, 1524, 13893, 18893, 45741, 7853, 17433, 312, 4906, 5609, 50275, 20261, 253, 4477, 452, 5196, 247, 11080, 5301, 342, 643, 45439, 3169, 3082, 891, 717, 12371, 849, 597, 7277, 281, 27370, 77, 7274, 24088, 29180, 3169, 49866, 6477, 50276, 17465, 569, 273, 616, 789, 403, 417, 5469, 387, 512, 50275, 7152, 339, 22352, 12666, 387, 16747, 4623, 278, 363, 317, 34961, 941, 42072, 310, 2684, 949, 253, 3579, 1566, 2593, 495, 12262, 253, 1554, 4173, 300, 4758, 342, 18783, 21750, 16762, 285, 3200, 13418, 50276, 2843, 460, 21257, 403, 6296, 3732, 281, 35919, 941, 2829, 577, 50276, 16806, 366, 4679, 327, 6046, 285, 3200, 403, 2684, 50276, 783, 3200, 1566, 4872, 3408, 15673, 310, 8069, 247, 337, 69, 10234, 1566, 6296, 5165, 275, 3036, 374, 436, 310, 12497, 281, 9232, 495, 69, 3200, 275, 3946, 50276, 783, 2605, 273, 253, 2929, 476, 320, 5520, 4623, 1491, 310, 1024, 275, 14801, 1271, 285, 18332, 275, 253, 2022, 2929, 323, 479, 352, 310, 2834, 281, 956, 253, 2022, 1386, 273, 253, 2929, 50276, 783, 6046, 1268, 908, 275, 4679, 273, 40009, 2125, 310, 1512, 1029, 281, 12666, 387, 14282, 1543, 1293, 36971, 562, 27166, 4278, 2429, 281, 253, 3216, 5083, 50276, 249, 3036, 495, 25584, 12057, 4620, 281, 1555, 3200, 347, 6046, 27662, 36971, 253, 2460, 1690, 16747, 4623, 27166, 2508, 24088, 275, 253, 1821, 261, 1179, 2170, 50275, 187, 187, 4118, 18435, 27, 74, 651, 751, 281, 5717, 512, 30628, 323, 616, 673, 285, 3434, 5262, 16725, 253, 2929, 285, 323, 616, 13226, 342, 253, 30080, 22559, 1232, 891, 671, 5717, 253, 4477, 323, 616, 7000, 30080, 22559, 285, 253, 2544, 1160, 281, 253, 7714, 512, 1264, 30628, 5194, 326, 253, 2929, 943, 320, 7607, 594, 891, 717, 5211, 281, 564, 342, 616, 17401, 285, 1007, 3579, 281, 6523, 253, 2929, 3559, 387, 4260, 77 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 16681, 247, 1077, 4623, 1895, 275, 278, 363, 11931, 835, 40634, 9305, 2460, 941, 3198, 281, 320, 5520, 2220, 275, 1340, 281, 3986, 10401, 2460, 3290, 50276, 783, 20544, 281, 6780, 403, 50276, 18, 1223, 1142, 45439, 278, 363, 14433, 3082, 403, 22128, 12718, 327, 2460, 3169, 941, 35919, 569, 352, 310, 31255, 281, 1239, 670, 436, 2746, 281, 2486, 12057, 17477, 941, 35919, 569, 347, 973, 352, 310, 671, 5322, 281, 923, 275, 253, 1543, 326, 436, 347, 3264, 19132, 2220, 253, 15846, 2460, 3169, 7274, 50275, 19, 253, 2929, 310, 973, 3542, 285, 10949, 512, 253, 3309, 30486, 3340, 253, 6239, 2278, 310, 11080, 1223, 1335, 6108, 2217, 1659, 323, 253, 3082, 285, 253, 5661, 5740, 50275, 20, 352, 310, 31255, 281, 2686, 923, 326, 253, 4477, 403, 5277, 253, 2127, 285, 3212, 253, 3048, 310, 247, 5272, 40477, 18491, 9628, 253, 10166, 3210, 323, 2067, 1027, 25152, 7533, 651, 320, 253, 1735, 3213, 3579, 50276, 9088, 403, 760, 247, 1643, 32213, 50276, 18, 275, 253, 1543, 3559, 275, 2829, 374, 760, 253, 1599, 256, 3549, 285, 3714, 23838, 2193, 403, 1677, 323, 253, 806, 10895, 533, 37136, 403, 253, 2629, 21492, 390, 2629, 6332, 417, 2908, 285, 33810, 1580, 436, 310, 512, 18433, 941, 368, 812, 4354, 31986, 436, 2829, 275, 3817, 14777, 285, 671, 387, 1878, 323, 690, 273, 253, 11701, 513, 7605, 5175, 342, 247, 1327, 36928, 18433, 1071, 281, 923, 604, 368, 452, 247, 1534, 7756, 2220, 24088, 253, 8245, 2460, 3169, 7274, 3738, 273, 2282, 2709, 14023, 310, 271, 2523, 672, 2820, 281, 513, 436, 323, 512, 50276, 19, 6046, 285, 3200, 2308, 497, 6777, 1754, 327, 5304, 48350, 273, 3382, 20947, 476, 436, 320, 50221, 368, 908, 1027, 35919, 569, 342, 1027, 2193, 327, 253, 3733, 3888, 285, 840, 25910, 2429, 352, 281, 3382, 20947, 50276, 20, 634, 1655, 3733, 941, 8414, 273, 4751, 22163, 6216, 6667, 275, 465, 5641, 340, 84, 342, 3969, 22296, 3806, 3216, 5083, 3888, 48361, 285, 17433, 312, 6216, 7483, 465, 5641, 6667, 340, 86, 1024, 949, 634, 42072, 368, 403, 2820, 281, 1614, 253, 17433, 312, 6216, 7483, 465, 5641, 6667, 281, 625, 4569, 1524, 6667, 533, 752, 670, 23000, 1690, 16747, 4623, 1524, 40634, 941, 2057, 830, 253, 14235, 390, 432, 13644, 12482, 7881, 824, 347, 5987, 6549, 2369, 261, 358, 10389, 17554, 48781, 2061, 50275, 21, 5001, 253, 3200, 14053, 352, 11403, 78, 368, 403, 760, 14053, 10234, 275, 2460, 38857, 15036, 275, 465, 5641, 37989, 752, 670, 1690, 39501, 275, 2460, 594, 3408, 391, 11441, 275, 465, 5641, 39501, 403, 2761, 1900, 629, 273, 253, 1895, 672, 24088, 6979, 2151, 923, 24088, 5987, 2700, 6342, 4670, 249, 2061, 13137, 740, 1610, 2511, 925, 11282, 18161, 1166, 33608, 1237, 11546, 50276, 7152, 339, 431, 248, 2929, 10262, 247, 4891, 16182, 352, 310, 973, 3542, 285, 4518, 18872, 253, 6944, 278, 363, 3762, 275, 2426, 273, 253, 9706, 5572, 310, 23395, 5544, 285, 26115, 50275, 262, 310, 973, 2529, 849, 253, 32270, 6410, 285, 13727, 42072, 310, 5678, 275, 253, 3733, 7792, 285, 849, 352, 310, 11392, 275, 253, 2957, 3470, 50275, 12563, 253, 2929, 14371, 2410, 1763, 5356, 849, 1097, 253, 12057, 17477, 35919, 569, 285, 253, 2460, 3169, 35919, 569, 476, 8162, 281, 271, 5520, 14433, 3045, 1690, 271, 28913, 273, 253, 2060, 3386, 273, 253, 25584, 7792, 253, 4477, 1246, 247, 11088, 3045, 1783, 273, 253, 4081, 25584, 6974, 1690, 18276, 285, 11745, 14023, 342, 1375, 23037, 14387, 45439, 3806, 3082, 824, 347, 6046, 19, 250, 585, 278, 376, 814, 420, 390, 23524, 563, 1666, 25379, 50276, 6756, 2127, 310, 2130, 327, 40477, 50275, 783, 1543, 1007, 12532, 17837, 597, 651, 320, 1014, 625, 21414, 604, 417, 760, 1754, 327, 13506, 15216, 275, 958, 2654, 320, 14338, 281, 923, 849, 253, 1566, 17923, 327, 1524, 3200, 941, 327, 44045, 17433, 312, 6216, 15302, 390, 849, 352, 2087, 4219, 281, 3110, 941, 342, 49404, 50275, 783, 4477, 1375, 326, 374, 69, 2963, 739, 857, 2865, 17433, 312, 4906, 369, 908, 323, 1566, 3733, 285, 5175, 476, 368, 4496, 41509, 436, 4327, 275, 625, 2508, 275, 1798, 891, 717, 12371, 849, 436, 1511, 273, 17433, 312, 4906, 7033, 281, 1524, 13893, 18893, 45741, 7853, 17433, 312, 4906, 5609, 50275, 20261, 253, 4477, 452, 5196, 247, 11080, 5301, 342, 643, 45439, 3169, 3082, 891, 717, 12371, 849, 597, 7277, 281, 27370, 77, 7274, 24088, 29180, 3169, 49866, 6477, 50276, 17465, 569, 273, 616, 789, 403, 417, 5469, 387, 512, 50275, 7152, 339, 22352, 12666, 387, 16747, 4623, 278, 363, 317, 34961, 941, 42072, 310, 2684, 949, 253, 3579, 1566, 2593, 495, 12262, 253, 1554, 4173, 300, 4758, 342, 18783, 21750, 16762, 285, 3200, 13418, 50276, 2843, 460, 21257, 403, 6296, 3732, 281, 35919, 941, 2829, 577, 50276, 16806, 366, 4679, 327, 6046, 285, 3200, 403, 2684, 50276, 783, 3200, 1566, 4872, 3408, 15673, 310, 8069, 247, 337, 69, 10234, 1566, 6296, 5165, 275, 3036, 374, 436, 310, 12497, 281, 9232, 495, 69, 3200, 275, 3946, 50276, 783, 2605, 273, 253, 2929, 476, 320, 5520, 4623, 1491, 310, 1024, 275, 14801, 1271, 285, 18332, 275, 253, 2022, 2929, 323, 479, 352, 310, 2834, 281, 956, 253, 2022, 1386, 273, 253, 2929, 50276, 783, 6046, 1268, 908, 275, 4679, 273, 40009, 2125, 310, 1512, 1029, 281, 12666, 387, 14282, 1543, 1293, 36971, 562, 27166, 4278, 2429, 281, 253, 3216, 5083, 50276, 249, 3036, 495, 25584, 12057, 4620, 281, 1555, 3200, 347, 6046, 27662, 36971, 253, 2460, 1690, 16747, 4623, 27166, 2508, 24088, 275, 253, 1821, 261, 1179, 2170, 50275, 187, 187, 4118, 18435, 27, 74, 651, 751, 281, 5717, 512, 30628, 323, 616, 673, 285, 3434, 5262, 16725, 253, 2929, 285, 323, 616, 13226, 342, 253, 30080, 22559, 1232, 891, 671, 5717, 253, 4477, 323, 616, 7000, 30080, 22559, 285, 253, 2544, 1160, 281, 253, 7714, 512, 1264, 30628, 5194, 326, 253, 2929, 943, 320, 7607, 594, 891, 717, 5211, 281, 564, 342, 616, 17401, 285, 1007, 3579, 281, 6523, 253, 2929, 3559, 387, 4260, 77 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper examines the source of translational invariance in cnns and points out that the convolution operation is translationally equivariant and not invariant authors thoroughly examine how training and architecture contribute to translational invariance in cnns my main concern about this paper is with respect to its novelty while the experiments are done in a thorough way the main point of this paper that translation invariance is formed during training is mostly known to the community the equivariance property of convolution operation is discussed in machinedeep learning textbooks eg 2 and the importance of imageaugmentations like spatial jitter on network object recognition performance has repeatedly been demonstrated because of this reason i dont think this paper would be suitable for publication at iclr other comments the contribution of the paper is very unclear from the abstract even getting past the first two sections i was still left wondering what i should be expecting to see in the rest of the paper the text needs more proofing reading more than few typos and misuse of words eg bases basis human vision recognition human visual object recognition lecun et al 1998 is cited as a biological model which is not a good example of a biological neural network although cnns have some commonalities with biological neural networks they have many more differences 1 might be a better reference to an early biologically inspired neural net it is claimed that cnns achieve neither rotation nor scale invariance however this is not a binary property what matters is the degree of invariance that could be measured section 3 by nonpretrained network do you mean the untrained network it is unclear from the text under section 3 what pretraining on the whole canvas is the 1location dataset is not described anywhere in the paper and i had to go by a guess as to what this dataset contains in section 5 the use of the cosine similarity measure instead of the accuracy which was used in the previous 4 experiments is not well motivated if this is a better measure why not using it in all experiments 1 fukushima k 1975 cognitron a selforganizing multilayered neural network biological cybernetics 2034 121136 2 goodfellow i bengio y courville a bengio y 2016 deep learning vol 1 p 2 cambridge mit pressdocsepthis paper addresses the problem of how convolutional neural networks cnns achieve translation invariance and the authors argue that this invariance es mostly learned from suitable datasets rather than a result of the architecture in particular imagenetpretrained networks have learned to be invariant to translation and fine tuned the experiments are performed in mnistlike datasets evaluating classification performance at different locations the authors conclude that invariance is achieved when the cnn is trained with the different objects being presented at different locations across the canvas and that the invariance can be forgotten after subsequent training pros the experiments explore several settings and are convincing in clearly showing that translation invariance requires that the network observes the different objects translated across the canvas in their particular setting although i have some concerns about the setting understanding how neural networks achieve certain properties such as translation invariance is very relevant the paper is well written and can be followed easily i like particularly the ilustrations of the experiments cons the main concern i have is that the insights are relatively incremental the experimental setting replicates blything et al 2020 and the main insight of cnns can learn translation invariance from suitable large and diverse datasets such as imagenet was already shown in that paper note that although available in arxiv in sept 2020 the authors are aware of the work since they state that they use bything et als dataset and replicate their experiments the results in the submission are not showing significantly novel insights results are shown with small datasets mnistlike but not clear how they extrapolate to more complex one it is also not clear to me that is possible to train a heavy model such as vgg16 with such small resolution datasets even when they are translated to different locations it probably results in very significant overfitting only evaluated on vgg16 to be more convincing in the general claim it is necessary to also evaluate other models the authors only analyze translation invariance of the whole network it would be more interesting to analyze invariance of the different layers via intermediate representations experiment 3 for instance encourages local translation invariance within each quadrant but not across the whole canvas i would expect that higher layers still behave like vanilla vgg16 which overfits to the location while lower layers show higher level of invariance fine tuning the whole network i understand the authors trainfine tune all the layers in this setting is probably leading to significant overfitting and therefore to catastrophic forgetting the authors should consider the case where only the classifier is trained and a variable number of layers in the top and thus less prone to overfitting and avoid forgetting in lower layers to further assess the invariance in different layers questions please clarify cons minor comments some figures seem to suggest that images have multiple objects in multiple locations my understanding is that every image has only one object and the location can change so those figures may be misleading please clarify and modify the figure if necessarydocsepthis paper analysis and studies translation invariance in convolution neural networks it argues that typically it is claimed that cnns are translation invariant due to the convolution function and that actually convolution are equivariant while pooling is the actual function that gives local invariance or global when the pooling is across all locations this is not included in the description of invariance one neural network vgg16 is used for the analysis in different scenarios 1 pretrained on imagenet and finetuned to the new dataset on one location 2 trained from scratch using the new dataset in one location 3 trained from scratch using the new datasets in all locations of the canvas and test on the other datasets the main conclusion in the paper is that cnns are not invariant to translation by design of the architecture but that when pretrained on naturalistic images they can be positive points the research question is important and this kind of analysis to understand cnns are crucial to better understand network capabilities and being able to predict their behaviour comment after rebuttal period given that there was no rebuttal i keep my initial rating study how using pretrained networks affect the generalization in some factors eg translation invariance in this paper is interesting and novel the paper is well written and easy to follow concerns the main concern is about the experimental setup and results presented in the paper only one network is used for the study to validate if it is indeed the architectural design the gives translation invariance it is a weak statement if other cnns architectures with different configurations number of layers amount of local pooling if global pooling is used the effect of strides etc analyzed and the current observations might only apply to vgg16 the results in figure 2 b when comparing the translation invariance across the different datasets it is mentioned that depending on the dataset that the network is initialized from it brings more or less translation invariance a deeper analysis on the amount of data used the similarity between the objects across datasets and how this effect the final performance would be nice to include here talking about invariance is a bit confusing and it might be that this robustness to position changes is more related the generalization properties to the other datasets due to the resemblance of the objects and obtained by experience it would have been nice to see the behaviour in another transformation as well since it would strengthen the claim that the invariance i would say robustness to transformations is not due to the architecture but to previous exposure to naturalistic images and if it is not it would bring some light into why it is for translation and not for other transformations minor comments as a side comment since there is some motivation in the introduction relating to human visual processing humans have a loss of acuity recognition performance with distance to the focal point and perceive high accuracy in the fovea a lowaccuracy in the periphery which is not captured by typical cnns ### Summary:
this paper receives 3 initial rejection ratings no rebuttal was submitted by the authors there is no basis for overturning the reviewers decisions this paper should be rejected
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 33888, 253, 2603, 273, 33103, 31429, 275, 260, 79, 2224, 285, 2792, 562, 326, 253, 27311, 4254, 310, 10234, 595, 32270, 6410, 285, 417, 13727, 4477, 16575, 9186, 849, 3733, 285, 10336, 8162, 281, 33103, 31429, 275, 260, 79, 2224, 619, 2022, 4468, 670, 436, 2929, 310, 342, 1675, 281, 697, 38135, 1223, 253, 4679, 403, 2218, 275, 247, 11080, 1039, 253, 2022, 1127, 273, 436, 2929, 326, 10234, 31429, 310, 4447, 1309, 3733, 310, 6571, 1929, 281, 253, 3114, 253, 32270, 14417, 2867, 273, 27311, 4254, 310, 5469, 275, 3674, 967, 70, 554, 4715, 45789, 24088, 374, 285, 253, 6349, 273, 2460, 2321, 420, 569, 751, 8820, 480, 4069, 327, 2990, 1789, 8981, 3045, 556, 12889, 644, 5183, 984, 273, 436, 1921, 891, 13414, 1158, 436, 2929, 651, 320, 7470, 323, 9311, 387, 17857, 32888, 50275, 977, 5701, 50275, 783, 7680, 273, 253, 2929, 310, 1077, 12744, 432, 253, 12002, 1014, 2970, 2469, 253, 806, 767, 7118, 891, 369, 1335, 1669, 12371, 752, 891, 943, 320, 16764, 281, 923, 275, 253, 1551, 273, 253, 2929, 50275, 783, 2505, 3198, 625, 4737, 272, 4361, 625, 685, 1643, 963, 993, 285, 41775, 273, 3000, 24088, 14395, 50276, 40265, 1966, 8113, 8981, 50276, 13961, 5304, 1789, 8981, 50276, 282, 68, 328, 1162, 355, 8065, 310, 11106, 347, 247, 7534, 1566, 534, 310, 417, 247, 1175, 1650, 273, 247, 7534, 11454, 2990, 3738, 260, 79, 2224, 452, 690, 1846, 12908, 342, 7534, 11454, 6928, 597, 452, 1142, 625, 3910, 337, 1537, 320, 247, 1805, 3806, 281, 271, 2393, 35605, 11797, 11454, 2036, 50276, 262, 310, 7558, 326, 260, 79, 2224, 5115, 6747, 9381, 4543, 4311, 31429, 2299, 436, 310, 417, 247, 8985, 2867, 752, 8213, 310, 253, 4248, 273, 31429, 326, 812, 320, 4080, 50275, 4674, 495, 407, 1327, 4025, 11273, 2990, 513, 368, 1599, 253, 440, 32927, 2990, 50276, 262, 310, 12744, 432, 253, 2505, 762, 2593, 495, 752, 3215, 26208, 327, 253, 2644, 18857, 310, 50276, 783, 337, 12428, 10895, 310, 417, 2529, 9825, 275, 253, 2929, 285, 891, 574, 281, 564, 407, 247, 5476, 347, 281, 752, 436, 10895, 4428, 50275, 249, 2593, 608, 253, 897, 273, 253, 7349, 460, 14259, 2557, 3185, 273, 253, 7200, 534, 369, 908, 275, 253, 2045, 577, 4679, 310, 417, 973, 17194, 604, 436, 310, 247, 1805, 2557, 2139, 417, 970, 352, 275, 512, 4679, 50274, 18, 269, 2788, 46102, 465, 14752, 7709, 262, 1406, 247, 1881, 7397, 3006, 33362, 333, 2122, 11454, 2990, 7534, 20239, 3024, 982, 1384, 1706, 1249, 883, 1812, 50276, 19, 1175, 47530, 891, 270, 1205, 900, 340, 1960, 6169, 247, 50276, 67, 1205, 900, 340, 4022, 3676, 4715, 1936, 337, 268, 374, 4049, 8298, 4784, 2315, 7152, 33032, 2520, 2929, 12453, 253, 1895, 273, 849, 27311, 267, 11454, 6928, 260, 79, 2224, 5115, 10234, 31429, 285, 253, 4477, 9059, 326, 436, 31429, 1578, 6571, 6311, 432, 7470, 15302, 2581, 685, 247, 906, 273, 253, 10336, 275, 1798, 4440, 257, 292, 4025, 11273, 6928, 452, 6311, 281, 320, 13727, 281, 10234, 285, 4030, 24251, 253, 4679, 403, 2684, 275, 278, 79, 382, 3022, 15302, 16344, 9162, 3045, 387, 1027, 8593, 253, 4477, 7525, 326, 31429, 310, 6786, 672, 253, 260, 9866, 310, 10166, 342, 253, 1027, 5113, 1146, 3559, 387, 1027, 8593, 2439, 253, 18857, 285, 326, 253, 31429, 476, 320, 14454, 846, 6774, 3733, 50276, 856, 84, 50276, 783, 4679, 8338, 2067, 7533, 285, 403, 21414, 275, 4518, 4645, 326, 10234, 31429, 4419, 326, 253, 2990, 40687, 253, 1027, 5113, 15786, 2439, 253, 18857, 275, 616, 1798, 4758, 3738, 891, 452, 690, 7350, 670, 253, 4758, 50276, 4524, 6924, 849, 11454, 6928, 5115, 2176, 3607, 824, 347, 10234, 31429, 310, 1077, 4623, 50276, 783, 2929, 310, 973, 3542, 285, 476, 320, 3560, 4354, 891, 751, 3782, 253, 4164, 5337, 569, 273, 253, 4679, 50275, 5040, 50276, 783, 2022, 4468, 891, 452, 310, 326, 253, 16039, 403, 4942, 32809, 253, 5661, 4758, 24945, 270, 314, 1597, 1162, 355, 9169, 285, 253, 2022, 12288, 273, 260, 79, 2224, 476, 3037, 10234, 31429, 432, 7470, 1781, 285, 11117, 15302, 824, 347, 4440, 257, 292, 369, 2168, 2011, 275, 326, 2929, 3877, 326, 3738, 2130, 275, 549, 32693, 275, 22688, 9169, 253, 4477, 403, 6600, 273, 253, 789, 1580, 597, 1375, 326, 597, 897, 407, 1597, 1162, 14350, 10895, 285, 25464, 616, 4679, 253, 1543, 275, 253, 19529, 403, 417, 4645, 3012, 4460, 16039, 50276, 16680, 403, 2011, 342, 1355, 15302, 278, 79, 382, 3022, 533, 417, 2590, 849, 597, 26480, 25839, 281, 625, 2570, 581, 352, 310, 671, 417, 2590, 281, 479, 326, 310, 1896, 281, 6194, 247, 5536, 1566, 824, 347, 362, 1266, 1036, 342, 824, 1355, 6064, 15302, 1014, 672, 597, 403, 15786, 281, 1027, 8593, 352, 3164, 1543, 275, 1077, 1534, 689, 31893, 50276, 7483, 6760, 327, 362, 1266, 1036, 281, 320, 625, 21414, 275, 253, 2087, 1750, 352, 310, 3309, 281, 671, 7472, 643, 3210, 50275, 783, 4477, 760, 12106, 10234, 31429, 273, 253, 2644, 2990, 352, 651, 320, 625, 4722, 281, 12106, 31429, 273, 253, 1027, 8090, 3066, 10444, 14237, 3368, 495, 323, 4227, 29426, 1980, 10234, 31429, 1561, 1016, 48045, 533, 417, 2439, 253, 2644, 18857, 891, 651, 1902, 326, 2169, 8090, 1335, 21319, 751, 26724, 362, 1266, 1036, 534, 689, 26017, 281, 253, 4328, 1223, 2406, 8090, 921, 2169, 1268, 273, 31429, 50275, 32829, 25184, 253, 2644, 2990, 891, 2096, 253, 4477, 1140, 2050, 460, 19928, 512, 253, 8090, 275, 436, 4758, 310, 3164, 4283, 281, 1534, 689, 31893, 285, 3103, 281, 36256, 37264, 253, 4477, 943, 1908, 253, 1083, 835, 760, 253, 30410, 310, 10166, 285, 247, 4778, 1180, 273, 8090, 275, 253, 1755, 285, 3021, 1679, 21291, 281, 689, 31893, 285, 3693, 37264, 275, 2406, 8090, 281, 2007, 2939, 253, 31429, 275, 1027, 8090, 50276, 34974, 4496, 19148, 772, 50276, 37585, 5701, 690, 8442, 1646, 281, 1804, 326, 3888, 452, 2709, 5113, 275, 2709, 8593, 619, 4685, 310, 326, 1046, 2460, 556, 760, 581, 1789, 285, 253, 4328, 476, 1818, 594, 1110, 8442, 778, 320, 24363, 4496, 19148, 285, 10007, 253, 4677, 604, 3309, 7152, 33032, 2520, 2929, 1783, 285, 2175, 10234, 31429, 275, 27311, 11454, 6928, 352, 8219, 326, 5431, 352, 310, 7558, 326, 260, 79, 2224, 403, 10234, 13727, 1955, 281, 253, 27311, 1159, 285, 326, 2686, 27311, 403, 32270, 6410, 1223, 45900, 310, 253, 4588, 1159, 326, 4245, 1980, 31429, 390, 4156, 672, 253, 45900, 310, 2439, 512, 8593, 436, 310, 417, 2908, 275, 253, 5740, 273, 31429, 581, 11454, 2990, 362, 1266, 1036, 310, 908, 323, 253, 1783, 275, 1027, 15216, 337, 3215, 11273, 327, 4440, 257, 292, 285, 1442, 292, 37437, 281, 253, 747, 10895, 327, 581, 4328, 374, 10166, 432, 20041, 970, 253, 747, 10895, 275, 581, 4328, 495, 10166, 432, 20041, 970, 253, 747, 15302, 275, 512, 8593, 273, 253, 18857, 285, 1071, 327, 253, 643, 15302, 253, 2022, 6452, 275, 253, 2929, 310, 326, 260, 79, 2224, 403, 417, 13727, 281, 10234, 407, 2216, 273, 253, 10336, 533, 326, 672, 3215, 11273, 327, 3626, 2531, 3888, 597, 476, 320, 50274, 10247, 2792, 50275, 783, 2561, 1953, 310, 1774, 285, 436, 2238, 273, 1783, 281, 2096, 260, 79, 2224, 403, 9560, 281, 1805, 2096, 2990, 13789, 285, 1146, 2104, 281, 3283, 616, 8770, 50275, 13982, 846, 30080, 22559, 2180, 1677, 326, 627, 369, 642, 30080, 22559, 891, 1978, 619, 3302, 13716, 50275, 34966, 849, 970, 3215, 11273, 6928, 2818, 253, 26647, 275, 690, 2616, 24088, 10234, 31429, 275, 436, 2929, 310, 4722, 285, 4460, 50275, 783, 2929, 310, 973, 3542, 285, 3477, 281, 956, 50274, 585, 1209, 2224, 50274, 783, 2022, 4468, 310, 670, 253, 5661, 9978, 285, 1543, 3559, 275, 253, 2929, 760, 581, 2990, 310, 908, 323, 253, 1263, 281, 17813, 604, 352, 310, 6296, 253, 27934, 2216, 253, 4245, 10234, 31429, 352, 310, 247, 5075, 3908, 604, 643, 260, 79, 2224, 35615, 342, 1027, 16012, 1180, 273, 8090, 2408, 273, 1980, 45900, 604, 4156, 45900, 310, 908, 253, 1055, 273, 47582, 3966, 50276, 29965, 4337, 285, 253, 1655, 7313, 1537, 760, 4647, 281, 362, 1266, 1036, 50275, 783, 1543, 275, 4677, 374, 270, 672, 10941, 253, 10234, 31429, 2439, 253, 1027, 15302, 352, 310, 5393, 326, 7293, 327, 253, 10895, 326, 253, 2990, 310, 31260, 432, 352, 10316, 625, 390, 1679, 10234, 31429, 247, 12861, 1783, 327, 253, 2408, 273, 941, 908, 253, 14259, 875, 253, 5113, 2439, 15302, 285, 849, 436, 1055, 253, 2457, 3045, 651, 320, 5322, 281, 2486, 1060, 5015, 670, 31429, 310, 247, 2372, 21643, 285, 352, 1537, 320, 326, 436, 31640, 281, 1899, 2544, 310, 625, 2905, 253, 26647, 3607, 281, 253, 643, 15302, 1955, 281, 253, 36199, 273, 253, 5113, 285, 2797, 407, 2793, 50273, 262, 651, 452, 644, 5322, 281, 923, 253, 8770, 275, 1529, 9261, 347, 973, 1580, 352, 651, 17084, 253, 1750, 326, 253, 31429, 891, 651, 1333, 31640, 281, 21257, 310, 417, 1955, 281, 253, 10336, 533, 281, 2045, 5445, 281, 3626, 2531, 3888, 285, 604, 352, 310, 417, 352, 651, 3324, 690, 1708, 715, 2139, 352, 310, 323, 10234, 285, 417, 323, 643, 21257, 50272, 37585, 5701, 50273, 284, 247, 1930, 4385, 1580, 627, 310, 690, 16038, 275, 253, 10199, 12600, 281, 1966, 5304, 5162, 7497, 452, 247, 2957, 273, 43457, 8981, 3045, 342, 4181, 281, 253, 18560, 1127, 285, 24947, 1029, 7200, 275, 253, 269, 710, 66, 247, 1698, 18921, 1974, 275, 253, 29615, 534, 310, 417, 10848, 407, 6867, 260, 79, 2224, 2490, 187, 4118, 18435, 27, 2520, 2929, 14488, 495, 3302, 18235, 17503, 642, 30080, 22559, 369, 9262, 407, 253, 4477, 627, 310, 642, 3720, 323, 46011, 272, 253, 30628, 7089, 436, 2929, 943, 320, 10945 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 33888, 253, 2603, 273, 33103, 31429, 275, 260, 79, 2224, 285, 2792, 562, 326, 253, 27311, 4254, 310, 10234, 595, 32270, 6410, 285, 417, 13727, 4477, 16575, 9186, 849, 3733, 285, 10336, 8162, 281, 33103, 31429, 275, 260, 79, 2224, 619, 2022, 4468, 670, 436, 2929, 310, 342, 1675, 281, 697, 38135, 1223, 253, 4679, 403, 2218, 275, 247, 11080, 1039, 253, 2022, 1127, 273, 436, 2929, 326, 10234, 31429, 310, 4447, 1309, 3733, 310, 6571, 1929, 281, 253, 3114, 253, 32270, 14417, 2867, 273, 27311, 4254, 310, 5469, 275, 3674, 967, 70, 554, 4715, 45789, 24088, 374, 285, 253, 6349, 273, 2460, 2321, 420, 569, 751, 8820, 480, 4069, 327, 2990, 1789, 8981, 3045, 556, 12889, 644, 5183, 984, 273, 436, 1921, 891, 13414, 1158, 436, 2929, 651, 320, 7470, 323, 9311, 387, 17857, 32888, 50275, 977, 5701, 50275, 783, 7680, 273, 253, 2929, 310, 1077, 12744, 432, 253, 12002, 1014, 2970, 2469, 253, 806, 767, 7118, 891, 369, 1335, 1669, 12371, 752, 891, 943, 320, 16764, 281, 923, 275, 253, 1551, 273, 253, 2929, 50275, 783, 2505, 3198, 625, 4737, 272, 4361, 625, 685, 1643, 963, 993, 285, 41775, 273, 3000, 24088, 14395, 50276, 40265, 1966, 8113, 8981, 50276, 13961, 5304, 1789, 8981, 50276, 282, 68, 328, 1162, 355, 8065, 310, 11106, 347, 247, 7534, 1566, 534, 310, 417, 247, 1175, 1650, 273, 247, 7534, 11454, 2990, 3738, 260, 79, 2224, 452, 690, 1846, 12908, 342, 7534, 11454, 6928, 597, 452, 1142, 625, 3910, 337, 1537, 320, 247, 1805, 3806, 281, 271, 2393, 35605, 11797, 11454, 2036, 50276, 262, 310, 7558, 326, 260, 79, 2224, 5115, 6747, 9381, 4543, 4311, 31429, 2299, 436, 310, 417, 247, 8985, 2867, 752, 8213, 310, 253, 4248, 273, 31429, 326, 812, 320, 4080, 50275, 4674, 495, 407, 1327, 4025, 11273, 2990, 513, 368, 1599, 253, 440, 32927, 2990, 50276, 262, 310, 12744, 432, 253, 2505, 762, 2593, 495, 752, 3215, 26208, 327, 253, 2644, 18857, 310, 50276, 783, 337, 12428, 10895, 310, 417, 2529, 9825, 275, 253, 2929, 285, 891, 574, 281, 564, 407, 247, 5476, 347, 281, 752, 436, 10895, 4428, 50275, 249, 2593, 608, 253, 897, 273, 253, 7349, 460, 14259, 2557, 3185, 273, 253, 7200, 534, 369, 908, 275, 253, 2045, 577, 4679, 310, 417, 973, 17194, 604, 436, 310, 247, 1805, 2557, 2139, 417, 970, 352, 275, 512, 4679, 50274, 18, 269, 2788, 46102, 465, 14752, 7709, 262, 1406, 247, 1881, 7397, 3006, 33362, 333, 2122, 11454, 2990, 7534, 20239, 3024, 982, 1384, 1706, 1249, 883, 1812, 50276, 19, 1175, 47530, 891, 270, 1205, 900, 340, 1960, 6169, 247, 50276, 67, 1205, 900, 340, 4022, 3676, 4715, 1936, 337, 268, 374, 4049, 8298, 4784, 2315, 7152, 33032, 2520, 2929, 12453, 253, 1895, 273, 849, 27311, 267, 11454, 6928, 260, 79, 2224, 5115, 10234, 31429, 285, 253, 4477, 9059, 326, 436, 31429, 1578, 6571, 6311, 432, 7470, 15302, 2581, 685, 247, 906, 273, 253, 10336, 275, 1798, 4440, 257, 292, 4025, 11273, 6928, 452, 6311, 281, 320, 13727, 281, 10234, 285, 4030, 24251, 253, 4679, 403, 2684, 275, 278, 79, 382, 3022, 15302, 16344, 9162, 3045, 387, 1027, 8593, 253, 4477, 7525, 326, 31429, 310, 6786, 672, 253, 260, 9866, 310, 10166, 342, 253, 1027, 5113, 1146, 3559, 387, 1027, 8593, 2439, 253, 18857, 285, 326, 253, 31429, 476, 320, 14454, 846, 6774, 3733, 50276, 856, 84, 50276, 783, 4679, 8338, 2067, 7533, 285, 403, 21414, 275, 4518, 4645, 326, 10234, 31429, 4419, 326, 253, 2990, 40687, 253, 1027, 5113, 15786, 2439, 253, 18857, 275, 616, 1798, 4758, 3738, 891, 452, 690, 7350, 670, 253, 4758, 50276, 4524, 6924, 849, 11454, 6928, 5115, 2176, 3607, 824, 347, 10234, 31429, 310, 1077, 4623, 50276, 783, 2929, 310, 973, 3542, 285, 476, 320, 3560, 4354, 891, 751, 3782, 253, 4164, 5337, 569, 273, 253, 4679, 50275, 5040, 50276, 783, 2022, 4468, 891, 452, 310, 326, 253, 16039, 403, 4942, 32809, 253, 5661, 4758, 24945, 270, 314, 1597, 1162, 355, 9169, 285, 253, 2022, 12288, 273, 260, 79, 2224, 476, 3037, 10234, 31429, 432, 7470, 1781, 285, 11117, 15302, 824, 347, 4440, 257, 292, 369, 2168, 2011, 275, 326, 2929, 3877, 326, 3738, 2130, 275, 549, 32693, 275, 22688, 9169, 253, 4477, 403, 6600, 273, 253, 789, 1580, 597, 1375, 326, 597, 897, 407, 1597, 1162, 14350, 10895, 285, 25464, 616, 4679, 253, 1543, 275, 253, 19529, 403, 417, 4645, 3012, 4460, 16039, 50276, 16680, 403, 2011, 342, 1355, 15302, 278, 79, 382, 3022, 533, 417, 2590, 849, 597, 26480, 25839, 281, 625, 2570, 581, 352, 310, 671, 417, 2590, 281, 479, 326, 310, 1896, 281, 6194, 247, 5536, 1566, 824, 347, 362, 1266, 1036, 342, 824, 1355, 6064, 15302, 1014, 672, 597, 403, 15786, 281, 1027, 8593, 352, 3164, 1543, 275, 1077, 1534, 689, 31893, 50276, 7483, 6760, 327, 362, 1266, 1036, 281, 320, 625, 21414, 275, 253, 2087, 1750, 352, 310, 3309, 281, 671, 7472, 643, 3210, 50275, 783, 4477, 760, 12106, 10234, 31429, 273, 253, 2644, 2990, 352, 651, 320, 625, 4722, 281, 12106, 31429, 273, 253, 1027, 8090, 3066, 10444, 14237, 3368, 495, 323, 4227, 29426, 1980, 10234, 31429, 1561, 1016, 48045, 533, 417, 2439, 253, 2644, 18857, 891, 651, 1902, 326, 2169, 8090, 1335, 21319, 751, 26724, 362, 1266, 1036, 534, 689, 26017, 281, 253, 4328, 1223, 2406, 8090, 921, 2169, 1268, 273, 31429, 50275, 32829, 25184, 253, 2644, 2990, 891, 2096, 253, 4477, 1140, 2050, 460, 19928, 512, 253, 8090, 275, 436, 4758, 310, 3164, 4283, 281, 1534, 689, 31893, 285, 3103, 281, 36256, 37264, 253, 4477, 943, 1908, 253, 1083, 835, 760, 253, 30410, 310, 10166, 285, 247, 4778, 1180, 273, 8090, 275, 253, 1755, 285, 3021, 1679, 21291, 281, 689, 31893, 285, 3693, 37264, 275, 2406, 8090, 281, 2007, 2939, 253, 31429, 275, 1027, 8090, 50276, 34974, 4496, 19148, 772, 50276, 37585, 5701, 690, 8442, 1646, 281, 1804, 326, 3888, 452, 2709, 5113, 275, 2709, 8593, 619, 4685, 310, 326, 1046, 2460, 556, 760, 581, 1789, 285, 253, 4328, 476, 1818, 594, 1110, 8442, 778, 320, 24363, 4496, 19148, 285, 10007, 253, 4677, 604, 3309, 7152, 33032, 2520, 2929, 1783, 285, 2175, 10234, 31429, 275, 27311, 11454, 6928, 352, 8219, 326, 5431, 352, 310, 7558, 326, 260, 79, 2224, 403, 10234, 13727, 1955, 281, 253, 27311, 1159, 285, 326, 2686, 27311, 403, 32270, 6410, 1223, 45900, 310, 253, 4588, 1159, 326, 4245, 1980, 31429, 390, 4156, 672, 253, 45900, 310, 2439, 512, 8593, 436, 310, 417, 2908, 275, 253, 5740, 273, 31429, 581, 11454, 2990, 362, 1266, 1036, 310, 908, 323, 253, 1783, 275, 1027, 15216, 337, 3215, 11273, 327, 4440, 257, 292, 285, 1442, 292, 37437, 281, 253, 747, 10895, 327, 581, 4328, 374, 10166, 432, 20041, 970, 253, 747, 10895, 275, 581, 4328, 495, 10166, 432, 20041, 970, 253, 747, 15302, 275, 512, 8593, 273, 253, 18857, 285, 1071, 327, 253, 643, 15302, 253, 2022, 6452, 275, 253, 2929, 310, 326, 260, 79, 2224, 403, 417, 13727, 281, 10234, 407, 2216, 273, 253, 10336, 533, 326, 672, 3215, 11273, 327, 3626, 2531, 3888, 597, 476, 320, 50274, 10247, 2792, 50275, 783, 2561, 1953, 310, 1774, 285, 436, 2238, 273, 1783, 281, 2096, 260, 79, 2224, 403, 9560, 281, 1805, 2096, 2990, 13789, 285, 1146, 2104, 281, 3283, 616, 8770, 50275, 13982, 846, 30080, 22559, 2180, 1677, 326, 627, 369, 642, 30080, 22559, 891, 1978, 619, 3302, 13716, 50275, 34966, 849, 970, 3215, 11273, 6928, 2818, 253, 26647, 275, 690, 2616, 24088, 10234, 31429, 275, 436, 2929, 310, 4722, 285, 4460, 50275, 783, 2929, 310, 973, 3542, 285, 3477, 281, 956, 50274, 585, 1209, 2224, 50274, 783, 2022, 4468, 310, 670, 253, 5661, 9978, 285, 1543, 3559, 275, 253, 2929, 760, 581, 2990, 310, 908, 323, 253, 1263, 281, 17813, 604, 352, 310, 6296, 253, 27934, 2216, 253, 4245, 10234, 31429, 352, 310, 247, 5075, 3908, 604, 643, 260, 79, 2224, 35615, 342, 1027, 16012, 1180, 273, 8090, 2408, 273, 1980, 45900, 604, 4156, 45900, 310, 908, 253, 1055, 273, 47582, 3966, 50276, 29965, 4337, 285, 253, 1655, 7313, 1537, 760, 4647, 281, 362, 1266, 1036, 50275, 783, 1543, 275, 4677, 374, 270, 672, 10941, 253, 10234, 31429, 2439, 253, 1027, 15302, 352, 310, 5393, 326, 7293, 327, 253, 10895, 326, 253, 2990, 310, 31260, 432, 352, 10316, 625, 390, 1679, 10234, 31429, 247, 12861, 1783, 327, 253, 2408, 273, 941, 908, 253, 14259, 875, 253, 5113, 2439, 15302, 285, 849, 436, 1055, 253, 2457, 3045, 651, 320, 5322, 281, 2486, 1060, 5015, 670, 31429, 310, 247, 2372, 21643, 285, 352, 1537, 320, 326, 436, 31640, 281, 1899, 2544, 310, 625, 2905, 253, 26647, 3607, 281, 253, 643, 15302, 1955, 281, 253, 36199, 273, 253, 5113, 285, 2797, 407, 2793, 50273, 262, 651, 452, 644, 5322, 281, 923, 253, 8770, 275, 1529, 9261, 347, 973, 1580, 352, 651, 17084, 253, 1750, 326, 253, 31429, 891, 651, 1333, 31640, 281, 21257, 310, 417, 1955, 281, 253, 10336, 533, 281, 2045, 5445, 281, 3626, 2531, 3888, 285, 604, 352, 310, 417, 352, 651, 3324, 690, 1708, 715, 2139, 352, 310, 323, 10234, 285, 417, 323, 643, 21257, 50272, 37585, 5701, 50273, 284, 247, 1930, 4385, 1580, 627, 310, 690, 16038, 275, 253, 10199, 12600, 281, 1966, 5304, 5162, 7497, 452, 247, 2957, 273, 43457, 8981, 3045, 342, 4181, 281, 253, 18560, 1127, 285, 24947, 1029, 7200, 275, 253, 269, 710, 66, 247, 1698, 18921, 1974, 275, 253, 29615, 534, 310, 417, 10848, 407, 6867, 260, 79, 2224, 2490, 187, 4118, 18435, 27, 2520, 2929, 14488, 495, 3302, 18235, 17503, 642, 30080, 22559, 369, 9262, 407, 253, 4477, 627, 310, 642, 3720, 323, 46011, 272, 253, 30628, 7089, 436, 2929, 943, 320, 10945 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper aims to offer a theoretical analysis of the training with data augmentation and associated consistency loss while it is intuitive that training with data augmentation and consistency loss will help this paper offers a theoretical justification of the intuitions the simple framework to view dac as a hypothesis space complexity reduction technique is neat and intuitive strengths this work presents several clear strengths this paper is one of the several works pioneering the discussions of data augmentation when used together with consistency loss although several preceding works have been ignored 1 2 after the general form is introduced several applications can be directly extended which shows the potential of this work an interesting definition of the strength of the augmentation weakness however i also have several major concerns about this work for example if i understand correctly theorem 1 is essentially a reuse of the standard generalization error bound with a replacement of the original hypothesis space to the regularized hypothesis space and then the main argument is that since regularized hypothesis space is believed to be smaller then the new bound is tighter overall i dont think this result is significant enough especially considering it takes a major position in this paper i think is too trivial to be considered as an important theorem of a publication at this level it might be more appropriate to call it a lemma or a proposition of the theorem of the error bound with standard hypothesis space a smaller upper bound does not really say much of the performances both of the bounds could be not tight and even they are tight some discussions of how smaller the regularized one is will be helpful while it is very interesting to see the definition of the strength of the augmentation it does not say much in practice without a deeper or broader discussion the definition seems to be a math brick for the theorem discussions on how it is linked to the practice could be helpful especially since the definition plays a central role as the major assumption in following theoretical results the intuitive explanation offered by the authors are not intuitive enough in particular for the examples of using rotation as augmentation the authors state they use rotation in this paper what are the intuitions of the definition how different degrees of rotation corresponds to different strengths and any numerical evidence can be reported also whats the intuitive explanation of assumption 1 with theorem 1 being trivial as discussed above theorem 2 is probably one of the most important result in this paper yet there seems also some issues the proof of theorem 2 critically depends on assumption 1 this is not made clear in the main text it might be better to put the full statement at least the full definition of d back to the main manuscript the empirical results do not offer any validation to the new theoretical discussion but used to show that training with dac can benefit in comparison to erm a fact that the community has known for a while 1 invarianceinducing regularization using worstcase transformations suffices to boost accuracy and spatial robustness 2 squared 2 norm as consistency loss for leveraging augmented data to learn robust and invariant representations overall i feel like the theoretical discussion is not significant enough and the associated experiments are also weak in the context of theoretical papers docsep summary the paper introduces a statistical framework to analyze data augmentation to interpret consistency regularization as a way to reduce function class complexity building upon this framework the paper shows that for linear regression consistency regularization is more efficient than empirical risk minimization provides generalization bounds under consistency regularization for logistic regression twolayer neural networks provides a generalization bound for expansionbased data augmentations for multiclass classification strengths 1 the paper introduces a formal framework to study data augmentations and consistency regularization the ideas are simple but novel with several meaningful results and implications 2 the paper is quite wellwritten with the main ideas outlined clearly the approach is wellmotivated and seems to be built upon established literature the review of related works is informative weakness 1 my main concern with the paper is about the results of theorem 3 and theorem 5 which appear very much weaker than they should be theorem 5 for some  0 1 with probability at least 1 from a mathematical point of view this is an extremely weak result delta could be 0999999 a regular generalization bound would have stated for all delta0 when n is large enough theorem 3 with constant probability learning the face value of the statement would be that the probability of the event stated in the theorem remains constant as n changes which is obviously not true i suppose that the author means with probability bounded from below by a constant even in that case this is a very weak result wouldnt we want the probability to be at least close to one or say greater than 12 the appendix which is supposed to contain formal statements and proofs also runs into that problem statements about constant probability appear in several places the main assumption of the analysis assumptions 3 n 4 d daug  log1 is also stated for some delta0 theorem 4 in technicality doesnt share the same problem however its result is only meaningful under assumption 3 which is subject to the same constraint if i understand correctly the proofs of the paper might be adapted to support typical generalization bounds but the manuscript in its current state didnt do that and the results of the theorems do not make much sense 2 the technical contributions of the other main results of the paper theorem 1 and theorem 2 are limited the proof and result of theorem 1 is straightforward the result of theorem 2 for linear regression is interesting but the proof involves standard computations other comments throughout the manuscript and also in the appendix which is supposed to contain rigorous proof the author used several asymptotic notations such as and bigo notation which may not be appropriate since 1 the analyses of the paper is nonasymptotic and involves many parameters 2 in rigorous nonasymptotic analyses statements such as n ndelta is important but will be obscured using bigo notations a significant part of the proof in the appendix 10 pages was to prove several results that correspond to a threesentence remark in the main texts with no result statements if the authors believe that this is a central point a subsection in the main text should be created to provide the details otherwise i suggest removing those parts out of the text just the remarks and the appendix since those materials are not central to the content totally not peerreviewed and shouldnt be associated with the paper if it is accepted example 1 it can be verified that it would be helpful if a short verification is included in the appendix proof of theorem 2 the rest of proof is identical to standard regression analysis this part is central to the result so more details should be spelled out or a reference should be given notations rho is used both as a metric and as the subgaussian constant this is further confusing since they appear close to each other at times proof of lemma 5 the last statement on page 28 is incorrect od do not dominate d there is a typo in the equation 2 216 questions how are the regularizing constant lambda 10 chosen in the last 4 experiments while it is not the optimal value in the first one overall my vote for the paper is a weak reject i think the framework of the paper is original and the ideas are intuitive with several interesting results and implications on the other hand the statements of some of the results are very weak to an extent that they are not meaningful the writing of the main text as well as the proofs are nonrigorous which was partially an intention of the authors at least for the main text but might have affected the papers mathematical quality update after response and revisions the revision addresses my main concern about the weakness of the results of theorem 3 and 5 on the other hand im still of the opinions that the heavy uses of asymptotic representations and the choices of present thee results of the main text in nonrigorous manner have affected the papers mathematical quality a few other concerns are left unaddressed i thus raise my score from 5 to 6 docsepdata augmentation is a common technique to improve generalization especially when data is scarce this paper introduces a theoretical framework for analyzing the effectiveness of consistency regularization when data augmentation is employed in the limit consistency regularization is akin to solving a constrained optimization problem with consistency constraints the paper theoretically studies this limit for linear regression logistic regression and a twolayer perceptron with relu activation and tries to characterize the benefits of consistency regularization beyond that of vanilla data augmentation the paper then continues to experiments where it is shown that consistency regularization outperforms data augmentation on three benchmarks and the benefits are significant especially when labeled data is scarce data augmentation da is a common technique to improve generalization especially when data is scarce this paper introduces a theoretical framework for analyzing the effectiveness of consistency regularization when data augmentation is employed in the limit consistency regularization is akin to solving a constrained optimization problem with consistency constraints while data augmentation is used for better generalization with less data it is also used to suppress spurious features and generalize to unseen distributions a discussion around that is warranted also the theory unfortunately is not applicable to this other major usecase of da while this paper introduces a regularizer rho in eq 1 the exact functional form of the regularizer is of no use as long as it is a proper divergence between the original samples and the augmented samples however this is practically not true as the functional form of the regularizer can be make or break in obtaining results this is a shortcoming of the analysis in this paper which needs to be clearly reflected eq 2 can be obtained as a limit of eq 1 when lambda to infty only if rho satisfies certain regularity conditions eg being a divergence no such conditions are stated in the paper so the deduction is wrong as stated the mathematical model for da that is described in definition 1 is limited in scope for example it cannot capture some of the motivating works that the paper cites such as mixup this needs to be clearly stated the operational meaning of the da strength in definition 2 is unclear beyond the linear examples that are provided as this does not capture any nonlinear relationship between features as far as i followed from a theoretical perspective the data augmentation in this paper widetildemathcalax is always linear for deriving the main theorems 24 it is not clear how definition 2 does or can capture randomness in training examples or the augmentation function as dtextaug itself becomes a random variable another important relationship that is missing is the relationship between dtextaug and alpha which is the number of times data is augmented can you at least say what happens when alpha to infty one major issue that has been ignored is that widehathtextdac as defined in eq 2 is not directly solvable through the empirical risk minimization framework even worse it is unclear whether there is any way to devise a stochastic solver for it hence the theoretical study of the paper is only applicable to a solver that is not practically existing in theorem 1 it is unclear how one would be able to relate the rademacher complexity of ttextdacwidetildemathcala x to alpha and dtextaug henceforth it is not clear how to interpret the improvement beyond there is some improvement associated with dac compared to vanilla da which we already knew in theorem 1 what is cl in theorem 1 and its proof the data augmentation seem to have been assumed to be a fixed linear transformation that should be clearly communicated in particular in the second line of the proof on page 13 in appendix a1 there is no expectation with respect to the randomness associated with da even if the da function is deterministic the set ttextdacwidetildemathcala x is random so i dont know how to interpret the inequality the paper theoretically studies this limit for linear regression logistic regression and a twolayer perceptron with relu activation and tries to characterize the benefits of consistency regularization beyond that of vanilla data augmentation please change the notation of widehatherm in eq 3 i suggest calling this daerm rather than erm not to confuse with erm wo augmentation the benefit of the consistency regularizer in lr is characterized in theorem 2 through a quantity called d it is not clear to me how to interpret that even after reading remark 2 and example 1 can you please provide a more intuitive explanation can you please better explain figure 2 is the xaxis d how is it calculated why is the bound in theorem 3 presented in this obscure form rather than the more commonly used high probability bound wouldnt data augmentation without consistency regularization also admit the same form as theorem 3 if so then how does theorem 3 characterize benefits of consistency regularization in particular i am not sure if i can follow the argument about the generalization properties of widehatherm stated without proof after theorem 3 and i suspect it to be incorrect the proof of theorem 4 is inscrutable i could not make my way through it although i consider myself to be on the more theoretical side of the spectrum so i would say the mathematical exposition of the paper is not accessible to the iclr general audience i could not follow what the goal of section 43 is theorem 5 is stated for realizable model classes it is known that in this case the generalization bounds admit fast o1n rates rather than the usual o1sqrtn rates see tsybakov 2004 steinke and zakynthinou 2020 hence the upper bound in theorem 5 is orderwise loose and i am not sure how to interpret it the paper then continues to experiments where it is shown that consistency regularization outperforms data augmentation on three benchmarks and the benefits are significant especially when labeled data is scarce while i like the results there are several major concerns here first it is unclear what form of regularizer function rho has been used to produce the results in the experiments section can you please clearly explain the regularizer for all experiments as is well known and can also be seen in the experiments when dealing with overparameterized nonconvex neural models choosing lambda to infty practically results in convergence to poor local minima that do not generalize for example this can be seen in table 1 for lambda 20 and if lambda is even further increased the resulting model would achieve the performance of a random classifier this is a big discrepancy between the theoretical setup of this paper and the empirical setup and a discussion around this shortcoming is warranted in table 1 can you please explain the drop of the performance seen for lambda 1 one would intuitively expect that increasing lambda from 0 the performance would increase and then it will plateau and then start to deteriorate hence lambda 1 performance is counterintuitive it is claimed that the dac regularization gives a worse performance as it falsely enforces dac regularization where the label of the augmented sample may have changed however this is not substantiated through sufficient evidence either change the language to might be explained or provide more evidence typos pg 4 after 1 as an regularizer as a regularizer pg 17 statement of thm 8 satisfies satisfy references alexander b tsybakov optimal aggregation of classifiers in statistical learning the annals of statistics 321135166 2004 thomas steinke and lydia zakynthinou reasoning about generalization via conditional mutual information in proceedings of thirty third conference on learning theory volume 125 pages 34373452 0912 jul 2020 this paper develops a mathematical framework for consistency regularization several theoretical results and empirical results are provided overall i like the general idea of the paper however there are several major concerns with the way it is executed in particular the major issue is that the theory and experiments are disjoint and do not support each other 1 the theory is too tied to linear models and it is unclear how it could be extended beyond linear models 2 the theory is only applicable to linear models with lambda to infty this regime is not even viable in the empirical world when we deal with neural models as also evidenced in the experiments of the paper 3 while the experiments are interesting they are not conclusive on their own and lack important details for example the regularizer function rho is not specified also baselines for these benchmarks are not compared against while there are many things that i like about the paper it does not tell a coherent story and especially one that would benefit the iclr audience and hence i recommend the paper to be rejected in its current form i hope the authors can clarify some of the explicit commentsquestions that i have raised in my review during the rebuttal update after author response i would like to thank the authors for their extensive responses to my original comments as well as the followup comments and also for their revisions to the paper which has improved the paper significantly thus i am raising my score from 3 to 5 while some of my previous concerns are addressed there are many remaining concerns that would require another carefulextensive revision and would require another round of review which is why i still dont think the paper is ready to be accepted i would like to emphasize that i like the general formulation of the problem and the general positioning of the paper and i think the paper would be a nice contribution to the literature once the problems especially with mathematical exposition are fixed here are some explicit pointers for the authors imprecise and inscrutable mathematical exposition there are lots of imprecise statements which also reviewer s3ue complained about still in the revised paper for example what does gg mean in assumption 2 page 15 at the same time the math is not followable i still could not follow some of the proofs hard to gain intuitiontakeaways from theoretical results while the authors present several results it is hard to understand takeaways from the developed theory while they have addressed many concerns several still remain for example rhs of the revised thm 5 does not depend on any of the data augmentation parameters such as dtextaug and alpha what is the takeaway from this theorem given that the same bound also applies to erm and daerm gap between theory and practice not discussed although i explicitly gave feedback to address the gap between the theory and practice it is not well discussed yet for example the theory is developed for lambda to infty while there is a practical sweet spot for lambda in the experiments based on the optimization challenges while i think that this gap should not be the reason to not accept the paper it warrants a discussion around the results and their practical applicability which is currently missing from the paper i hope the authors would find these comments useful in revising their paper for a future submission docsepthe paper proposes a regularization approach based on data augmentation and develops learning bounds for that setting the main motivation is to provide means to characterize reduction in sample complexity as a result of employing data augmentation techniques during training what would be the difference between data augmentation and consistency regularization this is unclear from the introduction and it serves as a motivation for this work previous work has demonstrated that regularized risk minimization problems with augmented training samples the setting known as vicinal risk minimization can have identical effect as consistency regularization ie low variance between predictions over original and augmented samples please check out the work on vicinal risk minimization and references further developing that line of work 1 o chapelle j weston l bottou v vapnik nips vicinal risk minimization definition 1 this definition is overly restrictive for some instances it is natural to have the conditional distribution with larger entropy and yet the instance is a proper augmentation there are definitely transformations of images that are more confusing relative to labels of interest when compared to the notion of original instance this assumption is thus not motivated by practical considerations and might be an artefact required for theoretical analysis here it is also important to properly introduce original because in different worlds pairs x y sampled from a data generating distribution are different this would then imply that the notion of original is not unique definition 2 it is common actually that the data lies on a lowdimensional manifold and augmentation is in a number of cases just a walk over that manifold thus i see no reason why this metric would be of any use in assessing the informativeness of data augmentation also the difference is the perturbation over the input space which might have nothing to do with the manifold that we would like to learn and over which the data generating distribution actually operates consistency regularization what is so special about eq 1 in the context of prior work not discussed nor covered in details is this not equivalent to learning with empirical risk minimization and augmented samples known as vicinal risk minimization for 20 years for the latter it can be demonstrated theoretically that it minimizes the variance over the neighborhood in the instance space defined by augmented samples i can see that eq 1 might provide more flexibility when it comes to the enforcement of label alignment over such neighborhoods but that needs to be demonstrated the vicinal setting is also quite flexible the theoretical results are under restrictive assumptions and in my understanding apply only to linear models theorem 4 is an attempt to obtain a theoretical result on the effectiveness for two layer neural network but the assumption xitopb xijtopb actually restricts this to linear models as well experiments the performance improvements are rather modest 1 relative on cifar100 in the training setting were the batches ordered in the same way the problem is nonconvex and that might impact the generalization performance table 2 the experiment with different number of samples 1000 20000 why only 3 augmented samples per training instance in such cases augmentation typically adds many more samples and it would be nice to see the results with 10 additional samples per training instance as in the case for the experiment in table 1 table 3 how can the performance be consistently the same with 3 7 and 15 augmented samples this might be an indication that the augmentation scheme does not do enough to describe the local neighborhoods around training samples this in turn might then not be an objective assessment of the effectiveness table 4 the same argument as for table 3 one would expect much more differentiation between numbers for strength 2 5 and 10 in fact this experiment increases my scepticism in the utility of the daug as a measure of augmentation informativeness the assumptions under which the paper operates are quite restrictive and do not see that they hold in practice and thus apply to widely used data augmentation schemes along with deep learning models the crux of the theoretical contributions is restricted to linear models which further limits their scope the paper does not also cover well the prior work on empirical risk minimization problems especially the line of work on vicinal risk minimization experiments show rather incremental improvement 1 relative and it is quite strange that there is no substantially larger effect of augmentation when the replication factor goes from 3 to 15 ### Summary:
this paper shows how constraining the representation to be invariant to augmentation shrinks the hypothesis space to improve generalization more than just introducing additional samples through augmentation i agree with the reviewers that this is a novel intuitive and interesting finding however there were many technical and clarity issues with the original submission these were partially addressed by the authors in the rebuttal the reviewers appreciated the authors efforts and commitment in the rebuttal but my conclusion from our discussion that this paper requires another round of revisions i hope the authors would follow the reviewers comments improve the paper and resubmit
[ 9101, 352, 281, 320, 13583, 50276, 783, 4737, 273, 10012, 577, 310, 275, 8658, 13508, 891, 812, 417, 1056, 619, 1039, 949, 352, 3738, 891, 1908, 4266, 281, 320, 327, 253, 625, 10527, 1930, 273, 253, 6637, 594, 891, 651, 1333, 253, 15965, 47284, 273, 253, 2929, 310, 417, 12482, 281, 253, 17857, 32888, 2087, 8446, 50276, 74, 812, 417, 956, 752, 253, 4736, 273, 2593, 7652, 310, 50276, 33921, 608, 310, 4767, 323, 1524, 12729, 1566, 5971, 352, 310, 1929, 326, 275, 436, 1083, 253, 26647, 14493, 11476, 3809, 258, 18, 79, 4142, 2581, 685, 253, 7312, 258, 18, 2609, 79, 4142, 923, 246, 19089, 67, 28101, 6157, 2870, 249, 413, 285, 1182, 518, 1362, 35275, 276, 9169, 7613, 253, 5170, 3033, 275, 10012, 608, 310, 1340, 3020, 13155, 285, 891, 717, 417, 2119, 849, 281, 4665, 352, 50275, 783, 2929, 840, 7788, 281, 4679, 835, 352, 310, 2011, 326, 15274, 37820, 41731, 13015, 941, 42072, 327, 1264, 49602, 285, 253, 5373, 403, 1534, 3340, 672, 13130, 941, 310, 29967, 1223, 891, 751, 253, 1543, 627, 403, 2067, 2201, 7350, 1060, 50276, 7053, 352, 310, 12744, 752, 830, 273, 3963, 6081, 1159, 391, 1689, 556, 644, 908, 281, 4711, 253, 1543, 275, 253, 4679, 2593, 476, 368, 4496, 4518, 5513, 253, 3963, 6081, 323, 512, 4679, 50276, 284, 310, 973, 1929, 285, 476, 671, 320, 2326, 275, 253, 4679, 672, 10620, 342, 689, 19484, 1025, 1327, 44181, 11454, 3210, 13887, 29331, 281, 2192, 555, 18236, 1543, 275, 14940, 281, 4105, 1980, 46836, 326, 513, 417, 39970, 323, 1650, 436, 476, 320, 2326, 275, 2829, 337, 323, 29331, 50276, 938, 285, 604, 29331, 310, 1014, 2007, 2559, 253, 4795, 1566, 651, 5115, 253, 3045, 273, 247, 3632, 30410, 436, 310, 247, 1943, 26210, 875, 253, 10527, 9978, 273, 436, 2929, 285, 253, 16774, 9978, 285, 247, 5955, 1475, 436, 2159, 4202, 310, 26085, 50276, 249, 2829, 337, 476, 368, 4496, 5513, 253, 5926, 273, 253, 3045, 2326, 323, 29331, 337, 581, 651, 540, 41597, 1902, 326, 3629, 29331, 432, 470, 253, 3045, 651, 2572, 285, 840, 352, 588, 30025, 285, 840, 1265, 281, 16528, 366, 7613, 29331, 337, 3045, 310, 4828, 565, 48714, 50276, 262, 310, 7558, 326, 253, 277, 317, 37820, 4245, 247, 7197, 3045, 347, 352, 39380, 546, 36217, 277, 317, 37820, 835, 253, 5203, 273, 253, 31612, 3410, 778, 452, 4391, 2299, 436, 310, 417, 4326, 4215, 949, 4209, 1941, 2057, 1818, 253, 3448, 281, 1537, 320, 5544, 390, 2085, 625, 1941, 50275, 555, 993, 50276, 8159, 577, 846, 337, 347, 271, 3963, 6081, 50276, 284, 247, 3963, 6081, 50276, 8159, 1722, 3908, 273, 289, 78, 854, 12310, 50276, 84, 16412, 90, 50276, 250, 3065, 50276, 47510, 5945, 270, 246, 19089, 67, 28101, 8654, 20828, 273, 49996, 275, 7605, 4715, 253, 2459, 932, 273, 9990, 4567, 883, 1671, 17962, 6157, 50275, 394, 4921, 2870, 249, 413, 285, 298, 32922, 1182, 518, 1362, 35275, 276, 14720, 670, 26647, 3066, 17697, 15577, 1491, 275, 10061, 273, 10488, 2626, 8059, 327, 4715, 3762, 4644, 11140, 7223, 5910, 1787, 1706, 3583, 15630, 805, 49137, 9169, 50275, 2520, 2929, 24357, 247, 15965, 7792, 323, 15274, 37820, 2067, 10527, 1543, 285, 16774, 1543, 403, 2530, 4583, 891, 751, 253, 2087, 2934, 273, 253, 2929, 2299, 627, 403, 2067, 2201, 7350, 342, 253, 1039, 352, 310, 11407, 275, 1798, 253, 2201, 2523, 310, 326, 253, 3762, 285, 4679, 403, 28465, 285, 513, 417, 1329, 1016, 643, 337, 253, 3762, 310, 1512, 12331, 281, 4872, 3210, 285, 352, 310, 12744, 849, 352, 812, 320, 6508, 4457, 4872, 3210, 374, 253, 3762, 310, 760, 7763, 281, 4872, 3210, 342, 29331, 281, 2192, 555, 436, 9459, 310, 417, 1014, 16571, 275, 253, 16774, 1533, 672, 359, 2968, 342, 11454, 3210, 347, 671, 27007, 275, 253, 4679, 273, 253, 2929, 495, 1223, 253, 4679, 403, 4722, 597, 403, 417, 38662, 327, 616, 1211, 285, 3480, 1774, 4278, 323, 1650, 253, 3963, 6081, 1159, 391, 1689, 310, 417, 7616, 671, 1666, 25379, 323, 841, 49602, 403, 417, 2429, 1411, 1223, 627, 403, 1142, 1841, 326, 891, 751, 670, 253, 2929, 352, 1057, 417, 2028, 247, 18893, 2926, 285, 3340, 581, 326, 651, 5649, 253, 17857, 32888, 8446, 285, 7613, 891, 5583, 253, 2929, 281, 320, 10945, 275, 697, 1655, 830, 891, 3524, 253, 4477, 476, 19148, 690, 273, 253, 6843, 5701, 34974, 326, 891, 452, 5439, 275, 619, 2278, 1309, 253, 30080, 22559, 50274, 11183, 846, 2488, 2380, 50275, 74, 651, 751, 281, 5717, 253, 4477, 323, 616, 9470, 6128, 281, 619, 3236, 5701, 347, 973, 347, 253, 956, 484, 5701, 285, 671, 323, 616, 38549, 281, 253, 2929, 534, 556, 5520, 253, 2929, 3012, 3021, 891, 717, 12976, 619, 4868, 432, 495, 281, 608, 1223, 690, 273, 619, 2045, 7350, 403, 9713, 627, 403, 1142, 5780, 7350, 326, 651, 2430, 1529, 1557, 71, 1657, 633, 3134, 18520, 285, 651, 2430, 1529, 3790, 273, 2278, 534, 310, 2139, 891, 1335, 13414, 1158, 253, 2929, 310, 4704, 281, 320, 7607, 891, 651, 751, 281, 22175, 326, 891, 751, 253, 2087, 15895, 273, 253, 1895, 285, 253, 2087, 19274, 273, 253, 2929, 285, 891, 1158, 253, 2929, 651, 320, 247, 5322, 7680, 281, 253, 6239, 2378, 253, 3237, 3340, 342, 15965, 47284, 403, 4229, 1060, 403, 690, 6843, 29476, 323, 253, 4477, 50275, 11548, 2845, 885, 285, 275, 8658, 13508, 15965, 47284, 627, 403, 8783, 273, 1607, 2845, 885, 7234, 534, 671, 37317, 256, 20, 489, 19375, 670, 1335, 275, 253, 17265, 2929, 323, 1650, 752, 1057, 305, 72, 1599, 275, 9376, 374, 3239, 1458, 387, 253, 1072, 673, 253, 14168, 310, 417, 956, 494, 891, 1335, 812, 417, 956, 690, 273, 253, 27947, 50274, 10984, 281, 6351, 30328, 21528, 42287, 432, 10527, 1543, 1223, 253, 4477, 1246, 2067, 1543, 352, 310, 1892, 281, 2096, 1379, 42287, 432, 253, 3715, 3762, 1223, 597, 452, 9713, 1142, 7350, 2067, 1335, 3464, 323, 1650, 38309, 273, 253, 17265, 289, 78, 608, 1057, 417, 3469, 327, 667, 273, 253, 941, 42072, 3602, 824, 347, 277, 1156, 2321, 285, 9765, 752, 310, 253, 1379, 12594, 432, 436, 10012, 1677, 326, 253, 1072, 3033, 671, 10384, 281, 209, 693, 285, 4204, 693, 50275, 27142, 875, 3762, 285, 3946, 417, 5469, 3738, 891, 11120, 3534, 8680, 281, 2953, 253, 8037, 875, 253, 3762, 285, 3946, 352, 310, 417, 973, 5469, 2568, 323, 1650, 253, 3762, 310, 3715, 323, 29331, 281, 2192, 555, 1223, 627, 310, 247, 8542, 7353, 6308, 323, 29331, 275, 253, 4679, 1754, 327, 253, 13757, 7881, 1223, 891, 1158, 326, 436, 8037, 943, 417, 320, 253, 1921, 281, 417, 2997, 253, 2929, 352, 32570, 247, 5955, 1475, 253, 1543, 285, 616, 8542, 30437, 534, 310, 4390, 5816, 432, 253, 2929, 50276, 74, 3524, 253, 4477, 651, 1089, 841, 5701, 4217, 275, 3585, 2182, 616, 2929, 323, 247, 2852, 19529, 5474, 339, 431, 248, 2929, 29328, 247, 37820, 2746, 1754, 327, 941, 42072, 285, 24357, 4715, 14493, 323, 326, 4758, 253, 2022, 16038, 310, 281, 2085, 2097, 281, 17710, 5141, 275, 3410, 10454, 347, 247, 906, 273, 19693, 941, 42072, 5609, 1309, 3733, 50276, 5371, 651, 320, 253, 3064, 875, 941, 42072, 285, 15274, 37820, 436, 310, 12744, 432, 253, 10199, 285, 352, 11029, 347, 247, 16038, 323, 436, 789, 2045, 789, 556, 5183, 326, 3963, 1025, 2495, 41458, 3237, 342, 31612, 3733, 3530, 253, 4758, 1929, 347, 15951, 989, 2495, 41458, 476, 452, 8931, 1055, 347, 15274, 37820, 26332, 1698, 11041, 875, 13650, 689, 3236, 285, 31612, 3530, 4496, 2451, 562, 253, 789, 327, 15951, 989, 2495, 41458, 285, 10414, 2007, 6684, 326, 1386, 273, 789, 337, 258, 13885, 4415, 480, 8935, 251, 298, 3673, 276, 362, 37033, 16825, 295, 2824, 15951, 989, 2495, 41458, 50275, 28692, 337, 436, 5426, 310, 27662, 29190, 323, 690, 10872, 352, 310, 3626, 281, 452, 253, 17697, 3268, 342, 4067, 15579, 285, 2568, 253, 4227, 310, 247, 1463, 42072, 627, 403, 7964, 21257, 273, 3888, 326, 403, 625, 21643, 4103, 281, 13301, 273, 1600, 672, 2429, 281, 253, 10732, 273, 3236, 4227, 436, 9376, 310, 3021, 417, 17194, 407, 8542, 15711, 285, 1537, 320, 271, 39624, 12690, 2424, 323, 10527, 1783, 1060, 352, 310, 671, 1774, 281, 6283, 9569, 3236, 984, 275, 1027, 20490, 8557, 1269, 340, 19958, 432, 247, 941, 11365, 3268, 403, 1027, 436, 651, 840, 16084, 326, 253, 10732, 273, 3236, 310, 417, 4451, 50275, 28692, 374, 352, 310, 1846, 2686, 326, 253, 941, 8696, 327, 247, 1698, 6967, 16751, 285, 42072, 310, 275, 247, 1180, 273, 2219, 816, 247, 2940, 689, 326, 16751, 3021, 891, 923, 642, 1921, 2139, 436, 7982, 651, 320, 273, 667, 897, 275, 18005, 253, 4151, 255, 6460, 273, 941, 42072, 671, 253, 3064, 310, 253, 20452, 689, 253, 3280, 2317, 534, 1537, 452, 2717, 281, 513, 342, 253, 16751, 326, 359, 651, 751, 281, 3037, 285, 689, 534, 253, 941, 11365, 3268, 2686, 17209, 50275, 46540, 1371, 37820, 752, 310, 594, 2714, 670, 16186, 337, 275, 253, 3634, 273, 2720, 789, 417, 5469, 4543, 6107, 275, 4278, 310, 436, 417, 6425, 281, 4715, 342, 16774, 2495, 41458, 285, 31612, 3530, 1929, 347, 15951, 989, 2495, 41458, 323, 1384, 1107, 323, 253, 6158, 352, 476, 320, 5183, 28055, 326, 352, 46926, 253, 11041, 689, 253, 9168, 275, 253, 4227, 2317, 2931, 407, 31612, 3530, 891, 476, 923, 326, 16186, 337, 1537, 2085, 625, 15840, 672, 352, 3249, 281, 253, 10473, 273, 5203, 12420, 689, 824, 25237, 533, 326, 3198, 281, 320, 5183, 253, 15951, 989, 4758, 310, 671, 3240, 12112, 50275, 783, 10527, 1543, 403, 762, 29190, 13260, 285, 275, 619, 4685, 4647, 760, 281, 4872, 3210, 10012, 577, 310, 271, 3177, 281, 4044, 247, 10527, 906, 327, 253, 12510, 323, 767, 3828, 11454, 2990, 533, 253, 9376, 1269, 262, 412, 67, 50276, 89, 1944, 3956, 67, 2686, 45798, 436, 281, 4872, 3210, 347, 973, 50275, 16217, 3825, 253, 3045, 11701, 403, 2581, 16453, 337, 4103, 327, 260, 338, 274, 2313, 275, 253, 3733, 4758, 497, 253, 39657, 6960, 275, 253, 1072, 1039, 253, 1895, 310, 1327, 44181, 285, 326, 1537, 3486, 253, 26647, 3045, 50276, 2420, 374, 253, 3368, 342, 1027, 1180, 273, 3530, 9098, 50276, 19, 1418, 2139, 760, 495, 31612, 3530, 591, 3733, 4227, 275, 824, 2219, 42072, 5431, 11323, 1142, 625, 3530, 285, 352, 651, 320, 5322, 281, 923, 253, 1543, 342, 884, 3081, 3530, 591, 3733, 4227, 347, 275, 253, 1083, 323, 253, 3368, 275, 2829, 337, 50276, 2420, 495, 849, 476, 253, 3045, 320, 12724, 253, 1072, 342, 495, 818, 285, 1458, 31612, 3530, 436, 1537, 320, 271, 14011, 326, 253, 42072, 6974, 1057, 417, 513, 2217, 281, 6266, 253, 1980, 25237, 1475, 3733, 3530, 436, 275, 1614, 1537, 840, 417, 320, 271, 8103, 6803, 273, 253, 12510, 2829, 577, 253, 1072, 4154, 347, 323, 2829, 495, 581, 651, 1902, 1199, 625, 9827, 875, 3904, 323, 4757, 374, 608, 285, 884, 275, 958, 436, 3368, 5459, 619, 44743, 36106, 275, 253, 11839, 273, 253, 277, 2321, 347, 247, 2557, 273, 42072, 4151, 255, 6460, 253, 13260, 762, 534, 253, 2929, 17209, 403, 3240, 29190, 285, 513, 417, 923, 326, 597, 2186, 275, 3946, 285, 3021, 4647, 281, 7561, 908, 941, 42072, 15849, 2112, 342, 3676, 4715, 3210, 253, 5385, 89, 273, 253, 10527, 9021, 310, 11096, 281, 4872, 3210, 534, 2007, 7787, 616, 7990, 253, 2929, 1057, 417, 671, 3835, 973, 253, 2720, 789, 327, 16774, 2495, 41458, 3237, 3340, 253, 1386, 273, 789, 327, 15951, 989, 2495, 41458, 4679, 921, 2581, 32809, 7756, 337, 4103, 285, 352, 310, 3240, 8921, 326, 627, 310, 642, 9619, 4067, 1055, 273, 42072, 672, 253, 14970, 2803, 4566, 432, 495, 281, 1458, 2490, 187, 4118, 18435, 27, 2520, 2929, 2722, 849, 1030, 26208, 253, 6779, 281, 320, 13727, 281, 42072, 11111, 3106, 253, 9079, 2317, 281, 3157, 26647, 625, 685, 816, 16984, 3081, 3530, 949, 42072, 891, 5194, 342, 253, 30628, 326, 436, 310, 247, 4460, 27350, 285, 4722, 4560, 2299, 627, 497, 1142, 7681, 285, 19843, 3374, 342, 253, 3236, 19529, 841, 497, 10571, 9713, 407, 253, 4477, 275, 253, 30080, 22559, 253, 30628, 14109, 253, 4477, 6031, 285, 11847, 275, 253, 30080, 22559, 533, 619, 6452, 432, 776, 5955, 326, 436, 2929, 4419, 1529, 3790, 273, 38549, 891, 3524, 253, 4477, 651, 956, 253, 30628, 5701, 3157, 253, 2929, 285, 501, 538, 2225 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 9101, 352, 281, 320, 13583, 50276, 783, 4737, 273, 10012, 577, 310, 275, 8658, 13508, 891, 812, 417, 1056, 619, 1039, 949, 352, 3738, 891, 1908, 4266, 281, 320, 327, 253, 625, 10527, 1930, 273, 253, 6637, 594, 891, 651, 1333, 253, 15965, 47284, 273, 253, 2929, 310, 417, 12482, 281, 253, 17857, 32888, 2087, 8446, 50276, 74, 812, 417, 956, 752, 253, 4736, 273, 2593, 7652, 310, 50276, 33921, 608, 310, 4767, 323, 1524, 12729, 1566, 5971, 352, 310, 1929, 326, 275, 436, 1083, 253, 26647, 14493, 11476, 3809, 258, 18, 79, 4142, 2581, 685, 253, 7312, 258, 18, 2609, 79, 4142, 923, 246, 19089, 67, 28101, 6157, 2870, 249, 413, 285, 1182, 518, 1362, 35275, 276, 9169, 7613, 253, 5170, 3033, 275, 10012, 608, 310, 1340, 3020, 13155, 285, 891, 717, 417, 2119, 849, 281, 4665, 352, 50275, 783, 2929, 840, 7788, 281, 4679, 835, 352, 310, 2011, 326, 15274, 37820, 41731, 13015, 941, 42072, 327, 1264, 49602, 285, 253, 5373, 403, 1534, 3340, 672, 13130, 941, 310, 29967, 1223, 891, 751, 253, 1543, 627, 403, 2067, 2201, 7350, 1060, 50276, 7053, 352, 310, 12744, 752, 830, 273, 3963, 6081, 1159, 391, 1689, 556, 644, 908, 281, 4711, 253, 1543, 275, 253, 4679, 2593, 476, 368, 4496, 4518, 5513, 253, 3963, 6081, 323, 512, 4679, 50276, 284, 310, 973, 1929, 285, 476, 671, 320, 2326, 275, 253, 4679, 672, 10620, 342, 689, 19484, 1025, 1327, 44181, 11454, 3210, 13887, 29331, 281, 2192, 555, 18236, 1543, 275, 14940, 281, 4105, 1980, 46836, 326, 513, 417, 39970, 323, 1650, 436, 476, 320, 2326, 275, 2829, 337, 323, 29331, 50276, 938, 285, 604, 29331, 310, 1014, 2007, 2559, 253, 4795, 1566, 651, 5115, 253, 3045, 273, 247, 3632, 30410, 436, 310, 247, 1943, 26210, 875, 253, 10527, 9978, 273, 436, 2929, 285, 253, 16774, 9978, 285, 247, 5955, 1475, 436, 2159, 4202, 310, 26085, 50276, 249, 2829, 337, 476, 368, 4496, 5513, 253, 5926, 273, 253, 3045, 2326, 323, 29331, 337, 581, 651, 540, 41597, 1902, 326, 3629, 29331, 432, 470, 253, 3045, 651, 2572, 285, 840, 352, 588, 30025, 285, 840, 1265, 281, 16528, 366, 7613, 29331, 337, 3045, 310, 4828, 565, 48714, 50276, 262, 310, 7558, 326, 253, 277, 317, 37820, 4245, 247, 7197, 3045, 347, 352, 39380, 546, 36217, 277, 317, 37820, 835, 253, 5203, 273, 253, 31612, 3410, 778, 452, 4391, 2299, 436, 310, 417, 4326, 4215, 949, 4209, 1941, 2057, 1818, 253, 3448, 281, 1537, 320, 5544, 390, 2085, 625, 1941, 50275, 555, 993, 50276, 8159, 577, 846, 337, 347, 271, 3963, 6081, 50276, 284, 247, 3963, 6081, 50276, 8159, 1722, 3908, 273, 289, 78, 854, 12310, 50276, 84, 16412, 90, 50276, 250, 3065, 50276, 47510, 5945, 270, 246, 19089, 67, 28101, 8654, 20828, 273, 49996, 275, 7605, 4715, 253, 2459, 932, 273, 9990, 4567, 883, 1671, 17962, 6157, 50275, 394, 4921, 2870, 249, 413, 285, 298, 32922, 1182, 518, 1362, 35275, 276, 14720, 670, 26647, 3066, 17697, 15577, 1491, 275, 10061, 273, 10488, 2626, 8059, 327, 4715, 3762, 4644, 11140, 7223, 5910, 1787, 1706, 3583, 15630, 805, 49137, 9169, 50275, 2520, 2929, 24357, 247, 15965, 7792, 323, 15274, 37820, 2067, 10527, 1543, 285, 16774, 1543, 403, 2530, 4583, 891, 751, 253, 2087, 2934, 273, 253, 2929, 2299, 627, 403, 2067, 2201, 7350, 342, 253, 1039, 352, 310, 11407, 275, 1798, 253, 2201, 2523, 310, 326, 253, 3762, 285, 4679, 403, 28465, 285, 513, 417, 1329, 1016, 643, 337, 253, 3762, 310, 1512, 12331, 281, 4872, 3210, 285, 352, 310, 12744, 849, 352, 812, 320, 6508, 4457, 4872, 3210, 374, 253, 3762, 310, 760, 7763, 281, 4872, 3210, 342, 29331, 281, 2192, 555, 436, 9459, 310, 417, 1014, 16571, 275, 253, 16774, 1533, 672, 359, 2968, 342, 11454, 3210, 347, 671, 27007, 275, 253, 4679, 273, 253, 2929, 495, 1223, 253, 4679, 403, 4722, 597, 403, 417, 38662, 327, 616, 1211, 285, 3480, 1774, 4278, 323, 1650, 253, 3963, 6081, 1159, 391, 1689, 310, 417, 7616, 671, 1666, 25379, 323, 841, 49602, 403, 417, 2429, 1411, 1223, 627, 403, 1142, 1841, 326, 891, 751, 670, 253, 2929, 352, 1057, 417, 2028, 247, 18893, 2926, 285, 3340, 581, 326, 651, 5649, 253, 17857, 32888, 8446, 285, 7613, 891, 5583, 253, 2929, 281, 320, 10945, 275, 697, 1655, 830, 891, 3524, 253, 4477, 476, 19148, 690, 273, 253, 6843, 5701, 34974, 326, 891, 452, 5439, 275, 619, 2278, 1309, 253, 30080, 22559, 50274, 11183, 846, 2488, 2380, 50275, 74, 651, 751, 281, 5717, 253, 4477, 323, 616, 9470, 6128, 281, 619, 3236, 5701, 347, 973, 347, 253, 956, 484, 5701, 285, 671, 323, 616, 38549, 281, 253, 2929, 534, 556, 5520, 253, 2929, 3012, 3021, 891, 717, 12976, 619, 4868, 432, 495, 281, 608, 1223, 690, 273, 619, 2045, 7350, 403, 9713, 627, 403, 1142, 5780, 7350, 326, 651, 2430, 1529, 1557, 71, 1657, 633, 3134, 18520, 285, 651, 2430, 1529, 3790, 273, 2278, 534, 310, 2139, 891, 1335, 13414, 1158, 253, 2929, 310, 4704, 281, 320, 7607, 891, 651, 751, 281, 22175, 326, 891, 751, 253, 2087, 15895, 273, 253, 1895, 285, 253, 2087, 19274, 273, 253, 2929, 285, 891, 1158, 253, 2929, 651, 320, 247, 5322, 7680, 281, 253, 6239, 2378, 253, 3237, 3340, 342, 15965, 47284, 403, 4229, 1060, 403, 690, 6843, 29476, 323, 253, 4477, 50275, 11548, 2845, 885, 285, 275, 8658, 13508, 15965, 47284, 627, 403, 8783, 273, 1607, 2845, 885, 7234, 534, 671, 37317, 256, 20, 489, 19375, 670, 1335, 275, 253, 17265, 2929, 323, 1650, 752, 1057, 305, 72, 1599, 275, 9376, 374, 3239, 1458, 387, 253, 1072, 673, 253, 14168, 310, 417, 956, 494, 891, 1335, 812, 417, 956, 690, 273, 253, 27947, 50274, 10984, 281, 6351, 30328, 21528, 42287, 432, 10527, 1543, 1223, 253, 4477, 1246, 2067, 1543, 352, 310, 1892, 281, 2096, 1379, 42287, 432, 253, 3715, 3762, 1223, 597, 452, 9713, 1142, 7350, 2067, 1335, 3464, 323, 1650, 38309, 273, 253, 17265, 289, 78, 608, 1057, 417, 3469, 327, 667, 273, 253, 941, 42072, 3602, 824, 347, 277, 1156, 2321, 285, 9765, 752, 310, 253, 1379, 12594, 432, 436, 10012, 1677, 326, 253, 1072, 3033, 671, 10384, 281, 209, 693, 285, 4204, 693, 50275, 27142, 875, 3762, 285, 3946, 417, 5469, 3738, 891, 11120, 3534, 8680, 281, 2953, 253, 8037, 875, 253, 3762, 285, 3946, 352, 310, 417, 973, 5469, 2568, 323, 1650, 253, 3762, 310, 3715, 323, 29331, 281, 2192, 555, 1223, 627, 310, 247, 8542, 7353, 6308, 323, 29331, 275, 253, 4679, 1754, 327, 253, 13757, 7881, 1223, 891, 1158, 326, 436, 8037, 943, 417, 320, 253, 1921, 281, 417, 2997, 253, 2929, 352, 32570, 247, 5955, 1475, 253, 1543, 285, 616, 8542, 30437, 534, 310, 4390, 5816, 432, 253, 2929, 50276, 74, 3524, 253, 4477, 651, 1089, 841, 5701, 4217, 275, 3585, 2182, 616, 2929, 323, 247, 2852, 19529, 5474, 339, 431, 248, 2929, 29328, 247, 37820, 2746, 1754, 327, 941, 42072, 285, 24357, 4715, 14493, 323, 326, 4758, 253, 2022, 16038, 310, 281, 2085, 2097, 281, 17710, 5141, 275, 3410, 10454, 347, 247, 906, 273, 19693, 941, 42072, 5609, 1309, 3733, 50276, 5371, 651, 320, 253, 3064, 875, 941, 42072, 285, 15274, 37820, 436, 310, 12744, 432, 253, 10199, 285, 352, 11029, 347, 247, 16038, 323, 436, 789, 2045, 789, 556, 5183, 326, 3963, 1025, 2495, 41458, 3237, 342, 31612, 3733, 3530, 253, 4758, 1929, 347, 15951, 989, 2495, 41458, 476, 452, 8931, 1055, 347, 15274, 37820, 26332, 1698, 11041, 875, 13650, 689, 3236, 285, 31612, 3530, 4496, 2451, 562, 253, 789, 327, 15951, 989, 2495, 41458, 285, 10414, 2007, 6684, 326, 1386, 273, 789, 337, 258, 13885, 4415, 480, 8935, 251, 298, 3673, 276, 362, 37033, 16825, 295, 2824, 15951, 989, 2495, 41458, 50275, 28692, 337, 436, 5426, 310, 27662, 29190, 323, 690, 10872, 352, 310, 3626, 281, 452, 253, 17697, 3268, 342, 4067, 15579, 285, 2568, 253, 4227, 310, 247, 1463, 42072, 627, 403, 7964, 21257, 273, 3888, 326, 403, 625, 21643, 4103, 281, 13301, 273, 1600, 672, 2429, 281, 253, 10732, 273, 3236, 4227, 436, 9376, 310, 3021, 417, 17194, 407, 8542, 15711, 285, 1537, 320, 271, 39624, 12690, 2424, 323, 10527, 1783, 1060, 352, 310, 671, 1774, 281, 6283, 9569, 3236, 984, 275, 1027, 20490, 8557, 1269, 340, 19958, 432, 247, 941, 11365, 3268, 403, 1027, 436, 651, 840, 16084, 326, 253, 10732, 273, 3236, 310, 417, 4451, 50275, 28692, 374, 352, 310, 1846, 2686, 326, 253, 941, 8696, 327, 247, 1698, 6967, 16751, 285, 42072, 310, 275, 247, 1180, 273, 2219, 816, 247, 2940, 689, 326, 16751, 3021, 891, 923, 642, 1921, 2139, 436, 7982, 651, 320, 273, 667, 897, 275, 18005, 253, 4151, 255, 6460, 273, 941, 42072, 671, 253, 3064, 310, 253, 20452, 689, 253, 3280, 2317, 534, 1537, 452, 2717, 281, 513, 342, 253, 16751, 326, 359, 651, 751, 281, 3037, 285, 689, 534, 253, 941, 11365, 3268, 2686, 17209, 50275, 46540, 1371, 37820, 752, 310, 594, 2714, 670, 16186, 337, 275, 253, 3634, 273, 2720, 789, 417, 5469, 4543, 6107, 275, 4278, 310, 436, 417, 6425, 281, 4715, 342, 16774, 2495, 41458, 285, 31612, 3530, 1929, 347, 15951, 989, 2495, 41458, 323, 1384, 1107, 323, 253, 6158, 352, 476, 320, 5183, 28055, 326, 352, 46926, 253, 11041, 689, 253, 9168, 275, 253, 4227, 2317, 2931, 407, 31612, 3530, 891, 476, 923, 326, 16186, 337, 1537, 2085, 625, 15840, 672, 352, 3249, 281, 253, 10473, 273, 5203, 12420, 689, 824, 25237, 533, 326, 3198, 281, 320, 5183, 253, 15951, 989, 4758, 310, 671, 3240, 12112, 50275, 783, 10527, 1543, 403, 762, 29190, 13260, 285, 275, 619, 4685, 4647, 760, 281, 4872, 3210, 10012, 577, 310, 271, 3177, 281, 4044, 247, 10527, 906, 327, 253, 12510, 323, 767, 3828, 11454, 2990, 533, 253, 9376, 1269, 262, 412, 67, 50276, 89, 1944, 3956, 67, 2686, 45798, 436, 281, 4872, 3210, 347, 973, 50275, 16217, 3825, 253, 3045, 11701, 403, 2581, 16453, 337, 4103, 327, 260, 338, 274, 2313, 275, 253, 3733, 4758, 497, 253, 39657, 6960, 275, 253, 1072, 1039, 253, 1895, 310, 1327, 44181, 285, 326, 1537, 3486, 253, 26647, 3045, 50276, 2420, 374, 253, 3368, 342, 1027, 1180, 273, 3530, 9098, 50276, 19, 1418, 2139, 760, 495, 31612, 3530, 591, 3733, 4227, 275, 824, 2219, 42072, 5431, 11323, 1142, 625, 3530, 285, 352, 651, 320, 5322, 281, 923, 253, 1543, 342, 884, 3081, 3530, 591, 3733, 4227, 347, 275, 253, 1083, 323, 253, 3368, 275, 2829, 337, 50276, 2420, 495, 849, 476, 253, 3045, 320, 12724, 253, 1072, 342, 495, 818, 285, 1458, 31612, 3530, 436, 1537, 320, 271, 14011, 326, 253, 42072, 6974, 1057, 417, 513, 2217, 281, 6266, 253, 1980, 25237, 1475, 3733, 3530, 436, 275, 1614, 1537, 840, 417, 320, 271, 8103, 6803, 273, 253, 12510, 2829, 577, 253, 1072, 4154, 347, 323, 2829, 495, 581, 651, 1902, 1199, 625, 9827, 875, 3904, 323, 4757, 374, 608, 285, 884, 275, 958, 436, 3368, 5459, 619, 44743, 36106, 275, 253, 11839, 273, 253, 277, 2321, 347, 247, 2557, 273, 42072, 4151, 255, 6460, 253, 13260, 762, 534, 253, 2929, 17209, 403, 3240, 29190, 285, 513, 417, 923, 326, 597, 2186, 275, 3946, 285, 3021, 4647, 281, 7561, 908, 941, 42072, 15849, 2112, 342, 3676, 4715, 3210, 253, 5385, 89, 273, 253, 10527, 9021, 310, 11096, 281, 4872, 3210, 534, 2007, 7787, 616, 7990, 253, 2929, 1057, 417, 671, 3835, 973, 253, 2720, 789, 327, 16774, 2495, 41458, 3237, 3340, 253, 1386, 273, 789, 327, 15951, 989, 2495, 41458, 4679, 921, 2581, 32809, 7756, 337, 4103, 285, 352, 310, 3240, 8921, 326, 627, 310, 642, 9619, 4067, 1055, 273, 42072, 672, 253, 14970, 2803, 4566, 432, 495, 281, 1458, 2490, 187, 4118, 18435, 27, 2520, 2929, 2722, 849, 1030, 26208, 253, 6779, 281, 320, 13727, 281, 42072, 11111, 3106, 253, 9079, 2317, 281, 3157, 26647, 625, 685, 816, 16984, 3081, 3530, 949, 42072, 891, 5194, 342, 253, 30628, 326, 436, 310, 247, 4460, 27350, 285, 4722, 4560, 2299, 627, 497, 1142, 7681, 285, 19843, 3374, 342, 253, 3236, 19529, 841, 497, 10571, 9713, 407, 253, 4477, 275, 253, 30080, 22559, 253, 30628, 14109, 253, 4477, 6031, 285, 11847, 275, 253, 30080, 22559, 533, 619, 6452, 432, 776, 5955, 326, 436, 2929, 4419, 1529, 3790, 273, 38549, 891, 3524, 253, 4477, 651, 956, 253, 30628, 5701, 3157, 253, 2929, 285, 501, 538, 2225 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: comments after rebuttal overall the authors have addressed many concerns in the latest version of the paper in the current form i have a much better impression of this work and thus raise my score to a point where i can recommend acceptance in the following i add some afterrebuttal comment at the end of each item of my original review summary this papers presents an application of the cayley transform to parameterizing a subset of orthogonal convolutional layers as i understood in contrast to other methods that orthogonalize the convolution by reshaping the kernel the orthogonalization is performed in the frequency domain the cayley transform is applied to the kernel matrices corresponding to each pixel after applying the fft and then the output is recovered using the inverse fft then it is shown empirically that this method preserves norms and that either this method or a combination of the cayley transform and a method from a different paper rko improves upon certified robustness using the simple lipschitzmargin bound some marginal gains on clean error are also reported pros 1 quality the empirical evaluation is good at comparing between baselines however it could be improved see cons 2 originality it seems interesting and novel to do orthogonalization using the cayley transform in fourier domain as this seems to bypass limitations of other methods that have troubles trying to orthogonalize the matrix corresponding to the convolution the main issue is that such process usually returns a matrix that does not correspond to an actual convolution and this paper seems to avoid that issue altogether 3 significance from experimental evaluation it seems that the proposed method is a strong candidate among other alternatives to orthogonal convolutional layers however it is also more expensive afterrebuttal the authors have added additional experiments and show that in practice their method can be as fast or faster than other baselines in certain scenarios 4 significance it seems that the space of orthogonal transformations parameterized in the way proposed in this paper can be more expressive than previous work bcop or rko although this could be clarified further by the authors cons 1 clarity after reading the paper multiple times it is not clear to me if this is a heuristic argument or if we can prove that the output of the socalled cayley layer is indeed orthogonal that is can we show a theorem like let fx be the output of the cayley layer algorithm 1 then fx x 1 for all x is this true does it make sense it looks like experiments in figure 1 are trying to validate this empirically but i wonder if this formal result could be obtained afterrebuttal the authors make this more clear in sections 3 and 4 2 clarity i feel that there are many terms thrown around without a clear definition and it makes the arguments confusing in particular i am talking about the whole of section i elaborate further first skewsymmetry is defined for square matrices and then the term skewsymmetryze is for the transformation faaat so what does skewsymmetryze mean if it means a function such that its ouput is skewsymmetric then it would be trivial to output any constant skewsymmetric matrix the important thing about faaat is that it is in fact the orthogonal projection of a onto the space of skewsymmetric matrix modulo a factor of 12 then what does it mean to skewsymmetrize a convolution here the authors use the notation convwx which is not defined anywhere but i guess that it means the linear operator corresponding to the convolution with kernel w but is this a single kernel matrix i assume because otherwise what is wt in textconvwt note that in the previous paragraph w is defined as a tensor of higher rank4 so its transpose is not defined in line with the previous the vanilla way to skewsymmetryze convw would be convw convwt what is the relation with the expression convw convwt in summary i found this paragraph extremely confusing afterrebuttal the authors go over the definitions and now make more clear statements in section 4 which makes the argument more concise the current version is more enlightening to the reader 3 clarity i am confused by the two proposed methods crko and cayley which i think are not properly highlighted in the architecture considerations in page 6 what do you mean that for our cayley layer we use the cayley transform for consistency what is the alternative isnt using the ct the whole point about crko and cayley layer what is the difference between the cayley layer and crko on a first read i thought they were the same looks like crko is rko cayley transform but it would be good to clarify the differences ### Summary:
very good paper it proposes a novel parameterization of orthogonal convolutions that uses the cayley transform in the fourier domain the paper discusses several aspects of the proposed parameterization including limitations and computational considerations and showcases it in the important application of adversarial robustness achieving good results the reviews are all very positive so im happy to recommend acceptance also a big shoutout to the reviewers and to the authors for being outstanding during the discussion period the reviewers engaged with the paper to a great depth and the authors improved the paper considerably as a response well done to all of you
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 26122, 846, 30080, 22559, 4583, 253, 4477, 452, 9713, 1142, 7350, 275, 253, 6323, 2715, 273, 253, 2929, 275, 253, 1655, 830, 891, 452, 247, 1199, 1805, 13214, 273, 436, 789, 285, 3021, 7164, 619, 4868, 281, 247, 1127, 835, 891, 476, 5583, 14924, 275, 253, 1563, 891, 823, 690, 846, 250, 2858, 22559, 4385, 387, 253, 990, 273, 1016, 5382, 273, 619, 3236, 2278, 50275, 8774, 436, 9380, 10262, 271, 2898, 273, 253, 260, 333, 2205, 4979, 281, 4764, 3006, 247, 8578, 273, 19627, 27311, 267, 8090, 347, 891, 7192, 275, 4499, 281, 643, 3082, 326, 19627, 907, 253, 27311, 407, 40206, 15609, 253, 10295, 253, 19627, 1320, 310, 2684, 275, 253, 4294, 5028, 253, 260, 333, 2205, 4979, 310, 3732, 281, 253, 10295, 12624, 3969, 281, 1016, 12275, 846, 9433, 253, 269, 649, 285, 840, 253, 3453, 310, 12372, 970, 253, 13737, 269, 649, 840, 352, 310, 2011, 45190, 326, 436, 1332, 31221, 22429, 285, 326, 2057, 436, 1332, 390, 247, 5019, 273, 253, 260, 333, 2205, 4979, 285, 247, 1332, 432, 247, 1027, 2929, 391, 7381, 19132, 2220, 18065, 31640, 970, 253, 2969, 11233, 37913, 15456, 3033, 690, 16888, 15988, 327, 4076, 2228, 403, 671, 2361, 50275, 856, 84, 337, 3290, 253, 16774, 7103, 310, 1175, 387, 10941, 875, 1666, 25379, 2299, 352, 812, 320, 5520, 923, 772, 50276, 19, 3236, 414, 352, 3133, 4722, 285, 4460, 281, 513, 19627, 1320, 970, 253, 260, 333, 2205, 4979, 275, 269, 15421, 5028, 347, 436, 3133, 281, 18210, 7364, 273, 643, 3082, 326, 452, 19408, 2820, 281, 19627, 907, 253, 4315, 3969, 281, 253, 27311, 253, 2022, 2523, 310, 326, 824, 1232, 3798, 6548, 247, 4315, 326, 1057, 417, 2723, 281, 271, 4588, 27311, 285, 436, 2929, 3133, 281, 3693, 326, 2523, 17965, 50276, 20, 8453, 432, 5661, 7103, 352, 3133, 326, 253, 4081, 1332, 310, 247, 2266, 7431, 2190, 643, 18075, 281, 19627, 27311, 267, 8090, 2299, 352, 310, 671, 625, 8214, 846, 250, 2858, 22559, 253, 4477, 452, 2879, 3081, 4679, 285, 921, 326, 275, 3946, 616, 1332, 476, 320, 347, 3809, 390, 7938, 685, 643, 1666, 25379, 275, 2176, 15216, 50276, 21, 8453, 352, 3133, 326, 253, 2317, 273, 19627, 21257, 4764, 1025, 275, 253, 1039, 4081, 275, 436, 2929, 476, 320, 625, 43541, 685, 2045, 789, 270, 21592, 390, 391, 7381, 3738, 436, 812, 320, 31637, 2007, 407, 253, 4477, 50275, 5040, 337, 19843, 846, 4361, 253, 2929, 2709, 2069, 352, 310, 417, 2590, 281, 479, 604, 436, 310, 247, 47641, 4154, 390, 604, 359, 476, 5276, 326, 253, 3453, 273, 253, 9267, 18859, 260, 333, 2205, 3828, 310, 6296, 19627, 326, 310, 476, 359, 921, 247, 10012, 751, 1339, 269, 89, 320, 253, 3453, 273, 253, 260, 333, 2205, 3828, 5933, 337, 840, 269, 89, 1269, 50276, 18, 323, 512, 1269, 310, 436, 2032, 1057, 352, 1056, 3282, 352, 4453, 751, 4679, 275, 4677, 337, 403, 2820, 281, 17813, 436, 45190, 533, 891, 4282, 604, 436, 7473, 906, 812, 320, 2797, 846, 250, 2858, 22559, 253, 4477, 1056, 436, 625, 2590, 275, 7118, 495, 285, 577, 50276, 19, 19843, 891, 1928, 326, 627, 403, 1142, 2426, 13044, 1475, 1293, 247, 2590, 5426, 285, 352, 2789, 253, 7125, 21643, 275, 1798, 891, 717, 5015, 670, 253, 2644, 273, 2593, 891, 21184, 2007, 50276, 7053, 8413, 8819, 41961, 310, 2931, 323, 6278, 12624, 285, 840, 253, 1307, 8413, 8819, 41961, 2721, 310, 323, 253, 9261, 4195, 66, 255, 594, 752, 1057, 8413, 8819, 41961, 2721, 1599, 604, 352, 2097, 247, 1159, 824, 326, 697, 258, 484, 307, 310, 8413, 8819, 25562, 840, 352, 651, 320, 14916, 281, 3453, 667, 3638, 8413, 8819, 25562, 4315, 253, 1774, 2181, 670, 4195, 66, 255, 310, 326, 352, 310, 275, 958, 253, 19627, 12378, 273, 247, 4830, 253, 2317, 273, 8413, 8819, 25562, 4315, 40090, 247, 2803, 273, 1249, 50276, 7461, 752, 1057, 352, 1599, 281, 8413, 8819, 1105, 3899, 363, 2721, 247, 27311, 1060, 253, 4477, 897, 253, 14951, 2410, 22358, 534, 310, 417, 2931, 9825, 533, 891, 5476, 326, 352, 2097, 253, 4872, 5572, 3969, 281, 253, 27311, 342, 10295, 259, 533, 310, 436, 247, 2014, 10295, 4315, 891, 5467, 984, 5010, 752, 310, 22923, 275, 2505, 13118, 17118, 3877, 326, 275, 253, 2045, 12494, 259, 310, 2931, 347, 247, 13148, 273, 2169, 5958, 21, 594, 697, 811, 3014, 310, 417, 2931, 50276, 249, 1386, 342, 253, 2045, 253, 26724, 1039, 281, 8413, 8819, 41961, 2721, 2410, 88, 651, 320, 2410, 88, 50276, 13118, 17118, 752, 310, 253, 5886, 342, 253, 2048, 2410, 88, 50276, 13118, 17118, 275, 6010, 891, 1119, 436, 12494, 6685, 21643, 846, 250, 2858, 22559, 253, 4477, 564, 689, 253, 14308, 285, 1024, 1056, 625, 2590, 7234, 275, 2593, 577, 534, 2789, 253, 4154, 625, 44003, 253, 1655, 2715, 310, 625, 25441, 2980, 281, 253, 9414, 50276, 20, 19843, 891, 717, 13477, 407, 253, 767, 4081, 3082, 1531, 7381, 285, 260, 333, 2205, 534, 891, 1158, 403, 417, 6283, 16318, 275, 253, 10336, 15711, 275, 3239, 721, 752, 513, 368, 1599, 326, 323, 776, 260, 333, 2205, 3828, 359, 897, 253, 260, 333, 2205, 4979, 323, 15274, 752, 310, 253, 5795, 310, 2649, 970, 253, 45830, 253, 2644, 1127, 670, 1531, 7381, 285, 260, 333, 2205, 3828, 752, 310, 253, 3064, 875, 253, 260, 333, 2205, 3828, 285, 1531, 7381, 327, 247, 806, 1239, 891, 1869, 597, 497, 253, 1072, 4453, 751, 1531, 7381, 310, 391, 7381, 50276, 68, 333, 2205, 4979, 533, 352, 651, 320, 1175, 281, 19148, 253, 3910, 187, 187, 4118, 18435, 27, 635, 1175, 2929, 352, 29328, 247, 4460, 4764, 1320, 273, 19627, 2410, 17009, 326, 4648, 253, 260, 333, 2205, 4979, 275, 253, 269, 15421, 5028, 253, 2929, 25339, 2067, 7794, 273, 253, 4081, 4764, 1320, 1690, 7364, 285, 15180, 15711, 285, 921, 12866, 352, 275, 253, 1774, 2898, 273, 48960, 31640, 17170, 1175, 1543, 253, 10123, 403, 512, 1077, 2762, 594, 516, 5211, 281, 5583, 14924, 50276, 12563, 247, 1943, 11557, 483, 281, 253, 30628, 285, 281, 253, 4477, 323, 1146, 16383, 1309, 253, 5955, 2180, 253, 30628, 9583, 342, 253, 2929, 281, 247, 1270, 6864, 285, 253, 4477, 5520, 253, 2929, 15455, 347, 247, 2380, 973, 2218, 281, 512, 273, 368 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 26122, 846, 30080, 22559, 4583, 253, 4477, 452, 9713, 1142, 7350, 275, 253, 6323, 2715, 273, 253, 2929, 275, 253, 1655, 830, 891, 452, 247, 1199, 1805, 13214, 273, 436, 789, 285, 3021, 7164, 619, 4868, 281, 247, 1127, 835, 891, 476, 5583, 14924, 275, 253, 1563, 891, 823, 690, 846, 250, 2858, 22559, 4385, 387, 253, 990, 273, 1016, 5382, 273, 619, 3236, 2278, 50275, 8774, 436, 9380, 10262, 271, 2898, 273, 253, 260, 333, 2205, 4979, 281, 4764, 3006, 247, 8578, 273, 19627, 27311, 267, 8090, 347, 891, 7192, 275, 4499, 281, 643, 3082, 326, 19627, 907, 253, 27311, 407, 40206, 15609, 253, 10295, 253, 19627, 1320, 310, 2684, 275, 253, 4294, 5028, 253, 260, 333, 2205, 4979, 310, 3732, 281, 253, 10295, 12624, 3969, 281, 1016, 12275, 846, 9433, 253, 269, 649, 285, 840, 253, 3453, 310, 12372, 970, 253, 13737, 269, 649, 840, 352, 310, 2011, 45190, 326, 436, 1332, 31221, 22429, 285, 326, 2057, 436, 1332, 390, 247, 5019, 273, 253, 260, 333, 2205, 4979, 285, 247, 1332, 432, 247, 1027, 2929, 391, 7381, 19132, 2220, 18065, 31640, 970, 253, 2969, 11233, 37913, 15456, 3033, 690, 16888, 15988, 327, 4076, 2228, 403, 671, 2361, 50275, 856, 84, 337, 3290, 253, 16774, 7103, 310, 1175, 387, 10941, 875, 1666, 25379, 2299, 352, 812, 320, 5520, 923, 772, 50276, 19, 3236, 414, 352, 3133, 4722, 285, 4460, 281, 513, 19627, 1320, 970, 253, 260, 333, 2205, 4979, 275, 269, 15421, 5028, 347, 436, 3133, 281, 18210, 7364, 273, 643, 3082, 326, 452, 19408, 2820, 281, 19627, 907, 253, 4315, 3969, 281, 253, 27311, 253, 2022, 2523, 310, 326, 824, 1232, 3798, 6548, 247, 4315, 326, 1057, 417, 2723, 281, 271, 4588, 27311, 285, 436, 2929, 3133, 281, 3693, 326, 2523, 17965, 50276, 20, 8453, 432, 5661, 7103, 352, 3133, 326, 253, 4081, 1332, 310, 247, 2266, 7431, 2190, 643, 18075, 281, 19627, 27311, 267, 8090, 2299, 352, 310, 671, 625, 8214, 846, 250, 2858, 22559, 253, 4477, 452, 2879, 3081, 4679, 285, 921, 326, 275, 3946, 616, 1332, 476, 320, 347, 3809, 390, 7938, 685, 643, 1666, 25379, 275, 2176, 15216, 50276, 21, 8453, 352, 3133, 326, 253, 2317, 273, 19627, 21257, 4764, 1025, 275, 253, 1039, 4081, 275, 436, 2929, 476, 320, 625, 43541, 685, 2045, 789, 270, 21592, 390, 391, 7381, 3738, 436, 812, 320, 31637, 2007, 407, 253, 4477, 50275, 5040, 337, 19843, 846, 4361, 253, 2929, 2709, 2069, 352, 310, 417, 2590, 281, 479, 604, 436, 310, 247, 47641, 4154, 390, 604, 359, 476, 5276, 326, 253, 3453, 273, 253, 9267, 18859, 260, 333, 2205, 3828, 310, 6296, 19627, 326, 310, 476, 359, 921, 247, 10012, 751, 1339, 269, 89, 320, 253, 3453, 273, 253, 260, 333, 2205, 3828, 5933, 337, 840, 269, 89, 1269, 50276, 18, 323, 512, 1269, 310, 436, 2032, 1057, 352, 1056, 3282, 352, 4453, 751, 4679, 275, 4677, 337, 403, 2820, 281, 17813, 436, 45190, 533, 891, 4282, 604, 436, 7473, 906, 812, 320, 2797, 846, 250, 2858, 22559, 253, 4477, 1056, 436, 625, 2590, 275, 7118, 495, 285, 577, 50276, 19, 19843, 891, 1928, 326, 627, 403, 1142, 2426, 13044, 1475, 1293, 247, 2590, 5426, 285, 352, 2789, 253, 7125, 21643, 275, 1798, 891, 717, 5015, 670, 253, 2644, 273, 2593, 891, 21184, 2007, 50276, 7053, 8413, 8819, 41961, 310, 2931, 323, 6278, 12624, 285, 840, 253, 1307, 8413, 8819, 41961, 2721, 310, 323, 253, 9261, 4195, 66, 255, 594, 752, 1057, 8413, 8819, 41961, 2721, 1599, 604, 352, 2097, 247, 1159, 824, 326, 697, 258, 484, 307, 310, 8413, 8819, 25562, 840, 352, 651, 320, 14916, 281, 3453, 667, 3638, 8413, 8819, 25562, 4315, 253, 1774, 2181, 670, 4195, 66, 255, 310, 326, 352, 310, 275, 958, 253, 19627, 12378, 273, 247, 4830, 253, 2317, 273, 8413, 8819, 25562, 4315, 40090, 247, 2803, 273, 1249, 50276, 7461, 752, 1057, 352, 1599, 281, 8413, 8819, 1105, 3899, 363, 2721, 247, 27311, 1060, 253, 4477, 897, 253, 14951, 2410, 22358, 534, 310, 417, 2931, 9825, 533, 891, 5476, 326, 352, 2097, 253, 4872, 5572, 3969, 281, 253, 27311, 342, 10295, 259, 533, 310, 436, 247, 2014, 10295, 4315, 891, 5467, 984, 5010, 752, 310, 22923, 275, 2505, 13118, 17118, 3877, 326, 275, 253, 2045, 12494, 259, 310, 2931, 347, 247, 13148, 273, 2169, 5958, 21, 594, 697, 811, 3014, 310, 417, 2931, 50276, 249, 1386, 342, 253, 2045, 253, 26724, 1039, 281, 8413, 8819, 41961, 2721, 2410, 88, 651, 320, 2410, 88, 50276, 13118, 17118, 752, 310, 253, 5886, 342, 253, 2048, 2410, 88, 50276, 13118, 17118, 275, 6010, 891, 1119, 436, 12494, 6685, 21643, 846, 250, 2858, 22559, 253, 4477, 564, 689, 253, 14308, 285, 1024, 1056, 625, 2590, 7234, 275, 2593, 577, 534, 2789, 253, 4154, 625, 44003, 253, 1655, 2715, 310, 625, 25441, 2980, 281, 253, 9414, 50276, 20, 19843, 891, 717, 13477, 407, 253, 767, 4081, 3082, 1531, 7381, 285, 260, 333, 2205, 534, 891, 1158, 403, 417, 6283, 16318, 275, 253, 10336, 15711, 275, 3239, 721, 752, 513, 368, 1599, 326, 323, 776, 260, 333, 2205, 3828, 359, 897, 253, 260, 333, 2205, 4979, 323, 15274, 752, 310, 253, 5795, 310, 2649, 970, 253, 45830, 253, 2644, 1127, 670, 1531, 7381, 285, 260, 333, 2205, 3828, 752, 310, 253, 3064, 875, 253, 260, 333, 2205, 3828, 285, 1531, 7381, 327, 247, 806, 1239, 891, 1869, 597, 497, 253, 1072, 4453, 751, 1531, 7381, 310, 391, 7381, 50276, 68, 333, 2205, 4979, 533, 352, 651, 320, 1175, 281, 19148, 253, 3910, 187, 187, 4118, 18435, 27, 635, 1175, 2929, 352, 29328, 247, 4460, 4764, 1320, 273, 19627, 2410, 17009, 326, 4648, 253, 260, 333, 2205, 4979, 275, 253, 269, 15421, 5028, 253, 2929, 25339, 2067, 7794, 273, 253, 4081, 4764, 1320, 1690, 7364, 285, 15180, 15711, 285, 921, 12866, 352, 275, 253, 1774, 2898, 273, 48960, 31640, 17170, 1175, 1543, 253, 10123, 403, 512, 1077, 2762, 594, 516, 5211, 281, 5583, 14924, 50276, 12563, 247, 1943, 11557, 483, 281, 253, 30628, 285, 281, 253, 4477, 323, 1146, 16383, 1309, 253, 5955, 2180, 253, 30628, 9583, 342, 253, 2929, 281, 247, 1270, 6864, 285, 253, 4477, 5520, 253, 2929, 15455, 347, 247, 2380, 973, 2218, 281, 512, 273, 368 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper proposes a new initialization scheme for the kmedian problem on graph input or general metric spaces using metric embedding tree structure the paper proposes an algorithm that finds initialization of good centers using hst that gets an approximation factor of olog minkd if the data is in euclidean space where d is the number of dimensions then the paper studies clustering with differential privacy guarantee and hows that the initialization method could be adapted to give a slightly stronger muliplicative and additive errors the work complemented these theoretical findings with experiments and show that the proposed initialization imporves the performance of kmedian initialization overall the paper makes some good contributions and adds to the communities understanding of the kclustering problem espeically with privacy constraints the paper proposed new algorithm designs and proved updated bounds for kmedian withwithout privacy constraints the experiment design also seems comprehensive and persuasive to me the writing in this paper is mostly smooth but could still be improved and be more clarifying in some ways the paper did well in presenting the algorithms framework and the ideas there are interesting although one can argue that the improvement in approximation ratio in classical kmedian clustering is marginal but as the authors noted the design is new and it serves well for the setting of differential privacy constraints i do think the authors could be more clear when talking about previous work on privacy clustering the approaches used and the differences the setting of euclidean space vs general graph input right now the problem setting and the results seem a bit confusing to me minor comments 1 does the number 2 really matter in the 2hst that you are using or can we replace it with something like 1epsilon 2 in the approximation ratio there is mink d does that mean that we can only obtain this result with points in euclidean space 3 the notation nv and nv share the same meaning right 3 when defining scorevnvcdot 2hv is this the first time the notation hv appears i couldnt seem to find a definition of it 4 in my opinion the paper sometimes uses a new terminology and assume the reader knows what it is i think it is better to introduce terminologies and include a short description of them just to make sure the reader is on the same page for example the major comparison kmeansmedian is never fully explained 5 the paper studies kmedian clustering i wonder if we know anything about what happens when we switch to kmeans do the conclusions still hold mostly i consider the contribution made in this paper to be meaningful to the clustering community but it could be improved at least in writing it has a valid theoretical framework but as a review from the broader clustering community i find it hard to judge how significant these findings are docsepthis paper introduces a new initialization scheme for the kmedians clustering problem in the general metric space setting this is based on the construction of metric embeddings via 2hsts hierarchically wellseparated trees the authors also extend this to the differential privacy dp setting they prove approximation guarantees in both the nondp and dp settings improving upon the literature finally they empirically validate algorithms against a number of baselines with both real world and synthetic datasets for multiple metrics strengths this paper gives improved kmedians bounds in the general metric setting non dp that improve over the literature in the d k regime it also gives the best known theoretical guarantee in the dp setting and is worse than the lower bound by a small additive factor k log log n the paper empirically compares their algorithm to a number of standard baselines showing favorable results for multiple datasets and metrics and especially for less separable data and when the input data an unbalanced subset of the universe weaknesses a few comments on improving the paper the paper does not discuss runtimes of algorithms a discussion of runtimes as well as comparison of runtimes in experiments would be useful regarding the results for kmeans moving the kmeans results to the main paper as well as a brief discussion and comparison to kmedians would be useful we give a number of suggestions to improve clarity hst tree hst symmetric difference one symmetric difference of size one in algorithm 1 costf x y costf x y we will count levels from large to small we will count levels in descending order down the tree section 32 intro suppose t is an l log delta level2 hst let l log delta and suppose t is an llevel2hst algorithm 2 mention algorithm 6 in build a levell 2hst tree t based on input u state theorem 34 and theorem 35 before lemmas state that ndp stands for non differentially private theorem 42 constants 10 can be absorbed into bigoh notation this paper improves upon the literature for the kmedians clustering problem in the general metric setting as well as in the differentially private case the approximation guarantees improve the best known results in this case the experiments demonstrate the validity of the theoretical contributions as well docsepthis paper considers the problem of finding good initial centers for the fundamental problem of kmedian clustering using a randomized embedding of the original metric into a tree metric after setting the initial centers a standard local search algorithm is applied to produce an improved solution this is explored in both the standard context of kmedian clustering as well as in the relevant context of differentially private clustering in the latter setting the goal is to minimize the amount of additive error introduced by the algorithm subject to being epsilondifferentially private an extension to kmeans is given in the appendix in the standard setting of kmedian clustering the main theoretical result is an initialization algorithm which is an ologmindeltakapproximation to the optimal kmedian clustering this is an improvement over kmedian which gives olog k when delta is small eg for delta od and d is small using this as a seed results for a local search method results in an o1approximation overall at a high level their algorithm first constructs an embedding of the original metric into a hierarchically wellseparated tree hst from there the initialization can be seen as finding an o1approximate solution on the hst efficiently the overall guarantee follows from standard results about hsts in the differentially private setting the main result is a similar guarantee on the quality of the initial solution and also a bound on the quality of the final solution when using a known private local search algorithm the quality of the final solution has o1multiplicative error and oepsilon1k2deltalognloglogn additive error this is an improvement over the additive error of oepsilon1k2deltalog2n due to gupta et al 2010 the number of local search iterations is also improved from oklog n to okloglog n the main idea for the initialization is similar to the standard setting but here they use the structure of the hst to ensure the initial solution is private by injecting a different amount of noise at each level of the tree an empirical study is done on a class of synthetic graphs as well as the mnist dataset for the synthetic graphs the metric space is given by the weighted shortest path distance in each graph while for the mnist dataset the metric is given by either ell1 or ell2 the authors compare both the initial costs and the final costs after running a local search method for several initialization methods in both the standard and differentially private settings the main observation is that the proposed initialization methods tend to have better initial cost and the proposed differentially private method often outperforms the other methods in both initial cost and final cost the main strengths of this paper lie in the differentially private version of their algorithm here we get both theoretical and empirical improvements over prior work i also appreciated the simplicity of the algorithms and the clarity of the presentation in terms of weakness first there is some misleading language with how the result for the standard setting is presented on page 2 the paper claims that the proposed method provides an olog minkdapproximation but later on page 6 this is clarified as being an ologmindeltakapproximation this achieves the former bound when the input data is bounded so that delta od but this caveat is not discussed in the beginning of the paper i recommend the authors to move this discussion up to the statement of their contributions to avoid misleading readers next the main approach of the paper is to embed the input metric into a tree metric via an hst then efficiently compute an o1approximation on this hst metric embeddings especially tree embeddings are a standard technique in approximation algorithms and this should be made clear in the discussion additionally it should be noted that there are polynomial time algorithms for computing an exact kmedian solution on a tree metric eg see 12 the proposed algorithm is a worthwhile contribution due to its simplicity but these prior results should be discussed additionally it would be interesting to use one of these exact methods as a baseline in the experiments in the experiments the authors run a fixed 20 iterations of local search after finding the initial centers then report the kmedian costs it might be interesting to also compare the runtimeiteration cost of reaching a locally optimal solution for the proposed methods and baselines as well as the final cost of the locally optimal solutions found by each this sort of experiment seemed to be motivated by the discussion of the improved iteration bound for the differentially private method but is missing as of now i lean more towards rejection but would be inclined to increase my score of the above comments are addressed references 1 shah rahul faster algorithms for kmedian problem on trees with smaller heights technical report 2003 2 tamir arie an opn2 algorithm for the pmedian and related problems on tree graphs operations research letters 1996 edit after reading the authors responses one of the main benefits of the proposed method for clustering on a tree is that it can be adapted to the differentially private setting which it is unclear how to do for other dynamic programming based methods i have raised my score to weak accept this paper considers initialization methods for kmedian clustering in both the standard setting and the differentially private setting the paper gives theoretical bounds for their methods in both settings and backs this up with an empirical study given the misleading presentation of some of the results a lack of discussioncomparison to prior work on kmedian in tree metrics and a lack of running timeiteration count comparison in the experiments i am not okay with accepting this paper unless these points are addressed docsepthe paper suggests an algorithm for the metric kmedian problem using ideas from metric embedding theory the suggested use of the algorithm is as an initialization routine for the local search based algorithm for kmedian the differentially private version of the algorithm is also given along with bounds on kmedian approximation factor experiments are conducted over datasets such as mnist and results compared against the kmeans algorithm a popular initialisation algorithm the paper uses familiar techniques from metric embedding literature to suggest an algorithm for the kmedian problem the contributions does not seem to be very strong the metric embedding based algorithm for kmedian is shown to give approximation guarantee of logmink d which is better than kmeans which gives olog k however there are other algorithms that give much better approximation guarantees it is not clear why the comparison is done with kmeans here this is something that the paper does not elaborate perhaps the algorithm is being suggested as an initialisation routine and hence the comparison is done with kmeans but then that cannot be the only reason since the other algorithms with better approximation guarantees can also be suggested as initialisation routines the discussion seems to be lacking on this aspect in my opinion the improvement with respect to the differentially private seems to be minor over the previous work in summary the paper neither introduces new techniques nor obtains significant improvement over past results my suggestion for improving the paper would be to add a discussion on why the suggested algorithm should be the right initialisation algorithm ### Summary:
this paper considers initialization methods for the kmeans algorithm there is a lot of prior work in this area the reviewers were mildly positive on the paper there were several concerns on how the results were presented as well as the comparison to prior work importantly no reviewer felt that there was a lot of novelty in the paper over the line of work on kmeans initialization
[ 9569, 18376, 5970, 285, 2486, 247, 2159, 5740, 273, 731, 816, 281, 1056, 2119, 253, 9414, 310, 327, 253, 1072, 3239, 323, 1650, 253, 2201, 5301, 465, 30799, 29541, 50276, 261, 1620, 4751, 5544, 608, 253, 2929, 2175, 465, 29541, 17524, 891, 4282, 604, 359, 871, 2712, 670, 752, 6569, 672, 359, 5234, 281, 465, 30799, 513, 253, 11815, 1335, 2186, 6571, 891, 1908, 253, 7680, 1160, 275, 436, 2929, 281, 320, 14282, 281, 253, 17524, 3114, 533, 352, 812, 320, 5520, 387, 1878, 275, 4028, 352, 556, 247, 3588, 10527, 7792, 533, 347, 247, 2278, 432, 253, 16055, 17524, 3114, 891, 1089, 352, 1892, 281, 5963, 849, 1534, 841, 4342, 403, 5474, 33032, 2520, 2929, 23970, 247, 747, 31850, 6974, 323, 253, 465, 1314, 2458, 17524, 1895, 275, 253, 2087, 7982, 2317, 4758, 436, 310, 1754, 327, 253, 5140, 273, 7982, 46234, 3066, 374, 73, 7752, 20258, 1037, 973, 49741, 7139, 253, 4477, 671, 9017, 436, 281, 253, 8967, 11068, 33234, 4758, 597, 5276, 11193, 23632, 275, 1097, 253, 27370, 81, 285, 33234, 7533, 11138, 2220, 253, 6239, 4720, 597, 45190, 17813, 11333, 1411, 247, 1180, 273, 1666, 25379, 342, 1097, 1524, 1533, 285, 13506, 15302, 323, 2709, 17082, 20544, 50276, 2520, 2929, 4245, 5520, 465, 1314, 2458, 14493, 275, 253, 2087, 7982, 4758, 1327, 33234, 326, 3157, 689, 253, 6239, 275, 253, 277, 50276, 76, 9459, 352, 671, 4245, 253, 1682, 1929, 10527, 12215, 275, 253, 33234, 4758, 285, 310, 7197, 685, 253, 2406, 3033, 407, 247, 1355, 21842, 2803, 465, 2412, 2412, 295, 50276, 783, 2929, 45190, 26662, 616, 5933, 281, 247, 1180, 273, 2629, 1666, 25379, 4645, 13857, 1543, 323, 2709, 15302, 285, 17082, 285, 3340, 323, 1679, 39690, 941, 285, 672, 253, 3280, 941, 271, 440, 30063, 8578, 273, 253, 10325, 50276, 20881, 1255, 265, 247, 1643, 5701, 327, 11138, 253, 2929, 50276, 783, 2929, 1057, 417, 2319, 1408, 3181, 273, 11333, 247, 5955, 273, 1408, 3181, 347, 973, 347, 5301, 273, 1408, 3181, 275, 4679, 651, 320, 4217, 50276, 1747, 13218, 253, 1543, 323, 465, 30799, 50276, 26621, 253, 465, 30799, 1543, 281, 253, 2022, 2929, 347, 973, 347, 247, 4864, 5955, 285, 5301, 281, 465, 1314, 2458, 651, 320, 4217, 50276, 664, 1918, 247, 1180, 273, 13991, 281, 3157, 19843, 50276, 73, 296, 5202, 50276, 73, 296, 50276, 39617, 3064, 581, 50276, 39617, 3064, 273, 1979, 581, 50276, 249, 5933, 337, 2105, 71, 50276, 89, 50276, 90, 50276, 16736, 71, 50276, 89, 50276, 90, 50276, 664, 588, 1385, 2308, 432, 1781, 281, 1355, 50276, 664, 588, 1385, 2308, 275, 16317, 1340, 1066, 253, 5202, 50276, 4674, 4567, 26432, 50276, 84, 2384, 246, 310, 271, 298, 50276, 2808, 18687, 1268, 19, 288, 296, 50276, 1059, 298, 50276, 2808, 18687, 285, 9428, 246, 310, 271, 298, 5251, 19, 73, 296, 50276, 41528, 374, 3748, 5933, 721, 275, 1973, 247, 20978, 437, 374, 73, 296, 5202, 246, 1754, 327, 3280, 1484, 50276, 3409, 10012, 5910, 285, 10012, 4791, 1078, 458, 44661, 50276, 3409, 326, 295, 12132, 9572, 323, 1327, 21673, 3055, 50276, 33921, 5976, 14637, 884, 476, 320, 19360, 715, 1943, 1368, 14951, 50276, 2520, 2929, 19132, 2220, 253, 6239, 323, 253, 465, 1314, 2458, 17524, 1895, 275, 253, 2087, 7982, 4758, 347, 973, 347, 275, 253, 21673, 3055, 1083, 253, 11193, 23632, 3157, 253, 1682, 1929, 1543, 275, 436, 1083, 253, 4679, 7568, 253, 13091, 273, 253, 10527, 9021, 347, 973, 5474, 33032, 2520, 2929, 19401, 253, 1895, 273, 4560, 1175, 3302, 12127, 323, 253, 7936, 1895, 273, 465, 29541, 17524, 970, 247, 14871, 21496, 273, 253, 3236, 7982, 715, 247, 5202, 7982, 50276, 6438, 4758, 253, 3302, 12127, 247, 2629, 1980, 3186, 5933, 310, 3732, 281, 4711, 271, 5520, 2900, 50276, 2520, 310, 14859, 275, 1097, 253, 2629, 3634, 273, 465, 29541, 17524, 347, 973, 347, 275, 253, 4623, 3634, 273, 21673, 3055, 17524, 50276, 249, 253, 6158, 4758, 253, 4736, 310, 281, 15338, 253, 2408, 273, 21842, 2228, 5611, 407, 253, 5933, 2256, 281, 1146, 299, 793, 300, 857, 7413, 1365, 3055, 50276, 266, 6880, 281, 465, 30799, 310, 1677, 275, 253, 30762, 50276, 249, 253, 2629, 4758, 273, 465, 29541, 17524, 253, 2022, 10527, 906, 310, 271, 31850, 5933, 534, 310, 271, 258, 2808, 14785, 2585, 518, 6772, 3266, 318, 281, 253, 8654, 465, 29541, 17524, 50276, 2520, 310, 271, 7756, 689, 465, 29541, 534, 4245, 258, 2808, 465, 672, 18687, 310, 1355, 24088, 323, 18687, 50276, 351, 285, 277, 310, 1355, 50276, 5302, 436, 347, 247, 8357, 1543, 323, 247, 1980, 3186, 1332, 1543, 275, 271, 258, 18, 6772, 3266, 318, 4583, 50276, 255, 247, 1029, 1268, 616, 5933, 806, 21031, 271, 21496, 273, 253, 3236, 7982, 715, 247, 20258, 1037, 973, 49741, 5202, 288, 296, 50276, 4064, 627, 253, 31850, 476, 320, 2326, 347, 4560, 271, 258, 18, 9887, 2542, 2900, 327, 253, 288, 296, 14556, 50276, 783, 4583, 12215, 3637, 432, 2629, 1543, 670, 288, 7752, 50276, 249, 50276, 783, 21673, 3055, 4758, 253, 2022, 906, 310, 247, 2074, 12215, 327, 253, 3290, 273, 253, 3302, 2900, 285, 671, 247, 3033, 327, 253, 3290, 273, 253, 2457, 2900, 672, 970, 247, 1929, 3055, 1980, 3186, 5933, 50276, 783, 3290, 273, 253, 2457, 2900, 556, 258, 18, 23939, 1860, 800, 2228, 285, 258, 4259, 18, 76, 19, 69, 2585, 267, 2331, 2808, 77, 2331, 21842, 2228, 50276, 2520, 310, 271, 7756, 689, 253, 21842, 2228, 273, 258, 4259, 18, 76, 19, 69, 2585, 267, 462, 19, 79, 1955, 281, 1149, 37668, 1162, 355, 4267, 50276, 783, 1180, 273, 1980, 3186, 25142, 310, 671, 5520, 432, 8718, 2808, 295, 281, 8718, 2808, 2808, 295, 50276, 783, 2022, 2934, 323, 253, 31850, 310, 2074, 281, 253, 2629, 4758, 533, 1060, 597, 897, 253, 2605, 273, 253, 288, 296, 281, 5416, 253, 3302, 2900, 310, 3055, 407, 42014, 247, 1027, 2408, 273, 6046, 387, 1016, 1268, 273, 253, 5202, 50276, 266, 16774, 1263, 310, 2218, 327, 247, 966, 273, 13506, 14580, 347, 973, 347, 253, 278, 79, 382, 10895, 50276, 1542, 253, 13506, 14580, 253, 7982, 2317, 310, 1677, 407, 253, 17375, 30505, 1854, 4181, 275, 1016, 4216, 1223, 323, 253, 278, 79, 382, 10895, 253, 7982, 310, 1677, 407, 2057, 11591, 18, 390, 11591, 19, 50276, 783, 4477, 7277, 1097, 253, 3302, 4815, 285, 253, 2457, 4815, 846, 3515, 247, 1980, 3186, 1332, 323, 2067, 31850, 3082, 275, 1097, 253, 2629, 285, 21673, 3055, 7533, 50276, 783, 2022, 8310, 310, 326, 253, 4081, 31850, 3082, 5257, 281, 452, 1805, 3302, 2105, 285, 253, 4081, 21673, 3055, 1332, 2223, 41731, 13015, 253, 643, 3082, 275, 1097, 3302, 2105, 285, 2457, 2105, 50273, 783, 2022, 20544, 273, 436, 2929, 7027, 275, 253, 21673, 3055, 2715, 273, 616, 5933, 50276, 1568, 359, 755, 1097, 10527, 285, 16774, 11701, 689, 2720, 789, 50276, 74, 671, 14109, 253, 17647, 273, 253, 11333, 285, 253, 19843, 273, 253, 9759, 50276, 249, 2426, 273, 14855, 806, 627, 310, 690, 24363, 3448, 342, 849, 253, 906, 323, 253, 2629, 4758, 310, 3559, 50276, 251, 3239, 374, 253, 2929, 3916, 326, 253, 4081, 1332, 3400, 271, 258, 2808, 278, 750, 69, 6772, 3266, 318, 533, 1996, 327, 3239, 721, 436, 310, 31637, 347, 1146, 271, 258, 2808, 14785, 2585, 518, 6772, 3266, 318, 50276, 2520, 33526, 253, 3438, 3033, 672, 253, 3280, 941, 310, 11542, 594, 326, 18687, 50276, 351, 533, 436, 15985, 255, 310, 417, 5469, 275, 253, 5068, 273, 253, 2929, 50276, 74, 5583, 253, 4477, 281, 2118, 436, 5955, 598, 281, 253, 3908, 273, 616, 9021, 281, 3693, 24363, 10668, 50276, 8384, 253, 2022, 2746, 273, 253, 2929, 310, 281, 8473, 253, 3280, 7982, 715, 247, 5202, 7982, 3066, 271, 288, 296, 840, 14556, 11897, 271, 258, 18, 6772, 3266, 318, 327, 436, 288, 296, 50276, 10994, 46234, 3340, 5202, 46234, 403, 247, 2629, 5853, 275, 11193, 11333, 285, 436, 943, 320, 1160, 2590, 275, 253, 5955, 50276, 29483, 595, 352, 943, 320, 4879, 326, 627, 403, 14189, 673, 11333, 323, 12672, 271, 3242, 465, 29541, 2900, 327, 247, 5202, 7982, 24088, 923, 1249, 50276, 783, 4081, 5933, 310, 247, 32811, 7680, 1955, 281, 697, 17647, 533, 841, 2720, 1543, 943, 320, 5469, 50276, 29483, 595, 352, 651, 320, 4722, 281, 897, 581, 273, 841, 3242, 3082, 347, 247, 8245, 275, 253, 4679, 50276, 249, 253, 4679, 253, 4477, 1408, 247, 4229, 1384, 25142, 273, 1980, 3186, 846, 4560, 253, 3302, 12127, 840, 1304, 253, 465, 29541, 4815, 50276, 262, 1537, 320, 4722, 281, 671, 7277, 253, 20243, 2562, 318, 2105, 273, 10922, 247, 12171, 8654, 2900, 323, 253, 4081, 3082, 285, 1666, 25379, 347, 973, 347, 253, 2457, 2105, 273, 253, 12171, 8654, 5482, 1119, 407, 1016, 50276, 2520, 3686, 273, 3368, 4455, 281, 320, 17194, 407, 253, 5955, 273, 253, 5520, 19502, 3033, 323, 253, 21673, 3055, 1332, 533, 310, 5816, 50276, 284, 273, 1024, 891, 9644, 625, 4404, 18235, 533, 651, 320, 21802, 281, 2572, 619, 4868, 273, 253, 1840, 5701, 403, 9713, 50276, 250, 3065, 50276, 18, 50276, 1200, 1240, 1218, 73, 335, 7938, 11333, 323, 465, 29541, 1895, 327, 7139, 342, 4577, 27574, 50276, 48746, 1304, 6469, 50276, 19, 50276, 85, 312, 343, 247, 6595, 50276, 266, 1121, 79, 19, 5933, 323, 253, 268, 29541, 285, 2905, 3237, 327, 5202, 14580, 50276, 42316, 2561, 4876, 8441, 50272, 15576, 846, 4361, 253, 4477, 6128, 50276, 531, 273, 253, 2022, 5373, 273, 253, 4081, 1332, 323, 17524, 327, 247, 5202, 310, 326, 352, 476, 320, 12956, 281, 253, 21673, 3055, 4758, 534, 352, 310, 12744, 849, 281, 513, 323, 643, 7870, 10717, 1754, 3082, 50276, 74, 452, 5439, 619, 4868, 281, 5075, 2997, 50274, 2520, 2929, 19401, 31850, 3082, 323, 465, 29541, 17524, 275, 1097, 253, 2629, 4758, 285, 253, 21673, 3055, 4758, 50276, 783, 2929, 4245, 10527, 14493, 323, 616, 3082, 275, 1097, 7533, 285, 22513, 436, 598, 342, 271, 16774, 1263, 50276, 28821, 253, 24363, 9759, 273, 690, 273, 253, 1543, 247, 3480, 273, 5955, 47109, 281, 2720, 789, 327, 465, 29541, 275, 5202, 17082, 285, 247, 3480, 273, 3515, 673, 2562, 318, 1385, 5301, 275, 253, 4679, 891, 717, 417, 8261, 342, 18738, 436, 2929, 5734, 841, 2792, 403, 9713, 5474, 339, 431, 248, 2929, 5936, 271, 5933, 323, 253, 7982, 465, 29541, 1895, 970, 5697, 432, 7982, 21496, 3762, 253, 5125, 897, 273, 253, 5933, 310, 347, 271, 31850, 10934, 323, 253, 1980, 3186, 1754, 5933, 323, 465, 29541, 253, 21673, 3055, 2715, 273, 253, 5933, 310, 671, 1677, 2112, 342, 14493, 327, 465, 29541, 11193, 2803, 4679, 403, 5196, 689, 15302, 824, 347, 278, 79, 382, 285, 1543, 2429, 1411, 253, 465, 30799, 5933, 247, 4633, 3302, 5837, 5933, 253, 2929, 4648, 7615, 5609, 432, 7982, 21496, 6239, 281, 1804, 271, 5933, 323, 253, 465, 29541, 1895, 253, 9021, 1057, 417, 1646, 281, 320, 1077, 2266, 50276, 783, 7982, 21496, 1754, 5933, 323, 465, 29541, 310, 2011, 281, 1918, 11193, 12215, 273, 2412, 78, 750, 277, 534, 310, 1805, 685, 465, 30799, 534, 4245, 258, 2808, 465, 2299, 627, 403, 643, 11333, 326, 1918, 1199, 1805, 11193, 23632, 352, 310, 417, 2590, 2139, 253, 5301, 310, 2218, 342, 465, 30799, 1060, 436, 310, 1633, 326, 253, 2929, 1057, 417, 21184, 4931, 253, 5933, 310, 1146, 5125, 347, 271, 3302, 5837, 10934, 285, 7613, 253, 5301, 310, 2218, 342, 465, 30799, 533, 840, 326, 2550, 320, 253, 760, 1921, 1580, 253, 643, 11333, 342, 1805, 11193, 23632, 476, 671, 320, 5125, 347, 3302, 5837, 33255, 253, 5955, 3133, 281, 320, 14999, 327, 436, 4809, 275, 619, 4743, 50276, 783, 7756, 342, 1675, 281, 253, 21673, 3055, 3133, 281, 320, 5884, 689, 253, 2045, 789, 275, 6010, 253, 2929, 6747, 23970, 747, 5609, 4543, 31326, 1534, 7756, 689, 2469, 1543, 619, 14876, 323, 11138, 253, 2929, 651, 320, 281, 823, 247, 5955, 327, 2139, 253, 5125, 5933, 943, 320, 253, 987, 3302, 5837, 5933, 2490, 187, 4118, 18435, 27, 2520, 2929, 19401, 31850, 3082, 323, 253, 465, 30799, 5933, 50276, 9088, 310, 247, 2257, 273, 2720, 789, 275, 436, 2170, 253, 30628, 497, 38920, 2762, 327, 253, 2929, 50276, 9088, 497, 2067, 7350, 327, 849, 253, 1543, 497, 3559, 347, 973, 347, 253, 5301, 281, 2720, 789, 15538, 642, 37317, 3543, 326, 627, 369, 247, 2257, 273, 38135, 275, 253, 2929, 689, 253, 1386, 273, 789, 327, 465, 30799, 31850 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 9569, 18376, 5970, 285, 2486, 247, 2159, 5740, 273, 731, 816, 281, 1056, 2119, 253, 9414, 310, 327, 253, 1072, 3239, 323, 1650, 253, 2201, 5301, 465, 30799, 29541, 50276, 261, 1620, 4751, 5544, 608, 253, 2929, 2175, 465, 29541, 17524, 891, 4282, 604, 359, 871, 2712, 670, 752, 6569, 672, 359, 5234, 281, 465, 30799, 513, 253, 11815, 1335, 2186, 6571, 891, 1908, 253, 7680, 1160, 275, 436, 2929, 281, 320, 14282, 281, 253, 17524, 3114, 533, 352, 812, 320, 5520, 387, 1878, 275, 4028, 352, 556, 247, 3588, 10527, 7792, 533, 347, 247, 2278, 432, 253, 16055, 17524, 3114, 891, 1089, 352, 1892, 281, 5963, 849, 1534, 841, 4342, 403, 5474, 33032, 2520, 2929, 23970, 247, 747, 31850, 6974, 323, 253, 465, 1314, 2458, 17524, 1895, 275, 253, 2087, 7982, 2317, 4758, 436, 310, 1754, 327, 253, 5140, 273, 7982, 46234, 3066, 374, 73, 7752, 20258, 1037, 973, 49741, 7139, 253, 4477, 671, 9017, 436, 281, 253, 8967, 11068, 33234, 4758, 597, 5276, 11193, 23632, 275, 1097, 253, 27370, 81, 285, 33234, 7533, 11138, 2220, 253, 6239, 4720, 597, 45190, 17813, 11333, 1411, 247, 1180, 273, 1666, 25379, 342, 1097, 1524, 1533, 285, 13506, 15302, 323, 2709, 17082, 20544, 50276, 2520, 2929, 4245, 5520, 465, 1314, 2458, 14493, 275, 253, 2087, 7982, 4758, 1327, 33234, 326, 3157, 689, 253, 6239, 275, 253, 277, 50276, 76, 9459, 352, 671, 4245, 253, 1682, 1929, 10527, 12215, 275, 253, 33234, 4758, 285, 310, 7197, 685, 253, 2406, 3033, 407, 247, 1355, 21842, 2803, 465, 2412, 2412, 295, 50276, 783, 2929, 45190, 26662, 616, 5933, 281, 247, 1180, 273, 2629, 1666, 25379, 4645, 13857, 1543, 323, 2709, 15302, 285, 17082, 285, 3340, 323, 1679, 39690, 941, 285, 672, 253, 3280, 941, 271, 440, 30063, 8578, 273, 253, 10325, 50276, 20881, 1255, 265, 247, 1643, 5701, 327, 11138, 253, 2929, 50276, 783, 2929, 1057, 417, 2319, 1408, 3181, 273, 11333, 247, 5955, 273, 1408, 3181, 347, 973, 347, 5301, 273, 1408, 3181, 275, 4679, 651, 320, 4217, 50276, 1747, 13218, 253, 1543, 323, 465, 30799, 50276, 26621, 253, 465, 30799, 1543, 281, 253, 2022, 2929, 347, 973, 347, 247, 4864, 5955, 285, 5301, 281, 465, 1314, 2458, 651, 320, 4217, 50276, 664, 1918, 247, 1180, 273, 13991, 281, 3157, 19843, 50276, 73, 296, 5202, 50276, 73, 296, 50276, 39617, 3064, 581, 50276, 39617, 3064, 273, 1979, 581, 50276, 249, 5933, 337, 2105, 71, 50276, 89, 50276, 90, 50276, 16736, 71, 50276, 89, 50276, 90, 50276, 664, 588, 1385, 2308, 432, 1781, 281, 1355, 50276, 664, 588, 1385, 2308, 275, 16317, 1340, 1066, 253, 5202, 50276, 4674, 4567, 26432, 50276, 84, 2384, 246, 310, 271, 298, 50276, 2808, 18687, 1268, 19, 288, 296, 50276, 1059, 298, 50276, 2808, 18687, 285, 9428, 246, 310, 271, 298, 5251, 19, 73, 296, 50276, 41528, 374, 3748, 5933, 721, 275, 1973, 247, 20978, 437, 374, 73, 296, 5202, 246, 1754, 327, 3280, 1484, 50276, 3409, 10012, 5910, 285, 10012, 4791, 1078, 458, 44661, 50276, 3409, 326, 295, 12132, 9572, 323, 1327, 21673, 3055, 50276, 33921, 5976, 14637, 884, 476, 320, 19360, 715, 1943, 1368, 14951, 50276, 2520, 2929, 19132, 2220, 253, 6239, 323, 253, 465, 1314, 2458, 17524, 1895, 275, 253, 2087, 7982, 4758, 347, 973, 347, 275, 253, 21673, 3055, 1083, 253, 11193, 23632, 3157, 253, 1682, 1929, 1543, 275, 436, 1083, 253, 4679, 7568, 253, 13091, 273, 253, 10527, 9021, 347, 973, 5474, 33032, 2520, 2929, 19401, 253, 1895, 273, 4560, 1175, 3302, 12127, 323, 253, 7936, 1895, 273, 465, 29541, 17524, 970, 247, 14871, 21496, 273, 253, 3236, 7982, 715, 247, 5202, 7982, 50276, 6438, 4758, 253, 3302, 12127, 247, 2629, 1980, 3186, 5933, 310, 3732, 281, 4711, 271, 5520, 2900, 50276, 2520, 310, 14859, 275, 1097, 253, 2629, 3634, 273, 465, 29541, 17524, 347, 973, 347, 275, 253, 4623, 3634, 273, 21673, 3055, 17524, 50276, 249, 253, 6158, 4758, 253, 4736, 310, 281, 15338, 253, 2408, 273, 21842, 2228, 5611, 407, 253, 5933, 2256, 281, 1146, 299, 793, 300, 857, 7413, 1365, 3055, 50276, 266, 6880, 281, 465, 30799, 310, 1677, 275, 253, 30762, 50276, 249, 253, 2629, 4758, 273, 465, 29541, 17524, 253, 2022, 10527, 906, 310, 271, 31850, 5933, 534, 310, 271, 258, 2808, 14785, 2585, 518, 6772, 3266, 318, 281, 253, 8654, 465, 29541, 17524, 50276, 2520, 310, 271, 7756, 689, 465, 29541, 534, 4245, 258, 2808, 465, 672, 18687, 310, 1355, 24088, 323, 18687, 50276, 351, 285, 277, 310, 1355, 50276, 5302, 436, 347, 247, 8357, 1543, 323, 247, 1980, 3186, 1332, 1543, 275, 271, 258, 18, 6772, 3266, 318, 4583, 50276, 255, 247, 1029, 1268, 616, 5933, 806, 21031, 271, 21496, 273, 253, 3236, 7982, 715, 247, 20258, 1037, 973, 49741, 5202, 288, 296, 50276, 4064, 627, 253, 31850, 476, 320, 2326, 347, 4560, 271, 258, 18, 9887, 2542, 2900, 327, 253, 288, 296, 14556, 50276, 783, 4583, 12215, 3637, 432, 2629, 1543, 670, 288, 7752, 50276, 249, 50276, 783, 21673, 3055, 4758, 253, 2022, 906, 310, 247, 2074, 12215, 327, 253, 3290, 273, 253, 3302, 2900, 285, 671, 247, 3033, 327, 253, 3290, 273, 253, 2457, 2900, 672, 970, 247, 1929, 3055, 1980, 3186, 5933, 50276, 783, 3290, 273, 253, 2457, 2900, 556, 258, 18, 23939, 1860, 800, 2228, 285, 258, 4259, 18, 76, 19, 69, 2585, 267, 2331, 2808, 77, 2331, 21842, 2228, 50276, 2520, 310, 271, 7756, 689, 253, 21842, 2228, 273, 258, 4259, 18, 76, 19, 69, 2585, 267, 462, 19, 79, 1955, 281, 1149, 37668, 1162, 355, 4267, 50276, 783, 1180, 273, 1980, 3186, 25142, 310, 671, 5520, 432, 8718, 2808, 295, 281, 8718, 2808, 2808, 295, 50276, 783, 2022, 2934, 323, 253, 31850, 310, 2074, 281, 253, 2629, 4758, 533, 1060, 597, 897, 253, 2605, 273, 253, 288, 296, 281, 5416, 253, 3302, 2900, 310, 3055, 407, 42014, 247, 1027, 2408, 273, 6046, 387, 1016, 1268, 273, 253, 5202, 50276, 266, 16774, 1263, 310, 2218, 327, 247, 966, 273, 13506, 14580, 347, 973, 347, 253, 278, 79, 382, 10895, 50276, 1542, 253, 13506, 14580, 253, 7982, 2317, 310, 1677, 407, 253, 17375, 30505, 1854, 4181, 275, 1016, 4216, 1223, 323, 253, 278, 79, 382, 10895, 253, 7982, 310, 1677, 407, 2057, 11591, 18, 390, 11591, 19, 50276, 783, 4477, 7277, 1097, 253, 3302, 4815, 285, 253, 2457, 4815, 846, 3515, 247, 1980, 3186, 1332, 323, 2067, 31850, 3082, 275, 1097, 253, 2629, 285, 21673, 3055, 7533, 50276, 783, 2022, 8310, 310, 326, 253, 4081, 31850, 3082, 5257, 281, 452, 1805, 3302, 2105, 285, 253, 4081, 21673, 3055, 1332, 2223, 41731, 13015, 253, 643, 3082, 275, 1097, 3302, 2105, 285, 2457, 2105, 50273, 783, 2022, 20544, 273, 436, 2929, 7027, 275, 253, 21673, 3055, 2715, 273, 616, 5933, 50276, 1568, 359, 755, 1097, 10527, 285, 16774, 11701, 689, 2720, 789, 50276, 74, 671, 14109, 253, 17647, 273, 253, 11333, 285, 253, 19843, 273, 253, 9759, 50276, 249, 2426, 273, 14855, 806, 627, 310, 690, 24363, 3448, 342, 849, 253, 906, 323, 253, 2629, 4758, 310, 3559, 50276, 251, 3239, 374, 253, 2929, 3916, 326, 253, 4081, 1332, 3400, 271, 258, 2808, 278, 750, 69, 6772, 3266, 318, 533, 1996, 327, 3239, 721, 436, 310, 31637, 347, 1146, 271, 258, 2808, 14785, 2585, 518, 6772, 3266, 318, 50276, 2520, 33526, 253, 3438, 3033, 672, 253, 3280, 941, 310, 11542, 594, 326, 18687, 50276, 351, 533, 436, 15985, 255, 310, 417, 5469, 275, 253, 5068, 273, 253, 2929, 50276, 74, 5583, 253, 4477, 281, 2118, 436, 5955, 598, 281, 253, 3908, 273, 616, 9021, 281, 3693, 24363, 10668, 50276, 8384, 253, 2022, 2746, 273, 253, 2929, 310, 281, 8473, 253, 3280, 7982, 715, 247, 5202, 7982, 3066, 271, 288, 296, 840, 14556, 11897, 271, 258, 18, 6772, 3266, 318, 327, 436, 288, 296, 50276, 10994, 46234, 3340, 5202, 46234, 403, 247, 2629, 5853, 275, 11193, 11333, 285, 436, 943, 320, 1160, 2590, 275, 253, 5955, 50276, 29483, 595, 352, 943, 320, 4879, 326, 627, 403, 14189, 673, 11333, 323, 12672, 271, 3242, 465, 29541, 2900, 327, 247, 5202, 7982, 24088, 923, 1249, 50276, 783, 4081, 5933, 310, 247, 32811, 7680, 1955, 281, 697, 17647, 533, 841, 2720, 1543, 943, 320, 5469, 50276, 29483, 595, 352, 651, 320, 4722, 281, 897, 581, 273, 841, 3242, 3082, 347, 247, 8245, 275, 253, 4679, 50276, 249, 253, 4679, 253, 4477, 1408, 247, 4229, 1384, 25142, 273, 1980, 3186, 846, 4560, 253, 3302, 12127, 840, 1304, 253, 465, 29541, 4815, 50276, 262, 1537, 320, 4722, 281, 671, 7277, 253, 20243, 2562, 318, 2105, 273, 10922, 247, 12171, 8654, 2900, 323, 253, 4081, 3082, 285, 1666, 25379, 347, 973, 347, 253, 2457, 2105, 273, 253, 12171, 8654, 5482, 1119, 407, 1016, 50276, 2520, 3686, 273, 3368, 4455, 281, 320, 17194, 407, 253, 5955, 273, 253, 5520, 19502, 3033, 323, 253, 21673, 3055, 1332, 533, 310, 5816, 50276, 284, 273, 1024, 891, 9644, 625, 4404, 18235, 533, 651, 320, 21802, 281, 2572, 619, 4868, 273, 253, 1840, 5701, 403, 9713, 50276, 250, 3065, 50276, 18, 50276, 1200, 1240, 1218, 73, 335, 7938, 11333, 323, 465, 29541, 1895, 327, 7139, 342, 4577, 27574, 50276, 48746, 1304, 6469, 50276, 19, 50276, 85, 312, 343, 247, 6595, 50276, 266, 1121, 79, 19, 5933, 323, 253, 268, 29541, 285, 2905, 3237, 327, 5202, 14580, 50276, 42316, 2561, 4876, 8441, 50272, 15576, 846, 4361, 253, 4477, 6128, 50276, 531, 273, 253, 2022, 5373, 273, 253, 4081, 1332, 323, 17524, 327, 247, 5202, 310, 326, 352, 476, 320, 12956, 281, 253, 21673, 3055, 4758, 534, 352, 310, 12744, 849, 281, 513, 323, 643, 7870, 10717, 1754, 3082, 50276, 74, 452, 5439, 619, 4868, 281, 5075, 2997, 50274, 2520, 2929, 19401, 31850, 3082, 323, 465, 29541, 17524, 275, 1097, 253, 2629, 4758, 285, 253, 21673, 3055, 4758, 50276, 783, 2929, 4245, 10527, 14493, 323, 616, 3082, 275, 1097, 7533, 285, 22513, 436, 598, 342, 271, 16774, 1263, 50276, 28821, 253, 24363, 9759, 273, 690, 273, 253, 1543, 247, 3480, 273, 5955, 47109, 281, 2720, 789, 327, 465, 29541, 275, 5202, 17082, 285, 247, 3480, 273, 3515, 673, 2562, 318, 1385, 5301, 275, 253, 4679, 891, 717, 417, 8261, 342, 18738, 436, 2929, 5734, 841, 2792, 403, 9713, 5474, 339, 431, 248, 2929, 5936, 271, 5933, 323, 253, 7982, 465, 29541, 1895, 970, 5697, 432, 7982, 21496, 3762, 253, 5125, 897, 273, 253, 5933, 310, 347, 271, 31850, 10934, 323, 253, 1980, 3186, 1754, 5933, 323, 465, 29541, 253, 21673, 3055, 2715, 273, 253, 5933, 310, 671, 1677, 2112, 342, 14493, 327, 465, 29541, 11193, 2803, 4679, 403, 5196, 689, 15302, 824, 347, 278, 79, 382, 285, 1543, 2429, 1411, 253, 465, 30799, 5933, 247, 4633, 3302, 5837, 5933, 253, 2929, 4648, 7615, 5609, 432, 7982, 21496, 6239, 281, 1804, 271, 5933, 323, 253, 465, 29541, 1895, 253, 9021, 1057, 417, 1646, 281, 320, 1077, 2266, 50276, 783, 7982, 21496, 1754, 5933, 323, 465, 29541, 310, 2011, 281, 1918, 11193, 12215, 273, 2412, 78, 750, 277, 534, 310, 1805, 685, 465, 30799, 534, 4245, 258, 2808, 465, 2299, 627, 403, 643, 11333, 326, 1918, 1199, 1805, 11193, 23632, 352, 310, 417, 2590, 2139, 253, 5301, 310, 2218, 342, 465, 30799, 1060, 436, 310, 1633, 326, 253, 2929, 1057, 417, 21184, 4931, 253, 5933, 310, 1146, 5125, 347, 271, 3302, 5837, 10934, 285, 7613, 253, 5301, 310, 2218, 342, 465, 30799, 533, 840, 326, 2550, 320, 253, 760, 1921, 1580, 253, 643, 11333, 342, 1805, 11193, 23632, 476, 671, 320, 5125, 347, 3302, 5837, 33255, 253, 5955, 3133, 281, 320, 14999, 327, 436, 4809, 275, 619, 4743, 50276, 783, 7756, 342, 1675, 281, 253, 21673, 3055, 3133, 281, 320, 5884, 689, 253, 2045, 789, 275, 6010, 253, 2929, 6747, 23970, 747, 5609, 4543, 31326, 1534, 7756, 689, 2469, 1543, 619, 14876, 323, 11138, 253, 2929, 651, 320, 281, 823, 247, 5955, 327, 2139, 253, 5125, 5933, 943, 320, 253, 987, 3302, 5837, 5933, 2490, 187, 4118, 18435, 27, 2520, 2929, 19401, 31850, 3082, 323, 253, 465, 30799, 5933, 50276, 9088, 310, 247, 2257, 273, 2720, 789, 275, 436, 2170, 253, 30628, 497, 38920, 2762, 327, 253, 2929, 50276, 9088, 497, 2067, 7350, 327, 849, 253, 1543, 497, 3559, 347, 973, 347, 253, 5301, 281, 2720, 789, 15538, 642, 37317, 3543, 326, 627, 369, 247, 2257, 273, 38135, 275, 253, 2929, 689, 253, 1386, 273, 789, 327, 465, 30799, 31850 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this contribution consists of an openai gym style wrapper around the sumo traffic simulation package a selection of traffic scenarios that can be run using that package as well as a set of rl algorithms that can use this interface the authors ran all of the provided algorithms against all provided environments and report performance figures for those runs and make the code for doing so available this system seems potentially very useful for making traffic signal control easier to experiment with in an rl setting since the openai gym api is ubiquitous and this system handles featurization of the action and observation spaces in a convenient way it is also helpful that there are selected traffic scenarios covering a range of relevant realistic conditions that any algorithm attempting to solve this problem should handle the paper also provides a reasonable introduction to the topic as well as a description of how traffic signal control is modeled as an mdp the primary weakness of this paper is that its main contribution the benchmark is provided under a noderivatives license this makes it illegal for other researchers to build on this work by modifying the provided code which drastically decreases the value of the contribution i recognize that the individual traffic scenarios are licensed under restrictive licenses for understandable reasons but theres no clear reason why this restriction is present for the new contributions made in this paper the packaging and documentation would also both benefit from additional work the existing documentation consists of this paper as well as a small readme describing some of the installation requirements and how to run the code with the expectation that the user works out of the code directory provided working out of another authors code directory is not convenient compared to being able to install the library through pythons standard packaging system the vast majority of frequently used rl benchmarks provide the ability to install them and it is not difficult to add support for doing so it would also be preferable if the different algorithms presented used a single neural network library instead of requiring multiple ones with unclear version requirements to be installed in particular dependencies on versions of tensorflow that lack officially supported builds are highly inconvenient the paper has a few minor formatting issues mostly unresolved references and spends too long criticizing cityflow but this is not consequential docsepthis paper introduces a benchmark for rlbased control of traffic lights the primary advantages touted over existing work include use of the sumo simulator traffic scenarios based on realworld road layouts and traffic levels a standard gym interface and a set of reference algorithms for the proposed environments experiments show that independent ppodqn tend to do quite well at the end of training but take far longer to converge than specialised algorithms like mplight s1 paper makes a good case for the importance of studying the traffic control problem and having accurate simulators arguments in favour of this new simulator are clearly laid out although note first weakness below s2 on execution writing was generally clear choice of experiments seemed reasonable w1 it is unclear to me how much value this new benchmark provides for the community relative to the various existing benchmarks the strongest argument seems to be in section 231 which claims that existing benchmarks are less realistic because they either use simplified or arbitrarily overcomplicated versions of realworld traffic grids i do not know how much this matters for evaluation thoughit may be that the simplifications in question dont affect relative rankings of different methods at all and the experiments in this paper do not try to evaluate how rankings change when moving from eg the jinming feng simulator to resco w2 there are a few important experimental details missing from the paper docsepthis paper proposes 1 benchmarking signal control tasks based on wellestablished traffic scenarios 2 implementation of various rl algorithms on these signal control problems 3 comparision and analysis of these rl algorithms under varying sensing assumptions the benchmarking proposed in this paper could be very helpful for future study on using rl for traffic signal control a standarized benchmarking on rl for traffic signal could serve as the foundation to help better future algorithm design for such problems which could bring lots of benefits for realworld traffic control the paper is well written and the benchmarking tasks are carefully chosen to match the realworld scenarios the experiments are wellconducted a comprehensive set of baseline controllers and rl controllers are evaluated the implementation are also validated by check against previous work and detailed analysis are given for comparing the performance of the benchmarked algorithms it would be interesting to see how the rl algorithms would work under weaker sensing abilities can the author give any discussion on how much sensing information mentioned in the paper can be accurately measured in the realworld traffic control scenarios ie how realistic are the sensing assumptions are docsep proposes benchmarks to study the problem of congestion control using realistic traffic situations presents baselines and thorough comparative evaluations current progress in congestion control algorithms are evaluated using different settings avoiding fair comparisons between them this is a well motivated paper that positions itself as an effort to consolidate the diverging field presents a set of baselines and evaluations to outline a fair comparison between various state of art algorithms baselines are validated before they are used for comparisons line 7232 be precise and revise the premise and claim about modelbased methods and deep q learning algorithms provide more explanation wrt yellow light a is the duration of yellow light subsumed inside a phase b how is wait time etc affected by the yellow light c is yellow light mandatory after every phase transitions d how are free right turn on red etc situations handelled line 166 missing reference and typo while im convinced please justify why allowing algorithms the choice of state and rewards is a good choice for fair comparison give they are being evaluated on two different mdps sec 41 the pronounced trend for fma2c and idqn isnt well justified this is too large of a gap too gloss over ### Summary:
all the reviewers appreciate the value of the proposed benchmarks the remaining concerns seem addressable i recommend accepting the paper while asking the authors to incorporate the review feedback into the cameraready paper
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 7680, 8414, 273, 271, 1527, 2284, 17409, 3740, 27436, 1475, 253, 2020, 80, 7137, 9864, 5522, 247, 5438, 273, 7137, 15216, 326, 476, 320, 1408, 970, 326, 5522, 347, 973, 347, 247, 873, 273, 391, 77, 11333, 326, 476, 897, 436, 5673, 253, 4477, 6337, 512, 273, 253, 2530, 11333, 1411, 512, 2530, 12620, 285, 1304, 3045, 8442, 323, 1110, 6613, 285, 1056, 253, 2127, 323, 2509, 594, 2130, 436, 985, 3133, 7826, 1077, 4217, 323, 2403, 7137, 2625, 1453, 6927, 281, 3368, 342, 275, 271, 391, 77, 4758, 1580, 253, 1527, 2284, 17409, 23370, 310, 33079, 285, 436, 985, 22139, 704, 4953, 1320, 273, 253, 2250, 285, 8310, 8470, 275, 247, 11638, 1039, 352, 310, 671, 9371, 326, 627, 403, 4236, 7137, 15216, 10985, 247, 2491, 273, 4623, 15958, 2515, 326, 667, 5933, 13756, 281, 8415, 436, 1895, 943, 6016, 50276, 783, 2929, 671, 3400, 247, 5272, 10199, 281, 253, 9400, 347, 973, 347, 247, 5740, 273, 849, 7137, 2625, 1453, 310, 23115, 347, 271, 278, 12132, 253, 3625, 14855, 273, 436, 2929, 310, 326, 697, 2022, 7680, 253, 22791, 310, 2530, 762, 247, 6913, 254, 400, 3993, 7981, 436, 2789, 352, 9676, 323, 643, 8607, 281, 1973, 327, 436, 789, 407, 26264, 253, 2530, 2127, 534, 31063, 12075, 253, 1318, 273, 253, 7680, 891, 9446, 326, 253, 2060, 7137, 15216, 403, 17236, 762, 29190, 23937, 323, 34007, 4606, 533, 253, 373, 642, 2590, 1921, 2139, 436, 12400, 310, 1246, 323, 253, 747, 9021, 1160, 275, 436, 2929, 50276, 783, 18420, 285, 10097, 651, 671, 1097, 5649, 432, 3081, 789, 253, 5368, 10097, 8414, 273, 436, 2929, 347, 973, 347, 247, 1355, 1239, 1405, 12930, 690, 273, 253, 12692, 6095, 285, 849, 281, 1408, 253, 2127, 342, 253, 15355, 326, 253, 2608, 2987, 562, 273, 253, 2127, 9617, 2530, 2444, 562, 273, 1529, 4477, 2127, 9617, 310, 417, 11638, 2429, 281, 1146, 2104, 281, 3334, 253, 6335, 949, 7239, 394, 790, 2629, 18420, 985, 253, 8485, 5020, 273, 7208, 908, 391, 77, 49602, 2085, 253, 3745, 281, 3334, 731, 285, 352, 310, 417, 2834, 281, 823, 1329, 323, 2509, 594, 50276, 262, 651, 671, 320, 29224, 604, 253, 1027, 11333, 3559, 908, 247, 2014, 11454, 2990, 6335, 3185, 273, 10568, 2709, 4394, 342, 12744, 2715, 6095, 281, 320, 8038, 275, 1798, 21011, 327, 9508, 273, 13148, 5449, 326, 3480, 15335, 4516, 21168, 403, 4122, 44697, 50276, 783, 2929, 556, 247, 1643, 5884, 33907, 3374, 6571, 39394, 10414, 285, 30885, 1512, 1048, 7291, 3006, 2846, 5449, 533, 436, 310, 417, 4823, 1624, 5474, 33032, 2520, 2929, 23970, 247, 22791, 323, 391, 77, 3169, 1453, 273, 7137, 10654, 253, 3625, 11361, 14078, 264, 689, 5368, 789, 2486, 897, 273, 253, 2020, 80, 40022, 7137, 15216, 1754, 327, 1524, 10186, 3971, 50107, 285, 7137, 2308, 247, 2629, 17409, 5673, 285, 247, 873, 273, 3806, 11333, 323, 253, 4081, 12620, 4679, 921, 326, 3907, 7266, 351, 47051, 5257, 281, 513, 3240, 973, 387, 253, 990, 273, 3733, 533, 1379, 2080, 3356, 281, 29623, 685, 2714, 1701, 11333, 751, 278, 446, 429, 256, 18, 2929, 2789, 247, 1175, 1083, 323, 253, 6349, 273, 12392, 253, 7137, 1453, 1895, 285, 1907, 7899, 948, 28457, 7125, 275, 9796, 273, 436, 747, 40022, 403, 4518, 10090, 562, 3738, 3877, 806, 14855, 2708, 50276, 84, 19, 327, 10636, 4028, 369, 3839, 2590, 4327, 273, 4679, 4455, 5272, 259, 18, 352, 310, 12744, 281, 479, 849, 1199, 1318, 436, 747, 22791, 3400, 323, 253, 3114, 4103, 281, 253, 2710, 5368, 49602, 253, 19508, 4154, 3133, 281, 320, 275, 2593, 28804, 534, 3916, 326, 5368, 49602, 403, 1679, 15958, 984, 597, 2057, 897, 21010, 390, 29607, 689, 681, 37787, 9508, 273, 1524, 10186, 7137, 42590, 891, 513, 417, 871, 849, 1199, 436, 8213, 323, 7103, 2167, 262, 778, 320, 326, 253, 8077, 6787, 275, 1953, 13414, 2818, 4103, 31972, 273, 1027, 3082, 387, 512, 285, 253, 4679, 275, 436, 2929, 513, 417, 1611, 281, 7472, 849, 31972, 1818, 672, 4886, 432, 24088, 253, 480, 249, 3987, 50276, 71, 1205, 40022, 281, 501, 1940, 50276, 88, 19, 627, 403, 247, 1643, 1774, 5661, 4278, 5816, 432, 253, 2929, 5474, 33032, 2520, 2929, 29328, 337, 22791, 272, 2625, 1453, 8892, 1754, 327, 973, 21877, 7137, 15216, 374, 7092, 273, 2710, 391, 77, 11333, 327, 841, 2625, 1453, 3237, 495, 3294, 1297, 285, 1783, 273, 841, 391, 77, 11333, 762, 11962, 17950, 13260, 253, 22791, 272, 4081, 275, 436, 2929, 812, 320, 1077, 9371, 323, 2852, 1263, 327, 970, 391, 77, 323, 7137, 2625, 1453, 50276, 66, 1462, 274, 1025, 22791, 272, 327, 391, 77, 323, 7137, 2625, 812, 5752, 347, 253, 12153, 281, 1361, 1805, 2852, 5933, 2216, 323, 824, 3237, 534, 812, 3324, 8783, 273, 5373, 323, 1524, 10186, 7137, 1453, 50276, 783, 2929, 310, 973, 3542, 285, 253, 22791, 272, 8892, 403, 9257, 6777, 281, 3761, 253, 1524, 10186, 15216, 50276, 783, 4679, 403, 973, 11018, 264, 247, 11088, 873, 273, 8245, 27765, 285, 391, 77, 27765, 403, 6760, 253, 7092, 403, 671, 17618, 407, 2451, 1411, 2045, 789, 285, 7000, 1783, 403, 1677, 323, 10941, 253, 3045, 273, 253, 22791, 264, 11333, 50276, 262, 651, 320, 4722, 281, 923, 849, 253, 391, 77, 11333, 651, 789, 762, 21076, 17950, 15277, 50276, 5092, 253, 2488, 1918, 667, 5955, 327, 849, 1199, 17950, 1491, 5393, 275, 253, 2929, 476, 320, 13613, 4080, 275, 253, 1524, 10186, 7137, 1453, 15216, 26332, 849, 15958, 403, 253, 17950, 13260, 403, 50275, 7152, 33032, 29328, 49602, 281, 1263, 253, 1895, 273, 35367, 1453, 970, 15958, 7137, 9534, 50275, 81, 5957, 1666, 25379, 285, 50276, 42771, 602, 20407, 27163, 50275, 6259, 4780, 275, 35367, 1453, 11333, 403, 6760, 970, 1027, 7533, 17816, 4344, 14023, 875, 731, 436, 310, 247, 973, 17194, 2929, 326, 6887, 3139, 347, 271, 3434, 281, 16932, 366, 253, 11711, 3390, 1673, 50275, 81, 5957, 247, 873, 273, 1666, 25379, 285, 27163, 281, 19270, 247, 4344, 5301, 875, 2710, 1375, 273, 1445, 11333, 50276, 10352, 25379, 403, 17618, 1078, 597, 403, 908, 323, 14023, 50275, 1282, 8187, 1237, 320, 10799, 285, 49620, 253, 26536, 285, 1750, 670, 1566, 3169, 3082, 285, 3676, 2805, 4715, 11333, 50275, 42260, 625, 8813, 8772, 8862, 1708, 50276, 66, 310, 253, 7467, 273, 8862, 1708, 749, 2204, 264, 3304, 247, 3408, 270, 849, 310, 3343, 673, 3966, 5876, 407, 253, 8862, 1708, 260, 310, 8862, 1708, 17396, 846, 1046, 3408, 16307, 277, 849, 403, 1959, 987, 1614, 327, 2502, 3966, 9534, 1133, 5911, 50276, 1282, 23541, 5816, 3806, 285, 1745, 80, 50275, 6050, 516, 13762, 4496, 15249, 2139, 6941, 11333, 253, 4327, 273, 1375, 285, 23267, 310, 247, 1175, 4327, 323, 4344, 5301, 1918, 597, 403, 1146, 6760, 327, 767, 1027, 31934, 793, 50276, 1704, 7609, 50276, 783, 17088, 9058, 323, 269, 785, 19, 68, 285, 2654, 47051, 310, 2649, 973, 17285, 436, 310, 1512, 1781, 273, 247, 8037, 1512, 27392, 689, 50275, 187, 187, 4118, 18435, 27, 455, 253, 30628, 11435, 253, 1318, 273, 253, 4081, 49602, 253, 5780, 7350, 1646, 2953, 494, 891, 5583, 18738, 253, 2929, 1223, 7004, 253, 4477, 281, 19071, 253, 2278, 8680, 715, 253, 4049, 254, 609, 5102, 2929 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 7680, 8414, 273, 271, 1527, 2284, 17409, 3740, 27436, 1475, 253, 2020, 80, 7137, 9864, 5522, 247, 5438, 273, 7137, 15216, 326, 476, 320, 1408, 970, 326, 5522, 347, 973, 347, 247, 873, 273, 391, 77, 11333, 326, 476, 897, 436, 5673, 253, 4477, 6337, 512, 273, 253, 2530, 11333, 1411, 512, 2530, 12620, 285, 1304, 3045, 8442, 323, 1110, 6613, 285, 1056, 253, 2127, 323, 2509, 594, 2130, 436, 985, 3133, 7826, 1077, 4217, 323, 2403, 7137, 2625, 1453, 6927, 281, 3368, 342, 275, 271, 391, 77, 4758, 1580, 253, 1527, 2284, 17409, 23370, 310, 33079, 285, 436, 985, 22139, 704, 4953, 1320, 273, 253, 2250, 285, 8310, 8470, 275, 247, 11638, 1039, 352, 310, 671, 9371, 326, 627, 403, 4236, 7137, 15216, 10985, 247, 2491, 273, 4623, 15958, 2515, 326, 667, 5933, 13756, 281, 8415, 436, 1895, 943, 6016, 50276, 783, 2929, 671, 3400, 247, 5272, 10199, 281, 253, 9400, 347, 973, 347, 247, 5740, 273, 849, 7137, 2625, 1453, 310, 23115, 347, 271, 278, 12132, 253, 3625, 14855, 273, 436, 2929, 310, 326, 697, 2022, 7680, 253, 22791, 310, 2530, 762, 247, 6913, 254, 400, 3993, 7981, 436, 2789, 352, 9676, 323, 643, 8607, 281, 1973, 327, 436, 789, 407, 26264, 253, 2530, 2127, 534, 31063, 12075, 253, 1318, 273, 253, 7680, 891, 9446, 326, 253, 2060, 7137, 15216, 403, 17236, 762, 29190, 23937, 323, 34007, 4606, 533, 253, 373, 642, 2590, 1921, 2139, 436, 12400, 310, 1246, 323, 253, 747, 9021, 1160, 275, 436, 2929, 50276, 783, 18420, 285, 10097, 651, 671, 1097, 5649, 432, 3081, 789, 253, 5368, 10097, 8414, 273, 436, 2929, 347, 973, 347, 247, 1355, 1239, 1405, 12930, 690, 273, 253, 12692, 6095, 285, 849, 281, 1408, 253, 2127, 342, 253, 15355, 326, 253, 2608, 2987, 562, 273, 253, 2127, 9617, 2530, 2444, 562, 273, 1529, 4477, 2127, 9617, 310, 417, 11638, 2429, 281, 1146, 2104, 281, 3334, 253, 6335, 949, 7239, 394, 790, 2629, 18420, 985, 253, 8485, 5020, 273, 7208, 908, 391, 77, 49602, 2085, 253, 3745, 281, 3334, 731, 285, 352, 310, 417, 2834, 281, 823, 1329, 323, 2509, 594, 50276, 262, 651, 671, 320, 29224, 604, 253, 1027, 11333, 3559, 908, 247, 2014, 11454, 2990, 6335, 3185, 273, 10568, 2709, 4394, 342, 12744, 2715, 6095, 281, 320, 8038, 275, 1798, 21011, 327, 9508, 273, 13148, 5449, 326, 3480, 15335, 4516, 21168, 403, 4122, 44697, 50276, 783, 2929, 556, 247, 1643, 5884, 33907, 3374, 6571, 39394, 10414, 285, 30885, 1512, 1048, 7291, 3006, 2846, 5449, 533, 436, 310, 417, 4823, 1624, 5474, 33032, 2520, 2929, 23970, 247, 22791, 323, 391, 77, 3169, 1453, 273, 7137, 10654, 253, 3625, 11361, 14078, 264, 689, 5368, 789, 2486, 897, 273, 253, 2020, 80, 40022, 7137, 15216, 1754, 327, 1524, 10186, 3971, 50107, 285, 7137, 2308, 247, 2629, 17409, 5673, 285, 247, 873, 273, 3806, 11333, 323, 253, 4081, 12620, 4679, 921, 326, 3907, 7266, 351, 47051, 5257, 281, 513, 3240, 973, 387, 253, 990, 273, 3733, 533, 1379, 2080, 3356, 281, 29623, 685, 2714, 1701, 11333, 751, 278, 446, 429, 256, 18, 2929, 2789, 247, 1175, 1083, 323, 253, 6349, 273, 12392, 253, 7137, 1453, 1895, 285, 1907, 7899, 948, 28457, 7125, 275, 9796, 273, 436, 747, 40022, 403, 4518, 10090, 562, 3738, 3877, 806, 14855, 2708, 50276, 84, 19, 327, 10636, 4028, 369, 3839, 2590, 4327, 273, 4679, 4455, 5272, 259, 18, 352, 310, 12744, 281, 479, 849, 1199, 1318, 436, 747, 22791, 3400, 323, 253, 3114, 4103, 281, 253, 2710, 5368, 49602, 253, 19508, 4154, 3133, 281, 320, 275, 2593, 28804, 534, 3916, 326, 5368, 49602, 403, 1679, 15958, 984, 597, 2057, 897, 21010, 390, 29607, 689, 681, 37787, 9508, 273, 1524, 10186, 7137, 42590, 891, 513, 417, 871, 849, 1199, 436, 8213, 323, 7103, 2167, 262, 778, 320, 326, 253, 8077, 6787, 275, 1953, 13414, 2818, 4103, 31972, 273, 1027, 3082, 387, 512, 285, 253, 4679, 275, 436, 2929, 513, 417, 1611, 281, 7472, 849, 31972, 1818, 672, 4886, 432, 24088, 253, 480, 249, 3987, 50276, 71, 1205, 40022, 281, 501, 1940, 50276, 88, 19, 627, 403, 247, 1643, 1774, 5661, 4278, 5816, 432, 253, 2929, 5474, 33032, 2520, 2929, 29328, 337, 22791, 272, 2625, 1453, 8892, 1754, 327, 973, 21877, 7137, 15216, 374, 7092, 273, 2710, 391, 77, 11333, 327, 841, 2625, 1453, 3237, 495, 3294, 1297, 285, 1783, 273, 841, 391, 77, 11333, 762, 11962, 17950, 13260, 253, 22791, 272, 4081, 275, 436, 2929, 812, 320, 1077, 9371, 323, 2852, 1263, 327, 970, 391, 77, 323, 7137, 2625, 1453, 50276, 66, 1462, 274, 1025, 22791, 272, 327, 391, 77, 323, 7137, 2625, 812, 5752, 347, 253, 12153, 281, 1361, 1805, 2852, 5933, 2216, 323, 824, 3237, 534, 812, 3324, 8783, 273, 5373, 323, 1524, 10186, 7137, 1453, 50276, 783, 2929, 310, 973, 3542, 285, 253, 22791, 272, 8892, 403, 9257, 6777, 281, 3761, 253, 1524, 10186, 15216, 50276, 783, 4679, 403, 973, 11018, 264, 247, 11088, 873, 273, 8245, 27765, 285, 391, 77, 27765, 403, 6760, 253, 7092, 403, 671, 17618, 407, 2451, 1411, 2045, 789, 285, 7000, 1783, 403, 1677, 323, 10941, 253, 3045, 273, 253, 22791, 264, 11333, 50276, 262, 651, 320, 4722, 281, 923, 849, 253, 391, 77, 11333, 651, 789, 762, 21076, 17950, 15277, 50276, 5092, 253, 2488, 1918, 667, 5955, 327, 849, 1199, 17950, 1491, 5393, 275, 253, 2929, 476, 320, 13613, 4080, 275, 253, 1524, 10186, 7137, 1453, 15216, 26332, 849, 15958, 403, 253, 17950, 13260, 403, 50275, 7152, 33032, 29328, 49602, 281, 1263, 253, 1895, 273, 35367, 1453, 970, 15958, 7137, 9534, 50275, 81, 5957, 1666, 25379, 285, 50276, 42771, 602, 20407, 27163, 50275, 6259, 4780, 275, 35367, 1453, 11333, 403, 6760, 970, 1027, 7533, 17816, 4344, 14023, 875, 731, 436, 310, 247, 973, 17194, 2929, 326, 6887, 3139, 347, 271, 3434, 281, 16932, 366, 253, 11711, 3390, 1673, 50275, 81, 5957, 247, 873, 273, 1666, 25379, 285, 27163, 281, 19270, 247, 4344, 5301, 875, 2710, 1375, 273, 1445, 11333, 50276, 10352, 25379, 403, 17618, 1078, 597, 403, 908, 323, 14023, 50275, 1282, 8187, 1237, 320, 10799, 285, 49620, 253, 26536, 285, 1750, 670, 1566, 3169, 3082, 285, 3676, 2805, 4715, 11333, 50275, 42260, 625, 8813, 8772, 8862, 1708, 50276, 66, 310, 253, 7467, 273, 8862, 1708, 749, 2204, 264, 3304, 247, 3408, 270, 849, 310, 3343, 673, 3966, 5876, 407, 253, 8862, 1708, 260, 310, 8862, 1708, 17396, 846, 1046, 3408, 16307, 277, 849, 403, 1959, 987, 1614, 327, 2502, 3966, 9534, 1133, 5911, 50276, 1282, 23541, 5816, 3806, 285, 1745, 80, 50275, 6050, 516, 13762, 4496, 15249, 2139, 6941, 11333, 253, 4327, 273, 1375, 285, 23267, 310, 247, 1175, 4327, 323, 4344, 5301, 1918, 597, 403, 1146, 6760, 327, 767, 1027, 31934, 793, 50276, 1704, 7609, 50276, 783, 17088, 9058, 323, 269, 785, 19, 68, 285, 2654, 47051, 310, 2649, 973, 17285, 436, 310, 1512, 1781, 273, 247, 8037, 1512, 27392, 689, 50275, 187, 187, 4118, 18435, 27, 455, 253, 30628, 11435, 253, 1318, 273, 253, 4081, 49602, 253, 5780, 7350, 1646, 2953, 494, 891, 5583, 18738, 253, 2929, 1223, 7004, 253, 4477, 281, 19071, 253, 2278, 8680, 715, 253, 4049, 254, 609, 5102, 2929 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the authors show that deep convolutional networks restructure the eigenspaces of the inducing kernels which empowers them to learn a dramatically broader class of functions covering a wide range of spacefrequency combinations strengths this paper evaluates the success of deep neural networks by running extensive experiments that show the tradeoff between space and frequency weakness the meaning of the word frequency is unclear in the context of this paper the main idea behind theorem eigenspace restructuring which replaces one eigenvectorvalue computation with a set of part based eigenvalueeigenvectors has limited novelty the novelty is in its use in neural network architecture analysis the authors might want to consider evaluating the performance of a system based on the amount of data employed and based on how data was acquired observational vs experimental studies systems that employ data from observational studies are known to be susceptible to selection bias and spurious correlations as opposed to data from experimental studies that are employed in objective causal inference missing reference as the authors might recall vasilescu etal in their icpr 2020 paper have advocated replacing the svd computation with a set of part based svds for which they provided a closed form mathematical derivation based on this mathematical derivation they developed the incremental hierarchical mmode block svd which was demonstrated experimentally inproceedingsvasilescu20 authorvasilescu m alex o and kim eric and zeng xiao s booktitle2020 25th international conference of pattern recognition icpr 2020 titlecausalx causal explanations and block multilinear factor analysis year2021 locationmilan italy monthjan pages1073610743 the authors evaluate deep convolutional networks through the lens of hierarchical locality the paper evaluates the performance of deep neural networks by performing extensive experiments that show the tradeoff between space versus frequency the eigenspace restructuring theorem has a wellchosen name however eigenspace restructuring is a wellknown concept and the novelty of the theorem is limited clarity of the writing and its overall organization needs improvement the word frequency needs to be better defined it is not clear if there are any novel insights update my original score was too generous and it was in anticipation that any mathematical inconsistencies or ambiguities would be addressed docsepthe paper shows how the topological of convolutional neural network restructures the eigenspace of the neural kernel strength the theorem statements are simple and clear weakness it is hard to decipher what is happening with just the theorem statements alone clarity the paper is clear till section 3 then most part of section 4 is confusing the theorem statements are simple and clear however it is hard to decipher what is happening with just the theorem statements alone the following are not clear and need to be discussed explicitly how is the cardinality of lambdamathcalg related to the architecture does condition 1 on assumptiong place a restriction is the weight wuvassociated to the edge uvinvarepsilond a constant since it is said that wuvequiv alphau looks like wuv alphau is used in definition 1 however it is confusing because wuv is also being used in section 31 equations 14 1720 please clarify ie are the wuv in section 31 and section 4 the same if not is it possible to use a different notation is sumvinmathcaln0 dv d what is the intuition behind spatial distance what are the activations that satisfy assumptionphi mathbftmathbfr is defined but it is not clear where it is used is nmathbfr equal to the set of input nodes for all mathbfr such that mathbfr0 or put differently what happens if one of the rv is equal to zero and one of the input nodes get left out is there an example of nmathbfr without a common ancestor u such that phiu is admissible is there an example of an mathbfrnotinmathcalamathcalgd ie an mathbfr that is not learnable in section 5 scnns and srhf interactions it is said that since the activation function of the last layer is identity we have mathcalamathcalgdmathbfrin mathbbnmathcaln0d nmathbfr 0 text or 1 however it seems that the same argument applies to mlps as well page 4 last paragraph mentions the activations of the inputoutput nodes be the identity function yet in equation 29 mathcalamathcalgdmathbbnd it is not clear why there is a difference how can nmathbfr be 1 or 0 given that there are w input nodes which node gets chosen for nmathbfr1 how is mathbfr defined what is the meaning of short long range interaction it is not clear what the dotted lines in figure 2 mean especially what is the point of highlighting the left most branches alone same goes with the figure 4 figure zoo in the appendix overall 1 it is not clear what is meant by space and the space frequency trade off 2 given that there are too many constants alphau alphav alphaw alphap it might be great to give some template networks to illustrate while the paper indeed talks about mlp scnn dcnn etc the discussion about how the specific constant were arrived at in equations 2931 is very rushed and is the major cause of confusion the current score is mainly due to the issues related to clarity docsepthe paper categorizes architectural biases from a frequency point of view through the lens of ntks and nngps the two main quantities are the frequency index fi and the spatial index si which are calculated from the dags the authors claim that these two quantities capture what type of functions in terms of frequency and space can be learned by specific architectures ie mlps learn lrlf functions scnns learn srhf functions and so on the main theoretical contribution is theorem 1 the authors give an architecturedependent spectral decomposition formula for the ntk and nngp kernels using dags strengths the paper sounds very interesting as far as i know it proposes a unique perspective to capture architectural biases finely weaknesses the paper is very hard to follow partially because it is nonstandard some parts lack rigor figures need a much better explanation in my opinion the biggest issue is the interpretation of the main results section 5 i cannot follow how the authors created the dag of the scnn network for example and from there arrived at the equation 30 experiments are only made for a particular and somehow peculiar type of synthetic data i would also appreciate some explanation on what convp22 and so on corresponds to in terms of standard convnet architectures detailed feedback in figure 1 it was not clear to me what the black line represents can you study what kind of architectural changes make the family of learnable functions bigger for example making the network wider or deeper in the case of mlps the amount of time needed to learn the jth eigenfunction is about 1kj page 3 can the authors elaborate on this point figure 2 is not comprehensible as it stands why not include a nice explanatory caption page 5 at the beginning of the main results the goal is nonasymptotic characterization as the input dimension d to infty here im confused about what does nonasymptotic refer to why is the spatial index defined twice both in def 1 and def 2 minor feedbacktypos although i liked the color code in general i found the color code for finer interpolations confusing in figure 1 caption and bullet point 3 the color codes do not match at the end of page 2 where the mse is defined it seems that the index for the summation should be x y in x y page 2 the gradient dynamics is described d is missing it is not clear what concrete input spaces refer to on page 3 page 6 it describes a connection s is missing although the paper looks very interesting i found it very hard to follow i think there is substantial room for making the paper more accessible see the detailed comments above but also my lack of familiarity with the tools used in the paper ie graph computations may have made it difficult to read for me i could not follow how the authors arrived at the lrlfsrhf and similar interpretations therefore i can not recommend acceptance docsepthis paper provides new insights to understanding the mechanism behind neural networks thanks to a spacefrequency through the socalled eigenspace restructuring the paper provides solid foundations to understand the mechanism behind neural networks including deep neural networks this is done thanks to recent theoretical advances of neural network gaussian process and neural tangent kernels this work seems very interesting however this work only explore the architecture of neural networks to understand their behaviors while the conducted analysis is carried on the model of the neural network we find that it is missing some indepth analysis on the other two basic ingredients of machine learning methods namely the data and the optimization algorithm it is worth noting that different optimization algorithms are used in the paper namely sgd momentum used for finitewidth networks and kernel regression used for infinitewidth networks but nothing is said about the influence of the algorithms this is also the case about missing analysis on the influence of the training dataset there are some spelling and grammatical errors in the paper such as an demonstration it describe a our theorem justify spherial harmonics rebuttal the authors reply is off the mark as they did not tackle our concerns neither address comments from the other reviews after reading all the reviews and the authors replies i have downgraded my score the paper provides solid foundations to understand the mechanism behind deep neural networks ### Summary:
the paper shows that deep convolutional neural networks in the kernel regime restructure the eigenspaces of the inducing kernels which leads to some insights regarding the range of spacefrequency combinations learned by such networks the reviewers identified a number of problems with the current submission for instance they found that the paper is hard to follow it lacks clarity and the theorem statements are hard to understand the authors also use a somewhat nonstandard experimental setup despite an extensive discussion with the authors which cleared out a few minor problems the bulk of the concerns of the reviewers were not successfully adressed i am therefore not able to recommend acceptance the authors need to improve the clarity of the paper and provide more discussions of the theorems in a resubmission as well as potentially reconsider their experimental setup
[ 3037, 247, 16821, 16055, 966, 273, 3470, 10985, 247, 4618, 2491, 273, 2317, 18163, 13553, 20544, 50276, 2520, 2929, 44995, 253, 2323, 273, 3676, 11454, 6928, 407, 3515, 9470, 4679, 326, 921, 253, 5454, 2727, 875, 2317, 285, 4294, 50275, 20881, 1255, 50276, 783, 4495, 273, 253, 3159, 4294, 310, 12744, 275, 253, 3634, 273, 436, 2929, 50276, 783, 2022, 2934, 3212, 10012, 299, 17731, 4511, 48090, 534, 36287, 581, 9216, 11000, 2877, 13782, 342, 247, 873, 273, 629, 1754, 25023, 70, 3855, 34383, 50276, 7110, 3710, 38135, 50276, 783, 38135, 310, 275, 697, 897, 275, 11454, 2990, 10336, 1783, 50276, 783, 4477, 1537, 971, 281, 1908, 16344, 253, 3045, 273, 247, 985, 1754, 327, 253, 2408, 273, 941, 7091, 285, 1754, 327, 849, 941, 369, 9288, 21899, 4632, 5661, 2175, 50276, 39098, 326, 2126, 941, 432, 21899, 2175, 403, 1929, 281, 320, 16931, 281, 5438, 8492, 285, 46541, 13007, 347, 10066, 281, 941, 432, 5661, 2175, 326, 403, 7091, 275, 8103, 19349, 17032, 50273, 33722, 3806, 347, 253, 4477, 1537, 6983, 16016, 3205, 14573, 1162, 267, 275, 616, 17857, 1087, 9169, 2929, 452, 36431, 15706, 253, 18504, 69, 13782, 342, 247, 873, 273, 629, 1754, 18504, 1397, 323, 534, 597, 2530, 247, 4581, 830, 15965, 28529, 50276, 3169, 327, 436, 15965, 28529, 597, 3715, 50276, 783, 32809, 24498, 5823, 853, 2972, 18504, 69, 534, 369, 5183, 21657, 50275, 249, 856, 22868, 9210, 3205, 14573, 938, 50276, 7582, 9210, 3205, 14573, 278, 247, 1591, 258, 285, 465, 303, 2827, 280, 285, 1182, 1205, 1269, 22728, 256, 1984, 5564, 14952, 2030, 394, 5213, 8059, 273, 3102, 8981, 17857, 1087, 9169, 4060, 68, 27026, 89, 19349, 22909, 285, 2972, 1554, 40511, 2803, 1783, 807, 938, 1797, 4328, 22009, 266, 352, 5242, 1770, 17551, 7223, 12224, 1812, 12224, 3079, 50272, 783, 4477, 7472, 3676, 27311, 267, 6928, 949, 253, 9655, 273, 24498, 33643, 253, 2929, 44995, 253, 3045, 273, 3676, 11454, 6928, 407, 9591, 9470, 4679, 326, 921, 253, 5454, 2727, 875, 2317, 7147, 4294, 50276, 783, 299, 17731, 4511, 48090, 10012, 556, 247, 973, 348, 5458, 1416, 50276, 35529, 299, 17731, 4511, 48090, 310, 247, 973, 4304, 4473, 285, 253, 38135, 273, 253, 10012, 310, 3710, 50275, 498, 15752, 273, 253, 4028, 285, 697, 4583, 6003, 3198, 7756, 50276, 783, 3159, 4294, 3198, 281, 320, 1805, 2931, 50276, 262, 310, 417, 2590, 604, 627, 403, 667, 4460, 16039, 50272, 11183, 50276, 2577, 3236, 4868, 369, 1512, 19649, 285, 352, 369, 275, 30513, 326, 667, 15965, 45611, 390, 15200, 39560, 651, 320, 9713, 50272, 7152, 339, 431, 248, 2929, 2722, 849, 253, 17597, 273, 27311, 267, 11454, 2990, 1551, 38124, 253, 299, 17731, 4511, 273, 253, 11454, 10295, 50276, 45563, 253, 10012, 7234, 403, 2969, 285, 2590, 50275, 20881, 1255, 352, 310, 1892, 281, 1086, 6894, 752, 310, 9369, 342, 816, 253, 10012, 7234, 3815, 50275, 498, 15752, 253, 2929, 310, 2590, 7357, 2593, 495, 840, 954, 629, 273, 2593, 577, 310, 21643, 253, 10012, 7234, 403, 2969, 285, 2590, 2299, 352, 310, 1892, 281, 1086, 6894, 752, 310, 9369, 342, 816, 253, 10012, 7234, 3815, 50274, 783, 1563, 403, 417, 2590, 285, 878, 281, 320, 5469, 11120, 50275, 5430, 310, 253, 46950, 273, 24082, 11747, 506, 1179, 72, 2905, 281, 253, 10336, 50275, 18566, 1617, 337, 327, 9376, 72, 1659, 247, 12400, 50275, 261, 253, 2801, 259, 8962, 12534, 281, 253, 5024, 1484, 8498, 4334, 793, 300, 857, 247, 3638, 1580, 352, 310, 753, 326, 259, 48132, 371, 400, 9765, 86, 4453, 751, 259, 8962, 9765, 86, 310, 908, 275, 5426, 337, 2299, 352, 310, 21643, 984, 259, 8962, 310, 671, 1146, 908, 275, 2593, 4562, 7424, 1638, 1722, 938, 4496, 19148, 26332, 403, 253, 259, 8962, 275, 2593, 4562, 285, 2593, 577, 253, 1072, 604, 417, 310, 352, 1896, 281, 897, 247, 1027, 14951, 50275, 261, 2020, 8498, 1588, 79, 17, 43857, 277, 50275, 5371, 310, 253, 30328, 3212, 8820, 4181, 50275, 5371, 403, 253, 1396, 569, 326, 10517, 9376, 2162, 50275, 1324, 649, 1324, 925, 310, 2931, 533, 352, 310, 417, 2590, 835, 352, 310, 908, 50275, 261, 295, 1324, 925, 4503, 281, 253, 873, 273, 3280, 7632, 323, 512, 14168, 67, 925, 824, 326, 14168, 67, 925, 17, 390, 1691, 13359, 752, 6569, 604, 581, 273, 253, 48192, 310, 4503, 281, 5058, 285, 581, 273, 253, 3280, 7632, 755, 1669, 562, 50275, 261, 627, 271, 1650, 273, 295, 1324, 925, 1293, 247, 1846, 35095, 1484, 824, 326, 815, 14353, 310, 22961, 50275, 261, 627, 271, 1650, 273, 271, 14168, 67, 925, 31469, 1588, 312, 506, 1179, 35333, 26332, 271, 14168, 67, 925, 326, 310, 417, 3037, 494, 50275, 249, 2593, 608, 660, 79, 2224, 285, 256, 9492, 71, 6355, 50271, 262, 310, 753, 326, 1580, 253, 5743, 1159, 273, 253, 1390, 3828, 310, 6489, 359, 452, 14168, 1179, 312, 506, 1179, 35333, 1324, 925, 249, 14168, 4482, 79, 1588, 79, 17, 69, 295, 1324, 925, 50276, 17, 2505, 390, 50276, 18, 2299, 352, 3133, 326, 253, 1072, 4154, 10384, 281, 13361, 793, 347, 973, 3239, 577, 1390, 12494, 25957, 253, 1396, 569, 273, 253, 3280, 9252, 7632, 320, 253, 6489, 1159, 2568, 275, 5150, 3285, 14168, 1179, 312, 506, 1179, 35333, 1991, 2109, 352, 310, 417, 2590, 2139, 627, 310, 247, 3064, 50273, 5430, 476, 295, 1324, 925, 320, 337, 390, 470, 1677, 326, 627, 403, 259, 3280, 7632, 534, 4666, 4850, 6777, 323, 295, 1324, 925, 18, 50272, 5430, 310, 14168, 67, 925, 2931, 50274, 5371, 310, 253, 4495, 273, 2159, 1048, 2491, 5016, 50275, 262, 310, 417, 2590, 752, 253, 24817, 3104, 275, 4677, 374, 1599, 3340, 752, 310, 253, 1127, 273, 27321, 253, 1669, 954, 12998, 3815, 1072, 4566, 342, 253, 4677, 577, 4677, 41089, 275, 253, 30762, 50276, 1189, 455, 50276, 18, 352, 310, 417, 2590, 752, 310, 5486, 407, 2317, 285, 253, 2317, 4294, 5454, 745, 50276, 19, 1677, 326, 627, 403, 1512, 1142, 14637, 9765, 86, 355, 545, 580, 355, 545, 1403, 355, 545, 522, 352, 1537, 320, 1270, 281, 1918, 690, 7646, 6928, 281, 17093, 1223, 253, 2929, 6296, 12088, 670, 13361, 81, 660, 9866, 36196, 9866, 3966, 253, 5955, 670, 849, 253, 2173, 3638, 497, 7244, 387, 275, 7424, 3285, 2405, 310, 1077, 20906, 285, 310, 253, 2201, 2847, 273, 13775, 50275, 783, 1655, 4868, 310, 7194, 1955, 281, 253, 3374, 2905, 281, 19843, 5474, 339, 431, 248, 2929, 13213, 4219, 27934, 31306, 432, 247, 4294, 1127, 273, 1859, 949, 253, 9655, 273, 34900, 661, 285, 295, 1251, 793, 253, 767, 2022, 13483, 403, 253, 4294, 3605, 12684, 285, 253, 8820, 3605, 4927, 534, 403, 5118, 432, 253, 277, 3544, 253, 4477, 1750, 326, 841, 767, 13483, 9232, 752, 1511, 273, 3470, 275, 2426, 273, 4294, 285, 2317, 476, 320, 6311, 407, 2173, 35615, 26332, 13361, 793, 3037, 298, 8435, 71, 3470, 660, 79, 2224, 3037, 256, 9492, 71, 3470, 285, 594, 327, 253, 2022, 10527, 7680, 310, 10012, 337, 253, 4477, 1918, 271, 6805, 1520, 2662, 9879, 14717, 7212, 323, 253, 295, 17922, 285, 295, 1251, 81, 34501, 970, 277, 3544, 50276, 296, 3755, 20556, 253, 2929, 7835, 1077, 4722, 347, 2080, 347, 891, 871, 352, 29328, 247, 4451, 8668, 281, 9232, 27934, 31306, 25806, 50275, 20881, 1255, 265, 253, 2929, 310, 1077, 1892, 281, 956, 10571, 984, 352, 310, 1327, 15291, 690, 4243, 3480, 8132, 263, 8442, 878, 247, 1199, 1805, 8813, 50276, 249, 619, 4743, 253, 5962, 2523, 310, 253, 7914, 273, 253, 2022, 1543, 2593, 608, 891, 2550, 956, 849, 253, 4477, 3562, 253, 31398, 273, 253, 660, 9866, 2990, 323, 1650, 285, 432, 627, 7244, 387, 253, 5150, 1884, 50275, 16217, 3825, 403, 760, 1160, 323, 247, 1798, 285, 10380, 19532, 1511, 273, 13506, 941, 50275, 74, 651, 671, 11435, 690, 8813, 327, 752, 2410, 81, 1423, 285, 594, 327, 10140, 281, 275, 2426, 273, 2629, 2410, 3024, 35615, 50275, 5992, 7193, 8680, 50276, 249, 4677, 337, 352, 369, 417, 2590, 281, 479, 752, 253, 2806, 1386, 6125, 476, 368, 1263, 752, 2238, 273, 27934, 2544, 1056, 253, 2021, 273, 3037, 494, 3470, 8750, 323, 1650, 2403, 253, 2990, 14200, 390, 12861, 275, 253, 1083, 273, 13361, 793, 50276, 783, 2408, 273, 673, 3058, 281, 3037, 253, 480, 394, 9216, 3701, 310, 670, 337, 31169, 3239, 495, 50276, 5092, 253, 4477, 21184, 327, 436, 1127, 50275, 13206, 374, 310, 417, 28535, 6286, 347, 352, 9572, 2139, 417, 2486, 247, 5322, 50276, 911, 11139, 2473, 11743, 50274, 6377, 608, 387, 253, 5068, 273, 253, 2022, 1543, 253, 4736, 310, 50276, 4160, 284, 40045, 3875, 14846, 347, 253, 3280, 7877, 277, 281, 2192, 555, 1060, 516, 13477, 670, 752, 1057, 1327, 284, 40045, 3875, 3730, 281, 50275, 22309, 310, 253, 8820, 3605, 2931, 7019, 1097, 275, 809, 337, 285, 809, 374, 50275, 37585, 8680, 555, 993, 50276, 20261, 891, 10490, 253, 3295, 2127, 275, 2087, 891, 1119, 253, 3295, 2127, 323, 40259, 20670, 569, 21643, 275, 4677, 337, 11743, 285, 16950, 1127, 495, 253, 3295, 11646, 513, 417, 3761, 50275, 255, 253, 990, 273, 3239, 374, 835, 253, 278, 339, 310, 2931, 352, 3133, 326, 253, 3605, 323, 253, 36138, 943, 320, 1269, 340, 275, 1269, 340, 50276, 6377, 374, 50276, 783, 11786, 8062, 310, 2529, 277, 310, 5816, 50275, 262, 310, 417, 2590, 752, 11859, 3280, 8470, 3730, 281, 327, 3239, 495, 50275, 6377, 721, 352, 8631, 247, 4602, 50276, 84, 310, 5816, 50276, 20261, 253, 2929, 4453, 1077, 4722, 891, 1119, 352, 1077, 1892, 281, 956, 891, 1158, 627, 310, 6832, 2316, 323, 2403, 253, 2929, 625, 12482, 923, 253, 7000, 5701, 1840, 533, 671, 619, 3480, 273, 38550, 342, 253, 5657, 908, 275, 253, 2929, 26332, 4216, 30745, 778, 452, 1160, 352, 2834, 281, 1239, 323, 479, 891, 812, 417, 956, 849, 253, 4477, 7244, 387, 253, 298, 8435, 3671, 9492, 71, 285, 2074, 27838, 3103, 891, 476, 417, 5583, 14924, 50276, 7152, 33032, 2520, 2929, 3400, 747, 16039, 281, 4685, 253, 5122, 3212, 11454, 6928, 6701, 281, 247, 2317, 18163, 949, 253, 9267, 18859, 299, 17731, 4511, 48090, 50276, 783, 2929, 3400, 4891, 27629, 281, 2096, 253, 5122, 3212, 11454, 6928, 1690, 3676, 11454, 6928, 436, 310, 2218, 6701, 281, 3332, 10527, 16424, 273, 11454, 2990, 305, 12064, 1232, 285, 11454, 28196, 34501, 436, 789, 3133, 1077, 4722, 2299, 436, 789, 760, 8338, 253, 10336, 273, 11454, 6928, 281, 2096, 616, 13576, 1223, 253, 5196, 1783, 310, 4824, 327, 253, 1566, 273, 253, 11454, 2990, 359, 1089, 326, 352, 310, 5816, 690, 801, 554, 394, 1783, 327, 253, 643, 767, 5044, 12696, 273, 5145, 4715, 3082, 10775, 253, 941, 285, 253, 13757, 5933, 352, 310, 4409, 15806, 326, 1027, 13757, 11333, 403, 908, 275, 253, 2929, 10775, 256, 35333, 10254, 908, 323, 6486, 3429, 6928, 285, 10295, 9077, 908, 323, 11968, 3429, 6928, 533, 2717, 310, 753, 670, 253, 4833, 273, 253, 11333, 436, 310, 671, 253, 1083, 670, 5816, 1783, 327, 253, 4833, 273, 253, 3733, 10895, 627, 403, 690, 33797, 285, 47412, 474, 6332, 275, 253, 2929, 824, 347, 271, 20028, 352, 6266, 247, 776, 10012, 15249, 653, 248, 3760, 23284, 982, 50276, 250, 2858, 22559, 253, 4477, 12252, 310, 745, 253, 1616, 347, 597, 858, 417, 18915, 776, 7350, 6747, 2953, 5701, 432, 253, 643, 10123, 846, 4361, 512, 253, 10123, 285, 253, 4477, 32114, 891, 452, 1066, 41556, 619, 4868, 253, 2929, 3400, 4891, 27629, 281, 2096, 253, 5122, 3212, 3676, 11454, 6928, 2490, 187, 4118, 18435, 27, 783, 2929, 2722, 326, 3676, 27311, 267, 11454, 6928, 275, 253, 10295, 9459, 1551, 7818, 253, 299, 17731, 32328, 273, 253, 24635, 34501, 534, 5644, 281, 690, 16039, 5001, 253, 2491, 273, 2317, 18163, 13553, 6311, 407, 824, 6928, 50276, 783, 30628, 3636, 247, 1180, 273, 3237, 342, 253, 1655, 19529, 323, 4227, 597, 1119, 326, 253, 2929, 310, 1892, 281, 956, 352, 19756, 19843, 285, 253, 10012, 7234, 403, 1892, 281, 2096, 253, 4477, 671, 897, 247, 8489, 1327, 15291, 5661, 9978, 50276, 3229, 3784, 271, 9470, 5955, 342, 253, 4477, 534, 16481, 562, 247, 1643, 5884, 3237, 253, 10713, 273, 253, 7350, 273, 253, 30628, 497, 417, 8379, 519, 2079, 891, 717, 3103, 417, 2104, 281, 5583, 14924, 253, 4477, 878, 281, 3157, 253, 19843, 273, 253, 2929, 285, 2085, 625, 11985, 273, 253, 39383, 275, 247, 501, 538, 2230, 347, 973, 347, 7826, 24033, 616, 5661, 9978 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 3037, 247, 16821, 16055, 966, 273, 3470, 10985, 247, 4618, 2491, 273, 2317, 18163, 13553, 20544, 50276, 2520, 2929, 44995, 253, 2323, 273, 3676, 11454, 6928, 407, 3515, 9470, 4679, 326, 921, 253, 5454, 2727, 875, 2317, 285, 4294, 50275, 20881, 1255, 50276, 783, 4495, 273, 253, 3159, 4294, 310, 12744, 275, 253, 3634, 273, 436, 2929, 50276, 783, 2022, 2934, 3212, 10012, 299, 17731, 4511, 48090, 534, 36287, 581, 9216, 11000, 2877, 13782, 342, 247, 873, 273, 629, 1754, 25023, 70, 3855, 34383, 50276, 7110, 3710, 38135, 50276, 783, 38135, 310, 275, 697, 897, 275, 11454, 2990, 10336, 1783, 50276, 783, 4477, 1537, 971, 281, 1908, 16344, 253, 3045, 273, 247, 985, 1754, 327, 253, 2408, 273, 941, 7091, 285, 1754, 327, 849, 941, 369, 9288, 21899, 4632, 5661, 2175, 50276, 39098, 326, 2126, 941, 432, 21899, 2175, 403, 1929, 281, 320, 16931, 281, 5438, 8492, 285, 46541, 13007, 347, 10066, 281, 941, 432, 5661, 2175, 326, 403, 7091, 275, 8103, 19349, 17032, 50273, 33722, 3806, 347, 253, 4477, 1537, 6983, 16016, 3205, 14573, 1162, 267, 275, 616, 17857, 1087, 9169, 2929, 452, 36431, 15706, 253, 18504, 69, 13782, 342, 247, 873, 273, 629, 1754, 18504, 1397, 323, 534, 597, 2530, 247, 4581, 830, 15965, 28529, 50276, 3169, 327, 436, 15965, 28529, 597, 3715, 50276, 783, 32809, 24498, 5823, 853, 2972, 18504, 69, 534, 369, 5183, 21657, 50275, 249, 856, 22868, 9210, 3205, 14573, 938, 50276, 7582, 9210, 3205, 14573, 278, 247, 1591, 258, 285, 465, 303, 2827, 280, 285, 1182, 1205, 1269, 22728, 256, 1984, 5564, 14952, 2030, 394, 5213, 8059, 273, 3102, 8981, 17857, 1087, 9169, 4060, 68, 27026, 89, 19349, 22909, 285, 2972, 1554, 40511, 2803, 1783, 807, 938, 1797, 4328, 22009, 266, 352, 5242, 1770, 17551, 7223, 12224, 1812, 12224, 3079, 50272, 783, 4477, 7472, 3676, 27311, 267, 6928, 949, 253, 9655, 273, 24498, 33643, 253, 2929, 44995, 253, 3045, 273, 3676, 11454, 6928, 407, 9591, 9470, 4679, 326, 921, 253, 5454, 2727, 875, 2317, 7147, 4294, 50276, 783, 299, 17731, 4511, 48090, 10012, 556, 247, 973, 348, 5458, 1416, 50276, 35529, 299, 17731, 4511, 48090, 310, 247, 973, 4304, 4473, 285, 253, 38135, 273, 253, 10012, 310, 3710, 50275, 498, 15752, 273, 253, 4028, 285, 697, 4583, 6003, 3198, 7756, 50276, 783, 3159, 4294, 3198, 281, 320, 1805, 2931, 50276, 262, 310, 417, 2590, 604, 627, 403, 667, 4460, 16039, 50272, 11183, 50276, 2577, 3236, 4868, 369, 1512, 19649, 285, 352, 369, 275, 30513, 326, 667, 15965, 45611, 390, 15200, 39560, 651, 320, 9713, 50272, 7152, 339, 431, 248, 2929, 2722, 849, 253, 17597, 273, 27311, 267, 11454, 2990, 1551, 38124, 253, 299, 17731, 4511, 273, 253, 11454, 10295, 50276, 45563, 253, 10012, 7234, 403, 2969, 285, 2590, 50275, 20881, 1255, 352, 310, 1892, 281, 1086, 6894, 752, 310, 9369, 342, 816, 253, 10012, 7234, 3815, 50275, 498, 15752, 253, 2929, 310, 2590, 7357, 2593, 495, 840, 954, 629, 273, 2593, 577, 310, 21643, 253, 10012, 7234, 403, 2969, 285, 2590, 2299, 352, 310, 1892, 281, 1086, 6894, 752, 310, 9369, 342, 816, 253, 10012, 7234, 3815, 50274, 783, 1563, 403, 417, 2590, 285, 878, 281, 320, 5469, 11120, 50275, 5430, 310, 253, 46950, 273, 24082, 11747, 506, 1179, 72, 2905, 281, 253, 10336, 50275, 18566, 1617, 337, 327, 9376, 72, 1659, 247, 12400, 50275, 261, 253, 2801, 259, 8962, 12534, 281, 253, 5024, 1484, 8498, 4334, 793, 300, 857, 247, 3638, 1580, 352, 310, 753, 326, 259, 48132, 371, 400, 9765, 86, 4453, 751, 259, 8962, 9765, 86, 310, 908, 275, 5426, 337, 2299, 352, 310, 21643, 984, 259, 8962, 310, 671, 1146, 908, 275, 2593, 4562, 7424, 1638, 1722, 938, 4496, 19148, 26332, 403, 253, 259, 8962, 275, 2593, 4562, 285, 2593, 577, 253, 1072, 604, 417, 310, 352, 1896, 281, 897, 247, 1027, 14951, 50275, 261, 2020, 8498, 1588, 79, 17, 43857, 277, 50275, 5371, 310, 253, 30328, 3212, 8820, 4181, 50275, 5371, 403, 253, 1396, 569, 326, 10517, 9376, 2162, 50275, 1324, 649, 1324, 925, 310, 2931, 533, 352, 310, 417, 2590, 835, 352, 310, 908, 50275, 261, 295, 1324, 925, 4503, 281, 253, 873, 273, 3280, 7632, 323, 512, 14168, 67, 925, 824, 326, 14168, 67, 925, 17, 390, 1691, 13359, 752, 6569, 604, 581, 273, 253, 48192, 310, 4503, 281, 5058, 285, 581, 273, 253, 3280, 7632, 755, 1669, 562, 50275, 261, 627, 271, 1650, 273, 295, 1324, 925, 1293, 247, 1846, 35095, 1484, 824, 326, 815, 14353, 310, 22961, 50275, 261, 627, 271, 1650, 273, 271, 14168, 67, 925, 31469, 1588, 312, 506, 1179, 35333, 26332, 271, 14168, 67, 925, 326, 310, 417, 3037, 494, 50275, 249, 2593, 608, 660, 79, 2224, 285, 256, 9492, 71, 6355, 50271, 262, 310, 753, 326, 1580, 253, 5743, 1159, 273, 253, 1390, 3828, 310, 6489, 359, 452, 14168, 1179, 312, 506, 1179, 35333, 1324, 925, 249, 14168, 4482, 79, 1588, 79, 17, 69, 295, 1324, 925, 50276, 17, 2505, 390, 50276, 18, 2299, 352, 3133, 326, 253, 1072, 4154, 10384, 281, 13361, 793, 347, 973, 3239, 577, 1390, 12494, 25957, 253, 1396, 569, 273, 253, 3280, 9252, 7632, 320, 253, 6489, 1159, 2568, 275, 5150, 3285, 14168, 1179, 312, 506, 1179, 35333, 1991, 2109, 352, 310, 417, 2590, 2139, 627, 310, 247, 3064, 50273, 5430, 476, 295, 1324, 925, 320, 337, 390, 470, 1677, 326, 627, 403, 259, 3280, 7632, 534, 4666, 4850, 6777, 323, 295, 1324, 925, 18, 50272, 5430, 310, 14168, 67, 925, 2931, 50274, 5371, 310, 253, 4495, 273, 2159, 1048, 2491, 5016, 50275, 262, 310, 417, 2590, 752, 253, 24817, 3104, 275, 4677, 374, 1599, 3340, 752, 310, 253, 1127, 273, 27321, 253, 1669, 954, 12998, 3815, 1072, 4566, 342, 253, 4677, 577, 4677, 41089, 275, 253, 30762, 50276, 1189, 455, 50276, 18, 352, 310, 417, 2590, 752, 310, 5486, 407, 2317, 285, 253, 2317, 4294, 5454, 745, 50276, 19, 1677, 326, 627, 403, 1512, 1142, 14637, 9765, 86, 355, 545, 580, 355, 545, 1403, 355, 545, 522, 352, 1537, 320, 1270, 281, 1918, 690, 7646, 6928, 281, 17093, 1223, 253, 2929, 6296, 12088, 670, 13361, 81, 660, 9866, 36196, 9866, 3966, 253, 5955, 670, 849, 253, 2173, 3638, 497, 7244, 387, 275, 7424, 3285, 2405, 310, 1077, 20906, 285, 310, 253, 2201, 2847, 273, 13775, 50275, 783, 1655, 4868, 310, 7194, 1955, 281, 253, 3374, 2905, 281, 19843, 5474, 339, 431, 248, 2929, 13213, 4219, 27934, 31306, 432, 247, 4294, 1127, 273, 1859, 949, 253, 9655, 273, 34900, 661, 285, 295, 1251, 793, 253, 767, 2022, 13483, 403, 253, 4294, 3605, 12684, 285, 253, 8820, 3605, 4927, 534, 403, 5118, 432, 253, 277, 3544, 253, 4477, 1750, 326, 841, 767, 13483, 9232, 752, 1511, 273, 3470, 275, 2426, 273, 4294, 285, 2317, 476, 320, 6311, 407, 2173, 35615, 26332, 13361, 793, 3037, 298, 8435, 71, 3470, 660, 79, 2224, 3037, 256, 9492, 71, 3470, 285, 594, 327, 253, 2022, 10527, 7680, 310, 10012, 337, 253, 4477, 1918, 271, 6805, 1520, 2662, 9879, 14717, 7212, 323, 253, 295, 17922, 285, 295, 1251, 81, 34501, 970, 277, 3544, 50276, 296, 3755, 20556, 253, 2929, 7835, 1077, 4722, 347, 2080, 347, 891, 871, 352, 29328, 247, 4451, 8668, 281, 9232, 27934, 31306, 25806, 50275, 20881, 1255, 265, 253, 2929, 310, 1077, 1892, 281, 956, 10571, 984, 352, 310, 1327, 15291, 690, 4243, 3480, 8132, 263, 8442, 878, 247, 1199, 1805, 8813, 50276, 249, 619, 4743, 253, 5962, 2523, 310, 253, 7914, 273, 253, 2022, 1543, 2593, 608, 891, 2550, 956, 849, 253, 4477, 3562, 253, 31398, 273, 253, 660, 9866, 2990, 323, 1650, 285, 432, 627, 7244, 387, 253, 5150, 1884, 50275, 16217, 3825, 403, 760, 1160, 323, 247, 1798, 285, 10380, 19532, 1511, 273, 13506, 941, 50275, 74, 651, 671, 11435, 690, 8813, 327, 752, 2410, 81, 1423, 285, 594, 327, 10140, 281, 275, 2426, 273, 2629, 2410, 3024, 35615, 50275, 5992, 7193, 8680, 50276, 249, 4677, 337, 352, 369, 417, 2590, 281, 479, 752, 253, 2806, 1386, 6125, 476, 368, 1263, 752, 2238, 273, 27934, 2544, 1056, 253, 2021, 273, 3037, 494, 3470, 8750, 323, 1650, 2403, 253, 2990, 14200, 390, 12861, 275, 253, 1083, 273, 13361, 793, 50276, 783, 2408, 273, 673, 3058, 281, 3037, 253, 480, 394, 9216, 3701, 310, 670, 337, 31169, 3239, 495, 50276, 5092, 253, 4477, 21184, 327, 436, 1127, 50275, 13206, 374, 310, 417, 28535, 6286, 347, 352, 9572, 2139, 417, 2486, 247, 5322, 50276, 911, 11139, 2473, 11743, 50274, 6377, 608, 387, 253, 5068, 273, 253, 2022, 1543, 253, 4736, 310, 50276, 4160, 284, 40045, 3875, 14846, 347, 253, 3280, 7877, 277, 281, 2192, 555, 1060, 516, 13477, 670, 752, 1057, 1327, 284, 40045, 3875, 3730, 281, 50275, 22309, 310, 253, 8820, 3605, 2931, 7019, 1097, 275, 809, 337, 285, 809, 374, 50275, 37585, 8680, 555, 993, 50276, 20261, 891, 10490, 253, 3295, 2127, 275, 2087, 891, 1119, 253, 3295, 2127, 323, 40259, 20670, 569, 21643, 275, 4677, 337, 11743, 285, 16950, 1127, 495, 253, 3295, 11646, 513, 417, 3761, 50275, 255, 253, 990, 273, 3239, 374, 835, 253, 278, 339, 310, 2931, 352, 3133, 326, 253, 3605, 323, 253, 36138, 943, 320, 1269, 340, 275, 1269, 340, 50276, 6377, 374, 50276, 783, 11786, 8062, 310, 2529, 277, 310, 5816, 50275, 262, 310, 417, 2590, 752, 11859, 3280, 8470, 3730, 281, 327, 3239, 495, 50275, 6377, 721, 352, 8631, 247, 4602, 50276, 84, 310, 5816, 50276, 20261, 253, 2929, 4453, 1077, 4722, 891, 1119, 352, 1077, 1892, 281, 956, 891, 1158, 627, 310, 6832, 2316, 323, 2403, 253, 2929, 625, 12482, 923, 253, 7000, 5701, 1840, 533, 671, 619, 3480, 273, 38550, 342, 253, 5657, 908, 275, 253, 2929, 26332, 4216, 30745, 778, 452, 1160, 352, 2834, 281, 1239, 323, 479, 891, 812, 417, 956, 849, 253, 4477, 7244, 387, 253, 298, 8435, 3671, 9492, 71, 285, 2074, 27838, 3103, 891, 476, 417, 5583, 14924, 50276, 7152, 33032, 2520, 2929, 3400, 747, 16039, 281, 4685, 253, 5122, 3212, 11454, 6928, 6701, 281, 247, 2317, 18163, 949, 253, 9267, 18859, 299, 17731, 4511, 48090, 50276, 783, 2929, 3400, 4891, 27629, 281, 2096, 253, 5122, 3212, 11454, 6928, 1690, 3676, 11454, 6928, 436, 310, 2218, 6701, 281, 3332, 10527, 16424, 273, 11454, 2990, 305, 12064, 1232, 285, 11454, 28196, 34501, 436, 789, 3133, 1077, 4722, 2299, 436, 789, 760, 8338, 253, 10336, 273, 11454, 6928, 281, 2096, 616, 13576, 1223, 253, 5196, 1783, 310, 4824, 327, 253, 1566, 273, 253, 11454, 2990, 359, 1089, 326, 352, 310, 5816, 690, 801, 554, 394, 1783, 327, 253, 643, 767, 5044, 12696, 273, 5145, 4715, 3082, 10775, 253, 941, 285, 253, 13757, 5933, 352, 310, 4409, 15806, 326, 1027, 13757, 11333, 403, 908, 275, 253, 2929, 10775, 256, 35333, 10254, 908, 323, 6486, 3429, 6928, 285, 10295, 9077, 908, 323, 11968, 3429, 6928, 533, 2717, 310, 753, 670, 253, 4833, 273, 253, 11333, 436, 310, 671, 253, 1083, 670, 5816, 1783, 327, 253, 4833, 273, 253, 3733, 10895, 627, 403, 690, 33797, 285, 47412, 474, 6332, 275, 253, 2929, 824, 347, 271, 20028, 352, 6266, 247, 776, 10012, 15249, 653, 248, 3760, 23284, 982, 50276, 250, 2858, 22559, 253, 4477, 12252, 310, 745, 253, 1616, 347, 597, 858, 417, 18915, 776, 7350, 6747, 2953, 5701, 432, 253, 643, 10123, 846, 4361, 512, 253, 10123, 285, 253, 4477, 32114, 891, 452, 1066, 41556, 619, 4868, 253, 2929, 3400, 4891, 27629, 281, 2096, 253, 5122, 3212, 3676, 11454, 6928, 2490, 187, 4118, 18435, 27, 783, 2929, 2722, 326, 3676, 27311, 267, 11454, 6928, 275, 253, 10295, 9459, 1551, 7818, 253, 299, 17731, 32328, 273, 253, 24635, 34501, 534, 5644, 281, 690, 16039, 5001, 253, 2491, 273, 2317, 18163, 13553, 6311, 407, 824, 6928, 50276, 783, 30628, 3636, 247, 1180, 273, 3237, 342, 253, 1655, 19529, 323, 4227, 597, 1119, 326, 253, 2929, 310, 1892, 281, 956, 352, 19756, 19843, 285, 253, 10012, 7234, 403, 1892, 281, 2096, 253, 4477, 671, 897, 247, 8489, 1327, 15291, 5661, 9978, 50276, 3229, 3784, 271, 9470, 5955, 342, 253, 4477, 534, 16481, 562, 247, 1643, 5884, 3237, 253, 10713, 273, 253, 7350, 273, 253, 30628, 497, 417, 8379, 519, 2079, 891, 717, 3103, 417, 2104, 281, 5583, 14924, 253, 4477, 878, 281, 3157, 253, 19843, 273, 253, 2929, 285, 2085, 625, 11985, 273, 253, 39383, 275, 247, 501, 538, 2230, 347, 973, 347, 7826, 24033, 616, 5661, 9978 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper is focused on training networks to solve problems via extrapolation eg solving large mazes by learning to solve small mazes in order to achieve this two modifications to recurrent convolutional networks are proposed a adding a concatenating skip connection from the input to the recurrent layers that stabilizes extrapolation fig 2 and b using a modified training method and loss that encourages the network to learn computations independent of current iteration alg 1 together these techniques lead to substantial improvements in results on the problems of computing prefix sums solving 2d mazes and solving chess puzzles the paper is well written and easy to read and understand it builds directly on recent work by schwarzschild et al 2021a 2021b 2021c and is essentially about addressing the limitations of that work on the challenging tasks identified in that work that involve learning from easy instances of a task and generalizing to hard instances this is an interesting line of work that could lead to useful insights and this papers contributions certainly to make significant progress on the challenges considered in particular the authors have already conducted various analyses to answer followup questions that a curious reader might have and i had about the working of the proposed techniques the results of these experiments will be very useful and instructive to readers interested in these problems i would like to point out two weaknesses of this paper 1 the first out of the proposed two techniques the recall skip connection appears to be fixing what one may call a mistake made by schwarzschild et al in recent work during model design this is because if one looks at prior literature on models that are stable when unrolled longer than training time one immediately find this model design based on theoretical insights from control theory see ciccone et al a and the nonautonomous networks therein this is not to say that the contribution of this paper isnt useful but it should be put in context of past work that has already established strong theoretical results directly related to the behavior of interest i find it interesting that in this paper no conditions on weights were necessary in practice to achieve stability 2 im concerned that the authors are playing fast and loose with the term thinking in this line of work this is very loaded metaphor and in my opinion every paper that decides to use it must at least include a clear note that thinking is just a fancier term for a particular type of processing here and these networks are not really thinking in any regular sense of the term otherwise we can say that any classifier or detector is also thinking i realize that this is not just something unique to this paper and some other papers have used such terms without such clarification but this is only a review of this paper and id appreciate if the authors think about this issue and try to do better than prior work a ciccone m gallieri m masci j osendorfer c gomez f 2018 naisnet stable deep networks from nonautonomous differential equations arxiv preprint arxiv180407209 the paper proposes simple techniques that clearly addresses the limitations of recent work on solving tasks in the easytohard benchmark dataset docsepthis paper proposes two extensions to the recently proposed recurrent thinking systems in order to enable them to better generalize from training on simple problems to testing on more complex problems the proposed extensions involve 1 giving the model access to a cure indicating the tobesolved problem at each time step recall and 2 a method intended to prevent the model from learning behaviors specific to particular iterations so as to enable generalization to more complex problems via a larger number of iterations when combined these extensions enable generalization to significantly more complex problem instances across three separate task domains both the underlying approach and the proposed extensions are compelling and the results are promising it is good to see results across a range of task domains the analyses on convergence and overthinking in the latter part of the paper are also very nice i think there are a few issues that need to be sorted out but am happy to raise my score if the authors can address these concerns the only baseline considered is a feedforward version of the primary model ie in which parameters are not shared across iterations this seems like a good comparison but other baselines should be considered as well how important is the particular recurrent architecture employed here in which the output is fed back into the model as input how might an lstm with a recurrent hidden state or a model with an external memory perform on the generalization benchmarks that are considered in this paper the proposed progressive loss seems to be useful primarily for getting these kinds of models to solve the task in the shortest number of iterations of possible the reason seems to be that a model trained only on a fixed number of iterations has no reason to arrive at the correct answer any sooner than the point at which a loss will be computed given that i wonder whether the proposed method is the simplest or best way to accomplish this would it work to simply train models on a randomly sampled number of iterations or to penalize longer processing as is done in adaptive computation time it would be good to compare the proposed approach to these alternatives in the appendix the authors state that when we compute averages we only include models that trained beyond a threshold training accuracy does this mean that the reported results are only for a subset of models that reached some training criterion how many models failed to reach this criterion and are the results qualitatively different when no such criterion is used models are evaluated by taking the iteration with the highest confidence rather than simply using the final iteration do the models still perform just as well at generalization to more complex problems if the final iteration is used on the prefix sum and maze problems the recall element seems to be useful primarily for preventing the overthinking phenomenon and for those task domains this element of their proposal appears to be highly effective as it prevents overthinking even without the progressive loss however on the chess problems the version of the model with recall but without the progressive loss still suffers from overthinking what might explain this discrepancy minor issues and additional questions how does confidence evolve over time do the models generally become more confident with a greater number of iterations if so could confidence be used as a signal for autonomously selecting how many iterations to perform rather than having to select this by hand when reading the paper for the first time it is unclear where the results in figure 1 come from im assuming these results employ both of the proposed extensions but it would be helpful to specify that this is the case ie that vanilla thinking systems would not be capable of producing these results the legend for table 1 of the appendix says perfrom instead of perform the legend for table 4 in the appendix references the prefix sum task but the table appears to contain results from the maze task at the end of the main section with results on the maze task the reader is directed to section a3 to see an example of a 201x201 maze but this is actually in section a7 the proposed approach is compelling and the results are promising but there are a few issues that needed to be sorted out including some additional baselines and clarification of the selection criteria for the reported results update after discussion period the authors included a number of supplementary results and informative additional control experiments i still think that the results would be more convincing if compared to a broader range of competitive baselines but i think these new resultsexperiments are a significant enough improvement to merit a score increase docsepthe paper proposes two modifications to recurrent neural networks that help improve generalization on three synthetic tasks in particular the authors implement a recall mechanism through residual connections to prevent the recurrent network from losing original input information after many iterations in addition they randomly apply truncated backpropagation through time to the computation graph to build an auxiliary loss which diminishes overfitting to the number of recurrent steps the trained network can solve significantly harder problems by performing more recurrent iterations showing strong generalization ability pros the paper gives solid extensions to the idea of deep thing networks schwarzschild et al 2021 the writing is generally clear and easy to follow the modifications are wellmotivated addressing the weaknesses of the original model the experiments demonstrate well the effectiveness of the modifications outperforming the baselines by a significant margin there are detailed ablation studies and behaviour analyses that give insights into the inner working of the model cons the modifications are incremental using common techniques like residual connection and truncated backpropagation through time unsurprisingly using these techniques will help improve the performance the tasks are toyish it would be more persuasive if the method could work for realistic data or some classical benchmarks the main manuscript misses some details of the method see questions below overall i like the idea of this paper which proposes an interesting way to exhibit logical generalization by allowing recurrent networks to think more during inference however as mentioned above compared to the original work the proposed modifications are marginal it is straightforward that using residual connections is critical to training deep networks to combat missing input information or gradient vanishing etc the progressive loss is more interesting yet it is unclear from the writing that this is a novel contribution to the training of rnns eg compared to other auxiliary training for rnn as in trinh et al 2018 what is the difference and advantage of using the proposed method regarding experimental results only chess puzzle data seems real but here the result is not impressive as there is only a 4 improvement in terms of peak accuracy the performance of the proposed method on the other two synthetic data is promising i am curious whether the method only works for these specific tasks or is extendable to other problems sequential inputs such as copy associative recall graph reasoning graves et al 2016 or natural data text images questions and comments it is reasonable to use cnn architecture for imagelike inputs however for prefixsum the input is a sequence of 0 and 1 did you apply cnn here what is the loss function to make the paper selfcontained in sec 3 or 4 you should describe the output format and the loss used to train the model from appendix a it seems that the final output is a 2d bitmap the procedure of selecting the inference iteration for performance measurement should be mentioned in the main manuscript the analysis would be more informative if the authors could compare the results of other iterations with that of the peak one how much difference did they converge to the final output as suggested in fig 9 for prefixsum did you test with longer sequences can your method generalize to 1024bit why there is no performance visualization for maze 201x201 in the paper the feedforward model is a weak baseline the experiments could be stronger with advanced architectures such as transformer or selfattention model fig 8 more explanation on why the new loss helped more in the setting will be appreciated fig 9 the yaxis values go up to 1010 hows that possible how does the new loss help in this case overall i like the idea of this paper however compared to the original work the proposed modifications are marginally significant docsepthe paper proposes two modifications to recurrent neural networks that enable them to extrapolate to larger problems than seen during training the problems generalizations are 1 pathfinding in a maze larger mazes 2 binary prefix sums larger bit strings 3 evaluate the best chess move given a position harder positions the modifications are 1 add the initial problem features to every step of the recurrent computation coined recall in the paper 2 train on a combination of a regular loss with m recurrent iterations and with a loss with mnk iterations where the gradient is not tracked for the first n iterations the paper shows that the modified recurrent network learns an algorithm that converges to the correct result with more recurrent iterations thus overcoming the problem of overthinking divergence really as coined in earlier papers the paper further shows that the learned convergent algorithm extrapolates to largerharder problems for instance the networks are trained on 32bit strings and evaluated on 512bit strings and trained on 9x9 mazes and evaluated on up to 201x201 mazes the paper compares their method ablations wo recall and modified loss mlps and mlps with recall and show that only their proposed network learns a convergent solution that extrapolates to the largerharder problem instances strengths the paper clearly states the problem and convincingly shows a solution the paper is pretty well written weakness the paper is evaluated on a very new dataset which seems ideal for the kind of method suggested and the only real comparison is the work they directly build on you can always find a dataset for which your method is the best and this paper has a bit of that feeling except for the chess positions the examples are new and toylike why do we need these new benchmarks why not evaluate on some more standard benchmarks eg image classificationsegmentation simple qa babi for instance etc also the bit sums and maze path finding are trivially solved with standard handcoded algorithms why would you want to use a neural net on them if the benchmarks are not interesting problems in their own right they might still be nice benchmarks because they nicely exemplify some specific problematic environments but then it must be very clearly explained how this ties back into better solving realworld problems the chess positions task is the nicest example problem but then again how does this compare to something like alphazero or any other modern mcts approach whats the benefit computational generalization this must be explicitly explained the paper doesnt discuss and compare to the very relevant work recurrent relational networks rrn 1 rrns are recurrent graph neural networks and very very similar both neworks have the same outputrecurrentwithrecallembedx structure and the rrn measures a loss on every step of recurrent computation similar to the modified loss the rrn also learns a convergent algorithm and generalizes to harder tasks i would very much appreciate a comparison to a rrn network in all of the new benchmarks or conversely evaluating on some of the same benchmarks as in 1 i suspect the performance will be very similar for the two networks the loss weight alpha hyperparameter is kind of inelegant have the authors tried simply measuring a loss on every recurrent iteration like in 1 this would also encourage a convergent algorithm and would be simpler 1 palm r b paquet u winther o 2018 recurrent relational networks in 32nd conference on neural information processing systems neurips 2018 montral canada vol 31 pp 33682278 decent paper but should be evaluated on more common benchmarks less toy settings and should discuss and compare to 1 ### Summary:
this is an interesting work and i urge the authors to keep pushing this direction of research unfortunately i feel like the manuscript in its current format is not ready for acceptance the research direction is definitely underexplored which makes the evaluation of the work a bit tricky still i think that some of the points raised by the reviewers hold for eg the need of additional baselines to provide a bit of context for what is going oni understand that the authors view their work as an improvement of the previously proposed dt network however that is a recent architecture not sufficiently established not to require additional baseline for comparisons this combined with the novely of the dataset makes it really hard to judge the work the writeup might also require a bit of attention in particular it seems a lot of important details of the work or clarifications regarding the method ended up in the appendix a lot of the smaller things reviewer pointed out the authors rightfully so acknowledged in the rebuttal and propose to fix however i feel this might end up requiring a bit of reorganization of the manuscript rather that adding things at the end of the appendix i also highlight and agree with the word thinking being overloaded in this scenario ablation studies some done as part of the rebuttal might be also a key component to get this work over the finish line eg the discussion around the progressive loss i acknowledge that the authors did run some of those experiments though i feel a more in depth look at the results and interpretation of them eg not looking just at final performance but at the behaviour of the system and integrating them in the main manuscript could also provide considerable additional insight in the proposed architecture my main worry is that in its current format the paper might not end up having the impact it deserves and any of the changes above will greatly improve the quality and the attention the work will get in the community
[ 5430, 1057, 7162, 23554, 689, 673, 513, 253, 3210, 3839, 2489, 625, 13224, 342, 247, 3687, 1180, 273, 25142, 604, 594, 812, 7162, 320, 908, 347, 247, 2625, 323, 1125, 11168, 4087, 17221, 849, 1142, 25142, 281, 1347, 2581, 685, 1907, 281, 3609, 436, 407, 1133, 50276, 9453, 4361, 253, 2929, 323, 253, 806, 673, 352, 310, 12744, 835, 253, 1543, 275, 4677, 337, 1705, 432, 516, 7384, 841, 1543, 2126, 1097, 273, 253, 4081, 18149, 533, 352, 651, 320, 9371, 281, 13199, 326, 436, 310, 253, 1083, 26332, 326, 26724, 4680, 2718, 651, 417, 320, 7032, 273, 9603, 841, 1543, 50276, 783, 13691, 323, 2829, 337, 273, 253, 30762, 2296, 591, 4064, 3185, 273, 1347, 50276, 783, 13691, 323, 2829, 577, 275, 253, 30762, 10414, 253, 17744, 2020, 4836, 533, 253, 2829, 4620, 281, 3831, 1543, 432, 253, 37045, 4836, 50276, 255, 253, 990, 273, 253, 2022, 2593, 342, 1543, 327, 253, 37045, 4836, 253, 9414, 310, 6828, 281, 2593, 247, 20, 281, 923, 271, 1650, 273, 247, 848, 89, 1252, 37045, 533, 436, 310, 2686, 275, 2593, 247, 24, 50274, 783, 4081, 2746, 310, 18511, 285, 253, 1543, 403, 12532, 533, 627, 403, 247, 1643, 3374, 326, 3058, 281, 320, 20045, 562, 1690, 690, 3081, 1666, 25379, 285, 37699, 273, 253, 5438, 6866, 323, 253, 2361, 1543, 50276, 11183, 846, 5955, 2180, 253, 4477, 2908, 247, 1180, 273, 24864, 1543, 285, 27096, 3081, 1453, 4679, 891, 1335, 1158, 326, 253, 1543, 651, 320, 625, 21414, 604, 2429, 281, 247, 16055, 2491, 273, 12085, 1666, 25379, 533, 891, 1158, 841, 747, 906, 11523, 468, 3825, 403, 247, 1534, 2217, 7756, 281, 15785, 247, 4868, 2572, 5474, 339, 431, 248, 2929, 29328, 767, 14586, 281, 18902, 11454, 6928, 326, 1361, 3157, 26647, 327, 1264, 13506, 8892, 275, 1798, 253, 4477, 3359, 247, 6983, 5122, 949, 12541, 10291, 281, 3657, 253, 18902, 2990, 432, 10305, 3236, 3280, 1491, 846, 1142, 25142, 275, 1635, 597, 12421, 4647, 28069, 896, 44263, 318, 949, 673, 281, 253, 13782, 4216, 281, 1973, 271, 24026, 2957, 534, 13064, 6419, 689, 31893, 281, 253, 1180, 273, 18902, 5018, 253, 10166, 2990, 476, 8415, 3012, 12150, 3237, 407, 9591, 625, 18902, 25142, 4645, 2266, 26647, 3745, 50276, 856, 84, 50276, 783, 2929, 4245, 4891, 18149, 281, 253, 2934, 273, 3676, 2181, 6928, 5807, 29815, 50040, 1162, 355, 43425, 253, 4028, 310, 3839, 2590, 285, 3477, 281, 956, 253, 14586, 403, 973, 24013, 8550, 50276, 12025, 272, 253, 32213, 273, 253, 3236, 1566, 50276, 783, 4679, 7568, 973, 253, 12510, 273, 253, 14586, 41731, 14692, 253, 1666, 25379, 407, 247, 1534, 8459, 627, 403, 7000, 28913, 2175, 285, 8770, 6260, 326, 1918, 16039, 715, 253, 6703, 2444, 273, 253, 1566, 50275, 5040, 50276, 783, 14586, 403, 32809, 970, 1846, 5609, 751, 12541, 4602, 285, 28069, 896, 44263, 318, 949, 673, 5061, 321, 28761, 970, 841, 5609, 588, 1361, 3157, 253, 3045, 50275, 783, 8892, 403, 20953, 763, 352, 651, 320, 625, 34593, 604, 253, 1332, 812, 789, 323, 15958, 941, 390, 690, 8946, 49602, 50275, 783, 2022, 7714, 38771, 690, 4278, 273, 253, 1332, 923, 3533, 2708, 50276, 1189, 455, 891, 751, 253, 2934, 273, 436, 2929, 534, 29328, 271, 4722, 1039, 281, 10738, 13760, 26647, 407, 6941, 18902, 6928, 281, 1158, 625, 1309, 17032, 2299, 347, 5393, 1840, 2429, 281, 253, 3236, 789, 253, 4081, 14586, 403, 16888, 352, 310, 15246, 326, 970, 12541, 10291, 310, 4619, 281, 3733, 3676, 6928, 281, 11757, 5816, 3280, 1491, 390, 11786, 29199, 3966, 50276, 783, 13439, 2957, 310, 625, 4722, 2568, 352, 310, 12744, 432, 253, 4028, 326, 436, 310, 247, 4460, 7680, 281, 253, 3733, 273, 391, 79, 2224, 24088, 2429, 281, 643, 24026, 3733, 323, 391, 9866, 347, 275, 492, 249, 73, 1162, 355, 4765, 752, 310, 253, 3064, 285, 5750, 273, 970, 253, 4081, 1332, 50275, 1747, 13218, 5661, 1543, 760, 29992, 25351, 941, 3133, 1524, 533, 1060, 253, 906, 310, 417, 13943, 347, 627, 310, 760, 247, 577, 7756, 275, 2426, 273, 5241, 7200, 253, 3045, 273, 253, 4081, 1332, 327, 253, 643, 767, 13506, 941, 310, 12532, 50276, 74, 717, 14338, 1880, 253, 1332, 760, 2987, 323, 841, 2173, 8892, 390, 310, 9017, 494, 281, 643, 3237, 22453, 14800, 824, 347, 3491, 42162, 6983, 4216, 14720, 38854, 1162, 355, 4022, 390, 3626, 941, 2505, 3888, 50276, 34974, 285, 5701, 50275, 262, 310, 5272, 281, 897, 260, 9866, 10336, 323, 4440, 44549, 14800, 2299, 323, 17744, 2204, 253, 3280, 310, 247, 3425, 273, 470, 285, 337, 858, 368, 4647, 260, 9866, 1060, 50275, 5371, 310, 253, 2957, 1159, 281, 1056, 253, 2929, 1881, 41010, 275, 4706, 495, 390, 577, 368, 943, 6266, 253, 3453, 5981, 285, 253, 2957, 908, 281, 6194, 253, 1566, 432, 30762, 247, 352, 3133, 326, 253, 2457, 3453, 310, 247, 374, 69, 42615, 50276, 783, 5199, 273, 17221, 253, 17032, 19502, 323, 3045, 6814, 943, 320, 5393, 275, 253, 2022, 7714, 50275, 783, 1783, 651, 320, 625, 27096, 604, 253, 4477, 812, 7277, 253, 1543, 273, 643, 25142, 342, 326, 273, 253, 5241, 581, 849, 1199, 3064, 858, 597, 29623, 281, 253, 2457, 3453, 347, 5125, 275, 3036, 898, 50276, 1542, 17744, 2204, 858, 368, 1071, 342, 3356, 6430, 476, 634, 1332, 39970, 281, 27277, 2713, 2139, 627, 310, 642, 3045, 24426, 323, 37045, 848, 89, 1252, 50276, 249, 253, 2929, 253, 3997, 10495, 1566, 310, 247, 5075, 8245, 253, 4679, 812, 320, 10046, 342, 7269, 35615, 824, 347, 39707, 390, 1881, 42959, 1566, 50274, 926, 854, 625, 8813, 327, 2139, 253, 747, 2957, 6518, 625, 275, 253, 4758, 588, 320, 14109, 50276, 926, 898, 253, 340, 10565, 2193, 564, 598, 281, 8437, 17, 849, 84, 326, 1896, 849, 1057, 253, 747, 2957, 1361, 275, 436, 1083, 4583, 891, 751, 253, 2934, 273, 436, 2929, 2299, 2429, 281, 253, 3236, 789, 253, 4081, 14586, 403, 42876, 1534, 5474, 339, 431, 248, 2929, 29328, 767, 14586, 281, 18902, 11454, 6928, 326, 8046, 731, 281, 26480, 25839, 281, 4067, 3237, 685, 2326, 1309, 3733, 50276, 783, 3237, 2087, 5904, 403, 50276, 18, 1854, 28983, 275, 247, 37045, 4067, 278, 1370, 265, 50276, 19, 8985, 17744, 22661, 4067, 2372, 11559, 495, 7472, 253, 1682, 29992, 2118, 1677, 247, 1899, 12150, 6887, 50276, 783, 14586, 403, 50275, 18, 823, 253, 3302, 1895, 3386, 281, 1046, 3213, 273, 253, 18902, 13782, 48945, 6983, 275, 253, 2929, 374, 6194, 327, 247, 5019, 273, 247, 3963, 2957, 342, 278, 18902, 25142, 285, 342, 247, 2957, 342, 278, 30664, 25142, 835, 253, 11786, 310, 417, 27173, 323, 253, 806, 295, 25142, 50276, 783, 2929, 2722, 326, 253, 7321, 18902, 2990, 33772, 271, 5933, 326, 26414, 281, 253, 3451, 906, 342, 625, 18902, 25142, 3021, 40845, 253, 1895, 273, 689, 37341, 23279, 1663, 347, 48945, 275, 4321, 9380, 50275, 783, 2929, 2007, 2722, 326, 253, 6311, 41886, 5933, 26480, 311, 684, 281, 4067, 10984, 254, 3237, 323, 4227, 253, 6928, 403, 10166, 327, 4567, 2713, 11559, 285, 6760, 327, 23414, 2713, 11559, 285, 10166, 327, 898, 89, 26, 278, 1370, 265, 285, 6760, 327, 598, 281, 848, 89, 1252, 278, 1370, 265, 50276, 783, 2929, 26662, 616, 1332, 490, 77, 569, 32063, 6983, 285, 7321, 2957, 13361, 793, 285, 13361, 793, 342, 6983, 285, 921, 326, 760, 616, 4081, 2990, 33772, 247, 41886, 2900, 326, 26480, 311, 684, 281, 253, 4067, 10984, 254, 1895, 10872, 20544, 50275, 783, 2929, 4518, 3054, 253, 1895, 285, 2410, 1763, 5356, 2722, 247, 2900, 50275, 783, 2929, 310, 3965, 973, 3542, 50276, 20881, 1255, 50275, 783, 2929, 310, 6760, 327, 247, 1077, 747, 10895, 534, 3133, 7445, 323, 253, 2238, 273, 1332, 5125, 285, 253, 760, 1524, 5301, 310, 253, 789, 597, 3587, 1973, 327, 368, 476, 1900, 1089, 247, 10895, 323, 534, 634, 1332, 310, 253, 1682, 285, 436, 2929, 556, 247, 2372, 273, 326, 5471, 3707, 323, 253, 29992, 6887, 253, 6667, 403, 747, 285, 20953, 3022, 2139, 513, 359, 878, 841, 747, 49602, 2139, 417, 7472, 327, 690, 625, 2629, 49602, 24088, 2460, 9162, 29429, 318, 2969, 2805, 66, 5366, 74, 323, 4227, 3966, 50276, 12563, 253, 2372, 22661, 285, 37045, 1854, 4560, 403, 35820, 1365, 14042, 342, 2629, 1133, 38059, 11333, 2139, 651, 368, 971, 281, 897, 247, 11454, 2036, 327, 731, 604, 253, 49602, 403, 417, 4722, 3237, 275, 616, 1211, 987, 597, 1537, 1335, 320, 5322, 49602, 984, 597, 23395, 40924, 6644, 690, 2173, 20276, 12620, 533, 840, 352, 1364, 320, 1077, 4518, 5544, 849, 436, 16027, 896, 715, 1805, 16161, 1524, 10186, 3237, 253, 29992, 6887, 4836, 310, 253, 6815, 383, 1650, 1895, 533, 840, 969, 849, 1057, 436, 7277, 281, 1633, 751, 355, 545, 1370, 2771, 390, 667, 643, 4980, 278, 291, 84, 2746, 47515, 253, 5649, 15180, 26647, 436, 1364, 320, 11120, 5544, 50274, 783, 2929, 36908, 2319, 285, 7277, 281, 253, 1077, 4623, 789, 18902, 38524, 6928, 391, 30930, 337, 391, 83, 2224, 403, 18902, 4216, 11454, 6928, 285, 1077, 1077, 2074, 1097, 747, 3869, 452, 253, 1072, 3453, 250, 6259, 3113, 250, 1179, 282, 1814, 264, 89, 2605, 285, 253, 391, 30930, 5593, 247, 2957, 327, 1046, 3213, 273, 18902, 13782, 2074, 281, 253, 7321, 2957, 253, 391, 30930, 671, 33772, 247, 41886, 5933, 285, 2087, 4219, 281, 12150, 8892, 891, 651, 1077, 1199, 11435, 247, 5301, 281, 247, 391, 30930, 2990, 275, 512, 273, 253, 747, 49602, 390, 5636, 600, 16344, 327, 690, 273, 253, 1072, 49602, 347, 275, 337, 891, 9101, 253, 3045, 588, 320, 1077, 2074, 323, 253, 767, 6928, 50274, 783, 2957, 2801, 9765, 4373, 19484, 310, 2238, 273, 275, 70, 1851, 386, 452, 253, 4477, 3597, 3365, 10499, 247, 2957, 327, 1046, 18902, 19502, 751, 275, 337, 436, 651, 671, 11907, 247, 41886, 5933, 285, 651, 320, 19554, 50276, 18, 50276, 17183, 78, 391, 270, 1349, 21118, 1484, 50276, 6481, 508, 258, 4765, 18902, 38524, 6928, 275, 4567, 2109, 8059, 327, 11454, 1491, 5162, 2718, 5723, 2824, 4765, 24325, 1544, 476, 2960, 1936, 4562, 7266, 495, 23926, 1423, 3141, 12524, 2929, 533, 943, 320, 6760, 327, 625, 1846, 49602, 50276, 1417, 20953, 7533, 285, 943, 2319, 285, 7277, 281, 337, 2490, 187, 4118, 18435, 27, 2520, 310, 271, 4722, 789, 285, 891, 21434, 253, 4477, 281, 1978, 13383, 436, 3884, 273, 2561, 19235, 891, 1928, 751, 253, 7714, 275, 697, 1655, 5981, 310, 417, 4704, 323, 14924, 50276, 783, 2561, 3884, 310, 7964, 15560, 18398, 446, 2149, 534, 2789, 253, 7103, 273, 253, 789, 247, 2372, 28190, 1335, 891, 1158, 326, 690, 273, 253, 2792, 5439, 407, 253, 30628, 2186, 323, 24088, 253, 878, 273, 3081, 1666, 25379, 281, 2085, 247, 2372, 273, 3634, 323, 752, 310, 1469, 327, 74, 2096, 326, 253, 4477, 1859, 616, 789, 347, 271, 7756, 273, 253, 3786, 4081, 19641, 2990, 2299, 326, 310, 247, 3332, 10336, 417, 10481, 4232, 417, 281, 2430, 3081, 8245, 323, 14023, 436, 5678, 342, 253, 22458, 600, 273, 253, 10895, 2789, 352, 1663, 1892, 281, 5963, 253, 789, 50275, 783, 3630, 484, 1537, 671, 2430, 247, 2372, 273, 4116, 275, 1798, 352, 3133, 247, 2257, 273, 1774, 4278, 273, 253, 789, 390, 8254, 6787, 5001, 253, 1332, 7402, 598, 275, 253, 30762, 247, 2257, 273, 253, 4577, 1841, 37317, 8042, 562, 253, 4477, 987, 2920, 594, 14969, 275, 253, 30080, 22559, 285, 12661, 281, 4993, 2299, 891, 1928, 436, 1537, 990, 598, 10568, 247, 2372, 273, 40386, 273, 253, 7714, 2581, 326, 6240, 1841, 387, 253, 990, 273, 253, 30762, 891, 671, 6780, 285, 5194, 342, 253, 3159, 4680, 1146, 689, 19052, 275, 436, 10076, 50276, 1752, 318, 2175, 690, 2218, 347, 629, 273, 253, 30080, 22559, 1537, 320, 671, 247, 2234, 4445, 281, 755, 436, 789, 689, 253, 8416, 1386, 24088, 253, 5955, 1475, 253, 13439, 2957, 891, 14409, 326, 253, 4477, 858, 1408, 690, 273, 1110, 4679, 2167, 891, 1928, 247, 625, 275, 6864, 1007, 387, 253, 1543, 285, 7914, 273, 731, 24088, 417, 2819, 816, 387, 2457, 3045, 533, 387, 253, 8770, 273, 253, 985, 285, 24399, 731, 275, 253, 2022, 7714, 812, 671, 2085, 10665, 3081, 12288, 275, 253, 4081, 10336, 50275, 2577, 2022, 7664, 310, 326, 275, 697, 1655, 5981, 253, 2929, 1537, 417, 990, 598, 1907, 253, 3486, 352, 22828, 285, 667, 273, 253, 2544, 1840, 588, 10260, 3157, 253, 3290, 285, 253, 4116, 253, 789, 588, 755, 275, 253, 3114 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 5430, 1057, 7162, 23554, 689, 673, 513, 253, 3210, 3839, 2489, 625, 13224, 342, 247, 3687, 1180, 273, 25142, 604, 594, 812, 7162, 320, 908, 347, 247, 2625, 323, 1125, 11168, 4087, 17221, 849, 1142, 25142, 281, 1347, 2581, 685, 1907, 281, 3609, 436, 407, 1133, 50276, 9453, 4361, 253, 2929, 323, 253, 806, 673, 352, 310, 12744, 835, 253, 1543, 275, 4677, 337, 1705, 432, 516, 7384, 841, 1543, 2126, 1097, 273, 253, 4081, 18149, 533, 352, 651, 320, 9371, 281, 13199, 326, 436, 310, 253, 1083, 26332, 326, 26724, 4680, 2718, 651, 417, 320, 7032, 273, 9603, 841, 1543, 50276, 783, 13691, 323, 2829, 337, 273, 253, 30762, 2296, 591, 4064, 3185, 273, 1347, 50276, 783, 13691, 323, 2829, 577, 275, 253, 30762, 10414, 253, 17744, 2020, 4836, 533, 253, 2829, 4620, 281, 3831, 1543, 432, 253, 37045, 4836, 50276, 255, 253, 990, 273, 253, 2022, 2593, 342, 1543, 327, 253, 37045, 4836, 253, 9414, 310, 6828, 281, 2593, 247, 20, 281, 923, 271, 1650, 273, 247, 848, 89, 1252, 37045, 533, 436, 310, 2686, 275, 2593, 247, 24, 50274, 783, 4081, 2746, 310, 18511, 285, 253, 1543, 403, 12532, 533, 627, 403, 247, 1643, 3374, 326, 3058, 281, 320, 20045, 562, 1690, 690, 3081, 1666, 25379, 285, 37699, 273, 253, 5438, 6866, 323, 253, 2361, 1543, 50276, 11183, 846, 5955, 2180, 253, 4477, 2908, 247, 1180, 273, 24864, 1543, 285, 27096, 3081, 1453, 4679, 891, 1335, 1158, 326, 253, 1543, 651, 320, 625, 21414, 604, 2429, 281, 247, 16055, 2491, 273, 12085, 1666, 25379, 533, 891, 1158, 841, 747, 906, 11523, 468, 3825, 403, 247, 1534, 2217, 7756, 281, 15785, 247, 4868, 2572, 5474, 339, 431, 248, 2929, 29328, 767, 14586, 281, 18902, 11454, 6928, 326, 1361, 3157, 26647, 327, 1264, 13506, 8892, 275, 1798, 253, 4477, 3359, 247, 6983, 5122, 949, 12541, 10291, 281, 3657, 253, 18902, 2990, 432, 10305, 3236, 3280, 1491, 846, 1142, 25142, 275, 1635, 597, 12421, 4647, 28069, 896, 44263, 318, 949, 673, 281, 253, 13782, 4216, 281, 1973, 271, 24026, 2957, 534, 13064, 6419, 689, 31893, 281, 253, 1180, 273, 18902, 5018, 253, 10166, 2990, 476, 8415, 3012, 12150, 3237, 407, 9591, 625, 18902, 25142, 4645, 2266, 26647, 3745, 50276, 856, 84, 50276, 783, 2929, 4245, 4891, 18149, 281, 253, 2934, 273, 3676, 2181, 6928, 5807, 29815, 50040, 1162, 355, 43425, 253, 4028, 310, 3839, 2590, 285, 3477, 281, 956, 253, 14586, 403, 973, 24013, 8550, 50276, 12025, 272, 253, 32213, 273, 253, 3236, 1566, 50276, 783, 4679, 7568, 973, 253, 12510, 273, 253, 14586, 41731, 14692, 253, 1666, 25379, 407, 247, 1534, 8459, 627, 403, 7000, 28913, 2175, 285, 8770, 6260, 326, 1918, 16039, 715, 253, 6703, 2444, 273, 253, 1566, 50275, 5040, 50276, 783, 14586, 403, 32809, 970, 1846, 5609, 751, 12541, 4602, 285, 28069, 896, 44263, 318, 949, 673, 5061, 321, 28761, 970, 841, 5609, 588, 1361, 3157, 253, 3045, 50275, 783, 8892, 403, 20953, 763, 352, 651, 320, 625, 34593, 604, 253, 1332, 812, 789, 323, 15958, 941, 390, 690, 8946, 49602, 50275, 783, 2022, 7714, 38771, 690, 4278, 273, 253, 1332, 923, 3533, 2708, 50276, 1189, 455, 891, 751, 253, 2934, 273, 436, 2929, 534, 29328, 271, 4722, 1039, 281, 10738, 13760, 26647, 407, 6941, 18902, 6928, 281, 1158, 625, 1309, 17032, 2299, 347, 5393, 1840, 2429, 281, 253, 3236, 789, 253, 4081, 14586, 403, 16888, 352, 310, 15246, 326, 970, 12541, 10291, 310, 4619, 281, 3733, 3676, 6928, 281, 11757, 5816, 3280, 1491, 390, 11786, 29199, 3966, 50276, 783, 13439, 2957, 310, 625, 4722, 2568, 352, 310, 12744, 432, 253, 4028, 326, 436, 310, 247, 4460, 7680, 281, 253, 3733, 273, 391, 79, 2224, 24088, 2429, 281, 643, 24026, 3733, 323, 391, 9866, 347, 275, 492, 249, 73, 1162, 355, 4765, 752, 310, 253, 3064, 285, 5750, 273, 970, 253, 4081, 1332, 50275, 1747, 13218, 5661, 1543, 760, 29992, 25351, 941, 3133, 1524, 533, 1060, 253, 906, 310, 417, 13943, 347, 627, 310, 760, 247, 577, 7756, 275, 2426, 273, 5241, 7200, 253, 3045, 273, 253, 4081, 1332, 327, 253, 643, 767, 13506, 941, 310, 12532, 50276, 74, 717, 14338, 1880, 253, 1332, 760, 2987, 323, 841, 2173, 8892, 390, 310, 9017, 494, 281, 643, 3237, 22453, 14800, 824, 347, 3491, 42162, 6983, 4216, 14720, 38854, 1162, 355, 4022, 390, 3626, 941, 2505, 3888, 50276, 34974, 285, 5701, 50275, 262, 310, 5272, 281, 897, 260, 9866, 10336, 323, 4440, 44549, 14800, 2299, 323, 17744, 2204, 253, 3280, 310, 247, 3425, 273, 470, 285, 337, 858, 368, 4647, 260, 9866, 1060, 50275, 5371, 310, 253, 2957, 1159, 281, 1056, 253, 2929, 1881, 41010, 275, 4706, 495, 390, 577, 368, 943, 6266, 253, 3453, 5981, 285, 253, 2957, 908, 281, 6194, 253, 1566, 432, 30762, 247, 352, 3133, 326, 253, 2457, 3453, 310, 247, 374, 69, 42615, 50276, 783, 5199, 273, 17221, 253, 17032, 19502, 323, 3045, 6814, 943, 320, 5393, 275, 253, 2022, 7714, 50275, 783, 1783, 651, 320, 625, 27096, 604, 253, 4477, 812, 7277, 253, 1543, 273, 643, 25142, 342, 326, 273, 253, 5241, 581, 849, 1199, 3064, 858, 597, 29623, 281, 253, 2457, 3453, 347, 5125, 275, 3036, 898, 50276, 1542, 17744, 2204, 858, 368, 1071, 342, 3356, 6430, 476, 634, 1332, 39970, 281, 27277, 2713, 2139, 627, 310, 642, 3045, 24426, 323, 37045, 848, 89, 1252, 50276, 249, 253, 2929, 253, 3997, 10495, 1566, 310, 247, 5075, 8245, 253, 4679, 812, 320, 10046, 342, 7269, 35615, 824, 347, 39707, 390, 1881, 42959, 1566, 50274, 926, 854, 625, 8813, 327, 2139, 253, 747, 2957, 6518, 625, 275, 253, 4758, 588, 320, 14109, 50276, 926, 898, 253, 340, 10565, 2193, 564, 598, 281, 8437, 17, 849, 84, 326, 1896, 849, 1057, 253, 747, 2957, 1361, 275, 436, 1083, 4583, 891, 751, 253, 2934, 273, 436, 2929, 2299, 2429, 281, 253, 3236, 789, 253, 4081, 14586, 403, 42876, 1534, 5474, 339, 431, 248, 2929, 29328, 767, 14586, 281, 18902, 11454, 6928, 326, 8046, 731, 281, 26480, 25839, 281, 4067, 3237, 685, 2326, 1309, 3733, 50276, 783, 3237, 2087, 5904, 403, 50276, 18, 1854, 28983, 275, 247, 37045, 4067, 278, 1370, 265, 50276, 19, 8985, 17744, 22661, 4067, 2372, 11559, 495, 7472, 253, 1682, 29992, 2118, 1677, 247, 1899, 12150, 6887, 50276, 783, 14586, 403, 50275, 18, 823, 253, 3302, 1895, 3386, 281, 1046, 3213, 273, 253, 18902, 13782, 48945, 6983, 275, 253, 2929, 374, 6194, 327, 247, 5019, 273, 247, 3963, 2957, 342, 278, 18902, 25142, 285, 342, 247, 2957, 342, 278, 30664, 25142, 835, 253, 11786, 310, 417, 27173, 323, 253, 806, 295, 25142, 50276, 783, 2929, 2722, 326, 253, 7321, 18902, 2990, 33772, 271, 5933, 326, 26414, 281, 253, 3451, 906, 342, 625, 18902, 25142, 3021, 40845, 253, 1895, 273, 689, 37341, 23279, 1663, 347, 48945, 275, 4321, 9380, 50275, 783, 2929, 2007, 2722, 326, 253, 6311, 41886, 5933, 26480, 311, 684, 281, 4067, 10984, 254, 3237, 323, 4227, 253, 6928, 403, 10166, 327, 4567, 2713, 11559, 285, 6760, 327, 23414, 2713, 11559, 285, 10166, 327, 898, 89, 26, 278, 1370, 265, 285, 6760, 327, 598, 281, 848, 89, 1252, 278, 1370, 265, 50276, 783, 2929, 26662, 616, 1332, 490, 77, 569, 32063, 6983, 285, 7321, 2957, 13361, 793, 285, 13361, 793, 342, 6983, 285, 921, 326, 760, 616, 4081, 2990, 33772, 247, 41886, 2900, 326, 26480, 311, 684, 281, 253, 4067, 10984, 254, 1895, 10872, 20544, 50275, 783, 2929, 4518, 3054, 253, 1895, 285, 2410, 1763, 5356, 2722, 247, 2900, 50275, 783, 2929, 310, 3965, 973, 3542, 50276, 20881, 1255, 50275, 783, 2929, 310, 6760, 327, 247, 1077, 747, 10895, 534, 3133, 7445, 323, 253, 2238, 273, 1332, 5125, 285, 253, 760, 1524, 5301, 310, 253, 789, 597, 3587, 1973, 327, 368, 476, 1900, 1089, 247, 10895, 323, 534, 634, 1332, 310, 253, 1682, 285, 436, 2929, 556, 247, 2372, 273, 326, 5471, 3707, 323, 253, 29992, 6887, 253, 6667, 403, 747, 285, 20953, 3022, 2139, 513, 359, 878, 841, 747, 49602, 2139, 417, 7472, 327, 690, 625, 2629, 49602, 24088, 2460, 9162, 29429, 318, 2969, 2805, 66, 5366, 74, 323, 4227, 3966, 50276, 12563, 253, 2372, 22661, 285, 37045, 1854, 4560, 403, 35820, 1365, 14042, 342, 2629, 1133, 38059, 11333, 2139, 651, 368, 971, 281, 897, 247, 11454, 2036, 327, 731, 604, 253, 49602, 403, 417, 4722, 3237, 275, 616, 1211, 987, 597, 1537, 1335, 320, 5322, 49602, 984, 597, 23395, 40924, 6644, 690, 2173, 20276, 12620, 533, 840, 352, 1364, 320, 1077, 4518, 5544, 849, 436, 16027, 896, 715, 1805, 16161, 1524, 10186, 3237, 253, 29992, 6887, 4836, 310, 253, 6815, 383, 1650, 1895, 533, 840, 969, 849, 1057, 436, 7277, 281, 1633, 751, 355, 545, 1370, 2771, 390, 667, 643, 4980, 278, 291, 84, 2746, 47515, 253, 5649, 15180, 26647, 436, 1364, 320, 11120, 5544, 50274, 783, 2929, 36908, 2319, 285, 7277, 281, 253, 1077, 4623, 789, 18902, 38524, 6928, 391, 30930, 337, 391, 83, 2224, 403, 18902, 4216, 11454, 6928, 285, 1077, 1077, 2074, 1097, 747, 3869, 452, 253, 1072, 3453, 250, 6259, 3113, 250, 1179, 282, 1814, 264, 89, 2605, 285, 253, 391, 30930, 5593, 247, 2957, 327, 1046, 3213, 273, 18902, 13782, 2074, 281, 253, 7321, 2957, 253, 391, 30930, 671, 33772, 247, 41886, 5933, 285, 2087, 4219, 281, 12150, 8892, 891, 651, 1077, 1199, 11435, 247, 5301, 281, 247, 391, 30930, 2990, 275, 512, 273, 253, 747, 49602, 390, 5636, 600, 16344, 327, 690, 273, 253, 1072, 49602, 347, 275, 337, 891, 9101, 253, 3045, 588, 320, 1077, 2074, 323, 253, 767, 6928, 50274, 783, 2957, 2801, 9765, 4373, 19484, 310, 2238, 273, 275, 70, 1851, 386, 452, 253, 4477, 3597, 3365, 10499, 247, 2957, 327, 1046, 18902, 19502, 751, 275, 337, 436, 651, 671, 11907, 247, 41886, 5933, 285, 651, 320, 19554, 50276, 18, 50276, 17183, 78, 391, 270, 1349, 21118, 1484, 50276, 6481, 508, 258, 4765, 18902, 38524, 6928, 275, 4567, 2109, 8059, 327, 11454, 1491, 5162, 2718, 5723, 2824, 4765, 24325, 1544, 476, 2960, 1936, 4562, 7266, 495, 23926, 1423, 3141, 12524, 2929, 533, 943, 320, 6760, 327, 625, 1846, 49602, 50276, 1417, 20953, 7533, 285, 943, 2319, 285, 7277, 281, 337, 2490, 187, 4118, 18435, 27, 2520, 310, 271, 4722, 789, 285, 891, 21434, 253, 4477, 281, 1978, 13383, 436, 3884, 273, 2561, 19235, 891, 1928, 751, 253, 7714, 275, 697, 1655, 5981, 310, 417, 4704, 323, 14924, 50276, 783, 2561, 3884, 310, 7964, 15560, 18398, 446, 2149, 534, 2789, 253, 7103, 273, 253, 789, 247, 2372, 28190, 1335, 891, 1158, 326, 690, 273, 253, 2792, 5439, 407, 253, 30628, 2186, 323, 24088, 253, 878, 273, 3081, 1666, 25379, 281, 2085, 247, 2372, 273, 3634, 323, 752, 310, 1469, 327, 74, 2096, 326, 253, 4477, 1859, 616, 789, 347, 271, 7756, 273, 253, 3786, 4081, 19641, 2990, 2299, 326, 310, 247, 3332, 10336, 417, 10481, 4232, 417, 281, 2430, 3081, 8245, 323, 14023, 436, 5678, 342, 253, 22458, 600, 273, 253, 10895, 2789, 352, 1663, 1892, 281, 5963, 253, 789, 50275, 783, 3630, 484, 1537, 671, 2430, 247, 2372, 273, 4116, 275, 1798, 352, 3133, 247, 2257, 273, 1774, 4278, 273, 253, 789, 390, 8254, 6787, 5001, 253, 1332, 7402, 598, 275, 253, 30762, 247, 2257, 273, 253, 4577, 1841, 37317, 8042, 562, 253, 4477, 987, 2920, 594, 14969, 275, 253, 30080, 22559, 285, 12661, 281, 4993, 2299, 891, 1928, 436, 1537, 990, 598, 10568, 247, 2372, 273, 40386, 273, 253, 7714, 2581, 326, 6240, 1841, 387, 253, 990, 273, 253, 30762, 891, 671, 6780, 285, 5194, 342, 253, 3159, 4680, 1146, 689, 19052, 275, 436, 10076, 50276, 1752, 318, 2175, 690, 2218, 347, 629, 273, 253, 30080, 22559, 1537, 320, 671, 247, 2234, 4445, 281, 755, 436, 789, 689, 253, 8416, 1386, 24088, 253, 5955, 1475, 253, 13439, 2957, 891, 14409, 326, 253, 4477, 858, 1408, 690, 273, 1110, 4679, 2167, 891, 1928, 247, 625, 275, 6864, 1007, 387, 253, 1543, 285, 7914, 273, 731, 24088, 417, 2819, 816, 387, 2457, 3045, 533, 387, 253, 8770, 273, 253, 985, 285, 24399, 731, 275, 253, 2022, 7714, 812, 671, 2085, 10665, 3081, 12288, 275, 253, 4081, 10336, 50275, 2577, 2022, 7664, 310, 326, 275, 697, 1655, 5981, 253, 2929, 1537, 417, 990, 598, 1907, 253, 3486, 352, 22828, 285, 667, 273, 253, 2544, 1840, 588, 10260, 3157, 253, 3290, 285, 253, 4116, 253, 789, 588, 755, 275, 253, 3114 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the authors present a way to transform data from a source distribution to have characteristics of a target distribution this is accomplished by applying a neuronedit function to the encoding of the input this edited input is then decoded the neuronedit function is parametrized by the target distributions statistics the edit function does a sort of simple histogram matching so that the ith percentile values of the source distributions bottleneck representations instead become the ith percentile values of the target distributions bottleneck representations experiments are on cifar10 and biology datasets the latter of which are not my strong suit this paper is wellwritten and original it is original because there are only a few works which directly manipulate the latent space one example is latent space interpolation used to visualize gans and this is distinct from those the problem they aim to solve also has not received much attention which enhances the novelty of this paper the presented method is simple and easy to implement since the editing function is not learned but is instead deterministic it is encapsulated in equation 1 the fact that the editing function is fixed may greatly hinder its flexibility and applicability in section 31 and figure 1 we are shown that neuroneditting can turn images of horses with white backgrounds into images of horses with dark backgrounds horses are an unseen class neuroneditting turns the horse darker as well it seems that one could change the brightness and contrast of the image to obtain a similar effect or one could or take the geometric mean of the image in 01 with the average target image and obtain a similar effect such traditional methods are also robust to unseen classes moreover neuronedittings ability to change the brightness of the image is not that surprising given that brightness is some of the most basic image information in point of fact it is captured by the dc coefficient the very first coefficient from the discrete cosine transform which is used in jpeg what else can neuroneditting do in the image domain can this be used to rotate or reflect mnist digits the biological experiments also appear to involve simple input transformations fine points an edit function between the the i am not sure the speculation about this methods loose relation to word2vec belongs in a scientific work both involve modifications to a neural representations but no further relation is justified in the paper was the dataset partitioning for the cifar10 experiment done manually if not what process partitioned the dataset edit some of my suggestions were incorporated in the rebuttal but my sentiment is still that this is almost at the acceptance threshold the large focus on biology makes much of this paper harder to evaluate or appreciatedocsepthe paper demonstrates that we can harness semantically meaningful features learned by a pretrained autoencoder ae to define a determinisc transformation eg math operations on latent space to transform one distribution a into another distribution b the original ae was pretrained on a larger distribution that includes both a and b a key contribution of this paper is the interesting demonstration that this method called neuron editing allows us to perform a transformation t that transforms pretreatment observations into posttreatment observations which is useful in the medical or biological setting novelty neuron editing is essentially a common technique of performing arithmetics in the latent space eg king man woman queen in nlp or man wearing sunglass man woman woman wearing sunglasses eg in image domain eg in alec radford et al 2015 therefore the novelty is limited significance the main contribution of this paper is the empirical demonstration that such transformation t is better defined rather than learned directly from data eg via gans i should note that im not too familiar with the biology datasets in sec 32 and sec 33 in order to fully appreciate the practical impact of neuron editing clarity i think some key reasons behind why neuron editing works could be more clearly presented that is the key here is we use pretrained aes to perform a predefined transformation i think the key might not be whether we use gans or not it is how we use them i guess if we use ali ie training a gan concurrently with an ae to perform neuron editing the result should work as welldocsepthe authors proposed a novel method of making data transformation that is much easier to extend to the cases where the input distribution is different from the one that is used to the train the model insample vs outsample this has a lot of application in removing experimental noise in biological data also known as batch effects the idea is to learn a representation that separates background dimensions that do not vary across data points but may be subject to change in a data transformation and foreground dimensions that vary between data points under the same background and then apply a fixed linear transformation in the learned representation space this is different from other approaches such as gan where the transformation is learned entirely based on the data in addition it mitigates some known problems such as the mode collapse in gan by just learning a good representation this is proposed to be done by an autoencoder trained on both insamples and outsamples the transformation is however adjusted based on the insamples only experimental results are appealing in different applications compared to gan resnetgan and cyclegan here are my major concerns the idea seems to be very general and indeed is applicable to any latent representation learning method and not just autoencoders is there any reason that other more complicated unsupervised representation learning methods were not used for benchmarking in the paper the method heavily relies on the quality of the unsupervised learned representation how one is guaranteed that the transformation in the learned space be simple and piecewise linear shouldnt we consider a regularization method to guide the unsupervised learning more appropriately the method also implicitly assumes that the same neurons model background and foreground in the insample and outsample data points how is that guaranteed in practice ### Summary:
this was a borderline paper as reviewers generally agreed that the method was a new method that was appropriately explained and motivated and had reasonable experimental results the main drawbacks were that the significance of the method was unclear in particular the method might be too inflexible due to being based on a hardcoded rule and it is not clear why this is the right approach relative to eg gans with a modified training objective reviewers also had difficulty assessing the significance of the results on biological datasets while such results certainly add to the paper the paper would be stronger if the argument for significance could be assessed from more standard datasets a note on the review process the reviewers initially scored the paper 666 but the review text for some of the reviews was more negative than a typical 6 score to confirm this i asked if any reviewers wanted to push for acceptance none of the reviewers did generally due to feeling the significance of the results was limited and two of the reviewers decided to lower their scores to account for this
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 1246, 247, 1039, 281, 4979, 941, 432, 247, 2603, 3268, 281, 452, 5319, 273, 247, 2303, 3268, 436, 310, 14123, 407, 9433, 247, 5723, 11406, 262, 1159, 281, 253, 9706, 273, 253, 3280, 436, 16168, 3280, 310, 840, 45775, 253, 5723, 11406, 262, 1159, 310, 30364, 50065, 407, 253, 2303, 10670, 9990, 253, 12921, 1159, 1057, 247, 3686, 273, 2969, 33105, 11038, 594, 326, 253, 209, 334, 36384, 2193, 273, 253, 2603, 10670, 3673, 44856, 14237, 3185, 2489, 253, 209, 334, 36384, 2193, 273, 253, 2303, 10670, 3673, 44856, 14237, 4679, 403, 327, 260, 338, 274, 740, 285, 16775, 15302, 253, 6158, 273, 534, 403, 417, 619, 2266, 4176, 50276, 2520, 2929, 310, 973, 15720, 285, 3236, 352, 310, 3236, 984, 627, 403, 760, 247, 1643, 2987, 534, 3587, 26526, 253, 21624, 2317, 581, 1650, 310, 21624, 2317, 30370, 908, 281, 31986, 305, 507, 285, 436, 310, 5799, 432, 1110, 253, 1895, 597, 4388, 281, 8415, 671, 556, 417, 2959, 1199, 4116, 534, 25222, 253, 38135, 273, 436, 2929, 253, 3559, 1332, 310, 2969, 285, 3477, 281, 3359, 1580, 253, 14835, 1159, 310, 417, 6311, 533, 310, 3185, 30027, 352, 310, 42221, 275, 5150, 337, 50276, 783, 958, 326, 253, 14835, 1159, 310, 4229, 778, 10260, 35007, 697, 15840, 285, 30437, 275, 2593, 4562, 285, 4677, 337, 359, 403, 2011, 326, 5723, 11406, 2835, 476, 1614, 3888, 273, 12074, 342, 3168, 24550, 715, 3888, 273, 12074, 342, 3644, 24550, 12074, 403, 271, 39709, 966, 5723, 11406, 2835, 7819, 253, 8815, 28170, 347, 973, 352, 3133, 326, 581, 812, 1818, 253, 20468, 285, 4499, 273, 253, 2460, 281, 4044, 247, 2074, 1055, 390, 581, 812, 390, 1379, 253, 17856, 1599, 273, 253, 2460, 275, 14805, 342, 253, 3388, 2303, 2460, 285, 4044, 247, 2074, 1055, 824, 5899, 3082, 403, 671, 10237, 281, 39709, 5971, 25761, 5723, 11406, 45083, 3745, 281, 1818, 253, 20468, 273, 253, 2460, 310, 417, 326, 10084, 1677, 326, 20468, 310, 690, 273, 253, 954, 5044, 2460, 1491, 275, 1127, 273, 958, 352, 310, 10848, 407, 253, 36196, 10235, 253, 1077, 806, 10235, 432, 253, 13358, 7349, 460, 4979, 534, 310, 908, 275, 480, 21949, 752, 2010, 476, 5723, 11406, 2835, 513, 275, 253, 2460, 5028, 476, 436, 320, 908, 281, 21033, 390, 4887, 278, 79, 382, 24321, 253, 7534, 4679, 671, 3176, 281, 6388, 2969, 3280, 21257, 50276, 32829, 2792, 50276, 266, 12921, 1159, 875, 253, 253, 50276, 74, 717, 417, 2119, 253, 22898, 670, 436, 3082, 13155, 5886, 281, 3159, 19, 4642, 14125, 275, 247, 8249, 789, 1097, 6388, 14586, 281, 247, 11454, 14237, 533, 642, 2007, 5886, 310, 17285, 275, 253, 2929, 50276, 4238, 253, 10895, 41463, 323, 253, 260, 338, 274, 740, 3368, 2218, 13542, 604, 417, 752, 1232, 10883, 264, 253, 10895, 50276, 15576, 690, 273, 619, 13991, 497, 11217, 275, 253, 30080, 22559, 533, 619, 21942, 310, 1335, 326, 436, 310, 2761, 387, 253, 14924, 7887, 253, 1781, 2770, 327, 16775, 2789, 1199, 273, 436, 2929, 12150, 281, 7472, 390, 14109, 406, 339, 431, 248, 2929, 14371, 326, 359, 476, 26880, 3300, 39904, 14282, 3386, 6311, 407, 247, 3215, 11273, 6753, 36465, 247, 70, 281, 4853, 247, 11544, 2865, 9261, 24088, 14168, 5871, 327, 21624, 2317, 281, 4979, 581, 3268, 247, 715, 1529, 3268, 270, 253, 3236, 247, 70, 369, 3215, 11273, 327, 247, 4067, 3268, 326, 3797, 1097, 247, 285, 270, 50276, 66, 2234, 7680, 273, 436, 2929, 310, 253, 4722, 20028, 326, 436, 1332, 1925, 23586, 14835, 4483, 441, 281, 1347, 247, 9261, 246, 326, 29698, 50276, 4025, 9048, 7313, 715, 1501, 21272, 7313, 534, 310, 4217, 275, 253, 3739, 390, 7534, 4758, 50275, 2369, 652, 555, 23586, 14835, 310, 9093, 247, 1846, 5853, 273, 9591, 549, 334, 46682, 275, 253, 21624, 2317, 24088, 6963, 50276, 1342, 50276, 17217, 50276, 1452, 257, 275, 295, 24343, 390, 637, 9398, 29502, 14407, 50276, 1342, 50276, 17217, 50276, 17217, 9398, 29502, 38330, 24088, 275, 2460, 5028, 24088, 275, 21844, 68, 1985, 4379, 1162, 355, 4104, 3103, 253, 38135, 310, 3710, 50275, 9188, 40348, 253, 2022, 7680, 273, 436, 2929, 310, 253, 16774, 20028, 326, 824, 9261, 246, 310, 1805, 2931, 2581, 685, 6311, 3587, 432, 941, 24088, 3066, 305, 507, 50276, 74, 943, 3877, 326, 516, 417, 1512, 7615, 342, 253, 16775, 15302, 275, 4706, 4567, 285, 4706, 5922, 275, 1340, 281, 4751, 11435, 253, 8542, 3486, 273, 23586, 14835, 50275, 498, 15752, 50276, 74, 1158, 690, 2234, 4606, 3212, 2139, 23586, 14835, 2987, 812, 320, 625, 4518, 3559, 326, 310, 253, 2234, 1060, 310, 359, 897, 3215, 11273, 247, 265, 281, 1347, 247, 41364, 9261, 891, 1158, 253, 2234, 1537, 417, 320, 1880, 359, 897, 305, 507, 390, 417, 352, 310, 849, 359, 897, 731, 891, 5476, 604, 359, 897, 19541, 26332, 3733, 247, 36827, 35046, 342, 271, 247, 70, 281, 1347, 23586, 14835, 253, 906, 943, 789, 347, 6210, 392, 406, 339, 431, 248, 4477, 4081, 247, 4460, 1332, 273, 2403, 941, 9261, 326, 310, 1199, 6927, 281, 9017, 281, 253, 2219, 835, 253, 3280, 3268, 310, 1027, 432, 253, 581, 326, 310, 908, 281, 253, 6194, 253, 1566, 1210, 4636, 4632, 562, 16848, 436, 556, 247, 2257, 273, 2898, 275, 11922, 5661, 6046, 275, 7534, 941, 671, 1929, 347, 14604, 2538, 253, 2934, 310, 281, 3037, 247, 6779, 326, 36158, 4114, 10103, 326, 513, 417, 6889, 2439, 941, 2792, 533, 778, 320, 2256, 281, 1818, 275, 247, 941, 9261, 285, 35936, 10103, 326, 6889, 875, 941, 2792, 762, 253, 1072, 4114, 285, 840, 4647, 247, 4229, 4872, 9261, 275, 253, 6311, 6779, 2317, 436, 310, 1027, 432, 643, 7274, 824, 347, 36827, 835, 253, 9261, 310, 6311, 7094, 1754, 327, 253, 941, 275, 1635, 352, 4784, 304, 684, 690, 1929, 3237, 824, 347, 253, 4438, 13551, 275, 36827, 407, 816, 4715, 247, 1175, 6779, 436, 310, 4081, 281, 320, 2218, 407, 271, 6753, 36465, 10166, 327, 1097, 1210, 10240, 285, 20823, 10240, 253, 9261, 310, 2299, 10904, 1754, 327, 253, 1210, 10240, 760, 5661, 1543, 403, 23176, 275, 1027, 4893, 2429, 281, 36827, 501, 3024, 1247, 285, 5880, 1247, 50276, 1568, 403, 619, 2201, 7350, 50276, 783, 2934, 3133, 281, 320, 1077, 2087, 285, 6296, 310, 7763, 281, 667, 21624, 6779, 4715, 1332, 285, 417, 816, 6753, 2083, 351, 398, 310, 627, 667, 1921, 326, 643, 625, 9542, 440, 35421, 6779, 4715, 3082, 497, 417, 908, 323, 22791, 272, 275, 253, 2929, 50275, 783, 1332, 11306, 15771, 327, 253, 3290, 273, 253, 440, 35421, 6311, 6779, 849, 581, 310, 16293, 326, 253, 9261, 275, 253, 6311, 2317, 320, 2969, 285, 5313, 3020, 4872, 943, 2649, 359, 1908, 247, 37820, 1332, 281, 7102, 253, 440, 35421, 4715, 625, 20420, 50274, 783, 1332, 671, 29688, 19584, 326, 253, 1072, 8512, 1566, 4114, 285, 35936, 275, 253, 1210, 4636, 285, 562, 16848, 941, 2792, 849, 310, 326, 16293, 275, 3946, 187, 187, 4118, 18435, 27, 2520, 369, 247, 45210, 2929, 347, 30628, 3839, 5821, 326, 253, 1332, 369, 247, 747, 1332, 326, 369, 20420, 5544, 285, 17194, 285, 574, 5272, 5661, 1543, 253, 2022, 30453, 497, 326, 253, 8453, 273, 253, 1332, 369, 12744, 275, 1798, 253, 1332, 1537, 320, 1512, 2192, 1591, 917, 1955, 281, 1146, 1754, 327, 247, 1892, 38059, 4086, 285, 352, 310, 417, 2590, 2139, 436, 310, 253, 987, 2746, 4103, 281, 24088, 305, 507, 342, 247, 7321, 3733, 8103, 30628, 671, 574, 10183, 18005, 253, 8453, 273, 253, 1543, 327, 7534, 15302, 1223, 824, 1543, 5604, 823, 281, 253, 2929, 253, 2929, 651, 320, 10046, 604, 253, 4154, 323, 8453, 812, 320, 7515, 432, 625, 2629, 15302, 50276, 66, 3877, 327, 253, 2278, 1232, 253, 30628, 8523, 11691, 253, 2929, 45298, 533, 253, 2278, 2505, 323, 690, 273, 253, 10123, 369, 625, 4016, 685, 247, 6867, 721, 4868, 281, 6583, 436, 891, 2546, 604, 667, 30628, 3078, 281, 7450, 323, 14924, 5293, 273, 253, 30628, 858, 3839, 1955, 281, 5471, 253, 8453, 273, 253, 1543, 369, 3710, 285, 767, 273, 253, 30628, 4425, 281, 2406, 616, 7363, 281, 2395, 323, 436 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 1246, 247, 1039, 281, 4979, 941, 432, 247, 2603, 3268, 281, 452, 5319, 273, 247, 2303, 3268, 436, 310, 14123, 407, 9433, 247, 5723, 11406, 262, 1159, 281, 253, 9706, 273, 253, 3280, 436, 16168, 3280, 310, 840, 45775, 253, 5723, 11406, 262, 1159, 310, 30364, 50065, 407, 253, 2303, 10670, 9990, 253, 12921, 1159, 1057, 247, 3686, 273, 2969, 33105, 11038, 594, 326, 253, 209, 334, 36384, 2193, 273, 253, 2603, 10670, 3673, 44856, 14237, 3185, 2489, 253, 209, 334, 36384, 2193, 273, 253, 2303, 10670, 3673, 44856, 14237, 4679, 403, 327, 260, 338, 274, 740, 285, 16775, 15302, 253, 6158, 273, 534, 403, 417, 619, 2266, 4176, 50276, 2520, 2929, 310, 973, 15720, 285, 3236, 352, 310, 3236, 984, 627, 403, 760, 247, 1643, 2987, 534, 3587, 26526, 253, 21624, 2317, 581, 1650, 310, 21624, 2317, 30370, 908, 281, 31986, 305, 507, 285, 436, 310, 5799, 432, 1110, 253, 1895, 597, 4388, 281, 8415, 671, 556, 417, 2959, 1199, 4116, 534, 25222, 253, 38135, 273, 436, 2929, 253, 3559, 1332, 310, 2969, 285, 3477, 281, 3359, 1580, 253, 14835, 1159, 310, 417, 6311, 533, 310, 3185, 30027, 352, 310, 42221, 275, 5150, 337, 50276, 783, 958, 326, 253, 14835, 1159, 310, 4229, 778, 10260, 35007, 697, 15840, 285, 30437, 275, 2593, 4562, 285, 4677, 337, 359, 403, 2011, 326, 5723, 11406, 2835, 476, 1614, 3888, 273, 12074, 342, 3168, 24550, 715, 3888, 273, 12074, 342, 3644, 24550, 12074, 403, 271, 39709, 966, 5723, 11406, 2835, 7819, 253, 8815, 28170, 347, 973, 352, 3133, 326, 581, 812, 1818, 253, 20468, 285, 4499, 273, 253, 2460, 281, 4044, 247, 2074, 1055, 390, 581, 812, 390, 1379, 253, 17856, 1599, 273, 253, 2460, 275, 14805, 342, 253, 3388, 2303, 2460, 285, 4044, 247, 2074, 1055, 824, 5899, 3082, 403, 671, 10237, 281, 39709, 5971, 25761, 5723, 11406, 45083, 3745, 281, 1818, 253, 20468, 273, 253, 2460, 310, 417, 326, 10084, 1677, 326, 20468, 310, 690, 273, 253, 954, 5044, 2460, 1491, 275, 1127, 273, 958, 352, 310, 10848, 407, 253, 36196, 10235, 253, 1077, 806, 10235, 432, 253, 13358, 7349, 460, 4979, 534, 310, 908, 275, 480, 21949, 752, 2010, 476, 5723, 11406, 2835, 513, 275, 253, 2460, 5028, 476, 436, 320, 908, 281, 21033, 390, 4887, 278, 79, 382, 24321, 253, 7534, 4679, 671, 3176, 281, 6388, 2969, 3280, 21257, 50276, 32829, 2792, 50276, 266, 12921, 1159, 875, 253, 253, 50276, 74, 717, 417, 2119, 253, 22898, 670, 436, 3082, 13155, 5886, 281, 3159, 19, 4642, 14125, 275, 247, 8249, 789, 1097, 6388, 14586, 281, 247, 11454, 14237, 533, 642, 2007, 5886, 310, 17285, 275, 253, 2929, 50276, 4238, 253, 10895, 41463, 323, 253, 260, 338, 274, 740, 3368, 2218, 13542, 604, 417, 752, 1232, 10883, 264, 253, 10895, 50276, 15576, 690, 273, 619, 13991, 497, 11217, 275, 253, 30080, 22559, 533, 619, 21942, 310, 1335, 326, 436, 310, 2761, 387, 253, 14924, 7887, 253, 1781, 2770, 327, 16775, 2789, 1199, 273, 436, 2929, 12150, 281, 7472, 390, 14109, 406, 339, 431, 248, 2929, 14371, 326, 359, 476, 26880, 3300, 39904, 14282, 3386, 6311, 407, 247, 3215, 11273, 6753, 36465, 247, 70, 281, 4853, 247, 11544, 2865, 9261, 24088, 14168, 5871, 327, 21624, 2317, 281, 4979, 581, 3268, 247, 715, 1529, 3268, 270, 253, 3236, 247, 70, 369, 3215, 11273, 327, 247, 4067, 3268, 326, 3797, 1097, 247, 285, 270, 50276, 66, 2234, 7680, 273, 436, 2929, 310, 253, 4722, 20028, 326, 436, 1332, 1925, 23586, 14835, 4483, 441, 281, 1347, 247, 9261, 246, 326, 29698, 50276, 4025, 9048, 7313, 715, 1501, 21272, 7313, 534, 310, 4217, 275, 253, 3739, 390, 7534, 4758, 50275, 2369, 652, 555, 23586, 14835, 310, 9093, 247, 1846, 5853, 273, 9591, 549, 334, 46682, 275, 253, 21624, 2317, 24088, 6963, 50276, 1342, 50276, 17217, 50276, 1452, 257, 275, 295, 24343, 390, 637, 9398, 29502, 14407, 50276, 1342, 50276, 17217, 50276, 17217, 9398, 29502, 38330, 24088, 275, 2460, 5028, 24088, 275, 21844, 68, 1985, 4379, 1162, 355, 4104, 3103, 253, 38135, 310, 3710, 50275, 9188, 40348, 253, 2022, 7680, 273, 436, 2929, 310, 253, 16774, 20028, 326, 824, 9261, 246, 310, 1805, 2931, 2581, 685, 6311, 3587, 432, 941, 24088, 3066, 305, 507, 50276, 74, 943, 3877, 326, 516, 417, 1512, 7615, 342, 253, 16775, 15302, 275, 4706, 4567, 285, 4706, 5922, 275, 1340, 281, 4751, 11435, 253, 8542, 3486, 273, 23586, 14835, 50275, 498, 15752, 50276, 74, 1158, 690, 2234, 4606, 3212, 2139, 23586, 14835, 2987, 812, 320, 625, 4518, 3559, 326, 310, 253, 2234, 1060, 310, 359, 897, 3215, 11273, 247, 265, 281, 1347, 247, 41364, 9261, 891, 1158, 253, 2234, 1537, 417, 320, 1880, 359, 897, 305, 507, 390, 417, 352, 310, 849, 359, 897, 731, 891, 5476, 604, 359, 897, 19541, 26332, 3733, 247, 36827, 35046, 342, 271, 247, 70, 281, 1347, 23586, 14835, 253, 906, 943, 789, 347, 6210, 392, 406, 339, 431, 248, 4477, 4081, 247, 4460, 1332, 273, 2403, 941, 9261, 326, 310, 1199, 6927, 281, 9017, 281, 253, 2219, 835, 253, 3280, 3268, 310, 1027, 432, 253, 581, 326, 310, 908, 281, 253, 6194, 253, 1566, 1210, 4636, 4632, 562, 16848, 436, 556, 247, 2257, 273, 2898, 275, 11922, 5661, 6046, 275, 7534, 941, 671, 1929, 347, 14604, 2538, 253, 2934, 310, 281, 3037, 247, 6779, 326, 36158, 4114, 10103, 326, 513, 417, 6889, 2439, 941, 2792, 533, 778, 320, 2256, 281, 1818, 275, 247, 941, 9261, 285, 35936, 10103, 326, 6889, 875, 941, 2792, 762, 253, 1072, 4114, 285, 840, 4647, 247, 4229, 4872, 9261, 275, 253, 6311, 6779, 2317, 436, 310, 1027, 432, 643, 7274, 824, 347, 36827, 835, 253, 9261, 310, 6311, 7094, 1754, 327, 253, 941, 275, 1635, 352, 4784, 304, 684, 690, 1929, 3237, 824, 347, 253, 4438, 13551, 275, 36827, 407, 816, 4715, 247, 1175, 6779, 436, 310, 4081, 281, 320, 2218, 407, 271, 6753, 36465, 10166, 327, 1097, 1210, 10240, 285, 20823, 10240, 253, 9261, 310, 2299, 10904, 1754, 327, 253, 1210, 10240, 760, 5661, 1543, 403, 23176, 275, 1027, 4893, 2429, 281, 36827, 501, 3024, 1247, 285, 5880, 1247, 50276, 1568, 403, 619, 2201, 7350, 50276, 783, 2934, 3133, 281, 320, 1077, 2087, 285, 6296, 310, 7763, 281, 667, 21624, 6779, 4715, 1332, 285, 417, 816, 6753, 2083, 351, 398, 310, 627, 667, 1921, 326, 643, 625, 9542, 440, 35421, 6779, 4715, 3082, 497, 417, 908, 323, 22791, 272, 275, 253, 2929, 50275, 783, 1332, 11306, 15771, 327, 253, 3290, 273, 253, 440, 35421, 6311, 6779, 849, 581, 310, 16293, 326, 253, 9261, 275, 253, 6311, 2317, 320, 2969, 285, 5313, 3020, 4872, 943, 2649, 359, 1908, 247, 37820, 1332, 281, 7102, 253, 440, 35421, 4715, 625, 20420, 50274, 783, 1332, 671, 29688, 19584, 326, 253, 1072, 8512, 1566, 4114, 285, 35936, 275, 253, 1210, 4636, 285, 562, 16848, 941, 2792, 849, 310, 326, 16293, 275, 3946, 187, 187, 4118, 18435, 27, 2520, 369, 247, 45210, 2929, 347, 30628, 3839, 5821, 326, 253, 1332, 369, 247, 747, 1332, 326, 369, 20420, 5544, 285, 17194, 285, 574, 5272, 5661, 1543, 253, 2022, 30453, 497, 326, 253, 8453, 273, 253, 1332, 369, 12744, 275, 1798, 253, 1332, 1537, 320, 1512, 2192, 1591, 917, 1955, 281, 1146, 1754, 327, 247, 1892, 38059, 4086, 285, 352, 310, 417, 2590, 2139, 436, 310, 253, 987, 2746, 4103, 281, 24088, 305, 507, 342, 247, 7321, 3733, 8103, 30628, 671, 574, 10183, 18005, 253, 8453, 273, 253, 1543, 327, 7534, 15302, 1223, 824, 1543, 5604, 823, 281, 253, 2929, 253, 2929, 651, 320, 10046, 604, 253, 4154, 323, 8453, 812, 320, 7515, 432, 625, 2629, 15302, 50276, 66, 3877, 327, 253, 2278, 1232, 253, 30628, 8523, 11691, 253, 2929, 45298, 533, 253, 2278, 2505, 323, 690, 273, 253, 10123, 369, 625, 4016, 685, 247, 6867, 721, 4868, 281, 6583, 436, 891, 2546, 604, 667, 30628, 3078, 281, 7450, 323, 14924, 5293, 273, 253, 30628, 858, 3839, 1955, 281, 5471, 253, 8453, 273, 253, 1543, 369, 3710, 285, 767, 273, 253, 30628, 4425, 281, 2406, 616, 7363, 281, 2395, 323, 436 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper presents the idea of enforcing better condition numbers for innerloops in metalearning frameworks they reformulated metalearning loss as the leastsquare formulation which enables them to easily approximate condition numbers then it was used as an additional loss term to minimize it which results in better performance in fewshot classification tasks they demonstrated that their approach converged rapidly especially for the first few iterations i generally like the idea of achieving rapid convergence without introducing more parameters the paper is well written and easy to follow the idea of using leastsquare reformulation to easily approximate eigenvalues is quite interesting however i have some concerns about experimental results below in the question section i would revise my score depending on the authors response the authors addressed some limitations in the main text and i do not see any negative societal impact docsep the authors propose a regularization method for improving the conditioning of maml experiments show the method consistently reduces number of steps for adaption and improves accuracy strengths method seems very effective at reducing number of required maml steps consistently outperforms maml simple additional loss during training no cost at inference time authors study effect of using parameter subset weaknesses precondition loss only applied in top layers as mentioned by authors but this may not be a big problem as the top layers are the ones that are adapted the most during metatesting yes docsepthe paper proposes a regularisation term for the outerloop of maml to encourage a well conditioned parameter space that improves inner loop adaptation the experiments suggest that the condition number is correlated to the performance within a few gradient updates training with the proposed regularisation term is shown to empirically improve upon maml strengths does not require additional parameters for adaptation unlike existing preconditioning methods interesting analysis regarding the connection between condition number and fewstep performance on an unseen task promising experiment results show an empirical benefit of preconditioning weaknesses requires higherorder gradients computationally expensive like maml computing the conditioning term is expensive as such instead of applying the conditioning constraint to the entire model only a small subset of parameters can have the conditioning constraint applied missing comparison with existing preconditioning methods for example see section 21 of warpgrad overall i would rate the paper novelty medium clarity high significance medium yes docsepimproving the performance of a deep neural network on a new unseen task with a limited number of new datapoints and adaptation epochs is one of the central problems in modern deep learning the authors propose to improve the performance of maml a benchmark algorithm for fewshot adaptation problems by preconditioning the parameter space using the condition number of the network instead of directly using the condition number to condition the bilevel optimization problem of maml the authors propose to consider the distribution of all eigenvalues using the approximated logarithmic eigenvalues for increased expressiveness this simple modification to maml is shown to be highly effective at improving the fewshot adaptation process across diverse datasets strength the proposed method is theoretically sound interesting and novel although it is a simple modification to an existing algorithm maml there is enough novelty to be recognized in the idea to bettercondition a parameter space using the condition number of the hessian matrix in addition to the strong theoretical motivation behind the proposed approach the authors experimentally demonstrate the relationship between the condition number of a network and its adaptation capabilities the proposed method is shown to be highly effective at allowing more rapid fewshot adaptation according to the experimental results the proposed method improves adaptation performance of maml across all adaptation steps but the degree of improvement is particularly noteworthy under a limited number of adaptation steps 1 or 2 the writing is concise and clear it was easy to follow how the modified objective function is derived by introducing the conditioning constraint to the maml bilevel optimization problem weakness maybe more baseline approaches from metalearning andor fewshot learning other than maml could be added for comparison can maml the conditioned parameter space proposed method outperform more recent metalearning algorithms also correct me if im wrong but it appears that the parameter space conditioning can be integrated with other metalearning algorithms as well does it lead to performance improvement regardless of the choice of base metalearning algorithm the authors adequately acknowledge the computational complexity of their method in section 5 please refer to weakness and questions for additional concerns and questions ### Summary:
this paper was quite well received by reviewers with scores of 5 6 6 8 reviewers felt the paper was well written clear and expressed an interesting core idea experimental results compare against maml and show clear improvements the key idea here was inspired by preconditioning and the method here aims to increase adaptation speed for gradientbased metalearning methods without incurring extra parameters the paper recasts the optimisation to a nonlinear leastsquares formulation and propose a way to enforce a wellconditioned parameter space for metalearning through the condition number and local curvature perspectives experiments show that the approach outperforms unconstrained optimization significantly and does particularly well during initial adaptation phase of optimization the ac recommends acceptance
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 10262, 253, 2934, 273, 37703, 1805, 1617, 3904, 323, 6703, 4213, 2695, 275, 5148, 613, 920, 31225, 597, 8460, 2907, 5148, 613, 920, 2957, 347, 253, 1878, 15044, 15895, 534, 13276, 731, 281, 4354, 16851, 1617, 3904, 840, 352, 369, 908, 347, 271, 3081, 2957, 1307, 281, 15338, 352, 534, 1543, 275, 1805, 3045, 275, 1643, 11860, 9162, 8892, 597, 5183, 326, 616, 2746, 5975, 2400, 9086, 3340, 323, 253, 806, 1643, 25142, 891, 3839, 751, 253, 2934, 273, 17170, 5233, 14940, 1293, 16984, 625, 3602, 253, 2929, 310, 973, 3542, 285, 3477, 281, 956, 253, 2934, 273, 970, 1878, 15044, 8460, 1427, 281, 4354, 16851, 20223, 310, 3240, 4722, 2299, 891, 452, 690, 7350, 670, 5661, 1543, 2708, 275, 253, 1953, 2593, 891, 651, 49620, 619, 4868, 7293, 327, 253, 4477, 2380, 253, 4477, 9713, 690, 7364, 275, 253, 2022, 2505, 285, 891, 513, 417, 923, 667, 4016, 38058, 3486, 5474, 33032, 253, 4477, 12661, 247, 37820, 1332, 323, 11138, 253, 21839, 273, 278, 16878, 50276, 16217, 3825, 921, 253, 1332, 12724, 11355, 1180, 273, 5018, 323, 5223, 279, 285, 19132, 7200, 20544, 50276, 9349, 3133, 1077, 3576, 387, 8493, 1180, 273, 2424, 278, 16878, 5018, 50276, 46540, 1574, 41731, 13015, 278, 16878, 50276, 19583, 3081, 2957, 1309, 3733, 642, 2105, 387, 17032, 673, 50276, 43355, 1263, 1055, 273, 970, 4764, 8578, 50276, 20881, 1255, 265, 50276, 3456, 12380, 2957, 760, 3732, 275, 1755, 8090, 347, 5393, 407, 4477, 533, 436, 778, 417, 320, 247, 1943, 1895, 347, 253, 1755, 8090, 403, 253, 4394, 326, 403, 12956, 253, 954, 1309, 1313, 255, 38972, 4754, 5474, 339, 431, 248, 2929, 29328, 247, 3963, 5837, 1307, 323, 253, 8346, 14075, 273, 278, 16878, 281, 11907, 247, 973, 27039, 4764, 2317, 326, 19132, 6703, 6287, 15644, 253, 4679, 1804, 326, 253, 1617, 1180, 310, 9578, 281, 253, 3045, 1561, 247, 1643, 11786, 11269, 3733, 342, 253, 4081, 3963, 5837, 1307, 310, 2011, 281, 45190, 3157, 2220, 278, 16878, 50272, 296, 3755, 20556, 50275, 18566, 417, 2430, 3081, 3602, 323, 15644, 12401, 5368, 638, 42743, 3082, 50274, 47606, 1783, 5001, 253, 4602, 875, 1617, 1180, 285, 1643, 10539, 3045, 327, 271, 39709, 4836, 50274, 13382, 2182, 3368, 1543, 921, 271, 16774, 5649, 273, 638, 42743, 50275, 20881, 1255, 265, 50275, 36042, 2169, 2621, 27935, 43245, 8214, 751, 278, 16878, 50275, 16777, 272, 253, 21839, 1307, 310, 8214, 347, 824, 3185, 273, 9433, 253, 21839, 7658, 281, 253, 2862, 1566, 760, 247, 1355, 8578, 273, 3602, 476, 452, 253, 21839, 7658, 3732, 50275, 33722, 5301, 342, 5368, 638, 42743, 3082, 323, 1650, 923, 2593, 3127, 273, 45645, 4971, 50275, 1189, 455, 891, 651, 2281, 253, 2929, 50276, 2369, 652, 555, 4646, 50276, 498, 15752, 1029, 50276, 9188, 40348, 4646, 50275, 9820, 5474, 33032, 303, 40037, 253, 3045, 273, 247, 3676, 11454, 2990, 327, 247, 747, 39709, 4836, 342, 247, 3710, 1180, 273, 747, 2856, 522, 842, 84, 285, 15644, 44540, 310, 581, 273, 253, 4275, 3237, 275, 4980, 3676, 4715, 253, 4477, 12661, 281, 3157, 253, 3045, 273, 278, 16878, 247, 22791, 5933, 323, 1643, 11860, 15644, 3237, 407, 638, 42743, 253, 4764, 2317, 970, 253, 1617, 1180, 273, 253, 2990, 3185, 273, 3587, 970, 253, 1617, 1180, 281, 1617, 253, 26413, 652, 13757, 1895, 273, 278, 16878, 253, 4477, 12661, 281, 1908, 253, 3268, 273, 512, 20223, 970, 253, 34930, 32643, 20223, 323, 2559, 3890, 6460, 436, 2969, 11237, 281, 278, 16878, 310, 2011, 281, 320, 4122, 3576, 387, 11138, 253, 1643, 11860, 15644, 1232, 2439, 11117, 15302, 4757, 50276, 783, 4081, 1332, 310, 28055, 3590, 4722, 285, 4460, 3738, 352, 310, 247, 2969, 11237, 281, 271, 5368, 5933, 278, 16878, 627, 310, 2217, 38135, 281, 320, 7478, 275, 253, 2934, 281, 1805, 12380, 247, 4764, 2317, 970, 253, 1617, 1180, 273, 253, 344, 859, 757, 4315, 275, 1635, 281, 253, 2266, 10527, 16038, 3212, 253, 4081, 2746, 253, 4477, 21657, 7568, 253, 2954, 875, 253, 1617, 1180, 273, 247, 2990, 285, 697, 15644, 13789, 50275, 783, 4081, 1332, 310, 2011, 281, 320, 4122, 3576, 387, 6941, 625, 5233, 1643, 11860, 15644, 2556, 281, 253, 5661, 1543, 253, 4081, 1332, 19132, 15644, 3045, 273, 278, 16878, 2439, 512, 15644, 5018, 533, 253, 4248, 273, 7756, 310, 3782, 35092, 762, 247, 3710, 1180, 273, 15644, 5018, 337, 390, 374, 50275, 783, 4028, 310, 44003, 285, 2590, 352, 369, 3477, 281, 956, 849, 253, 7321, 8103, 1159, 310, 6012, 407, 16984, 253, 21839, 7658, 281, 253, 278, 16878, 26413, 652, 13757, 1895, 50275, 20881, 1255, 50276, 28489, 625, 8245, 7274, 432, 5148, 613, 920, 285, 263, 1643, 11860, 4715, 643, 685, 278, 16878, 812, 320, 2879, 323, 5301, 476, 278, 16878, 50276, 783, 27039, 4764, 2317, 4081, 1332, 562, 32231, 625, 3332, 5148, 613, 920, 11333, 671, 3451, 479, 604, 516, 3430, 533, 352, 4620, 326, 253, 4764, 2317, 21839, 476, 320, 8527, 342, 643, 5148, 613, 920, 11333, 347, 973, 1057, 352, 1421, 281, 3045, 7756, 10159, 273, 253, 4327, 273, 2613, 5148, 613, 920, 5933, 253, 4477, 18212, 14409, 253, 15180, 10454, 273, 616, 1332, 275, 2593, 608, 4496, 3730, 281, 14855, 285, 3533, 323, 3081, 7350, 285, 3533, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 369, 3240, 973, 2959, 407, 30628, 342, 7363, 273, 608, 721, 721, 854, 30628, 3543, 253, 2929, 369, 973, 3542, 2590, 285, 4469, 271, 4722, 5161, 2934, 5661, 1543, 7277, 1411, 278, 16878, 285, 921, 2590, 11701, 50276, 783, 2234, 2934, 1060, 369, 11797, 407, 638, 42743, 285, 253, 1332, 1060, 13698, 281, 2572, 15644, 3885, 323, 11786, 3169, 5148, 613, 920, 3082, 1293, 36967, 804, 4465, 3602, 253, 2929, 761, 9346, 253, 5556, 5837, 281, 247, 14561, 1878, 23600, 4420, 15895, 285, 12661, 247, 1039, 281, 7767, 247, 973, 44321, 4764, 2317, 323, 5148, 613, 920, 949, 253, 1617, 1180, 285, 1980, 16841, 24302, 4679, 921, 326, 253, 2746, 41731, 13015, 440, 48454, 13757, 3012, 285, 1057, 3782, 973, 1309, 3302, 15644, 3408, 273, 13757, 50276, 783, 913, 32636, 14924 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 10262, 253, 2934, 273, 37703, 1805, 1617, 3904, 323, 6703, 4213, 2695, 275, 5148, 613, 920, 31225, 597, 8460, 2907, 5148, 613, 920, 2957, 347, 253, 1878, 15044, 15895, 534, 13276, 731, 281, 4354, 16851, 1617, 3904, 840, 352, 369, 908, 347, 271, 3081, 2957, 1307, 281, 15338, 352, 534, 1543, 275, 1805, 3045, 275, 1643, 11860, 9162, 8892, 597, 5183, 326, 616, 2746, 5975, 2400, 9086, 3340, 323, 253, 806, 1643, 25142, 891, 3839, 751, 253, 2934, 273, 17170, 5233, 14940, 1293, 16984, 625, 3602, 253, 2929, 310, 973, 3542, 285, 3477, 281, 956, 253, 2934, 273, 970, 1878, 15044, 8460, 1427, 281, 4354, 16851, 20223, 310, 3240, 4722, 2299, 891, 452, 690, 7350, 670, 5661, 1543, 2708, 275, 253, 1953, 2593, 891, 651, 49620, 619, 4868, 7293, 327, 253, 4477, 2380, 253, 4477, 9713, 690, 7364, 275, 253, 2022, 2505, 285, 891, 513, 417, 923, 667, 4016, 38058, 3486, 5474, 33032, 253, 4477, 12661, 247, 37820, 1332, 323, 11138, 253, 21839, 273, 278, 16878, 50276, 16217, 3825, 921, 253, 1332, 12724, 11355, 1180, 273, 5018, 323, 5223, 279, 285, 19132, 7200, 20544, 50276, 9349, 3133, 1077, 3576, 387, 8493, 1180, 273, 2424, 278, 16878, 5018, 50276, 46540, 1574, 41731, 13015, 278, 16878, 50276, 19583, 3081, 2957, 1309, 3733, 642, 2105, 387, 17032, 673, 50276, 43355, 1263, 1055, 273, 970, 4764, 8578, 50276, 20881, 1255, 265, 50276, 3456, 12380, 2957, 760, 3732, 275, 1755, 8090, 347, 5393, 407, 4477, 533, 436, 778, 417, 320, 247, 1943, 1895, 347, 253, 1755, 8090, 403, 253, 4394, 326, 403, 12956, 253, 954, 1309, 1313, 255, 38972, 4754, 5474, 339, 431, 248, 2929, 29328, 247, 3963, 5837, 1307, 323, 253, 8346, 14075, 273, 278, 16878, 281, 11907, 247, 973, 27039, 4764, 2317, 326, 19132, 6703, 6287, 15644, 253, 4679, 1804, 326, 253, 1617, 1180, 310, 9578, 281, 253, 3045, 1561, 247, 1643, 11786, 11269, 3733, 342, 253, 4081, 3963, 5837, 1307, 310, 2011, 281, 45190, 3157, 2220, 278, 16878, 50272, 296, 3755, 20556, 50275, 18566, 417, 2430, 3081, 3602, 323, 15644, 12401, 5368, 638, 42743, 3082, 50274, 47606, 1783, 5001, 253, 4602, 875, 1617, 1180, 285, 1643, 10539, 3045, 327, 271, 39709, 4836, 50274, 13382, 2182, 3368, 1543, 921, 271, 16774, 5649, 273, 638, 42743, 50275, 20881, 1255, 265, 50275, 36042, 2169, 2621, 27935, 43245, 8214, 751, 278, 16878, 50275, 16777, 272, 253, 21839, 1307, 310, 8214, 347, 824, 3185, 273, 9433, 253, 21839, 7658, 281, 253, 2862, 1566, 760, 247, 1355, 8578, 273, 3602, 476, 452, 253, 21839, 7658, 3732, 50275, 33722, 5301, 342, 5368, 638, 42743, 3082, 323, 1650, 923, 2593, 3127, 273, 45645, 4971, 50275, 1189, 455, 891, 651, 2281, 253, 2929, 50276, 2369, 652, 555, 4646, 50276, 498, 15752, 1029, 50276, 9188, 40348, 4646, 50275, 9820, 5474, 33032, 303, 40037, 253, 3045, 273, 247, 3676, 11454, 2990, 327, 247, 747, 39709, 4836, 342, 247, 3710, 1180, 273, 747, 2856, 522, 842, 84, 285, 15644, 44540, 310, 581, 273, 253, 4275, 3237, 275, 4980, 3676, 4715, 253, 4477, 12661, 281, 3157, 253, 3045, 273, 278, 16878, 247, 22791, 5933, 323, 1643, 11860, 15644, 3237, 407, 638, 42743, 253, 4764, 2317, 970, 253, 1617, 1180, 273, 253, 2990, 3185, 273, 3587, 970, 253, 1617, 1180, 281, 1617, 253, 26413, 652, 13757, 1895, 273, 278, 16878, 253, 4477, 12661, 281, 1908, 253, 3268, 273, 512, 20223, 970, 253, 34930, 32643, 20223, 323, 2559, 3890, 6460, 436, 2969, 11237, 281, 278, 16878, 310, 2011, 281, 320, 4122, 3576, 387, 11138, 253, 1643, 11860, 15644, 1232, 2439, 11117, 15302, 4757, 50276, 783, 4081, 1332, 310, 28055, 3590, 4722, 285, 4460, 3738, 352, 310, 247, 2969, 11237, 281, 271, 5368, 5933, 278, 16878, 627, 310, 2217, 38135, 281, 320, 7478, 275, 253, 2934, 281, 1805, 12380, 247, 4764, 2317, 970, 253, 1617, 1180, 273, 253, 344, 859, 757, 4315, 275, 1635, 281, 253, 2266, 10527, 16038, 3212, 253, 4081, 2746, 253, 4477, 21657, 7568, 253, 2954, 875, 253, 1617, 1180, 273, 247, 2990, 285, 697, 15644, 13789, 50275, 783, 4081, 1332, 310, 2011, 281, 320, 4122, 3576, 387, 6941, 625, 5233, 1643, 11860, 15644, 2556, 281, 253, 5661, 1543, 253, 4081, 1332, 19132, 15644, 3045, 273, 278, 16878, 2439, 512, 15644, 5018, 533, 253, 4248, 273, 7756, 310, 3782, 35092, 762, 247, 3710, 1180, 273, 15644, 5018, 337, 390, 374, 50275, 783, 4028, 310, 44003, 285, 2590, 352, 369, 3477, 281, 956, 849, 253, 7321, 8103, 1159, 310, 6012, 407, 16984, 253, 21839, 7658, 281, 253, 278, 16878, 26413, 652, 13757, 1895, 50275, 20881, 1255, 50276, 28489, 625, 8245, 7274, 432, 5148, 613, 920, 285, 263, 1643, 11860, 4715, 643, 685, 278, 16878, 812, 320, 2879, 323, 5301, 476, 278, 16878, 50276, 783, 27039, 4764, 2317, 4081, 1332, 562, 32231, 625, 3332, 5148, 613, 920, 11333, 671, 3451, 479, 604, 516, 3430, 533, 352, 4620, 326, 253, 4764, 2317, 21839, 476, 320, 8527, 342, 643, 5148, 613, 920, 11333, 347, 973, 1057, 352, 1421, 281, 3045, 7756, 10159, 273, 253, 4327, 273, 2613, 5148, 613, 920, 5933, 253, 4477, 18212, 14409, 253, 15180, 10454, 273, 616, 1332, 275, 2593, 608, 4496, 3730, 281, 14855, 285, 3533, 323, 3081, 7350, 285, 3533, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 369, 3240, 973, 2959, 407, 30628, 342, 7363, 273, 608, 721, 721, 854, 30628, 3543, 253, 2929, 369, 973, 3542, 2590, 285, 4469, 271, 4722, 5161, 2934, 5661, 1543, 7277, 1411, 278, 16878, 285, 921, 2590, 11701, 50276, 783, 2234, 2934, 1060, 369, 11797, 407, 638, 42743, 285, 253, 1332, 1060, 13698, 281, 2572, 15644, 3885, 323, 11786, 3169, 5148, 613, 920, 3082, 1293, 36967, 804, 4465, 3602, 253, 2929, 761, 9346, 253, 5556, 5837, 281, 247, 14561, 1878, 23600, 4420, 15895, 285, 12661, 247, 1039, 281, 7767, 247, 973, 44321, 4764, 2317, 323, 5148, 613, 920, 949, 253, 1617, 1180, 285, 1980, 16841, 24302, 4679, 921, 326, 253, 2746, 41731, 13015, 440, 48454, 13757, 3012, 285, 1057, 3782, 973, 1309, 3302, 15644, 3408, 273, 13757, 50276, 783, 913, 32636, 14924 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this work studies the problem of how to define learning rate schedules when training deep models so that the models better generalize to this end the paper proposes and evaluates a learning rate schedule that consists of two stages knee schedule a first stage of exploration adoptes a high learning rate this initial stage is followed by a second stage where the learning rate decreases in a linear way extensive experimental results both in text and image data show that the proposed scheme allows one to train faster or to obtain better results with a fixed computational budget the proposed learning schedule leads to sota results on iwslt14 deen and wmt14 deen datasets the work relates the good performance of the proposed knee schedule in the hypothesis that wide minima have a lower density are less common therefore a large learning rate is required initially and for some time to avoid shallow minima the second refinement stage with the learning rate declining linearly allows one to delve into the minimum found in the exploration stage recent works indicate that in fact wide minima are the ones that lead the models to generalize better and this is in agreement with the experimental results of the article the main contribution of the article is an exhaustive experimental evaluation in different applications where they analyze different schedules and show how the proposed schedule leads to superior performance the paper raises a working hypothesis compatible with the success of the lr schedule and in that sense generates an interesting line to continue research some questions 1 from reading the article it is not clear to me how it is justified to keep the learning rate high even when the loss stagnates i understand this is based on conducting experiments and then measuring the power of generalization but it is interesting that from the training point of view it would seem that after training stagnates the network is not learning but pivoting from one side to the other what do you think can be a good hypothesis of what is happening during training at this stage i would like if possible that this point is better discussed and it would also be useful if the work better discussed why the working hypothesis is the most reasonable explanation 2 table 1 shows that reducing the learning rate after the exploration stage helps to better minimize the loss however this does not translate into a network that generalizes better is it reasonable to hypothesize that during this second period the network overfitted to the behavior around this minimum does this phenomenon occur in other experiments if so why is the second refinement stage needed 3 warmup some optimizers use a warm up step where the learning rate starts to rise smoothly it would be interesting to better discuss how this stage is linked to the exploration stage how long does the warmup stage need to be if warmup exploration decay is put together at the end it is a curve with a certain resemblance to a cosine additionally if the information is available it would be useful to have the standard deviations of the average values calculated in table 6 docseplearning rate schedule plays an important role in dl which has a large influence over the final performance though there have been lots of schedules achieving sota performance still requires careful handtuned schedule that may be case by case compared with previous learning rate schedules authors first conjectured that the number of wide minima is significantly lower than the number of sharp minima and then proposed to use a large learning rate at the initialization phase for sufficient exploration to achieve a wide minima which may achieve better generalization performance extensive experiments validate the proposed learning rate schedule the observation of this paper looks interesting and authors have conducted lots of experiments to validate the effects of proposed learning rate schedules however the novelty of this paper seems limited first authors conjecture that the number of wide minima is significantly lower than the number of sharp minima but it lacks a thorough investigation of this conjecture either from related empirical study or theoretical understanding second for the proposed learning rate schedule it seems not very clear how to set the duration of exploration epochs appropriately across different tasks as it is still a handtuned hyperparameter for fixed 50 explore there is not much difference in terms of the performance compared with previous schedule such as cosine decay or linear decay in table 6 overall i tend to a weak reject and it would much better if authors could go deeper behind the observationconjecturedocsepsummary this paper did an empirical study on the learning rate lr schedule for deep neural networks dnns training the authors argue that the density of wide minima is lower than sharp minima and then show that this makes keeping high lr necessary finally they propose a new lr schedule that maintains high lr enough long pros the problem this paper studies is import for dnns training the proposed lr schedule is simple and has the potential to be used widely the authors conduct extensive empirical tests to support their claim and the experimental design is reasonable cons im not fully convinced by the hypothesis that wide minima have lower density the empirical results can be explained by other hypotheses as well for example it is also possible that wide minima are farther away from the initialization i think the authors need to either provide theoretical analysis or come up with new experiments to further verify this hypothesis the proposed lr schedule does not seem necessary one could easily achieve the same purpose by existing lr schedules eg use a step decay lr schedule the novelty is low the main novelty of the paper is the above hypothesis but it is not supported enough the proposed lr schedule is a slightly modified version of the existing lr schedule thus the contribution of this paper seems incremental docsepoverview overall i believe the comparisons to baselines seem too problematic to understand the value of the proposed method regarding the reduced training budget results knee schedule can achieve the same accuracy as the baseline with a much reduced training budget were the baseline schedules also retuned for the reduced training budget if not this seems like an unfair advantage to the proposed method for example in the mlperf competition httpsarxivorgabs191001500 for imagenet there have been schedules consisting of a linear warmup followed by quadratic decay that have been tuned to reach 759 in only 64 epochs even at massive batch sizes implying that the baseline schedule in table 3 could likely do much better than what is reported if it was retuned with the same number of trials as the proposed method or if a more competitive baseline schedule was used some of the results seem misleading as well in the training curve figures 6 7 8 9 10 in the appendix it seems odd that the proposed method only catches up to the untuned baselines towards the very end of training and that this was not mentioned in the main text for example on cifar10 the baseline beats the proposed method until the final 5 out of 200 epochs of training on bertlarge pretraining it is unclear from the plots when the proposed method beats the baseline as the curves are so similar on wmt14 ende the baseline beats the proposed method until the final 54 out of 70 epochs of training on iwslt14 deen the baseline and proposed method cross each other a few times the final time being at epoch 41 of 50 on iwslt14 deen with the mat network the baseline and proposed method cross each other a few times the final time being at epoch 330 of 400 while it is not invalid for a proposed method to overtake a baseline towards the end of training these results indicate that perhaps if the baselines were retuned they could maintain their better performance for the last few epochs of training using the same initial lr for the proposed and baseline methods is useful however it is insufficient to demonstrate that the proposed method could still perform well under different initial conditions i have additional concerns about the significance of the proposed method over the baselines which i describe below regarding comparing to the sharpness of the baseline lr schedules with fewer explore epochs a large learning rate might still get lucky occasionally in finding a wide minima but invariably finds only a narrower minima due to their higher density it would help to show curvature metrics at frequent intervals during training to confirm this hypothesis and to also show these for the other learning rate schedules compared to so that you can demonstrate that the proposed schedule achieves something the baselines cannot the sharpness values in figure 2 are interesting but i am unable to determine how impressive they are given that they are not compared to sharpness values for any other schedules so i dont know what the baseline numbers should be finally it is unclear that the proposed method is novel enough to warrant a standalone paper without more rigorous theoretical explanations to support the claimed reasons behind its performance pros it is useful to note that definitions of curvature can be problematic which the authors do discuss citing httpsarxivorgabs170304933 the breadth of experiments is genuinely impressive but unfortunately would be more impressive if the breadth was smaller and more careful tuning was done for the proposed method and baselines concerns in appendix c when describing your curvature metric you say the maximization problem is solved by applying 1000 iterations of projected gradient ascent how was 1000 chosen did the sharpness metric stop changing if more steps were used what are the stddevs of the results in tables 6 7 18 19 20 21 26 the proposed results seem very close to the untuned baselines and so it would be useful to understand how statistically significant they are toy problems can be extremely useful to empirically demonstrate this wide vs sharp minima selection phenomena would be useful such as in wu et al 2018 httpspapersnipsccpaper8049howsgdselectstheglobalminimainoverparameterizedlearningadynamicalstabilityperspective or the noisy quadratic model in httpsarxivorgabs190704164 the curves in figure 7 seem extremely similar it would help to plot the loss on a log scale in section 41 in all cases we use the model checkpoint with least loss on the validation set for computing bleu scores on the test set is early stopping used in all experiments if not why thus in the presence of several flatter minimas gd with lower learning rates does not find them leading to the conjecture that density of sharper minima is perhaps larger than density of wider minima it is unclear to me how their previous results support this hypothesis couldnt one retune the learning rate of sgd to find sharperflatter minima independently of how many sharpflat minima exist writing the experiment details in the intro could be moved to later in the paper it seems to be repeated in section 2 overall the paper length seems like it could be drastically reduced by removing repeated statements figures 6 7 8 9 would be much clearer to read if it was a single plot per row possibly on a log scale on the vertical axis when applicable for consistency it would be useful to have baseline short budget also be reported in table 5 prior work there are many previous works on explaining the benefits of large learning rates the most relevant being httpsarxivorgabs190704595 which seems to make the same case as this paper but is not cited additionally httpsarxivorgabs200302218 has more theoretically explanations for this using the neural tangent kernel literature and the authors could likely derive similar explanations in fact they use a similar schedule as the proposed method but do not give it a name the network is trained with different initial learning rates followed by a decay at a fixed physical time t to the same final learning rate this schedule is introduced in order to ensure that all experiments have the same level of sgd noise toward the end of training finally there are other works that describe how low curvature directions of the loss landscape will be learned first benefiting from a higher lr followed by high curvaturehigh noise directions which benefits from a smaller lr described in httpsarxivorgabs190704164 i believe that a more formal explanation and analysis of the claims on solution curvature density should be provided additional feedback comments suggestions for improvement and questions for the authors i believe that fairer experimental setup would be similar to the following pick several competitive lr schedules for each problem not just the standard ones identify a similar number of hyperparamters for each such as number of warmup steps decay values decay curve shapes etc retune each schedule and the proposed method for the same number of trials using similarly sized search spaces for each ideally one would also retune the initialfinal learning rates momentum and other hyperparameters for each but this may be too expensive select the best performing hyperparameter setting for each schedule and rerun it over multiple seeds to check for stability it can be problematic to make comparisons across methods with different numbers of hyperparameters even with the same tuning budget because it is impossible to construct the same volume hyperparameter spaces with different numbers of hyperparameters see httpsarxivorgabs200701547 for a more thorough treatment ### Summary:
the reviewers are concerned about the novelty of the proposed learning rate schedule the rigor of the empirical validation and the relationship between the results and the discussion of sharp vs local minima i invite the authors to incorporate reviewers comments and resubmit to other ml venues
[ 604, 5890, 484, 50276, 15083, 7843, 50276, 8632, 333, 310, 1691, 2366, 387, 253, 990, 352, 310, 247, 6970, 342, 247, 2176, 36199, 281, 247, 7349, 460, 50275, 29483, 595, 604, 253, 1491, 310, 2130, 352, 651, 320, 4217, 281, 452, 253, 2629, 21492, 273, 253, 3388, 2193, 5118, 275, 2829, 721, 50275, 7152, 339, 713, 4026, 2281, 10130, 7120, 271, 1774, 2554, 275, 45439, 534, 556, 247, 1781, 4833, 689, 253, 2457, 3045, 2167, 627, 452, 644, 8783, 273, 28631, 17170, 256, 5503, 3045, 1335, 4419, 10182, 1133, 85, 37437, 10130, 326, 778, 320, 1083, 407, 1083, 2429, 342, 2045, 4715, 2281, 28631, 4477, 806, 19704, 1520, 326, 253, 1180, 273, 4618, 46836, 310, 3012, 2406, 685, 253, 1180, 273, 9479, 46836, 285, 840, 50276, 856, 7334, 281, 897, 247, 1781, 4715, 2281, 387, 253, 31850, 3408, 323, 4209, 17947, 281, 5115, 247, 4618, 46836, 534, 778, 5115, 1805, 26647, 3045, 9470, 4679, 17813, 253, 4081, 4715, 2281, 10130, 50275, 783, 8310, 273, 436, 2929, 4453, 4722, 285, 4477, 452, 5196, 8783, 273, 4679, 281, 17813, 253, 2538, 273, 4081, 4715, 2281, 28631, 2299, 253, 38135, 273, 436, 2929, 3133, 3710, 806, 4477, 24366, 326, 253, 1180, 273, 4618, 46836, 310, 3012, 2406, 685, 253, 1180, 273, 9479, 46836, 533, 352, 19756, 247, 11080, 5839, 273, 436, 24366, 2057, 432, 2905, 16774, 1263, 390, 10527, 4685, 1273, 323, 253, 4081, 4715, 2281, 10130, 352, 3133, 417, 1077, 2590, 849, 281, 873, 253, 7467, 273, 17947, 44540, 20420, 2439, 1027, 8892, 347, 352, 310, 1335, 247, 1133, 85, 37437, 4373, 19484, 323, 4229, 2456, 8338, 627, 310, 417, 1199, 3064, 275, 2426, 273, 253, 3045, 2429, 342, 2045, 10130, 824, 347, 7349, 460, 10027, 390, 4872, 10027, 275, 2829, 721, 50272, 1189, 455, 891, 5257, 281, 247, 5075, 12009, 285, 352, 651, 1199, 1805, 604, 4477, 812, 564, 12861, 3212, 253, 8310, 585, 720, 1520, 406, 339, 793, 360, 3454, 436, 2929, 858, 271, 16774, 1263, 327, 253, 4715, 2281, 298, 83, 10130, 323, 3676, 11454, 6928, 277, 79, 2224, 3733, 253, 4477, 9059, 326, 253, 4038, 273, 4618, 46836, 310, 2406, 685, 9479, 46836, 285, 840, 921, 326, 436, 2789, 7562, 1029, 298, 83, 3309, 4720, 597, 12661, 247, 747, 298, 83, 10130, 326, 18922, 1029, 298, 83, 2217, 1048, 50275, 856, 84, 209, 186, 783, 1895, 436, 2929, 2175, 310, 1395, 323, 277, 79, 2224, 3733, 253, 4081, 298, 83, 10130, 310, 2969, 285, 556, 253, 2442, 281, 320, 908, 7561, 209, 186, 783, 4477, 2589, 9470, 16774, 5216, 281, 1329, 616, 1750, 285, 253, 5661, 2216, 310, 5272, 50276, 5040, 209, 186, 303, 417, 4751, 13762, 407, 253, 9079, 326, 4618, 46836, 452, 2406, 4038, 253, 16774, 1543, 476, 320, 5544, 407, 643, 24316, 347, 973, 323, 1650, 352, 310, 671, 1896, 326, 4618, 46836, 403, 21816, 1977, 432, 253, 31850, 891, 1158, 253, 4477, 878, 281, 2057, 2085, 10527, 1783, 390, 1705, 598, 342, 747, 4679, 281, 2007, 12654, 436, 9079, 209, 186, 783, 4081, 298, 83, 10130, 1057, 417, 1646, 3309, 581, 812, 4354, 5115, 253, 1072, 4096, 407, 5368, 298, 83, 28631, 24088, 897, 247, 3213, 10027, 298, 83, 10130, 50276, 186, 783, 38135, 310, 1698, 253, 2022, 38135, 273, 253, 2929, 310, 253, 1840, 9079, 533, 352, 310, 417, 4516, 2217, 253, 4081, 298, 83, 10130, 310, 247, 5777, 7321, 2715, 273, 253, 5368, 298, 83, 10130, 50276, 40622, 253, 7680, 273, 436, 2929, 3133, 32809, 50275, 7152, 33032, 39930, 4583, 891, 2868, 253, 14023, 281, 1666, 25379, 1646, 1512, 20276, 281, 2096, 253, 1318, 273, 253, 4081, 1332, 5001, 253, 3777, 3733, 7563, 1543, 12267, 10130, 476, 5115, 253, 1072, 7200, 347, 253, 8245, 342, 247, 1199, 3777, 3733, 7563, 497, 253, 8245, 28631, 671, 851, 37437, 323, 253, 3777, 3733, 7563, 604, 417, 436, 3133, 751, 271, 16593, 5750, 281, 253, 4081, 1332, 323, 1650, 275, 253, 13361, 49181, 7324, 5987, 39962, 2061, 5375, 746, 2313, 33856, 323, 4440, 257, 292, 627, 452, 644, 28631, 11253, 273, 247, 4872, 5890, 484, 3560, 407, 21396, 10027, 326, 452, 644, 24251, 281, 3986, 818, 3046, 275, 760, 6705, 44540, 1014, 387, 7863, 14604, 9552, 27594, 326, 253, 8245, 10130, 275, 2829, 495, 812, 2779, 513, 1199, 1805, 685, 752, 310, 2361, 604, 352, 369, 851, 37437, 342, 253, 1072, 1180, 273, 7587, 347, 253, 4081, 1332, 390, 604, 247, 625, 12085, 8245, 10130, 369, 908, 690, 273, 253, 1543, 1646, 24363, 347, 973, 275, 253, 3733, 6970, 8442, 721, 818, 854, 898, 884, 275, 253, 30762, 352, 3133, 8909, 326, 253, 4081, 1332, 760, 32010, 598, 281, 253, 18093, 37437, 1666, 25379, 4404, 253, 1077, 990, 273, 3733, 285, 326, 436, 369, 417, 5393, 275, 253, 2022, 2505, 323, 1650, 327, 260, 338, 274, 740, 253, 8245, 27125, 253, 4081, 1332, 1919, 253, 2457, 608, 562, 273, 1052, 44540, 273, 3733, 327, 270, 797, 16374, 3215, 26208, 352, 310, 12744, 432, 253, 14777, 672, 253, 4081, 1332, 27125, 253, 8245, 347, 253, 9191, 403, 594, 2074, 327, 259, 6917, 1047, 19072, 253, 8245, 27125, 253, 4081, 1332, 1919, 253, 2457, 8255, 562, 273, 5571, 44540, 273, 3733, 327, 891, 88, 3433, 85, 1047, 372, 257, 253, 8245, 285, 4081, 1332, 2831, 1016, 643, 247, 1643, 2069, 253, 2457, 673, 1146, 387, 23657, 7609, 273, 2456, 327, 891, 88, 3433, 85, 1047, 372, 257, 342, 253, 1111, 2990, 253, 8245, 285, 4081, 1332, 2831, 1016, 643, 247, 1643, 2069, 253, 2457, 673, 1146, 387, 23657, 24792, 273, 9166, 1223, 352, 310, 417, 12078, 323, 247, 4081, 1332, 281, 19486, 640, 247, 8245, 4404, 253, 990, 273, 3733, 841, 1543, 5224, 326, 4931, 604, 253, 1666, 25379, 497, 851, 37437, 597, 812, 6558, 616, 1805, 3045, 323, 253, 1390, 1643, 44540, 273, 3733, 970, 253, 1072, 3302, 298, 83, 323, 253, 4081, 285, 8245, 3082, 310, 4217, 2299, 352, 310, 12497, 281, 7568, 326, 253, 4081, 1332, 812, 1335, 1347, 973, 762, 1027, 3302, 2515, 891, 452, 3081, 7350, 670, 253, 8453, 273, 253, 4081, 1332, 689, 253, 1666, 25379, 534, 891, 6266, 2708, 50276, 1747, 13218, 10941, 281, 253, 9479, 1255, 273, 253, 8245, 298, 83, 28631, 342, 11184, 8338, 44540, 247, 1781, 4715, 2281, 1537, 1335, 755, 13476, 13949, 275, 4560, 247, 4618, 46836, 533, 37504, 9010, 760, 247, 39937, 46836, 1955, 281, 616, 2169, 4038, 352, 651, 1361, 281, 921, 16841, 17082, 387, 10879, 11508, 1309, 3733, 281, 6583, 436, 9079, 285, 281, 671, 921, 841, 323, 253, 643, 4715, 2281, 28631, 2429, 281, 594, 326, 368, 476, 7568, 326, 253, 4081, 10130, 33526, 1633, 253, 1666, 25379, 2550, 253, 9479, 1255, 2193, 275, 4677, 374, 403, 4722, 533, 891, 717, 7591, 281, 3653, 849, 13943, 597, 403, 1677, 326, 597, 403, 417, 2429, 281, 9479, 1255, 2193, 323, 667, 643, 28631, 594, 891, 13414, 871, 752, 253, 8245, 3904, 943, 320, 50276, 71, 3341, 352, 310, 12744, 326, 253, 4081, 1332, 310, 4460, 2217, 281, 7501, 247, 40468, 2929, 1293, 625, 26565, 10527, 22909, 281, 1329, 253, 7558, 4606, 3212, 697, 3045, 50276, 856, 84, 352, 310, 4217, 281, 3877, 326, 14308, 273, 16841, 476, 320, 20276, 534, 253, 4477, 513, 2319, 19936, 5987, 39962, 2061, 5375, 15046, 1229, 2537, 1610, 253, 37535, 273, 4679, 310, 27364, 13943, 533, 19235, 651, 320, 625, 13943, 604, 253, 37535, 369, 4577, 285, 625, 10182, 25184, 369, 2218, 323, 253, 4081, 1332, 285, 1666, 25379, 50276, 585, 1209, 2224, 275, 30762, 260, 672, 12930, 634, 16841, 7982, 368, 1333, 253, 11903, 1320, 1895, 310, 14042, 407, 9433, 9098, 25142, 273, 16589, 11786, 49104, 849, 369, 9098, 6777, 858, 253, 9479, 1255, 7982, 3523, 6890, 604, 625, 5018, 497, 908, 752, 403, 253, 6268, 3620, 84, 273, 253, 1543, 275, 7180, 721, 818, 1283, 655, 1384, 3127, 3436, 253, 4081, 1543, 1646, 1077, 2810, 281, 253, 18093, 37437, 1666, 25379, 285, 594, 352, 651, 320, 4217, 281, 2096, 849, 10126, 1534, 597, 403, 20953, 3237, 476, 320, 6685, 4217, 281, 45190, 7568, 436, 4618, 4632, 9479, 46836, 5438, 16958, 651, 320, 4217, 824, 347, 275, 259, 86, 1162, 355, 4765, 5987, 50004, 79, 2824, 550, 20790, 1438, 2537, 5430, 8433, 69, 7135, 296, 248, 14456, 1222, 303, 404, 1189, 19484, 1025, 28269, 324, 3190, 474, 296, 1430, 5726, 4911, 390, 253, 27620, 21396, 1566, 275, 5987, 39962, 2061, 5375, 16129, 26942, 18467, 253, 9191, 275, 4677, 818, 1646, 6685, 2074, 352, 651, 1361, 281, 7484, 253, 2957, 327, 247, 2412, 4311, 275, 2593, 7609, 275, 512, 2219, 359, 897, 253, 1566, 32552, 342, 1878, 2957, 327, 253, 12820, 873, 323, 12672, 7387, 86, 7363, 327, 253, 1071, 873, 310, 2393, 15910, 908, 275, 512, 4679, 604, 417, 2139, 3021, 275, 253, 3361, 273, 2067, 892, 2569, 7221, 284, 305, 69, 342, 2406, 4715, 4142, 1057, 417, 1089, 731, 4283, 281, 253, 24366, 326, 4038, 273, 17614, 468, 46836, 310, 4931, 4067, 685, 4038, 273, 14200, 46836, 352, 310, 12744, 281, 479, 849, 616, 2045, 1543, 1329, 436, 9079, 812, 2649, 581, 851, 2517, 253, 4715, 2281, 273, 256, 35333, 281, 1089, 17614, 468, 1258, 2569, 46836, 10939, 273, 849, 1142, 9479, 22829, 46836, 2226, 50276, 17695, 253, 3368, 4278, 275, 253, 26432, 812, 320, 4395, 281, 1996, 275, 253, 2929, 352, 3133, 281, 320, 6015, 275, 2593, 374, 4583, 253, 2929, 2978, 3133, 751, 352, 812, 320, 31063, 3777, 407, 11922, 6015, 7234, 8442, 721, 818, 854, 898, 651, 320, 1199, 30909, 281, 1239, 604, 352, 369, 247, 2014, 7484, 591, 4194, 6830, 327, 247, 2412, 4311, 327, 253, 9118, 7844, 672, 7763, 323, 15274, 352, 651, 320, 4217, 281, 452, 8245, 2159, 7563, 671, 320, 2361, 275, 2829, 608, 50276, 40844, 789, 627, 403, 1142, 2045, 2987, 327, 15571, 253, 5373, 273, 1781, 4715, 4142, 253, 954, 4623, 1146, 5987, 39962, 2061, 5375, 16129, 1967, 1857, 2222, 534, 3133, 281, 1056, 253, 1072, 1083, 347, 436, 2929, 533, 310, 417, 11106, 23000, 5987, 39962, 2061, 5375, 1518, 1229, 1423, 1093, 556, 625, 28055, 22909, 323, 436, 970, 253, 11454, 28196, 10295, 6239, 285, 253, 4477, 812, 2779, 15313, 2074, 22909, 275, 958, 597, 897, 247, 2074, 10130, 347, 253, 4081, 1332, 533, 513, 417, 1918, 352, 247, 1416, 253, 2990, 310, 10166, 342, 1027, 3302, 4715, 4142, 3560, 407, 247, 10027, 387, 247, 4229, 3520, 673, 246, 50275, 936, 253, 1072, 2457, 4715, 2281, 436, 10130, 310, 5611, 275, 1340, 281, 5416, 326, 512, 4679, 452, 253, 1072, 1268, 273, 256, 35333, 6046, 2584, 253, 990, 273, 3733, 4720, 627, 403, 643, 2987, 326, 6266, 849, 1698, 16841, 10746, 273, 253, 2957, 13016, 588, 320, 6311, 806, 2750, 2996, 432, 247, 2169, 298, 83, 3560, 407, 1029, 16841, 8656, 6046, 10746, 534, 5373, 432, 247, 4577, 298, 83, 2529, 275, 5987, 39962, 2061, 5375, 16129, 26942, 18467, 891, 2868, 326, 247, 625, 7473, 8813, 285, 1783, 273, 253, 3916, 327, 2900, 16841, 4038, 943, 320, 2530, 50276, 38092, 8680, 5701, 13991, 323, 7756, 285, 3533, 323, 253, 4477, 891, 2868, 326, 22870, 83, 5661, 9978, 651, 320, 2074, 281, 253, 1563, 2619, 2067, 12085, 298, 83, 28631, 323, 1016, 1895, 417, 816, 253, 2629, 4394, 4271, 247, 2074, 1180, 273, 4373, 3575, 1336, 323, 1016, 824, 347, 1180, 273, 5890, 484, 5018, 10027, 2193, 10027, 6970, 15029, 3966, 851, 2517, 1016, 10130, 285, 253, 4081, 1332, 323, 253, 1072, 1180, 273, 7587, 970, 12014, 25180, 3186, 8470, 323, 1016, 34243, 581, 651, 671, 851, 2517, 253, 3302, 13017, 4715, 4142, 10254, 285, 643, 4373, 22041, 323, 1016, 533, 436, 778, 320, 1512, 8214, 3609, 253, 1682, 9591, 4373, 19484, 4758, 323, 1016, 10130, 285, 294, 6321, 352, 689, 2709, 12922, 281, 2451, 323, 7882, 50276, 262, 476, 320, 20276, 281, 1056, 14023, 2439, 3082, 342, 1027, 3904, 273, 4373, 22041, 1014, 342, 253, 1072, 25184, 7563, 984, 352, 310, 7479, 281, 3989, 253, 1072, 4644, 4373, 19484, 8470, 342, 1027, 3904, 273, 4373, 22041, 923, 5987, 39962, 2061, 5375, 8602, 10496, 2504, 323, 247, 625, 11080, 1971, 187, 187, 4118, 18435, 27, 783, 30628, 403, 7514, 670, 253, 38135, 273, 253, 4081, 4715, 2281, 10130, 253, 8132, 263, 273, 253, 16774, 12820, 285, 253, 2954, 875, 253, 1543, 285, 253, 5955, 273, 9479, 4632, 1980, 46836, 891, 19864, 253, 4477, 281, 19071, 30628, 5701, 285, 501, 538, 2225, 281, 643, 13361, 28966 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 604, 5890, 484, 50276, 15083, 7843, 50276, 8632, 333, 310, 1691, 2366, 387, 253, 990, 352, 310, 247, 6970, 342, 247, 2176, 36199, 281, 247, 7349, 460, 50275, 29483, 595, 604, 253, 1491, 310, 2130, 352, 651, 320, 4217, 281, 452, 253, 2629, 21492, 273, 253, 3388, 2193, 5118, 275, 2829, 721, 50275, 7152, 339, 713, 4026, 2281, 10130, 7120, 271, 1774, 2554, 275, 45439, 534, 556, 247, 1781, 4833, 689, 253, 2457, 3045, 2167, 627, 452, 644, 8783, 273, 28631, 17170, 256, 5503, 3045, 1335, 4419, 10182, 1133, 85, 37437, 10130, 326, 778, 320, 1083, 407, 1083, 2429, 342, 2045, 4715, 2281, 28631, 4477, 806, 19704, 1520, 326, 253, 1180, 273, 4618, 46836, 310, 3012, 2406, 685, 253, 1180, 273, 9479, 46836, 285, 840, 50276, 856, 7334, 281, 897, 247, 1781, 4715, 2281, 387, 253, 31850, 3408, 323, 4209, 17947, 281, 5115, 247, 4618, 46836, 534, 778, 5115, 1805, 26647, 3045, 9470, 4679, 17813, 253, 4081, 4715, 2281, 10130, 50275, 783, 8310, 273, 436, 2929, 4453, 4722, 285, 4477, 452, 5196, 8783, 273, 4679, 281, 17813, 253, 2538, 273, 4081, 4715, 2281, 28631, 2299, 253, 38135, 273, 436, 2929, 3133, 3710, 806, 4477, 24366, 326, 253, 1180, 273, 4618, 46836, 310, 3012, 2406, 685, 253, 1180, 273, 9479, 46836, 533, 352, 19756, 247, 11080, 5839, 273, 436, 24366, 2057, 432, 2905, 16774, 1263, 390, 10527, 4685, 1273, 323, 253, 4081, 4715, 2281, 10130, 352, 3133, 417, 1077, 2590, 849, 281, 873, 253, 7467, 273, 17947, 44540, 20420, 2439, 1027, 8892, 347, 352, 310, 1335, 247, 1133, 85, 37437, 4373, 19484, 323, 4229, 2456, 8338, 627, 310, 417, 1199, 3064, 275, 2426, 273, 253, 3045, 2429, 342, 2045, 10130, 824, 347, 7349, 460, 10027, 390, 4872, 10027, 275, 2829, 721, 50272, 1189, 455, 891, 5257, 281, 247, 5075, 12009, 285, 352, 651, 1199, 1805, 604, 4477, 812, 564, 12861, 3212, 253, 8310, 585, 720, 1520, 406, 339, 793, 360, 3454, 436, 2929, 858, 271, 16774, 1263, 327, 253, 4715, 2281, 298, 83, 10130, 323, 3676, 11454, 6928, 277, 79, 2224, 3733, 253, 4477, 9059, 326, 253, 4038, 273, 4618, 46836, 310, 2406, 685, 9479, 46836, 285, 840, 921, 326, 436, 2789, 7562, 1029, 298, 83, 3309, 4720, 597, 12661, 247, 747, 298, 83, 10130, 326, 18922, 1029, 298, 83, 2217, 1048, 50275, 856, 84, 209, 186, 783, 1895, 436, 2929, 2175, 310, 1395, 323, 277, 79, 2224, 3733, 253, 4081, 298, 83, 10130, 310, 2969, 285, 556, 253, 2442, 281, 320, 908, 7561, 209, 186, 783, 4477, 2589, 9470, 16774, 5216, 281, 1329, 616, 1750, 285, 253, 5661, 2216, 310, 5272, 50276, 5040, 209, 186, 303, 417, 4751, 13762, 407, 253, 9079, 326, 4618, 46836, 452, 2406, 4038, 253, 16774, 1543, 476, 320, 5544, 407, 643, 24316, 347, 973, 323, 1650, 352, 310, 671, 1896, 326, 4618, 46836, 403, 21816, 1977, 432, 253, 31850, 891, 1158, 253, 4477, 878, 281, 2057, 2085, 10527, 1783, 390, 1705, 598, 342, 747, 4679, 281, 2007, 12654, 436, 9079, 209, 186, 783, 4081, 298, 83, 10130, 1057, 417, 1646, 3309, 581, 812, 4354, 5115, 253, 1072, 4096, 407, 5368, 298, 83, 28631, 24088, 897, 247, 3213, 10027, 298, 83, 10130, 50276, 186, 783, 38135, 310, 1698, 253, 2022, 38135, 273, 253, 2929, 310, 253, 1840, 9079, 533, 352, 310, 417, 4516, 2217, 253, 4081, 298, 83, 10130, 310, 247, 5777, 7321, 2715, 273, 253, 5368, 298, 83, 10130, 50276, 40622, 253, 7680, 273, 436, 2929, 3133, 32809, 50275, 7152, 33032, 39930, 4583, 891, 2868, 253, 14023, 281, 1666, 25379, 1646, 1512, 20276, 281, 2096, 253, 1318, 273, 253, 4081, 1332, 5001, 253, 3777, 3733, 7563, 1543, 12267, 10130, 476, 5115, 253, 1072, 7200, 347, 253, 8245, 342, 247, 1199, 3777, 3733, 7563, 497, 253, 8245, 28631, 671, 851, 37437, 323, 253, 3777, 3733, 7563, 604, 417, 436, 3133, 751, 271, 16593, 5750, 281, 253, 4081, 1332, 323, 1650, 275, 253, 13361, 49181, 7324, 5987, 39962, 2061, 5375, 746, 2313, 33856, 323, 4440, 257, 292, 627, 452, 644, 28631, 11253, 273, 247, 4872, 5890, 484, 3560, 407, 21396, 10027, 326, 452, 644, 24251, 281, 3986, 818, 3046, 275, 760, 6705, 44540, 1014, 387, 7863, 14604, 9552, 27594, 326, 253, 8245, 10130, 275, 2829, 495, 812, 2779, 513, 1199, 1805, 685, 752, 310, 2361, 604, 352, 369, 851, 37437, 342, 253, 1072, 1180, 273, 7587, 347, 253, 4081, 1332, 390, 604, 247, 625, 12085, 8245, 10130, 369, 908, 690, 273, 253, 1543, 1646, 24363, 347, 973, 275, 253, 3733, 6970, 8442, 721, 818, 854, 898, 884, 275, 253, 30762, 352, 3133, 8909, 326, 253, 4081, 1332, 760, 32010, 598, 281, 253, 18093, 37437, 1666, 25379, 4404, 253, 1077, 990, 273, 3733, 285, 326, 436, 369, 417, 5393, 275, 253, 2022, 2505, 323, 1650, 327, 260, 338, 274, 740, 253, 8245, 27125, 253, 4081, 1332, 1919, 253, 2457, 608, 562, 273, 1052, 44540, 273, 3733, 327, 270, 797, 16374, 3215, 26208, 352, 310, 12744, 432, 253, 14777, 672, 253, 4081, 1332, 27125, 253, 8245, 347, 253, 9191, 403, 594, 2074, 327, 259, 6917, 1047, 19072, 253, 8245, 27125, 253, 4081, 1332, 1919, 253, 2457, 8255, 562, 273, 5571, 44540, 273, 3733, 327, 891, 88, 3433, 85, 1047, 372, 257, 253, 8245, 285, 4081, 1332, 2831, 1016, 643, 247, 1643, 2069, 253, 2457, 673, 1146, 387, 23657, 7609, 273, 2456, 327, 891, 88, 3433, 85, 1047, 372, 257, 342, 253, 1111, 2990, 253, 8245, 285, 4081, 1332, 2831, 1016, 643, 247, 1643, 2069, 253, 2457, 673, 1146, 387, 23657, 24792, 273, 9166, 1223, 352, 310, 417, 12078, 323, 247, 4081, 1332, 281, 19486, 640, 247, 8245, 4404, 253, 990, 273, 3733, 841, 1543, 5224, 326, 4931, 604, 253, 1666, 25379, 497, 851, 37437, 597, 812, 6558, 616, 1805, 3045, 323, 253, 1390, 1643, 44540, 273, 3733, 970, 253, 1072, 3302, 298, 83, 323, 253, 4081, 285, 8245, 3082, 310, 4217, 2299, 352, 310, 12497, 281, 7568, 326, 253, 4081, 1332, 812, 1335, 1347, 973, 762, 1027, 3302, 2515, 891, 452, 3081, 7350, 670, 253, 8453, 273, 253, 4081, 1332, 689, 253, 1666, 25379, 534, 891, 6266, 2708, 50276, 1747, 13218, 10941, 281, 253, 9479, 1255, 273, 253, 8245, 298, 83, 28631, 342, 11184, 8338, 44540, 247, 1781, 4715, 2281, 1537, 1335, 755, 13476, 13949, 275, 4560, 247, 4618, 46836, 533, 37504, 9010, 760, 247, 39937, 46836, 1955, 281, 616, 2169, 4038, 352, 651, 1361, 281, 921, 16841, 17082, 387, 10879, 11508, 1309, 3733, 281, 6583, 436, 9079, 285, 281, 671, 921, 841, 323, 253, 643, 4715, 2281, 28631, 2429, 281, 594, 326, 368, 476, 7568, 326, 253, 4081, 10130, 33526, 1633, 253, 1666, 25379, 2550, 253, 9479, 1255, 2193, 275, 4677, 374, 403, 4722, 533, 891, 717, 7591, 281, 3653, 849, 13943, 597, 403, 1677, 326, 597, 403, 417, 2429, 281, 9479, 1255, 2193, 323, 667, 643, 28631, 594, 891, 13414, 871, 752, 253, 8245, 3904, 943, 320, 50276, 71, 3341, 352, 310, 12744, 326, 253, 4081, 1332, 310, 4460, 2217, 281, 7501, 247, 40468, 2929, 1293, 625, 26565, 10527, 22909, 281, 1329, 253, 7558, 4606, 3212, 697, 3045, 50276, 856, 84, 352, 310, 4217, 281, 3877, 326, 14308, 273, 16841, 476, 320, 20276, 534, 253, 4477, 513, 2319, 19936, 5987, 39962, 2061, 5375, 15046, 1229, 2537, 1610, 253, 37535, 273, 4679, 310, 27364, 13943, 533, 19235, 651, 320, 625, 13943, 604, 253, 37535, 369, 4577, 285, 625, 10182, 25184, 369, 2218, 323, 253, 4081, 1332, 285, 1666, 25379, 50276, 585, 1209, 2224, 275, 30762, 260, 672, 12930, 634, 16841, 7982, 368, 1333, 253, 11903, 1320, 1895, 310, 14042, 407, 9433, 9098, 25142, 273, 16589, 11786, 49104, 849, 369, 9098, 6777, 858, 253, 9479, 1255, 7982, 3523, 6890, 604, 625, 5018, 497, 908, 752, 403, 253, 6268, 3620, 84, 273, 253, 1543, 275, 7180, 721, 818, 1283, 655, 1384, 3127, 3436, 253, 4081, 1543, 1646, 1077, 2810, 281, 253, 18093, 37437, 1666, 25379, 285, 594, 352, 651, 320, 4217, 281, 2096, 849, 10126, 1534, 597, 403, 20953, 3237, 476, 320, 6685, 4217, 281, 45190, 7568, 436, 4618, 4632, 9479, 46836, 5438, 16958, 651, 320, 4217, 824, 347, 275, 259, 86, 1162, 355, 4765, 5987, 50004, 79, 2824, 550, 20790, 1438, 2537, 5430, 8433, 69, 7135, 296, 248, 14456, 1222, 303, 404, 1189, 19484, 1025, 28269, 324, 3190, 474, 296, 1430, 5726, 4911, 390, 253, 27620, 21396, 1566, 275, 5987, 39962, 2061, 5375, 16129, 26942, 18467, 253, 9191, 275, 4677, 818, 1646, 6685, 2074, 352, 651, 1361, 281, 7484, 253, 2957, 327, 247, 2412, 4311, 275, 2593, 7609, 275, 512, 2219, 359, 897, 253, 1566, 32552, 342, 1878, 2957, 327, 253, 12820, 873, 323, 12672, 7387, 86, 7363, 327, 253, 1071, 873, 310, 2393, 15910, 908, 275, 512, 4679, 604, 417, 2139, 3021, 275, 253, 3361, 273, 2067, 892, 2569, 7221, 284, 305, 69, 342, 2406, 4715, 4142, 1057, 417, 1089, 731, 4283, 281, 253, 24366, 326, 4038, 273, 17614, 468, 46836, 310, 4931, 4067, 685, 4038, 273, 14200, 46836, 352, 310, 12744, 281, 479, 849, 616, 2045, 1543, 1329, 436, 9079, 812, 2649, 581, 851, 2517, 253, 4715, 2281, 273, 256, 35333, 281, 1089, 17614, 468, 1258, 2569, 46836, 10939, 273, 849, 1142, 9479, 22829, 46836, 2226, 50276, 17695, 253, 3368, 4278, 275, 253, 26432, 812, 320, 4395, 281, 1996, 275, 253, 2929, 352, 3133, 281, 320, 6015, 275, 2593, 374, 4583, 253, 2929, 2978, 3133, 751, 352, 812, 320, 31063, 3777, 407, 11922, 6015, 7234, 8442, 721, 818, 854, 898, 651, 320, 1199, 30909, 281, 1239, 604, 352, 369, 247, 2014, 7484, 591, 4194, 6830, 327, 247, 2412, 4311, 327, 253, 9118, 7844, 672, 7763, 323, 15274, 352, 651, 320, 4217, 281, 452, 8245, 2159, 7563, 671, 320, 2361, 275, 2829, 608, 50276, 40844, 789, 627, 403, 1142, 2045, 2987, 327, 15571, 253, 5373, 273, 1781, 4715, 4142, 253, 954, 4623, 1146, 5987, 39962, 2061, 5375, 16129, 1967, 1857, 2222, 534, 3133, 281, 1056, 253, 1072, 1083, 347, 436, 2929, 533, 310, 417, 11106, 23000, 5987, 39962, 2061, 5375, 1518, 1229, 1423, 1093, 556, 625, 28055, 22909, 323, 436, 970, 253, 11454, 28196, 10295, 6239, 285, 253, 4477, 812, 2779, 15313, 2074, 22909, 275, 958, 597, 897, 247, 2074, 10130, 347, 253, 4081, 1332, 533, 513, 417, 1918, 352, 247, 1416, 253, 2990, 310, 10166, 342, 1027, 3302, 4715, 4142, 3560, 407, 247, 10027, 387, 247, 4229, 3520, 673, 246, 50275, 936, 253, 1072, 2457, 4715, 2281, 436, 10130, 310, 5611, 275, 1340, 281, 5416, 326, 512, 4679, 452, 253, 1072, 1268, 273, 256, 35333, 6046, 2584, 253, 990, 273, 3733, 4720, 627, 403, 643, 2987, 326, 6266, 849, 1698, 16841, 10746, 273, 253, 2957, 13016, 588, 320, 6311, 806, 2750, 2996, 432, 247, 2169, 298, 83, 3560, 407, 1029, 16841, 8656, 6046, 10746, 534, 5373, 432, 247, 4577, 298, 83, 2529, 275, 5987, 39962, 2061, 5375, 16129, 26942, 18467, 891, 2868, 326, 247, 625, 7473, 8813, 285, 1783, 273, 253, 3916, 327, 2900, 16841, 4038, 943, 320, 2530, 50276, 38092, 8680, 5701, 13991, 323, 7756, 285, 3533, 323, 253, 4477, 891, 2868, 326, 22870, 83, 5661, 9978, 651, 320, 2074, 281, 253, 1563, 2619, 2067, 12085, 298, 83, 28631, 323, 1016, 1895, 417, 816, 253, 2629, 4394, 4271, 247, 2074, 1180, 273, 4373, 3575, 1336, 323, 1016, 824, 347, 1180, 273, 5890, 484, 5018, 10027, 2193, 10027, 6970, 15029, 3966, 851, 2517, 1016, 10130, 285, 253, 4081, 1332, 323, 253, 1072, 1180, 273, 7587, 970, 12014, 25180, 3186, 8470, 323, 1016, 34243, 581, 651, 671, 851, 2517, 253, 3302, 13017, 4715, 4142, 10254, 285, 643, 4373, 22041, 323, 1016, 533, 436, 778, 320, 1512, 8214, 3609, 253, 1682, 9591, 4373, 19484, 4758, 323, 1016, 10130, 285, 294, 6321, 352, 689, 2709, 12922, 281, 2451, 323, 7882, 50276, 262, 476, 320, 20276, 281, 1056, 14023, 2439, 3082, 342, 1027, 3904, 273, 4373, 22041, 1014, 342, 253, 1072, 25184, 7563, 984, 352, 310, 7479, 281, 3989, 253, 1072, 4644, 4373, 19484, 8470, 342, 1027, 3904, 273, 4373, 22041, 923, 5987, 39962, 2061, 5375, 8602, 10496, 2504, 323, 247, 625, 11080, 1971, 187, 187, 4118, 18435, 27, 783, 30628, 403, 7514, 670, 253, 38135, 273, 253, 4081, 4715, 2281, 10130, 253, 8132, 263, 273, 253, 16774, 12820, 285, 253, 2954, 875, 253, 1543, 285, 253, 5955, 273, 9479, 4632, 1980, 46836, 891, 19864, 253, 4477, 281, 19071, 30628, 5701, 285, 501, 538, 2225, 281, 643, 13361, 28966 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the authors of this paper propose to use a continuoustime normalizing flow and e3equivariant gnn as the differential function in the neural pde for molecular generation experimental results on qm9 and zinc show that the proposed model modflow obtained better validity and novelty in comparison with existing graph flow based models strengths 1 the consideration of using continuoustime normalizing flow as encoder makes sense the authors find the their proposed model is equivalent to a variant of temporal graph network tgn weakness 1 the evaluation metrics used in this paper are validity uniqueness novelty and reconstruction which are usually used for sanity check to further look into the consistency of between the generated samples and the training samples more informative metrics should be used for example the metrics in moses can be considered 2 it seems that an intact test set was not used for result evaluation 3 furthermore the computational comparison should not be limited to flowbased methods other stateoftheart deep learning based molecular generation approaches should be compared as well 4 minors 1 moroever moreover 2 check the capitalizations in each reference item only technical limitations are mentioned as future work in one sentence in section 5 conclusion docsepthe presented method combines continuous e3equivariant flows and pdes on graphs to generate molecular structures strengths figure 2 helped me get a highlevel understanding of the model experiment 41 showcased the capabilities of the model with a simple example sections 42 and 43 are easy to follow and fairly clear the results seem impressive the models scores on validity uniqueness novelty and reconstruction seems impressive weaknesses the paper is not exceptionally wellwritten i had a hard time following sections 32 and 33 the main contributions of this paper the reader is expected to be familiar with the egnn model if the reader doesnt happen to know the exact inputs and mode of operation of this model the reader will get lost the importance of this borrowed architecture in this approach warrants a short introduction the authors adequately addressed the limitations of their work docsepthe authors propose a new generative model for molecules based on normalizing flows the network is based on a set of odes one per node that are coupled together to form a pde which lets the model graph densities accurately the model is also e3 equivariant which is an important physical property for molecules the paper presents experiments on qm9 and zinc250k datasets and evaluate results on validity uniqueness novelty and reconstruction metrics the modflow method performs well on these metrics strengths as the modflow method is based on combining normalizing flows with neural odes it can generate graphs in oneshot and also provide density estimates unlike gans the model is e3 equivariant which is a desirable property for molecules and other atomic systems weaknesses it is not entirely clear how useful the metrics used in the paper are as such they are weak metrics for example validity simply measures the fraction of molecules that do not violate chemical valency rule most chemical applications require generating based on more complex properties such as generating nontoxic drug molecules generating synthesizable materials etc therefore it is unclear how useful this method is in practice please discuss limitations in terms of computational efficiency classes of atomic systems that can be generated by this method etc for example this method cannot directly be used for materials where periodic boundary conditions need to be respected also potential negative societal impact such as generating toxic materials etc should be mentioned docsepthis work introduces modflow a modular continuous normalizing flow model parametrized by an e3 equivariant gnn to demonstrate the efficacy of modflow the model is trained on qm9 and zinc250k to the end of generating valid unique and novel samples strengths 1 the writing is clear for the most part and the model is wellpresented 2 the validity uniqueness and novelty results for qm9 and zinc250k look great the generated samples also seem reasonable weaknesses my main concerns with this work are related to its numerical experiments 1 more metrics should be included for a comprehensive assessment of modflows generation quality this is because validity uniqueness and novelty scores may not correlate with how realistic the samples are to this end one may use the frechet chemnet distance fcd which measures the similarity between the set of generated molecules and the set of training molecules one can easily compute the fcd as well as a variety of other metrics through moses httpsgithubcommolecularsetsmoses and the inclusion of these additional metrics will make the paper much more convincing 2 more experiments are needed to fully demonstrate the effectiveness of modflow for example property optimization is conducted in eg jtvae and graphaf to find the best generated molecules with certain chemical scores eg penalized logp qed such optimization showcases that the model is able to learn a meaningful latent space useful for downstream tasks other possibilities are latent space interpolation and property prediction edit in light of the added experimental results during the rebuttal i have now raised my score to 6 as i believe the paper makes a solid contribution in cheminformatics while modflow can generate molecules with high validity by directly sampling from the latent space there does not seem to be a straightforward way to incorporate valency checks into the sampling procedure of modflow should one desire to do so ### Summary:
this paper introduces modflow an e3 equivariant normalizing flow for generating molecular conformations a normalizing flow model for molecular systems could be extremely useful since normalizing flows have desirable properties including tractable densities and good sampling behavior the authors demonstrated the effectiveness of their method compared to state of the art on two canonical datasets qm9 and zinc250k although it might have been nice to see something harder all four referees weakly thought the paper should be accepted all of the referees liked the idea of using normalizing flows for this problem and thought the approach laid out by the authors seemed reasonable some of the referees thought the writing was good while others thought it could be improved the main stumbling block was the choice of metrics reported on the qm9 and zinc datasets however during the rebuttal the authors provided additional metrics which seemed to corroborate the success of the approach given the global support by the referees and improvement by the authors during the rebuttal i think the paper ought to be accepted
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 273, 436, 2929, 12661, 281, 897, 247, 44351, 26202, 553, 2622, 3006, 2685, 285, 299, 20, 8275, 6410, 305, 9866, 347, 253, 8967, 1159, 275, 253, 11454, 268, 615, 323, 5787, 5978, 5661, 1543, 327, 2805, 78, 26, 285, 20458, 921, 326, 253, 4081, 1566, 771, 5449, 2797, 1805, 13091, 285, 38135, 275, 5301, 342, 5368, 4216, 2685, 1754, 3210, 50276, 296, 3755, 20556, 337, 253, 8180, 273, 970, 44351, 26202, 553, 2622, 3006, 2685, 347, 32049, 2789, 3282, 253, 4477, 1089, 253, 616, 4081, 1566, 310, 6425, 281, 247, 12955, 273, 11935, 4216, 2990, 246, 3757, 50275, 20881, 1255, 50276, 18, 253, 7103, 17082, 908, 275, 436, 2929, 403, 13091, 34002, 38135, 285, 14433, 534, 403, 3798, 908, 323, 45985, 2451, 281, 2007, 1007, 715, 253, 15274, 273, 875, 253, 4561, 3530, 285, 253, 3733, 3530, 625, 27096, 17082, 943, 320, 908, 323, 1650, 253, 17082, 275, 278, 4863, 476, 320, 2783, 50276, 19, 352, 3133, 326, 271, 15282, 1071, 873, 369, 417, 908, 323, 906, 7103, 50276, 20, 33810, 253, 15180, 5301, 943, 417, 320, 3710, 281, 2685, 3169, 3082, 643, 1375, 23037, 14387, 3676, 4715, 1754, 5787, 5978, 7274, 943, 320, 2429, 347, 973, 50276, 21, 36973, 337, 2298, 80, 972, 50276, 3062, 1189, 374, 2451, 253, 5347, 5904, 275, 1016, 3806, 5382, 50276, 7483, 7681, 7364, 403, 5393, 347, 2852, 789, 275, 581, 6197, 275, 2593, 608, 6452, 50276, 7152, 339, 431, 248, 3559, 1332, 24772, 5415, 299, 20, 8275, 6410, 14221, 285, 268, 3229, 327, 14580, 281, 6635, 5787, 5289, 50276, 296, 3755, 20556, 50275, 13206, 374, 6518, 479, 755, 247, 1029, 5251, 4685, 273, 253, 1566, 50276, 16217, 2092, 7609, 44762, 833, 253, 13789, 273, 253, 1566, 342, 247, 2969, 1650, 50276, 21454, 5976, 285, 7652, 403, 3477, 281, 956, 285, 9648, 2590, 50276, 783, 1543, 1646, 13943, 253, 3210, 7363, 327, 13091, 34002, 38135, 285, 14433, 3133, 13943, 50275, 20881, 1255, 265, 50275, 783, 2929, 310, 417, 35888, 973, 15720, 891, 574, 247, 1892, 673, 1563, 7118, 4567, 285, 5922, 50276, 783, 2022, 9021, 273, 436, 2929, 50276, 783, 9414, 310, 3264, 281, 320, 7615, 342, 253, 299, 3757, 79, 1566, 604, 253, 9414, 36908, 5108, 281, 871, 253, 3242, 14800, 285, 4438, 273, 4254, 273, 436, 1566, 253, 9414, 588, 755, 3663, 253, 6349, 273, 436, 29563, 10336, 275, 436, 2746, 32570, 247, 2159, 10199, 253, 4477, 18212, 9713, 253, 7364, 273, 616, 789, 5474, 339, 431, 248, 4477, 12661, 247, 747, 1006, 800, 1566, 323, 8094, 1754, 327, 2622, 3006, 14221, 253, 2990, 310, 1754, 327, 247, 873, 273, 258, 3229, 581, 591, 4666, 326, 403, 9904, 2366, 281, 830, 247, 268, 615, 534, 14935, 253, 1566, 4216, 16689, 13613, 253, 1566, 310, 671, 299, 20, 32270, 6410, 534, 310, 271, 1774, 3520, 2867, 323, 8094, 50276, 783, 2929, 10262, 4679, 327, 2805, 78, 26, 285, 20458, 9519, 76, 15302, 285, 7472, 1543, 327, 13091, 34002, 38135, 285, 14433, 17082, 253, 771, 5449, 1332, 17923, 973, 327, 841, 17082, 20544, 50276, 284, 253, 771, 5449, 1332, 310, 1754, 327, 16248, 2622, 3006, 14221, 342, 11454, 258, 3229, 352, 476, 6635, 14580, 275, 4394, 12022, 285, 671, 2085, 4038, 8197, 12401, 305, 507, 50276, 783, 1566, 310, 299, 20, 32270, 6410, 534, 310, 247, 11408, 2867, 323, 8094, 285, 643, 13805, 2718, 50276, 20881, 1255, 265, 50276, 262, 310, 417, 7094, 2590, 849, 4217, 253, 17082, 908, 275, 253, 2929, 403, 347, 824, 597, 403, 5075, 17082, 50276, 1542, 1650, 13091, 3365, 5593, 253, 6919, 273, 8094, 326, 513, 417, 20835, 5793, 821, 1371, 4086, 954, 5793, 4893, 2430, 11365, 1754, 327, 625, 2570, 3607, 824, 347, 11365, 25450, 32759, 2854, 8094, 11365, 35143, 12729, 4753, 3966, 3103, 352, 310, 12744, 849, 4217, 436, 1332, 310, 275, 3946, 4496, 2319, 7364, 275, 2426, 273, 15180, 6733, 5971, 273, 13805, 2718, 326, 476, 320, 4561, 407, 436, 1332, 3966, 323, 1650, 436, 1332, 2550, 3587, 320, 908, 323, 4753, 835, 15316, 7548, 2515, 878, 281, 320, 22694, 50276, 12563, 2442, 4016, 38058, 3486, 824, 347, 11365, 12825, 4753, 3966, 943, 320, 5393, 5474, 33032, 2520, 789, 23970, 771, 5449, 247, 23178, 5415, 2622, 3006, 2685, 1566, 30364, 50065, 407, 271, 299, 20, 32270, 6410, 305, 9866, 281, 7568, 253, 10307, 273, 771, 5449, 253, 1566, 310, 10166, 327, 2805, 78, 26, 285, 20458, 9519, 76, 281, 253, 990, 273, 11365, 3588, 4451, 285, 4460, 3530, 50276, 296, 3755, 20556, 337, 253, 4028, 310, 2590, 323, 253, 954, 629, 285, 253, 1566, 310, 973, 15068, 264, 50276, 19, 253, 13091, 34002, 285, 38135, 1543, 323, 2805, 78, 26, 285, 20458, 9519, 76, 1007, 1270, 253, 4561, 3530, 671, 1646, 5272, 50276, 20881, 1255, 265, 619, 2022, 7350, 342, 436, 789, 403, 2905, 281, 697, 10704, 4679, 337, 625, 17082, 943, 320, 2908, 323, 247, 11088, 6803, 273, 771, 5449, 84, 5978, 3290, 436, 310, 984, 13091, 34002, 285, 38135, 7363, 778, 417, 24888, 342, 849, 15958, 253, 3530, 403, 281, 436, 990, 581, 778, 897, 253, 4107, 25742, 3554, 3024, 4181, 269, 2428, 534, 5593, 253, 14259, 875, 253, 873, 273, 4561, 8094, 285, 253, 873, 273, 3733, 8094, 581, 476, 4354, 11897, 253, 269, 2428, 347, 973, 347, 247, 5235, 273, 643, 17082, 949, 278, 4863, 5987, 7280, 2823, 13495, 1178, 3610, 4863, 285, 253, 11250, 273, 841, 3081, 17082, 588, 1056, 253, 2929, 1199, 625, 21414, 374, 625, 4679, 403, 3058, 281, 4751, 7568, 253, 12510, 273, 771, 5449, 323, 1650, 2867, 13757, 310, 5196, 275, 24088, 480, 18698, 3348, 285, 4216, 2320, 281, 1089, 253, 1682, 4561, 8094, 342, 2176, 5793, 7363, 24088, 29697, 1025, 2412, 81, 2805, 264, 824, 13757, 921, 12866, 326, 253, 1566, 310, 2104, 281, 3037, 247, 14282, 21624, 2317, 4217, 323, 15450, 8892, 643, 15018, 403, 21624, 2317, 30370, 285, 2867, 10554, 50276, 15576, 275, 1708, 273, 253, 2879, 5661, 1543, 1309, 253, 30080, 22559, 891, 452, 1024, 5439, 619, 4868, 281, 721, 347, 891, 2868, 253, 2929, 2789, 247, 4891, 7680, 275, 1161, 1222, 32531, 1223, 771, 5449, 476, 6635, 8094, 342, 1029, 13091, 407, 3587, 10491, 432, 253, 21624, 2317, 627, 1057, 417, 1646, 281, 320, 247, 15246, 1039, 281, 19071, 821, 1371, 12255, 715, 253, 10491, 5199, 273, 771, 5449, 943, 581, 8327, 281, 513, 594, 2490, 187, 4118, 18435, 27, 2520, 2929, 23970, 771, 5449, 271, 299, 20, 32270, 6410, 2622, 3006, 2685, 323, 11365, 5787, 10138, 569, 247, 2622, 3006, 2685, 1566, 323, 5787, 2718, 812, 320, 6685, 4217, 1580, 2622, 3006, 14221, 452, 11408, 3607, 1690, 10649, 494, 16689, 285, 1175, 10491, 3879, 253, 4477, 5183, 253, 12510, 273, 616, 1332, 2429, 281, 1375, 273, 253, 1445, 327, 767, 15516, 15302, 2805, 78, 26, 285, 20458, 9519, 76, 3738, 352, 1537, 452, 644, 5322, 281, 923, 1633, 12150, 50276, 455, 1740, 10591, 6151, 22112, 1869, 253, 2929, 943, 320, 7607, 512, 273, 253, 10591, 6151, 10490, 253, 2934, 273, 970, 2622, 3006, 14221, 323, 436, 1895, 285, 1869, 253, 2746, 10090, 562, 407, 253, 4477, 4455, 5272, 690, 273, 253, 10591, 6151, 1869, 253, 4028, 369, 1175, 1223, 2571, 1869, 352, 812, 320, 5520, 253, 2022, 331, 24886, 2972, 369, 253, 4327, 273, 17082, 2361, 327, 253, 2805, 78, 26, 285, 20458, 15302, 2299, 1309, 253, 30080, 22559, 253, 4477, 2530, 3081, 17082, 534, 4455, 281, 25092, 366, 253, 2323, 273, 253, 2746, 1677, 253, 4156, 1329, 407, 253, 10591, 6151, 285, 7756, 407, 253, 4477, 1309, 253, 30080, 22559, 891, 1158, 253, 2929, 12758, 281, 320, 7607, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 273, 436, 2929, 12661, 281, 897, 247, 44351, 26202, 553, 2622, 3006, 2685, 285, 299, 20, 8275, 6410, 305, 9866, 347, 253, 8967, 1159, 275, 253, 11454, 268, 615, 323, 5787, 5978, 5661, 1543, 327, 2805, 78, 26, 285, 20458, 921, 326, 253, 4081, 1566, 771, 5449, 2797, 1805, 13091, 285, 38135, 275, 5301, 342, 5368, 4216, 2685, 1754, 3210, 50276, 296, 3755, 20556, 337, 253, 8180, 273, 970, 44351, 26202, 553, 2622, 3006, 2685, 347, 32049, 2789, 3282, 253, 4477, 1089, 253, 616, 4081, 1566, 310, 6425, 281, 247, 12955, 273, 11935, 4216, 2990, 246, 3757, 50275, 20881, 1255, 50276, 18, 253, 7103, 17082, 908, 275, 436, 2929, 403, 13091, 34002, 38135, 285, 14433, 534, 403, 3798, 908, 323, 45985, 2451, 281, 2007, 1007, 715, 253, 15274, 273, 875, 253, 4561, 3530, 285, 253, 3733, 3530, 625, 27096, 17082, 943, 320, 908, 323, 1650, 253, 17082, 275, 278, 4863, 476, 320, 2783, 50276, 19, 352, 3133, 326, 271, 15282, 1071, 873, 369, 417, 908, 323, 906, 7103, 50276, 20, 33810, 253, 15180, 5301, 943, 417, 320, 3710, 281, 2685, 3169, 3082, 643, 1375, 23037, 14387, 3676, 4715, 1754, 5787, 5978, 7274, 943, 320, 2429, 347, 973, 50276, 21, 36973, 337, 2298, 80, 972, 50276, 3062, 1189, 374, 2451, 253, 5347, 5904, 275, 1016, 3806, 5382, 50276, 7483, 7681, 7364, 403, 5393, 347, 2852, 789, 275, 581, 6197, 275, 2593, 608, 6452, 50276, 7152, 339, 431, 248, 3559, 1332, 24772, 5415, 299, 20, 8275, 6410, 14221, 285, 268, 3229, 327, 14580, 281, 6635, 5787, 5289, 50276, 296, 3755, 20556, 50275, 13206, 374, 6518, 479, 755, 247, 1029, 5251, 4685, 273, 253, 1566, 50276, 16217, 2092, 7609, 44762, 833, 253, 13789, 273, 253, 1566, 342, 247, 2969, 1650, 50276, 21454, 5976, 285, 7652, 403, 3477, 281, 956, 285, 9648, 2590, 50276, 783, 1543, 1646, 13943, 253, 3210, 7363, 327, 13091, 34002, 38135, 285, 14433, 3133, 13943, 50275, 20881, 1255, 265, 50275, 783, 2929, 310, 417, 35888, 973, 15720, 891, 574, 247, 1892, 673, 1563, 7118, 4567, 285, 5922, 50276, 783, 2022, 9021, 273, 436, 2929, 50276, 783, 9414, 310, 3264, 281, 320, 7615, 342, 253, 299, 3757, 79, 1566, 604, 253, 9414, 36908, 5108, 281, 871, 253, 3242, 14800, 285, 4438, 273, 4254, 273, 436, 1566, 253, 9414, 588, 755, 3663, 253, 6349, 273, 436, 29563, 10336, 275, 436, 2746, 32570, 247, 2159, 10199, 253, 4477, 18212, 9713, 253, 7364, 273, 616, 789, 5474, 339, 431, 248, 4477, 12661, 247, 747, 1006, 800, 1566, 323, 8094, 1754, 327, 2622, 3006, 14221, 253, 2990, 310, 1754, 327, 247, 873, 273, 258, 3229, 581, 591, 4666, 326, 403, 9904, 2366, 281, 830, 247, 268, 615, 534, 14935, 253, 1566, 4216, 16689, 13613, 253, 1566, 310, 671, 299, 20, 32270, 6410, 534, 310, 271, 1774, 3520, 2867, 323, 8094, 50276, 783, 2929, 10262, 4679, 327, 2805, 78, 26, 285, 20458, 9519, 76, 15302, 285, 7472, 1543, 327, 13091, 34002, 38135, 285, 14433, 17082, 253, 771, 5449, 1332, 17923, 973, 327, 841, 17082, 20544, 50276, 284, 253, 771, 5449, 1332, 310, 1754, 327, 16248, 2622, 3006, 14221, 342, 11454, 258, 3229, 352, 476, 6635, 14580, 275, 4394, 12022, 285, 671, 2085, 4038, 8197, 12401, 305, 507, 50276, 783, 1566, 310, 299, 20, 32270, 6410, 534, 310, 247, 11408, 2867, 323, 8094, 285, 643, 13805, 2718, 50276, 20881, 1255, 265, 50276, 262, 310, 417, 7094, 2590, 849, 4217, 253, 17082, 908, 275, 253, 2929, 403, 347, 824, 597, 403, 5075, 17082, 50276, 1542, 1650, 13091, 3365, 5593, 253, 6919, 273, 8094, 326, 513, 417, 20835, 5793, 821, 1371, 4086, 954, 5793, 4893, 2430, 11365, 1754, 327, 625, 2570, 3607, 824, 347, 11365, 25450, 32759, 2854, 8094, 11365, 35143, 12729, 4753, 3966, 3103, 352, 310, 12744, 849, 4217, 436, 1332, 310, 275, 3946, 4496, 2319, 7364, 275, 2426, 273, 15180, 6733, 5971, 273, 13805, 2718, 326, 476, 320, 4561, 407, 436, 1332, 3966, 323, 1650, 436, 1332, 2550, 3587, 320, 908, 323, 4753, 835, 15316, 7548, 2515, 878, 281, 320, 22694, 50276, 12563, 2442, 4016, 38058, 3486, 824, 347, 11365, 12825, 4753, 3966, 943, 320, 5393, 5474, 33032, 2520, 789, 23970, 771, 5449, 247, 23178, 5415, 2622, 3006, 2685, 1566, 30364, 50065, 407, 271, 299, 20, 32270, 6410, 305, 9866, 281, 7568, 253, 10307, 273, 771, 5449, 253, 1566, 310, 10166, 327, 2805, 78, 26, 285, 20458, 9519, 76, 281, 253, 990, 273, 11365, 3588, 4451, 285, 4460, 3530, 50276, 296, 3755, 20556, 337, 253, 4028, 310, 2590, 323, 253, 954, 629, 285, 253, 1566, 310, 973, 15068, 264, 50276, 19, 253, 13091, 34002, 285, 38135, 1543, 323, 2805, 78, 26, 285, 20458, 9519, 76, 1007, 1270, 253, 4561, 3530, 671, 1646, 5272, 50276, 20881, 1255, 265, 619, 2022, 7350, 342, 436, 789, 403, 2905, 281, 697, 10704, 4679, 337, 625, 17082, 943, 320, 2908, 323, 247, 11088, 6803, 273, 771, 5449, 84, 5978, 3290, 436, 310, 984, 13091, 34002, 285, 38135, 7363, 778, 417, 24888, 342, 849, 15958, 253, 3530, 403, 281, 436, 990, 581, 778, 897, 253, 4107, 25742, 3554, 3024, 4181, 269, 2428, 534, 5593, 253, 14259, 875, 253, 873, 273, 4561, 8094, 285, 253, 873, 273, 3733, 8094, 581, 476, 4354, 11897, 253, 269, 2428, 347, 973, 347, 247, 5235, 273, 643, 17082, 949, 278, 4863, 5987, 7280, 2823, 13495, 1178, 3610, 4863, 285, 253, 11250, 273, 841, 3081, 17082, 588, 1056, 253, 2929, 1199, 625, 21414, 374, 625, 4679, 403, 3058, 281, 4751, 7568, 253, 12510, 273, 771, 5449, 323, 1650, 2867, 13757, 310, 5196, 275, 24088, 480, 18698, 3348, 285, 4216, 2320, 281, 1089, 253, 1682, 4561, 8094, 342, 2176, 5793, 7363, 24088, 29697, 1025, 2412, 81, 2805, 264, 824, 13757, 921, 12866, 326, 253, 1566, 310, 2104, 281, 3037, 247, 14282, 21624, 2317, 4217, 323, 15450, 8892, 643, 15018, 403, 21624, 2317, 30370, 285, 2867, 10554, 50276, 15576, 275, 1708, 273, 253, 2879, 5661, 1543, 1309, 253, 30080, 22559, 891, 452, 1024, 5439, 619, 4868, 281, 721, 347, 891, 2868, 253, 2929, 2789, 247, 4891, 7680, 275, 1161, 1222, 32531, 1223, 771, 5449, 476, 6635, 8094, 342, 1029, 13091, 407, 3587, 10491, 432, 253, 21624, 2317, 627, 1057, 417, 1646, 281, 320, 247, 15246, 1039, 281, 19071, 821, 1371, 12255, 715, 253, 10491, 5199, 273, 771, 5449, 943, 581, 8327, 281, 513, 594, 2490, 187, 4118, 18435, 27, 2520, 2929, 23970, 771, 5449, 271, 299, 20, 32270, 6410, 2622, 3006, 2685, 323, 11365, 5787, 10138, 569, 247, 2622, 3006, 2685, 1566, 323, 5787, 2718, 812, 320, 6685, 4217, 1580, 2622, 3006, 14221, 452, 11408, 3607, 1690, 10649, 494, 16689, 285, 1175, 10491, 3879, 253, 4477, 5183, 253, 12510, 273, 616, 1332, 2429, 281, 1375, 273, 253, 1445, 327, 767, 15516, 15302, 2805, 78, 26, 285, 20458, 9519, 76, 3738, 352, 1537, 452, 644, 5322, 281, 923, 1633, 12150, 50276, 455, 1740, 10591, 6151, 22112, 1869, 253, 2929, 943, 320, 7607, 512, 273, 253, 10591, 6151, 10490, 253, 2934, 273, 970, 2622, 3006, 14221, 323, 436, 1895, 285, 1869, 253, 2746, 10090, 562, 407, 253, 4477, 4455, 5272, 690, 273, 253, 10591, 6151, 1869, 253, 4028, 369, 1175, 1223, 2571, 1869, 352, 812, 320, 5520, 253, 2022, 331, 24886, 2972, 369, 253, 4327, 273, 17082, 2361, 327, 253, 2805, 78, 26, 285, 20458, 15302, 2299, 1309, 253, 30080, 22559, 253, 4477, 2530, 3081, 17082, 534, 4455, 281, 25092, 366, 253, 2323, 273, 253, 2746, 1677, 253, 4156, 1329, 407, 253, 10591, 6151, 285, 7756, 407, 253, 4477, 1309, 253, 30080, 22559, 891, 1158, 253, 2929, 12758, 281, 320, 7607, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: 1 using the minimax framework to optimize the empirical sum radonnikodym derivatives on the expected rewards is somewhat novel 2 the paper is generally well written and easy to understand 3 this paper provides a general generalization bound and a specific learning algorithm when policy is parameterized 1 the method regards dm ips and dr as a black box but actually there could be a lot of connections between the general framework and the standard methods for example ips also has a weight which is similar to the r variable the framework completely ignores the details in these estimators and just wrap a minimax framework on top of them i think this may not be the optimal way the paper also does not discuss this 2 the paper claims that the parameter in the constraint is easy to pick but the experiment section only tries a large range of the parameter and shows the performance is best when the parameter matches with the simulation in practice how to use domain knowledge to pick the parameter should be elaborated as it is an important factor 1 the method is heavily dependent on the constraint set for the max player which is constructed based on prior knowledge about the odds ratiothe constraint set as mentioned by the authors is very hard to specify as in dro methods the paper claims that their constraint set is an intuitive one it certainly is when we directly specify the ratio but this is also very restrictive it seems this framework can only work with a specific type of constraint set 2 i do not quite understand why the parameter of policy is minimizing the control variates of the expected reward intuitively the parameter of the optimal policy should be maximizing the reward docsepthis paper studies a new problem of generalizing the policy to a target population which is considered in the previous literature the paper develops a method to optimize the minimax value of a policy that achieves the best worstcase policy value on the target population with theoretical guarantee the experiments show that the proposed framework improves the generalizability of policies the proposed method needs to calibrate the parameters and ps1 however calibrating these two parameters needs some additional information or domain knowledge which makes the proposed method not practical in the realworld applications in the experiments part the author should show that the datadriven way of calibrating is effective near the true see q3 and q4 docsep1 this paper proposed a novel framework to generalize offpolicy learning under sample selection bias 2 the authors derive minmax optimization formulation to learn generalizable policies 3 this paper provides theoretical guarantees for the generalization bound of policy value the experiments may not be adequate to strongly support this papers claims especially for the experiment with real clinical data 1 the experiment has a very limited number of samples which makes the conclusion less convincing 2 can the authors add policy regret improvements compared to baselines to demonstrate their better policy generalization can the author further reduce the variance for the target policy in eq 7 eg a owen and y zhou safe and effective importance samplingjournal of the american statistical association in the experiment section is that possible for the authors to investigate the variance of the target policy value estimation docsep1 the paper studies a novel problem in offpolicy optimizaiton with potential practical imporatnce 2 the overall methodology design is sound 3 the paper is well written and easy to understand my main suggestion is to mention and compare with other approaches of offpolicy optimization in your introduction as well as experiments specifically the paper focuses on valuebased policy search among a prespecified policy class which will suffer the selection bias as the value is defined as an expectation over the covariate shift however eg the socall qlearning in dtr estimating eyxt first and then take action greedily will not have such an issue except for the suggestion in q4 i have several minor suggestions regarding the experiment section that i hope to see in the response 1 regarding the simulation setting discussion on how significant the selection bias is and a study on how does it affect the improvement 2 add a skyline that knows the selection bias it provides a more detailed picture of the tradeoff from robustness 3 discussion and possibly more experiments on complexer policy classes eg neural network is is possible to extend to those cases at all besides the argument on the top of page 5 right column is loose a more rigorous argument or proof should be added eg even in the appendix mention where does this condition appear and what may happen when add the estimation errors there ### Summary:
meta review in this paper the authors consider the problem of learning decision rules optimal with respect to some target distribution where samples are drawn from a training distribution that potentially differs from the target distribution the difference between distributions is modeled using a selector variable s since only the training data is observed corresponding to the selector variable assuming the value 1 the authors treat the probability weight that would allow adjustment of policy learning to the target distribution to be a nonidentified quantity thus rather than computing this weight directly the authors choose to learn the worst case optimal policy in the minimax sense using ideas from sensitivity analysis in causal inference where the sensitivity parameter bounds on the odds ratio quantifying the relationship of numerator and denominator of the selector weight finally the authors evaluate their methods on both synthetic and real data the reviewer consensus that emerged after the author feedback phase and discussion was positive overall although some clarifications regarding proofs was requested for the final version of the paper
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 18, 970, 253, 7221, 991, 7792, 281, 22318, 253, 16774, 2020, 1985, 251, 16825, 351, 1105, 13335, 327, 253, 3264, 23267, 310, 8489, 4460, 50276, 19, 253, 2929, 310, 3839, 973, 3542, 285, 3477, 281, 2096, 50276, 20, 436, 2929, 3400, 247, 2087, 26647, 3033, 285, 247, 2173, 4715, 5933, 672, 3646, 310, 4764, 1025, 50275, 18, 253, 1332, 17730, 42961, 41998, 285, 1837, 347, 247, 2806, 3817, 533, 2686, 627, 812, 320, 247, 2257, 273, 10291, 875, 253, 2087, 7792, 285, 253, 2629, 3082, 323, 1650, 41998, 671, 556, 247, 2801, 534, 310, 2074, 281, 253, 391, 4778, 253, 7792, 4336, 35136, 253, 4278, 275, 841, 48489, 285, 816, 16384, 247, 7221, 991, 7792, 327, 1755, 273, 731, 891, 1158, 436, 778, 417, 320, 253, 8654, 1039, 253, 2929, 671, 1057, 417, 2319, 436, 50276, 19, 253, 2929, 3916, 326, 253, 4764, 275, 253, 7658, 310, 3477, 281, 2619, 533, 253, 3368, 2593, 760, 14177, 247, 1781, 2491, 273, 253, 4764, 285, 2722, 253, 3045, 310, 1682, 672, 253, 4764, 10129, 342, 253, 9864, 275, 3946, 849, 281, 897, 5028, 3640, 281, 2619, 253, 4764, 943, 320, 50221, 347, 352, 310, 271, 1774, 2803, 50275, 18, 253, 1332, 310, 11306, 7976, 327, 253, 7658, 873, 323, 253, 2781, 4760, 534, 310, 8818, 1754, 327, 2720, 3640, 670, 253, 13653, 2839, 74, 4977, 7658, 873, 347, 5393, 407, 253, 4477, 310, 1077, 1892, 281, 13199, 347, 275, 3926, 3082, 253, 2929, 3916, 326, 616, 7658, 873, 310, 271, 27350, 581, 352, 5604, 310, 672, 359, 3587, 13199, 253, 4313, 533, 436, 310, 671, 1077, 29190, 352, 3133, 436, 7792, 476, 760, 789, 342, 247, 2173, 1511, 273, 7658, 873, 50275, 19, 891, 513, 417, 3240, 2096, 2139, 253, 4764, 273, 3646, 310, 28699, 253, 1453, 1459, 684, 273, 253, 3264, 10921, 540, 41597, 253, 4764, 273, 253, 8654, 3646, 943, 320, 46875, 253, 10921, 5474, 33032, 2520, 2929, 2175, 247, 747, 1895, 273, 2087, 3006, 253, 3646, 281, 247, 2303, 3072, 534, 310, 2783, 275, 253, 2045, 6239, 253, 2929, 24357, 247, 1332, 281, 22318, 253, 7221, 991, 1318, 273, 247, 3646, 326, 33526, 253, 1682, 9065, 5045, 3646, 1318, 327, 253, 2303, 3072, 342, 10527, 12215, 253, 4679, 921, 326, 253, 4081, 7792, 19132, 253, 2087, 50228, 273, 7823, 50276, 783, 4081, 1332, 3198, 281, 24403, 366, 253, 3602, 50276, 395, 3714, 18, 2299, 24403, 839, 841, 767, 3602, 3198, 690, 3081, 1491, 390, 5028, 3640, 534, 2789, 253, 4081, 1332, 417, 8542, 275, 253, 1524, 10186, 4893, 50276, 249, 253, 4679, 629, 253, 2488, 943, 921, 326, 253, 2856, 324, 1069, 257, 1039, 273, 24403, 839, 50276, 261, 3576, 2822, 253, 2032, 50276, 2887, 2805, 20, 285, 2805, 21, 5474, 33032, 18, 436, 2929, 4081, 247, 4460, 7792, 281, 39970, 745, 22872, 4715, 762, 3410, 5438, 8492, 374, 253, 4477, 15313, 1054, 4090, 13757, 15895, 281, 3037, 2087, 12729, 7823, 495, 436, 2929, 3400, 10527, 23632, 323, 253, 26647, 3033, 273, 3646, 1318, 253, 4679, 778, 417, 320, 10599, 281, 7052, 1329, 436, 9380, 3916, 3340, 323, 253, 3368, 342, 1524, 3382, 941, 337, 253, 3368, 556, 247, 1077, 3710, 1180, 273, 3530, 534, 2789, 253, 6452, 1679, 21414, 374, 476, 253, 4477, 823, 3646, 14938, 11701, 2429, 281, 1666, 25379, 281, 7568, 616, 1805, 3646, 26647, 476, 253, 2488, 2007, 4796, 253, 11041, 323, 253, 2303, 3646, 275, 16186, 818, 24088, 247, 258, 15082, 285, 340, 1182, 14451, 4999, 285, 3576, 6349, 10491, 19317, 273, 253, 41290, 266, 7605, 5864, 275, 253, 3368, 2593, 310, 326, 1896, 323, 253, 4477, 281, 7409, 253, 11041, 273, 253, 2303, 3646, 1318, 13418, 5474, 33032, 18, 253, 2929, 2175, 247, 4460, 1895, 275, 745, 22872, 5556, 478, 1942, 251, 342, 2442, 8542, 1607, 263, 255, 6591, 50276, 19, 253, 4583, 16182, 2216, 310, 3590, 50276, 20, 253, 2929, 310, 973, 3542, 285, 3477, 281, 2096, 50275, 2577, 2022, 14876, 310, 281, 3748, 285, 7277, 342, 643, 7274, 273, 745, 22872, 13757, 275, 634, 10199, 347, 973, 347, 4679, 50276, 46458, 253, 2929, 16633, 327, 1318, 3169, 3646, 3186, 2190, 247, 838, 1553, 1245, 3646, 966, 534, 588, 11089, 253, 5438, 8492, 347, 253, 1318, 310, 2931, 347, 271, 15355, 689, 253, 9383, 11610, 5333, 50276, 35529, 24088, 253, 9267, 455, 2805, 28269, 275, 277, 1206, 26230, 2046, 633, 806, 285, 840, 1379, 2250, 37819, 1031, 588, 417, 452, 824, 271, 2523, 50275, 16829, 323, 253, 14876, 275, 2805, 21, 891, 452, 2067, 5884, 13991, 5001, 253, 3368, 2593, 326, 891, 3524, 281, 923, 275, 253, 2380, 50275, 18, 5001, 253, 9864, 4758, 5955, 327, 849, 1534, 253, 5438, 8492, 310, 285, 247, 1263, 327, 849, 1057, 352, 2818, 253, 7756, 50276, 19, 823, 247, 1629, 36226, 326, 6057, 253, 5438, 8492, 352, 3400, 247, 625, 7000, 5406, 273, 253, 5454, 2727, 432, 31640, 50276, 20, 5955, 285, 6830, 625, 4679, 327, 2570, 254, 3646, 5971, 24088, 11454, 2990, 310, 310, 1896, 281, 9017, 281, 1110, 2219, 387, 512, 50275, 67, 11587, 253, 4154, 327, 253, 1755, 273, 3239, 608, 987, 5084, 310, 13155, 247, 625, 26565, 4154, 390, 4737, 943, 320, 2879, 24088, 1014, 275, 253, 30762, 3748, 835, 1057, 436, 1617, 3176, 285, 752, 778, 5108, 672, 823, 253, 13418, 6332, 627, 50275, 187, 187, 4118, 18435, 27, 13518, 2278, 275, 436, 2929, 253, 4477, 1908, 253, 1895, 273, 4715, 3061, 4803, 8654, 342, 1675, 281, 690, 2303, 3268, 835, 3530, 403, 8392, 432, 247, 3733, 3268, 326, 7826, 19986, 432, 253, 2303, 3268, 50276, 783, 3064, 875, 10670, 310, 23115, 970, 247, 23434, 4778, 256, 50276, 17480, 760, 253, 3733, 941, 310, 2540, 3969, 281, 253, 23434, 4778, 7384, 253, 1318, 337, 253, 4477, 1555, 253, 5912, 2801, 326, 651, 1581, 14000, 273, 3646, 4715, 281, 253, 2303, 3268, 281, 320, 247, 1327, 24491, 10671, 50276, 40622, 2581, 685, 12672, 436, 2801, 3587, 253, 4477, 5206, 281, 3037, 253, 9065, 1083, 8654, 3646, 275, 253, 7221, 991, 3282, 970, 5697, 432, 7340, 1783, 275, 19349, 17032, 835, 253, 7340, 4764, 14493, 327, 253, 13653, 4313, 2677, 5411, 253, 2954, 273, 4520, 1080, 285, 12619, 273, 253, 23434, 2801, 50276, 71, 3341, 253, 4477, 7472, 616, 3082, 327, 1097, 13506, 285, 1524, 941, 50276, 783, 37317, 13969, 326, 13082, 846, 253, 2488, 8680, 3408, 285, 5955, 369, 2762, 4583, 3738, 690, 8254, 6787, 5001, 27947, 369, 9521, 323, 253, 2457, 2715, 273, 253, 2929 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 18, 970, 253, 7221, 991, 7792, 281, 22318, 253, 16774, 2020, 1985, 251, 16825, 351, 1105, 13335, 327, 253, 3264, 23267, 310, 8489, 4460, 50276, 19, 253, 2929, 310, 3839, 973, 3542, 285, 3477, 281, 2096, 50276, 20, 436, 2929, 3400, 247, 2087, 26647, 3033, 285, 247, 2173, 4715, 5933, 672, 3646, 310, 4764, 1025, 50275, 18, 253, 1332, 17730, 42961, 41998, 285, 1837, 347, 247, 2806, 3817, 533, 2686, 627, 812, 320, 247, 2257, 273, 10291, 875, 253, 2087, 7792, 285, 253, 2629, 3082, 323, 1650, 41998, 671, 556, 247, 2801, 534, 310, 2074, 281, 253, 391, 4778, 253, 7792, 4336, 35136, 253, 4278, 275, 841, 48489, 285, 816, 16384, 247, 7221, 991, 7792, 327, 1755, 273, 731, 891, 1158, 436, 778, 417, 320, 253, 8654, 1039, 253, 2929, 671, 1057, 417, 2319, 436, 50276, 19, 253, 2929, 3916, 326, 253, 4764, 275, 253, 7658, 310, 3477, 281, 2619, 533, 253, 3368, 2593, 760, 14177, 247, 1781, 2491, 273, 253, 4764, 285, 2722, 253, 3045, 310, 1682, 672, 253, 4764, 10129, 342, 253, 9864, 275, 3946, 849, 281, 897, 5028, 3640, 281, 2619, 253, 4764, 943, 320, 50221, 347, 352, 310, 271, 1774, 2803, 50275, 18, 253, 1332, 310, 11306, 7976, 327, 253, 7658, 873, 323, 253, 2781, 4760, 534, 310, 8818, 1754, 327, 2720, 3640, 670, 253, 13653, 2839, 74, 4977, 7658, 873, 347, 5393, 407, 253, 4477, 310, 1077, 1892, 281, 13199, 347, 275, 3926, 3082, 253, 2929, 3916, 326, 616, 7658, 873, 310, 271, 27350, 581, 352, 5604, 310, 672, 359, 3587, 13199, 253, 4313, 533, 436, 310, 671, 1077, 29190, 352, 3133, 436, 7792, 476, 760, 789, 342, 247, 2173, 1511, 273, 7658, 873, 50275, 19, 891, 513, 417, 3240, 2096, 2139, 253, 4764, 273, 3646, 310, 28699, 253, 1453, 1459, 684, 273, 253, 3264, 10921, 540, 41597, 253, 4764, 273, 253, 8654, 3646, 943, 320, 46875, 253, 10921, 5474, 33032, 2520, 2929, 2175, 247, 747, 1895, 273, 2087, 3006, 253, 3646, 281, 247, 2303, 3072, 534, 310, 2783, 275, 253, 2045, 6239, 253, 2929, 24357, 247, 1332, 281, 22318, 253, 7221, 991, 1318, 273, 247, 3646, 326, 33526, 253, 1682, 9065, 5045, 3646, 1318, 327, 253, 2303, 3072, 342, 10527, 12215, 253, 4679, 921, 326, 253, 4081, 7792, 19132, 253, 2087, 50228, 273, 7823, 50276, 783, 4081, 1332, 3198, 281, 24403, 366, 253, 3602, 50276, 395, 3714, 18, 2299, 24403, 839, 841, 767, 3602, 3198, 690, 3081, 1491, 390, 5028, 3640, 534, 2789, 253, 4081, 1332, 417, 8542, 275, 253, 1524, 10186, 4893, 50276, 249, 253, 4679, 629, 253, 2488, 943, 921, 326, 253, 2856, 324, 1069, 257, 1039, 273, 24403, 839, 50276, 261, 3576, 2822, 253, 2032, 50276, 2887, 2805, 20, 285, 2805, 21, 5474, 33032, 18, 436, 2929, 4081, 247, 4460, 7792, 281, 39970, 745, 22872, 4715, 762, 3410, 5438, 8492, 374, 253, 4477, 15313, 1054, 4090, 13757, 15895, 281, 3037, 2087, 12729, 7823, 495, 436, 2929, 3400, 10527, 23632, 323, 253, 26647, 3033, 273, 3646, 1318, 253, 4679, 778, 417, 320, 10599, 281, 7052, 1329, 436, 9380, 3916, 3340, 323, 253, 3368, 342, 1524, 3382, 941, 337, 253, 3368, 556, 247, 1077, 3710, 1180, 273, 3530, 534, 2789, 253, 6452, 1679, 21414, 374, 476, 253, 4477, 823, 3646, 14938, 11701, 2429, 281, 1666, 25379, 281, 7568, 616, 1805, 3646, 26647, 476, 253, 2488, 2007, 4796, 253, 11041, 323, 253, 2303, 3646, 275, 16186, 818, 24088, 247, 258, 15082, 285, 340, 1182, 14451, 4999, 285, 3576, 6349, 10491, 19317, 273, 253, 41290, 266, 7605, 5864, 275, 253, 3368, 2593, 310, 326, 1896, 323, 253, 4477, 281, 7409, 253, 11041, 273, 253, 2303, 3646, 1318, 13418, 5474, 33032, 18, 253, 2929, 2175, 247, 4460, 1895, 275, 745, 22872, 5556, 478, 1942, 251, 342, 2442, 8542, 1607, 263, 255, 6591, 50276, 19, 253, 4583, 16182, 2216, 310, 3590, 50276, 20, 253, 2929, 310, 973, 3542, 285, 3477, 281, 2096, 50275, 2577, 2022, 14876, 310, 281, 3748, 285, 7277, 342, 643, 7274, 273, 745, 22872, 13757, 275, 634, 10199, 347, 973, 347, 4679, 50276, 46458, 253, 2929, 16633, 327, 1318, 3169, 3646, 3186, 2190, 247, 838, 1553, 1245, 3646, 966, 534, 588, 11089, 253, 5438, 8492, 347, 253, 1318, 310, 2931, 347, 271, 15355, 689, 253, 9383, 11610, 5333, 50276, 35529, 24088, 253, 9267, 455, 2805, 28269, 275, 277, 1206, 26230, 2046, 633, 806, 285, 840, 1379, 2250, 37819, 1031, 588, 417, 452, 824, 271, 2523, 50275, 16829, 323, 253, 14876, 275, 2805, 21, 891, 452, 2067, 5884, 13991, 5001, 253, 3368, 2593, 326, 891, 3524, 281, 923, 275, 253, 2380, 50275, 18, 5001, 253, 9864, 4758, 5955, 327, 849, 1534, 253, 5438, 8492, 310, 285, 247, 1263, 327, 849, 1057, 352, 2818, 253, 7756, 50276, 19, 823, 247, 1629, 36226, 326, 6057, 253, 5438, 8492, 352, 3400, 247, 625, 7000, 5406, 273, 253, 5454, 2727, 432, 31640, 50276, 20, 5955, 285, 6830, 625, 4679, 327, 2570, 254, 3646, 5971, 24088, 11454, 2990, 310, 310, 1896, 281, 9017, 281, 1110, 2219, 387, 512, 50275, 67, 11587, 253, 4154, 327, 253, 1755, 273, 3239, 608, 987, 5084, 310, 13155, 247, 625, 26565, 4154, 390, 4737, 943, 320, 2879, 24088, 1014, 275, 253, 30762, 3748, 835, 1057, 436, 1617, 3176, 285, 752, 778, 5108, 672, 823, 253, 13418, 6332, 627, 50275, 187, 187, 4118, 18435, 27, 13518, 2278, 275, 436, 2929, 253, 4477, 1908, 253, 1895, 273, 4715, 3061, 4803, 8654, 342, 1675, 281, 690, 2303, 3268, 835, 3530, 403, 8392, 432, 247, 3733, 3268, 326, 7826, 19986, 432, 253, 2303, 3268, 50276, 783, 3064, 875, 10670, 310, 23115, 970, 247, 23434, 4778, 256, 50276, 17480, 760, 253, 3733, 941, 310, 2540, 3969, 281, 253, 23434, 4778, 7384, 253, 1318, 337, 253, 4477, 1555, 253, 5912, 2801, 326, 651, 1581, 14000, 273, 3646, 4715, 281, 253, 2303, 3268, 281, 320, 247, 1327, 24491, 10671, 50276, 40622, 2581, 685, 12672, 436, 2801, 3587, 253, 4477, 5206, 281, 3037, 253, 9065, 1083, 8654, 3646, 275, 253, 7221, 991, 3282, 970, 5697, 432, 7340, 1783, 275, 19349, 17032, 835, 253, 7340, 4764, 14493, 327, 253, 13653, 4313, 2677, 5411, 253, 2954, 273, 4520, 1080, 285, 12619, 273, 253, 23434, 2801, 50276, 71, 3341, 253, 4477, 7472, 616, 3082, 327, 1097, 13506, 285, 1524, 941, 50276, 783, 37317, 13969, 326, 13082, 846, 253, 2488, 8680, 3408, 285, 5955, 369, 2762, 4583, 3738, 690, 8254, 6787, 5001, 27947, 369, 9521, 323, 253, 2457, 2715, 273, 253, 2929 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper proposed a deep bayesian model for heterogeneous multiomics data integration the gromovwasserstein fgw regularization between latent representations of heterogeneous views is used to align nodesfeatures in every pair of views the experimental results have demonstrated improvement in inferring meaningful relations the proposed model allows to integrate of both structured and unstructured heterogeneous views which often happens in the context of omics data in order to align nodesfeatures in every pair of views the gromovwasserstein fgw has been used the experimental results show a significant improvement of the proposed model to the baselines in several downstream tasks however the paper can be improved in the following aspects it is not clear what is the key difference between bayrel and the proposed besides the fgw regularization term can we apply bayrel and fgw into a pipeline to learn heterogeneous views and how is it compared to the proposed models if possible what is the square node fgw in the graphical model in figure 1 it is difficult to follow the generative process of the proposed models in short the paper is trying to solve an important challenge of heterogeneous multiomics data integration by proposing an advanced model compared with bayrel however what is the key difference and improvements compared with bayrel should be discussed in detail docsepthe authors propose a hierarchical bayesian generative model for multiview learning aimed at application to integration of multiple modalities of omics data the paper is dense and terse making it difficult to follow in order to pick out the novelty of the proposed approach the definition of the experiments and the data used in the experiments section is not selfsufficient without consulting the appendix it is hard to know what are the problems being solved the preprocessing and representation of the data to be used as input to the method although the approach is aimed at multiomic integration there is no real connection to the domain of application the evaluation is focused on the comparison to bayrel a related approach mentioned throughout the paper but not described in the related work multiomics integration is of interest in computational biomedicine however focusing on bayrel only in the related work the authors miss a large body of work old and new to name some snf icluster liger embedding propagation they also miss out on a large amount of data being generated with increasing pace within the field judging from the results morel seems to offer some performance improvement over bayrel on structured problems the authors demonstrate the ability of morel to learn from unpaired data and in a setting with missing data it all comes at a cost of 45 fold increase in computational time while there is a limited novelty and demonstrated improvement in performance over a single related approach from the application and result interpretation point of view the advantages of using morel are not clear docsepthe authors propose a bayesian framework to learn relations among multiomic datasets the main unique advantage over existing methods is the proposed method is able to learn without a priori dependency structure and it allows a certain degree of missingness and mismatching experiments on two biomedical multiomics data sets partially demonstrate the effectiveness of the proposed method strengths i generally find the paper quite well written the paper may be interesting to a much broader community rather than just the biomedical use cases the experiments have shown as the methodology is presented for any multiomic data in a very general way the overall methodology is similar to bayrel both papers formulate the problem of multiomics data integration by modeling it as multiview link prediction however this paper does have several nontrivial improvements over bayrel it could handle scenarios that have no known structure information and the samples could be unpaired across views weaknesses my main concern is with the experimental part the experiments are not extensive and cannot fully demonstrate the advantages of the proposed approach first the datasets are restrictive microbiomemetabolite interactions in cf may not be the most commonly used benchmark datasets in the biomedical multiomic datasets with many other more usual datasets not considered second many of the advantages the proposed method claims cannot be fully demonstrated by the two datasets in experiment though the authors do manipulate the structure of datasets manually to show that the method still works when some key assumptions of bayrel do not hold it should be more telling if the authors could conduct a more comprehensive experimental studies including more common datasets and datasets that are naturally with missingness and pairing issues also as the computational overhead compared to bayrel knowing whether the methods could handle largescale data with heterogeneous dependency structures would be helpful the proposed method is convincing with respect to an important problem the paper has nontrivial improvement over existing methods though it lacks comprehensive experimental validation it is overall a good complement to existing literature docsepthe authors propose a bayesian model to infer relations across heterogeneous views of data which can be of structured and unstructured type on the positive side the approach deals with problems that could not be previously easily tackled such as 1 having access to a graph of dependencies between features for each view 2 dealing with missing instances on some of the views 3 having pairing information across each view on the negative side a more thorough empirical investigation of how well the approach can deal with those cases is lacking specifically 1 a study of the robustness when the graph of dependencies between features is noisy is missing one should expect that some edges are wrong and study the dependency of the performance wrt the number of wrong edges 2 the authors show a single example where 10 of data is removed however a study that reports the performance wrt the fraction of removed data is needed to substantiate the claim that the approach is indeed successful in dealing with missing data moreover the fraction of missing instances should be different for different views 3 the authors report a single example of lack of alignment reversing the order for one view but not the other while a more systematic analysis with several random permutations is what should be reported the computational complexity is reported for a single dataset while it should be possible to give a sense of the time dependency of the proposed approach wrt the training set and test set size does it scale linearly some comments would be appreciated to clarify how the model deals with the case of several structured or unstructured views arguably the graph of dependencies between features would form disconnected components when multiple structured views are considered and hence the shortest path distance cannot be computed in that case with regards to the case with multiple views it would be of interest to study the sample complexity as a function of the number of available views since as the authors have stated in the introduction leveraging the information from multiple sources should improve the performance on tasks with a limited number of training samples the approach deals with problems that could not be previously easily tackled however a more thorough empirical investigation of how well the approach can deal with those cases is lacking ### Summary:
a deep bayesian generative model is presented for multiomics integration using fused gromovwasserstein regularization between latent representations of the data views the method removes several nontrivial and practically important restrictions from an earlier method bayrel enabling application in new setups while still performing well reviewers discussed the paper with the authors resolving misunderstandings of the differences from earlier work esp bayrel the authors reported more extensive experiments in the rebuttal though not comparisons the main remaining weakness is that the contributions are in a very narrow field or at least aplications have only been demonstrated in the narrow field of multiomics data analysis and even within that field only in a narrow subfield in a machine learning venue that is restrictive another issue is computational efficiency the final decision then depends on how much weight we place on the novel contributions vs these weaknesses
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 4081, 247, 3676, 17699, 16561, 1566, 323, 22766, 4471, 19177, 941, 9554, 253, 305, 409, 729, 88, 30666, 6339, 269, 72, 88, 37820, 875, 21624, 14237, 273, 22766, 6849, 310, 908, 281, 8495, 7632, 28862, 275, 1046, 4667, 273, 6849, 253, 5661, 1543, 452, 5183, 7756, 275, 9441, 804, 14282, 2493, 253, 4081, 1566, 4483, 281, 19837, 273, 1097, 18872, 285, 440, 34218, 22766, 6849, 534, 2223, 6569, 275, 253, 3634, 273, 7005, 982, 941, 275, 1340, 281, 8495, 7632, 28862, 275, 1046, 4667, 273, 6849, 253, 305, 409, 729, 88, 30666, 6339, 269, 72, 88, 556, 644, 908, 253, 5661, 1543, 921, 247, 1534, 7756, 273, 253, 4081, 1566, 281, 253, 1666, 25379, 275, 2067, 15450, 8892, 2299, 253, 2929, 476, 320, 5520, 275, 253, 1563, 7794, 50274, 262, 310, 417, 2590, 752, 310, 253, 2234, 3064, 875, 17699, 1661, 285, 253, 4081, 16280, 253, 269, 72, 88, 37820, 1307, 50273, 5092, 359, 4647, 17699, 1661, 285, 269, 72, 88, 715, 247, 15722, 281, 3037, 22766, 6849, 285, 849, 310, 352, 2429, 281, 253, 4081, 3210, 604, 1896, 50274, 5371, 310, 253, 6278, 4666, 269, 72, 88, 275, 253, 29886, 1566, 275, 4677, 337, 352, 310, 2834, 281, 956, 253, 1006, 800, 1232, 273, 253, 4081, 3210, 50276, 249, 2159, 253, 2929, 310, 2820, 281, 8415, 271, 1774, 5691, 273, 22766, 4471, 19177, 941, 9554, 407, 36636, 271, 7269, 1566, 2429, 342, 17699, 1661, 2299, 752, 310, 253, 2234, 3064, 285, 11701, 2429, 342, 17699, 1661, 943, 320, 5469, 275, 2508, 5474, 339, 431, 248, 4477, 12661, 247, 24498, 17699, 16561, 1006, 800, 1566, 323, 1554, 400, 827, 4715, 11205, 387, 2898, 281, 9554, 273, 2709, 33433, 273, 7005, 982, 941, 50276, 783, 2929, 310, 14086, 285, 4109, 339, 2403, 352, 2834, 281, 956, 275, 1340, 281, 2619, 562, 253, 38135, 273, 253, 4081, 2746, 253, 5426, 273, 253, 4679, 285, 253, 941, 908, 275, 253, 4679, 2593, 310, 417, 1881, 31031, 1293, 25021, 253, 30762, 352, 310, 1892, 281, 871, 752, 403, 253, 3237, 1146, 14042, 253, 638, 21678, 285, 6779, 273, 253, 941, 281, 320, 908, 347, 3280, 281, 253, 1332, 3738, 253, 2746, 310, 11205, 387, 4471, 4986, 9554, 627, 310, 642, 1524, 4602, 281, 253, 5028, 273, 2898, 253, 7103, 310, 7106, 327, 253, 5301, 281, 17699, 1661, 247, 2905, 2746, 5393, 4768, 253, 2929, 533, 417, 2529, 275, 253, 2905, 789, 50276, 23939, 19177, 9554, 310, 273, 1600, 275, 15180, 33379, 35705, 2299, 13654, 327, 17699, 1661, 760, 275, 253, 2905, 789, 253, 4477, 2985, 247, 1781, 2133, 273, 789, 1711, 285, 747, 281, 1416, 690, 3802, 71, 17857, 77, 8976, 298, 8047, 21496, 18634, 597, 671, 2985, 562, 327, 247, 1781, 2408, 273, 941, 1146, 4561, 342, 3629, 13870, 1561, 253, 1673, 50276, 6881, 3390, 432, 253, 1543, 625, 77, 3133, 281, 3959, 690, 3045, 7756, 689, 17699, 1661, 327, 18872, 3237, 253, 4477, 7568, 253, 3745, 273, 625, 77, 281, 3037, 432, 47223, 941, 285, 275, 247, 4758, 342, 5816, 941, 352, 512, 3249, 387, 247, 2105, 273, 5329, 7975, 2572, 275, 15180, 673, 50275, 6050, 627, 310, 247, 3710, 38135, 285, 5183, 7756, 275, 3045, 689, 247, 2014, 2905, 2746, 432, 253, 2898, 285, 906, 7914, 1127, 273, 1859, 253, 11361, 273, 970, 625, 77, 403, 417, 2590, 5474, 339, 431, 248, 4477, 12661, 247, 17699, 16561, 7792, 281, 3037, 2493, 2190, 4471, 4986, 15302, 253, 2022, 4451, 5750, 689, 5368, 3082, 310, 253, 4081, 1332, 310, 2104, 281, 3037, 1293, 247, 30400, 18925, 2605, 285, 352, 4483, 247, 2176, 4248, 273, 5816, 1255, 285, 19412, 16464, 4679, 327, 767, 35156, 4471, 19177, 941, 5239, 10571, 7568, 253, 12510, 273, 253, 4081, 1332, 20544, 50276, 74, 3839, 1089, 253, 2929, 3240, 973, 3542, 253, 2929, 778, 320, 4722, 281, 247, 1199, 16055, 3114, 2581, 685, 816, 253, 35156, 897, 2219, 253, 4679, 452, 2011, 347, 253, 16182, 310, 3559, 323, 667, 4471, 4986, 941, 275, 247, 1077, 2087, 1039, 50275, 783, 4583, 16182, 310, 2074, 281, 17699, 1661, 1097, 9380, 36803, 253, 1895, 273, 4471, 19177, 941, 9554, 407, 14053, 352, 347, 1554, 400, 827, 3048, 10554, 2299, 436, 2929, 1057, 452, 2067, 37825, 11701, 689, 17699, 1661, 352, 812, 6016, 15216, 326, 452, 642, 1929, 2605, 1491, 285, 253, 3530, 812, 320, 47223, 2439, 6849, 50275, 20881, 1255, 265, 619, 2022, 4468, 310, 342, 253, 5661, 629, 50276, 783, 4679, 403, 417, 9470, 285, 2550, 4751, 7568, 253, 11361, 273, 253, 4081, 2746, 806, 253, 15302, 403, 29190, 18124, 297, 358, 292, 22386, 614, 6355, 275, 21194, 778, 417, 320, 253, 954, 7744, 908, 22791, 15302, 275, 253, 35156, 4471, 4986, 15302, 342, 1142, 643, 625, 7312, 15302, 417, 2783, 1273, 1142, 273, 253, 11361, 253, 4081, 1332, 3916, 2550, 320, 4751, 5183, 407, 253, 767, 15302, 275, 3368, 2167, 253, 4477, 513, 26526, 253, 2605, 273, 15302, 13542, 281, 921, 326, 253, 1332, 1335, 2987, 672, 690, 2234, 13260, 273, 17699, 1661, 513, 417, 2186, 352, 943, 320, 625, 7746, 604, 253, 4477, 812, 2589, 247, 625, 11088, 5661, 2175, 1690, 625, 1846, 15302, 285, 15302, 326, 403, 10748, 342, 5816, 1255, 285, 25015, 3374, 671, 347, 253, 15180, 18332, 2429, 281, 17699, 1661, 8958, 1880, 253, 3082, 812, 6016, 1236, 2510, 25912, 941, 342, 22766, 18925, 5289, 651, 320, 9371, 50276, 783, 4081, 1332, 310, 21414, 342, 1675, 281, 271, 1774, 1895, 253, 2929, 556, 37825, 7756, 689, 5368, 3082, 2167, 352, 19756, 11088, 5661, 12820, 352, 310, 4583, 247, 1175, 13503, 281, 5368, 6239, 5474, 339, 431, 248, 4477, 12661, 247, 17699, 16561, 1566, 281, 9441, 2493, 2439, 22766, 6849, 273, 941, 534, 476, 320, 273, 18872, 285, 440, 34218, 1511, 50276, 251, 253, 2762, 1930, 253, 2746, 13330, 342, 3237, 326, 812, 417, 320, 3786, 4354, 11463, 1070, 824, 347, 337, 1907, 2289, 281, 247, 4216, 273, 21011, 875, 3386, 323, 1016, 1859, 374, 10620, 342, 5816, 10872, 327, 690, 273, 253, 6849, 495, 1907, 25015, 1491, 2439, 1016, 1859, 50276, 251, 253, 4016, 1930, 247, 625, 11080, 16774, 5839, 273, 849, 973, 253, 2746, 476, 2968, 342, 1110, 2219, 310, 14999, 5742, 337, 247, 1263, 273, 253, 31640, 672, 253, 4216, 273, 21011, 875, 3386, 310, 27620, 310, 5816, 581, 943, 1902, 326, 690, 9297, 403, 3430, 285, 1263, 253, 18925, 273, 253, 3045, 8772, 253, 1180, 273, 3430, 9297, 374, 253, 4477, 921, 247, 2014, 1650, 835, 884, 273, 941, 310, 5176, 2299, 247, 1263, 326, 5012, 253, 3045, 8772, 253, 6919, 273, 5176, 941, 310, 3058, 281, 4326, 4513, 253, 1750, 326, 253, 2746, 310, 6296, 5547, 275, 10620, 342, 5816, 941, 25761, 253, 6919, 273, 5816, 10872, 943, 320, 1027, 323, 1027, 6849, 495, 253, 4477, 1304, 247, 2014, 1650, 273, 3480, 273, 12420, 40310, 253, 1340, 323, 581, 1859, 533, 417, 253, 643, 1223, 247, 625, 12082, 1783, 342, 2067, 3632, 39908, 310, 752, 943, 320, 2361, 253, 15180, 10454, 310, 2361, 323, 247, 2014, 10895, 1223, 352, 943, 320, 1896, 281, 1918, 247, 3282, 273, 253, 673, 18925, 273, 253, 4081, 2746, 50276, 88, 1378, 253, 3733, 873, 285, 1071, 873, 1979, 1057, 352, 4311, 23352, 690, 5701, 651, 320, 14109, 281, 19148, 849, 253, 1566, 13330, 342, 253, 1083, 273, 2067, 18872, 390, 440, 34218, 6849, 25711, 253, 50276, 10580, 273, 21011, 875, 3386, 651, 830, 33817, 4295, 672, 2709, 18872, 6849, 403, 2783, 285, 7613, 253, 30505, 1854, 4181, 2550, 320, 10302, 275, 326, 1083, 50276, 3113, 17730, 281, 253, 1083, 342, 2709, 6849, 352, 651, 320, 273, 1600, 281, 1263, 253, 3410, 10454, 347, 247, 1159, 273, 253, 1180, 273, 2130, 6849, 1580, 347, 253, 4477, 452, 4767, 275, 253, 10199, 19732, 2977, 253, 1491, 432, 2709, 4973, 943, 3157, 253, 3045, 327, 8892, 342, 247, 3710, 1180, 273, 3733, 3530, 50274, 783, 2746, 13330, 342, 3237, 326, 812, 417, 320, 3786, 4354, 11463, 1070, 2299, 50276, 66, 625, 11080, 16774, 5839, 273, 849, 973, 253, 2746, 476, 2968, 342, 1110, 2219, 310, 14999, 50276, 187, 187, 4118, 18435, 27, 66, 3676, 17699, 16561, 1006, 800, 1566, 310, 3559, 323, 4471, 19177, 9554, 970, 29843, 305, 409, 729, 88, 30666, 6339, 37820, 875, 21624, 14237, 273, 253, 941, 6849, 253, 1332, 26586, 2067, 37825, 285, 18236, 1774, 13133, 432, 271, 4321, 1332, 17699, 1661, 17690, 2898, 275, 747, 873, 8777, 1223, 1335, 9591, 973, 50276, 15337, 398, 5469, 253, 2929, 342, 253, 4477, 30426, 23452, 1676, 723, 273, 253, 3910, 432, 4321, 789, 17985, 17699, 1661, 253, 4477, 2361, 625, 9470, 4679, 275, 253, 30080, 22559, 2167, 417, 14023, 253, 2022, 5780, 14855, 310, 326, 253, 9021, 403, 275, 247, 1077, 6891, 1673, 390, 387, 1878, 247, 35663, 452, 760, 644, 5183, 275, 253, 6891, 1673, 273, 4471, 19177, 941, 1783, 285, 1014, 1561, 326, 1673, 760, 275, 247, 6891, 749, 3423, 275, 247, 5145, 4715, 18767, 326, 310, 29190, 1529, 2523, 310, 15180, 6733, 253, 2457, 3061, 840, 7024, 327, 849, 1199, 2801, 359, 1659, 327, 253, 4460, 9021, 4632, 841, 32213 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 4081, 247, 3676, 17699, 16561, 1566, 323, 22766, 4471, 19177, 941, 9554, 253, 305, 409, 729, 88, 30666, 6339, 269, 72, 88, 37820, 875, 21624, 14237, 273, 22766, 6849, 310, 908, 281, 8495, 7632, 28862, 275, 1046, 4667, 273, 6849, 253, 5661, 1543, 452, 5183, 7756, 275, 9441, 804, 14282, 2493, 253, 4081, 1566, 4483, 281, 19837, 273, 1097, 18872, 285, 440, 34218, 22766, 6849, 534, 2223, 6569, 275, 253, 3634, 273, 7005, 982, 941, 275, 1340, 281, 8495, 7632, 28862, 275, 1046, 4667, 273, 6849, 253, 305, 409, 729, 88, 30666, 6339, 269, 72, 88, 556, 644, 908, 253, 5661, 1543, 921, 247, 1534, 7756, 273, 253, 4081, 1566, 281, 253, 1666, 25379, 275, 2067, 15450, 8892, 2299, 253, 2929, 476, 320, 5520, 275, 253, 1563, 7794, 50274, 262, 310, 417, 2590, 752, 310, 253, 2234, 3064, 875, 17699, 1661, 285, 253, 4081, 16280, 253, 269, 72, 88, 37820, 1307, 50273, 5092, 359, 4647, 17699, 1661, 285, 269, 72, 88, 715, 247, 15722, 281, 3037, 22766, 6849, 285, 849, 310, 352, 2429, 281, 253, 4081, 3210, 604, 1896, 50274, 5371, 310, 253, 6278, 4666, 269, 72, 88, 275, 253, 29886, 1566, 275, 4677, 337, 352, 310, 2834, 281, 956, 253, 1006, 800, 1232, 273, 253, 4081, 3210, 50276, 249, 2159, 253, 2929, 310, 2820, 281, 8415, 271, 1774, 5691, 273, 22766, 4471, 19177, 941, 9554, 407, 36636, 271, 7269, 1566, 2429, 342, 17699, 1661, 2299, 752, 310, 253, 2234, 3064, 285, 11701, 2429, 342, 17699, 1661, 943, 320, 5469, 275, 2508, 5474, 339, 431, 248, 4477, 12661, 247, 24498, 17699, 16561, 1006, 800, 1566, 323, 1554, 400, 827, 4715, 11205, 387, 2898, 281, 9554, 273, 2709, 33433, 273, 7005, 982, 941, 50276, 783, 2929, 310, 14086, 285, 4109, 339, 2403, 352, 2834, 281, 956, 275, 1340, 281, 2619, 562, 253, 38135, 273, 253, 4081, 2746, 253, 5426, 273, 253, 4679, 285, 253, 941, 908, 275, 253, 4679, 2593, 310, 417, 1881, 31031, 1293, 25021, 253, 30762, 352, 310, 1892, 281, 871, 752, 403, 253, 3237, 1146, 14042, 253, 638, 21678, 285, 6779, 273, 253, 941, 281, 320, 908, 347, 3280, 281, 253, 1332, 3738, 253, 2746, 310, 11205, 387, 4471, 4986, 9554, 627, 310, 642, 1524, 4602, 281, 253, 5028, 273, 2898, 253, 7103, 310, 7106, 327, 253, 5301, 281, 17699, 1661, 247, 2905, 2746, 5393, 4768, 253, 2929, 533, 417, 2529, 275, 253, 2905, 789, 50276, 23939, 19177, 9554, 310, 273, 1600, 275, 15180, 33379, 35705, 2299, 13654, 327, 17699, 1661, 760, 275, 253, 2905, 789, 253, 4477, 2985, 247, 1781, 2133, 273, 789, 1711, 285, 747, 281, 1416, 690, 3802, 71, 17857, 77, 8976, 298, 8047, 21496, 18634, 597, 671, 2985, 562, 327, 247, 1781, 2408, 273, 941, 1146, 4561, 342, 3629, 13870, 1561, 253, 1673, 50276, 6881, 3390, 432, 253, 1543, 625, 77, 3133, 281, 3959, 690, 3045, 7756, 689, 17699, 1661, 327, 18872, 3237, 253, 4477, 7568, 253, 3745, 273, 625, 77, 281, 3037, 432, 47223, 941, 285, 275, 247, 4758, 342, 5816, 941, 352, 512, 3249, 387, 247, 2105, 273, 5329, 7975, 2572, 275, 15180, 673, 50275, 6050, 627, 310, 247, 3710, 38135, 285, 5183, 7756, 275, 3045, 689, 247, 2014, 2905, 2746, 432, 253, 2898, 285, 906, 7914, 1127, 273, 1859, 253, 11361, 273, 970, 625, 77, 403, 417, 2590, 5474, 339, 431, 248, 4477, 12661, 247, 17699, 16561, 7792, 281, 3037, 2493, 2190, 4471, 4986, 15302, 253, 2022, 4451, 5750, 689, 5368, 3082, 310, 253, 4081, 1332, 310, 2104, 281, 3037, 1293, 247, 30400, 18925, 2605, 285, 352, 4483, 247, 2176, 4248, 273, 5816, 1255, 285, 19412, 16464, 4679, 327, 767, 35156, 4471, 19177, 941, 5239, 10571, 7568, 253, 12510, 273, 253, 4081, 1332, 20544, 50276, 74, 3839, 1089, 253, 2929, 3240, 973, 3542, 253, 2929, 778, 320, 4722, 281, 247, 1199, 16055, 3114, 2581, 685, 816, 253, 35156, 897, 2219, 253, 4679, 452, 2011, 347, 253, 16182, 310, 3559, 323, 667, 4471, 4986, 941, 275, 247, 1077, 2087, 1039, 50275, 783, 4583, 16182, 310, 2074, 281, 17699, 1661, 1097, 9380, 36803, 253, 1895, 273, 4471, 19177, 941, 9554, 407, 14053, 352, 347, 1554, 400, 827, 3048, 10554, 2299, 436, 2929, 1057, 452, 2067, 37825, 11701, 689, 17699, 1661, 352, 812, 6016, 15216, 326, 452, 642, 1929, 2605, 1491, 285, 253, 3530, 812, 320, 47223, 2439, 6849, 50275, 20881, 1255, 265, 619, 2022, 4468, 310, 342, 253, 5661, 629, 50276, 783, 4679, 403, 417, 9470, 285, 2550, 4751, 7568, 253, 11361, 273, 253, 4081, 2746, 806, 253, 15302, 403, 29190, 18124, 297, 358, 292, 22386, 614, 6355, 275, 21194, 778, 417, 320, 253, 954, 7744, 908, 22791, 15302, 275, 253, 35156, 4471, 4986, 15302, 342, 1142, 643, 625, 7312, 15302, 417, 2783, 1273, 1142, 273, 253, 11361, 253, 4081, 1332, 3916, 2550, 320, 4751, 5183, 407, 253, 767, 15302, 275, 3368, 2167, 253, 4477, 513, 26526, 253, 2605, 273, 15302, 13542, 281, 921, 326, 253, 1332, 1335, 2987, 672, 690, 2234, 13260, 273, 17699, 1661, 513, 417, 2186, 352, 943, 320, 625, 7746, 604, 253, 4477, 812, 2589, 247, 625, 11088, 5661, 2175, 1690, 625, 1846, 15302, 285, 15302, 326, 403, 10748, 342, 5816, 1255, 285, 25015, 3374, 671, 347, 253, 15180, 18332, 2429, 281, 17699, 1661, 8958, 1880, 253, 3082, 812, 6016, 1236, 2510, 25912, 941, 342, 22766, 18925, 5289, 651, 320, 9371, 50276, 783, 4081, 1332, 310, 21414, 342, 1675, 281, 271, 1774, 1895, 253, 2929, 556, 37825, 7756, 689, 5368, 3082, 2167, 352, 19756, 11088, 5661, 12820, 352, 310, 4583, 247, 1175, 13503, 281, 5368, 6239, 5474, 339, 431, 248, 4477, 12661, 247, 17699, 16561, 1566, 281, 9441, 2493, 2439, 22766, 6849, 273, 941, 534, 476, 320, 273, 18872, 285, 440, 34218, 1511, 50276, 251, 253, 2762, 1930, 253, 2746, 13330, 342, 3237, 326, 812, 417, 320, 3786, 4354, 11463, 1070, 824, 347, 337, 1907, 2289, 281, 247, 4216, 273, 21011, 875, 3386, 323, 1016, 1859, 374, 10620, 342, 5816, 10872, 327, 690, 273, 253, 6849, 495, 1907, 25015, 1491, 2439, 1016, 1859, 50276, 251, 253, 4016, 1930, 247, 625, 11080, 16774, 5839, 273, 849, 973, 253, 2746, 476, 2968, 342, 1110, 2219, 310, 14999, 5742, 337, 247, 1263, 273, 253, 31640, 672, 253, 4216, 273, 21011, 875, 3386, 310, 27620, 310, 5816, 581, 943, 1902, 326, 690, 9297, 403, 3430, 285, 1263, 253, 18925, 273, 253, 3045, 8772, 253, 1180, 273, 3430, 9297, 374, 253, 4477, 921, 247, 2014, 1650, 835, 884, 273, 941, 310, 5176, 2299, 247, 1263, 326, 5012, 253, 3045, 8772, 253, 6919, 273, 5176, 941, 310, 3058, 281, 4326, 4513, 253, 1750, 326, 253, 2746, 310, 6296, 5547, 275, 10620, 342, 5816, 941, 25761, 253, 6919, 273, 5816, 10872, 943, 320, 1027, 323, 1027, 6849, 495, 253, 4477, 1304, 247, 2014, 1650, 273, 3480, 273, 12420, 40310, 253, 1340, 323, 581, 1859, 533, 417, 253, 643, 1223, 247, 625, 12082, 1783, 342, 2067, 3632, 39908, 310, 752, 943, 320, 2361, 253, 15180, 10454, 310, 2361, 323, 247, 2014, 10895, 1223, 352, 943, 320, 1896, 281, 1918, 247, 3282, 273, 253, 673, 18925, 273, 253, 4081, 2746, 50276, 88, 1378, 253, 3733, 873, 285, 1071, 873, 1979, 1057, 352, 4311, 23352, 690, 5701, 651, 320, 14109, 281, 19148, 849, 253, 1566, 13330, 342, 253, 1083, 273, 2067, 18872, 390, 440, 34218, 6849, 25711, 253, 50276, 10580, 273, 21011, 875, 3386, 651, 830, 33817, 4295, 672, 2709, 18872, 6849, 403, 2783, 285, 7613, 253, 30505, 1854, 4181, 2550, 320, 10302, 275, 326, 1083, 50276, 3113, 17730, 281, 253, 1083, 342, 2709, 6849, 352, 651, 320, 273, 1600, 281, 1263, 253, 3410, 10454, 347, 247, 1159, 273, 253, 1180, 273, 2130, 6849, 1580, 347, 253, 4477, 452, 4767, 275, 253, 10199, 19732, 2977, 253, 1491, 432, 2709, 4973, 943, 3157, 253, 3045, 327, 8892, 342, 247, 3710, 1180, 273, 3733, 3530, 50274, 783, 2746, 13330, 342, 3237, 326, 812, 417, 320, 3786, 4354, 11463, 1070, 2299, 50276, 66, 625, 11080, 16774, 5839, 273, 849, 973, 253, 2746, 476, 2968, 342, 1110, 2219, 310, 14999, 50276, 187, 187, 4118, 18435, 27, 66, 3676, 17699, 16561, 1006, 800, 1566, 310, 3559, 323, 4471, 19177, 9554, 970, 29843, 305, 409, 729, 88, 30666, 6339, 37820, 875, 21624, 14237, 273, 253, 941, 6849, 253, 1332, 26586, 2067, 37825, 285, 18236, 1774, 13133, 432, 271, 4321, 1332, 17699, 1661, 17690, 2898, 275, 747, 873, 8777, 1223, 1335, 9591, 973, 50276, 15337, 398, 5469, 253, 2929, 342, 253, 4477, 30426, 23452, 1676, 723, 273, 253, 3910, 432, 4321, 789, 17985, 17699, 1661, 253, 4477, 2361, 625, 9470, 4679, 275, 253, 30080, 22559, 2167, 417, 14023, 253, 2022, 5780, 14855, 310, 326, 253, 9021, 403, 275, 247, 1077, 6891, 1673, 390, 387, 1878, 247, 35663, 452, 760, 644, 5183, 275, 253, 6891, 1673, 273, 4471, 19177, 941, 1783, 285, 1014, 1561, 326, 1673, 760, 275, 247, 6891, 749, 3423, 275, 247, 5145, 4715, 18767, 326, 310, 29190, 1529, 2523, 310, 15180, 6733, 253, 2457, 3061, 840, 7024, 327, 849, 1199, 2801, 359, 1659, 327, 253, 4460, 9021, 4632, 841, 32213 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: summary the authors present a general formulation of different settings in multitask learning including pretraining regimes in a setting where the goal is to get best performance for a prespecified primary task and additional auxiliary tasks the main idea is to divide the gradients on the auxiliary task into 2 subspaces a subspace where the gradients influence performance of the primary task and a subspace where they only influence the auxiliary task without changing the loss on the primary task within the subspace that does have influence on the primary task it is easy to compute directions that have a positive or negative effect on the primary task which allows to create different learning schemes given the gradients that point toward i auxiliary influence only ii positive influence on auxiliary tass iii negative influence on primary task experimental results show improvements over previously identified meta learning methods on 2 natural language datasets and 3 image datasets strengths the authors present a general framework for an important problem it has many applications in a wide variety of fields and contributes to thinking about meta learning and pretraining in a more general way explanations illustrations and mathematical derivations are clear and easy to understand the authors show that with a careful choice of hyperparameters their approach can improve performance especially in settings with limited data on the primary task the results on natural language datasets are also interesting showing that they achieve a higher performance when the auxiliary task doesnt exactly match the primary task data weaknesses some of the explanations and especially the proof fall apart when considering the klargest principal vector of j since kd the sum of all gradients in gaux will clearly still have a big influence on the performance of the primary task the proposed algorithm introduces a lot of additional hyperparameters and not all of those hyper parameters are properly discussed etaaux and etaprim are properly discussed and the authors convincingly show that these parameters are implicitly set by other methods as well figure 2 shows an ablation study but does not help practitioners to set the discussed values given the large variance over 5 runs the choice of the subspace for gaux seems very critical and the provided experimental results and discussion are somewhat lacking how can a random choice of subspace basis improve the results calculating the randomizedlowrankapprox is only done every n steps but there is no mention of n later some experimental results should be provided to convince the reader that the basis does not change too much given 2 different batches from the primary task other remarks the result in table 2 is much better than the best result in table 4 on the catvsdogs experiment what is the difference can the experiments in table 4 be repeated to be more comparable to table 2 pcgrad most closely resembles this work what type of subspace basis is used in that work it would be interesting to see a direct comparison between pcgrad and the proposed method with etaaux alphaaux alphaaux alphaaux and using the same basis early stopping after 10 epochs seems quite short and might explain some of the large variance in the results minor remarks k is introduced without much explanation which was a bit confusing on first reading it should be clearly stated that it is a hyper parameter on first mention docsep summary leveraging the power of the datarich related tasks have been studied eg pretraining and multitask learning this paper points out that careful utilization of auxiliary task is required to gain enhanced performance in primary tasks in order to prevent harming the performance of primary tasks they suggest the method to decompose auxiliary updates into three directions which have positive negative and neutral impact on the primary task reasons for score  in this paper it is highly interesting to see how to use a decomposition from the span of the primary task jacobian to adapt auxiliary gradients and validate the proposed methodology on image and textual data even though this is an interesting setting and the technical solutions presented in the paper look reasonable the idea seems to be pretty incremental as it stacks multiple existing techniques without many innovations  pros  1 the proposed methodology utilizes automatic differentiation procedures and randomized singular value decomposition for efficient scalability 2 the proposed framework allows the model to treat each auxiliary update independently by its impact on the task of interest which seems to be interesting cons  authors need to perform more qualitative and quantitative analysis on the datasets to vilify the effectiveness of the proposed methodology docsepthe work studies the auxiliary task selection in deep learning to resolve the burden of selecting relevant tasks for pretraining or the multitask learning by decomposing the auxiliary updates one can reweight separately the beneficial and harmful directions so that the net contribution to the update of the primary task is always positive the efficient implementation is experimented in text classification image classification and medical imaging transfer tasks the first contribution is the decomposition algorithm and reweighting of the auxiliary updates it is a simple idea with a nice insight of treating the primary task and the auxiliary tasks in different manners the decomposition allows a reweighting on the updates to optimize the primary task as much as possible while keeping the auxiliary tasks providing improvable directions the second contribution is an efficient mechanism to approximate and calculate the svd of the jacobian of the primary task the mechanism is implemented from an existing randomized approximation method the third contribution is a set of experiments verifying the proposed method the experiments include text classification image classification and medical imaging transfer tasks the most salient result is the 99 data efficiency to achieve improvable performance in the medical imaging transfer task concerns besides the above positive contributions following are some concerns from the observations 1 the relative improvements comparing to the baselines in table 1 and table 2 do not seem as much as that in gururangan et al 2020 and yu et al 2020 respectively 2 the weights reported in the experiments are 1 or 1 in the experiments for instance etaaux 1 1 1 is reported in the image classification task the reader would expect much better improvements when given the freedom to reassign the weights on the decomposed directions especially when the harmful part has a negated weight moreover why are the values chosen in eta 1 or 1 would there be a nicer balance between say the beneficial and the harmful parts for instance would eta 1 08 09 be a better choice it would be crucial that the authors can explain furthermore or support further experiments to confirm whether the potential of this decomposition algorithm is fully demonstrated or not post rebuttal i have read the authors response all my concerns are addressed properly however i still doubt that even the corner cases of eta have a better performance would there be a systematic way to find the optimal parameters reflecting the true potential of this method thus i will keep my score unchanged docsepsummary this paper studies auxiliary learning and propose a decomposition method on the auxiliary gradient into several components and to select relevantuseful decomposed gradients to maximize the assistance of the primary task strength how to adjust auxiliary tasks in a beneficial way is always a challenging problem in multitask learning the proposed solution on gradient decomposition is novel and interesting the improved performance based on the proposed method seems to be nontrivial weakness this paper however has some significant weaknesses which i will outline below missing details the subspace of the primary task gradient is composed of all training samples in the primary task based on the definition of mathcals however for any reasonablesize training dataset which contains at least 10k training samples this process seems to be extremely expensive to compute the authors have introduced randomised approximation on decomposition step but accumulate gradients seem to be already taking a lot of computation i hope the authors could justify this unfair experimental setting my biggest concern is on hyperparameter tuning on each component of the decomposed auxiliary task gradient in eq 1 the authors decompose each auxiliary task gradient into three components i one is lying in the subspace of primary task gradient ii one is orthogonal to the primary task gradient ii the final one is in conflict to the primary task gradient by selecting different weightings on each component the authors claim that this formulation can be degraded into prior auxiliary learning methods i agree that this formulation is general however all of these component weightings are selected by hand and seem to be very different across different datasets and tasks this gives a very unfair comparison to previous methods simply because these methods are included in one of the hyperparameter sets of corresponding weighting values i have noticed that the authors have constrained the search space into 4 sets of weighting but this does not justify this problem ideally these weighting should be automatically computed during training and varied based on the dataset in addition the weighting for the primary task also varies across different datasets so i am wondering whether these task weightings are consistent in baselines and did authors perform a similar hyperparameter search on baseline methods as well other minor issues this formulation cannot be used as pretraining with only auxiliary gradients which are helpful to the primary task since we do not know the primary task gradient beforehand even we know the primary task gradient 0 1 0 is simply putting more weighting on the existing primary task gradient so its the same as to tune up the learning rate rated up after authors clarification ### Summary:
after engaging in some good interactive discussions all but one reviewer settled on a rating of marginal accept the most negative reviewer didnt really provide a clear enough explanation of what was lacking in the work the other reviewers felt that the observed gains for this multitask learning framework were clear enough that the work is worthy of some attention by the community the ac recommends acceptance but one may consider this recommendation as a just past the line for acceptance recommendation
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 8774, 253, 4477, 1246, 247, 2087, 15895, 273, 1027, 7533, 275, 1554, 262, 1945, 4715, 1690, 3215, 26208, 27005, 275, 247, 4758, 835, 253, 4736, 310, 281, 755, 1682, 3045, 323, 247, 838, 1553, 1245, 3625, 4836, 285, 3081, 24026, 8892, 253, 2022, 2934, 310, 281, 10957, 253, 27935, 327, 253, 24026, 4836, 715, 374, 749, 31748, 247, 24822, 835, 253, 27935, 4833, 3045, 273, 253, 3625, 4836, 285, 247, 24822, 835, 597, 760, 4833, 253, 24026, 4836, 1293, 6890, 253, 2957, 327, 253, 3625, 4836, 1561, 253, 24822, 326, 1057, 452, 4833, 327, 253, 3625, 4836, 352, 310, 3477, 281, 11897, 10746, 326, 452, 247, 2762, 390, 4016, 1055, 327, 253, 3625, 4836, 534, 4483, 281, 2794, 1027, 4715, 15849, 1677, 253, 27935, 326, 1127, 2584, 891, 24026, 4833, 760, 21255, 2762, 4833, 327, 24026, 246, 515, 37685, 4016, 4833, 327, 3625, 4836, 5661, 1543, 921, 11701, 689, 3786, 3636, 11419, 4715, 3082, 327, 374, 3626, 3448, 15302, 285, 495, 2460, 15302, 50276, 296, 3755, 20556, 253, 4477, 1246, 247, 2087, 7792, 323, 271, 1774, 1895, 352, 556, 1142, 4893, 275, 247, 4618, 5235, 273, 4910, 285, 17904, 281, 4680, 670, 11419, 4715, 285, 3215, 26208, 275, 247, 625, 2087, 1039, 22909, 33954, 285, 15965, 3538, 569, 403, 2590, 285, 3477, 281, 2096, 253, 4477, 921, 326, 342, 247, 10182, 4327, 273, 4373, 22041, 616, 2746, 476, 3157, 3045, 3340, 275, 7533, 342, 3710, 941, 327, 253, 3625, 4836, 253, 1543, 327, 3626, 3448, 15302, 403, 671, 4722, 4645, 326, 597, 5115, 247, 2169, 3045, 672, 253, 24026, 4836, 36908, 4555, 3761, 253, 3625, 4836, 941, 50276, 20881, 1255, 265, 690, 273, 253, 22909, 285, 3340, 253, 4737, 2965, 7419, 672, 7296, 253, 465, 45242, 8624, 4972, 273, 480, 1580, 465, 69, 253, 2020, 273, 512, 27935, 275, 305, 10422, 588, 4518, 1335, 452, 247, 1943, 4833, 327, 253, 3045, 273, 253, 3625, 4836, 253, 4081, 5933, 23970, 247, 2257, 273, 3081, 4373, 22041, 285, 417, 512, 273, 1110, 4373, 3602, 403, 6283, 5469, 1162, 66, 10422, 285, 1162, 522, 3428, 403, 6283, 5469, 285, 253, 4477, 2410, 1763, 5356, 921, 326, 841, 3602, 403, 29688, 873, 407, 643, 3082, 347, 973, 4677, 374, 2722, 271, 28913, 1263, 533, 1057, 417, 1361, 24432, 281, 873, 253, 5469, 2193, 1677, 253, 1781, 11041, 689, 608, 6613, 253, 4327, 273, 253, 24822, 323, 305, 10422, 3133, 1077, 4619, 285, 253, 2530, 5661, 1543, 285, 5955, 403, 8489, 14999, 849, 476, 247, 3632, 4327, 273, 24822, 3720, 3157, 253, 1543, 18899, 253, 14871, 676, 14714, 9887, 310, 760, 2218, 1046, 295, 5018, 533, 627, 310, 642, 3748, 273, 295, 1996, 690, 5661, 1543, 943, 320, 2530, 281, 18578, 253, 9414, 326, 253, 3720, 1057, 417, 1818, 1512, 1199, 1677, 374, 1027, 39657, 432, 253, 3625, 4836, 50276, 977, 16157, 253, 906, 275, 2829, 374, 310, 1199, 1805, 685, 253, 1682, 906, 275, 2829, 577, 327, 253, 5798, 87, 8289, 14175, 3368, 752, 310, 253, 3064, 476, 253, 4679, 275, 2829, 577, 320, 6015, 281, 320, 625, 10870, 281, 2829, 374, 21136, 4971, 954, 8244, 29217, 436, 789, 752, 1511, 273, 24822, 3720, 310, 908, 275, 326, 789, 352, 651, 320, 4722, 281, 923, 247, 1480, 5301, 875, 21136, 4971, 285, 253, 4081, 1332, 342, 1162, 66, 10422, 50276, 1637, 10422, 9765, 10422, 9765, 10422, 285, 970, 253, 1072, 3720, 2393, 15910, 846, 884, 44540, 3133, 3240, 2159, 285, 1537, 5513, 690, 273, 253, 1781, 11041, 275, 253, 1543, 50276, 37585, 16157, 465, 310, 5611, 1293, 1199, 8813, 534, 369, 247, 2372, 21643, 327, 806, 4361, 352, 943, 320, 4518, 4767, 326, 352, 310, 247, 4373, 4764, 327, 806, 3748, 5474, 33032, 50276, 8774, 19732, 2977, 253, 1612, 273, 253, 2856, 274, 469, 2905, 8892, 452, 644, 5421, 24088, 3215, 26208, 285, 1554, 262, 1945, 4715, 436, 2929, 2792, 562, 326, 10182, 19575, 273, 24026, 4836, 310, 2424, 281, 6351, 8655, 3045, 275, 3625, 8892, 275, 1340, 281, 3657, 5237, 272, 253, 3045, 273, 3625, 8892, 597, 1804, 253, 1332, 281, 11101, 3014, 24026, 11269, 715, 1264, 10746, 534, 452, 2762, 4016, 285, 9238, 3486, 327, 253, 3625, 4836, 50274, 250, 3743, 323, 4868, 575, 50276, 249, 436, 2929, 352, 310, 4122, 4722, 281, 923, 849, 281, 897, 247, 14717, 432, 253, 13905, 273, 253, 3625, 4836, 480, 317, 706, 757, 281, 5223, 24026, 27935, 285, 17813, 253, 4081, 16182, 327, 2460, 285, 45860, 941, 1014, 2167, 436, 310, 271, 4722, 4758, 285, 253, 7681, 5482, 3559, 275, 253, 2929, 1007, 5272, 253, 2934, 3133, 281, 320, 3965, 32809, 347, 352, 34577, 2709, 5368, 5609, 1293, 1142, 32771, 575, 50274, 856, 84, 575, 337, 253, 4081, 16182, 29820, 50276, 50154, 9827, 7259, 285, 14871, 11098, 1318, 14717, 323, 5919, 9171, 1430, 50276, 19, 253, 4081, 7792, 4483, 253, 1566, 281, 1555, 1016, 24026, 5731, 10939, 407, 697, 3486, 327, 253, 4836, 273, 1600, 534, 3133, 281, 320, 4722, 50274, 5040, 575, 4477, 878, 281, 1347, 625, 18276, 285, 11745, 1783, 327, 253, 15302, 281, 20442, 1419, 253, 12510, 273, 253, 4081, 16182, 50274, 7152, 339, 431, 248, 789, 2175, 253, 24026, 4836, 5438, 275, 3676, 4715, 281, 11322, 253, 7977, 273, 17221, 4623, 8892, 323, 3215, 26208, 390, 253, 1554, 262, 1945, 4715, 407, 11101, 28163, 253, 24026, 11269, 581, 476, 294, 6712, 11794, 253, 12912, 285, 19632, 10746, 594, 326, 253, 2036, 7680, 281, 253, 5731, 273, 253, 3625, 4836, 310, 1900, 2762, 253, 5919, 7092, 310, 3368, 264, 275, 2505, 9162, 2460, 9162, 285, 3739, 6979, 3700, 8892, 50276, 783, 806, 7680, 310, 253, 14717, 5933, 285, 294, 6712, 272, 273, 253, 24026, 11269, 352, 310, 247, 2969, 2934, 342, 247, 5322, 12288, 273, 12767, 253, 3625, 4836, 285, 253, 24026, 8892, 275, 1027, 34323, 253, 14717, 4483, 247, 294, 6712, 272, 327, 253, 11269, 281, 22318, 253, 3625, 4836, 347, 1199, 347, 1896, 1223, 7562, 253, 24026, 8892, 5277, 1965, 17254, 10746, 253, 1273, 7680, 310, 271, 5919, 5122, 281, 16851, 285, 10173, 253, 18504, 69, 273, 253, 480, 317, 706, 757, 273, 253, 3625, 4836, 253, 5122, 310, 9009, 432, 271, 5368, 14871, 11193, 1332, 253, 2626, 7680, 310, 247, 873, 273, 4679, 49160, 253, 4081, 1332, 253, 4679, 2486, 2505, 9162, 2460, 9162, 285, 3739, 6979, 3700, 8892, 253, 954, 43066, 906, 310, 253, 8688, 941, 6733, 281, 5115, 1965, 17254, 3045, 275, 253, 3739, 6979, 3700, 4836, 50276, 585, 1209, 2224, 50276, 67, 11587, 253, 1840, 2762, 9021, 1563, 403, 690, 7350, 432, 253, 7313, 50276, 18, 253, 4103, 11701, 10941, 281, 253, 1666, 25379, 275, 2829, 337, 285, 2829, 374, 513, 417, 1646, 347, 1199, 347, 326, 275, 305, 321, 321, 606, 266, 1162, 355, 9169, 285, 340, 86, 1162, 355, 9169, 2975, 50275, 19, 253, 13461, 2361, 275, 253, 4679, 403, 337, 390, 337, 275, 253, 4679, 323, 4227, 1162, 66, 10422, 50276, 18, 337, 337, 310, 2361, 275, 253, 2460, 9162, 4836, 50276, 783, 9414, 651, 1902, 1199, 1805, 11701, 672, 1677, 253, 7185, 281, 17279, 525, 253, 13461, 327, 253, 45765, 10746, 3340, 672, 253, 19632, 629, 556, 247, 2297, 456, 2801, 25761, 2139, 403, 253, 2193, 6777, 275, 1162, 66, 337, 390, 337, 651, 627, 320, 247, 49482, 6654, 875, 1333, 253, 12912, 285, 253, 19632, 4243, 323, 4227, 651, 1162, 66, 50276, 18, 16331, 15630, 320, 247, 1805, 4327, 352, 651, 320, 9560, 326, 253, 4477, 476, 5513, 33810, 390, 1329, 2007, 4679, 281, 6583, 1880, 253, 2442, 273, 436, 14717, 5933, 310, 4751, 5183, 390, 417, 50273, 5996, 30080, 22559, 50276, 74, 452, 1239, 253, 4477, 2380, 512, 619, 7350, 403, 9713, 6283, 2299, 891, 1335, 5545, 326, 1014, 253, 7145, 2219, 273, 1162, 66, 452, 247, 1805, 3045, 651, 627, 320, 247, 12082, 1039, 281, 1089, 253, 8654, 3602, 18964, 253, 2032, 2442, 273, 436, 1332, 3021, 891, 588, 1978, 619, 4868, 19965, 5474, 339, 793, 360, 3454, 436, 2929, 2175, 24026, 4715, 285, 12661, 247, 14717, 1332, 327, 253, 24026, 11786, 715, 2067, 4295, 285, 281, 3609, 4623, 316, 4085, 45765, 27935, 281, 22950, 253, 8385, 273, 253, 3625, 4836, 50276, 45563, 849, 281, 4575, 24026, 8892, 275, 247, 12912, 1039, 310, 1900, 247, 11132, 1895, 275, 1554, 262, 1945, 4715, 253, 4081, 2900, 327, 11786, 14717, 310, 4460, 285, 4722, 253, 5520, 3045, 1754, 327, 253, 4081, 1332, 3133, 281, 320, 37825, 50276, 20881, 1255, 436, 2929, 2299, 556, 690, 1534, 32213, 534, 891, 588, 19270, 2708, 50276, 33722, 4278, 253, 24822, 273, 253, 3625, 4836, 11786, 310, 9924, 273, 512, 3733, 3530, 275, 253, 3625, 4836, 1754, 327, 253, 5426, 273, 14168, 68, 932, 2299, 323, 667, 1921, 2272, 907, 3733, 10895, 534, 4428, 387, 1878, 884, 76, 3733, 3530, 436, 1232, 3133, 281, 320, 6685, 8214, 281, 11897, 253, 4477, 452, 5611, 41699, 11193, 327, 14717, 3213, 533, 29010, 27935, 1646, 281, 320, 2168, 3192, 247, 2257, 273, 13782, 891, 3524, 253, 4477, 812, 15249, 436, 50275, 328, 25525, 5661, 4758, 619, 5962, 4468, 310, 327, 4373, 19484, 25184, 327, 1016, 4445, 273, 253, 45765, 24026, 4836, 11786, 275, 16186, 337, 253, 4477, 11101, 3014, 1016, 24026, 4836, 11786, 715, 1264, 4295, 891, 581, 310, 10776, 275, 253, 24822, 273, 3625, 4836, 11786, 21255, 581, 310, 19627, 281, 253, 3625, 4836, 11786, 21255, 253, 2457, 581, 310, 275, 7344, 281, 253, 3625, 4836, 11786, 407, 17221, 1027, 2801, 723, 327, 1016, 4445, 253, 4477, 1750, 326, 436, 15895, 476, 320, 30853, 715, 2720, 24026, 4715, 3082, 891, 5194, 326, 436, 15895, 310, 2087, 2299, 512, 273, 841, 4445, 2801, 723, 403, 4236, 407, 1133, 285, 1646, 281, 320, 1077, 1027, 2439, 1027, 15302, 285, 8892, 50275, 2520, 4245, 247, 1077, 16593, 5301, 281, 2045, 3082, 3365, 984, 841, 3082, 403, 2908, 275, 581, 273, 253, 4373, 19484, 5239, 273, 3969, 42428, 2193, 891, 452, 8344, 326, 253, 4477, 452, 20793, 253, 3186, 2317, 715, 577, 5239, 273, 42428, 533, 436, 1057, 417, 15249, 436, 1895, 34243, 841, 42428, 943, 320, 8356, 10302, 1309, 3733, 285, 12848, 1754, 327, 253, 10895, 50276, 249, 1635, 253, 42428, 323, 253, 3625, 4836, 671, 16149, 2439, 1027, 15302, 594, 891, 717, 12371, 1880, 841, 4836, 2801, 723, 403, 5185, 275, 1666, 25379, 285, 858, 4477, 1347, 247, 2074, 4373, 19484, 3186, 327, 8245, 3082, 347, 973, 50276, 977, 5884, 3374, 436, 15895, 2550, 320, 908, 347, 3215, 26208, 342, 760, 24026, 27935, 534, 403, 9371, 281, 253, 3625, 4836, 1580, 359, 513, 417, 871, 253, 3625, 4836, 11786, 38565, 1014, 359, 871, 253, 3625, 4836, 11786, 470, 337, 470, 310, 3365, 8133, 625, 42428, 327, 253, 5368, 3625, 4836, 11786, 594, 697, 253, 1072, 347, 281, 19928, 598, 253, 4715, 2281, 50274, 14092, 598, 846, 4477, 37699, 187, 187, 4118, 18435, 27, 6438, 15966, 275, 690, 1175, 18366, 11985, 512, 533, 581, 37317, 11371, 327, 247, 13716, 273, 16888, 2997, 253, 954, 4016, 37317, 42126, 1663, 2085, 247, 2590, 2217, 8813, 273, 752, 369, 14999, 275, 253, 789, 253, 643, 30628, 3543, 326, 253, 2540, 15988, 323, 436, 1554, 262, 1945, 4715, 7792, 497, 2590, 2217, 326, 253, 789, 310, 18338, 273, 690, 4116, 407, 253, 3114, 253, 913, 32636, 14924, 533, 581, 778, 1908, 436, 17401, 347, 247, 816, 2469, 253, 1386, 323, 14924, 17401 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 8774, 253, 4477, 1246, 247, 2087, 15895, 273, 1027, 7533, 275, 1554, 262, 1945, 4715, 1690, 3215, 26208, 27005, 275, 247, 4758, 835, 253, 4736, 310, 281, 755, 1682, 3045, 323, 247, 838, 1553, 1245, 3625, 4836, 285, 3081, 24026, 8892, 253, 2022, 2934, 310, 281, 10957, 253, 27935, 327, 253, 24026, 4836, 715, 374, 749, 31748, 247, 24822, 835, 253, 27935, 4833, 3045, 273, 253, 3625, 4836, 285, 247, 24822, 835, 597, 760, 4833, 253, 24026, 4836, 1293, 6890, 253, 2957, 327, 253, 3625, 4836, 1561, 253, 24822, 326, 1057, 452, 4833, 327, 253, 3625, 4836, 352, 310, 3477, 281, 11897, 10746, 326, 452, 247, 2762, 390, 4016, 1055, 327, 253, 3625, 4836, 534, 4483, 281, 2794, 1027, 4715, 15849, 1677, 253, 27935, 326, 1127, 2584, 891, 24026, 4833, 760, 21255, 2762, 4833, 327, 24026, 246, 515, 37685, 4016, 4833, 327, 3625, 4836, 5661, 1543, 921, 11701, 689, 3786, 3636, 11419, 4715, 3082, 327, 374, 3626, 3448, 15302, 285, 495, 2460, 15302, 50276, 296, 3755, 20556, 253, 4477, 1246, 247, 2087, 7792, 323, 271, 1774, 1895, 352, 556, 1142, 4893, 275, 247, 4618, 5235, 273, 4910, 285, 17904, 281, 4680, 670, 11419, 4715, 285, 3215, 26208, 275, 247, 625, 2087, 1039, 22909, 33954, 285, 15965, 3538, 569, 403, 2590, 285, 3477, 281, 2096, 253, 4477, 921, 326, 342, 247, 10182, 4327, 273, 4373, 22041, 616, 2746, 476, 3157, 3045, 3340, 275, 7533, 342, 3710, 941, 327, 253, 3625, 4836, 253, 1543, 327, 3626, 3448, 15302, 403, 671, 4722, 4645, 326, 597, 5115, 247, 2169, 3045, 672, 253, 24026, 4836, 36908, 4555, 3761, 253, 3625, 4836, 941, 50276, 20881, 1255, 265, 690, 273, 253, 22909, 285, 3340, 253, 4737, 2965, 7419, 672, 7296, 253, 465, 45242, 8624, 4972, 273, 480, 1580, 465, 69, 253, 2020, 273, 512, 27935, 275, 305, 10422, 588, 4518, 1335, 452, 247, 1943, 4833, 327, 253, 3045, 273, 253, 3625, 4836, 253, 4081, 5933, 23970, 247, 2257, 273, 3081, 4373, 22041, 285, 417, 512, 273, 1110, 4373, 3602, 403, 6283, 5469, 1162, 66, 10422, 285, 1162, 522, 3428, 403, 6283, 5469, 285, 253, 4477, 2410, 1763, 5356, 921, 326, 841, 3602, 403, 29688, 873, 407, 643, 3082, 347, 973, 4677, 374, 2722, 271, 28913, 1263, 533, 1057, 417, 1361, 24432, 281, 873, 253, 5469, 2193, 1677, 253, 1781, 11041, 689, 608, 6613, 253, 4327, 273, 253, 24822, 323, 305, 10422, 3133, 1077, 4619, 285, 253, 2530, 5661, 1543, 285, 5955, 403, 8489, 14999, 849, 476, 247, 3632, 4327, 273, 24822, 3720, 3157, 253, 1543, 18899, 253, 14871, 676, 14714, 9887, 310, 760, 2218, 1046, 295, 5018, 533, 627, 310, 642, 3748, 273, 295, 1996, 690, 5661, 1543, 943, 320, 2530, 281, 18578, 253, 9414, 326, 253, 3720, 1057, 417, 1818, 1512, 1199, 1677, 374, 1027, 39657, 432, 253, 3625, 4836, 50276, 977, 16157, 253, 906, 275, 2829, 374, 310, 1199, 1805, 685, 253, 1682, 906, 275, 2829, 577, 327, 253, 5798, 87, 8289, 14175, 3368, 752, 310, 253, 3064, 476, 253, 4679, 275, 2829, 577, 320, 6015, 281, 320, 625, 10870, 281, 2829, 374, 21136, 4971, 954, 8244, 29217, 436, 789, 752, 1511, 273, 24822, 3720, 310, 908, 275, 326, 789, 352, 651, 320, 4722, 281, 923, 247, 1480, 5301, 875, 21136, 4971, 285, 253, 4081, 1332, 342, 1162, 66, 10422, 50276, 1637, 10422, 9765, 10422, 9765, 10422, 285, 970, 253, 1072, 3720, 2393, 15910, 846, 884, 44540, 3133, 3240, 2159, 285, 1537, 5513, 690, 273, 253, 1781, 11041, 275, 253, 1543, 50276, 37585, 16157, 465, 310, 5611, 1293, 1199, 8813, 534, 369, 247, 2372, 21643, 327, 806, 4361, 352, 943, 320, 4518, 4767, 326, 352, 310, 247, 4373, 4764, 327, 806, 3748, 5474, 33032, 50276, 8774, 19732, 2977, 253, 1612, 273, 253, 2856, 274, 469, 2905, 8892, 452, 644, 5421, 24088, 3215, 26208, 285, 1554, 262, 1945, 4715, 436, 2929, 2792, 562, 326, 10182, 19575, 273, 24026, 4836, 310, 2424, 281, 6351, 8655, 3045, 275, 3625, 8892, 275, 1340, 281, 3657, 5237, 272, 253, 3045, 273, 3625, 8892, 597, 1804, 253, 1332, 281, 11101, 3014, 24026, 11269, 715, 1264, 10746, 534, 452, 2762, 4016, 285, 9238, 3486, 327, 253, 3625, 4836, 50274, 250, 3743, 323, 4868, 575, 50276, 249, 436, 2929, 352, 310, 4122, 4722, 281, 923, 849, 281, 897, 247, 14717, 432, 253, 13905, 273, 253, 3625, 4836, 480, 317, 706, 757, 281, 5223, 24026, 27935, 285, 17813, 253, 4081, 16182, 327, 2460, 285, 45860, 941, 1014, 2167, 436, 310, 271, 4722, 4758, 285, 253, 7681, 5482, 3559, 275, 253, 2929, 1007, 5272, 253, 2934, 3133, 281, 320, 3965, 32809, 347, 352, 34577, 2709, 5368, 5609, 1293, 1142, 32771, 575, 50274, 856, 84, 575, 337, 253, 4081, 16182, 29820, 50276, 50154, 9827, 7259, 285, 14871, 11098, 1318, 14717, 323, 5919, 9171, 1430, 50276, 19, 253, 4081, 7792, 4483, 253, 1566, 281, 1555, 1016, 24026, 5731, 10939, 407, 697, 3486, 327, 253, 4836, 273, 1600, 534, 3133, 281, 320, 4722, 50274, 5040, 575, 4477, 878, 281, 1347, 625, 18276, 285, 11745, 1783, 327, 253, 15302, 281, 20442, 1419, 253, 12510, 273, 253, 4081, 16182, 50274, 7152, 339, 431, 248, 789, 2175, 253, 24026, 4836, 5438, 275, 3676, 4715, 281, 11322, 253, 7977, 273, 17221, 4623, 8892, 323, 3215, 26208, 390, 253, 1554, 262, 1945, 4715, 407, 11101, 28163, 253, 24026, 11269, 581, 476, 294, 6712, 11794, 253, 12912, 285, 19632, 10746, 594, 326, 253, 2036, 7680, 281, 253, 5731, 273, 253, 3625, 4836, 310, 1900, 2762, 253, 5919, 7092, 310, 3368, 264, 275, 2505, 9162, 2460, 9162, 285, 3739, 6979, 3700, 8892, 50276, 783, 806, 7680, 310, 253, 14717, 5933, 285, 294, 6712, 272, 273, 253, 24026, 11269, 352, 310, 247, 2969, 2934, 342, 247, 5322, 12288, 273, 12767, 253, 3625, 4836, 285, 253, 24026, 8892, 275, 1027, 34323, 253, 14717, 4483, 247, 294, 6712, 272, 327, 253, 11269, 281, 22318, 253, 3625, 4836, 347, 1199, 347, 1896, 1223, 7562, 253, 24026, 8892, 5277, 1965, 17254, 10746, 253, 1273, 7680, 310, 271, 5919, 5122, 281, 16851, 285, 10173, 253, 18504, 69, 273, 253, 480, 317, 706, 757, 273, 253, 3625, 4836, 253, 5122, 310, 9009, 432, 271, 5368, 14871, 11193, 1332, 253, 2626, 7680, 310, 247, 873, 273, 4679, 49160, 253, 4081, 1332, 253, 4679, 2486, 2505, 9162, 2460, 9162, 285, 3739, 6979, 3700, 8892, 253, 954, 43066, 906, 310, 253, 8688, 941, 6733, 281, 5115, 1965, 17254, 3045, 275, 253, 3739, 6979, 3700, 4836, 50276, 585, 1209, 2224, 50276, 67, 11587, 253, 1840, 2762, 9021, 1563, 403, 690, 7350, 432, 253, 7313, 50276, 18, 253, 4103, 11701, 10941, 281, 253, 1666, 25379, 275, 2829, 337, 285, 2829, 374, 513, 417, 1646, 347, 1199, 347, 326, 275, 305, 321, 321, 606, 266, 1162, 355, 9169, 285, 340, 86, 1162, 355, 9169, 2975, 50275, 19, 253, 13461, 2361, 275, 253, 4679, 403, 337, 390, 337, 275, 253, 4679, 323, 4227, 1162, 66, 10422, 50276, 18, 337, 337, 310, 2361, 275, 253, 2460, 9162, 4836, 50276, 783, 9414, 651, 1902, 1199, 1805, 11701, 672, 1677, 253, 7185, 281, 17279, 525, 253, 13461, 327, 253, 45765, 10746, 3340, 672, 253, 19632, 629, 556, 247, 2297, 456, 2801, 25761, 2139, 403, 253, 2193, 6777, 275, 1162, 66, 337, 390, 337, 651, 627, 320, 247, 49482, 6654, 875, 1333, 253, 12912, 285, 253, 19632, 4243, 323, 4227, 651, 1162, 66, 50276, 18, 16331, 15630, 320, 247, 1805, 4327, 352, 651, 320, 9560, 326, 253, 4477, 476, 5513, 33810, 390, 1329, 2007, 4679, 281, 6583, 1880, 253, 2442, 273, 436, 14717, 5933, 310, 4751, 5183, 390, 417, 50273, 5996, 30080, 22559, 50276, 74, 452, 1239, 253, 4477, 2380, 512, 619, 7350, 403, 9713, 6283, 2299, 891, 1335, 5545, 326, 1014, 253, 7145, 2219, 273, 1162, 66, 452, 247, 1805, 3045, 651, 627, 320, 247, 12082, 1039, 281, 1089, 253, 8654, 3602, 18964, 253, 2032, 2442, 273, 436, 1332, 3021, 891, 588, 1978, 619, 4868, 19965, 5474, 339, 793, 360, 3454, 436, 2929, 2175, 24026, 4715, 285, 12661, 247, 14717, 1332, 327, 253, 24026, 11786, 715, 2067, 4295, 285, 281, 3609, 4623, 316, 4085, 45765, 27935, 281, 22950, 253, 8385, 273, 253, 3625, 4836, 50276, 45563, 849, 281, 4575, 24026, 8892, 275, 247, 12912, 1039, 310, 1900, 247, 11132, 1895, 275, 1554, 262, 1945, 4715, 253, 4081, 2900, 327, 11786, 14717, 310, 4460, 285, 4722, 253, 5520, 3045, 1754, 327, 253, 4081, 1332, 3133, 281, 320, 37825, 50276, 20881, 1255, 436, 2929, 2299, 556, 690, 1534, 32213, 534, 891, 588, 19270, 2708, 50276, 33722, 4278, 253, 24822, 273, 253, 3625, 4836, 11786, 310, 9924, 273, 512, 3733, 3530, 275, 253, 3625, 4836, 1754, 327, 253, 5426, 273, 14168, 68, 932, 2299, 323, 667, 1921, 2272, 907, 3733, 10895, 534, 4428, 387, 1878, 884, 76, 3733, 3530, 436, 1232, 3133, 281, 320, 6685, 8214, 281, 11897, 253, 4477, 452, 5611, 41699, 11193, 327, 14717, 3213, 533, 29010, 27935, 1646, 281, 320, 2168, 3192, 247, 2257, 273, 13782, 891, 3524, 253, 4477, 812, 15249, 436, 50275, 328, 25525, 5661, 4758, 619, 5962, 4468, 310, 327, 4373, 19484, 25184, 327, 1016, 4445, 273, 253, 45765, 24026, 4836, 11786, 275, 16186, 337, 253, 4477, 11101, 3014, 1016, 24026, 4836, 11786, 715, 1264, 4295, 891, 581, 310, 10776, 275, 253, 24822, 273, 3625, 4836, 11786, 21255, 581, 310, 19627, 281, 253, 3625, 4836, 11786, 21255, 253, 2457, 581, 310, 275, 7344, 281, 253, 3625, 4836, 11786, 407, 17221, 1027, 2801, 723, 327, 1016, 4445, 253, 4477, 1750, 326, 436, 15895, 476, 320, 30853, 715, 2720, 24026, 4715, 3082, 891, 5194, 326, 436, 15895, 310, 2087, 2299, 512, 273, 841, 4445, 2801, 723, 403, 4236, 407, 1133, 285, 1646, 281, 320, 1077, 1027, 2439, 1027, 15302, 285, 8892, 50275, 2520, 4245, 247, 1077, 16593, 5301, 281, 2045, 3082, 3365, 984, 841, 3082, 403, 2908, 275, 581, 273, 253, 4373, 19484, 5239, 273, 3969, 42428, 2193, 891, 452, 8344, 326, 253, 4477, 452, 20793, 253, 3186, 2317, 715, 577, 5239, 273, 42428, 533, 436, 1057, 417, 15249, 436, 1895, 34243, 841, 42428, 943, 320, 8356, 10302, 1309, 3733, 285, 12848, 1754, 327, 253, 10895, 50276, 249, 1635, 253, 42428, 323, 253, 3625, 4836, 671, 16149, 2439, 1027, 15302, 594, 891, 717, 12371, 1880, 841, 4836, 2801, 723, 403, 5185, 275, 1666, 25379, 285, 858, 4477, 1347, 247, 2074, 4373, 19484, 3186, 327, 8245, 3082, 347, 973, 50276, 977, 5884, 3374, 436, 15895, 2550, 320, 908, 347, 3215, 26208, 342, 760, 24026, 27935, 534, 403, 9371, 281, 253, 3625, 4836, 1580, 359, 513, 417, 871, 253, 3625, 4836, 11786, 38565, 1014, 359, 871, 253, 3625, 4836, 11786, 470, 337, 470, 310, 3365, 8133, 625, 42428, 327, 253, 5368, 3625, 4836, 11786, 594, 697, 253, 1072, 347, 281, 19928, 598, 253, 4715, 2281, 50274, 14092, 598, 846, 4477, 37699, 187, 187, 4118, 18435, 27, 6438, 15966, 275, 690, 1175, 18366, 11985, 512, 533, 581, 37317, 11371, 327, 247, 13716, 273, 16888, 2997, 253, 954, 4016, 37317, 42126, 1663, 2085, 247, 2590, 2217, 8813, 273, 752, 369, 14999, 275, 253, 789, 253, 643, 30628, 3543, 326, 253, 2540, 15988, 323, 436, 1554, 262, 1945, 4715, 7792, 497, 2590, 2217, 326, 253, 789, 310, 18338, 273, 690, 4116, 407, 253, 3114, 253, 913, 32636, 14924, 533, 581, 778, 1908, 436, 17401, 347, 247, 816, 2469, 253, 1386, 323, 14924, 17401 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this work considers the controlled feature selection problem or selecting a small number of important features subject to fdr control the proposed method flowselect builds on the existing literature of conditional randomization tests and holdout randomization tests as well as normalizing flows from the deep generative modeling literature in simple terms flowselect uses normalizing flows to learn a data distribution and mcmc to sample individual heldout features from their conditional distribution it then passes these samples through a single model fit on the original data to generate samples from the null distribution and it then compares these to the observed test statistic calculated using the observed data with the rejection threshold set according to benjaminihochberg the proposed approach is justified using an asymptotic argument normalizing flows are theoretically capable of modeling any distribution given enough transforms and mcmc should be able to sample from the learned distribution given enough samples the empirical results are quite promising flowselect provides better results than several competing knockoff methods across two types of synthetic tasks as well as reasonable results on a real scrnaseq dataset strengths the method combines known methods for feature selection under fdr control with two new techniques at least in this subfield normalizing flows and mcmc to sample from the learned distribution the asymptotic argument makes clear why this approach should work at least in scenarios with a large amount of data a large flow model and sufficient mcmc samples the empirical results are quite promising the ablation study showed that multiple methodological innovations are important for helping flowselect outperform the baselines the normalizing flows mcmc approach for handling heldout features as well as the hrt weaknesses the theory is somewhat optimistic the pvalues converge only under two asymptotic results its natural to develop asymptotic theory first as this is often easier but i wonder if theres any way the theory could be refined to more closely reflect how flowselect performs in practice or at least help control whether either of the key steps is working well enough more on this in the questions section below nits this may not be a mistake but i wasnt sure of the reason for the 1 in the summation and denominator of eq 5 theres a typo on line 252 to have support in i may have missed this but what exactly was the test statistic used in the experiments the caption of figure 2 says the statistics were calculated using either lasso or random forest was it the models predictive accuracy a couple of the questions above get at limitations that might be helpful to discuss ability to identify important features that are strongly correlated with one another in the worst case identical limited applicability of theory to practical usage unclear scaling with dataset size docsepthis paper proposes a new variable selection method called flowselect to perform controlled feature selection that does not suffer the problems existed in knockoffsbased methods asymptotically the proposed method computes valid pvalues empirically flowselect consistently controls the fdr on both synthetic and semisynthetic benchmarks whereas competing knockoffbased approaches do not strength the paper is wellmotivated and wellwritten the proposed method has nice theoretical guarantees and sufficient empirical evaluations weakness in fact some work on knockoff has improved the two problems highlighted by the author that lead to the failure of traditional knockoffsbased methods eg fan y et al rank largescale inference with graphical nonlinear knockoffs however the authors does not discuss this related literature all my concerns are presented in weakness and questions moreover i do not have any concern on negative social impact docsepthe paper considers the problem of multiple hypothesis testing with the fdr control the work is situated in the modelx setting and is motivated by the limitations of the existing modelx multiple testing problems with an emphasis on the knockoffbased methods the paper proposes using the normalizing flow technique to estimate the conditional distribution of features ie xj mid xj which combine with crthrt yields approximately correct pvalues for conditional hypothesis testing the pvalues are subsequently fed to the bhq procedure and a selection set is provided the proposed method is evaluated with synthetic and semisynthetic data showing satisfying empirical results when compared to other methods that rely on inexact knockoff constructions strength 1 the paper finds a nice connection between a stateofart conditional density estimation method and the modelx feature selection method 2 the proposed method performs well empirically weaknesses 1 i find the motivation and comparison of the proposed method unsuitable the paper tries to see the proposed method as an alternative method for the knockoffbased method but at its best ie when the mapping g is exactly known the proposed method does not guarantee the fdr control due to the dependence between the pvalues further the target distribution knockoffs or inexact knockoff machines tries to learn is different from what the proposed method tries to learn the former is a knockoff distribution satisfying the swapping invariant property while the latter is simply the conditional distribution i would suggest framing the proposed method as a way of implementing crthrt when the feature distribution is not exactly known 2 following the previous comment i find the comparison in the simulations a bit unfair methods w different guarantees onebit pvalues versus multibit pvalues for example it might be more appropriate to consider using the existing deep method to learn xj mid xj and apply crthrt 3 the presentation of this paper can be improved a the notation is not consistent throughout the paper for example the dimension is represented as d in line 44 then d in line 45 and then p in line 103 b the description of the existing method is a bit hard to follow for example it may help to briefly explain where the test data and training data come from theorem 1 involves the index n and it would help to explain what is this n in the actual flow used is it m my comments on the limitations have been included in the strengths and weaknesses part the authors have adequately addressed the potential negative societal impact of their work docsepthe authors present a method called flowselect based on normalizing flows to select features in a controlled fashion meaning that the false discovery rate fdr is limited they select features by first approximating the data distribution via the normalizing flow and then compute a pvalue for each feature explaining the data with respect to some model eg a lasso regression model or a random forest estimating the pvalues involves sampling from the marginal distribution of the flow which is done via mcmc to make sure that the fdr can be controlled the authors prove that their estimates of the pvalues are asymptotically correct ie that the estimates converge to the true value almost surely in their experiments they apply their method to a synthetic dataset ie correlated gaussians a semisynthetic dataset based on gene data and a realworld dataset where they try to discover genes relevant for the oil content in soy beans their method outperforms several baseline procedures based on knockoffs etc and they are able to find snps relevant for the oil content which have also been confirmed by other studies strengths in general the paper is wellstructured and easy to read i appreciate the background section especially because it guides the reader through the four lines of research the authors take ideas from and combine them to a new method flowselect combines several known and established methods in a creative way it is well grounded from a theoretical perspective as the authors prove that their estimator for the pvalues converges to the true value almost surely flowselect was put to test on three different datasets it should to have high power on the synthetic and semisynthetic dataset higher or on par with all of the baselines while having fdr as high or lower than the threshold which is not the case for all the baselines that the authors are also able to identify snps which are relevant for the oil content in soy beans is impressive weaknesses although the authors compare the snps they identify in the soy bean experiment to what has been found in other studies they do not apply the baselines they used in the previous experiments to the same dataset which makes their findings slightly less convincing moreover they give little insight in the computational cost of their method compared to competing procedures computing the marginals from the flow is only possible through mcmc which can be very inefficient if the proposal distribution does not represent the actual distribution well especially when dealing with genomic data with 10k features this could render the method very inefficient unfortunately the authors give very little insight into how they chose these proposal distributions and ensured that they represent the true marginals well conclusion im in favor of accepting the article although i wish that the others could clarify the concerns i raised above it is not clear how well flowselect can be scaled up to very high dimensional datasets such as genomic data with 10k features ### Summary:
this paper describes how to use normalizing flows for selecting features in a way that controls the type1 error by using a normalizing flow along with mcmc to sample from the null distribution the majority of the reviewers were positive however the most confident reviewer was negative from taking a look at that reviewers concerns i tend to agree with most of them the paper is titled knockofffree which means in the context of this paper that both 1 1bit pvalues are not used and 2 the full knockoff property is not required only sampling from complete conditionals are required most of the experiments compare knockoff methods to the proposed approach so its not clear if 1 1bit pvalues are not great or 2 the modelx processcomplete conditional sampling process is better with normalizing flows the former point is known and the latter point on the best way to sample from the complete conditionals is really the value if we take the paper as 1 complete conditionals are 1d 2 mcmc can be used to sample from a 1d unnormalized density 3 simple mcmc wont be bad because the problem is 1d any likelihood based deep generative model can be used to sample complete conditionals then its a solid paper on the other hand the belief that flows are the correct choice versus other likelihoodbased deep generative models is harder to take as theres only a comparison with a mixture density network used in the original hrt paper also from other uses of these models different models are better in different situations id suggest a heavy discussion in the paper on this point at the minimum maybe even a reframing of the paper is needed finally for the test statistic the hrt may not be the best choice for work like this paper that studies the problems with estimating xdistribution the paper contra contrarian statistics for controlled variable selection at aistats 2021 shows that the hrt test statistic is more sensitive to modelx estimation errors than a simple mixture statistic that doesnt give up much power the choice of test statistic also merits some discussion in step 3
[ 3530, 432, 253, 3635, 3268, 285, 352, 840, 26662, 841, 281, 253, 2540, 1071, 26312, 5118, 970, 253, 2540, 941, 342, 253, 18235, 7887, 873, 2556, 281, 2240, 16935, 74, 1689, 348, 4978, 50276, 783, 4081, 2746, 310, 17285, 970, 271, 20185, 4154, 2622, 3006, 14221, 403, 28055, 7032, 273, 14053, 667, 3268, 1677, 2217, 29698, 285, 278, 3591, 68, 943, 320, 2104, 281, 3410, 432, 253, 6311, 3268, 1677, 2217, 3530, 253, 16774, 1543, 403, 3240, 12532, 2685, 7135, 3400, 1805, 1543, 685, 2067, 11771, 7569, 2727, 3082, 2439, 767, 3510, 273, 13506, 8892, 347, 973, 347, 5272, 1543, 327, 247, 1524, 7362, 79, 511, 82, 10895, 50276, 296, 3755, 20556, 50275, 783, 1332, 24772, 1929, 3082, 323, 4735, 5438, 762, 269, 5267, 1453, 342, 767, 747, 5609, 387, 1878, 275, 436, 749, 3423, 2622, 3006, 14221, 285, 278, 3591, 68, 281, 3410, 432, 253, 6311, 3268, 50276, 783, 20185, 4154, 2789, 2590, 2139, 436, 2746, 943, 789, 387, 1878, 275, 15216, 342, 247, 1781, 2408, 273, 941, 247, 1781, 2685, 1566, 285, 4209, 278, 3591, 68, 3530, 50276, 783, 16774, 1543, 403, 3240, 12532, 50276, 783, 28913, 1263, 2692, 326, 2709, 35961, 32771, 403, 1774, 323, 9073, 2685, 7135, 562, 32231, 253, 1666, 25379, 253, 2622, 3006, 14221, 50276, 78, 3591, 68, 2746, 323, 10885, 2918, 483, 3386, 347, 973, 347, 253, 288, 1378, 50275, 20881, 1255, 265, 50275, 783, 3762, 310, 8489, 28684, 253, 268, 8858, 29623, 760, 762, 767, 20185, 1543, 697, 3626, 281, 1287, 20185, 3762, 806, 347, 436, 310, 2223, 6927, 533, 891, 4282, 604, 253, 373, 667, 1039, 253, 3762, 812, 320, 22407, 281, 625, 8244, 4887, 849, 2685, 7135, 17923, 275, 3946, 390, 387, 1878, 1361, 1453, 1880, 2057, 273, 253, 2234, 5018, 310, 2444, 973, 2217, 625, 327, 436, 275, 253, 3533, 2593, 2708, 50275, 79, 953, 50276, 2520, 778, 417, 320, 247, 10551, 533, 891, 369, 2649, 2119, 273, 253, 1921, 323, 253, 337, 275, 253, 36138, 285, 12619, 273, 16186, 608, 50276, 783, 373, 247, 1745, 80, 327, 1386, 28485, 281, 452, 1329, 275, 50276, 74, 778, 452, 9829, 436, 533, 752, 4555, 369, 253, 1071, 26312, 908, 275, 253, 4679, 253, 11743, 273, 4677, 374, 2296, 253, 9990, 497, 5118, 970, 2057, 298, 26341, 390, 3632, 9741, 369, 352, 253, 3210, 15970, 7200, 50276, 66, 4564, 273, 253, 3533, 1840, 755, 387, 7364, 326, 1537, 320, 9371, 281, 2319, 50276, 1430, 281, 4271, 1774, 3386, 326, 403, 7052, 9578, 342, 581, 1529, 275, 253, 9065, 1083, 8931, 50276, 15870, 30437, 273, 3762, 281, 8542, 10393, 50276, 328, 8250, 13642, 342, 10895, 1979, 50276, 7152, 33032, 2520, 2929, 29328, 247, 747, 4778, 5438, 1332, 1925, 2685, 7135, 281, 1347, 6537, 4735, 5438, 326, 1057, 417, 11089, 253, 3237, 13164, 275, 7569, 14273, 3169, 3082, 38311, 50276, 783, 4081, 1332, 48169, 3588, 268, 8858, 45190, 2685, 7135, 12724, 5760, 253, 269, 5267, 327, 1097, 13506, 285, 49863, 23744, 49602, 5727, 11771, 7569, 2727, 3169, 7274, 513, 417, 50276, 45563, 50275, 783, 2929, 310, 973, 24013, 8550, 285, 973, 15720, 50275, 783, 4081, 1332, 556, 5322, 10527, 23632, 285, 4209, 16774, 27163, 50275, 20881, 1255, 50276, 249, 958, 690, 789, 327, 7569, 2727, 556, 5520, 253, 767, 3237, 16318, 407, 253, 2488, 326, 1421, 281, 253, 4433, 273, 5899, 7569, 14273, 3169, 3082, 24088, 7989, 340, 1162, 355, 5958, 1236, 2510, 25912, 17032, 342, 29886, 14561, 7569, 14273, 2299, 253, 4477, 1057, 417, 2319, 436, 2905, 6239, 50276, 455, 619, 7350, 403, 3559, 275, 14855, 285, 3533, 25761, 891, 513, 417, 452, 667, 4468, 327, 4016, 2675, 3486, 5474, 339, 431, 248, 2929, 19401, 253, 1895, 273, 2709, 9079, 5175, 342, 253, 269, 5267, 1453, 253, 789, 310, 17860, 275, 253, 1566, 89, 4758, 285, 310, 17194, 407, 253, 7364, 273, 253, 5368, 1566, 89, 2709, 5175, 3237, 342, 271, 15075, 327, 253, 7569, 2727, 3169, 3082, 253, 2929, 29328, 970, 253, 2622, 3006, 2685, 5853, 281, 6642, 253, 17697, 3268, 273, 3386, 26332, 1269, 75, 4260, 1269, 75, 534, 13398, 342, 1531, 394, 1378, 11026, 5512, 3451, 268, 8858, 323, 17697, 9079, 5175, 253, 268, 8858, 403, 9674, 10208, 281, 253, 270, 73, 82, 5199, 285, 247, 5438, 873, 310, 2530, 253, 4081, 1332, 310, 6760, 342, 13506, 285, 49863, 23744, 941, 4645, 14127, 16774, 1543, 672, 2429, 281, 643, 3082, 326, 10725, 327, 29257, 514, 7569, 2727, 35831, 4757, 50276, 18, 253, 2929, 9010, 247, 5322, 4602, 875, 247, 1375, 1171, 435, 50276, 35428, 4038, 13418, 1332, 285, 253, 1566, 89, 4735, 5438, 1332, 50276, 19, 253, 4081, 1332, 17923, 973, 45190, 50276, 20881, 1255, 265, 337, 891, 1089, 253, 16038, 285, 5301, 273, 253, 4081, 1332, 49590, 253, 2929, 14177, 281, 923, 253, 4081, 1332, 347, 271, 5795, 1332, 323, 253, 7569, 2727, 3169, 1332, 533, 387, 697, 1682, 26332, 672, 253, 10603, 305, 310, 4555, 1929, 253, 4081, 1332, 1057, 417, 12215, 253, 269, 5267, 1453, 1955, 281, 253, 10096, 875, 253, 268, 8858, 2007, 253, 2303, 3268, 7569, 14273, 390, 29257, 514, 7569, 2727, 10679, 14177, 281, 3037, 310, 1027, 432, 752, 253, 4081, 1332, 14177, 281, 3037, 253, 3438, 310, 247, 7569, 2727, 3268, 14127, 253, 1863, 5436, 13727, 2867, 1223, 253, 6158, 310, 3365, 253, 17697, 3268, 891, 651, 1804, 39926, 253, 4081, 1332, 347, 247, 1039, 273, 16994, 1531, 394, 1378, 672, 253, 4735, 3268, 310, 417, 4555, 1929, 50275, 19, 1563, 253, 2045, 4385, 891, 1089, 253, 5301, 275, 253, 9938, 247, 2372, 16593, 3082, 259, 1027, 23632, 581, 2713, 268, 8858, 7147, 1554, 487, 262, 268, 8858, 323, 1650, 352, 1537, 320, 625, 4569, 281, 1908, 970, 253, 5368, 3676, 1332, 281, 3037, 1269, 75, 4260, 1269, 75, 285, 4647, 1531, 394, 1378, 50276, 20, 253, 9759, 273, 436, 2929, 476, 320, 5520, 50275, 66, 253, 14951, 310, 417, 5185, 4768, 253, 2929, 323, 1650, 253, 7877, 310, 6607, 347, 277, 275, 1386, 7127, 840, 277, 275, 1386, 5329, 285, 840, 268, 275, 1386, 13062, 50275, 67, 253, 5740, 273, 253, 5368, 1332, 310, 247, 2372, 1892, 281, 956, 323, 1650, 352, 778, 1361, 281, 13366, 5513, 835, 253, 1071, 941, 285, 3733, 941, 1705, 432, 10012, 337, 8687, 253, 3605, 295, 285, 352, 651, 1361, 281, 5513, 752, 310, 436, 295, 275, 253, 4588, 2685, 908, 310, 352, 278, 50274, 2577, 5701, 327, 253, 7364, 452, 644, 2908, 275, 253, 20544, 285, 32213, 629, 253, 4477, 452, 18212, 9713, 253, 2442, 4016, 38058, 3486, 273, 616, 789, 5474, 339, 431, 248, 4477, 1246, 247, 1332, 1925, 2685, 7135, 1754, 327, 2622, 3006, 14221, 281, 3609, 3386, 275, 247, 6537, 8142, 4495, 326, 253, 3221, 8900, 2281, 269, 5267, 310, 3710, 597, 3609, 3386, 407, 806, 4020, 839, 253, 941, 3268, 3066, 253, 2622, 3006, 2685, 285, 840, 11897, 247, 268, 2877, 323, 1016, 4735, 15571, 253, 941, 342, 1675, 281, 690, 1566, 24088, 247, 298, 26341, 9077, 1566, 390, 247, 3632, 9741, 26230, 253, 268, 8858, 8687, 10491, 432, 253, 16888, 3268, 273, 253, 2685, 534, 310, 2218, 3066, 278, 3591, 68, 50276, 936, 1056, 2119, 326, 253, 269, 5267, 476, 320, 6537, 253, 4477, 5276, 326, 616, 8197, 273, 253, 268, 8858, 403, 38311, 3451, 26332, 326, 253, 8197, 29623, 281, 253, 2032, 1318, 2761, 13353, 50276, 249, 616, 4679, 597, 4647, 616, 1332, 281, 247, 13506, 10895, 26332, 9578, 305, 10064, 2458, 247, 49863, 23744, 10895, 1754, 327, 3320, 941, 285, 247, 1524, 10186, 10895, 835, 597, 1611, 281, 9413, 3608, 4623, 323, 253, 4166, 2600, 275, 19175, 18661, 616, 1332, 41731, 13015, 2067, 8245, 7259, 1754, 327, 7569, 14273, 3966, 285, 597, 403, 2104, 281, 1089, 3802, 793, 4623, 323, 253, 4166, 2600, 534, 452, 671, 644, 5783, 407, 643, 2175, 50276, 296, 3755, 20556, 50276, 249, 2087, 253, 2929, 310, 973, 34218, 285, 3477, 281, 1239, 891, 11435, 253, 4114, 2593, 3340, 984, 352, 22591, 253, 9414, 949, 253, 1740, 3104, 273, 2561, 253, 4477, 1379, 5697, 432, 285, 13398, 731, 281, 247, 747, 1332, 50276, 5449, 7135, 24772, 2067, 1929, 285, 4232, 3082, 275, 247, 10995, 1039, 352, 310, 973, 28462, 432, 247, 10527, 8668, 347, 253, 4477, 5276, 326, 616, 29107, 323, 253, 268, 8858, 26414, 281, 253, 2032, 1318, 2761, 13353, 50276, 5449, 7135, 369, 1691, 281, 1071, 327, 1264, 1027, 15302, 352, 943, 281, 452, 1029, 1612, 327, 253, 13506, 285, 49863, 23744, 10895, 2169, 390, 327, 1061, 342, 512, 273, 253, 1666, 25379, 1223, 1907, 269, 5267, 347, 1029, 390, 2406, 685, 253, 7887, 534, 310, 417, 253, 1083, 323, 512, 253, 1666, 25379, 326, 253, 4477, 403, 671, 2104, 281, 4271, 3802, 793, 534, 403, 4623, 323, 253, 4166, 2600, 275, 19175, 18661, 310, 13943, 50275, 20881, 1255, 265, 50276, 20261, 253, 4477, 7277, 253, 3802, 793, 597, 4271, 275, 253, 19175, 24732, 3368, 281, 752, 556, 644, 1119, 275, 643, 2175, 597, 513, 417, 4647, 253, 1666, 25379, 597, 908, 275, 253, 2045, 4679, 281, 253, 1072, 10895, 534, 2789, 616, 4342, 5777, 1679, 21414, 50276, 3062, 1189, 597, 1918, 1652, 12288, 275, 253, 15180, 2105, 273, 616, 1332, 2429, 281, 11771, 7259, 12672, 253, 8459, 932, 432, 253, 2685, 310, 760, 1896, 949, 278, 3591, 68, 534, 476, 320, 1077, 31334, 604, 253, 10419, 3268, 1057, 417, 1957, 253, 4588, 3268, 973, 3340, 672, 10620, 342, 14421, 941, 342, 884, 76, 3386, 436, 812, 8600, 253, 1332, 1077, 31334, 19235, 253, 4477, 1918, 1077, 1652, 12288, 715, 849, 597, 9703, 841, 10419, 10670, 285, 33075, 326, 597, 1957, 253, 2032, 8459, 932, 973, 50275, 585, 3444, 50276, 303, 275, 3718, 273, 18738, 253, 3929, 3738, 891, 5730, 326, 253, 2571, 812, 19148, 253, 7350, 891, 5439, 1840, 352, 310, 417, 2590, 849, 973, 2685, 7135, 476, 320, 24337, 598, 281, 1077, 1029, 15759, 15302, 824, 347, 14421, 941, 342, 884, 76, 3386, 2490, 187, 4118, 18435, 27, 2520, 2929, 8631, 849, 281, 897, 2622, 3006, 14221, 323, 17221, 3386, 275, 247, 1039, 326, 5760, 253, 1511, 18, 2228, 407, 970, 247, 2622, 3006, 2685, 2112, 342, 278, 3591, 68, 281, 3410, 432, 253, 3635, 3268, 253, 5020, 273, 253, 30628, 497, 2762, 2299, 253, 954, 13224, 37317, 369, 4016, 432, 3192, 247, 1007, 387, 326, 30628, 7350, 891, 5257, 281, 5194, 342, 954, 273, 731, 50276, 783, 2929, 310, 18879, 7569, 2727, 4924, 534, 2097, 275, 253, 3634, 273, 436, 2929, 326, 1097, 337, 337, 2713, 268, 8858, 403, 417, 908, 285, 374, 253, 2120, 7569, 2727, 2867, 310, 417, 2424, 760, 10491, 432, 3426, 1617, 932, 403, 2424, 954, 273, 253, 4679, 7277, 7569, 2727, 3082, 281, 253, 4081, 2746, 594, 697, 417, 2590, 604, 337, 337, 2713, 268, 8858, 403, 417, 1270, 390, 374, 253, 1566, 89, 1232, 11984, 17697, 10491, 1232, 310, 1805, 342, 2622, 3006, 14221, 253, 3438, 1127, 310, 1929, 285, 253, 6158, 1127, 327, 253, 1682, 1039, 281, 3410, 432, 253, 3426, 1617, 932, 310, 1663, 253, 1318, 50275, 338, 359, 1379, 253, 2929, 347, 50275, 18, 3426, 1617, 932, 403, 337, 69, 374, 278, 3591, 68, 476, 320, 908, 281, 3410, 432, 247, 337, 69, 440, 6320, 1025, 4038, 495, 2969, 278, 3591, 68, 31451, 320, 3076, 984, 253, 1895, 310, 337, 69, 50275, 1279, 12177, 1754, 3676, 1006, 800, 1566, 476, 320, 908, 281, 3410, 3426, 1617, 932, 50276, 7461, 697, 247, 4891, 2929, 50276, 251, 253, 643, 1133, 253, 9927, 326, 14221, 403, 253, 3451, 4327, 7147, 643, 12177, 3169, 3676, 1006, 800, 3210, 310, 12150, 281, 1379, 347, 253, 373, 760, 247, 5301, 342, 247, 7802, 4038, 2990, 908, 275, 253, 3236, 288, 1378, 2929, 671, 432, 643, 4648, 273, 841, 3210, 1027, 3210, 403, 1805, 275, 1027, 9534, 2654, 1804, 247, 5536, 5955, 275, 253, 2929, 327, 436, 1127, 387, 253, 5927, 5046, 1014, 247, 16110, 6472, 273, 253, 2929, 310, 3058, 50276, 71, 3341, 323, 253, 1071, 26312, 253, 288, 1378, 778, 417, 320, 253, 1682, 4327, 323, 789, 751, 436, 2929, 326, 2175, 253, 3237, 342, 26230, 1269, 35360, 253, 50276, 20790, 15563, 2916, 6656, 9990, 323, 6537, 4778, 5438, 387, 247, 382, 1832, 43425, 2722, 326, 253, 288, 1378, 1071, 26312, 310, 625, 7996, 281, 1566, 89, 13418, 6332, 685, 247, 2969, 7802, 26312, 326, 36908, 1918, 598, 1199, 1612, 253, 4327, 273, 1071, 26312, 671, 16108, 690, 5955, 275, 3213, 495 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 3530, 432, 253, 3635, 3268, 285, 352, 840, 26662, 841, 281, 253, 2540, 1071, 26312, 5118, 970, 253, 2540, 941, 342, 253, 18235, 7887, 873, 2556, 281, 2240, 16935, 74, 1689, 348, 4978, 50276, 783, 4081, 2746, 310, 17285, 970, 271, 20185, 4154, 2622, 3006, 14221, 403, 28055, 7032, 273, 14053, 667, 3268, 1677, 2217, 29698, 285, 278, 3591, 68, 943, 320, 2104, 281, 3410, 432, 253, 6311, 3268, 1677, 2217, 3530, 253, 16774, 1543, 403, 3240, 12532, 2685, 7135, 3400, 1805, 1543, 685, 2067, 11771, 7569, 2727, 3082, 2439, 767, 3510, 273, 13506, 8892, 347, 973, 347, 5272, 1543, 327, 247, 1524, 7362, 79, 511, 82, 10895, 50276, 296, 3755, 20556, 50275, 783, 1332, 24772, 1929, 3082, 323, 4735, 5438, 762, 269, 5267, 1453, 342, 767, 747, 5609, 387, 1878, 275, 436, 749, 3423, 2622, 3006, 14221, 285, 278, 3591, 68, 281, 3410, 432, 253, 6311, 3268, 50276, 783, 20185, 4154, 2789, 2590, 2139, 436, 2746, 943, 789, 387, 1878, 275, 15216, 342, 247, 1781, 2408, 273, 941, 247, 1781, 2685, 1566, 285, 4209, 278, 3591, 68, 3530, 50276, 783, 16774, 1543, 403, 3240, 12532, 50276, 783, 28913, 1263, 2692, 326, 2709, 35961, 32771, 403, 1774, 323, 9073, 2685, 7135, 562, 32231, 253, 1666, 25379, 253, 2622, 3006, 14221, 50276, 78, 3591, 68, 2746, 323, 10885, 2918, 483, 3386, 347, 973, 347, 253, 288, 1378, 50275, 20881, 1255, 265, 50275, 783, 3762, 310, 8489, 28684, 253, 268, 8858, 29623, 760, 762, 767, 20185, 1543, 697, 3626, 281, 1287, 20185, 3762, 806, 347, 436, 310, 2223, 6927, 533, 891, 4282, 604, 253, 373, 667, 1039, 253, 3762, 812, 320, 22407, 281, 625, 8244, 4887, 849, 2685, 7135, 17923, 275, 3946, 390, 387, 1878, 1361, 1453, 1880, 2057, 273, 253, 2234, 5018, 310, 2444, 973, 2217, 625, 327, 436, 275, 253, 3533, 2593, 2708, 50275, 79, 953, 50276, 2520, 778, 417, 320, 247, 10551, 533, 891, 369, 2649, 2119, 273, 253, 1921, 323, 253, 337, 275, 253, 36138, 285, 12619, 273, 16186, 608, 50276, 783, 373, 247, 1745, 80, 327, 1386, 28485, 281, 452, 1329, 275, 50276, 74, 778, 452, 9829, 436, 533, 752, 4555, 369, 253, 1071, 26312, 908, 275, 253, 4679, 253, 11743, 273, 4677, 374, 2296, 253, 9990, 497, 5118, 970, 2057, 298, 26341, 390, 3632, 9741, 369, 352, 253, 3210, 15970, 7200, 50276, 66, 4564, 273, 253, 3533, 1840, 755, 387, 7364, 326, 1537, 320, 9371, 281, 2319, 50276, 1430, 281, 4271, 1774, 3386, 326, 403, 7052, 9578, 342, 581, 1529, 275, 253, 9065, 1083, 8931, 50276, 15870, 30437, 273, 3762, 281, 8542, 10393, 50276, 328, 8250, 13642, 342, 10895, 1979, 50276, 7152, 33032, 2520, 2929, 29328, 247, 747, 4778, 5438, 1332, 1925, 2685, 7135, 281, 1347, 6537, 4735, 5438, 326, 1057, 417, 11089, 253, 3237, 13164, 275, 7569, 14273, 3169, 3082, 38311, 50276, 783, 4081, 1332, 48169, 3588, 268, 8858, 45190, 2685, 7135, 12724, 5760, 253, 269, 5267, 327, 1097, 13506, 285, 49863, 23744, 49602, 5727, 11771, 7569, 2727, 3169, 7274, 513, 417, 50276, 45563, 50275, 783, 2929, 310, 973, 24013, 8550, 285, 973, 15720, 50275, 783, 4081, 1332, 556, 5322, 10527, 23632, 285, 4209, 16774, 27163, 50275, 20881, 1255, 50276, 249, 958, 690, 789, 327, 7569, 2727, 556, 5520, 253, 767, 3237, 16318, 407, 253, 2488, 326, 1421, 281, 253, 4433, 273, 5899, 7569, 14273, 3169, 3082, 24088, 7989, 340, 1162, 355, 5958, 1236, 2510, 25912, 17032, 342, 29886, 14561, 7569, 14273, 2299, 253, 4477, 1057, 417, 2319, 436, 2905, 6239, 50276, 455, 619, 7350, 403, 3559, 275, 14855, 285, 3533, 25761, 891, 513, 417, 452, 667, 4468, 327, 4016, 2675, 3486, 5474, 339, 431, 248, 2929, 19401, 253, 1895, 273, 2709, 9079, 5175, 342, 253, 269, 5267, 1453, 253, 789, 310, 17860, 275, 253, 1566, 89, 4758, 285, 310, 17194, 407, 253, 7364, 273, 253, 5368, 1566, 89, 2709, 5175, 3237, 342, 271, 15075, 327, 253, 7569, 2727, 3169, 3082, 253, 2929, 29328, 970, 253, 2622, 3006, 2685, 5853, 281, 6642, 253, 17697, 3268, 273, 3386, 26332, 1269, 75, 4260, 1269, 75, 534, 13398, 342, 1531, 394, 1378, 11026, 5512, 3451, 268, 8858, 323, 17697, 9079, 5175, 253, 268, 8858, 403, 9674, 10208, 281, 253, 270, 73, 82, 5199, 285, 247, 5438, 873, 310, 2530, 253, 4081, 1332, 310, 6760, 342, 13506, 285, 49863, 23744, 941, 4645, 14127, 16774, 1543, 672, 2429, 281, 643, 3082, 326, 10725, 327, 29257, 514, 7569, 2727, 35831, 4757, 50276, 18, 253, 2929, 9010, 247, 5322, 4602, 875, 247, 1375, 1171, 435, 50276, 35428, 4038, 13418, 1332, 285, 253, 1566, 89, 4735, 5438, 1332, 50276, 19, 253, 4081, 1332, 17923, 973, 45190, 50276, 20881, 1255, 265, 337, 891, 1089, 253, 16038, 285, 5301, 273, 253, 4081, 1332, 49590, 253, 2929, 14177, 281, 923, 253, 4081, 1332, 347, 271, 5795, 1332, 323, 253, 7569, 2727, 3169, 1332, 533, 387, 697, 1682, 26332, 672, 253, 10603, 305, 310, 4555, 1929, 253, 4081, 1332, 1057, 417, 12215, 253, 269, 5267, 1453, 1955, 281, 253, 10096, 875, 253, 268, 8858, 2007, 253, 2303, 3268, 7569, 14273, 390, 29257, 514, 7569, 2727, 10679, 14177, 281, 3037, 310, 1027, 432, 752, 253, 4081, 1332, 14177, 281, 3037, 253, 3438, 310, 247, 7569, 2727, 3268, 14127, 253, 1863, 5436, 13727, 2867, 1223, 253, 6158, 310, 3365, 253, 17697, 3268, 891, 651, 1804, 39926, 253, 4081, 1332, 347, 247, 1039, 273, 16994, 1531, 394, 1378, 672, 253, 4735, 3268, 310, 417, 4555, 1929, 50275, 19, 1563, 253, 2045, 4385, 891, 1089, 253, 5301, 275, 253, 9938, 247, 2372, 16593, 3082, 259, 1027, 23632, 581, 2713, 268, 8858, 7147, 1554, 487, 262, 268, 8858, 323, 1650, 352, 1537, 320, 625, 4569, 281, 1908, 970, 253, 5368, 3676, 1332, 281, 3037, 1269, 75, 4260, 1269, 75, 285, 4647, 1531, 394, 1378, 50276, 20, 253, 9759, 273, 436, 2929, 476, 320, 5520, 50275, 66, 253, 14951, 310, 417, 5185, 4768, 253, 2929, 323, 1650, 253, 7877, 310, 6607, 347, 277, 275, 1386, 7127, 840, 277, 275, 1386, 5329, 285, 840, 268, 275, 1386, 13062, 50275, 67, 253, 5740, 273, 253, 5368, 1332, 310, 247, 2372, 1892, 281, 956, 323, 1650, 352, 778, 1361, 281, 13366, 5513, 835, 253, 1071, 941, 285, 3733, 941, 1705, 432, 10012, 337, 8687, 253, 3605, 295, 285, 352, 651, 1361, 281, 5513, 752, 310, 436, 295, 275, 253, 4588, 2685, 908, 310, 352, 278, 50274, 2577, 5701, 327, 253, 7364, 452, 644, 2908, 275, 253, 20544, 285, 32213, 629, 253, 4477, 452, 18212, 9713, 253, 2442, 4016, 38058, 3486, 273, 616, 789, 5474, 339, 431, 248, 4477, 1246, 247, 1332, 1925, 2685, 7135, 1754, 327, 2622, 3006, 14221, 281, 3609, 3386, 275, 247, 6537, 8142, 4495, 326, 253, 3221, 8900, 2281, 269, 5267, 310, 3710, 597, 3609, 3386, 407, 806, 4020, 839, 253, 941, 3268, 3066, 253, 2622, 3006, 2685, 285, 840, 11897, 247, 268, 2877, 323, 1016, 4735, 15571, 253, 941, 342, 1675, 281, 690, 1566, 24088, 247, 298, 26341, 9077, 1566, 390, 247, 3632, 9741, 26230, 253, 268, 8858, 8687, 10491, 432, 253, 16888, 3268, 273, 253, 2685, 534, 310, 2218, 3066, 278, 3591, 68, 50276, 936, 1056, 2119, 326, 253, 269, 5267, 476, 320, 6537, 253, 4477, 5276, 326, 616, 8197, 273, 253, 268, 8858, 403, 38311, 3451, 26332, 326, 253, 8197, 29623, 281, 253, 2032, 1318, 2761, 13353, 50276, 249, 616, 4679, 597, 4647, 616, 1332, 281, 247, 13506, 10895, 26332, 9578, 305, 10064, 2458, 247, 49863, 23744, 10895, 1754, 327, 3320, 941, 285, 247, 1524, 10186, 10895, 835, 597, 1611, 281, 9413, 3608, 4623, 323, 253, 4166, 2600, 275, 19175, 18661, 616, 1332, 41731, 13015, 2067, 8245, 7259, 1754, 327, 7569, 14273, 3966, 285, 597, 403, 2104, 281, 1089, 3802, 793, 4623, 323, 253, 4166, 2600, 534, 452, 671, 644, 5783, 407, 643, 2175, 50276, 296, 3755, 20556, 50276, 249, 2087, 253, 2929, 310, 973, 34218, 285, 3477, 281, 1239, 891, 11435, 253, 4114, 2593, 3340, 984, 352, 22591, 253, 9414, 949, 253, 1740, 3104, 273, 2561, 253, 4477, 1379, 5697, 432, 285, 13398, 731, 281, 247, 747, 1332, 50276, 5449, 7135, 24772, 2067, 1929, 285, 4232, 3082, 275, 247, 10995, 1039, 352, 310, 973, 28462, 432, 247, 10527, 8668, 347, 253, 4477, 5276, 326, 616, 29107, 323, 253, 268, 8858, 26414, 281, 253, 2032, 1318, 2761, 13353, 50276, 5449, 7135, 369, 1691, 281, 1071, 327, 1264, 1027, 15302, 352, 943, 281, 452, 1029, 1612, 327, 253, 13506, 285, 49863, 23744, 10895, 2169, 390, 327, 1061, 342, 512, 273, 253, 1666, 25379, 1223, 1907, 269, 5267, 347, 1029, 390, 2406, 685, 253, 7887, 534, 310, 417, 253, 1083, 323, 512, 253, 1666, 25379, 326, 253, 4477, 403, 671, 2104, 281, 4271, 3802, 793, 534, 403, 4623, 323, 253, 4166, 2600, 275, 19175, 18661, 310, 13943, 50275, 20881, 1255, 265, 50276, 20261, 253, 4477, 7277, 253, 3802, 793, 597, 4271, 275, 253, 19175, 24732, 3368, 281, 752, 556, 644, 1119, 275, 643, 2175, 597, 513, 417, 4647, 253, 1666, 25379, 597, 908, 275, 253, 2045, 4679, 281, 253, 1072, 10895, 534, 2789, 616, 4342, 5777, 1679, 21414, 50276, 3062, 1189, 597, 1918, 1652, 12288, 275, 253, 15180, 2105, 273, 616, 1332, 2429, 281, 11771, 7259, 12672, 253, 8459, 932, 432, 253, 2685, 310, 760, 1896, 949, 278, 3591, 68, 534, 476, 320, 1077, 31334, 604, 253, 10419, 3268, 1057, 417, 1957, 253, 4588, 3268, 973, 3340, 672, 10620, 342, 14421, 941, 342, 884, 76, 3386, 436, 812, 8600, 253, 1332, 1077, 31334, 19235, 253, 4477, 1918, 1077, 1652, 12288, 715, 849, 597, 9703, 841, 10419, 10670, 285, 33075, 326, 597, 1957, 253, 2032, 8459, 932, 973, 50275, 585, 3444, 50276, 303, 275, 3718, 273, 18738, 253, 3929, 3738, 891, 5730, 326, 253, 2571, 812, 19148, 253, 7350, 891, 5439, 1840, 352, 310, 417, 2590, 849, 973, 2685, 7135, 476, 320, 24337, 598, 281, 1077, 1029, 15759, 15302, 824, 347, 14421, 941, 342, 884, 76, 3386, 2490, 187, 4118, 18435, 27, 2520, 2929, 8631, 849, 281, 897, 2622, 3006, 14221, 323, 17221, 3386, 275, 247, 1039, 326, 5760, 253, 1511, 18, 2228, 407, 970, 247, 2622, 3006, 2685, 2112, 342, 278, 3591, 68, 281, 3410, 432, 253, 3635, 3268, 253, 5020, 273, 253, 30628, 497, 2762, 2299, 253, 954, 13224, 37317, 369, 4016, 432, 3192, 247, 1007, 387, 326, 30628, 7350, 891, 5257, 281, 5194, 342, 954, 273, 731, 50276, 783, 2929, 310, 18879, 7569, 2727, 4924, 534, 2097, 275, 253, 3634, 273, 436, 2929, 326, 1097, 337, 337, 2713, 268, 8858, 403, 417, 908, 285, 374, 253, 2120, 7569, 2727, 2867, 310, 417, 2424, 760, 10491, 432, 3426, 1617, 932, 403, 2424, 954, 273, 253, 4679, 7277, 7569, 2727, 3082, 281, 253, 4081, 2746, 594, 697, 417, 2590, 604, 337, 337, 2713, 268, 8858, 403, 417, 1270, 390, 374, 253, 1566, 89, 1232, 11984, 17697, 10491, 1232, 310, 1805, 342, 2622, 3006, 14221, 253, 3438, 1127, 310, 1929, 285, 253, 6158, 1127, 327, 253, 1682, 1039, 281, 3410, 432, 253, 3426, 1617, 932, 310, 1663, 253, 1318, 50275, 338, 359, 1379, 253, 2929, 347, 50275, 18, 3426, 1617, 932, 403, 337, 69, 374, 278, 3591, 68, 476, 320, 908, 281, 3410, 432, 247, 337, 69, 440, 6320, 1025, 4038, 495, 2969, 278, 3591, 68, 31451, 320, 3076, 984, 253, 1895, 310, 337, 69, 50275, 1279, 12177, 1754, 3676, 1006, 800, 1566, 476, 320, 908, 281, 3410, 3426, 1617, 932, 50276, 7461, 697, 247, 4891, 2929, 50276, 251, 253, 643, 1133, 253, 9927, 326, 14221, 403, 253, 3451, 4327, 7147, 643, 12177, 3169, 3676, 1006, 800, 3210, 310, 12150, 281, 1379, 347, 253, 373, 760, 247, 5301, 342, 247, 7802, 4038, 2990, 908, 275, 253, 3236, 288, 1378, 2929, 671, 432, 643, 4648, 273, 841, 3210, 1027, 3210, 403, 1805, 275, 1027, 9534, 2654, 1804, 247, 5536, 5955, 275, 253, 2929, 327, 436, 1127, 387, 253, 5927, 5046, 1014, 247, 16110, 6472, 273, 253, 2929, 310, 3058, 50276, 71, 3341, 323, 253, 1071, 26312, 253, 288, 1378, 778, 417, 320, 253, 1682, 4327, 323, 789, 751, 436, 2929, 326, 2175, 253, 3237, 342, 26230, 1269, 35360, 253, 50276, 20790, 15563, 2916, 6656, 9990, 323, 6537, 4778, 5438, 387, 247, 382, 1832, 43425, 2722, 326, 253, 288, 1378, 1071, 26312, 310, 625, 7996, 281, 1566, 89, 13418, 6332, 685, 247, 2969, 7802, 26312, 326, 36908, 1918, 598, 1199, 1612, 253, 4327, 273, 1071, 26312, 671, 16108, 690, 5955, 275, 3213, 495 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: determinantal point processes provide an efficient and elegant way to sample a subset of diverse items from a ground set this has found applications in summarization matrix approximation minibatch selection however the naive algorithm for dpp takes time on3 where n is the size of the ground set the authors provide an alternative model dppnet for sampling diverse items that preserves the elegant mathematical properties closure under conditioning logsubmodularity of dpps while having faster sampling algorithms the authors need to compare the performance of dppnet against faster alternatives to sample from dpps eg httpsarxivorgpdf150901618pdf as well as compare on applications where there is a significant gap between uniform sampling and dpps because there are the applications where dpps are crucial the examples in table 2 and table 3 do not address thisdocsepthis paper proposes a scaleable algorithm for sampling from dppnets a proposed model which approximates the distribution of a dpp the approach builds upon a proposed inhibitive attention mechanism and transformer networks the proposed approach and focus on sampling is original as far as i can tell the problem is also important to parts of the community as dpps or similar distributions are used more and more frequently however the applicability of the proposed approach is limited as it is unclear how to deal with varying ground set sizes the authors briefly discuss this issue in their conclusion referring to circumvent this problem by subsampling this can however be problematic either requiring to sample from a dpp or incurring high probability of missing important items furthermore the used evaluation method is biased in favor of dppnets as numerical results evaluate the likelihood of samples under the dpp which the dppnet is trained to approximate for this makes it difficult to draw conclusions from the presented results i understand that this evaluation is used as there is no standard way of measuring diversity of a subset of items but it is also clear that no baseline can be competitive one possibility to overcome this bias would be to consider a downstream task and evaluate performance on that task furthermore i suggest to make certain aspects of the paper more explicit and provide additional details for instance i would suggest to spell out a training algorithm provide equations for the training criterion and the evaluation criterion please comment on the cost of training constantly computing the marginal probabilities for training should be quite expensive and the convergence of the training maybe show a training curve this would be interesting in the light of theorem 1 and corollary 1 certain parts of the paper are unclear or details are missing table 3 what is dpp gao how are results for kmedoids computed including the standard error are these results obtained by computing multiple kmedoids solutions with differing initial conditions in the paper you say furthermore greedily sampling the mode from the dppnet achieves a better nll than dpp samples themselves what are the implications of this what is the nll of an approximate mode of the original dpp is the statement that you want to make that the greedy approximation works welldocsepquality 510 this paper proposes dppnet which approximates determinantal point processes with deep networks by inhibitive attention mechanism the authors provided a theoretical analysis under some condition that the dppnet is of logsubmodularity clarity 910 this paper is well written and provides a clear figure to demonstrate their network architecture originality 610 this paper is mainly based on the work vaswani et al attention is all you need 2017 it computes the dissimilarities by subtracting attention in the original work from one and then samples a subset by an unrolled recurrent neural network significance 510 this paper uses negative loglikelihood as the measurement to compare dppnet with other methods without further application it is difficult to measure the improvement of this method over other methods pros 1 this paper is well written and provides a figure to clearly demonstrate their network architecture 2 this paper provides a deep learning way to sample a subset of data from the whole data set and reduce the computation complexity there are some comments 1 figure 4 shows the sampled digits from uniform distribution dppnet with mode and dpp how about the sampled digits from kmedoids providing the sampled digits from kmedoids can make the experiments more complete 2 the object of dppnet is to minimize the negative loglikelihood the dpp and kmedoids have other motivations not directly optimizing the negative loglikelihood this may be the reason why dppnet has a better performance on negative loglikelihood even than dpp could the authors provide some other measures like the visual comparison in figure 4 to compare these methods 3 does gendpp mode in table 2 mean the greedy mode in algorithm 1 a clear denotation can make it more clear ### Summary:
the paper addresses the complexity issue of determinantal point processes via generative deep models the reviewers and ac note the critical limitation of applicability of this paper to variable ground set sizes whether authors rebuttal is not convincing enough ac thinks the proposed method has potential and is interesting but decided that the authors need more works to publish
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 18916, 249, 386, 267, 1127, 4870, 2085, 271, 5919, 285, 20654, 1039, 281, 3410, 247, 8578, 273, 11117, 4957, 432, 247, 3216, 873, 436, 556, 1119, 4893, 275, 10405, 1320, 4315, 11193, 1054, 487, 1506, 5438, 2299, 253, 27785, 5933, 323, 277, 377, 3936, 673, 327, 20, 835, 295, 310, 253, 1979, 273, 253, 3216, 873, 253, 4477, 2085, 271, 5795, 1566, 277, 377, 3024, 323, 10491, 11117, 4957, 326, 31221, 253, 20654, 15965, 3607, 14230, 762, 21839, 2412, 2377, 2307, 792, 414, 273, 277, 44361, 1223, 1907, 7938, 10491, 11333, 50276, 783, 4477, 878, 281, 7277, 253, 3045, 273, 277, 377, 3024, 1411, 7938, 18075, 281, 3410, 432, 277, 44361, 24088, 5987, 39962, 2061, 9275, 1010, 2693, 11718, 1093, 9275, 347, 973, 347, 7277, 327, 4893, 835, 627, 310, 247, 1534, 8037, 875, 6447, 10491, 285, 277, 44361, 984, 627, 403, 253, 4893, 835, 277, 44361, 403, 9560, 253, 6667, 275, 2829, 374, 285, 2829, 495, 513, 417, 2953, 436, 7152, 33032, 2520, 2929, 29328, 247, 4311, 494, 5933, 323, 10491, 432, 277, 377, 47301, 247, 4081, 1566, 534, 4020, 684, 253, 3268, 273, 247, 277, 377, 253, 2746, 21168, 2220, 247, 4081, 3084, 1483, 4116, 5122, 285, 39707, 6928, 50276, 783, 4081, 2746, 285, 2770, 327, 10491, 310, 3236, 347, 2080, 347, 891, 476, 2028, 253, 1895, 310, 671, 1774, 281, 4243, 273, 253, 3114, 347, 277, 44361, 390, 2074, 10670, 403, 908, 625, 285, 625, 7208, 2299, 253, 30437, 273, 253, 4081, 2746, 310, 3710, 347, 352, 310, 12744, 849, 281, 2968, 342, 11962, 3216, 873, 9552, 50276, 783, 4477, 13366, 2319, 436, 2523, 275, 616, 6452, 14339, 281, 39256, 436, 1895, 407, 8790, 312, 4906, 436, 476, 2299, 320, 20276, 2057, 10568, 281, 3410, 432, 247, 277, 377, 390, 36967, 804, 1029, 5912, 273, 5816, 1774, 4957, 50276, 44295, 3062, 253, 908, 7103, 1332, 310, 23539, 275, 3718, 273, 277, 377, 47301, 347, 10704, 1543, 7472, 253, 12177, 273, 3530, 762, 253, 277, 377, 534, 253, 277, 377, 3024, 310, 10166, 281, 16851, 323, 436, 2789, 352, 2834, 281, 3812, 11815, 432, 253, 3559, 1543, 891, 2096, 326, 436, 7103, 310, 908, 347, 627, 310, 642, 2629, 1039, 273, 10499, 9991, 273, 247, 8578, 273, 4957, 533, 352, 310, 671, 2590, 326, 642, 8245, 476, 320, 12085, 581, 6387, 281, 11399, 436, 8492, 651, 320, 281, 1908, 247, 15450, 4836, 285, 7472, 3045, 327, 326, 4836, 50275, 44295, 3062, 891, 1804, 281, 1056, 2176, 7794, 273, 253, 2929, 625, 6843, 285, 2085, 3081, 4278, 323, 4227, 891, 651, 1804, 281, 15368, 562, 247, 3733, 5933, 2085, 7424, 323, 253, 3733, 17705, 285, 253, 7103, 17705, 4496, 4385, 327, 253, 2105, 273, 3733, 11485, 12672, 253, 16888, 20552, 323, 3733, 943, 320, 3240, 8214, 285, 253, 14940, 273, 253, 3733, 5046, 921, 247, 3733, 6970, 436, 651, 320, 4722, 275, 253, 1708, 273, 10012, 337, 285, 40460, 337, 50276, 33455, 4243, 273, 253, 2929, 403, 12744, 390, 4278, 403, 5816, 50276, 2420, 495, 752, 310, 277, 377, 305, 8500, 50276, 5430, 403, 1543, 323, 465, 1314, 9448, 10302, 1690, 253, 2629, 2228, 403, 841, 1543, 2797, 407, 12672, 2709, 465, 1314, 9448, 5482, 342, 26704, 3302, 2515, 50276, 249, 253, 2929, 368, 1333, 33810, 37819, 1031, 10491, 253, 4438, 432, 253, 277, 377, 3024, 33526, 247, 1805, 295, 620, 685, 277, 377, 3530, 3746, 752, 403, 253, 12739, 273, 436, 752, 310, 253, 295, 620, 273, 271, 16851, 4438, 273, 253, 3236, 277, 377, 310, 253, 3908, 326, 368, 971, 281, 1056, 326, 253, 38754, 11193, 2987, 6210, 392, 406, 33032, 15177, 33930, 436, 2929, 29328, 277, 377, 3024, 534, 4020, 684, 27152, 267, 1127, 4870, 342, 3676, 6928, 407, 3084, 1483, 4116, 5122, 253, 4477, 2530, 247, 10527, 1783, 762, 690, 1617, 326, 253, 277, 377, 3024, 310, 273, 2412, 2377, 2307, 792, 414, 50276, 498, 15752, 898, 740, 436, 2929, 310, 973, 3542, 285, 3400, 247, 2590, 4677, 281, 7568, 616, 2990, 10336, 50276, 19164, 414, 43385, 436, 2929, 310, 7194, 1754, 327, 253, 789, 16016, 88, 6451, 1162, 355, 4116, 310, 512, 368, 878, 4240, 352, 48169, 253, 43110, 1005, 407, 45771, 4116, 275, 253, 3236, 789, 432, 581, 285, 840, 3530, 247, 8578, 407, 271, 440, 9095, 18902, 11454, 2990, 50275, 9188, 40348, 33930, 436, 2929, 4648, 4016, 2412, 7513, 10202, 347, 253, 6814, 281, 7277, 277, 377, 3024, 342, 643, 3082, 1293, 2007, 2898, 352, 310, 2834, 281, 2557, 253, 7756, 273, 436, 1332, 689, 643, 3082, 50276, 856, 84, 50276, 18, 436, 2929, 310, 973, 3542, 285, 3400, 247, 4677, 281, 4518, 7568, 616, 2990, 10336, 50276, 19, 436, 2929, 3400, 247, 3676, 4715, 1039, 281, 3410, 247, 8578, 273, 941, 432, 253, 2644, 941, 873, 285, 4796, 253, 13782, 10454, 50276, 9088, 403, 690, 5701, 337, 4677, 577, 2722, 253, 19958, 24321, 432, 6447, 3268, 277, 377, 3024, 342, 4438, 285, 277, 377, 849, 670, 253, 19958, 24321, 432, 465, 1314, 9448, 5277, 253, 19958, 24321, 432, 465, 1314, 9448, 476, 1056, 253, 4679, 625, 3426, 50276, 19, 253, 1789, 273, 277, 377, 3024, 310, 281, 15338, 253, 4016, 2412, 7513, 10202, 253, 277, 377, 285, 465, 1314, 9448, 452, 643, 42852, 417, 3587, 39793, 253, 4016, 2412, 7513, 10202, 436, 778, 320, 253, 1921, 2139, 277, 377, 3024, 556, 247, 1805, 3045, 327, 4016, 2412, 7513, 10202, 1014, 685, 277, 377, 812, 253, 4477, 2085, 690, 643, 5593, 751, 253, 5304, 5301, 275, 4677, 577, 281, 7277, 841, 3082, 50276, 20, 1057, 305, 423, 377, 4438, 275, 2829, 374, 1599, 253, 38754, 4438, 275, 5933, 337, 247, 2590, 1850, 302, 318, 476, 1056, 352, 625, 2590, 187, 187, 4118, 18435, 27, 783, 2929, 12453, 253, 10454, 2523, 273, 27152, 267, 1127, 4870, 3066, 1006, 800, 3676, 3210, 50276, 783, 30628, 285, 913, 3877, 253, 4619, 12291, 273, 30437, 273, 436, 2929, 281, 4778, 3216, 873, 9552, 1880, 4477, 30080, 22559, 310, 417, 21414, 2217, 50276, 317, 11121, 253, 4081, 1332, 556, 2442, 285, 310, 4722, 533, 4425, 326, 253, 4477, 878, 625, 2987, 281, 15452 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 18916, 249, 386, 267, 1127, 4870, 2085, 271, 5919, 285, 20654, 1039, 281, 3410, 247, 8578, 273, 11117, 4957, 432, 247, 3216, 873, 436, 556, 1119, 4893, 275, 10405, 1320, 4315, 11193, 1054, 487, 1506, 5438, 2299, 253, 27785, 5933, 323, 277, 377, 3936, 673, 327, 20, 835, 295, 310, 253, 1979, 273, 253, 3216, 873, 253, 4477, 2085, 271, 5795, 1566, 277, 377, 3024, 323, 10491, 11117, 4957, 326, 31221, 253, 20654, 15965, 3607, 14230, 762, 21839, 2412, 2377, 2307, 792, 414, 273, 277, 44361, 1223, 1907, 7938, 10491, 11333, 50276, 783, 4477, 878, 281, 7277, 253, 3045, 273, 277, 377, 3024, 1411, 7938, 18075, 281, 3410, 432, 277, 44361, 24088, 5987, 39962, 2061, 9275, 1010, 2693, 11718, 1093, 9275, 347, 973, 347, 7277, 327, 4893, 835, 627, 310, 247, 1534, 8037, 875, 6447, 10491, 285, 277, 44361, 984, 627, 403, 253, 4893, 835, 277, 44361, 403, 9560, 253, 6667, 275, 2829, 374, 285, 2829, 495, 513, 417, 2953, 436, 7152, 33032, 2520, 2929, 29328, 247, 4311, 494, 5933, 323, 10491, 432, 277, 377, 47301, 247, 4081, 1566, 534, 4020, 684, 253, 3268, 273, 247, 277, 377, 253, 2746, 21168, 2220, 247, 4081, 3084, 1483, 4116, 5122, 285, 39707, 6928, 50276, 783, 4081, 2746, 285, 2770, 327, 10491, 310, 3236, 347, 2080, 347, 891, 476, 2028, 253, 1895, 310, 671, 1774, 281, 4243, 273, 253, 3114, 347, 277, 44361, 390, 2074, 10670, 403, 908, 625, 285, 625, 7208, 2299, 253, 30437, 273, 253, 4081, 2746, 310, 3710, 347, 352, 310, 12744, 849, 281, 2968, 342, 11962, 3216, 873, 9552, 50276, 783, 4477, 13366, 2319, 436, 2523, 275, 616, 6452, 14339, 281, 39256, 436, 1895, 407, 8790, 312, 4906, 436, 476, 2299, 320, 20276, 2057, 10568, 281, 3410, 432, 247, 277, 377, 390, 36967, 804, 1029, 5912, 273, 5816, 1774, 4957, 50276, 44295, 3062, 253, 908, 7103, 1332, 310, 23539, 275, 3718, 273, 277, 377, 47301, 347, 10704, 1543, 7472, 253, 12177, 273, 3530, 762, 253, 277, 377, 534, 253, 277, 377, 3024, 310, 10166, 281, 16851, 323, 436, 2789, 352, 2834, 281, 3812, 11815, 432, 253, 3559, 1543, 891, 2096, 326, 436, 7103, 310, 908, 347, 627, 310, 642, 2629, 1039, 273, 10499, 9991, 273, 247, 8578, 273, 4957, 533, 352, 310, 671, 2590, 326, 642, 8245, 476, 320, 12085, 581, 6387, 281, 11399, 436, 8492, 651, 320, 281, 1908, 247, 15450, 4836, 285, 7472, 3045, 327, 326, 4836, 50275, 44295, 3062, 891, 1804, 281, 1056, 2176, 7794, 273, 253, 2929, 625, 6843, 285, 2085, 3081, 4278, 323, 4227, 891, 651, 1804, 281, 15368, 562, 247, 3733, 5933, 2085, 7424, 323, 253, 3733, 17705, 285, 253, 7103, 17705, 4496, 4385, 327, 253, 2105, 273, 3733, 11485, 12672, 253, 16888, 20552, 323, 3733, 943, 320, 3240, 8214, 285, 253, 14940, 273, 253, 3733, 5046, 921, 247, 3733, 6970, 436, 651, 320, 4722, 275, 253, 1708, 273, 10012, 337, 285, 40460, 337, 50276, 33455, 4243, 273, 253, 2929, 403, 12744, 390, 4278, 403, 5816, 50276, 2420, 495, 752, 310, 277, 377, 305, 8500, 50276, 5430, 403, 1543, 323, 465, 1314, 9448, 10302, 1690, 253, 2629, 2228, 403, 841, 1543, 2797, 407, 12672, 2709, 465, 1314, 9448, 5482, 342, 26704, 3302, 2515, 50276, 249, 253, 2929, 368, 1333, 33810, 37819, 1031, 10491, 253, 4438, 432, 253, 277, 377, 3024, 33526, 247, 1805, 295, 620, 685, 277, 377, 3530, 3746, 752, 403, 253, 12739, 273, 436, 752, 310, 253, 295, 620, 273, 271, 16851, 4438, 273, 253, 3236, 277, 377, 310, 253, 3908, 326, 368, 971, 281, 1056, 326, 253, 38754, 11193, 2987, 6210, 392, 406, 33032, 15177, 33930, 436, 2929, 29328, 277, 377, 3024, 534, 4020, 684, 27152, 267, 1127, 4870, 342, 3676, 6928, 407, 3084, 1483, 4116, 5122, 253, 4477, 2530, 247, 10527, 1783, 762, 690, 1617, 326, 253, 277, 377, 3024, 310, 273, 2412, 2377, 2307, 792, 414, 50276, 498, 15752, 898, 740, 436, 2929, 310, 973, 3542, 285, 3400, 247, 2590, 4677, 281, 7568, 616, 2990, 10336, 50276, 19164, 414, 43385, 436, 2929, 310, 7194, 1754, 327, 253, 789, 16016, 88, 6451, 1162, 355, 4116, 310, 512, 368, 878, 4240, 352, 48169, 253, 43110, 1005, 407, 45771, 4116, 275, 253, 3236, 789, 432, 581, 285, 840, 3530, 247, 8578, 407, 271, 440, 9095, 18902, 11454, 2990, 50275, 9188, 40348, 33930, 436, 2929, 4648, 4016, 2412, 7513, 10202, 347, 253, 6814, 281, 7277, 277, 377, 3024, 342, 643, 3082, 1293, 2007, 2898, 352, 310, 2834, 281, 2557, 253, 7756, 273, 436, 1332, 689, 643, 3082, 50276, 856, 84, 50276, 18, 436, 2929, 310, 973, 3542, 285, 3400, 247, 4677, 281, 4518, 7568, 616, 2990, 10336, 50276, 19, 436, 2929, 3400, 247, 3676, 4715, 1039, 281, 3410, 247, 8578, 273, 941, 432, 253, 2644, 941, 873, 285, 4796, 253, 13782, 10454, 50276, 9088, 403, 690, 5701, 337, 4677, 577, 2722, 253, 19958, 24321, 432, 6447, 3268, 277, 377, 3024, 342, 4438, 285, 277, 377, 849, 670, 253, 19958, 24321, 432, 465, 1314, 9448, 5277, 253, 19958, 24321, 432, 465, 1314, 9448, 476, 1056, 253, 4679, 625, 3426, 50276, 19, 253, 1789, 273, 277, 377, 3024, 310, 281, 15338, 253, 4016, 2412, 7513, 10202, 253, 277, 377, 285, 465, 1314, 9448, 452, 643, 42852, 417, 3587, 39793, 253, 4016, 2412, 7513, 10202, 436, 778, 320, 253, 1921, 2139, 277, 377, 3024, 556, 247, 1805, 3045, 327, 4016, 2412, 7513, 10202, 1014, 685, 277, 377, 812, 253, 4477, 2085, 690, 643, 5593, 751, 253, 5304, 5301, 275, 4677, 577, 281, 7277, 841, 3082, 50276, 20, 1057, 305, 423, 377, 4438, 275, 2829, 374, 1599, 253, 38754, 4438, 275, 5933, 337, 247, 2590, 1850, 302, 318, 476, 1056, 352, 625, 2590, 187, 187, 4118, 18435, 27, 783, 2929, 12453, 253, 10454, 2523, 273, 27152, 267, 1127, 4870, 3066, 1006, 800, 3676, 3210, 50276, 783, 30628, 285, 913, 3877, 253, 4619, 12291, 273, 30437, 273, 436, 2929, 281, 4778, 3216, 873, 9552, 1880, 4477, 30080, 22559, 310, 417, 21414, 2217, 50276, 317, 11121, 253, 4081, 1332, 556, 2442, 285, 310, 4722, 533, 4425, 326, 253, 4477, 878, 625, 2987, 281, 15452 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper studies the problem of active reward learning in reinforcement learning the objective is to design an algorithm that allows the user after limited access to a few reward queries to not only reconstruct the reward function but also use it to produce an epsilonoptimal policy the authors assume the rewards are in 01 and that the human feedback is random such that for any state action pair and episode step sah the observed human feedback y satisfies mathbbpy sah fhsah such that fh in mathcalf for some known function class mathcalf it is further assumed that the true rewards are a thresholded version of the human feedback so that the reward of sah equals 1 if fhsah 12 and 0 otherwise they make use of a massart noise assumption to ensure the learnability of this threshold function it is possible to show that sufficient samples from the human labeler are enough to learn the thresholded reward even over all state actions in a data pool the number of labels needed depends only on the gap delta from the massart noise assumption and the complexity of the function class mathcalf in short even if the dataset is large a dataset size independent number of samples are needed to learn the reward function over all of it the authors then combine this result with a reward free exploration scheme to first collect sufficient data from either a tabular or a linear mdp and use it for the purpose of reward labeling the reward labeling results described above imply that in order to learn the reward function over all the datapoints in the data batch generated by free exploration it is enough to query the rewards over a number of state action pairs only dependent on the noise condition and the size of the function class they complement their discussion with some commentary on how to extend their results to the setting of offline rl 1 strengths the setting studied by the paper seems novel and the discussion that these results may bring about would represent a welcome addition to the reinforcement learning literature developing theoretical understanding of hil algorithms and how it is that in practice these algorithms do not require massive amounts of data is very important thus the assumptions used in this work are not to be thought as limiting but instead as ways of developing understanding of these issues 2 weaknesses the technical contribution of this work is limited the algorithms presented and their analysis in my understanding is mainly lifted from the existing literature the active reward learning guarantees are not really novel or surprising the reward free section lacks citations but these results are already present in the literature despite 2 i think the paper has merit because of the introduction of this setup this work poses no negative societal impact docsepsummary to better understand humanintheloop rl this paper aims to provide a theoretical understanding of the feedback required from a human to learn a reward function the authors proposes a provable activelearning rl algorithm which queries a human on feedback for its actions the proposed algorithm needs to query the reward function for feedback oh dimr times in contrast to standard rl algorithm which require at least omegatextpolyd frac1epsilon this is a purely theoretical paper and there are no experiments decision overall i think this paper is interesting in the problem that it studies and the theoretical results seem important i am not an expert in this area however and will tentatively rate the paper at slightly below the acceptance threshold the submission is held back by some open questions namely the comparison of the bound derived versus the bound of standard rl algorithms experiments in a toy environment with an unknown but programmatic reward function that can be queried would also be nice and strengthen confidence in the importance of the derived algorithm strengths the problem studied that of optimally querying a human on feedback for reward is an interesting line of study that deserves a theoretical analysis the bound derived is surprisingly elegant and the presentation of the theory is mostly readable weaknesses experiments in a toy environment with some programmatic reward function that can be queried would help contextualize the proposed algorithm and the tightness of the bound the biggest open question is how to contextualize the bound derived which depends on a measure of complexity of the reward function with the standard bound that depends on the complexity of of the transition the conclusion section which the authors point to only discusses future work and not the limitations of the algorithm nor negative societal impact as a theory paper societal impact is difficult to ascertain limitations in the theoretical analysis are somewhat discussed throughout the paper but it would be nice to see candid discussion in the conclusion on the overall limitations of this approach docsepthis paper aims to solve the rl problem in a setting where the rewards are initially unknown the algorithms proceeds in two stages first in the exploration phase the algorithm collects samples form the environment to learn about the transitions from state to state then the algorithm queries the user about rewards for specific stateaction pairs and comes up with an approximation to the optimal policy the paper shows a bound on the suboptimality gap on the resulting policy the analysis is restricted to the case where the true reward is binary minor points in line 119 the definition of the norm of a function wrt to the set does not use the set z on the right hand side you have to change the definition so that the summation is over the elements of z it may make sense to briefly contrast your approach to 1 in the active learning paragraph of the related work section they solve a related reward active learning problem although in a slightly different setting known transition model unknown reward function 1 lindner d turchetta m tschiatschek s ciosek k krause a 2021 information directed reward learning for reinforcement learning neurips 2021 strengths the problem of finding out enough information about the reward function of an mdp to be able to come up with a closeenough idea of what the optimal policy is is a major challenge and crucial to unlock many applications of rl this paper makes a good attempt directed towards solving this problem the theoretical results paint a complete picture of the performance of the agent trained using this method weaknesses there is no empirical evaluation in the main paper and evaluation in the supplementary material is very limited this is not a dealbreaker since good theoretical papers can by viable contributions to neurips even without experiments but including them would make the submission stronger presentation of the material is dry and dense at times it could be improved by providing more intuitions about what the practical significance of each result is the authors briefly mention limitations of the approach in the conclusion but do not discuss them at length in my view the biggest limitations are 1 it is unclear how the method scales and 2 it is unclear what the noise margin will be for realistic problems a small margin makes the required number of queries in theorem 3 very large i do not see any ethical problems with this paper docsepthe authors propose a theoretical framework for studying humanintheloop rl which splits the problem into a rewardfree exploration phase and an active reward learning phase the agent has no knowledge of the reward function and has to learn about the reward from a human expert in the first phase the agent explores the environment using standard rfe methods in the second phase the agent then queries the human about states it has encountered during exploration in order to learn about the reward the authors study the sample complexity of this problem comparing the sample complexity of exploration with the sample complexity of learning the reward they propose an active reward learning algorithm and bound its sample complexity as a function of the complexity of the reward function class they then apply this algorithm to linear mdps and offline rl deriving tighter sample complexity results for these special cases strengths the paper provides a nice theoretical framework to think about humanintheloop rl the theoretical results support the empirical observation that we need much less information about the reward than interactions with the environment which is encouraging for work on rl that relies on human feedback the theoretical analysis seems strong and the results all seem sound to me i did not check the proofs the application of the theoretical analysis to offline rl is particularly nice and supports the generality of the proposed framework the paper is wellwritten overall and wellstructured weaknesses the setup makes some strong and impractical assumptions such as assuming binary rewards and bounded noise the proposed framework of separating an exploration and an active learning phase is not novel and implicitly assumed in a lot of prior work on active reward learning practical work often finds that iterating between exploration and active learning is beneficial which the theoretical framework does not capture the paper could be clearer about how the theoretical framework connects to practical applications the paper lacks any empirical evaluation having at least a small empirical evaluation of the proposed method to see if the sample complexity results match empirical performance at least in simple experiments would make the paper much stronger in my opinion the paper would benefit from a discussion of the assumptions made for the theoretical results in particular binary rewards and bounded noise it should also discuss if the approach could be extended to nonlinear rewards i did not find a discussion of potential broader impact of the paper ### Summary:
this paper investigates humanintheloop rl the framework and proposed algorithm allows an agent to reconstruct the reward function and produce a nearoptimal policy after limited access to a few reward queries the primary contributions of the work are the problem formulation algorithm and formal results all reviewers agreed on acceptance most importantly there was consensus that problem setting is relevant and interesting and the math is correct the reviewers noted the techniques used are not new and the results not surprising but the formulation is novel this is not necessarily a bad thing there was some discussion on some of the assumptions required binary feedback and bounded noise in the end there was clear consensus that the paper adds a much needed theoretical framework to hilrl and should inspire further algorithmic work things to address for camera ready all the reviewers thought it was a bad idea to have related work in the appendix the ac agrees the experiments in the appendix are easy to miss more clearly reference them in the main text the text is not great in places especially in the additions made to the paper in response to the reviewers
[ 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 2175, 253, 1895, 273, 3939, 10921, 4715, 275, 35221, 4715, 253, 8103, 310, 281, 2216, 271, 5933, 326, 50276, 42905, 253, 2608, 846, 3710, 2289, 281, 247, 1643, 10921, 19241, 281, 417, 760, 17029, 253, 10921, 1159, 533, 671, 897, 352, 281, 4711, 271, 299, 4277, 29776, 3646, 253, 4477, 5467, 253, 23267, 403, 275, 50276, 520, 285, 326, 253, 1966, 8680, 310, 3632, 824, 326, 323, 667, 1375, 2250, 4667, 285, 9037, 3213, 50276, 84, 1240, 50276, 783, 2540, 1966, 8680, 340, 12310, 50275, 1991, 4789, 50276, 84, 1240, 50276, 71, 11285, 1240, 50276, 10328, 326, 269, 73, 275, 14168, 1179, 71, 323, 690, 1929, 1159, 966, 14168, 1179, 71, 352, 310, 2007, 8025, 326, 253, 2032, 23267, 403, 247, 7887, 264, 2715, 273, 253, 1966, 8680, 594, 326, 253, 10921, 273, 618, 73, 18207, 337, 604, 269, 11285, 1240, 50276, 805, 285, 470, 5010, 597, 1056, 897, 273, 247, 2280, 435, 6046, 9376, 281, 5416, 253, 3037, 1430, 273, 436, 7887, 1159, 352, 310, 1896, 281, 921, 326, 4209, 3530, 432, 253, 1966, 5203, 254, 403, 2217, 281, 3037, 253, 7887, 264, 10921, 1014, 689, 512, 1375, 5231, 275, 247, 941, 6363, 253, 1180, 273, 13301, 3058, 7024, 760, 327, 253, 8037, 18687, 432, 253, 2280, 435, 6046, 9376, 285, 253, 10454, 273, 253, 1159, 966, 14168, 1179, 71, 275, 2159, 1014, 604, 253, 10895, 310, 1781, 247, 10895, 1979, 3907, 1180, 273, 3530, 403, 3058, 281, 3037, 253, 10921, 1159, 689, 512, 273, 352, 50276, 783, 4477, 840, 13398, 436, 906, 342, 247, 10921, 1959, 17947, 6974, 281, 806, 4822, 4209, 941, 432, 2057, 247, 10334, 792, 390, 247, 4872, 278, 12132, 285, 897, 352, 323, 253, 4096, 273, 10921, 21473, 253, 10921, 21473, 1543, 2529, 1840, 16084, 326, 275, 1340, 281, 3037, 253, 10921, 1159, 689, 512, 253, 2856, 522, 842, 84, 275, 253, 941, 14604, 4561, 407, 1959, 17947, 352, 310, 2217, 281, 7316, 253, 23267, 689, 247, 1180, 273, 1375, 2250, 8557, 760, 7976, 327, 253, 6046, 1617, 285, 253, 1979, 273, 253, 1159, 966, 50275, 9328, 13503, 616, 5955, 342, 690, 22378, 327, 849, 281, 9017, 616, 1543, 281, 253, 4758, 273, 28841, 391, 77, 50276, 18, 20544, 253, 4758, 5421, 407, 253, 2929, 3133, 4460, 285, 253, 5955, 326, 841, 1543, 778, 3324, 670, 651, 1957, 247, 10112, 1635, 281, 253, 35221, 4715, 6239, 6684, 10527, 4685, 273, 288, 300, 11333, 285, 849, 352, 310, 326, 275, 3946, 841, 11333, 513, 417, 2430, 7863, 8322, 273, 941, 310, 1077, 1774, 3021, 253, 13260, 908, 275, 436, 789, 403, 417, 281, 320, 1869, 347, 14155, 533, 3185, 347, 4088, 273, 6684, 4685, 273, 841, 3374, 50276, 19, 32213, 253, 7681, 7680, 273, 436, 789, 310, 3710, 253, 11333, 3559, 285, 616, 1783, 275, 619, 4685, 310, 7194, 14287, 432, 253, 5368, 6239, 253, 3939, 10921, 4715, 23632, 403, 417, 1663, 4460, 390, 10084, 253, 10921, 1959, 2593, 19756, 30404, 533, 841, 1543, 403, 2168, 1246, 275, 253, 6239, 50275, 3229, 3784, 374, 891, 1158, 253, 2929, 556, 15785, 984, 273, 253, 10199, 273, 436, 9978, 50272, 2520, 789, 24543, 642, 4016, 38058, 3486, 5474, 339, 793, 360, 3454, 50275, 936, 1805, 2096, 1966, 565, 2955, 24318, 391, 77, 436, 2929, 13698, 281, 2085, 247, 10527, 4685, 273, 253, 8680, 2424, 432, 247, 1966, 281, 3037, 247, 10921, 1159, 253, 4477, 29328, 247, 872, 494, 3939, 28269, 391, 77, 5933, 534, 19241, 247, 1966, 327, 8680, 323, 697, 5231, 253, 4081, 5933, 3198, 281, 7316, 253, 10921, 1159, 323, 8680, 12506, 3317, 83, 2069, 275, 4499, 281, 2629, 391, 77, 5933, 534, 2430, 387, 1878, 7005, 909, 366, 633, 4818, 10120, 1315, 317, 18, 4259, 436, 310, 247, 15846, 10527, 2929, 285, 627, 403, 642, 4679, 50273, 33642, 50275, 1189, 455, 891, 1158, 436, 2929, 310, 4722, 275, 253, 1895, 326, 352, 2175, 285, 253, 10527, 1543, 1646, 1774, 891, 717, 417, 271, 6485, 275, 436, 2170, 2299, 285, 588, 12556, 3146, 2281, 253, 2929, 387, 5777, 2708, 253, 14924, 7887, 253, 19529, 310, 2918, 896, 407, 690, 1527, 3533, 10775, 253, 5301, 273, 253, 3033, 6012, 7147, 253, 3033, 273, 2629, 391, 77, 11333, 4679, 275, 247, 20953, 3126, 342, 271, 7202, 533, 2086, 33778, 10921, 1159, 326, 476, 320, 32305, 728, 651, 671, 320, 5322, 285, 17084, 7162, 275, 253, 6349, 273, 253, 6012, 5933, 50276, 296, 3755, 20556, 50272, 783, 1895, 5421, 326, 273, 5556, 595, 7316, 272, 247, 1966, 327, 8680, 50273, 1542, 10921, 310, 271, 4722, 1386, 273, 1263, 326, 22828, 247, 50273, 783, 33977, 1783, 50274, 783, 3033, 6012, 310, 19143, 20654, 285, 253, 9759, 273, 50273, 783, 3762, 310, 6571, 34025, 50276, 20881, 1255, 265, 50272, 16217, 3825, 275, 247, 20953, 3126, 342, 690, 2086, 33778, 10921, 50273, 3701, 326, 476, 320, 32305, 728, 651, 1361, 33876, 907, 253, 4081, 50273, 41528, 285, 253, 6863, 1255, 273, 253, 3033, 50274, 783, 5962, 1527, 1953, 310, 849, 281, 33876, 907, 253, 3033, 6012, 50273, 4609, 7024, 327, 247, 2557, 273, 10454, 273, 253, 10921, 1159, 50273, 3113, 253, 2629, 3033, 326, 7024, 327, 253, 10454, 273, 273, 253, 50273, 25974, 50276, 783, 6452, 2593, 534, 253, 4477, 1127, 281, 760, 25339, 2852, 789, 285, 417, 253, 7364, 273, 253, 5933, 4543, 4016, 38058, 3486, 347, 247, 3762, 2929, 38058, 3486, 310, 2834, 281, 24228, 7364, 275, 253, 10527, 1783, 403, 8489, 5469, 4768, 253, 2929, 533, 352, 651, 320, 5322, 281, 923, 4613, 5955, 275, 253, 6452, 327, 253, 4583, 7364, 273, 436, 2746, 5474, 33032, 2520, 2929, 13698, 281, 8415, 253, 391, 77, 1895, 275, 247, 4758, 835, 253, 23267, 403, 8523, 7202, 253, 11333, 16947, 275, 767, 8661, 806, 275, 253, 17947, 3408, 253, 5933, 41084, 3530, 830, 253, 3126, 281, 3037, 670, 253, 16307, 432, 1375, 281, 1375, 840, 253, 5933, 19241, 253, 2608, 670, 23267, 323, 2173, 1375, 1913, 8557, 285, 3249, 598, 342, 271, 11193, 281, 253, 8654, 3646, 253, 2929, 2722, 247, 3033, 327, 253, 749, 32581, 1319, 8037, 327, 253, 4795, 3646, 253, 1783, 310, 11096, 281, 253, 1083, 835, 253, 2032, 10921, 310, 8985, 50276, 37585, 2792, 50274, 249, 1386, 12035, 253, 5426, 273, 253, 5222, 273, 247, 1159, 8772, 281, 253, 873, 1057, 417, 897, 253, 873, 1182, 327, 253, 987, 1133, 1930, 368, 452, 281, 1818, 253, 5426, 594, 326, 253, 36138, 310, 689, 253, 3603, 273, 1182, 50274, 262, 778, 1056, 3282, 281, 13366, 4499, 634, 2746, 281, 337, 275, 253, 3939, 4715, 12494, 273, 253, 2905, 789, 2593, 597, 8415, 247, 2905, 10921, 3939, 4715, 1895, 3738, 275, 247, 5777, 1027, 4758, 1929, 5502, 1566, 7202, 10921, 1159, 50275, 18, 298, 527, 1216, 277, 10709, 25742, 893, 278, 28669, 4635, 255, 9163, 76, 256, 16399, 583, 76, 465, 465, 376, 2327, 247, 43425, 1491, 6828, 10921, 4715, 323, 35221, 4715, 5723, 2824, 43425, 20544, 50276, 783, 1895, 273, 4560, 562, 2217, 1491, 670, 253, 10921, 1159, 273, 271, 278, 12132, 281, 320, 2104, 281, 1705, 598, 342, 247, 2810, 40252, 2934, 273, 752, 253, 8654, 3646, 310, 310, 247, 2201, 5691, 285, 9560, 281, 19444, 1142, 4893, 273, 391, 77, 436, 2929, 2789, 247, 1175, 3177, 6828, 4404, 16161, 436, 1895, 50275, 783, 10527, 1543, 6848, 247, 3426, 5406, 273, 253, 3045, 273, 253, 5570, 10166, 970, 436, 1332, 50276, 20881, 1255, 265, 50276, 9088, 310, 642, 16774, 7103, 275, 253, 2022, 2929, 285, 7103, 275, 253, 24864, 2144, 310, 1077, 3710, 436, 310, 417, 247, 2968, 49619, 1580, 1175, 10527, 9380, 476, 407, 16571, 9021, 281, 5723, 2824, 1014, 1293, 4679, 533, 1690, 731, 651, 1056, 253, 19529, 10046, 50275, 49836, 273, 253, 2144, 310, 6079, 285, 14086, 387, 2069, 352, 812, 320, 5520, 407, 5277, 625, 16875, 4431, 670, 752, 253, 8542, 8453, 273, 1016, 906, 310, 50276, 783, 4477, 13366, 3748, 7364, 273, 253, 2746, 275, 253, 6452, 533, 513, 417, 2319, 731, 387, 2978, 275, 619, 1859, 253, 5962, 7364, 403, 337, 352, 310, 12744, 849, 253, 1332, 11498, 285, 374, 352, 310, 12744, 752, 253, 6046, 8459, 588, 320, 323, 15958, 3237, 247, 1355, 8459, 2789, 253, 2424, 1180, 273, 19241, 275, 10012, 495, 1077, 1781, 50276, 74, 513, 417, 923, 667, 16289, 3237, 342, 436, 2929, 5474, 339, 431, 248, 4477, 12661, 247, 10527, 7792, 323, 12392, 1966, 565, 2955, 24318, 391, 77, 534, 36509, 253, 1895, 715, 247, 10921, 4924, 17947, 3408, 285, 271, 3939, 10921, 4715, 3408, 253, 5570, 556, 642, 3640, 273, 253, 10921, 1159, 285, 556, 281, 3037, 670, 253, 10921, 432, 247, 1966, 6485, 275, 253, 806, 3408, 253, 5570, 33826, 253, 3126, 970, 2629, 391, 453, 3082, 275, 253, 1273, 3408, 253, 5570, 840, 19241, 253, 1966, 670, 3054, 352, 556, 14494, 1309, 17947, 275, 1340, 281, 3037, 670, 253, 10921, 253, 4477, 1263, 253, 3410, 10454, 273, 436, 1895, 10941, 253, 3410, 10454, 273, 17947, 342, 253, 3410, 10454, 273, 4715, 253, 10921, 597, 12661, 271, 3939, 10921, 4715, 5933, 285, 3033, 697, 3410, 10454, 347, 247, 1159, 273, 253, 10454, 273, 253, 10921, 1159, 966, 597, 840, 4647, 436, 5933, 281, 4872, 31934, 793, 285, 28841, 391, 77, 44190, 40638, 3410, 10454, 1543, 323, 841, 2714, 2219, 20544, 50276, 783, 2929, 3400, 247, 5322, 10527, 7792, 281, 1158, 670, 1966, 565, 2955, 24318, 391, 77, 50276, 783, 10527, 1543, 1329, 253, 16774, 8310, 326, 359, 878, 1199, 1679, 1491, 670, 253, 10921, 685, 6355, 342, 253, 3126, 534, 310, 18462, 323, 789, 327, 391, 77, 326, 15771, 327, 1966, 8680, 50276, 783, 10527, 1783, 3133, 2266, 285, 253, 1543, 512, 1646, 3590, 281, 479, 891, 858, 417, 2451, 253, 27947, 50276, 783, 2898, 273, 253, 10527, 1783, 281, 28841, 391, 77, 310, 3782, 5322, 285, 8525, 253, 31376, 273, 253, 4081, 7792, 50276, 783, 2929, 310, 973, 15720, 4583, 285, 973, 34218, 50275, 20881, 1255, 265, 50276, 783, 9978, 2789, 690, 2266, 285, 45783, 13260, 824, 347, 7384, 8985, 23267, 285, 11542, 6046, 50276, 783, 4081, 7792, 273, 23694, 271, 17947, 285, 271, 3939, 4715, 3408, 310, 417, 4460, 285, 29688, 8025, 275, 247, 2257, 273, 2720, 789, 327, 3939, 10921, 4715, 8542, 789, 2223, 9010, 326, 10040, 839, 875, 17947, 285, 3939, 4715, 310, 12912, 534, 253, 10527, 7792, 1057, 417, 9232, 50276, 783, 2929, 812, 320, 30909, 670, 849, 253, 10527, 7792, 23417, 281, 8542, 4893, 50276, 783, 2929, 19756, 667, 16774, 7103, 1907, 387, 1878, 247, 1355, 16774, 7103, 273, 253, 4081, 1332, 281, 923, 604, 253, 3410, 10454, 1543, 3761, 16774, 3045, 387, 1878, 275, 2969, 4679, 651, 1056, 253, 2929, 1199, 10046, 275, 619, 4743, 253, 2929, 651, 5649, 432, 247, 5955, 273, 253, 13260, 1160, 323, 253, 10527, 1543, 275, 1798, 8985, 23267, 285, 11542, 6046, 352, 943, 671, 2319, 604, 253, 2746, 812, 320, 6508, 281, 14561, 23267, 50276, 74, 858, 417, 1089, 247, 5955, 273, 2442, 16055, 3486, 273, 253, 2929, 2490, 187, 4118, 18435, 27, 2520, 2929, 2340, 684, 1966, 565, 2955, 24318, 391, 77, 253, 7792, 285, 4081, 5933, 4483, 271, 5570, 281, 17029, 253, 10921, 1159, 285, 4711, 247, 2822, 29776, 3646, 846, 3710, 2289, 281, 247, 1643, 10921, 19241, 253, 3625, 9021, 273, 253, 789, 403, 253, 1895, 15895, 5933, 285, 7473, 1543, 50275, 455, 30628, 5821, 327, 14924, 954, 15538, 627, 369, 13969, 326, 1895, 4758, 310, 4623, 285, 4722, 285, 253, 14168, 310, 3451, 253, 30628, 4879, 253, 5609, 908, 403, 417, 747, 285, 253, 1543, 417, 10084, 533, 253, 15895, 310, 4460, 436, 310, 417, 7933, 247, 3076, 2181, 627, 369, 690, 5955, 327, 690, 273, 253, 13260, 2424, 8985, 8680, 285, 11542, 6046, 275, 253, 990, 627, 369, 2590, 13969, 326, 253, 2929, 11323, 247, 1199, 3058, 10527, 7792, 281, 288, 300, 8435, 285, 943, 26761, 2007, 5933, 280, 789, 50276, 28579, 281, 2953, 323, 6568, 4704, 50276, 455, 253, 30628, 1869, 352, 369, 247, 3076, 2934, 281, 452, 2905, 789, 275, 253, 30762, 253, 913, 18726, 50275, 783, 4679, 275, 253, 30762, 403, 3477, 281, 2985, 625, 4518, 3806, 731, 275, 253, 2022, 2505, 50276, 783, 2505, 310, 417, 1270, 275, 5053, 3340, 275, 253, 30733, 1160, 281, 253, 2929, 275, 2380, 281, 253, 30628, 50273 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 2175, 253, 1895, 273, 3939, 10921, 4715, 275, 35221, 4715, 253, 8103, 310, 281, 2216, 271, 5933, 326, 50276, 42905, 253, 2608, 846, 3710, 2289, 281, 247, 1643, 10921, 19241, 281, 417, 760, 17029, 253, 10921, 1159, 533, 671, 897, 352, 281, 4711, 271, 299, 4277, 29776, 3646, 253, 4477, 5467, 253, 23267, 403, 275, 50276, 520, 285, 326, 253, 1966, 8680, 310, 3632, 824, 326, 323, 667, 1375, 2250, 4667, 285, 9037, 3213, 50276, 84, 1240, 50276, 783, 2540, 1966, 8680, 340, 12310, 50275, 1991, 4789, 50276, 84, 1240, 50276, 71, 11285, 1240, 50276, 10328, 326, 269, 73, 275, 14168, 1179, 71, 323, 690, 1929, 1159, 966, 14168, 1179, 71, 352, 310, 2007, 8025, 326, 253, 2032, 23267, 403, 247, 7887, 264, 2715, 273, 253, 1966, 8680, 594, 326, 253, 10921, 273, 618, 73, 18207, 337, 604, 269, 11285, 1240, 50276, 805, 285, 470, 5010, 597, 1056, 897, 273, 247, 2280, 435, 6046, 9376, 281, 5416, 253, 3037, 1430, 273, 436, 7887, 1159, 352, 310, 1896, 281, 921, 326, 4209, 3530, 432, 253, 1966, 5203, 254, 403, 2217, 281, 3037, 253, 7887, 264, 10921, 1014, 689, 512, 1375, 5231, 275, 247, 941, 6363, 253, 1180, 273, 13301, 3058, 7024, 760, 327, 253, 8037, 18687, 432, 253, 2280, 435, 6046, 9376, 285, 253, 10454, 273, 253, 1159, 966, 14168, 1179, 71, 275, 2159, 1014, 604, 253, 10895, 310, 1781, 247, 10895, 1979, 3907, 1180, 273, 3530, 403, 3058, 281, 3037, 253, 10921, 1159, 689, 512, 273, 352, 50276, 783, 4477, 840, 13398, 436, 906, 342, 247, 10921, 1959, 17947, 6974, 281, 806, 4822, 4209, 941, 432, 2057, 247, 10334, 792, 390, 247, 4872, 278, 12132, 285, 897, 352, 323, 253, 4096, 273, 10921, 21473, 253, 10921, 21473, 1543, 2529, 1840, 16084, 326, 275, 1340, 281, 3037, 253, 10921, 1159, 689, 512, 253, 2856, 522, 842, 84, 275, 253, 941, 14604, 4561, 407, 1959, 17947, 352, 310, 2217, 281, 7316, 253, 23267, 689, 247, 1180, 273, 1375, 2250, 8557, 760, 7976, 327, 253, 6046, 1617, 285, 253, 1979, 273, 253, 1159, 966, 50275, 9328, 13503, 616, 5955, 342, 690, 22378, 327, 849, 281, 9017, 616, 1543, 281, 253, 4758, 273, 28841, 391, 77, 50276, 18, 20544, 253, 4758, 5421, 407, 253, 2929, 3133, 4460, 285, 253, 5955, 326, 841, 1543, 778, 3324, 670, 651, 1957, 247, 10112, 1635, 281, 253, 35221, 4715, 6239, 6684, 10527, 4685, 273, 288, 300, 11333, 285, 849, 352, 310, 326, 275, 3946, 841, 11333, 513, 417, 2430, 7863, 8322, 273, 941, 310, 1077, 1774, 3021, 253, 13260, 908, 275, 436, 789, 403, 417, 281, 320, 1869, 347, 14155, 533, 3185, 347, 4088, 273, 6684, 4685, 273, 841, 3374, 50276, 19, 32213, 253, 7681, 7680, 273, 436, 789, 310, 3710, 253, 11333, 3559, 285, 616, 1783, 275, 619, 4685, 310, 7194, 14287, 432, 253, 5368, 6239, 253, 3939, 10921, 4715, 23632, 403, 417, 1663, 4460, 390, 10084, 253, 10921, 1959, 2593, 19756, 30404, 533, 841, 1543, 403, 2168, 1246, 275, 253, 6239, 50275, 3229, 3784, 374, 891, 1158, 253, 2929, 556, 15785, 984, 273, 253, 10199, 273, 436, 9978, 50272, 2520, 789, 24543, 642, 4016, 38058, 3486, 5474, 339, 793, 360, 3454, 50275, 936, 1805, 2096, 1966, 565, 2955, 24318, 391, 77, 436, 2929, 13698, 281, 2085, 247, 10527, 4685, 273, 253, 8680, 2424, 432, 247, 1966, 281, 3037, 247, 10921, 1159, 253, 4477, 29328, 247, 872, 494, 3939, 28269, 391, 77, 5933, 534, 19241, 247, 1966, 327, 8680, 323, 697, 5231, 253, 4081, 5933, 3198, 281, 7316, 253, 10921, 1159, 323, 8680, 12506, 3317, 83, 2069, 275, 4499, 281, 2629, 391, 77, 5933, 534, 2430, 387, 1878, 7005, 909, 366, 633, 4818, 10120, 1315, 317, 18, 4259, 436, 310, 247, 15846, 10527, 2929, 285, 627, 403, 642, 4679, 50273, 33642, 50275, 1189, 455, 891, 1158, 436, 2929, 310, 4722, 275, 253, 1895, 326, 352, 2175, 285, 253, 10527, 1543, 1646, 1774, 891, 717, 417, 271, 6485, 275, 436, 2170, 2299, 285, 588, 12556, 3146, 2281, 253, 2929, 387, 5777, 2708, 253, 14924, 7887, 253, 19529, 310, 2918, 896, 407, 690, 1527, 3533, 10775, 253, 5301, 273, 253, 3033, 6012, 7147, 253, 3033, 273, 2629, 391, 77, 11333, 4679, 275, 247, 20953, 3126, 342, 271, 7202, 533, 2086, 33778, 10921, 1159, 326, 476, 320, 32305, 728, 651, 671, 320, 5322, 285, 17084, 7162, 275, 253, 6349, 273, 253, 6012, 5933, 50276, 296, 3755, 20556, 50272, 783, 1895, 5421, 326, 273, 5556, 595, 7316, 272, 247, 1966, 327, 8680, 50273, 1542, 10921, 310, 271, 4722, 1386, 273, 1263, 326, 22828, 247, 50273, 783, 33977, 1783, 50274, 783, 3033, 6012, 310, 19143, 20654, 285, 253, 9759, 273, 50273, 783, 3762, 310, 6571, 34025, 50276, 20881, 1255, 265, 50272, 16217, 3825, 275, 247, 20953, 3126, 342, 690, 2086, 33778, 10921, 50273, 3701, 326, 476, 320, 32305, 728, 651, 1361, 33876, 907, 253, 4081, 50273, 41528, 285, 253, 6863, 1255, 273, 253, 3033, 50274, 783, 5962, 1527, 1953, 310, 849, 281, 33876, 907, 253, 3033, 6012, 50273, 4609, 7024, 327, 247, 2557, 273, 10454, 273, 253, 10921, 1159, 50273, 3113, 253, 2629, 3033, 326, 7024, 327, 253, 10454, 273, 273, 253, 50273, 25974, 50276, 783, 6452, 2593, 534, 253, 4477, 1127, 281, 760, 25339, 2852, 789, 285, 417, 253, 7364, 273, 253, 5933, 4543, 4016, 38058, 3486, 347, 247, 3762, 2929, 38058, 3486, 310, 2834, 281, 24228, 7364, 275, 253, 10527, 1783, 403, 8489, 5469, 4768, 253, 2929, 533, 352, 651, 320, 5322, 281, 923, 4613, 5955, 275, 253, 6452, 327, 253, 4583, 7364, 273, 436, 2746, 5474, 33032, 2520, 2929, 13698, 281, 8415, 253, 391, 77, 1895, 275, 247, 4758, 835, 253, 23267, 403, 8523, 7202, 253, 11333, 16947, 275, 767, 8661, 806, 275, 253, 17947, 3408, 253, 5933, 41084, 3530, 830, 253, 3126, 281, 3037, 670, 253, 16307, 432, 1375, 281, 1375, 840, 253, 5933, 19241, 253, 2608, 670, 23267, 323, 2173, 1375, 1913, 8557, 285, 3249, 598, 342, 271, 11193, 281, 253, 8654, 3646, 253, 2929, 2722, 247, 3033, 327, 253, 749, 32581, 1319, 8037, 327, 253, 4795, 3646, 253, 1783, 310, 11096, 281, 253, 1083, 835, 253, 2032, 10921, 310, 8985, 50276, 37585, 2792, 50274, 249, 1386, 12035, 253, 5426, 273, 253, 5222, 273, 247, 1159, 8772, 281, 253, 873, 1057, 417, 897, 253, 873, 1182, 327, 253, 987, 1133, 1930, 368, 452, 281, 1818, 253, 5426, 594, 326, 253, 36138, 310, 689, 253, 3603, 273, 1182, 50274, 262, 778, 1056, 3282, 281, 13366, 4499, 634, 2746, 281, 337, 275, 253, 3939, 4715, 12494, 273, 253, 2905, 789, 2593, 597, 8415, 247, 2905, 10921, 3939, 4715, 1895, 3738, 275, 247, 5777, 1027, 4758, 1929, 5502, 1566, 7202, 10921, 1159, 50275, 18, 298, 527, 1216, 277, 10709, 25742, 893, 278, 28669, 4635, 255, 9163, 76, 256, 16399, 583, 76, 465, 465, 376, 2327, 247, 43425, 1491, 6828, 10921, 4715, 323, 35221, 4715, 5723, 2824, 43425, 20544, 50276, 783, 1895, 273, 4560, 562, 2217, 1491, 670, 253, 10921, 1159, 273, 271, 278, 12132, 281, 320, 2104, 281, 1705, 598, 342, 247, 2810, 40252, 2934, 273, 752, 253, 8654, 3646, 310, 310, 247, 2201, 5691, 285, 9560, 281, 19444, 1142, 4893, 273, 391, 77, 436, 2929, 2789, 247, 1175, 3177, 6828, 4404, 16161, 436, 1895, 50275, 783, 10527, 1543, 6848, 247, 3426, 5406, 273, 253, 3045, 273, 253, 5570, 10166, 970, 436, 1332, 50276, 20881, 1255, 265, 50276, 9088, 310, 642, 16774, 7103, 275, 253, 2022, 2929, 285, 7103, 275, 253, 24864, 2144, 310, 1077, 3710, 436, 310, 417, 247, 2968, 49619, 1580, 1175, 10527, 9380, 476, 407, 16571, 9021, 281, 5723, 2824, 1014, 1293, 4679, 533, 1690, 731, 651, 1056, 253, 19529, 10046, 50275, 49836, 273, 253, 2144, 310, 6079, 285, 14086, 387, 2069, 352, 812, 320, 5520, 407, 5277, 625, 16875, 4431, 670, 752, 253, 8542, 8453, 273, 1016, 906, 310, 50276, 783, 4477, 13366, 3748, 7364, 273, 253, 2746, 275, 253, 6452, 533, 513, 417, 2319, 731, 387, 2978, 275, 619, 1859, 253, 5962, 7364, 403, 337, 352, 310, 12744, 849, 253, 1332, 11498, 285, 374, 352, 310, 12744, 752, 253, 6046, 8459, 588, 320, 323, 15958, 3237, 247, 1355, 8459, 2789, 253, 2424, 1180, 273, 19241, 275, 10012, 495, 1077, 1781, 50276, 74, 513, 417, 923, 667, 16289, 3237, 342, 436, 2929, 5474, 339, 431, 248, 4477, 12661, 247, 10527, 7792, 323, 12392, 1966, 565, 2955, 24318, 391, 77, 534, 36509, 253, 1895, 715, 247, 10921, 4924, 17947, 3408, 285, 271, 3939, 10921, 4715, 3408, 253, 5570, 556, 642, 3640, 273, 253, 10921, 1159, 285, 556, 281, 3037, 670, 253, 10921, 432, 247, 1966, 6485, 275, 253, 806, 3408, 253, 5570, 33826, 253, 3126, 970, 2629, 391, 453, 3082, 275, 253, 1273, 3408, 253, 5570, 840, 19241, 253, 1966, 670, 3054, 352, 556, 14494, 1309, 17947, 275, 1340, 281, 3037, 670, 253, 10921, 253, 4477, 1263, 253, 3410, 10454, 273, 436, 1895, 10941, 253, 3410, 10454, 273, 17947, 342, 253, 3410, 10454, 273, 4715, 253, 10921, 597, 12661, 271, 3939, 10921, 4715, 5933, 285, 3033, 697, 3410, 10454, 347, 247, 1159, 273, 253, 10454, 273, 253, 10921, 1159, 966, 597, 840, 4647, 436, 5933, 281, 4872, 31934, 793, 285, 28841, 391, 77, 44190, 40638, 3410, 10454, 1543, 323, 841, 2714, 2219, 20544, 50276, 783, 2929, 3400, 247, 5322, 10527, 7792, 281, 1158, 670, 1966, 565, 2955, 24318, 391, 77, 50276, 783, 10527, 1543, 1329, 253, 16774, 8310, 326, 359, 878, 1199, 1679, 1491, 670, 253, 10921, 685, 6355, 342, 253, 3126, 534, 310, 18462, 323, 789, 327, 391, 77, 326, 15771, 327, 1966, 8680, 50276, 783, 10527, 1783, 3133, 2266, 285, 253, 1543, 512, 1646, 3590, 281, 479, 891, 858, 417, 2451, 253, 27947, 50276, 783, 2898, 273, 253, 10527, 1783, 281, 28841, 391, 77, 310, 3782, 5322, 285, 8525, 253, 31376, 273, 253, 4081, 7792, 50276, 783, 2929, 310, 973, 15720, 4583, 285, 973, 34218, 50275, 20881, 1255, 265, 50276, 783, 9978, 2789, 690, 2266, 285, 45783, 13260, 824, 347, 7384, 8985, 23267, 285, 11542, 6046, 50276, 783, 4081, 7792, 273, 23694, 271, 17947, 285, 271, 3939, 4715, 3408, 310, 417, 4460, 285, 29688, 8025, 275, 247, 2257, 273, 2720, 789, 327, 3939, 10921, 4715, 8542, 789, 2223, 9010, 326, 10040, 839, 875, 17947, 285, 3939, 4715, 310, 12912, 534, 253, 10527, 7792, 1057, 417, 9232, 50276, 783, 2929, 812, 320, 30909, 670, 849, 253, 10527, 7792, 23417, 281, 8542, 4893, 50276, 783, 2929, 19756, 667, 16774, 7103, 1907, 387, 1878, 247, 1355, 16774, 7103, 273, 253, 4081, 1332, 281, 923, 604, 253, 3410, 10454, 1543, 3761, 16774, 3045, 387, 1878, 275, 2969, 4679, 651, 1056, 253, 2929, 1199, 10046, 275, 619, 4743, 253, 2929, 651, 5649, 432, 247, 5955, 273, 253, 13260, 1160, 323, 253, 10527, 1543, 275, 1798, 8985, 23267, 285, 11542, 6046, 352, 943, 671, 2319, 604, 253, 2746, 812, 320, 6508, 281, 14561, 23267, 50276, 74, 858, 417, 1089, 247, 5955, 273, 2442, 16055, 3486, 273, 253, 2929, 2490, 187, 4118, 18435, 27, 2520, 2929, 2340, 684, 1966, 565, 2955, 24318, 391, 77, 253, 7792, 285, 4081, 5933, 4483, 271, 5570, 281, 17029, 253, 10921, 1159, 285, 4711, 247, 2822, 29776, 3646, 846, 3710, 2289, 281, 247, 1643, 10921, 19241, 253, 3625, 9021, 273, 253, 789, 403, 253, 1895, 15895, 5933, 285, 7473, 1543, 50275, 455, 30628, 5821, 327, 14924, 954, 15538, 627, 369, 13969, 326, 1895, 4758, 310, 4623, 285, 4722, 285, 253, 14168, 310, 3451, 253, 30628, 4879, 253, 5609, 908, 403, 417, 747, 285, 253, 1543, 417, 10084, 533, 253, 15895, 310, 4460, 436, 310, 417, 7933, 247, 3076, 2181, 627, 369, 690, 5955, 327, 690, 273, 253, 13260, 2424, 8985, 8680, 285, 11542, 6046, 275, 253, 990, 627, 369, 2590, 13969, 326, 253, 2929, 11323, 247, 1199, 3058, 10527, 7792, 281, 288, 300, 8435, 285, 943, 26761, 2007, 5933, 280, 789, 50276, 28579, 281, 2953, 323, 6568, 4704, 50276, 455, 253, 30628, 1869, 352, 369, 247, 3076, 2934, 281, 452, 2905, 789, 275, 253, 30762, 253, 913, 18726, 50275, 783, 4679, 275, 253, 30762, 403, 3477, 281, 2985, 625, 4518, 3806, 731, 275, 253, 2022, 2505, 50276, 783, 2505, 310, 417, 1270, 275, 5053, 3340, 275, 253, 30733, 1160, 281, 253, 2929, 275, 2380, 281, 253, 30628, 50273 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper proposes a gradual neural network pruning algorithm which utilizes soft topk mechanism for the weight update of maskedunmasked parameters in particular authors adopt the differentiable topk mechanism of xie et al 2020 which smooths the topk problem in a principled manner via viewing topk as an optimal transport problem and adds the entropic regularizer to make the allocation smooth the model applies both soft and hard topk for the evaluation of the model and only soft for the gradient computation when combined with an appropriate scheduling of sparsity and entropic regularization intensity the method achieves sotalike performance on resnet50 and vitb trained on imagenet1k strength first of all i must say that the paper is very clearly written i like how algorithms 12 are placed which makes algorithm 3 very sensible and straightforward thing to do also the dual averagingbased perspective toward the topkast update was quite new and fresh to me although i would have appreciated slightly more detailed discussion on what dual averaging is and i think this could be inspirational for many future works another big strength of the proposed method is its inference efficiency having much lower theoretical inference flop than topkast weakness my main concern is on the practicality of the proposed method although the central idea of the algorithm is interesting and it seems like it gives slight boost in terms of the final sparsitytoperformance tradeoff i am slightly worried about the practicality of the method spartan has additional hyperparameter beta which needs to be carefully selected and scheduled for the performance gain over the existing methods which may require many training runs to be tuned also unlike magnitude pruning whose active parameters decrease as the training proceeds and topkast which has decreased number of peak parameters via backward sparsity spartan has all parameters active throughout the training having very little benefit in terms of resource used for training training flops memory here are some nitpicks or much minorer concerns in line 60 the paper mentions that we show that spartan interpolates between but i do not think this point has been rigorously shown in fact a key idea of topkast is using backward sparsity which is slightly less than forward sparsity but not dense and spartan uses the fully densebutrescaled backward in line 137 authors seem to argue that the entropic regularization is necessary in order to efficiently solve eq2 or equivalently eq 1 i do not think this is necessarily true in fact the solution of eq1 is actually quite easy to get for uniform c one would simply need to perform the magnitudebased pruning or costrescaled magnitude whenever c is nonuniform thus i view the entropic regularization as an artificially introduced but nevertheless neat artifact to make the allocation softer instead of a requirement authors adequately addressed these in my humble opinion docsepthis article introduces a new method spartan which allows training neural networks with sparsity constraints spartan belongs to the densetosparse family of methods it maintains a dense parameter during training and makes it sparse little by little spartan controls the level of sparsity thanks to a parameter beta which performs an interpolation between two popular methods to train sparse neural networks topkast ref 23 of the paper at beta 0 and imp 47 at beta infty the central idea is to use during training a mask on the network parameters based on a softtopk operator implemented via regularized optimal transport overall i find this article very well written especially the introduction the background and the presentation of the contributions the authors have made a notable effort of pedagogy which greatly facilitates the reading and the understanding of the article in particular figures 1 and 2 are very clear didactic and i thank the authors for this effort on the other hand the proposed method seems to me quite relevant and well supported by many experiments the fact that spartan is linked to and generalizes imp and topkast strengthens the contribution the method seems to me quite new it skillfully combines the softtopk operator of xie et al 41 with dual averaging ideas however there are a few points that are in my opinion unclear and would need to be clarified these prevent me from putting a very clear accept on this article but i would gladly change my mind depending on the authors answers na docsepthis paper proposes a new method for pruning dnn parameters the proposed method uses a differential topk operator so that the proposed method interpolates between imp and topkast and balances exploitation and exploration well strengths the proposed method is simple and easy to use the proposed method outperforms existing methods the experiments include the modern vit architecture weaknesses while i grasp the intuitive idea of the proposed method eg the proposed method interpolates imp and topkast i dont find the rationale behind the specific design of the proposed update rule more detailed discussions and hopefully principled interpretation eg the update of the proposed method can be interpreted as a descent step of some objective function should be given the authors mentioned the limitations and social impacts well in section 5 docsepthe paper presents spartan a densetosparse algorithm to train sparse neural networks in which the sparsity of parameters can be enforced directly by a predetermined budget the paper has 3 main contributions by incorporating the soft topk operator in the parameter update and by introducing the sharpness parameter spartan allows to interpolate between iterative magnitude pruning and topk always sparse training spartan shows very competitive and consistent performance over the existing methods and also the fully dense training the authors also study the effect of the sharpness parameter on the accuracy as a tradeoff between exploration and exploitation amongst core components of spartan the algorithm 4 leaves me much doubt it appears to me that the algorithm 4 is nothing but the sinkhorn algorithm in that case it is not clear to me the motivation of the stopping criteria in the line 7 in practice and also in the optimal transport community it is enough to control the marginal violation see remark 46 in compot the calculation of the sinkhorn iterations does not look right to me here i present my own derivation of the sinkhorn update first let us define y y y in mathbb rd times 2 by stacking two vectors in mathbb rd and c fracvc 0d in mathbb rd times 2 where 0d is the zero vector in mathbb rd then clearly langle c y rangle fracvct y where langle c y rangle sumij cij yij denotes the frobenius product we can rewrite the problem 2 as miny in mathbb rd times 2 langle c y rangle text subject to y 12 c text and yt 1d k 1dt c kt the two marginals are not yet normalized so denote s 1dt c then define c fraccs c sc fracvc 0d and k frac1s k 1dt c kt fracks 1 frackst the above problem is equivalent to miny in mathbb rd times 2 langle c y rangle text subject to y 12 c text and yt 1d k now this is a properly defined optimal transport problem if we define the entropic regularization problem as for beta 0 miny in mathbb rd times 2 langle c y rangle frac1beta hy text subject to y 12 c text and yt 1d k where hy sumij yij log yij 1 the sinkhorn update now reads for dual vectors u in mathbb r2 and v in mathbb rd we have v log c log sumj expbeta ccdot j uj and u log k log sumi expbeta ci cdot vi and the optimal plan y is given by yij expvi uj beta cij comparing v u y to nu mu m from the lines 456 in the algorithm 4 respectively i do not see why they are equivalent reference compot gabriel peyr and marco cuturi 2019 computational optimal transport with applications to data science foundations and trends in machine learning vol 11 no 56 pp 355607 the authors do discuss the limitations and potential negative societal impact of their work ### Summary:
all reviewers agree that the paper is clearly written and proposes an algorithm which is both novel and efficient the rebuttal has clarified a number of points and thereby adressed most of the concerns of the reviewers the authors are thus strongly encouraged to take into account the comments of the reviewers and to add some of the clarifications that they provided in this discussion in the paper and supplementary materials
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 247, 26830, 11454, 2990, 819, 25004, 5933, 534, 29820, 2602, 1755, 76, 5122, 323, 253, 2801, 5731, 273, 34741, 328, 12477, 264, 3602, 275, 1798, 4477, 5283, 253, 46350, 1755, 76, 5122, 273, 1269, 466, 1162, 355, 9169, 534, 6032, 84, 253, 1755, 76, 1895, 275, 247, 3505, 74, 6216, 5133, 3066, 14657, 1755, 76, 347, 271, 8654, 4616, 1895, 285, 11323, 253, 994, 12189, 3963, 6081, 281, 1056, 253, 17621, 6032, 253, 1566, 10384, 1097, 2602, 285, 1892, 1755, 76, 323, 253, 7103, 273, 253, 1566, 285, 760, 2602, 323, 253, 11786, 13782, 672, 5678, 342, 271, 4569, 27387, 273, 37139, 414, 285, 994, 12189, 37820, 7133, 253, 1332, 33526, 256, 5359, 2804, 3045, 327, 501, 3024, 1235, 285, 9084, 67, 10166, 327, 4440, 257, 292, 18, 76, 4757, 806, 273, 512, 891, 1364, 1333, 326, 253, 2929, 310, 1077, 4518, 3542, 891, 751, 849, 11333, 1249, 403, 4845, 534, 2789, 5933, 495, 1077, 24600, 285, 15246, 2181, 281, 513, 671, 253, 8746, 25001, 3169, 8668, 2584, 253, 1755, 76, 505, 5731, 369, 3240, 747, 285, 5352, 281, 479, 3738, 891, 651, 452, 14109, 5777, 625, 7000, 5955, 327, 752, 8746, 25001, 310, 285, 891, 1158, 436, 812, 320, 39637, 1050, 323, 1142, 2852, 2987, 1529, 1943, 4757, 273, 253, 4081, 1332, 310, 697, 17032, 6733, 1907, 1199, 2406, 10527, 17032, 892, 412, 685, 1755, 76, 505, 50276, 20881, 1255, 619, 2022, 4468, 310, 327, 253, 8542, 414, 273, 253, 4081, 1332, 3738, 253, 4275, 2934, 273, 253, 5933, 310, 4722, 285, 352, 3133, 751, 352, 4245, 4512, 9510, 275, 2426, 273, 253, 2457, 37139, 414, 85, 2211, 12301, 5454, 2727, 891, 717, 5777, 11926, 670, 253, 8542, 414, 273, 253, 1332, 653, 46412, 556, 3081, 4373, 19484, 9840, 534, 3198, 281, 320, 9257, 4236, 285, 11526, 323, 253, 3045, 6351, 689, 253, 5368, 3082, 534, 778, 2430, 1142, 3733, 6613, 281, 320, 24251, 671, 12401, 9777, 819, 25004, 3692, 3939, 3602, 6379, 347, 253, 3733, 16947, 285, 1755, 76, 505, 534, 556, 6137, 1180, 273, 5241, 3602, 3066, 19265, 37139, 414, 653, 46412, 556, 512, 3602, 3939, 4768, 253, 3733, 1907, 1077, 1652, 5649, 275, 2426, 273, 7741, 908, 323, 3733, 3733, 892, 2695, 3541, 50276, 1568, 403, 690, 12389, 81, 5519, 390, 1199, 1054, 14071, 7350, 50276, 249, 1386, 3925, 253, 2929, 25957, 326, 359, 921, 326, 653, 46412, 20670, 684, 875, 50276, 2858, 891, 513, 417, 1158, 436, 1127, 556, 644, 8132, 29689, 2011, 275, 958, 247, 2234, 2934, 273, 1755, 76, 505, 310, 970, 19265, 37139, 414, 534, 310, 5777, 1679, 685, 3579, 37139, 414, 533, 417, 14086, 285, 653, 46412, 4648, 253, 4751, 14086, 2858, 373, 1179, 264, 19265, 50276, 249, 1386, 14509, 4477, 1646, 281, 9059, 326, 253, 994, 12189, 37820, 310, 3309, 275, 1340, 281, 14556, 8415, 16186, 19, 390, 39406, 16186, 337, 891, 513, 417, 1158, 436, 310, 7933, 2032, 275, 958, 253, 2900, 273, 16186, 18, 310, 2686, 3240, 3477, 281, 755, 323, 6447, 260, 581, 651, 3365, 878, 281, 1347, 253, 2849, 7128, 2275, 833, 819, 25004, 390, 2105, 373, 1179, 264, 9777, 10793, 260, 310, 1327, 23714, 3021, 891, 1859, 253, 994, 12189, 37820, 347, 271, 41544, 5611, 533, 17837, 18176, 34332, 281, 1056, 253, 17621, 44108, 3185, 273, 247, 8284, 4477, 18212, 9713, 841, 275, 619, 26896, 4743, 5474, 33032, 2520, 3929, 23970, 247, 747, 1332, 653, 46412, 534, 4483, 3733, 11454, 6928, 342, 37139, 414, 10806, 653, 46412, 14125, 281, 253, 12006, 292, 375, 12083, 2021, 273, 3082, 352, 18922, 247, 14086, 4764, 1309, 3733, 285, 2789, 352, 23507, 1652, 407, 1652, 653, 46412, 5760, 253, 1268, 273, 37139, 414, 6701, 281, 247, 4764, 9840, 534, 17923, 271, 30370, 875, 767, 4633, 3082, 281, 6194, 23507, 11454, 6928, 1755, 76, 505, 1275, 3495, 273, 253, 2929, 387, 9840, 50276, 17, 285, 1607, 7543, 387, 9840, 50276, 3259, 253, 4275, 2934, 310, 281, 897, 1309, 3733, 247, 8989, 327, 253, 2990, 3602, 1754, 327, 247, 2602, 3956, 76, 5572, 9009, 3066, 3963, 1025, 8654, 4616, 50276, 1189, 455, 891, 1089, 436, 3929, 1077, 973, 3542, 3340, 253, 10199, 253, 4114, 285, 253, 9759, 273, 253, 9021, 253, 4477, 452, 1160, 247, 16613, 3434, 273, 7690, 356, 15514, 534, 10260, 29499, 253, 4361, 285, 253, 4685, 273, 253, 3929, 275, 1798, 8442, 337, 285, 374, 403, 1077, 2590, 858, 9994, 285, 891, 5717, 253, 4477, 323, 436, 3434, 327, 253, 643, 1133, 253, 4081, 1332, 3133, 281, 479, 3240, 4623, 285, 973, 4516, 407, 1142, 4679, 253, 958, 326, 653, 46412, 310, 7939, 281, 285, 2087, 4219, 1607, 285, 1755, 76, 505, 4056, 49966, 253, 7680, 253, 1332, 3133, 281, 479, 3240, 747, 352, 10861, 2920, 24772, 253, 2602, 3956, 76, 5572, 273, 1269, 466, 1162, 355, 7609, 342, 8746, 25001, 5697, 50275, 35529, 627, 403, 247, 1643, 2792, 326, 403, 275, 619, 4743, 12744, 285, 651, 878, 281, 320, 31637, 841, 3657, 479, 432, 8133, 247, 1077, 2590, 2997, 327, 436, 3929, 533, 891, 651, 46107, 1818, 619, 2564, 7293, 327, 253, 4477, 9172, 5549, 5474, 33032, 2520, 2929, 29328, 247, 747, 1332, 323, 819, 25004, 277, 9866, 3602, 253, 4081, 1332, 4648, 247, 8967, 1755, 76, 5572, 594, 326, 253, 4081, 1332, 20670, 684, 875, 1607, 285, 1755, 76, 505, 285, 40216, 30211, 285, 17947, 973, 50276, 296, 3755, 20556, 50275, 783, 4081, 1332, 310, 2969, 285, 3477, 281, 897, 50276, 783, 4081, 1332, 41731, 13015, 5368, 3082, 50276, 783, 4679, 2486, 253, 4980, 9084, 10336, 50275, 20881, 1255, 265, 50275, 6050, 891, 15909, 253, 27350, 2934, 273, 253, 4081, 1332, 24088, 253, 4081, 1332, 20670, 684, 1607, 285, 1755, 76, 505, 891, 13414, 1089, 253, 24775, 3212, 253, 2173, 2216, 273, 253, 4081, 5731, 4086, 625, 7000, 11985, 285, 18670, 3505, 74, 6216, 7914, 24088, 253, 5731, 273, 253, 4081, 1332, 476, 320, 12814, 347, 247, 18499, 3213, 273, 690, 8103, 1159, 943, 320, 1677, 253, 4477, 5393, 253, 7364, 285, 2675, 16274, 973, 275, 2593, 608, 5474, 339, 431, 248, 2929, 10262, 653, 46412, 50276, 66, 12006, 292, 375, 12083, 5933, 281, 6194, 23507, 11454, 6928, 275, 534, 253, 37139, 414, 273, 3602, 476, 320, 27810, 3587, 407, 247, 17095, 7563, 50276, 783, 2929, 556, 495, 2022, 9021, 50276, 1615, 24049, 253, 2602, 1755, 76, 5572, 275, 253, 4764, 5731, 285, 407, 16984, 253, 9479, 1255, 4764, 653, 46412, 4483, 281, 20670, 366, 875, 34560, 9777, 819, 25004, 285, 1755, 76, 1900, 23507, 3733, 50275, 1033, 46412, 2722, 1077, 12085, 285, 5185, 3045, 689, 253, 5368, 3082, 285, 671, 253, 4751, 14086, 3733, 50275, 783, 4477, 671, 1263, 253, 1055, 273, 253, 9479, 1255, 4764, 327, 253, 7200, 347, 247, 5454, 2727, 875, 17947, 285, 30211, 15995, 5161, 4295, 273, 653, 46412, 253, 5933, 577, 6505, 479, 1199, 5545, 50275, 262, 4620, 281, 479, 326, 253, 5933, 577, 310, 2717, 533, 253, 16338, 27721, 5933, 275, 326, 1083, 352, 310, 417, 2590, 281, 479, 253, 16038, 273, 253, 15910, 6866, 275, 253, 1386, 818, 275, 3946, 285, 671, 275, 253, 8654, 4616, 3114, 352, 310, 2217, 281, 1453, 253, 16888, 8411, 923, 7579, 7904, 275, 509, 302, 50272, 783, 10272, 273, 253, 16338, 27721, 25142, 1057, 417, 1007, 987, 281, 479, 1060, 891, 1246, 619, 1211, 28529, 273, 253, 16338, 27721, 5731, 806, 1339, 441, 4853, 340, 50276, 90, 340, 275, 14168, 4482, 47939, 2069, 374, 407, 37444, 767, 11390, 275, 14168, 4482, 47939, 285, 260, 50276, 1124, 16788, 470, 69, 275, 14168, 4482, 47939, 2069, 374, 835, 470, 69, 310, 253, 5058, 4972, 275, 14168, 4482, 47939, 840, 4518, 298, 2134, 260, 340, 391, 2134, 50276, 1124, 87, 291, 340, 835, 298, 2134, 260, 340, 391, 2134, 50276, 2204, 1944, 260, 1944, 340, 1944, 12853, 253, 8954, 7564, 3750, 1885, 359, 476, 24813, 253, 1895, 374, 347, 50274, 1222, 90, 275, 14168, 4482, 47939, 2069, 374, 298, 2134, 260, 340, 391, 2134, 2505, 2256, 281, 50276, 90, 1249, 50276, 68, 2505, 285, 50276, 1767, 337, 69, 50276, 76, 337, 7064, 260, 50276, 5751, 50275, 783, 767, 8459, 932, 403, 417, 2568, 12650, 594, 9173, 256, 50276, 18, 7064, 260, 840, 4853, 260, 50276, 1124, 6113, 260, 50276, 1026, 50276, 1124, 16788, 470, 69, 285, 465, 50276, 1124, 18, 84, 465, 337, 7064, 260, 50276, 5751, 50275, 925, 7305, 337, 50276, 925, 471, 296, 253, 1840, 1895, 310, 6425, 281, 50275, 1222, 90, 275, 14168, 4482, 47939, 2069, 374, 298, 2134, 260, 340, 391, 2134, 2505, 2256, 281, 50276, 90, 1249, 50276, 68, 2505, 285, 50276, 1767, 337, 69, 50276, 76, 50275, 2666, 436, 310, 247, 6283, 2931, 8654, 4616, 1895, 604, 359, 4853, 253, 994, 12189, 37820, 1895, 347, 323, 9840, 50276, 17, 50275, 1222, 90, 275, 14168, 4482, 47939, 2069, 374, 298, 2134, 260, 340, 391, 2134, 50276, 1124, 18, 2461, 1465, 2505, 2256, 281, 50276, 90, 1249, 50276, 68, 2505, 285, 50276, 1767, 337, 69, 50276, 76, 50275, 2811, 1465, 50276, 2204, 1944, 340, 1944, 2412, 340, 1944, 50276, 18, 253, 16338, 27721, 5731, 1024, 9563, 323, 8746, 11390, 1484, 275, 14168, 4482, 391, 19, 285, 362, 275, 14168, 4482, 47939, 359, 452, 362, 50276, 2808, 260, 50276, 2808, 2020, 75, 866, 2461, 260, 3830, 480, 50276, 10441, 285, 50276, 86, 50276, 2808, 465, 50276, 2808, 2020, 74, 866, 2461, 16399, 260, 5256, 50276, 6584, 285, 253, 8654, 2098, 340, 310, 1677, 407, 340, 1944, 50276, 4347, 6584, 50276, 10441, 50276, 2461, 260, 1944, 50275, 681, 48434, 362, 1484, 340, 281, 8794, 12910, 278, 432, 253, 3104, 38094, 275, 253, 5933, 577, 2975, 891, 513, 417, 923, 2139, 597, 403, 6425, 50276, 14005, 50276, 3118, 302, 305, 357, 19399, 759, 6147, 285, 2304, 1940, 2624, 11317, 6247, 15180, 8654, 4616, 342, 4893, 281, 941, 5859, 27629, 285, 13554, 275, 5145, 4715, 1936, 1903, 642, 8026, 7266, 26033, 25616, 253, 4477, 513, 2319, 253, 7364, 285, 2442, 4016, 38058, 3486, 273, 616, 789, 2490, 187, 4118, 18435, 27, 455, 30628, 5194, 326, 253, 2929, 310, 4518, 3542, 285, 29328, 271, 5933, 534, 310, 1097, 4460, 285, 5919, 50276, 783, 30080, 22559, 556, 31637, 247, 1180, 273, 2792, 285, 7624, 519, 2079, 954, 273, 253, 7350, 273, 253, 30628, 253, 4477, 403, 3021, 7052, 14659, 281, 1379, 715, 2395, 253, 5701, 273, 253, 30628, 285, 281, 823, 690, 273, 253, 8254, 6787, 326, 597, 2530, 275, 436, 5955, 275, 253, 2929, 285, 24864, 4753, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 247, 26830, 11454, 2990, 819, 25004, 5933, 534, 29820, 2602, 1755, 76, 5122, 323, 253, 2801, 5731, 273, 34741, 328, 12477, 264, 3602, 275, 1798, 4477, 5283, 253, 46350, 1755, 76, 5122, 273, 1269, 466, 1162, 355, 9169, 534, 6032, 84, 253, 1755, 76, 1895, 275, 247, 3505, 74, 6216, 5133, 3066, 14657, 1755, 76, 347, 271, 8654, 4616, 1895, 285, 11323, 253, 994, 12189, 3963, 6081, 281, 1056, 253, 17621, 6032, 253, 1566, 10384, 1097, 2602, 285, 1892, 1755, 76, 323, 253, 7103, 273, 253, 1566, 285, 760, 2602, 323, 253, 11786, 13782, 672, 5678, 342, 271, 4569, 27387, 273, 37139, 414, 285, 994, 12189, 37820, 7133, 253, 1332, 33526, 256, 5359, 2804, 3045, 327, 501, 3024, 1235, 285, 9084, 67, 10166, 327, 4440, 257, 292, 18, 76, 4757, 806, 273, 512, 891, 1364, 1333, 326, 253, 2929, 310, 1077, 4518, 3542, 891, 751, 849, 11333, 1249, 403, 4845, 534, 2789, 5933, 495, 1077, 24600, 285, 15246, 2181, 281, 513, 671, 253, 8746, 25001, 3169, 8668, 2584, 253, 1755, 76, 505, 5731, 369, 3240, 747, 285, 5352, 281, 479, 3738, 891, 651, 452, 14109, 5777, 625, 7000, 5955, 327, 752, 8746, 25001, 310, 285, 891, 1158, 436, 812, 320, 39637, 1050, 323, 1142, 2852, 2987, 1529, 1943, 4757, 273, 253, 4081, 1332, 310, 697, 17032, 6733, 1907, 1199, 2406, 10527, 17032, 892, 412, 685, 1755, 76, 505, 50276, 20881, 1255, 619, 2022, 4468, 310, 327, 253, 8542, 414, 273, 253, 4081, 1332, 3738, 253, 4275, 2934, 273, 253, 5933, 310, 4722, 285, 352, 3133, 751, 352, 4245, 4512, 9510, 275, 2426, 273, 253, 2457, 37139, 414, 85, 2211, 12301, 5454, 2727, 891, 717, 5777, 11926, 670, 253, 8542, 414, 273, 253, 1332, 653, 46412, 556, 3081, 4373, 19484, 9840, 534, 3198, 281, 320, 9257, 4236, 285, 11526, 323, 253, 3045, 6351, 689, 253, 5368, 3082, 534, 778, 2430, 1142, 3733, 6613, 281, 320, 24251, 671, 12401, 9777, 819, 25004, 3692, 3939, 3602, 6379, 347, 253, 3733, 16947, 285, 1755, 76, 505, 534, 556, 6137, 1180, 273, 5241, 3602, 3066, 19265, 37139, 414, 653, 46412, 556, 512, 3602, 3939, 4768, 253, 3733, 1907, 1077, 1652, 5649, 275, 2426, 273, 7741, 908, 323, 3733, 3733, 892, 2695, 3541, 50276, 1568, 403, 690, 12389, 81, 5519, 390, 1199, 1054, 14071, 7350, 50276, 249, 1386, 3925, 253, 2929, 25957, 326, 359, 921, 326, 653, 46412, 20670, 684, 875, 50276, 2858, 891, 513, 417, 1158, 436, 1127, 556, 644, 8132, 29689, 2011, 275, 958, 247, 2234, 2934, 273, 1755, 76, 505, 310, 970, 19265, 37139, 414, 534, 310, 5777, 1679, 685, 3579, 37139, 414, 533, 417, 14086, 285, 653, 46412, 4648, 253, 4751, 14086, 2858, 373, 1179, 264, 19265, 50276, 249, 1386, 14509, 4477, 1646, 281, 9059, 326, 253, 994, 12189, 37820, 310, 3309, 275, 1340, 281, 14556, 8415, 16186, 19, 390, 39406, 16186, 337, 891, 513, 417, 1158, 436, 310, 7933, 2032, 275, 958, 253, 2900, 273, 16186, 18, 310, 2686, 3240, 3477, 281, 755, 323, 6447, 260, 581, 651, 3365, 878, 281, 1347, 253, 2849, 7128, 2275, 833, 819, 25004, 390, 2105, 373, 1179, 264, 9777, 10793, 260, 310, 1327, 23714, 3021, 891, 1859, 253, 994, 12189, 37820, 347, 271, 41544, 5611, 533, 17837, 18176, 34332, 281, 1056, 253, 17621, 44108, 3185, 273, 247, 8284, 4477, 18212, 9713, 841, 275, 619, 26896, 4743, 5474, 33032, 2520, 3929, 23970, 247, 747, 1332, 653, 46412, 534, 4483, 3733, 11454, 6928, 342, 37139, 414, 10806, 653, 46412, 14125, 281, 253, 12006, 292, 375, 12083, 2021, 273, 3082, 352, 18922, 247, 14086, 4764, 1309, 3733, 285, 2789, 352, 23507, 1652, 407, 1652, 653, 46412, 5760, 253, 1268, 273, 37139, 414, 6701, 281, 247, 4764, 9840, 534, 17923, 271, 30370, 875, 767, 4633, 3082, 281, 6194, 23507, 11454, 6928, 1755, 76, 505, 1275, 3495, 273, 253, 2929, 387, 9840, 50276, 17, 285, 1607, 7543, 387, 9840, 50276, 3259, 253, 4275, 2934, 310, 281, 897, 1309, 3733, 247, 8989, 327, 253, 2990, 3602, 1754, 327, 247, 2602, 3956, 76, 5572, 9009, 3066, 3963, 1025, 8654, 4616, 50276, 1189, 455, 891, 1089, 436, 3929, 1077, 973, 3542, 3340, 253, 10199, 253, 4114, 285, 253, 9759, 273, 253, 9021, 253, 4477, 452, 1160, 247, 16613, 3434, 273, 7690, 356, 15514, 534, 10260, 29499, 253, 4361, 285, 253, 4685, 273, 253, 3929, 275, 1798, 8442, 337, 285, 374, 403, 1077, 2590, 858, 9994, 285, 891, 5717, 253, 4477, 323, 436, 3434, 327, 253, 643, 1133, 253, 4081, 1332, 3133, 281, 479, 3240, 4623, 285, 973, 4516, 407, 1142, 4679, 253, 958, 326, 653, 46412, 310, 7939, 281, 285, 2087, 4219, 1607, 285, 1755, 76, 505, 4056, 49966, 253, 7680, 253, 1332, 3133, 281, 479, 3240, 747, 352, 10861, 2920, 24772, 253, 2602, 3956, 76, 5572, 273, 1269, 466, 1162, 355, 7609, 342, 8746, 25001, 5697, 50275, 35529, 627, 403, 247, 1643, 2792, 326, 403, 275, 619, 4743, 12744, 285, 651, 878, 281, 320, 31637, 841, 3657, 479, 432, 8133, 247, 1077, 2590, 2997, 327, 436, 3929, 533, 891, 651, 46107, 1818, 619, 2564, 7293, 327, 253, 4477, 9172, 5549, 5474, 33032, 2520, 2929, 29328, 247, 747, 1332, 323, 819, 25004, 277, 9866, 3602, 253, 4081, 1332, 4648, 247, 8967, 1755, 76, 5572, 594, 326, 253, 4081, 1332, 20670, 684, 875, 1607, 285, 1755, 76, 505, 285, 40216, 30211, 285, 17947, 973, 50276, 296, 3755, 20556, 50275, 783, 4081, 1332, 310, 2969, 285, 3477, 281, 897, 50276, 783, 4081, 1332, 41731, 13015, 5368, 3082, 50276, 783, 4679, 2486, 253, 4980, 9084, 10336, 50275, 20881, 1255, 265, 50275, 6050, 891, 15909, 253, 27350, 2934, 273, 253, 4081, 1332, 24088, 253, 4081, 1332, 20670, 684, 1607, 285, 1755, 76, 505, 891, 13414, 1089, 253, 24775, 3212, 253, 2173, 2216, 273, 253, 4081, 5731, 4086, 625, 7000, 11985, 285, 18670, 3505, 74, 6216, 7914, 24088, 253, 5731, 273, 253, 4081, 1332, 476, 320, 12814, 347, 247, 18499, 3213, 273, 690, 8103, 1159, 943, 320, 1677, 253, 4477, 5393, 253, 7364, 285, 2675, 16274, 973, 275, 2593, 608, 5474, 339, 431, 248, 2929, 10262, 653, 46412, 50276, 66, 12006, 292, 375, 12083, 5933, 281, 6194, 23507, 11454, 6928, 275, 534, 253, 37139, 414, 273, 3602, 476, 320, 27810, 3587, 407, 247, 17095, 7563, 50276, 783, 2929, 556, 495, 2022, 9021, 50276, 1615, 24049, 253, 2602, 1755, 76, 5572, 275, 253, 4764, 5731, 285, 407, 16984, 253, 9479, 1255, 4764, 653, 46412, 4483, 281, 20670, 366, 875, 34560, 9777, 819, 25004, 285, 1755, 76, 1900, 23507, 3733, 50275, 1033, 46412, 2722, 1077, 12085, 285, 5185, 3045, 689, 253, 5368, 3082, 285, 671, 253, 4751, 14086, 3733, 50275, 783, 4477, 671, 1263, 253, 1055, 273, 253, 9479, 1255, 4764, 327, 253, 7200, 347, 247, 5454, 2727, 875, 17947, 285, 30211, 15995, 5161, 4295, 273, 653, 46412, 253, 5933, 577, 6505, 479, 1199, 5545, 50275, 262, 4620, 281, 479, 326, 253, 5933, 577, 310, 2717, 533, 253, 16338, 27721, 5933, 275, 326, 1083, 352, 310, 417, 2590, 281, 479, 253, 16038, 273, 253, 15910, 6866, 275, 253, 1386, 818, 275, 3946, 285, 671, 275, 253, 8654, 4616, 3114, 352, 310, 2217, 281, 1453, 253, 16888, 8411, 923, 7579, 7904, 275, 509, 302, 50272, 783, 10272, 273, 253, 16338, 27721, 25142, 1057, 417, 1007, 987, 281, 479, 1060, 891, 1246, 619, 1211, 28529, 273, 253, 16338, 27721, 5731, 806, 1339, 441, 4853, 340, 50276, 90, 340, 275, 14168, 4482, 47939, 2069, 374, 407, 37444, 767, 11390, 275, 14168, 4482, 47939, 285, 260, 50276, 1124, 16788, 470, 69, 275, 14168, 4482, 47939, 2069, 374, 835, 470, 69, 310, 253, 5058, 4972, 275, 14168, 4482, 47939, 840, 4518, 298, 2134, 260, 340, 391, 2134, 50276, 1124, 87, 291, 340, 835, 298, 2134, 260, 340, 391, 2134, 50276, 2204, 1944, 260, 1944, 340, 1944, 12853, 253, 8954, 7564, 3750, 1885, 359, 476, 24813, 253, 1895, 374, 347, 50274, 1222, 90, 275, 14168, 4482, 47939, 2069, 374, 298, 2134, 260, 340, 391, 2134, 2505, 2256, 281, 50276, 90, 1249, 50276, 68, 2505, 285, 50276, 1767, 337, 69, 50276, 76, 337, 7064, 260, 50276, 5751, 50275, 783, 767, 8459, 932, 403, 417, 2568, 12650, 594, 9173, 256, 50276, 18, 7064, 260, 840, 4853, 260, 50276, 1124, 6113, 260, 50276, 1026, 50276, 1124, 16788, 470, 69, 285, 465, 50276, 1124, 18, 84, 465, 337, 7064, 260, 50276, 5751, 50275, 925, 7305, 337, 50276, 925, 471, 296, 253, 1840, 1895, 310, 6425, 281, 50275, 1222, 90, 275, 14168, 4482, 47939, 2069, 374, 298, 2134, 260, 340, 391, 2134, 2505, 2256, 281, 50276, 90, 1249, 50276, 68, 2505, 285, 50276, 1767, 337, 69, 50276, 76, 50275, 2666, 436, 310, 247, 6283, 2931, 8654, 4616, 1895, 604, 359, 4853, 253, 994, 12189, 37820, 1895, 347, 323, 9840, 50276, 17, 50275, 1222, 90, 275, 14168, 4482, 47939, 2069, 374, 298, 2134, 260, 340, 391, 2134, 50276, 1124, 18, 2461, 1465, 2505, 2256, 281, 50276, 90, 1249, 50276, 68, 2505, 285, 50276, 1767, 337, 69, 50276, 76, 50275, 2811, 1465, 50276, 2204, 1944, 340, 1944, 2412, 340, 1944, 50276, 18, 253, 16338, 27721, 5731, 1024, 9563, 323, 8746, 11390, 1484, 275, 14168, 4482, 391, 19, 285, 362, 275, 14168, 4482, 47939, 359, 452, 362, 50276, 2808, 260, 50276, 2808, 2020, 75, 866, 2461, 260, 3830, 480, 50276, 10441, 285, 50276, 86, 50276, 2808, 465, 50276, 2808, 2020, 74, 866, 2461, 16399, 260, 5256, 50276, 6584, 285, 253, 8654, 2098, 340, 310, 1677, 407, 340, 1944, 50276, 4347, 6584, 50276, 10441, 50276, 2461, 260, 1944, 50275, 681, 48434, 362, 1484, 340, 281, 8794, 12910, 278, 432, 253, 3104, 38094, 275, 253, 5933, 577, 2975, 891, 513, 417, 923, 2139, 597, 403, 6425, 50276, 14005, 50276, 3118, 302, 305, 357, 19399, 759, 6147, 285, 2304, 1940, 2624, 11317, 6247, 15180, 8654, 4616, 342, 4893, 281, 941, 5859, 27629, 285, 13554, 275, 5145, 4715, 1936, 1903, 642, 8026, 7266, 26033, 25616, 253, 4477, 513, 2319, 253, 7364, 285, 2442, 4016, 38058, 3486, 273, 616, 789, 2490, 187, 4118, 18435, 27, 455, 30628, 5194, 326, 253, 2929, 310, 4518, 3542, 285, 29328, 271, 5933, 534, 310, 1097, 4460, 285, 5919, 50276, 783, 30080, 22559, 556, 31637, 247, 1180, 273, 2792, 285, 7624, 519, 2079, 954, 273, 253, 7350, 273, 253, 30628, 253, 4477, 403, 3021, 7052, 14659, 281, 1379, 715, 2395, 253, 5701, 273, 253, 30628, 285, 281, 823, 690, 273, 253, 8254, 6787, 326, 597, 2530, 275, 436, 5955, 275, 253, 2929, 285, 24864, 4753, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: overview this paper purports to study training deep equilibrium models by studying the optimization dynamics of deep linear equilibrium models the original deep equilibrium model is formalized as follows given a training dataset xiyi for i 1n where xi in mathcalxsubseteq mathbbrmx and yiin mathcalysubseteq mathbbrmy are the ith input and output respectively the goal is to learn a predictor from a family mathcalh ftheta mathbbrmx rightarrow mathbbrmy thetaintheta then instead of trying to map x to y using finite amount of layers deep equilibrium models assume infinite number of layers and the output z of the last hidden layer is defined by beginequation z limlrightarrow infty zl limlrightarrow infty hzl1xtheta hzxtheta endequation where h is some continuous function of choice in particular with deep equilibrium linear models h is constrained as follows beginequation hzl1xtheta gammasigmaazl1phix endequation where phix is a feature map of x and transforms xinmathbbrmx into phixinmathbbrm theta ab are two trainable matrices where ainmathbbrmtimes m is for computing each hidden output and binmathbbrmytimes m is for computing the final output of the network gamma in 01 is some positive real number and sigma is a nonlinear function to ensure the existence of the fixed point z this model is linear in z the objective function of this deep equilibrium linear model can be written as follows beginequation lab sumi1n ellleft bleft limlrightarrowinftyzlxiarightyiright endequation where ell is some choice of loss function then this paper provides some motivations behind studying the dynamics of these deep equilibrium linear models by presenting some interesting comparisons between deep equilibrium linear models and normal linear models and additional less interesting comparisons with fullyconnected feedforward deep neural networks fnn using standard image datasets cifar10 cifar100 and kuzushijimnist in their tests the deep equilibrium linear models outperformed both linear models and fnns the main results from this paper is uncovering the dynamics behind these deep equilibrium linear models this paper provided a sequence of proofs that shows linear convergence of these models step by step under the assumption that the loss functions ell are differentiable which is satisfied by some standard loss functions such as square loss losgistic loss and smoothed hinge loss ellfthetaxy max01fthetaxyk with kgeq 2 also despite of nonconvexity of the loss functions it is shown in this paper that a global minimum l always exists and under their assumptions the deep equilibrium linear models will always converge to the global minimum linearly strengths the main strength of this paper is that it brought some interesting insights into deep equilibrium models and they showed their results rigorously the definitions and propositions are clear enough for readers with some analysis and machine learning background to fully understand since the dynamics of deep learning models is an open field of research and isnt discovered fully this paper will definitely contribute to the deep learning field in understanding the dynamics and convergence theory of these deep equilibrium linear models and potentially benefiting researches on understanding more general deep learning models weakness although this paper brought some nice ideas some of the methodologies are not quite convincing for example in the experiments shown in this paper they compare performance of deep equilibrium linear models with linear models and deep neural networks especially in the comparison with the deep networks it doesnt seem that the networks are deep enough for the comparison to be compelling also in the same experiments they assumed the true data distribution is approximately given by a deep equilibrium linear model and generated data according to this model ie the data is only semireal it would be better to show that the deep equilibrium linear model outperforms other models under a more general setting throughout the entire paper its unclear what are the contributions to be specific in the first two sections its unclear whether they want to show the trainability of the deep equilibrium linear models or learn the dynamics of these models also some of the interesting aspects of this paper were missing some details for example one would be interested in seeing why exactly can the deep equilibrium linear models outperform linear models since they are both linear and the only difference is how they are trained in the last two sections of this paper they also brought up implicit bias which would be another interesting topic to dive into it would be great seeing more comparisons on those aspectsdocsepthe paper discusses the theory of deep equilibrium models with linear activations the model weights are softmaxed to ensure that inference converges to a fixed point a necessary condition for training deep equilibrium models the paper then analyzes the gradient flow dynamics of such models the main result is that linearrate convergence is guaranteed for a class of loss functions including quadratic and logistic losses when training with gradient flow this conclusion is supported by experiments conducted in a teacherstudentlike setup where the labels are generated by a teacher deep equilibrium model showing that training does converge in practice deep equilibrium models represent a novel way to train neural networks and not much is known about them yet theoretically it is important that we understand the dynamics of such models better and this paper is a good step in that direction suggested improvements 1 definition 1 has a typo on the righthand side of the inequality there shouldnt be a gradient 2 the main results are presented clearly and the paper is generally easy to read the only exception is section 32 on the connection with trust region newton methods which i did not understand i recommend clarifying the intuition behind theorem 2 as well as the main message of this section docsepthis submission studies the dynamics and convergence properties of deep equilibrium models which are parametric fixedpoint iterations corresponding to the infinite depth limit of weighttied neural networks as the authors point out these networks differ from deep linear networks and networks in the ntk scaling in that the optimization remains nonlinear wrt the parameters the authors prove two results first they establish linear convergence to the global minimum under the relatively strict assumption of a local plinequality secondly they show that the dynamics of the deep equilibrium models differs from gradient descent dynamics and in fact is related to a trust region newton method the first theorem is nice and wellpresented but i think the issue of the radius over which the plinequality holds could be better discussed i could not tell whether or not the convergence depends on starting within the locally smooth and quadratically bounded region of the loss the second theorem regarding the nature of the dynamics lacks a clear interpretation basically all that is said is that the dynamics is distinct from what would be seen in a linear model and its not clear that this dynamics has anything to do with implicit regularization as the authors suggest i would recommend the authors clarify the discussion of this result the experiments are somewhat bizarre and i felt that they were a little misleading about the representative power potential of these models but perhaps i did not fully understand the setup and intent the authors randomly sample a deep equilibrium model and then use it to represent a conditional probability distribution where the conditioning is with image data then the authors fit various models to this distribution and show that the deep equilibrium models which is precisely the underlying function representing the distribution has better performance than other classes of functions i think the description in this section could be vastly improved perhaps presenting this more akin to a studentteacher problem docsep summary this work focuses on the study of global convergence and gradient dynamics of a recently proposed family of models the deep equilibrium linear models delm under common classes of loss functions exploiting the neumann series convergence and the pl inequality analysis the authors proved convergence to global optima of delm without prior assumption on the width m of the model relative to the number of data n here is my general opinion while deep linear models in general as the authors acknowledged has been widely studied and despite the extreme simplicity of the delm when compared to the original deq model that bai et al 1 studied i found this work interesting and important in establishing a solid foundation for the theoretical study of this class of implicitdepth models the authors managed to demonstrate that the gradient dynamics and convergence assumptions of delm is indeed different from typical stacked deep linear models that prior approaches study such as deep linear resnet and throughout the arguments of the paper and the proof in the appendix i can tell how the equilibrium property of delm is making the story different and think this paper sets a good starting point for future similar in this direction for general implicitdepth models but still the paper has a limited scope in terms of the structure it studies pros 1 one of the first theoretical works on the gradient dynamics and convergence properties of the deep equilibrium models 1 and implicit models in general 23 which are quite different from conventional deep networks 2 clear notation and theoretical insights with proof relatively easy to follow the proof seems overall correct there are some that i didnt check closely though 3 clear discussion of the relation including and especially the differences of the prior analysis on deep linear neural networks cons 1 the very definition of delm which the author provided a particular formulation of is of limited scope see my comment below that expands on this point 2 the empirical studies to validate the conclusion of the theoretical results could be strengthened i have some commentsquestions for the authors detailed below 1 the major limitation that i found while reading the discussion and the proof of this paper is that while the authors claim to study deep equilibrium linear models the insights mostly only apply to the models converging with neumann series guarantee and can be written in the form bu1 phix i understand the motivation for fixating a provably convergent equilibrium model formulation but a a provably convergent deep equilibrium linear model doesnt need to be neumann for its jacobian for the implicit function theorem to work in the simplest case the fixed point of a function hx on 2d can have a local derivative with absolute value 1 this certainly implies that repeatedly unrolling the function hx may not converge and yet there still is a unique fixed point and one can reliably solve for it see 4 however without the nice neumann series form which allows one to write igamma sigmaa1 as a closedform representation for the infinitedepth network forward pass i dont think the theorems will hold directly typically the ij1 term should only appear in the implicit function theorem which is used for the backward pass i expect the authors to clarify this further b the very design of sigmaa in the model the authors study is a bit bizarre to me why applying a softmax on the weight is it just to ensure that proposition 1 holds ie that you have a handy provablyconvergent linear model the authors stressed a few times that the hcdot function is thus nonlinear wrt sigmaa but i fail to directly see why it matters so much as the model is still linear wrt the input its really a onelinear layer though the inverse from neumann does make a difference on a and in terms of the gradient dynamics the major difference this makes will merely be fracpartial sigmaapartial a i might have missed something here and would appreciate if the authors can clarify 2 i didnt quite get the specific point the authors were trying to make in section 32 in terms of the implicit bias could you expand on that 3 overall i feel that the empirical support of the theoretical findings can be stronger for instance by inspecting different initialization a0 b0 or validating the radius discussion at the end of section 31 for the logistic loss like in zou et al 5 some synthetic data could probably work just fine what is the reason for only using 200 images from the mnistcifar datasets is it to keep the size of phi small but i didnt see the authors report anything about it in section 22 and since the primary purpose of sec 22 is to discuss whether the model would also make sense in practice which i take to mean that you only want to compare the test accuracies coming out of these models wouldnt a 200sample version of mnistcifar too small to draw a robust conclusion on this 4 one that i think could be useful for further thought is the convergence property not just for gd but also sgd like zou et al provided in 5 up to a probability minor things i page 3 the outputs of the deep equilibrium linear models fthetax cdots are nonlinear and nonmultilinear in the optimization variables ab nonlinear in even b ii page 14 vq bq iii page 16 nablaf lab nablaa lab 1 httpsarxivorgabs190901377 2 httpsarxivorgabs190806315 3 httpsarxivorgabs200906211 4 httpsarxivorgabs200608591 5 httpsarxivorgabs200301094 ### Summary:
the paper analyzes the gradient flow dynamics of deep equilibrium models with linear activations and establishes linear convergence for quadratic loss and logistic loss several exciting results and connections solid contribution accept
[ 3755, 20556, 50276, 783, 2022, 4757, 273, 436, 2929, 310, 326, 352, 3982, 50276, 8826, 4722, 16039, 715, 3676, 12902, 50276, 19286, 285, 597, 2692, 616, 1543, 8132, 29689, 253, 14308, 285, 39325, 403, 2590, 2217, 323, 10668, 342, 690, 1783, 285, 5145, 4715, 4114, 281, 4751, 2096, 50275, 17480, 253, 8062, 273, 3676, 4715, 3210, 310, 271, 1527, 1673, 273, 2561, 285, 310, 2649, 6888, 4751, 436, 2929, 588, 7964, 8162, 281, 253, 3676, 4715, 1673, 275, 4685, 253, 8062, 285, 14940, 3762, 273, 841, 3676, 12902, 4872, 3210, 285, 7826, 2750, 2996, 29905, 2706, 327, 4685, 625, 2087, 3676, 4715, 3210, 50276, 20881, 1255, 50276, 20261, 436, 2929, 3982, 690, 5322, 5697, 690, 273, 253, 39396, 403, 417, 3240, 21414, 323, 1650, 275, 253, 4679, 2011, 275, 436, 2929, 597, 50276, 23813, 3045, 273, 3676, 12902, 4872, 3210, 342, 4872, 3210, 285, 3676, 11454, 6928, 3340, 275, 253, 5301, 342, 253, 3676, 6928, 352, 36908, 1646, 326, 253, 6928, 403, 3676, 2217, 323, 253, 5301, 281, 320, 18511, 671, 275, 253, 1072, 4679, 597, 8025, 253, 2032, 941, 3268, 310, 5512, 1677, 407, 247, 3676, 12902, 4872, 1566, 285, 4561, 941, 2556, 281, 436, 1566, 50276, 466, 253, 941, 310, 760, 3300, 603, 267, 352, 651, 320, 1805, 281, 921, 326, 253, 3676, 12902, 4872, 1566, 41731, 13015, 643, 3210, 762, 247, 625, 2087, 4758, 50276, 10489, 483, 253, 2862, 2929, 697, 12744, 752, 403, 253, 9021, 281, 320, 2173, 275, 253, 806, 767, 7118, 697, 12744, 1880, 597, 971, 281, 921, 253, 6194, 1430, 273, 253, 3676, 12902, 4872, 3210, 390, 3037, 253, 8062, 273, 841, 3210, 671, 690, 273, 253, 4722, 7794, 273, 436, 2929, 497, 5816, 690, 4278, 323, 1650, 581, 651, 320, 6110, 275, 6523, 2139, 4555, 476, 253, 3676, 12902, 4872, 3210, 562, 32231, 4872, 3210, 1580, 597, 403, 1097, 4872, 285, 253, 760, 3064, 310, 849, 597, 403, 10166, 275, 253, 1390, 767, 7118, 273, 436, 2929, 597, 671, 3982, 598, 15424, 8492, 534, 651, 320, 1529, 4722, 9400, 281, 25760, 715, 352, 651, 320, 1270, 6523, 625, 14023, 327, 1110, 7794, 7152, 339, 431, 248, 2929, 25339, 253, 3762, 273, 3676, 12902, 3210, 342, 4872, 1396, 569, 253, 1566, 13461, 403, 2602, 4090, 264, 281, 5416, 326, 17032, 26414, 281, 247, 4229, 1127, 247, 3309, 1617, 323, 3733, 3676, 12902, 3210, 253, 2929, 840, 3537, 13505, 253, 11786, 2685, 8062, 273, 824, 3210, 253, 2022, 906, 310, 326, 1386, 3298, 366, 14940, 310, 16293, 323, 247, 966, 273, 2957, 3470, 1690, 21396, 285, 21535, 11655, 672, 3733, 342, 11786, 2685, 436, 6452, 310, 4516, 407, 4679, 5196, 275, 247, 9732, 39095, 3022, 9978, 835, 253, 13301, 403, 4561, 407, 247, 9732, 3676, 12902, 1566, 4645, 326, 3733, 1057, 29623, 275, 3946, 50276, 22412, 12902, 3210, 1957, 247, 4460, 1039, 281, 6194, 11454, 6928, 285, 417, 1199, 310, 1929, 670, 731, 2568, 28055, 352, 310, 1774, 326, 359, 2096, 253, 8062, 273, 824, 3210, 1805, 285, 436, 2929, 310, 247, 1175, 3213, 275, 326, 3884, 50276, 35640, 264, 11701, 50276, 18, 5426, 337, 556, 247, 1745, 80, 327, 253, 987, 4608, 1930, 273, 253, 11370, 627, 943, 2649, 320, 247, 11786, 50276, 19, 253, 2022, 1543, 403, 3559, 4518, 285, 253, 2929, 310, 3839, 3477, 281, 1239, 253, 760, 6517, 310, 2593, 4567, 327, 253, 4602, 342, 4517, 2919, 747, 1299, 3082, 534, 891, 858, 417, 2096, 891, 5583, 8254, 5411, 253, 30328, 3212, 10012, 374, 347, 973, 347, 253, 2022, 3935, 273, 436, 2593, 5474, 33032, 2520, 19529, 2175, 253, 8062, 285, 14940, 3607, 273, 3676, 12902, 3210, 534, 403, 36833, 4229, 3659, 25142, 3969, 281, 253, 11968, 6864, 2701, 273, 2801, 85, 728, 11454, 6928, 347, 253, 4477, 1127, 562, 841, 6928, 9184, 432, 3676, 4872, 6928, 285, 6928, 275, 253, 295, 17922, 13642, 275, 326, 253, 13757, 4558, 14561, 8772, 253, 3602, 253, 4477, 5276, 767, 1543, 806, 597, 5100, 4872, 14940, 281, 253, 4156, 5927, 762, 253, 4942, 7654, 9376, 273, 247, 1980, 499, 460, 15177, 1273, 314, 597, 921, 326, 253, 8062, 273, 253, 3676, 12902, 3210, 19986, 432, 11786, 18499, 8062, 285, 275, 958, 310, 2905, 281, 247, 4517, 2919, 747, 1299, 1332, 50276, 783, 806, 10012, 310, 5322, 285, 973, 15068, 264, 533, 891, 1158, 253, 2523, 273, 253, 9941, 689, 534, 253, 499, 460, 15177, 6556, 812, 320, 1805, 5469, 891, 812, 417, 2028, 1880, 390, 417, 253, 14940, 7024, 327, 4983, 1561, 253, 12171, 6032, 285, 13284, 5372, 11542, 2919, 273, 253, 2957, 50275, 783, 1273, 10012, 5001, 253, 3753, 273, 253, 8062, 19756, 247, 2590, 7914, 10323, 512, 326, 310, 753, 310, 326, 253, 8062, 310, 5799, 432, 752, 651, 320, 2326, 275, 247, 4872, 1566, 285, 697, 417, 2590, 326, 436, 8062, 556, 2712, 281, 513, 342, 15424, 37820, 347, 253, 4477, 1804, 891, 651, 5583, 253, 4477, 19148, 253, 5955, 273, 436, 906, 50275, 783, 4679, 403, 8489, 27541, 285, 891, 3543, 326, 597, 497, 247, 1652, 24363, 670, 253, 8612, 1612, 50276, 33177, 273, 841, 3210, 533, 4931, 891, 858, 417, 4751, 2096, 253, 9978, 285, 6860, 253, 4477, 12421, 3410, 247, 3676, 12902, 1566, 285, 840, 897, 352, 281, 1957, 247, 17697, 5912, 3268, 835, 253, 21839, 310, 342, 2460, 941, 840, 253, 4477, 4944, 2710, 3210, 281, 436, 3268, 285, 921, 326, 253, 3676, 12902, 3210, 534, 310, 10534, 253, 6944, 1159, 9999, 253, 3268, 556, 1805, 3045, 685, 643, 5971, 273, 3470, 891, 1158, 253, 5740, 275, 436, 2593, 812, 320, 37078, 5520, 4931, 15250, 436, 625, 33917, 281, 247, 5974, 442, 12844, 1895, 50274, 7152, 33032, 6010, 436, 789, 16633, 327, 253, 1263, 273, 4156, 14940, 285, 11786, 8062, 273, 247, 4102, 4081, 2021, 273, 3210, 253, 3676, 12902, 50276, 8172, 3210, 1448, 78, 762, 1846, 5971, 273, 2957, 3470, 38883, 253, 425, 25331, 2962, 14940, 285, 253, 499, 11370, 1783, 253, 4477, 8058, 14940, 281, 4156, 5556, 66, 273, 1448, 78, 1293, 2720, 9376, 327, 253, 4871, 278, 273, 253, 1566, 4103, 281, 253, 1180, 273, 941, 295, 50275, 1568, 310, 619, 2087, 4743, 50276, 6050, 3676, 4872, 3210, 275, 2087, 347, 253, 4477, 14969, 556, 644, 7561, 5421, 285, 5747, 253, 9559, 17647, 273, 253, 1448, 78, 672, 2429, 281, 253, 3236, 372, 82, 1566, 326, 270, 2284, 1162, 355, 337, 5421, 891, 1119, 436, 789, 4722, 285, 1774, 275, 14631, 247, 4891, 12153, 323, 253, 10527, 1263, 273, 436, 966, 273, 15424, 16719, 3210, 253, 4477, 7303, 281, 7568, 326, 253, 11786, 8062, 285, 14940, 13260, 273, 1448, 78, 310, 6296, 1027, 432, 6867, 24982, 3676, 4872, 3210, 326, 2720, 7274, 1263, 824, 347, 3676, 4872, 501, 3024, 285, 4768, 253, 7125, 273, 253, 2929, 285, 253, 4737, 275, 253, 30762, 891, 476, 2028, 849, 253, 12902, 2867, 273, 1448, 78, 310, 2403, 253, 2926, 1027, 285, 1158, 436, 2929, 5239, 247, 1175, 4983, 1127, 323, 2852, 2074, 275, 436, 3884, 323, 2087, 15424, 16719, 3210, 533, 1335, 253, 2929, 556, 247, 3710, 7990, 275, 2426, 273, 253, 2605, 352, 2175, 50276, 856, 84, 337, 581, 273, 253, 806, 10527, 2987, 327, 253, 11786, 8062, 285, 14940, 3607, 273, 253, 3676, 12902, 3210, 337, 285, 15424, 3210, 275, 2087, 3495, 534, 403, 3240, 1027, 432, 6041, 3676, 6928, 374, 2590, 14951, 285, 10527, 16039, 342, 4737, 4942, 3477, 281, 956, 253, 4737, 3133, 4583, 3451, 627, 403, 690, 326, 891, 42126, 2451, 8244, 2167, 495, 2590, 5955, 273, 253, 5886, 1690, 285, 3340, 253, 3910, 273, 253, 2720, 1783, 327, 3676, 4872, 11454, 6928, 50276, 5040, 337, 253, 1077, 5426, 273, 1448, 78, 534, 253, 2488, 2530, 247, 1798, 15895, 273, 310, 273, 3710, 7990, 923, 619, 4385, 2708, 326, 35205, 327, 436, 1127, 374, 253, 16774, 2175, 281, 17813, 253, 6452, 273, 253, 10527, 1543, 812, 320, 34615, 50274, 74, 452, 690, 5701, 34974, 323, 253, 4477, 7000, 2708, 50276, 18, 253, 2201, 12291, 326, 891, 1119, 1223, 4361, 253, 5955, 285, 253, 4737, 273, 436, 2929, 310, 326, 1223, 253, 4477, 1750, 281, 1263, 3676, 12902, 4872, 3210, 253, 16039, 6571, 760, 4647, 281, 253, 3210, 5975, 3390, 342, 425, 25331, 2962, 12215, 285, 476, 320, 3542, 275, 253, 830, 1081, 18, 815, 895, 891, 2096, 253, 16038, 323, 4993, 839, 247, 872, 1598, 41886, 12902, 1566, 15895, 533, 50272, 66, 247, 872, 1598, 41886, 3676, 12902, 4872, 1566, 36908, 878, 281, 320, 425, 25331, 323, 697, 480, 317, 706, 757, 323, 253, 15424, 1159, 10012, 281, 789, 275, 253, 22325, 1083, 253, 4229, 1127, 273, 247, 1159, 288, 89, 327, 374, 69, 476, 452, 247, 1980, 4309, 342, 7880, 1318, 50276, 18, 436, 5604, 8018, 326, 12889, 440, 19891, 253, 1159, 288, 89, 778, 417, 29623, 285, 2568, 627, 1335, 310, 247, 4451, 4229, 1127, 285, 581, 476, 27340, 8415, 323, 352, 923, 577, 2299, 1293, 253, 5322, 425, 25331, 2962, 830, 534, 4483, 581, 281, 3630, 25477, 1861, 40009, 66, 18, 347, 247, 4581, 630, 6779, 323, 253, 38353, 959, 554, 394, 2990, 3579, 1509, 891, 13414, 1158, 253, 39383, 588, 2186, 3587, 5431, 253, 891, 75, 18, 1307, 943, 760, 3176, 275, 253, 15424, 1159, 10012, 534, 310, 908, 323, 253, 19265, 1509, 891, 1902, 253, 4477, 281, 19148, 436, 2007, 50272, 67, 253, 1077, 2216, 273, 40009, 66, 275, 253, 1566, 253, 4477, 1263, 310, 247, 2372, 27541, 281, 479, 2139, 9433, 247, 2602, 4090, 327, 253, 2801, 310, 352, 816, 281, 5416, 326, 13989, 337, 6556, 26332, 326, 368, 452, 247, 24783, 872, 1598, 585, 332, 7322, 4872, 1566, 253, 4477, 21198, 247, 1643, 2069, 326, 253, 288, 3830, 1159, 310, 3021, 14561, 8772, 40009, 66, 533, 891, 1891, 281, 3587, 923, 2139, 352, 8213, 594, 1199, 347, 253, 1566, 310, 1335, 4872, 8772, 253, 3280, 697, 1663, 247, 327, 4115, 274, 3828, 2167, 253, 13737, 432, 425, 25331, 1057, 1056, 247, 3064, 327, 247, 285, 275, 2426, 273, 253, 11786, 8062, 253, 2201, 3064, 436, 2789, 588, 7960, 320, 1315, 317, 3214, 40009, 522, 435, 451, 247, 891, 1537, 452, 9829, 1633, 1060, 285, 651, 11435, 604, 253, 4477, 476, 19148, 50276, 19, 891, 42126, 3240, 755, 253, 2173, 1127, 253, 4477, 497, 2820, 281, 1056, 275, 2593, 4567, 275, 2426, 273, 253, 15424, 8492, 812, 368, 5645, 327, 326, 50276, 20, 4583, 891, 1928, 326, 253, 16774, 1329, 273, 253, 10527, 4342, 476, 320, 10046, 323, 4227, 407, 16030, 272, 1027, 31850, 247, 17, 270, 17, 390, 3588, 839, 253, 9941, 5955, 387, 253, 990, 273, 2593, 4562, 323, 253, 21535, 2957, 751, 275, 1182, 276, 1162, 355, 608, 690, 13506, 941, 812, 3164, 789, 816, 4030, 752, 310, 253, 1921, 323, 760, 970, 1052, 3888, 432, 253, 278, 79, 382, 46277, 274, 15302, 310, 352, 281, 1978, 253, 1979, 273, 815, 74, 1355, 533, 891, 42126, 923, 253, 4477, 1304, 2712, 670, 352, 275, 2593, 3307, 285, 1580, 253, 3625, 4096, 273, 4706, 3307, 310, 281, 2319, 1880, 253, 1566, 651, 671, 1056, 3282, 275, 3946, 534, 891, 1379, 281, 1599, 326, 368, 760, 971, 281, 7277, 253, 1071, 3933, 19103, 3551, 562, 273, 841, 3210, 651, 2649, 247, 1052, 16848, 2715, 273, 278, 79, 382, 46277, 274, 1512, 1355, 281, 3812, 247, 10237, 6452, 327, 436, 50276, 21, 581, 326, 891, 1158, 812, 320, 4217, 323, 2007, 1869, 310, 253, 14940, 2867, 417, 816, 323, 305, 69, 533, 671, 256, 35333, 751, 1182, 276, 1162, 355, 2530, 275, 608, 598, 281, 247, 5912, 50274, 37585, 1841, 50273, 74, 3239, 495, 253, 18012, 273, 253, 3676, 12902, 4872, 3210, 269, 3124, 89, 260, 6768, 403, 14561, 285, 1327, 9961, 40511, 275, 253, 13757, 4903, 490, 14561, 275, 1014, 270, 50274, 2886, 3239, 1638, 362, 82, 50276, 67, 82, 50274, 12211, 3239, 1668, 295, 1752, 2320, 5188, 50276, 79, 1752, 5781, 5188, 50273, 18, 5987, 39962, 2061, 5375, 746, 2693, 520, 22863, 50276, 19, 5987, 39962, 2061, 5375, 16129, 1438, 3571, 1010, 50276, 20, 5987, 39962, 2061, 5375, 1518, 2270, 3763, 883, 50276, 21, 5987, 39962, 2061, 5375, 1518, 1549, 2227, 4739, 50276, 22, 5987, 39962, 2061, 5375, 9755, 520, 41280, 187, 187, 4118, 18435, 27, 783, 2929, 3537, 13505, 253, 11786, 2685, 8062, 273, 3676, 12902, 3210, 342, 4872, 1396, 569, 285, 25097, 4872, 14940, 323, 21396, 2957, 285, 21535, 2957, 2067, 12302, 1543, 285, 10291, 4891, 7680, 2997, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 3755, 20556, 50276, 783, 2022, 4757, 273, 436, 2929, 310, 326, 352, 3982, 50276, 8826, 4722, 16039, 715, 3676, 12902, 50276, 19286, 285, 597, 2692, 616, 1543, 8132, 29689, 253, 14308, 285, 39325, 403, 2590, 2217, 323, 10668, 342, 690, 1783, 285, 5145, 4715, 4114, 281, 4751, 2096, 50275, 17480, 253, 8062, 273, 3676, 4715, 3210, 310, 271, 1527, 1673, 273, 2561, 285, 310, 2649, 6888, 4751, 436, 2929, 588, 7964, 8162, 281, 253, 3676, 4715, 1673, 275, 4685, 253, 8062, 285, 14940, 3762, 273, 841, 3676, 12902, 4872, 3210, 285, 7826, 2750, 2996, 29905, 2706, 327, 4685, 625, 2087, 3676, 4715, 3210, 50276, 20881, 1255, 50276, 20261, 436, 2929, 3982, 690, 5322, 5697, 690, 273, 253, 39396, 403, 417, 3240, 21414, 323, 1650, 275, 253, 4679, 2011, 275, 436, 2929, 597, 50276, 23813, 3045, 273, 3676, 12902, 4872, 3210, 342, 4872, 3210, 285, 3676, 11454, 6928, 3340, 275, 253, 5301, 342, 253, 3676, 6928, 352, 36908, 1646, 326, 253, 6928, 403, 3676, 2217, 323, 253, 5301, 281, 320, 18511, 671, 275, 253, 1072, 4679, 597, 8025, 253, 2032, 941, 3268, 310, 5512, 1677, 407, 247, 3676, 12902, 4872, 1566, 285, 4561, 941, 2556, 281, 436, 1566, 50276, 466, 253, 941, 310, 760, 3300, 603, 267, 352, 651, 320, 1805, 281, 921, 326, 253, 3676, 12902, 4872, 1566, 41731, 13015, 643, 3210, 762, 247, 625, 2087, 4758, 50276, 10489, 483, 253, 2862, 2929, 697, 12744, 752, 403, 253, 9021, 281, 320, 2173, 275, 253, 806, 767, 7118, 697, 12744, 1880, 597, 971, 281, 921, 253, 6194, 1430, 273, 253, 3676, 12902, 4872, 3210, 390, 3037, 253, 8062, 273, 841, 3210, 671, 690, 273, 253, 4722, 7794, 273, 436, 2929, 497, 5816, 690, 4278, 323, 1650, 581, 651, 320, 6110, 275, 6523, 2139, 4555, 476, 253, 3676, 12902, 4872, 3210, 562, 32231, 4872, 3210, 1580, 597, 403, 1097, 4872, 285, 253, 760, 3064, 310, 849, 597, 403, 10166, 275, 253, 1390, 767, 7118, 273, 436, 2929, 597, 671, 3982, 598, 15424, 8492, 534, 651, 320, 1529, 4722, 9400, 281, 25760, 715, 352, 651, 320, 1270, 6523, 625, 14023, 327, 1110, 7794, 7152, 339, 431, 248, 2929, 25339, 253, 3762, 273, 3676, 12902, 3210, 342, 4872, 1396, 569, 253, 1566, 13461, 403, 2602, 4090, 264, 281, 5416, 326, 17032, 26414, 281, 247, 4229, 1127, 247, 3309, 1617, 323, 3733, 3676, 12902, 3210, 253, 2929, 840, 3537, 13505, 253, 11786, 2685, 8062, 273, 824, 3210, 253, 2022, 906, 310, 326, 1386, 3298, 366, 14940, 310, 16293, 323, 247, 966, 273, 2957, 3470, 1690, 21396, 285, 21535, 11655, 672, 3733, 342, 11786, 2685, 436, 6452, 310, 4516, 407, 4679, 5196, 275, 247, 9732, 39095, 3022, 9978, 835, 253, 13301, 403, 4561, 407, 247, 9732, 3676, 12902, 1566, 4645, 326, 3733, 1057, 29623, 275, 3946, 50276, 22412, 12902, 3210, 1957, 247, 4460, 1039, 281, 6194, 11454, 6928, 285, 417, 1199, 310, 1929, 670, 731, 2568, 28055, 352, 310, 1774, 326, 359, 2096, 253, 8062, 273, 824, 3210, 1805, 285, 436, 2929, 310, 247, 1175, 3213, 275, 326, 3884, 50276, 35640, 264, 11701, 50276, 18, 5426, 337, 556, 247, 1745, 80, 327, 253, 987, 4608, 1930, 273, 253, 11370, 627, 943, 2649, 320, 247, 11786, 50276, 19, 253, 2022, 1543, 403, 3559, 4518, 285, 253, 2929, 310, 3839, 3477, 281, 1239, 253, 760, 6517, 310, 2593, 4567, 327, 253, 4602, 342, 4517, 2919, 747, 1299, 3082, 534, 891, 858, 417, 2096, 891, 5583, 8254, 5411, 253, 30328, 3212, 10012, 374, 347, 973, 347, 253, 2022, 3935, 273, 436, 2593, 5474, 33032, 2520, 19529, 2175, 253, 8062, 285, 14940, 3607, 273, 3676, 12902, 3210, 534, 403, 36833, 4229, 3659, 25142, 3969, 281, 253, 11968, 6864, 2701, 273, 2801, 85, 728, 11454, 6928, 347, 253, 4477, 1127, 562, 841, 6928, 9184, 432, 3676, 4872, 6928, 285, 6928, 275, 253, 295, 17922, 13642, 275, 326, 253, 13757, 4558, 14561, 8772, 253, 3602, 253, 4477, 5276, 767, 1543, 806, 597, 5100, 4872, 14940, 281, 253, 4156, 5927, 762, 253, 4942, 7654, 9376, 273, 247, 1980, 499, 460, 15177, 1273, 314, 597, 921, 326, 253, 8062, 273, 253, 3676, 12902, 3210, 19986, 432, 11786, 18499, 8062, 285, 275, 958, 310, 2905, 281, 247, 4517, 2919, 747, 1299, 1332, 50276, 783, 806, 10012, 310, 5322, 285, 973, 15068, 264, 533, 891, 1158, 253, 2523, 273, 253, 9941, 689, 534, 253, 499, 460, 15177, 6556, 812, 320, 1805, 5469, 891, 812, 417, 2028, 1880, 390, 417, 253, 14940, 7024, 327, 4983, 1561, 253, 12171, 6032, 285, 13284, 5372, 11542, 2919, 273, 253, 2957, 50275, 783, 1273, 10012, 5001, 253, 3753, 273, 253, 8062, 19756, 247, 2590, 7914, 10323, 512, 326, 310, 753, 310, 326, 253, 8062, 310, 5799, 432, 752, 651, 320, 2326, 275, 247, 4872, 1566, 285, 697, 417, 2590, 326, 436, 8062, 556, 2712, 281, 513, 342, 15424, 37820, 347, 253, 4477, 1804, 891, 651, 5583, 253, 4477, 19148, 253, 5955, 273, 436, 906, 50275, 783, 4679, 403, 8489, 27541, 285, 891, 3543, 326, 597, 497, 247, 1652, 24363, 670, 253, 8612, 1612, 50276, 33177, 273, 841, 3210, 533, 4931, 891, 858, 417, 4751, 2096, 253, 9978, 285, 6860, 253, 4477, 12421, 3410, 247, 3676, 12902, 1566, 285, 840, 897, 352, 281, 1957, 247, 17697, 5912, 3268, 835, 253, 21839, 310, 342, 2460, 941, 840, 253, 4477, 4944, 2710, 3210, 281, 436, 3268, 285, 921, 326, 253, 3676, 12902, 3210, 534, 310, 10534, 253, 6944, 1159, 9999, 253, 3268, 556, 1805, 3045, 685, 643, 5971, 273, 3470, 891, 1158, 253, 5740, 275, 436, 2593, 812, 320, 37078, 5520, 4931, 15250, 436, 625, 33917, 281, 247, 5974, 442, 12844, 1895, 50274, 7152, 33032, 6010, 436, 789, 16633, 327, 253, 1263, 273, 4156, 14940, 285, 11786, 8062, 273, 247, 4102, 4081, 2021, 273, 3210, 253, 3676, 12902, 50276, 8172, 3210, 1448, 78, 762, 1846, 5971, 273, 2957, 3470, 38883, 253, 425, 25331, 2962, 14940, 285, 253, 499, 11370, 1783, 253, 4477, 8058, 14940, 281, 4156, 5556, 66, 273, 1448, 78, 1293, 2720, 9376, 327, 253, 4871, 278, 273, 253, 1566, 4103, 281, 253, 1180, 273, 941, 295, 50275, 1568, 310, 619, 2087, 4743, 50276, 6050, 3676, 4872, 3210, 275, 2087, 347, 253, 4477, 14969, 556, 644, 7561, 5421, 285, 5747, 253, 9559, 17647, 273, 253, 1448, 78, 672, 2429, 281, 253, 3236, 372, 82, 1566, 326, 270, 2284, 1162, 355, 337, 5421, 891, 1119, 436, 789, 4722, 285, 1774, 275, 14631, 247, 4891, 12153, 323, 253, 10527, 1263, 273, 436, 966, 273, 15424, 16719, 3210, 253, 4477, 7303, 281, 7568, 326, 253, 11786, 8062, 285, 14940, 13260, 273, 1448, 78, 310, 6296, 1027, 432, 6867, 24982, 3676, 4872, 3210, 326, 2720, 7274, 1263, 824, 347, 3676, 4872, 501, 3024, 285, 4768, 253, 7125, 273, 253, 2929, 285, 253, 4737, 275, 253, 30762, 891, 476, 2028, 849, 253, 12902, 2867, 273, 1448, 78, 310, 2403, 253, 2926, 1027, 285, 1158, 436, 2929, 5239, 247, 1175, 4983, 1127, 323, 2852, 2074, 275, 436, 3884, 323, 2087, 15424, 16719, 3210, 533, 1335, 253, 2929, 556, 247, 3710, 7990, 275, 2426, 273, 253, 2605, 352, 2175, 50276, 856, 84, 337, 581, 273, 253, 806, 10527, 2987, 327, 253, 11786, 8062, 285, 14940, 3607, 273, 253, 3676, 12902, 3210, 337, 285, 15424, 3210, 275, 2087, 3495, 534, 403, 3240, 1027, 432, 6041, 3676, 6928, 374, 2590, 14951, 285, 10527, 16039, 342, 4737, 4942, 3477, 281, 956, 253, 4737, 3133, 4583, 3451, 627, 403, 690, 326, 891, 42126, 2451, 8244, 2167, 495, 2590, 5955, 273, 253, 5886, 1690, 285, 3340, 253, 3910, 273, 253, 2720, 1783, 327, 3676, 4872, 11454, 6928, 50276, 5040, 337, 253, 1077, 5426, 273, 1448, 78, 534, 253, 2488, 2530, 247, 1798, 15895, 273, 310, 273, 3710, 7990, 923, 619, 4385, 2708, 326, 35205, 327, 436, 1127, 374, 253, 16774, 2175, 281, 17813, 253, 6452, 273, 253, 10527, 1543, 812, 320, 34615, 50274, 74, 452, 690, 5701, 34974, 323, 253, 4477, 7000, 2708, 50276, 18, 253, 2201, 12291, 326, 891, 1119, 1223, 4361, 253, 5955, 285, 253, 4737, 273, 436, 2929, 310, 326, 1223, 253, 4477, 1750, 281, 1263, 3676, 12902, 4872, 3210, 253, 16039, 6571, 760, 4647, 281, 253, 3210, 5975, 3390, 342, 425, 25331, 2962, 12215, 285, 476, 320, 3542, 275, 253, 830, 1081, 18, 815, 895, 891, 2096, 253, 16038, 323, 4993, 839, 247, 872, 1598, 41886, 12902, 1566, 15895, 533, 50272, 66, 247, 872, 1598, 41886, 3676, 12902, 4872, 1566, 36908, 878, 281, 320, 425, 25331, 323, 697, 480, 317, 706, 757, 323, 253, 15424, 1159, 10012, 281, 789, 275, 253, 22325, 1083, 253, 4229, 1127, 273, 247, 1159, 288, 89, 327, 374, 69, 476, 452, 247, 1980, 4309, 342, 7880, 1318, 50276, 18, 436, 5604, 8018, 326, 12889, 440, 19891, 253, 1159, 288, 89, 778, 417, 29623, 285, 2568, 627, 1335, 310, 247, 4451, 4229, 1127, 285, 581, 476, 27340, 8415, 323, 352, 923, 577, 2299, 1293, 253, 5322, 425, 25331, 2962, 830, 534, 4483, 581, 281, 3630, 25477, 1861, 40009, 66, 18, 347, 247, 4581, 630, 6779, 323, 253, 38353, 959, 554, 394, 2990, 3579, 1509, 891, 13414, 1158, 253, 39383, 588, 2186, 3587, 5431, 253, 891, 75, 18, 1307, 943, 760, 3176, 275, 253, 15424, 1159, 10012, 534, 310, 908, 323, 253, 19265, 1509, 891, 1902, 253, 4477, 281, 19148, 436, 2007, 50272, 67, 253, 1077, 2216, 273, 40009, 66, 275, 253, 1566, 253, 4477, 1263, 310, 247, 2372, 27541, 281, 479, 2139, 9433, 247, 2602, 4090, 327, 253, 2801, 310, 352, 816, 281, 5416, 326, 13989, 337, 6556, 26332, 326, 368, 452, 247, 24783, 872, 1598, 585, 332, 7322, 4872, 1566, 253, 4477, 21198, 247, 1643, 2069, 326, 253, 288, 3830, 1159, 310, 3021, 14561, 8772, 40009, 66, 533, 891, 1891, 281, 3587, 923, 2139, 352, 8213, 594, 1199, 347, 253, 1566, 310, 1335, 4872, 8772, 253, 3280, 697, 1663, 247, 327, 4115, 274, 3828, 2167, 253, 13737, 432, 425, 25331, 1057, 1056, 247, 3064, 327, 247, 285, 275, 2426, 273, 253, 11786, 8062, 253, 2201, 3064, 436, 2789, 588, 7960, 320, 1315, 317, 3214, 40009, 522, 435, 451, 247, 891, 1537, 452, 9829, 1633, 1060, 285, 651, 11435, 604, 253, 4477, 476, 19148, 50276, 19, 891, 42126, 3240, 755, 253, 2173, 1127, 253, 4477, 497, 2820, 281, 1056, 275, 2593, 4567, 275, 2426, 273, 253, 15424, 8492, 812, 368, 5645, 327, 326, 50276, 20, 4583, 891, 1928, 326, 253, 16774, 1329, 273, 253, 10527, 4342, 476, 320, 10046, 323, 4227, 407, 16030, 272, 1027, 31850, 247, 17, 270, 17, 390, 3588, 839, 253, 9941, 5955, 387, 253, 990, 273, 2593, 4562, 323, 253, 21535, 2957, 751, 275, 1182, 276, 1162, 355, 608, 690, 13506, 941, 812, 3164, 789, 816, 4030, 752, 310, 253, 1921, 323, 760, 970, 1052, 3888, 432, 253, 278, 79, 382, 46277, 274, 15302, 310, 352, 281, 1978, 253, 1979, 273, 815, 74, 1355, 533, 891, 42126, 923, 253, 4477, 1304, 2712, 670, 352, 275, 2593, 3307, 285, 1580, 253, 3625, 4096, 273, 4706, 3307, 310, 281, 2319, 1880, 253, 1566, 651, 671, 1056, 3282, 275, 3946, 534, 891, 1379, 281, 1599, 326, 368, 760, 971, 281, 7277, 253, 1071, 3933, 19103, 3551, 562, 273, 841, 3210, 651, 2649, 247, 1052, 16848, 2715, 273, 278, 79, 382, 46277, 274, 1512, 1355, 281, 3812, 247, 10237, 6452, 327, 436, 50276, 21, 581, 326, 891, 1158, 812, 320, 4217, 323, 2007, 1869, 310, 253, 14940, 2867, 417, 816, 323, 305, 69, 533, 671, 256, 35333, 751, 1182, 276, 1162, 355, 2530, 275, 608, 598, 281, 247, 5912, 50274, 37585, 1841, 50273, 74, 3239, 495, 253, 18012, 273, 253, 3676, 12902, 4872, 3210, 269, 3124, 89, 260, 6768, 403, 14561, 285, 1327, 9961, 40511, 275, 253, 13757, 4903, 490, 14561, 275, 1014, 270, 50274, 2886, 3239, 1638, 362, 82, 50276, 67, 82, 50274, 12211, 3239, 1668, 295, 1752, 2320, 5188, 50276, 79, 1752, 5781, 5188, 50273, 18, 5987, 39962, 2061, 5375, 746, 2693, 520, 22863, 50276, 19, 5987, 39962, 2061, 5375, 16129, 1438, 3571, 1010, 50276, 20, 5987, 39962, 2061, 5375, 1518, 2270, 3763, 883, 50276, 21, 5987, 39962, 2061, 5375, 1518, 1549, 2227, 4739, 50276, 22, 5987, 39962, 2061, 5375, 9755, 520, 41280, 187, 187, 4118, 18435, 27, 783, 2929, 3537, 13505, 253, 11786, 2685, 8062, 273, 3676, 12902, 3210, 342, 4872, 1396, 569, 285, 25097, 4872, 14940, 323, 21396, 2957, 285, 21535, 2957, 2067, 12302, 1543, 285, 10291, 4891, 7680, 2997, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this works aims at taskoriented finetuning from pretrained imagenet models it proposes a neural architecture search and online adaption framework nasoa to perform fast taskoriented model finetuning the nasoa first employ an offline nas to select a group of models and then pick up the most suitable model from this group via an online schedule generator weaknesses the proposed method may be useful in some finetuning scenarios however will have low overall impact from the reviewers view lack of novel and interesting part the proposed approach simply uses and combines existing methods eg nas seems more like an engineering projectdocsepsummary the paper presents a joint neural architecture search nas and online adaption oa framework named nasoa aiming at providing faster taskoriented netuning system the first step of the approach is an offline nas to form a pretrained trainingefficient model zoo which is followed by an online scheduler which selects the most suitable model and generates a personalized training regime with respect to the target task strengths the paper is fairly well written and easy to follow the proposed approach is fairly novel up to my knowledge this is the first effort to combine nas and online adaptation techniques for faster finetuning task in addition the proposed jointblock macro level search space is novel too the practical benefits of the proposed approach are obvious for automl systems and the shared etnas model zoo will help other researchers working on this topic the paper provides in appendix b a short but interesting analysis on co2 consumption this is a good practice which is rare enough to be worthy of note weaknesses and remarks i have some concerns regarding the fact that the paper does not compare the proposed approach to a baseline offline setting the idea would be to collect offline training data by finetuning on several datasets of different nature and collect the corresponding accuracies to train a predictor eg an mlp the paper mentions this possibility but says that it is less realistic than the online setting which is true but it would have been interesting to compare the two settings im not sure to understand the relevance of the preliminary experiments presented in subsection 41 the findings ie 1 finetuning performs better than training from scratch and 2 learning rate and frozen stage are crucial for netuning seem obvious to me and correspond to widelyadopted common practices its not clear to me how the models are organized into groups in table 2 are they grouped based on their respective complexity if so it should explicitly mentioned by adding a column with the number of parameters of each model when presenting the imagenet results in table 2 the paper does not mention that current sota on imagenet is around 88 top1 accuracy ie about 6 points above the best result presented in the table the paper should add this information to slightly tone down the claim that searched models outperform other sota imagenet models in terms of training accuracy it would have been interesting to give more details about the nsgaii algorithm in the paper itself rather than only referring to the appendix i understand that its due to the lack of space but it is a little bit frustrating to have no information in the paper about this point especially about the modifications compared to the original algorithm figures 3 and 4 are quite useless in their current form since the text size is too small to be read table 4 is hardly interpretable without reading its corresponding descriptive paragraph in the paper authors should give more information eg in the caption about each method page 5 line 5 mlp stands for multi layer perceptron not perception justification of rating despite the concerns mentioned above i think the proposed approach is an interesting addition to the taskoriented online finetuning literature especially considering its practical benefits for automl systems i also think the paper could be considerably improved by taking into account the comments made above overall im leaning to reject 5 marginally below acceptance threshold but i would be considering to increase my rating if some modifications are made during rebuttal post rebuttal update the authors provided a detailed rebuttal which addressed many of my concerns and clarified some points i will update my rating from 5 marginally below acceptance threshold to 6 marginally above acceptance threshold docsepthis paper is the first to address the problem of searching better backbone architectures for downstream tasks and online hyperparameter selection the whole system proposed in this algorithms offers an endtoend solution for producing a welltrained architecture for a specific downstream task within a fixed training budget i believe this system will have a large impact in industry model deployment this paper separate the overall searching process into 1 generating efficient training model zoo pretrained on a source dataset 2 taskoriented finetuning schedule mode selection hyperparam selection for downstream task finetuning the search space of the efficient model zoo is decoupled to two stages blocklevel search space and macrolevel search space this design could enrich the model zoo with diverse models architectures the major part of taskoriented finetuning schedule is a performance predictor which is fed with target dataset embedding model identity and training hyperparams and output an assessment of the final performance through extensive evaluation experiments the effectiveness is of this system is demonstrated through well defined ablation studies the author offers our insightful conclusion of why their systems works 1 a better and more diverse model zoo where smaller models have simper block structures and larger models have more complicated block structure 2 the welltrained performance predictor in taskoriented finetuning scheduler can successfully capture some sort of correlation between bunch of hyperparam and final performance some of my concerns 1 the searched architectures are not directly optimized for downstream task performance 2 this system cannot address the problem with consistently changing downstream task and the cost of training the scheduler for a new downstream task is a little bit huge because of the large pretraining model zoo overall i recommend this paper to be accepted due to its engineering efforts several interesting conclusion drawn from empirical study and great impact in industries docsepin this paper a joint neural architecture search and online adaption nasoa framework is proposed to achieve a faster taskoriented finetuning upon the request of users in particular two main contributions are made in this paper 1 a finetuning pipeline that seamlessly combines the trainingefficient nas and online adaption algorithm is introduced which can effectively generate a personalized finetuning schedule of each desired task via an adaptive model for accumulating experience from the past tasks 2 a blocklevel and macrolevel search space is introduced in the resulting framework which enables a simple to complex block design and fine adjustment of the computation allocation on each stage i have two comments about this paper which go as follows 1 in this work a joint nsaoa framework is introduced facilitate a fast continuous crosstask model adaption why not learn an effective finetuning regime based on a handcrafted network in my point of view a good finetuning regime should be also independent of the choice of network 2 the blocklevel and macrolevel search space can reduce the redundant skipconnections in the resulting structures to the best of my knowledge the skipconnection is effective to avoid the gradient vanishing problem in a very deep neural network therefore the skipconnections at the lower layers are very important in the network design however the skipconnections at the lower layers will take more memory cost at both training and testing stages from this point of view authors should explain where the redundant skipconnections always occur in the rebuttal ### Summary:
in this paper a network architecture search nas problem in a changing environment is studied and an online adaptation oa algorithm for the problem is proposed many reviewers found that the oanas problem discussed in this paper is interesting and practically important however many reviewers including those with high review scores recognize that the weakness of this paper is the lack of sufficient theoretical verification furthermore although extensive experiments are conducted it is still not clear whether the experimental setups discussed in the paper are generally applicable to other practical problems overall although this is a nice work in that a new practical problem is considered and a workable algorithm for the problem is demonstrated in an extensive simulation study i could not recommend the acceptance in its current form because of the lack of theoretical validity and evidence of general applicability
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2987, 13698, 387, 4836, 21085, 1442, 292, 25004, 432, 3215, 11273, 4440, 257, 292, 3210, 352, 29328, 247, 11454, 10336, 3186, 285, 3909, 5223, 279, 7792, 13332, 12354, 281, 1347, 3809, 4836, 21085, 1566, 1442, 292, 25004, 253, 13332, 12354, 806, 2126, 271, 28841, 13332, 281, 3609, 247, 1387, 273, 3210, 285, 840, 2619, 598, 253, 954, 7470, 1566, 432, 436, 1387, 3066, 271, 3909, 10130, 14156, 50275, 20881, 1255, 265, 253, 4081, 1332, 778, 320, 4217, 275, 690, 1442, 292, 25004, 15216, 2299, 588, 452, 1698, 4583, 3486, 432, 253, 30628, 1859, 3480, 273, 4460, 285, 4722, 629, 253, 4081, 2746, 3365, 4648, 285, 24772, 5368, 3082, 24088, 13332, 3133, 625, 751, 271, 11369, 2199, 7152, 339, 793, 360, 3454, 50276, 783, 2929, 10262, 247, 6036, 11454, 10336, 3186, 13332, 285, 3909, 5223, 279, 258, 66, 7792, 4907, 13332, 12354, 26400, 387, 5277, 7938, 4836, 21085, 2036, 25004, 985, 253, 806, 3213, 273, 253, 2746, 310, 271, 28841, 13332, 281, 830, 247, 3215, 11273, 3733, 20246, 1566, 41089, 534, 310, 3560, 407, 271, 3909, 8194, 14398, 534, 34899, 253, 954, 7470, 1566, 285, 15693, 247, 32339, 3733, 9459, 342, 1675, 281, 253, 2303, 4836, 50276, 296, 3755, 20556, 50276, 783, 2929, 310, 9648, 973, 3542, 285, 3477, 281, 956, 50275, 783, 4081, 2746, 310, 9648, 4460, 598, 281, 619, 3640, 436, 310, 253, 806, 3434, 281, 13398, 13332, 285, 3909, 15644, 5609, 323, 7938, 1442, 292, 25004, 4836, 275, 1635, 253, 4081, 6036, 6172, 14823, 1268, 3186, 2317, 310, 4460, 1512, 50276, 783, 8542, 5373, 273, 253, 4081, 2746, 403, 4755, 323, 3772, 77, 2718, 285, 253, 6096, 1162, 27109, 1566, 41089, 588, 1361, 643, 8607, 2444, 327, 436, 9400, 50276, 783, 2929, 3400, 275, 30762, 270, 247, 2159, 533, 4722, 1783, 327, 820, 19, 8353, 436, 310, 247, 1175, 3946, 534, 310, 7520, 2217, 281, 320, 18338, 273, 3877, 50276, 20881, 1255, 265, 285, 16157, 50276, 74, 452, 690, 7350, 5001, 253, 958, 326, 253, 2929, 1057, 417, 7277, 253, 4081, 2746, 281, 247, 8245, 28841, 4758, 253, 2934, 651, 320, 281, 4822, 28841, 3733, 941, 407, 1442, 292, 25004, 327, 2067, 15302, 273, 1027, 3753, 285, 4822, 253, 3969, 3933, 19103, 281, 6194, 247, 23403, 24088, 271, 13361, 81, 253, 2929, 25957, 436, 6387, 533, 2296, 326, 352, 310, 1679, 15958, 685, 253, 3909, 4758, 534, 310, 2032, 533, 352, 651, 452, 644, 4722, 281, 7277, 253, 767, 7533, 50276, 303, 417, 2119, 281, 2096, 253, 17200, 273, 253, 12611, 4679, 3559, 275, 19087, 7609, 253, 4342, 26332, 337, 1442, 292, 25004, 17923, 1805, 685, 3733, 432, 20041, 285, 374, 4715, 2281, 285, 13831, 3924, 403, 9560, 323, 2036, 25004, 1646, 4755, 281, 479, 285, 2723, 281, 7561, 324, 2178, 264, 1846, 8333, 50276, 953, 417, 2590, 281, 479, 849, 253, 3210, 403, 10932, 715, 2390, 275, 2829, 374, 403, 597, 24104, 1754, 327, 616, 9056, 10454, 604, 594, 352, 943, 11120, 5393, 407, 6240, 247, 5084, 342, 253, 1180, 273, 3602, 273, 1016, 1566, 50275, 9453, 15250, 253, 4440, 257, 292, 1543, 275, 2829, 374, 253, 2929, 1057, 417, 3748, 326, 1655, 256, 5503, 327, 4440, 257, 292, 310, 1475, 11003, 1755, 18, 7200, 26332, 670, 721, 2792, 1840, 253, 1682, 906, 3559, 275, 253, 2829, 253, 2929, 943, 823, 436, 1491, 281, 5777, 10541, 1066, 253, 1750, 326, 16113, 3210, 562, 32231, 643, 256, 5503, 4440, 257, 292, 3210, 275, 2426, 273, 3733, 7200, 50276, 262, 651, 452, 644, 4722, 281, 1918, 625, 4278, 670, 253, 295, 8433, 2284, 74, 5933, 275, 253, 2929, 3139, 2581, 685, 760, 14339, 281, 253, 30762, 891, 2096, 326, 697, 1955, 281, 253, 3480, 273, 2317, 533, 352, 310, 247, 1652, 2372, 29125, 281, 452, 642, 1491, 275, 253, 2929, 670, 436, 1127, 3340, 670, 253, 14586, 2429, 281, 253, 3236, 5933, 50276, 40203, 495, 285, 577, 403, 3240, 19437, 275, 616, 1655, 830, 1580, 253, 2505, 1979, 310, 1512, 1355, 281, 320, 1239, 50275, 2420, 577, 310, 10693, 4665, 494, 1293, 4361, 697, 3969, 27389, 12494, 275, 253, 2929, 4477, 943, 1918, 625, 1491, 24088, 275, 253, 11743, 670, 1016, 1332, 50276, 6377, 608, 1386, 608, 13361, 81, 9572, 323, 4471, 3828, 591, 916, 1406, 417, 13071, 50276, 6309, 1877, 273, 13716, 50276, 3229, 3784, 253, 7350, 5393, 1840, 891, 1158, 253, 4081, 2746, 310, 271, 4722, 1635, 281, 253, 4836, 21085, 3909, 1442, 292, 25004, 6239, 3340, 7296, 697, 8542, 5373, 323, 3772, 77, 2718, 891, 671, 1158, 253, 2929, 812, 320, 15455, 5520, 407, 3192, 715, 2395, 253, 5701, 1160, 1840, 4583, 516, 25661, 281, 12009, 608, 42876, 2708, 14924, 7887, 533, 891, 651, 320, 7296, 281, 2572, 619, 13716, 604, 690, 14586, 403, 1160, 1309, 30080, 22559, 50275, 5996, 30080, 22559, 5731, 50276, 783, 4477, 2530, 247, 7000, 30080, 22559, 534, 9713, 1142, 273, 619, 7350, 285, 31637, 690, 2792, 891, 588, 5731, 619, 13716, 432, 608, 42876, 2708, 14924, 7887, 281, 721, 42876, 1840, 14924, 7887, 5474, 33032, 2520, 2929, 310, 253, 806, 281, 2953, 253, 1895, 273, 12203, 1805, 27882, 35615, 323, 15450, 8892, 285, 3909, 4373, 19484, 5438, 50276, 783, 2644, 985, 4081, 275, 436, 11333, 6131, 271, 990, 936, 423, 2900, 323, 9603, 247, 973, 32927, 10336, 323, 247, 2173, 15450, 4836, 1561, 247, 4229, 3733, 7563, 891, 2868, 436, 985, 588, 452, 247, 1781, 3486, 275, 4491, 1566, 19007, 50276, 2520, 2929, 4858, 253, 4583, 12203, 1232, 715, 337, 11365, 5919, 3733, 1566, 41089, 3215, 11273, 327, 247, 2603, 10895, 374, 4836, 21085, 1442, 292, 25004, 10130, 4438, 5438, 4373, 3575, 5438, 323, 15450, 4836, 1442, 292, 25004, 50276, 783, 3186, 2317, 273, 253, 5919, 1566, 41089, 310, 34430, 6216, 281, 767, 8661, 2972, 5251, 3186, 2317, 285, 14823, 5251, 3186, 2317, 436, 2216, 812, 15655, 253, 1566, 41089, 342, 11117, 3210, 35615, 50275, 783, 2201, 629, 273, 4836, 21085, 1442, 292, 25004, 10130, 310, 247, 3045, 23403, 534, 310, 10208, 342, 2303, 10895, 21496, 1566, 6489, 285, 3733, 4373, 12928, 285, 3453, 271, 6803, 273, 253, 2457, 3045, 50275, 10489, 9470, 7103, 4679, 253, 12510, 310, 273, 436, 985, 310, 5183, 50276, 10489, 973, 2931, 28913, 2175, 253, 2488, 6131, 776, 47860, 6452, 273, 2139, 616, 2718, 2987, 50276, 18, 247, 1805, 285, 625, 11117, 1566, 41089, 835, 50276, 6795, 254, 3210, 452, 948, 468, 2972, 5289, 285, 4067, 3210, 452, 625, 9542, 2972, 2605, 374, 253, 973, 32927, 3045, 23403, 275, 4836, 21085, 1442, 292, 25004, 50276, 31062, 14398, 476, 8379, 9232, 690, 3686, 273, 5921, 875, 12190, 273, 4373, 3575, 285, 2457, 3045, 50275, 8826, 273, 619, 7350, 50276, 18, 253, 16113, 35615, 403, 417, 50276, 18711, 314, 18325, 323, 15450, 4836, 3045, 50275, 19, 436, 985, 2550, 2953, 253, 1895, 342, 12724, 6890, 15450, 4836, 285, 253, 2105, 273, 3733, 253, 8194, 14398, 323, 247, 747, 15450, 4836, 310, 247, 1652, 2372, 5699, 984, 273, 253, 1781, 3215, 26208, 1566, 41089, 50275, 1189, 455, 891, 5583, 436, 2929, 281, 320, 7607, 1955, 281, 697, 11369, 6031, 2067, 4722, 6452, 8392, 432, 16774, 1263, 285, 1270, 3486, 275, 17057, 5474, 339, 9852, 436, 2929, 247, 6036, 11454, 10336, 3186, 285, 3909, 5223, 279, 13332, 12354, 7792, 310, 4081, 281, 5115, 247, 7938, 4836, 21085, 1442, 292, 25004, 2220, 253, 2748, 273, 4212, 275, 1798, 767, 2022, 9021, 403, 1160, 275, 436, 2929, 337, 247, 1442, 292, 25004, 15722, 326, 22595, 13102, 24772, 253, 3733, 20246, 13332, 285, 3909, 5223, 279, 5933, 310, 5611, 534, 476, 8069, 6635, 247, 32339, 1442, 292, 25004, 10130, 273, 1016, 6799, 4836, 3066, 271, 17825, 1566, 323, 47125, 2793, 432, 253, 2469, 8892, 374, 247, 2972, 5251, 285, 14823, 5251, 3186, 2317, 310, 5611, 275, 253, 4795, 7792, 534, 13276, 247, 2969, 281, 2570, 2972, 2216, 285, 4030, 14000, 273, 253, 13782, 17621, 327, 1016, 3924, 50276, 74, 452, 767, 50276, 26122, 670, 436, 2929, 534, 50276, 2184, 347, 3637, 50276, 18, 186, 249, 436, 789, 247, 6036, 295, 6678, 12354, 7792, 310, 5611, 12454, 247, 3809, 5415, 23987, 296, 1945, 1566, 5223, 279, 2139, 417, 3037, 271, 3576, 1442, 292, 25004, 9459, 1754, 327, 247, 1133, 12517, 264, 2990, 275, 619, 1127, 273, 1859, 247, 1175, 1442, 292, 25004, 9459, 943, 320, 671, 3907, 273, 253, 4327, 273, 2990, 50276, 19, 186, 783, 2972, 5251, 285, 14823, 5251, 3186, 2317, 476, 4796, 253, 28116, 17049, 35002, 275, 253, 4795, 5289, 281, 253, 1682, 273, 619, 3640, 253, 17049, 14477, 310, 3576, 281, 3693, 253, 11786, 29199, 1895, 275, 247, 1077, 3676, 11454, 2990, 3103, 253, 17049, 35002, 387, 253, 2406, 8090, 403, 1077, 1774, 275, 253, 2990, 2216, 2299, 253, 17049, 35002, 387, 253, 2406, 8090, 588, 1379, 625, 3541, 2105, 387, 1097, 3733, 285, 5175, 8661, 432, 436, 1127, 273, 1859, 4477, 943, 5513, 835, 253, 28116, 17049, 35002, 1900, 2826, 275, 253, 30080, 22559, 2490, 187, 4118, 18435, 27, 249, 436, 2929, 247, 2990, 10336, 3186, 13332, 1895, 275, 247, 6890, 3126, 310, 5421, 285, 271, 3909, 15644, 258, 66, 5933, 323, 253, 1895, 310, 4081, 1142, 30628, 1119, 326, 253, 258, 24288, 1895, 5469, 275, 436, 2929, 310, 4722, 285, 18236, 1774, 2299, 1142, 30628, 1690, 1110, 342, 1029, 2278, 7363, 9446, 326, 253, 14855, 273, 436, 2929, 310, 253, 3480, 273, 4209, 10527, 21999, 33810, 3738, 9470, 4679, 403, 5196, 352, 310, 1335, 417, 2590, 1880, 253, 5661, 873, 8777, 5469, 275, 253, 2929, 403, 3839, 7763, 281, 643, 8542, 3237, 4583, 3738, 436, 310, 247, 5322, 789, 275, 326, 247, 747, 8542, 1895, 310, 2783, 285, 247, 789, 494, 5933, 323, 253, 1895, 310, 5183, 275, 271, 9470, 9864, 1263, 891, 812, 417, 5583, 253, 14924, 275, 697, 1655, 830, 984, 273, 253, 3480, 273, 10527, 13091, 285, 1941, 273, 2087, 30437 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2987, 13698, 387, 4836, 21085, 1442, 292, 25004, 432, 3215, 11273, 4440, 257, 292, 3210, 352, 29328, 247, 11454, 10336, 3186, 285, 3909, 5223, 279, 7792, 13332, 12354, 281, 1347, 3809, 4836, 21085, 1566, 1442, 292, 25004, 253, 13332, 12354, 806, 2126, 271, 28841, 13332, 281, 3609, 247, 1387, 273, 3210, 285, 840, 2619, 598, 253, 954, 7470, 1566, 432, 436, 1387, 3066, 271, 3909, 10130, 14156, 50275, 20881, 1255, 265, 253, 4081, 1332, 778, 320, 4217, 275, 690, 1442, 292, 25004, 15216, 2299, 588, 452, 1698, 4583, 3486, 432, 253, 30628, 1859, 3480, 273, 4460, 285, 4722, 629, 253, 4081, 2746, 3365, 4648, 285, 24772, 5368, 3082, 24088, 13332, 3133, 625, 751, 271, 11369, 2199, 7152, 339, 793, 360, 3454, 50276, 783, 2929, 10262, 247, 6036, 11454, 10336, 3186, 13332, 285, 3909, 5223, 279, 258, 66, 7792, 4907, 13332, 12354, 26400, 387, 5277, 7938, 4836, 21085, 2036, 25004, 985, 253, 806, 3213, 273, 253, 2746, 310, 271, 28841, 13332, 281, 830, 247, 3215, 11273, 3733, 20246, 1566, 41089, 534, 310, 3560, 407, 271, 3909, 8194, 14398, 534, 34899, 253, 954, 7470, 1566, 285, 15693, 247, 32339, 3733, 9459, 342, 1675, 281, 253, 2303, 4836, 50276, 296, 3755, 20556, 50276, 783, 2929, 310, 9648, 973, 3542, 285, 3477, 281, 956, 50275, 783, 4081, 2746, 310, 9648, 4460, 598, 281, 619, 3640, 436, 310, 253, 806, 3434, 281, 13398, 13332, 285, 3909, 15644, 5609, 323, 7938, 1442, 292, 25004, 4836, 275, 1635, 253, 4081, 6036, 6172, 14823, 1268, 3186, 2317, 310, 4460, 1512, 50276, 783, 8542, 5373, 273, 253, 4081, 2746, 403, 4755, 323, 3772, 77, 2718, 285, 253, 6096, 1162, 27109, 1566, 41089, 588, 1361, 643, 8607, 2444, 327, 436, 9400, 50276, 783, 2929, 3400, 275, 30762, 270, 247, 2159, 533, 4722, 1783, 327, 820, 19, 8353, 436, 310, 247, 1175, 3946, 534, 310, 7520, 2217, 281, 320, 18338, 273, 3877, 50276, 20881, 1255, 265, 285, 16157, 50276, 74, 452, 690, 7350, 5001, 253, 958, 326, 253, 2929, 1057, 417, 7277, 253, 4081, 2746, 281, 247, 8245, 28841, 4758, 253, 2934, 651, 320, 281, 4822, 28841, 3733, 941, 407, 1442, 292, 25004, 327, 2067, 15302, 273, 1027, 3753, 285, 4822, 253, 3969, 3933, 19103, 281, 6194, 247, 23403, 24088, 271, 13361, 81, 253, 2929, 25957, 436, 6387, 533, 2296, 326, 352, 310, 1679, 15958, 685, 253, 3909, 4758, 534, 310, 2032, 533, 352, 651, 452, 644, 4722, 281, 7277, 253, 767, 7533, 50276, 303, 417, 2119, 281, 2096, 253, 17200, 273, 253, 12611, 4679, 3559, 275, 19087, 7609, 253, 4342, 26332, 337, 1442, 292, 25004, 17923, 1805, 685, 3733, 432, 20041, 285, 374, 4715, 2281, 285, 13831, 3924, 403, 9560, 323, 2036, 25004, 1646, 4755, 281, 479, 285, 2723, 281, 7561, 324, 2178, 264, 1846, 8333, 50276, 953, 417, 2590, 281, 479, 849, 253, 3210, 403, 10932, 715, 2390, 275, 2829, 374, 403, 597, 24104, 1754, 327, 616, 9056, 10454, 604, 594, 352, 943, 11120, 5393, 407, 6240, 247, 5084, 342, 253, 1180, 273, 3602, 273, 1016, 1566, 50275, 9453, 15250, 253, 4440, 257, 292, 1543, 275, 2829, 374, 253, 2929, 1057, 417, 3748, 326, 1655, 256, 5503, 327, 4440, 257, 292, 310, 1475, 11003, 1755, 18, 7200, 26332, 670, 721, 2792, 1840, 253, 1682, 906, 3559, 275, 253, 2829, 253, 2929, 943, 823, 436, 1491, 281, 5777, 10541, 1066, 253, 1750, 326, 16113, 3210, 562, 32231, 643, 256, 5503, 4440, 257, 292, 3210, 275, 2426, 273, 3733, 7200, 50276, 262, 651, 452, 644, 4722, 281, 1918, 625, 4278, 670, 253, 295, 8433, 2284, 74, 5933, 275, 253, 2929, 3139, 2581, 685, 760, 14339, 281, 253, 30762, 891, 2096, 326, 697, 1955, 281, 253, 3480, 273, 2317, 533, 352, 310, 247, 1652, 2372, 29125, 281, 452, 642, 1491, 275, 253, 2929, 670, 436, 1127, 3340, 670, 253, 14586, 2429, 281, 253, 3236, 5933, 50276, 40203, 495, 285, 577, 403, 3240, 19437, 275, 616, 1655, 830, 1580, 253, 2505, 1979, 310, 1512, 1355, 281, 320, 1239, 50275, 2420, 577, 310, 10693, 4665, 494, 1293, 4361, 697, 3969, 27389, 12494, 275, 253, 2929, 4477, 943, 1918, 625, 1491, 24088, 275, 253, 11743, 670, 1016, 1332, 50276, 6377, 608, 1386, 608, 13361, 81, 9572, 323, 4471, 3828, 591, 916, 1406, 417, 13071, 50276, 6309, 1877, 273, 13716, 50276, 3229, 3784, 253, 7350, 5393, 1840, 891, 1158, 253, 4081, 2746, 310, 271, 4722, 1635, 281, 253, 4836, 21085, 3909, 1442, 292, 25004, 6239, 3340, 7296, 697, 8542, 5373, 323, 3772, 77, 2718, 891, 671, 1158, 253, 2929, 812, 320, 15455, 5520, 407, 3192, 715, 2395, 253, 5701, 1160, 1840, 4583, 516, 25661, 281, 12009, 608, 42876, 2708, 14924, 7887, 533, 891, 651, 320, 7296, 281, 2572, 619, 13716, 604, 690, 14586, 403, 1160, 1309, 30080, 22559, 50275, 5996, 30080, 22559, 5731, 50276, 783, 4477, 2530, 247, 7000, 30080, 22559, 534, 9713, 1142, 273, 619, 7350, 285, 31637, 690, 2792, 891, 588, 5731, 619, 13716, 432, 608, 42876, 2708, 14924, 7887, 281, 721, 42876, 1840, 14924, 7887, 5474, 33032, 2520, 2929, 310, 253, 806, 281, 2953, 253, 1895, 273, 12203, 1805, 27882, 35615, 323, 15450, 8892, 285, 3909, 4373, 19484, 5438, 50276, 783, 2644, 985, 4081, 275, 436, 11333, 6131, 271, 990, 936, 423, 2900, 323, 9603, 247, 973, 32927, 10336, 323, 247, 2173, 15450, 4836, 1561, 247, 4229, 3733, 7563, 891, 2868, 436, 985, 588, 452, 247, 1781, 3486, 275, 4491, 1566, 19007, 50276, 2520, 2929, 4858, 253, 4583, 12203, 1232, 715, 337, 11365, 5919, 3733, 1566, 41089, 3215, 11273, 327, 247, 2603, 10895, 374, 4836, 21085, 1442, 292, 25004, 10130, 4438, 5438, 4373, 3575, 5438, 323, 15450, 4836, 1442, 292, 25004, 50276, 783, 3186, 2317, 273, 253, 5919, 1566, 41089, 310, 34430, 6216, 281, 767, 8661, 2972, 5251, 3186, 2317, 285, 14823, 5251, 3186, 2317, 436, 2216, 812, 15655, 253, 1566, 41089, 342, 11117, 3210, 35615, 50275, 783, 2201, 629, 273, 4836, 21085, 1442, 292, 25004, 10130, 310, 247, 3045, 23403, 534, 310, 10208, 342, 2303, 10895, 21496, 1566, 6489, 285, 3733, 4373, 12928, 285, 3453, 271, 6803, 273, 253, 2457, 3045, 50275, 10489, 9470, 7103, 4679, 253, 12510, 310, 273, 436, 985, 310, 5183, 50276, 10489, 973, 2931, 28913, 2175, 253, 2488, 6131, 776, 47860, 6452, 273, 2139, 616, 2718, 2987, 50276, 18, 247, 1805, 285, 625, 11117, 1566, 41089, 835, 50276, 6795, 254, 3210, 452, 948, 468, 2972, 5289, 285, 4067, 3210, 452, 625, 9542, 2972, 2605, 374, 253, 973, 32927, 3045, 23403, 275, 4836, 21085, 1442, 292, 25004, 50276, 31062, 14398, 476, 8379, 9232, 690, 3686, 273, 5921, 875, 12190, 273, 4373, 3575, 285, 2457, 3045, 50275, 8826, 273, 619, 7350, 50276, 18, 253, 16113, 35615, 403, 417, 50276, 18711, 314, 18325, 323, 15450, 4836, 3045, 50275, 19, 436, 985, 2550, 2953, 253, 1895, 342, 12724, 6890, 15450, 4836, 285, 253, 2105, 273, 3733, 253, 8194, 14398, 323, 247, 747, 15450, 4836, 310, 247, 1652, 2372, 5699, 984, 273, 253, 1781, 3215, 26208, 1566, 41089, 50275, 1189, 455, 891, 5583, 436, 2929, 281, 320, 7607, 1955, 281, 697, 11369, 6031, 2067, 4722, 6452, 8392, 432, 16774, 1263, 285, 1270, 3486, 275, 17057, 5474, 339, 9852, 436, 2929, 247, 6036, 11454, 10336, 3186, 285, 3909, 5223, 279, 13332, 12354, 7792, 310, 4081, 281, 5115, 247, 7938, 4836, 21085, 1442, 292, 25004, 2220, 253, 2748, 273, 4212, 275, 1798, 767, 2022, 9021, 403, 1160, 275, 436, 2929, 337, 247, 1442, 292, 25004, 15722, 326, 22595, 13102, 24772, 253, 3733, 20246, 13332, 285, 3909, 5223, 279, 5933, 310, 5611, 534, 476, 8069, 6635, 247, 32339, 1442, 292, 25004, 10130, 273, 1016, 6799, 4836, 3066, 271, 17825, 1566, 323, 47125, 2793, 432, 253, 2469, 8892, 374, 247, 2972, 5251, 285, 14823, 5251, 3186, 2317, 310, 5611, 275, 253, 4795, 7792, 534, 13276, 247, 2969, 281, 2570, 2972, 2216, 285, 4030, 14000, 273, 253, 13782, 17621, 327, 1016, 3924, 50276, 74, 452, 767, 50276, 26122, 670, 436, 2929, 534, 50276, 2184, 347, 3637, 50276, 18, 186, 249, 436, 789, 247, 6036, 295, 6678, 12354, 7792, 310, 5611, 12454, 247, 3809, 5415, 23987, 296, 1945, 1566, 5223, 279, 2139, 417, 3037, 271, 3576, 1442, 292, 25004, 9459, 1754, 327, 247, 1133, 12517, 264, 2990, 275, 619, 1127, 273, 1859, 247, 1175, 1442, 292, 25004, 9459, 943, 320, 671, 3907, 273, 253, 4327, 273, 2990, 50276, 19, 186, 783, 2972, 5251, 285, 14823, 5251, 3186, 2317, 476, 4796, 253, 28116, 17049, 35002, 275, 253, 4795, 5289, 281, 253, 1682, 273, 619, 3640, 253, 17049, 14477, 310, 3576, 281, 3693, 253, 11786, 29199, 1895, 275, 247, 1077, 3676, 11454, 2990, 3103, 253, 17049, 35002, 387, 253, 2406, 8090, 403, 1077, 1774, 275, 253, 2990, 2216, 2299, 253, 17049, 35002, 387, 253, 2406, 8090, 588, 1379, 625, 3541, 2105, 387, 1097, 3733, 285, 5175, 8661, 432, 436, 1127, 273, 1859, 4477, 943, 5513, 835, 253, 28116, 17049, 35002, 1900, 2826, 275, 253, 30080, 22559, 2490, 187, 4118, 18435, 27, 249, 436, 2929, 247, 2990, 10336, 3186, 13332, 1895, 275, 247, 6890, 3126, 310, 5421, 285, 271, 3909, 15644, 258, 66, 5933, 323, 253, 1895, 310, 4081, 1142, 30628, 1119, 326, 253, 258, 24288, 1895, 5469, 275, 436, 2929, 310, 4722, 285, 18236, 1774, 2299, 1142, 30628, 1690, 1110, 342, 1029, 2278, 7363, 9446, 326, 253, 14855, 273, 436, 2929, 310, 253, 3480, 273, 4209, 10527, 21999, 33810, 3738, 9470, 4679, 403, 5196, 352, 310, 1335, 417, 2590, 1880, 253, 5661, 873, 8777, 5469, 275, 253, 2929, 403, 3839, 7763, 281, 643, 8542, 3237, 4583, 3738, 436, 310, 247, 5322, 789, 275, 326, 247, 747, 8542, 1895, 310, 2783, 285, 247, 789, 494, 5933, 323, 253, 1895, 310, 5183, 275, 271, 9470, 9864, 1263, 891, 812, 417, 5583, 253, 14924, 275, 697, 1655, 830, 984, 273, 253, 3480, 273, 10527, 13091, 285, 1941, 273, 2087, 30437 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper proposes a dataset and procedure for testing methods that reduce the training sample the data sources are the wellknown cifar and tinyimagenet datasets the authors test how the existing methods behave when using different augmentations compression ratios neural network classifiers including those found with nas 1 the paper attempts to classify the conditions under which the training sample compression method should work well 2 the methods tested in the paper are stateoftheart 1 related work is described in introduction and benchmark description that makes the paper hard to comprehend i believe that separating related work in its own section would benefit the paper 2 the benchmark uses cifar 10 and 100 tiny imagenet as data sources and is limited to image classification 3 metrics used for evaluation are described poorly 4 in general the work does not look like a complete benchmark with a clear testing procedure it is rather a set of experiments with different methods docsepthe paper presents a benchmark for fairly comparing dataset condensation methods the benchmark evaluates a condensation methods performance with multiple data augmentation schemes and compression ratios in addition to its performance in neural architecture search and crossarchitecture settings the benchmark is modular and standardizes the evaluation of dataset condensation methods the benchmark covers the primary use cases of condensed benchmarks the experimental section evaluates the state of the art condensation methods and identifies important trends across all methods the code is open source and seems easy to use there is an established scoreboard for the benchmark which will foster continued improvement the benchmark only focuses on 3 image classification datasets therefore the benchmark wont indicate the usefulness of these techniques on other tasks or domains the benchmark should include some indicationevaluation of any bias introduced by the dataset condensation method docsepthis work aims to provide a comprehensive benchmark and library to study the true effectiveness of dataset condensation methods in particular it motivates and provides a framework for carefully studying model performance under various data augmentations on the condensed dataset model performance under a wide range of dataset compression ratios dataset condensation effects across model architectures mlps convnets and resnets speedup effects on nas thorough evaluation on the benchmark using common architectures and dataset condensation methods is provided a thoughtful and comprehensive benchmark for evaluating dataset condensation methods strong analysis and experimental studies on existing condensation methods convenient and welldocumented api to easily access the benchmark the datasets cifar10100 and tinyimagenet and architectures mlpconvnetsresnets seem a little bit limited in scope for understanding transferability across architectures if the goal is to test condensed datasets in different settings it makes more sense to consider architectures far beyond cnns docsepthe paper proposes a benchmark for dataset condensation techniques more specifically it aims to identify various factors to be analyzed when investigation dataset condensation revolving primarily about numbers of instances choice of architecture and data augmentation methods together with the additional aspect of transferability these factors are then investigated empirically for the cifar and tinyimagenet datasets on the basis of four recently proposed dataset condensation techniques and two straightforward approaches to select original data instances to store post rebuttal update as indicated in below response parts of my concerns have been addressed whereas decisions taken to not address others lead me to raise my rating while still leaning towards rejecting the paper in the revised form the paper is well structured easy to read and to follow the four considered dataset condensation methods are very recent and intuitively promising in particular there is an appeal that these may be better than traditional core sets which may or may not be the case as some of the empirical investigation has shown here the idea to establish a leaderboard that is being updates is nice perhaps not primarily for the reason to be state of the art but given that there are several dimensions of evaluation more so for comparison purposes in order to relate methods maybe this could also be emphasized as the present narrative seems to favor stat of the art the neural architecture search investigation is interesting because it is an experiment that many may want to conduct albeit will find as shown that the correlations are negligible these types of empirical insights may be useful to the community the choice of methods and taxonomy seems overly narrow already when reading lines 2830 it does not become clear why this particular taxonomy is meaningful most importantly it is unclear to me why dataset condensation is framed as a novel and emerging promising direction lines 26 this may indeed be true for the particular idea in the four synthesis method but only specifically with respect to combining multiple data pointsclasses into a single instance if we think of data subsets core sets have enjoyed rich theoretical investigation and a long history of empirical methods for their selection it is somewhat surprising to not even have core sets be mentioned and two oddly specific methods herding and kcenter be mentioned as some recent advocates of data subset selection my biggest concern stemming directly from above argument on the lack of mention to various prior works is that the framing makes it seem like much of the work is particularly novel and insightful even if it is at least in parts well known more precisely the notion that kmeansembed is some novel proposed method to select data subsets is not only very stretched but just plain wrong core sets have a long history and the particular suggested approach is far from novel see eg tsang et al 2005 for svms core sets for kmeans and kmedians in harpeled kushal 2005 or concretely with respect to the novelty of the evaluating kmeans in the embedding the robust kcenter based on l2 distances in activations of a deep convolutional layer in eg the core set approach of sener savarese iclr 2018 overall the key difference is that these methods not only solve the data subset selection problem they actually do so with various formal guarantees to which degree they work its a very common paradigm that finds applications from active learning to continual learning in my humble opinion this particular paper under review would actually do itself a favor to remove the kmeans embed part and its advocacy for novelty and stick to the investigation of the condensation methods rather than core sets on the flip side if the paper does in fact want to draw an analogue to core sets it should do so rigorously and consider more than one method in addition to random sampling towards a more exhaustive and faithful comparison the data augmentation part is interesting in principle but it is unclear to me what precise insights are supposed to be drawn from the benchmark as a practitioner rather then proposing and conducting a very systematic ablation study on the influence of various types of data augmentation in data condensation creation and later training of a derivate model the shown study conflates various augmentation factors it unfortunately becomes completely opaque in terms of which factors contribute when particularly as some augmentation techniques are overlapping and superpositions whereas some experiments seem to have a motivation others feel more adhoc for instance what was the hypothesis and expectation behind section 33 it is wellknown and trivial from core sets and intuition that larger amounts of images per class will boost accuracy figure 2 in core sets what is then typically investigated is the quality of the selection mechanism in particular for small sample scenarios here however the dataset condensation methods are all biased by an already picked set of randomly subsampled data instances it would be great to see standard deviationsany measures of statistical deviations across randomly seeded experimental repetitions in any of the experimental result figures given that all of the techniques are extremely stochastic i suspect that the addition of standard deviations would show that there is virtually fairly little difference between the methods in many cases in particular figure 1 2 docsepthis paper introduces a data condensationdistillation benchmark under different model architectures compression ratios and datasets expanding comparisons of prior work to demonstrate how such factors can affect the end performance the evaluation also considers a neural architecture search scenario and two data selection mechanisms both as baselines and as initialization strategies extensive benchmarking with 3 datasets 4 augmentation strategies 4 different model architectures 4 data augmentation and 2 data selection methods several insights on the scalability and generalizability of existing data condensation methods evaluation also compares methods on a wider range of compression ratios with larger ipcs to evaluate in usecases with less stringent compression requirements there are several parts that require further explanation and some of the experimental setup choices are not justified more details below my main concern is about the ease of use and extensibility of the benchmark in other words it is unclear how easy it is to include new methods being developed the github repository does not contain any such documentation and cannot seem to find such a class interface in general documentation and project structure is not very clear when trying to find the code for each method distributionmatchingpyhttpsgithubcomjustincui03dcbenchmarkblobmainmethodsdistributionmatchingdistributionmatchingpy seems to be an empty python file and mttdistillationhttpsgithubcomjustincui03dcbenchmarktreemainmethodsmttdistillation and others seem to be copies of full repositories from each paper would be good to consider a common structure across methods and to create a welldocumented api with code releases and github docs example httpsgithubcomopenaigym same comments hold for extending the toolkit to other datasets eg domainspecific ones that a future user would like to work on as well as new data augmentation methods models or downstream tasks are there instructions on how to do so would be good for the paper andor repository to cover such details does the opensourced code contain unit tests or any other form of code verification to ensure correctness and fix any future bug fixes docsepthe authors propose a dataset condensation benchmark for images including several architectures the effect of data augmentation and the performance on nas task they compare four methods dc dsa dm and tm the paper addresses an important problem in ml and thanks to benchmarking is able to underlines several shortcomings of the field as well as isolate the effect of various processes there are many errors that make the reading a difficult experience and no discussion of how the benchmark is going to be updated with new models ### Summary:
there are quite a few problems raised by the reviewers worth paying attention to thoroughness of evaluations performance metrics are difficult to understand and insufficiently justified and described theres also mention of nonperformance related metrics like bias not being adequately considered theres some debate over the appropriateness of the included baselines and some skepticism about the justification for some experiments furthermore reviewer sxsg notes it would be great to see standard deviationsany measures of statistical deviations across randomly seeded experimental repetitions in any of the experimental result figures and i agree i also think authors could do a better job in the main text or supplement justifying the use of these specific baselines however in the revised version of the paper many of these issues are addressed with the rewrite of section 3 restricted scope many mention the limitation of focusing on images and a small handful of datasets reviewer af85 notes the datasets cifar10100 and tinyimagenet and architectures mlpconvnetsresnets seem a little bit limited in scope for understanding transferability across architectures and id agree though i recognize that a goal of the tool is to allow for others to also contribute datasets and send their models to be tested also the authors are correct in noting that many of these issues are a byproduct of the fact that much of the data condensation work so far has been focused on image datasets usability reviewer qbsw reviewer sxsg and especially reviewer 5lwd all mention issues with documentation and not how difficult it is to follow the provided instructions in order to assess a model add a new dataset etc however the authors seem to have addressed many of the concerns improving documentation significantly following this feedback overall it seems the authors paid attention to reviewer critiques and responded respectfully and meaningfully to the provided feedback given the importance of the topic and the current lack of testing infrastructure in this area i recommend we accept this paper as a poster i hope authors continue to take in feedback at the conference to continue to make further improvements to their benchmarking platform
[ 22791, 943, 2486, 690, 14011, 15419, 2368, 273, 667, 8492, 5611, 407, 253, 10895, 33974, 1332, 50276, 7152, 33032, 2520, 789, 13698, 281, 2085, 247, 11088, 22791, 285, 6335, 281, 1263, 253, 2032, 12510, 273, 10895, 33974, 3082, 275, 1798, 352, 15265, 684, 285, 3400, 247, 7792, 323, 9257, 12392, 50276, 7645, 3045, 762, 2710, 941, 35919, 569, 327, 253, 35341, 10895, 50276, 7645, 3045, 762, 247, 4618, 2491, 273, 10895, 13800, 11878, 50276, 42429, 33974, 2538, 2439, 1566, 35615, 13361, 793, 2410, 47301, 285, 501, 47301, 50276, 15507, 484, 2538, 327, 13332, 50276, 42771, 602, 7103, 327, 253, 22791, 970, 1846, 35615, 285, 10895, 33974, 3082, 310, 2530, 50276, 66, 30457, 285, 11088, 22791, 323, 16344, 10895, 33974, 3082, 50276, 9072, 1783, 285, 5661, 2175, 327, 5368, 33974, 3082, 50276, 585, 1261, 850, 285, 6210, 392, 1829, 264, 23370, 281, 4354, 2289, 253, 22791, 50276, 783, 15302, 260, 338, 274, 6903, 361, 285, 10058, 303, 6533, 292, 285, 35615, 13361, 81, 13118, 47301, 373, 47301, 1646, 247, 1652, 2372, 3710, 275, 7990, 323, 4685, 3700, 1430, 2439, 35615, 604, 253, 4736, 310, 281, 1071, 35341, 15302, 275, 1027, 7533, 352, 2789, 625, 3282, 281, 1908, 35615, 2080, 4457, 260, 79, 2224, 5474, 339, 431, 248, 2929, 29328, 247, 22791, 323, 10895, 33974, 5609, 625, 5742, 352, 13698, 281, 4271, 2710, 2616, 281, 320, 5867, 672, 5839, 10895, 33974, 3585, 11932, 8558, 670, 3904, 273, 10872, 4327, 273, 10336, 285, 941, 42072, 3082, 2366, 342, 253, 3081, 4809, 273, 3700, 1430, 841, 2616, 403, 840, 6949, 45190, 323, 253, 260, 338, 274, 285, 10058, 303, 6533, 292, 15302, 327, 253, 3720, 273, 1740, 4102, 4081, 10895, 33974, 5609, 285, 767, 15246, 7274, 281, 3609, 3236, 941, 10872, 281, 4657, 50275, 5996, 30080, 22559, 5731, 347, 4860, 275, 2708, 2380, 4243, 273, 619, 7350, 452, 644, 9713, 5727, 7089, 2668, 281, 417, 2953, 2571, 1421, 479, 281, 7164, 619, 13716, 1223, 1335, 25661, 4404, 33944, 253, 2929, 275, 253, 17265, 830, 50274, 783, 2929, 310, 973, 18872, 3477, 281, 1239, 285, 281, 956, 50275, 783, 1740, 2783, 10895, 33974, 3082, 403, 1077, 3332, 285, 540, 41597, 12532, 275, 1798, 627, 310, 271, 4549, 326, 841, 778, 320, 1805, 685, 5899, 5161, 5239, 534, 778, 390, 778, 417, 320, 253, 1083, 347, 690, 273, 253, 16774, 5839, 556, 2011, 1060, 50275, 783, 2934, 281, 5100, 247, 6657, 4697, 326, 310, 1146, 11269, 310, 5322, 4931, 417, 8558, 323, 253, 1921, 281, 320, 1375, 273, 253, 1445, 533, 1677, 326, 627, 403, 2067, 10103, 273, 7103, 625, 594, 323, 5301, 6378, 275, 1340, 281, 14588, 3082, 5046, 436, 812, 671, 320, 21947, 347, 253, 1246, 14511, 3133, 281, 3718, 1098, 273, 253, 1445, 50276, 783, 11454, 10336, 3186, 5839, 310, 4722, 984, 352, 310, 271, 3368, 326, 1142, 778, 971, 281, 2589, 23447, 588, 1089, 347, 2011, 326, 253, 13007, 403, 22879, 841, 3510, 273, 16774, 16039, 778, 320, 4217, 281, 253, 3114, 50273, 783, 4327, 273, 3082, 285, 2891, 13646, 3133, 27662, 6891, 2168, 672, 4361, 3104, 3349, 1229, 352, 1057, 417, 2489, 2590, 2139, 436, 1798, 2891, 13646, 310, 14282, 954, 15538, 352, 310, 12744, 281, 479, 2139, 10895, 33974, 310, 29318, 347, 247, 4460, 285, 14149, 12532, 3884, 3104, 3436, 436, 778, 6296, 320, 2032, 323, 253, 1798, 2934, 275, 253, 1740, 9066, 1332, 533, 760, 5742, 342, 1675, 281, 16248, 2709, 941, 2792, 19770, 715, 247, 2014, 4227, 604, 359, 1158, 273, 941, 20077, 5161, 5239, 452, 11346, 6793, 10527, 5839, 285, 247, 1048, 2892, 273, 16774, 3082, 323, 616, 5438, 352, 310, 8489, 10084, 281, 417, 1014, 452, 5161, 5239, 320, 5393, 285, 767, 45814, 2173, 3082, 617, 5361, 285, 465, 9229, 320, 5393, 347, 690, 3332, 23318, 273, 941, 8578, 5438, 50275, 2577, 5962, 4468, 45030, 3587, 432, 1840, 4154, 327, 253, 3480, 273, 3748, 281, 2710, 2720, 2987, 310, 326, 253, 39926, 2789, 352, 1646, 751, 1199, 273, 253, 789, 310, 3782, 4460, 285, 47860, 1014, 604, 352, 310, 387, 1878, 275, 4243, 973, 1929, 625, 10534, 253, 10732, 326, 465, 10722, 4393, 264, 310, 690, 4460, 4081, 1332, 281, 3609, 941, 20077, 310, 417, 760, 1077, 20061, 533, 816, 8342, 3430, 5161, 5239, 452, 247, 1048, 2892, 285, 253, 1798, 5125, 2746, 310, 2080, 432, 4460, 923, 24088, 28669, 606, 1162, 355, 5826, 323, 18504, 983, 5161, 5239, 323, 465, 30799, 285, 465, 1314, 2458, 275, 4230, 81, 17371, 50276, 76, 2345, 267, 5826, 390, 345, 2414, 600, 342, 1675, 281, 253, 38135, 273, 253, 16344, 465, 30799, 275, 253, 21496, 253, 10237, 465, 9229, 1754, 327, 298, 19, 13849, 275, 1396, 569, 273, 247, 3676, 27311, 267, 3828, 275, 24088, 253, 5161, 873, 2746, 273, 256, 4330, 50276, 47929, 609, 339, 17857, 32888, 4765, 4583, 253, 2234, 3064, 310, 326, 841, 3082, 417, 760, 8415, 253, 941, 8578, 5438, 1895, 597, 2686, 513, 594, 342, 2710, 7473, 23632, 281, 534, 4248, 597, 789, 697, 247, 1077, 1846, 22199, 326, 9010, 4893, 432, 3939, 4715, 281, 45120, 4715, 50276, 249, 619, 26896, 4743, 436, 1798, 2929, 762, 2278, 651, 2686, 513, 3139, 247, 3718, 281, 5386, 253, 465, 30799, 8473, 629, 285, 697, 28572, 323, 38135, 285, 7356, 281, 253, 5839, 273, 253, 33974, 3082, 2581, 685, 5161, 5239, 327, 253, 19153, 1930, 604, 253, 2929, 1057, 275, 958, 971, 281, 3812, 271, 28046, 281, 5161, 5239, 352, 943, 513, 594, 8132, 29689, 285, 1908, 625, 685, 581, 1332, 275, 1635, 281, 3632, 10491, 4404, 247, 625, 41389, 285, 21738, 5301, 50275, 783, 941, 42072, 629, 310, 4722, 275, 8063, 533, 352, 310, 12744, 281, 479, 752, 10799, 16039, 403, 6326, 281, 320, 8392, 432, 253, 22791, 347, 247, 34815, 2581, 840, 36636, 285, 16472, 247, 1077, 12082, 28913, 1263, 327, 253, 4833, 273, 2710, 3510, 273, 941, 42072, 275, 941, 33974, 8869, 285, 1996, 3733, 273, 247, 3538, 366, 1566, 253, 2011, 1263, 49446, 684, 2710, 42072, 2616, 352, 19235, 4916, 4336, 34350, 275, 2426, 273, 534, 2616, 8162, 672, 3782, 347, 690, 42072, 5609, 403, 21481, 285, 2221, 35507, 50275, 2811, 284, 690, 4679, 1646, 281, 452, 247, 16038, 2571, 1928, 625, 519, 37806, 323, 4227, 752, 369, 253, 9079, 285, 15355, 3212, 2593, 5922, 352, 310, 973, 4304, 285, 14916, 432, 5161, 5239, 285, 30328, 326, 4067, 8322, 273, 3888, 591, 966, 588, 9510, 7200, 4677, 374, 275, 5161, 5239, 752, 310, 840, 5431, 6949, 310, 253, 3290, 273, 253, 5438, 5122, 275, 1798, 323, 1355, 3410, 15216, 1060, 2299, 253, 10895, 33974, 3082, 403, 512, 23539, 407, 271, 2168, 5055, 873, 273, 12421, 8790, 312, 6216, 941, 10872, 50275, 262, 651, 320, 1270, 281, 923, 2629, 21492, 1279, 5593, 273, 7605, 21492, 2439, 12421, 27677, 5661, 49495, 275, 667, 273, 253, 5661, 906, 8442, 1677, 326, 512, 273, 253, 5609, 403, 6685, 19191, 891, 9101, 326, 253, 1635, 273, 2629, 21492, 651, 921, 326, 627, 310, 14257, 9648, 1652, 3064, 875, 253, 3082, 275, 1142, 2219, 275, 1798, 4677, 337, 50276, 19, 50276, 7152, 33032, 2520, 2929, 23970, 247, 941, 33974, 8155, 21755, 22791, 762, 1027, 1566, 35615, 13800, 11878, 285, 15302, 16122, 14023, 273, 2720, 789, 281, 7568, 849, 824, 2616, 476, 2818, 253, 990, 3045, 253, 7103, 671, 19401, 247, 11454, 10336, 3186, 10076, 285, 767, 941, 5438, 6297, 1097, 347, 1666, 25379, 285, 347, 31850, 8130, 50275, 2068, 3134, 22791, 272, 342, 495, 15302, 577, 42072, 8130, 577, 1027, 1566, 35615, 577, 941, 42072, 285, 374, 941, 5438, 3082, 50275, 43249, 16039, 327, 253, 9171, 1430, 285, 2087, 50228, 273, 5368, 941, 33974, 3082, 50276, 15419, 2368, 671, 26662, 3082, 327, 247, 14200, 2491, 273, 13800, 11878, 342, 4067, 13997, 6113, 281, 7472, 275, 441, 886, 1169, 342, 1679, 32881, 13800, 6095, 50276, 9088, 403, 2067, 4243, 326, 2430, 2007, 8813, 285, 690, 273, 253, 5661, 9978, 10165, 403, 417, 17285, 625, 4278, 2708, 619, 2022, 4468, 310, 670, 253, 11990, 273, 897, 285, 1021, 561, 2322, 273, 253, 22791, 275, 643, 3000, 352, 310, 12744, 849, 3477, 352, 310, 281, 2486, 747, 3082, 1146, 3715, 253, 40477, 18491, 1057, 417, 3831, 667, 824, 10097, 285, 2550, 1646, 281, 1089, 824, 247, 966, 5673, 50274, 249, 2087, 10097, 285, 2199, 2605, 310, 417, 1077, 2590, 672, 2820, 281, 1089, 253, 2127, 323, 1016, 1332, 3268, 45767, 4789, 3614, 7280, 681, 6309, 1763, 4113, 2941, 12352, 31591, 4698, 23723, 7265, 30172, 35360, 45767, 35360, 45767, 4789, 3133, 281, 320, 271, 6325, 15548, 1873, 285, 278, 1440, 8155, 21755, 3614, 7280, 681, 6309, 1763, 4113, 2941, 12352, 31591, 4698, 5643, 358, 404, 9349, 3610, 1440, 8155, 21755, 285, 2571, 1646, 281, 320, 10125, 273, 2120, 43445, 432, 1016, 2929, 651, 320, 1175, 281, 1908, 247, 1846, 2605, 2439, 3082, 285, 281, 2794, 247, 6210, 392, 1829, 264, 23370, 342, 2127, 16784, 285, 40477, 27586, 1650, 5987, 7280, 681, 5758, 66, 304, 1105, 50276, 18941, 5701, 2186, 323, 13633, 253, 4968, 11554, 281, 643, 15302, 24088, 10625, 29765, 4394, 326, 247, 2852, 2608, 651, 751, 281, 789, 327, 347, 973, 347, 747, 941, 42072, 3082, 3210, 390, 15450, 8892, 403, 627, 7997, 327, 849, 281, 513, 594, 651, 320, 1175, 323, 253, 2929, 285, 263, 18491, 281, 3835, 824, 4278, 1057, 253, 13279, 47549, 2127, 3831, 3943, 5216, 390, 667, 643, 830, 273, 2127, 21999, 281, 5416, 36594, 285, 4993, 667, 2852, 7505, 26019, 5474, 339, 431, 248, 4477, 12661, 247, 10895, 33974, 22791, 323, 3888, 1690, 2067, 35615, 253, 1055, 273, 941, 42072, 285, 253, 3045, 327, 13332, 4836, 597, 7277, 1740, 3082, 36196, 277, 6678, 42961, 285, 246, 78, 253, 2929, 12453, 271, 1774, 1895, 275, 13361, 285, 6701, 281, 22791, 272, 310, 2104, 281, 762, 8737, 2067, 35387, 273, 253, 1673, 347, 973, 347, 20843, 253, 1055, 273, 2710, 4870, 627, 403, 1142, 6332, 326, 1056, 253, 4361, 247, 2834, 2793, 285, 642, 5955, 273, 849, 253, 22791, 310, 1469, 281, 320, 9300, 342, 747, 3210, 2490, 187, 4118, 18435, 27, 9088, 403, 3240, 247, 1643, 3237, 5439, 407, 253, 30628, 4409, 10054, 4116, 281, 50274, 42771, 602, 1255, 273, 27163, 3045, 17082, 403, 2834, 281, 2096, 285, 12497, 314, 17285, 285, 2529, 253, 373, 671, 3748, 273, 1327, 24159, 2905, 17082, 751, 8492, 417, 1146, 18212, 2783, 253, 373, 690, 8881, 689, 253, 3991, 48362, 273, 253, 2908, 1666, 25379, 285, 690, 44730, 670, 253, 22861, 323, 690, 4679, 33810, 37317, 256, 89, 8433, 7211, 352, 651, 320, 1270, 281, 923, 2629, 21492, 1279, 5593, 273, 7605, 21492, 2439, 12421, 27677, 5661, 49495, 275, 667, 273, 253, 5661, 906, 8442, 285, 891, 5194, 891, 671, 1158, 4477, 812, 513, 247, 1805, 2628, 275, 253, 2022, 2505, 390, 8499, 816, 5411, 253, 897, 273, 841, 2173, 1666, 25379, 2299, 275, 253, 17265, 2715, 273, 253, 2929, 1142, 273, 841, 3374, 403, 9713, 342, 253, 24813, 273, 2593, 495, 50275, 44255, 7990, 1142, 3748, 253, 12291, 273, 13654, 327, 3888, 285, 247, 1355, 17167, 273, 15302, 37317, 6706, 2227, 7211, 253, 15302, 260, 338, 274, 6903, 361, 285, 10058, 303, 6533, 292, 285, 35615, 13361, 81, 13118, 47301, 373, 47301, 1646, 247, 1652, 2372, 3710, 275, 7990, 323, 4685, 3700, 1430, 2439, 35615, 285, 2654, 5194, 2167, 891, 9446, 326, 247, 4736, 273, 253, 4968, 310, 281, 1581, 323, 2571, 281, 671, 8162, 15302, 285, 5007, 616, 3210, 281, 320, 5762, 671, 253, 4477, 403, 3451, 275, 15806, 326, 1142, 273, 841, 3374, 403, 247, 407, 7509, 273, 253, 958, 326, 1199, 273, 253, 941, 33974, 789, 594, 2080, 556, 644, 7106, 327, 2460, 15302, 50275, 316, 1430, 37317, 2805, 1768, 88, 37317, 256, 89, 8433, 285, 3340, 37317, 608, 77, 14066, 512, 3748, 3374, 342, 10097, 285, 417, 849, 2834, 352, 310, 281, 956, 253, 2530, 7997, 275, 1340, 281, 2939, 247, 1566, 823, 247, 747, 10895, 3966, 2299, 253, 4477, 1646, 281, 452, 9713, 1142, 273, 253, 7350, 11138, 10097, 3012, 1563, 436, 8680, 50275, 1189, 455, 352, 3133, 253, 4477, 5087, 4116, 281, 37317, 2268, 4624, 285, 10974, 44738, 285, 4495, 2920, 281, 253, 2530, 8680, 1677, 253, 6349, 273, 253, 9400, 285, 253, 1655, 3480, 273, 5175, 11319, 275, 436, 2170, 891, 5583, 359, 2997, 436, 2929, 347, 247, 20731, 891, 3524, 4477, 4035, 281, 1379, 275, 8680, 387, 253, 8059, 281, 4035, 281, 1056, 2007, 11701, 281, 616, 22791, 272, 5147, 50276 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 22791, 943, 2486, 690, 14011, 15419, 2368, 273, 667, 8492, 5611, 407, 253, 10895, 33974, 1332, 50276, 7152, 33032, 2520, 789, 13698, 281, 2085, 247, 11088, 22791, 285, 6335, 281, 1263, 253, 2032, 12510, 273, 10895, 33974, 3082, 275, 1798, 352, 15265, 684, 285, 3400, 247, 7792, 323, 9257, 12392, 50276, 7645, 3045, 762, 2710, 941, 35919, 569, 327, 253, 35341, 10895, 50276, 7645, 3045, 762, 247, 4618, 2491, 273, 10895, 13800, 11878, 50276, 42429, 33974, 2538, 2439, 1566, 35615, 13361, 793, 2410, 47301, 285, 501, 47301, 50276, 15507, 484, 2538, 327, 13332, 50276, 42771, 602, 7103, 327, 253, 22791, 970, 1846, 35615, 285, 10895, 33974, 3082, 310, 2530, 50276, 66, 30457, 285, 11088, 22791, 323, 16344, 10895, 33974, 3082, 50276, 9072, 1783, 285, 5661, 2175, 327, 5368, 33974, 3082, 50276, 585, 1261, 850, 285, 6210, 392, 1829, 264, 23370, 281, 4354, 2289, 253, 22791, 50276, 783, 15302, 260, 338, 274, 6903, 361, 285, 10058, 303, 6533, 292, 285, 35615, 13361, 81, 13118, 47301, 373, 47301, 1646, 247, 1652, 2372, 3710, 275, 7990, 323, 4685, 3700, 1430, 2439, 35615, 604, 253, 4736, 310, 281, 1071, 35341, 15302, 275, 1027, 7533, 352, 2789, 625, 3282, 281, 1908, 35615, 2080, 4457, 260, 79, 2224, 5474, 339, 431, 248, 2929, 29328, 247, 22791, 323, 10895, 33974, 5609, 625, 5742, 352, 13698, 281, 4271, 2710, 2616, 281, 320, 5867, 672, 5839, 10895, 33974, 3585, 11932, 8558, 670, 3904, 273, 10872, 4327, 273, 10336, 285, 941, 42072, 3082, 2366, 342, 253, 3081, 4809, 273, 3700, 1430, 841, 2616, 403, 840, 6949, 45190, 323, 253, 260, 338, 274, 285, 10058, 303, 6533, 292, 15302, 327, 253, 3720, 273, 1740, 4102, 4081, 10895, 33974, 5609, 285, 767, 15246, 7274, 281, 3609, 3236, 941, 10872, 281, 4657, 50275, 5996, 30080, 22559, 5731, 347, 4860, 275, 2708, 2380, 4243, 273, 619, 7350, 452, 644, 9713, 5727, 7089, 2668, 281, 417, 2953, 2571, 1421, 479, 281, 7164, 619, 13716, 1223, 1335, 25661, 4404, 33944, 253, 2929, 275, 253, 17265, 830, 50274, 783, 2929, 310, 973, 18872, 3477, 281, 1239, 285, 281, 956, 50275, 783, 1740, 2783, 10895, 33974, 3082, 403, 1077, 3332, 285, 540, 41597, 12532, 275, 1798, 627, 310, 271, 4549, 326, 841, 778, 320, 1805, 685, 5899, 5161, 5239, 534, 778, 390, 778, 417, 320, 253, 1083, 347, 690, 273, 253, 16774, 5839, 556, 2011, 1060, 50275, 783, 2934, 281, 5100, 247, 6657, 4697, 326, 310, 1146, 11269, 310, 5322, 4931, 417, 8558, 323, 253, 1921, 281, 320, 1375, 273, 253, 1445, 533, 1677, 326, 627, 403, 2067, 10103, 273, 7103, 625, 594, 323, 5301, 6378, 275, 1340, 281, 14588, 3082, 5046, 436, 812, 671, 320, 21947, 347, 253, 1246, 14511, 3133, 281, 3718, 1098, 273, 253, 1445, 50276, 783, 11454, 10336, 3186, 5839, 310, 4722, 984, 352, 310, 271, 3368, 326, 1142, 778, 971, 281, 2589, 23447, 588, 1089, 347, 2011, 326, 253, 13007, 403, 22879, 841, 3510, 273, 16774, 16039, 778, 320, 4217, 281, 253, 3114, 50273, 783, 4327, 273, 3082, 285, 2891, 13646, 3133, 27662, 6891, 2168, 672, 4361, 3104, 3349, 1229, 352, 1057, 417, 2489, 2590, 2139, 436, 1798, 2891, 13646, 310, 14282, 954, 15538, 352, 310, 12744, 281, 479, 2139, 10895, 33974, 310, 29318, 347, 247, 4460, 285, 14149, 12532, 3884, 3104, 3436, 436, 778, 6296, 320, 2032, 323, 253, 1798, 2934, 275, 253, 1740, 9066, 1332, 533, 760, 5742, 342, 1675, 281, 16248, 2709, 941, 2792, 19770, 715, 247, 2014, 4227, 604, 359, 1158, 273, 941, 20077, 5161, 5239, 452, 11346, 6793, 10527, 5839, 285, 247, 1048, 2892, 273, 16774, 3082, 323, 616, 5438, 352, 310, 8489, 10084, 281, 417, 1014, 452, 5161, 5239, 320, 5393, 285, 767, 45814, 2173, 3082, 617, 5361, 285, 465, 9229, 320, 5393, 347, 690, 3332, 23318, 273, 941, 8578, 5438, 50275, 2577, 5962, 4468, 45030, 3587, 432, 1840, 4154, 327, 253, 3480, 273, 3748, 281, 2710, 2720, 2987, 310, 326, 253, 39926, 2789, 352, 1646, 751, 1199, 273, 253, 789, 310, 3782, 4460, 285, 47860, 1014, 604, 352, 310, 387, 1878, 275, 4243, 973, 1929, 625, 10534, 253, 10732, 326, 465, 10722, 4393, 264, 310, 690, 4460, 4081, 1332, 281, 3609, 941, 20077, 310, 417, 760, 1077, 20061, 533, 816, 8342, 3430, 5161, 5239, 452, 247, 1048, 2892, 285, 253, 1798, 5125, 2746, 310, 2080, 432, 4460, 923, 24088, 28669, 606, 1162, 355, 5826, 323, 18504, 983, 5161, 5239, 323, 465, 30799, 285, 465, 1314, 2458, 275, 4230, 81, 17371, 50276, 76, 2345, 267, 5826, 390, 345, 2414, 600, 342, 1675, 281, 253, 38135, 273, 253, 16344, 465, 30799, 275, 253, 21496, 253, 10237, 465, 9229, 1754, 327, 298, 19, 13849, 275, 1396, 569, 273, 247, 3676, 27311, 267, 3828, 275, 24088, 253, 5161, 873, 2746, 273, 256, 4330, 50276, 47929, 609, 339, 17857, 32888, 4765, 4583, 253, 2234, 3064, 310, 326, 841, 3082, 417, 760, 8415, 253, 941, 8578, 5438, 1895, 597, 2686, 513, 594, 342, 2710, 7473, 23632, 281, 534, 4248, 597, 789, 697, 247, 1077, 1846, 22199, 326, 9010, 4893, 432, 3939, 4715, 281, 45120, 4715, 50276, 249, 619, 26896, 4743, 436, 1798, 2929, 762, 2278, 651, 2686, 513, 3139, 247, 3718, 281, 5386, 253, 465, 30799, 8473, 629, 285, 697, 28572, 323, 38135, 285, 7356, 281, 253, 5839, 273, 253, 33974, 3082, 2581, 685, 5161, 5239, 327, 253, 19153, 1930, 604, 253, 2929, 1057, 275, 958, 971, 281, 3812, 271, 28046, 281, 5161, 5239, 352, 943, 513, 594, 8132, 29689, 285, 1908, 625, 685, 581, 1332, 275, 1635, 281, 3632, 10491, 4404, 247, 625, 41389, 285, 21738, 5301, 50275, 783, 941, 42072, 629, 310, 4722, 275, 8063, 533, 352, 310, 12744, 281, 479, 752, 10799, 16039, 403, 6326, 281, 320, 8392, 432, 253, 22791, 347, 247, 34815, 2581, 840, 36636, 285, 16472, 247, 1077, 12082, 28913, 1263, 327, 253, 4833, 273, 2710, 3510, 273, 941, 42072, 275, 941, 33974, 8869, 285, 1996, 3733, 273, 247, 3538, 366, 1566, 253, 2011, 1263, 49446, 684, 2710, 42072, 2616, 352, 19235, 4916, 4336, 34350, 275, 2426, 273, 534, 2616, 8162, 672, 3782, 347, 690, 42072, 5609, 403, 21481, 285, 2221, 35507, 50275, 2811, 284, 690, 4679, 1646, 281, 452, 247, 16038, 2571, 1928, 625, 519, 37806, 323, 4227, 752, 369, 253, 9079, 285, 15355, 3212, 2593, 5922, 352, 310, 973, 4304, 285, 14916, 432, 5161, 5239, 285, 30328, 326, 4067, 8322, 273, 3888, 591, 966, 588, 9510, 7200, 4677, 374, 275, 5161, 5239, 752, 310, 840, 5431, 6949, 310, 253, 3290, 273, 253, 5438, 5122, 275, 1798, 323, 1355, 3410, 15216, 1060, 2299, 253, 10895, 33974, 3082, 403, 512, 23539, 407, 271, 2168, 5055, 873, 273, 12421, 8790, 312, 6216, 941, 10872, 50275, 262, 651, 320, 1270, 281, 923, 2629, 21492, 1279, 5593, 273, 7605, 21492, 2439, 12421, 27677, 5661, 49495, 275, 667, 273, 253, 5661, 906, 8442, 1677, 326, 512, 273, 253, 5609, 403, 6685, 19191, 891, 9101, 326, 253, 1635, 273, 2629, 21492, 651, 921, 326, 627, 310, 14257, 9648, 1652, 3064, 875, 253, 3082, 275, 1142, 2219, 275, 1798, 4677, 337, 50276, 19, 50276, 7152, 33032, 2520, 2929, 23970, 247, 941, 33974, 8155, 21755, 22791, 762, 1027, 1566, 35615, 13800, 11878, 285, 15302, 16122, 14023, 273, 2720, 789, 281, 7568, 849, 824, 2616, 476, 2818, 253, 990, 3045, 253, 7103, 671, 19401, 247, 11454, 10336, 3186, 10076, 285, 767, 941, 5438, 6297, 1097, 347, 1666, 25379, 285, 347, 31850, 8130, 50275, 2068, 3134, 22791, 272, 342, 495, 15302, 577, 42072, 8130, 577, 1027, 1566, 35615, 577, 941, 42072, 285, 374, 941, 5438, 3082, 50275, 43249, 16039, 327, 253, 9171, 1430, 285, 2087, 50228, 273, 5368, 941, 33974, 3082, 50276, 15419, 2368, 671, 26662, 3082, 327, 247, 14200, 2491, 273, 13800, 11878, 342, 4067, 13997, 6113, 281, 7472, 275, 441, 886, 1169, 342, 1679, 32881, 13800, 6095, 50276, 9088, 403, 2067, 4243, 326, 2430, 2007, 8813, 285, 690, 273, 253, 5661, 9978, 10165, 403, 417, 17285, 625, 4278, 2708, 619, 2022, 4468, 310, 670, 253, 11990, 273, 897, 285, 1021, 561, 2322, 273, 253, 22791, 275, 643, 3000, 352, 310, 12744, 849, 3477, 352, 310, 281, 2486, 747, 3082, 1146, 3715, 253, 40477, 18491, 1057, 417, 3831, 667, 824, 10097, 285, 2550, 1646, 281, 1089, 824, 247, 966, 5673, 50274, 249, 2087, 10097, 285, 2199, 2605, 310, 417, 1077, 2590, 672, 2820, 281, 1089, 253, 2127, 323, 1016, 1332, 3268, 45767, 4789, 3614, 7280, 681, 6309, 1763, 4113, 2941, 12352, 31591, 4698, 23723, 7265, 30172, 35360, 45767, 35360, 45767, 4789, 3133, 281, 320, 271, 6325, 15548, 1873, 285, 278, 1440, 8155, 21755, 3614, 7280, 681, 6309, 1763, 4113, 2941, 12352, 31591, 4698, 5643, 358, 404, 9349, 3610, 1440, 8155, 21755, 285, 2571, 1646, 281, 320, 10125, 273, 2120, 43445, 432, 1016, 2929, 651, 320, 1175, 281, 1908, 247, 1846, 2605, 2439, 3082, 285, 281, 2794, 247, 6210, 392, 1829, 264, 23370, 342, 2127, 16784, 285, 40477, 27586, 1650, 5987, 7280, 681, 5758, 66, 304, 1105, 50276, 18941, 5701, 2186, 323, 13633, 253, 4968, 11554, 281, 643, 15302, 24088, 10625, 29765, 4394, 326, 247, 2852, 2608, 651, 751, 281, 789, 327, 347, 973, 347, 747, 941, 42072, 3082, 3210, 390, 15450, 8892, 403, 627, 7997, 327, 849, 281, 513, 594, 651, 320, 1175, 323, 253, 2929, 285, 263, 18491, 281, 3835, 824, 4278, 1057, 253, 13279, 47549, 2127, 3831, 3943, 5216, 390, 667, 643, 830, 273, 2127, 21999, 281, 5416, 36594, 285, 4993, 667, 2852, 7505, 26019, 5474, 339, 431, 248, 4477, 12661, 247, 10895, 33974, 22791, 323, 3888, 1690, 2067, 35615, 253, 1055, 273, 941, 42072, 285, 253, 3045, 327, 13332, 4836, 597, 7277, 1740, 3082, 36196, 277, 6678, 42961, 285, 246, 78, 253, 2929, 12453, 271, 1774, 1895, 275, 13361, 285, 6701, 281, 22791, 272, 310, 2104, 281, 762, 8737, 2067, 35387, 273, 253, 1673, 347, 973, 347, 20843, 253, 1055, 273, 2710, 4870, 627, 403, 1142, 6332, 326, 1056, 253, 4361, 247, 2834, 2793, 285, 642, 5955, 273, 849, 253, 22791, 310, 1469, 281, 320, 9300, 342, 747, 3210, 2490, 187, 4118, 18435, 27, 9088, 403, 3240, 247, 1643, 3237, 5439, 407, 253, 30628, 4409, 10054, 4116, 281, 50274, 42771, 602, 1255, 273, 27163, 3045, 17082, 403, 2834, 281, 2096, 285, 12497, 314, 17285, 285, 2529, 253, 373, 671, 3748, 273, 1327, 24159, 2905, 17082, 751, 8492, 417, 1146, 18212, 2783, 253, 373, 690, 8881, 689, 253, 3991, 48362, 273, 253, 2908, 1666, 25379, 285, 690, 44730, 670, 253, 22861, 323, 690, 4679, 33810, 37317, 256, 89, 8433, 7211, 352, 651, 320, 1270, 281, 923, 2629, 21492, 1279, 5593, 273, 7605, 21492, 2439, 12421, 27677, 5661, 49495, 275, 667, 273, 253, 5661, 906, 8442, 285, 891, 5194, 891, 671, 1158, 4477, 812, 513, 247, 1805, 2628, 275, 253, 2022, 2505, 390, 8499, 816, 5411, 253, 897, 273, 841, 2173, 1666, 25379, 2299, 275, 253, 17265, 2715, 273, 253, 2929, 1142, 273, 841, 3374, 403, 9713, 342, 253, 24813, 273, 2593, 495, 50275, 44255, 7990, 1142, 3748, 253, 12291, 273, 13654, 327, 3888, 285, 247, 1355, 17167, 273, 15302, 37317, 6706, 2227, 7211, 253, 15302, 260, 338, 274, 6903, 361, 285, 10058, 303, 6533, 292, 285, 35615, 13361, 81, 13118, 47301, 373, 47301, 1646, 247, 1652, 2372, 3710, 275, 7990, 323, 4685, 3700, 1430, 2439, 35615, 285, 2654, 5194, 2167, 891, 9446, 326, 247, 4736, 273, 253, 4968, 310, 281, 1581, 323, 2571, 281, 671, 8162, 15302, 285, 5007, 616, 3210, 281, 320, 5762, 671, 253, 4477, 403, 3451, 275, 15806, 326, 1142, 273, 841, 3374, 403, 247, 407, 7509, 273, 253, 958, 326, 1199, 273, 253, 941, 33974, 789, 594, 2080, 556, 644, 7106, 327, 2460, 15302, 50275, 316, 1430, 37317, 2805, 1768, 88, 37317, 256, 89, 8433, 285, 3340, 37317, 608, 77, 14066, 512, 3748, 3374, 342, 10097, 285, 417, 849, 2834, 352, 310, 281, 956, 253, 2530, 7997, 275, 1340, 281, 2939, 247, 1566, 823, 247, 747, 10895, 3966, 2299, 253, 4477, 1646, 281, 452, 9713, 1142, 273, 253, 7350, 11138, 10097, 3012, 1563, 436, 8680, 50275, 1189, 455, 352, 3133, 253, 4477, 5087, 4116, 281, 37317, 2268, 4624, 285, 10974, 44738, 285, 4495, 2920, 281, 253, 2530, 8680, 1677, 253, 6349, 273, 253, 9400, 285, 253, 1655, 3480, 273, 5175, 11319, 275, 436, 2170, 891, 5583, 359, 2997, 436, 2929, 347, 247, 20731, 891, 3524, 4477, 4035, 281, 1379, 275, 8680, 387, 253, 8059, 281, 4035, 281, 1056, 2007, 11701, 281, 616, 22791, 272, 5147, 50276 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: inspired by the integrated gradient method in attribution methods the authors propose a new importance metric for pruning an integral of the product between the norm of the parameter weight and its attribution along a path between this weight value and a baseline zero value this metric captures more global view than previous magnitudebased and gradientbased pruning methods then the authors devise the socalled singe method to intertwine the pruning with the network finetuning empirical studies are conducted to compare singe with both structured and unstructured pruning methods on cifar 10 and imagenet strength 1 motivation is very clear and the idea of using the approximated integration as the criterion of pruning is a good way to measure the influence of forcing a parameter to 0 2 this paper has clear writing and easy to follow weaknesses 1 figure 1 is not effective and clear enough to facilitate the readers understanding on the arguments made in sec 31 one reason is that the notation mu is not introduced before its appearance also the claims at line 129 133 about the effects of the change in values are not clearly depicted in figure 1 2 the claim that singe significantly outperforms existing stateoftheart dnn pruning methods seems to be a bit inappropriate as its sometimes inconsistent with what demonstrates in the empirical studies for example in table 5 the accuracy of singe is nearly the same as those of adaptdcp manidpa and mdp however the latter remove significantly more parameters or flops than singe does the authors have included adequate limitations docsepthis paper proposes a new neuron pruning method based on the integrated gradients ig attribution algorithm the paper claims that igbased pruning is more optimal than magnitude and local gradientbased methods since it accounts for the gradient effects globally along the path from 0valued baseline weight to given input weight in each layer l the paper performs impressive number of experiments for a number of structured and unstructured pruning techniques from the state of the art literature and shows that igbased pruning outperforms those methods based on two evaluation metrics removed parameters and removed floating point operators in addition to that the paper also proposes entwined pruning and finetuning procedure and shows that it help to obtain better accuracy while performing the pruning the experiments in the paper are conducted for imagenet and mobilenet v2 backbone for imagenet and cifar10 dataset strengths overall the idea of the paper is clearly described related work is properly cited experimental setup and results are properly discussed the paper shows strong empirical evidence that igbased pruning outperforms magnitude gradientbased approaches as well as a number of other state of the art approaches from the neuron pruning literature the paper also shows that entwined finetuning helps to improve model accuracy regardless of pruning technique using over 80 pruning target weaknesses i think that in general the readability fluency of the paper can be further improved specifically i felt that in the notation and description of the approaches there is some level of expectations that the readers know the notations of structured pruning techniques and integrated gradients for example neither figure 1 nor the context talking about figure 1 describes what mu and mu2 are figure 1 and its descriptions are a bit confusing originally i was thinking that a b and c are weight matrices but later the paper refers to a and b as two different weights it would be good to improve the description of figure 1 given that entwined finetuning is one of the important contributions of the paper it would be good to describe it in detail it would be good to make it clear what finetuning over o steps mean and what we mean by finetuning the network f over a batch from d it seems that the authors make an assumption that the readers are familiar with the specific finetuning approach that they are describing at this point from the given description it is hard to tell the amount of novelty in the entwined finetuning table 1 it would be great if the authors explain relatively large std in integrated magnitude x grad cig2 method minor comments on page 9 a notation nn but i dont see a description of notation n table 6 has 2 singe rows but it is not clear what are the differences between those two singes i do not see any potential negative societal impact in this work docsepthis paper proposes a novel pruning method based on an integrated gradient pruning criterion which combines a magnitude and gradient based criterion and integrates the product over the path of the neuron removal in addition the work proposes a finetuning flowchart to improve the performance of the finetuned network this strategy is an iterative layerwise approach which sets the pruning ratios for each block separately lastly a comparison of singe on resnet56 on cifar10 and resnet50 and mobilenetv2 on imagenet shows stateoftheart performance strengths i like the idea of an integration criterion and according to table 1 it seams that this is the main driver for the superior performance of singe to the best of my knowledge this is the first method to use an integration criterion for structured pruning the integration criterion is well motivated and the ablations support the design choices made by the authors singe achieves stateoftheart performance on cifar10 and imagenet weaknesses the main weakness that i see is that the work does not explain how the flops and parameters of the pruned network are calculated this is a major issue because the work does not directly compare to other stateoftheart methods but merely quotes the numbers since the calculation of the flops and parameters of a pruned network is nontrivial see appendix d of sosp 43 this can have a huge impact on the performance and could reduce the performance significantly the second main weakness is that the work applies an iterative finetuning scheme but compares to singleshot pruning methods such as hap sosp oto and srrgr since the paper shows in table 2 that an iterative pruning scheme seems to improve the overall performance significantly this would give singe an advantage and makes a direct comparison of the pruning criterions harder further the authors do not mention the computational effort to evaluate the criterion due to the riemannian integral this should be significantly higher compared to other gradient magnitude or firstorder based methods lastly a direct comparison to at least one competing method within the same pruning pipeline would provide the possibility to better disentangle the performance gains from the finetuning pipeline and criterion yes ### Summary:
novel pruning method based on integrated gradients reviewers agreed that the method is wellmotivated and that the comparisons showcase the potential of this method there are some concerns regarding fairness of the comparisons in terms of flops and parameter count i believe some of the rebuttal answers from the authors address those concerns i think this work is novel and interesting enough to be accepted at neurips
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 38358, 407, 253, 8527, 11786, 1332, 275, 863, 2382, 3082, 253, 4477, 12661, 247, 747, 6349, 7982, 323, 819, 25004, 271, 9909, 273, 253, 1885, 875, 253, 5222, 273, 253, 4764, 2801, 285, 697, 863, 2382, 2112, 247, 1854, 875, 436, 2801, 1318, 285, 247, 8245, 5058, 1318, 436, 7982, 28174, 625, 4156, 1859, 685, 2045, 2849, 7128, 2275, 833, 285, 11786, 3169, 819, 25004, 3082, 840, 253, 4477, 45018, 253, 9267, 18859, 1625, 70, 1332, 281, 44463, 460, 253, 819, 25004, 342, 253, 2990, 1442, 292, 25004, 16774, 2175, 403, 5196, 281, 7277, 1625, 70, 342, 1097, 18872, 285, 440, 34218, 819, 25004, 3082, 327, 260, 338, 274, 884, 285, 4440, 257, 292, 50276, 45563, 337, 16038, 310, 1077, 2590, 285, 253, 2934, 273, 970, 253, 34930, 9554, 347, 253, 17705, 273, 819, 25004, 310, 247, 1175, 1039, 281, 2557, 253, 4833, 273, 17190, 247, 4764, 281, 470, 374, 436, 2929, 556, 2590, 4028, 285, 3477, 281, 956, 50275, 20881, 1255, 265, 50276, 18, 4677, 337, 310, 417, 3576, 285, 2590, 2217, 281, 12454, 253, 10668, 4685, 327, 253, 7125, 1160, 275, 4706, 4562, 581, 1921, 310, 326, 253, 14951, 12910, 310, 417, 5611, 1078, 697, 7286, 671, 253, 3916, 387, 1386, 17181, 50276, 14380, 670, 253, 2538, 273, 253, 1818, 275, 2193, 403, 417, 4518, 17253, 275, 4677, 337, 50276, 19, 253, 1750, 326, 1625, 70, 3012, 41731, 13015, 5368, 1375, 23037, 14387, 277, 9866, 819, 25004, 3082, 3133, 281, 320, 247, 2372, 19582, 347, 697, 4536, 16706, 342, 752, 14371, 275, 253, 16774, 2175, 323, 1650, 275, 2829, 608, 253, 7200, 273, 1625, 70, 310, 4829, 253, 1072, 347, 1110, 273, 5223, 69, 7693, 637, 301, 4904, 285, 278, 12132, 2299, 253, 6158, 5386, 3012, 625, 3602, 390, 892, 2695, 685, 1625, 70, 1057, 253, 4477, 452, 2908, 10599, 7364, 5474, 33032, 2520, 2929, 29328, 247, 747, 23586, 819, 25004, 50276, 9349, 1754, 327, 253, 8527, 27935, 25477, 863, 2382, 5933, 253, 2929, 3916, 326, 25477, 3169, 819, 25004, 310, 625, 8654, 685, 9777, 285, 1980, 11786, 3169, 3082, 1580, 352, 8553, 323, 253, 11786, 2538, 21349, 2112, 253, 1854, 432, 470, 24995, 8245, 2801, 281, 1677, 3280, 2801, 275, 1016, 3828, 298, 253, 2929, 17923, 13943, 1180, 273, 4679, 323, 247, 1180, 273, 18872, 285, 440, 34218, 819, 25004, 5609, 432, 253, 1375, 273, 253, 1445, 6239, 285, 2722, 326, 25477, 3169, 819, 25004, 41731, 13015, 1110, 3082, 1754, 327, 767, 7103, 17082, 50276, 2013, 3149, 3602, 285, 50276, 2013, 3149, 14974, 1127, 9158, 275, 1635, 281, 326, 253, 2929, 671, 29328, 994, 88, 967, 819, 25004, 285, 1442, 292, 25004, 5199, 285, 2722, 326, 352, 1361, 281, 4044, 1805, 7200, 1223, 9591, 253, 819, 25004, 50276, 783, 4679, 275, 253, 2929, 403, 5196, 323, 4440, 257, 292, 285, 31551, 257, 292, 362, 19, 27882, 323, 4440, 257, 292, 285, 50276, 46277, 274, 740, 10895, 50276, 296, 3755, 20556, 50275, 1189, 455, 253, 2934, 273, 253, 2929, 310, 4518, 2529, 2905, 789, 310, 6283, 11106, 5661, 9978, 285, 1543, 403, 6283, 5469, 50276, 783, 2929, 2722, 2266, 16774, 1941, 326, 25477, 3169, 819, 25004, 41731, 13015, 9777, 11786, 3169, 7274, 347, 973, 347, 247, 1180, 273, 643, 1375, 273, 253, 1445, 7274, 432, 253, 23586, 819, 25004, 6239, 50275, 783, 2929, 671, 2722, 326, 994, 88, 967, 1442, 292, 25004, 7729, 281, 3157, 1566, 7200, 10159, 273, 819, 25004, 5853, 970, 689, 5096, 819, 25004, 2303, 50276, 20881, 1255, 265, 50275, 74, 1158, 326, 275, 2087, 253, 1239, 1430, 2938, 1371, 273, 253, 2929, 476, 320, 2007, 5520, 5742, 891, 3543, 326, 275, 253, 14951, 285, 5740, 273, 253, 7274, 627, 310, 690, 1268, 273, 12656, 326, 253, 10668, 871, 253, 41818, 273, 18872, 819, 25004, 5609, 285, 8527, 27935, 323, 1650, 6747, 4677, 337, 4543, 253, 3634, 5015, 670, 4677, 337, 8631, 752, 12910, 285, 12910, 19, 403, 50276, 13206, 337, 285, 697, 20121, 403, 247, 2372, 21643, 8927, 891, 369, 4680, 326, 247, 270, 285, 260, 403, 2801, 12624, 533, 1996, 253, 2929, 10770, 281, 247, 285, 270, 347, 767, 1027, 13461, 352, 651, 320, 1175, 281, 3157, 253, 5740, 273, 4677, 337, 50275, 28821, 326, 994, 88, 967, 1442, 292, 25004, 310, 581, 273, 253, 1774, 9021, 273, 253, 2929, 352, 651, 320, 1175, 281, 6266, 352, 275, 2508, 352, 651, 320, 1175, 281, 1056, 352, 2590, 752, 1442, 292, 25004, 689, 258, 5018, 1599, 285, 752, 359, 1599, 407, 1442, 292, 25004, 253, 2990, 269, 689, 247, 14604, 432, 277, 352, 3133, 326, 253, 4477, 1056, 271, 9376, 326, 253, 10668, 403, 7615, 342, 253, 2173, 1442, 292, 25004, 2746, 326, 597, 403, 12930, 387, 436, 1127, 432, 253, 1677, 5740, 352, 310, 1892, 281, 2028, 253, 2408, 273, 38135, 275, 253, 994, 88, 967, 1442, 292, 25004, 50276, 2420, 337, 352, 651, 320, 1270, 604, 253, 4477, 5513, 4942, 1781, 6268, 275, 8527, 9777, 1269, 3805, 11721, 19, 1332, 50276, 37585, 5701, 50275, 251, 3239, 898, 247, 14951, 48257, 533, 891, 13414, 923, 247, 5740, 273, 14951, 295, 50276, 2420, 721, 556, 374, 1625, 70, 10175, 533, 352, 310, 417, 2590, 752, 403, 253, 3910, 875, 1110, 767, 1625, 265, 50274, 74, 513, 417, 923, 667, 2442, 4016, 38058, 3486, 275, 436, 789, 5474, 33032, 2520, 2929, 29328, 247, 4460, 819, 25004, 1332, 1754, 327, 271, 8527, 11786, 819, 25004, 17705, 534, 24772, 247, 9777, 285, 11786, 1754, 17705, 285, 49661, 253, 1885, 689, 253, 1854, 273, 253, 23586, 8570, 50275, 249, 1635, 253, 789, 29328, 247, 1442, 292, 25004, 2685, 21341, 281, 3157, 253, 3045, 273, 253, 1442, 292, 37437, 2990, 50276, 2520, 5700, 310, 271, 34560, 3828, 3020, 2746, 534, 5239, 253, 819, 25004, 11878, 323, 1016, 2972, 11794, 50275, 6275, 314, 247, 5301, 273, 1625, 70, 327, 501, 3024, 3208, 327, 260, 338, 274, 740, 285, 501, 3024, 1235, 285, 31551, 257, 292, 87, 19, 327, 4440, 257, 292, 2722, 1375, 23037, 14387, 3045, 50275, 296, 3755, 20556, 50275, 74, 751, 253, 2934, 273, 271, 9554, 17705, 285, 2556, 281, 2829, 337, 352, 396, 1317, 326, 436, 310, 253, 2022, 6254, 323, 253, 8936, 3045, 273, 1625, 70, 281, 253, 1682, 273, 619, 3640, 436, 310, 253, 806, 1332, 281, 897, 271, 9554, 17705, 323, 18872, 819, 25004, 50274, 783, 9554, 17705, 310, 973, 17194, 285, 253, 490, 77, 569, 1329, 253, 2216, 10165, 1160, 407, 253, 4477, 50275, 4093, 70, 33526, 1375, 23037, 14387, 3045, 327, 260, 338, 274, 740, 285, 4440, 257, 292, 50276, 20881, 1255, 265, 50275, 783, 2022, 14855, 326, 891, 923, 310, 326, 253, 789, 1057, 417, 5513, 849, 253, 892, 2695, 285, 3602, 273, 253, 819, 37437, 2990, 403, 5118, 436, 310, 247, 2201, 2523, 984, 253, 789, 1057, 417, 3587, 7277, 281, 643, 1375, 23037, 14387, 3082, 533, 7960, 19101, 253, 3904, 1580, 253, 10272, 273, 253, 892, 2695, 285, 3602, 273, 247, 819, 37437, 2990, 310, 37825, 923, 30762, 277, 273, 256, 2161, 7652, 436, 476, 452, 247, 5699, 3486, 327, 253, 3045, 285, 812, 4796, 253, 3045, 3012, 50275, 783, 1273, 2022, 14855, 310, 326, 253, 789, 10384, 271, 34560, 1442, 292, 25004, 6974, 533, 26662, 281, 21864, 12022, 819, 25004, 3082, 824, 347, 419, 81, 256, 2161, 258, 936, 285, 256, 2676, 737, 1580, 253, 2929, 2722, 275, 2829, 374, 326, 271, 34560, 819, 25004, 6974, 3133, 281, 3157, 253, 4583, 3045, 3012, 436, 651, 1918, 1625, 70, 271, 5750, 285, 2789, 247, 1480, 5301, 273, 253, 819, 25004, 16696, 621, 12150, 50274, 44295, 253, 4477, 513, 417, 3748, 253, 15180, 3434, 281, 7472, 253, 17705, 1955, 281, 253, 4172, 39480, 757, 9909, 436, 943, 320, 3012, 2169, 2429, 281, 643, 11786, 9777, 390, 806, 2621, 1754, 3082, 50275, 6275, 314, 247, 1480, 5301, 281, 387, 1878, 581, 11771, 1332, 1561, 253, 1072, 819, 25004, 15722, 651, 2085, 253, 6387, 281, 1805, 557, 290, 2134, 253, 3045, 15988, 432, 253, 1442, 292, 25004, 15722, 285, 17705, 50275, 9820, 2490, 187, 4118, 18435, 27, 2369, 652, 819, 25004, 1332, 1754, 327, 8527, 27935, 30628, 5821, 326, 253, 1332, 310, 973, 24013, 8550, 285, 326, 253, 14023, 34647, 253, 2442, 273, 436, 1332, 627, 403, 690, 7350, 5001, 28959, 273, 253, 14023, 275, 2426, 273, 892, 2695, 285, 4764, 1385, 891, 2868, 690, 273, 253, 30080, 22559, 9172, 432, 253, 4477, 2953, 1110, 7350, 891, 1158, 436, 789, 310, 4460, 285, 4722, 2217, 281, 320, 7607, 387, 5723, 2824 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 38358, 407, 253, 8527, 11786, 1332, 275, 863, 2382, 3082, 253, 4477, 12661, 247, 747, 6349, 7982, 323, 819, 25004, 271, 9909, 273, 253, 1885, 875, 253, 5222, 273, 253, 4764, 2801, 285, 697, 863, 2382, 2112, 247, 1854, 875, 436, 2801, 1318, 285, 247, 8245, 5058, 1318, 436, 7982, 28174, 625, 4156, 1859, 685, 2045, 2849, 7128, 2275, 833, 285, 11786, 3169, 819, 25004, 3082, 840, 253, 4477, 45018, 253, 9267, 18859, 1625, 70, 1332, 281, 44463, 460, 253, 819, 25004, 342, 253, 2990, 1442, 292, 25004, 16774, 2175, 403, 5196, 281, 7277, 1625, 70, 342, 1097, 18872, 285, 440, 34218, 819, 25004, 3082, 327, 260, 338, 274, 884, 285, 4440, 257, 292, 50276, 45563, 337, 16038, 310, 1077, 2590, 285, 253, 2934, 273, 970, 253, 34930, 9554, 347, 253, 17705, 273, 819, 25004, 310, 247, 1175, 1039, 281, 2557, 253, 4833, 273, 17190, 247, 4764, 281, 470, 374, 436, 2929, 556, 2590, 4028, 285, 3477, 281, 956, 50275, 20881, 1255, 265, 50276, 18, 4677, 337, 310, 417, 3576, 285, 2590, 2217, 281, 12454, 253, 10668, 4685, 327, 253, 7125, 1160, 275, 4706, 4562, 581, 1921, 310, 326, 253, 14951, 12910, 310, 417, 5611, 1078, 697, 7286, 671, 253, 3916, 387, 1386, 17181, 50276, 14380, 670, 253, 2538, 273, 253, 1818, 275, 2193, 403, 417, 4518, 17253, 275, 4677, 337, 50276, 19, 253, 1750, 326, 1625, 70, 3012, 41731, 13015, 5368, 1375, 23037, 14387, 277, 9866, 819, 25004, 3082, 3133, 281, 320, 247, 2372, 19582, 347, 697, 4536, 16706, 342, 752, 14371, 275, 253, 16774, 2175, 323, 1650, 275, 2829, 608, 253, 7200, 273, 1625, 70, 310, 4829, 253, 1072, 347, 1110, 273, 5223, 69, 7693, 637, 301, 4904, 285, 278, 12132, 2299, 253, 6158, 5386, 3012, 625, 3602, 390, 892, 2695, 685, 1625, 70, 1057, 253, 4477, 452, 2908, 10599, 7364, 5474, 33032, 2520, 2929, 29328, 247, 747, 23586, 819, 25004, 50276, 9349, 1754, 327, 253, 8527, 27935, 25477, 863, 2382, 5933, 253, 2929, 3916, 326, 25477, 3169, 819, 25004, 310, 625, 8654, 685, 9777, 285, 1980, 11786, 3169, 3082, 1580, 352, 8553, 323, 253, 11786, 2538, 21349, 2112, 253, 1854, 432, 470, 24995, 8245, 2801, 281, 1677, 3280, 2801, 275, 1016, 3828, 298, 253, 2929, 17923, 13943, 1180, 273, 4679, 323, 247, 1180, 273, 18872, 285, 440, 34218, 819, 25004, 5609, 432, 253, 1375, 273, 253, 1445, 6239, 285, 2722, 326, 25477, 3169, 819, 25004, 41731, 13015, 1110, 3082, 1754, 327, 767, 7103, 17082, 50276, 2013, 3149, 3602, 285, 50276, 2013, 3149, 14974, 1127, 9158, 275, 1635, 281, 326, 253, 2929, 671, 29328, 994, 88, 967, 819, 25004, 285, 1442, 292, 25004, 5199, 285, 2722, 326, 352, 1361, 281, 4044, 1805, 7200, 1223, 9591, 253, 819, 25004, 50276, 783, 4679, 275, 253, 2929, 403, 5196, 323, 4440, 257, 292, 285, 31551, 257, 292, 362, 19, 27882, 323, 4440, 257, 292, 285, 50276, 46277, 274, 740, 10895, 50276, 296, 3755, 20556, 50275, 1189, 455, 253, 2934, 273, 253, 2929, 310, 4518, 2529, 2905, 789, 310, 6283, 11106, 5661, 9978, 285, 1543, 403, 6283, 5469, 50276, 783, 2929, 2722, 2266, 16774, 1941, 326, 25477, 3169, 819, 25004, 41731, 13015, 9777, 11786, 3169, 7274, 347, 973, 347, 247, 1180, 273, 643, 1375, 273, 253, 1445, 7274, 432, 253, 23586, 819, 25004, 6239, 50275, 783, 2929, 671, 2722, 326, 994, 88, 967, 1442, 292, 25004, 7729, 281, 3157, 1566, 7200, 10159, 273, 819, 25004, 5853, 970, 689, 5096, 819, 25004, 2303, 50276, 20881, 1255, 265, 50275, 74, 1158, 326, 275, 2087, 253, 1239, 1430, 2938, 1371, 273, 253, 2929, 476, 320, 2007, 5520, 5742, 891, 3543, 326, 275, 253, 14951, 285, 5740, 273, 253, 7274, 627, 310, 690, 1268, 273, 12656, 326, 253, 10668, 871, 253, 41818, 273, 18872, 819, 25004, 5609, 285, 8527, 27935, 323, 1650, 6747, 4677, 337, 4543, 253, 3634, 5015, 670, 4677, 337, 8631, 752, 12910, 285, 12910, 19, 403, 50276, 13206, 337, 285, 697, 20121, 403, 247, 2372, 21643, 8927, 891, 369, 4680, 326, 247, 270, 285, 260, 403, 2801, 12624, 533, 1996, 253, 2929, 10770, 281, 247, 285, 270, 347, 767, 1027, 13461, 352, 651, 320, 1175, 281, 3157, 253, 5740, 273, 4677, 337, 50275, 28821, 326, 994, 88, 967, 1442, 292, 25004, 310, 581, 273, 253, 1774, 9021, 273, 253, 2929, 352, 651, 320, 1175, 281, 6266, 352, 275, 2508, 352, 651, 320, 1175, 281, 1056, 352, 2590, 752, 1442, 292, 25004, 689, 258, 5018, 1599, 285, 752, 359, 1599, 407, 1442, 292, 25004, 253, 2990, 269, 689, 247, 14604, 432, 277, 352, 3133, 326, 253, 4477, 1056, 271, 9376, 326, 253, 10668, 403, 7615, 342, 253, 2173, 1442, 292, 25004, 2746, 326, 597, 403, 12930, 387, 436, 1127, 432, 253, 1677, 5740, 352, 310, 1892, 281, 2028, 253, 2408, 273, 38135, 275, 253, 994, 88, 967, 1442, 292, 25004, 50276, 2420, 337, 352, 651, 320, 1270, 604, 253, 4477, 5513, 4942, 1781, 6268, 275, 8527, 9777, 1269, 3805, 11721, 19, 1332, 50276, 37585, 5701, 50275, 251, 3239, 898, 247, 14951, 48257, 533, 891, 13414, 923, 247, 5740, 273, 14951, 295, 50276, 2420, 721, 556, 374, 1625, 70, 10175, 533, 352, 310, 417, 2590, 752, 403, 253, 3910, 875, 1110, 767, 1625, 265, 50274, 74, 513, 417, 923, 667, 2442, 4016, 38058, 3486, 275, 436, 789, 5474, 33032, 2520, 2929, 29328, 247, 4460, 819, 25004, 1332, 1754, 327, 271, 8527, 11786, 819, 25004, 17705, 534, 24772, 247, 9777, 285, 11786, 1754, 17705, 285, 49661, 253, 1885, 689, 253, 1854, 273, 253, 23586, 8570, 50275, 249, 1635, 253, 789, 29328, 247, 1442, 292, 25004, 2685, 21341, 281, 3157, 253, 3045, 273, 253, 1442, 292, 37437, 2990, 50276, 2520, 5700, 310, 271, 34560, 3828, 3020, 2746, 534, 5239, 253, 819, 25004, 11878, 323, 1016, 2972, 11794, 50275, 6275, 314, 247, 5301, 273, 1625, 70, 327, 501, 3024, 3208, 327, 260, 338, 274, 740, 285, 501, 3024, 1235, 285, 31551, 257, 292, 87, 19, 327, 4440, 257, 292, 2722, 1375, 23037, 14387, 3045, 50275, 296, 3755, 20556, 50275, 74, 751, 253, 2934, 273, 271, 9554, 17705, 285, 2556, 281, 2829, 337, 352, 396, 1317, 326, 436, 310, 253, 2022, 6254, 323, 253, 8936, 3045, 273, 1625, 70, 281, 253, 1682, 273, 619, 3640, 436, 310, 253, 806, 1332, 281, 897, 271, 9554, 17705, 323, 18872, 819, 25004, 50274, 783, 9554, 17705, 310, 973, 17194, 285, 253, 490, 77, 569, 1329, 253, 2216, 10165, 1160, 407, 253, 4477, 50275, 4093, 70, 33526, 1375, 23037, 14387, 3045, 327, 260, 338, 274, 740, 285, 4440, 257, 292, 50276, 20881, 1255, 265, 50275, 783, 2022, 14855, 326, 891, 923, 310, 326, 253, 789, 1057, 417, 5513, 849, 253, 892, 2695, 285, 3602, 273, 253, 819, 37437, 2990, 403, 5118, 436, 310, 247, 2201, 2523, 984, 253, 789, 1057, 417, 3587, 7277, 281, 643, 1375, 23037, 14387, 3082, 533, 7960, 19101, 253, 3904, 1580, 253, 10272, 273, 253, 892, 2695, 285, 3602, 273, 247, 819, 37437, 2990, 310, 37825, 923, 30762, 277, 273, 256, 2161, 7652, 436, 476, 452, 247, 5699, 3486, 327, 253, 3045, 285, 812, 4796, 253, 3045, 3012, 50275, 783, 1273, 2022, 14855, 310, 326, 253, 789, 10384, 271, 34560, 1442, 292, 25004, 6974, 533, 26662, 281, 21864, 12022, 819, 25004, 3082, 824, 347, 419, 81, 256, 2161, 258, 936, 285, 256, 2676, 737, 1580, 253, 2929, 2722, 275, 2829, 374, 326, 271, 34560, 819, 25004, 6974, 3133, 281, 3157, 253, 4583, 3045, 3012, 436, 651, 1918, 1625, 70, 271, 5750, 285, 2789, 247, 1480, 5301, 273, 253, 819, 25004, 16696, 621, 12150, 50274, 44295, 253, 4477, 513, 417, 3748, 253, 15180, 3434, 281, 7472, 253, 17705, 1955, 281, 253, 4172, 39480, 757, 9909, 436, 943, 320, 3012, 2169, 2429, 281, 643, 11786, 9777, 390, 806, 2621, 1754, 3082, 50275, 6275, 314, 247, 1480, 5301, 281, 387, 1878, 581, 11771, 1332, 1561, 253, 1072, 819, 25004, 15722, 651, 2085, 253, 6387, 281, 1805, 557, 290, 2134, 253, 3045, 15988, 432, 253, 1442, 292, 25004, 15722, 285, 17705, 50275, 9820, 2490, 187, 4118, 18435, 27, 2369, 652, 819, 25004, 1332, 1754, 327, 8527, 27935, 30628, 5821, 326, 253, 1332, 310, 973, 24013, 8550, 285, 326, 253, 14023, 34647, 253, 2442, 273, 436, 1332, 627, 403, 690, 7350, 5001, 28959, 273, 253, 14023, 275, 2426, 273, 892, 2695, 285, 4764, 1385, 891, 2868, 690, 273, 253, 30080, 22559, 9172, 432, 253, 4477, 2953, 1110, 7350, 891, 1158, 436, 789, 310, 4460, 285, 4722, 2217, 281, 320, 7607, 387, 5723, 2824 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper suggests a variational formulation for discovering ode systems in a closedform the main advantage of such a formulation is that it can find the system based on the observed trajectories and some analytical testing functions rather than estimating possibly noisy derivatives of them the variational objective is equal to the distance between estimated and true velocity vector fields on the manifold when the estimated trajectory converges to the true one in l2 sense and there are infinitely many testing functions while a finite number of functions are okay empirically the proposed method outperforms its counterparts for various ode systems and tumor volume dataset whose true dynamics is unknown the paper is wellmotivated and easy to follow the proposed variational approach is sound for bypassing timederivatives of noisy observations it is also a modelagnostic framework thus might be widely applicable regardless owns specific setting empirical validation on the proposed framework dcode shows that dcode is the right way to recover odes under some undesirable but unavoidable measurement errors the tumor growth experiment is very interesting because its closedform ode is indeed unknown dcode can capture an interesting behavior tumor volume increase due to drug resistance while the baseline cannot i have not found any major drawbacks to this paper for these reasons i think this paper is worthy of publication for iclr the followings are some relatively minor questions q1 treating secondorder nonautonomous systems as a set of firstorder autonomous odes requires nontrivial effort since the number of variables increases to 2j 1 is there any suggestion about when one should consider such a situation without prior knowledge of the target system it seems that the authors have found nonautonomous ode for the tumor growth task thus it will be nice if they can explain their modeling procedure briefly q2 is the search space for mathematical operations used in this paper enough for arbitrary systems if not and at the same time if the variational objective does not converge to 0 well for an unknown target system then what would be the best option for the next behavior eg adding some additional operations or considering higherorder nonautonomous systems such as q1 it may depend on the target problem but it will be nice if the authors can discuss about it q3 the authors state that neural odes should estimate the unknown initial condition thus they are not very effective for chaotic systems but it seems that one can rather directly use y0 x0 eps0 as an initial condition for neural odes are neural odes still bad for such a case though it might depend on the noise level and lyapunov exponent of the target dynamics the current evaluation of neural odes in figure 4 seems to be extremely poor and i felt the evaluation is slightly unfair thus i wonder it is truly a limitation of neural odes or due to a bad estimation of the initial state as i mentioned above i think the paper is interesting and thus recommend the acceptance of this paper postrebuttal i appreciate the authors clarifications on my questions id like to keep my score docsepthis paper proposes a new methodology to infer symbolic ode representation from observed time series in contrast to previous methods they bypass the inference of the time derivative and rather proceed by first estimating a continuous approximation of the trajectory and then optmizing a novel objective function the first step can be performed with the interpolation method of choice gp or splines the second step can use any optimization scheme that does not require derivative such as genetic algorithm the authors then evaluate their approach on a series of dynamical system and show improved performance compared to the baselines under consideration the paper is well written and addressing an important problem for scientific discovery and interpretability of ode methods my main comment is that i think another baseline to investigate the importance of the proposed objective function would be the following you optimize over an initial value and the function f and you minimize the reconstruction error is this a baseline you could compare against as it does not involve computing the derivative either it could also be competitive and really assess the added value of the proposed objective function i also believe this setup would fit theorem 1 if this is not a good idea can authors explain why as i dont see a clear advantage of the proposed method over the one i just described it currently undermines my evaluation of the significancenovelty of this work however id be very happy to change my mind on this topic in table 2 it seems that the performance differ massively between both dimensions 8 and 9 yet at a glance their functional form look really similar it would be nice to provide some extra information or insights in that regard in the text because all these symbolic regression approaches make use of nondifferentiable optimizers the computational cost can be quite high i would appreciate some discussion of the computational complexity of the different methods and to have an order of magnitude of the computation time its also not exactly clear to me how the interpolation steps work in the multidimensional setup in practice do you use single dimension gps for each dimension or do you use multitask gp to model the correlations in equation 3 the first integral should have fj as f is not a scalar function i think this is a nice paper but to fully motivate the approach it should compare against another obvious baseline i would like the authors to motivate why the setup i described above is not considered in the experiments docsepauthors propose dcode a framework to approximate closedform odes using symbolic regression the method first estimate the trajectory by smoothing the observations using any convenient smoother before inferring a closed form finite sequence of operations transition function this allows for better interpretability of the system authors test dcode on 5 simulated datasets where they show high performance of the method and on one real dataset where they use dcode to model the temporal effect of chemotherapy on the tumor volume and compare it to total variation regularized differentiation srt the closest method in the literature using symbolic regression to approximate closedform odes is novel and very interesting the results presented in the paper are compelling but i would want to see more comparisons to other methods as well as results on more datasets the method section is very well described with useful and precise details found in the appendix the weakness of this paper lies in the experiments and results section authors have selected five dynamical systems governed by a wide range of nine closedform odes however all these system have independent gaussian noise with different noise level i assume that if the noise is properly handled by the smoother this will not impact the model but i would like to see experiments on datasets where the smoother does not necessary perform very well low signal to noise ratio in that case will dcode be robust enough to recover correct parameters of the ode if this is not the case i dont really see the advantage of dcode in addition to converting an already inferred trajectory into an ode for the sake of interpretability similarly how will it handle sparse observations one advantage of methods inferring odesde parameters for dynamical systems are the ability to approximate trajectories in regions with few observations here wont the method suffer from the smoother if not able to capture the trajectory correctly to understand better the advantages of dcode over other methods i think that showing results on more than one real dataset i understand the dataset presented here is very challenging with comparisons to more than one method for example neural ode discussed here in the simulated data experiments also is dcode restricted to one dimensional observations would it be possible to perform symbolic regression for higher dimensional latent variables if it is possible then an extension of this framework to higherdimensional observations and latent processes would be very compelling and allow for comparisons with many models from the dynamical systems literature that infer latent odes minor comment the figures especially figure 2 could be made bigger and clearer by removing white space between the panels post rebuttal i increased my score from 6 to 8 given that dcode allows to approximate the latent ode even when the smoother does not perform well i would lean towards acceptance of the paper using symbolic regression to approximate closedform odes is novel and very interesting and the results presented in the paper are compelling however i would want to see more comparisons to other methods as well as results on more datasets to gain a better understanding of when this method will be useful and outperform existing methods i am hoping that the authors will be able to to answer my concerns by presenting more results docsepthe paper addresses the problem of learning closed form odes from observational data when the observation can be noisy and not frequent enough that instantaneous derivatives can be estimated with low enough error the work uses a variational criterion on the solution of the ode circumventing the need to evaluate instantaneous derivatives it uses existing symbolic regression techniques but instead of the regression loss it minimizes a loss term quantifying the violation of the variational constraints it is a work combining existing results in a novel way but offers new perspective and there are synergies between the different parts applied the method is clearly described and the potential benefits are clearly understandable see summary however i find the baseline comparison somewhat incomplete my main questions are about the neuralode comparison i do not agree that learning a chaotic dynamics directly with a node is not possible the author argue that we need to estimate the initial condition what makes it impossible due to the chaotic instability only thing we actually need to train a node is to minimize the error in a shorter time horizon specifically if we have samples more often than the lyapunov time we should be able to learn the mapping we integrate the dynamics forward inside a smaller than lyapunov time window and use this error to learn please clarify this point and train the node with a loss appropriate and show that result on the rightmost plot of figure 4 a simulation should stay on the attractor of course the concrete trajectory is impossible to reproduce the valid argument still remains that node is not interpretable while closed form ode is i have no doubt on this point however if node training is actually possible it can be used similarly as gp and others in the twostep methods and the gradient networks can be distilled to a closed form equation the work is derivative in methodology but provide enough insight to be interesting i have some questions on the neural ode comparison what would need some clarification if this concern is addressed it would make it easier to support acceptance ### Summary:
this paper introduces a new technique for discovering closedform functional forms ordinary differential equations that explain noisy observed trajectories xt where the label xt fxt t is not observed but without trying to approximate it the method first tries to approximate a smoother trajector xhatt then relies on a variational formulation using a loss function over functionals cjj defined in terms of an orthonormal basis g1 gs of sampling functions such that the sum of squares of all the cj approximates the theoretical distance between fx and the solution fx these sampling functions are typically chosen to be a basis of sine functions the method is evaluated on several canonical odes growth model glycolitic oscillator lorenz chaotic attractor and compared to gaussian processesbased differentiation to splinebased differentiation regularised differentiation and applied to model the temporal effect of chemotherapy on tumor volume reviewers found that the paper was wellmotivated and easy to follow ebvj well evaluated ebvj offering new perspectives to symbolic regression 79ft reviewer vag3 had their concerns addressed reviewer zddy had concerns about the running time a misunderstanding that was clarified and the lack of comparison to a simple baseline consisting in double optimisation over f and xhat0 using neural odes the authors have added a neural ode baseline but were in disagreement with zddy and 79ft about their limitations reviewers engaged in a discussion with the authors and the scores are 6 6 8 8 i believe that the paper definitely meets the conference acceptance bar and would advocate for its inclusion as a spotlight in the conference
[ 1908, 824, 247, 4112, 1293, 2720, 3640, 273, 253, 2303, 985, 352, 3133, 326, 253, 4477, 452, 1119, 1327, 1920, 20899, 258, 615, 323, 253, 4502, 3116, 4836, 3021, 352, 588, 320, 5322, 604, 597, 476, 5513, 616, 14053, 5199, 13366, 50275, 82, 19, 310, 253, 3186, 2317, 323, 15965, 5871, 908, 275, 436, 2929, 2217, 323, 10341, 2718, 604, 417, 285, 387, 253, 1072, 673, 604, 253, 39762, 8103, 1057, 417, 29623, 281, 470, 973, 323, 271, 7202, 2303, 985, 840, 752, 651, 320, 253, 1682, 4500, 323, 253, 1735, 3879, 24088, 6240, 690, 3081, 5871, 390, 7296, 2169, 2621, 1327, 1920, 20899, 2718, 824, 347, 2805, 18, 352, 778, 3469, 327, 253, 2303, 1895, 533, 352, 588, 320, 5322, 604, 253, 4477, 476, 2319, 670, 352, 50275, 82, 20, 253, 4477, 1375, 326, 11454, 258, 3229, 943, 6642, 253, 7202, 3302, 1617, 3021, 597, 403, 417, 1077, 3576, 323, 29784, 2718, 533, 352, 3133, 326, 581, 476, 2581, 3587, 897, 340, 17, 50276, 89, 17, 50276, 2265, 17, 347, 271, 3302, 1617, 323, 11454, 258, 3229, 403, 11454, 258, 3229, 1335, 3076, 323, 824, 247, 1083, 2167, 352, 1537, 3469, 327, 253, 6046, 1268, 285, 12865, 522, 43772, 23653, 273, 253, 2303, 8062, 253, 1655, 7103, 273, 11454, 258, 3229, 275, 4677, 577, 3133, 281, 320, 6685, 4105, 285, 891, 3543, 253, 7103, 310, 5777, 16593, 3021, 891, 4282, 352, 310, 7777, 247, 12291, 273, 11454, 258, 3229, 390, 1955, 281, 247, 3076, 13418, 273, 253, 3302, 1375, 50276, 284, 891, 5393, 1840, 891, 1158, 253, 2929, 310, 4722, 285, 3021, 5583, 253, 14924, 273, 436, 2929, 50275, 5996, 250, 2858, 22559, 891, 11435, 253, 4477, 8254, 6787, 327, 619, 3533, 2654, 751, 281, 1978, 619, 4868, 5474, 33032, 2520, 2929, 29328, 247, 747, 16182, 281, 9441, 24762, 258, 615, 6779, 432, 2540, 673, 2962, 275, 4499, 281, 2045, 3082, 597, 18210, 253, 17032, 273, 253, 673, 4309, 285, 2581, 4262, 407, 806, 26230, 247, 5415, 11193, 273, 253, 18974, 285, 840, 1478, 78, 3006, 247, 4460, 8103, 1159, 253, 806, 3213, 476, 320, 2684, 342, 253, 30370, 1332, 273, 4327, 31025, 390, 6821, 1100, 253, 1273, 3213, 476, 897, 667, 13757, 6974, 326, 1057, 417, 2430, 4309, 824, 347, 6380, 5933, 253, 4477, 840, 7472, 616, 2746, 327, 247, 2962, 273, 18525, 985, 285, 921, 5520, 3045, 2429, 281, 253, 1666, 25379, 762, 8180, 253, 2929, 310, 973, 3542, 285, 15974, 271, 1774, 1895, 323, 8249, 8900, 285, 4665, 1430, 273, 258, 615, 3082, 50273, 2577, 2022, 4385, 310, 326, 891, 1158, 1529, 8245, 281, 7409, 253, 6349, 273, 253, 4081, 8103, 1159, 651, 320, 253, 1563, 50276, 5658, 22318, 689, 271, 3302, 1318, 285, 253, 1159, 269, 285, 368, 15338, 253, 14433, 2228, 310, 436, 247, 8245, 368, 812, 7277, 1411, 50276, 284, 352, 1057, 417, 6388, 12672, 253, 4309, 2057, 352, 812, 671, 320, 12085, 285, 1663, 2939, 253, 2879, 1318, 273, 253, 4081, 8103, 1159, 891, 671, 2868, 436, 9978, 651, 4944, 10012, 337, 604, 436, 310, 417, 247, 1175, 2934, 476, 4477, 5513, 2139, 50276, 284, 891, 13414, 923, 247, 2590, 5750, 273, 253, 4081, 1332, 689, 253, 581, 891, 816, 2529, 352, 4390, 35162, 1100, 619, 7103, 273, 253, 1415, 1377, 15854, 652, 555, 273, 436, 789, 2299, 2654, 320, 1077, 5211, 281, 1818, 619, 2564, 327, 436, 9400, 50275, 249, 2829, 374, 352, 3133, 326, 253, 3045, 9184, 48994, 875, 1097, 10103, 854, 285, 898, 2568, 387, 247, 17834, 616, 5164, 830, 1007, 1663, 2074, 352, 651, 320, 5322, 281, 2085, 690, 4465, 1491, 390, 16039, 275, 326, 2743, 275, 253, 2505, 50275, 12157, 512, 841, 24762, 9077, 7274, 1056, 897, 273, 27370, 7413, 6051, 5556, 14460, 253, 15180, 2105, 476, 320, 3240, 1029, 891, 651, 11435, 690, 5955, 273, 253, 15180, 10454, 273, 253, 1027, 3082, 285, 281, 452, 271, 1340, 273, 9777, 273, 253, 13782, 673, 50275, 953, 671, 417, 4555, 2590, 281, 479, 849, 253, 30370, 5018, 789, 275, 253, 23964, 37613, 9978, 275, 3946, 513, 368, 897, 2014, 7877, 305, 793, 323, 1016, 7877, 390, 513, 368, 897, 1554, 262, 1945, 31025, 281, 1566, 253, 13007, 50273, 249, 5150, 495, 253, 806, 9909, 943, 452, 269, 75, 347, 269, 310, 417, 247, 13434, 1159, 891, 1158, 436, 310, 247, 5322, 2929, 533, 281, 4751, 41509, 253, 2746, 352, 943, 7277, 1411, 1529, 4755, 8245, 891, 651, 751, 253, 4477, 281, 41509, 2139, 253, 9978, 891, 2529, 1840, 310, 417, 2783, 275, 253, 4679, 50276, 7152, 33032, 43355, 12661, 277, 3211, 247, 7792, 281, 16851, 4581, 630, 258, 3229, 970, 24762, 9077, 253, 1332, 806, 6642, 253, 18974, 407, 36971, 253, 7313, 970, 667, 11638, 39797, 977, 1078, 9441, 804, 247, 4581, 830, 6486, 3425, 273, 5871, 5502, 1159, 436, 4483, 323, 1805, 4665, 1430, 273, 253, 985, 50275, 43355, 1071, 277, 3211, 327, 608, 15524, 15302, 835, 597, 921, 1029, 3045, 273, 253, 1332, 285, 327, 581, 1524, 10895, 835, 597, 897, 277, 3211, 281, 1566, 253, 11935, 1055, 273, 13215, 327, 253, 4502, 4644, 285, 7277, 352, 281, 2264, 7629, 3963, 1025, 9827, 256, 1378, 253, 8642, 1332, 275, 253, 6239, 50276, 5302, 24762, 9077, 281, 16851, 4581, 630, 258, 3229, 310, 4460, 285, 1077, 4722, 253, 1543, 3559, 275, 253, 2929, 403, 18511, 533, 891, 651, 971, 281, 923, 625, 14023, 281, 643, 3082, 347, 973, 347, 1543, 327, 625, 15302, 50275, 783, 1332, 2593, 310, 1077, 973, 2529, 342, 4217, 285, 10799, 4278, 1119, 275, 253, 30762, 50276, 783, 14855, 273, 436, 2929, 8696, 275, 253, 4679, 285, 1543, 2593, 50275, 43355, 452, 4236, 2620, 18525, 2718, 17886, 407, 247, 4618, 2491, 273, 7457, 4581, 630, 258, 3229, 2299, 512, 841, 985, 452, 3907, 305, 12064, 6046, 342, 1027, 6046, 1268, 891, 5467, 326, 604, 253, 6046, 310, 6283, 15726, 407, 253, 39797, 977, 436, 588, 417, 3486, 253, 1566, 533, 891, 651, 751, 281, 923, 4679, 327, 15302, 835, 253, 39797, 977, 1057, 417, 3309, 1347, 1077, 973, 1698, 2625, 281, 6046, 4313, 275, 326, 1083, 588, 277, 3211, 320, 10237, 2217, 281, 9295, 3451, 3602, 273, 253, 258, 615, 50276, 338, 436, 310, 417, 253, 1083, 891, 13414, 1663, 923, 253, 5750, 273, 277, 3211, 275, 1635, 281, 22022, 271, 2168, 22245, 18974, 715, 271, 258, 615, 323, 253, 13232, 273, 4665, 1430, 50275, 3549, 6241, 849, 588, 352, 6016, 23507, 7313, 581, 5750, 273, 3082, 9441, 804, 258, 3229, 615, 3602, 323, 18525, 2718, 403, 253, 3745, 281, 16851, 24102, 275, 4811, 342, 1643, 7313, 1060, 31451, 253, 1332, 11089, 432, 253, 39797, 977, 604, 417, 2104, 281, 9232, 253, 18974, 9113, 50275, 936, 2096, 1805, 253, 11361, 273, 277, 3211, 689, 643, 3082, 891, 1158, 326, 4645, 1543, 327, 625, 685, 581, 1524, 10895, 891, 2096, 253, 10895, 3559, 1060, 310, 1077, 11132, 342, 14023, 281, 625, 685, 581, 1332, 323, 1650, 11454, 258, 615, 5469, 1060, 275, 253, 15524, 941, 4679, 50275, 12563, 310, 277, 3211, 11096, 281, 581, 15759, 7313, 651, 352, 320, 1896, 281, 1347, 24762, 9077, 323, 2169, 15759, 21624, 4903, 604, 352, 310, 1896, 840, 271, 6880, 273, 436, 7792, 281, 2169, 6967, 7313, 285, 21624, 4870, 651, 320, 1077, 18511, 285, 1581, 323, 14023, 342, 1142, 3210, 432, 253, 18525, 2718, 6239, 326, 9441, 21624, 258, 3229, 50276, 37585, 4385, 253, 8442, 3340, 4677, 374, 812, 320, 1160, 8750, 285, 30909, 407, 11922, 3168, 2317, 875, 253, 12471, 50276, 5996, 30080, 22559, 50276, 74, 2559, 619, 4868, 432, 721, 281, 854, 1677, 326, 277, 3211, 4483, 281, 16851, 253, 21624, 258, 615, 1014, 672, 253, 39797, 977, 1057, 417, 1347, 973, 50276, 74, 651, 9644, 4404, 14924, 273, 253, 2929, 970, 24762, 9077, 281, 16851, 4581, 630, 258, 3229, 310, 4460, 285, 1077, 4722, 285, 253, 1543, 3559, 275, 253, 2929, 403, 18511, 2299, 891, 651, 971, 281, 923, 625, 14023, 281, 643, 3082, 347, 973, 347, 1543, 327, 625, 15302, 281, 6351, 247, 1805, 4685, 273, 672, 436, 1332, 588, 320, 4217, 285, 562, 32231, 5368, 3082, 891, 717, 11525, 326, 253, 4477, 588, 320, 2104, 281, 281, 3662, 619, 7350, 407, 15250, 625, 1543, 50276, 7152, 339, 431, 248, 2929, 12453, 253, 1895, 273, 4715, 4581, 830, 258, 3229, 432, 21899, 941, 672, 253, 8310, 476, 320, 27620, 285, 417, 10879, 2217, 326, 35774, 13335, 476, 320, 5998, 342, 1698, 2217, 2228, 253, 789, 4648, 247, 39762, 17705, 327, 253, 2900, 273, 253, 258, 615, 39256, 272, 253, 878, 281, 7472, 35774, 13335, 352, 4648, 5368, 24762, 9077, 5609, 533, 3185, 273, 253, 9077, 2957, 352, 46926, 247, 2957, 1307, 2677, 5411, 253, 8411, 273, 253, 39762, 10806, 352, 310, 247, 789, 16248, 5368, 1543, 275, 247, 4460, 1039, 533, 6131, 747, 8668, 285, 627, 403, 26455, 447, 875, 253, 1027, 4243, 3732, 253, 1332, 310, 4518, 2529, 285, 253, 2442, 5373, 403, 4518, 34007, 923, 6010, 2299, 891, 1089, 253, 8245, 5301, 8489, 18464, 50276, 2577, 2022, 3533, 403, 670, 253, 11454, 853, 5301, 891, 513, 417, 5194, 326, 4715, 247, 29784, 8062, 3587, 342, 247, 4666, 310, 417, 1896, 253, 2488, 9059, 326, 359, 878, 281, 6642, 253, 3302, 1617, 752, 2789, 352, 7479, 1955, 281, 253, 29784, 17620, 760, 2181, 359, 2686, 878, 281, 6194, 247, 4666, 310, 281, 15338, 253, 2228, 275, 247, 12217, 673, 16892, 5742, 604, 359, 452, 3530, 625, 2223, 685, 253, 12865, 522, 43772, 673, 359, 943, 320, 2104, 281, 3037, 253, 10603, 359, 19837, 253, 8062, 3579, 3304, 247, 4577, 685, 12865, 522, 43772, 673, 3497, 285, 897, 436, 2228, 281, 3037, 50276, 32897, 19148, 436, 1127, 285, 6194, 253, 4666, 342, 247, 2957, 4569, 285, 921, 326, 906, 327, 253, 987, 2252, 7484, 273, 4677, 577, 247, 9864, 943, 3297, 327, 253, 6427, 263, 273, 2282, 253, 11859, 18974, 310, 7479, 281, 18302, 50276, 783, 3588, 4154, 1335, 4558, 326, 4666, 310, 417, 4665, 494, 1223, 4581, 830, 258, 615, 310, 891, 452, 642, 5545, 327, 436, 1127, 2299, 604, 4666, 3733, 310, 2686, 1896, 352, 476, 320, 908, 12014, 347, 31025, 285, 2571, 275, 253, 2500, 493, 554, 3082, 285, 253, 11786, 6928, 476, 320, 35755, 281, 247, 4581, 830, 5150, 50276, 783, 789, 310, 4309, 275, 16182, 533, 2085, 2217, 12288, 281, 320, 4722, 891, 452, 690, 3533, 327, 253, 11454, 258, 615, 5301, 752, 651, 878, 690, 37699, 604, 436, 4468, 310, 9713, 352, 651, 1056, 352, 6927, 281, 1329, 14924, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 23970, 247, 747, 5853, 323, 30375, 4581, 630, 5164, 4948, 9826, 8967, 7424, 326, 5513, 27620, 2540, 24102, 209, 633, 835, 253, 5203, 209, 633, 50276, 71, 633, 246, 310, 417, 2540, 533, 1293, 2820, 281, 16851, 352, 253, 1332, 806, 14177, 281, 16851, 247, 39797, 977, 13310, 263, 1269, 700, 85, 840, 15771, 327, 247, 39762, 15895, 970, 247, 2957, 1159, 689, 1159, 932, 260, 22492, 2931, 275, 2426, 273, 271, 49674, 1939, 3720, 305, 18, 50276, 5943, 273, 10491, 3470, 824, 326, 253, 2020, 273, 19325, 273, 512, 253, 260, 75, 4020, 684, 253, 10527, 4181, 875, 269, 89, 285, 253, 2900, 269, 89, 841, 10491, 3470, 403, 5431, 6777, 281, 320, 247, 3720, 273, 37353, 3470, 253, 1332, 310, 6760, 327, 2067, 15516, 258, 3229, 3116, 1566, 22471, 17533, 27957, 298, 15077, 91, 29784, 6427, 263, 285, 2429, 281, 305, 12064, 4870, 3169, 9827, 281, 6821, 460, 3169, 9827, 3963, 1701, 9827, 285, 3732, 281, 1566, 253, 11935, 1055, 273, 13215, 327, 4502, 4644, 50275, 15337, 398, 1119, 326, 253, 2929, 369, 973, 24013, 8550, 285, 3477, 281, 956, 38391, 87, 75, 973, 6760, 38391, 87, 75, 9159, 747, 24302, 281, 24762, 9077, 11275, 649, 37317, 12531, 20, 574, 616, 7350, 9713, 37317, 1182, 1678, 90, 574, 7350, 670, 253, 3515, 673, 247, 40663, 326, 369, 31637, 285, 253, 3480, 273, 5301, 281, 247, 2969, 8245, 11253, 275, 4021, 5556, 5837, 689, 269, 285, 1269, 700, 17, 970, 11454, 258, 3229, 253, 4477, 452, 2879, 247, 11454, 258, 615, 8245, 533, 497, 275, 30859, 342, 1182, 1678, 90, 285, 11275, 649, 670, 616, 7364, 50276, 15337, 398, 9583, 275, 247, 5955, 342, 253, 4477, 285, 253, 7363, 403, 721, 721, 854, 854, 891, 2868, 326, 253, 2929, 7964, 16382, 253, 8059, 14924, 2534, 285, 651, 21424, 323, 697, 11250, 347, 247, 34543, 275, 253, 8059 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 1908, 824, 247, 4112, 1293, 2720, 3640, 273, 253, 2303, 985, 352, 3133, 326, 253, 4477, 452, 1119, 1327, 1920, 20899, 258, 615, 323, 253, 4502, 3116, 4836, 3021, 352, 588, 320, 5322, 604, 597, 476, 5513, 616, 14053, 5199, 13366, 50275, 82, 19, 310, 253, 3186, 2317, 323, 15965, 5871, 908, 275, 436, 2929, 2217, 323, 10341, 2718, 604, 417, 285, 387, 253, 1072, 673, 604, 253, 39762, 8103, 1057, 417, 29623, 281, 470, 973, 323, 271, 7202, 2303, 985, 840, 752, 651, 320, 253, 1682, 4500, 323, 253, 1735, 3879, 24088, 6240, 690, 3081, 5871, 390, 7296, 2169, 2621, 1327, 1920, 20899, 2718, 824, 347, 2805, 18, 352, 778, 3469, 327, 253, 2303, 1895, 533, 352, 588, 320, 5322, 604, 253, 4477, 476, 2319, 670, 352, 50275, 82, 20, 253, 4477, 1375, 326, 11454, 258, 3229, 943, 6642, 253, 7202, 3302, 1617, 3021, 597, 403, 417, 1077, 3576, 323, 29784, 2718, 533, 352, 3133, 326, 581, 476, 2581, 3587, 897, 340, 17, 50276, 89, 17, 50276, 2265, 17, 347, 271, 3302, 1617, 323, 11454, 258, 3229, 403, 11454, 258, 3229, 1335, 3076, 323, 824, 247, 1083, 2167, 352, 1537, 3469, 327, 253, 6046, 1268, 285, 12865, 522, 43772, 23653, 273, 253, 2303, 8062, 253, 1655, 7103, 273, 11454, 258, 3229, 275, 4677, 577, 3133, 281, 320, 6685, 4105, 285, 891, 3543, 253, 7103, 310, 5777, 16593, 3021, 891, 4282, 352, 310, 7777, 247, 12291, 273, 11454, 258, 3229, 390, 1955, 281, 247, 3076, 13418, 273, 253, 3302, 1375, 50276, 284, 891, 5393, 1840, 891, 1158, 253, 2929, 310, 4722, 285, 3021, 5583, 253, 14924, 273, 436, 2929, 50275, 5996, 250, 2858, 22559, 891, 11435, 253, 4477, 8254, 6787, 327, 619, 3533, 2654, 751, 281, 1978, 619, 4868, 5474, 33032, 2520, 2929, 29328, 247, 747, 16182, 281, 9441, 24762, 258, 615, 6779, 432, 2540, 673, 2962, 275, 4499, 281, 2045, 3082, 597, 18210, 253, 17032, 273, 253, 673, 4309, 285, 2581, 4262, 407, 806, 26230, 247, 5415, 11193, 273, 253, 18974, 285, 840, 1478, 78, 3006, 247, 4460, 8103, 1159, 253, 806, 3213, 476, 320, 2684, 342, 253, 30370, 1332, 273, 4327, 31025, 390, 6821, 1100, 253, 1273, 3213, 476, 897, 667, 13757, 6974, 326, 1057, 417, 2430, 4309, 824, 347, 6380, 5933, 253, 4477, 840, 7472, 616, 2746, 327, 247, 2962, 273, 18525, 985, 285, 921, 5520, 3045, 2429, 281, 253, 1666, 25379, 762, 8180, 253, 2929, 310, 973, 3542, 285, 15974, 271, 1774, 1895, 323, 8249, 8900, 285, 4665, 1430, 273, 258, 615, 3082, 50273, 2577, 2022, 4385, 310, 326, 891, 1158, 1529, 8245, 281, 7409, 253, 6349, 273, 253, 4081, 8103, 1159, 651, 320, 253, 1563, 50276, 5658, 22318, 689, 271, 3302, 1318, 285, 253, 1159, 269, 285, 368, 15338, 253, 14433, 2228, 310, 436, 247, 8245, 368, 812, 7277, 1411, 50276, 284, 352, 1057, 417, 6388, 12672, 253, 4309, 2057, 352, 812, 671, 320, 12085, 285, 1663, 2939, 253, 2879, 1318, 273, 253, 4081, 8103, 1159, 891, 671, 2868, 436, 9978, 651, 4944, 10012, 337, 604, 436, 310, 417, 247, 1175, 2934, 476, 4477, 5513, 2139, 50276, 284, 891, 13414, 923, 247, 2590, 5750, 273, 253, 4081, 1332, 689, 253, 581, 891, 816, 2529, 352, 4390, 35162, 1100, 619, 7103, 273, 253, 1415, 1377, 15854, 652, 555, 273, 436, 789, 2299, 2654, 320, 1077, 5211, 281, 1818, 619, 2564, 327, 436, 9400, 50275, 249, 2829, 374, 352, 3133, 326, 253, 3045, 9184, 48994, 875, 1097, 10103, 854, 285, 898, 2568, 387, 247, 17834, 616, 5164, 830, 1007, 1663, 2074, 352, 651, 320, 5322, 281, 2085, 690, 4465, 1491, 390, 16039, 275, 326, 2743, 275, 253, 2505, 50275, 12157, 512, 841, 24762, 9077, 7274, 1056, 897, 273, 27370, 7413, 6051, 5556, 14460, 253, 15180, 2105, 476, 320, 3240, 1029, 891, 651, 11435, 690, 5955, 273, 253, 15180, 10454, 273, 253, 1027, 3082, 285, 281, 452, 271, 1340, 273, 9777, 273, 253, 13782, 673, 50275, 953, 671, 417, 4555, 2590, 281, 479, 849, 253, 30370, 5018, 789, 275, 253, 23964, 37613, 9978, 275, 3946, 513, 368, 897, 2014, 7877, 305, 793, 323, 1016, 7877, 390, 513, 368, 897, 1554, 262, 1945, 31025, 281, 1566, 253, 13007, 50273, 249, 5150, 495, 253, 806, 9909, 943, 452, 269, 75, 347, 269, 310, 417, 247, 13434, 1159, 891, 1158, 436, 310, 247, 5322, 2929, 533, 281, 4751, 41509, 253, 2746, 352, 943, 7277, 1411, 1529, 4755, 8245, 891, 651, 751, 253, 4477, 281, 41509, 2139, 253, 9978, 891, 2529, 1840, 310, 417, 2783, 275, 253, 4679, 50276, 7152, 33032, 43355, 12661, 277, 3211, 247, 7792, 281, 16851, 4581, 630, 258, 3229, 970, 24762, 9077, 253, 1332, 806, 6642, 253, 18974, 407, 36971, 253, 7313, 970, 667, 11638, 39797, 977, 1078, 9441, 804, 247, 4581, 830, 6486, 3425, 273, 5871, 5502, 1159, 436, 4483, 323, 1805, 4665, 1430, 273, 253, 985, 50275, 43355, 1071, 277, 3211, 327, 608, 15524, 15302, 835, 597, 921, 1029, 3045, 273, 253, 1332, 285, 327, 581, 1524, 10895, 835, 597, 897, 277, 3211, 281, 1566, 253, 11935, 1055, 273, 13215, 327, 253, 4502, 4644, 285, 7277, 352, 281, 2264, 7629, 3963, 1025, 9827, 256, 1378, 253, 8642, 1332, 275, 253, 6239, 50276, 5302, 24762, 9077, 281, 16851, 4581, 630, 258, 3229, 310, 4460, 285, 1077, 4722, 253, 1543, 3559, 275, 253, 2929, 403, 18511, 533, 891, 651, 971, 281, 923, 625, 14023, 281, 643, 3082, 347, 973, 347, 1543, 327, 625, 15302, 50275, 783, 1332, 2593, 310, 1077, 973, 2529, 342, 4217, 285, 10799, 4278, 1119, 275, 253, 30762, 50276, 783, 14855, 273, 436, 2929, 8696, 275, 253, 4679, 285, 1543, 2593, 50275, 43355, 452, 4236, 2620, 18525, 2718, 17886, 407, 247, 4618, 2491, 273, 7457, 4581, 630, 258, 3229, 2299, 512, 841, 985, 452, 3907, 305, 12064, 6046, 342, 1027, 6046, 1268, 891, 5467, 326, 604, 253, 6046, 310, 6283, 15726, 407, 253, 39797, 977, 436, 588, 417, 3486, 253, 1566, 533, 891, 651, 751, 281, 923, 4679, 327, 15302, 835, 253, 39797, 977, 1057, 417, 3309, 1347, 1077, 973, 1698, 2625, 281, 6046, 4313, 275, 326, 1083, 588, 277, 3211, 320, 10237, 2217, 281, 9295, 3451, 3602, 273, 253, 258, 615, 50276, 338, 436, 310, 417, 253, 1083, 891, 13414, 1663, 923, 253, 5750, 273, 277, 3211, 275, 1635, 281, 22022, 271, 2168, 22245, 18974, 715, 271, 258, 615, 323, 253, 13232, 273, 4665, 1430, 50275, 3549, 6241, 849, 588, 352, 6016, 23507, 7313, 581, 5750, 273, 3082, 9441, 804, 258, 3229, 615, 3602, 323, 18525, 2718, 403, 253, 3745, 281, 16851, 24102, 275, 4811, 342, 1643, 7313, 1060, 31451, 253, 1332, 11089, 432, 253, 39797, 977, 604, 417, 2104, 281, 9232, 253, 18974, 9113, 50275, 936, 2096, 1805, 253, 11361, 273, 277, 3211, 689, 643, 3082, 891, 1158, 326, 4645, 1543, 327, 625, 685, 581, 1524, 10895, 891, 2096, 253, 10895, 3559, 1060, 310, 1077, 11132, 342, 14023, 281, 625, 685, 581, 1332, 323, 1650, 11454, 258, 615, 5469, 1060, 275, 253, 15524, 941, 4679, 50275, 12563, 310, 277, 3211, 11096, 281, 581, 15759, 7313, 651, 352, 320, 1896, 281, 1347, 24762, 9077, 323, 2169, 15759, 21624, 4903, 604, 352, 310, 1896, 840, 271, 6880, 273, 436, 7792, 281, 2169, 6967, 7313, 285, 21624, 4870, 651, 320, 1077, 18511, 285, 1581, 323, 14023, 342, 1142, 3210, 432, 253, 18525, 2718, 6239, 326, 9441, 21624, 258, 3229, 50276, 37585, 4385, 253, 8442, 3340, 4677, 374, 812, 320, 1160, 8750, 285, 30909, 407, 11922, 3168, 2317, 875, 253, 12471, 50276, 5996, 30080, 22559, 50276, 74, 2559, 619, 4868, 432, 721, 281, 854, 1677, 326, 277, 3211, 4483, 281, 16851, 253, 21624, 258, 615, 1014, 672, 253, 39797, 977, 1057, 417, 1347, 973, 50276, 74, 651, 9644, 4404, 14924, 273, 253, 2929, 970, 24762, 9077, 281, 16851, 4581, 630, 258, 3229, 310, 4460, 285, 1077, 4722, 285, 253, 1543, 3559, 275, 253, 2929, 403, 18511, 2299, 891, 651, 971, 281, 923, 625, 14023, 281, 643, 3082, 347, 973, 347, 1543, 327, 625, 15302, 281, 6351, 247, 1805, 4685, 273, 672, 436, 1332, 588, 320, 4217, 285, 562, 32231, 5368, 3082, 891, 717, 11525, 326, 253, 4477, 588, 320, 2104, 281, 281, 3662, 619, 7350, 407, 15250, 625, 1543, 50276, 7152, 339, 431, 248, 2929, 12453, 253, 1895, 273, 4715, 4581, 830, 258, 3229, 432, 21899, 941, 672, 253, 8310, 476, 320, 27620, 285, 417, 10879, 2217, 326, 35774, 13335, 476, 320, 5998, 342, 1698, 2217, 2228, 253, 789, 4648, 247, 39762, 17705, 327, 253, 2900, 273, 253, 258, 615, 39256, 272, 253, 878, 281, 7472, 35774, 13335, 352, 4648, 5368, 24762, 9077, 5609, 533, 3185, 273, 253, 9077, 2957, 352, 46926, 247, 2957, 1307, 2677, 5411, 253, 8411, 273, 253, 39762, 10806, 352, 310, 247, 789, 16248, 5368, 1543, 275, 247, 4460, 1039, 533, 6131, 747, 8668, 285, 627, 403, 26455, 447, 875, 253, 1027, 4243, 3732, 253, 1332, 310, 4518, 2529, 285, 253, 2442, 5373, 403, 4518, 34007, 923, 6010, 2299, 891, 1089, 253, 8245, 5301, 8489, 18464, 50276, 2577, 2022, 3533, 403, 670, 253, 11454, 853, 5301, 891, 513, 417, 5194, 326, 4715, 247, 29784, 8062, 3587, 342, 247, 4666, 310, 417, 1896, 253, 2488, 9059, 326, 359, 878, 281, 6642, 253, 3302, 1617, 752, 2789, 352, 7479, 1955, 281, 253, 29784, 17620, 760, 2181, 359, 2686, 878, 281, 6194, 247, 4666, 310, 281, 15338, 253, 2228, 275, 247, 12217, 673, 16892, 5742, 604, 359, 452, 3530, 625, 2223, 685, 253, 12865, 522, 43772, 673, 359, 943, 320, 2104, 281, 3037, 253, 10603, 359, 19837, 253, 8062, 3579, 3304, 247, 4577, 685, 12865, 522, 43772, 673, 3497, 285, 897, 436, 2228, 281, 3037, 50276, 32897, 19148, 436, 1127, 285, 6194, 253, 4666, 342, 247, 2957, 4569, 285, 921, 326, 906, 327, 253, 987, 2252, 7484, 273, 4677, 577, 247, 9864, 943, 3297, 327, 253, 6427, 263, 273, 2282, 253, 11859, 18974, 310, 7479, 281, 18302, 50276, 783, 3588, 4154, 1335, 4558, 326, 4666, 310, 417, 4665, 494, 1223, 4581, 830, 258, 615, 310, 891, 452, 642, 5545, 327, 436, 1127, 2299, 604, 4666, 3733, 310, 2686, 1896, 352, 476, 320, 908, 12014, 347, 31025, 285, 2571, 275, 253, 2500, 493, 554, 3082, 285, 253, 11786, 6928, 476, 320, 35755, 281, 247, 4581, 830, 5150, 50276, 783, 789, 310, 4309, 275, 16182, 533, 2085, 2217, 12288, 281, 320, 4722, 891, 452, 690, 3533, 327, 253, 11454, 258, 615, 5301, 752, 651, 878, 690, 37699, 604, 436, 4468, 310, 9713, 352, 651, 1056, 352, 6927, 281, 1329, 14924, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 23970, 247, 747, 5853, 323, 30375, 4581, 630, 5164, 4948, 9826, 8967, 7424, 326, 5513, 27620, 2540, 24102, 209, 633, 835, 253, 5203, 209, 633, 50276, 71, 633, 246, 310, 417, 2540, 533, 1293, 2820, 281, 16851, 352, 253, 1332, 806, 14177, 281, 16851, 247, 39797, 977, 13310, 263, 1269, 700, 85, 840, 15771, 327, 247, 39762, 15895, 970, 247, 2957, 1159, 689, 1159, 932, 260, 22492, 2931, 275, 2426, 273, 271, 49674, 1939, 3720, 305, 18, 50276, 5943, 273, 10491, 3470, 824, 326, 253, 2020, 273, 19325, 273, 512, 253, 260, 75, 4020, 684, 253, 10527, 4181, 875, 269, 89, 285, 253, 2900, 269, 89, 841, 10491, 3470, 403, 5431, 6777, 281, 320, 247, 3720, 273, 37353, 3470, 253, 1332, 310, 6760, 327, 2067, 15516, 258, 3229, 3116, 1566, 22471, 17533, 27957, 298, 15077, 91, 29784, 6427, 263, 285, 2429, 281, 305, 12064, 4870, 3169, 9827, 281, 6821, 460, 3169, 9827, 3963, 1701, 9827, 285, 3732, 281, 1566, 253, 11935, 1055, 273, 13215, 327, 4502, 4644, 50275, 15337, 398, 1119, 326, 253, 2929, 369, 973, 24013, 8550, 285, 3477, 281, 956, 38391, 87, 75, 973, 6760, 38391, 87, 75, 9159, 747, 24302, 281, 24762, 9077, 11275, 649, 37317, 12531, 20, 574, 616, 7350, 9713, 37317, 1182, 1678, 90, 574, 7350, 670, 253, 3515, 673, 247, 40663, 326, 369, 31637, 285, 253, 3480, 273, 5301, 281, 247, 2969, 8245, 11253, 275, 4021, 5556, 5837, 689, 269, 285, 1269, 700, 17, 970, 11454, 258, 3229, 253, 4477, 452, 2879, 247, 11454, 258, 615, 8245, 533, 497, 275, 30859, 342, 1182, 1678, 90, 285, 11275, 649, 670, 616, 7364, 50276, 15337, 398, 9583, 275, 247, 5955, 342, 253, 4477, 285, 253, 7363, 403, 721, 721, 854, 854, 891, 2868, 326, 253, 2929, 7964, 16382, 253, 8059, 14924, 2534, 285, 651, 21424, 323, 697, 11250, 347, 247, 34543, 275, 253, 8059 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper suggests an interesting approach to knowledge distillation which uses architectural properties rather than the loss function to encourage knowledge transfer between a teacher and a student a student and teacher net are jointly trained on a prediction task with forward connections from layers in the student net to layers in the teacher net at test time the teacher net can be stripped away the method results in improvements in student performance compared to training the student without the teacher the main positives i see in this paper are modest improvements over baselines in the onestage kd setting provides a different perspective on kd compared to recent works that are based on loss functions although the results look promising i think the paper is not yet ready for publication the main issues i see are no comparisons against network pruning and compression methods which are arguably more relevant than kd baselines little insight or analysis of why the method works concerns with the experiments detailed below this paper positions itself as a kd method and argues that a novelty is in doing kd via architectural tricks rather than via a loss function this may indeed be a new perspective for the kd literature but it isnt without precedent methods that use architectural scaffolding at training time which is removed at test time do exist simply under other names such as weight pruning or model compression many of these papers are cited in the intro of the current submission the current paper needs to make it clearer how the proposed method relates to those methods and how it goes beyond them ideally this should include quantitative comparisons or an explanation for why the other methods are not applicable my second major concern is that this paper provides very little in the way of an explanation for why the method works there is an intriguing statement that the method enhances the learning performance of the student model due to the backward gradient flows from the teacher but nothing to back this statement up some analysis of how these gradients achieve desirable effects would greatly strengthen the paper my third concern is with the experiments first the numbers in table 2 are somewhat lower than those reported in prior papers see table 1 of tian et al 2020 appendix a2 states that the code from tian et al 2020 was used to run the comparisons why then the discrepancy in performance second many of the prior kd methods perform best when their objective is combined with the original hinton kd objective see table 7 of tian et al 2020 but this comparison is not provided in the current paper these two concerns mean that im not sure the proposed method is really outperforming competitive baselines from prior work lastly i did not find enough details about ecd to be able to really evaluate if those results are strong or interesting stylistically i think the paper would be improved by adopting a more evenhanded tone statements like compared to existing kd methods ecd is more friendly to end users thanks to its simplicity or that the method is simple and neat come across more as advertisement rather than as scientific analysis the intro argues that the onestage nature of the proposed approach makes it more applicable than twostage approaches but i would say one and twostage methods are simply targeting qualitatively different applications two stage approaches are useful when you are given a big model which maybe you do not have the resources or data to train and want to compress it or adapt it eg for mobile deployment one stage approaches are useful when you are able to train the big model yourself there are interesting tradeoffs between these two paradigms and one is not better than the other the current paper should acknowledge this in general the advantages and disadvantages of the method should be discussed equally i also think the paper overemphasizes how simple the method is i dont personally feel this method is any simpler than methods that use loss functions and in fact i find the proposed method more conceptually complex since i dont know why it works despite these criticisms something interesting does seem to be going on with this method and i encourage the authors to pursue it further minor comments 1 abstract temporally temporarily 2 table 1 what is the baseline the student 3 table 2 citations should be added for each methoddocsepsummary the paper proposes new kd framework ie explicit connection distillation ecd which unlike existing methods designs teacher network that is well aligned with the student architecture and trains both the networks simultaneously using explicit dense feature connections the proposed method is evaluated on cifar100 and imagenet datasets strengths the proposed method neither requires any explicit pretrained teacher network nor any distillation loss so the method overcomes the problem of selecting teacher network or alternatives of distillation losses for the task at hand by design the generated teacher network has features aligned with the student network at every layer concerns though existing works involves complex optimization in terms of losses but the hyperparameters involved in distillation like the weight on distillation loss or the temperature value is not so sensitive like learning rate even without careful tuning decent distillation performance can be achieved with moderate temperature high weight on distillation loss and low weight on cross entropy loss so this is not a major limitation in existing methods in the proposed ecd framework both the teacher and student networks are trained simultaneously so number of trainable parameters teacher parameters student parameters would be large so the method may not work well in case of limited amount of training samples selecting an optimal value of kernel number n is a concern the gain in performance in table 1 for wrn402 is marginal so it seems the proposed method may not be effective on some architectures like wide resnets where the network channels are widened in table 5 marginal improvement using ecd over stage wise feature supervision table 4 shows shallow layers migrate more information than higher layers and dense connections are preferred on shallow layers only to get optimal performance but identifying the layer from which high level semantics would be captured is nontrivial queries for authors any restriction or range of values that alpha can take all the experiments are done with n16 how the performance changes by varying n is the performance of the teacher reported in table 1 obtained through auxiliary teacher involving feature connections with the student network while training using the proposed ecd how to decide number of epochs for training based on either teacher or student performance on validation data details about ecd and how learnable ensemble is applied is not mentioned in detail even in appendix general remarks while the creation of auxiliary teacher directly from the student network removes its dependencies from pretrained teacher but dependency on several design choices like the number of dynamic additive convolutions for the first module and appropriate places for adding connection paths in the second module for explicit flow of layer to layer gradients remaindocsepthe authors propose a new knowledge distillation method applicable to convolutional neural networks given the architecture of a student network the teacher network is constructed by replacing each convolutional layer with a dynamic additive convolutional layer in this way the teacher network is guaranteed to be more capable than the student network then the teacher and student models are trained together minimizing their own training losses in order to distill the student model the student and teacher model are interconnected in a way such that the student model receives gradient flow from the teacher network pros 1 the proposed method is straightforward and easy to implement 2 by adding an additional distillation loss on the logits the ecd method it achieved competitive distillation results cons 1 the authors claim that the gradient flow from the teacher model to the student model helps improving the student model but i do not see why the gradient is from the teachers loss so by applying the gradient on the student model only the teachers loss can be improved this is my greatest concern similarly from equation 2 i do not see why optimizing thetat would help the generalization ability of the student model 2 the authors made several claims about existing knowledge distillation methods for me many of the claims are not meaningful for example in real applications for twostage kd methods well pretrained teacher models are usually not available this is true but what is the drawback of well pretrained teacher models are usually not available all distillation methods require a teacher model the proposed method still need to train a teacher model in fact i might claim that twostage kd methods can utilize existing pretrained teacher models whereas the proposed method always has to train a teacher model during distillation similar claim was made on designing complex distillation losses as a drawback but a user does not have to design new forms of losses if heshe stick to existing methods on the other hand if the proposed method became popular researchers may add auxiliary losses on top of the proposed method and that would not seem like a drawback for the proposed method some confusions 1 when comparing to different distillation methods in table 2 are all teacher models have the same architecture 2 when generating the teacher model do the authors initialize the weights of the teacher model randomly or according to the student model after reading the author feedback i upgrade my rating to 6 see responses in this thread for reasonsdocsepoverall i vote for marginally below the acceptance threshold i think the idea is somewhat novel for the explicit connection distillation especially for crossnetwork layertolayer gradients propagation this paper proposes a new strategy of knowledge distillation called explicit connection distillation ecd and the ecd achieving knowledge transfer via crossnetwork layertolayer gradients propagation without considering conventional knowledge distillation losses experiments on popular image classification tasks show the proposed method is effective however several concerns including the clarity of the paper and some additional ablation studies see cons below result in the decision pros 1 the knowledge distillation by crossnetwork layertolayer gradients propagation is somewhat novel to me 2 this paper is easy to follow the methodology part is clear to me 3 the experiments part show detail ablation study of each component and the supplementary lists almost detail of experiments which help the community to reproduce the proposed methods cons 1 the first concern is about motivation 1 the author claims conventional kd methods leads to complex optimization objectives since they introduce two extra hyperparameters to my best of the knowledge these two parameters have not too much search space ep temperature is from 35 and the weight is t2 from hiltons paper and following paper 2 the drawback of onestage kd methods is a little bit overclaim both one and dml can be applied to a variety of architecture in my opinion the teacher design of ecd follows a similar strategy with one and its variants which is the teacher is wider than the student overall i think the motivation of this paper needs to be very careful to clear 2 the fairness of comparison is the dynamic additive convolution component used to in student network does this will influence the comparison of table2 does one and dml also use that 3 why the automatically generated teachers of ecd is much lower than other methods in table2 in term of performance and results in higher student performance is there any explanation here like such as 12 4 could you provide the computation cost comparison of the proposed method and other onestage methods in table2 5 some recently sota work is missed 34 although i know the performance of this paper is outperformed i think they need to be discussed reference 1 improved knowledge distillation via teacher assistant 2 search to distill pearls are everywhere but not the eyes 3 online knowledge distillation via collaborative learning 4 peer collaborative learning for online knowledge distillation ### Summary:
knowledge distillation kd has been widely used in practice for deployment in this paper a variant of kd is proposed given a student network an auxiliary teacher architecture is temporarily generated via dynamic additive convolutions dense feature connections are introduced to cotrain the teacher and student models the proposed method is novel and interesting empirical results showed that the proposed method can perform better than several kd variants however it is unclear why the proposed method works although the authors tried to address this issue in their rebuttal besides this a bigger concern on this work is that it missed a comparison with a recent approach in 1 which looks much simpler and performs significantly better on similar experiments in 1 their resnet50 05x is smaller than the student model in this paper which used more filters on the top but showed much stronger performance on both relative and absolute improvements over the same baseline training from scratch for the imagenet classification task on the technical side the method in 1 simply uses the original resnet50 as the teacher model and the student model resnet50 05x progressively mimics the intermediate outputs of the teacher model from layer to layer 1 also contains a theoretic analysis meanfield analysis based to support their method comparing with the method in 1 the proposed method here is more complicated less motivated and less efficient 1 d zhou m ye c chen t meng m tan x song q le q liu and d schuurmans go wide then narrow efficient training of deep thin networks icml 2020
[ 320, 2104, 281, 1663, 7472, 604, 1110, 1543, 403, 2266, 390, 4722, 50276, 296, 1190, 18260, 891, 1158, 253, 2929, 651, 320, 5520, 407, 25987, 247, 625, 1014, 22124, 10541, 7234, 751, 2429, 281, 5368, 465, 69, 3082, 299, 2428, 310, 625, 11453, 281, 990, 4212, 6701, 281, 697, 17647, 390, 326, 253, 1332, 310, 2969, 285, 18176, 1705, 2439, 625, 347, 28064, 2581, 685, 347, 8249, 1783, 253, 26432, 8219, 326, 253, 327, 383, 486, 3753, 273, 253, 4081, 2746, 2789, 352, 625, 7763, 685, 2500, 493, 486, 7274, 533, 891, 651, 1333, 581, 285, 2500, 493, 486, 3082, 403, 3365, 12262, 36143, 1027, 4893, 767, 3924, 7274, 403, 4217, 672, 368, 403, 1677, 247, 1943, 1566, 534, 5046, 368, 513, 417, 452, 253, 5300, 390, 941, 281, 6194, 285, 971, 281, 19477, 352, 390, 5223, 352, 24088, 323, 6109, 19007, 581, 3924, 7274, 403, 4217, 672, 368, 403, 2104, 281, 6194, 253, 1943, 1566, 4834, 627, 403, 4722, 5454, 14273, 875, 841, 767, 11951, 304, 983, 285, 581, 310, 417, 1805, 685, 253, 643, 253, 1655, 2929, 943, 14409, 436, 275, 2087, 253, 11361, 285, 23797, 273, 253, 1332, 943, 320, 5469, 9696, 891, 671, 1158, 253, 2929, 25732, 2013, 545, 284, 4219, 849, 2969, 253, 1332, 310, 891, 13414, 11697, 1928, 436, 1332, 310, 667, 19554, 685, 3082, 326, 897, 2957, 3470, 285, 275, 958, 891, 1089, 253, 4081, 1332, 625, 4473, 1230, 2570, 1580, 891, 13414, 871, 2139, 352, 2987, 50276, 3229, 3784, 841, 43680, 1633, 4722, 1057, 1646, 281, 320, 1469, 327, 342, 436, 1332, 285, 891, 11907, 253, 4477, 281, 15142, 352, 2007, 50276, 37585, 5701, 337, 12002, 5897, 595, 50276, 5142, 1831, 3441, 374, 2829, 337, 752, 310, 253, 8245, 253, 5974, 495, 2829, 374, 30404, 943, 320, 2879, 323, 1016, 1332, 7152, 339, 793, 360, 3454, 50276, 783, 2929, 29328, 747, 465, 69, 7792, 26332, 6843, 4602, 940, 21755, 299, 2428, 534, 12401, 5368, 3082, 11809, 9732, 2990, 326, 310, 973, 15616, 342, 253, 5974, 10336, 285, 18784, 1097, 253, 6928, 10486, 970, 6843, 14086, 4735, 10291, 253, 4081, 1332, 310, 6760, 327, 260, 338, 274, 2313, 285, 4440, 257, 292, 15302, 50276, 296, 3755, 20556, 50275, 783, 4081, 1332, 6747, 4419, 667, 6843, 3215, 11273, 9732, 2990, 4543, 667, 940, 21755, 2957, 594, 253, 1332, 689, 3217, 253, 1895, 273, 17221, 9732, 2990, 390, 18075, 273, 940, 21755, 11655, 323, 253, 4836, 387, 1133, 50276, 1615, 2216, 253, 4561, 9732, 2990, 556, 3386, 15616, 342, 253, 5974, 2990, 387, 1046, 3828, 50276, 585, 1209, 2224, 50275, 2004, 5368, 2987, 8687, 2570, 13757, 275, 2426, 273, 11655, 533, 253, 4373, 22041, 3206, 275, 940, 21755, 751, 253, 2801, 327, 940, 21755, 2957, 390, 253, 3276, 1318, 310, 417, 594, 7996, 751, 4715, 2281, 1014, 1293, 10182, 25184, 12524, 940, 21755, 3045, 476, 320, 6786, 342, 10290, 3276, 1029, 2801, 327, 940, 21755, 2957, 285, 1698, 2801, 327, 2831, 15579, 2957, 594, 436, 310, 417, 247, 2201, 12291, 275, 5368, 3082, 50276, 249, 253, 4081, 299, 2428, 7792, 1097, 253, 9732, 285, 5974, 6928, 403, 10166, 10486, 594, 1180, 273, 6194, 494, 3602, 9732, 3602, 50276, 39095, 3602, 651, 320, 1781, 594, 253, 1332, 778, 417, 789, 973, 275, 1083, 273, 3710, 2408, 273, 3733, 3530, 50276, 7135, 272, 271, 8654, 1318, 273, 10295, 1180, 295, 310, 247, 4468, 50276, 783, 6351, 275, 3045, 275, 2829, 337, 323, 1488, 79, 24948, 310, 16888, 594, 352, 3133, 253, 4081, 1332, 778, 417, 320, 3576, 327, 690, 35615, 751, 4618, 501, 47301, 835, 253, 2990, 8123, 403, 42248, 50276, 249, 2829, 608, 16888, 7756, 970, 299, 2428, 689, 3924, 15822, 4735, 20446, 50276, 2420, 577, 2722, 20126, 8090, 31690, 625, 1491, 685, 2169, 8090, 285, 14086, 10291, 403, 9013, 327, 20126, 8090, 760, 281, 755, 8654, 3045, 533, 12488, 253, 3828, 432, 534, 1029, 1268, 35185, 651, 320, 10848, 310, 37825, 50276, 371, 12395, 323, 4477, 50275, 1279, 12400, 390, 2491, 273, 2193, 326, 9765, 476, 1379, 50276, 455, 253, 4679, 403, 2218, 342, 295, 1036, 849, 253, 3045, 2544, 407, 11962, 295, 50276, 261, 253, 3045, 273, 253, 9732, 2361, 275, 2829, 337, 2797, 949, 24026, 9732, 7668, 4735, 10291, 342, 253, 5974, 2990, 50276, 6050, 3733, 970, 253, 4081, 299, 2428, 849, 281, 7617, 1180, 273, 44540, 323, 3733, 1754, 327, 2057, 9732, 390, 5974, 3045, 327, 12820, 941, 50276, 23454, 670, 299, 2428, 285, 849, 3037, 494, 19862, 310, 3732, 310, 417, 5393, 275, 2508, 1014, 275, 30762, 50276, 16691, 16157, 50276, 6050, 253, 8869, 273, 24026, 9732, 3587, 432, 253, 5974, 2990, 26586, 697, 21011, 432, 3215, 11273, 9732, 533, 18925, 327, 2067, 2216, 10165, 751, 253, 1180, 273, 7870, 21842, 2410, 17009, 323, 253, 806, 6333, 285, 4569, 5053, 323, 6240, 4602, 11865, 275, 253, 1273, 6333, 323, 6843, 2685, 273, 3828, 281, 3828, 27935, 3464, 7152, 339, 431, 248, 4477, 12661, 247, 747, 3640, 940, 21755, 1332, 7763, 281, 27311, 267, 11454, 6928, 1677, 253, 10336, 273, 247, 5974, 2990, 253, 9732, 2990, 310, 8818, 407, 15706, 1016, 27311, 267, 3828, 342, 247, 7870, 21842, 27311, 267, 3828, 275, 436, 1039, 253, 9732, 2990, 310, 16293, 281, 320, 625, 7032, 685, 253, 5974, 2990, 840, 253, 9732, 285, 5974, 3210, 403, 10166, 2366, 28699, 616, 1211, 3733, 11655, 275, 1340, 281, 940, 408, 253, 5974, 1566, 253, 5974, 285, 9732, 1566, 403, 36282, 275, 247, 1039, 824, 326, 253, 5974, 1566, 14488, 11786, 2685, 432, 253, 9732, 2990, 50276, 856, 84, 337, 253, 4081, 1332, 310, 15246, 285, 3477, 281, 3359, 374, 407, 6240, 271, 3081, 940, 21755, 2957, 327, 253, 2412, 953, 253, 299, 2428, 1332, 352, 6786, 12085, 940, 21755, 1543, 50276, 5040, 337, 253, 4477, 1750, 326, 253, 11786, 2685, 432, 253, 9732, 1566, 281, 253, 5974, 1566, 7729, 11138, 253, 5974, 1566, 533, 891, 513, 417, 923, 2139, 253, 11786, 310, 432, 253, 10954, 2957, 594, 407, 9433, 253, 11786, 327, 253, 5974, 1566, 760, 253, 10954, 2957, 476, 320, 5520, 436, 310, 619, 6459, 4468, 12014, 432, 5150, 374, 891, 513, 417, 923, 2139, 39793, 253, 41506, 651, 1361, 253, 26647, 3745, 273, 253, 5974, 1566, 374, 253, 4477, 1160, 2067, 3916, 670, 5368, 3640, 940, 21755, 3082, 323, 479, 1142, 273, 253, 3916, 403, 417, 14282, 323, 1650, 275, 1524, 4893, 323, 2500, 493, 486, 465, 69, 3082, 973, 3215, 11273, 9732, 3210, 403, 3798, 417, 2130, 50276, 2520, 310, 2032, 533, 752, 310, 253, 32489, 273, 973, 3215, 11273, 9732, 3210, 403, 3798, 417, 2130, 512, 940, 21755, 3082, 2430, 247, 9732, 1566, 253, 4081, 1332, 1335, 878, 281, 6194, 247, 9732, 1566, 275, 958, 891, 1537, 1750, 326, 2500, 493, 486, 465, 69, 3082, 476, 16584, 5368, 3215, 11273, 9732, 3210, 5727, 253, 4081, 1332, 1900, 556, 281, 6194, 247, 9732, 1566, 1309, 940, 21755, 2074, 1750, 369, 1160, 327, 20462, 2570, 940, 21755, 11655, 347, 247, 32489, 533, 247, 2608, 1057, 417, 452, 281, 2216, 747, 4948, 273, 11655, 604, 344, 6689, 7356, 281, 5368, 3082, 327, 253, 643, 1133, 604, 253, 4081, 1332, 3395, 4633, 8607, 778, 823, 24026, 11655, 327, 1755, 273, 253, 4081, 1332, 285, 326, 651, 417, 1646, 751, 247, 32489, 323, 253, 4081, 1332, 50275, 8826, 1461, 16723, 337, 672, 10941, 281, 1027, 940, 21755, 3082, 275, 2829, 374, 403, 512, 9732, 3210, 452, 253, 1072, 10336, 374, 672, 11365, 253, 9732, 1566, 513, 253, 4477, 26641, 253, 13461, 273, 253, 9732, 1566, 12421, 390, 2556, 281, 253, 5974, 1566, 50276, 6438, 4361, 253, 2488, 8680, 891, 15047, 619, 13716, 281, 721, 923, 6128, 275, 436, 6293, 323, 4606, 7152, 33032, 1189, 455, 891, 6273, 323, 42876, 2708, 253, 14924, 7887, 891, 1158, 253, 2934, 310, 8489, 4460, 323, 253, 6843, 4602, 940, 21755, 3340, 323, 2831, 18428, 2242, 797, 311, 4071, 27935, 18634, 436, 2929, 29328, 247, 747, 5700, 273, 3640, 940, 21755, 1925, 50276, 911, 20692, 4602, 940, 21755, 299, 2428, 285, 253, 299, 2428, 17170, 3640, 3700, 3066, 2831, 18428, 2242, 797, 311, 4071, 27935, 18634, 1293, 7296, 6041, 3640, 940, 21755, 11655, 4679, 327, 4633, 2460, 9162, 8892, 921, 253, 4081, 1332, 310, 3576, 2299, 2067, 7350, 1690, 253, 19843, 273, 253, 2929, 285, 690, 3081, 28913, 2175, 50276, 2887, 772, 2708, 906, 275, 253, 3061, 50275, 856, 84, 50276, 18, 253, 3640, 940, 21755, 407, 2831, 18428, 2242, 797, 311, 4071, 27935, 18634, 310, 8489, 4460, 281, 479, 374, 436, 2929, 310, 3477, 281, 956, 253, 16182, 629, 310, 2590, 281, 479, 495, 253, 4679, 629, 921, 2508, 28913, 1263, 273, 1016, 4445, 285, 253, 24864, 10894, 2761, 2508, 273, 4679, 534, 1361, 253, 3114, 281, 18302, 253, 4081, 3082, 50275, 5040, 50276, 18, 253, 806, 4468, 310, 670, 16038, 337, 253, 2488, 3916, 6041, 465, 69, 3082, 5644, 281, 2570, 13757, 16566, 1580, 597, 9569, 767, 4465, 4373, 22041, 281, 619, 1682, 273, 253, 3640, 841, 767, 3602, 452, 417, 1512, 1199, 3186, 2317, 2563, 3276, 310, 432, 4791, 285, 253, 2801, 310, 246, 19, 432, 288, 2878, 790, 2929, 285, 1563, 2929, 50276, 19, 253, 32489, 273, 327, 383, 486, 465, 69, 3082, 310, 247, 1652, 2372, 689, 7041, 50276, 15617, 581, 285, 277, 1686, 476, 320, 3732, 281, 247, 5235, 273, 10336, 275, 619, 4743, 253, 9732, 2216, 273, 50276, 886, 69, 50276, 25739, 84, 247, 2074, 5700, 342, 581, 285, 697, 11640, 534, 310, 253, 9732, 310, 14200, 685, 253, 5974, 50276, 1189, 455, 891, 1158, 253, 16038, 273, 436, 2929, 3198, 281, 320, 1077, 10182, 281, 2590, 374, 253, 28959, 273, 5301, 310, 253, 7870, 21842, 27311, 4445, 908, 281, 275, 5974, 2990, 1057, 436, 588, 4833, 253, 5301, 273, 50276, 2420, 19, 1057, 581, 285, 277, 1686, 671, 897, 326, 495, 2139, 253, 8356, 4561, 10954, 273, 299, 2428, 310, 1199, 2406, 685, 643, 3082, 275, 2829, 19, 275, 1307, 273, 3045, 285, 1543, 275, 2169, 5974, 3045, 310, 627, 667, 8813, 1060, 751, 824, 347, 1249, 577, 812, 368, 2085, 253, 13782, 2105, 5301, 273, 253, 4081, 1332, 285, 643, 327, 383, 486, 3082, 275, 2829, 19, 608, 690, 4102, 256, 5503, 789, 310, 9829, 5910, 3738, 891, 871, 253, 3045, 273, 436, 2929, 310, 41731, 10574, 891, 1158, 597, 878, 281, 320, 5469, 50275, 14005, 337, 5520, 3640, 940, 21755, 3066, 9732, 13372, 50276, 19, 3186, 281, 940, 408, 27887, 5200, 403, 11678, 533, 417, 253, 2927, 50276, 20, 3909, 3640, 940, 21755, 3066, 27549, 4715, 50276, 21, 14218, 27549, 4715, 323, 3909, 3640, 940, 21755, 2490, 187, 4118, 18435, 27, 36871, 940, 21755, 465, 69, 556, 644, 7561, 908, 275, 3946, 323, 19007, 50276, 249, 436, 2929, 247, 12955, 273, 465, 69, 310, 4081, 1677, 247, 5974, 2990, 271, 24026, 9732, 10336, 310, 20220, 4561, 3066, 7870, 21842, 2410, 17009, 14086, 4735, 10291, 403, 5611, 281, 13450, 1949, 253, 9732, 285, 5974, 3210, 253, 4081, 1332, 310, 4460, 285, 4722, 16774, 1543, 2692, 326, 253, 4081, 1332, 476, 1347, 1805, 685, 2067, 465, 69, 11640, 50276, 35529, 352, 310, 12744, 2139, 253, 4081, 1332, 2987, 3738, 253, 4477, 3597, 281, 2953, 436, 2523, 275, 616, 30080, 22559, 50275, 67, 11587, 436, 50276, 66, 8750, 4468, 327, 436, 789, 310, 326, 352, 9829, 247, 5301, 342, 247, 3332, 2746, 275, 337, 534, 4453, 1199, 19554, 285, 17923, 3012, 1805, 327, 2074, 4679, 50276, 249, 337, 616, 501, 3024, 1235, 16987, 89, 310, 4577, 685, 253, 5974, 1566, 275, 436, 2929, 534, 908, 625, 15116, 327, 253, 1755, 533, 2692, 1199, 10046, 3045, 327, 1097, 4103, 285, 7880, 11701, 689, 253, 1072, 8245, 3733, 432, 20041, 323, 253, 4440, 257, 292, 9162, 4836, 327, 253, 7681, 1930, 253, 1332, 275, 337, 3365, 4648, 253, 3236, 501, 3024, 1235, 347, 253, 9732, 1566, 50276, 395, 253, 5974, 1566, 501, 3024, 1235, 16987, 89, 31414, 43341, 253, 10444, 18012, 273, 253, 9732, 1566, 432, 3828, 281, 3828, 337, 671, 4428, 247, 50276, 783, 30325, 1783, 50276, 10722, 3423, 1783, 1754, 281, 1329, 616, 1332, 10941, 342, 253, 1332, 275, 337, 253, 4081, 1332, 1060, 310, 625, 9542, 1679, 17194, 285, 1679, 5919, 50275, 18, 277, 1182, 14451, 278, 9094, 260, 260, 864, 246, 278, 1205, 278, 23136, 1269, 4498, 2805, 458, 2805, 632, 86, 285, 277, 5807, 86, 321, 24044, 564, 4618, 840, 6891, 5919, 3733, 273, 3676, 6906, 6928, 17857, 1686, 9169 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 320, 2104, 281, 1663, 7472, 604, 1110, 1543, 403, 2266, 390, 4722, 50276, 296, 1190, 18260, 891, 1158, 253, 2929, 651, 320, 5520, 407, 25987, 247, 625, 1014, 22124, 10541, 7234, 751, 2429, 281, 5368, 465, 69, 3082, 299, 2428, 310, 625, 11453, 281, 990, 4212, 6701, 281, 697, 17647, 390, 326, 253, 1332, 310, 2969, 285, 18176, 1705, 2439, 625, 347, 28064, 2581, 685, 347, 8249, 1783, 253, 26432, 8219, 326, 253, 327, 383, 486, 3753, 273, 253, 4081, 2746, 2789, 352, 625, 7763, 685, 2500, 493, 486, 7274, 533, 891, 651, 1333, 581, 285, 2500, 493, 486, 3082, 403, 3365, 12262, 36143, 1027, 4893, 767, 3924, 7274, 403, 4217, 672, 368, 403, 1677, 247, 1943, 1566, 534, 5046, 368, 513, 417, 452, 253, 5300, 390, 941, 281, 6194, 285, 971, 281, 19477, 352, 390, 5223, 352, 24088, 323, 6109, 19007, 581, 3924, 7274, 403, 4217, 672, 368, 403, 2104, 281, 6194, 253, 1943, 1566, 4834, 627, 403, 4722, 5454, 14273, 875, 841, 767, 11951, 304, 983, 285, 581, 310, 417, 1805, 685, 253, 643, 253, 1655, 2929, 943, 14409, 436, 275, 2087, 253, 11361, 285, 23797, 273, 253, 1332, 943, 320, 5469, 9696, 891, 671, 1158, 253, 2929, 25732, 2013, 545, 284, 4219, 849, 2969, 253, 1332, 310, 891, 13414, 11697, 1928, 436, 1332, 310, 667, 19554, 685, 3082, 326, 897, 2957, 3470, 285, 275, 958, 891, 1089, 253, 4081, 1332, 625, 4473, 1230, 2570, 1580, 891, 13414, 871, 2139, 352, 2987, 50276, 3229, 3784, 841, 43680, 1633, 4722, 1057, 1646, 281, 320, 1469, 327, 342, 436, 1332, 285, 891, 11907, 253, 4477, 281, 15142, 352, 2007, 50276, 37585, 5701, 337, 12002, 5897, 595, 50276, 5142, 1831, 3441, 374, 2829, 337, 752, 310, 253, 8245, 253, 5974, 495, 2829, 374, 30404, 943, 320, 2879, 323, 1016, 1332, 7152, 339, 793, 360, 3454, 50276, 783, 2929, 29328, 747, 465, 69, 7792, 26332, 6843, 4602, 940, 21755, 299, 2428, 534, 12401, 5368, 3082, 11809, 9732, 2990, 326, 310, 973, 15616, 342, 253, 5974, 10336, 285, 18784, 1097, 253, 6928, 10486, 970, 6843, 14086, 4735, 10291, 253, 4081, 1332, 310, 6760, 327, 260, 338, 274, 2313, 285, 4440, 257, 292, 15302, 50276, 296, 3755, 20556, 50275, 783, 4081, 1332, 6747, 4419, 667, 6843, 3215, 11273, 9732, 2990, 4543, 667, 940, 21755, 2957, 594, 253, 1332, 689, 3217, 253, 1895, 273, 17221, 9732, 2990, 390, 18075, 273, 940, 21755, 11655, 323, 253, 4836, 387, 1133, 50276, 1615, 2216, 253, 4561, 9732, 2990, 556, 3386, 15616, 342, 253, 5974, 2990, 387, 1046, 3828, 50276, 585, 1209, 2224, 50275, 2004, 5368, 2987, 8687, 2570, 13757, 275, 2426, 273, 11655, 533, 253, 4373, 22041, 3206, 275, 940, 21755, 751, 253, 2801, 327, 940, 21755, 2957, 390, 253, 3276, 1318, 310, 417, 594, 7996, 751, 4715, 2281, 1014, 1293, 10182, 25184, 12524, 940, 21755, 3045, 476, 320, 6786, 342, 10290, 3276, 1029, 2801, 327, 940, 21755, 2957, 285, 1698, 2801, 327, 2831, 15579, 2957, 594, 436, 310, 417, 247, 2201, 12291, 275, 5368, 3082, 50276, 249, 253, 4081, 299, 2428, 7792, 1097, 253, 9732, 285, 5974, 6928, 403, 10166, 10486, 594, 1180, 273, 6194, 494, 3602, 9732, 3602, 50276, 39095, 3602, 651, 320, 1781, 594, 253, 1332, 778, 417, 789, 973, 275, 1083, 273, 3710, 2408, 273, 3733, 3530, 50276, 7135, 272, 271, 8654, 1318, 273, 10295, 1180, 295, 310, 247, 4468, 50276, 783, 6351, 275, 3045, 275, 2829, 337, 323, 1488, 79, 24948, 310, 16888, 594, 352, 3133, 253, 4081, 1332, 778, 417, 320, 3576, 327, 690, 35615, 751, 4618, 501, 47301, 835, 253, 2990, 8123, 403, 42248, 50276, 249, 2829, 608, 16888, 7756, 970, 299, 2428, 689, 3924, 15822, 4735, 20446, 50276, 2420, 577, 2722, 20126, 8090, 31690, 625, 1491, 685, 2169, 8090, 285, 14086, 10291, 403, 9013, 327, 20126, 8090, 760, 281, 755, 8654, 3045, 533, 12488, 253, 3828, 432, 534, 1029, 1268, 35185, 651, 320, 10848, 310, 37825, 50276, 371, 12395, 323, 4477, 50275, 1279, 12400, 390, 2491, 273, 2193, 326, 9765, 476, 1379, 50276, 455, 253, 4679, 403, 2218, 342, 295, 1036, 849, 253, 3045, 2544, 407, 11962, 295, 50276, 261, 253, 3045, 273, 253, 9732, 2361, 275, 2829, 337, 2797, 949, 24026, 9732, 7668, 4735, 10291, 342, 253, 5974, 2990, 50276, 6050, 3733, 970, 253, 4081, 299, 2428, 849, 281, 7617, 1180, 273, 44540, 323, 3733, 1754, 327, 2057, 9732, 390, 5974, 3045, 327, 12820, 941, 50276, 23454, 670, 299, 2428, 285, 849, 3037, 494, 19862, 310, 3732, 310, 417, 5393, 275, 2508, 1014, 275, 30762, 50276, 16691, 16157, 50276, 6050, 253, 8869, 273, 24026, 9732, 3587, 432, 253, 5974, 2990, 26586, 697, 21011, 432, 3215, 11273, 9732, 533, 18925, 327, 2067, 2216, 10165, 751, 253, 1180, 273, 7870, 21842, 2410, 17009, 323, 253, 806, 6333, 285, 4569, 5053, 323, 6240, 4602, 11865, 275, 253, 1273, 6333, 323, 6843, 2685, 273, 3828, 281, 3828, 27935, 3464, 7152, 339, 431, 248, 4477, 12661, 247, 747, 3640, 940, 21755, 1332, 7763, 281, 27311, 267, 11454, 6928, 1677, 253, 10336, 273, 247, 5974, 2990, 253, 9732, 2990, 310, 8818, 407, 15706, 1016, 27311, 267, 3828, 342, 247, 7870, 21842, 27311, 267, 3828, 275, 436, 1039, 253, 9732, 2990, 310, 16293, 281, 320, 625, 7032, 685, 253, 5974, 2990, 840, 253, 9732, 285, 5974, 3210, 403, 10166, 2366, 28699, 616, 1211, 3733, 11655, 275, 1340, 281, 940, 408, 253, 5974, 1566, 253, 5974, 285, 9732, 1566, 403, 36282, 275, 247, 1039, 824, 326, 253, 5974, 1566, 14488, 11786, 2685, 432, 253, 9732, 2990, 50276, 856, 84, 337, 253, 4081, 1332, 310, 15246, 285, 3477, 281, 3359, 374, 407, 6240, 271, 3081, 940, 21755, 2957, 327, 253, 2412, 953, 253, 299, 2428, 1332, 352, 6786, 12085, 940, 21755, 1543, 50276, 5040, 337, 253, 4477, 1750, 326, 253, 11786, 2685, 432, 253, 9732, 1566, 281, 253, 5974, 1566, 7729, 11138, 253, 5974, 1566, 533, 891, 513, 417, 923, 2139, 253, 11786, 310, 432, 253, 10954, 2957, 594, 407, 9433, 253, 11786, 327, 253, 5974, 1566, 760, 253, 10954, 2957, 476, 320, 5520, 436, 310, 619, 6459, 4468, 12014, 432, 5150, 374, 891, 513, 417, 923, 2139, 39793, 253, 41506, 651, 1361, 253, 26647, 3745, 273, 253, 5974, 1566, 374, 253, 4477, 1160, 2067, 3916, 670, 5368, 3640, 940, 21755, 3082, 323, 479, 1142, 273, 253, 3916, 403, 417, 14282, 323, 1650, 275, 1524, 4893, 323, 2500, 493, 486, 465, 69, 3082, 973, 3215, 11273, 9732, 3210, 403, 3798, 417, 2130, 50276, 2520, 310, 2032, 533, 752, 310, 253, 32489, 273, 973, 3215, 11273, 9732, 3210, 403, 3798, 417, 2130, 512, 940, 21755, 3082, 2430, 247, 9732, 1566, 253, 4081, 1332, 1335, 878, 281, 6194, 247, 9732, 1566, 275, 958, 891, 1537, 1750, 326, 2500, 493, 486, 465, 69, 3082, 476, 16584, 5368, 3215, 11273, 9732, 3210, 5727, 253, 4081, 1332, 1900, 556, 281, 6194, 247, 9732, 1566, 1309, 940, 21755, 2074, 1750, 369, 1160, 327, 20462, 2570, 940, 21755, 11655, 347, 247, 32489, 533, 247, 2608, 1057, 417, 452, 281, 2216, 747, 4948, 273, 11655, 604, 344, 6689, 7356, 281, 5368, 3082, 327, 253, 643, 1133, 604, 253, 4081, 1332, 3395, 4633, 8607, 778, 823, 24026, 11655, 327, 1755, 273, 253, 4081, 1332, 285, 326, 651, 417, 1646, 751, 247, 32489, 323, 253, 4081, 1332, 50275, 8826, 1461, 16723, 337, 672, 10941, 281, 1027, 940, 21755, 3082, 275, 2829, 374, 403, 512, 9732, 3210, 452, 253, 1072, 10336, 374, 672, 11365, 253, 9732, 1566, 513, 253, 4477, 26641, 253, 13461, 273, 253, 9732, 1566, 12421, 390, 2556, 281, 253, 5974, 1566, 50276, 6438, 4361, 253, 2488, 8680, 891, 15047, 619, 13716, 281, 721, 923, 6128, 275, 436, 6293, 323, 4606, 7152, 33032, 1189, 455, 891, 6273, 323, 42876, 2708, 253, 14924, 7887, 891, 1158, 253, 2934, 310, 8489, 4460, 323, 253, 6843, 4602, 940, 21755, 3340, 323, 2831, 18428, 2242, 797, 311, 4071, 27935, 18634, 436, 2929, 29328, 247, 747, 5700, 273, 3640, 940, 21755, 1925, 50276, 911, 20692, 4602, 940, 21755, 299, 2428, 285, 253, 299, 2428, 17170, 3640, 3700, 3066, 2831, 18428, 2242, 797, 311, 4071, 27935, 18634, 1293, 7296, 6041, 3640, 940, 21755, 11655, 4679, 327, 4633, 2460, 9162, 8892, 921, 253, 4081, 1332, 310, 3576, 2299, 2067, 7350, 1690, 253, 19843, 273, 253, 2929, 285, 690, 3081, 28913, 2175, 50276, 2887, 772, 2708, 906, 275, 253, 3061, 50275, 856, 84, 50276, 18, 253, 3640, 940, 21755, 407, 2831, 18428, 2242, 797, 311, 4071, 27935, 18634, 310, 8489, 4460, 281, 479, 374, 436, 2929, 310, 3477, 281, 956, 253, 16182, 629, 310, 2590, 281, 479, 495, 253, 4679, 629, 921, 2508, 28913, 1263, 273, 1016, 4445, 285, 253, 24864, 10894, 2761, 2508, 273, 4679, 534, 1361, 253, 3114, 281, 18302, 253, 4081, 3082, 50275, 5040, 50276, 18, 253, 806, 4468, 310, 670, 16038, 337, 253, 2488, 3916, 6041, 465, 69, 3082, 5644, 281, 2570, 13757, 16566, 1580, 597, 9569, 767, 4465, 4373, 22041, 281, 619, 1682, 273, 253, 3640, 841, 767, 3602, 452, 417, 1512, 1199, 3186, 2317, 2563, 3276, 310, 432, 4791, 285, 253, 2801, 310, 246, 19, 432, 288, 2878, 790, 2929, 285, 1563, 2929, 50276, 19, 253, 32489, 273, 327, 383, 486, 465, 69, 3082, 310, 247, 1652, 2372, 689, 7041, 50276, 15617, 581, 285, 277, 1686, 476, 320, 3732, 281, 247, 5235, 273, 10336, 275, 619, 4743, 253, 9732, 2216, 273, 50276, 886, 69, 50276, 25739, 84, 247, 2074, 5700, 342, 581, 285, 697, 11640, 534, 310, 253, 9732, 310, 14200, 685, 253, 5974, 50276, 1189, 455, 891, 1158, 253, 16038, 273, 436, 2929, 3198, 281, 320, 1077, 10182, 281, 2590, 374, 253, 28959, 273, 5301, 310, 253, 7870, 21842, 27311, 4445, 908, 281, 275, 5974, 2990, 1057, 436, 588, 4833, 253, 5301, 273, 50276, 2420, 19, 1057, 581, 285, 277, 1686, 671, 897, 326, 495, 2139, 253, 8356, 4561, 10954, 273, 299, 2428, 310, 1199, 2406, 685, 643, 3082, 275, 2829, 19, 275, 1307, 273, 3045, 285, 1543, 275, 2169, 5974, 3045, 310, 627, 667, 8813, 1060, 751, 824, 347, 1249, 577, 812, 368, 2085, 253, 13782, 2105, 5301, 273, 253, 4081, 1332, 285, 643, 327, 383, 486, 3082, 275, 2829, 19, 608, 690, 4102, 256, 5503, 789, 310, 9829, 5910, 3738, 891, 871, 253, 3045, 273, 436, 2929, 310, 41731, 10574, 891, 1158, 597, 878, 281, 320, 5469, 50275, 14005, 337, 5520, 3640, 940, 21755, 3066, 9732, 13372, 50276, 19, 3186, 281, 940, 408, 27887, 5200, 403, 11678, 533, 417, 253, 2927, 50276, 20, 3909, 3640, 940, 21755, 3066, 27549, 4715, 50276, 21, 14218, 27549, 4715, 323, 3909, 3640, 940, 21755, 2490, 187, 4118, 18435, 27, 36871, 940, 21755, 465, 69, 556, 644, 7561, 908, 275, 3946, 323, 19007, 50276, 249, 436, 2929, 247, 12955, 273, 465, 69, 310, 4081, 1677, 247, 5974, 2990, 271, 24026, 9732, 10336, 310, 20220, 4561, 3066, 7870, 21842, 2410, 17009, 14086, 4735, 10291, 403, 5611, 281, 13450, 1949, 253, 9732, 285, 5974, 3210, 253, 4081, 1332, 310, 4460, 285, 4722, 16774, 1543, 2692, 326, 253, 4081, 1332, 476, 1347, 1805, 685, 2067, 465, 69, 11640, 50276, 35529, 352, 310, 12744, 2139, 253, 4081, 1332, 2987, 3738, 253, 4477, 3597, 281, 2953, 436, 2523, 275, 616, 30080, 22559, 50275, 67, 11587, 436, 50276, 66, 8750, 4468, 327, 436, 789, 310, 326, 352, 9829, 247, 5301, 342, 247, 3332, 2746, 275, 337, 534, 4453, 1199, 19554, 285, 17923, 3012, 1805, 327, 2074, 4679, 50276, 249, 337, 616, 501, 3024, 1235, 16987, 89, 310, 4577, 685, 253, 5974, 1566, 275, 436, 2929, 534, 908, 625, 15116, 327, 253, 1755, 533, 2692, 1199, 10046, 3045, 327, 1097, 4103, 285, 7880, 11701, 689, 253, 1072, 8245, 3733, 432, 20041, 323, 253, 4440, 257, 292, 9162, 4836, 327, 253, 7681, 1930, 253, 1332, 275, 337, 3365, 4648, 253, 3236, 501, 3024, 1235, 347, 253, 9732, 1566, 50276, 395, 253, 5974, 1566, 501, 3024, 1235, 16987, 89, 31414, 43341, 253, 10444, 18012, 273, 253, 9732, 1566, 432, 3828, 281, 3828, 337, 671, 4428, 247, 50276, 783, 30325, 1783, 50276, 10722, 3423, 1783, 1754, 281, 1329, 616, 1332, 10941, 342, 253, 1332, 275, 337, 253, 4081, 1332, 1060, 310, 625, 9542, 1679, 17194, 285, 1679, 5919, 50275, 18, 277, 1182, 14451, 278, 9094, 260, 260, 864, 246, 278, 1205, 278, 23136, 1269, 4498, 2805, 458, 2805, 632, 86, 285, 277, 5807, 86, 321, 24044, 564, 4618, 840, 6891, 5919, 3733, 273, 3676, 6906, 6928, 17857, 1686, 9169 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the authors come up with a loss for kd that encourages not only the correlation of relative differences in scores of the class predictions for an image but also the correlation of the relative differences of scores of the images for a class they claim that this sort of loss improves kd from strong teacher models and they show improvements on imagenet coco cityscapes etc strengths they show improvements on various benchmarks their proposed loss is straight forward to understand weaknessesconcerns it is not clear to me how their described intuition in lines 142 to 148 relate to the columnwise loss where try to correlate the relative difference in scores per class some relevant literature references missing httpsarxivorgabs191001348 httpsarxivorgabs190203393 the title seems overly generic and doesnt describe the gist of the paper kd is obviously done from a stronger teacher thats not novel the main novelty is the row and column wise loss the teachers are also not that strong by current standards they use a resnet 18 and 50 nowadays we have vits for image recongnition and other stronger models than resnet 18 some qualitative examples and quantitative analysis on what fails to be distilled from the strong teachers would also be nice to have docsepthis paper raise a overlooked question in knowledge distillation how to distill from a strong teacher ie how to conduct knowledge distillation when the discrepancy between the teacher and student models is large this paper conducted an empirical study and found that existing kd methods may fail when distilling from a strong teacher this paper further proposed to use the pearson correlation coefficient as a new match manner to replace the kl divergence strengths the proposed research question how to distill from a strong teacher is important but overlooked problem for knowledge distillation which has a great potential in realworld applications the research problem is novel and practical this paper conducted a comprehensive empirical study on how previous kd methods fail when the discrepancy between the teacher and student models is large this paper proposed to use the pearson correlation coefficient as a new match manner to replace the kl divergence the proposed method is simple efficient and practical the paper conducted extensive experiments on various benchmark datasets including image classification object detection and semantic segmentation the paper is well written and organized weaknesses the compared methods in table 5 are not stateoftheart some recent works see references are not compared the font size in figures 1 and 2 is too small it would be great if these two figures can be reorganized in the future version references r1 knowledge distillation meets selfsupervision eccv 2020 r2 densely guided knowledge distillation using multiple teacher assistants iccv 2021 the authors have addressed the limitations and potential negative societal impact of their work docsepthis paper investigates the topic of learning from a stronger teacher in kd the authors show that using kl divergence in kd may not perform well when distilling from stronger teachers and propose a new kd method called dist to only preserve the relations between the teacher and student outputs instead of exactly matching in kl divergence numerical results on different tasks are provided to show the superiority of dist on baseline and stronger teacher settings strengths 1 the problem of learning from a stronger teacher is interesting and worth to investigate many previous works have shown that the kd performs poorly when the capacity gap between teacher and student is large this paper extends this topic to stronger teacher where the stronger denotes larger capacity or stronger training strategy and provides a unified solution 2 the proposed method is simple and effective the authors empirically find that the discrepancy between the student and a stronger teacher becomes larger and therefore adopt correlation coefficient to alleviate this discrepancy the method is intuitively sound and well supported by the experiments 3 the improvements compared to kd is significant according to the numerous experiments on various tasks weaknesses 1 dist is less effective on larger student networks for example in table 4 dist achieves 19 improvement on resnet18 compared to kd but only improves resnet34 and resnet50 by 06 and 02 2 the performance of predictionbased distillation method is limited on dense prediction tasks eg detection compared to featurebased methods as the feature contains more localization information yes the authors adequately discussed the limitations of this paper docsepthis paper proposes a new knowledge distillation method that can use a stronger teacher to make a better student it proposes a correlationbased loss at the interclass level and intra class level the distance between the teacher predictions and the student predictions is computed using the pearson correlation coefficient instead of kl divergence experiments show the effectiveness of the proposed method strengths 1 existing kd methods perform worse when the teacher becomes stronger while the proposed method can use a stronger teacher to train a better student 2 experimental results verify the effectiveness weaknesses 1 one of the key components is the matching metric namely the pearson correlation coefficient pcc however the assumption that pcc is a more relaxed constraint compared with kl divergence because of its invariance to scale and shift is not convincing enough the constraint strength of a loss function is defined via its gradient distribution for example kl divergence and mse loss have the same optimal solution while mse loss is stricter than kl because of stricter punishment according to its gradient distribution from this perspective it is necessary to provide the gradient comparison between kl and pcc 2 the experiments are not sufficient enough 21 there are limited types of teacher architectures 22 most compared methods are proposed before 2019 see tab 5 23 the compared methods are not sufficient in tab 3 and 4 24 the overall performance comparisons are only conducted on the smallscale dataset ie cifar100 large datasets eg imagenet should also be evaluated 25 the performance improvement compared with sotas is marginal see tab 5 some students only have a 006 gain compared with crd 3 there are some typos and some improper presentations the texts of the figure are too small especially the texts in fig2 some typos such as on each classes in the caption of fig 3 should be corrected the authors have discussed the limitations and societal impacts of their works the proposed method cannot fully address the binary classification tasks ### Summary:
the paper presents a new kd loss different from the widely used kl divergence for learning from strong teachers who have large gaps between students the authors provide a comprehensive study and improve the challenging benchmarks the contribution is significant to the kd community and ac recommends accept authors may want to carefully upgrade the paper with constructive comments for the cameraready version
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 1705, 598, 342, 247, 2957, 323, 465, 69, 326, 29426, 417, 760, 253, 5921, 273, 4103, 3910, 275, 7363, 273, 253, 966, 13650, 323, 271, 2460, 533, 671, 253, 5921, 273, 253, 4103, 3910, 273, 7363, 273, 253, 3888, 323, 247, 966, 597, 1750, 326, 436, 3686, 273, 2957, 19132, 465, 69, 432, 2266, 9732, 3210, 285, 597, 921, 11701, 327, 4440, 257, 292, 9285, 80, 2846, 1026, 9652, 3966, 50276, 296, 3755, 20556, 50276, 9328, 921, 11701, 327, 2710, 49602, 50275, 14094, 4081, 2957, 310, 4951, 3579, 281, 2096, 50275, 20881, 1255, 265, 585, 1209, 2224, 50276, 262, 310, 417, 2590, 281, 479, 849, 616, 2529, 30328, 275, 3104, 21669, 281, 20995, 14588, 281, 253, 5084, 3020, 2957, 835, 1611, 281, 24888, 253, 4103, 3064, 275, 7363, 591, 966, 50275, 8826, 4623, 6239, 10414, 5816, 5987, 39962, 2061, 5375, 746, 2313, 1012, 2385, 5987, 39962, 2061, 5375, 746, 9992, 1610, 4590, 50274, 783, 4060, 3133, 27662, 12314, 285, 36908, 6266, 253, 305, 382, 273, 253, 2929, 465, 69, 310, 9090, 2218, 432, 247, 10046, 9732, 28763, 417, 4460, 253, 2022, 38135, 310, 253, 4194, 285, 5084, 15822, 2957, 50275, 783, 10954, 403, 671, 417, 326, 2266, 407, 1655, 7465, 597, 897, 247, 501, 3024, 1283, 285, 2456, 31735, 359, 452, 362, 953, 323, 2460, 761, 543, 79, 539, 285, 643, 10046, 3210, 685, 501, 3024, 1283, 50276, 8826, 18276, 6667, 285, 11745, 1783, 327, 752, 10224, 281, 320, 35755, 432, 253, 2266, 10954, 651, 671, 320, 5322, 281, 452, 50276, 7152, 33032, 2520, 2929, 7164, 247, 28849, 1953, 275, 3640, 940, 21755, 849, 281, 940, 408, 432, 247, 2266, 9732, 26332, 849, 281, 2589, 3640, 940, 21755, 672, 253, 26210, 875, 253, 9732, 285, 5974, 3210, 310, 1781, 436, 2929, 5196, 271, 16774, 1263, 285, 1119, 326, 5368, 465, 69, 3082, 778, 1891, 672, 940, 3867, 432, 247, 2266, 9732, 436, 2929, 2007, 4081, 281, 897, 253, 27887, 1665, 5921, 10235, 347, 247, 747, 3761, 5133, 281, 8171, 253, 27451, 23279, 20544, 50275, 783, 4081, 2561, 1953, 849, 281, 940, 408, 432, 247, 2266, 9732, 310, 1774, 533, 28849, 1895, 323, 3640, 940, 21755, 534, 556, 247, 1270, 2442, 275, 1524, 10186, 4893, 253, 2561, 1895, 310, 4460, 285, 8542, 50275, 2520, 2929, 5196, 247, 11088, 16774, 1263, 327, 849, 2045, 465, 69, 3082, 1891, 672, 253, 26210, 875, 253, 9732, 285, 5974, 3210, 310, 1781, 50274, 2520, 2929, 4081, 281, 897, 253, 27887, 1665, 5921, 10235, 347, 247, 747, 3761, 5133, 281, 8171, 253, 27451, 23279, 253, 4081, 1332, 310, 2969, 5919, 285, 8542, 50275, 783, 2929, 5196, 9470, 4679, 327, 2710, 22791, 15302, 1690, 2460, 9162, 1789, 5481, 285, 24705, 26405, 50274, 783, 2929, 310, 973, 3542, 285, 10932, 50276, 20881, 1255, 265, 50275, 783, 2429, 3082, 275, 2829, 608, 403, 417, 1375, 23037, 14387, 690, 3332, 2987, 923, 10414, 403, 417, 2429, 50275, 783, 8266, 1979, 275, 8442, 337, 285, 374, 310, 1512, 1355, 352, 651, 320, 1270, 604, 841, 767, 8442, 476, 320, 294, 34092, 275, 253, 2852, 2715, 50276, 250, 3065, 50276, 83, 18, 3640, 940, 21755, 16382, 1881, 12185, 4694, 23746, 87, 9169, 391, 19, 42350, 18107, 3640, 940, 21755, 970, 2709, 9732, 35785, 17857, 17312, 43425, 253, 4477, 452, 9713, 253, 7364, 285, 2442, 4016, 38058, 3486, 273, 616, 789, 5474, 33032, 2520, 2929, 2340, 684, 253, 9400, 273, 4715, 432, 247, 10046, 9732, 275, 465, 69, 253, 4477, 921, 326, 970, 27451, 23279, 275, 465, 69, 778, 417, 1347, 973, 672, 940, 3867, 432, 10046, 10954, 285, 12661, 247, 747, 465, 69, 1332, 1925, 940, 281, 760, 14003, 253, 2493, 875, 253, 9732, 285, 5974, 18012, 3185, 273, 4555, 11038, 275, 27451, 23279, 10704, 1543, 327, 1027, 8892, 403, 2530, 281, 921, 253, 34385, 273, 940, 327, 8245, 285, 10046, 9732, 7533, 20544, 337, 253, 1895, 273, 4715, 432, 247, 10046, 9732, 310, 4722, 285, 4409, 281, 7409, 1142, 2045, 2987, 452, 2011, 326, 253, 465, 69, 17923, 15225, 672, 253, 5350, 8037, 875, 9732, 285, 5974, 310, 1781, 436, 2929, 8725, 436, 9400, 281, 10046, 9732, 835, 253, 10046, 12853, 4067, 5350, 390, 10046, 3733, 5700, 285, 3400, 247, 27998, 2900, 50276, 19, 253, 4081, 1332, 310, 2969, 285, 3576, 253, 4477, 45190, 1089, 326, 253, 26210, 875, 253, 5974, 285, 247, 10046, 9732, 4916, 4067, 285, 3103, 5283, 5921, 10235, 281, 33623, 436, 26210, 253, 1332, 310, 540, 41597, 3590, 285, 973, 4516, 407, 253, 4679, 50276, 20, 253, 11701, 2429, 281, 465, 69, 310, 1534, 2556, 281, 253, 7418, 4679, 327, 2710, 8892, 50276, 20881, 1255, 265, 337, 940, 310, 1679, 3576, 327, 4067, 5974, 6928, 323, 1650, 275, 2829, 577, 940, 33526, 655, 7756, 327, 501, 3024, 1093, 2429, 281, 465, 69, 533, 760, 19132, 501, 3024, 1706, 285, 501, 3024, 1235, 407, 17796, 285, 16261, 50276, 19, 253, 3045, 273, 10554, 3169, 940, 21755, 1332, 310, 3710, 327, 14086, 10554, 8892, 24088, 5481, 2429, 281, 4735, 3169, 3082, 347, 253, 4735, 4428, 625, 14536, 1491, 50274, 9820, 253, 4477, 18212, 5469, 253, 7364, 273, 436, 2929, 50276, 7152, 33032, 2520, 2929, 29328, 247, 747, 3640, 940, 21755, 1332, 326, 476, 897, 247, 10046, 9732, 281, 1056, 247, 1805, 5974, 352, 29328, 247, 5921, 3169, 2957, 387, 253, 734, 2437, 1268, 285, 8376, 966, 1268, 253, 4181, 875, 253, 9732, 13650, 285, 253, 5974, 13650, 310, 10302, 970, 253, 27887, 1665, 5921, 10235, 3185, 273, 27451, 23279, 4679, 921, 253, 12510, 273, 253, 4081, 1332, 20544, 337, 186, 20137, 465, 69, 3082, 1347, 7197, 672, 253, 9732, 4916, 10046, 1223, 253, 4081, 1332, 476, 897, 247, 10046, 9732, 281, 6194, 247, 1805, 5974, 374, 186, 49363, 1543, 12654, 253, 12510, 50275, 20881, 1255, 265, 337, 186, 531, 273, 253, 2234, 4295, 310, 253, 11038, 7982, 10775, 253, 27887, 1665, 5921, 10235, 268, 550, 2299, 253, 9376, 326, 268, 550, 310, 247, 625, 19595, 7658, 2429, 342, 27451, 23279, 984, 273, 697, 31429, 281, 4311, 285, 5333, 310, 417, 21414, 2217, 253, 7658, 4757, 273, 247, 2957, 1159, 310, 2931, 3066, 697, 11786, 3268, 323, 1650, 27451, 23279, 285, 278, 339, 2957, 452, 253, 1072, 8654, 2900, 1223, 278, 339, 2957, 310, 34614, 350, 685, 27451, 984, 273, 34614, 350, 14232, 2556, 281, 697, 11786, 3268, 432, 436, 8668, 352, 310, 3309, 281, 2085, 253, 11786, 5301, 875, 27451, 285, 268, 550, 374, 186, 783, 4679, 403, 417, 4209, 2217, 3127, 627, 403, 3710, 3510, 273, 9732, 35615, 3307, 954, 2429, 3082, 403, 4081, 1078, 6247, 923, 10334, 608, 3495, 253, 2429, 3082, 403, 417, 4209, 275, 10334, 495, 285, 577, 2164, 253, 4583, 3045, 14023, 403, 760, 5196, 327, 253, 1355, 7527, 10895, 26332, 260, 338, 274, 2313, 1781, 15302, 24088, 4440, 257, 292, 943, 671, 320, 6760, 2030, 253, 3045, 7756, 2429, 342, 256, 302, 284, 310, 16888, 923, 10334, 608, 690, 3484, 760, 452, 247, 209, 7174, 6351, 2429, 342, 1531, 69, 495, 186, 9088, 403, 690, 963, 993, 285, 690, 14697, 27228, 253, 17438, 273, 253, 4677, 403, 1512, 1355, 3340, 253, 17438, 275, 3036, 19, 690, 963, 993, 824, 347, 327, 1016, 5971, 275, 253, 11743, 273, 3036, 495, 943, 320, 15045, 50276, 783, 4477, 452, 5469, 253, 7364, 285, 38058, 16274, 273, 616, 2987, 253, 4081, 1332, 2550, 4751, 2953, 253, 8985, 9162, 8892, 2490, 187, 4118, 18435, 27, 783, 2929, 10262, 247, 747, 465, 69, 2957, 1027, 432, 253, 7561, 908, 27451, 23279, 323, 4715, 432, 2266, 10954, 665, 452, 1781, 18388, 875, 3484, 253, 4477, 2085, 247, 11088, 1263, 285, 3157, 253, 11132, 49602, 253, 7680, 310, 1534, 281, 253, 465, 69, 3114, 285, 913, 32636, 2997, 4477, 778, 971, 281, 9257, 15047, 253, 2929, 342, 25799, 5701, 323, 253, 4049, 254, 609, 5102, 2715, 50276 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 1705, 598, 342, 247, 2957, 323, 465, 69, 326, 29426, 417, 760, 253, 5921, 273, 4103, 3910, 275, 7363, 273, 253, 966, 13650, 323, 271, 2460, 533, 671, 253, 5921, 273, 253, 4103, 3910, 273, 7363, 273, 253, 3888, 323, 247, 966, 597, 1750, 326, 436, 3686, 273, 2957, 19132, 465, 69, 432, 2266, 9732, 3210, 285, 597, 921, 11701, 327, 4440, 257, 292, 9285, 80, 2846, 1026, 9652, 3966, 50276, 296, 3755, 20556, 50276, 9328, 921, 11701, 327, 2710, 49602, 50275, 14094, 4081, 2957, 310, 4951, 3579, 281, 2096, 50275, 20881, 1255, 265, 585, 1209, 2224, 50276, 262, 310, 417, 2590, 281, 479, 849, 616, 2529, 30328, 275, 3104, 21669, 281, 20995, 14588, 281, 253, 5084, 3020, 2957, 835, 1611, 281, 24888, 253, 4103, 3064, 275, 7363, 591, 966, 50275, 8826, 4623, 6239, 10414, 5816, 5987, 39962, 2061, 5375, 746, 2313, 1012, 2385, 5987, 39962, 2061, 5375, 746, 9992, 1610, 4590, 50274, 783, 4060, 3133, 27662, 12314, 285, 36908, 6266, 253, 305, 382, 273, 253, 2929, 465, 69, 310, 9090, 2218, 432, 247, 10046, 9732, 28763, 417, 4460, 253, 2022, 38135, 310, 253, 4194, 285, 5084, 15822, 2957, 50275, 783, 10954, 403, 671, 417, 326, 2266, 407, 1655, 7465, 597, 897, 247, 501, 3024, 1283, 285, 2456, 31735, 359, 452, 362, 953, 323, 2460, 761, 543, 79, 539, 285, 643, 10046, 3210, 685, 501, 3024, 1283, 50276, 8826, 18276, 6667, 285, 11745, 1783, 327, 752, 10224, 281, 320, 35755, 432, 253, 2266, 10954, 651, 671, 320, 5322, 281, 452, 50276, 7152, 33032, 2520, 2929, 7164, 247, 28849, 1953, 275, 3640, 940, 21755, 849, 281, 940, 408, 432, 247, 2266, 9732, 26332, 849, 281, 2589, 3640, 940, 21755, 672, 253, 26210, 875, 253, 9732, 285, 5974, 3210, 310, 1781, 436, 2929, 5196, 271, 16774, 1263, 285, 1119, 326, 5368, 465, 69, 3082, 778, 1891, 672, 940, 3867, 432, 247, 2266, 9732, 436, 2929, 2007, 4081, 281, 897, 253, 27887, 1665, 5921, 10235, 347, 247, 747, 3761, 5133, 281, 8171, 253, 27451, 23279, 20544, 50275, 783, 4081, 2561, 1953, 849, 281, 940, 408, 432, 247, 2266, 9732, 310, 1774, 533, 28849, 1895, 323, 3640, 940, 21755, 534, 556, 247, 1270, 2442, 275, 1524, 10186, 4893, 253, 2561, 1895, 310, 4460, 285, 8542, 50275, 2520, 2929, 5196, 247, 11088, 16774, 1263, 327, 849, 2045, 465, 69, 3082, 1891, 672, 253, 26210, 875, 253, 9732, 285, 5974, 3210, 310, 1781, 50274, 2520, 2929, 4081, 281, 897, 253, 27887, 1665, 5921, 10235, 347, 247, 747, 3761, 5133, 281, 8171, 253, 27451, 23279, 253, 4081, 1332, 310, 2969, 5919, 285, 8542, 50275, 783, 2929, 5196, 9470, 4679, 327, 2710, 22791, 15302, 1690, 2460, 9162, 1789, 5481, 285, 24705, 26405, 50274, 783, 2929, 310, 973, 3542, 285, 10932, 50276, 20881, 1255, 265, 50275, 783, 2429, 3082, 275, 2829, 608, 403, 417, 1375, 23037, 14387, 690, 3332, 2987, 923, 10414, 403, 417, 2429, 50275, 783, 8266, 1979, 275, 8442, 337, 285, 374, 310, 1512, 1355, 352, 651, 320, 1270, 604, 841, 767, 8442, 476, 320, 294, 34092, 275, 253, 2852, 2715, 50276, 250, 3065, 50276, 83, 18, 3640, 940, 21755, 16382, 1881, 12185, 4694, 23746, 87, 9169, 391, 19, 42350, 18107, 3640, 940, 21755, 970, 2709, 9732, 35785, 17857, 17312, 43425, 253, 4477, 452, 9713, 253, 7364, 285, 2442, 4016, 38058, 3486, 273, 616, 789, 5474, 33032, 2520, 2929, 2340, 684, 253, 9400, 273, 4715, 432, 247, 10046, 9732, 275, 465, 69, 253, 4477, 921, 326, 970, 27451, 23279, 275, 465, 69, 778, 417, 1347, 973, 672, 940, 3867, 432, 10046, 10954, 285, 12661, 247, 747, 465, 69, 1332, 1925, 940, 281, 760, 14003, 253, 2493, 875, 253, 9732, 285, 5974, 18012, 3185, 273, 4555, 11038, 275, 27451, 23279, 10704, 1543, 327, 1027, 8892, 403, 2530, 281, 921, 253, 34385, 273, 940, 327, 8245, 285, 10046, 9732, 7533, 20544, 337, 253, 1895, 273, 4715, 432, 247, 10046, 9732, 310, 4722, 285, 4409, 281, 7409, 1142, 2045, 2987, 452, 2011, 326, 253, 465, 69, 17923, 15225, 672, 253, 5350, 8037, 875, 9732, 285, 5974, 310, 1781, 436, 2929, 8725, 436, 9400, 281, 10046, 9732, 835, 253, 10046, 12853, 4067, 5350, 390, 10046, 3733, 5700, 285, 3400, 247, 27998, 2900, 50276, 19, 253, 4081, 1332, 310, 2969, 285, 3576, 253, 4477, 45190, 1089, 326, 253, 26210, 875, 253, 5974, 285, 247, 10046, 9732, 4916, 4067, 285, 3103, 5283, 5921, 10235, 281, 33623, 436, 26210, 253, 1332, 310, 540, 41597, 3590, 285, 973, 4516, 407, 253, 4679, 50276, 20, 253, 11701, 2429, 281, 465, 69, 310, 1534, 2556, 281, 253, 7418, 4679, 327, 2710, 8892, 50276, 20881, 1255, 265, 337, 940, 310, 1679, 3576, 327, 4067, 5974, 6928, 323, 1650, 275, 2829, 577, 940, 33526, 655, 7756, 327, 501, 3024, 1093, 2429, 281, 465, 69, 533, 760, 19132, 501, 3024, 1706, 285, 501, 3024, 1235, 407, 17796, 285, 16261, 50276, 19, 253, 3045, 273, 10554, 3169, 940, 21755, 1332, 310, 3710, 327, 14086, 10554, 8892, 24088, 5481, 2429, 281, 4735, 3169, 3082, 347, 253, 4735, 4428, 625, 14536, 1491, 50274, 9820, 253, 4477, 18212, 5469, 253, 7364, 273, 436, 2929, 50276, 7152, 33032, 2520, 2929, 29328, 247, 747, 3640, 940, 21755, 1332, 326, 476, 897, 247, 10046, 9732, 281, 1056, 247, 1805, 5974, 352, 29328, 247, 5921, 3169, 2957, 387, 253, 734, 2437, 1268, 285, 8376, 966, 1268, 253, 4181, 875, 253, 9732, 13650, 285, 253, 5974, 13650, 310, 10302, 970, 253, 27887, 1665, 5921, 10235, 3185, 273, 27451, 23279, 4679, 921, 253, 12510, 273, 253, 4081, 1332, 20544, 337, 186, 20137, 465, 69, 3082, 1347, 7197, 672, 253, 9732, 4916, 10046, 1223, 253, 4081, 1332, 476, 897, 247, 10046, 9732, 281, 6194, 247, 1805, 5974, 374, 186, 49363, 1543, 12654, 253, 12510, 50275, 20881, 1255, 265, 337, 186, 531, 273, 253, 2234, 4295, 310, 253, 11038, 7982, 10775, 253, 27887, 1665, 5921, 10235, 268, 550, 2299, 253, 9376, 326, 268, 550, 310, 247, 625, 19595, 7658, 2429, 342, 27451, 23279, 984, 273, 697, 31429, 281, 4311, 285, 5333, 310, 417, 21414, 2217, 253, 7658, 4757, 273, 247, 2957, 1159, 310, 2931, 3066, 697, 11786, 3268, 323, 1650, 27451, 23279, 285, 278, 339, 2957, 452, 253, 1072, 8654, 2900, 1223, 278, 339, 2957, 310, 34614, 350, 685, 27451, 984, 273, 34614, 350, 14232, 2556, 281, 697, 11786, 3268, 432, 436, 8668, 352, 310, 3309, 281, 2085, 253, 11786, 5301, 875, 27451, 285, 268, 550, 374, 186, 783, 4679, 403, 417, 4209, 2217, 3127, 627, 403, 3710, 3510, 273, 9732, 35615, 3307, 954, 2429, 3082, 403, 4081, 1078, 6247, 923, 10334, 608, 3495, 253, 2429, 3082, 403, 417, 4209, 275, 10334, 495, 285, 577, 2164, 253, 4583, 3045, 14023, 403, 760, 5196, 327, 253, 1355, 7527, 10895, 26332, 260, 338, 274, 2313, 1781, 15302, 24088, 4440, 257, 292, 943, 671, 320, 6760, 2030, 253, 3045, 7756, 2429, 342, 256, 302, 284, 310, 16888, 923, 10334, 608, 690, 3484, 760, 452, 247, 209, 7174, 6351, 2429, 342, 1531, 69, 495, 186, 9088, 403, 690, 963, 993, 285, 690, 14697, 27228, 253, 17438, 273, 253, 4677, 403, 1512, 1355, 3340, 253, 17438, 275, 3036, 19, 690, 963, 993, 824, 347, 327, 1016, 5971, 275, 253, 11743, 273, 3036, 495, 943, 320, 15045, 50276, 783, 4477, 452, 5469, 253, 7364, 285, 38058, 16274, 273, 616, 2987, 253, 4081, 1332, 2550, 4751, 2953, 253, 8985, 9162, 8892, 2490, 187, 4118, 18435, 27, 783, 2929, 10262, 247, 747, 465, 69, 2957, 1027, 432, 253, 7561, 908, 27451, 23279, 323, 4715, 432, 2266, 10954, 665, 452, 1781, 18388, 875, 3484, 253, 4477, 2085, 247, 11088, 1263, 285, 3157, 253, 11132, 49602, 253, 7680, 310, 1534, 281, 253, 465, 69, 3114, 285, 913, 32636, 2997, 4477, 778, 971, 281, 9257, 15047, 253, 2929, 342, 25799, 5701, 323, 253, 4049, 254, 609, 5102, 2715, 50276 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper describes an alternative to l1l2 errors wrt output and one groundtruth example that are used to augment adversarial losses when training conditional gans while these augmented losses are often needed to stabilize and guide gan training the authors argue that they also bias the optimization of the generator towards mode collapse to address this the method proposes two kinds of alternate lossesboth of which essentially generate multiple sample outputs from the same input fit these with a gaussian distribution by computing the generating sample mean and variance and try to maximize the likelihood of the true training output under this distribution the paper provides theoretical and empirical analysis to show that the proposed approach leads to generators that produce samples that are both diverse and highquality i think this is a good paper and solves an important problemwhere one usually had to sacrifice diversity to obtain stable training by adding a reconstruction loss i recommend acceptance an interesting ablation experiment might be to see what happens when one no longer includes the gan loss and trains only with the mlmm or mcmle losses and compare this to training with only the l1l2 losses the other thing id like the authors to comment on are the potential shortcomings of using a simple uncorrelated gaussian to model the sample distributions it seems that such a distribution may not capture the fact that multiple dimensions of the output ie multiple pixel intensities are not independent conditioned on the input perhaps it may be worth exploring whether gaussians with general covariance matrices or independent in some decorrelated space learned from say simply the set of outputs may increase the efficacy of these losses postrebuttal ive read the other reviews and retain my positive impression of the paper i also appreciate that the authors have conducted additional experiments based on my nonbinding suggestionsand the results are indeed interesting i am upgrading my score accordinglydocsepthis paper analyzes the model collapse problems on training conditional gans and attribute it to the mismatch between gan loss and reconstruction loss this paper also proposes new types of reconstruction loss by measuring higher statistics for better multimodal conditional generation pros 1 the analysis in sec 44 is insightful which partially explains the success of mlmm and mcmle over previous method in generating diverse conditional outputs 2 the paper is well written and easy to follow cons analysis on the experiments is a little insufficient as shown below i have some questions and suggestions about experiments 1 how does the training process affected by changing the reconstruction loss eg how the training curve changes do mlmm and mcmle converge slower or faster than the original ones what about training stability 2 why only mlmm1 is not compared with other methods on srganceleba and glcica from pix2pix cases it seems that gaussian mlmm1 performs much better than mlmm12 docsepthe paper proposes a modification to the traditional conditional gan objective which minimizes gan loss as well as either l1 or l2 pixelwise reconstruction losses in order to promote diverse multimodal generation of images the modification involves replacing the l1l2 reconstruction loss which predicts the first moment of a pixelwise gaussianlaplace respectively likelihood model assuming a constant spherical covariance matrix with a new objective that matches the first and second moments of a pixelwise gaussianlaplace likelihood model with diagonal covariance matrix two models are proposed for matching the first and second moments the first one involves using a separate network to predict the moments from data which are then used to match the generators empirical estimates of the moments using k samples of generated images the second involves directly matching the empirical moment estimates using monte carlo the paper makes use of a wellestablished idea modeling pixelwise image likelihood with a diagonal covariance matrix ie heteroscedastic variance which as explained in 1 is a way to learn datadependent aleatoric uncertainty following 1 the usage of first and second moment prediction is also prevalent in recent deep generative models for example 2 ie image likelihood models predict the perpixel mean and variance in the l2 likelihood case for optimizing equation 4 from the paper recent work has also attempted to go beyond the assumption of a diagonal covariance matrix for example in 3 a banddiagonal covariance matrix is estimated hence the only novel idea in the paper seems to be the method for matching the empirical estimates of the first and second moments over k samples the motivation for doing this makes intuitive sense since diversity in generation is desired which is also demonstrated in the results section specific comments the loss of modality of reconstruction loss section 32 seems like something which doesnt require the extent of mathematical and empirical detail presented in the paper several of the cited works already mention the pitfalls of using reconstruction loss the analyses in section 44 are sound in derivation but not so much in the conclusions drawn it is not clear that the lack of existence of a generator that is an optimal solution to the gan and l2 loss individually implies that any learnt generator using gan l2 loss is suboptimal more explanation on this part would help the paper is well written presents a simple idea complete with experiments for comparing diversity with competing methods some theoretical analyses do no directly support the proposition eg sections 32 and 44 in my specific comments above hence the claim that the proposed method prevents mode collapse training stability and gives diverse multimodal predictions is supported by experiments and intuition for the method but not so much theoretically however the major weakness of the paper is the lack of novelty of the core idea update after rebuttal having read through the other reviews and the authors rebuttal i am unsatisfied with the rebuttal and i do not recommend accepting the paper my rating has decreased accordingly the reasons for my recommendation after discussion with other reviews are 1 lack of novelty and 2 weak theoretical results some justification of which was stated in my initial review above elaborating more on the second point i would like to mention some points which came up during the discussion with other reviewers the theoretical result which states that not using reconstruction loss given that multimodal outputs are desired is a weaker result than proving that the proposed method is actually effective in what it is designed to do there are empirical results to back that claim but i strongly believe that the theoretical results fall short and feel out of place in the overall justification for the proposed method this along with my earlier point of lack of novelty are the basis for my decision references 1 kendall alex and yarin gal what uncertainties do we need in bayesian deep learning for computer vision advances in neural information processing systems 2017 2 bloesch m czarnowski j clark r leutenegger s davison a j 2018 codeslamlearning a compact optimisable representation for dense visual slam cvpr 2018 3 dorta g vicente s agapito l campbell n d simpson i 2018 february structured uncertainty prediction networks in proceedings of the ieee conference on computer vision and pattern recognition ### Summary:
the paper presents new loss functions which replace the reconstruction part for the training of conditional gans theoretical considerations and an empirical analysis show that the proposed loss can better handle multimodality of the target distribution than reconstruction based losses while being competitive in terms of image quality
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 8631, 271, 5795, 281, 298, 18, 77, 19, 6332, 8772, 3453, 285, 581, 3216, 33024, 1650, 326, 403, 908, 281, 35919, 48960, 11655, 672, 3733, 17697, 305, 507, 1223, 841, 31612, 11655, 403, 2223, 3058, 281, 33292, 285, 7102, 36827, 3733, 253, 4477, 9059, 326, 597, 671, 8492, 253, 13757, 273, 253, 14156, 4404, 4438, 13551, 281, 2953, 436, 253, 1332, 29328, 767, 9351, 273, 17958, 11655, 15617, 273, 534, 9093, 6635, 2709, 3410, 18012, 432, 253, 1072, 3280, 4944, 841, 342, 247, 305, 12064, 3268, 407, 12672, 253, 11365, 3410, 1599, 285, 11041, 285, 1611, 281, 22950, 253, 12177, 273, 253, 2032, 3733, 3453, 762, 436, 3268, 253, 2929, 3400, 10527, 285, 16774, 1783, 281, 921, 326, 253, 4081, 2746, 5644, 281, 21025, 326, 4711, 3530, 326, 403, 1097, 11117, 285, 1029, 15177, 50276, 74, 1158, 436, 310, 247, 1175, 2929, 285, 35910, 271, 1774, 1895, 2811, 581, 3798, 574, 281, 17789, 9991, 281, 4044, 6474, 3733, 407, 6240, 247, 14433, 2957, 891, 5583, 14924, 50276, 266, 4722, 28913, 3368, 1537, 320, 281, 923, 752, 6569, 672, 581, 642, 3356, 3797, 253, 36827, 2957, 285, 18784, 760, 342, 253, 13361, 2188, 390, 278, 3591, 282, 11655, 285, 7277, 436, 281, 3733, 342, 760, 253, 298, 18, 77, 19, 11655, 253, 643, 2181, 2654, 751, 253, 4477, 281, 4385, 327, 403, 253, 2442, 35387, 273, 970, 247, 2969, 41656, 4919, 305, 12064, 281, 1566, 253, 3410, 10670, 352, 3133, 326, 824, 247, 3268, 778, 417, 9232, 253, 958, 326, 2709, 10103, 273, 253, 3453, 26332, 2709, 12275, 26112, 403, 417, 3907, 27039, 327, 253, 3280, 4931, 352, 778, 320, 4409, 18216, 1880, 305, 10064, 2458, 342, 2087, 26677, 12624, 390, 3907, 275, 690, 11482, 4919, 2317, 6311, 432, 1333, 3365, 253, 873, 273, 18012, 778, 2572, 253, 10307, 273, 841, 11655, 50276, 5996, 250, 2858, 22559, 50276, 422, 1239, 253, 643, 10123, 285, 13280, 619, 2762, 13214, 273, 253, 2929, 891, 671, 11435, 326, 253, 4477, 452, 5196, 3081, 4679, 1754, 327, 619, 1327, 13018, 13991, 395, 253, 1543, 403, 6296, 4722, 891, 717, 38234, 619, 4868, 15672, 7152, 33032, 2520, 2929, 3537, 13505, 253, 1566, 13551, 3237, 327, 3733, 17697, 305, 507, 285, 11104, 352, 281, 253, 29713, 875, 36827, 2957, 285, 14433, 2957, 436, 2929, 671, 29328, 747, 3510, 273, 14433, 2957, 407, 10499, 2169, 9990, 323, 1805, 23390, 26306, 17697, 5978, 50276, 856, 84, 337, 186, 783, 1783, 275, 4706, 7127, 310, 47860, 534, 10571, 11424, 253, 2323, 273, 13361, 2188, 285, 278, 3591, 282, 689, 2045, 1332, 275, 11365, 11117, 17697, 18012, 374, 186, 783, 2929, 310, 973, 3542, 285, 3477, 281, 956, 50276, 5040, 1783, 327, 253, 4679, 310, 247, 1652, 12497, 347, 2011, 2708, 50276, 74, 452, 690, 3533, 285, 13991, 670, 4679, 50276, 18, 186, 5430, 1057, 253, 3733, 1232, 5876, 407, 6890, 253, 14433, 2957, 24088, 849, 253, 3733, 6970, 2544, 513, 13361, 2188, 285, 278, 3591, 282, 29623, 17357, 390, 7938, 685, 253, 3236, 4394, 752, 670, 3733, 7882, 50276, 19, 186, 22309, 760, 13361, 2188, 18, 310, 417, 2429, 342, 643, 3082, 327, 256, 15164, 593, 282, 5830, 285, 1289, 68, 3737, 432, 8066, 19, 30061, 2219, 352, 3133, 326, 305, 12064, 13361, 2188, 18, 17923, 1199, 1805, 685, 13361, 2188, 805, 5474, 339, 431, 248, 2929, 29328, 247, 11237, 281, 253, 5899, 17697, 36827, 8103, 534, 46926, 36827, 2957, 347, 973, 347, 2057, 298, 18, 390, 298, 19, 12275, 3020, 14433, 11655, 275, 1340, 281, 8591, 11117, 23390, 26306, 5978, 273, 3888, 253, 11237, 8687, 15706, 253, 298, 18, 77, 19, 14433, 2957, 50276, 4609, 26295, 253, 806, 2774, 273, 247, 12275, 3020, 305, 12064, 4123, 5070, 2975, 12177, 1566, 7384, 247, 3638, 19474, 26677, 4315, 50276, 3113, 247, 747, 8103, 326, 10129, 253, 806, 285, 1273, 9506, 273, 247, 12275, 3020, 305, 12064, 4123, 5070, 12177, 1566, 342, 16421, 26677, 4315, 767, 3210, 403, 4081, 323, 11038, 253, 806, 285, 1273, 9506, 50276, 783, 806, 581, 8687, 970, 247, 4858, 2990, 281, 3283, 253, 9506, 432, 941, 534, 403, 840, 908, 281, 3761, 253, 21025, 16774, 8197, 273, 253, 9506, 970, 465, 3530, 273, 4561, 3888, 253, 1273, 8687, 3587, 11038, 253, 16774, 2774, 8197, 970, 1114, 442, 1113, 4213, 50276, 783, 2929, 2789, 897, 273, 247, 973, 21877, 2934, 50276, 7645, 272, 12275, 3020, 2460, 12177, 342, 247, 16421, 26677, 4315, 26332, 6895, 375, 758, 3258, 11041, 534, 347, 5544, 275, 337, 310, 247, 1039, 281, 3037, 2856, 324, 2662, 21844, 1080, 280, 11649, 1563, 337, 253, 10393, 273, 806, 285, 1273, 2774, 10554, 310, 671, 21270, 275, 3332, 3676, 1006, 800, 3210, 323, 1650, 374, 26332, 2460, 12177, 3210, 3283, 253, 591, 29206, 1599, 285, 11041, 275, 253, 298, 19, 12177, 1083, 323, 39793, 5150, 577, 432, 253, 2929, 3332, 789, 556, 671, 9919, 281, 564, 4457, 253, 9376, 273, 247, 16421, 26677, 4315, 323, 1650, 275, 495, 247, 3961, 41758, 26677, 4315, 310, 5998, 7613, 253, 760, 4460, 2934, 275, 253, 2929, 3133, 281, 320, 253, 1332, 323, 11038, 253, 16774, 8197, 273, 253, 806, 285, 1273, 9506, 689, 465, 3530, 253, 16038, 323, 2509, 436, 2789, 27350, 3282, 1580, 9991, 275, 5978, 310, 6799, 534, 310, 671, 5183, 275, 253, 1543, 50276, 4674, 2173, 5701, 50276, 783, 2957, 273, 36453, 273, 14433, 2957, 2593, 4567, 3133, 751, 1633, 534, 36908, 2430, 253, 6070, 273, 15965, 285, 16774, 2508, 3559, 275, 253, 2929, 2067, 273, 253, 11106, 2987, 2168, 3748, 253, 8483, 27366, 273, 970, 14433, 2957, 50275, 783, 6260, 275, 2593, 7127, 403, 3590, 275, 28529, 533, 417, 594, 1199, 275, 253, 11815, 8392, 352, 310, 417, 2590, 326, 253, 3480, 273, 6242, 273, 247, 14156, 326, 310, 271, 8654, 2900, 281, 253, 36827, 285, 298, 19, 2957, 15978, 8018, 326, 667, 34003, 14156, 970, 36827, 50276, 77, 19, 2957, 310, 749, 29776, 625, 8813, 327, 436, 629, 651, 1361, 50276, 783, 2929, 310, 973, 3542, 10262, 247, 2969, 2934, 3426, 342, 4679, 323, 10941, 9991, 342, 11771, 3082, 690, 10527, 6260, 513, 642, 3587, 1329, 253, 13989, 50276, 909, 7118, 4567, 285, 7127, 275, 619, 2173, 5701, 1840, 7613, 253, 1750, 326, 253, 4081, 1332, 16897, 4438, 13551, 3733, 7882, 285, 4245, 11117, 23390, 26306, 13650, 310, 4516, 407, 4679, 285, 30328, 323, 253, 1332, 533, 417, 594, 1199, 28055, 2299, 253, 2201, 14855, 273, 253, 2929, 310, 253, 3480, 273, 38135, 273, 253, 5161, 2934, 50275, 11183, 846, 30080, 22559, 1907, 1239, 949, 253, 643, 10123, 285, 253, 4477, 30080, 22559, 891, 717, 5061, 33496, 342, 253, 30080, 22559, 285, 891, 513, 417, 5583, 18738, 253, 2929, 619, 13716, 556, 6137, 15672, 50276, 783, 4606, 323, 619, 17401, 846, 5955, 342, 643, 10123, 403, 50276, 18, 3480, 273, 38135, 285, 374, 5075, 10527, 1543, 690, 22861, 273, 534, 369, 4767, 275, 619, 3302, 2278, 1840, 14883, 839, 625, 327, 253, 1273, 1127, 891, 651, 751, 281, 3748, 690, 2792, 534, 2210, 598, 1309, 253, 5955, 342, 643, 30628, 253, 10527, 906, 534, 3054, 326, 417, 970, 14433, 2957, 1677, 326, 23390, 26306, 18012, 403, 6799, 310, 247, 21076, 906, 685, 18597, 326, 253, 4081, 1332, 310, 2686, 3576, 275, 752, 352, 310, 4158, 281, 513, 627, 403, 16774, 1543, 281, 896, 326, 1750, 533, 891, 7052, 2868, 326, 253, 10527, 1543, 2965, 2159, 285, 1928, 562, 273, 1659, 275, 253, 4583, 22861, 323, 253, 4081, 1332, 436, 2112, 342, 619, 4321, 1127, 273, 3480, 273, 38135, 403, 253, 3720, 323, 619, 3061, 50275, 250, 3065, 337, 465, 423, 455, 247, 1591, 285, 340, 19881, 5918, 752, 20418, 513, 359, 878, 275, 17699, 16561, 3676, 4715, 323, 4382, 8113, 16424, 275, 11454, 1491, 5162, 2718, 4240, 374, 787, 13347, 348, 278, 37441, 1596, 15767, 480, 502, 782, 391, 458, 10284, 909, 1063, 256, 50276, 34926, 1988, 247, 480, 4765, 11646, 5247, 28269, 247, 8566, 5556, 261, 494, 6779, 323, 14086, 5304, 48041, 30105, 1087, 4765, 495, 277, 29674, 305, 15951, 13589, 256, 639, 522, 7067, 298, 2986, 10910, 295, 277, 50276, 3549, 10836, 891, 4765, 704, 67, 4701, 18872, 11649, 10554, 6928, 275, 10061, 273, 253, 26332, 1796, 8059, 327, 4382, 8113, 285, 3102, 8981, 187, 187, 4118, 18435, 27, 783, 2929, 10262, 747, 2957, 3470, 534, 8171, 253, 14433, 629, 323, 253, 3733, 273, 17697, 305, 507, 10527, 15711, 285, 271, 16774, 1783, 921, 326, 253, 4081, 2957, 476, 1805, 6016, 23390, 351, 1319, 273, 253, 2303, 3268, 685, 14433, 1754, 11655, 1223, 1146, 12085, 275, 2426, 273, 2460, 3290 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 8631, 271, 5795, 281, 298, 18, 77, 19, 6332, 8772, 3453, 285, 581, 3216, 33024, 1650, 326, 403, 908, 281, 35919, 48960, 11655, 672, 3733, 17697, 305, 507, 1223, 841, 31612, 11655, 403, 2223, 3058, 281, 33292, 285, 7102, 36827, 3733, 253, 4477, 9059, 326, 597, 671, 8492, 253, 13757, 273, 253, 14156, 4404, 4438, 13551, 281, 2953, 436, 253, 1332, 29328, 767, 9351, 273, 17958, 11655, 15617, 273, 534, 9093, 6635, 2709, 3410, 18012, 432, 253, 1072, 3280, 4944, 841, 342, 247, 305, 12064, 3268, 407, 12672, 253, 11365, 3410, 1599, 285, 11041, 285, 1611, 281, 22950, 253, 12177, 273, 253, 2032, 3733, 3453, 762, 436, 3268, 253, 2929, 3400, 10527, 285, 16774, 1783, 281, 921, 326, 253, 4081, 2746, 5644, 281, 21025, 326, 4711, 3530, 326, 403, 1097, 11117, 285, 1029, 15177, 50276, 74, 1158, 436, 310, 247, 1175, 2929, 285, 35910, 271, 1774, 1895, 2811, 581, 3798, 574, 281, 17789, 9991, 281, 4044, 6474, 3733, 407, 6240, 247, 14433, 2957, 891, 5583, 14924, 50276, 266, 4722, 28913, 3368, 1537, 320, 281, 923, 752, 6569, 672, 581, 642, 3356, 3797, 253, 36827, 2957, 285, 18784, 760, 342, 253, 13361, 2188, 390, 278, 3591, 282, 11655, 285, 7277, 436, 281, 3733, 342, 760, 253, 298, 18, 77, 19, 11655, 253, 643, 2181, 2654, 751, 253, 4477, 281, 4385, 327, 403, 253, 2442, 35387, 273, 970, 247, 2969, 41656, 4919, 305, 12064, 281, 1566, 253, 3410, 10670, 352, 3133, 326, 824, 247, 3268, 778, 417, 9232, 253, 958, 326, 2709, 10103, 273, 253, 3453, 26332, 2709, 12275, 26112, 403, 417, 3907, 27039, 327, 253, 3280, 4931, 352, 778, 320, 4409, 18216, 1880, 305, 10064, 2458, 342, 2087, 26677, 12624, 390, 3907, 275, 690, 11482, 4919, 2317, 6311, 432, 1333, 3365, 253, 873, 273, 18012, 778, 2572, 253, 10307, 273, 841, 11655, 50276, 5996, 250, 2858, 22559, 50276, 422, 1239, 253, 643, 10123, 285, 13280, 619, 2762, 13214, 273, 253, 2929, 891, 671, 11435, 326, 253, 4477, 452, 5196, 3081, 4679, 1754, 327, 619, 1327, 13018, 13991, 395, 253, 1543, 403, 6296, 4722, 891, 717, 38234, 619, 4868, 15672, 7152, 33032, 2520, 2929, 3537, 13505, 253, 1566, 13551, 3237, 327, 3733, 17697, 305, 507, 285, 11104, 352, 281, 253, 29713, 875, 36827, 2957, 285, 14433, 2957, 436, 2929, 671, 29328, 747, 3510, 273, 14433, 2957, 407, 10499, 2169, 9990, 323, 1805, 23390, 26306, 17697, 5978, 50276, 856, 84, 337, 186, 783, 1783, 275, 4706, 7127, 310, 47860, 534, 10571, 11424, 253, 2323, 273, 13361, 2188, 285, 278, 3591, 282, 689, 2045, 1332, 275, 11365, 11117, 17697, 18012, 374, 186, 783, 2929, 310, 973, 3542, 285, 3477, 281, 956, 50276, 5040, 1783, 327, 253, 4679, 310, 247, 1652, 12497, 347, 2011, 2708, 50276, 74, 452, 690, 3533, 285, 13991, 670, 4679, 50276, 18, 186, 5430, 1057, 253, 3733, 1232, 5876, 407, 6890, 253, 14433, 2957, 24088, 849, 253, 3733, 6970, 2544, 513, 13361, 2188, 285, 278, 3591, 282, 29623, 17357, 390, 7938, 685, 253, 3236, 4394, 752, 670, 3733, 7882, 50276, 19, 186, 22309, 760, 13361, 2188, 18, 310, 417, 2429, 342, 643, 3082, 327, 256, 15164, 593, 282, 5830, 285, 1289, 68, 3737, 432, 8066, 19, 30061, 2219, 352, 3133, 326, 305, 12064, 13361, 2188, 18, 17923, 1199, 1805, 685, 13361, 2188, 805, 5474, 339, 431, 248, 2929, 29328, 247, 11237, 281, 253, 5899, 17697, 36827, 8103, 534, 46926, 36827, 2957, 347, 973, 347, 2057, 298, 18, 390, 298, 19, 12275, 3020, 14433, 11655, 275, 1340, 281, 8591, 11117, 23390, 26306, 5978, 273, 3888, 253, 11237, 8687, 15706, 253, 298, 18, 77, 19, 14433, 2957, 50276, 4609, 26295, 253, 806, 2774, 273, 247, 12275, 3020, 305, 12064, 4123, 5070, 2975, 12177, 1566, 7384, 247, 3638, 19474, 26677, 4315, 50276, 3113, 247, 747, 8103, 326, 10129, 253, 806, 285, 1273, 9506, 273, 247, 12275, 3020, 305, 12064, 4123, 5070, 12177, 1566, 342, 16421, 26677, 4315, 767, 3210, 403, 4081, 323, 11038, 253, 806, 285, 1273, 9506, 50276, 783, 806, 581, 8687, 970, 247, 4858, 2990, 281, 3283, 253, 9506, 432, 941, 534, 403, 840, 908, 281, 3761, 253, 21025, 16774, 8197, 273, 253, 9506, 970, 465, 3530, 273, 4561, 3888, 253, 1273, 8687, 3587, 11038, 253, 16774, 2774, 8197, 970, 1114, 442, 1113, 4213, 50276, 783, 2929, 2789, 897, 273, 247, 973, 21877, 2934, 50276, 7645, 272, 12275, 3020, 2460, 12177, 342, 247, 16421, 26677, 4315, 26332, 6895, 375, 758, 3258, 11041, 534, 347, 5544, 275, 337, 310, 247, 1039, 281, 3037, 2856, 324, 2662, 21844, 1080, 280, 11649, 1563, 337, 253, 10393, 273, 806, 285, 1273, 2774, 10554, 310, 671, 21270, 275, 3332, 3676, 1006, 800, 3210, 323, 1650, 374, 26332, 2460, 12177, 3210, 3283, 253, 591, 29206, 1599, 285, 11041, 275, 253, 298, 19, 12177, 1083, 323, 39793, 5150, 577, 432, 253, 2929, 3332, 789, 556, 671, 9919, 281, 564, 4457, 253, 9376, 273, 247, 16421, 26677, 4315, 323, 1650, 275, 495, 247, 3961, 41758, 26677, 4315, 310, 5998, 7613, 253, 760, 4460, 2934, 275, 253, 2929, 3133, 281, 320, 253, 1332, 323, 11038, 253, 16774, 8197, 273, 253, 806, 285, 1273, 9506, 689, 465, 3530, 253, 16038, 323, 2509, 436, 2789, 27350, 3282, 1580, 9991, 275, 5978, 310, 6799, 534, 310, 671, 5183, 275, 253, 1543, 50276, 4674, 2173, 5701, 50276, 783, 2957, 273, 36453, 273, 14433, 2957, 2593, 4567, 3133, 751, 1633, 534, 36908, 2430, 253, 6070, 273, 15965, 285, 16774, 2508, 3559, 275, 253, 2929, 2067, 273, 253, 11106, 2987, 2168, 3748, 253, 8483, 27366, 273, 970, 14433, 2957, 50275, 783, 6260, 275, 2593, 7127, 403, 3590, 275, 28529, 533, 417, 594, 1199, 275, 253, 11815, 8392, 352, 310, 417, 2590, 326, 253, 3480, 273, 6242, 273, 247, 14156, 326, 310, 271, 8654, 2900, 281, 253, 36827, 285, 298, 19, 2957, 15978, 8018, 326, 667, 34003, 14156, 970, 36827, 50276, 77, 19, 2957, 310, 749, 29776, 625, 8813, 327, 436, 629, 651, 1361, 50276, 783, 2929, 310, 973, 3542, 10262, 247, 2969, 2934, 3426, 342, 4679, 323, 10941, 9991, 342, 11771, 3082, 690, 10527, 6260, 513, 642, 3587, 1329, 253, 13989, 50276, 909, 7118, 4567, 285, 7127, 275, 619, 2173, 5701, 1840, 7613, 253, 1750, 326, 253, 4081, 1332, 16897, 4438, 13551, 3733, 7882, 285, 4245, 11117, 23390, 26306, 13650, 310, 4516, 407, 4679, 285, 30328, 323, 253, 1332, 533, 417, 594, 1199, 28055, 2299, 253, 2201, 14855, 273, 253, 2929, 310, 253, 3480, 273, 38135, 273, 253, 5161, 2934, 50275, 11183, 846, 30080, 22559, 1907, 1239, 949, 253, 643, 10123, 285, 253, 4477, 30080, 22559, 891, 717, 5061, 33496, 342, 253, 30080, 22559, 285, 891, 513, 417, 5583, 18738, 253, 2929, 619, 13716, 556, 6137, 15672, 50276, 783, 4606, 323, 619, 17401, 846, 5955, 342, 643, 10123, 403, 50276, 18, 3480, 273, 38135, 285, 374, 5075, 10527, 1543, 690, 22861, 273, 534, 369, 4767, 275, 619, 3302, 2278, 1840, 14883, 839, 625, 327, 253, 1273, 1127, 891, 651, 751, 281, 3748, 690, 2792, 534, 2210, 598, 1309, 253, 5955, 342, 643, 30628, 253, 10527, 906, 534, 3054, 326, 417, 970, 14433, 2957, 1677, 326, 23390, 26306, 18012, 403, 6799, 310, 247, 21076, 906, 685, 18597, 326, 253, 4081, 1332, 310, 2686, 3576, 275, 752, 352, 310, 4158, 281, 513, 627, 403, 16774, 1543, 281, 896, 326, 1750, 533, 891, 7052, 2868, 326, 253, 10527, 1543, 2965, 2159, 285, 1928, 562, 273, 1659, 275, 253, 4583, 22861, 323, 253, 4081, 1332, 436, 2112, 342, 619, 4321, 1127, 273, 3480, 273, 38135, 403, 253, 3720, 323, 619, 3061, 50275, 250, 3065, 337, 465, 423, 455, 247, 1591, 285, 340, 19881, 5918, 752, 20418, 513, 359, 878, 275, 17699, 16561, 3676, 4715, 323, 4382, 8113, 16424, 275, 11454, 1491, 5162, 2718, 4240, 374, 787, 13347, 348, 278, 37441, 1596, 15767, 480, 502, 782, 391, 458, 10284, 909, 1063, 256, 50276, 34926, 1988, 247, 480, 4765, 11646, 5247, 28269, 247, 8566, 5556, 261, 494, 6779, 323, 14086, 5304, 48041, 30105, 1087, 4765, 495, 277, 29674, 305, 15951, 13589, 256, 639, 522, 7067, 298, 2986, 10910, 295, 277, 50276, 3549, 10836, 891, 4765, 704, 67, 4701, 18872, 11649, 10554, 6928, 275, 10061, 273, 253, 26332, 1796, 8059, 327, 4382, 8113, 285, 3102, 8981, 187, 187, 4118, 18435, 27, 783, 2929, 10262, 747, 2957, 3470, 534, 8171, 253, 14433, 629, 323, 253, 3733, 273, 17697, 305, 507, 10527, 15711, 285, 271, 16774, 1783, 921, 326, 253, 4081, 2957, 476, 1805, 6016, 23390, 351, 1319, 273, 253, 2303, 3268, 685, 14433, 1754, 11655, 1223, 1146, 12085, 275, 2426, 273, 2460, 3290 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper tries to demystify the power of embedding and embedding space trying to connect the structure of learned embeddings and knowledge generation process the authors tested gaussian model preferential placement pp model and directional preferential placement dpp model for node generations the authors also tried to learn embeddings by graph representation learning algorithms after running ba models with several variances with gravitational movement and centroid nodeedge weighting main contributions 1 propose reasonable processes that can explain the evolution of knowledge embeddings in terms of preferential attachment and attractiverepulsive force 2 compare observed and generated embedding spaces via nonparametric statistics identifying frequency concentration and clustering velocity properties 3 evaluate and find the best generative process incremental insertion process with spatial contextdependent preferential attachment strengths this paper proposes interesting approaches to review popular static word embeddings the authors try to reconstruct a number of different embeddings based on different node generation models edge generating models and their combinations 1 it is exciting to read the authors endeavor on revisiting the old successful embeddings that brought a wide range of breakthrough in nlp 2 their approach of viewing embeddings purely as a result of nodeonly generations vs a product of graph that has both node and edge generations continued by graph embeddings seems to be thorough instrument to investigate the embedding spaces 3 their findings of frequency concentration not only highly frequency terms getting richer and clustering velocity but also they tend to be together is useful to support an important observation there are significant amount of frequency information in learned embeddings 4 the authors revisit many interesting concepts and ideas in different fields like geography that covers spatial relationships and corresponding philosophy weaknesses 1 the primary concern is that reading this paper may not add useful intuitions in the era of contextual embeddings and largescale language models if the paper suggests any type of treatment that may alleviate too much dependency on the frequency information or that can improve the performance of stateoftheart language models by feeding better initial embeddings the research will become a gold but the current draft itself is less likely to impact on our community 2 all the models and experiments are designed and operated rather passively by tuning the hyperparameters and parameters then comparing the results with the popular learned embeddings it is of course useful to have such models analyses however it is more natural for machine learning community to fit each model into the training portion of data and validate it on the holdout set before making comparisons otherwise we cannot say whether each suggested model actually has some degree of generalizability of the phenomena of interests 3 indeed we already understood in both highlevel and lowlevel that these static embeddings largely encode frequency information though they were often argued to encapsulate semantic information usual definition of the context in computational linguistics is a set of words in the sliding window of each word which inevitably connects to learning frequentialdistributional similarities rather than precisely semantic and syntactic similarities it was also studied that frequent words are more frequently updated in sgd framework sharing the large norms if we use gaussian like kernel it will be not surprising to see these are grouped questions 1 why do you believe that embeddings are optimized to preserve the natural clusters of the realworld knowledge in its metric space at least the three embeddings used in the paper never explicitly optimize any clustering objectives they want the words closely in the raw texts locate closely in the euclidean space 2 geometry often means the characteristics of the manifold or topological essence defined by a collection of open sets that will indeed define suitable metric for metric spaces some expressions about geometry or invariance would be too subjectively used 3 it is interesting to see how clustering velocity is defined as area under the curve in the graph of increasing number of connected components different types of graphs maybe different scalefreeness and degree distributions naturally differs in this clustering velocity but it is a bit doubtful that this metric can be drastically different even for two graphs with similar degree distributions by adversarially tweaking couple of local connections as both models and new metrics are proposed there must be a natural question about the stability and sensitivity of this metric 4 more active form of contributions that can impact modern representation learning and transferbased language modeling will benefit the audiences in the community minor comments a2 left figure attach the figure reference as the actual figure is located in the previous page a3 figure 5 not in the main draft docsep in this paper the authors try to study if we can learn about the humans process of generating new ideas or concepts from embeddings the authors first define a set of models for generating embeddings then the authors observe two properties 1 frequency concentration and 2 cluster velocity the authors finally compare the embeddings from generative models with the embeddings from realworld data driven methods strengths in general i find the question the authors try to address is interesting the authors methodology of approaching this problem seems fine weaknesses one concern i have is the embedding the authors use now nlp community is adapting to embeddings from pretrained models eg bert it seems to me that analyzing these contextual embeddings would be better to understand the question the authors propose somehow i feel like the evaluation between the embeddings from the generative model and from the realworld is rather weak i guess another more straightforward way is to show the performance on downstream tasks using the generated embeddings in sum i think the authors look at an important problem but there are some limitations of this study eg not looking at contextual embeddings docsepthis paper tries to explore the natural processes that generate new knowledge or concepts specifically the authors propose two metrics to characterize embeddings trained on different datasets then they compare some synthetic data generated by models and realworld data according to the aforementioned metrics finally they conclude that the realworld data can be well simulated by a certain generative model strengths 1 this paper studies data generation processes with the help of embedding spaces it is an interesting problem and i think the idea of bridging embedding spaces and data generation processes is worth further exploration 2 the authors give a detailed illustration of several data generation methods weakness 1 the title is a bit confusing the authors ask a question why do embedding spaces look as they do in the title however it seems that they did not answer it throughout this submission 2 the submission is not selfcontained for example at the bottom of page 6 the authors put an important figure in the appendix however as noted by the iclr committee reviewers are not required to read appendices 3 the linear correlations in the experiments seem to be weak especially in figure 2 the authors may want to calculate the pearsons correlation between freq concentration and ranking 4 as described in the conclusion section experiments in this work are not sufficient to identify a single dominant model for reasoning about the origins and structures of natural embedding spaces i agree that this work gives a good start for an interesting problem however since the authors did not give a clear conclusion the technical contribution of this submission seems to be weak this paper studies an important and interesting problem however it does not provide a good enough solution to the problem to meet the bar of iclr docsepthis paper proposes two measures of wordgraph embedding to character the evolution of wordgraph ie frequency concentration and clustering velocity many existing graph generation models are surveyed the proposed measures are calculated on many realworld datasets the strengths s1 this paper studies an interesting problem to analyze structure evolution from the geometry in embedding space s2 two novel measures are proposed s3 this paper is wellwritten the weaknesses w1 there is no problem formulation its hard to know what is the computational problem studied in this paper w2 its not clear how do the proposed measures answer the raised problem w3 there is a lack of discussion of the rationale of the proposed measures w4 there is a lack of empirical validation for the proposed measure i have some concerns about this paper there is no problem formulation its hard to know what is the computational problem studied in this paper it seems that the goal of this paper is only to reveal and explain preferential attachment its not clear how do the proposed measures answer the raised problem what is the intuition of the proposed measure in terms of analyzing structure evolution there is a lack of discussion of the rationale of the proposed measures there is a lack of empirical validation for the proposed measure this paper studies an interesting problem and proposes two novel measures this paper lacks some key elements such as problem formulation rationale discussion and empirical validation as its too many weaknesses i have to recommend a reject ### Summary:
the authors of this work introduced new metrics for node embedding that can measure the evolution of the embeddings and compare them with existing graph embedding approaches and experimented on real datasets all reviewers agreed that the work addresses interesting problem and that the proposed measures are nove but there are too many flaws in the initial version of the paper and despite the thorough responses of the authors it is believed that there are still too many open questions for this paper to be accepted this year iclr
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 14177, 281, 1471, 9207, 1419, 253, 1612, 273, 21496, 285, 21496, 2317, 2820, 281, 4684, 253, 2605, 273, 6311, 46234, 285, 3640, 5978, 1232, 253, 4477, 5762, 305, 12064, 1566, 41637, 14663, 7266, 1566, 285, 36799, 41637, 14663, 277, 377, 1566, 323, 4666, 14649, 253, 4477, 671, 3597, 281, 3037, 46234, 407, 4216, 6779, 4715, 11333, 846, 3515, 18927, 3210, 342, 2067, 48894, 342, 18924, 4866, 285, 1399, 2631, 4666, 13057, 42428, 50275, 7265, 9021, 337, 12661, 5272, 4870, 326, 476, 5513, 253, 5606, 273, 3640, 46234, 275, 2426, 273, 41637, 14170, 285, 12994, 4762, 22480, 3490, 50276, 19, 7277, 2540, 285, 4561, 21496, 8470, 3066, 1327, 36928, 9990, 12488, 4294, 4719, 285, 17524, 7602, 3607, 50276, 20, 7472, 285, 1089, 253, 1682, 1006, 800, 1232, 32809, 16941, 1232, 342, 8820, 3634, 6820, 41637, 14170, 50276, 296, 3755, 20556, 436, 2929, 29328, 4722, 7274, 281, 2278, 4633, 4228, 3159, 46234, 253, 4477, 1611, 281, 17029, 247, 1180, 273, 1027, 46234, 1754, 327, 1027, 4666, 5978, 3210, 5024, 11365, 3210, 285, 616, 13553, 50276, 18, 352, 310, 12302, 281, 1239, 253, 4477, 38937, 327, 27694, 2996, 253, 1711, 5547, 46234, 326, 3982, 247, 4618, 2491, 273, 29709, 275, 295, 24343, 50276, 19, 616, 2746, 273, 14657, 46234, 15846, 347, 247, 906, 273, 4666, 7483, 14649, 4632, 247, 1885, 273, 4216, 326, 556, 1097, 4666, 285, 5024, 14649, 4821, 407, 4216, 46234, 50276, 339, 3030, 281, 320, 11080, 7935, 281, 7409, 253, 21496, 8470, 50276, 20, 616, 4342, 273, 4294, 4719, 417, 760, 4122, 4294, 2426, 2970, 38539, 285, 17524, 7602, 533, 671, 597, 5257, 281, 320, 2366, 310, 4217, 281, 1329, 271, 1774, 8310, 627, 403, 1534, 2408, 273, 4294, 1491, 275, 6311, 46234, 50276, 21, 253, 4477, 45735, 1142, 4722, 12342, 285, 5697, 275, 1027, 4910, 751, 37756, 326, 10949, 8820, 7688, 285, 3969, 11727, 50274, 20881, 1255, 265, 337, 253, 3625, 4468, 310, 326, 4361, 436, 2929, 778, 417, 823, 4217, 16875, 4431, 275, 253, 8685, 273, 33876, 46234, 285, 1236, 2510, 25912, 3448, 3210, 604, 253, 2929, 5936, 667, 1511, 273, 1971, 326, 778, 33623, 1512, 1199, 18925, 327, 253, 4294, 1491, 390, 326, 476, 3157, 253, 3045, 273, 1375, 23037, 14387, 3448, 3210, 407, 12422, 1805, 3302, 46234, 253, 2561, 588, 2489, 247, 5328, 533, 253, 1655, 7482, 3139, 310, 1679, 2779, 281, 3486, 327, 776, 3114, 50276, 19, 512, 253, 3210, 285, 4679, 403, 4158, 285, 11658, 2581, 1509, 1242, 407, 25184, 253, 4373, 22041, 285, 3602, 840, 10941, 253, 1543, 342, 253, 4633, 6311, 46234, 352, 310, 273, 2282, 4217, 281, 452, 824, 3210, 6260, 2299, 352, 310, 625, 3626, 323, 5145, 4715, 3114, 281, 4944, 1016, 1566, 715, 253, 3733, 5110, 273, 941, 285, 17813, 352, 327, 253, 2186, 483, 873, 1078, 2403, 14023, 5010, 359, 2550, 1333, 1880, 1016, 5125, 1566, 2686, 556, 690, 4248, 273, 2087, 50228, 273, 253, 16958, 273, 6284, 50276, 20, 6296, 359, 2168, 7192, 275, 1097, 1029, 5251, 285, 1698, 5251, 326, 841, 4228, 46234, 8127, 22573, 4294, 1491, 2167, 597, 497, 2223, 9125, 281, 22642, 4187, 24705, 1491, 7312, 5426, 273, 253, 3634, 275, 15180, 20365, 3397, 310, 247, 873, 273, 3000, 275, 253, 20661, 3497, 273, 1016, 3159, 534, 24473, 23417, 281, 4715, 2549, 1624, 35360, 267, 22620, 2581, 685, 10534, 24705, 285, 43548, 9994, 22620, 352, 369, 671, 5421, 326, 10879, 3000, 403, 625, 7208, 9300, 275, 256, 35333, 7792, 9628, 253, 1781, 22429, 604, 359, 897, 305, 12064, 751, 10295, 352, 588, 320, 417, 10084, 281, 923, 841, 403, 24104, 50274, 34974, 337, 2139, 513, 368, 2868, 326, 46234, 403, 18325, 281, 14003, 253, 3626, 9959, 273, 253, 1524, 10186, 3640, 275, 697, 7982, 2317, 387, 1878, 253, 1264, 46234, 908, 275, 253, 2929, 1620, 11120, 22318, 667, 17524, 16566, 597, 971, 253, 3000, 8244, 275, 253, 9305, 17438, 19912, 8244, 275, 253, 299, 26365, 2317, 50275, 19, 12087, 2223, 2097, 253, 5319, 273, 253, 16751, 390, 17597, 17718, 2931, 407, 247, 4849, 273, 1527, 5239, 326, 588, 6296, 4853, 7470, 7982, 323, 7982, 8470, 690, 12091, 670, 12087, 390, 31429, 651, 320, 1512, 2256, 1242, 908, 50276, 20, 352, 310, 4722, 281, 923, 849, 17524, 7602, 310, 2931, 347, 2170, 762, 253, 6970, 275, 253, 4216, 273, 3629, 1180, 273, 4802, 4295, 1027, 3510, 273, 14580, 5046, 1027, 9171, 832, 1820, 405, 285, 4248, 10670, 10748, 19986, 275, 436, 17524, 7602, 533, 352, 310, 247, 2372, 38342, 326, 436, 7982, 476, 320, 31063, 1027, 1014, 323, 767, 14580, 342, 2074, 4248, 10670, 407, 18539, 274, 1365, 13660, 1170, 4564, 273, 1980, 10291, 347, 1097, 3210, 285, 747, 17082, 403, 4081, 627, 1364, 320, 247, 3626, 1953, 670, 253, 7882, 285, 7340, 273, 436, 7982, 50276, 21, 625, 3939, 830, 273, 9021, 326, 476, 3486, 4980, 6779, 4715, 285, 3700, 3169, 3448, 14053, 588, 5649, 253, 23886, 275, 253, 3114, 50275, 37585, 5701, 247, 19, 1669, 4677, 50276, 38729, 253, 4677, 3806, 347, 253, 4588, 4677, 310, 4441, 275, 253, 2045, 3239, 50276, 66, 20, 4677, 608, 417, 275, 253, 2022, 7482, 50276, 7152, 33032, 275, 436, 2929, 50276, 783, 4477, 1611, 281, 1263, 604, 359, 476, 3037, 670, 253, 7497, 1232, 273, 11365, 747, 5697, 390, 12342, 432, 46234, 253, 4477, 806, 4853, 247, 873, 273, 3210, 323, 11365, 46234, 840, 253, 4477, 10018, 767, 3607, 337, 4294, 4719, 285, 374, 7368, 7602, 253, 4477, 4720, 7277, 253, 46234, 432, 1006, 800, 3210, 342, 253, 46234, 432, 1524, 10186, 941, 8877, 3082, 50276, 296, 3755, 20556, 50276, 249, 2087, 891, 1089, 253, 1953, 253, 4477, 1611, 281, 2953, 310, 4722, 253, 4477, 16182, 273, 17682, 436, 1895, 3133, 4030, 50276, 20881, 1255, 265, 50275, 531, 4468, 891, 452, 310, 253, 21496, 253, 4477, 897, 1024, 295, 24343, 3114, 310, 42174, 281, 46234, 432, 3215, 11273, 3210, 24088, 270, 797, 352, 3133, 281, 479, 326, 18918, 841, 33876, 46234, 651, 320, 1805, 281, 2096, 253, 1953, 253, 4477, 12661, 50275, 8826, 5430, 891, 1928, 751, 253, 7103, 875, 253, 46234, 432, 253, 1006, 800, 1566, 285, 432, 253, 1524, 10186, 310, 2581, 5075, 891, 5476, 1529, 625, 15246, 1039, 310, 281, 921, 253, 3045, 327, 15450, 8892, 970, 253, 4561, 46234, 50276, 249, 2020, 891, 1158, 253, 4477, 1007, 387, 271, 1774, 1895, 533, 627, 403, 690, 7364, 273, 436, 1263, 24088, 417, 2819, 387, 33876, 46234, 5474, 33032, 2520, 2929, 14177, 281, 8338, 253, 3626, 4870, 326, 6635, 747, 3640, 390, 12342, 5742, 253, 4477, 12661, 767, 17082, 281, 17710, 46234, 10166, 327, 1027, 15302, 840, 597, 7277, 690, 13506, 941, 4561, 407, 3210, 285, 1524, 10186, 941, 2556, 281, 253, 18979, 17082, 4720, 597, 7525, 326, 253, 1524, 10186, 941, 476, 320, 973, 15524, 407, 247, 2176, 1006, 800, 1566, 20544, 50276, 18, 436, 2929, 2175, 941, 5978, 4870, 342, 253, 1361, 273, 21496, 8470, 352, 310, 271, 4722, 1895, 285, 891, 1158, 253, 2934, 273, 49519, 21496, 8470, 285, 941, 5978, 4870, 310, 4409, 2007, 17947, 374, 253, 4477, 1918, 247, 7000, 23356, 273, 2067, 941, 5978, 3082, 50274, 20881, 1255, 337, 253, 4060, 310, 247, 2372, 21643, 253, 4477, 1642, 247, 1953, 2139, 513, 21496, 8470, 1007, 347, 597, 513, 275, 253, 4060, 2299, 352, 3133, 326, 597, 858, 417, 3662, 352, 4768, 436, 19529, 374, 253, 19529, 310, 417, 1881, 41010, 323, 1650, 387, 253, 5004, 273, 3239, 721, 253, 4477, 1691, 271, 1774, 4677, 275, 253, 30762, 2299, 347, 4879, 407, 253, 17857, 32888, 9353, 30628, 403, 417, 2424, 281, 1239, 14801, 1271, 495, 253, 4872, 13007, 275, 253, 4679, 1646, 281, 320, 5075, 3340, 275, 4677, 374, 253, 4477, 778, 971, 281, 10173, 253, 268, 6364, 790, 5921, 875, 4107, 82, 4719, 285, 19947, 577, 347, 2529, 275, 253, 6452, 2593, 4679, 275, 436, 789, 403, 417, 4209, 281, 4271, 247, 2014, 11360, 1566, 323, 14720, 670, 253, 20801, 285, 5289, 273, 3626, 21496, 8470, 891, 5194, 326, 436, 789, 4245, 247, 1175, 1265, 323, 271, 4722, 1895, 2299, 1580, 253, 4477, 858, 417, 1918, 247, 2590, 6452, 253, 7681, 7680, 273, 436, 19529, 3133, 281, 320, 5075, 50276, 2520, 2929, 2175, 271, 1774, 285, 4722, 1895, 2299, 352, 1057, 417, 2085, 247, 1175, 2217, 2900, 281, 253, 1895, 281, 2525, 253, 2534, 273, 17857, 32888, 5474, 33032, 2520, 2929, 29328, 767, 5593, 273, 3159, 10580, 21496, 281, 1894, 253, 5606, 273, 3159, 10580, 26332, 4294, 4719, 285, 17524, 7602, 1142, 5368, 4216, 5978, 3210, 403, 28671, 253, 4081, 5593, 403, 5118, 327, 1142, 1524, 10186, 15302, 253, 20544, 256, 18, 436, 2929, 2175, 271, 4722, 1895, 281, 12106, 2605, 5606, 432, 253, 12087, 275, 21496, 2317, 256, 19, 767, 4460, 5593, 403, 4081, 256, 20, 436, 2929, 310, 973, 15720, 50276, 783, 32213, 259, 18, 627, 310, 642, 1895, 15895, 697, 1892, 281, 871, 752, 310, 253, 15180, 1895, 5421, 275, 436, 2929, 259, 19, 697, 417, 2590, 849, 513, 253, 4081, 5593, 3662, 253, 5439, 1895, 259, 20, 627, 310, 247, 3480, 273, 5955, 273, 253, 24775, 273, 253, 4081, 5593, 259, 21, 627, 310, 247, 3480, 273, 16774, 12820, 323, 253, 4081, 2557, 50276, 74, 452, 690, 7350, 670, 436, 2929, 627, 310, 642, 1895, 15895, 697, 1892, 281, 871, 752, 310, 253, 15180, 1895, 5421, 275, 436, 2929, 352, 3133, 326, 253, 4736, 273, 436, 2929, 310, 760, 281, 10313, 285, 5513, 41637, 14170, 697, 417, 2590, 849, 513, 253, 4081, 5593, 3662, 253, 5439, 1895, 752, 310, 253, 30328, 273, 253, 4081, 2557, 275, 2426, 273, 18918, 2605, 5606, 50276, 9088, 310, 247, 3480, 273, 5955, 273, 253, 24775, 273, 253, 4081, 5593, 627, 310, 247, 3480, 273, 16774, 12820, 323, 253, 4081, 2557, 436, 2929, 2175, 271, 4722, 1895, 285, 29328, 767, 4460, 5593, 436, 2929, 19756, 690, 2234, 3603, 824, 347, 1895, 15895, 24775, 5955, 285, 16774, 12820, 347, 697, 1512, 1142, 32213, 891, 452, 281, 5583, 247, 12009, 2490, 187, 4118, 18435, 27, 783, 4477, 273, 436, 789, 5611, 747, 17082, 323, 4666, 21496, 326, 476, 2557, 253, 5606, 273, 253, 46234, 285, 7277, 731, 342, 5368, 4216, 21496, 7274, 285, 3368, 264, 327, 1524, 15302, 50276, 455, 30628, 5821, 326, 253, 789, 12453, 4722, 1895, 285, 326, 253, 4081, 5593, 403, 642, 306, 533, 627, 403, 1512, 1142, 32138, 275, 253, 3302, 2715, 273, 253, 2929, 285, 5747, 253, 11080, 6128, 273, 253, 4477, 352, 310, 6566, 326, 627, 403, 1335, 1512, 1142, 1527, 3533, 323, 436, 2929, 281, 320, 7607, 436, 807, 17857, 32888 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 14177, 281, 1471, 9207, 1419, 253, 1612, 273, 21496, 285, 21496, 2317, 2820, 281, 4684, 253, 2605, 273, 6311, 46234, 285, 3640, 5978, 1232, 253, 4477, 5762, 305, 12064, 1566, 41637, 14663, 7266, 1566, 285, 36799, 41637, 14663, 277, 377, 1566, 323, 4666, 14649, 253, 4477, 671, 3597, 281, 3037, 46234, 407, 4216, 6779, 4715, 11333, 846, 3515, 18927, 3210, 342, 2067, 48894, 342, 18924, 4866, 285, 1399, 2631, 4666, 13057, 42428, 50275, 7265, 9021, 337, 12661, 5272, 4870, 326, 476, 5513, 253, 5606, 273, 3640, 46234, 275, 2426, 273, 41637, 14170, 285, 12994, 4762, 22480, 3490, 50276, 19, 7277, 2540, 285, 4561, 21496, 8470, 3066, 1327, 36928, 9990, 12488, 4294, 4719, 285, 17524, 7602, 3607, 50276, 20, 7472, 285, 1089, 253, 1682, 1006, 800, 1232, 32809, 16941, 1232, 342, 8820, 3634, 6820, 41637, 14170, 50276, 296, 3755, 20556, 436, 2929, 29328, 4722, 7274, 281, 2278, 4633, 4228, 3159, 46234, 253, 4477, 1611, 281, 17029, 247, 1180, 273, 1027, 46234, 1754, 327, 1027, 4666, 5978, 3210, 5024, 11365, 3210, 285, 616, 13553, 50276, 18, 352, 310, 12302, 281, 1239, 253, 4477, 38937, 327, 27694, 2996, 253, 1711, 5547, 46234, 326, 3982, 247, 4618, 2491, 273, 29709, 275, 295, 24343, 50276, 19, 616, 2746, 273, 14657, 46234, 15846, 347, 247, 906, 273, 4666, 7483, 14649, 4632, 247, 1885, 273, 4216, 326, 556, 1097, 4666, 285, 5024, 14649, 4821, 407, 4216, 46234, 50276, 339, 3030, 281, 320, 11080, 7935, 281, 7409, 253, 21496, 8470, 50276, 20, 616, 4342, 273, 4294, 4719, 417, 760, 4122, 4294, 2426, 2970, 38539, 285, 17524, 7602, 533, 671, 597, 5257, 281, 320, 2366, 310, 4217, 281, 1329, 271, 1774, 8310, 627, 403, 1534, 2408, 273, 4294, 1491, 275, 6311, 46234, 50276, 21, 253, 4477, 45735, 1142, 4722, 12342, 285, 5697, 275, 1027, 4910, 751, 37756, 326, 10949, 8820, 7688, 285, 3969, 11727, 50274, 20881, 1255, 265, 337, 253, 3625, 4468, 310, 326, 4361, 436, 2929, 778, 417, 823, 4217, 16875, 4431, 275, 253, 8685, 273, 33876, 46234, 285, 1236, 2510, 25912, 3448, 3210, 604, 253, 2929, 5936, 667, 1511, 273, 1971, 326, 778, 33623, 1512, 1199, 18925, 327, 253, 4294, 1491, 390, 326, 476, 3157, 253, 3045, 273, 1375, 23037, 14387, 3448, 3210, 407, 12422, 1805, 3302, 46234, 253, 2561, 588, 2489, 247, 5328, 533, 253, 1655, 7482, 3139, 310, 1679, 2779, 281, 3486, 327, 776, 3114, 50276, 19, 512, 253, 3210, 285, 4679, 403, 4158, 285, 11658, 2581, 1509, 1242, 407, 25184, 253, 4373, 22041, 285, 3602, 840, 10941, 253, 1543, 342, 253, 4633, 6311, 46234, 352, 310, 273, 2282, 4217, 281, 452, 824, 3210, 6260, 2299, 352, 310, 625, 3626, 323, 5145, 4715, 3114, 281, 4944, 1016, 1566, 715, 253, 3733, 5110, 273, 941, 285, 17813, 352, 327, 253, 2186, 483, 873, 1078, 2403, 14023, 5010, 359, 2550, 1333, 1880, 1016, 5125, 1566, 2686, 556, 690, 4248, 273, 2087, 50228, 273, 253, 16958, 273, 6284, 50276, 20, 6296, 359, 2168, 7192, 275, 1097, 1029, 5251, 285, 1698, 5251, 326, 841, 4228, 46234, 8127, 22573, 4294, 1491, 2167, 597, 497, 2223, 9125, 281, 22642, 4187, 24705, 1491, 7312, 5426, 273, 253, 3634, 275, 15180, 20365, 3397, 310, 247, 873, 273, 3000, 275, 253, 20661, 3497, 273, 1016, 3159, 534, 24473, 23417, 281, 4715, 2549, 1624, 35360, 267, 22620, 2581, 685, 10534, 24705, 285, 43548, 9994, 22620, 352, 369, 671, 5421, 326, 10879, 3000, 403, 625, 7208, 9300, 275, 256, 35333, 7792, 9628, 253, 1781, 22429, 604, 359, 897, 305, 12064, 751, 10295, 352, 588, 320, 417, 10084, 281, 923, 841, 403, 24104, 50274, 34974, 337, 2139, 513, 368, 2868, 326, 46234, 403, 18325, 281, 14003, 253, 3626, 9959, 273, 253, 1524, 10186, 3640, 275, 697, 7982, 2317, 387, 1878, 253, 1264, 46234, 908, 275, 253, 2929, 1620, 11120, 22318, 667, 17524, 16566, 597, 971, 253, 3000, 8244, 275, 253, 9305, 17438, 19912, 8244, 275, 253, 299, 26365, 2317, 50275, 19, 12087, 2223, 2097, 253, 5319, 273, 253, 16751, 390, 17597, 17718, 2931, 407, 247, 4849, 273, 1527, 5239, 326, 588, 6296, 4853, 7470, 7982, 323, 7982, 8470, 690, 12091, 670, 12087, 390, 31429, 651, 320, 1512, 2256, 1242, 908, 50276, 20, 352, 310, 4722, 281, 923, 849, 17524, 7602, 310, 2931, 347, 2170, 762, 253, 6970, 275, 253, 4216, 273, 3629, 1180, 273, 4802, 4295, 1027, 3510, 273, 14580, 5046, 1027, 9171, 832, 1820, 405, 285, 4248, 10670, 10748, 19986, 275, 436, 17524, 7602, 533, 352, 310, 247, 2372, 38342, 326, 436, 7982, 476, 320, 31063, 1027, 1014, 323, 767, 14580, 342, 2074, 4248, 10670, 407, 18539, 274, 1365, 13660, 1170, 4564, 273, 1980, 10291, 347, 1097, 3210, 285, 747, 17082, 403, 4081, 627, 1364, 320, 247, 3626, 1953, 670, 253, 7882, 285, 7340, 273, 436, 7982, 50276, 21, 625, 3939, 830, 273, 9021, 326, 476, 3486, 4980, 6779, 4715, 285, 3700, 3169, 3448, 14053, 588, 5649, 253, 23886, 275, 253, 3114, 50275, 37585, 5701, 247, 19, 1669, 4677, 50276, 38729, 253, 4677, 3806, 347, 253, 4588, 4677, 310, 4441, 275, 253, 2045, 3239, 50276, 66, 20, 4677, 608, 417, 275, 253, 2022, 7482, 50276, 7152, 33032, 275, 436, 2929, 50276, 783, 4477, 1611, 281, 1263, 604, 359, 476, 3037, 670, 253, 7497, 1232, 273, 11365, 747, 5697, 390, 12342, 432, 46234, 253, 4477, 806, 4853, 247, 873, 273, 3210, 323, 11365, 46234, 840, 253, 4477, 10018, 767, 3607, 337, 4294, 4719, 285, 374, 7368, 7602, 253, 4477, 4720, 7277, 253, 46234, 432, 1006, 800, 3210, 342, 253, 46234, 432, 1524, 10186, 941, 8877, 3082, 50276, 296, 3755, 20556, 50276, 249, 2087, 891, 1089, 253, 1953, 253, 4477, 1611, 281, 2953, 310, 4722, 253, 4477, 16182, 273, 17682, 436, 1895, 3133, 4030, 50276, 20881, 1255, 265, 50275, 531, 4468, 891, 452, 310, 253, 21496, 253, 4477, 897, 1024, 295, 24343, 3114, 310, 42174, 281, 46234, 432, 3215, 11273, 3210, 24088, 270, 797, 352, 3133, 281, 479, 326, 18918, 841, 33876, 46234, 651, 320, 1805, 281, 2096, 253, 1953, 253, 4477, 12661, 50275, 8826, 5430, 891, 1928, 751, 253, 7103, 875, 253, 46234, 432, 253, 1006, 800, 1566, 285, 432, 253, 1524, 10186, 310, 2581, 5075, 891, 5476, 1529, 625, 15246, 1039, 310, 281, 921, 253, 3045, 327, 15450, 8892, 970, 253, 4561, 46234, 50276, 249, 2020, 891, 1158, 253, 4477, 1007, 387, 271, 1774, 1895, 533, 627, 403, 690, 7364, 273, 436, 1263, 24088, 417, 2819, 387, 33876, 46234, 5474, 33032, 2520, 2929, 14177, 281, 8338, 253, 3626, 4870, 326, 6635, 747, 3640, 390, 12342, 5742, 253, 4477, 12661, 767, 17082, 281, 17710, 46234, 10166, 327, 1027, 15302, 840, 597, 7277, 690, 13506, 941, 4561, 407, 3210, 285, 1524, 10186, 941, 2556, 281, 253, 18979, 17082, 4720, 597, 7525, 326, 253, 1524, 10186, 941, 476, 320, 973, 15524, 407, 247, 2176, 1006, 800, 1566, 20544, 50276, 18, 436, 2929, 2175, 941, 5978, 4870, 342, 253, 1361, 273, 21496, 8470, 352, 310, 271, 4722, 1895, 285, 891, 1158, 253, 2934, 273, 49519, 21496, 8470, 285, 941, 5978, 4870, 310, 4409, 2007, 17947, 374, 253, 4477, 1918, 247, 7000, 23356, 273, 2067, 941, 5978, 3082, 50274, 20881, 1255, 337, 253, 4060, 310, 247, 2372, 21643, 253, 4477, 1642, 247, 1953, 2139, 513, 21496, 8470, 1007, 347, 597, 513, 275, 253, 4060, 2299, 352, 3133, 326, 597, 858, 417, 3662, 352, 4768, 436, 19529, 374, 253, 19529, 310, 417, 1881, 41010, 323, 1650, 387, 253, 5004, 273, 3239, 721, 253, 4477, 1691, 271, 1774, 4677, 275, 253, 30762, 2299, 347, 4879, 407, 253, 17857, 32888, 9353, 30628, 403, 417, 2424, 281, 1239, 14801, 1271, 495, 253, 4872, 13007, 275, 253, 4679, 1646, 281, 320, 5075, 3340, 275, 4677, 374, 253, 4477, 778, 971, 281, 10173, 253, 268, 6364, 790, 5921, 875, 4107, 82, 4719, 285, 19947, 577, 347, 2529, 275, 253, 6452, 2593, 4679, 275, 436, 789, 403, 417, 4209, 281, 4271, 247, 2014, 11360, 1566, 323, 14720, 670, 253, 20801, 285, 5289, 273, 3626, 21496, 8470, 891, 5194, 326, 436, 789, 4245, 247, 1175, 1265, 323, 271, 4722, 1895, 2299, 1580, 253, 4477, 858, 417, 1918, 247, 2590, 6452, 253, 7681, 7680, 273, 436, 19529, 3133, 281, 320, 5075, 50276, 2520, 2929, 2175, 271, 1774, 285, 4722, 1895, 2299, 352, 1057, 417, 2085, 247, 1175, 2217, 2900, 281, 253, 1895, 281, 2525, 253, 2534, 273, 17857, 32888, 5474, 33032, 2520, 2929, 29328, 767, 5593, 273, 3159, 10580, 21496, 281, 1894, 253, 5606, 273, 3159, 10580, 26332, 4294, 4719, 285, 17524, 7602, 1142, 5368, 4216, 5978, 3210, 403, 28671, 253, 4081, 5593, 403, 5118, 327, 1142, 1524, 10186, 15302, 253, 20544, 256, 18, 436, 2929, 2175, 271, 4722, 1895, 281, 12106, 2605, 5606, 432, 253, 12087, 275, 21496, 2317, 256, 19, 767, 4460, 5593, 403, 4081, 256, 20, 436, 2929, 310, 973, 15720, 50276, 783, 32213, 259, 18, 627, 310, 642, 1895, 15895, 697, 1892, 281, 871, 752, 310, 253, 15180, 1895, 5421, 275, 436, 2929, 259, 19, 697, 417, 2590, 849, 513, 253, 4081, 5593, 3662, 253, 5439, 1895, 259, 20, 627, 310, 247, 3480, 273, 5955, 273, 253, 24775, 273, 253, 4081, 5593, 259, 21, 627, 310, 247, 3480, 273, 16774, 12820, 323, 253, 4081, 2557, 50276, 74, 452, 690, 7350, 670, 436, 2929, 627, 310, 642, 1895, 15895, 697, 1892, 281, 871, 752, 310, 253, 15180, 1895, 5421, 275, 436, 2929, 352, 3133, 326, 253, 4736, 273, 436, 2929, 310, 760, 281, 10313, 285, 5513, 41637, 14170, 697, 417, 2590, 849, 513, 253, 4081, 5593, 3662, 253, 5439, 1895, 752, 310, 253, 30328, 273, 253, 4081, 2557, 275, 2426, 273, 18918, 2605, 5606, 50276, 9088, 310, 247, 3480, 273, 5955, 273, 253, 24775, 273, 253, 4081, 5593, 627, 310, 247, 3480, 273, 16774, 12820, 323, 253, 4081, 2557, 436, 2929, 2175, 271, 4722, 1895, 285, 29328, 767, 4460, 5593, 436, 2929, 19756, 690, 2234, 3603, 824, 347, 1895, 15895, 24775, 5955, 285, 16774, 12820, 347, 697, 1512, 1142, 32213, 891, 452, 281, 5583, 247, 12009, 2490, 187, 4118, 18435, 27, 783, 4477, 273, 436, 789, 5611, 747, 17082, 323, 4666, 21496, 326, 476, 2557, 253, 5606, 273, 253, 46234, 285, 7277, 731, 342, 5368, 4216, 21496, 7274, 285, 3368, 264, 327, 1524, 15302, 50276, 455, 30628, 5821, 326, 253, 789, 12453, 4722, 1895, 285, 326, 253, 4081, 5593, 403, 642, 306, 533, 627, 403, 1512, 1142, 32138, 275, 253, 3302, 2715, 273, 253, 2929, 285, 5747, 253, 11080, 6128, 273, 253, 4477, 352, 310, 6566, 326, 627, 403, 1335, 1512, 1142, 1527, 3533, 323, 436, 2929, 281, 320, 7607, 436, 807, 17857, 32888 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper measures the generalization ability of modelbased agents ie muzero in comparison to modelfree agents the authors identify three key factors for procedural generalization planning selfsupervised representation learning and procedural data diversity however they find that these factors do not necessarily provide the same benefit for task generalization they argue for a move towards selfsupervised modelbased agents trained in rich procedural multitask environments the paper systematically investigates how muzero a sota modelbased agent performs on procedural generalization in procgen and task generalization in metaworld they identify three factors that improve procedural generalization through careful experiments next they test their effect for task generalization and find only a weak positive transfer to unseen tasks however the experiments and results testing the effect of planning and data diversity for task generalization were hard to follow the authors provide a thorough evaluation of muzero a modelbased agent on procedural generalization in procgen and task generalization in metaworld they identify three factors that improve procedural generation and demonstrate challenges in task generalization docsepthis paper explores the application of the muzero agent for tasks which require generalization across environments namely procgen and metaworld on procgen they find that muzero in its standard form performs on par or better than strong modelfree methods furthermore when combined with auxiliary selfsupervised learning ssl losses there is a significant jump in performance which achieves a new state of the art the paper includes interesting control experiments disentangling the effects of different components for example it shows that both muzeros modified targets for the value functions as well as the tree search for action selection each separately contribute to performance another interesting finding is that adding auxiliary ssl objectives can help generalization performance on unseen environments even when they do not improve performance on the training environments which i found surprising but useful results are also reported on task generalization benchmarks from metaworld here the results are less strong and selfsupervision does not appear to help there appears to be some transfer between tasks but it is limited pros very clear and wellwritten paper very strong results on procgen informative ablations and insights novel application of modelbased methods to procedurally generated environments cons limited novelty of methods the authors do not mention code release in their reproducibility statement in section 42 is there a reason why muzero is only trained for 50m while the baselines are trained for 400m steps it would be helpful to see how muzero performs when it is run for 400m steps or is this too slow if so this should be noted somewhere if it is too slow then it would be helpful to compare to the baseline performance at 50m steps overall this is a strong paper and i recommend acceptance it investigates the application of a modelbased rl mbrl to procedurally generated environments which contrasts with most existing mbrl works which on run on singleton environments in addition to strong results on procgen the ablations are quite informative in understanding the effect of the different algorithmic components the metaworld results are a bit disappointing but these are nevertheless helpful to include my main issue with the paper is that the authors do not mention code release in their reproducibility statement while the detailed appendix is appreciated this is not a substitute for releasing code i am also not aware of thoroughly tested opensource implementations of muzero i strongly urge the authors to make their code open source otherwise it will be difficult for the research community to build on this work docsepthis paper evaluates how well modelbased rl specifically muzero generalizes in comparison to modelfree rl it empirically compares how planning representation learning and data diversity affect the generalization of agents to evaluate the effect of planning a qlearning agent is constructed to be as similar as possible to muzero without the mcts in experiments it is found that planning selfsupervised representation learning reconstruction contrastive selfpredictive and data diversity all improve generalization performance however results are not similar in the meta world benchmarks where selfsupervision did not seem to improve results much the paper concludes that selfsupervision is a promising approach to improving the generalization of mbrl agents in procedural environments but perhaps it does not improve task generalization strengths while prior works have shown some sort of selfsupervised representation learning to be important in learning world models 1 this work seems to be the first to evaluate its effect on generalization since multitask rl and generalization seems to be an active research area i think this is a valuable contribution the paper shows that some conclusions as to generalization performance cannot be drawn when using a small training set which suggests researchers should evaluate their methods on diverse sets of environments to see whether the method affects generalization weaknesses the paper talks about procedural generalization and task generalization where the former corresponds to generalizing to unseen configurations of an environment with the same reward function and the latter corresponds to unseen reward functions in the same environment examples of each include procgen and ml1 for procedural generalization and ml10 and ml45 for task generalization to me it is unclear that ml10 and ml45 exactly fit into the category of task generalization since there are different types of objects different environments in the different task instances ml10 and ml45 have significantly fewer training environments than the procgen tasks where the paper claims that the improvement of selfsupervised representation learning is less apparent with lower data diversity perhaps the results on taskgeneralization are simply due to this i am curious as to whether there is truly a difference in generalization performance when measuring generalization to new reward functions rather than new environment configurations perhaps there is a better environment to test this in where the data diversity can be controlled for 1 babaeizadeh mohammad et al models pixels and rewards evaluating design tradeoffs in visual modelbased reinforcement learning httpsarxivorgabs201204603httpsarxivorgabs201204603 this paper shows that selfsupervised representation learning not only improves the training performance of mbrl but also the generalization performance and also points out that researchers may want to evaluate their algorithms on tasks with high data diversity however i feel as though the argument that task generalization differs from procedural generalization is not well supported by the experiments i do not recommend accepting this paper in its current form docsepthe authors present a systematic empirical study of the effect of planning and model learning on generalization performance using the muzero agent they use 2 environments procgen and metaworld to respectively explore procedural generalization to new variants of the environment with the same reward structure and task generalisation to new structures of the reward function in the same environment their main contributions are their empirical results specifically that additional reconstruction or selfsupervised losses enable the muezero agent to achieve stateoftheart performance in procedural generalization but are not enough to promote a similar increase in performance in task generalization tasks in metaworld finally they also explore the data diversity dimension showing that having more diverse data during training can help procedural generalization even more strengths the paper is very well written and the motivation and findings are clearly presented and easy to follow they provide a good ablation study of the different losses and explored a few interesting cases weaknesses the main weakness of the paper lies in that the scope is somewhat limited to an empirical study with only a small number of takeaway learnings for the community which are not necessarily very surprising minor although significant strides have been made in modelbased systems in recent years 27 the most popular modelbased benchmarks consist of identical training and testing environments eg 26 68 and do not measure or optimize for for generalization at all remove double for the work is very clearly presented easy to understand and presents a number of ablation cases on some important considerations for rl agents eg modelfree planning and model learning the learnings from the work are clear though not necessarily very surprising ### Summary:
the paper evaluates the generalization capabilities of modelbased agents in particular muzero compared with modelfree agents reviewers agree that the paper is wellwritten and the topic is interesting the ablation study is especially interesting as it disentangles the effect of different algorithmic components some concerns are raised about the significance of this work as the scope is limited to an empirical study and the results are not necessarily very surprising since the paper presents clear results on an important and relevant topic i recommend acceptance
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 5593, 253, 26647, 3745, 273, 1566, 3169, 6083, 26332, 278, 7958, 2771, 275, 5301, 281, 771, 813, 658, 6083, 253, 4477, 4271, 1264, 2234, 2616, 323, 19993, 26647, 7219, 1881, 35421, 6779, 4715, 285, 19993, 941, 9991, 2299, 597, 1089, 326, 841, 2616, 513, 417, 7933, 2085, 253, 1072, 5649, 323, 4836, 26647, 597, 9059, 323, 247, 2118, 4404, 1881, 35421, 1566, 3169, 6083, 10166, 275, 6793, 19993, 1554, 262, 1945, 12620, 50276, 783, 2929, 24181, 2340, 684, 849, 278, 7958, 2771, 247, 256, 5503, 1566, 3169, 5570, 17923, 327, 19993, 26647, 275, 15613, 1541, 285, 4836, 26647, 275, 1313, 1403, 2875, 597, 4271, 1264, 2616, 326, 3157, 19993, 26647, 949, 10182, 4679, 1735, 597, 1071, 616, 1055, 323, 4836, 26647, 285, 1089, 760, 247, 5075, 2762, 3700, 281, 39709, 8892, 2299, 253, 4679, 285, 1543, 5175, 253, 1055, 273, 7219, 285, 941, 9991, 323, 4836, 26647, 497, 1892, 281, 956, 253, 4477, 2085, 247, 11080, 7103, 273, 278, 7958, 2771, 247, 1566, 3169, 5570, 327, 19993, 26647, 275, 15613, 1541, 285, 4836, 26647, 275, 1313, 1403, 2875, 597, 4271, 1264, 2616, 326, 3157, 19993, 5978, 285, 7568, 7881, 275, 4836, 26647, 50276, 7152, 33032, 2520, 2929, 33826, 253, 2898, 273, 253, 278, 7958, 2771, 5570, 323, 8892, 534, 2430, 26647, 2439, 12620, 10775, 15613, 1541, 285, 1313, 1403, 2875, 50276, 251, 15613, 1541, 597, 1089, 326, 278, 7958, 2771, 275, 697, 2629, 830, 17923, 327, 1061, 390, 1805, 685, 2266, 771, 813, 658, 3082, 33810, 672, 5678, 342, 24026, 1881, 35421, 4715, 256, 3433, 11655, 627, 310, 247, 1534, 6923, 275, 3045, 534, 33526, 247, 747, 1375, 273, 253, 1445, 253, 2929, 3797, 4722, 1453, 4679, 557, 290, 36874, 253, 2538, 273, 1027, 4295, 323, 1650, 352, 2722, 326, 1097, 278, 7958, 18503, 7321, 8571, 323, 253, 1318, 3470, 347, 973, 347, 253, 5202, 3186, 323, 2250, 5438, 1016, 11794, 8162, 281, 3045, 1529, 4722, 4560, 310, 326, 6240, 24026, 256, 3433, 16566, 476, 1361, 26647, 3045, 327, 39709, 12620, 1014, 672, 597, 513, 417, 3157, 3045, 327, 253, 3733, 12620, 534, 891, 1119, 10084, 533, 4217, 50275, 16680, 403, 671, 2361, 327, 4836, 26647, 49602, 432, 1313, 1403, 2875, 1060, 253, 1543, 403, 1679, 2266, 285, 1881, 12185, 4694, 1057, 417, 3176, 281, 1361, 627, 4620, 281, 320, 690, 3700, 875, 8892, 533, 352, 310, 3710, 50276, 856, 84, 50276, 635, 2590, 285, 973, 15720, 2929, 50276, 635, 2266, 1543, 327, 15613, 1541, 50276, 37650, 800, 490, 77, 569, 285, 16039, 50276, 2369, 652, 2898, 273, 1566, 3169, 3082, 281, 3352, 8572, 4561, 12620, 50276, 5040, 50276, 15870, 38135, 273, 3082, 50275, 783, 4477, 513, 417, 3748, 2127, 3727, 275, 616, 38041, 3908, 50276, 249, 2593, 5976, 310, 627, 247, 1921, 2139, 278, 7958, 2771, 310, 760, 10166, 323, 2456, 78, 1223, 253, 1666, 25379, 403, 10166, 323, 9166, 78, 5018, 352, 651, 320, 9371, 281, 923, 849, 278, 7958, 2771, 17923, 672, 352, 310, 1408, 323, 9166, 78, 5018, 390, 310, 436, 1512, 3468, 604, 594, 436, 943, 320, 4879, 9366, 604, 352, 310, 1512, 3468, 840, 352, 651, 320, 9371, 281, 7277, 281, 253, 8245, 3045, 387, 2456, 78, 5018, 4583, 436, 310, 247, 2266, 2929, 285, 891, 5583, 14924, 352, 2340, 684, 253, 2898, 273, 247, 1566, 3169, 391, 77, 278, 1288, 77, 281, 3352, 8572, 4561, 12620, 534, 39165, 342, 954, 5368, 278, 1288, 77, 2987, 534, 327, 1408, 327, 47736, 12620, 275, 1635, 281, 2266, 1543, 327, 15613, 1541, 253, 490, 77, 569, 403, 3240, 27096, 275, 4685, 253, 1055, 273, 253, 1027, 5933, 280, 4295, 253, 1313, 1403, 2875, 1543, 403, 247, 2372, 31623, 533, 841, 403, 17837, 9371, 281, 2486, 50276, 2577, 2022, 2523, 342, 253, 2929, 310, 326, 253, 4477, 513, 417, 3748, 2127, 3727, 275, 616, 38041, 3908, 1223, 253, 7000, 30762, 310, 14109, 436, 310, 417, 247, 16502, 323, 20437, 2127, 891, 717, 671, 417, 6600, 273, 16575, 5762, 13279, 1505, 27558, 273, 278, 7958, 2771, 891, 7052, 21434, 253, 4477, 281, 1056, 616, 2127, 1527, 2603, 5010, 352, 588, 320, 2834, 323, 253, 2561, 3114, 281, 1973, 327, 436, 789, 50276, 7152, 33032, 2520, 2929, 44995, 849, 973, 1566, 3169, 391, 77, 5742, 278, 7958, 2771, 2087, 4219, 275, 5301, 281, 771, 813, 658, 391, 77, 352, 45190, 26662, 849, 7219, 6779, 4715, 285, 941, 9991, 2818, 253, 26647, 273, 6083, 281, 7472, 253, 1055, 273, 7219, 247, 2805, 28269, 5570, 310, 8818, 281, 320, 347, 2074, 347, 1896, 281, 278, 7958, 2771, 1293, 253, 278, 291, 84, 275, 4679, 352, 310, 1119, 326, 7219, 1881, 35421, 6779, 4715, 14433, 4499, 422, 1881, 22714, 422, 285, 941, 9991, 512, 3157, 26647, 3045, 2299, 1543, 403, 417, 2074, 275, 253, 11419, 1533, 49602, 835, 1881, 12185, 4694, 858, 417, 1646, 281, 3157, 1543, 1199, 253, 2929, 20097, 326, 1881, 12185, 4694, 310, 247, 12532, 2746, 281, 11138, 253, 26647, 273, 278, 1288, 77, 6083, 275, 19993, 12620, 533, 4931, 352, 1057, 417, 3157, 4836, 26647, 20544, 50275, 6050, 2720, 2987, 452, 2011, 690, 3686, 273, 1881, 35421, 6779, 4715, 281, 320, 1774, 275, 4715, 1533, 3210, 337, 436, 789, 3133, 281, 320, 253, 806, 281, 7472, 697, 1055, 327, 26647, 1580, 1554, 262, 1945, 391, 77, 285, 26647, 3133, 281, 320, 271, 3939, 2561, 2170, 891, 1158, 436, 310, 247, 9865, 7680, 50276, 783, 2929, 2722, 326, 690, 11815, 347, 281, 26647, 3045, 2550, 320, 8392, 672, 970, 247, 1355, 3733, 873, 534, 5936, 8607, 943, 7472, 616, 3082, 327, 11117, 5239, 273, 12620, 281, 923, 1880, 253, 1332, 11852, 26647, 50276, 20881, 1255, 265, 50275, 783, 2929, 12088, 670, 19993, 26647, 285, 4836, 26647, 835, 253, 3438, 10140, 281, 2087, 3006, 281, 39709, 16012, 273, 271, 3126, 342, 253, 1072, 10921, 1159, 285, 253, 6158, 10140, 281, 39709, 10921, 3470, 275, 253, 1072, 3126, 6667, 273, 1016, 2486, 15613, 1541, 285, 13361, 18, 323, 19993, 26647, 285, 13361, 740, 285, 13361, 1857, 323, 4836, 26647, 281, 479, 352, 310, 12744, 326, 13361, 740, 285, 13361, 1857, 4555, 4944, 715, 253, 7140, 273, 4836, 26647, 1580, 627, 403, 1027, 3510, 273, 5113, 1027, 12620, 275, 253, 1027, 4836, 10872, 50276, 1686, 740, 285, 13361, 1857, 452, 3012, 11184, 3733, 12620, 685, 253, 15613, 1541, 8892, 835, 253, 2929, 3916, 326, 253, 7756, 273, 1881, 35421, 6779, 4715, 310, 1679, 5165, 342, 2406, 941, 9991, 4931, 253, 1543, 327, 4836, 16691, 1320, 403, 3365, 1955, 281, 436, 891, 717, 14338, 347, 281, 1880, 627, 310, 7777, 247, 3064, 275, 26647, 3045, 672, 10499, 26647, 281, 747, 10921, 3470, 2581, 685, 747, 3126, 16012, 4931, 627, 310, 247, 1805, 3126, 281, 1071, 436, 275, 835, 253, 941, 9991, 476, 320, 6537, 323, 50276, 18, 5366, 3348, 478, 796, 73, 278, 1368, 20136, 1162, 355, 3210, 15115, 285, 23267, 16344, 2216, 5454, 14273, 275, 5304, 1566, 3169, 35221, 4715, 5987, 39962, 2061, 5375, 1252, 15781, 29251, 3614, 39962, 2061, 5375, 1252, 15781, 29251, 436, 2929, 2722, 326, 1881, 35421, 6779, 4715, 417, 760, 19132, 253, 3733, 3045, 273, 278, 1288, 77, 533, 671, 253, 26647, 3045, 285, 671, 2792, 562, 326, 8607, 778, 971, 281, 7472, 616, 11333, 327, 8892, 342, 1029, 941, 9991, 2299, 891, 1928, 347, 2167, 253, 4154, 326, 4836, 26647, 19986, 432, 19993, 26647, 310, 417, 973, 4516, 407, 253, 4679, 891, 513, 417, 5583, 18738, 436, 2929, 275, 697, 1655, 830, 5474, 339, 431, 248, 4477, 1246, 247, 12082, 16774, 1263, 273, 253, 1055, 273, 7219, 285, 1566, 4715, 327, 26647, 3045, 970, 253, 278, 7958, 2771, 5570, 597, 897, 374, 12620, 15613, 1541, 285, 1313, 1403, 2875, 281, 2975, 8338, 19993, 26647, 281, 747, 11640, 273, 253, 3126, 342, 253, 1072, 10921, 2605, 285, 4836, 2087, 5837, 281, 747, 5289, 273, 253, 10921, 1159, 275, 253, 1072, 3126, 616, 2022, 9021, 403, 616, 16774, 1543, 5742, 326, 3081, 14433, 390, 1881, 35421, 11655, 8046, 253, 278, 489, 10528, 5570, 281, 5115, 1375, 23037, 14387, 3045, 275, 19993, 26647, 533, 403, 417, 2217, 281, 8591, 247, 2074, 2572, 275, 3045, 275, 4836, 26647, 8892, 275, 1313, 1403, 2875, 4720, 597, 671, 8338, 253, 941, 9991, 7877, 4645, 326, 1907, 625, 11117, 941, 1309, 3733, 476, 1361, 19993, 26647, 1014, 625, 20544, 50274, 783, 2929, 310, 1077, 973, 3542, 285, 253, 16038, 285, 4342, 403, 4518, 3559, 285, 3477, 281, 956, 50276, 9328, 2085, 247, 1175, 28913, 1263, 273, 253, 1027, 11655, 285, 14859, 247, 1643, 4722, 2219, 50276, 20881, 1255, 265, 50275, 783, 2022, 14855, 273, 253, 2929, 8696, 275, 326, 253, 7990, 310, 8489, 3710, 281, 271, 16774, 1263, 342, 760, 247, 1355, 1180, 273, 1379, 12594, 3037, 723, 323, 253, 3114, 534, 403, 417, 7933, 1077, 10084, 50275, 37585, 3738, 1534, 47582, 452, 644, 1160, 275, 1566, 3169, 2718, 275, 3332, 1107, 3435, 253, 954, 4633, 1566, 3169, 49602, 2882, 273, 8931, 3733, 285, 5175, 12620, 24088, 3436, 9934, 285, 513, 417, 2557, 390, 22318, 323, 323, 26647, 387, 512, 5386, 4021, 323, 50276, 783, 789, 310, 1077, 4518, 3559, 3477, 281, 2096, 285, 10262, 247, 1180, 273, 28913, 2219, 327, 690, 1774, 15711, 323, 391, 77, 6083, 24088, 771, 813, 658, 7219, 285, 1566, 4715, 253, 3037, 723, 432, 253, 789, 403, 2590, 2167, 417, 7933, 1077, 10084, 50276, 187, 187, 4118, 18435, 27, 783, 2929, 44995, 253, 26647, 13789, 273, 1566, 3169, 6083, 275, 1798, 278, 7958, 2771, 2429, 342, 771, 813, 658, 6083, 30628, 5194, 326, 253, 2929, 310, 973, 15720, 285, 253, 9400, 310, 4722, 253, 28913, 1263, 310, 3340, 4722, 347, 352, 557, 290, 19236, 253, 1055, 273, 1027, 5933, 280, 4295, 690, 7350, 403, 5439, 670, 253, 8453, 273, 436, 789, 347, 253, 7990, 310, 3710, 281, 271, 16774, 1263, 285, 253, 1543, 403, 417, 7933, 1077, 10084, 50275, 17480, 253, 2929, 10262, 2590, 1543, 327, 271, 1774, 285, 4623, 9400, 891, 5583, 14924 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 5593, 253, 26647, 3745, 273, 1566, 3169, 6083, 26332, 278, 7958, 2771, 275, 5301, 281, 771, 813, 658, 6083, 253, 4477, 4271, 1264, 2234, 2616, 323, 19993, 26647, 7219, 1881, 35421, 6779, 4715, 285, 19993, 941, 9991, 2299, 597, 1089, 326, 841, 2616, 513, 417, 7933, 2085, 253, 1072, 5649, 323, 4836, 26647, 597, 9059, 323, 247, 2118, 4404, 1881, 35421, 1566, 3169, 6083, 10166, 275, 6793, 19993, 1554, 262, 1945, 12620, 50276, 783, 2929, 24181, 2340, 684, 849, 278, 7958, 2771, 247, 256, 5503, 1566, 3169, 5570, 17923, 327, 19993, 26647, 275, 15613, 1541, 285, 4836, 26647, 275, 1313, 1403, 2875, 597, 4271, 1264, 2616, 326, 3157, 19993, 26647, 949, 10182, 4679, 1735, 597, 1071, 616, 1055, 323, 4836, 26647, 285, 1089, 760, 247, 5075, 2762, 3700, 281, 39709, 8892, 2299, 253, 4679, 285, 1543, 5175, 253, 1055, 273, 7219, 285, 941, 9991, 323, 4836, 26647, 497, 1892, 281, 956, 253, 4477, 2085, 247, 11080, 7103, 273, 278, 7958, 2771, 247, 1566, 3169, 5570, 327, 19993, 26647, 275, 15613, 1541, 285, 4836, 26647, 275, 1313, 1403, 2875, 597, 4271, 1264, 2616, 326, 3157, 19993, 5978, 285, 7568, 7881, 275, 4836, 26647, 50276, 7152, 33032, 2520, 2929, 33826, 253, 2898, 273, 253, 278, 7958, 2771, 5570, 323, 8892, 534, 2430, 26647, 2439, 12620, 10775, 15613, 1541, 285, 1313, 1403, 2875, 50276, 251, 15613, 1541, 597, 1089, 326, 278, 7958, 2771, 275, 697, 2629, 830, 17923, 327, 1061, 390, 1805, 685, 2266, 771, 813, 658, 3082, 33810, 672, 5678, 342, 24026, 1881, 35421, 4715, 256, 3433, 11655, 627, 310, 247, 1534, 6923, 275, 3045, 534, 33526, 247, 747, 1375, 273, 253, 1445, 253, 2929, 3797, 4722, 1453, 4679, 557, 290, 36874, 253, 2538, 273, 1027, 4295, 323, 1650, 352, 2722, 326, 1097, 278, 7958, 18503, 7321, 8571, 323, 253, 1318, 3470, 347, 973, 347, 253, 5202, 3186, 323, 2250, 5438, 1016, 11794, 8162, 281, 3045, 1529, 4722, 4560, 310, 326, 6240, 24026, 256, 3433, 16566, 476, 1361, 26647, 3045, 327, 39709, 12620, 1014, 672, 597, 513, 417, 3157, 3045, 327, 253, 3733, 12620, 534, 891, 1119, 10084, 533, 4217, 50275, 16680, 403, 671, 2361, 327, 4836, 26647, 49602, 432, 1313, 1403, 2875, 1060, 253, 1543, 403, 1679, 2266, 285, 1881, 12185, 4694, 1057, 417, 3176, 281, 1361, 627, 4620, 281, 320, 690, 3700, 875, 8892, 533, 352, 310, 3710, 50276, 856, 84, 50276, 635, 2590, 285, 973, 15720, 2929, 50276, 635, 2266, 1543, 327, 15613, 1541, 50276, 37650, 800, 490, 77, 569, 285, 16039, 50276, 2369, 652, 2898, 273, 1566, 3169, 3082, 281, 3352, 8572, 4561, 12620, 50276, 5040, 50276, 15870, 38135, 273, 3082, 50275, 783, 4477, 513, 417, 3748, 2127, 3727, 275, 616, 38041, 3908, 50276, 249, 2593, 5976, 310, 627, 247, 1921, 2139, 278, 7958, 2771, 310, 760, 10166, 323, 2456, 78, 1223, 253, 1666, 25379, 403, 10166, 323, 9166, 78, 5018, 352, 651, 320, 9371, 281, 923, 849, 278, 7958, 2771, 17923, 672, 352, 310, 1408, 323, 9166, 78, 5018, 390, 310, 436, 1512, 3468, 604, 594, 436, 943, 320, 4879, 9366, 604, 352, 310, 1512, 3468, 840, 352, 651, 320, 9371, 281, 7277, 281, 253, 8245, 3045, 387, 2456, 78, 5018, 4583, 436, 310, 247, 2266, 2929, 285, 891, 5583, 14924, 352, 2340, 684, 253, 2898, 273, 247, 1566, 3169, 391, 77, 278, 1288, 77, 281, 3352, 8572, 4561, 12620, 534, 39165, 342, 954, 5368, 278, 1288, 77, 2987, 534, 327, 1408, 327, 47736, 12620, 275, 1635, 281, 2266, 1543, 327, 15613, 1541, 253, 490, 77, 569, 403, 3240, 27096, 275, 4685, 253, 1055, 273, 253, 1027, 5933, 280, 4295, 253, 1313, 1403, 2875, 1543, 403, 247, 2372, 31623, 533, 841, 403, 17837, 9371, 281, 2486, 50276, 2577, 2022, 2523, 342, 253, 2929, 310, 326, 253, 4477, 513, 417, 3748, 2127, 3727, 275, 616, 38041, 3908, 1223, 253, 7000, 30762, 310, 14109, 436, 310, 417, 247, 16502, 323, 20437, 2127, 891, 717, 671, 417, 6600, 273, 16575, 5762, 13279, 1505, 27558, 273, 278, 7958, 2771, 891, 7052, 21434, 253, 4477, 281, 1056, 616, 2127, 1527, 2603, 5010, 352, 588, 320, 2834, 323, 253, 2561, 3114, 281, 1973, 327, 436, 789, 50276, 7152, 33032, 2520, 2929, 44995, 849, 973, 1566, 3169, 391, 77, 5742, 278, 7958, 2771, 2087, 4219, 275, 5301, 281, 771, 813, 658, 391, 77, 352, 45190, 26662, 849, 7219, 6779, 4715, 285, 941, 9991, 2818, 253, 26647, 273, 6083, 281, 7472, 253, 1055, 273, 7219, 247, 2805, 28269, 5570, 310, 8818, 281, 320, 347, 2074, 347, 1896, 281, 278, 7958, 2771, 1293, 253, 278, 291, 84, 275, 4679, 352, 310, 1119, 326, 7219, 1881, 35421, 6779, 4715, 14433, 4499, 422, 1881, 22714, 422, 285, 941, 9991, 512, 3157, 26647, 3045, 2299, 1543, 403, 417, 2074, 275, 253, 11419, 1533, 49602, 835, 1881, 12185, 4694, 858, 417, 1646, 281, 3157, 1543, 1199, 253, 2929, 20097, 326, 1881, 12185, 4694, 310, 247, 12532, 2746, 281, 11138, 253, 26647, 273, 278, 1288, 77, 6083, 275, 19993, 12620, 533, 4931, 352, 1057, 417, 3157, 4836, 26647, 20544, 50275, 6050, 2720, 2987, 452, 2011, 690, 3686, 273, 1881, 35421, 6779, 4715, 281, 320, 1774, 275, 4715, 1533, 3210, 337, 436, 789, 3133, 281, 320, 253, 806, 281, 7472, 697, 1055, 327, 26647, 1580, 1554, 262, 1945, 391, 77, 285, 26647, 3133, 281, 320, 271, 3939, 2561, 2170, 891, 1158, 436, 310, 247, 9865, 7680, 50276, 783, 2929, 2722, 326, 690, 11815, 347, 281, 26647, 3045, 2550, 320, 8392, 672, 970, 247, 1355, 3733, 873, 534, 5936, 8607, 943, 7472, 616, 3082, 327, 11117, 5239, 273, 12620, 281, 923, 1880, 253, 1332, 11852, 26647, 50276, 20881, 1255, 265, 50275, 783, 2929, 12088, 670, 19993, 26647, 285, 4836, 26647, 835, 253, 3438, 10140, 281, 2087, 3006, 281, 39709, 16012, 273, 271, 3126, 342, 253, 1072, 10921, 1159, 285, 253, 6158, 10140, 281, 39709, 10921, 3470, 275, 253, 1072, 3126, 6667, 273, 1016, 2486, 15613, 1541, 285, 13361, 18, 323, 19993, 26647, 285, 13361, 740, 285, 13361, 1857, 323, 4836, 26647, 281, 479, 352, 310, 12744, 326, 13361, 740, 285, 13361, 1857, 4555, 4944, 715, 253, 7140, 273, 4836, 26647, 1580, 627, 403, 1027, 3510, 273, 5113, 1027, 12620, 275, 253, 1027, 4836, 10872, 50276, 1686, 740, 285, 13361, 1857, 452, 3012, 11184, 3733, 12620, 685, 253, 15613, 1541, 8892, 835, 253, 2929, 3916, 326, 253, 7756, 273, 1881, 35421, 6779, 4715, 310, 1679, 5165, 342, 2406, 941, 9991, 4931, 253, 1543, 327, 4836, 16691, 1320, 403, 3365, 1955, 281, 436, 891, 717, 14338, 347, 281, 1880, 627, 310, 7777, 247, 3064, 275, 26647, 3045, 672, 10499, 26647, 281, 747, 10921, 3470, 2581, 685, 747, 3126, 16012, 4931, 627, 310, 247, 1805, 3126, 281, 1071, 436, 275, 835, 253, 941, 9991, 476, 320, 6537, 323, 50276, 18, 5366, 3348, 478, 796, 73, 278, 1368, 20136, 1162, 355, 3210, 15115, 285, 23267, 16344, 2216, 5454, 14273, 275, 5304, 1566, 3169, 35221, 4715, 5987, 39962, 2061, 5375, 1252, 15781, 29251, 3614, 39962, 2061, 5375, 1252, 15781, 29251, 436, 2929, 2722, 326, 1881, 35421, 6779, 4715, 417, 760, 19132, 253, 3733, 3045, 273, 278, 1288, 77, 533, 671, 253, 26647, 3045, 285, 671, 2792, 562, 326, 8607, 778, 971, 281, 7472, 616, 11333, 327, 8892, 342, 1029, 941, 9991, 2299, 891, 1928, 347, 2167, 253, 4154, 326, 4836, 26647, 19986, 432, 19993, 26647, 310, 417, 973, 4516, 407, 253, 4679, 891, 513, 417, 5583, 18738, 436, 2929, 275, 697, 1655, 830, 5474, 339, 431, 248, 4477, 1246, 247, 12082, 16774, 1263, 273, 253, 1055, 273, 7219, 285, 1566, 4715, 327, 26647, 3045, 970, 253, 278, 7958, 2771, 5570, 597, 897, 374, 12620, 15613, 1541, 285, 1313, 1403, 2875, 281, 2975, 8338, 19993, 26647, 281, 747, 11640, 273, 253, 3126, 342, 253, 1072, 10921, 2605, 285, 4836, 2087, 5837, 281, 747, 5289, 273, 253, 10921, 1159, 275, 253, 1072, 3126, 616, 2022, 9021, 403, 616, 16774, 1543, 5742, 326, 3081, 14433, 390, 1881, 35421, 11655, 8046, 253, 278, 489, 10528, 5570, 281, 5115, 1375, 23037, 14387, 3045, 275, 19993, 26647, 533, 403, 417, 2217, 281, 8591, 247, 2074, 2572, 275, 3045, 275, 4836, 26647, 8892, 275, 1313, 1403, 2875, 4720, 597, 671, 8338, 253, 941, 9991, 7877, 4645, 326, 1907, 625, 11117, 941, 1309, 3733, 476, 1361, 19993, 26647, 1014, 625, 20544, 50274, 783, 2929, 310, 1077, 973, 3542, 285, 253, 16038, 285, 4342, 403, 4518, 3559, 285, 3477, 281, 956, 50276, 9328, 2085, 247, 1175, 28913, 1263, 273, 253, 1027, 11655, 285, 14859, 247, 1643, 4722, 2219, 50276, 20881, 1255, 265, 50275, 783, 2022, 14855, 273, 253, 2929, 8696, 275, 326, 253, 7990, 310, 8489, 3710, 281, 271, 16774, 1263, 342, 760, 247, 1355, 1180, 273, 1379, 12594, 3037, 723, 323, 253, 3114, 534, 403, 417, 7933, 1077, 10084, 50275, 37585, 3738, 1534, 47582, 452, 644, 1160, 275, 1566, 3169, 2718, 275, 3332, 1107, 3435, 253, 954, 4633, 1566, 3169, 49602, 2882, 273, 8931, 3733, 285, 5175, 12620, 24088, 3436, 9934, 285, 513, 417, 2557, 390, 22318, 323, 323, 26647, 387, 512, 5386, 4021, 323, 50276, 783, 789, 310, 1077, 4518, 3559, 3477, 281, 2096, 285, 10262, 247, 1180, 273, 28913, 2219, 327, 690, 1774, 15711, 323, 391, 77, 6083, 24088, 771, 813, 658, 7219, 285, 1566, 4715, 253, 3037, 723, 432, 253, 789, 403, 2590, 2167, 417, 7933, 1077, 10084, 50276, 187, 187, 4118, 18435, 27, 783, 2929, 44995, 253, 26647, 13789, 273, 1566, 3169, 6083, 275, 1798, 278, 7958, 2771, 2429, 342, 771, 813, 658, 6083, 30628, 5194, 326, 253, 2929, 310, 973, 15720, 285, 253, 9400, 310, 4722, 253, 28913, 1263, 310, 3340, 4722, 347, 352, 557, 290, 19236, 253, 1055, 273, 1027, 5933, 280, 4295, 690, 7350, 403, 5439, 670, 253, 8453, 273, 436, 789, 347, 253, 7990, 310, 3710, 281, 271, 16774, 1263, 285, 253, 1543, 403, 417, 7933, 1077, 10084, 50275, 17480, 253, 2929, 10262, 2590, 1543, 327, 271, 1774, 285, 4623, 9400, 891, 5583, 14924 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper presents an approach to learning user representations based on activity patterns on ecommerce websites and a user profile the method turns activity patterns into a sequence of discrete tokens based on the action type and attributes that correspond to a certain action a selfsupervised transformer is trained on this data with a masked language modeling mlm objective data is compartmentalized as longterm patterns such as a purchase or the use of reward points or shortterm such as clickthrough data or user profile information such as user age gender or location separate segment and position embeddings are used within each compartment since each masked token is a highlevel action type that may have many attributes predicting a maskedtoken is cast as a multilabel classification problem over attributes the trained model is evaluated on downstream user targeting and user attribute prediction benchmarks overall the paper is a straightforward application of bert pretraining with mlm only to learn user representations the main contribution of this work is the tokenizationdiscretization strategy and multilabel classification to enable masked language modeling strengths the overall approach of leveraging user activity patterns to learn user embeddings in a selfsupervised way is well motivated and uses wellestablished methods like bert to achieve this goal the discretizationtokenization and multilabel classification idea is well thought out weaknesses limited discussion of previous work in this space of user representation learning the user modeling section lacks any citations straightforward bert application with somewhat mixed results on some tasks case 1 on user targeting and hascar attribute prediction it is hard to determine what improvements are meaningful since the datasets used are standard to the best of my knowledge and there arent any significance testing questions comments what is the baseline accuracy of predicting the same genre as the previous genre purchased in the next genre prediction task the limited supervision setting was unclear to me especially in figures 4 5 please correct me if ive interpreted the results incorrectly figures 4 5 investigate supervised pretraining on labeled user data the transformer mtl model when pretrained on 12 classification tasks that require supervised data 1 gets better with more supervised pretraining data 2 does better than userbert with the same pretraining on supervised tasks figure 5 shows that userbert begins to overfit a motivation mentioned was that selfsupervised pretraining allows for strong semisupervised learning and learning with limited supervision however most performance curves report performance as a function of the number of epochs rather than the amount of labeled data available for downstream tasks like user targeting and attribute prediction strong performance with limited labeled data would make this work better the paragraph that starts with inspired by the bert model and its variations contains a sentence like understanding users in a similar way to how language is understood and syntax and semantics of sentence are comparable with behavioral patterns and characteristics of a user both especially the latter are pretty bold claims and are largely unnecessary to motivate the proposed model it should suffice to say that the model is inspired by bert which has been immensely useful across a host of nlp tasks the section on user modeling has no references to previous work that looks at building user representations the recommendation systems literature contains several such pieces of work and these should be discussed along these lines how would the learned representations compare to those obtained from say svd of the userpurchaseuserclickusersearch matrix this work appears to have some potential for misuse especially when trying to infer protected user attributes from user embeddings this deserves to be discussed in some detail its not clear to me why reporting area under the roc curve as a function of the number of epochs is meaningful why arent models trained with early stopping docsepuserbert selfsupervised user representation learning summary the paper provides an extension of bert to user data for pretraining user representation in a selfsupervised manner in particular it analogise the user behaviour sequence to words in a sentence and leverages the masked language model mlm approach typically used in nlp to train the user embedding to facilitate such extension the paper also proposes a discretisation approach and a unified input structuring to include longterm shortterm and demographic information pros 1 even though the idea of extending the selfsupervised pretraining for user representations is not new it is still an interesting area of research 2 the discretisation of user behaviour signals over longterm and shortterm to form behavioural words is quite reasonable concerns the key concerns about the contributions of the paper are as follows overall the novelty of this work is very limited to elaborate the major contribution of this work is two fold discretisation of raw user behaviour sequences for longterm and shortterm and using the discretised aggregated behaviour words as inputs to the bert architecture as is the other claimed contributions such as having a unified architecture and experiments to validate the approach do not seem substantial when it comes to discretisation though the idea seems appropriate two crucial questions regarding this step are not validated 1 there is no empirical evidence presented in the paper which shows discretisation improves userberts accuracy since it is a major contribution i request the authors to design and implement an ablation study to address this point 2 seemingly the authors have come up with a hand craftedheuristicdriven approach for discretisation why cant it be datadriven too meaning can clustering of actions be done in a datadriven manner if so what is the difference in accuracy between the proposed heuristicdriven and the datadriven alternative when it comes to the second contribution it is certainly not novel ie the paper does not propose any architectural change to bert though the paper claims that the presented model is a unified model to learn long term short term and demographics based user profile the unification is brought upon as a byproduct of feeding multimodal inputs to vanilla bert hence the overall novelty of the paper is very limited the following comments are my major concerns in each section of the paper section 2 while the authors have reviewed some literature in transfer learning and self supervised learning and have cited some relevant work they have not cited even one reference in section 23 which on user modeling ie the main theme of the paper i request the author to make a thorough survey and cite related work in section 23 and also highlight how userbert is different from them section 3 overall this section and section 4 lack cohesion and can be written clearer with the figures tables algorithms and descriptions this could help the reader better understand the approach for instance the following main points in the approach are not explained well 1 the paper states the final loss for one input sequence is the weighted sum of the losses of all masked tokens there is no detail what the weights are whether they are assigned based on heuristic or learnt 2 the approach considers ordinal attributes such as expense and age similar to categorical attributes eg each age has a unique embedding this seems counterintuitive and there is no empirical evidence to show that this counterintuitive design works well 3 it is not clear how to use the hidden representation to predict attributes from the transformed masked tokens more precisely it is possible that many attributes belonging to different actions are masked and then converted into one token embedding so what attributes are to be predicted in the final fully connected layer in this case 4 minor concern what does e stand for in equation 1 section 4 overall in this section the experimental design is not comprehensive and the results are not convincing for the following reasons along with the widedeep lstm transformers as baseline it would have been better to also include vanilla bert to the baselines against which the userbert can be compared in fact vanilla bert would be the closest and most appropriate baseline for comparison hence i request the authors to include it all the experiments are conducted on custom datasets since user profiling is an extremely useful and ubiquitous activity that benefits multiple domains i request the authors to experiment userbert on wellknown open source ecommerce and other user profiling datasets ref 1 and 2 in fact the profiles could be tested on downstream tasks like next genre prediction with these datasets this will help the reader to trust the userbert model better input representation being one of the major contributions of the paper it would give more insights if an ablation study is made on the user behaviour data longterm features shortterm features demographic features to compare and contrast the contribution and lift by each of the behaviour categories in the attribute prediction task within the two attributes experimented the performance of the proposed model is quite unconvincing would benefit if more experiments are performed for the next genre prediction though there are more than 10k genres each users typically have a very small subset of interest therefore it would be more informative if can compare the models map10 with the userlevel modes map10 the discussions of results are very vague and could be a lot deeper and precise questions during rebuttal period please address and clarify the concerns above reference 1 sun fei et al bert4rec sequential recommendation with bidirectional encoder representations from transformer proceedings of the 28th acm international conference on information and knowledge management 2019 2 kang wangcheng and julian mcauley selfattentive sequential recommendation 2018 ieee international conference on data mining icdm ieee 2018docsep i have read the author response and i still think the paper is limited in terms of novelty significance and experiments i would like to keep my current score this work proposed a selfsupervised pretraining approach to model users from user behavior timeseries data given timeseries data of user actions on ecommerce websites web browsing etc a selfsupervised prediction task similar to masked language modeling is introduced where a transformer model predicts masked actions from the context paper claims to show that the proposed pretraining approach performs better than baselines pretrained using multitasking on user behavior prediction tasks such as next purchase pros good user models can be useful in recommendation systems and paper tackles an important problem proposed pretraining approach based on timeseries data is interesting and looks reasonable paper is generally well written and easy to follow cons limited novelty and significance weak experiments related work on user modeling may not be comprehensive enough related work not comprehensive the paper doesnt do a good job of positioning the work in the context of prior work i dont see any papers discussed in the user modeling section of the related work which raises concerns about the significance of this work poor baselines the model is compared against baselines constructed by the authors some of which are vaguely described eg i dont understand the widedeep baseline i am not sure if these are strong baselines it is also unclear how good the multitask learning baselines are and there isnt any significant discussion about the taskshyperparameters used to train these baselines limited noveltycontributions the approach can be considered an application of masked language modeling im not sure if the details about tokenization bear much significance if the authors claim this to be a contribution the same should be verified by comparing against other ways of tokenizing the data modulo this detail the approach is a straightforward application of bertstyle pretraining i dont think the analogy between language and user behavior needs to be mentionedadds any value masked language model pretraining can be applied to any timeseries data although the paper claims to model users i couldnt find details in the paper about how exactly a usermodel is constructed from the pretrained model and how these models are finetuned on the target task while the paper attempts to address an important problem it has serious issues in terms of limited noveltysignificance of contributions and weak experiments docsepclarity i find the paper lacks details on multiple aspects please see my comments below originality the idea introduced in the paper is interesting most of the originality comes from the fact that the author is converting a user modeling problem into a pretraining problem defined on a special vocabulary aka action types attribute ids significance i am not convinced that the result is significant mostly because the experiment is hard to be reproduced due to lack of description of modeling details and the dataset being used some other comments it would be more helpful to point out the difference in more details that how your work is different from other related work in section 21 and 22 in section 32 tokenization of user behavior sequences if i understand correctly at least some subset of the attribute type of an user action is largely dependent on an external topic model or topic classifier for example to identity the main topic of a webpage sport news or to recognize the relevant entity from a shop nike shoes this part is not clearly explained in the paper i would recommend using an appendix section to explain it further otherwise it is impossible for other readers to reproduce the results mentioned in this paper can you rephrase this sentence the token representation is computed by the concatenation and mean calculation of the word embeddings of the attribute ids in each action under figure 1 i am confused how this is done is it a mean word embedding for each attribute id then we concatenate all of the mean embedding vectors in section 32 input representations could you give an example of token i see you said age is a token is there any other examples is token something only related to user demography and action what is the difference between them this give me confusion when reading the second paragraph under input representations under pretraining tasks the final loss for one input sequence is the weighted sum of the losses of all masked tokens can you elaborate on what the weighted sum refers to what is the weight are you assigning some weights to different type of user behavior eg longterm shortterm and user profiles how is that weight being decided in the experiment section 41 the dataset used in the paper does not seem publicavailable or do i missing something this also blocks reproducibility of the paper results is there anything you can use in the public world otherwise can you at least give more details besides those already in 41 on the underline dataset in the appendix ### Summary:
the paper discusses an extension of bert for learning user representations based on activity patterns in a selfsupervised setting all reviewers have concerns about the validity of the claims and the significance of the experimental results overall i agree with the reviewers that the paper needs more work to be published at iclr i recommend rejection
[ 671, 29328, 247, 35132, 5837, 2746, 285, 247, 27998, 3280, 1577, 981, 281, 2486, 1048, 3945, 2159, 3945, 285, 18825, 1491, 50270, 856, 84, 50272, 18, 1014, 2167, 253, 2934, 273, 13633, 253, 1881, 35421, 3215, 26208, 323, 2608, 14237, 310, 417, 747, 352, 310, 1335, 271, 4722, 2170, 273, 2561, 50274, 19, 253, 35132, 5837, 273, 2608, 8770, 6298, 689, 1048, 3945, 285, 2159, 3945, 281, 830, 35174, 3000, 310, 3240, 5272, 50272, 585, 1209, 2224, 50273, 783, 2234, 7350, 670, 253, 9021, 273, 253, 2929, 403, 347, 3637, 50275, 1189, 455, 253, 38135, 273, 436, 789, 310, 1077, 3710, 281, 21184, 253, 2201, 7680, 273, 436, 789, 310, 767, 7975, 50276, 3431, 2414, 5837, 273, 9305, 2608, 8770, 6430, 323, 1048, 3945, 285, 2159, 3945, 285, 50276, 5302, 253, 35132, 1701, 40006, 8770, 3000, 347, 14800, 281, 253, 270, 797, 10336, 347, 310, 50275, 783, 643, 7558, 9021, 824, 347, 1907, 247, 27998, 10336, 285, 4679, 281, 17813, 253, 2746, 513, 417, 1646, 6832, 50276, 9453, 352, 3249, 281, 35132, 5837, 2167, 253, 2934, 3133, 4569, 767, 9560, 3533, 5001, 436, 3213, 403, 417, 17618, 50276, 18, 627, 310, 642, 16774, 1941, 3559, 275, 253, 2929, 534, 2722, 35132, 5837, 19132, 2608, 589, 1641, 7200, 1580, 352, 310, 247, 2201, 7680, 891, 2748, 253, 4477, 281, 2216, 285, 3359, 271, 28913, 1263, 281, 2953, 436, 1127, 50276, 19, 16907, 253, 4477, 452, 1705, 598, 342, 247, 1133, 37171, 248, 32838, 17477, 50276, 6772, 607, 323, 35132, 5837, 2139, 16216, 352, 320, 2856, 324, 1069, 257, 1512, 4495, 476, 17524, 273, 5231, 320, 2218, 275, 247, 2856, 324, 1069, 257, 5133, 604, 594, 752, 310, 253, 3064, 275, 7200, 875, 253, 4081, 47641, 17477, 285, 253, 2856, 324, 1069, 257, 5795, 50276, 9453, 352, 3249, 281, 253, 1273, 7680, 352, 310, 5604, 417, 4460, 26332, 253, 2929, 1057, 417, 12661, 667, 27934, 1818, 281, 270, 797, 2167, 253, 2929, 3916, 326, 253, 3559, 1566, 310, 247, 27998, 1566, 281, 3037, 1048, 1307, 2159, 1307, 285, 35949, 1754, 2608, 6222, 253, 440, 1877, 310, 3982, 2220, 347, 247, 407, 7509, 273, 12422, 23390, 26306, 14800, 281, 26724, 270, 797, 50275, 48521, 253, 4583, 38135, 273, 253, 2929, 310, 1077, 3710, 50275, 783, 1563, 5701, 403, 619, 2201, 7350, 275, 1016, 2593, 273, 253, 2929, 50276, 4674, 374, 50276, 6050, 253, 4477, 452, 9814, 690, 6239, 275, 3700, 4715, 285, 1881, 22296, 4715, 285, 452, 11106, 690, 4623, 789, 597, 452, 417, 11106, 1014, 581, 3806, 275, 2593, 3495, 534, 327, 2608, 14053, 26332, 253, 2022, 10014, 273, 253, 2929, 891, 2748, 253, 2488, 281, 1056, 247, 11080, 6630, 285, 26542, 2905, 789, 275, 2593, 3495, 285, 671, 6780, 849, 2608, 6291, 310, 1027, 432, 731, 50276, 4674, 495, 50276, 1189, 455, 436, 2593, 285, 2593, 577, 3480, 28901, 279, 285, 476, 320, 3542, 30909, 342, 253, 8442, 7180, 11333, 285, 20121, 436, 812, 1361, 253, 9414, 1805, 2096, 253, 2746, 323, 4227, 253, 1563, 2022, 2792, 275, 253, 2746, 403, 417, 5544, 973, 50276, 18, 253, 2929, 3054, 253, 2457, 2957, 323, 581, 3280, 3425, 310, 253, 17375, 2020, 273, 253, 11655, 273, 512, 34741, 21761, 627, 310, 642, 2508, 752, 253, 13461, 403, 1880, 597, 403, 7922, 1754, 327, 47641, 390, 34003, 50276, 19, 253, 2746, 19401, 50144, 12474, 824, 347, 14247, 285, 2363, 2074, 281, 31091, 12474, 24088, 1016, 2363, 556, 247, 4451, 21496, 436, 3133, 4828, 565, 48714, 285, 627, 310, 642, 16774, 1941, 281, 921, 326, 436, 4828, 565, 48714, 2216, 2987, 973, 50276, 20, 352, 310, 417, 2590, 849, 281, 897, 253, 8763, 6779, 281, 3283, 12474, 432, 253, 13657, 34741, 21761, 625, 10534, 352, 310, 1896, 326, 1142, 12474, 15823, 281, 1027, 5231, 403, 34741, 285, 840, 11516, 715, 581, 10669, 21496, 594, 752, 12474, 403, 281, 320, 8131, 275, 253, 2457, 4751, 4802, 3828, 275, 436, 1083, 50276, 21, 5884, 4468, 752, 1057, 299, 1462, 323, 275, 5150, 337, 50275, 4674, 577, 50276, 1189, 455, 275, 436, 2593, 253, 5661, 2216, 310, 417, 11088, 285, 253, 1543, 403, 417, 21414, 323, 253, 1563, 4606, 50274, 28694, 342, 253, 259, 1356, 70, 554, 298, 296, 78, 4979, 398, 347, 8245, 352, 651, 452, 644, 1805, 281, 671, 2486, 26724, 270, 797, 281, 253, 1666, 25379, 1411, 534, 253, 2608, 6291, 476, 320, 2429, 275, 958, 26724, 270, 797, 651, 320, 253, 8642, 285, 954, 4569, 8245, 323, 5301, 7613, 891, 2748, 253, 4477, 281, 2486, 352, 50275, 455, 253, 4679, 403, 5196, 327, 2840, 15302, 1580, 2608, 27866, 310, 271, 6685, 4217, 285, 33079, 2425, 326, 5373, 2709, 10625, 891, 2748, 253, 4477, 281, 3368, 2608, 6291, 327, 973, 4304, 1527, 2603, 299, 32248, 285, 643, 2608, 27866, 15302, 1275, 337, 285, 374, 275, 958, 253, 10104, 812, 320, 5762, 327, 15450, 8892, 751, 1735, 19098, 10554, 342, 841, 15302, 436, 588, 1361, 253, 9414, 281, 4517, 253, 2608, 6291, 1566, 1805, 50275, 5423, 6779, 1146, 581, 273, 253, 2201, 9021, 273, 253, 2929, 352, 651, 1918, 625, 16039, 604, 271, 28913, 1263, 310, 1160, 327, 253, 2608, 8770, 941, 1048, 3945, 3386, 2159, 3945, 3386, 18825, 3386, 281, 7277, 285, 4499, 253, 7680, 285, 8488, 407, 1016, 273, 253, 8770, 9050, 50274, 249, 253, 11104, 10554, 4836, 1561, 253, 767, 12474, 3368, 264, 253, 3045, 273, 253, 4081, 1566, 310, 3240, 10915, 87, 19163, 651, 5649, 604, 625, 4679, 403, 2684, 50275, 1542, 253, 1735, 19098, 10554, 2167, 627, 403, 625, 685, 884, 76, 39831, 1016, 4212, 5431, 452, 247, 1077, 1355, 8578, 273, 1600, 3103, 352, 651, 320, 625, 27096, 604, 476, 7277, 253, 3210, 3711, 740, 342, 253, 2608, 5251, 10006, 3711, 740, 50275, 783, 11985, 273, 1543, 403, 1077, 21248, 285, 812, 320, 247, 2257, 12861, 285, 10799, 50272, 34974, 1309, 30080, 22559, 2180, 50272, 32897, 2953, 285, 19148, 253, 7350, 1840, 50269, 14005, 50276, 18, 5101, 704, 74, 1162, 355, 270, 797, 21, 2845, 22453, 17401, 342, 12246, 30869, 32049, 14237, 432, 39707, 10061, 273, 253, 3349, 394, 913, 78, 5213, 8059, 327, 1491, 285, 3640, 4323, 6247, 50276, 19, 465, 606, 259, 606, 5756, 72, 285, 49137, 757, 278, 6357, 1657, 90, 1881, 1595, 290, 422, 22453, 17401, 4765, 26332, 1796, 5213, 8059, 327, 941, 15067, 17857, 17670, 26332, 1796, 4765, 7152, 33032, 891, 452, 1239, 253, 2488, 2380, 285, 891, 1335, 1158, 253, 2929, 310, 3710, 275, 2426, 273, 38135, 8453, 285, 4679, 891, 651, 751, 281, 1978, 619, 1655, 4868, 50274, 2520, 789, 4081, 247, 1881, 35421, 3215, 26208, 2746, 281, 1566, 4212, 432, 2608, 3879, 2069, 12395, 941, 1677, 2069, 12395, 941, 273, 2608, 5231, 327, 299, 32248, 14248, 4384, 33310, 3966, 50276, 66, 1881, 35421, 10554, 4836, 2074, 281, 34741, 3448, 14053, 310, 5611, 835, 247, 39707, 1566, 26295, 34741, 5231, 432, 253, 3634, 2929, 3916, 281, 921, 326, 253, 4081, 3215, 26208, 2746, 17923, 1805, 685, 1666, 25379, 3215, 11273, 970, 1554, 262, 1945, 272, 327, 2608, 3879, 10554, 8892, 824, 347, 1735, 7471, 50275, 856, 84, 50276, 12311, 2608, 3210, 476, 320, 4217, 275, 17401, 2718, 285, 2929, 39223, 271, 1774, 1895, 50276, 856, 7334, 3215, 26208, 2746, 1754, 327, 2069, 12395, 941, 310, 4722, 285, 4453, 5272, 50276, 20790, 310, 3839, 973, 3542, 285, 3477, 281, 956, 50276, 5040, 50276, 15870, 38135, 285, 8453, 50276, 20881, 4679, 50276, 4919, 789, 327, 2608, 14053, 778, 417, 320, 11088, 2217, 50276, 4919, 789, 417, 11088, 253, 2929, 36908, 513, 247, 1175, 2628, 273, 19274, 253, 789, 275, 253, 3634, 273, 2720, 789, 891, 13414, 923, 667, 9380, 5469, 275, 253, 2608, 14053, 2593, 273, 253, 2905, 789, 534, 16540, 7350, 670, 253, 8453, 273, 436, 789, 50275, 31943, 1666, 25379, 253, 1566, 310, 2429, 1411, 1666, 25379, 8818, 407, 253, 4477, 690, 273, 534, 403, 39559, 2529, 24088, 891, 13414, 2096, 253, 259, 1356, 70, 554, 8245, 891, 717, 417, 2119, 604, 841, 403, 2266, 1666, 25379, 352, 310, 671, 12744, 849, 1175, 253, 1554, 262, 1945, 4715, 1666, 25379, 403, 285, 627, 310, 2649, 667, 1534, 5955, 670, 253, 8892, 27049, 22041, 908, 281, 6194, 841, 1666, 25379, 50275, 15870, 38135, 1987, 8303, 253, 2746, 476, 320, 2783, 271, 2898, 273, 34741, 3448, 14053, 516, 417, 2119, 604, 253, 4278, 670, 10669, 1320, 8800, 1199, 8453, 604, 253, 4477, 1750, 436, 281, 320, 247, 7680, 253, 1072, 943, 320, 16058, 407, 10941, 1411, 643, 4088, 273, 10669, 3006, 253, 941, 40090, 436, 2508, 253, 2746, 310, 247, 15246, 2898, 273, 270, 797, 4826, 3215, 26208, 50275, 74, 13414, 1158, 253, 24760, 875, 3448, 285, 2608, 3879, 3198, 281, 320, 5393, 324, 1397, 667, 1318, 34741, 3448, 1566, 3215, 26208, 476, 320, 3732, 281, 667, 2069, 12395, 941, 50276, 20261, 253, 2929, 3916, 281, 1566, 4212, 891, 812, 2649, 1089, 4278, 275, 253, 2929, 670, 849, 4555, 247, 441, 693, 49797, 310, 8818, 432, 253, 3215, 11273, 1566, 285, 849, 841, 3210, 403, 1442, 292, 37437, 327, 253, 2303, 4836, 50275, 6050, 253, 2929, 9437, 281, 2953, 271, 1774, 1895, 352, 556, 4092, 3374, 275, 2426, 273, 3710, 38135, 9188, 40348, 273, 9021, 285, 5075, 4679, 5474, 33032, 498, 15752, 50276, 74, 1089, 253, 2929, 19756, 4278, 327, 2709, 7794, 4496, 923, 619, 5701, 2708, 50276, 19164, 414, 50276, 783, 2934, 5611, 275, 253, 2929, 310, 4722, 954, 273, 253, 3236, 414, 3249, 432, 253, 958, 326, 253, 2488, 310, 22022, 247, 2608, 14053, 1895, 715, 247, 3215, 26208, 1895, 2931, 327, 247, 2714, 30318, 38857, 2250, 3510, 11104, 44077, 50276, 9188, 40348, 50276, 74, 717, 417, 13762, 326, 253, 906, 310, 1534, 6571, 984, 253, 3368, 310, 1892, 281, 320, 23775, 1955, 281, 3480, 273, 5740, 273, 14053, 4278, 285, 253, 10895, 1146, 908, 50274, 8826, 643, 5701, 50275, 262, 651, 320, 625, 9371, 281, 1127, 562, 253, 3064, 275, 625, 4278, 326, 849, 634, 789, 310, 1027, 432, 643, 2905, 789, 275, 2593, 3127, 285, 3307, 50275, 249, 2593, 4567, 10669, 1320, 273, 2608, 3879, 6430, 604, 891, 2096, 9113, 387, 1878, 690, 8578, 273, 253, 11104, 1511, 273, 271, 2608, 2250, 310, 8127, 7976, 327, 271, 6024, 9400, 1566, 390, 9400, 30410, 323, 1650, 281, 6489, 253, 2022, 9400, 273, 247, 42498, 9678, 3668, 390, 281, 9446, 253, 4623, 10726, 432, 247, 8979, 295, 2804, 12682, 50276, 2520, 629, 310, 417, 4518, 5544, 275, 253, 2929, 891, 651, 5583, 970, 271, 30762, 2593, 281, 5513, 352, 2007, 5010, 352, 310, 7479, 323, 643, 10668, 281, 18302, 253, 1543, 5393, 275, 436, 2929, 50275, 5092, 368, 294, 40712, 436, 6197, 253, 10669, 6779, 310, 10302, 407, 253, 32147, 318, 285, 1599, 10272, 273, 253, 3159, 46234, 273, 253, 11104, 44077, 275, 1016, 2250, 762, 4677, 337, 50276, 74, 717, 13477, 849, 436, 310, 2218, 310, 352, 247, 1599, 3159, 21496, 323, 1016, 11104, 2654, 840, 359, 32147, 366, 512, 273, 253, 1599, 21496, 11390, 50275, 249, 2593, 4567, 3280, 14237, 812, 368, 1918, 271, 1650, 273, 10669, 891, 923, 368, 753, 2363, 310, 247, 10669, 310, 627, 667, 643, 6667, 310, 10669, 1633, 760, 2905, 281, 2608, 1471, 3756, 285, 2250, 752, 310, 253, 3064, 875, 731, 436, 1918, 479, 13775, 672, 4361, 253, 1273, 12494, 762, 3280, 14237, 50275, 4524, 3215, 26208, 8892, 253, 2457, 2957, 323, 581, 3280, 3425, 310, 253, 17375, 2020, 273, 253, 11655, 273, 512, 34741, 21761, 476, 368, 21184, 327, 752, 253, 17375, 2020, 10770, 281, 752, 310, 253, 2801, 403, 368, 34018, 690, 13461, 281, 1027, 1511, 273, 2608, 3879, 24088, 1048, 3945, 2159, 3945, 285, 2608, 10104, 849, 310, 326, 2801, 1146, 4425, 50275, 249, 253, 3368, 2593, 7609, 253, 10895, 908, 275, 253, 2929, 1057, 417, 1646, 1345, 15735, 390, 513, 891, 5816, 1633, 436, 671, 8336, 38041, 273, 50276, 783, 2929, 1543, 310, 627, 2712, 368, 476, 897, 275, 253, 1345, 1533, 5010, 476, 368, 387, 1878, 1918, 625, 4278, 16280, 1110, 2168, 275, 7609, 327, 253, 762, 1282, 10895, 275, 50276, 783, 30762, 50269, 187, 187, 4118, 18435, 27, 783, 2929, 25339, 271, 6880, 273, 270, 797, 323, 4715, 2608, 14237, 1754, 327, 2425, 6127, 275, 247, 1881, 35421, 4758, 512, 30628, 452, 7350, 670, 253, 13091, 273, 253, 3916, 285, 253, 8453, 273, 253, 5661, 1543, 4583, 891, 5194, 342, 253, 30628, 326, 253, 2929, 3198, 625, 789, 281, 320, 3863, 387, 17857, 32888, 891, 5583, 18235 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 671, 29328, 247, 35132, 5837, 2746, 285, 247, 27998, 3280, 1577, 981, 281, 2486, 1048, 3945, 2159, 3945, 285, 18825, 1491, 50270, 856, 84, 50272, 18, 1014, 2167, 253, 2934, 273, 13633, 253, 1881, 35421, 3215, 26208, 323, 2608, 14237, 310, 417, 747, 352, 310, 1335, 271, 4722, 2170, 273, 2561, 50274, 19, 253, 35132, 5837, 273, 2608, 8770, 6298, 689, 1048, 3945, 285, 2159, 3945, 281, 830, 35174, 3000, 310, 3240, 5272, 50272, 585, 1209, 2224, 50273, 783, 2234, 7350, 670, 253, 9021, 273, 253, 2929, 403, 347, 3637, 50275, 1189, 455, 253, 38135, 273, 436, 789, 310, 1077, 3710, 281, 21184, 253, 2201, 7680, 273, 436, 789, 310, 767, 7975, 50276, 3431, 2414, 5837, 273, 9305, 2608, 8770, 6430, 323, 1048, 3945, 285, 2159, 3945, 285, 50276, 5302, 253, 35132, 1701, 40006, 8770, 3000, 347, 14800, 281, 253, 270, 797, 10336, 347, 310, 50275, 783, 643, 7558, 9021, 824, 347, 1907, 247, 27998, 10336, 285, 4679, 281, 17813, 253, 2746, 513, 417, 1646, 6832, 50276, 9453, 352, 3249, 281, 35132, 5837, 2167, 253, 2934, 3133, 4569, 767, 9560, 3533, 5001, 436, 3213, 403, 417, 17618, 50276, 18, 627, 310, 642, 16774, 1941, 3559, 275, 253, 2929, 534, 2722, 35132, 5837, 19132, 2608, 589, 1641, 7200, 1580, 352, 310, 247, 2201, 7680, 891, 2748, 253, 4477, 281, 2216, 285, 3359, 271, 28913, 1263, 281, 2953, 436, 1127, 50276, 19, 16907, 253, 4477, 452, 1705, 598, 342, 247, 1133, 37171, 248, 32838, 17477, 50276, 6772, 607, 323, 35132, 5837, 2139, 16216, 352, 320, 2856, 324, 1069, 257, 1512, 4495, 476, 17524, 273, 5231, 320, 2218, 275, 247, 2856, 324, 1069, 257, 5133, 604, 594, 752, 310, 253, 3064, 275, 7200, 875, 253, 4081, 47641, 17477, 285, 253, 2856, 324, 1069, 257, 5795, 50276, 9453, 352, 3249, 281, 253, 1273, 7680, 352, 310, 5604, 417, 4460, 26332, 253, 2929, 1057, 417, 12661, 667, 27934, 1818, 281, 270, 797, 2167, 253, 2929, 3916, 326, 253, 3559, 1566, 310, 247, 27998, 1566, 281, 3037, 1048, 1307, 2159, 1307, 285, 35949, 1754, 2608, 6222, 253, 440, 1877, 310, 3982, 2220, 347, 247, 407, 7509, 273, 12422, 23390, 26306, 14800, 281, 26724, 270, 797, 50275, 48521, 253, 4583, 38135, 273, 253, 2929, 310, 1077, 3710, 50275, 783, 1563, 5701, 403, 619, 2201, 7350, 275, 1016, 2593, 273, 253, 2929, 50276, 4674, 374, 50276, 6050, 253, 4477, 452, 9814, 690, 6239, 275, 3700, 4715, 285, 1881, 22296, 4715, 285, 452, 11106, 690, 4623, 789, 597, 452, 417, 11106, 1014, 581, 3806, 275, 2593, 3495, 534, 327, 2608, 14053, 26332, 253, 2022, 10014, 273, 253, 2929, 891, 2748, 253, 2488, 281, 1056, 247, 11080, 6630, 285, 26542, 2905, 789, 275, 2593, 3495, 285, 671, 6780, 849, 2608, 6291, 310, 1027, 432, 731, 50276, 4674, 495, 50276, 1189, 455, 436, 2593, 285, 2593, 577, 3480, 28901, 279, 285, 476, 320, 3542, 30909, 342, 253, 8442, 7180, 11333, 285, 20121, 436, 812, 1361, 253, 9414, 1805, 2096, 253, 2746, 323, 4227, 253, 1563, 2022, 2792, 275, 253, 2746, 403, 417, 5544, 973, 50276, 18, 253, 2929, 3054, 253, 2457, 2957, 323, 581, 3280, 3425, 310, 253, 17375, 2020, 273, 253, 11655, 273, 512, 34741, 21761, 627, 310, 642, 2508, 752, 253, 13461, 403, 1880, 597, 403, 7922, 1754, 327, 47641, 390, 34003, 50276, 19, 253, 2746, 19401, 50144, 12474, 824, 347, 14247, 285, 2363, 2074, 281, 31091, 12474, 24088, 1016, 2363, 556, 247, 4451, 21496, 436, 3133, 4828, 565, 48714, 285, 627, 310, 642, 16774, 1941, 281, 921, 326, 436, 4828, 565, 48714, 2216, 2987, 973, 50276, 20, 352, 310, 417, 2590, 849, 281, 897, 253, 8763, 6779, 281, 3283, 12474, 432, 253, 13657, 34741, 21761, 625, 10534, 352, 310, 1896, 326, 1142, 12474, 15823, 281, 1027, 5231, 403, 34741, 285, 840, 11516, 715, 581, 10669, 21496, 594, 752, 12474, 403, 281, 320, 8131, 275, 253, 2457, 4751, 4802, 3828, 275, 436, 1083, 50276, 21, 5884, 4468, 752, 1057, 299, 1462, 323, 275, 5150, 337, 50275, 4674, 577, 50276, 1189, 455, 275, 436, 2593, 253, 5661, 2216, 310, 417, 11088, 285, 253, 1543, 403, 417, 21414, 323, 253, 1563, 4606, 50274, 28694, 342, 253, 259, 1356, 70, 554, 298, 296, 78, 4979, 398, 347, 8245, 352, 651, 452, 644, 1805, 281, 671, 2486, 26724, 270, 797, 281, 253, 1666, 25379, 1411, 534, 253, 2608, 6291, 476, 320, 2429, 275, 958, 26724, 270, 797, 651, 320, 253, 8642, 285, 954, 4569, 8245, 323, 5301, 7613, 891, 2748, 253, 4477, 281, 2486, 352, 50275, 455, 253, 4679, 403, 5196, 327, 2840, 15302, 1580, 2608, 27866, 310, 271, 6685, 4217, 285, 33079, 2425, 326, 5373, 2709, 10625, 891, 2748, 253, 4477, 281, 3368, 2608, 6291, 327, 973, 4304, 1527, 2603, 299, 32248, 285, 643, 2608, 27866, 15302, 1275, 337, 285, 374, 275, 958, 253, 10104, 812, 320, 5762, 327, 15450, 8892, 751, 1735, 19098, 10554, 342, 841, 15302, 436, 588, 1361, 253, 9414, 281, 4517, 253, 2608, 6291, 1566, 1805, 50275, 5423, 6779, 1146, 581, 273, 253, 2201, 9021, 273, 253, 2929, 352, 651, 1918, 625, 16039, 604, 271, 28913, 1263, 310, 1160, 327, 253, 2608, 8770, 941, 1048, 3945, 3386, 2159, 3945, 3386, 18825, 3386, 281, 7277, 285, 4499, 253, 7680, 285, 8488, 407, 1016, 273, 253, 8770, 9050, 50274, 249, 253, 11104, 10554, 4836, 1561, 253, 767, 12474, 3368, 264, 253, 3045, 273, 253, 4081, 1566, 310, 3240, 10915, 87, 19163, 651, 5649, 604, 625, 4679, 403, 2684, 50275, 1542, 253, 1735, 19098, 10554, 2167, 627, 403, 625, 685, 884, 76, 39831, 1016, 4212, 5431, 452, 247, 1077, 1355, 8578, 273, 1600, 3103, 352, 651, 320, 625, 27096, 604, 476, 7277, 253, 3210, 3711, 740, 342, 253, 2608, 5251, 10006, 3711, 740, 50275, 783, 11985, 273, 1543, 403, 1077, 21248, 285, 812, 320, 247, 2257, 12861, 285, 10799, 50272, 34974, 1309, 30080, 22559, 2180, 50272, 32897, 2953, 285, 19148, 253, 7350, 1840, 50269, 14005, 50276, 18, 5101, 704, 74, 1162, 355, 270, 797, 21, 2845, 22453, 17401, 342, 12246, 30869, 32049, 14237, 432, 39707, 10061, 273, 253, 3349, 394, 913, 78, 5213, 8059, 327, 1491, 285, 3640, 4323, 6247, 50276, 19, 465, 606, 259, 606, 5756, 72, 285, 49137, 757, 278, 6357, 1657, 90, 1881, 1595, 290, 422, 22453, 17401, 4765, 26332, 1796, 5213, 8059, 327, 941, 15067, 17857, 17670, 26332, 1796, 4765, 7152, 33032, 891, 452, 1239, 253, 2488, 2380, 285, 891, 1335, 1158, 253, 2929, 310, 3710, 275, 2426, 273, 38135, 8453, 285, 4679, 891, 651, 751, 281, 1978, 619, 1655, 4868, 50274, 2520, 789, 4081, 247, 1881, 35421, 3215, 26208, 2746, 281, 1566, 4212, 432, 2608, 3879, 2069, 12395, 941, 1677, 2069, 12395, 941, 273, 2608, 5231, 327, 299, 32248, 14248, 4384, 33310, 3966, 50276, 66, 1881, 35421, 10554, 4836, 2074, 281, 34741, 3448, 14053, 310, 5611, 835, 247, 39707, 1566, 26295, 34741, 5231, 432, 253, 3634, 2929, 3916, 281, 921, 326, 253, 4081, 3215, 26208, 2746, 17923, 1805, 685, 1666, 25379, 3215, 11273, 970, 1554, 262, 1945, 272, 327, 2608, 3879, 10554, 8892, 824, 347, 1735, 7471, 50275, 856, 84, 50276, 12311, 2608, 3210, 476, 320, 4217, 275, 17401, 2718, 285, 2929, 39223, 271, 1774, 1895, 50276, 856, 7334, 3215, 26208, 2746, 1754, 327, 2069, 12395, 941, 310, 4722, 285, 4453, 5272, 50276, 20790, 310, 3839, 973, 3542, 285, 3477, 281, 956, 50276, 5040, 50276, 15870, 38135, 285, 8453, 50276, 20881, 4679, 50276, 4919, 789, 327, 2608, 14053, 778, 417, 320, 11088, 2217, 50276, 4919, 789, 417, 11088, 253, 2929, 36908, 513, 247, 1175, 2628, 273, 19274, 253, 789, 275, 253, 3634, 273, 2720, 789, 891, 13414, 923, 667, 9380, 5469, 275, 253, 2608, 14053, 2593, 273, 253, 2905, 789, 534, 16540, 7350, 670, 253, 8453, 273, 436, 789, 50275, 31943, 1666, 25379, 253, 1566, 310, 2429, 1411, 1666, 25379, 8818, 407, 253, 4477, 690, 273, 534, 403, 39559, 2529, 24088, 891, 13414, 2096, 253, 259, 1356, 70, 554, 8245, 891, 717, 417, 2119, 604, 841, 403, 2266, 1666, 25379, 352, 310, 671, 12744, 849, 1175, 253, 1554, 262, 1945, 4715, 1666, 25379, 403, 285, 627, 310, 2649, 667, 1534, 5955, 670, 253, 8892, 27049, 22041, 908, 281, 6194, 841, 1666, 25379, 50275, 15870, 38135, 1987, 8303, 253, 2746, 476, 320, 2783, 271, 2898, 273, 34741, 3448, 14053, 516, 417, 2119, 604, 253, 4278, 670, 10669, 1320, 8800, 1199, 8453, 604, 253, 4477, 1750, 436, 281, 320, 247, 7680, 253, 1072, 943, 320, 16058, 407, 10941, 1411, 643, 4088, 273, 10669, 3006, 253, 941, 40090, 436, 2508, 253, 2746, 310, 247, 15246, 2898, 273, 270, 797, 4826, 3215, 26208, 50275, 74, 13414, 1158, 253, 24760, 875, 3448, 285, 2608, 3879, 3198, 281, 320, 5393, 324, 1397, 667, 1318, 34741, 3448, 1566, 3215, 26208, 476, 320, 3732, 281, 667, 2069, 12395, 941, 50276, 20261, 253, 2929, 3916, 281, 1566, 4212, 891, 812, 2649, 1089, 4278, 275, 253, 2929, 670, 849, 4555, 247, 441, 693, 49797, 310, 8818, 432, 253, 3215, 11273, 1566, 285, 849, 841, 3210, 403, 1442, 292, 37437, 327, 253, 2303, 4836, 50275, 6050, 253, 2929, 9437, 281, 2953, 271, 1774, 1895, 352, 556, 4092, 3374, 275, 2426, 273, 3710, 38135, 9188, 40348, 273, 9021, 285, 5075, 4679, 5474, 33032, 498, 15752, 50276, 74, 1089, 253, 2929, 19756, 4278, 327, 2709, 7794, 4496, 923, 619, 5701, 2708, 50276, 19164, 414, 50276, 783, 2934, 5611, 275, 253, 2929, 310, 4722, 954, 273, 253, 3236, 414, 3249, 432, 253, 958, 326, 253, 2488, 310, 22022, 247, 2608, 14053, 1895, 715, 247, 3215, 26208, 1895, 2931, 327, 247, 2714, 30318, 38857, 2250, 3510, 11104, 44077, 50276, 9188, 40348, 50276, 74, 717, 417, 13762, 326, 253, 906, 310, 1534, 6571, 984, 253, 3368, 310, 1892, 281, 320, 23775, 1955, 281, 3480, 273, 5740, 273, 14053, 4278, 285, 253, 10895, 1146, 908, 50274, 8826, 643, 5701, 50275, 262, 651, 320, 625, 9371, 281, 1127, 562, 253, 3064, 275, 625, 4278, 326, 849, 634, 789, 310, 1027, 432, 643, 2905, 789, 275, 2593, 3127, 285, 3307, 50275, 249, 2593, 4567, 10669, 1320, 273, 2608, 3879, 6430, 604, 891, 2096, 9113, 387, 1878, 690, 8578, 273, 253, 11104, 1511, 273, 271, 2608, 2250, 310, 8127, 7976, 327, 271, 6024, 9400, 1566, 390, 9400, 30410, 323, 1650, 281, 6489, 253, 2022, 9400, 273, 247, 42498, 9678, 3668, 390, 281, 9446, 253, 4623, 10726, 432, 247, 8979, 295, 2804, 12682, 50276, 2520, 629, 310, 417, 4518, 5544, 275, 253, 2929, 891, 651, 5583, 970, 271, 30762, 2593, 281, 5513, 352, 2007, 5010, 352, 310, 7479, 323, 643, 10668, 281, 18302, 253, 1543, 5393, 275, 436, 2929, 50275, 5092, 368, 294, 40712, 436, 6197, 253, 10669, 6779, 310, 10302, 407, 253, 32147, 318, 285, 1599, 10272, 273, 253, 3159, 46234, 273, 253, 11104, 44077, 275, 1016, 2250, 762, 4677, 337, 50276, 74, 717, 13477, 849, 436, 310, 2218, 310, 352, 247, 1599, 3159, 21496, 323, 1016, 11104, 2654, 840, 359, 32147, 366, 512, 273, 253, 1599, 21496, 11390, 50275, 249, 2593, 4567, 3280, 14237, 812, 368, 1918, 271, 1650, 273, 10669, 891, 923, 368, 753, 2363, 310, 247, 10669, 310, 627, 667, 643, 6667, 310, 10669, 1633, 760, 2905, 281, 2608, 1471, 3756, 285, 2250, 752, 310, 253, 3064, 875, 731, 436, 1918, 479, 13775, 672, 4361, 253, 1273, 12494, 762, 3280, 14237, 50275, 4524, 3215, 26208, 8892, 253, 2457, 2957, 323, 581, 3280, 3425, 310, 253, 17375, 2020, 273, 253, 11655, 273, 512, 34741, 21761, 476, 368, 21184, 327, 752, 253, 17375, 2020, 10770, 281, 752, 310, 253, 2801, 403, 368, 34018, 690, 13461, 281, 1027, 1511, 273, 2608, 3879, 24088, 1048, 3945, 2159, 3945, 285, 2608, 10104, 849, 310, 326, 2801, 1146, 4425, 50275, 249, 253, 3368, 2593, 7609, 253, 10895, 908, 275, 253, 2929, 1057, 417, 1646, 1345, 15735, 390, 513, 891, 5816, 1633, 436, 671, 8336, 38041, 273, 50276, 783, 2929, 1543, 310, 627, 2712, 368, 476, 897, 275, 253, 1345, 1533, 5010, 476, 368, 387, 1878, 1918, 625, 4278, 16280, 1110, 2168, 275, 7609, 327, 253, 762, 1282, 10895, 275, 50276, 783, 30762, 50269, 187, 187, 4118, 18435, 27, 783, 2929, 25339, 271, 6880, 273, 270, 797, 323, 4715, 2608, 14237, 1754, 327, 2425, 6127, 275, 247, 1881, 35421, 4758, 512, 30628, 452, 7350, 670, 253, 13091, 273, 253, 3916, 285, 253, 8453, 273, 253, 5661, 1543, 4583, 891, 5194, 342, 253, 30628, 326, 253, 2929, 3198, 625, 789, 281, 320, 3863, 387, 17857, 32888, 891, 5583, 18235 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper introduces a method for jointly sampling structure and sequence of antibody loops in an iterative fashion that constrains structure generation less this paper provides an interesting study of antibody loop generation with both novel methodology and extensive empirical evaluation this combination of strengths makes it an excellent paper that should be of wide interest strengths the design of each component follows naturally from previous work such as ingraham et al 2019 this helps isolate out key methodological contributions for example no teacher forcing in structure in an easytointerpret way the coarsening procedure is of wider applicability in protein modeling beyond sampling cdr loops and an ablation study demonstrates its value empirically for this work baselines chosen highlight key interactions in refinegnn components and show that the model appears to effectively leverage the added complexity evaluations are thorough and span a number of datasets one of the biggest strengths of this paper weaknesses the use of a predictor to evaluate neutralization is justified based on very recent work and it is unclear that this practice is in line with broader norms in the antibody engineering community this paper provides an interesting study of antibody loop generation with both novel methodology and extensive empirical evaluation this combination of strengths makes it an excellent paper that should be of wide interest docsepthis paper proposes a joint generative model codesigning the sequence and the structure at the same time for the cdrs of antibodies with the goal to enhance binding specificity or neutralization capabilities the proposed method and two existing baselines are evaluated on 1 perplexity on holdout set 2 perplexity and sequence recovery on known antigenantibody complexes and 3 redesign cdrh3 of existing antibodies for coronavirus neutralization as measured by a given neutralization predictor refinegnn the proposed method shows improved performance in all three tasks this paper studies the important problem of computationally designing antibody cdrs the joint modeling approach of structures and sequences is novel and interesting the generation method is flexible and the authors also adapt it for conditional generation conditioned on the rest of the antibody and on given properties of interest with the predictive model already available the empirical results are convincing both perplexity and sequence recovery are standard metrics in protein design and the proposed method showed improvements in both perplexity and sequence recovery on two separate data sets the two baselines used for comparison are well described and meaningful request for clarification in section 42 it is not immediately clear whether the antigen is also included in the conditioning or only the antibody itself it would be appreciated if the authors can add more discussion around when this proposed method applies in practice for example the method requires already knowing the frame of the antibody when is a fixed frame variable cdrh3 design a reasonable assumption or when can we realistically expect to have accurate predictors for properties of interest while the model is trained on unbound antibody structures the second evaluation task is conditioned on the antibody structures in bound state within a complex if im not mistaken will there be a mismatch between the bound and unbound structures clarification and discussion around this point would also be appreciated interesting approach to an important problem convincing empirical results and baselines the approach might be limited to specific use cases in practice depending on the availability of a predictive model for the properties of interest and the knowledge of the antibody frame docsepthe paper proposed a deep generative model named iterative refinement graph neural network to generate antibody cdr for yshaped antibodies specifically it sequentially generates the cdr residue sequence and refines the global structure iteratively empirical results show superior performance compared with baselines strengths the paper is wellwritten and easy to follow the proposed refinement method has high novelty and outperform stateoftheart baseline methods weakness the experimental evaluation may be problematic it is not convincing to use machine learning methods to predict the neutralization ability based on cdr h3 the proposed method is novel and the paper is well written validated by thorough empirical studies ### Summary:
this paper proposes use of a novel generative modelling approach over both sequences and structure of proteins to codesign the cdr region of antibodies so achieve good bindingneutralization the reviewers are in agreement that the problem is one of importance and that the technical and empirical contributions are strong there are concerns over the relevance of evaluating the method by using a predictive model as ground truth still the overall contributions remain
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 23970, 247, 1332, 323, 26277, 10491, 2605, 285, 3425, 273, 8092, 17417, 275, 271, 34560, 8142, 326, 1030, 44196, 2605, 5978, 1679, 50276, 2520, 2929, 3400, 271, 4722, 1263, 273, 8092, 6287, 5978, 342, 1097, 4460, 16182, 285, 9470, 16774, 7103, 436, 5019, 273, 20544, 2789, 352, 271, 7126, 2929, 326, 943, 320, 273, 4618, 1600, 50276, 296, 3755, 20556, 50276, 783, 2216, 273, 1016, 4445, 3637, 10748, 432, 2045, 789, 824, 347, 6446, 20336, 1162, 355, 6247, 436, 7729, 20843, 562, 2234, 35961, 9021, 323, 1650, 642, 9732, 17190, 275, 2605, 275, 271, 3477, 936, 22416, 1039, 50276, 783, 820, 1032, 2980, 5199, 310, 273, 14200, 30437, 275, 2601, 14053, 4457, 10491, 260, 5267, 17417, 285, 271, 28913, 1263, 14371, 697, 1318, 45190, 323, 436, 789, 50276, 10352, 25379, 6777, 6780, 2234, 6355, 275, 39494, 3757, 79, 4295, 285, 921, 326, 253, 1566, 4620, 281, 8069, 25057, 253, 2879, 10454, 50276, 15419, 12542, 403, 11080, 285, 13905, 247, 1180, 273, 15302, 581, 273, 253, 5962, 20544, 273, 436, 2929, 50276, 20881, 1255, 265, 50276, 783, 897, 273, 247, 23403, 281, 7472, 9238, 1320, 310, 17285, 1754, 327, 1077, 3332, 789, 285, 352, 310, 12744, 326, 436, 3946, 310, 275, 1386, 342, 16055, 22429, 275, 253, 8092, 11369, 3114, 50276, 2520, 2929, 3400, 271, 4722, 1263, 273, 8092, 6287, 5978, 342, 1097, 4460, 16182, 285, 9470, 16774, 7103, 436, 5019, 273, 20544, 2789, 352, 271, 7126, 2929, 326, 943, 320, 273, 4618, 1600, 5474, 33032, 2520, 2929, 29328, 247, 6036, 1006, 800, 1566, 11646, 525, 272, 253, 3425, 285, 253, 2605, 387, 253, 1072, 673, 323, 253, 22942, 2967, 273, 8740, 342, 253, 4736, 281, 7278, 4865, 13005, 390, 9238, 1320, 13789, 50275, 783, 4081, 1332, 285, 767, 5368, 1666, 25379, 403, 6760, 327, 337, 44229, 414, 327, 2186, 483, 873, 374, 44229, 414, 285, 3425, 7355, 327, 1929, 11611, 25499, 1197, 12872, 285, 495, 45755, 260, 5267, 73, 20, 273, 5368, 8740, 323, 24891, 9238, 1320, 347, 4080, 407, 247, 1677, 9238, 1320, 23403, 39494, 3757, 79, 253, 4081, 1332, 2722, 5520, 3045, 275, 512, 1264, 8892, 436, 2929, 2175, 253, 1774, 1895, 273, 43245, 20462, 8092, 22942, 2967, 253, 6036, 14053, 2746, 273, 5289, 285, 6430, 310, 4460, 285, 4722, 253, 5978, 1332, 310, 12112, 285, 253, 4477, 671, 5223, 352, 323, 17697, 5978, 27039, 327, 253, 1551, 273, 253, 8092, 285, 327, 1677, 3607, 273, 1600, 342, 253, 15970, 1566, 2168, 2130, 50276, 783, 16774, 1543, 403, 21414, 1097, 44229, 414, 285, 3425, 7355, 403, 2629, 17082, 275, 2601, 2216, 285, 253, 4081, 1332, 2692, 11701, 275, 1097, 44229, 414, 285, 3425, 7355, 327, 767, 4858, 941, 5239, 253, 767, 1666, 25379, 908, 323, 5301, 403, 973, 2529, 285, 14282, 50276, 9629, 323, 37699, 275, 2593, 5976, 352, 310, 417, 4745, 2590, 1880, 253, 11611, 310, 671, 2908, 275, 253, 21839, 390, 760, 253, 8092, 3139, 50276, 262, 651, 320, 14109, 604, 253, 4477, 476, 823, 625, 5955, 1475, 672, 436, 4081, 1332, 10384, 275, 3946, 323, 1650, 253, 1332, 4419, 2168, 8958, 253, 3665, 273, 253, 8092, 672, 310, 247, 4229, 3665, 4778, 260, 5267, 73, 20, 2216, 247, 5272, 9376, 390, 672, 476, 359, 1524, 18260, 1902, 281, 452, 7899, 23477, 323, 3607, 273, 1600, 50276, 6050, 253, 1566, 310, 10166, 327, 440, 9458, 8092, 5289, 253, 1273, 7103, 4836, 310, 27039, 327, 253, 8092, 5289, 275, 3033, 1375, 1561, 247, 2570, 604, 516, 417, 20854, 588, 627, 320, 247, 29713, 875, 253, 3033, 285, 440, 9458, 5289, 37699, 285, 5955, 1475, 436, 1127, 651, 671, 320, 14109, 50276, 47606, 2746, 281, 271, 1774, 1895, 21414, 16774, 1543, 285, 1666, 25379, 253, 2746, 1537, 320, 3710, 281, 2173, 897, 2219, 275, 3946, 7293, 327, 253, 11659, 273, 247, 15970, 1566, 323, 253, 3607, 273, 1600, 285, 253, 3640, 273, 253, 8092, 3665, 5474, 339, 431, 248, 2929, 4081, 247, 3676, 1006, 800, 1566, 4907, 34560, 29646, 4216, 11454, 2990, 281, 6635, 8092, 260, 5267, 323, 340, 13824, 8740, 5742, 352, 32627, 15693, 253, 260, 5267, 16558, 3425, 285, 1275, 1100, 253, 4156, 2605, 10040, 3146, 16774, 1543, 921, 8936, 3045, 2429, 342, 1666, 25379, 50276, 296, 3755, 20556, 50276, 783, 2929, 310, 973, 15720, 285, 3477, 281, 956, 50275, 783, 4081, 29646, 1332, 556, 1029, 38135, 285, 562, 32231, 1375, 23037, 14387, 8245, 3082, 50275, 20881, 1255, 50276, 783, 5661, 7103, 778, 320, 20276, 352, 310, 417, 21414, 281, 897, 5145, 4715, 3082, 281, 3283, 253, 9238, 1320, 3745, 1754, 327, 260, 5267, 288, 20, 50276, 783, 4081, 1332, 310, 4460, 285, 253, 2929, 310, 973, 3542, 17618, 407, 11080, 16774, 2175, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 29328, 897, 273, 247, 4460, 1006, 800, 26278, 2746, 689, 1097, 6430, 285, 2605, 273, 4324, 281, 11646, 525, 253, 260, 5267, 2919, 273, 8740, 594, 5115, 1175, 4865, 27912, 1320, 253, 30628, 403, 275, 4345, 326, 253, 1895, 310, 581, 273, 6349, 285, 326, 253, 7681, 285, 16774, 9021, 403, 2266, 627, 403, 7350, 689, 253, 17200, 273, 16344, 253, 1332, 407, 970, 247, 15970, 1566, 347, 3216, 5083, 1335, 253, 4583, 9021, 3464 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 23970, 247, 1332, 323, 26277, 10491, 2605, 285, 3425, 273, 8092, 17417, 275, 271, 34560, 8142, 326, 1030, 44196, 2605, 5978, 1679, 50276, 2520, 2929, 3400, 271, 4722, 1263, 273, 8092, 6287, 5978, 342, 1097, 4460, 16182, 285, 9470, 16774, 7103, 436, 5019, 273, 20544, 2789, 352, 271, 7126, 2929, 326, 943, 320, 273, 4618, 1600, 50276, 296, 3755, 20556, 50276, 783, 2216, 273, 1016, 4445, 3637, 10748, 432, 2045, 789, 824, 347, 6446, 20336, 1162, 355, 6247, 436, 7729, 20843, 562, 2234, 35961, 9021, 323, 1650, 642, 9732, 17190, 275, 2605, 275, 271, 3477, 936, 22416, 1039, 50276, 783, 820, 1032, 2980, 5199, 310, 273, 14200, 30437, 275, 2601, 14053, 4457, 10491, 260, 5267, 17417, 285, 271, 28913, 1263, 14371, 697, 1318, 45190, 323, 436, 789, 50276, 10352, 25379, 6777, 6780, 2234, 6355, 275, 39494, 3757, 79, 4295, 285, 921, 326, 253, 1566, 4620, 281, 8069, 25057, 253, 2879, 10454, 50276, 15419, 12542, 403, 11080, 285, 13905, 247, 1180, 273, 15302, 581, 273, 253, 5962, 20544, 273, 436, 2929, 50276, 20881, 1255, 265, 50276, 783, 897, 273, 247, 23403, 281, 7472, 9238, 1320, 310, 17285, 1754, 327, 1077, 3332, 789, 285, 352, 310, 12744, 326, 436, 3946, 310, 275, 1386, 342, 16055, 22429, 275, 253, 8092, 11369, 3114, 50276, 2520, 2929, 3400, 271, 4722, 1263, 273, 8092, 6287, 5978, 342, 1097, 4460, 16182, 285, 9470, 16774, 7103, 436, 5019, 273, 20544, 2789, 352, 271, 7126, 2929, 326, 943, 320, 273, 4618, 1600, 5474, 33032, 2520, 2929, 29328, 247, 6036, 1006, 800, 1566, 11646, 525, 272, 253, 3425, 285, 253, 2605, 387, 253, 1072, 673, 323, 253, 22942, 2967, 273, 8740, 342, 253, 4736, 281, 7278, 4865, 13005, 390, 9238, 1320, 13789, 50275, 783, 4081, 1332, 285, 767, 5368, 1666, 25379, 403, 6760, 327, 337, 44229, 414, 327, 2186, 483, 873, 374, 44229, 414, 285, 3425, 7355, 327, 1929, 11611, 25499, 1197, 12872, 285, 495, 45755, 260, 5267, 73, 20, 273, 5368, 8740, 323, 24891, 9238, 1320, 347, 4080, 407, 247, 1677, 9238, 1320, 23403, 39494, 3757, 79, 253, 4081, 1332, 2722, 5520, 3045, 275, 512, 1264, 8892, 436, 2929, 2175, 253, 1774, 1895, 273, 43245, 20462, 8092, 22942, 2967, 253, 6036, 14053, 2746, 273, 5289, 285, 6430, 310, 4460, 285, 4722, 253, 5978, 1332, 310, 12112, 285, 253, 4477, 671, 5223, 352, 323, 17697, 5978, 27039, 327, 253, 1551, 273, 253, 8092, 285, 327, 1677, 3607, 273, 1600, 342, 253, 15970, 1566, 2168, 2130, 50276, 783, 16774, 1543, 403, 21414, 1097, 44229, 414, 285, 3425, 7355, 403, 2629, 17082, 275, 2601, 2216, 285, 253, 4081, 1332, 2692, 11701, 275, 1097, 44229, 414, 285, 3425, 7355, 327, 767, 4858, 941, 5239, 253, 767, 1666, 25379, 908, 323, 5301, 403, 973, 2529, 285, 14282, 50276, 9629, 323, 37699, 275, 2593, 5976, 352, 310, 417, 4745, 2590, 1880, 253, 11611, 310, 671, 2908, 275, 253, 21839, 390, 760, 253, 8092, 3139, 50276, 262, 651, 320, 14109, 604, 253, 4477, 476, 823, 625, 5955, 1475, 672, 436, 4081, 1332, 10384, 275, 3946, 323, 1650, 253, 1332, 4419, 2168, 8958, 253, 3665, 273, 253, 8092, 672, 310, 247, 4229, 3665, 4778, 260, 5267, 73, 20, 2216, 247, 5272, 9376, 390, 672, 476, 359, 1524, 18260, 1902, 281, 452, 7899, 23477, 323, 3607, 273, 1600, 50276, 6050, 253, 1566, 310, 10166, 327, 440, 9458, 8092, 5289, 253, 1273, 7103, 4836, 310, 27039, 327, 253, 8092, 5289, 275, 3033, 1375, 1561, 247, 2570, 604, 516, 417, 20854, 588, 627, 320, 247, 29713, 875, 253, 3033, 285, 440, 9458, 5289, 37699, 285, 5955, 1475, 436, 1127, 651, 671, 320, 14109, 50276, 47606, 2746, 281, 271, 1774, 1895, 21414, 16774, 1543, 285, 1666, 25379, 253, 2746, 1537, 320, 3710, 281, 2173, 897, 2219, 275, 3946, 7293, 327, 253, 11659, 273, 247, 15970, 1566, 323, 253, 3607, 273, 1600, 285, 253, 3640, 273, 253, 8092, 3665, 5474, 339, 431, 248, 2929, 4081, 247, 3676, 1006, 800, 1566, 4907, 34560, 29646, 4216, 11454, 2990, 281, 6635, 8092, 260, 5267, 323, 340, 13824, 8740, 5742, 352, 32627, 15693, 253, 260, 5267, 16558, 3425, 285, 1275, 1100, 253, 4156, 2605, 10040, 3146, 16774, 1543, 921, 8936, 3045, 2429, 342, 1666, 25379, 50276, 296, 3755, 20556, 50276, 783, 2929, 310, 973, 15720, 285, 3477, 281, 956, 50275, 783, 4081, 29646, 1332, 556, 1029, 38135, 285, 562, 32231, 1375, 23037, 14387, 8245, 3082, 50275, 20881, 1255, 50276, 783, 5661, 7103, 778, 320, 20276, 352, 310, 417, 21414, 281, 897, 5145, 4715, 3082, 281, 3283, 253, 9238, 1320, 3745, 1754, 327, 260, 5267, 288, 20, 50276, 783, 4081, 1332, 310, 4460, 285, 253, 2929, 310, 973, 3542, 17618, 407, 11080, 16774, 2175, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 29328, 897, 273, 247, 4460, 1006, 800, 26278, 2746, 689, 1097, 6430, 285, 2605, 273, 4324, 281, 11646, 525, 253, 260, 5267, 2919, 273, 8740, 594, 5115, 1175, 4865, 27912, 1320, 253, 30628, 403, 275, 4345, 326, 253, 1895, 310, 581, 273, 6349, 285, 326, 253, 7681, 285, 16774, 9021, 403, 2266, 627, 403, 7350, 689, 253, 17200, 273, 16344, 253, 1332, 407, 970, 247, 15970, 1566, 347, 3216, 5083, 1335, 253, 4583, 9021, 3464 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper proposes a novel model perturbationbased method for the sourcefree domain adaptation sfda problem under the mild assumption the authors connected their variational model perturbation with the bayesian neural network and proposed a perturbation parameters update method for sourcefree adaptation avoiding changing the source domain models weight severely while improving the models generalizability the experimental results on several domain adaptation benchmarks under three different settingsoffline sfda generalized sfda and continual sfda verify the effectiveness of the proposed method strengths 1 as far as i am aware this is the first time that the probabilistic model approach is introduced in the sourcefree domain adaptation the proposed method is technically sound and novel in the sfda problem the intuition behind the variational model perturbation method in this paper fits well the sfda setting which can bring some novel insights to the community 2 the proposed method is efficient and can be combined with other sfda methods 3 the experiments cover three different sfda scenarios and the results generally look promising the analysis in section 43 is comprehensive and addresses some of my concerns during reviewing weaknesses 1 the methodology section lacks some necessary description of the proposed method and generated algorithm which results in unclarity and ambiguity for example 1 what is the form of the specific objective function under the assumption of prior and posterior distribution mentioned in the papercould the author further clarify the expression of the regularization term 2 is the main parameter sigma calculated and updated if any directly by the neural network 2 some descriptions of the experimental section are confusing 1 for example in lines 218225 while describing the offline sfda the paper utilizes the training phase represents the target training process however it is written the model is first trained using 80 source data and then tested using the rest of the source data and all the target data for generalized sfda setting then in lines 245247 it mentions only the training phase and the test phase i suppose these two phrases are both noted for target training could the authors harmonize source training or target training to avoid ambiguity besides could the source model generation process be introduced to make the experimental description more complete considering the particular setting of sfda i look forward to a clearer elaboration of the experimental details 3 the experimental results only mention the accuracy but do not include the variance part as the proposed model perturbation method only optimized the perturbations variance and did not update the source models weight i am curious about the stability of the proposed method 4 there are some unclear mathematical expressions for example in line 110111 here we assume that we adopt the same prior ie pwtpdelta w does it means that these two random variables have the same variance 5 in line 91 this paper directly assumes the zeromean isotropic gaussian distribution as the posterior distribution family could the authors explain more about such choices and assumptions thanks for the effort from the authors and the rebuttal has well addressed my concerns mentioned above i would like to raise my score to 7 the authors discussed the limitations in section 2 docsepthe challenge of sourcefree domain adaptation sfda is that the source data is inaccessible after the source model is trained two prevalent approaches to address the problem is finetuning the source model on target domain and incorporate the adapters into the source model and learn it with target data however the finetuned model usually has poor generalization of the model since it tends to be biased to a specific target domain the adaptermodulating method is unable to handle large domain shifts as all the convolutional layers and fullyconnected layers are frozen to alleviate these problems in this work the authors proposed to perturb the parameters of the source model to achieve adaptation to target domain the model perturbation approach learn domainspecific knowledge through variational bayesian inference allowing the model to generalize to relevant domains this modelagnostic framework can efficiently adapt to target domain while largely preserving the discriminative ability of the source model the method has been validated in various datasets and experiment settings strengths the work is wellmotivated the writing is easy to follow and the framework is general and flexible provided a novel probabilistic framework for sfda demonstrated the theoretical connection between their probabilistic framework and the bayesian neural network the theoretical analysis of the method is clear and sound the empirical and comprehensive experiments validated the method in various settings weaknesses to make it selfcontain a more detailed description of the model training or a pseudocode of the training algorithm is expected for instance in line 227 and 237 the authors only mentioned that the expected likelihood adopted the same losses used in reference papers it would be better to include the loss function as supplementary since the authors adopted the local reparameterization trick in line 247 it would be better to briefly explain how to optimize the objective function in eq3 so that help the general audience understand why and how to apply the trick in turns it also help the audience get a better understanding of the training workflow i would recommend to add a pseudocode of the training algorithm in the main body or supplementary such that to help the readers to get a clear view of the model training workflow docsepin this paper the authors propose to solve the sourcefree domain adaptation problem by perturbing the source model to adapt it to the target domain the perturbations for the source model are learned through variational bayesian inference the authors provide a parameter sharing strategy to significantly reduce the number of learnable perturbations therefore the proposed method is a flexible and lightweight framework and can be integrated with existing methods extensive experiments on 5 datasets under 3 different sfda settings demonstrate the superior performance of variational model perturbations 1 a flexible and lightweight framework easy to be integrated with previous methods 2 very few learnable parameters 3 it decouples the process of domain adaptation into the invariant part the static source model and varying part the learnble perturbations 4 the authors connected variational model perturbation with bayesian neural networks which provides a theoretical guarantee for the generalization performance of model perturbation 5 most experimental details required for reproducing the results were given code was submitted as well which is noteworthy 6 the performance is competitive variational model perturbation can achieve the best results in almost settings against the compared baselines 7 the experiments take into account both sfda and test time adaptation settings which seem very sufficient note that previous methods solved sfda and test time adaptation independently eg shot1 sfda nrc2 sfda hcl3 sfda tent4 testtime adaptation bacs5 testtime adaptation cotta 6 testtime adaptation this paper essentially unifies the two settings into one framework 1 line 170 and 199 typos adaption 2 the upper case and lower case are not consistent kldivergence in line 95 and kldivergence in line 110 adaptive prior in line 126 and adaptive prior in line 268 3 in eq 8 w appears for the first time it should be wt according to eq 2 docsepthe paper proposes a variational bayesian inference framework to perturb source model parameters to address sourcefree domain adaptation the paper proposes framework simultaneously adapts the source model to the target domain and preserves the discriminability of the model and the paper introduces a parameter sharing strategy to achieve more efficient adaptation the paper provides a theoretical connetion between bayesain learning and the proposed method the paper conducts experiments on three sourcefree domain adaptation frameworks to verify the performance of the proposed method strength the general idea is interesting prior works on finetuning also take techniques to reduce the divergence between the finetuned model and the original this paper directly models the perturbation which naturally limited the divergence between the tuned model and the original model which can simultaneously avoids catestraphic forgetting and the enables adaptation the authors adaptively adjust the prior distribution for different convolution kernels which avoids the influence of the variance in convolutional kernel parameters in the pretrained model the connection to bayesian learning provides a theoretical basis for the proposed method the paper does not introduce too many extra parameters and could be trained efficiently the quantification of weight uncertainty is interesting which shows that different layers have different uncertainty on the parameters the observation could guide following works on sourcefree domain adpatation weakness on line 34 sourcefree domain adaptation assumes that both source data and labeled target data are not available but not or on line 3840 the paper argus that the performance on the source domain will degrade however in many domain adaptation settings only the performance in the target domain is important for the source domain we can directly use the model trained on the source domain to perform prediction thus different from continual learning degrading the source performance may not be a shortcoming of domain adaptation methods on line 4042 the bias to a specific target domain may not be a problem for the domain adaptation to a single domain but a problem for multitarget domain adaptation however for multitarget domain adaptation we can also merge all the domains and perform a single target domain adaptation on line 9194 the authors defautly assume that delta omega has a zero mean which means that delta omega has the highest density according to the property of the gaussian distribution however delta omega 0 indicates that the source network should be directly applied to the target network which is a clearly suboptimal solution for delta omega why does the authors use a obviously suboptimal solution at the point of the highest density for delta omega the parameter sharing strategy is natural and should not be considered as a significant contribution the parameters in the same kernel has strong correlations and should not be treated separately thus sharing the same perturbation is a natural choice but not a special strategy in section 43 what does parameter nonshareing mean in the paragraph parameter sharing strategy the authors do not discuss the limitations of the method could the authors discuss this in the response or point out at which place in the paper the authors discuss this ### Summary:
this paper proposes a novel probabilistic framework for sourcefree domain adaptation in which the source model serves as the invariant part mean while a perturbation variance is applied to the source model parameters to derive the target model that accounts for the targetspecific distribution all four reviewers provided detailed and constructive comments which were well taken into account by the authors in their revision and rebuttal after discussion all reviewers were positive about the paper ac agreed with the reviewers that this paper introduces a novel solid and parametersimple approach to sourcefree domain adaptation with comprehensive empirical evaluation for several settings which will be widely interested by the community a further comment of ac is that the connection to shai bendavid et als seminal bound is rather offtopic to this paper making the discussion subject to flaw there is no formal modeling of the source and target data distributions that is required by the bound while the bound cannot describe domain relatedness in terms of model parameters so i suggest the authors to remove this part to make the paper more convincing
[ 2087, 50228, 253, 5661, 1543, 327, 2067, 5028, 15644, 49602, 762, 1264, 1027, 4758, 601, 567, 1282, 42644, 1473, 14923, 42644, 1473, 285, 45120, 42644, 1473, 12654, 253, 12510, 273, 253, 4081, 1332, 20544, 50276, 18, 347, 2080, 347, 891, 717, 6600, 436, 310, 253, 806, 673, 326, 253, 37851, 1566, 2746, 310, 5611, 275, 253, 2603, 4924, 5028, 15644, 253, 4081, 1332, 310, 22335, 3590, 285, 4460, 275, 253, 42644, 1473, 1895, 253, 30328, 3212, 253, 39762, 1566, 20452, 1332, 275, 436, 2929, 13840, 973, 253, 42644, 1473, 4758, 534, 476, 3324, 690, 4460, 16039, 281, 253, 3114, 374, 253, 4081, 1332, 310, 5919, 285, 476, 320, 5678, 342, 643, 42644, 1473, 3082, 495, 253, 4679, 3835, 1264, 1027, 42644, 1473, 15216, 285, 253, 1543, 3839, 1007, 12532, 253, 1783, 275, 2593, 7652, 310, 11088, 285, 12453, 690, 273, 619, 7350, 1309, 16725, 50276, 20881, 1255, 265, 50276, 18, 253, 16182, 2593, 19756, 690, 3309, 5740, 273, 253, 4081, 1332, 285, 4561, 5933, 534, 1543, 275, 440, 498, 15752, 285, 28931, 323, 1650, 50272, 18, 752, 310, 253, 830, 273, 253, 2173, 8103, 1159, 762, 253, 9376, 273, 2720, 285, 12637, 3268, 5393, 275, 253, 2929, 16534, 253, 2488, 2007, 19148, 253, 2048, 273, 253, 37820, 1307, 50273, 19, 310, 253, 2022, 4764, 40009, 5118, 285, 9300, 604, 667, 3587, 407, 253, 11454, 2990, 374, 690, 20121, 273, 253, 5661, 2593, 403, 21643, 50273, 18, 323, 1650, 275, 3104, 26578, 14832, 1223, 12930, 253, 28841, 42644, 1473, 253, 2929, 29820, 253, 3733, 3408, 6125, 253, 2303, 3733, 1232, 2299, 352, 310, 3542, 253, 1566, 310, 806, 10166, 970, 5096, 2603, 941, 285, 840, 5762, 970, 253, 1551, 273, 253, 2603, 941, 285, 512, 253, 2303, 941, 323, 14923, 42644, 1473, 4758, 840, 275, 3104, 22752, 18392, 352, 25957, 760, 253, 3733, 3408, 285, 253, 1071, 3408, 891, 9428, 841, 767, 25491, 403, 1097, 4879, 323, 2303, 3733, 812, 253, 4477, 23284, 907, 2603, 3733, 390, 2303, 3733, 281, 3693, 28931, 16280, 812, 253, 2603, 1566, 5978, 1232, 320, 5611, 281, 1056, 253, 5661, 5740, 625, 3426, 7296, 253, 1798, 4758, 273, 42644, 1473, 891, 1007, 3579, 281, 247, 30909, 14883, 318, 273, 253, 5661, 4278, 495, 253, 5661, 1543, 760, 3748, 253, 7200, 533, 513, 417, 2486, 253, 11041, 629, 347, 253, 4081, 1566, 20452, 1332, 760, 18325, 253, 26309, 11041, 285, 858, 417, 5731, 253, 2603, 3210, 2801, 891, 717, 14338, 670, 253, 7882, 273, 253, 4081, 1332, 577, 627, 403, 690, 12744, 15965, 12091, 323, 1650, 275, 1386, 1903, 520, 883, 1060, 359, 5467, 326, 359, 5283, 253, 1072, 2720, 26332, 268, 17118, 81, 3005, 259, 1057, 352, 2097, 326, 841, 767, 3632, 4903, 452, 253, 1072, 11041, 608, 275, 1386, 11583, 436, 2929, 3587, 19584, 253, 1182, 254, 485, 266, 29436, 305, 12064, 3268, 347, 253, 12637, 3268, 2021, 812, 253, 4477, 5513, 625, 670, 824, 10165, 285, 13260, 50272, 35501, 323, 253, 3434, 432, 253, 4477, 285, 253, 30080, 22559, 556, 973, 9713, 619, 7350, 5393, 1840, 891, 651, 751, 281, 7164, 619, 4868, 281, 818, 253, 4477, 5469, 253, 7364, 275, 2593, 374, 5474, 339, 431, 248, 5691, 273, 2603, 4924, 5028, 15644, 42644, 1473, 310, 326, 253, 2603, 941, 310, 49187, 846, 253, 2603, 1566, 310, 10166, 767, 21270, 7274, 281, 2953, 253, 1895, 310, 1442, 292, 25004, 253, 2603, 1566, 327, 2303, 5028, 285, 19071, 253, 519, 49872, 715, 253, 2603, 1566, 285, 3037, 352, 342, 2303, 941, 2299, 253, 1442, 292, 37437, 1566, 3798, 556, 4105, 26647, 273, 253, 1566, 1580, 352, 14280, 281, 320, 23539, 281, 247, 2173, 2303, 5028, 253, 23675, 2307, 8287, 1332, 310, 7591, 281, 6016, 1781, 5028, 15036, 347, 512, 253, 27311, 267, 8090, 285, 4751, 14063, 8090, 403, 13831, 50275, 936, 33623, 841, 3237, 275, 436, 789, 253, 4477, 4081, 281, 12230, 253, 3602, 273, 253, 2603, 1566, 281, 5115, 15644, 281, 2303, 5028, 253, 1566, 20452, 2746, 3037, 10625, 29765, 3640, 949, 39762, 17699, 16561, 17032, 6941, 253, 1566, 281, 39970, 281, 4623, 10625, 436, 1566, 1530, 6932, 7792, 476, 14556, 5223, 281, 2303, 5028, 1223, 8127, 24279, 253, 20741, 800, 3745, 273, 253, 2603, 1566, 253, 1332, 556, 644, 17618, 275, 2710, 15302, 285, 3368, 7533, 20544, 50276, 783, 789, 310, 973, 24013, 8550, 253, 4028, 310, 3477, 281, 956, 285, 253, 7792, 310, 2087, 285, 12112, 50276, 33850, 247, 4460, 37851, 7792, 323, 42644, 1473, 50276, 48387, 456, 253, 10527, 4602, 875, 616, 37851, 7792, 285, 253, 17699, 16561, 11454, 2990, 50276, 783, 10527, 1783, 273, 253, 1332, 310, 2590, 285, 3590, 50275, 783, 16774, 285, 11088, 4679, 17618, 253, 1332, 275, 2710, 7533, 50276, 20881, 1255, 265, 50276, 936, 1056, 352, 1881, 1987, 404, 247, 625, 7000, 5740, 273, 253, 1566, 3733, 390, 247, 10585, 406, 853, 273, 253, 3733, 5933, 310, 3264, 323, 4227, 275, 1386, 26472, 285, 27332, 253, 4477, 760, 5393, 326, 253, 3264, 12177, 8671, 253, 1072, 11655, 908, 275, 3806, 9380, 352, 651, 320, 1805, 281, 2486, 253, 2957, 1159, 347, 24864, 50275, 17480, 253, 4477, 8671, 253, 1980, 294, 19484, 1320, 10480, 275, 1386, 28875, 352, 651, 320, 1805, 281, 13366, 5513, 849, 281, 22318, 253, 8103, 1159, 275, 16186, 20, 594, 326, 1361, 253, 2087, 8446, 2096, 2139, 285, 849, 281, 4647, 253, 10480, 275, 7819, 352, 671, 1361, 253, 8446, 755, 247, 1805, 4685, 273, 253, 3733, 24824, 50276, 74, 651, 5583, 281, 823, 247, 10585, 406, 853, 273, 253, 3733, 5933, 275, 253, 2022, 2133, 390, 24864, 824, 326, 281, 1361, 253, 10668, 281, 755, 247, 2590, 1859, 273, 253, 1566, 3733, 24824, 5474, 339, 9852, 436, 2929, 253, 4477, 12661, 281, 8415, 253, 2603, 4924, 5028, 15644, 1895, 407, 12230, 272, 253, 2603, 1566, 281, 5223, 352, 281, 253, 2303, 5028, 253, 26309, 323, 253, 2603, 1566, 403, 6311, 949, 39762, 17699, 16561, 17032, 253, 4477, 2085, 247, 4764, 9628, 5700, 281, 3012, 4796, 253, 1180, 273, 3037, 494, 26309, 3103, 253, 4081, 1332, 310, 247, 12112, 285, 28441, 7792, 285, 476, 320, 8527, 342, 5368, 3082, 9470, 4679, 327, 608, 15302, 762, 495, 1027, 42644, 1473, 7533, 7568, 253, 8936, 3045, 273, 39762, 1566, 26309, 337, 247, 12112, 285, 28441, 7792, 3477, 281, 320, 8527, 342, 2045, 3082, 374, 1077, 1643, 3037, 494, 3602, 495, 352, 34430, 1868, 253, 1232, 273, 5028, 15644, 715, 253, 13727, 629, 253, 4228, 2603, 1566, 285, 11962, 629, 253, 3037, 934, 26309, 577, 253, 4477, 4802, 39762, 1566, 20452, 342, 17699, 16561, 11454, 6928, 534, 3400, 247, 10527, 12215, 323, 253, 26647, 3045, 273, 1566, 20452, 608, 954, 5661, 4278, 2424, 323, 39306, 253, 1543, 497, 1677, 2127, 369, 9262, 347, 973, 534, 310, 35092, 721, 253, 3045, 310, 12085, 39762, 1566, 20452, 476, 5115, 253, 1682, 1543, 275, 2761, 7533, 1411, 253, 2429, 1666, 25379, 818, 253, 4679, 1379, 715, 2395, 1097, 42644, 1473, 285, 1071, 673, 15644, 7533, 534, 1646, 1077, 4209, 3877, 326, 2045, 3082, 14042, 42644, 1473, 285, 1071, 673, 15644, 10939, 24088, 5103, 18, 42644, 1473, 295, 3373, 19, 42644, 1473, 288, 498, 20, 42644, 1473, 12556, 21, 1071, 2606, 15644, 270, 18944, 22, 1071, 2606, 15644, 13450, 893, 721, 1071, 2606, 15644, 436, 2929, 9093, 440, 7790, 253, 767, 7533, 715, 581, 7792, 337, 1386, 18672, 285, 1749, 963, 993, 5223, 279, 374, 253, 5170, 1083, 285, 2406, 1083, 403, 417, 5185, 465, 392, 2373, 9515, 275, 1386, 5325, 285, 465, 392, 2373, 9515, 275, 1386, 9199, 17825, 2720, 275, 1386, 17574, 285, 17825, 2720, 275, 1386, 30783, 495, 275, 16186, 854, 259, 4620, 323, 253, 806, 673, 352, 943, 320, 22923, 2556, 281, 16186, 374, 5474, 339, 431, 248, 2929, 29328, 247, 39762, 17699, 16561, 17032, 7792, 281, 12230, 2603, 1566, 3602, 281, 2953, 2603, 4924, 5028, 15644, 50276, 783, 2929, 29328, 7792, 10486, 5223, 84, 253, 2603, 1566, 281, 253, 2303, 5028, 285, 31221, 253, 20741, 1430, 273, 253, 1566, 285, 50275, 783, 2929, 23970, 247, 4764, 9628, 5700, 281, 5115, 625, 5919, 15644, 50276, 783, 2929, 3400, 247, 10527, 345, 3024, 279, 875, 17699, 265, 404, 4715, 285, 253, 4081, 1332, 50276, 783, 2929, 2589, 84, 4679, 327, 1264, 2603, 4924, 5028, 15644, 31225, 281, 12654, 253, 3045, 273, 253, 4081, 1332, 4757, 50276, 783, 2087, 2934, 310, 4722, 2720, 2987, 327, 1442, 292, 25004, 671, 1379, 5609, 281, 4796, 253, 23279, 875, 253, 1442, 292, 37437, 1566, 285, 253, 3236, 436, 2929, 3587, 3210, 253, 20452, 534, 10748, 3710, 253, 23279, 875, 253, 24251, 1566, 285, 253, 3236, 1566, 534, 476, 10486, 32547, 5798, 383, 41507, 37264, 285, 253, 13276, 15644, 50276, 783, 4477, 5223, 1242, 4575, 253, 2720, 3268, 323, 1027, 27311, 34501, 534, 32547, 253, 4833, 273, 253, 11041, 275, 27311, 267, 10295, 3602, 275, 253, 3215, 11273, 1566, 50276, 783, 4602, 281, 17699, 16561, 4715, 3400, 247, 10527, 3720, 323, 253, 4081, 1332, 50276, 783, 2929, 1057, 417, 9569, 1512, 1142, 4465, 3602, 285, 812, 320, 10166, 14556, 50276, 783, 21652, 273, 2801, 11649, 310, 4722, 534, 2722, 326, 1027, 8090, 452, 1027, 11649, 327, 253, 3602, 253, 8310, 812, 7102, 1563, 2987, 327, 2603, 4924, 5028, 519, 4066, 318, 50275, 20881, 1255, 50276, 251, 1386, 5910, 2603, 4924, 5028, 15644, 19584, 326, 1097, 2603, 941, 285, 13130, 2303, 941, 403, 417, 2130, 533, 417, 390, 50276, 251, 1386, 6480, 1449, 253, 2929, 1736, 316, 326, 253, 3045, 327, 253, 2603, 5028, 588, 40195, 2299, 275, 1142, 5028, 15644, 7533, 760, 253, 3045, 275, 253, 2303, 5028, 310, 1774, 323, 253, 2603, 5028, 359, 476, 3587, 897, 253, 1566, 10166, 327, 253, 2603, 5028, 281, 1347, 10554, 3021, 1027, 432, 45120, 4715, 9579, 272, 253, 2603, 3045, 778, 417, 320, 247, 2159, 4202, 273, 5028, 15644, 3082, 50276, 251, 1386, 21566, 19, 253, 8492, 281, 247, 2173, 2303, 5028, 778, 417, 320, 247, 1895, 323, 253, 5028, 15644, 281, 247, 2014, 5028, 533, 247, 1895, 323, 1554, 262, 1816, 5028, 15644, 2299, 323, 1554, 262, 1816, 5028, 15644, 359, 476, 671, 17310, 512, 253, 10625, 285, 1347, 247, 2014, 2303, 5028, 15644, 50275, 251, 1386, 898, 19332, 253, 4477, 809, 1920, 314, 5467, 326, 18687, 40639, 556, 247, 5058, 1599, 534, 2097, 326, 18687, 40639, 556, 253, 4585, 4038, 2556, 281, 253, 2867, 273, 253, 305, 12064, 3268, 2299, 18687, 40639, 50276, 17, 6492, 326, 253, 2603, 2990, 943, 320, 3587, 3732, 281, 253, 2303, 2990, 534, 310, 247, 4518, 749, 29776, 2900, 323, 18687, 40639, 2139, 1057, 253, 4477, 897, 247, 9090, 749, 29776, 2900, 387, 253, 1127, 273, 253, 4585, 4038, 323, 18687, 40639, 50276, 783, 4764, 9628, 5700, 310, 3626, 285, 943, 417, 320, 2783, 347, 247, 1534, 7680, 253, 3602, 275, 253, 1072, 10295, 556, 2266, 13007, 285, 943, 417, 320, 4127, 11794, 3021, 9628, 253, 1072, 20452, 310, 247, 3626, 4327, 533, 417, 247, 2714, 5700, 50276, 249, 2593, 7652, 752, 1057, 4764, 14122, 73, 609, 272, 1599, 275, 253, 12494, 4764, 9628, 5700, 253, 4477, 513, 417, 2319, 253, 7364, 273, 253, 1332, 812, 253, 4477, 2319, 436, 275, 253, 2380, 390, 1127, 562, 387, 534, 1659, 275, 253, 2929, 253, 4477, 2319, 436, 2490, 187, 4118, 18435, 27, 2520, 2929, 29328, 247, 4460, 37851, 7792, 323, 2603, 4924, 5028, 15644, 275, 534, 253, 2603, 1566, 11029, 347, 253, 13727, 629, 1599, 1223, 247, 20452, 11041, 310, 3732, 281, 253, 2603, 1566, 3602, 281, 15313, 253, 2303, 1566, 326, 8553, 323, 253, 2303, 6160, 3268, 512, 1740, 30628, 2530, 7000, 285, 25799, 5701, 534, 497, 973, 2668, 715, 2395, 407, 253, 4477, 275, 616, 18520, 285, 30080, 22559, 846, 5955, 512, 30628, 497, 2762, 670, 253, 2929, 913, 5821, 342, 253, 30628, 326, 436, 2929, 23970, 247, 4460, 4891, 285, 3602, 45501, 2746, 281, 2603, 4924, 5028, 15644, 342, 11088, 16774, 7103, 323, 2067, 7533, 534, 588, 320, 7561, 6110, 407, 253, 3114, 247, 2007, 4385, 273, 913, 310, 326, 253, 4602, 281, 439, 2284, 23104, 580, 301, 1162, 14350, 41116, 3033, 310, 2581, 273, 649, 6361, 281, 436, 2929, 2403, 253, 5955, 2256, 281, 19652, 50276, 9088, 310, 642, 7473, 14053, 273, 253, 2603, 285, 2303, 941, 10670, 326, 310, 2424, 407, 253, 3033, 1223, 253, 3033, 2550, 6266, 5028, 2905, 1255, 275, 2426, 273, 1566, 3602, 594, 891, 1804, 253, 4477, 281, 5386, 436, 629, 281, 1056, 253, 2929, 625, 21414 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 2087, 50228, 253, 5661, 1543, 327, 2067, 5028, 15644, 49602, 762, 1264, 1027, 4758, 601, 567, 1282, 42644, 1473, 14923, 42644, 1473, 285, 45120, 42644, 1473, 12654, 253, 12510, 273, 253, 4081, 1332, 20544, 50276, 18, 347, 2080, 347, 891, 717, 6600, 436, 310, 253, 806, 673, 326, 253, 37851, 1566, 2746, 310, 5611, 275, 253, 2603, 4924, 5028, 15644, 253, 4081, 1332, 310, 22335, 3590, 285, 4460, 275, 253, 42644, 1473, 1895, 253, 30328, 3212, 253, 39762, 1566, 20452, 1332, 275, 436, 2929, 13840, 973, 253, 42644, 1473, 4758, 534, 476, 3324, 690, 4460, 16039, 281, 253, 3114, 374, 253, 4081, 1332, 310, 5919, 285, 476, 320, 5678, 342, 643, 42644, 1473, 3082, 495, 253, 4679, 3835, 1264, 1027, 42644, 1473, 15216, 285, 253, 1543, 3839, 1007, 12532, 253, 1783, 275, 2593, 7652, 310, 11088, 285, 12453, 690, 273, 619, 7350, 1309, 16725, 50276, 20881, 1255, 265, 50276, 18, 253, 16182, 2593, 19756, 690, 3309, 5740, 273, 253, 4081, 1332, 285, 4561, 5933, 534, 1543, 275, 440, 498, 15752, 285, 28931, 323, 1650, 50272, 18, 752, 310, 253, 830, 273, 253, 2173, 8103, 1159, 762, 253, 9376, 273, 2720, 285, 12637, 3268, 5393, 275, 253, 2929, 16534, 253, 2488, 2007, 19148, 253, 2048, 273, 253, 37820, 1307, 50273, 19, 310, 253, 2022, 4764, 40009, 5118, 285, 9300, 604, 667, 3587, 407, 253, 11454, 2990, 374, 690, 20121, 273, 253, 5661, 2593, 403, 21643, 50273, 18, 323, 1650, 275, 3104, 26578, 14832, 1223, 12930, 253, 28841, 42644, 1473, 253, 2929, 29820, 253, 3733, 3408, 6125, 253, 2303, 3733, 1232, 2299, 352, 310, 3542, 253, 1566, 310, 806, 10166, 970, 5096, 2603, 941, 285, 840, 5762, 970, 253, 1551, 273, 253, 2603, 941, 285, 512, 253, 2303, 941, 323, 14923, 42644, 1473, 4758, 840, 275, 3104, 22752, 18392, 352, 25957, 760, 253, 3733, 3408, 285, 253, 1071, 3408, 891, 9428, 841, 767, 25491, 403, 1097, 4879, 323, 2303, 3733, 812, 253, 4477, 23284, 907, 2603, 3733, 390, 2303, 3733, 281, 3693, 28931, 16280, 812, 253, 2603, 1566, 5978, 1232, 320, 5611, 281, 1056, 253, 5661, 5740, 625, 3426, 7296, 253, 1798, 4758, 273, 42644, 1473, 891, 1007, 3579, 281, 247, 30909, 14883, 318, 273, 253, 5661, 4278, 495, 253, 5661, 1543, 760, 3748, 253, 7200, 533, 513, 417, 2486, 253, 11041, 629, 347, 253, 4081, 1566, 20452, 1332, 760, 18325, 253, 26309, 11041, 285, 858, 417, 5731, 253, 2603, 3210, 2801, 891, 717, 14338, 670, 253, 7882, 273, 253, 4081, 1332, 577, 627, 403, 690, 12744, 15965, 12091, 323, 1650, 275, 1386, 1903, 520, 883, 1060, 359, 5467, 326, 359, 5283, 253, 1072, 2720, 26332, 268, 17118, 81, 3005, 259, 1057, 352, 2097, 326, 841, 767, 3632, 4903, 452, 253, 1072, 11041, 608, 275, 1386, 11583, 436, 2929, 3587, 19584, 253, 1182, 254, 485, 266, 29436, 305, 12064, 3268, 347, 253, 12637, 3268, 2021, 812, 253, 4477, 5513, 625, 670, 824, 10165, 285, 13260, 50272, 35501, 323, 253, 3434, 432, 253, 4477, 285, 253, 30080, 22559, 556, 973, 9713, 619, 7350, 5393, 1840, 891, 651, 751, 281, 7164, 619, 4868, 281, 818, 253, 4477, 5469, 253, 7364, 275, 2593, 374, 5474, 339, 431, 248, 5691, 273, 2603, 4924, 5028, 15644, 42644, 1473, 310, 326, 253, 2603, 941, 310, 49187, 846, 253, 2603, 1566, 310, 10166, 767, 21270, 7274, 281, 2953, 253, 1895, 310, 1442, 292, 25004, 253, 2603, 1566, 327, 2303, 5028, 285, 19071, 253, 519, 49872, 715, 253, 2603, 1566, 285, 3037, 352, 342, 2303, 941, 2299, 253, 1442, 292, 37437, 1566, 3798, 556, 4105, 26647, 273, 253, 1566, 1580, 352, 14280, 281, 320, 23539, 281, 247, 2173, 2303, 5028, 253, 23675, 2307, 8287, 1332, 310, 7591, 281, 6016, 1781, 5028, 15036, 347, 512, 253, 27311, 267, 8090, 285, 4751, 14063, 8090, 403, 13831, 50275, 936, 33623, 841, 3237, 275, 436, 789, 253, 4477, 4081, 281, 12230, 253, 3602, 273, 253, 2603, 1566, 281, 5115, 15644, 281, 2303, 5028, 253, 1566, 20452, 2746, 3037, 10625, 29765, 3640, 949, 39762, 17699, 16561, 17032, 6941, 253, 1566, 281, 39970, 281, 4623, 10625, 436, 1566, 1530, 6932, 7792, 476, 14556, 5223, 281, 2303, 5028, 1223, 8127, 24279, 253, 20741, 800, 3745, 273, 253, 2603, 1566, 253, 1332, 556, 644, 17618, 275, 2710, 15302, 285, 3368, 7533, 20544, 50276, 783, 789, 310, 973, 24013, 8550, 253, 4028, 310, 3477, 281, 956, 285, 253, 7792, 310, 2087, 285, 12112, 50276, 33850, 247, 4460, 37851, 7792, 323, 42644, 1473, 50276, 48387, 456, 253, 10527, 4602, 875, 616, 37851, 7792, 285, 253, 17699, 16561, 11454, 2990, 50276, 783, 10527, 1783, 273, 253, 1332, 310, 2590, 285, 3590, 50275, 783, 16774, 285, 11088, 4679, 17618, 253, 1332, 275, 2710, 7533, 50276, 20881, 1255, 265, 50276, 936, 1056, 352, 1881, 1987, 404, 247, 625, 7000, 5740, 273, 253, 1566, 3733, 390, 247, 10585, 406, 853, 273, 253, 3733, 5933, 310, 3264, 323, 4227, 275, 1386, 26472, 285, 27332, 253, 4477, 760, 5393, 326, 253, 3264, 12177, 8671, 253, 1072, 11655, 908, 275, 3806, 9380, 352, 651, 320, 1805, 281, 2486, 253, 2957, 1159, 347, 24864, 50275, 17480, 253, 4477, 8671, 253, 1980, 294, 19484, 1320, 10480, 275, 1386, 28875, 352, 651, 320, 1805, 281, 13366, 5513, 849, 281, 22318, 253, 8103, 1159, 275, 16186, 20, 594, 326, 1361, 253, 2087, 8446, 2096, 2139, 285, 849, 281, 4647, 253, 10480, 275, 7819, 352, 671, 1361, 253, 8446, 755, 247, 1805, 4685, 273, 253, 3733, 24824, 50276, 74, 651, 5583, 281, 823, 247, 10585, 406, 853, 273, 253, 3733, 5933, 275, 253, 2022, 2133, 390, 24864, 824, 326, 281, 1361, 253, 10668, 281, 755, 247, 2590, 1859, 273, 253, 1566, 3733, 24824, 5474, 339, 9852, 436, 2929, 253, 4477, 12661, 281, 8415, 253, 2603, 4924, 5028, 15644, 1895, 407, 12230, 272, 253, 2603, 1566, 281, 5223, 352, 281, 253, 2303, 5028, 253, 26309, 323, 253, 2603, 1566, 403, 6311, 949, 39762, 17699, 16561, 17032, 253, 4477, 2085, 247, 4764, 9628, 5700, 281, 3012, 4796, 253, 1180, 273, 3037, 494, 26309, 3103, 253, 4081, 1332, 310, 247, 12112, 285, 28441, 7792, 285, 476, 320, 8527, 342, 5368, 3082, 9470, 4679, 327, 608, 15302, 762, 495, 1027, 42644, 1473, 7533, 7568, 253, 8936, 3045, 273, 39762, 1566, 26309, 337, 247, 12112, 285, 28441, 7792, 3477, 281, 320, 8527, 342, 2045, 3082, 374, 1077, 1643, 3037, 494, 3602, 495, 352, 34430, 1868, 253, 1232, 273, 5028, 15644, 715, 253, 13727, 629, 253, 4228, 2603, 1566, 285, 11962, 629, 253, 3037, 934, 26309, 577, 253, 4477, 4802, 39762, 1566, 20452, 342, 17699, 16561, 11454, 6928, 534, 3400, 247, 10527, 12215, 323, 253, 26647, 3045, 273, 1566, 20452, 608, 954, 5661, 4278, 2424, 323, 39306, 253, 1543, 497, 1677, 2127, 369, 9262, 347, 973, 534, 310, 35092, 721, 253, 3045, 310, 12085, 39762, 1566, 20452, 476, 5115, 253, 1682, 1543, 275, 2761, 7533, 1411, 253, 2429, 1666, 25379, 818, 253, 4679, 1379, 715, 2395, 1097, 42644, 1473, 285, 1071, 673, 15644, 7533, 534, 1646, 1077, 4209, 3877, 326, 2045, 3082, 14042, 42644, 1473, 285, 1071, 673, 15644, 10939, 24088, 5103, 18, 42644, 1473, 295, 3373, 19, 42644, 1473, 288, 498, 20, 42644, 1473, 12556, 21, 1071, 2606, 15644, 270, 18944, 22, 1071, 2606, 15644, 13450, 893, 721, 1071, 2606, 15644, 436, 2929, 9093, 440, 7790, 253, 767, 7533, 715, 581, 7792, 337, 1386, 18672, 285, 1749, 963, 993, 5223, 279, 374, 253, 5170, 1083, 285, 2406, 1083, 403, 417, 5185, 465, 392, 2373, 9515, 275, 1386, 5325, 285, 465, 392, 2373, 9515, 275, 1386, 9199, 17825, 2720, 275, 1386, 17574, 285, 17825, 2720, 275, 1386, 30783, 495, 275, 16186, 854, 259, 4620, 323, 253, 806, 673, 352, 943, 320, 22923, 2556, 281, 16186, 374, 5474, 339, 431, 248, 2929, 29328, 247, 39762, 17699, 16561, 17032, 7792, 281, 12230, 2603, 1566, 3602, 281, 2953, 2603, 4924, 5028, 15644, 50276, 783, 2929, 29328, 7792, 10486, 5223, 84, 253, 2603, 1566, 281, 253, 2303, 5028, 285, 31221, 253, 20741, 1430, 273, 253, 1566, 285, 50275, 783, 2929, 23970, 247, 4764, 9628, 5700, 281, 5115, 625, 5919, 15644, 50276, 783, 2929, 3400, 247, 10527, 345, 3024, 279, 875, 17699, 265, 404, 4715, 285, 253, 4081, 1332, 50276, 783, 2929, 2589, 84, 4679, 327, 1264, 2603, 4924, 5028, 15644, 31225, 281, 12654, 253, 3045, 273, 253, 4081, 1332, 4757, 50276, 783, 2087, 2934, 310, 4722, 2720, 2987, 327, 1442, 292, 25004, 671, 1379, 5609, 281, 4796, 253, 23279, 875, 253, 1442, 292, 37437, 1566, 285, 253, 3236, 436, 2929, 3587, 3210, 253, 20452, 534, 10748, 3710, 253, 23279, 875, 253, 24251, 1566, 285, 253, 3236, 1566, 534, 476, 10486, 32547, 5798, 383, 41507, 37264, 285, 253, 13276, 15644, 50276, 783, 4477, 5223, 1242, 4575, 253, 2720, 3268, 323, 1027, 27311, 34501, 534, 32547, 253, 4833, 273, 253, 11041, 275, 27311, 267, 10295, 3602, 275, 253, 3215, 11273, 1566, 50276, 783, 4602, 281, 17699, 16561, 4715, 3400, 247, 10527, 3720, 323, 253, 4081, 1332, 50276, 783, 2929, 1057, 417, 9569, 1512, 1142, 4465, 3602, 285, 812, 320, 10166, 14556, 50276, 783, 21652, 273, 2801, 11649, 310, 4722, 534, 2722, 326, 1027, 8090, 452, 1027, 11649, 327, 253, 3602, 253, 8310, 812, 7102, 1563, 2987, 327, 2603, 4924, 5028, 519, 4066, 318, 50275, 20881, 1255, 50276, 251, 1386, 5910, 2603, 4924, 5028, 15644, 19584, 326, 1097, 2603, 941, 285, 13130, 2303, 941, 403, 417, 2130, 533, 417, 390, 50276, 251, 1386, 6480, 1449, 253, 2929, 1736, 316, 326, 253, 3045, 327, 253, 2603, 5028, 588, 40195, 2299, 275, 1142, 5028, 15644, 7533, 760, 253, 3045, 275, 253, 2303, 5028, 310, 1774, 323, 253, 2603, 5028, 359, 476, 3587, 897, 253, 1566, 10166, 327, 253, 2603, 5028, 281, 1347, 10554, 3021, 1027, 432, 45120, 4715, 9579, 272, 253, 2603, 3045, 778, 417, 320, 247, 2159, 4202, 273, 5028, 15644, 3082, 50276, 251, 1386, 21566, 19, 253, 8492, 281, 247, 2173, 2303, 5028, 778, 417, 320, 247, 1895, 323, 253, 5028, 15644, 281, 247, 2014, 5028, 533, 247, 1895, 323, 1554, 262, 1816, 5028, 15644, 2299, 323, 1554, 262, 1816, 5028, 15644, 359, 476, 671, 17310, 512, 253, 10625, 285, 1347, 247, 2014, 2303, 5028, 15644, 50275, 251, 1386, 898, 19332, 253, 4477, 809, 1920, 314, 5467, 326, 18687, 40639, 556, 247, 5058, 1599, 534, 2097, 326, 18687, 40639, 556, 253, 4585, 4038, 2556, 281, 253, 2867, 273, 253, 305, 12064, 3268, 2299, 18687, 40639, 50276, 17, 6492, 326, 253, 2603, 2990, 943, 320, 3587, 3732, 281, 253, 2303, 2990, 534, 310, 247, 4518, 749, 29776, 2900, 323, 18687, 40639, 2139, 1057, 253, 4477, 897, 247, 9090, 749, 29776, 2900, 387, 253, 1127, 273, 253, 4585, 4038, 323, 18687, 40639, 50276, 783, 4764, 9628, 5700, 310, 3626, 285, 943, 417, 320, 2783, 347, 247, 1534, 7680, 253, 3602, 275, 253, 1072, 10295, 556, 2266, 13007, 285, 943, 417, 320, 4127, 11794, 3021, 9628, 253, 1072, 20452, 310, 247, 3626, 4327, 533, 417, 247, 2714, 5700, 50276, 249, 2593, 7652, 752, 1057, 4764, 14122, 73, 609, 272, 1599, 275, 253, 12494, 4764, 9628, 5700, 253, 4477, 513, 417, 2319, 253, 7364, 273, 253, 1332, 812, 253, 4477, 2319, 436, 275, 253, 2380, 390, 1127, 562, 387, 534, 1659, 275, 253, 2929, 253, 4477, 2319, 436, 2490, 187, 4118, 18435, 27, 2520, 2929, 29328, 247, 4460, 37851, 7792, 323, 2603, 4924, 5028, 15644, 275, 534, 253, 2603, 1566, 11029, 347, 253, 13727, 629, 1599, 1223, 247, 20452, 11041, 310, 3732, 281, 253, 2603, 1566, 3602, 281, 15313, 253, 2303, 1566, 326, 8553, 323, 253, 2303, 6160, 3268, 512, 1740, 30628, 2530, 7000, 285, 25799, 5701, 534, 497, 973, 2668, 715, 2395, 407, 253, 4477, 275, 616, 18520, 285, 30080, 22559, 846, 5955, 512, 30628, 497, 2762, 670, 253, 2929, 913, 5821, 342, 253, 30628, 326, 436, 2929, 23970, 247, 4460, 4891, 285, 3602, 45501, 2746, 281, 2603, 4924, 5028, 15644, 342, 11088, 16774, 7103, 323, 2067, 7533, 534, 588, 320, 7561, 6110, 407, 253, 3114, 247, 2007, 4385, 273, 913, 310, 326, 253, 4602, 281, 439, 2284, 23104, 580, 301, 1162, 14350, 41116, 3033, 310, 2581, 273, 649, 6361, 281, 436, 2929, 2403, 253, 5955, 2256, 281, 19652, 50276, 9088, 310, 642, 7473, 14053, 273, 253, 2603, 285, 2303, 941, 10670, 326, 310, 2424, 407, 253, 3033, 1223, 253, 3033, 2550, 6266, 5028, 2905, 1255, 275, 2426, 273, 1566, 3602, 594, 891, 1804, 253, 4477, 281, 5386, 436, 629, 281, 1056, 253, 2929, 625, 21414 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper proposes a general theoretical framework for inference learning to this end the authors propose the generalized inference learning gil algorithm they are then able to show that gil closely approximates the implicit stochastic gradient descent implicit sgd which corresponds to the proximal update of the weight parameters and is distinct from the explicit sgd often used by backpropagation in addition the authors compare the performance of several algorithms based on backpropagation with those based on inference learning and establish that the latter converges more quickly on smaller minibatches to the best of my knowledge this paper offers a novel perspective on inference learning by establishing the connection between gil and implicit sgd the authors do a good job of motivating the similarities between inference learning and biological learning processes in addition they provide both theoretical justifications and experimental results for their claims as for the weaknesses i believe that the authors could explain or provide some ideas for some of the experimental results they observe in more detail such as ilsgd performing worse than bpsgd but iladam and bpadam performing more similarly on cifar10 the main limitation of the paper is that it mainly explores the case where a single data point is present in each minibatch the authors bring up this limitation in their work and provide a justification as to why it is not significant from a biological point of view as such i believe the authors adequately address the limitations of their work however i am not sure whether exploring the properties of larger minibatch sizes in more detail could still yield some interesting results docsepthis paper proposes an alternative to gradient back propagation in the form of inference learning this algorithm has the advantage of being more compatible with a biophysical implementation and the paper aims to develop a theoretical framework for this learning the expected results are to obtain convergence results for this algorithm and to show the performance compared to classical bp algorithms the abstract makes a very strong assumption about the result by stating that this algorithm can be superior to the bp algorithm a main strength of the paper is to show the state of the art in machine learning algorithms and in particular to show the comparison between stochastic gradient descent type models and inference learning models the model is well presented and the notations are relatively clear in particular the main limitation of the back propagation algorithm is well highlighted by showing how to implement a local target estimation algorithm a first weakness of the paper is that it relies on details that are supplied in the 40 pages appendix and that relying on this does not allow an independent evaluation of the results in the paper secondly even if the results seem encouraging they are still clearly insufficient to show the generality of the algorithm indeed the only benchmark used is that of cifar and the results are obtained with results superior to bp only in the very restrictive case of a batch size equal to one this simulation result does not therefore justify the conclusion made in the summary about the generality of the results overall the paper suffers from a major limitation between the conclusions that are put forward in the abstract and the theoretical but also simulation results that are presented in the paper it would surely have been more correct to make less farreaching assumptions but fully validated by the theoretical and experimental results docsepthe paper gives a theoretical framework and justification for the biologically plausible inference learning neuralnetwork training algorithm a generalization of predictive coding the paper shows that in the lown regime il closely approximates implicit stochastic gradient descent it steps in the gradient direction while also staying close to the current weights based on this theorem the paper then improves il to more closely approximate implicit sgd which improves the stability and convergence of il across learning rates and tasks strengths strong bioplausibility motivation beats backpropagation in the smallbatch setting weaknesses beaten by backprop in the midsizebatch n64 setting i cant quite see where this paper fits in bioplausible training of artificial neural networks in the smallbatch eg n1 regime remains to my knowledge not a very common task if the idea is that the brain might do something like il can the authors point to any experiments specifically picking out il as distinct from predictive coding in the brain if the idea is to train anns for tasks why not just use bp and large batchsizes have the authors run any experiments in memoryconstrained settings showing that il does better there the authors have addressed these questions to my satisfaction in their response ### Summary:
this paper presents an interesting connection between stochastic gradient descent by backpropagation and the inference learning algorithm for predictive coding the key result is that inference learning approximates implicit gradient descent rather than explicit sgd as normally implemented the implicit methods perform comparably to standard methods and they may be of interest to computational neuroscientists interested in biologically plausible learning rules in addition to addressing the reviewers concerns i would encourage the authors to improve the exposition around eqs 1 and 2 the stated equalities require a few lines of calculus to derive and you could spare the reader the trouble
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 247, 2087, 10527, 7792, 323, 17032, 4715, 281, 436, 990, 253, 4477, 12661, 253, 14923, 17032, 4715, 305, 300, 5933, 597, 403, 840, 2104, 281, 921, 326, 305, 300, 8244, 4020, 684, 253, 15424, 19191, 11786, 18499, 15424, 256, 35333, 534, 10140, 281, 253, 19561, 5731, 273, 253, 2801, 3602, 285, 310, 5799, 432, 253, 6843, 256, 35333, 2223, 908, 407, 896, 44263, 318, 275, 1635, 253, 4477, 7277, 253, 3045, 273, 2067, 11333, 1754, 327, 896, 44263, 318, 342, 1110, 1754, 327, 17032, 4715, 285, 5100, 326, 253, 6158, 26414, 625, 4541, 327, 4577, 1054, 487, 32358, 281, 253, 1682, 273, 619, 3640, 436, 2929, 6131, 247, 4460, 8668, 327, 17032, 4715, 407, 14631, 253, 4602, 875, 305, 300, 285, 15424, 256, 35333, 253, 4477, 513, 247, 1175, 2628, 273, 15265, 839, 253, 22620, 875, 17032, 4715, 285, 7534, 4715, 4870, 275, 1635, 597, 2085, 1097, 10527, 816, 6787, 285, 5661, 1543, 323, 616, 3916, 50276, 284, 323, 253, 32213, 891, 2868, 326, 253, 4477, 812, 5513, 390, 2085, 690, 5697, 323, 690, 273, 253, 5661, 1543, 597, 10018, 275, 625, 2508, 824, 347, 38934, 35333, 9591, 7197, 685, 270, 793, 35333, 533, 4164, 43089, 285, 270, 11022, 312, 9591, 625, 12014, 327, 260, 338, 274, 740, 253, 2022, 12291, 273, 253, 2929, 310, 326, 352, 7194, 33826, 253, 1083, 835, 247, 2014, 941, 1127, 310, 1246, 275, 1016, 1054, 487, 1506, 253, 4477, 3324, 598, 436, 12291, 275, 616, 789, 285, 2085, 247, 22861, 347, 281, 2139, 352, 310, 417, 1534, 432, 247, 7534, 1127, 273, 1859, 347, 824, 891, 2868, 253, 4477, 18212, 2953, 253, 7364, 273, 616, 789, 2299, 891, 717, 417, 2119, 1880, 18216, 253, 3607, 273, 4067, 1054, 487, 1506, 9552, 275, 625, 2508, 812, 1335, 4917, 690, 4722, 1543, 5474, 33032, 2520, 2929, 29328, 271, 5795, 281, 11786, 896, 18634, 275, 253, 830, 273, 17032, 4715, 436, 5933, 556, 253, 5750, 273, 1146, 625, 13333, 342, 247, 1794, 40947, 7092, 285, 253, 2929, 13698, 281, 1287, 247, 10527, 7792, 323, 436, 4715, 253, 3264, 1543, 403, 281, 4044, 14940, 1543, 323, 436, 5933, 285, 281, 921, 253, 3045, 2429, 281, 8946, 20633, 11333, 253, 12002, 2789, 247, 1077, 2266, 9376, 670, 253, 906, 407, 14851, 326, 436, 5933, 476, 320, 8936, 281, 253, 20633, 5933, 50276, 66, 2022, 4757, 273, 253, 2929, 310, 281, 921, 253, 1375, 273, 253, 1445, 275, 5145, 4715, 11333, 285, 275, 1798, 281, 921, 253, 5301, 875, 19191, 11786, 18499, 1511, 3210, 285, 17032, 4715, 3210, 253, 1566, 310, 973, 3559, 285, 253, 41818, 403, 4942, 2590, 275, 1798, 253, 2022, 12291, 273, 253, 896, 18634, 5933, 310, 973, 16318, 407, 4645, 849, 281, 3359, 247, 1980, 2303, 13418, 5933, 50276, 66, 806, 14855, 273, 253, 2929, 310, 326, 352, 15771, 327, 4278, 326, 403, 12164, 275, 253, 3387, 7223, 30762, 285, 326, 22128, 327, 436, 1057, 417, 1581, 271, 3907, 7103, 273, 253, 1543, 275, 253, 2929, 1273, 314, 1014, 604, 253, 1543, 1646, 18462, 597, 403, 1335, 4518, 12497, 281, 921, 253, 31376, 273, 253, 5933, 6296, 253, 760, 22791, 908, 310, 326, 273, 260, 338, 274, 285, 253, 1543, 403, 2797, 342, 1543, 8936, 281, 20633, 760, 275, 253, 1077, 29190, 1083, 273, 247, 14604, 1979, 4503, 281, 581, 436, 9864, 906, 1057, 417, 3103, 15249, 253, 6452, 1160, 275, 253, 6010, 670, 253, 31376, 273, 253, 1543, 4583, 253, 2929, 27171, 432, 247, 2201, 12291, 875, 253, 11815, 326, 403, 1691, 3579, 275, 253, 12002, 285, 253, 10527, 533, 671, 9864, 1543, 326, 403, 3559, 275, 253, 2929, 352, 651, 13353, 452, 644, 625, 3451, 281, 1056, 1679, 2080, 45017, 13260, 533, 4751, 17618, 407, 253, 10527, 285, 5661, 1543, 5474, 339, 431, 248, 2929, 4245, 247, 10527, 7792, 285, 22861, 323, 253, 35605, 21541, 17032, 4715, 11454, 18428, 3733, 5933, 247, 26647, 273, 15970, 12425, 253, 2929, 2722, 326, 275, 253, 298, 628, 9459, 4164, 8244, 4020, 684, 15424, 19191, 11786, 18499, 352, 5018, 275, 253, 11786, 3884, 1223, 671, 14596, 2810, 281, 253, 1655, 13461, 1754, 327, 436, 10012, 253, 2929, 840, 19132, 4164, 281, 625, 8244, 16851, 15424, 256, 35333, 534, 19132, 253, 7882, 285, 14940, 273, 4164, 2439, 4715, 4142, 285, 8892, 20544, 50275, 9072, 1794, 4488, 666, 2322, 16038, 50276, 1257, 1832, 896, 44263, 318, 275, 253, 1355, 23941, 4758, 50276, 20881, 1255, 265, 50276, 1257, 15030, 407, 896, 8560, 275, 253, 278, 2352, 907, 23941, 295, 1540, 4758, 891, 16216, 3240, 923, 835, 436, 2929, 13840, 275, 50276, 4193, 4488, 666, 917, 3733, 273, 13345, 11454, 6928, 275, 253, 1355, 23941, 24088, 295, 18, 9459, 4558, 281, 619, 3640, 417, 247, 1077, 1846, 4836, 50276, 338, 253, 2934, 310, 326, 253, 3998, 1537, 513, 1633, 751, 4164, 476, 253, 4477, 1127, 281, 667, 4679, 5742, 8871, 562, 4164, 347, 5799, 432, 15970, 12425, 275, 253, 3998, 50276, 338, 253, 2934, 310, 281, 6194, 271, 2224, 323, 8892, 2139, 417, 816, 897, 20633, 285, 1781, 14604, 84, 4219, 50276, 9802, 253, 4477, 1408, 667, 4679, 275, 3541, 48454, 7533, 4645, 326, 4164, 1057, 1805, 627, 50276, 783, 4477, 452, 9713, 841, 3533, 281, 619, 13212, 275, 616, 2380, 2490, 187, 4118, 18435, 27, 2520, 2929, 10262, 271, 4722, 4602, 875, 19191, 11786, 18499, 407, 896, 44263, 318, 285, 253, 17032, 4715, 5933, 323, 15970, 12425, 253, 2234, 906, 310, 326, 17032, 4715, 4020, 684, 15424, 11786, 18499, 2581, 685, 6843, 256, 35333, 347, 9403, 9009, 253, 15424, 3082, 1347, 3294, 1598, 281, 2629, 3082, 285, 597, 778, 320, 273, 1600, 281, 15180, 6551, 30202, 1346, 6110, 275, 35605, 21541, 4715, 4803, 50276, 249, 1635, 281, 15974, 253, 30628, 7350, 891, 651, 11907, 253, 4477, 281, 3157, 253, 47284, 1475, 16186, 84, 337, 285, 374, 253, 4767, 4503, 1005, 2430, 247, 1643, 3104, 273, 34171, 281, 15313, 285, 368, 812, 18345, 253, 9414, 253, 7596, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 247, 2087, 10527, 7792, 323, 17032, 4715, 281, 436, 990, 253, 4477, 12661, 253, 14923, 17032, 4715, 305, 300, 5933, 597, 403, 840, 2104, 281, 921, 326, 305, 300, 8244, 4020, 684, 253, 15424, 19191, 11786, 18499, 15424, 256, 35333, 534, 10140, 281, 253, 19561, 5731, 273, 253, 2801, 3602, 285, 310, 5799, 432, 253, 6843, 256, 35333, 2223, 908, 407, 896, 44263, 318, 275, 1635, 253, 4477, 7277, 253, 3045, 273, 2067, 11333, 1754, 327, 896, 44263, 318, 342, 1110, 1754, 327, 17032, 4715, 285, 5100, 326, 253, 6158, 26414, 625, 4541, 327, 4577, 1054, 487, 32358, 281, 253, 1682, 273, 619, 3640, 436, 2929, 6131, 247, 4460, 8668, 327, 17032, 4715, 407, 14631, 253, 4602, 875, 305, 300, 285, 15424, 256, 35333, 253, 4477, 513, 247, 1175, 2628, 273, 15265, 839, 253, 22620, 875, 17032, 4715, 285, 7534, 4715, 4870, 275, 1635, 597, 2085, 1097, 10527, 816, 6787, 285, 5661, 1543, 323, 616, 3916, 50276, 284, 323, 253, 32213, 891, 2868, 326, 253, 4477, 812, 5513, 390, 2085, 690, 5697, 323, 690, 273, 253, 5661, 1543, 597, 10018, 275, 625, 2508, 824, 347, 38934, 35333, 9591, 7197, 685, 270, 793, 35333, 533, 4164, 43089, 285, 270, 11022, 312, 9591, 625, 12014, 327, 260, 338, 274, 740, 253, 2022, 12291, 273, 253, 2929, 310, 326, 352, 7194, 33826, 253, 1083, 835, 247, 2014, 941, 1127, 310, 1246, 275, 1016, 1054, 487, 1506, 253, 4477, 3324, 598, 436, 12291, 275, 616, 789, 285, 2085, 247, 22861, 347, 281, 2139, 352, 310, 417, 1534, 432, 247, 7534, 1127, 273, 1859, 347, 824, 891, 2868, 253, 4477, 18212, 2953, 253, 7364, 273, 616, 789, 2299, 891, 717, 417, 2119, 1880, 18216, 253, 3607, 273, 4067, 1054, 487, 1506, 9552, 275, 625, 2508, 812, 1335, 4917, 690, 4722, 1543, 5474, 33032, 2520, 2929, 29328, 271, 5795, 281, 11786, 896, 18634, 275, 253, 830, 273, 17032, 4715, 436, 5933, 556, 253, 5750, 273, 1146, 625, 13333, 342, 247, 1794, 40947, 7092, 285, 253, 2929, 13698, 281, 1287, 247, 10527, 7792, 323, 436, 4715, 253, 3264, 1543, 403, 281, 4044, 14940, 1543, 323, 436, 5933, 285, 281, 921, 253, 3045, 2429, 281, 8946, 20633, 11333, 253, 12002, 2789, 247, 1077, 2266, 9376, 670, 253, 906, 407, 14851, 326, 436, 5933, 476, 320, 8936, 281, 253, 20633, 5933, 50276, 66, 2022, 4757, 273, 253, 2929, 310, 281, 921, 253, 1375, 273, 253, 1445, 275, 5145, 4715, 11333, 285, 275, 1798, 281, 921, 253, 5301, 875, 19191, 11786, 18499, 1511, 3210, 285, 17032, 4715, 3210, 253, 1566, 310, 973, 3559, 285, 253, 41818, 403, 4942, 2590, 275, 1798, 253, 2022, 12291, 273, 253, 896, 18634, 5933, 310, 973, 16318, 407, 4645, 849, 281, 3359, 247, 1980, 2303, 13418, 5933, 50276, 66, 806, 14855, 273, 253, 2929, 310, 326, 352, 15771, 327, 4278, 326, 403, 12164, 275, 253, 3387, 7223, 30762, 285, 326, 22128, 327, 436, 1057, 417, 1581, 271, 3907, 7103, 273, 253, 1543, 275, 253, 2929, 1273, 314, 1014, 604, 253, 1543, 1646, 18462, 597, 403, 1335, 4518, 12497, 281, 921, 253, 31376, 273, 253, 5933, 6296, 253, 760, 22791, 908, 310, 326, 273, 260, 338, 274, 285, 253, 1543, 403, 2797, 342, 1543, 8936, 281, 20633, 760, 275, 253, 1077, 29190, 1083, 273, 247, 14604, 1979, 4503, 281, 581, 436, 9864, 906, 1057, 417, 3103, 15249, 253, 6452, 1160, 275, 253, 6010, 670, 253, 31376, 273, 253, 1543, 4583, 253, 2929, 27171, 432, 247, 2201, 12291, 875, 253, 11815, 326, 403, 1691, 3579, 275, 253, 12002, 285, 253, 10527, 533, 671, 9864, 1543, 326, 403, 3559, 275, 253, 2929, 352, 651, 13353, 452, 644, 625, 3451, 281, 1056, 1679, 2080, 45017, 13260, 533, 4751, 17618, 407, 253, 10527, 285, 5661, 1543, 5474, 339, 431, 248, 2929, 4245, 247, 10527, 7792, 285, 22861, 323, 253, 35605, 21541, 17032, 4715, 11454, 18428, 3733, 5933, 247, 26647, 273, 15970, 12425, 253, 2929, 2722, 326, 275, 253, 298, 628, 9459, 4164, 8244, 4020, 684, 15424, 19191, 11786, 18499, 352, 5018, 275, 253, 11786, 3884, 1223, 671, 14596, 2810, 281, 253, 1655, 13461, 1754, 327, 436, 10012, 253, 2929, 840, 19132, 4164, 281, 625, 8244, 16851, 15424, 256, 35333, 534, 19132, 253, 7882, 285, 14940, 273, 4164, 2439, 4715, 4142, 285, 8892, 20544, 50275, 9072, 1794, 4488, 666, 2322, 16038, 50276, 1257, 1832, 896, 44263, 318, 275, 253, 1355, 23941, 4758, 50276, 20881, 1255, 265, 50276, 1257, 15030, 407, 896, 8560, 275, 253, 278, 2352, 907, 23941, 295, 1540, 4758, 891, 16216, 3240, 923, 835, 436, 2929, 13840, 275, 50276, 4193, 4488, 666, 917, 3733, 273, 13345, 11454, 6928, 275, 253, 1355, 23941, 24088, 295, 18, 9459, 4558, 281, 619, 3640, 417, 247, 1077, 1846, 4836, 50276, 338, 253, 2934, 310, 326, 253, 3998, 1537, 513, 1633, 751, 4164, 476, 253, 4477, 1127, 281, 667, 4679, 5742, 8871, 562, 4164, 347, 5799, 432, 15970, 12425, 275, 253, 3998, 50276, 338, 253, 2934, 310, 281, 6194, 271, 2224, 323, 8892, 2139, 417, 816, 897, 20633, 285, 1781, 14604, 84, 4219, 50276, 9802, 253, 4477, 1408, 667, 4679, 275, 3541, 48454, 7533, 4645, 326, 4164, 1057, 1805, 627, 50276, 783, 4477, 452, 9713, 841, 3533, 281, 619, 13212, 275, 616, 2380, 2490, 187, 4118, 18435, 27, 2520, 2929, 10262, 271, 4722, 4602, 875, 19191, 11786, 18499, 407, 896, 44263, 318, 285, 253, 17032, 4715, 5933, 323, 15970, 12425, 253, 2234, 906, 310, 326, 17032, 4715, 4020, 684, 15424, 11786, 18499, 2581, 685, 6843, 256, 35333, 347, 9403, 9009, 253, 15424, 3082, 1347, 3294, 1598, 281, 2629, 3082, 285, 597, 778, 320, 273, 1600, 281, 15180, 6551, 30202, 1346, 6110, 275, 35605, 21541, 4715, 4803, 50276, 249, 1635, 281, 15974, 253, 30628, 7350, 891, 651, 11907, 253, 4477, 281, 3157, 253, 47284, 1475, 16186, 84, 337, 285, 374, 253, 4767, 4503, 1005, 2430, 247, 1643, 3104, 273, 34171, 281, 15313, 285, 368, 812, 18345, 253, 9414, 253, 7596, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper presents clue for crossmodality representation learning from incomplete observations using the crossencoders specifically the focus of clue is to learn comprehensive representations from datasets with each having a subset of modalities possibly unimodal strength the clue make use of the crosslinked vaes to model the crossmodality encoding it leans a comprehensive latent encoding with partially available modalities achieve better performance than the other compared methods weaknesses it need k2 crossencoders for k modalities scalability may be limited as it requires largenumber of cross models to store and compute thus poses the space and computational computational challenges yes docsepthe paper proposes clue crosslinked unified embedding to construct multimodal representations from modalityincomplete datasets and demonstrates its application on multimodal singlecell data integration the paper is well written and clearly presented the paper extends the multivae architecture following a good intuition preserving modalityspecific features rather than simply aligning multimodal data clue yields large improvement over baseline approaches the evaluation is conducted on one dataset for one specific application which seems a bit limited see questions above docsepthis paper proposes a semisupervised neural network named clue crosslinked unified embedding for crossmodality representation learning the core part is the proposed crossencoders which can learn modalityspecific representations and combine them to build an effective feature embedding this paper uses crossencoders to learn comprehensive representation however the experiments are insufficient we recommend enriching the experimental section such as experimental setup and parameter analysis of the method ### Summary:
in this paper the authors propose clue crosslinked unified embedding to construct multimodal representations from modalityincomplete datasets and apply clue to the singlecell data integration problems the proposed method is simple yet effective and shows the superior performance over stateoftheart methods all reviewers agree to accept the paper i will also vote for acceptance in the final version i encourage the authors to improve the experimental section by addressing the reviewers concerns
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 10262, 22796, 323, 2831, 2307, 1319, 6779, 4715, 432, 18464, 7313, 970, 253, 2831, 2083, 351, 398, 5742, 253, 2770, 273, 22796, 310, 281, 3037, 11088, 14237, 432, 15302, 342, 1016, 1907, 247, 8578, 273, 33433, 6830, 32505, 26306, 50275, 45563, 253, 22796, 1056, 897, 273, 253, 2831, 16862, 13460, 265, 281, 1566, 253, 2831, 2307, 1319, 9706, 352, 458, 507, 247, 11088, 21624, 9706, 342, 10571, 2130, 33433, 5115, 1805, 3045, 685, 253, 643, 2429, 3082, 50276, 20881, 1255, 265, 352, 878, 465, 19, 2831, 2083, 351, 398, 323, 465, 33433, 9171, 1430, 778, 320, 3710, 347, 352, 4419, 1236, 1541, 2764, 273, 2831, 3210, 281, 4657, 285, 11897, 3021, 24543, 253, 2317, 285, 15180, 15180, 7881, 50276, 9820, 5474, 339, 431, 248, 2929, 29328, 22796, 2831, 16862, 27998, 21496, 281, 3989, 23390, 26306, 14237, 432, 36453, 249, 11984, 15302, 285, 14371, 697, 2898, 327, 23390, 26306, 2014, 3992, 941, 9554, 253, 2929, 310, 973, 3542, 285, 4518, 3559, 50276, 783, 2929, 8725, 253, 1554, 400, 3348, 10336, 1563, 247, 1175, 30328, 24279, 36453, 6160, 3386, 2581, 685, 3365, 8495, 272, 23390, 26306, 941, 50276, 498, 489, 11026, 1781, 7756, 689, 8245, 7274, 50275, 783, 7103, 310, 5196, 327, 581, 10895, 323, 581, 2173, 2898, 534, 3133, 247, 2372, 3710, 50276, 2887, 3533, 1840, 5474, 33032, 2520, 2929, 29328, 247, 49863, 29974, 13337, 11454, 2990, 4907, 22796, 2831, 16862, 27998, 21496, 323, 2831, 2307, 1319, 6779, 4715, 253, 5161, 629, 310, 253, 4081, 2831, 2083, 351, 398, 534, 476, 3037, 36453, 6160, 14237, 285, 13398, 731, 281, 1973, 271, 3576, 4735, 21496, 436, 2929, 4648, 2831, 2083, 351, 398, 281, 3037, 11088, 6779, 2299, 253, 4679, 403, 12497, 359, 5583, 15655, 272, 253, 5661, 2593, 824, 347, 5661, 9978, 285, 4764, 1783, 273, 253, 1332, 2490, 187, 4118, 18435, 27, 249, 436, 2929, 253, 4477, 12661, 22796, 2831, 16862, 27998, 21496, 281, 3989, 23390, 26306, 14237, 432, 36453, 249, 11984, 15302, 285, 4647, 22796, 281, 253, 2014, 3992, 941, 9554, 3237, 253, 4081, 1332, 310, 2969, 2568, 3576, 285, 2722, 253, 8936, 3045, 689, 1375, 23037, 14387, 3082, 512, 30628, 5194, 281, 2997, 253, 2929, 891, 588, 671, 6273, 323, 14924, 275, 253, 2457, 2715, 891, 11907, 253, 4477, 281, 3157, 253, 5661, 2593, 407, 15974, 253, 30628, 7350 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 10262, 22796, 323, 2831, 2307, 1319, 6779, 4715, 432, 18464, 7313, 970, 253, 2831, 2083, 351, 398, 5742, 253, 2770, 273, 22796, 310, 281, 3037, 11088, 14237, 432, 15302, 342, 1016, 1907, 247, 8578, 273, 33433, 6830, 32505, 26306, 50275, 45563, 253, 22796, 1056, 897, 273, 253, 2831, 16862, 13460, 265, 281, 1566, 253, 2831, 2307, 1319, 9706, 352, 458, 507, 247, 11088, 21624, 9706, 342, 10571, 2130, 33433, 5115, 1805, 3045, 685, 253, 643, 2429, 3082, 50276, 20881, 1255, 265, 352, 878, 465, 19, 2831, 2083, 351, 398, 323, 465, 33433, 9171, 1430, 778, 320, 3710, 347, 352, 4419, 1236, 1541, 2764, 273, 2831, 3210, 281, 4657, 285, 11897, 3021, 24543, 253, 2317, 285, 15180, 15180, 7881, 50276, 9820, 5474, 339, 431, 248, 2929, 29328, 22796, 2831, 16862, 27998, 21496, 281, 3989, 23390, 26306, 14237, 432, 36453, 249, 11984, 15302, 285, 14371, 697, 2898, 327, 23390, 26306, 2014, 3992, 941, 9554, 253, 2929, 310, 973, 3542, 285, 4518, 3559, 50276, 783, 2929, 8725, 253, 1554, 400, 3348, 10336, 1563, 247, 1175, 30328, 24279, 36453, 6160, 3386, 2581, 685, 3365, 8495, 272, 23390, 26306, 941, 50276, 498, 489, 11026, 1781, 7756, 689, 8245, 7274, 50275, 783, 7103, 310, 5196, 327, 581, 10895, 323, 581, 2173, 2898, 534, 3133, 247, 2372, 3710, 50276, 2887, 3533, 1840, 5474, 33032, 2520, 2929, 29328, 247, 49863, 29974, 13337, 11454, 2990, 4907, 22796, 2831, 16862, 27998, 21496, 323, 2831, 2307, 1319, 6779, 4715, 253, 5161, 629, 310, 253, 4081, 2831, 2083, 351, 398, 534, 476, 3037, 36453, 6160, 14237, 285, 13398, 731, 281, 1973, 271, 3576, 4735, 21496, 436, 2929, 4648, 2831, 2083, 351, 398, 281, 3037, 11088, 6779, 2299, 253, 4679, 403, 12497, 359, 5583, 15655, 272, 253, 5661, 2593, 824, 347, 5661, 9978, 285, 4764, 1783, 273, 253, 1332, 2490, 187, 4118, 18435, 27, 249, 436, 2929, 253, 4477, 12661, 22796, 2831, 16862, 27998, 21496, 281, 3989, 23390, 26306, 14237, 432, 36453, 249, 11984, 15302, 285, 4647, 22796, 281, 253, 2014, 3992, 941, 9554, 3237, 253, 4081, 1332, 310, 2969, 2568, 3576, 285, 2722, 253, 8936, 3045, 689, 1375, 23037, 14387, 3082, 512, 30628, 5194, 281, 2997, 253, 2929, 891, 588, 671, 6273, 323, 14924, 275, 253, 2457, 2715, 891, 11907, 253, 4477, 281, 3157, 253, 5661, 2593, 407, 15974, 253, 30628, 7350 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the authors propose an interesting problem where only a small sized labeled data set is available for supervised learning this approach is different from the existing deep learning benchmarks which either assume the availability of pretrained supervised imagenet classification model or an unsupervised model trained using large scale auxiliary data eg moco swav the proposed method divides training of different stages of the convnet architecture using different sized cropspatches of images the lower layers are trained to classify the smallest size crops these layers are then frozen to train higher level layers with larger patches as inputs the network learns to classify input patches into the classes of images they belong to evaluation is performed on image quality assessment task on live dataset and indoor scene classification on mit indoor scenes dataset the problem is of practical interest since a large amount of labeledunlabeled data is not always available for the domain of interest eg depth data etc however i am not entirely sure that methods using small scale data alone can ever achieve performances close to the methods that leverage pretrained supervisedunsupervised prior models even in cases where large scale data is unavailable small scale paired data eg image depth can be used to regularize the training of one domain using a pretrained model of the other domain eg gupta et al cvpr 2016 the experimental evaluation in this paper is woefully inadequate it is clear that the method is more successful compared to learning from scratch but i would have liked to see how much worse it is compared to finetuning of pretrained models if the gap is too large is it even a direction worth exploring is the proposed method the first attempt at training neural nets without priors if not how does it compare to others evaluation on more serious datasets such as sun 397 would have been more appealing at present it seems like a simple experiment on a couple small datasets docsepthe paper presents a method for improving the accuracy when training cnns from scratch on a small dataset the method is as following instead of training the whole network on the full images one first creates a shallow model and trains it on the small crops from the dataset the labels for the patches are set the same as for the whole image then one adds more layers freezes the previously trained and train on bigger patches and so on the method is evaluated on 2 datasets image quality assesment on live dataset and scene classification on mit indoor 67 while the idea might be good the paper itself is a low quality problems 1 two things are mixed together a progressive patch cropping starting from small and ending up with big image and b progressive adding more and more layers while freezeing earliest that is not clear and justified to use both together the cropping can be seen is very aggressize data augmentation and if fact is commonly used for training imagenet cnns eg see 2 the freezing and training is a variant of net2net1 which is not cited 2 datasets useblind image quality assessment is a bad task for testing the idea why because when image is of a bad quality it can often easily be told from a small patch that is why aggressize patch cropping and keeping label the same is justified that is not true for other tasks like classification object detection metric learning and so on yet the method is claimed to be quite general 3 there is no baseline on any of the standard dataset not strong supervised baseline on the datasets paper proposed by strong i mean where hyperparameters are reasonably tuned and standard data augmentation is used eg for the mit indoor67 the paper reports 4295 accuracy in the same time course report from 2015 3 has 438 accuracy i dont see improvement from the proposed method 4 the model is not even specified is it custom cnn vggstyle resnetstyle i would recommend to start from the strong baseline and proper evaluation if the method can improve on them and then well presented then it can be published 1 net2net accelerating learning via knowledge transfer chen etal iclr 2016 httpsarxivorgabs151105641 2 httpsgithubcomnvidiadaliblob1e9196702d991d3342ad7a5a7d57c2893abad832docsexamplesusecasespytorchresnet50mainpyl116 3 httpcs231nstanfordedureports2015pdfsondiekifinalpaperpdf after rebuttal update given there is no rebuttal there will be no updatedocsepthis paper presents an interesting approach for training neural networks with a small dataset the main idea is to train the model from the early layers to the deeper layers stepbystep with different types of inputs ie patches cropped images or full images sampled from the given training set experimental results show great performance compared to training from scratch pros 1 this paper presents a good idea for training with small dataset the proposed method is technical valid and it is clear that stepwise training can help improve the performance from my point of view it is more like applying intermediate supervision on each layer using different types of training inputs by doing so we ensure that the early layers and the deeper layers are forced to learn to find the specified lowlevel and highlevel semantics respectively thus the resultant model could behaves like the largescale pretrained deep cnns we analyzed in the literature 2 in experiment section the authors present the key results on two datasets showing the advantage of the proposed method 3 the paper is easy to understand also the writing is very concise cons 1 though this paper presents a really focused contribution on training with small dataset one can see that the paper lacks of indepth analysis on either the target task or the proposed algorithm i would suggest that the authors could conduct more experiments to better validate the target task ie training with small dataset it would be great to add a transfer learning baseline ie pretrained on imagenet and then finetuned on the target dataset and show that it does not work for your research problem the readers could better understand the difficulty of your research problem 2 my another question is more related to the problem definition or more specifically the importance of the addressed problem 1 why we need to deal with small datasets if the target problemapplication is important it should be easy to enlarge the dataset at scale for example there is a scene classification dataset built by mit in 2009 it is a smallscale dataset which is used in this paper in 2015 due to the importance of the task mit people have scaled the dataset and the new one is called places which has 25 millions of images with scenecategory labels 2 why it is small there are two possible reasons i could think of i it may be difficult and expensive to collect training labels due to extreme labor efforts or privacy concerns ie pixelwise labels medical images ii the applications are newly emerged so all data are sparse but surely it will be scaled up in the nearly future but in the paper the presented tasks including scene classification and quality assessment are wellestablished and they should not be so difficult to obtain training data in general i think the presented experiments are toy examples and the smallscale setting may not be convincing enough i would suggest that the authors could include more examples and results of smalldataset scenarios which could add values to the paper postrebuttal the idea is good but the experiments and analysis are not enough to validate the proposed idea the paper is not ready for a publication ### Summary:
all reviewers agreed on the major shortcomings of this submission the most important of which is that the contributions are insufficiently evaluated there was no author response
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 12661, 271, 4722, 1895, 835, 760, 247, 1355, 25180, 13130, 941, 873, 310, 2130, 323, 22296, 4715, 436, 2746, 310, 1027, 432, 253, 5368, 3676, 4715, 49602, 534, 2057, 5467, 253, 11659, 273, 3215, 11273, 22296, 4440, 257, 292, 9162, 1566, 390, 271, 440, 35421, 1566, 10166, 970, 1781, 4311, 24026, 941, 24088, 278, 16856, 1863, 580, 253, 4081, 1332, 37141, 3733, 273, 1027, 8661, 273, 253, 2410, 3024, 10336, 970, 1027, 25180, 19492, 4066, 2706, 273, 3888, 253, 2406, 8090, 403, 10166, 281, 30215, 253, 8004, 1979, 19492, 841, 8090, 403, 840, 13831, 281, 6194, 2169, 1268, 8090, 342, 4067, 20412, 347, 14800, 253, 2990, 33772, 281, 30215, 3280, 20412, 715, 253, 5971, 273, 3888, 597, 5663, 281, 7103, 310, 2684, 327, 2460, 3290, 6803, 4836, 327, 3153, 10895, 285, 24340, 6200, 9162, 327, 4784, 24340, 13451, 10895, 50276, 783, 1895, 310, 273, 8542, 1600, 1580, 247, 1781, 2408, 273, 13130, 328, 22027, 941, 310, 417, 1900, 2130, 323, 253, 5028, 273, 1600, 24088, 6864, 941, 3966, 2299, 891, 717, 417, 7094, 2119, 326, 3082, 970, 1355, 4311, 941, 3815, 476, 2455, 5115, 16226, 2810, 281, 253, 3082, 326, 25057, 3215, 11273, 22296, 328, 35421, 2720, 3210, 1014, 275, 2219, 835, 1781, 4311, 941, 310, 29356, 1355, 4311, 18433, 941, 24088, 2460, 50276, 16719, 50276, 5092, 320, 908, 281, 3963, 907, 253, 3733, 273, 581, 5028, 970, 247, 3215, 11273, 1566, 273, 253, 643, 5028, 24088, 1149, 37668, 1162, 355, 30105, 1087, 4022, 253, 5661, 7103, 275, 436, 2929, 310, 32063, 4085, 314, 18766, 352, 310, 2590, 326, 253, 1332, 310, 625, 5547, 2429, 281, 4715, 432, 20041, 533, 891, 651, 452, 10490, 281, 923, 849, 1199, 7197, 352, 310, 2429, 281, 1442, 292, 25004, 273, 3215, 11273, 3210, 604, 253, 8037, 310, 1512, 1781, 310, 352, 1014, 247, 3884, 4409, 18216, 310, 253, 4081, 1332, 253, 806, 3177, 387, 3733, 11454, 37507, 1293, 2235, 641, 604, 417, 849, 1057, 352, 7277, 281, 2571, 7103, 327, 625, 4092, 15302, 824, 347, 5101, 37737, 651, 452, 644, 625, 23176, 387, 1246, 352, 3133, 751, 247, 2969, 3368, 327, 247, 4564, 1355, 15302, 5474, 339, 431, 248, 2929, 10262, 247, 1332, 323, 11138, 253, 7200, 672, 3733, 260, 79, 2224, 432, 20041, 327, 247, 1355, 10895, 50275, 783, 1332, 310, 347, 1563, 3185, 273, 3733, 253, 2644, 2990, 327, 253, 2120, 3888, 581, 806, 10513, 247, 20126, 1566, 285, 18784, 352, 327, 253, 1355, 19492, 432, 253, 10895, 253, 13301, 323, 253, 20412, 403, 873, 253, 1072, 347, 323, 253, 2644, 2460, 840, 581, 11323, 625, 8090, 1959, 13505, 253, 3786, 10166, 285, 6194, 327, 8750, 20412, 285, 594, 327, 50275, 783, 1332, 310, 6760, 327, 374, 15302, 2460, 3290, 718, 265, 420, 327, 3153, 10895, 285, 6200, 9162, 327, 4784, 24340, 9963, 50272, 6050, 253, 2934, 1537, 320, 1175, 253, 2929, 3139, 310, 247, 1698, 3290, 3237, 50275, 18, 767, 1841, 403, 6804, 2366, 247, 13439, 12097, 9187, 2784, 4983, 432, 1355, 285, 12365, 598, 342, 1943, 2460, 285, 270, 13439, 6240, 625, 285, 625, 8090, 1223, 21090, 272, 18353, 50275, 3529, 310, 417, 2590, 285, 17285, 281, 897, 1097, 2366, 50276, 783, 9187, 2784, 476, 320, 2326, 310, 1077, 27562, 907, 941, 42072, 285, 604, 958, 310, 7744, 908, 323, 3733, 4440, 257, 292, 260, 79, 2224, 24088, 923, 374, 50275, 783, 24250, 285, 3733, 310, 247, 12955, 273, 2036, 19, 3024, 18, 534, 310, 417, 11106, 50274, 19, 15302, 897, 27895, 2460, 3290, 6803, 310, 247, 3076, 4836, 323, 5175, 253, 2934, 2139, 984, 672, 2460, 310, 273, 247, 3076, 3290, 352, 476, 2223, 4354, 320, 2183, 432, 247, 1355, 12097, 326, 310, 2139, 27562, 907, 12097, 9187, 2784, 285, 50276, 23492, 5203, 253, 1072, 310, 17285, 326, 310, 417, 2032, 323, 643, 8892, 751, 9162, 1789, 5481, 7982, 4715, 285, 594, 327, 2568, 253, 1332, 310, 7558, 281, 320, 3240, 2087, 50275, 20, 627, 310, 642, 8245, 327, 667, 273, 253, 2629, 10895, 417, 2266, 22296, 8245, 327, 253, 15302, 2929, 4081, 407, 2266, 891, 1599, 835, 4373, 22041, 403, 12054, 24251, 285, 2629, 941, 42072, 310, 908, 50276, 909, 323, 253, 4784, 24340, 2251, 253, 2929, 5012, 577, 22270, 7200, 275, 253, 1072, 673, 2282, 1304, 432, 4104, 495, 556, 40175, 7200, 891, 13414, 923, 7756, 432, 253, 4081, 1332, 50276, 21, 253, 1566, 310, 417, 1014, 7616, 310, 352, 2840, 260, 9866, 362, 1266, 4826, 501, 3024, 4826, 50273, 74, 651, 5583, 281, 1265, 432, 253, 2266, 8245, 285, 1463, 7103, 604, 253, 1332, 476, 3157, 327, 731, 285, 840, 50276, 4714, 3559, 840, 352, 476, 320, 3863, 50275, 18, 2036, 19, 3024, 38757, 4715, 3066, 3640, 3700, 260, 864, 1162, 267, 17857, 32888, 4022, 5987, 39962, 2061, 5375, 1010, 7749, 22, 33976, 50276, 19, 5987, 7280, 681, 31287, 13535, 30277, 487, 77, 706, 18, 70, 26, 19196, 30967, 69, 39405, 69, 1610, 2945, 324, 24, 66, 22, 66, 24, 69, 3011, 68, 1619, 4590, 23388, 46702, 7152, 11523, 10240, 316, 886, 1169, 81, 1767, 263, 348, 373, 3024, 1235, 7265, 81, 1190, 13210, 50276, 20, 3944, 6113, 18390, 79, 18549, 4379, 264, 459, 4124, 6620, 19875, 3671, 857, 36332, 338, 989, 20790, 9275, 50273, 6438, 30080, 22559, 5731, 50276, 28821, 627, 310, 642, 30080, 22559, 627, 588, 320, 642, 9300, 406, 33032, 2520, 2929, 10262, 271, 4722, 2746, 323, 3733, 11454, 6928, 342, 247, 1355, 10895, 253, 2022, 2934, 310, 281, 6194, 253, 1566, 432, 253, 2393, 8090, 281, 253, 12861, 8090, 3213, 1615, 10539, 342, 1027, 3510, 273, 14800, 26332, 20412, 9187, 1882, 3888, 390, 2120, 3888, 19958, 432, 253, 1677, 3733, 873, 5661, 1543, 921, 1270, 3045, 2429, 281, 3733, 432, 20041, 50276, 856, 84, 337, 436, 2929, 10262, 247, 1175, 2934, 323, 3733, 342, 1355, 10895, 253, 4081, 1332, 310, 7681, 3588, 285, 352, 310, 2590, 326, 3213, 3020, 3733, 476, 1361, 3157, 253, 3045, 432, 619, 1127, 273, 1859, 352, 310, 625, 751, 9433, 10444, 20446, 327, 1016, 3828, 970, 1027, 3510, 273, 3733, 14800, 407, 2509, 594, 359, 5416, 326, 253, 2393, 8090, 285, 253, 12861, 8090, 403, 6726, 281, 3037, 281, 1089, 253, 7616, 1698, 5251, 285, 1029, 5251, 35185, 2975, 3021, 253, 29395, 1566, 812, 37824, 751, 253, 1236, 2510, 25912, 3215, 11273, 3676, 260, 79, 2224, 359, 5867, 275, 253, 6239, 374, 275, 3368, 2593, 253, 4477, 1246, 253, 2234, 1543, 327, 767, 15302, 4645, 253, 5750, 273, 253, 4081, 1332, 495, 253, 2929, 310, 3477, 281, 2096, 671, 253, 4028, 310, 1077, 44003, 50275, 5040, 337, 2167, 436, 2929, 10262, 247, 1663, 7106, 7680, 327, 3733, 342, 1355, 10895, 581, 476, 923, 326, 253, 2929, 19756, 273, 801, 554, 394, 1783, 327, 2057, 253, 2303, 4836, 390, 253, 4081, 5933, 891, 651, 1804, 326, 253, 4477, 812, 2589, 625, 4679, 281, 1805, 17813, 253, 2303, 4836, 26332, 3733, 342, 1355, 10895, 352, 651, 320, 1270, 281, 823, 247, 3700, 4715, 8245, 26332, 3215, 11273, 327, 4440, 257, 292, 285, 840, 1442, 292, 37437, 327, 253, 2303, 10895, 285, 921, 326, 352, 1057, 417, 789, 323, 634, 2561, 1895, 253, 10668, 812, 1805, 2096, 253, 10183, 273, 634, 2561, 1895, 374, 619, 1529, 1953, 310, 625, 2905, 281, 253, 1895, 5426, 390, 625, 5742, 253, 6349, 273, 253, 9713, 1895, 337, 2139, 359, 878, 281, 2968, 342, 1355, 15302, 604, 253, 2303, 33991, 377, 5688, 310, 1774, 352, 943, 320, 3477, 281, 46112, 253, 10895, 387, 4311, 323, 1650, 627, 310, 247, 6200, 9162, 10895, 4270, 407, 4784, 275, 4748, 352, 310, 247, 1355, 7527, 10895, 534, 310, 908, 275, 436, 2929, 275, 4104, 1955, 281, 253, 6349, 273, 253, 4836, 4784, 952, 452, 24337, 253, 10895, 285, 253, 747, 581, 310, 1925, 5053, 534, 556, 2030, 9790, 273, 3888, 342, 5362, 886, 1280, 13301, 374, 2139, 352, 310, 1355, 627, 403, 767, 1896, 4606, 891, 812, 1158, 273, 891, 352, 778, 320, 2834, 285, 8214, 281, 4822, 3733, 13301, 1955, 281, 9559, 5299, 6031, 390, 11068, 7350, 26332, 12275, 3020, 13301, 3739, 3888, 21255, 253, 4893, 403, 9841, 13082, 594, 512, 941, 403, 23507, 533, 13353, 352, 588, 320, 24337, 598, 275, 253, 4829, 2852, 533, 275, 253, 2929, 253, 3559, 8892, 1690, 6200, 9162, 285, 3290, 6803, 403, 973, 21877, 285, 597, 943, 417, 320, 594, 2834, 281, 4044, 3733, 941, 275, 2087, 891, 1158, 253, 3559, 4679, 403, 20953, 6667, 285, 253, 1355, 7527, 4758, 778, 417, 320, 21414, 2217, 891, 651, 1804, 326, 253, 4477, 812, 2486, 625, 6667, 285, 1543, 273, 924, 267, 392, 255, 23456, 15216, 534, 812, 823, 2193, 281, 253, 2929, 50275, 5996, 250, 2858, 22559, 253, 2934, 310, 1175, 533, 253, 4679, 285, 1783, 403, 417, 2217, 281, 17813, 253, 4081, 2934, 253, 2929, 310, 417, 4704, 323, 247, 9311, 187, 187, 4118, 18435, 27, 455, 30628, 5821, 327, 253, 2201, 35387, 273, 436, 19529, 253, 954, 1774, 273, 534, 310, 326, 253, 9021, 403, 12497, 314, 6760, 627, 369, 642, 2488, 2380, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 12661, 271, 4722, 1895, 835, 760, 247, 1355, 25180, 13130, 941, 873, 310, 2130, 323, 22296, 4715, 436, 2746, 310, 1027, 432, 253, 5368, 3676, 4715, 49602, 534, 2057, 5467, 253, 11659, 273, 3215, 11273, 22296, 4440, 257, 292, 9162, 1566, 390, 271, 440, 35421, 1566, 10166, 970, 1781, 4311, 24026, 941, 24088, 278, 16856, 1863, 580, 253, 4081, 1332, 37141, 3733, 273, 1027, 8661, 273, 253, 2410, 3024, 10336, 970, 1027, 25180, 19492, 4066, 2706, 273, 3888, 253, 2406, 8090, 403, 10166, 281, 30215, 253, 8004, 1979, 19492, 841, 8090, 403, 840, 13831, 281, 6194, 2169, 1268, 8090, 342, 4067, 20412, 347, 14800, 253, 2990, 33772, 281, 30215, 3280, 20412, 715, 253, 5971, 273, 3888, 597, 5663, 281, 7103, 310, 2684, 327, 2460, 3290, 6803, 4836, 327, 3153, 10895, 285, 24340, 6200, 9162, 327, 4784, 24340, 13451, 10895, 50276, 783, 1895, 310, 273, 8542, 1600, 1580, 247, 1781, 2408, 273, 13130, 328, 22027, 941, 310, 417, 1900, 2130, 323, 253, 5028, 273, 1600, 24088, 6864, 941, 3966, 2299, 891, 717, 417, 7094, 2119, 326, 3082, 970, 1355, 4311, 941, 3815, 476, 2455, 5115, 16226, 2810, 281, 253, 3082, 326, 25057, 3215, 11273, 22296, 328, 35421, 2720, 3210, 1014, 275, 2219, 835, 1781, 4311, 941, 310, 29356, 1355, 4311, 18433, 941, 24088, 2460, 50276, 16719, 50276, 5092, 320, 908, 281, 3963, 907, 253, 3733, 273, 581, 5028, 970, 247, 3215, 11273, 1566, 273, 253, 643, 5028, 24088, 1149, 37668, 1162, 355, 30105, 1087, 4022, 253, 5661, 7103, 275, 436, 2929, 310, 32063, 4085, 314, 18766, 352, 310, 2590, 326, 253, 1332, 310, 625, 5547, 2429, 281, 4715, 432, 20041, 533, 891, 651, 452, 10490, 281, 923, 849, 1199, 7197, 352, 310, 2429, 281, 1442, 292, 25004, 273, 3215, 11273, 3210, 604, 253, 8037, 310, 1512, 1781, 310, 352, 1014, 247, 3884, 4409, 18216, 310, 253, 4081, 1332, 253, 806, 3177, 387, 3733, 11454, 37507, 1293, 2235, 641, 604, 417, 849, 1057, 352, 7277, 281, 2571, 7103, 327, 625, 4092, 15302, 824, 347, 5101, 37737, 651, 452, 644, 625, 23176, 387, 1246, 352, 3133, 751, 247, 2969, 3368, 327, 247, 4564, 1355, 15302, 5474, 339, 431, 248, 2929, 10262, 247, 1332, 323, 11138, 253, 7200, 672, 3733, 260, 79, 2224, 432, 20041, 327, 247, 1355, 10895, 50275, 783, 1332, 310, 347, 1563, 3185, 273, 3733, 253, 2644, 2990, 327, 253, 2120, 3888, 581, 806, 10513, 247, 20126, 1566, 285, 18784, 352, 327, 253, 1355, 19492, 432, 253, 10895, 253, 13301, 323, 253, 20412, 403, 873, 253, 1072, 347, 323, 253, 2644, 2460, 840, 581, 11323, 625, 8090, 1959, 13505, 253, 3786, 10166, 285, 6194, 327, 8750, 20412, 285, 594, 327, 50275, 783, 1332, 310, 6760, 327, 374, 15302, 2460, 3290, 718, 265, 420, 327, 3153, 10895, 285, 6200, 9162, 327, 4784, 24340, 9963, 50272, 6050, 253, 2934, 1537, 320, 1175, 253, 2929, 3139, 310, 247, 1698, 3290, 3237, 50275, 18, 767, 1841, 403, 6804, 2366, 247, 13439, 12097, 9187, 2784, 4983, 432, 1355, 285, 12365, 598, 342, 1943, 2460, 285, 270, 13439, 6240, 625, 285, 625, 8090, 1223, 21090, 272, 18353, 50275, 3529, 310, 417, 2590, 285, 17285, 281, 897, 1097, 2366, 50276, 783, 9187, 2784, 476, 320, 2326, 310, 1077, 27562, 907, 941, 42072, 285, 604, 958, 310, 7744, 908, 323, 3733, 4440, 257, 292, 260, 79, 2224, 24088, 923, 374, 50275, 783, 24250, 285, 3733, 310, 247, 12955, 273, 2036, 19, 3024, 18, 534, 310, 417, 11106, 50274, 19, 15302, 897, 27895, 2460, 3290, 6803, 310, 247, 3076, 4836, 323, 5175, 253, 2934, 2139, 984, 672, 2460, 310, 273, 247, 3076, 3290, 352, 476, 2223, 4354, 320, 2183, 432, 247, 1355, 12097, 326, 310, 2139, 27562, 907, 12097, 9187, 2784, 285, 50276, 23492, 5203, 253, 1072, 310, 17285, 326, 310, 417, 2032, 323, 643, 8892, 751, 9162, 1789, 5481, 7982, 4715, 285, 594, 327, 2568, 253, 1332, 310, 7558, 281, 320, 3240, 2087, 50275, 20, 627, 310, 642, 8245, 327, 667, 273, 253, 2629, 10895, 417, 2266, 22296, 8245, 327, 253, 15302, 2929, 4081, 407, 2266, 891, 1599, 835, 4373, 22041, 403, 12054, 24251, 285, 2629, 941, 42072, 310, 908, 50276, 909, 323, 253, 4784, 24340, 2251, 253, 2929, 5012, 577, 22270, 7200, 275, 253, 1072, 673, 2282, 1304, 432, 4104, 495, 556, 40175, 7200, 891, 13414, 923, 7756, 432, 253, 4081, 1332, 50276, 21, 253, 1566, 310, 417, 1014, 7616, 310, 352, 2840, 260, 9866, 362, 1266, 4826, 501, 3024, 4826, 50273, 74, 651, 5583, 281, 1265, 432, 253, 2266, 8245, 285, 1463, 7103, 604, 253, 1332, 476, 3157, 327, 731, 285, 840, 50276, 4714, 3559, 840, 352, 476, 320, 3863, 50275, 18, 2036, 19, 3024, 38757, 4715, 3066, 3640, 3700, 260, 864, 1162, 267, 17857, 32888, 4022, 5987, 39962, 2061, 5375, 1010, 7749, 22, 33976, 50276, 19, 5987, 7280, 681, 31287, 13535, 30277, 487, 77, 706, 18, 70, 26, 19196, 30967, 69, 39405, 69, 1610, 2945, 324, 24, 66, 22, 66, 24, 69, 3011, 68, 1619, 4590, 23388, 46702, 7152, 11523, 10240, 316, 886, 1169, 81, 1767, 263, 348, 373, 3024, 1235, 7265, 81, 1190, 13210, 50276, 20, 3944, 6113, 18390, 79, 18549, 4379, 264, 459, 4124, 6620, 19875, 3671, 857, 36332, 338, 989, 20790, 9275, 50273, 6438, 30080, 22559, 5731, 50276, 28821, 627, 310, 642, 30080, 22559, 627, 588, 320, 642, 9300, 406, 33032, 2520, 2929, 10262, 271, 4722, 2746, 323, 3733, 11454, 6928, 342, 247, 1355, 10895, 253, 2022, 2934, 310, 281, 6194, 253, 1566, 432, 253, 2393, 8090, 281, 253, 12861, 8090, 3213, 1615, 10539, 342, 1027, 3510, 273, 14800, 26332, 20412, 9187, 1882, 3888, 390, 2120, 3888, 19958, 432, 253, 1677, 3733, 873, 5661, 1543, 921, 1270, 3045, 2429, 281, 3733, 432, 20041, 50276, 856, 84, 337, 436, 2929, 10262, 247, 1175, 2934, 323, 3733, 342, 1355, 10895, 253, 4081, 1332, 310, 7681, 3588, 285, 352, 310, 2590, 326, 3213, 3020, 3733, 476, 1361, 3157, 253, 3045, 432, 619, 1127, 273, 1859, 352, 310, 625, 751, 9433, 10444, 20446, 327, 1016, 3828, 970, 1027, 3510, 273, 3733, 14800, 407, 2509, 594, 359, 5416, 326, 253, 2393, 8090, 285, 253, 12861, 8090, 403, 6726, 281, 3037, 281, 1089, 253, 7616, 1698, 5251, 285, 1029, 5251, 35185, 2975, 3021, 253, 29395, 1566, 812, 37824, 751, 253, 1236, 2510, 25912, 3215, 11273, 3676, 260, 79, 2224, 359, 5867, 275, 253, 6239, 374, 275, 3368, 2593, 253, 4477, 1246, 253, 2234, 1543, 327, 767, 15302, 4645, 253, 5750, 273, 253, 4081, 1332, 495, 253, 2929, 310, 3477, 281, 2096, 671, 253, 4028, 310, 1077, 44003, 50275, 5040, 337, 2167, 436, 2929, 10262, 247, 1663, 7106, 7680, 327, 3733, 342, 1355, 10895, 581, 476, 923, 326, 253, 2929, 19756, 273, 801, 554, 394, 1783, 327, 2057, 253, 2303, 4836, 390, 253, 4081, 5933, 891, 651, 1804, 326, 253, 4477, 812, 2589, 625, 4679, 281, 1805, 17813, 253, 2303, 4836, 26332, 3733, 342, 1355, 10895, 352, 651, 320, 1270, 281, 823, 247, 3700, 4715, 8245, 26332, 3215, 11273, 327, 4440, 257, 292, 285, 840, 1442, 292, 37437, 327, 253, 2303, 10895, 285, 921, 326, 352, 1057, 417, 789, 323, 634, 2561, 1895, 253, 10668, 812, 1805, 2096, 253, 10183, 273, 634, 2561, 1895, 374, 619, 1529, 1953, 310, 625, 2905, 281, 253, 1895, 5426, 390, 625, 5742, 253, 6349, 273, 253, 9713, 1895, 337, 2139, 359, 878, 281, 2968, 342, 1355, 15302, 604, 253, 2303, 33991, 377, 5688, 310, 1774, 352, 943, 320, 3477, 281, 46112, 253, 10895, 387, 4311, 323, 1650, 627, 310, 247, 6200, 9162, 10895, 4270, 407, 4784, 275, 4748, 352, 310, 247, 1355, 7527, 10895, 534, 310, 908, 275, 436, 2929, 275, 4104, 1955, 281, 253, 6349, 273, 253, 4836, 4784, 952, 452, 24337, 253, 10895, 285, 253, 747, 581, 310, 1925, 5053, 534, 556, 2030, 9790, 273, 3888, 342, 5362, 886, 1280, 13301, 374, 2139, 352, 310, 1355, 627, 403, 767, 1896, 4606, 891, 812, 1158, 273, 891, 352, 778, 320, 2834, 285, 8214, 281, 4822, 3733, 13301, 1955, 281, 9559, 5299, 6031, 390, 11068, 7350, 26332, 12275, 3020, 13301, 3739, 3888, 21255, 253, 4893, 403, 9841, 13082, 594, 512, 941, 403, 23507, 533, 13353, 352, 588, 320, 24337, 598, 275, 253, 4829, 2852, 533, 275, 253, 2929, 253, 3559, 8892, 1690, 6200, 9162, 285, 3290, 6803, 403, 973, 21877, 285, 597, 943, 417, 320, 594, 2834, 281, 4044, 3733, 941, 275, 2087, 891, 1158, 253, 3559, 4679, 403, 20953, 6667, 285, 253, 1355, 7527, 4758, 778, 417, 320, 21414, 2217, 891, 651, 1804, 326, 253, 4477, 812, 2486, 625, 6667, 285, 1543, 273, 924, 267, 392, 255, 23456, 15216, 534, 812, 823, 2193, 281, 253, 2929, 50275, 5996, 250, 2858, 22559, 253, 2934, 310, 1175, 533, 253, 4679, 285, 1783, 403, 417, 2217, 281, 17813, 253, 4081, 2934, 253, 2929, 310, 417, 4704, 323, 247, 9311, 187, 187, 4118, 18435, 27, 455, 30628, 5821, 327, 253, 2201, 35387, 273, 436, 19529, 253, 954, 1774, 273, 534, 310, 326, 253, 9021, 403, 12497, 314, 6760, 627, 369, 642, 2488, 2380, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the authors analyze the classimbalanced learning problem through the lens of the loss landscape specifically they examine the spectral density of hessian of classwise loss in which they observe that for the tail class loss landscape the solution converges to a saddle point they further theoretically and empirically demonstrate that sharpness aware minimization sam a recent technique that aims to converge to flat minima can be effectively used to escape saddle points for minority classes borrowing sam results in an increase in accuracy in the minority classes strength the idea to interpret the longtailed distribution problem with respect to the loss of landscape is novel the authors validate the effect of the sam for model training theoretically and visually weakness the proposed solution for the longtailed recognition problem is not novel enough it would be better to propose a novel method by modifying the existing sam method to be especially helpful for the tail classes simply borrowing the existing method which is generally helpful might not be enough for complete research also empirically in table 123 it is hard to see the sam is especially helpful for the tail classes especially helpful to alleviate the longtailed distribution the experiments also seem to be weak for example in table1 the authors should also evaluate their method with other imbalance ratios also they should compare with more recent stateoftheart longtailed recognition works 1234 the baselines shown in the paper are too weak in order to claim the effectiveness of the proposed method the authors should apply the proposed method to the stateoftheart methods and see the improvements instead of only applying the method on simple baselines 1 self supervision to distillation for longtailed visual recognition iccv 2021 2 parametric contrastive learning iccv 2021 3 influencebalanced loss for imbalanced visual classification iccv 2021 4 distilling virtual examples for longtailed recognition iccv 2021 comments after the rebuttal i have read the other reviewers comments and the authors rebuttal which addresses most of my concerns therefore i would like to raise my score the authors addressed the limitations of the proposed method and the potential negative social impact of their work docsepin the realworld many datasets are imbalanced which can degrade the trained models performance to tackle this problem this paper analyzes the classimbalance problem by examining the spectral density of hessian of classwise loss from the observation that the tail class loss converges to a saddle point this paper proposes to use sharpness aware minimization sam this paper justifies the proposed method theoretically and empirically on multiple benchmark datasets strengths this paper justifies the proposed method with theoretical justification and various analyses the proposed method improves the baselines the paper is well written weaknesses the comparison baselines and the prior literature on longtailed learning is limited the limitations and potential negative societal impact are addressed docsepthe authors find that for the tail class loss landscape the solution converges to a saddle point and the network thus suffers from poor generalization on minority classes this work uses sharpnessaware minimization sam to escape saddle points and enhance the generalization performance which has been theoretically and empirically demonstrated experimental results show that combining sam with sota techniques leads to significant gains in performance on minority classes strength 1 the idea of studying the effect of negative curvature for classimbalanced learning using deep neural networks is novel and interesting which may pave a new way for further research on imbalanced classification 2 experimental results empirically show that escaping saddle points with sam leads to a notable in crease in overall accuracy primarily due to the major gain in the accuracy on the tail classes weakness 1 the algorithm should be presented and described in detail 2 the background of sharpnessaware minimization sam shoud be described in detail 1 the algorithm should be presented and described in detail which is helpful for understanding the proposed method 2 the background of sharpnessaware minimization sam shoud be described in detail docsepin this paper the authors find that the key issue for imbalanced classification is that the training for minority classes can lead to convergence to saddle points of their loss landscape this phenomenon cannot be avoided in the reweighting methods consequently the network suffers from poor generalization on minority classes to escape from the saddle points they introduce sharpnessaware minimization sam with a high regularization factor rho to enhance the generalization performance they theoretically and empirically demonstrate that sam with high rho is able to escape saddle points faster than sgd and converge to better solutions pros 1 analysis the problem of imbalanced classification from the persepective of optimaization peocedure is novel 2 the conclusion that saddle points cannot be avoided in reweighting methods is interesting 3 sufficient theoretically and empirically analysis does the saddle points exist in other methods such as data augmentation and samplelevel reweighting methods ### Summary:
the paper studies the problem of saddle point escape for class imbalanced datasets and mostly makes two contributions from my perspective 1 analysis of the spectral density of the hessian for classimbalanced datasets this observation is novel as far as i know 2 a short analysis of sam demonstrating it escapes saddle points while the first point seems novel and of interest i have some limited reservations regarding the second contribution theoretically the authors provide a theorem demonstrating that the cnc condition derived in daneshmand et al holds with a larger constant it is however unclear whether this is the reason for the superior performance of sam in unbalanced datasets saddle points are often not prevalent in the loss landscapes of modern neural networks the paper does not directly show that better performance is linked to saddles i would like to encourage the authors to more directly highlight the importance of the cnc condition used in the analysis overall the reviewers are still rather positive about the paper and despite its shortcomings it has the potential to encourage more research in this field i therefore recommend acceptance and invite the authors to add a discussion of the shortcomings that should be addressed in future work finally i note that there is some recent work analyzing the dynamics of gradient descent under class imbalance characterizing the effect of class imbalance on the learning dynamics francazi et al the findings do not seem to be directly related but its probably worth checking
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 12106, 253, 966, 6785, 267, 3086, 4715, 1895, 949, 253, 9655, 273, 253, 2957, 13016, 50276, 46458, 597, 9186, 253, 9879, 4038, 273, 344, 859, 757, 273, 966, 3020, 2957, 275, 534, 597, 10018, 326, 323, 253, 8105, 966, 2957, 13016, 253, 2900, 26414, 281, 247, 26759, 1127, 50276, 9328, 2007, 28055, 285, 45190, 7568, 326, 9479, 1255, 6600, 41458, 1775, 247, 3332, 5853, 326, 13698, 281, 29623, 281, 6507, 46836, 476, 320, 8069, 908, 281, 8773, 26759, 2792, 323, 15156, 5971, 40770, 1775, 1543, 275, 271, 2572, 275, 7200, 275, 253, 15156, 5971, 50276, 45563, 50275, 783, 2934, 281, 4665, 253, 1048, 29551, 3268, 1895, 342, 1675, 281, 253, 2957, 273, 13016, 310, 4460, 50276, 783, 4477, 17813, 253, 1055, 273, 253, 1775, 323, 1566, 3733, 28055, 285, 25910, 50276, 20881, 1255, 50276, 783, 4081, 2900, 323, 253, 1048, 29551, 8981, 1895, 310, 417, 4460, 2217, 352, 651, 320, 1805, 281, 12661, 247, 4460, 1332, 407, 26264, 253, 5368, 1775, 1332, 281, 320, 3340, 9371, 323, 253, 8105, 5971, 3365, 40770, 253, 5368, 1332, 534, 310, 3839, 9371, 1537, 417, 320, 2217, 323, 3426, 2561, 671, 45190, 275, 2829, 15567, 352, 310, 1892, 281, 923, 253, 1775, 310, 3340, 9371, 323, 253, 8105, 5971, 3340, 9371, 281, 33623, 253, 1048, 29551, 3268, 50275, 783, 4679, 671, 1646, 281, 320, 5075, 323, 1650, 275, 2829, 18, 253, 4477, 943, 671, 7472, 616, 1332, 342, 643, 31561, 11878, 50275, 12563, 597, 943, 7277, 342, 625, 3332, 1375, 23037, 14387, 1048, 29551, 8981, 2987, 1249, 1706, 253, 1666, 25379, 2011, 275, 253, 2929, 403, 1512, 5075, 275, 1340, 281, 1750, 253, 12510, 273, 253, 4081, 1332, 253, 4477, 943, 4647, 253, 4081, 1332, 281, 253, 1375, 23037, 14387, 3082, 285, 923, 253, 11701, 3185, 273, 760, 9433, 253, 1332, 327, 2969, 1666, 25379, 50276, 18, 1881, 20446, 281, 940, 21755, 323, 1048, 29551, 5304, 8981, 17857, 17312, 43425, 374, 36833, 4499, 422, 4715, 17857, 17312, 43425, 495, 4833, 30063, 2957, 323, 516, 30063, 5304, 9162, 17857, 17312, 43425, 577, 940, 3867, 7503, 6667, 323, 1048, 29551, 8981, 17857, 17312, 43425, 50274, 26122, 846, 253, 30080, 22559, 50276, 74, 452, 1239, 253, 643, 30628, 5701, 285, 253, 4477, 30080, 22559, 534, 12453, 954, 273, 619, 7350, 3103, 891, 651, 751, 281, 7164, 619, 4868, 253, 4477, 9713, 253, 7364, 273, 253, 4081, 1332, 285, 253, 2442, 4016, 2675, 3486, 273, 616, 789, 5474, 339, 9852, 253, 1524, 10186, 1142, 15302, 403, 516, 30063, 534, 476, 40195, 253, 10166, 3210, 3045, 281, 18915, 436, 1895, 436, 2929, 3537, 13505, 253, 966, 6785, 267, 593, 1895, 407, 17565, 253, 9879, 4038, 273, 344, 859, 757, 273, 966, 3020, 2957, 432, 253, 8310, 326, 253, 8105, 966, 2957, 26414, 281, 247, 26759, 1127, 436, 2929, 29328, 281, 897, 9479, 1255, 6600, 41458, 1775, 436, 2929, 816, 7790, 253, 4081, 1332, 28055, 285, 45190, 327, 2709, 22791, 15302, 20544, 50276, 2520, 2929, 816, 7790, 253, 4081, 1332, 342, 10527, 22861, 285, 2710, 6260, 50276, 783, 4081, 1332, 19132, 253, 1666, 25379, 50275, 783, 2929, 310, 973, 3542, 50276, 20881, 1255, 265, 50276, 783, 5301, 1666, 25379, 285, 253, 2720, 6239, 327, 1048, 29551, 4715, 310, 3710, 50276, 783, 7364, 285, 2442, 4016, 38058, 3486, 403, 9713, 5474, 339, 431, 248, 4477, 1089, 326, 323, 253, 8105, 966, 2957, 13016, 253, 2900, 26414, 281, 247, 26759, 1127, 285, 253, 2990, 3021, 27171, 432, 4105, 26647, 327, 15156, 5971, 436, 789, 4648, 9479, 1255, 13823, 41458, 1775, 281, 8773, 26759, 2792, 285, 7278, 253, 26647, 3045, 534, 556, 644, 28055, 285, 45190, 5183, 5661, 1543, 921, 326, 16248, 1775, 342, 256, 5503, 5609, 5644, 281, 1534, 15988, 275, 3045, 327, 15156, 5971, 4757, 50276, 18, 253, 2934, 273, 12392, 253, 1055, 273, 4016, 16841, 323, 966, 6785, 267, 3086, 4715, 970, 3676, 11454, 6928, 310, 4460, 285, 4722, 534, 778, 29238, 247, 747, 1039, 323, 2007, 2561, 327, 516, 30063, 9162, 374, 50276, 49363, 1543, 45190, 921, 326, 34528, 26759, 2792, 342, 1775, 5644, 281, 247, 16613, 275, 1424, 511, 275, 4583, 7200, 8558, 1955, 281, 253, 2201, 6351, 275, 253, 7200, 327, 253, 8105, 5971, 50276, 20881, 1255, 337, 253, 5933, 943, 320, 3559, 285, 2529, 275, 2508, 374, 253, 4114, 273, 9479, 1255, 13823, 41458, 1775, 439, 2995, 320, 2529, 275, 2508, 50275, 18, 253, 5933, 943, 320, 3559, 285, 2529, 275, 2508, 534, 310, 9371, 323, 4685, 253, 4081, 1332, 374, 253, 4114, 273, 9479, 1255, 13823, 41458, 1775, 439, 2995, 320, 2529, 275, 2508, 50275, 7152, 339, 9852, 436, 2929, 253, 4477, 1089, 326, 253, 2234, 2523, 323, 516, 30063, 9162, 310, 326, 253, 3733, 323, 15156, 5971, 476, 1421, 281, 14940, 281, 26759, 2792, 273, 616, 2957, 13016, 436, 11562, 2550, 320, 16371, 275, 253, 294, 6712, 272, 3082, 17912, 253, 2990, 27171, 432, 4105, 26647, 327, 15156, 5971, 281, 8773, 432, 253, 26759, 2792, 597, 9569, 9479, 1255, 13823, 41458, 1775, 342, 247, 1029, 37820, 2803, 391, 1689, 281, 7278, 253, 26647, 3045, 597, 28055, 285, 45190, 7568, 326, 1775, 342, 1029, 391, 1689, 50276, 261, 2104, 281, 8773, 26759, 2792, 7938, 685, 256, 35333, 285, 29623, 281, 1805, 5482, 5847, 337, 1783, 253, 1895, 273, 516, 30063, 9162, 432, 253, 591, 339, 4911, 273, 5556, 66, 1320, 759, 406, 264, 459, 310, 4460, 374, 253, 6452, 326, 26759, 2792, 2550, 320, 16371, 275, 294, 6712, 272, 3082, 310, 4722, 495, 4209, 28055, 285, 45190, 1783, 1057, 253, 26759, 2792, 2226, 275, 643, 3082, 824, 347, 941, 42072, 285, 3410, 5251, 294, 6712, 272, 3082, 2490, 187, 4118, 18435, 27, 783, 2929, 2175, 253, 1895, 273, 26759, 1127, 8773, 323, 966, 516, 30063, 15302, 285, 6571, 2789, 767, 9021, 432, 619, 8668, 337, 1783, 273, 253, 9879, 4038, 273, 253, 344, 859, 757, 323, 966, 6785, 267, 3086, 15302, 436, 8310, 310, 4460, 347, 2080, 347, 891, 871, 374, 247, 2159, 1783, 273, 1775, 17227, 352, 44716, 26759, 2792, 50276, 6050, 253, 806, 1127, 3133, 4460, 285, 273, 1600, 891, 452, 690, 3710, 33196, 5001, 253, 1273, 7680, 28055, 253, 4477, 2085, 247, 10012, 17227, 326, 253, 260, 9068, 1617, 6012, 275, 277, 6597, 73, 7076, 1162, 355, 6556, 342, 247, 4067, 3638, 352, 310, 2299, 12744, 1880, 436, 310, 253, 1921, 323, 253, 8936, 3045, 273, 1775, 275, 440, 30063, 15302, 26759, 2792, 403, 2223, 417, 21270, 275, 253, 2957, 37328, 273, 4980, 11454, 6928, 253, 2929, 1057, 417, 3587, 921, 326, 1805, 3045, 310, 7939, 281, 37127, 868, 891, 651, 751, 281, 11907, 253, 4477, 281, 625, 3587, 6780, 253, 6349, 273, 253, 260, 9068, 1617, 908, 275, 253, 1783, 50276, 1189, 455, 253, 30628, 403, 1335, 2581, 2762, 670, 253, 2929, 285, 5747, 697, 35387, 352, 556, 253, 2442, 281, 11907, 625, 2561, 275, 436, 1673, 891, 3103, 5583, 14924, 285, 19864, 253, 4477, 281, 823, 247, 5955, 273, 253, 35387, 326, 943, 320, 9713, 275, 2852, 789, 50276, 71, 3341, 891, 3877, 326, 627, 310, 690, 3332, 789, 18918, 253, 8062, 273, 11786, 18499, 762, 966, 31561, 39330, 253, 1055, 273, 966, 31561, 327, 253, 4715, 8062, 38996, 23248, 1162, 355, 253, 4342, 513, 417, 1646, 281, 320, 3587, 2905, 533, 697, 3164, 4409, 12669, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 12106, 253, 966, 6785, 267, 3086, 4715, 1895, 949, 253, 9655, 273, 253, 2957, 13016, 50276, 46458, 597, 9186, 253, 9879, 4038, 273, 344, 859, 757, 273, 966, 3020, 2957, 275, 534, 597, 10018, 326, 323, 253, 8105, 966, 2957, 13016, 253, 2900, 26414, 281, 247, 26759, 1127, 50276, 9328, 2007, 28055, 285, 45190, 7568, 326, 9479, 1255, 6600, 41458, 1775, 247, 3332, 5853, 326, 13698, 281, 29623, 281, 6507, 46836, 476, 320, 8069, 908, 281, 8773, 26759, 2792, 323, 15156, 5971, 40770, 1775, 1543, 275, 271, 2572, 275, 7200, 275, 253, 15156, 5971, 50276, 45563, 50275, 783, 2934, 281, 4665, 253, 1048, 29551, 3268, 1895, 342, 1675, 281, 253, 2957, 273, 13016, 310, 4460, 50276, 783, 4477, 17813, 253, 1055, 273, 253, 1775, 323, 1566, 3733, 28055, 285, 25910, 50276, 20881, 1255, 50276, 783, 4081, 2900, 323, 253, 1048, 29551, 8981, 1895, 310, 417, 4460, 2217, 352, 651, 320, 1805, 281, 12661, 247, 4460, 1332, 407, 26264, 253, 5368, 1775, 1332, 281, 320, 3340, 9371, 323, 253, 8105, 5971, 3365, 40770, 253, 5368, 1332, 534, 310, 3839, 9371, 1537, 417, 320, 2217, 323, 3426, 2561, 671, 45190, 275, 2829, 15567, 352, 310, 1892, 281, 923, 253, 1775, 310, 3340, 9371, 323, 253, 8105, 5971, 3340, 9371, 281, 33623, 253, 1048, 29551, 3268, 50275, 783, 4679, 671, 1646, 281, 320, 5075, 323, 1650, 275, 2829, 18, 253, 4477, 943, 671, 7472, 616, 1332, 342, 643, 31561, 11878, 50275, 12563, 597, 943, 7277, 342, 625, 3332, 1375, 23037, 14387, 1048, 29551, 8981, 2987, 1249, 1706, 253, 1666, 25379, 2011, 275, 253, 2929, 403, 1512, 5075, 275, 1340, 281, 1750, 253, 12510, 273, 253, 4081, 1332, 253, 4477, 943, 4647, 253, 4081, 1332, 281, 253, 1375, 23037, 14387, 3082, 285, 923, 253, 11701, 3185, 273, 760, 9433, 253, 1332, 327, 2969, 1666, 25379, 50276, 18, 1881, 20446, 281, 940, 21755, 323, 1048, 29551, 5304, 8981, 17857, 17312, 43425, 374, 36833, 4499, 422, 4715, 17857, 17312, 43425, 495, 4833, 30063, 2957, 323, 516, 30063, 5304, 9162, 17857, 17312, 43425, 577, 940, 3867, 7503, 6667, 323, 1048, 29551, 8981, 17857, 17312, 43425, 50274, 26122, 846, 253, 30080, 22559, 50276, 74, 452, 1239, 253, 643, 30628, 5701, 285, 253, 4477, 30080, 22559, 534, 12453, 954, 273, 619, 7350, 3103, 891, 651, 751, 281, 7164, 619, 4868, 253, 4477, 9713, 253, 7364, 273, 253, 4081, 1332, 285, 253, 2442, 4016, 2675, 3486, 273, 616, 789, 5474, 339, 9852, 253, 1524, 10186, 1142, 15302, 403, 516, 30063, 534, 476, 40195, 253, 10166, 3210, 3045, 281, 18915, 436, 1895, 436, 2929, 3537, 13505, 253, 966, 6785, 267, 593, 1895, 407, 17565, 253, 9879, 4038, 273, 344, 859, 757, 273, 966, 3020, 2957, 432, 253, 8310, 326, 253, 8105, 966, 2957, 26414, 281, 247, 26759, 1127, 436, 2929, 29328, 281, 897, 9479, 1255, 6600, 41458, 1775, 436, 2929, 816, 7790, 253, 4081, 1332, 28055, 285, 45190, 327, 2709, 22791, 15302, 20544, 50276, 2520, 2929, 816, 7790, 253, 4081, 1332, 342, 10527, 22861, 285, 2710, 6260, 50276, 783, 4081, 1332, 19132, 253, 1666, 25379, 50275, 783, 2929, 310, 973, 3542, 50276, 20881, 1255, 265, 50276, 783, 5301, 1666, 25379, 285, 253, 2720, 6239, 327, 1048, 29551, 4715, 310, 3710, 50276, 783, 7364, 285, 2442, 4016, 38058, 3486, 403, 9713, 5474, 339, 431, 248, 4477, 1089, 326, 323, 253, 8105, 966, 2957, 13016, 253, 2900, 26414, 281, 247, 26759, 1127, 285, 253, 2990, 3021, 27171, 432, 4105, 26647, 327, 15156, 5971, 436, 789, 4648, 9479, 1255, 13823, 41458, 1775, 281, 8773, 26759, 2792, 285, 7278, 253, 26647, 3045, 534, 556, 644, 28055, 285, 45190, 5183, 5661, 1543, 921, 326, 16248, 1775, 342, 256, 5503, 5609, 5644, 281, 1534, 15988, 275, 3045, 327, 15156, 5971, 4757, 50276, 18, 253, 2934, 273, 12392, 253, 1055, 273, 4016, 16841, 323, 966, 6785, 267, 3086, 4715, 970, 3676, 11454, 6928, 310, 4460, 285, 4722, 534, 778, 29238, 247, 747, 1039, 323, 2007, 2561, 327, 516, 30063, 9162, 374, 50276, 49363, 1543, 45190, 921, 326, 34528, 26759, 2792, 342, 1775, 5644, 281, 247, 16613, 275, 1424, 511, 275, 4583, 7200, 8558, 1955, 281, 253, 2201, 6351, 275, 253, 7200, 327, 253, 8105, 5971, 50276, 20881, 1255, 337, 253, 5933, 943, 320, 3559, 285, 2529, 275, 2508, 374, 253, 4114, 273, 9479, 1255, 13823, 41458, 1775, 439, 2995, 320, 2529, 275, 2508, 50275, 18, 253, 5933, 943, 320, 3559, 285, 2529, 275, 2508, 534, 310, 9371, 323, 4685, 253, 4081, 1332, 374, 253, 4114, 273, 9479, 1255, 13823, 41458, 1775, 439, 2995, 320, 2529, 275, 2508, 50275, 7152, 339, 9852, 436, 2929, 253, 4477, 1089, 326, 253, 2234, 2523, 323, 516, 30063, 9162, 310, 326, 253, 3733, 323, 15156, 5971, 476, 1421, 281, 14940, 281, 26759, 2792, 273, 616, 2957, 13016, 436, 11562, 2550, 320, 16371, 275, 253, 294, 6712, 272, 3082, 17912, 253, 2990, 27171, 432, 4105, 26647, 327, 15156, 5971, 281, 8773, 432, 253, 26759, 2792, 597, 9569, 9479, 1255, 13823, 41458, 1775, 342, 247, 1029, 37820, 2803, 391, 1689, 281, 7278, 253, 26647, 3045, 597, 28055, 285, 45190, 7568, 326, 1775, 342, 1029, 391, 1689, 50276, 261, 2104, 281, 8773, 26759, 2792, 7938, 685, 256, 35333, 285, 29623, 281, 1805, 5482, 5847, 337, 1783, 253, 1895, 273, 516, 30063, 9162, 432, 253, 591, 339, 4911, 273, 5556, 66, 1320, 759, 406, 264, 459, 310, 4460, 374, 253, 6452, 326, 26759, 2792, 2550, 320, 16371, 275, 294, 6712, 272, 3082, 310, 4722, 495, 4209, 28055, 285, 45190, 1783, 1057, 253, 26759, 2792, 2226, 275, 643, 3082, 824, 347, 941, 42072, 285, 3410, 5251, 294, 6712, 272, 3082, 2490, 187, 4118, 18435, 27, 783, 2929, 2175, 253, 1895, 273, 26759, 1127, 8773, 323, 966, 516, 30063, 15302, 285, 6571, 2789, 767, 9021, 432, 619, 8668, 337, 1783, 273, 253, 9879, 4038, 273, 253, 344, 859, 757, 323, 966, 6785, 267, 3086, 15302, 436, 8310, 310, 4460, 347, 2080, 347, 891, 871, 374, 247, 2159, 1783, 273, 1775, 17227, 352, 44716, 26759, 2792, 50276, 6050, 253, 806, 1127, 3133, 4460, 285, 273, 1600, 891, 452, 690, 3710, 33196, 5001, 253, 1273, 7680, 28055, 253, 4477, 2085, 247, 10012, 17227, 326, 253, 260, 9068, 1617, 6012, 275, 277, 6597, 73, 7076, 1162, 355, 6556, 342, 247, 4067, 3638, 352, 310, 2299, 12744, 1880, 436, 310, 253, 1921, 323, 253, 8936, 3045, 273, 1775, 275, 440, 30063, 15302, 26759, 2792, 403, 2223, 417, 21270, 275, 253, 2957, 37328, 273, 4980, 11454, 6928, 253, 2929, 1057, 417, 3587, 921, 326, 1805, 3045, 310, 7939, 281, 37127, 868, 891, 651, 751, 281, 11907, 253, 4477, 281, 625, 3587, 6780, 253, 6349, 273, 253, 260, 9068, 1617, 908, 275, 253, 1783, 50276, 1189, 455, 253, 30628, 403, 1335, 2581, 2762, 670, 253, 2929, 285, 5747, 697, 35387, 352, 556, 253, 2442, 281, 11907, 625, 2561, 275, 436, 1673, 891, 3103, 5583, 14924, 285, 19864, 253, 4477, 281, 823, 247, 5955, 273, 253, 35387, 326, 943, 320, 9713, 275, 2852, 789, 50276, 71, 3341, 891, 3877, 326, 627, 310, 690, 3332, 789, 18918, 253, 8062, 273, 11786, 18499, 762, 966, 31561, 39330, 253, 1055, 273, 966, 31561, 327, 253, 4715, 8062, 38996, 23248, 1162, 355, 253, 4342, 513, 417, 1646, 281, 320, 3587, 2905, 533, 697, 3164, 4409, 12669, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: implicit bias of adversarial training for deep neural networks explores how minimizing the exponential loss ie lx ex of a homogeneous neural network ie a neural network such that f al wl circ sigmal circ cdots circ sigma2 circ a1 w1 prodlk1 akc wl circ sigmal circ cdots circ sigma2 circ w1 for activation functions sigmal ldots sigma2 weights wl ldots w1 al ldots a1 0 and c geq 1 on samples with perturbations that maximizing the loss influences the optimized neural networks weights specifically this paper proves that for an exponential loss and a multichomogeneous neural network the limit point for fracwlvert w rvert with respect to the gradient flow fracdwdt left fracdtildemathcallpartial w rightt of the adversarial training objective tildemathcall frac1nsumni1ellxi deltaiw yi under ell2fgm fgsm ell2pgd and ellinftypgd is along the karushkuhntucker kkt point of the constrained minimization problem minw1 ldots wl frac12 lvert w rvert2ell2 text st tildegammai geq 1 where tildegammai yi fxi deltaiw and w w1 ldots wl this theorem demonstrates thatfor a class of neural networks and adversarial perturbationsadversarial training has an implicit bias that can be expressed in closed form this result provides an important contribution to understanding how adversarial training improves adversarial robustness implicit bias of adversarial training for deep neural networks answers a prevalent question in the theory of adversarial robustness how exactly in closed mathematical form does adversarial training improve adversarial robustness the paper proves a theorem stating adversarial training produces an implicit bias on the normalized weights fracwlvert w rvert with a more precise statement in summary of the paper the majority of the paper is clearly written and correct the papers results place a milestone in the theory of adversarial robustness however the proof of theorem 2 may have potential errors which may be from my confusion on some statements and notation adversarial training in linear neural networks is a corollary to theorem 5 so any potential errors in theorem 2 do not invalidate theorem 5s results in addition there are several statements and notation that are vague and illdefined in the theorems this makes the exact statement of the theorems difficult to ascertain potential errors in the proof of theorem 2 lemma 5 requires a constant step size 1betar barr epsilon for betar barr epsilon r3ll2left alpha beta fracepsilonbarrl left 2alpha beta fracalpha betabarrl right right the constant step size implies in lemma 5 that maxk lvert wkt rvert rt for some t however in lemma 2 the step size etat is not constant where etat min 1 betart barrt epsilon and rt1 rt mut for wt1 notin mathcalsrt mut as the etat decreases whenever wt1 notin mathcalsrt mut the step size etat decreases at mathcalsrt mut and not at mathcalsrt it is not obvious that lemma 5 holds as the step size is not constant in mathcalsrt this is particularly important as a lemma 6 and theorem 2 assume that maxk lvert wk rvertf to infty could you please provide a short proof that maxk lvert wk rvertf to infty under the assumptions of theorem 2 it is also unclear how statements under equation 39 hold as a result of maxk lvert wk rvertf to infty why does uk sigmak sigmakt ukt to vk1sigmatk1sigmak1vtk1 why does this imply sigmak sigmatk and sigmatk1sigmak1 are approximately the same how is approximately the same defined why do all layers have rank 1 vague language illdefined statements and undefined definitions the paper suffers from several instances of vague language for instance alignment phenomenon on pages 6 15 and 17 are not defined in the paper although i can figure out your intended definition it makes theorem 2s precise statement difficult to ascertain other instances include approximately the same on page 15 and improper use of limits on equation 52 in addition the assumptions in the theorems sometimes do not match the proofs the proof of theorem 2 assumes only a logistic loss function while theorem 2s statement assumes a broader class of loss functions limited experimental results these comments did not impact my review recommendation in many theoretical papers experimental sections typically quantitatively measure the difference between general theoretical statements and common realworld cases for example experiments show how theorems in compressed sensing deviated from typical use cases in the paper a particular area for improvement is the experimental section this section show the training accuracy and the normalized margin both plots will clearly increase as a result of adversarial training and do not add any useful information to the paper do the weights singular values diverge to infinity in practice does theorem 5s statement on the normalized weights still occur when you use other losses and neural networks or remove assumptions 2 and 4 implicit bias of adversarial training for deep neural networks contributes a milestone result to the theory of adversarial robustness the majority of the paper is clearly written and correct however the adversarial training for linear neural networks portion of the paper has several flaws potential errors vague language and illdefined statements such as approximately the same and improper use of limits in equation 52 and use of undefined notation and definitions such as alignment phenomenon nevertheless the contributions are significant and novel and the paper would receive an accept if adversarial training for linear neural networks is amended or removed docsepthis paper characterizes the bias of adversarial training toward specific minimumnorm solutions or kkt points of a particular optimization problem their results generalizes the work of li et al 2020 by proving the directional alignment with the adversarial maxmargin solution for deep linear models for l2 perturbations theorem 2 as well as convergence in direction for homogenous networks for l2 fgm fgsm and l2 linf pgd perturbations theorem 5 strengths the results are novel and extend prior theoretical results to the extent i have verified the proofs are correct minor comments theorem 5 one limitation of this result is that it depends on the adversarial perturbation as part of the constraints in eq 19 that is in comparison with related results of li et al 2020 and faghri et al 2021 that their characterizations make the difference between the solution for various lpnorm perturbations clear understandably theorem 5 is a more general result for homogenous models but it would still be useful to derive prior linear results as corollaries of theorem 5 assumption 4 can easily be false for large perturbation sizes the footnote says similar assumptions have been made in prior work but those were not about separability of adversarial examples can you provide more justification for this assumption figure 1 this is an interesting plot confirming the increase in adversarial margin can you plot fgsm and pgd on the same plot i understand that the adversarial margin for the two is different because the corresponding optimization problem is different however a natural question is is there a relation between the two problems page 9 tradeoff between standard and adversarial accuracy im not sure i understand the theoretical argument of this part is there a concrete result based on theorem 5 section 4 have you verified these results for varying epsilon size other than 16255 how about other network architectures is there a challenge in doing so this paper makes a solid theoretical contribution it could be improved with more empirical verification of the results docsepthe paper studies the adversarial training problem under deep linear network classifiers and standard l2 and linfty perturbations the papers main result suggests that in the linearly separable case the adversarially trained model via gradient descent will asymptotically converge to the maxmargin solution some extensions of this result to homogenous neural networks with exponential loss function have been provided the paper also performs some preliminary numerical experiments to support the theoretical results this paper focuses on the convergence behavior of adversarial training methods the paper tries to show the implicit bias of adversarial training in a simplified setting with a deep linear neural network and linearly separable data for a binary classification problem under this setting the paper proves the gradient descent algorithm will converge asymptotically to the maxmargin solution later the paper extends this result to homogeneous neural network functions with the exponential loss function and a separation assumption in assumption 4 overall the paper targets an interesting question and proves some useful results on the behavior of adversarial training problems however i have some comments on the assumptions made to simplify the analysis some of which seem to be quite restrictive and limit the results application to real adversarial training problems regarding the papers theoretical setting i think some of the assumptions are quite strong in section 31 the analysis is limited to deep linear networks and linearly separable data in binary classification also the convergence result is an asymptotic guarantee which does not bound the iteration complexity of finding the maxmargin solution the convergence guarantee to the maxmargin solution also holds for the standard training algorithm which questions whether the result can distinguish adversarial training methods from standard training algorithms while the results in section 31 are written clearly i think section 32 lacks a clear presentation and puts several limiting assumptions first the gradient steps are replaced with gradient flow which is inconsistent with real adversarial training experiments also the exponential loss function used for theoretical analysis is not used in practical adversarial training experiments it is not clear whether the results can be extended to standard crossentropy and squarederror loss functions in deep learning classification problems also assumption 4 on the separability of adversarial examples is pretty strong and essentially assumes the adversarial training method finds a perfect solution at some iteration t0 which is too strong given that the paper wants to study the convergence behavior of adversarial training therefore i think the assumptions are too restrictive for a real adversarial learning setting and significantly limit the application of the results to practical deep learning experiments i will look forward to the authors responses regarding the reasoning behind these assumptions to give my final score while the paper shows some insightful results on the convergence of adversarial training for deep linear networks the assumptions for the analysis of deep nonlinear networks seem too restrictive to me also replacing the gradient steps with the gradient flow seems incompatible with standard adversarial training experiments the paper would become much stronger after relaxing some of the assumptions and performing the analysis for the actual gradient descent algorithm rather than considering the gradient flow docsepthis paper aims to understand the training results of adversarial training and proves that under certain conditions adversarial training results maximize the margin for the adversarial training samples similar results have been observed in the cleaning training of dnn this papers contribution is to extend them to the adversarial training settings the paper seems technically sound although i dont have the time to go over all the appendix the results to be honest are not surprising given previous works on standard training but i believe the rigorous justification presented in the paper is of importance some minor concerns 1 the notation of loss function is abused the loss function in 6 takes both x and y as the input while in assumption 1 the loss function only takes one argument i suppose the loss function in 6 means lxyw l fxwy where l is the loss function in assumption 1 2 i dont quite understand the remark at the top of page 9 lhs of 20 is about the prediction of the clean test sample the rhs of 20 is about the fitting of the adversarial training sample how does this inequality relate to the tradeoff between robustness and accuracy please add more discussion 3 page 21 after eq 81 and lemma 4 can be applied to it should be and lemma 9 can be applied to the theoretical contribution of this paper is good to me unless other reviewers find technical errors in the proof i think it is a good supplement to the current theory of adversarial dnn training ### Summary:
the paper is a nice addition to the developing theory of implicit bias in neural training while the results are somewhat expected the technical aspects are fairly involved due to the adversarial component
[ 3762, 273, 48960, 31640, 849, 4555, 275, 4581, 15965, 830, 1057, 48960, 3733, 3157, 48960, 31640, 253, 2929, 19539, 247, 10012, 14851, 48960, 3733, 11330, 271, 15424, 8492, 327, 253, 12650, 13461, 1315, 317, 24966, 1748, 259, 391, 1748, 342, 247, 625, 10799, 3908, 275, 6010, 273, 253, 2929, 253, 5020, 273, 253, 2929, 310, 4518, 3542, 285, 3451, 253, 9380, 1543, 1659, 247, 41457, 275, 253, 3762, 273, 48960, 31640, 2299, 253, 4737, 273, 10012, 374, 778, 452, 2442, 6332, 534, 778, 320, 432, 619, 13775, 327, 690, 7234, 285, 14951, 48960, 3733, 275, 4872, 11454, 6928, 310, 247, 40460, 281, 10012, 608, 594, 667, 2442, 6332, 275, 10012, 374, 513, 417, 12078, 366, 10012, 608, 84, 1543, 50276, 249, 1635, 627, 403, 2067, 7234, 285, 14951, 326, 403, 21248, 285, 4164, 392, 37224, 275, 253, 39383, 436, 2789, 253, 3242, 3908, 273, 253, 39383, 2834, 281, 24228, 50275, 33177, 6332, 275, 253, 4737, 273, 10012, 374, 50276, 21838, 608, 4419, 247, 3638, 3213, 1979, 337, 9900, 274, 38700, 299, 4277, 323, 50276, 9900, 274, 38700, 299, 4277, 50276, 83, 20, 620, 19, 1274, 9765, 50276, 2461, 50276, 1124, 4259, 2009, 8435, 1669, 374, 1637, 50276, 2461, 50276, 1124, 1637, 701, 357, 3298, 77, 987, 987, 253, 3638, 3213, 1979, 8018, 275, 18057, 608, 326, 2781, 76, 298, 1748, 259, 5751, 391, 1748, 50276, 1378, 323, 690, 246, 2299, 275, 18057, 374, 253, 3213, 1979, 1162, 255, 310, 417, 3638, 835, 1162, 255, 50276, 1222, 50276, 18, 701, 435, 2534, 1378, 299, 4277, 50276, 395, 37523, 18, 50276, 1378, 50276, 10082, 323, 22923, 18, 417, 249, 14168, 68, 932, 1378, 50276, 10082, 347, 253, 1162, 255, 12075, 10793, 50276, 17118, 18, 417, 249, 14168, 68, 932, 1378, 50276, 10082, 253, 3213, 1979, 1162, 255, 50276, 40600, 1169, 387, 14168, 68, 932, 1378, 50276, 10082, 285, 417, 387, 14168, 68, 932, 1378, 352, 310, 417, 4755, 326, 18057, 608, 6556, 347, 253, 3213, 1979, 310, 417, 3638, 275, 14168, 68, 932, 1378, 436, 310, 3782, 1774, 347, 247, 18057, 721, 285, 10012, 374, 5467, 326, 50276, 4090, 76, 298, 1748, 47178, 391, 1748, 71, 281, 2192, 555, 812, 368, 4496, 2085, 247, 2159, 4737, 326, 2781, 76, 298, 1748, 47178, 391, 1748, 71, 281, 2192, 555, 762, 253, 13260, 273, 10012, 374, 50276, 262, 310, 671, 12744, 849, 7234, 762, 5150, 6931, 2186, 347, 247, 906, 273, 2781, 76, 298, 1748, 47178, 391, 1748, 71, 281, 2192, 555, 2139, 1057, 50276, 2788, 9788, 45879, 9788, 78, 21216, 1484, 5751, 281, 362, 76, 18, 24502, 2056, 76, 18, 84, 15379, 518, 18, 87, 17922, 18, 2139, 1057, 436, 16084, 9788, 45879, 9788, 2056, 76, 285, 9788, 2056, 76, 18, 84, 15379, 518, 18, 403, 5512, 253, 1072, 849, 310, 5512, 253, 1072, 2931, 2139, 513, 512, 8090, 50276, 9802, 5958, 337, 50275, 87, 3611, 3448, 4164, 392, 37224, 7234, 285, 17011, 14308, 50276, 783, 2929, 27171, 432, 2067, 10872, 273, 21248, 3448, 323, 4227, 12420, 11562, 327, 7223, 721, 1458, 285, 1722, 403, 417, 2931, 275, 253, 2929, 3738, 891, 476, 4677, 562, 634, 6034, 5426, 352, 2789, 10012, 374, 84, 10799, 3908, 2834, 281, 24228, 50276, 977, 10872, 2486, 5512, 253, 1072, 327, 3239, 1458, 285, 14697, 897, 273, 7787, 327, 5150, 8073, 50276, 249, 1635, 253, 13260, 275, 253, 39383, 4536, 513, 417, 3761, 253, 27947, 50276, 783, 4737, 273, 10012, 374, 19584, 760, 247, 21535, 2957, 1159, 1223, 10012, 374, 84, 3908, 19584, 247, 16055, 966, 273, 2957, 3470, 50275, 15870, 5661, 1543, 50276, 20513, 5701, 858, 417, 3486, 619, 2278, 17401, 50276, 249, 1142, 10527, 9380, 5661, 7118, 5431, 36878, 2557, 253, 3064, 875, 2087, 10527, 7234, 285, 1846, 1524, 10186, 2219, 323, 1650, 4679, 921, 849, 39383, 275, 21012, 17950, 1474, 4215, 432, 6867, 897, 2219, 275, 253, 2929, 247, 1798, 2170, 323, 7756, 310, 253, 5661, 2593, 436, 2593, 921, 253, 3733, 7200, 285, 253, 12650, 8459, 1097, 14777, 588, 4518, 2572, 347, 247, 906, 273, 48960, 3733, 285, 513, 417, 823, 667, 4217, 1491, 281, 253, 2929, 513, 253, 13461, 11098, 2193, 11711, 463, 281, 23579, 275, 3946, 1057, 10012, 608, 84, 3908, 327, 253, 12650, 13461, 1335, 2826, 672, 368, 897, 643, 11655, 285, 11454, 6928, 390, 5386, 13260, 374, 285, 577, 50276, 303, 20692, 8492, 273, 48960, 3733, 323, 3676, 11454, 6928, 17904, 247, 41457, 906, 281, 253, 3762, 273, 48960, 31640, 253, 5020, 273, 253, 2929, 310, 4518, 3542, 285, 3451, 2299, 253, 48960, 3733, 323, 4872, 11454, 6928, 5110, 273, 253, 2929, 556, 2067, 32138, 2442, 6332, 21248, 3448, 285, 4164, 392, 37224, 7234, 824, 347, 5512, 253, 1072, 285, 14697, 897, 273, 7787, 275, 5150, 8073, 285, 897, 273, 17011, 14951, 285, 14308, 824, 347, 12420, 11562, 17837, 253, 9021, 403, 1534, 285, 4460, 285, 253, 2929, 651, 4763, 271, 2997, 604, 48960, 3733, 323, 4872, 11454, 6928, 310, 13799, 390, 5176, 50275, 7152, 33032, 2520, 2929, 45589, 253, 8492, 273, 48960, 3733, 2584, 2173, 5927, 12850, 5482, 390, 465, 5751, 2792, 273, 247, 1798, 13757, 1895, 616, 1543, 2087, 4219, 253, 789, 273, 632, 1162, 355, 9169, 407, 18597, 253, 36799, 12420, 342, 253, 48960, 2781, 15456, 2900, 323, 3676, 4872, 3210, 323, 298, 19, 26309, 10012, 374, 347, 973, 347, 14940, 275, 3884, 323, 2860, 11426, 6928, 323, 298, 19, 269, 34753, 269, 72, 3610, 285, 298, 19, 298, 2050, 23256, 69, 26309, 10012, 608, 20544, 50276, 783, 1543, 403, 4460, 285, 9017, 2720, 10527, 1543, 50276, 936, 253, 6070, 891, 452, 16058, 253, 27947, 403, 3451, 50275, 37585, 5701, 50276, 33921, 608, 581, 12291, 273, 436, 906, 310, 326, 352, 7024, 327, 253, 48960, 20452, 347, 629, 273, 253, 10806, 275, 16186, 655, 326, 310, 275, 5301, 342, 2905, 1543, 273, 632, 1162, 355, 9169, 285, 269, 29416, 363, 1162, 355, 43425, 326, 616, 1894, 5904, 1056, 253, 3064, 875, 253, 2900, 323, 2710, 39322, 12850, 26309, 2590, 2096, 1598, 10012, 608, 310, 247, 625, 2087, 906, 323, 2860, 11426, 3210, 533, 352, 651, 1335, 320, 4217, 281, 15313, 2720, 4872, 1543, 347, 944, 2555, 3927, 273, 10012, 608, 50276, 515, 23892, 577, 476, 4354, 320, 3221, 323, 1781, 20452, 9552, 253, 43302, 2296, 2074, 13260, 452, 644, 1160, 275, 2720, 789, 533, 1110, 497, 417, 670, 2533, 1430, 273, 48960, 6667, 476, 368, 2085, 625, 22861, 323, 436, 9376, 50276, 13206, 337, 436, 310, 271, 4722, 7484, 24025, 253, 2572, 275, 48960, 8459, 476, 368, 7484, 269, 72, 3610, 285, 23256, 69, 327, 253, 1072, 7484, 891, 2096, 326, 253, 48960, 8459, 323, 253, 767, 310, 1027, 984, 253, 3969, 13757, 1895, 310, 1027, 2299, 247, 3626, 1953, 310, 310, 627, 247, 5886, 875, 253, 767, 3237, 50276, 6377, 898, 5454, 2727, 875, 2629, 285, 48960, 7200, 516, 417, 2119, 891, 2096, 253, 10527, 4154, 273, 436, 629, 310, 627, 247, 11859, 906, 1754, 327, 10012, 608, 50276, 4674, 577, 452, 368, 16058, 841, 1543, 323, 11962, 299, 4277, 1979, 643, 685, 1668, 10637, 849, 670, 643, 2990, 35615, 310, 627, 247, 5691, 275, 2509, 594, 50276, 2520, 2929, 2789, 247, 4891, 10527, 7680, 352, 812, 320, 5520, 342, 625, 16774, 21999, 273, 253, 1543, 5474, 339, 431, 248, 2929, 2175, 253, 48960, 3733, 1895, 762, 3676, 4872, 2990, 49996, 285, 2629, 298, 19, 285, 298, 3259, 26309, 253, 9380, 2022, 906, 5936, 326, 275, 253, 23352, 39690, 1083, 253, 18539, 274, 1365, 10166, 1566, 3066, 11786, 18499, 588, 38311, 29623, 281, 253, 2781, 15456, 2900, 690, 18149, 273, 436, 906, 281, 2860, 11426, 11454, 6928, 342, 17619, 2957, 1159, 452, 644, 2530, 253, 2929, 671, 17923, 690, 12611, 10704, 4679, 281, 1329, 253, 10527, 1543, 436, 2929, 16633, 327, 253, 14940, 3879, 273, 48960, 3733, 3082, 253, 2929, 14177, 281, 921, 253, 15424, 8492, 273, 48960, 3733, 275, 247, 21010, 4758, 342, 247, 3676, 4872, 11454, 2990, 285, 23352, 39690, 941, 323, 247, 8985, 9162, 1895, 762, 436, 4758, 253, 2929, 19539, 253, 11786, 18499, 5933, 588, 29623, 38311, 281, 253, 2781, 15456, 2900, 1996, 253, 2929, 8725, 436, 906, 281, 17010, 11454, 2990, 3470, 342, 253, 17619, 2957, 1159, 285, 247, 9712, 9376, 275, 9376, 577, 4583, 253, 2929, 8571, 271, 4722, 1953, 285, 19539, 690, 4217, 1543, 327, 253, 3879, 273, 48960, 3733, 3237, 2299, 891, 452, 690, 5701, 327, 253, 13260, 1160, 281, 25636, 253, 1783, 690, 273, 534, 1646, 281, 320, 3240, 29190, 285, 2701, 253, 1543, 2898, 281, 1524, 48960, 3733, 3237, 50276, 1747, 13218, 253, 9380, 10527, 4758, 891, 1158, 690, 273, 253, 13260, 403, 3240, 2266, 275, 2593, 4562, 253, 1783, 310, 3710, 281, 3676, 4872, 6928, 285, 23352, 39690, 941, 275, 8985, 9162, 671, 253, 14940, 906, 310, 271, 20185, 12215, 534, 1057, 417, 3033, 253, 19502, 10454, 273, 4560, 253, 2781, 15456, 2900, 253, 14940, 12215, 281, 253, 2781, 15456, 2900, 671, 6556, 323, 253, 2629, 3733, 5933, 534, 3533, 1880, 253, 906, 476, 12129, 48960, 3733, 3082, 432, 2629, 3733, 11333, 50276, 6050, 253, 1543, 275, 2593, 4562, 403, 3542, 4518, 891, 1158, 2593, 4567, 19756, 247, 2590, 9759, 285, 12516, 2067, 14155, 13260, 806, 253, 11786, 5018, 403, 7932, 342, 11786, 2685, 534, 310, 16706, 342, 1524, 48960, 3733, 4679, 671, 253, 17619, 2957, 1159, 908, 323, 10527, 1783, 310, 417, 908, 275, 8542, 48960, 3733, 4679, 352, 310, 417, 2590, 1880, 253, 1543, 476, 320, 6508, 281, 2629, 2831, 290, 10144, 285, 30044, 3775, 2957, 3470, 275, 3676, 4715, 9162, 3237, 671, 9376, 577, 327, 253, 2533, 1430, 273, 48960, 6667, 310, 3965, 2266, 285, 9093, 19584, 253, 48960, 3733, 1332, 9010, 247, 3962, 2900, 387, 690, 19502, 246, 17, 534, 310, 1512, 2266, 1677, 326, 253, 2929, 5605, 281, 1263, 253, 14940, 3879, 273, 48960, 3733, 3103, 891, 1158, 253, 13260, 403, 1512, 29190, 323, 247, 1524, 48960, 4715, 4758, 285, 3012, 2701, 253, 2898, 273, 253, 1543, 281, 8542, 3676, 4715, 4679, 891, 588, 1007, 3579, 281, 253, 4477, 6128, 5001, 253, 14720, 3212, 841, 13260, 281, 1918, 619, 2457, 4868, 1223, 253, 2929, 2722, 690, 47860, 1543, 327, 253, 14940, 273, 48960, 3733, 323, 3676, 4872, 6928, 253, 13260, 323, 253, 1783, 273, 3676, 14561, 6928, 1646, 1512, 29190, 281, 479, 671, 15706, 253, 11786, 5018, 342, 253, 11786, 2685, 3133, 32555, 342, 2629, 48960, 3733, 4679, 253, 2929, 651, 2489, 1199, 10046, 846, 32196, 690, 273, 253, 13260, 285, 9591, 253, 1783, 323, 253, 4588, 11786, 18499, 5933, 2581, 685, 7296, 253, 11786, 2685, 5474, 33032, 2520, 2929, 13698, 281, 2096, 253, 3733, 1543, 273, 48960, 3733, 285, 19539, 326, 762, 2176, 2515, 48960, 3733, 1543, 22950, 253, 8459, 323, 253, 48960, 3733, 3530, 2074, 1543, 452, 644, 2540, 275, 253, 12478, 3733, 273, 277, 9866, 436, 9380, 7680, 310, 281, 9017, 731, 281, 253, 48960, 3733, 7533, 253, 2929, 3133, 22335, 3590, 3738, 891, 13414, 452, 253, 673, 281, 564, 689, 512, 253, 30762, 253, 1543, 281, 320, 8274, 403, 417, 10084, 1677, 2045, 2987, 327, 2629, 3733, 533, 891, 2868, 253, 26565, 22861, 3559, 275, 253, 2929, 310, 273, 6349, 690, 5884, 7350, 337, 253, 14951, 273, 2957, 1159, 310, 19848, 253, 2957, 1159, 275, 721, 3936, 1097, 1269, 285, 340, 347, 253, 3280, 1223, 275, 9376, 337, 253, 2957, 1159, 760, 3936, 581, 4154, 891, 9428, 253, 2957, 1159, 275, 721, 2097, 298, 5246, 88, 50276, 77, 269, 89, 22383, 835, 298, 310, 253, 2957, 1159, 275, 9376, 337, 374, 891, 13414, 3240, 2096, 253, 7579, 387, 253, 1755, 273, 3239, 898, 298, 11285, 273, 1384, 310, 670, 253, 10554, 273, 253, 4076, 1071, 3410, 253, 38309, 273, 1384, 310, 670, 253, 13532, 273, 253, 48960, 3733, 3410, 849, 1057, 436, 11370, 14588, 281, 253, 5454, 2727, 875, 31640, 285, 7200, 4496, 823, 625, 5955, 495, 3239, 3127, 846, 16186, 11681, 285, 18057, 577, 476, 320, 3732, 281, 50276, 262, 943, 320, 285, 18057, 898, 476, 320, 3732, 281, 253, 10527, 7680, 273, 436, 2929, 310, 1175, 281, 479, 5734, 643, 30628, 1089, 7681, 6332, 275, 253, 4737, 891, 1158, 352, 310, 247, 1175, 8499, 281, 253, 1655, 3762, 273, 48960, 277, 9866, 3733, 50276, 187, 187, 4118, 18435, 27, 783, 2929, 310, 247, 5322, 1635, 281, 253, 6684, 3762, 273, 15424, 8492, 275, 11454, 3733, 1223, 253, 1543, 403, 8489, 3264, 253, 7681, 7794, 403, 9648, 3206, 1955, 281, 253, 48960, 4445 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 3762, 273, 48960, 31640, 849, 4555, 275, 4581, 15965, 830, 1057, 48960, 3733, 3157, 48960, 31640, 253, 2929, 19539, 247, 10012, 14851, 48960, 3733, 11330, 271, 15424, 8492, 327, 253, 12650, 13461, 1315, 317, 24966, 1748, 259, 391, 1748, 342, 247, 625, 10799, 3908, 275, 6010, 273, 253, 2929, 253, 5020, 273, 253, 2929, 310, 4518, 3542, 285, 3451, 253, 9380, 1543, 1659, 247, 41457, 275, 253, 3762, 273, 48960, 31640, 2299, 253, 4737, 273, 10012, 374, 778, 452, 2442, 6332, 534, 778, 320, 432, 619, 13775, 327, 690, 7234, 285, 14951, 48960, 3733, 275, 4872, 11454, 6928, 310, 247, 40460, 281, 10012, 608, 594, 667, 2442, 6332, 275, 10012, 374, 513, 417, 12078, 366, 10012, 608, 84, 1543, 50276, 249, 1635, 627, 403, 2067, 7234, 285, 14951, 326, 403, 21248, 285, 4164, 392, 37224, 275, 253, 39383, 436, 2789, 253, 3242, 3908, 273, 253, 39383, 2834, 281, 24228, 50275, 33177, 6332, 275, 253, 4737, 273, 10012, 374, 50276, 21838, 608, 4419, 247, 3638, 3213, 1979, 337, 9900, 274, 38700, 299, 4277, 323, 50276, 9900, 274, 38700, 299, 4277, 50276, 83, 20, 620, 19, 1274, 9765, 50276, 2461, 50276, 1124, 4259, 2009, 8435, 1669, 374, 1637, 50276, 2461, 50276, 1124, 1637, 701, 357, 3298, 77, 987, 987, 253, 3638, 3213, 1979, 8018, 275, 18057, 608, 326, 2781, 76, 298, 1748, 259, 5751, 391, 1748, 50276, 1378, 323, 690, 246, 2299, 275, 18057, 374, 253, 3213, 1979, 1162, 255, 310, 417, 3638, 835, 1162, 255, 50276, 1222, 50276, 18, 701, 435, 2534, 1378, 299, 4277, 50276, 395, 37523, 18, 50276, 1378, 50276, 10082, 323, 22923, 18, 417, 249, 14168, 68, 932, 1378, 50276, 10082, 347, 253, 1162, 255, 12075, 10793, 50276, 17118, 18, 417, 249, 14168, 68, 932, 1378, 50276, 10082, 253, 3213, 1979, 1162, 255, 50276, 40600, 1169, 387, 14168, 68, 932, 1378, 50276, 10082, 285, 417, 387, 14168, 68, 932, 1378, 352, 310, 417, 4755, 326, 18057, 608, 6556, 347, 253, 3213, 1979, 310, 417, 3638, 275, 14168, 68, 932, 1378, 436, 310, 3782, 1774, 347, 247, 18057, 721, 285, 10012, 374, 5467, 326, 50276, 4090, 76, 298, 1748, 47178, 391, 1748, 71, 281, 2192, 555, 812, 368, 4496, 2085, 247, 2159, 4737, 326, 2781, 76, 298, 1748, 47178, 391, 1748, 71, 281, 2192, 555, 762, 253, 13260, 273, 10012, 374, 50276, 262, 310, 671, 12744, 849, 7234, 762, 5150, 6931, 2186, 347, 247, 906, 273, 2781, 76, 298, 1748, 47178, 391, 1748, 71, 281, 2192, 555, 2139, 1057, 50276, 2788, 9788, 45879, 9788, 78, 21216, 1484, 5751, 281, 362, 76, 18, 24502, 2056, 76, 18, 84, 15379, 518, 18, 87, 17922, 18, 2139, 1057, 436, 16084, 9788, 45879, 9788, 2056, 76, 285, 9788, 2056, 76, 18, 84, 15379, 518, 18, 403, 5512, 253, 1072, 849, 310, 5512, 253, 1072, 2931, 2139, 513, 512, 8090, 50276, 9802, 5958, 337, 50275, 87, 3611, 3448, 4164, 392, 37224, 7234, 285, 17011, 14308, 50276, 783, 2929, 27171, 432, 2067, 10872, 273, 21248, 3448, 323, 4227, 12420, 11562, 327, 7223, 721, 1458, 285, 1722, 403, 417, 2931, 275, 253, 2929, 3738, 891, 476, 4677, 562, 634, 6034, 5426, 352, 2789, 10012, 374, 84, 10799, 3908, 2834, 281, 24228, 50276, 977, 10872, 2486, 5512, 253, 1072, 327, 3239, 1458, 285, 14697, 897, 273, 7787, 327, 5150, 8073, 50276, 249, 1635, 253, 13260, 275, 253, 39383, 4536, 513, 417, 3761, 253, 27947, 50276, 783, 4737, 273, 10012, 374, 19584, 760, 247, 21535, 2957, 1159, 1223, 10012, 374, 84, 3908, 19584, 247, 16055, 966, 273, 2957, 3470, 50275, 15870, 5661, 1543, 50276, 20513, 5701, 858, 417, 3486, 619, 2278, 17401, 50276, 249, 1142, 10527, 9380, 5661, 7118, 5431, 36878, 2557, 253, 3064, 875, 2087, 10527, 7234, 285, 1846, 1524, 10186, 2219, 323, 1650, 4679, 921, 849, 39383, 275, 21012, 17950, 1474, 4215, 432, 6867, 897, 2219, 275, 253, 2929, 247, 1798, 2170, 323, 7756, 310, 253, 5661, 2593, 436, 2593, 921, 253, 3733, 7200, 285, 253, 12650, 8459, 1097, 14777, 588, 4518, 2572, 347, 247, 906, 273, 48960, 3733, 285, 513, 417, 823, 667, 4217, 1491, 281, 253, 2929, 513, 253, 13461, 11098, 2193, 11711, 463, 281, 23579, 275, 3946, 1057, 10012, 608, 84, 3908, 327, 253, 12650, 13461, 1335, 2826, 672, 368, 897, 643, 11655, 285, 11454, 6928, 390, 5386, 13260, 374, 285, 577, 50276, 303, 20692, 8492, 273, 48960, 3733, 323, 3676, 11454, 6928, 17904, 247, 41457, 906, 281, 253, 3762, 273, 48960, 31640, 253, 5020, 273, 253, 2929, 310, 4518, 3542, 285, 3451, 2299, 253, 48960, 3733, 323, 4872, 11454, 6928, 5110, 273, 253, 2929, 556, 2067, 32138, 2442, 6332, 21248, 3448, 285, 4164, 392, 37224, 7234, 824, 347, 5512, 253, 1072, 285, 14697, 897, 273, 7787, 275, 5150, 8073, 285, 897, 273, 17011, 14951, 285, 14308, 824, 347, 12420, 11562, 17837, 253, 9021, 403, 1534, 285, 4460, 285, 253, 2929, 651, 4763, 271, 2997, 604, 48960, 3733, 323, 4872, 11454, 6928, 310, 13799, 390, 5176, 50275, 7152, 33032, 2520, 2929, 45589, 253, 8492, 273, 48960, 3733, 2584, 2173, 5927, 12850, 5482, 390, 465, 5751, 2792, 273, 247, 1798, 13757, 1895, 616, 1543, 2087, 4219, 253, 789, 273, 632, 1162, 355, 9169, 407, 18597, 253, 36799, 12420, 342, 253, 48960, 2781, 15456, 2900, 323, 3676, 4872, 3210, 323, 298, 19, 26309, 10012, 374, 347, 973, 347, 14940, 275, 3884, 323, 2860, 11426, 6928, 323, 298, 19, 269, 34753, 269, 72, 3610, 285, 298, 19, 298, 2050, 23256, 69, 26309, 10012, 608, 20544, 50276, 783, 1543, 403, 4460, 285, 9017, 2720, 10527, 1543, 50276, 936, 253, 6070, 891, 452, 16058, 253, 27947, 403, 3451, 50275, 37585, 5701, 50276, 33921, 608, 581, 12291, 273, 436, 906, 310, 326, 352, 7024, 327, 253, 48960, 20452, 347, 629, 273, 253, 10806, 275, 16186, 655, 326, 310, 275, 5301, 342, 2905, 1543, 273, 632, 1162, 355, 9169, 285, 269, 29416, 363, 1162, 355, 43425, 326, 616, 1894, 5904, 1056, 253, 3064, 875, 253, 2900, 323, 2710, 39322, 12850, 26309, 2590, 2096, 1598, 10012, 608, 310, 247, 625, 2087, 906, 323, 2860, 11426, 3210, 533, 352, 651, 1335, 320, 4217, 281, 15313, 2720, 4872, 1543, 347, 944, 2555, 3927, 273, 10012, 608, 50276, 515, 23892, 577, 476, 4354, 320, 3221, 323, 1781, 20452, 9552, 253, 43302, 2296, 2074, 13260, 452, 644, 1160, 275, 2720, 789, 533, 1110, 497, 417, 670, 2533, 1430, 273, 48960, 6667, 476, 368, 2085, 625, 22861, 323, 436, 9376, 50276, 13206, 337, 436, 310, 271, 4722, 7484, 24025, 253, 2572, 275, 48960, 8459, 476, 368, 7484, 269, 72, 3610, 285, 23256, 69, 327, 253, 1072, 7484, 891, 2096, 326, 253, 48960, 8459, 323, 253, 767, 310, 1027, 984, 253, 3969, 13757, 1895, 310, 1027, 2299, 247, 3626, 1953, 310, 310, 627, 247, 5886, 875, 253, 767, 3237, 50276, 6377, 898, 5454, 2727, 875, 2629, 285, 48960, 7200, 516, 417, 2119, 891, 2096, 253, 10527, 4154, 273, 436, 629, 310, 627, 247, 11859, 906, 1754, 327, 10012, 608, 50276, 4674, 577, 452, 368, 16058, 841, 1543, 323, 11962, 299, 4277, 1979, 643, 685, 1668, 10637, 849, 670, 643, 2990, 35615, 310, 627, 247, 5691, 275, 2509, 594, 50276, 2520, 2929, 2789, 247, 4891, 10527, 7680, 352, 812, 320, 5520, 342, 625, 16774, 21999, 273, 253, 1543, 5474, 339, 431, 248, 2929, 2175, 253, 48960, 3733, 1895, 762, 3676, 4872, 2990, 49996, 285, 2629, 298, 19, 285, 298, 3259, 26309, 253, 9380, 2022, 906, 5936, 326, 275, 253, 23352, 39690, 1083, 253, 18539, 274, 1365, 10166, 1566, 3066, 11786, 18499, 588, 38311, 29623, 281, 253, 2781, 15456, 2900, 690, 18149, 273, 436, 906, 281, 2860, 11426, 11454, 6928, 342, 17619, 2957, 1159, 452, 644, 2530, 253, 2929, 671, 17923, 690, 12611, 10704, 4679, 281, 1329, 253, 10527, 1543, 436, 2929, 16633, 327, 253, 14940, 3879, 273, 48960, 3733, 3082, 253, 2929, 14177, 281, 921, 253, 15424, 8492, 273, 48960, 3733, 275, 247, 21010, 4758, 342, 247, 3676, 4872, 11454, 2990, 285, 23352, 39690, 941, 323, 247, 8985, 9162, 1895, 762, 436, 4758, 253, 2929, 19539, 253, 11786, 18499, 5933, 588, 29623, 38311, 281, 253, 2781, 15456, 2900, 1996, 253, 2929, 8725, 436, 906, 281, 17010, 11454, 2990, 3470, 342, 253, 17619, 2957, 1159, 285, 247, 9712, 9376, 275, 9376, 577, 4583, 253, 2929, 8571, 271, 4722, 1953, 285, 19539, 690, 4217, 1543, 327, 253, 3879, 273, 48960, 3733, 3237, 2299, 891, 452, 690, 5701, 327, 253, 13260, 1160, 281, 25636, 253, 1783, 690, 273, 534, 1646, 281, 320, 3240, 29190, 285, 2701, 253, 1543, 2898, 281, 1524, 48960, 3733, 3237, 50276, 1747, 13218, 253, 9380, 10527, 4758, 891, 1158, 690, 273, 253, 13260, 403, 3240, 2266, 275, 2593, 4562, 253, 1783, 310, 3710, 281, 3676, 4872, 6928, 285, 23352, 39690, 941, 275, 8985, 9162, 671, 253, 14940, 906, 310, 271, 20185, 12215, 534, 1057, 417, 3033, 253, 19502, 10454, 273, 4560, 253, 2781, 15456, 2900, 253, 14940, 12215, 281, 253, 2781, 15456, 2900, 671, 6556, 323, 253, 2629, 3733, 5933, 534, 3533, 1880, 253, 906, 476, 12129, 48960, 3733, 3082, 432, 2629, 3733, 11333, 50276, 6050, 253, 1543, 275, 2593, 4562, 403, 3542, 4518, 891, 1158, 2593, 4567, 19756, 247, 2590, 9759, 285, 12516, 2067, 14155, 13260, 806, 253, 11786, 5018, 403, 7932, 342, 11786, 2685, 534, 310, 16706, 342, 1524, 48960, 3733, 4679, 671, 253, 17619, 2957, 1159, 908, 323, 10527, 1783, 310, 417, 908, 275, 8542, 48960, 3733, 4679, 352, 310, 417, 2590, 1880, 253, 1543, 476, 320, 6508, 281, 2629, 2831, 290, 10144, 285, 30044, 3775, 2957, 3470, 275, 3676, 4715, 9162, 3237, 671, 9376, 577, 327, 253, 2533, 1430, 273, 48960, 6667, 310, 3965, 2266, 285, 9093, 19584, 253, 48960, 3733, 1332, 9010, 247, 3962, 2900, 387, 690, 19502, 246, 17, 534, 310, 1512, 2266, 1677, 326, 253, 2929, 5605, 281, 1263, 253, 14940, 3879, 273, 48960, 3733, 3103, 891, 1158, 253, 13260, 403, 1512, 29190, 323, 247, 1524, 48960, 4715, 4758, 285, 3012, 2701, 253, 2898, 273, 253, 1543, 281, 8542, 3676, 4715, 4679, 891, 588, 1007, 3579, 281, 253, 4477, 6128, 5001, 253, 14720, 3212, 841, 13260, 281, 1918, 619, 2457, 4868, 1223, 253, 2929, 2722, 690, 47860, 1543, 327, 253, 14940, 273, 48960, 3733, 323, 3676, 4872, 6928, 253, 13260, 323, 253, 1783, 273, 3676, 14561, 6928, 1646, 1512, 29190, 281, 479, 671, 15706, 253, 11786, 5018, 342, 253, 11786, 2685, 3133, 32555, 342, 2629, 48960, 3733, 4679, 253, 2929, 651, 2489, 1199, 10046, 846, 32196, 690, 273, 253, 13260, 285, 9591, 253, 1783, 323, 253, 4588, 11786, 18499, 5933, 2581, 685, 7296, 253, 11786, 2685, 5474, 33032, 2520, 2929, 13698, 281, 2096, 253, 3733, 1543, 273, 48960, 3733, 285, 19539, 326, 762, 2176, 2515, 48960, 3733, 1543, 22950, 253, 8459, 323, 253, 48960, 3733, 3530, 2074, 1543, 452, 644, 2540, 275, 253, 12478, 3733, 273, 277, 9866, 436, 9380, 7680, 310, 281, 9017, 731, 281, 253, 48960, 3733, 7533, 253, 2929, 3133, 22335, 3590, 3738, 891, 13414, 452, 253, 673, 281, 564, 689, 512, 253, 30762, 253, 1543, 281, 320, 8274, 403, 417, 10084, 1677, 2045, 2987, 327, 2629, 3733, 533, 891, 2868, 253, 26565, 22861, 3559, 275, 253, 2929, 310, 273, 6349, 690, 5884, 7350, 337, 253, 14951, 273, 2957, 1159, 310, 19848, 253, 2957, 1159, 275, 721, 3936, 1097, 1269, 285, 340, 347, 253, 3280, 1223, 275, 9376, 337, 253, 2957, 1159, 760, 3936, 581, 4154, 891, 9428, 253, 2957, 1159, 275, 721, 2097, 298, 5246, 88, 50276, 77, 269, 89, 22383, 835, 298, 310, 253, 2957, 1159, 275, 9376, 337, 374, 891, 13414, 3240, 2096, 253, 7579, 387, 253, 1755, 273, 3239, 898, 298, 11285, 273, 1384, 310, 670, 253, 10554, 273, 253, 4076, 1071, 3410, 253, 38309, 273, 1384, 310, 670, 253, 13532, 273, 253, 48960, 3733, 3410, 849, 1057, 436, 11370, 14588, 281, 253, 5454, 2727, 875, 31640, 285, 7200, 4496, 823, 625, 5955, 495, 3239, 3127, 846, 16186, 11681, 285, 18057, 577, 476, 320, 3732, 281, 50276, 262, 943, 320, 285, 18057, 898, 476, 320, 3732, 281, 253, 10527, 7680, 273, 436, 2929, 310, 1175, 281, 479, 5734, 643, 30628, 1089, 7681, 6332, 275, 253, 4737, 891, 1158, 352, 310, 247, 1175, 8499, 281, 253, 1655, 3762, 273, 48960, 277, 9866, 3733, 50276, 187, 187, 4118, 18435, 27, 783, 2929, 310, 247, 5322, 1635, 281, 253, 6684, 3762, 273, 15424, 8492, 275, 11454, 3733, 1223, 253, 1543, 403, 8489, 3264, 253, 7681, 7794, 403, 9648, 3206, 1955, 281, 253, 48960, 4445 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: i the literature reports a large body of work that focuses on minimizing the tradeoff between accuracy and fairness and between accuracy and privacy there is relatively little work that focuses on the intersection between privacy fairness and accuracy ii for an easier interpretation the authors care to show a simplified version of theorem 1 about the conflicting effects of accuracy and privacy on the one hand and fairness on the other hand and theorem 3 questioning the sacrifice of accuracy for fairness besides the detailed versions developed in appendix similarly a proof sketch of theorem 2 is provided in the body of the text iii the experimental protocol developed on synthetic data and on celeba and cifar10 visual datasets is convincing i do not see any minor comments  page 4 in the previous section we considered assumptions that allow the algorithm to generate classifiers which makes mistakes on frequently occurring examples but with a low probability and makes mistakes on infrequent samples with a high probability page 5 theorem 2 and its proof is included in appendix a2 page 16  i do not see the necessity of second line in 24 docsepimportant topic and interesting research angle theoretical results corroborated by experimental results while admitting the importance of this research direction claimed contributions have been unclear to me related to the stated contribution 1 and 2 the authors discussed the advantageous approximate differential privacy over differential privacy how about advantages when understanding interactions between fairness and privacy based on approximate differential privacy and on differential privacy are these advantages mainly due to the use of approximate differential privacy or the authors contributions if any the focus data is the longtailed although synthetic data is longtailed generated i do not see the real world data shares such a property in addition any reason on exclusively using computer vision datasets many tabular datasets actually share the longtailed characteristics which might resonate with the longtailed focus of this paper for concrete examples check 1 a survey on datasets for fairnessaware machine learning dami related work on state of the art regarding the tradeoff between accuracy and fairness in addition non deep learning approaches such as random forests are widely used in practice which is one motivation of using the employed framework in comparison to deep learning methods are ignored relevant discussion and comparison can strengthen the merit of this work docsep1 the research topic is interesting understanding the tradeoff between privacy fairness and accuracy is essential for the development of trustworthy algorithms 2 the major conclusions from theoretical findings are clear 3 supportive experiments look sound and serve as a good explanation for the theorems the figures fully show the trend along the three directions 1 the presentation for this paper can be further improved some details are hard to follow with many notations defined one or two pages ago notation abuse or the change of variables and variables without intuitive explanation it is not easy to understand a theoretical result in a short time the organizations also can be reconsidered for example i dont quite follow lemma 1 and 2 under the content privacy at the cost of fairness pointing out their relations to the current topics would be helpful minor the authors are encouraged to simplify some notations probably a table for notations is helpful align the notations with fig 1 left or more figures also help the reading docsep1 the authors look at the problems of privacy fairness and accuracy from a new perspective that can potentially lead to promising future research discussions in the community 2 the revealed findings are interesting and provide new insights that may have some impact 3 findings are supported with strong theoretical proofs and empirical results 4 writings are clear and easy to follow i think to get more conclusive findings the authors need to experiment with other popular fairness notions and consider other types of data such as text some important findings in the paper are based on the results of one single dataset also i am a bit lost trying to connect the longtailed data with minority groups do they represent the same things in this work some explanations in the introduction can help do the authors plan to release their code and data see q4 ### Summary:
meta review the paper studies the intersection between fairness privacy and accuracy reviewers are overall positive about the novel insights that the paper provides minor concerns are well covered by the rebuttal
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 74, 253, 6239, 5012, 247, 1781, 2133, 273, 789, 326, 16633, 327, 28699, 253, 5454, 2727, 875, 7200, 285, 28959, 285, 875, 7200, 285, 11068, 627, 310, 4942, 1652, 789, 326, 16633, 327, 253, 15171, 875, 11068, 28959, 285, 7200, 21255, 323, 271, 6927, 7914, 253, 4477, 1557, 281, 921, 247, 21010, 2715, 273, 10012, 337, 670, 253, 24648, 2538, 273, 7200, 285, 11068, 327, 253, 581, 1133, 285, 28959, 327, 253, 643, 1133, 285, 10012, 495, 20501, 253, 17789, 273, 7200, 323, 28959, 16280, 253, 7000, 9508, 3715, 275, 30762, 12014, 247, 4737, 23211, 273, 10012, 374, 310, 2530, 275, 253, 2133, 273, 253, 2505, 37685, 253, 5661, 7241, 3715, 327, 13506, 941, 285, 327, 6076, 5830, 285, 260, 338, 274, 740, 5304, 15302, 310, 21414, 891, 513, 417, 923, 667, 5884, 5701, 575, 50276, 6377, 577, 275, 253, 2045, 2593, 359, 2783, 13260, 326, 1581, 253, 5933, 281, 6635, 49996, 534, 2789, 16503, 327, 7208, 12952, 6667, 533, 342, 247, 1698, 5912, 285, 2789, 16503, 327, 2192, 38976, 3530, 342, 247, 1029, 5912, 3239, 608, 10012, 374, 285, 697, 4737, 310, 2908, 275, 30762, 247, 19, 3239, 1668, 575, 891, 513, 417, 923, 253, 15504, 273, 1273, 1386, 275, 2164, 5474, 33032, 18108, 9400, 285, 4722, 2561, 6907, 50275, 783, 33977, 1543, 47790, 407, 5661, 1543, 50276, 6050, 28168, 253, 6349, 273, 436, 2561, 3884, 7558, 9021, 452, 644, 12744, 281, 479, 2905, 281, 253, 4767, 7680, 337, 285, 374, 253, 4477, 5469, 253, 24400, 16851, 8967, 11068, 689, 8967, 11068, 849, 670, 11361, 672, 4685, 6355, 875, 28959, 285, 11068, 1754, 327, 16851, 8967, 11068, 285, 327, 8967, 11068, 403, 841, 11361, 7194, 1955, 281, 253, 897, 273, 16851, 8967, 11068, 390, 253, 4477, 9021, 604, 667, 50275, 783, 2770, 941, 310, 253, 1048, 29551, 3738, 13506, 941, 310, 1048, 29551, 4561, 891, 513, 417, 923, 253, 1524, 1533, 941, 10764, 824, 247, 2867, 275, 1635, 667, 1921, 327, 14288, 970, 4382, 8113, 15302, 1142, 10334, 792, 15302, 2686, 3894, 253, 1048, 29551, 5319, 534, 1537, 8146, 366, 342, 253, 1048, 29551, 2770, 273, 436, 2929, 323, 11859, 6667, 2451, 337, 247, 6630, 327, 15302, 323, 28959, 13823, 5145, 4715, 2687, 74, 50274, 4919, 789, 327, 1375, 273, 253, 1445, 5001, 253, 5454, 2727, 875, 7200, 285, 28959, 275, 1635, 1327, 3676, 4715, 7274, 824, 347, 3632, 21327, 403, 7561, 908, 275, 3946, 534, 310, 581, 16038, 273, 970, 253, 7091, 7792, 275, 5301, 281, 3676, 4715, 3082, 403, 12841, 4623, 5955, 285, 5301, 476, 17084, 253, 15785, 273, 436, 789, 50275, 7152, 33032, 18, 253, 2561, 9400, 310, 4722, 4685, 253, 5454, 2727, 875, 11068, 28959, 285, 7200, 310, 5667, 323, 253, 2440, 273, 46808, 11333, 50276, 19, 253, 2201, 11815, 432, 10527, 4342, 403, 2590, 50276, 20, 23384, 4679, 1007, 3590, 285, 5752, 347, 247, 1175, 8813, 323, 253, 39383, 253, 8442, 4751, 921, 253, 9058, 2112, 253, 1264, 10746, 337, 253, 9759, 323, 436, 2929, 476, 320, 2007, 5520, 690, 4278, 403, 1892, 281, 956, 342, 1142, 41818, 2931, 581, 390, 767, 7223, 3622, 14951, 7242, 390, 253, 1818, 273, 4903, 285, 4903, 1293, 27350, 8813, 352, 310, 417, 3477, 281, 2096, 247, 10527, 906, 275, 247, 2159, 673, 253, 8889, 671, 476, 320, 8756, 38617, 323, 1650, 891, 13414, 3240, 956, 18057, 337, 285, 374, 762, 253, 2600, 11068, 387, 253, 2105, 273, 28959, 13458, 562, 616, 2493, 281, 253, 1655, 12989, 651, 320, 9371, 5884, 253, 4477, 403, 14659, 281, 25636, 690, 41818, 3164, 247, 2829, 323, 41818, 310, 9371, 8495, 253, 41818, 342, 3036, 337, 1669, 390, 625, 8442, 671, 1361, 253, 4361, 5474, 33032, 18, 253, 4477, 1007, 387, 253, 3237, 273, 11068, 28959, 285, 7200, 432, 247, 747, 8668, 326, 476, 7826, 1421, 281, 12532, 2852, 2561, 11985, 275, 253, 3114, 374, 253, 4950, 4342, 403, 4722, 285, 2085, 747, 16039, 326, 778, 452, 690, 3486, 50276, 20, 4342, 403, 4516, 342, 2266, 10527, 27947, 285, 16774, 1543, 50276, 21, 25527, 403, 2590, 285, 3477, 281, 956, 891, 1158, 281, 755, 625, 38662, 4342, 253, 4477, 878, 281, 3368, 342, 643, 4633, 28959, 27367, 285, 1908, 643, 3510, 273, 941, 824, 347, 2505, 690, 1774, 4342, 275, 253, 2929, 403, 1754, 327, 253, 1543, 273, 581, 2014, 10895, 671, 891, 717, 247, 2372, 3663, 2820, 281, 4684, 253, 1048, 29551, 941, 342, 15156, 2390, 513, 597, 1957, 253, 1072, 1841, 275, 436, 789, 690, 22909, 275, 253, 10199, 476, 1361, 513, 253, 4477, 2098, 281, 3727, 616, 2127, 285, 941, 50276, 2887, 2805, 21, 50276, 187, 187, 4118, 18435, 27, 13518, 2278, 253, 2929, 2175, 253, 15171, 875, 28959, 11068, 285, 7200, 30628, 403, 4583, 2762, 670, 253, 4460, 16039, 326, 253, 2929, 3400, 5884, 7350, 403, 973, 6107, 407, 253, 30080, 22559 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 74, 253, 6239, 5012, 247, 1781, 2133, 273, 789, 326, 16633, 327, 28699, 253, 5454, 2727, 875, 7200, 285, 28959, 285, 875, 7200, 285, 11068, 627, 310, 4942, 1652, 789, 326, 16633, 327, 253, 15171, 875, 11068, 28959, 285, 7200, 21255, 323, 271, 6927, 7914, 253, 4477, 1557, 281, 921, 247, 21010, 2715, 273, 10012, 337, 670, 253, 24648, 2538, 273, 7200, 285, 11068, 327, 253, 581, 1133, 285, 28959, 327, 253, 643, 1133, 285, 10012, 495, 20501, 253, 17789, 273, 7200, 323, 28959, 16280, 253, 7000, 9508, 3715, 275, 30762, 12014, 247, 4737, 23211, 273, 10012, 374, 310, 2530, 275, 253, 2133, 273, 253, 2505, 37685, 253, 5661, 7241, 3715, 327, 13506, 941, 285, 327, 6076, 5830, 285, 260, 338, 274, 740, 5304, 15302, 310, 21414, 891, 513, 417, 923, 667, 5884, 5701, 575, 50276, 6377, 577, 275, 253, 2045, 2593, 359, 2783, 13260, 326, 1581, 253, 5933, 281, 6635, 49996, 534, 2789, 16503, 327, 7208, 12952, 6667, 533, 342, 247, 1698, 5912, 285, 2789, 16503, 327, 2192, 38976, 3530, 342, 247, 1029, 5912, 3239, 608, 10012, 374, 285, 697, 4737, 310, 2908, 275, 30762, 247, 19, 3239, 1668, 575, 891, 513, 417, 923, 253, 15504, 273, 1273, 1386, 275, 2164, 5474, 33032, 18108, 9400, 285, 4722, 2561, 6907, 50275, 783, 33977, 1543, 47790, 407, 5661, 1543, 50276, 6050, 28168, 253, 6349, 273, 436, 2561, 3884, 7558, 9021, 452, 644, 12744, 281, 479, 2905, 281, 253, 4767, 7680, 337, 285, 374, 253, 4477, 5469, 253, 24400, 16851, 8967, 11068, 689, 8967, 11068, 849, 670, 11361, 672, 4685, 6355, 875, 28959, 285, 11068, 1754, 327, 16851, 8967, 11068, 285, 327, 8967, 11068, 403, 841, 11361, 7194, 1955, 281, 253, 897, 273, 16851, 8967, 11068, 390, 253, 4477, 9021, 604, 667, 50275, 783, 2770, 941, 310, 253, 1048, 29551, 3738, 13506, 941, 310, 1048, 29551, 4561, 891, 513, 417, 923, 253, 1524, 1533, 941, 10764, 824, 247, 2867, 275, 1635, 667, 1921, 327, 14288, 970, 4382, 8113, 15302, 1142, 10334, 792, 15302, 2686, 3894, 253, 1048, 29551, 5319, 534, 1537, 8146, 366, 342, 253, 1048, 29551, 2770, 273, 436, 2929, 323, 11859, 6667, 2451, 337, 247, 6630, 327, 15302, 323, 28959, 13823, 5145, 4715, 2687, 74, 50274, 4919, 789, 327, 1375, 273, 253, 1445, 5001, 253, 5454, 2727, 875, 7200, 285, 28959, 275, 1635, 1327, 3676, 4715, 7274, 824, 347, 3632, 21327, 403, 7561, 908, 275, 3946, 534, 310, 581, 16038, 273, 970, 253, 7091, 7792, 275, 5301, 281, 3676, 4715, 3082, 403, 12841, 4623, 5955, 285, 5301, 476, 17084, 253, 15785, 273, 436, 789, 50275, 7152, 33032, 18, 253, 2561, 9400, 310, 4722, 4685, 253, 5454, 2727, 875, 11068, 28959, 285, 7200, 310, 5667, 323, 253, 2440, 273, 46808, 11333, 50276, 19, 253, 2201, 11815, 432, 10527, 4342, 403, 2590, 50276, 20, 23384, 4679, 1007, 3590, 285, 5752, 347, 247, 1175, 8813, 323, 253, 39383, 253, 8442, 4751, 921, 253, 9058, 2112, 253, 1264, 10746, 337, 253, 9759, 323, 436, 2929, 476, 320, 2007, 5520, 690, 4278, 403, 1892, 281, 956, 342, 1142, 41818, 2931, 581, 390, 767, 7223, 3622, 14951, 7242, 390, 253, 1818, 273, 4903, 285, 4903, 1293, 27350, 8813, 352, 310, 417, 3477, 281, 2096, 247, 10527, 906, 275, 247, 2159, 673, 253, 8889, 671, 476, 320, 8756, 38617, 323, 1650, 891, 13414, 3240, 956, 18057, 337, 285, 374, 762, 253, 2600, 11068, 387, 253, 2105, 273, 28959, 13458, 562, 616, 2493, 281, 253, 1655, 12989, 651, 320, 9371, 5884, 253, 4477, 403, 14659, 281, 25636, 690, 41818, 3164, 247, 2829, 323, 41818, 310, 9371, 8495, 253, 41818, 342, 3036, 337, 1669, 390, 625, 8442, 671, 1361, 253, 4361, 5474, 33032, 18, 253, 4477, 1007, 387, 253, 3237, 273, 11068, 28959, 285, 7200, 432, 247, 747, 8668, 326, 476, 7826, 1421, 281, 12532, 2852, 2561, 11985, 275, 253, 3114, 374, 253, 4950, 4342, 403, 4722, 285, 2085, 747, 16039, 326, 778, 452, 690, 3486, 50276, 20, 4342, 403, 4516, 342, 2266, 10527, 27947, 285, 16774, 1543, 50276, 21, 25527, 403, 2590, 285, 3477, 281, 956, 891, 1158, 281, 755, 625, 38662, 4342, 253, 4477, 878, 281, 3368, 342, 643, 4633, 28959, 27367, 285, 1908, 643, 3510, 273, 941, 824, 347, 2505, 690, 1774, 4342, 275, 253, 2929, 403, 1754, 327, 253, 1543, 273, 581, 2014, 10895, 671, 891, 717, 247, 2372, 3663, 2820, 281, 4684, 253, 1048, 29551, 941, 342, 15156, 2390, 513, 597, 1957, 253, 1072, 1841, 275, 436, 789, 690, 22909, 275, 253, 10199, 476, 1361, 513, 253, 4477, 2098, 281, 3727, 616, 2127, 285, 941, 50276, 2887, 2805, 21, 50276, 187, 187, 4118, 18435, 27, 13518, 2278, 253, 2929, 2175, 253, 15171, 875, 28959, 11068, 285, 7200, 30628, 403, 4583, 2762, 670, 253, 4460, 16039, 326, 253, 2929, 3400, 5884, 7350, 403, 973, 6107, 407, 253, 30080, 22559 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper proposes bootstrapped transformer which uses bootstrapping to selfgenerate more offline data to improve sequence model training of trajectory transformer in offline rl setting the authors study autogenerative generation and teacher forcing for trajectory generation in bootstrapping along with data filtering based on confidence score prediction probability and simple curriculum learning they demonstrate better performance comparing to other offline rl baselines in d4rl dataset specifically on gymmujoco and adroit they also conduct analysis on bootstrapped data from the perspective of coveragediversity and alignment with mdp and an ablation to provide simple guideline of bootstrapping strengths the paper is wellwritten and easy to follow the proposed method is wellmotivated and the problem being attacked is important in using sequence modeling in offline rl the main results are in general quite good and support the major contribution of this work weakness ttreproduce is somewhat very different from ttoriginal actually its worse in general thus i wonder if its possible that ttoriginal with retraining can achieve better results why s4rl used in the experiment only considers gaussian noise but not also other augmentation techniques proposed in the original paper like dropout mixup etc regarding ttretrain baseline additional to uniformly sample from the original dataset for additional training what about a simple weighted sampling based on loss or data likelihood the author claims in line 184190 that autoregressive generation is better than teacher forcing due to its more diverse generation in spite of the seemingly persuasive explanation the marginal difference in table 2 plus no boot ar in adroit experiment table 3 though the author mentions it is too timeconsuming i wonder if there is an alternative to provide more concrete justification dont well justify the statement from figure 2 ar and tf seem to be complementary to each other with tf covers sparser regions in the original data distribution i wonder combining these two bootstrapping techniques with simple methods like randomly choosing one approach to generate data will provide further improvement how does the proposed method perform under nonexpert demonstration specifically bootstrapping largely depends on the correctness of the original model ie tt here i wonder if we gradually mix lessperforming data to the original training set eg replacing some portion of dataset with random dataset in d4rl how will this negatively affect bootstrapping while autoregressive generation can produce larger variation how does the accumulated error affect bootstrapping autoregressive generation with larger generation length t produces data with larger deviation from the original dataset but at the same time suffers from larger accumulated error thus t may be an interesting hyperparameter for ablation study to see how to strike a good balance between the tradeoff apart from large training time due to online pseudo training data generation mentioned by the authors it would be nice to see more details or even failure mode resulting from instability of bootstrapping the authors mentioned some of them in ablation as well as techniques to remedy such issue but if any more it would be very helpful to see all other failure strategies that the authors have tried docsepin order to tackle the problem that offline rl data is often small in size and lacks in coverage this paper proposes to treat rl as a sequence modeling task and augment the training data by selecting the high confidence generated samples from the sequence model that is going through training the proposed method is evaluated on a subset of the d4rl continuous control tasks and is shown to outperform several baselines including transformerbased and modelbased ones as well as cql and bc for a bit over half of the tasks strengths the algorithms main body is simple and orthogonal to other offlinerl works the only constraint is to use a sequence model interesting analysis of the data distribution generated by teacherforcing vs autoregressive clear description of the method weakness it is unclear whether the gain of boot comes from 1 extra data 2 different architecture pretrained gpt2 vs not 3 some inherent property in the sequence model as opposed to other world models that may only predict the observation and the reward it is unclear from the paper whether bootstrapping is novel beyond supervised learning eg in rl there are quite a few additional limitations not mentioned in the paper l349350 1 the extra two hyperparameters introduced k and require finetuning which depends on availability to the environment or a good ope method 2 as mentioned in l3739 for other tasks in general it is unclear whether the dataset available is sufficient to train a boot unless we try it which will incur extra training time and cost as mentioned in l349350 docsepthis paper extends recent sequencemodeling approaches to rl with modelbased data augmentation with the end result looking like a dynainspired sutton 1991httpsdlacmorgdoi101145122344122377 variant of the trajectory transformer a few design decisions such as the sampling strategy autoregressive versus teacherforced and how often to reuse the modelgenerated data are investigated the algorithm is evaluated on the locomotion and adroit domains of the d4rl offline rl benchmark this paper considers a relatively straightforward combination of existing algorithms but does so thoroughly it is careful in its empirical evaluations with baseline variants including a tt with extra training to isolate the effects of boots extra gradient steps and b an s4rlmodified version of tt to compare the modelgenerated data to other forms of data augmentation the paper is also careful not to overclaim its novelty and combined with its thoroughness certainly executes on the premise it describes in the abstract and introduction the section on further analysis of bootstrapping hints at a question i have myself wondered about when a model is trained on the same underlying data as a policy but can then generate more data that further improves the policy what about the modelgenerated data caused that improvement the answer is somewhat heuristic and qualitative here but this question has underrecognized subtlety so i appreciated an attempt at addressing it empirically here i do think that the introduction might have overpromised a bit here saying that the generated pseudo trajectories keep consistent with the underlying mdp most would interpret that as meaning consistent with the initial state distribution and transition function but the investigation the authors conducted doesnt actually take into account dynamical feasibility but interstate distances that being said this is mostly a quibble about terminology and any attempt at answering this question at all is a good start as discussed in the strengths and weaknesses section the main weakness is that boot is a somewhat straightforward extension of existing work in modelbased rl however the paper does a thorough investigation of this extension and does not overclaim its contribution ### Summary:
summary offline rl rl algorithms aim to learn policies without interacting with an environment purely from the state and actions covered in the offline datasets however in realworld datasets the coverage can be insufficient to learn good policies thus it is an important research direction to improve the sample efficiency of those methods this paper approaches offline rl from a sequence modeling perspective the paper adopts a variant of trajectory transformers for data generation and they investigate two of the main design decisions in those models sampling methods autoregressive vs teacherforcing based reuse of modelgenerated data the authors validate their idea on two d4rl tasks adroit locomotion decision overall the paper is wellwritten and easy to understand the results and experiments are thorough and the paper goes for more depth in the experiments rather than breadth the ideas in the paper are not novel but the paper does not overclaim its contributions and results are interesting as a result i think both neurips and the broader offline rl communities would benefit from the findings of this paper i am nominating this paper for acceptance the reviewers were very positive about this paper during the rebuttal and discussion paper they all agreed that the paper is valuable and interesting contribution to the community the main criticism of this paper that came up during the discussion period was that the idea is just a straightforward combination of the existing techniques however the idea presented in the paper is coherent reasonable and executed well the authors provided a very detailed rebuttal with clarifications to the points that reviewers raised as a result of the rebuttal some of the reviewers increased their scores i would recommend that the authors incorporate some of those clarifications into the cameraready paper version some of those are in response to reviewer 9mfts question on the results with cql on additional data generated by boot is very interesting i think the authors should include it in the cameraready version of the paper additional experiments on other datasets from the d4rl gym environment as asked by reviewer 9mft the experiments asked by the reviewer hqzj
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 7491, 10981, 1882, 39707, 534, 4648, 7491, 10981, 2784, 281, 1881, 16450, 625, 28841, 941, 281, 3157, 3425, 1566, 3733, 273, 18974, 39707, 275, 28841, 391, 77, 4758, 253, 4477, 1263, 1125, 2646, 254, 800, 5978, 285, 9732, 17190, 323, 18974, 5978, 275, 7491, 10981, 2784, 2112, 342, 941, 19690, 1754, 327, 7162, 4868, 10554, 5912, 285, 2969, 24642, 4715, 597, 7568, 1805, 3045, 10941, 281, 643, 28841, 391, 77, 1666, 25379, 275, 277, 21, 8435, 10895, 5742, 327, 17409, 1906, 75, 16856, 285, 519, 14790, 597, 671, 2589, 1783, 327, 7491, 10981, 1882, 941, 432, 253, 8668, 273, 3835, 2961, 2095, 285, 12420, 342, 278, 12132, 285, 271, 28913, 281, 2085, 2969, 29609, 273, 7491, 10981, 2784, 20544, 50276, 783, 2929, 310, 973, 15720, 285, 3477, 281, 956, 50276, 783, 4081, 1332, 310, 973, 24013, 8550, 285, 253, 1895, 1146, 13964, 310, 1774, 275, 970, 3425, 14053, 275, 28841, 391, 77, 50276, 783, 2022, 1543, 403, 275, 2087, 3240, 1175, 285, 1329, 253, 2201, 7680, 273, 436, 789, 50276, 20881, 1255, 50276, 1440, 250, 5551, 336, 310, 8489, 1077, 1027, 432, 246, 13473, 10019, 2686, 697, 7197, 275, 2087, 3021, 891, 4282, 604, 697, 1896, 326, 246, 13473, 10019, 342, 851, 26208, 476, 5115, 1805, 1543, 50276, 22309, 256, 21, 8435, 908, 275, 253, 3368, 760, 19401, 305, 12064, 6046, 533, 417, 671, 643, 42072, 5609, 4081, 275, 253, 3236, 2929, 751, 5926, 483, 5878, 484, 3966, 50276, 1747, 13218, 42085, 1221, 1949, 8245, 3081, 281, 17568, 3410, 432, 253, 3236, 10895, 323, 3081, 3733, 752, 670, 247, 2969, 17375, 10491, 1754, 327, 2957, 390, 941, 12177, 50276, 783, 2488, 3916, 275, 1386, 25921, 16129, 326, 47694, 11020, 5978, 310, 1805, 685, 9732, 17190, 1955, 281, 697, 625, 11117, 5978, 275, 15866, 273, 253, 16907, 34593, 8813, 253, 16888, 3064, 275, 2829, 374, 5043, 642, 7491, 549, 275, 519, 14790, 3368, 2829, 495, 2167, 253, 2488, 25957, 352, 310, 1512, 673, 33136, 891, 4282, 604, 627, 310, 271, 5795, 281, 2085, 625, 11859, 22861, 13414, 973, 15249, 253, 3908, 50276, 4064, 4677, 374, 549, 285, 28793, 1646, 281, 320, 19767, 281, 1016, 643, 342, 28793, 10949, 653, 9332, 4811, 275, 253, 3236, 941, 3268, 891, 4282, 16248, 841, 767, 7491, 10981, 2784, 5609, 342, 2969, 3082, 751, 12421, 13887, 581, 2746, 281, 6635, 941, 588, 2085, 2007, 7756, 50276, 5430, 1057, 253, 4081, 1332, 1347, 762, 44382, 8292, 20028, 5742, 7491, 10981, 2784, 8127, 7024, 327, 253, 36594, 273, 253, 3236, 1566, 26332, 42085, 1060, 891, 4282, 604, 359, 13237, 5878, 1679, 468, 14692, 941, 281, 253, 3236, 3733, 873, 24088, 15706, 690, 5110, 273, 10895, 342, 3632, 10895, 275, 277, 21, 8435, 849, 588, 436, 18123, 2818, 7491, 10981, 2784, 50276, 6050, 47694, 11020, 5978, 476, 4711, 4067, 7629, 849, 1057, 253, 20821, 2228, 2818, 7491, 10981, 2784, 47694, 11020, 5978, 342, 4067, 5978, 2978, 246, 11330, 941, 342, 4067, 11254, 432, 253, 3236, 10895, 533, 387, 253, 1072, 673, 27171, 432, 4067, 20821, 2228, 3021, 246, 778, 320, 271, 4722, 4373, 19484, 323, 28913, 1263, 281, 923, 849, 281, 9974, 247, 1175, 6654, 875, 253, 5454, 2727, 7419, 432, 1781, 3733, 673, 1955, 281, 3909, 17927, 3733, 941, 5978, 5393, 407, 253, 4477, 352, 651, 320, 5322, 281, 923, 625, 4278, 390, 1014, 4433, 4438, 4795, 432, 17620, 273, 7491, 10981, 2784, 253, 4477, 5393, 690, 273, 731, 275, 28913, 347, 973, 347, 5609, 281, 16748, 824, 2523, 533, 604, 667, 625, 352, 651, 320, 1077, 9371, 281, 923, 512, 643, 4433, 8130, 326, 253, 4477, 452, 3597, 5474, 339, 9852, 1340, 281, 18915, 253, 1895, 326, 28841, 391, 77, 941, 310, 2223, 1355, 275, 1979, 285, 19756, 275, 7031, 436, 2929, 29328, 281, 1555, 391, 77, 347, 247, 3425, 14053, 4836, 285, 35919, 253, 3733, 941, 407, 17221, 253, 1029, 7162, 4561, 3530, 432, 253, 3425, 1566, 326, 310, 1469, 949, 3733, 50275, 783, 4081, 1332, 310, 6760, 327, 247, 8578, 273, 253, 277, 21, 8435, 5415, 1453, 8892, 285, 310, 2011, 281, 562, 32231, 2067, 1666, 25379, 1690, 39707, 3169, 285, 1566, 3169, 4394, 347, 973, 347, 260, 5848, 285, 49501, 323, 247, 2372, 689, 2716, 273, 253, 8892, 50275, 296, 3755, 20556, 50276, 783, 11333, 2022, 2133, 310, 2969, 285, 19627, 281, 643, 745, 41904, 77, 2987, 253, 760, 7658, 310, 281, 897, 247, 3425, 1566, 50275, 47606, 1783, 273, 253, 941, 3268, 4561, 407, 9732, 22958, 4632, 47694, 11020, 50276, 8250, 5740, 273, 253, 1332, 50276, 20881, 1255, 50276, 262, 310, 12744, 1880, 253, 6351, 273, 7491, 3249, 432, 337, 4465, 941, 374, 1027, 10336, 3215, 11273, 305, 431, 19, 4632, 417, 495, 690, 12794, 2867, 275, 253, 3425, 1566, 347, 10066, 281, 643, 1533, 3210, 326, 778, 760, 3283, 253, 8310, 285, 253, 10921, 50276, 262, 310, 12744, 432, 253, 2929, 1880, 7491, 10981, 2784, 310, 4460, 4457, 22296, 4715, 24088, 275, 391, 77, 627, 403, 3240, 247, 1643, 3081, 7364, 417, 5393, 275, 253, 2929, 298, 1706, 4590, 1235, 337, 253, 4465, 767, 4373, 22041, 5611, 465, 285, 50276, 15684, 1442, 292, 25004, 534, 7024, 327, 11659, 281, 253, 3126, 390, 247, 1175, 258, 365, 1332, 374, 347, 5393, 275, 298, 1787, 1867, 323, 643, 8892, 275, 2087, 352, 310, 12744, 1880, 253, 10895, 2130, 310, 4209, 281, 6194, 247, 7491, 5734, 359, 1611, 352, 534, 588, 36967, 4465, 3733, 673, 285, 2105, 347, 5393, 275, 298, 1706, 4590, 1235, 5474, 33032, 2520, 2929, 8725, 3332, 3425, 7645, 272, 7274, 281, 391, 77, 342, 1566, 3169, 941, 42072, 342, 253, 990, 906, 2819, 751, 247, 24187, 404, 1033, 1250, 256, 28738, 10226, 3614, 11830, 50232, 2061, 14369, 6903, 11838, 805, 1508, 2031, 805, 1508, 2357, 12955, 273, 253, 18974, 39707, 247, 1643, 2216, 7089, 824, 347, 253, 10491, 5700, 47694, 11020, 7147, 9732, 15189, 285, 849, 2223, 281, 33150, 253, 1566, 20419, 941, 403, 6949, 253, 5933, 310, 6760, 327, 253, 23904, 5011, 285, 519, 14790, 10625, 273, 253, 277, 21, 8435, 28841, 391, 77, 22791, 436, 2929, 19401, 247, 4942, 15246, 5019, 273, 5368, 11333, 533, 1057, 594, 16575, 352, 310, 10182, 275, 697, 16774, 27163, 342, 8245, 11640, 1690, 247, 42085, 342, 4465, 3733, 281, 20843, 253, 2538, 273, 19269, 4465, 11786, 5018, 285, 270, 271, 256, 21, 8435, 25016, 2715, 273, 42085, 281, 7277, 253, 1566, 20419, 941, 281, 643, 4948, 273, 941, 42072, 253, 2929, 310, 671, 10182, 417, 281, 689, 7041, 697, 38135, 285, 5678, 342, 697, 11080, 1255, 5604, 42506, 327, 253, 26536, 352, 8631, 275, 253, 12002, 285, 10199, 50276, 783, 2593, 327, 2007, 1783, 273, 7491, 10981, 2784, 28145, 387, 247, 1953, 891, 452, 4266, 13876, 670, 672, 247, 1566, 310, 10166, 327, 253, 1072, 6944, 941, 347, 247, 3646, 533, 476, 840, 6635, 625, 941, 326, 2007, 19132, 253, 3646, 752, 670, 253, 1566, 20419, 941, 4269, 326, 7756, 253, 3662, 310, 8489, 47641, 285, 18276, 1060, 533, 436, 1953, 556, 762, 35477, 16105, 555, 594, 891, 14109, 271, 3177, 387, 15974, 352, 45190, 1060, 891, 513, 1158, 326, 253, 10199, 1537, 452, 689, 13382, 1701, 247, 2372, 1060, 3981, 326, 253, 4561, 17927, 24102, 1978, 5185, 342, 253, 6944, 278, 12132, 954, 651, 4665, 326, 347, 4495, 5185, 342, 253, 3302, 1375, 3268, 285, 5502, 1159, 533, 253, 5839, 253, 4477, 5196, 36908, 2686, 1379, 715, 2395, 18525, 25720, 533, 30899, 13849, 326, 1146, 753, 436, 310, 6571, 247, 572, 487, 934, 670, 28939, 285, 667, 3177, 387, 22291, 436, 1953, 387, 512, 310, 247, 1175, 1265, 50276, 284, 5469, 275, 253, 20544, 285, 32213, 2593, 253, 2022, 14855, 310, 326, 7491, 310, 247, 8489, 15246, 6880, 273, 5368, 789, 275, 1566, 3169, 391, 77, 2299, 253, 2929, 1057, 247, 11080, 5839, 273, 436, 6880, 285, 1057, 417, 689, 7041, 697, 7680, 2490, 187, 4118, 18435, 27, 6010, 28841, 391, 77, 391, 77, 11333, 4388, 281, 3037, 7823, 1293, 18745, 342, 271, 3126, 15846, 432, 253, 1375, 285, 5231, 6107, 275, 253, 28841, 15302, 2299, 275, 1524, 10186, 15302, 253, 7031, 476, 320, 12497, 281, 3037, 1175, 7823, 3021, 352, 310, 271, 1774, 2561, 3884, 281, 3157, 253, 3410, 6733, 273, 1110, 3082, 436, 2929, 7274, 28841, 391, 77, 432, 247, 3425, 14053, 8668, 253, 2929, 47932, 247, 12955, 273, 18974, 4979, 398, 323, 941, 5978, 285, 597, 7409, 767, 273, 253, 2022, 2216, 7089, 275, 1110, 3210, 50276, 48027, 3082, 47694, 11020, 4632, 9732, 22958, 1754, 50276, 250, 2327, 273, 1566, 20419, 941, 50276, 783, 4477, 17813, 616, 2934, 327, 767, 277, 21, 8435, 8892, 50276, 324, 14790, 50276, 9450, 297, 5011, 50275, 33642, 50276, 1189, 455, 253, 2929, 310, 973, 15720, 285, 3477, 281, 2096, 253, 1543, 285, 4679, 403, 11080, 285, 253, 2929, 4566, 323, 625, 6864, 275, 253, 4679, 2581, 685, 37535, 253, 5697, 275, 253, 2929, 403, 417, 4460, 533, 253, 2929, 1057, 417, 689, 7041, 697, 9021, 285, 1543, 403, 4722, 347, 247, 906, 891, 1158, 1097, 5723, 2824, 285, 253, 16055, 28841, 391, 77, 7888, 651, 5649, 432, 253, 4342, 273, 436, 2929, 891, 717, 32085, 839, 436, 2929, 323, 14924, 50276, 783, 30628, 497, 1077, 2762, 670, 436, 2929, 1309, 253, 30080, 22559, 285, 5955, 2929, 597, 512, 5821, 326, 253, 2929, 310, 9865, 285, 4722, 7680, 281, 253, 3114, 253, 2022, 14226, 273, 436, 2929, 326, 2210, 598, 1309, 253, 5955, 2180, 369, 326, 253, 2934, 310, 816, 247, 15246, 5019, 273, 253, 5368, 5609, 2299, 253, 2934, 3559, 275, 253, 2929, 310, 18893, 5272, 285, 11407, 973, 50275, 783, 4477, 2530, 247, 1077, 7000, 30080, 22559, 342, 8254, 6787, 281, 253, 2792, 326, 30628, 5439, 347, 247, 906, 273, 253, 30080, 22559, 690, 273, 253, 30628, 2559, 616, 7363, 891, 651, 5583, 326, 253, 4477, 19071, 690, 273, 1110, 8254, 6787, 715, 253, 4049, 254, 609, 5102, 2929, 2715, 690, 273, 1110, 403, 50275, 249, 2380, 281, 37317, 898, 78, 34010, 1953, 327, 253, 1543, 342, 260, 5848, 327, 3081, 941, 4561, 407, 7491, 310, 1077, 4722, 891, 1158, 253, 4477, 943, 2486, 352, 275, 253, 4049, 254, 609, 5102, 2715, 273, 253, 2929, 50275, 38092, 4679, 327, 643, 15302, 432, 253, 277, 21, 8435, 17409, 3126, 347, 2546, 407, 37317, 898, 78, 649, 50276, 783, 4679, 2546, 407, 253, 37317, 288, 82, 91, 75, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 7491, 10981, 1882, 39707, 534, 4648, 7491, 10981, 2784, 281, 1881, 16450, 625, 28841, 941, 281, 3157, 3425, 1566, 3733, 273, 18974, 39707, 275, 28841, 391, 77, 4758, 253, 4477, 1263, 1125, 2646, 254, 800, 5978, 285, 9732, 17190, 323, 18974, 5978, 275, 7491, 10981, 2784, 2112, 342, 941, 19690, 1754, 327, 7162, 4868, 10554, 5912, 285, 2969, 24642, 4715, 597, 7568, 1805, 3045, 10941, 281, 643, 28841, 391, 77, 1666, 25379, 275, 277, 21, 8435, 10895, 5742, 327, 17409, 1906, 75, 16856, 285, 519, 14790, 597, 671, 2589, 1783, 327, 7491, 10981, 1882, 941, 432, 253, 8668, 273, 3835, 2961, 2095, 285, 12420, 342, 278, 12132, 285, 271, 28913, 281, 2085, 2969, 29609, 273, 7491, 10981, 2784, 20544, 50276, 783, 2929, 310, 973, 15720, 285, 3477, 281, 956, 50276, 783, 4081, 1332, 310, 973, 24013, 8550, 285, 253, 1895, 1146, 13964, 310, 1774, 275, 970, 3425, 14053, 275, 28841, 391, 77, 50276, 783, 2022, 1543, 403, 275, 2087, 3240, 1175, 285, 1329, 253, 2201, 7680, 273, 436, 789, 50276, 20881, 1255, 50276, 1440, 250, 5551, 336, 310, 8489, 1077, 1027, 432, 246, 13473, 10019, 2686, 697, 7197, 275, 2087, 3021, 891, 4282, 604, 697, 1896, 326, 246, 13473, 10019, 342, 851, 26208, 476, 5115, 1805, 1543, 50276, 22309, 256, 21, 8435, 908, 275, 253, 3368, 760, 19401, 305, 12064, 6046, 533, 417, 671, 643, 42072, 5609, 4081, 275, 253, 3236, 2929, 751, 5926, 483, 5878, 484, 3966, 50276, 1747, 13218, 42085, 1221, 1949, 8245, 3081, 281, 17568, 3410, 432, 253, 3236, 10895, 323, 3081, 3733, 752, 670, 247, 2969, 17375, 10491, 1754, 327, 2957, 390, 941, 12177, 50276, 783, 2488, 3916, 275, 1386, 25921, 16129, 326, 47694, 11020, 5978, 310, 1805, 685, 9732, 17190, 1955, 281, 697, 625, 11117, 5978, 275, 15866, 273, 253, 16907, 34593, 8813, 253, 16888, 3064, 275, 2829, 374, 5043, 642, 7491, 549, 275, 519, 14790, 3368, 2829, 495, 2167, 253, 2488, 25957, 352, 310, 1512, 673, 33136, 891, 4282, 604, 627, 310, 271, 5795, 281, 2085, 625, 11859, 22861, 13414, 973, 15249, 253, 3908, 50276, 4064, 4677, 374, 549, 285, 28793, 1646, 281, 320, 19767, 281, 1016, 643, 342, 28793, 10949, 653, 9332, 4811, 275, 253, 3236, 941, 3268, 891, 4282, 16248, 841, 767, 7491, 10981, 2784, 5609, 342, 2969, 3082, 751, 12421, 13887, 581, 2746, 281, 6635, 941, 588, 2085, 2007, 7756, 50276, 5430, 1057, 253, 4081, 1332, 1347, 762, 44382, 8292, 20028, 5742, 7491, 10981, 2784, 8127, 7024, 327, 253, 36594, 273, 253, 3236, 1566, 26332, 42085, 1060, 891, 4282, 604, 359, 13237, 5878, 1679, 468, 14692, 941, 281, 253, 3236, 3733, 873, 24088, 15706, 690, 5110, 273, 10895, 342, 3632, 10895, 275, 277, 21, 8435, 849, 588, 436, 18123, 2818, 7491, 10981, 2784, 50276, 6050, 47694, 11020, 5978, 476, 4711, 4067, 7629, 849, 1057, 253, 20821, 2228, 2818, 7491, 10981, 2784, 47694, 11020, 5978, 342, 4067, 5978, 2978, 246, 11330, 941, 342, 4067, 11254, 432, 253, 3236, 10895, 533, 387, 253, 1072, 673, 27171, 432, 4067, 20821, 2228, 3021, 246, 778, 320, 271, 4722, 4373, 19484, 323, 28913, 1263, 281, 923, 849, 281, 9974, 247, 1175, 6654, 875, 253, 5454, 2727, 7419, 432, 1781, 3733, 673, 1955, 281, 3909, 17927, 3733, 941, 5978, 5393, 407, 253, 4477, 352, 651, 320, 5322, 281, 923, 625, 4278, 390, 1014, 4433, 4438, 4795, 432, 17620, 273, 7491, 10981, 2784, 253, 4477, 5393, 690, 273, 731, 275, 28913, 347, 973, 347, 5609, 281, 16748, 824, 2523, 533, 604, 667, 625, 352, 651, 320, 1077, 9371, 281, 923, 512, 643, 4433, 8130, 326, 253, 4477, 452, 3597, 5474, 339, 9852, 1340, 281, 18915, 253, 1895, 326, 28841, 391, 77, 941, 310, 2223, 1355, 275, 1979, 285, 19756, 275, 7031, 436, 2929, 29328, 281, 1555, 391, 77, 347, 247, 3425, 14053, 4836, 285, 35919, 253, 3733, 941, 407, 17221, 253, 1029, 7162, 4561, 3530, 432, 253, 3425, 1566, 326, 310, 1469, 949, 3733, 50275, 783, 4081, 1332, 310, 6760, 327, 247, 8578, 273, 253, 277, 21, 8435, 5415, 1453, 8892, 285, 310, 2011, 281, 562, 32231, 2067, 1666, 25379, 1690, 39707, 3169, 285, 1566, 3169, 4394, 347, 973, 347, 260, 5848, 285, 49501, 323, 247, 2372, 689, 2716, 273, 253, 8892, 50275, 296, 3755, 20556, 50276, 783, 11333, 2022, 2133, 310, 2969, 285, 19627, 281, 643, 745, 41904, 77, 2987, 253, 760, 7658, 310, 281, 897, 247, 3425, 1566, 50275, 47606, 1783, 273, 253, 941, 3268, 4561, 407, 9732, 22958, 4632, 47694, 11020, 50276, 8250, 5740, 273, 253, 1332, 50276, 20881, 1255, 50276, 262, 310, 12744, 1880, 253, 6351, 273, 7491, 3249, 432, 337, 4465, 941, 374, 1027, 10336, 3215, 11273, 305, 431, 19, 4632, 417, 495, 690, 12794, 2867, 275, 253, 3425, 1566, 347, 10066, 281, 643, 1533, 3210, 326, 778, 760, 3283, 253, 8310, 285, 253, 10921, 50276, 262, 310, 12744, 432, 253, 2929, 1880, 7491, 10981, 2784, 310, 4460, 4457, 22296, 4715, 24088, 275, 391, 77, 627, 403, 3240, 247, 1643, 3081, 7364, 417, 5393, 275, 253, 2929, 298, 1706, 4590, 1235, 337, 253, 4465, 767, 4373, 22041, 5611, 465, 285, 50276, 15684, 1442, 292, 25004, 534, 7024, 327, 11659, 281, 253, 3126, 390, 247, 1175, 258, 365, 1332, 374, 347, 5393, 275, 298, 1787, 1867, 323, 643, 8892, 275, 2087, 352, 310, 12744, 1880, 253, 10895, 2130, 310, 4209, 281, 6194, 247, 7491, 5734, 359, 1611, 352, 534, 588, 36967, 4465, 3733, 673, 285, 2105, 347, 5393, 275, 298, 1706, 4590, 1235, 5474, 33032, 2520, 2929, 8725, 3332, 3425, 7645, 272, 7274, 281, 391, 77, 342, 1566, 3169, 941, 42072, 342, 253, 990, 906, 2819, 751, 247, 24187, 404, 1033, 1250, 256, 28738, 10226, 3614, 11830, 50232, 2061, 14369, 6903, 11838, 805, 1508, 2031, 805, 1508, 2357, 12955, 273, 253, 18974, 39707, 247, 1643, 2216, 7089, 824, 347, 253, 10491, 5700, 47694, 11020, 7147, 9732, 15189, 285, 849, 2223, 281, 33150, 253, 1566, 20419, 941, 403, 6949, 253, 5933, 310, 6760, 327, 253, 23904, 5011, 285, 519, 14790, 10625, 273, 253, 277, 21, 8435, 28841, 391, 77, 22791, 436, 2929, 19401, 247, 4942, 15246, 5019, 273, 5368, 11333, 533, 1057, 594, 16575, 352, 310, 10182, 275, 697, 16774, 27163, 342, 8245, 11640, 1690, 247, 42085, 342, 4465, 3733, 281, 20843, 253, 2538, 273, 19269, 4465, 11786, 5018, 285, 270, 271, 256, 21, 8435, 25016, 2715, 273, 42085, 281, 7277, 253, 1566, 20419, 941, 281, 643, 4948, 273, 941, 42072, 253, 2929, 310, 671, 10182, 417, 281, 689, 7041, 697, 38135, 285, 5678, 342, 697, 11080, 1255, 5604, 42506, 327, 253, 26536, 352, 8631, 275, 253, 12002, 285, 10199, 50276, 783, 2593, 327, 2007, 1783, 273, 7491, 10981, 2784, 28145, 387, 247, 1953, 891, 452, 4266, 13876, 670, 672, 247, 1566, 310, 10166, 327, 253, 1072, 6944, 941, 347, 247, 3646, 533, 476, 840, 6635, 625, 941, 326, 2007, 19132, 253, 3646, 752, 670, 253, 1566, 20419, 941, 4269, 326, 7756, 253, 3662, 310, 8489, 47641, 285, 18276, 1060, 533, 436, 1953, 556, 762, 35477, 16105, 555, 594, 891, 14109, 271, 3177, 387, 15974, 352, 45190, 1060, 891, 513, 1158, 326, 253, 10199, 1537, 452, 689, 13382, 1701, 247, 2372, 1060, 3981, 326, 253, 4561, 17927, 24102, 1978, 5185, 342, 253, 6944, 278, 12132, 954, 651, 4665, 326, 347, 4495, 5185, 342, 253, 3302, 1375, 3268, 285, 5502, 1159, 533, 253, 5839, 253, 4477, 5196, 36908, 2686, 1379, 715, 2395, 18525, 25720, 533, 30899, 13849, 326, 1146, 753, 436, 310, 6571, 247, 572, 487, 934, 670, 28939, 285, 667, 3177, 387, 22291, 436, 1953, 387, 512, 310, 247, 1175, 1265, 50276, 284, 5469, 275, 253, 20544, 285, 32213, 2593, 253, 2022, 14855, 310, 326, 7491, 310, 247, 8489, 15246, 6880, 273, 5368, 789, 275, 1566, 3169, 391, 77, 2299, 253, 2929, 1057, 247, 11080, 5839, 273, 436, 6880, 285, 1057, 417, 689, 7041, 697, 7680, 2490, 187, 4118, 18435, 27, 6010, 28841, 391, 77, 391, 77, 11333, 4388, 281, 3037, 7823, 1293, 18745, 342, 271, 3126, 15846, 432, 253, 1375, 285, 5231, 6107, 275, 253, 28841, 15302, 2299, 275, 1524, 10186, 15302, 253, 7031, 476, 320, 12497, 281, 3037, 1175, 7823, 3021, 352, 310, 271, 1774, 2561, 3884, 281, 3157, 253, 3410, 6733, 273, 1110, 3082, 436, 2929, 7274, 28841, 391, 77, 432, 247, 3425, 14053, 8668, 253, 2929, 47932, 247, 12955, 273, 18974, 4979, 398, 323, 941, 5978, 285, 597, 7409, 767, 273, 253, 2022, 2216, 7089, 275, 1110, 3210, 50276, 48027, 3082, 47694, 11020, 4632, 9732, 22958, 1754, 50276, 250, 2327, 273, 1566, 20419, 941, 50276, 783, 4477, 17813, 616, 2934, 327, 767, 277, 21, 8435, 8892, 50276, 324, 14790, 50276, 9450, 297, 5011, 50275, 33642, 50276, 1189, 455, 253, 2929, 310, 973, 15720, 285, 3477, 281, 2096, 253, 1543, 285, 4679, 403, 11080, 285, 253, 2929, 4566, 323, 625, 6864, 275, 253, 4679, 2581, 685, 37535, 253, 5697, 275, 253, 2929, 403, 417, 4460, 533, 253, 2929, 1057, 417, 689, 7041, 697, 9021, 285, 1543, 403, 4722, 347, 247, 906, 891, 1158, 1097, 5723, 2824, 285, 253, 16055, 28841, 391, 77, 7888, 651, 5649, 432, 253, 4342, 273, 436, 2929, 891, 717, 32085, 839, 436, 2929, 323, 14924, 50276, 783, 30628, 497, 1077, 2762, 670, 436, 2929, 1309, 253, 30080, 22559, 285, 5955, 2929, 597, 512, 5821, 326, 253, 2929, 310, 9865, 285, 4722, 7680, 281, 253, 3114, 253, 2022, 14226, 273, 436, 2929, 326, 2210, 598, 1309, 253, 5955, 2180, 369, 326, 253, 2934, 310, 816, 247, 15246, 5019, 273, 253, 5368, 5609, 2299, 253, 2934, 3559, 275, 253, 2929, 310, 18893, 5272, 285, 11407, 973, 50275, 783, 4477, 2530, 247, 1077, 7000, 30080, 22559, 342, 8254, 6787, 281, 253, 2792, 326, 30628, 5439, 347, 247, 906, 273, 253, 30080, 22559, 690, 273, 253, 30628, 2559, 616, 7363, 891, 651, 5583, 326, 253, 4477, 19071, 690, 273, 1110, 8254, 6787, 715, 253, 4049, 254, 609, 5102, 2929, 2715, 690, 273, 1110, 403, 50275, 249, 2380, 281, 37317, 898, 78, 34010, 1953, 327, 253, 1543, 342, 260, 5848, 327, 3081, 941, 4561, 407, 7491, 310, 1077, 4722, 891, 1158, 253, 4477, 943, 2486, 352, 275, 253, 4049, 254, 609, 5102, 2715, 273, 253, 2929, 50275, 38092, 4679, 327, 643, 15302, 432, 253, 277, 21, 8435, 17409, 3126, 347, 2546, 407, 37317, 898, 78, 649, 50276, 783, 4679, 2546, 407, 253, 37317, 288, 82, 91, 75, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: note i have reviewed a previous submission of this paper at a previous conference this paper continues a recent line of work aimed at identifying relationships between many different vote distributions where previous work used frequency matrices to measure the difference between sampled elections from given distributions this paper introduces a very natural extension to the concept and applies the frequency matrices to the vote distributions themselves the paper describes a number of election structures singlepeaked groupseperable mallows and finds the frequency matrix or a polynomial time formula for generating the matrix for each distribution using these frequency matrices a new map of elections is generated this skeleton map strengthens and confirms prior work and is shown to generally represent the distance between elections well finally mallows models are generated for elections representing realworld data and placed on the map these also have some similarity to prior work however they find generating a distribution to perfectly match each real election to be difficult some promising potential future work is identified while concluding the paper represents a novel addition to a recent series of papers this approach of generalizing from sampled data to comparing entire distributions could also have potential uses in other domains the work is well written and does a very good job of connecting itself to the prior results in this line of research while the results are moderately complex i found them to be explained and structured quite clearly overall i find the paper fairly strong and have no major issues with it due to space limitations some of the figures particularly fig 2 are rather small and difficult to read i am glad to see that most of the specific minor changes i have previously suggested have been fixed the authors have not discussed the potential societal impact of their work while the contribution appears quite far removed from any possible realworld impact perhaps one of the many appendices could be used to briefly imagine how this work could be misused somewhere down the line docsepthe authors build on recent work on a very exciting map of elections by showing how to compute the frequency matrix of various vote distributions they then use these frequency matrices to produce a skeleton map of distributions that is closely related to and visually much simpler than the map of elections approach lastly they show that they can use frequency matrices to estimate parameters of realworld election data based on nearest distributions strengths the map of elections is a remarkably useful and insightful tool when dealing with realworld elections and having a skeleton map like this that doesnt rely on sampling individual instances is a very useful addition i particularly appreciated that this skeleton map allows us to learn parameters of realworld elections or approximations thereof the sampling results algorithms are intuitively presented and clear to the reader despite the fact that the results are quite nontrivial weaknesses i am not overly familiar with the background behind the map of elections beyond seeing it as a very useful tool and it seems like the whole area is built upon many heuristics that seem to work pretty well but lack robust theoretical justification this is perhaps not so much of a downside to me relative to others but is definitely something worth thinking about yes docseppreviously szufa et al 2020 and boehmer et al 2021 proposed map of elections as both a visualization tool to group similar elections but also demonstrated that those elections close to each other on the map may have similar featuresthis paper introduces skeleton map which is a new type of map of elections the main difference is that in the skeleton map each point is a distribution instead of a single election this change simplifies the map while preserving the key features in order to represent each election vote distribution as a point this paper studies a number of prominent vote distributions either gives an analytical formula of the frequency matrices or provides a polynomial time algorithm to calculate it the authors ran experiments to see how the distances between different vote distributions change when the number of candidates change finally they use real life election datasets to verify that the vote distributions they consider mallows conitzer walsh with different parameters could fit some real vote datasets well but still have their limits i have previously reviewed another version of this paper originality this work is well built on two previous works szufa et al 2020 that proposed the map of elections and boehmer et al 2021 that defined compass matrices and normalized positionwise distance they did a thorough work providing algorithms to compute the frequency matrices for many vote distributions the skeleton map is a nice simplification to only have one point for each distribution instead of one point for each election quality the paper is well organized concepts and algorithms are clearly defined proved and elaborated experiments and analysis of results are well presented source code is provided my concern is that i did not find experiments showing this new method performs better than previous work clarity this paper is well written and organized significance my main concern is whether representing vote distributions by frequency matrices is better in the sense that it fits better with real life datasets than previous work that represents vote distributions by sampling elections from it it would be nice if the frequency matrix calculation could be used beyond the context in this paper again my main concern is how this paper compares with previous papers on fitting real life datasets ### Summary:
this paper works to identify relationships among different vote distributions this is done by applying previously introduced frequency matrices to the vote distributions themselves and gives formula or algorithms for computing these the resulting map of elections seems to have especially strong realworld potential
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 9939, 891, 452, 9814, 247, 2045, 19529, 273, 436, 2929, 387, 247, 2045, 8059, 50276, 2520, 2929, 7788, 247, 3332, 1386, 273, 789, 11205, 387, 12488, 7688, 875, 1142, 1027, 6273, 10670, 835, 2045, 789, 908, 4294, 12624, 281, 2557, 253, 3064, 875, 19958, 12337, 432, 1677, 10670, 436, 2929, 23970, 247, 1077, 3626, 6880, 281, 253, 4473, 285, 10384, 253, 4294, 12624, 281, 253, 6273, 10670, 3746, 50276, 783, 2929, 8631, 247, 1180, 273, 6132, 5289, 2014, 365, 6840, 1387, 339, 468, 494, 28974, 5811, 285, 9010, 253, 4294, 4315, 390, 247, 14189, 673, 7212, 323, 11365, 253, 4315, 323, 1016, 3268, 970, 841, 4294, 12624, 247, 747, 3711, 273, 12337, 310, 4561, 436, 29564, 3711, 4056, 49966, 285, 23849, 2720, 789, 285, 310, 2011, 281, 3839, 1957, 253, 4181, 875, 12337, 973, 4720, 28974, 5811, 3210, 403, 4561, 323, 12337, 9999, 1524, 10186, 941, 285, 4845, 327, 253, 3711, 841, 671, 452, 690, 14259, 281, 2720, 789, 2299, 597, 1089, 11365, 247, 3268, 281, 9670, 3761, 1016, 1524, 6132, 281, 320, 2834, 690, 12532, 2442, 2852, 789, 310, 3636, 1223, 26215, 253, 2929, 6125, 247, 4460, 1635, 281, 247, 3332, 2962, 273, 9380, 436, 2746, 273, 2087, 3006, 432, 19958, 941, 281, 10941, 2862, 10670, 812, 671, 452, 2442, 4648, 275, 643, 10625, 253, 789, 310, 973, 3542, 285, 1057, 247, 1077, 1175, 2628, 273, 12873, 3139, 281, 253, 2720, 1543, 275, 436, 1386, 273, 2561, 1223, 253, 1543, 403, 28249, 2570, 891, 1119, 731, 281, 320, 5544, 285, 18872, 3240, 4518, 50276, 1189, 455, 891, 1089, 253, 2929, 9648, 2266, 285, 452, 642, 2201, 3374, 342, 352, 1955, 281, 2317, 7364, 690, 273, 253, 8442, 3782, 3036, 374, 403, 2581, 1355, 285, 2834, 281, 1239, 891, 717, 9995, 281, 923, 326, 954, 273, 253, 2173, 5884, 2544, 891, 452, 3786, 5125, 452, 644, 4229, 50275, 783, 4477, 452, 417, 5469, 253, 2442, 38058, 3486, 273, 616, 789, 1223, 253, 7680, 4620, 3240, 2080, 5176, 432, 667, 1896, 1524, 10186, 3486, 4931, 581, 273, 253, 1142, 14801, 1271, 812, 320, 908, 281, 13366, 8564, 849, 436, 789, 812, 320, 3731, 3197, 9366, 1066, 253, 1386, 5474, 339, 431, 248, 4477, 1973, 327, 3332, 789, 327, 247, 1077, 12302, 3711, 273, 12337, 407, 4645, 849, 281, 11897, 253, 4294, 4315, 273, 2710, 6273, 10670, 597, 840, 897, 841, 4294, 12624, 281, 4711, 247, 29564, 3711, 273, 10670, 326, 310, 8244, 2905, 281, 285, 25910, 1199, 19554, 685, 253, 3711, 273, 12337, 2746, 1390, 314, 597, 921, 326, 597, 476, 897, 4294, 12624, 281, 6642, 3602, 273, 1524, 10186, 6132, 941, 1754, 327, 5275, 10670, 50276, 296, 3755, 20556, 50275, 783, 3711, 273, 12337, 310, 247, 24678, 4217, 285, 47860, 4968, 672, 10620, 342, 1524, 10186, 12337, 285, 1907, 247, 29564, 3711, 751, 436, 326, 36908, 10725, 327, 10491, 2060, 10872, 310, 247, 1077, 4217, 1635, 50275, 74, 3782, 14109, 326, 436, 29564, 3711, 4483, 441, 281, 3037, 3602, 273, 1524, 10186, 12337, 390, 34754, 10445, 50276, 783, 10491, 1543, 50276, 267, 46042, 403, 540, 41597, 3559, 285, 2590, 281, 253, 9414, 5747, 253, 958, 326, 253, 1543, 403, 3240, 37825, 50276, 20881, 1255, 265, 50275, 74, 717, 417, 27662, 7615, 342, 253, 4114, 3212, 253, 3711, 273, 12337, 4457, 6523, 352, 347, 247, 1077, 4217, 4968, 285, 352, 3133, 751, 253, 2644, 2170, 310, 4270, 2220, 1142, 344, 321, 3397, 326, 1646, 281, 789, 3965, 973, 533, 3480, 10237, 10527, 22861, 436, 310, 4931, 417, 594, 1199, 273, 247, 42719, 281, 479, 4103, 281, 2571, 533, 310, 7964, 1633, 4409, 4680, 670, 50275, 9820, 5474, 339, 377, 250, 11529, 18558, 2375, 66, 1162, 355, 9169, 285, 1766, 11430, 961, 1162, 355, 43425, 4081, 3711, 273, 12337, 347, 1097, 247, 24426, 4968, 281, 1387, 2074, 12337, 533, 671, 5183, 326, 1110, 12337, 2810, 281, 1016, 643, 327, 253, 3711, 778, 452, 2074, 4735, 296, 8701, 2929, 23970, 29564, 3711, 534, 310, 247, 747, 1511, 273, 3711, 273, 12337, 253, 2022, 3064, 310, 326, 275, 253, 29564, 3711, 1016, 1127, 310, 247, 3268, 3185, 273, 247, 2014, 6132, 436, 1818, 8077, 7790, 253, 3711, 1223, 24279, 253, 2234, 3386, 275, 1340, 281, 1957, 1016, 6132, 50276, 40997, 3268, 347, 247, 1127, 436, 2929, 2175, 247, 1180, 273, 11906, 6273, 10670, 2057, 4245, 271, 16101, 7212, 273, 253, 4294, 12624, 390, 3400, 247, 14189, 673, 5933, 281, 10173, 352, 253, 4477, 6337, 4679, 281, 923, 849, 253, 13849, 875, 1027, 6273, 10670, 1818, 672, 253, 1180, 273, 9183, 1818, 4720, 597, 897, 1524, 1495, 6132, 15302, 281, 12654, 326, 253, 6273, 10670, 597, 1908, 28974, 5811, 345, 13412, 259, 932, 73, 342, 1027, 3602, 812, 4944, 690, 1524, 6273, 15302, 973, 533, 1335, 452, 616, 7787, 50276, 74, 452, 3786, 9814, 1529, 2715, 273, 436, 2929, 50275, 19164, 414, 436, 789, 310, 973, 4270, 327, 767, 2045, 2987, 18558, 2375, 66, 1162, 355, 9169, 326, 4081, 253, 3711, 273, 12337, 285, 1766, 11430, 961, 1162, 355, 43425, 326, 2931, 17066, 12624, 285, 12650, 1899, 3020, 4181, 597, 858, 247, 11080, 789, 5277, 11333, 281, 11897, 253, 4294, 12624, 323, 1142, 6273, 10670, 253, 29564, 3711, 310, 247, 5322, 8077, 1877, 281, 760, 452, 581, 1127, 323, 1016, 3268, 3185, 273, 581, 1127, 323, 1016, 6132, 50276, 15177, 253, 2929, 310, 973, 10932, 12342, 285, 11333, 403, 4518, 2931, 8058, 285, 50221, 4679, 285, 1783, 273, 1543, 403, 973, 3559, 2603, 2127, 310, 2530, 619, 4468, 310, 326, 891, 858, 417, 1089, 4679, 4645, 436, 747, 1332, 17923, 1805, 685, 2045, 789, 50276, 498, 15752, 436, 2929, 310, 973, 3542, 285, 10932, 50276, 9188, 40348, 619, 2022, 4468, 310, 1880, 9999, 6273, 10670, 407, 4294, 12624, 310, 1805, 275, 253, 3282, 326, 352, 13840, 1805, 342, 1524, 1495, 15302, 685, 2045, 789, 326, 6125, 6273, 10670, 407, 10491, 12337, 432, 352, 50274, 262, 651, 320, 5322, 604, 253, 4294, 4315, 10272, 812, 320, 908, 4457, 253, 3634, 275, 436, 2929, 50275, 16245, 619, 2022, 4468, 310, 849, 436, 2929, 26662, 342, 2045, 9380, 327, 13532, 1524, 1495, 15302, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 2987, 281, 4271, 7688, 2190, 1027, 6273, 10670, 436, 310, 2218, 407, 9433, 3786, 5611, 4294, 12624, 281, 253, 6273, 10670, 3746, 285, 4245, 7212, 390, 11333, 323, 12672, 841, 50276, 783, 4795, 3711, 273, 12337, 3133, 281, 452, 3340, 2266, 1524, 10186, 2442 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 9939, 891, 452, 9814, 247, 2045, 19529, 273, 436, 2929, 387, 247, 2045, 8059, 50276, 2520, 2929, 7788, 247, 3332, 1386, 273, 789, 11205, 387, 12488, 7688, 875, 1142, 1027, 6273, 10670, 835, 2045, 789, 908, 4294, 12624, 281, 2557, 253, 3064, 875, 19958, 12337, 432, 1677, 10670, 436, 2929, 23970, 247, 1077, 3626, 6880, 281, 253, 4473, 285, 10384, 253, 4294, 12624, 281, 253, 6273, 10670, 3746, 50276, 783, 2929, 8631, 247, 1180, 273, 6132, 5289, 2014, 365, 6840, 1387, 339, 468, 494, 28974, 5811, 285, 9010, 253, 4294, 4315, 390, 247, 14189, 673, 7212, 323, 11365, 253, 4315, 323, 1016, 3268, 970, 841, 4294, 12624, 247, 747, 3711, 273, 12337, 310, 4561, 436, 29564, 3711, 4056, 49966, 285, 23849, 2720, 789, 285, 310, 2011, 281, 3839, 1957, 253, 4181, 875, 12337, 973, 4720, 28974, 5811, 3210, 403, 4561, 323, 12337, 9999, 1524, 10186, 941, 285, 4845, 327, 253, 3711, 841, 671, 452, 690, 14259, 281, 2720, 789, 2299, 597, 1089, 11365, 247, 3268, 281, 9670, 3761, 1016, 1524, 6132, 281, 320, 2834, 690, 12532, 2442, 2852, 789, 310, 3636, 1223, 26215, 253, 2929, 6125, 247, 4460, 1635, 281, 247, 3332, 2962, 273, 9380, 436, 2746, 273, 2087, 3006, 432, 19958, 941, 281, 10941, 2862, 10670, 812, 671, 452, 2442, 4648, 275, 643, 10625, 253, 789, 310, 973, 3542, 285, 1057, 247, 1077, 1175, 2628, 273, 12873, 3139, 281, 253, 2720, 1543, 275, 436, 1386, 273, 2561, 1223, 253, 1543, 403, 28249, 2570, 891, 1119, 731, 281, 320, 5544, 285, 18872, 3240, 4518, 50276, 1189, 455, 891, 1089, 253, 2929, 9648, 2266, 285, 452, 642, 2201, 3374, 342, 352, 1955, 281, 2317, 7364, 690, 273, 253, 8442, 3782, 3036, 374, 403, 2581, 1355, 285, 2834, 281, 1239, 891, 717, 9995, 281, 923, 326, 954, 273, 253, 2173, 5884, 2544, 891, 452, 3786, 5125, 452, 644, 4229, 50275, 783, 4477, 452, 417, 5469, 253, 2442, 38058, 3486, 273, 616, 789, 1223, 253, 7680, 4620, 3240, 2080, 5176, 432, 667, 1896, 1524, 10186, 3486, 4931, 581, 273, 253, 1142, 14801, 1271, 812, 320, 908, 281, 13366, 8564, 849, 436, 789, 812, 320, 3731, 3197, 9366, 1066, 253, 1386, 5474, 339, 431, 248, 4477, 1973, 327, 3332, 789, 327, 247, 1077, 12302, 3711, 273, 12337, 407, 4645, 849, 281, 11897, 253, 4294, 4315, 273, 2710, 6273, 10670, 597, 840, 897, 841, 4294, 12624, 281, 4711, 247, 29564, 3711, 273, 10670, 326, 310, 8244, 2905, 281, 285, 25910, 1199, 19554, 685, 253, 3711, 273, 12337, 2746, 1390, 314, 597, 921, 326, 597, 476, 897, 4294, 12624, 281, 6642, 3602, 273, 1524, 10186, 6132, 941, 1754, 327, 5275, 10670, 50276, 296, 3755, 20556, 50275, 783, 3711, 273, 12337, 310, 247, 24678, 4217, 285, 47860, 4968, 672, 10620, 342, 1524, 10186, 12337, 285, 1907, 247, 29564, 3711, 751, 436, 326, 36908, 10725, 327, 10491, 2060, 10872, 310, 247, 1077, 4217, 1635, 50275, 74, 3782, 14109, 326, 436, 29564, 3711, 4483, 441, 281, 3037, 3602, 273, 1524, 10186, 12337, 390, 34754, 10445, 50276, 783, 10491, 1543, 50276, 267, 46042, 403, 540, 41597, 3559, 285, 2590, 281, 253, 9414, 5747, 253, 958, 326, 253, 1543, 403, 3240, 37825, 50276, 20881, 1255, 265, 50275, 74, 717, 417, 27662, 7615, 342, 253, 4114, 3212, 253, 3711, 273, 12337, 4457, 6523, 352, 347, 247, 1077, 4217, 4968, 285, 352, 3133, 751, 253, 2644, 2170, 310, 4270, 2220, 1142, 344, 321, 3397, 326, 1646, 281, 789, 3965, 973, 533, 3480, 10237, 10527, 22861, 436, 310, 4931, 417, 594, 1199, 273, 247, 42719, 281, 479, 4103, 281, 2571, 533, 310, 7964, 1633, 4409, 4680, 670, 50275, 9820, 5474, 339, 377, 250, 11529, 18558, 2375, 66, 1162, 355, 9169, 285, 1766, 11430, 961, 1162, 355, 43425, 4081, 3711, 273, 12337, 347, 1097, 247, 24426, 4968, 281, 1387, 2074, 12337, 533, 671, 5183, 326, 1110, 12337, 2810, 281, 1016, 643, 327, 253, 3711, 778, 452, 2074, 4735, 296, 8701, 2929, 23970, 29564, 3711, 534, 310, 247, 747, 1511, 273, 3711, 273, 12337, 253, 2022, 3064, 310, 326, 275, 253, 29564, 3711, 1016, 1127, 310, 247, 3268, 3185, 273, 247, 2014, 6132, 436, 1818, 8077, 7790, 253, 3711, 1223, 24279, 253, 2234, 3386, 275, 1340, 281, 1957, 1016, 6132, 50276, 40997, 3268, 347, 247, 1127, 436, 2929, 2175, 247, 1180, 273, 11906, 6273, 10670, 2057, 4245, 271, 16101, 7212, 273, 253, 4294, 12624, 390, 3400, 247, 14189, 673, 5933, 281, 10173, 352, 253, 4477, 6337, 4679, 281, 923, 849, 253, 13849, 875, 1027, 6273, 10670, 1818, 672, 253, 1180, 273, 9183, 1818, 4720, 597, 897, 1524, 1495, 6132, 15302, 281, 12654, 326, 253, 6273, 10670, 597, 1908, 28974, 5811, 345, 13412, 259, 932, 73, 342, 1027, 3602, 812, 4944, 690, 1524, 6273, 15302, 973, 533, 1335, 452, 616, 7787, 50276, 74, 452, 3786, 9814, 1529, 2715, 273, 436, 2929, 50275, 19164, 414, 436, 789, 310, 973, 4270, 327, 767, 2045, 2987, 18558, 2375, 66, 1162, 355, 9169, 326, 4081, 253, 3711, 273, 12337, 285, 1766, 11430, 961, 1162, 355, 43425, 326, 2931, 17066, 12624, 285, 12650, 1899, 3020, 4181, 597, 858, 247, 11080, 789, 5277, 11333, 281, 11897, 253, 4294, 12624, 323, 1142, 6273, 10670, 253, 29564, 3711, 310, 247, 5322, 8077, 1877, 281, 760, 452, 581, 1127, 323, 1016, 3268, 3185, 273, 581, 1127, 323, 1016, 6132, 50276, 15177, 253, 2929, 310, 973, 10932, 12342, 285, 11333, 403, 4518, 2931, 8058, 285, 50221, 4679, 285, 1783, 273, 1543, 403, 973, 3559, 2603, 2127, 310, 2530, 619, 4468, 310, 326, 891, 858, 417, 1089, 4679, 4645, 436, 747, 1332, 17923, 1805, 685, 2045, 789, 50276, 498, 15752, 436, 2929, 310, 973, 3542, 285, 10932, 50276, 9188, 40348, 619, 2022, 4468, 310, 1880, 9999, 6273, 10670, 407, 4294, 12624, 310, 1805, 275, 253, 3282, 326, 352, 13840, 1805, 342, 1524, 1495, 15302, 685, 2045, 789, 326, 6125, 6273, 10670, 407, 10491, 12337, 432, 352, 50274, 262, 651, 320, 5322, 604, 253, 4294, 4315, 10272, 812, 320, 908, 4457, 253, 3634, 275, 436, 2929, 50275, 16245, 619, 2022, 4468, 310, 849, 436, 2929, 26662, 342, 2045, 9380, 327, 13532, 1524, 1495, 15302, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 2987, 281, 4271, 7688, 2190, 1027, 6273, 10670, 436, 310, 2218, 407, 9433, 3786, 5611, 4294, 12624, 281, 253, 6273, 10670, 3746, 285, 4245, 7212, 390, 11333, 323, 12672, 841, 50276, 783, 4795, 3711, 273, 12337, 3133, 281, 452, 3340, 2266, 1524, 10186, 2442 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this work proposed a systematic framework of unsupervised domain adaptation for time series data with various model selections and strategies strength include extensive baseline for the experiment opensourced the code weakness did not clarify what is the specific challenge in time series data beyond static data and how to remedy them as contributions does not cover other main time series applications time series forecasting scenarios did not cover or compare with the recent work daf of domain adaptation for time series by jin et al httpsarxivorgabs210206828 did not justify well why domain alignments are important well in daf it mentioned having both domain specific and invariance features are critical what if using different backborns such as attentionbased ones hard to understand why many criteria are proposed what is difference between scc and dev risk when to select one of them difficult to interpret the experiment results what is visual uda in the table 3 why were average values reported the observations are not much informative one on imbalanced data is not connected to model selection criterions one of backborn selection is kind of implying nonsystemtic aspect what is the real takeaway out of this may it recommend to use other possible backborn eg attentionbased one hard to understand why authors emphasize fair and realistic procedure are any of other methods not fair or realistic and why the paper is not wellwritten especially experiments parts and the message of our them are not clear in the line of overall paper takeawaytherefore the contribution seems to be not clear beyond aggregating all existing da methods and systematically deploying them docsepthe paper proposes a systematic evaluation framework named adatime which systematically evaluates different unsupervised domain adaptation methods on time series data the whole framework consists of a feature extractor a classifier and a domain alignment component the paper conducts largescale experiments adapting the stateoftheart visual domain adaptation methods to the proposed framework on time series classification tasks the findings based on the experimental results reveal the key points of applying uda to time series data pros 1 the paper comprehensively adapts the stateoftheart uda methods to time series classification tasks 2 the paper proposes a novel uda framework for time series data which may contribute to the community 3 the idea of this paper is novel to some extent cons 1 my main concern is the motivation the paper only talks about how the framework is designed while not elaborating clearly on why the framework is designed like this 2 the selected datasets are relatively too small and simple that the backbone network can only be a 1dcnn in order to avoid the overfitting phenomenon the experimental results on these toy timeseries datasets are unconvincing 3 the organization and writing of this paper should be improved the authors should pay more attention to the motivation and the details of the framework 4 it would be better if the authors explain some symbols like xtrains xtests ztrains ztests etc in figure 5 some related works are missing 1 ruichu cai jiawei chen zijian li wei chen keli zhang junjian ye zhuozhang li xiaoyan yang zhenjie zhang time series domain adaptation via sparse associative structure alignment aaai 2021 2 xiaoyong jin youngsuk park danielle c maddix yuyang wang xifeng yan domain adaptation for time series forecasting via attention sharing arxiv210206828 minors on page 2 the the following questions the following questions the findings are interesting and may contribute to the community however considering the motivation and the unconvincing experiments i vote for 5 marginally below the acceptance threshold docsepthis paper explores the unsupervised domain adaptation of time series data tsuda and it focuses on the benchmark construction by standardizing the base model datasets and model selection this paper provides a good benchmark of tsuda this benchmark can facilitate future research also the paper proposes some findings strengths 1 this paper evaluates the previous domain adaption algorithms under a fair setting extensive experiments are provided also the experiment results present some competitive baselines to the tsuda area this benchmark will be helpful to future research 2 based on the experiment results this paper provides some findings or analyses 3 the paper is well organized and with clear clarification weaknesses 1 the main concern is the technological novelty of this paper this paper adopts previous methods in the time series datasets which is technologically trivial especially the proposed adatime framework is also trivial can you further clarify the difference between your adatime framework and the standard domain adaption protocol in image classification it is hard for me to distinguish them that is why i think the adatime is not novel more elaborations will be very helpful for my judgment 2 some of the findings are also fragile  performance gap diminishes with a sufficient amount of data this clarification is not persuasive for me your conclusion is obtained from the comparison among three different datasets it would be helpful if you controlled the variables for example you can keep enlarging the size of one fixed dataset and record the change of results model selection has a significant effect on performance i think your experiment results are also affected by the limited data the three experiment datasets are too small to provide a robust result larger datasets should be included such as the hhar dataset used in codats 3 i am not sure about the contribution of this paper when the backbone is just cnn firstly as shown in figure 3 of the main text different backbones result in quite different results not only the numerical values but the relative performance is also changed so i think the findings of your paper may also be changed under other base models secondly as your mentioned the time series area does not have a consistent backbone thus the experiments are lacking in the backbone aspect in conclusion i think more baselines are needed if you want to obtain a general conclusion such as the tcn 4 as for the metric f1score is a wellestablished convention in the time series area such as anomaly detection and recommended system maybe f1score is not popular in vision but i still think the metric finding is too trivial this paper provides a useful benchmark for the tsuda task extensive experiments are included but because of the concern of technology novelty and the unconvincing evidence of some findings i would like to reject it after discussion the author addressed part of my question but i still think this paper is below the expectation i would like to arise the rank to 5 docsepthe paper introduces a standardized framework to systematically and fairly evaluate different domain adaptation methods on time series data deep learning has achieved a great success in time series classification tasks assuming access to a vast amount of labeled data for training i am not totally convinced that this is true the improvements over simple nearest neighbor models etc is not great in the related domain of anomaly detection a recent paper makes a forceful claim that most of the apparent success of deep learning for time series anomaly detection is nonsense a the paper contains a nice bake off but the main finding visual uda methods achieve comparable performance to tsuda methods on time series data is unsurprisingly i am very curious about how well one could do with a much simpler methods for har the class are walking walkingupstairs walkingdownstairs sitting standing laying it is easy to tell laying from any of the dynamic classes simply by the fact that the variance of laying is less than one tenth of the dynamic classes it is easy to tell laying from the other classes because g the acceleration due to gravity sifts from one axis to another etc i understand that simply maximizing accuracy is not the full point of the paper but i am still curious of we really need deep learning here in addition we find that model selection plays a key role and different selection strategies can significantly affect performance it would be surprising of that was not true however i do think the experiments as detailed and forceful and the community may find them useful a renjie wu current time series anomaly detection benchmarks are flawed and are creating the illusion of progress while the novelty is low the experiments as detailed and forceful and the community may find them useful docsepthis paper presents an empirical approach for unsupervised domain adaptation of time series data the paper points out some of the drawbacks of existing approaches due to inconsistencies in evaluation schemes datasets model selection rules and base neural network architectures the paper then presents adaptations of visual domain adaptation methods for time series data experimental results using ten stateoftheart methods on three benchmark datasets spanning fifteen crossdomain scenarios are presented strengths a systematic experimental approach for time series domain adaptation using multiple datasets and baseline architectures comparisons between visual domain adaptation methods and time seriesunsupervised domain adaptation methods are given guidelines for future research are given while the authors efforts in systematic experimental evaluation of existing visual domain adaptation algorithms and time seriesunsupervised domain adaptation algorithms is commendable there is no theoretical understanding at even the most basic level hence the conclusions drawn from the experiments may be specific to datasets and baseline architectures used the conclusions do not generalize ### Summary:
this work aims at giving a systematic evaluation of different unsupervised domain adaptation methods on time series classification tasks under a fair setting by providing extensive experiments on various datasets competitive baselines and model selection approaches this paper has the potential to facilitate future research on this topic if the mentioned concerns are well addressed after rebuttal and discussion the final scores were 35555 ac considered all reviews author responses and the discussions as well as reading through the paper as a neutral referee and reject the paper based on the following concerns model selection criterion as stated by the authors employing labeled target data for model selection will violate the fundamental assumption of unsupervised domain adaptation however the proposed fewshot target risk fst risk also requires labeling a few target domain samples if it is possible why not directly conduct semisupervised domain adaptation experiment details as a benchmark paper it is extremely important to carefully design the experiment details to attain promising results among these details a suitable network backbone for time series classification is cnn or resnet18 the best choice or tcn mentioned by reviewer f7xp largescale datasets with considerable domain gap and evaluation metrics are the first consideration to attain insightful findings novelty or interesting findings as pointed out by reviewers it is obvious that the technical novelty is limited but it may be okay for a benchmark paper if solidinteresting experimental results are observed however some of the findings are also fragile and the experiments should be carefully conducted to make them more solid in summary this paper studies a promising research direction of domain adaptation but the work cannot be accepted before addressing the reviewers comments the weaknesses mentioned above will have a high probability of being asked by the reviewers of the next conference so the authors need to make sure that they substantially revise their work before submitting it to another venue
[ 2358, 1619, 50274, 14958, 417, 15249, 973, 2139, 5028, 43097, 403, 1774, 973, 275, 277, 2320, 352, 5393, 1907, 1097, 5028, 2173, 285, 31429, 3386, 403, 4619, 50276, 5371, 604, 970, 1027, 896, 6448, 84, 824, 347, 4116, 3169, 4394, 50276, 10984, 281, 2096, 2139, 1142, 6866, 403, 4081, 752, 310, 3064, 875, 256, 550, 285, 1474, 2495, 672, 281, 3609, 581, 273, 731, 50276, 38157, 281, 4665, 253, 3368, 1543, 50276, 5371, 310, 5304, 209, 14776, 275, 253, 2829, 495, 50275, 22309, 497, 3388, 2193, 2361, 50276, 783, 7313, 403, 417, 1199, 27096, 50275, 531, 327, 516, 30063, 941, 310, 417, 4802, 281, 1566, 5438, 16696, 621, 50275, 531, 273, 896, 6448, 5438, 310, 2238, 273, 27594, 14122, 2468, 3028, 4809, 752, 310, 253, 1524, 1379, 12594, 562, 273, 436, 778, 352, 5583, 281, 897, 643, 1896, 896, 6448, 24088, 4116, 3169, 581, 50276, 10984, 281, 2096, 2139, 4477, 22175, 4344, 285, 15958, 5199, 403, 667, 273, 643, 3082, 417, 4344, 390, 15958, 285, 2139, 50275, 783, 2929, 310, 417, 973, 15720, 3340, 4679, 4243, 285, 253, 3935, 273, 776, 731, 403, 417, 2590, 275, 253, 1386, 273, 4583, 2929, 1379, 12594, 45230, 253, 7680, 3133, 281, 320, 417, 2590, 4457, 9406, 839, 512, 5368, 4204, 3082, 285, 24181, 45021, 731, 50276, 7152, 339, 431, 248, 2929, 29328, 247, 12082, 7103, 7792, 4907, 519, 255, 553, 534, 24181, 44995, 1027, 440, 35421, 5028, 15644, 3082, 327, 673, 2962, 941, 253, 2644, 7792, 8414, 273, 247, 4735, 4908, 263, 247, 30410, 285, 247, 5028, 12420, 4445, 253, 2929, 2589, 84, 1236, 2510, 25912, 4679, 42174, 253, 1375, 23037, 14387, 5304, 5028, 15644, 3082, 281, 253, 4081, 7792, 327, 673, 2962, 9162, 8892, 253, 4342, 1754, 327, 253, 5661, 1543, 10313, 253, 2234, 2792, 273, 9433, 209, 14776, 281, 673, 2962, 941, 5847, 337, 253, 2929, 9483, 1242, 5223, 84, 253, 1375, 23037, 14387, 209, 14776, 3082, 281, 673, 2962, 9162, 8892, 50276, 19, 253, 2929, 29328, 247, 4460, 209, 14776, 7792, 323, 673, 2962, 941, 534, 778, 8162, 281, 253, 3114, 495, 253, 2934, 273, 436, 2929, 310, 4460, 281, 690, 6070, 50276, 5040, 337, 619, 2022, 4468, 310, 253, 16038, 253, 2929, 760, 12088, 670, 849, 253, 7792, 310, 4158, 1223, 417, 14883, 839, 4518, 327, 2139, 253, 7792, 310, 4158, 751, 436, 374, 253, 4236, 15302, 403, 4942, 1512, 1355, 285, 2969, 326, 253, 27882, 2990, 476, 760, 320, 247, 337, 12352, 9866, 275, 1340, 281, 3693, 253, 689, 31893, 11562, 253, 5661, 1543, 327, 841, 20953, 2069, 12395, 15302, 403, 10915, 87, 19163, 495, 253, 6003, 285, 4028, 273, 436, 2929, 943, 320, 5520, 253, 4477, 943, 2075, 625, 4116, 281, 253, 16038, 285, 253, 4278, 273, 253, 7792, 50276, 21, 352, 651, 320, 1805, 604, 253, 4477, 5513, 690, 14217, 751, 209, 633, 44196, 209, 633, 6655, 1182, 7604, 968, 1182, 19453, 50276, 14069, 275, 4677, 50276, 22, 690, 2905, 2987, 403, 5816, 50275, 18, 8864, 469, 86, 260, 2284, 480, 571, 26981, 260, 864, 1182, 1944, 757, 632, 359, 74, 260, 864, 465, 7876, 1182, 12109, 12468, 75, 757, 9094, 1182, 11917, 6002, 12109, 632, 1269, 571, 899, 266, 30966, 1182, 864, 75, 466, 1182, 12109, 673, 2962, 5028, 15644, 3066, 23507, 42162, 2605, 12420, 39951, 2284, 43425, 374, 1269, 571, 899, 543, 480, 249, 2872, 84, 2788, 5603, 16447, 43808, 260, 278, 1911, 895, 340, 7352, 606, 259, 606, 1269, 338, 1205, 340, 266, 5028, 15644, 323, 673, 2962, 16923, 272, 3066, 4116, 9628, 549, 32693, 16899, 938, 2358, 1619, 50276, 1222, 641, 327, 3239, 374, 253, 253, 1563, 3533, 50276, 783, 1563, 3533, 253, 4342, 403, 4722, 285, 778, 8162, 281, 253, 3114, 2299, 7296, 253, 16038, 285, 253, 10915, 87, 19163, 4679, 891, 6273, 323, 608, 42876, 2708, 253, 14924, 7887, 5474, 33032, 2520, 2929, 33826, 253, 440, 35421, 5028, 15644, 273, 673, 2962, 941, 28669, 14776, 285, 352, 16633, 327, 253, 22791, 5140, 407, 2629, 3006, 253, 2613, 1566, 15302, 285, 1566, 5438, 436, 2929, 3400, 247, 1175, 22791, 273, 28669, 14776, 436, 22791, 476, 12454, 2852, 2561, 671, 253, 2929, 29328, 690, 4342, 50276, 296, 3755, 20556, 337, 436, 2929, 44995, 253, 2045, 5028, 5223, 279, 11333, 762, 247, 4344, 4758, 9470, 4679, 403, 2530, 671, 253, 3368, 1543, 1246, 690, 12085, 1666, 25379, 281, 253, 28669, 14776, 2170, 436, 22791, 588, 320, 9371, 281, 2852, 2561, 50276, 19, 1754, 327, 253, 3368, 1543, 436, 2929, 3400, 690, 4342, 390, 6260, 50276, 20, 253, 2929, 310, 973, 10932, 285, 342, 2590, 37699, 50275, 20881, 1255, 265, 337, 253, 2022, 4468, 310, 253, 20417, 38135, 273, 436, 2929, 436, 2929, 47932, 2045, 3082, 275, 253, 673, 2962, 15302, 534, 310, 1732, 11220, 14916, 3340, 253, 4081, 519, 255, 553, 7792, 310, 671, 14916, 476, 368, 2007, 19148, 253, 3064, 875, 634, 519, 255, 553, 7792, 285, 253, 2629, 5028, 5223, 279, 7241, 275, 2460, 9162, 352, 310, 1892, 323, 479, 281, 12129, 731, 326, 310, 2139, 891, 1158, 253, 519, 255, 553, 310, 417, 4460, 625, 14883, 569, 588, 320, 1077, 9371, 323, 619, 3883, 50276, 19, 690, 273, 253, 4342, 403, 671, 28304, 575, 50276, 24159, 8037, 13064, 6419, 342, 247, 4209, 2408, 273, 941, 436, 37699, 310, 417, 34593, 323, 479, 634, 6452, 310, 2797, 432, 253, 5301, 2190, 1264, 1027, 15302, 352, 651, 320, 9371, 604, 368, 6537, 253, 4903, 323, 1650, 368, 476, 1978, 19524, 272, 253, 1979, 273, 581, 4229, 10895, 285, 1924, 253, 1818, 273, 1543, 50276, 7645, 5438, 556, 247, 1534, 1055, 327, 3045, 891, 1158, 634, 3368, 1543, 403, 671, 5876, 407, 253, 3710, 941, 253, 1264, 3368, 15302, 403, 1512, 1355, 281, 2085, 247, 10237, 906, 4067, 15302, 943, 320, 2908, 824, 347, 253, 288, 9432, 10895, 908, 275, 12738, 1832, 50276, 20, 891, 717, 417, 2119, 670, 253, 7680, 273, 436, 2929, 672, 253, 27882, 310, 816, 260, 9866, 41005, 347, 2011, 275, 4677, 495, 273, 253, 2022, 2505, 1027, 896, 47473, 906, 275, 3240, 1027, 1543, 417, 760, 253, 10704, 2193, 533, 253, 4103, 3045, 310, 671, 4391, 594, 891, 1158, 253, 4342, 273, 634, 2929, 778, 671, 320, 4391, 762, 643, 2613, 3210, 1273, 314, 347, 634, 5393, 253, 673, 2962, 2170, 1057, 417, 452, 247, 5185, 27882, 3021, 253, 4679, 403, 14999, 275, 253, 27882, 4809, 275, 6452, 891, 1158, 625, 1666, 25379, 403, 3058, 604, 368, 971, 281, 4044, 247, 2087, 6452, 824, 347, 253, 246, 14340, 50276, 21, 347, 323, 253, 7982, 269, 18, 18891, 310, 247, 973, 21877, 5008, 275, 253, 673, 2962, 2170, 824, 347, 30207, 5481, 285, 8521, 985, 5046, 269, 18, 18891, 310, 417, 4633, 275, 8113, 533, 891, 1335, 1158, 253, 7982, 4560, 310, 1512, 14916, 50276, 2520, 2929, 3400, 247, 4217, 22791, 323, 253, 28669, 14776, 4836, 9470, 4679, 403, 2908, 533, 984, 273, 253, 4468, 273, 4302, 38135, 285, 253, 10915, 87, 19163, 1941, 273, 690, 4342, 891, 651, 751, 281, 12009, 352, 50275, 6438, 5955, 50276, 783, 2488, 9713, 629, 273, 619, 1953, 533, 891, 1335, 1158, 436, 2929, 310, 2708, 253, 15355, 891, 651, 751, 281, 12893, 253, 5958, 281, 608, 50276, 7152, 339, 431, 248, 2929, 23970, 50276, 66, 19817, 7792, 281, 24181, 285, 9648, 7472, 1027, 5028, 15644, 3082, 327, 673, 2962, 941, 3676, 4715, 556, 6786, 247, 1270, 2323, 275, 673, 2962, 9162, 8892, 7384, 2289, 281, 247, 8485, 2408, 273, 13130, 941, 323, 3733, 891, 717, 417, 9106, 13762, 326, 436, 310, 2032, 253, 11701, 689, 2969, 5275, 6346, 3210, 3966, 310, 417, 1270, 50276, 249, 253, 2905, 5028, 273, 30207, 5481, 247, 3332, 2929, 2789, 247, 3490, 1020, 1750, 326, 954, 273, 253, 5165, 2323, 273, 3676, 4715, 323, 673, 2962, 30207, 5481, 310, 25333, 247, 50276, 783, 2929, 4428, 247, 5322, 32413, 745, 533, 253, 2022, 4560, 5304, 209, 14776, 3082, 5115, 10870, 3045, 281, 28669, 14776, 3082, 327, 673, 2962, 941, 310, 5061, 321, 28761, 50276, 74, 717, 1077, 14338, 670, 849, 973, 581, 812, 513, 342, 247, 1199, 19554, 3082, 323, 4230, 253, 966, 403, 7824, 7824, 484, 15475, 7824, 3487, 15475, 7063, 6306, 23157, 352, 310, 3477, 281, 2028, 23157, 432, 667, 273, 253, 7870, 5971, 3365, 407, 253, 958, 326, 253, 11041, 273, 23157, 310, 1679, 685, 581, 28081, 273, 253, 7870, 5971, 352, 310, 3477, 281, 2028, 23157, 432, 253, 643, 5971, 984, 305, 253, 17680, 1955, 281, 12926, 256, 8515, 432, 581, 7844, 281, 1529, 3966, 50276, 74, 2096, 326, 3365, 46875, 7200, 310, 417, 253, 2120, 1127, 273, 253, 2929, 533, 891, 717, 1335, 14338, 273, 359, 1663, 878, 3676, 4715, 1060, 50275, 249, 1635, 359, 1089, 326, 1566, 5438, 7120, 247, 2234, 2554, 285, 1027, 5438, 8130, 476, 3012, 2818, 3045, 352, 651, 320, 10084, 273, 326, 369, 417, 2032, 50276, 35529, 891, 513, 1158, 253, 4679, 347, 7000, 285, 3490, 1020, 285, 253, 3114, 778, 1089, 731, 4217, 50275, 66, 3816, 75, 466, 259, 86, 1655, 673, 2962, 30207, 5481, 49602, 403, 33657, 285, 403, 6153, 253, 25451, 273, 4780, 50276, 6050, 253, 38135, 310, 1698, 253, 4679, 347, 7000, 285, 3490, 1020, 285, 253, 3114, 778, 1089, 731, 4217, 50276, 7152, 33032, 2520, 2929, 10262, 271, 16774, 2746, 323, 440, 35421, 5028, 15644, 273, 673, 2962, 941, 253, 2929, 2792, 562, 690, 273, 253, 30453, 273, 5368, 7274, 1955, 281, 45611, 275, 7103, 15849, 15302, 1566, 5438, 4803, 285, 2613, 11454, 2990, 35615, 253, 2929, 840, 10262, 41655, 273, 5304, 5028, 15644, 3082, 323, 673, 2962, 941, 5661, 1543, 970, 3578, 1375, 23037, 14387, 3082, 327, 1264, 22791, 15302, 28369, 15342, 2831, 13517, 15216, 403, 3559, 20544, 247, 12082, 5661, 2746, 50276, 1542, 673, 2962, 5028, 15644, 970, 2709, 15302, 285, 8245, 35615, 50276, 681, 1148, 10047, 875, 5304, 5028, 15644, 3082, 285, 673, 2962, 328, 35421, 5028, 15644, 3082, 403, 1677, 50276, 4297, 7686, 323, 2852, 2561, 403, 1677, 50276, 6050, 253, 4477, 6031, 275, 12082, 5661, 7103, 273, 5368, 5304, 5028, 15644, 11333, 285, 673, 2962, 328, 35421, 5028, 15644, 11333, 310, 49638, 494, 627, 310, 642, 10527, 4685, 387, 1014, 253, 954, 5044, 1268, 7613, 253, 11815, 8392, 432, 253, 4679, 778, 320, 2173, 281, 15302, 285, 8245, 35615, 908, 253, 11815, 513, 417, 39970, 2490, 187, 4118, 18435, 27, 2520, 789, 13698, 387, 4933, 247, 12082, 7103, 273, 1027, 440, 35421, 5028, 15644, 3082, 327, 673, 2962, 9162, 8892, 762, 247, 4344, 4758, 407, 5277, 9470, 4679, 327, 2710, 15302, 12085, 1666, 25379, 285, 1566, 5438, 7274, 436, 2929, 556, 253, 2442, 281, 12454, 2852, 2561, 327, 436, 9400, 604, 253, 5393, 7350, 403, 973, 9713, 50276, 6438, 30080, 22559, 285, 5955, 253, 2457, 7363, 497, 26033, 2417, 913, 2783, 512, 10123, 2488, 6128, 285, 253, 11985, 347, 973, 347, 4361, 949, 253, 2929, 347, 247, 9238, 32326, 285, 12009, 253, 2929, 1754, 327, 253, 1563, 7350, 50276, 7645, 5438, 17705, 347, 4767, 407, 253, 4477, 19693, 13130, 2303, 941, 323, 1566, 5438, 588, 20835, 253, 7936, 9376, 273, 440, 35421, 5028, 15644, 2299, 253, 4081, 1643, 11860, 2303, 2495, 269, 296, 2495, 671, 4419, 21473, 247, 1643, 2303, 5028, 3530, 604, 352, 310, 1896, 2139, 417, 3587, 2589, 49863, 29974, 13337, 5028, 15644, 50276, 16217, 2092, 4278, 347, 247, 22791, 2929, 352, 310, 6685, 1774, 281, 9257, 2216, 253, 3368, 4278, 281, 20685, 12532, 1543, 2190, 841, 4278, 247, 7470, 2990, 27882, 323, 673, 2962, 9162, 310, 260, 9866, 390, 501, 3024, 1093, 253, 1682, 4327, 50276, 263, 246, 14340, 5393, 407, 37317, 269, 24, 37755, 1236, 2510, 25912, 15302, 342, 10665, 5028, 8037, 285, 7103, 17082, 403, 253, 806, 8180, 281, 20685, 47860, 4342, 50276, 2369, 652, 555, 390, 4722, 4342, 347, 8042, 562, 407, 30628, 352, 310, 4755, 326, 253, 7681, 38135, 310, 3710, 533, 352, 778, 320, 8261, 323, 247, 22791, 2929, 604, 4891, 47606, 5661, 1543, 403, 2540, 2299, 690, 273, 253, 4342, 403, 671, 28304, 285, 253, 4679, 943, 320, 9257, 5196, 281, 1056, 731, 625, 4891, 50276, 249, 6010, 436, 2929, 2175, 247, 12532, 2561, 3884, 273, 5028, 15644, 533, 253, 789, 2550, 320, 7607, 1078, 15974, 253, 30628, 5701, 253, 32213, 5393, 1840, 588, 452, 247, 1029, 5912, 273, 1146, 2546, 407, 253, 30628, 273, 253, 1735, 8059, 594, 253, 4477, 878, 281, 1056, 2119, 326, 597, 9619, 49620, 616, 789, 1078, 29315, 352, 281, 1529, 18767 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 2358, 1619, 50274, 14958, 417, 15249, 973, 2139, 5028, 43097, 403, 1774, 973, 275, 277, 2320, 352, 5393, 1907, 1097, 5028, 2173, 285, 31429, 3386, 403, 4619, 50276, 5371, 604, 970, 1027, 896, 6448, 84, 824, 347, 4116, 3169, 4394, 50276, 10984, 281, 2096, 2139, 1142, 6866, 403, 4081, 752, 310, 3064, 875, 256, 550, 285, 1474, 2495, 672, 281, 3609, 581, 273, 731, 50276, 38157, 281, 4665, 253, 3368, 1543, 50276, 5371, 310, 5304, 209, 14776, 275, 253, 2829, 495, 50275, 22309, 497, 3388, 2193, 2361, 50276, 783, 7313, 403, 417, 1199, 27096, 50275, 531, 327, 516, 30063, 941, 310, 417, 4802, 281, 1566, 5438, 16696, 621, 50275, 531, 273, 896, 6448, 5438, 310, 2238, 273, 27594, 14122, 2468, 3028, 4809, 752, 310, 253, 1524, 1379, 12594, 562, 273, 436, 778, 352, 5583, 281, 897, 643, 1896, 896, 6448, 24088, 4116, 3169, 581, 50276, 10984, 281, 2096, 2139, 4477, 22175, 4344, 285, 15958, 5199, 403, 667, 273, 643, 3082, 417, 4344, 390, 15958, 285, 2139, 50275, 783, 2929, 310, 417, 973, 15720, 3340, 4679, 4243, 285, 253, 3935, 273, 776, 731, 403, 417, 2590, 275, 253, 1386, 273, 4583, 2929, 1379, 12594, 45230, 253, 7680, 3133, 281, 320, 417, 2590, 4457, 9406, 839, 512, 5368, 4204, 3082, 285, 24181, 45021, 731, 50276, 7152, 339, 431, 248, 2929, 29328, 247, 12082, 7103, 7792, 4907, 519, 255, 553, 534, 24181, 44995, 1027, 440, 35421, 5028, 15644, 3082, 327, 673, 2962, 941, 253, 2644, 7792, 8414, 273, 247, 4735, 4908, 263, 247, 30410, 285, 247, 5028, 12420, 4445, 253, 2929, 2589, 84, 1236, 2510, 25912, 4679, 42174, 253, 1375, 23037, 14387, 5304, 5028, 15644, 3082, 281, 253, 4081, 7792, 327, 673, 2962, 9162, 8892, 253, 4342, 1754, 327, 253, 5661, 1543, 10313, 253, 2234, 2792, 273, 9433, 209, 14776, 281, 673, 2962, 941, 5847, 337, 253, 2929, 9483, 1242, 5223, 84, 253, 1375, 23037, 14387, 209, 14776, 3082, 281, 673, 2962, 9162, 8892, 50276, 19, 253, 2929, 29328, 247, 4460, 209, 14776, 7792, 323, 673, 2962, 941, 534, 778, 8162, 281, 253, 3114, 495, 253, 2934, 273, 436, 2929, 310, 4460, 281, 690, 6070, 50276, 5040, 337, 619, 2022, 4468, 310, 253, 16038, 253, 2929, 760, 12088, 670, 849, 253, 7792, 310, 4158, 1223, 417, 14883, 839, 4518, 327, 2139, 253, 7792, 310, 4158, 751, 436, 374, 253, 4236, 15302, 403, 4942, 1512, 1355, 285, 2969, 326, 253, 27882, 2990, 476, 760, 320, 247, 337, 12352, 9866, 275, 1340, 281, 3693, 253, 689, 31893, 11562, 253, 5661, 1543, 327, 841, 20953, 2069, 12395, 15302, 403, 10915, 87, 19163, 495, 253, 6003, 285, 4028, 273, 436, 2929, 943, 320, 5520, 253, 4477, 943, 2075, 625, 4116, 281, 253, 16038, 285, 253, 4278, 273, 253, 7792, 50276, 21, 352, 651, 320, 1805, 604, 253, 4477, 5513, 690, 14217, 751, 209, 633, 44196, 209, 633, 6655, 1182, 7604, 968, 1182, 19453, 50276, 14069, 275, 4677, 50276, 22, 690, 2905, 2987, 403, 5816, 50275, 18, 8864, 469, 86, 260, 2284, 480, 571, 26981, 260, 864, 1182, 1944, 757, 632, 359, 74, 260, 864, 465, 7876, 1182, 12109, 12468, 75, 757, 9094, 1182, 11917, 6002, 12109, 632, 1269, 571, 899, 266, 30966, 1182, 864, 75, 466, 1182, 12109, 673, 2962, 5028, 15644, 3066, 23507, 42162, 2605, 12420, 39951, 2284, 43425, 374, 1269, 571, 899, 543, 480, 249, 2872, 84, 2788, 5603, 16447, 43808, 260, 278, 1911, 895, 340, 7352, 606, 259, 606, 1269, 338, 1205, 340, 266, 5028, 15644, 323, 673, 2962, 16923, 272, 3066, 4116, 9628, 549, 32693, 16899, 938, 2358, 1619, 50276, 1222, 641, 327, 3239, 374, 253, 253, 1563, 3533, 50276, 783, 1563, 3533, 253, 4342, 403, 4722, 285, 778, 8162, 281, 253, 3114, 2299, 7296, 253, 16038, 285, 253, 10915, 87, 19163, 4679, 891, 6273, 323, 608, 42876, 2708, 253, 14924, 7887, 5474, 33032, 2520, 2929, 33826, 253, 440, 35421, 5028, 15644, 273, 673, 2962, 941, 28669, 14776, 285, 352, 16633, 327, 253, 22791, 5140, 407, 2629, 3006, 253, 2613, 1566, 15302, 285, 1566, 5438, 436, 2929, 3400, 247, 1175, 22791, 273, 28669, 14776, 436, 22791, 476, 12454, 2852, 2561, 671, 253, 2929, 29328, 690, 4342, 50276, 296, 3755, 20556, 337, 436, 2929, 44995, 253, 2045, 5028, 5223, 279, 11333, 762, 247, 4344, 4758, 9470, 4679, 403, 2530, 671, 253, 3368, 1543, 1246, 690, 12085, 1666, 25379, 281, 253, 28669, 14776, 2170, 436, 22791, 588, 320, 9371, 281, 2852, 2561, 50276, 19, 1754, 327, 253, 3368, 1543, 436, 2929, 3400, 690, 4342, 390, 6260, 50276, 20, 253, 2929, 310, 973, 10932, 285, 342, 2590, 37699, 50275, 20881, 1255, 265, 337, 253, 2022, 4468, 310, 253, 20417, 38135, 273, 436, 2929, 436, 2929, 47932, 2045, 3082, 275, 253, 673, 2962, 15302, 534, 310, 1732, 11220, 14916, 3340, 253, 4081, 519, 255, 553, 7792, 310, 671, 14916, 476, 368, 2007, 19148, 253, 3064, 875, 634, 519, 255, 553, 7792, 285, 253, 2629, 5028, 5223, 279, 7241, 275, 2460, 9162, 352, 310, 1892, 323, 479, 281, 12129, 731, 326, 310, 2139, 891, 1158, 253, 519, 255, 553, 310, 417, 4460, 625, 14883, 569, 588, 320, 1077, 9371, 323, 619, 3883, 50276, 19, 690, 273, 253, 4342, 403, 671, 28304, 575, 50276, 24159, 8037, 13064, 6419, 342, 247, 4209, 2408, 273, 941, 436, 37699, 310, 417, 34593, 323, 479, 634, 6452, 310, 2797, 432, 253, 5301, 2190, 1264, 1027, 15302, 352, 651, 320, 9371, 604, 368, 6537, 253, 4903, 323, 1650, 368, 476, 1978, 19524, 272, 253, 1979, 273, 581, 4229, 10895, 285, 1924, 253, 1818, 273, 1543, 50276, 7645, 5438, 556, 247, 1534, 1055, 327, 3045, 891, 1158, 634, 3368, 1543, 403, 671, 5876, 407, 253, 3710, 941, 253, 1264, 3368, 15302, 403, 1512, 1355, 281, 2085, 247, 10237, 906, 4067, 15302, 943, 320, 2908, 824, 347, 253, 288, 9432, 10895, 908, 275, 12738, 1832, 50276, 20, 891, 717, 417, 2119, 670, 253, 7680, 273, 436, 2929, 672, 253, 27882, 310, 816, 260, 9866, 41005, 347, 2011, 275, 4677, 495, 273, 253, 2022, 2505, 1027, 896, 47473, 906, 275, 3240, 1027, 1543, 417, 760, 253, 10704, 2193, 533, 253, 4103, 3045, 310, 671, 4391, 594, 891, 1158, 253, 4342, 273, 634, 2929, 778, 671, 320, 4391, 762, 643, 2613, 3210, 1273, 314, 347, 634, 5393, 253, 673, 2962, 2170, 1057, 417, 452, 247, 5185, 27882, 3021, 253, 4679, 403, 14999, 275, 253, 27882, 4809, 275, 6452, 891, 1158, 625, 1666, 25379, 403, 3058, 604, 368, 971, 281, 4044, 247, 2087, 6452, 824, 347, 253, 246, 14340, 50276, 21, 347, 323, 253, 7982, 269, 18, 18891, 310, 247, 973, 21877, 5008, 275, 253, 673, 2962, 2170, 824, 347, 30207, 5481, 285, 8521, 985, 5046, 269, 18, 18891, 310, 417, 4633, 275, 8113, 533, 891, 1335, 1158, 253, 7982, 4560, 310, 1512, 14916, 50276, 2520, 2929, 3400, 247, 4217, 22791, 323, 253, 28669, 14776, 4836, 9470, 4679, 403, 2908, 533, 984, 273, 253, 4468, 273, 4302, 38135, 285, 253, 10915, 87, 19163, 1941, 273, 690, 4342, 891, 651, 751, 281, 12009, 352, 50275, 6438, 5955, 50276, 783, 2488, 9713, 629, 273, 619, 1953, 533, 891, 1335, 1158, 436, 2929, 310, 2708, 253, 15355, 891, 651, 751, 281, 12893, 253, 5958, 281, 608, 50276, 7152, 339, 431, 248, 2929, 23970, 50276, 66, 19817, 7792, 281, 24181, 285, 9648, 7472, 1027, 5028, 15644, 3082, 327, 673, 2962, 941, 3676, 4715, 556, 6786, 247, 1270, 2323, 275, 673, 2962, 9162, 8892, 7384, 2289, 281, 247, 8485, 2408, 273, 13130, 941, 323, 3733, 891, 717, 417, 9106, 13762, 326, 436, 310, 2032, 253, 11701, 689, 2969, 5275, 6346, 3210, 3966, 310, 417, 1270, 50276, 249, 253, 2905, 5028, 273, 30207, 5481, 247, 3332, 2929, 2789, 247, 3490, 1020, 1750, 326, 954, 273, 253, 5165, 2323, 273, 3676, 4715, 323, 673, 2962, 30207, 5481, 310, 25333, 247, 50276, 783, 2929, 4428, 247, 5322, 32413, 745, 533, 253, 2022, 4560, 5304, 209, 14776, 3082, 5115, 10870, 3045, 281, 28669, 14776, 3082, 327, 673, 2962, 941, 310, 5061, 321, 28761, 50276, 74, 717, 1077, 14338, 670, 849, 973, 581, 812, 513, 342, 247, 1199, 19554, 3082, 323, 4230, 253, 966, 403, 7824, 7824, 484, 15475, 7824, 3487, 15475, 7063, 6306, 23157, 352, 310, 3477, 281, 2028, 23157, 432, 667, 273, 253, 7870, 5971, 3365, 407, 253, 958, 326, 253, 11041, 273, 23157, 310, 1679, 685, 581, 28081, 273, 253, 7870, 5971, 352, 310, 3477, 281, 2028, 23157, 432, 253, 643, 5971, 984, 305, 253, 17680, 1955, 281, 12926, 256, 8515, 432, 581, 7844, 281, 1529, 3966, 50276, 74, 2096, 326, 3365, 46875, 7200, 310, 417, 253, 2120, 1127, 273, 253, 2929, 533, 891, 717, 1335, 14338, 273, 359, 1663, 878, 3676, 4715, 1060, 50275, 249, 1635, 359, 1089, 326, 1566, 5438, 7120, 247, 2234, 2554, 285, 1027, 5438, 8130, 476, 3012, 2818, 3045, 352, 651, 320, 10084, 273, 326, 369, 417, 2032, 50276, 35529, 891, 513, 1158, 253, 4679, 347, 7000, 285, 3490, 1020, 285, 253, 3114, 778, 1089, 731, 4217, 50275, 66, 3816, 75, 466, 259, 86, 1655, 673, 2962, 30207, 5481, 49602, 403, 33657, 285, 403, 6153, 253, 25451, 273, 4780, 50276, 6050, 253, 38135, 310, 1698, 253, 4679, 347, 7000, 285, 3490, 1020, 285, 253, 3114, 778, 1089, 731, 4217, 50276, 7152, 33032, 2520, 2929, 10262, 271, 16774, 2746, 323, 440, 35421, 5028, 15644, 273, 673, 2962, 941, 253, 2929, 2792, 562, 690, 273, 253, 30453, 273, 5368, 7274, 1955, 281, 45611, 275, 7103, 15849, 15302, 1566, 5438, 4803, 285, 2613, 11454, 2990, 35615, 253, 2929, 840, 10262, 41655, 273, 5304, 5028, 15644, 3082, 323, 673, 2962, 941, 5661, 1543, 970, 3578, 1375, 23037, 14387, 3082, 327, 1264, 22791, 15302, 28369, 15342, 2831, 13517, 15216, 403, 3559, 20544, 247, 12082, 5661, 2746, 50276, 1542, 673, 2962, 5028, 15644, 970, 2709, 15302, 285, 8245, 35615, 50276, 681, 1148, 10047, 875, 5304, 5028, 15644, 3082, 285, 673, 2962, 328, 35421, 5028, 15644, 3082, 403, 1677, 50276, 4297, 7686, 323, 2852, 2561, 403, 1677, 50276, 6050, 253, 4477, 6031, 275, 12082, 5661, 7103, 273, 5368, 5304, 5028, 15644, 11333, 285, 673, 2962, 328, 35421, 5028, 15644, 11333, 310, 49638, 494, 627, 310, 642, 10527, 4685, 387, 1014, 253, 954, 5044, 1268, 7613, 253, 11815, 8392, 432, 253, 4679, 778, 320, 2173, 281, 15302, 285, 8245, 35615, 908, 253, 11815, 513, 417, 39970, 2490, 187, 4118, 18435, 27, 2520, 789, 13698, 387, 4933, 247, 12082, 7103, 273, 1027, 440, 35421, 5028, 15644, 3082, 327, 673, 2962, 9162, 8892, 762, 247, 4344, 4758, 407, 5277, 9470, 4679, 327, 2710, 15302, 12085, 1666, 25379, 285, 1566, 5438, 7274, 436, 2929, 556, 253, 2442, 281, 12454, 2852, 2561, 327, 436, 9400, 604, 253, 5393, 7350, 403, 973, 9713, 50276, 6438, 30080, 22559, 285, 5955, 253, 2457, 7363, 497, 26033, 2417, 913, 2783, 512, 10123, 2488, 6128, 285, 253, 11985, 347, 973, 347, 4361, 949, 253, 2929, 347, 247, 9238, 32326, 285, 12009, 253, 2929, 1754, 327, 253, 1563, 7350, 50276, 7645, 5438, 17705, 347, 4767, 407, 253, 4477, 19693, 13130, 2303, 941, 323, 1566, 5438, 588, 20835, 253, 7936, 9376, 273, 440, 35421, 5028, 15644, 2299, 253, 4081, 1643, 11860, 2303, 2495, 269, 296, 2495, 671, 4419, 21473, 247, 1643, 2303, 5028, 3530, 604, 352, 310, 1896, 2139, 417, 3587, 2589, 49863, 29974, 13337, 5028, 15644, 50276, 16217, 2092, 4278, 347, 247, 22791, 2929, 352, 310, 6685, 1774, 281, 9257, 2216, 253, 3368, 4278, 281, 20685, 12532, 1543, 2190, 841, 4278, 247, 7470, 2990, 27882, 323, 673, 2962, 9162, 310, 260, 9866, 390, 501, 3024, 1093, 253, 1682, 4327, 50276, 263, 246, 14340, 5393, 407, 37317, 269, 24, 37755, 1236, 2510, 25912, 15302, 342, 10665, 5028, 8037, 285, 7103, 17082, 403, 253, 806, 8180, 281, 20685, 47860, 4342, 50276, 2369, 652, 555, 390, 4722, 4342, 347, 8042, 562, 407, 30628, 352, 310, 4755, 326, 253, 7681, 38135, 310, 3710, 533, 352, 778, 320, 8261, 323, 247, 22791, 2929, 604, 4891, 47606, 5661, 1543, 403, 2540, 2299, 690, 273, 253, 4342, 403, 671, 28304, 285, 253, 4679, 943, 320, 9257, 5196, 281, 1056, 731, 625, 4891, 50276, 249, 6010, 436, 2929, 2175, 247, 12532, 2561, 3884, 273, 5028, 15644, 533, 253, 789, 2550, 320, 7607, 1078, 15974, 253, 30628, 5701, 253, 32213, 5393, 1840, 588, 452, 247, 1029, 5912, 273, 1146, 2546, 407, 253, 30628, 273, 253, 1735, 8059, 594, 253, 4477, 878, 281, 1056, 2119, 326, 597, 9619, 49620, 616, 789, 1078, 29315, 352, 281, 1529, 18767 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the authors of this paper propose a platform safebench for the safety evaluation of avs they claim this platform is the first of its kind testing 8 different safety critical scenarios additionally different av algorithms are benchmarked on this simulation platform based on over 2000 safety scenarios results are generally promising tested on simple rl algorithms demonstrating tradeoffs between different av metrics such as speed and running red lights the first platform of its kind as far as i know which is a huge contribution the paper is very clear and well written this paper has usecases ranging from other researchers to actual av companies this platform could improve safety of avs central focus on just safety evaluation of ad makes the platform weaker also labeling just the performance measures of safebench as true safety is misleading traffic scenarios are not the only causes of unsafe driving limited amount of ad algorithms tested coming up with a single safety evaluation metric can be detrimental to learningevaluation docsepthe authors propose a simulator environment that can be used to evaluate algorithms for autonomous driving according to 10 metrics this benchmark addresses the problem of testing ad algorithm in a standardized manner and seems to be very promising and useful for future research the benchmark software seems very useful for the autonomous vehicle research community and i can see the benchmark gaining a lot of traction the software seems to be well written and the source code looks reasonable clean and well documented it is also licensed under the mit license and available via github the comparison of different algorithms is interesting and the drop in performance in the adverse scenarios invites future investigations my main problem with the paper is that is to me not clear what exactly it wants to be the simulator is introduced but i miss a detailed discussion of design decisions and an evaluation of the simulator itself for example i miss a rational why the chosen scenarios are useful or better than some other scenarios why does it make sense to only keep scenarios that cause collision with at least two algorithms similarly i miss an evaluation of the metric why is the metric good how does it compare to other metrics another point i feel that should be elaborated is if simulator results are actually useful in real world system so is an algorithm tested on this simulation better in a real world scenario however i am aware that such an evaluation is probably very complex and time consuming this is compounded by the excessive use of the appendix it is ok to push some parts or background into the appendix but at times the paper reads like a table of contents and most paragraphs seem to reference the appendix somehow it feels like the authors tried to do cram two papers into one i am not sure if two papers are a better solution but perhaps a longer journal paper would be an option another option might be to move the tables into the appendix and use the additional space to evaluate the simulator and not the results of algorithms using the simulator and provide some more details on the design rational docsepthis paper proposes safebench the first platform as far as i know for the safety evaluation of autodriving systems in this platform the author introduced several scenarios autodriving algorithms and evaluation metrics based on the platform the author conducted extensive experiments and offered some preliminary insights 1 this paper proposes safebench the first platform as far as i know for the safety evaluation of autodriving systems 2 the benchmark is modularized design and easy to use which would be highly beneficial for researchers in this area 3 this platform contains several safetycritical scenarios for auto driving and multiple evaluation metrics 4 the author studied and evaluated several autodriving algorithms 1 could the users flexibly add new vehicle models to the platform if yes how 2 could the users add additional testing scenarios 3 as for the benchmarking results i suggest the author offer more analysis on these results and provide more insights into the different autodriving algorithms 4 minor i suggest the author include and add more adversarial attack methods in their platform for example 12 references 1 dual attention suppression attack generate adversarial camouflage in physical world cvpr 2021 2 camou learning physical vehicle camouflages to adversarially attack detectors in the wild iclr 2019 docsepthis paper presents safebench a unified benchmarking platform for evaluating safety violations of autonomous vehicles the proposed framework considers 8 safetycritical testing scenarios following national highway traffic safety administration and develops four different scenario generation algorithms the benchmark includes four different deep reinforcement learning algorithms with four types of input the benchmark results suggest that the generated scenarios of safebench are more challenging for current autonomous driving practices this paper presents a novel benchmark for autonomous driving the benchmark results and findings are interesting the paper is wellwritten and easy fo follow some related work is missing some experimental settings are not welljustified see details below docsepthis paper suggests a unified framework to integrate different types of safety critical testing scenarios and scenario generation algorithms with the aim to ease the comparison and the understanding of their effectiveness four different rl algorithms are also compared unified test scenarios and scenario generation algorithms the modularity of the platform including ego vehicle agent scenario and evaluation nodes where each module could be customized based on the requirements lack of computation complexity analysis for scenario generation algorithms and rl based methods not enough documentation to be able to reproduce the provided results lack of generalization analysis on realworld data docsepthe paper presents a unified framework for the assessment of the safety levels of autonomous driving algorithms against 8 different safetycritical scenarios the scenarios are generated on the carla simulator with 4 different generation methods and are evaluated with 10 metrics the code and the leaderboard is provided at the dedicated website the paper is well written and the contributions are clearly presented the work is novel and could represent a step forward in the assessment of the safety of autonomous driving the benchmark is overall well designed and fairly comprehensive and the main project page includes additional videos and the learderboard the paper presents a few weaknesses that could be easily solved there is no description of how the traffic scenarios are generated during training of the agent additional details could be reported in the supplementary material there is no description of how the bev images are obtained in the real world these are typically obtained with neural networks or other perception pipelines processing imagespointclouds if this pipeline is considered as perfect and the bev images are collected directly in carla then it is better to highlight this point i feel like the overall score could be misleading depending on the weights that are applied maybe it could be an idea to also consider 3 different overall scores to sum up the safety functionality and etiquette levels the choice of the 8 scenarios selected from the nhtsa documents should be justified since the scenarios listed are way more were these scenarios selected by statistical frequency on the roads or by other metrics it is not clear how to submit additional results to the leaderboard ### Summary:
the paper addresses a muchneeded benchmark for the safety of autonomous driving all reviewers recognize the significance of the papers contributions most of the questions have been addressed satisfactorily by the rebuttal overall all reviewers are positive about the paper and recommend acceptance
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 273, 436, 2929, 12661, 247, 5147, 4999, 31591, 323, 253, 5252, 7103, 273, 1323, 84, 597, 1750, 436, 5147, 310, 253, 806, 273, 697, 2238, 5175, 854, 1027, 5252, 4619, 15216, 23000, 1027, 1323, 11333, 403, 22791, 264, 327, 436, 9864, 5147, 1754, 327, 689, 5307, 5252, 15216, 1543, 403, 3839, 12532, 5762, 327, 2969, 391, 77, 11333, 17227, 5454, 14273, 875, 1027, 1323, 17082, 824, 347, 3885, 285, 3515, 2502, 10654, 50276, 783, 806, 5147, 273, 697, 2238, 347, 2080, 347, 891, 871, 534, 310, 247, 5699, 7680, 253, 2929, 310, 1077, 2590, 285, 973, 3542, 436, 2929, 556, 441, 886, 1169, 12319, 432, 643, 8607, 281, 4588, 1323, 4413, 436, 5147, 812, 3157, 5252, 273, 1323, 84, 4275, 2770, 327, 816, 5252, 7103, 273, 519, 2789, 253, 5147, 21076, 671, 21473, 816, 253, 3045, 5593, 273, 4999, 31591, 347, 2032, 5252, 310, 24363, 7137, 15216, 403, 417, 253, 760, 5997, 273, 20372, 6276, 3710, 2408, 273, 519, 11333, 5762, 3551, 598, 342, 247, 2014, 5252, 7103, 7982, 476, 320, 30078, 281, 4715, 15419, 2368, 5474, 339, 431, 248, 4477, 12661, 247, 40022, 3126, 326, 476, 320, 908, 281, 7472, 11333, 323, 26279, 6276, 2556, 281, 884, 17082, 50276, 2520, 22791, 12453, 253, 1895, 273, 5175, 519, 5933, 275, 247, 19817, 5133, 285, 3133, 281, 320, 1077, 12532, 285, 4217, 323, 2852, 2561, 50274, 783, 22791, 3694, 3133, 1077, 4217, 323, 253, 26279, 4958, 2561, 3114, 285, 891, 476, 923, 253, 22791, 21896, 247, 2257, 273, 32535, 253, 3694, 3133, 281, 320, 973, 3542, 285, 253, 2603, 2127, 4453, 5272, 4076, 285, 973, 14290, 352, 310, 671, 17236, 762, 253, 4784, 7981, 285, 2130, 3066, 40477, 50274, 783, 5301, 273, 1027, 11333, 310, 4722, 285, 253, 5926, 275, 3045, 275, 253, 10021, 15216, 41969, 2852, 14006, 50274, 2577, 2022, 1895, 342, 253, 2929, 310, 326, 310, 281, 479, 417, 2590, 752, 4555, 352, 5605, 281, 320, 253, 40022, 310, 5611, 533, 891, 2985, 247, 7000, 5955, 273, 2216, 7089, 285, 271, 7103, 273, 253, 40022, 3139, 50276, 1542, 1650, 891, 2985, 247, 8870, 2139, 253, 6777, 15216, 403, 4217, 390, 1805, 685, 690, 643, 15216, 2139, 1057, 352, 1056, 3282, 281, 760, 1978, 15216, 326, 2847, 15708, 342, 387, 1878, 767, 11333, 12014, 891, 2985, 271, 7103, 273, 253, 7982, 2139, 310, 253, 7982, 1175, 849, 1057, 352, 7277, 281, 643, 17082, 50276, 23955, 1127, 891, 1928, 326, 943, 320, 50221, 310, 604, 40022, 1543, 403, 2686, 4217, 275, 1524, 1533, 985, 594, 310, 271, 5933, 5762, 327, 436, 9864, 1805, 275, 247, 1524, 1533, 10076, 2299, 891, 717, 6600, 326, 824, 271, 7103, 310, 3164, 1077, 2570, 285, 673, 21337, 50275, 2520, 310, 509, 8055, 407, 253, 13622, 897, 273, 253, 30762, 352, 310, 8718, 281, 7450, 690, 4243, 390, 4114, 715, 253, 30762, 533, 387, 2069, 253, 2929, 9563, 751, 247, 2829, 273, 9410, 285, 954, 33295, 1646, 281, 3806, 253, 30762, 10380, 50275, 262, 9193, 751, 253, 4477, 3597, 281, 513, 1531, 312, 767, 9380, 715, 581, 891, 717, 417, 2119, 604, 767, 9380, 403, 247, 1805, 2900, 533, 4931, 247, 3356, 6698, 2929, 651, 320, 271, 4500, 1529, 4500, 1537, 320, 281, 2118, 253, 7180, 715, 253, 30762, 285, 897, 253, 3081, 2317, 281, 7472, 253, 40022, 285, 417, 253, 1543, 273, 11333, 970, 253, 40022, 285, 2085, 690, 625, 4278, 327, 253, 2216, 8870, 50276, 7152, 33032, 2520, 2929, 29328, 4999, 31591, 253, 806, 5147, 347, 2080, 347, 891, 871, 323, 253, 5252, 7103, 273, 1125, 351, 19674, 2718, 275, 436, 5147, 253, 2488, 5611, 2067, 15216, 1125, 351, 19674, 11333, 285, 7103, 17082, 1754, 327, 253, 5147, 253, 2488, 5196, 9470, 4679, 285, 5907, 690, 12611, 16039, 337, 436, 2929, 29328, 4999, 31591, 253, 806, 5147, 347, 2080, 347, 891, 871, 323, 253, 5252, 7103, 273, 1125, 351, 19674, 2718, 374, 253, 22791, 310, 23178, 1025, 2216, 285, 3477, 281, 897, 534, 651, 320, 4122, 12912, 323, 8607, 275, 436, 2170, 495, 436, 5147, 4428, 2067, 5252, 26717, 15216, 323, 6753, 6276, 285, 2709, 7103, 17082, 577, 253, 2488, 5421, 285, 6760, 2067, 1125, 351, 19674, 11333, 337, 812, 253, 4212, 6520, 4360, 823, 747, 4958, 3210, 281, 253, 5147, 604, 4754, 849, 374, 812, 253, 4212, 823, 3081, 5175, 15216, 495, 347, 323, 253, 22791, 272, 1543, 891, 1804, 253, 2488, 3959, 625, 1783, 327, 841, 1543, 285, 2085, 625, 16039, 715, 253, 1027, 1125, 351, 19674, 11333, 577, 5884, 891, 1804, 253, 2488, 2486, 285, 823, 625, 48960, 2983, 3082, 275, 616, 5147, 323, 1650, 1249, 50276, 250, 3065, 50276, 18, 8746, 4116, 14523, 2983, 6635, 48960, 4049, 48821, 486, 275, 3520, 1533, 30105, 1087, 43425, 374, 4049, 276, 4715, 3520, 4958, 4049, 48821, 1131, 281, 18539, 274, 1365, 2983, 25421, 275, 253, 4956, 17857, 32888, 6247, 5474, 33032, 2520, 2929, 10262, 4999, 31591, 247, 27998, 22791, 272, 5147, 323, 16344, 5252, 15927, 273, 26279, 9411, 253, 4081, 7792, 19401, 854, 5252, 26717, 5175, 15216, 1563, 3872, 17657, 7137, 5252, 5286, 285, 24357, 1740, 1027, 10076, 5978, 11333, 253, 22791, 3797, 1740, 1027, 3676, 35221, 4715, 11333, 342, 1740, 3510, 273, 3280, 253, 22791, 1543, 1804, 326, 253, 4561, 15216, 273, 4999, 31591, 403, 625, 11132, 323, 1655, 26279, 6276, 8333, 50276, 2520, 2929, 10262, 247, 4460, 22791, 323, 26279, 6276, 50275, 783, 22791, 1543, 285, 4342, 403, 4722, 50275, 783, 2929, 310, 973, 15720, 285, 3477, 7954, 956, 50276, 8826, 2905, 789, 310, 5816, 50275, 8826, 5661, 7533, 403, 417, 973, 6309, 1245, 50276, 2887, 4278, 2708, 5474, 33032, 2520, 2929, 5936, 247, 27998, 7792, 281, 19837, 1027, 3510, 273, 5252, 4619, 5175, 15216, 285, 10076, 5978, 11333, 342, 253, 4388, 281, 11990, 253, 5301, 285, 253, 4685, 273, 616, 12510, 1740, 1027, 391, 77, 11333, 403, 671, 2429, 50275, 328, 1245, 1071, 15216, 285, 10076, 5978, 11333, 50276, 783, 23178, 414, 273, 253, 5147, 1690, 23057, 4958, 5570, 10076, 285, 7103, 7632, 835, 1016, 6333, 812, 320, 32176, 1754, 327, 253, 6095, 50274, 77, 471, 273, 13782, 10454, 1783, 323, 10076, 5978, 11333, 285, 391, 77, 1754, 3082, 50275, 1439, 2217, 10097, 281, 320, 2104, 281, 18302, 253, 2530, 1543, 50276, 77, 471, 273, 26647, 1783, 327, 1524, 10186, 941, 50276, 7152, 339, 431, 248, 2929, 10262, 247, 27998, 7792, 323, 253, 6803, 273, 253, 5252, 2308, 273, 26279, 6276, 11333, 1411, 854, 1027, 5252, 26717, 15216, 253, 15216, 403, 4561, 327, 253, 1113, 4123, 40022, 342, 577, 1027, 5978, 3082, 285, 403, 6760, 342, 884, 17082, 253, 2127, 285, 253, 6657, 4697, 310, 2530, 387, 253, 9940, 4422, 253, 2929, 310, 973, 3542, 285, 253, 9021, 403, 4518, 3559, 50276, 783, 789, 310, 4460, 285, 812, 1957, 247, 3213, 3579, 275, 253, 6803, 273, 253, 5252, 273, 26279, 6276, 253, 22791, 310, 4583, 973, 4158, 285, 9648, 11088, 285, 253, 2022, 2199, 3239, 3797, 3081, 10556, 285, 253, 458, 472, 254, 4697, 50276, 783, 2929, 10262, 247, 1643, 32213, 326, 812, 320, 4354, 14042, 50276, 9088, 310, 642, 5740, 273, 849, 253, 7137, 15216, 403, 4561, 1309, 3733, 273, 253, 5570, 3081, 4278, 812, 320, 2361, 275, 253, 24864, 2144, 50276, 9088, 310, 642, 5740, 273, 849, 253, 320, 87, 3888, 403, 2797, 275, 253, 1524, 1533, 841, 403, 5431, 2797, 342, 11454, 6928, 390, 643, 13071, 44387, 5162, 3888, 3659, 498, 13360, 604, 436, 15722, 310, 2783, 347, 3962, 285, 253, 320, 87, 3888, 403, 5728, 3587, 275, 1113, 4123, 840, 352, 310, 1805, 281, 6780, 436, 1127, 50276, 74, 1928, 751, 253, 4583, 4868, 812, 320, 24363, 7293, 327, 253, 13461, 326, 403, 3732, 5046, 352, 812, 320, 271, 2934, 281, 671, 1908, 495, 1027, 4583, 7363, 281, 2020, 598, 253, 5252, 13175, 285, 1162, 3008, 5464, 2308, 50275, 783, 4327, 273, 253, 854, 15216, 4236, 432, 253, 295, 384, 6678, 7177, 943, 320, 17285, 1580, 253, 15216, 7117, 403, 1039, 625, 497, 841, 15216, 4236, 407, 7605, 4294, 327, 253, 13939, 390, 407, 643, 17082, 50276, 262, 310, 417, 2590, 849, 281, 11929, 3081, 1543, 281, 253, 6657, 4697, 2490, 187, 4118, 18435, 27, 783, 2929, 12453, 247, 1199, 34498, 22791, 323, 253, 5252, 273, 26279, 6276, 512, 30628, 9446, 253, 8453, 273, 253, 9380, 9021, 954, 273, 253, 3533, 452, 644, 9713, 3449, 5906, 1031, 407, 253, 30080, 22559, 4583, 512, 30628, 403, 2762, 670, 253, 2929, 285, 5583, 14924 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 273, 436, 2929, 12661, 247, 5147, 4999, 31591, 323, 253, 5252, 7103, 273, 1323, 84, 597, 1750, 436, 5147, 310, 253, 806, 273, 697, 2238, 5175, 854, 1027, 5252, 4619, 15216, 23000, 1027, 1323, 11333, 403, 22791, 264, 327, 436, 9864, 5147, 1754, 327, 689, 5307, 5252, 15216, 1543, 403, 3839, 12532, 5762, 327, 2969, 391, 77, 11333, 17227, 5454, 14273, 875, 1027, 1323, 17082, 824, 347, 3885, 285, 3515, 2502, 10654, 50276, 783, 806, 5147, 273, 697, 2238, 347, 2080, 347, 891, 871, 534, 310, 247, 5699, 7680, 253, 2929, 310, 1077, 2590, 285, 973, 3542, 436, 2929, 556, 441, 886, 1169, 12319, 432, 643, 8607, 281, 4588, 1323, 4413, 436, 5147, 812, 3157, 5252, 273, 1323, 84, 4275, 2770, 327, 816, 5252, 7103, 273, 519, 2789, 253, 5147, 21076, 671, 21473, 816, 253, 3045, 5593, 273, 4999, 31591, 347, 2032, 5252, 310, 24363, 7137, 15216, 403, 417, 253, 760, 5997, 273, 20372, 6276, 3710, 2408, 273, 519, 11333, 5762, 3551, 598, 342, 247, 2014, 5252, 7103, 7982, 476, 320, 30078, 281, 4715, 15419, 2368, 5474, 339, 431, 248, 4477, 12661, 247, 40022, 3126, 326, 476, 320, 908, 281, 7472, 11333, 323, 26279, 6276, 2556, 281, 884, 17082, 50276, 2520, 22791, 12453, 253, 1895, 273, 5175, 519, 5933, 275, 247, 19817, 5133, 285, 3133, 281, 320, 1077, 12532, 285, 4217, 323, 2852, 2561, 50274, 783, 22791, 3694, 3133, 1077, 4217, 323, 253, 26279, 4958, 2561, 3114, 285, 891, 476, 923, 253, 22791, 21896, 247, 2257, 273, 32535, 253, 3694, 3133, 281, 320, 973, 3542, 285, 253, 2603, 2127, 4453, 5272, 4076, 285, 973, 14290, 352, 310, 671, 17236, 762, 253, 4784, 7981, 285, 2130, 3066, 40477, 50274, 783, 5301, 273, 1027, 11333, 310, 4722, 285, 253, 5926, 275, 3045, 275, 253, 10021, 15216, 41969, 2852, 14006, 50274, 2577, 2022, 1895, 342, 253, 2929, 310, 326, 310, 281, 479, 417, 2590, 752, 4555, 352, 5605, 281, 320, 253, 40022, 310, 5611, 533, 891, 2985, 247, 7000, 5955, 273, 2216, 7089, 285, 271, 7103, 273, 253, 40022, 3139, 50276, 1542, 1650, 891, 2985, 247, 8870, 2139, 253, 6777, 15216, 403, 4217, 390, 1805, 685, 690, 643, 15216, 2139, 1057, 352, 1056, 3282, 281, 760, 1978, 15216, 326, 2847, 15708, 342, 387, 1878, 767, 11333, 12014, 891, 2985, 271, 7103, 273, 253, 7982, 2139, 310, 253, 7982, 1175, 849, 1057, 352, 7277, 281, 643, 17082, 50276, 23955, 1127, 891, 1928, 326, 943, 320, 50221, 310, 604, 40022, 1543, 403, 2686, 4217, 275, 1524, 1533, 985, 594, 310, 271, 5933, 5762, 327, 436, 9864, 1805, 275, 247, 1524, 1533, 10076, 2299, 891, 717, 6600, 326, 824, 271, 7103, 310, 3164, 1077, 2570, 285, 673, 21337, 50275, 2520, 310, 509, 8055, 407, 253, 13622, 897, 273, 253, 30762, 352, 310, 8718, 281, 7450, 690, 4243, 390, 4114, 715, 253, 30762, 533, 387, 2069, 253, 2929, 9563, 751, 247, 2829, 273, 9410, 285, 954, 33295, 1646, 281, 3806, 253, 30762, 10380, 50275, 262, 9193, 751, 253, 4477, 3597, 281, 513, 1531, 312, 767, 9380, 715, 581, 891, 717, 417, 2119, 604, 767, 9380, 403, 247, 1805, 2900, 533, 4931, 247, 3356, 6698, 2929, 651, 320, 271, 4500, 1529, 4500, 1537, 320, 281, 2118, 253, 7180, 715, 253, 30762, 285, 897, 253, 3081, 2317, 281, 7472, 253, 40022, 285, 417, 253, 1543, 273, 11333, 970, 253, 40022, 285, 2085, 690, 625, 4278, 327, 253, 2216, 8870, 50276, 7152, 33032, 2520, 2929, 29328, 4999, 31591, 253, 806, 5147, 347, 2080, 347, 891, 871, 323, 253, 5252, 7103, 273, 1125, 351, 19674, 2718, 275, 436, 5147, 253, 2488, 5611, 2067, 15216, 1125, 351, 19674, 11333, 285, 7103, 17082, 1754, 327, 253, 5147, 253, 2488, 5196, 9470, 4679, 285, 5907, 690, 12611, 16039, 337, 436, 2929, 29328, 4999, 31591, 253, 806, 5147, 347, 2080, 347, 891, 871, 323, 253, 5252, 7103, 273, 1125, 351, 19674, 2718, 374, 253, 22791, 310, 23178, 1025, 2216, 285, 3477, 281, 897, 534, 651, 320, 4122, 12912, 323, 8607, 275, 436, 2170, 495, 436, 5147, 4428, 2067, 5252, 26717, 15216, 323, 6753, 6276, 285, 2709, 7103, 17082, 577, 253, 2488, 5421, 285, 6760, 2067, 1125, 351, 19674, 11333, 337, 812, 253, 4212, 6520, 4360, 823, 747, 4958, 3210, 281, 253, 5147, 604, 4754, 849, 374, 812, 253, 4212, 823, 3081, 5175, 15216, 495, 347, 323, 253, 22791, 272, 1543, 891, 1804, 253, 2488, 3959, 625, 1783, 327, 841, 1543, 285, 2085, 625, 16039, 715, 253, 1027, 1125, 351, 19674, 11333, 577, 5884, 891, 1804, 253, 2488, 2486, 285, 823, 625, 48960, 2983, 3082, 275, 616, 5147, 323, 1650, 1249, 50276, 250, 3065, 50276, 18, 8746, 4116, 14523, 2983, 6635, 48960, 4049, 48821, 486, 275, 3520, 1533, 30105, 1087, 43425, 374, 4049, 276, 4715, 3520, 4958, 4049, 48821, 1131, 281, 18539, 274, 1365, 2983, 25421, 275, 253, 4956, 17857, 32888, 6247, 5474, 33032, 2520, 2929, 10262, 4999, 31591, 247, 27998, 22791, 272, 5147, 323, 16344, 5252, 15927, 273, 26279, 9411, 253, 4081, 7792, 19401, 854, 5252, 26717, 5175, 15216, 1563, 3872, 17657, 7137, 5252, 5286, 285, 24357, 1740, 1027, 10076, 5978, 11333, 253, 22791, 3797, 1740, 1027, 3676, 35221, 4715, 11333, 342, 1740, 3510, 273, 3280, 253, 22791, 1543, 1804, 326, 253, 4561, 15216, 273, 4999, 31591, 403, 625, 11132, 323, 1655, 26279, 6276, 8333, 50276, 2520, 2929, 10262, 247, 4460, 22791, 323, 26279, 6276, 50275, 783, 22791, 1543, 285, 4342, 403, 4722, 50275, 783, 2929, 310, 973, 15720, 285, 3477, 7954, 956, 50276, 8826, 2905, 789, 310, 5816, 50275, 8826, 5661, 7533, 403, 417, 973, 6309, 1245, 50276, 2887, 4278, 2708, 5474, 33032, 2520, 2929, 5936, 247, 27998, 7792, 281, 19837, 1027, 3510, 273, 5252, 4619, 5175, 15216, 285, 10076, 5978, 11333, 342, 253, 4388, 281, 11990, 253, 5301, 285, 253, 4685, 273, 616, 12510, 1740, 1027, 391, 77, 11333, 403, 671, 2429, 50275, 328, 1245, 1071, 15216, 285, 10076, 5978, 11333, 50276, 783, 23178, 414, 273, 253, 5147, 1690, 23057, 4958, 5570, 10076, 285, 7103, 7632, 835, 1016, 6333, 812, 320, 32176, 1754, 327, 253, 6095, 50274, 77, 471, 273, 13782, 10454, 1783, 323, 10076, 5978, 11333, 285, 391, 77, 1754, 3082, 50275, 1439, 2217, 10097, 281, 320, 2104, 281, 18302, 253, 2530, 1543, 50276, 77, 471, 273, 26647, 1783, 327, 1524, 10186, 941, 50276, 7152, 339, 431, 248, 2929, 10262, 247, 27998, 7792, 323, 253, 6803, 273, 253, 5252, 2308, 273, 26279, 6276, 11333, 1411, 854, 1027, 5252, 26717, 15216, 253, 15216, 403, 4561, 327, 253, 1113, 4123, 40022, 342, 577, 1027, 5978, 3082, 285, 403, 6760, 342, 884, 17082, 253, 2127, 285, 253, 6657, 4697, 310, 2530, 387, 253, 9940, 4422, 253, 2929, 310, 973, 3542, 285, 253, 9021, 403, 4518, 3559, 50276, 783, 789, 310, 4460, 285, 812, 1957, 247, 3213, 3579, 275, 253, 6803, 273, 253, 5252, 273, 26279, 6276, 253, 22791, 310, 4583, 973, 4158, 285, 9648, 11088, 285, 253, 2022, 2199, 3239, 3797, 3081, 10556, 285, 253, 458, 472, 254, 4697, 50276, 783, 2929, 10262, 247, 1643, 32213, 326, 812, 320, 4354, 14042, 50276, 9088, 310, 642, 5740, 273, 849, 253, 7137, 15216, 403, 4561, 1309, 3733, 273, 253, 5570, 3081, 4278, 812, 320, 2361, 275, 253, 24864, 2144, 50276, 9088, 310, 642, 5740, 273, 849, 253, 320, 87, 3888, 403, 2797, 275, 253, 1524, 1533, 841, 403, 5431, 2797, 342, 11454, 6928, 390, 643, 13071, 44387, 5162, 3888, 3659, 498, 13360, 604, 436, 15722, 310, 2783, 347, 3962, 285, 253, 320, 87, 3888, 403, 5728, 3587, 275, 1113, 4123, 840, 352, 310, 1805, 281, 6780, 436, 1127, 50276, 74, 1928, 751, 253, 4583, 4868, 812, 320, 24363, 7293, 327, 253, 13461, 326, 403, 3732, 5046, 352, 812, 320, 271, 2934, 281, 671, 1908, 495, 1027, 4583, 7363, 281, 2020, 598, 253, 5252, 13175, 285, 1162, 3008, 5464, 2308, 50275, 783, 4327, 273, 253, 854, 15216, 4236, 432, 253, 295, 384, 6678, 7177, 943, 320, 17285, 1580, 253, 15216, 7117, 403, 1039, 625, 497, 841, 15216, 4236, 407, 7605, 4294, 327, 253, 13939, 390, 407, 643, 17082, 50276, 262, 310, 417, 2590, 849, 281, 11929, 3081, 1543, 281, 253, 6657, 4697, 2490, 187, 4118, 18435, 27, 783, 2929, 12453, 247, 1199, 34498, 22791, 323, 253, 5252, 273, 26279, 6276, 512, 30628, 9446, 253, 8453, 273, 253, 9380, 9021, 954, 273, 253, 3533, 452, 644, 9713, 3449, 5906, 1031, 407, 253, 30080, 22559, 4583, 512, 30628, 403, 2762, 670, 253, 2929, 285, 5583, 14924 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper proposes a new type of generative models with a new inference method of latent variables specifically the gradient of latent variables with respect to zero vector is taken as the inferred latent variables based on this the authors generalize the propose model to implicit and variational versions and demonstrate the models on image datasets pros the proposed method is easy and straightforward to implement cons 1 the model assumption that the one step gradient from zero vector equals to latent vector is quite limited and greatly constrains the model expressiveness a justification that such assumption is reasonable is badly needed 2 formulation needs to be carefully checked for example eqn 2 is not entirely correct to me the second term should not be binary cross entropy as there is no categorical variable involved also please avoid using abbreviations lbce lcce at the first time to introduce them which are confusing 3 experimental results are not sufficient to demonstrate the efficacy need more quantitative analysis and experiments on more challenging datasets 4 the claim that it saves parameters compared to vae is confusing in the variational version parametrizations of mux and sigmax are also required a principled way to very this claim is to show that with the variational version the method could use much less parameters compared vae while has the better synthesis quality overall the method proposed in this paper is new and promising however given the current unclear formulation and lack of strong experimental results i recommend a rejection docsepthe paper proposes gons which seek to build a generative model with an implicit encoder that comes essentially for free with the use of a few reparameterization tricks the main idea being that existing generative models with an encoder are redundant in that the decoder itself has the ability to compute the gradient with respect to a latent vector z which itself can be thought of as the encoding since the choice of what initial latent vector to choose arises here the paper advocates for simply choosing a z0 which is a zero vector in addition to the explicit formulation there is also an implicit gon which is proposed that can generalize implicit generative models like siren to entire distributions as opposed to a single data point as they are currently used overall i think this is very interesting work but incomplete considering gons are a completely new category of generative models it would greatly help to study each piece in more detail theoretically or empirically to establish what makes gons successful different and how this improves our understanding of implicit representations in neural networks strengths an interesting and novel formulation of encoding schemes from decoders that do not need any additional training or networks the paper explores several different variants of gons from a variational alternative implicit and a classifier which greatly expands its scope of application in new problems gons generalize implicit generative models like sirens to work with an entire data distribution with very few parameters which i think is a great benefit this also naturally allows for variational alternatives meaning we can sample from complex high dimensional distributions using very simple networks the implicit gon also enables finer grid sampling in the input space enabling its use in applications like super resolution naturally but to any image from the training distribution weaknesses the paper is very dense in terms of ideas and as such falls short in thoroughly evaluating all of them for example the paper contributes several ideas like gons implicit gons variational gons which is great but it would help if each one of those pieces were studied in some more detail so they can be compared and contextualized better with existing approaches for example in the formulation itself the gon loss is presented as is but i think it warrants some more study for example why is just a single step sufficient to estimate z does the quality of z improve if you take multiple smaller steps how stable is this for different datasets the empirical studies show promise that indeed this can work reasonably well in reconstructing different datasets but it would greatly help to justify some of these choices further in the explicit case how important is the choice of f the choice of activation function is explored but what about the architecture number of parameters for a given dataset in all the experiments the reconstruction losses are shown are for the training set how do the validation set samples get reconstructed its not clear if gons are so effective in reconstructing because they are memorizing the data how does the performance of gons change as the size of the output space grows larger for eg 128x128 or 256x256 some of the terminology is also confusing what does it mean when you overfit to an entire distribution i understand its usage for a single image but its not clear what this means for an entire dataset are the samples from figure 4 all from the same trained gon is figure 7 from an explicit gon or an implicit gon if its explicit how are the number of parameters comparable to an implicitgon clearly an explicit model will have a lot more number of parameters esp as the size of the images increase i really like and appreciate the variationalgon experiments how do they compare with standard vaes can they recover celeba 64x64 images how would they compare on quantitative metrics like fid etc in the super resolution experiment can it super resolve any image from the distribution it was trained on for eg in figure 5 is it just a matter of resampling the grid to 256x256 and running them through the pretrained model for any sample from px update on the revised manuscript i have read the new version of the paper and it reads a lot better the new expanded methods section and the definitions for different variations of gons makes the paper much stronger and easier to understand i appreciate and like the new experiments that show gons capabilities on lsun comparisons with vae on elbo most of my concerns have been addressed in this version i think this paper makes an interesting and novel contribution and i will raise my score accordingly docsepthis paper introduces a new inference method for autoencodertype models where the encoder is taken as a gradient of the decoder with respect to a zeroinitialized latent variable the method is evaluated for both a deterministic autoencoder and a vae on toy image data cifar10 being the most complex of them and applied to convolutional decoder and to sirentype implicit representation networks this is for all intents and purposes a single step iterative inference setup in its vae variant it is extremely similar to oldschool iterative inference albeit with a single gradient step the paper is verywell written and interesting the method seems to be getting very good results still the paper seems to be rushed the results are only on small scale and toyish datasets and there are very few baselines in its current state i recommend rejection due to rather limited novelty although its cool to see that this type of inference works for implicit scene representations and very limited evaluation there are also very many links to existing literature that are not properly described let me elaborate baselines to determine the efficacy of this method the authors would have to compare against some similar methods including oldschool multistep variational inference semiamortized variational inference the proposed method with multiple gradient steps the proposed method with detached gradient as in not use 2nd order gradients a fullyconvolutional autoencoder with parameters tied between the encoder and decoder this is for two reasons a this would reduce the number of parameters by half making it more similar to gon but also b the transposedconvolution used in such a setup corresponds almost exactly to the gradient of the encoder which is an idea very similar to gons missing links to the literature the above fullyconv ae setup modelagnostic metalearning and related eg cavia leo etc where the latents are produced by single or multistep optimization missing experiments we would need more evidence to determine if such a simple method is useful a good experiment would be eg on imagenet further suggestions subfigures in fig2 and 3 and most of figs in the appendix use different scales on the y axis it would be easier to read the figures if the scaled were normalized within a single figure update ive updated the score given the authors response see my comment below ### Summary:
this paper presents a new inference mechanism for latent variable models by taking the derivative of loglikelihood with respect to a zerovalued vector initially the reviewers raised concerns mostly regarding the limited experimentation and missing baselines however in the revised version the authors addressed most of these concerns given that most reviewers are positive after the revision and since the proposed method is simple and interesting i recommend accepting this paper
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 247, 747, 1511, 273, 1006, 800, 3210, 342, 247, 747, 17032, 1332, 273, 21624, 4903, 5742, 253, 11786, 273, 21624, 4903, 342, 1675, 281, 5058, 4972, 310, 2668, 347, 253, 22245, 21624, 4903, 1754, 327, 436, 253, 4477, 39970, 253, 12661, 1566, 281, 15424, 285, 39762, 9508, 285, 7568, 253, 3210, 327, 2460, 15302, 50276, 856, 84, 253, 4081, 1332, 310, 3477, 285, 15246, 281, 3359, 50275, 5040, 337, 253, 1566, 9376, 326, 253, 581, 3213, 11786, 432, 5058, 4972, 18207, 281, 21624, 4972, 310, 3240, 3710, 285, 10260, 1030, 44196, 253, 1566, 3890, 6460, 247, 22861, 326, 824, 9376, 310, 5272, 310, 16426, 3058, 50276, 19, 15895, 3198, 281, 320, 9257, 10141, 323, 1650, 16186, 79, 374, 310, 417, 7094, 3451, 281, 479, 253, 1273, 1307, 943, 417, 320, 8985, 2831, 15579, 347, 627, 310, 642, 31091, 4778, 3206, 671, 4496, 3693, 970, 490, 25669, 32830, 336, 298, 68, 336, 387, 253, 806, 673, 281, 9569, 731, 534, 403, 21643, 50275, 20, 5661, 1543, 403, 417, 4209, 281, 7568, 253, 10307, 878, 625, 11745, 1783, 285, 4679, 327, 625, 11132, 15302, 50275, 21, 253, 1750, 326, 352, 26866, 3602, 2429, 281, 362, 3348, 310, 21643, 275, 253, 39762, 2715, 30364, 21100, 569, 273, 278, 2310, 285, 40009, 89, 403, 671, 2424, 247, 3505, 74, 6216, 1039, 281, 1077, 436, 1750, 310, 281, 921, 326, 342, 253, 39762, 2715, 253, 1332, 812, 897, 1199, 1679, 3602, 2429, 362, 3348, 1223, 556, 253, 1805, 9066, 3290, 50275, 1189, 455, 253, 1332, 4081, 275, 436, 2929, 310, 747, 285, 12532, 2299, 1677, 253, 1655, 12744, 15895, 285, 3480, 273, 2266, 5661, 1543, 891, 5583, 247, 18235, 5474, 339, 431, 248, 2929, 29328, 305, 790, 534, 7703, 281, 1973, 247, 1006, 800, 1566, 342, 271, 15424, 32049, 326, 3249, 9093, 323, 1959, 342, 253, 897, 273, 247, 1643, 294, 19484, 1320, 24866, 253, 2022, 2934, 1146, 326, 5368, 1006, 800, 3210, 342, 271, 32049, 403, 28116, 275, 326, 253, 29810, 3139, 556, 253, 3745, 281, 11897, 253, 11786, 342, 1675, 281, 247, 21624, 4972, 1182, 534, 3139, 476, 320, 1869, 273, 347, 253, 9706, 1580, 253, 4327, 273, 752, 3302, 21624, 4972, 281, 5206, 15877, 1060, 253, 2929, 23318, 323, 3365, 13887, 247, 1182, 17, 534, 310, 247, 5058, 4972, 275, 1635, 281, 253, 6843, 15895, 627, 310, 671, 271, 15424, 26087, 534, 310, 4081, 326, 476, 39970, 15424, 1006, 800, 3210, 50276, 3022, 4927, 445, 281, 2862, 10670, 347, 10066, 281, 247, 2014, 941, 1127, 347, 597, 403, 4390, 908, 50276, 1189, 455, 891, 1158, 436, 310, 1077, 4722, 789, 533, 18464, 7296, 305, 790, 403, 247, 4336, 747, 7140, 273, 1006, 800, 3210, 352, 651, 10260, 1361, 281, 1263, 1016, 5313, 275, 625, 2508, 28055, 390, 45190, 281, 5100, 752, 2789, 305, 790, 5547, 1027, 285, 849, 436, 19132, 776, 4685, 273, 15424, 14237, 275, 11454, 6928, 50274, 296, 3755, 20556, 50275, 266, 4722, 285, 4460, 15895, 273, 9706, 15849, 432, 1086, 351, 398, 326, 513, 417, 878, 667, 3081, 3733, 390, 6928, 50275, 783, 2929, 33826, 2067, 1027, 11640, 273, 305, 790, 50276, 4064, 247, 39762, 5795, 15424, 285, 247, 30410, 534, 10260, 35205, 697, 7990, 273, 2898, 275, 747, 3237, 50275, 72, 790, 39970, 15424, 1006, 800, 3210, 751, 4927, 25083, 281, 789, 342, 271, 2862, 941, 3268, 342, 1077, 1643, 3602, 534, 891, 1158, 310, 247, 1270, 5649, 436, 671, 10748, 4483, 323, 39762, 18075, 4495, 359, 476, 3410, 432, 2570, 1029, 15759, 10670, 970, 1077, 2969, 6928, 50275, 783, 15424, 26087, 671, 13276, 40259, 9860, 10491, 275, 253, 3280, 2317, 17690, 697, 897, 275, 4893, 751, 2221, 6064, 10748, 50276, 2858, 281, 667, 2460, 432, 253, 3733, 3268, 50274, 20881, 1255, 265, 50275, 783, 2929, 310, 1077, 14086, 275, 2426, 273, 5697, 285, 347, 824, 11521, 2159, 275, 16575, 16344, 512, 273, 731, 323, 1650, 253, 2929, 17904, 2067, 5697, 751, 305, 790, 15424, 305, 790, 39762, 305, 790, 534, 310, 1270, 533, 352, 651, 1361, 604, 1016, 581, 273, 1110, 7437, 497, 5421, 275, 690, 625, 2508, 594, 597, 476, 320, 2429, 285, 33876, 1025, 1805, 342, 5368, 7274, 323, 1650, 275, 253, 15895, 3139, 253, 26087, 2957, 310, 3559, 347, 310, 533, 891, 1158, 352, 32570, 690, 625, 1263, 50271, 1542, 1650, 2139, 310, 816, 247, 2014, 3213, 4209, 281, 6642, 1182, 1057, 253, 3290, 273, 1182, 3157, 604, 368, 1379, 2709, 4577, 5018, 849, 6474, 310, 436, 323, 1027, 15302, 253, 16774, 2175, 921, 9023, 326, 6296, 436, 476, 789, 12054, 973, 275, 17029, 272, 1027, 15302, 533, 352, 651, 10260, 1361, 281, 15249, 690, 273, 841, 10165, 2007, 50271, 249, 253, 6843, 1083, 849, 1774, 310, 253, 4327, 273, 269, 50276, 783, 4327, 273, 5743, 1159, 310, 14859, 533, 752, 670, 253, 10336, 1180, 273, 3602, 323, 247, 1677, 10895, 50275, 249, 512, 253, 4679, 253, 14433, 11655, 403, 2011, 403, 323, 253, 3733, 873, 849, 513, 253, 12820, 873, 3530, 755, 25578, 50276, 953, 417, 2590, 604, 305, 790, 403, 594, 3576, 275, 17029, 272, 984, 597, 403, 16407, 3006, 253, 941, 50275, 5430, 1057, 253, 3045, 273, 305, 790, 1818, 347, 253, 1979, 273, 253, 3453, 2317, 17202, 4067, 323, 24088, 12842, 89, 8196, 390, 17558, 89, 9726, 50275, 8826, 273, 253, 28939, 310, 671, 21643, 752, 1057, 352, 1599, 672, 368, 689, 8491, 281, 271, 2862, 3268, 891, 2096, 697, 10393, 323, 247, 2014, 2460, 533, 697, 417, 2590, 752, 436, 2097, 323, 271, 2862, 10895, 403, 253, 3530, 432, 4677, 577, 512, 432, 253, 1072, 10166, 26087, 50275, 261, 4677, 818, 432, 271, 6843, 26087, 390, 271, 15424, 26087, 604, 697, 6843, 849, 403, 253, 1180, 273, 3602, 10870, 281, 271, 15424, 19835, 4518, 271, 6843, 1566, 588, 452, 247, 2257, 625, 1180, 273, 3602, 17985, 347, 253, 1979, 273, 253, 3888, 2572, 50276, 74, 1663, 751, 285, 11435, 253, 39762, 19835, 4679, 849, 513, 597, 7277, 342, 50276, 15291, 13460, 265, 476, 597, 9295, 6076, 5830, 6705, 89, 1540, 3888, 849, 651, 597, 7277, 327, 11745, 17082, 751, 269, 301, 3966, 50276, 249, 253, 2221, 6064, 3368, 476, 352, 2221, 11322, 667, 2460, 432, 253, 3268, 352, 369, 10166, 327, 323, 24088, 275, 4677, 608, 310, 352, 816, 247, 2647, 273, 501, 312, 4906, 253, 9860, 281, 17558, 89, 9726, 285, 3515, 731, 949, 253, 3215, 11273, 1566, 323, 667, 3410, 432, 268, 89, 50275, 11183, 327, 253, 17265, 7714, 50275, 74, 452, 1239, 253, 747, 2715, 273, 253, 2929, 285, 352, 9563, 247, 2257, 1805, 253, 747, 11848, 3082, 2593, 285, 253, 14308, 323, 1027, 10575, 273, 305, 790, 2789, 253, 2929, 1199, 10046, 285, 6927, 281, 2096, 891, 11435, 285, 751, 253, 747, 4679, 326, 921, 305, 790, 13789, 327, 298, 13998, 14023, 342, 362, 3348, 327, 1045, 2399, 50275, 2252, 273, 619, 7350, 452, 644, 9713, 275, 436, 2715, 891, 1158, 436, 2929, 2789, 271, 4722, 285, 4460, 7680, 285, 891, 588, 7164, 619, 4868, 15672, 50275, 7152, 33032, 2520, 2929, 23970, 247, 747, 17032, 1332, 323, 6753, 36465, 881, 3210, 835, 253, 32049, 310, 2668, 347, 247, 11786, 273, 253, 29810, 342, 1675, 281, 247, 5058, 19078, 1025, 21624, 4778, 253, 1332, 310, 6760, 323, 1097, 247, 30027, 6753, 36465, 285, 247, 362, 3348, 327, 20953, 2460, 941, 260, 338, 274, 740, 1146, 253, 954, 2570, 273, 731, 285, 3732, 281, 27311, 267, 29810, 285, 281, 4927, 445, 881, 15424, 6779, 6928, 436, 310, 323, 512, 540, 592, 285, 6378, 247, 2014, 3213, 34560, 17032, 9978, 275, 697, 362, 3348, 12955, 352, 310, 6685, 2074, 281, 1711, 19221, 34560, 17032, 23447, 342, 247, 2014, 11786, 3213, 50275, 783, 2929, 310, 1077, 4714, 3542, 285, 4722, 253, 1332, 3133, 281, 320, 2970, 1077, 1175, 1543, 1335, 253, 2929, 3133, 281, 320, 20906, 253, 1543, 403, 760, 327, 1355, 4311, 285, 20953, 763, 15302, 285, 627, 403, 1077, 1643, 1666, 25379, 50275, 249, 697, 1655, 1375, 891, 5583, 18235, 1955, 281, 2581, 3710, 38135, 3738, 697, 4484, 281, 923, 326, 436, 1511, 273, 17032, 2987, 323, 15424, 6200, 14237, 285, 1077, 3710, 7103, 627, 403, 671, 1077, 1142, 4859, 281, 5368, 6239, 326, 403, 417, 6283, 2529, 1339, 479, 21184, 50276, 10352, 25379, 281, 3653, 253, 10307, 273, 436, 1332, 253, 4477, 651, 452, 281, 7277, 1411, 690, 2074, 3082, 1690, 50276, 3502, 1651, 1554, 382, 554, 39762, 17032, 50276, 6017, 16726, 430, 1025, 39762, 17032, 50276, 783, 4081, 1332, 342, 2709, 11786, 5018, 50276, 783, 4081, 1332, 342, 31418, 11786, 347, 275, 417, 897, 374, 2109, 1340, 27935, 50276, 66, 4751, 13118, 2241, 267, 6753, 36465, 342, 3602, 12331, 875, 253, 32049, 285, 29810, 436, 310, 323, 767, 4606, 247, 436, 651, 4796, 253, 1180, 273, 3602, 407, 2716, 2403, 352, 625, 2074, 281, 26087, 533, 671, 270, 253, 811, 7334, 13118, 2241, 908, 275, 824, 247, 9978, 10140, 2761, 4555, 281, 253, 11786, 273, 253, 32049, 534, 310, 271, 2934, 1077, 2074, 281, 305, 790, 50276, 33722, 4859, 281, 253, 6239, 50276, 783, 1840, 4751, 13118, 247, 70, 9978, 50276, 7645, 1530, 6932, 5148, 613, 920, 285, 2905, 24088, 9605, 571, 458, 80, 3966, 835, 253, 4329, 592, 403, 4197, 407, 2014, 390, 1554, 382, 554, 13757, 50276, 33722, 4679, 359, 651, 878, 625, 1941, 281, 3653, 604, 824, 247, 2969, 1332, 310, 4217, 247, 1175, 3368, 651, 320, 24088, 327, 4440, 257, 292, 50276, 44295, 13991, 749, 40203, 275, 3036, 19, 285, 495, 285, 954, 273, 3036, 84, 275, 253, 30762, 897, 1027, 11498, 327, 253, 340, 7844, 352, 651, 320, 6927, 281, 1239, 253, 8442, 604, 253, 24337, 497, 12650, 1561, 247, 2014, 4677, 50276, 11183, 209, 422, 9300, 253, 4868, 1677, 253, 4477, 2380, 923, 619, 4385, 2708, 187, 187, 4118, 18435, 27, 2520, 2929, 10262, 247, 747, 17032, 5122, 323, 21624, 4778, 3210, 407, 3192, 253, 4309, 273, 2412, 7513, 10202, 342, 1675, 281, 247, 1182, 254, 19249, 2107, 4972, 8523, 253, 30628, 5439, 7350, 6571, 5001, 253, 3710, 40290, 285, 5816, 1666, 25379, 2299, 275, 253, 17265, 2715, 253, 4477, 9713, 954, 273, 841, 7350, 50275, 28821, 326, 954, 30628, 403, 2762, 846, 253, 18520, 285, 1580, 253, 4081, 1332, 310, 2969, 285, 4722, 891, 5583, 18738, 436, 2929 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 247, 747, 1511, 273, 1006, 800, 3210, 342, 247, 747, 17032, 1332, 273, 21624, 4903, 5742, 253, 11786, 273, 21624, 4903, 342, 1675, 281, 5058, 4972, 310, 2668, 347, 253, 22245, 21624, 4903, 1754, 327, 436, 253, 4477, 39970, 253, 12661, 1566, 281, 15424, 285, 39762, 9508, 285, 7568, 253, 3210, 327, 2460, 15302, 50276, 856, 84, 253, 4081, 1332, 310, 3477, 285, 15246, 281, 3359, 50275, 5040, 337, 253, 1566, 9376, 326, 253, 581, 3213, 11786, 432, 5058, 4972, 18207, 281, 21624, 4972, 310, 3240, 3710, 285, 10260, 1030, 44196, 253, 1566, 3890, 6460, 247, 22861, 326, 824, 9376, 310, 5272, 310, 16426, 3058, 50276, 19, 15895, 3198, 281, 320, 9257, 10141, 323, 1650, 16186, 79, 374, 310, 417, 7094, 3451, 281, 479, 253, 1273, 1307, 943, 417, 320, 8985, 2831, 15579, 347, 627, 310, 642, 31091, 4778, 3206, 671, 4496, 3693, 970, 490, 25669, 32830, 336, 298, 68, 336, 387, 253, 806, 673, 281, 9569, 731, 534, 403, 21643, 50275, 20, 5661, 1543, 403, 417, 4209, 281, 7568, 253, 10307, 878, 625, 11745, 1783, 285, 4679, 327, 625, 11132, 15302, 50275, 21, 253, 1750, 326, 352, 26866, 3602, 2429, 281, 362, 3348, 310, 21643, 275, 253, 39762, 2715, 30364, 21100, 569, 273, 278, 2310, 285, 40009, 89, 403, 671, 2424, 247, 3505, 74, 6216, 1039, 281, 1077, 436, 1750, 310, 281, 921, 326, 342, 253, 39762, 2715, 253, 1332, 812, 897, 1199, 1679, 3602, 2429, 362, 3348, 1223, 556, 253, 1805, 9066, 3290, 50275, 1189, 455, 253, 1332, 4081, 275, 436, 2929, 310, 747, 285, 12532, 2299, 1677, 253, 1655, 12744, 15895, 285, 3480, 273, 2266, 5661, 1543, 891, 5583, 247, 18235, 5474, 339, 431, 248, 2929, 29328, 305, 790, 534, 7703, 281, 1973, 247, 1006, 800, 1566, 342, 271, 15424, 32049, 326, 3249, 9093, 323, 1959, 342, 253, 897, 273, 247, 1643, 294, 19484, 1320, 24866, 253, 2022, 2934, 1146, 326, 5368, 1006, 800, 3210, 342, 271, 32049, 403, 28116, 275, 326, 253, 29810, 3139, 556, 253, 3745, 281, 11897, 253, 11786, 342, 1675, 281, 247, 21624, 4972, 1182, 534, 3139, 476, 320, 1869, 273, 347, 253, 9706, 1580, 253, 4327, 273, 752, 3302, 21624, 4972, 281, 5206, 15877, 1060, 253, 2929, 23318, 323, 3365, 13887, 247, 1182, 17, 534, 310, 247, 5058, 4972, 275, 1635, 281, 253, 6843, 15895, 627, 310, 671, 271, 15424, 26087, 534, 310, 4081, 326, 476, 39970, 15424, 1006, 800, 3210, 50276, 3022, 4927, 445, 281, 2862, 10670, 347, 10066, 281, 247, 2014, 941, 1127, 347, 597, 403, 4390, 908, 50276, 1189, 455, 891, 1158, 436, 310, 1077, 4722, 789, 533, 18464, 7296, 305, 790, 403, 247, 4336, 747, 7140, 273, 1006, 800, 3210, 352, 651, 10260, 1361, 281, 1263, 1016, 5313, 275, 625, 2508, 28055, 390, 45190, 281, 5100, 752, 2789, 305, 790, 5547, 1027, 285, 849, 436, 19132, 776, 4685, 273, 15424, 14237, 275, 11454, 6928, 50274, 296, 3755, 20556, 50275, 266, 4722, 285, 4460, 15895, 273, 9706, 15849, 432, 1086, 351, 398, 326, 513, 417, 878, 667, 3081, 3733, 390, 6928, 50275, 783, 2929, 33826, 2067, 1027, 11640, 273, 305, 790, 50276, 4064, 247, 39762, 5795, 15424, 285, 247, 30410, 534, 10260, 35205, 697, 7990, 273, 2898, 275, 747, 3237, 50275, 72, 790, 39970, 15424, 1006, 800, 3210, 751, 4927, 25083, 281, 789, 342, 271, 2862, 941, 3268, 342, 1077, 1643, 3602, 534, 891, 1158, 310, 247, 1270, 5649, 436, 671, 10748, 4483, 323, 39762, 18075, 4495, 359, 476, 3410, 432, 2570, 1029, 15759, 10670, 970, 1077, 2969, 6928, 50275, 783, 15424, 26087, 671, 13276, 40259, 9860, 10491, 275, 253, 3280, 2317, 17690, 697, 897, 275, 4893, 751, 2221, 6064, 10748, 50276, 2858, 281, 667, 2460, 432, 253, 3733, 3268, 50274, 20881, 1255, 265, 50275, 783, 2929, 310, 1077, 14086, 275, 2426, 273, 5697, 285, 347, 824, 11521, 2159, 275, 16575, 16344, 512, 273, 731, 323, 1650, 253, 2929, 17904, 2067, 5697, 751, 305, 790, 15424, 305, 790, 39762, 305, 790, 534, 310, 1270, 533, 352, 651, 1361, 604, 1016, 581, 273, 1110, 7437, 497, 5421, 275, 690, 625, 2508, 594, 597, 476, 320, 2429, 285, 33876, 1025, 1805, 342, 5368, 7274, 323, 1650, 275, 253, 15895, 3139, 253, 26087, 2957, 310, 3559, 347, 310, 533, 891, 1158, 352, 32570, 690, 625, 1263, 50271, 1542, 1650, 2139, 310, 816, 247, 2014, 3213, 4209, 281, 6642, 1182, 1057, 253, 3290, 273, 1182, 3157, 604, 368, 1379, 2709, 4577, 5018, 849, 6474, 310, 436, 323, 1027, 15302, 253, 16774, 2175, 921, 9023, 326, 6296, 436, 476, 789, 12054, 973, 275, 17029, 272, 1027, 15302, 533, 352, 651, 10260, 1361, 281, 15249, 690, 273, 841, 10165, 2007, 50271, 249, 253, 6843, 1083, 849, 1774, 310, 253, 4327, 273, 269, 50276, 783, 4327, 273, 5743, 1159, 310, 14859, 533, 752, 670, 253, 10336, 1180, 273, 3602, 323, 247, 1677, 10895, 50275, 249, 512, 253, 4679, 253, 14433, 11655, 403, 2011, 403, 323, 253, 3733, 873, 849, 513, 253, 12820, 873, 3530, 755, 25578, 50276, 953, 417, 2590, 604, 305, 790, 403, 594, 3576, 275, 17029, 272, 984, 597, 403, 16407, 3006, 253, 941, 50275, 5430, 1057, 253, 3045, 273, 305, 790, 1818, 347, 253, 1979, 273, 253, 3453, 2317, 17202, 4067, 323, 24088, 12842, 89, 8196, 390, 17558, 89, 9726, 50275, 8826, 273, 253, 28939, 310, 671, 21643, 752, 1057, 352, 1599, 672, 368, 689, 8491, 281, 271, 2862, 3268, 891, 2096, 697, 10393, 323, 247, 2014, 2460, 533, 697, 417, 2590, 752, 436, 2097, 323, 271, 2862, 10895, 403, 253, 3530, 432, 4677, 577, 512, 432, 253, 1072, 10166, 26087, 50275, 261, 4677, 818, 432, 271, 6843, 26087, 390, 271, 15424, 26087, 604, 697, 6843, 849, 403, 253, 1180, 273, 3602, 10870, 281, 271, 15424, 19835, 4518, 271, 6843, 1566, 588, 452, 247, 2257, 625, 1180, 273, 3602, 17985, 347, 253, 1979, 273, 253, 3888, 2572, 50276, 74, 1663, 751, 285, 11435, 253, 39762, 19835, 4679, 849, 513, 597, 7277, 342, 50276, 15291, 13460, 265, 476, 597, 9295, 6076, 5830, 6705, 89, 1540, 3888, 849, 651, 597, 7277, 327, 11745, 17082, 751, 269, 301, 3966, 50276, 249, 253, 2221, 6064, 3368, 476, 352, 2221, 11322, 667, 2460, 432, 253, 3268, 352, 369, 10166, 327, 323, 24088, 275, 4677, 608, 310, 352, 816, 247, 2647, 273, 501, 312, 4906, 253, 9860, 281, 17558, 89, 9726, 285, 3515, 731, 949, 253, 3215, 11273, 1566, 323, 667, 3410, 432, 268, 89, 50275, 11183, 327, 253, 17265, 7714, 50275, 74, 452, 1239, 253, 747, 2715, 273, 253, 2929, 285, 352, 9563, 247, 2257, 1805, 253, 747, 11848, 3082, 2593, 285, 253, 14308, 323, 1027, 10575, 273, 305, 790, 2789, 253, 2929, 1199, 10046, 285, 6927, 281, 2096, 891, 11435, 285, 751, 253, 747, 4679, 326, 921, 305, 790, 13789, 327, 298, 13998, 14023, 342, 362, 3348, 327, 1045, 2399, 50275, 2252, 273, 619, 7350, 452, 644, 9713, 275, 436, 2715, 891, 1158, 436, 2929, 2789, 271, 4722, 285, 4460, 7680, 285, 891, 588, 7164, 619, 4868, 15672, 50275, 7152, 33032, 2520, 2929, 23970, 247, 747, 17032, 1332, 323, 6753, 36465, 881, 3210, 835, 253, 32049, 310, 2668, 347, 247, 11786, 273, 253, 29810, 342, 1675, 281, 247, 5058, 19078, 1025, 21624, 4778, 253, 1332, 310, 6760, 323, 1097, 247, 30027, 6753, 36465, 285, 247, 362, 3348, 327, 20953, 2460, 941, 260, 338, 274, 740, 1146, 253, 954, 2570, 273, 731, 285, 3732, 281, 27311, 267, 29810, 285, 281, 4927, 445, 881, 15424, 6779, 6928, 436, 310, 323, 512, 540, 592, 285, 6378, 247, 2014, 3213, 34560, 17032, 9978, 275, 697, 362, 3348, 12955, 352, 310, 6685, 2074, 281, 1711, 19221, 34560, 17032, 23447, 342, 247, 2014, 11786, 3213, 50275, 783, 2929, 310, 1077, 4714, 3542, 285, 4722, 253, 1332, 3133, 281, 320, 2970, 1077, 1175, 1543, 1335, 253, 2929, 3133, 281, 320, 20906, 253, 1543, 403, 760, 327, 1355, 4311, 285, 20953, 763, 15302, 285, 627, 403, 1077, 1643, 1666, 25379, 50275, 249, 697, 1655, 1375, 891, 5583, 18235, 1955, 281, 2581, 3710, 38135, 3738, 697, 4484, 281, 923, 326, 436, 1511, 273, 17032, 2987, 323, 15424, 6200, 14237, 285, 1077, 3710, 7103, 627, 403, 671, 1077, 1142, 4859, 281, 5368, 6239, 326, 403, 417, 6283, 2529, 1339, 479, 21184, 50276, 10352, 25379, 281, 3653, 253, 10307, 273, 436, 1332, 253, 4477, 651, 452, 281, 7277, 1411, 690, 2074, 3082, 1690, 50276, 3502, 1651, 1554, 382, 554, 39762, 17032, 50276, 6017, 16726, 430, 1025, 39762, 17032, 50276, 783, 4081, 1332, 342, 2709, 11786, 5018, 50276, 783, 4081, 1332, 342, 31418, 11786, 347, 275, 417, 897, 374, 2109, 1340, 27935, 50276, 66, 4751, 13118, 2241, 267, 6753, 36465, 342, 3602, 12331, 875, 253, 32049, 285, 29810, 436, 310, 323, 767, 4606, 247, 436, 651, 4796, 253, 1180, 273, 3602, 407, 2716, 2403, 352, 625, 2074, 281, 26087, 533, 671, 270, 253, 811, 7334, 13118, 2241, 908, 275, 824, 247, 9978, 10140, 2761, 4555, 281, 253, 11786, 273, 253, 32049, 534, 310, 271, 2934, 1077, 2074, 281, 305, 790, 50276, 33722, 4859, 281, 253, 6239, 50276, 783, 1840, 4751, 13118, 247, 70, 9978, 50276, 7645, 1530, 6932, 5148, 613, 920, 285, 2905, 24088, 9605, 571, 458, 80, 3966, 835, 253, 4329, 592, 403, 4197, 407, 2014, 390, 1554, 382, 554, 13757, 50276, 33722, 4679, 359, 651, 878, 625, 1941, 281, 3653, 604, 824, 247, 2969, 1332, 310, 4217, 247, 1175, 3368, 651, 320, 24088, 327, 4440, 257, 292, 50276, 44295, 13991, 749, 40203, 275, 3036, 19, 285, 495, 285, 954, 273, 3036, 84, 275, 253, 30762, 897, 1027, 11498, 327, 253, 340, 7844, 352, 651, 320, 6927, 281, 1239, 253, 8442, 604, 253, 24337, 497, 12650, 1561, 247, 2014, 4677, 50276, 11183, 209, 422, 9300, 253, 4868, 1677, 253, 4477, 2380, 923, 619, 4385, 2708, 187, 187, 4118, 18435, 27, 2520, 2929, 10262, 247, 747, 17032, 5122, 323, 21624, 4778, 3210, 407, 3192, 253, 4309, 273, 2412, 7513, 10202, 342, 1675, 281, 247, 1182, 254, 19249, 2107, 4972, 8523, 253, 30628, 5439, 7350, 6571, 5001, 253, 3710, 40290, 285, 5816, 1666, 25379, 2299, 275, 253, 17265, 2715, 253, 4477, 9713, 954, 273, 841, 7350, 50275, 28821, 326, 954, 30628, 403, 2762, 846, 253, 18520, 285, 1580, 253, 4081, 1332, 310, 2969, 285, 4722, 891, 5583, 18738, 436, 2929 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper can be seen as a modification of sac in a multiagent setup by adding the conditional entropy hpiipij as a second set of regularization on top of hpii the overall idea and intuition appear to be interesting 1the first question is whether the mutual information is informative enough mutual means two a how about more eg haiaj ak when eq2 is still our target the underlying assumption is aj and ak would independently affect ai is that true and how the conditional distribution of aj on ak and vice versa would influence the objective 2the motivation for inducing mutual information using latent variable is not well motivated i can only find the first sentence in section 41 3would the new regularization benefit other rl algorithms say would the regularization combined with the baseline algorithms shown in the experiment be better compared with those without regularization 4the same latent variable z is shared by all the policies as shown in fig 1b would zij make more sense essentially would decentralized training be more reasonable 5in the experiment multiagent soft actorcritic is missing ie only keep hpij in the regularization 6how is alpha selected docsepsummary the authors propose to include the mutual information between agents simultaneous actions in the objective to encourage coordinated behaviour to induce positive mutual information the authors relax the assumption that the joint policy can be decomposed as the product of each agents policy independent of each other given the state and they achieve so by introducing a latent variable that correlates agents behaviours since the mutual information is difficult to compute the authors proposed to maximise a parametric lower bound the algorithm is theoretically motivated as a policy iteration variation in its exact tabular form but experiments are performed with neural network approximations on some environments numerical results show improvements over previous similar techniques strong points the mutual information objective the latent variable the variational lower bound and the algorithm are well motivated paper is well written simulation results seem convincing weak points the decentralised execution relying on having random generators with the exact same seed is not a robust solution the presentation as modified policy iteration is ok but quite straightforward the characterisation as a contraction is similar to many other works maybe a citation would have helped questions is always coordination desirable could be some adversarial example of an environment where the optimal policy requires lack of coordination wouldnt the current method suffer in such case does the paper assume finite stateaction sets if so please say it explicitly since the environment has stochastic transitions arent more assumptions needed in order to allow gamma 1 and still ensure existence of optimal policy regarding the rightmost term of b in 7 is it missing from 6 and 8 comments that didnt influence the score last paragraph of page 2 to explore widely seems loose to enhance exploration might be more accurate the authors use the term causal diagram but it seems to refer a bayesian network which indicates correlation rather than causality is that right third line of the conclusions paragraph in sec 7 wouldnt be more accurate to say applying approximate policy iterationdocsepsummary this paper proposes a maximum mutual information framework for cooperative marl following the insight that mutual information of agents policies is the indicator of coordination this paper proposes vm3ac an maac algorithm that optimizes longterm reward as well as a variational lower bound of mutual information in the paradigm of ctde experimental results show superiority of proposed algorithm in comparison with a few benchmark approaches detailed comments i appreciate the idea of utilizing mutual information mi of agents policies to facilitate the coordination of cooperative multiple agents in my personal opinion the mi of agents policy can be viewed as an indicator of coordination although the coordination may not be a goodoptimal one the results of temperature parameter experiments also demonstrate this i will mention this point later below my most concern is about the latent variable z i agree that the latent variable z should be an unobserved variable which reflects some information of coordination eg such a variable can be implicitly induced during the learning process of policies of agents in this paper the latent variable z in both learning and execution especially the authors use a common and predetermined sequence of z is generated before execution for agents in my opinion this violates the ctde paradigm which is claimed by the authors since such a common sequence of z is more like explicit signals of how to perform coordination not truly decentralized i am willing to hear the further understanding about the practical role of z from the authors moreover it seems that the two of three environments ie predatorprey and cooperative navigation are not partially observable po it would be better to provide more evaluation under po environments questions can the authors further discuss the reason to stop the gradient of term a in equation 45 since such a mechanism results in a practical optimization objective different from variation approximate mi as in algorithm 1 in decentralized execution phase the latent z is set to zerovector which conflicts the random sequence generated from the same random process as described in section 5 which one is the practical way and what is the difference in learning performance below equation 13 it says that l 1 is used for mc expectation is l 1 is sufficient for good performance and how can larger values of l influence the learning process suggestions as mentioned above i view mi as an indicator of coordination however the coordination may not be preferred therefore maximize the mi sometimes may trap the agents in a suboptimal coordination although the rewardobjective may help them out to some degree one potential evidence is a larger temperature parameter 015 hampers the learning performance as shown in figure 5 cd besides the stabilization induced by stop partial gradients of equation 45 may also be explained by this point therefore a more sophisticated mechanism for adaptive optimization of mi can be considered one possible concern also possible future work of proposed approach is the current modeling of mi is pairwise modeling of which the apparent drawback is the computational complexity can increase quadratically to the agent number as in this paper the experiments contain environments with up to 4 agents a development towards more agents should be considered minors in figure 5 last two subplots are both labeled as c temperature parameters are denoted by alpha in legends but beta in paragraph overall i think this paper takes a good attempt to study mi in cooperative marl proposing a reasonable algorithm with promising experimental results docsepthis paper proposes regularizing the conventional marl learning objective with a mutual information term to encourage more correlated behaviors among different agents the contribution is clearly stated however the similarity to previous works is not sufficiently discussed and the paper leaves out some important related works the paper maximizes the mutual information between agents policies given the current states to allow policies to be conditional dependent the authors assume there exists a dummy variable the contributions of the paper about the lower bound of the mutual information sec 42 and policy update sec 43 are not significant the lower bound of the mutual information has recently been extensively explored in multiagent settings used for encouraging role emergence minimized communication and exploration the policy update and policy improvement guarantee can be easily obtained based on soft reinforcement learning literature major concern about related works my major concern is not about the two contributions mentioned above instead i think that the first and main contribution of this paper is a subset of previous works contributions edti 1 discusses how to maximize the mutual information between the emphtrajectories of different agents their discussion already covers the correlation of policies of different agents at empha single timestep more importantly the authors of that paper also point out that only optimizing mutual information between trajectories is not enough because the reward signal has to be considered for better policy learning they even discuss the secondorder influence between these mutualinformationbased intrinsic rewards the contribution of this paper seems to be the first part of 1 frans oliehoek and other researchers also did lots of excellent works on this topic 2345 however the discussion about these related works is absent from this paper additionally the difference between the proposed method and jaquess paper social influence is not significant the only difference is whether the action of other agents aj is ajt or ajt1 when calculating the mutual information in jaquess paper they prove that their formulation is equivalent to a mutual information formulation i do not think the authors definition is an improvement of that of jaques at least the authors should provide a more serious discussion about this point perhaps providing a matrix game to show that different timesteps do indeed make a difference the authors may argue that their experiments show that their method has better performance the problem is that ssd used by jaques is a more challenging task than those used in this paper moreover social influence is sensitive to hyperparameter settings and needs finetuning to reach its full potential i will change my mind regarding the experiments if the authors could provide ssd results adopting their methods to tasks with discrete actions is not very difficult besides jaques et al also discuss how to make condition dependency clear between agents and how to carry out influential communications which are not discussed in this paper 1 wang t wang j wu y and zhang c 2019 influencebased multiagent exploration iclr 2020 spotlight 2 f a oliehoek s witwicki and l p kaelbling influencebased abstraction for multiagent systems in proceedings of the twentysixth aaai conference on artificial intelligence pages 14221428 july 2012 also see a recent longer version httpsarxivorgabs190709278 3 r becker s zilberstein v lesser and c v goldman transitionindependent decentralized markov decision processes in proceedings of the international conference on autonomous agents and multiagent systems pages 4148 2003 4 miguel suau de castro elena congeduti rolf an starre aleksander czechowski and frans a oliehoek influencebased abstraction in deep reinforcement learning in proceedings of the aamas workshop on adaptive learning agents ala may 2019 5 frans a oliehoek and christopher amato a concise introduction to decentralized pomdps springerbriefs in intelligent systems springer may 2016 ### Summary:
overview this paper introduces a maximum mutual information method for helping to coordinate rl agents without communication discussion some reviewers leaned towards accept but i found the two reviewers recommending rejecting to be more convincing recommendation this is an important research topic and im glad this paper is focusing on the problem hopefully the reviews will help improve a future version of this paper i agree that this is a new way of using mutual information but it seems more like a small improvement rather than a very significant step forward in addition i think the setting needs to be better motivated this is a centralized training with decentralized execution ctde setting and this paper helps the agents coordinate in ctde the agents work in the environment and then pool their information to train before deploying on the next episode i dont understand why eg in multiwalker agents would not be able to communicate while walking can communicate after they succeed or drop the object the episode ends and then cannot communicate once the next episode starts
[ 4833, 253, 8103, 50276, 19, 783, 16038, 323, 24635, 15577, 1491, 970, 21624, 4778, 310, 417, 973, 17194, 891, 476, 760, 1089, 253, 806, 6197, 275, 2593, 7609, 50275, 20, 12756, 253, 747, 37820, 5649, 643, 391, 77, 11333, 1333, 651, 253, 37820, 5678, 342, 253, 8245, 11333, 2011, 275, 253, 3368, 320, 1805, 2429, 342, 1110, 1293, 37820, 50273, 21, 783, 1072, 21624, 4778, 1182, 310, 6096, 407, 512, 253, 7823, 347, 2011, 275, 3036, 337, 67, 651, 1182, 1944, 1056, 625, 3282, 9093, 651, 40880, 3733, 320, 625, 5272, 50275, 22, 249, 253, 3368, 4471, 12788, 2602, 12353, 68, 17425, 310, 5816, 26332, 760, 1978, 288, 81, 1944, 275, 253, 37820, 50276, 23, 5430, 310, 9765, 4236, 50276, 7152, 339, 793, 360, 3454, 253, 4477, 12661, 281, 2486, 253, 15577, 1491, 875, 6083, 19645, 5231, 275, 253, 8103, 281, 11907, 25899, 8770, 281, 10808, 2762, 15577, 1491, 253, 4477, 7921, 253, 9376, 326, 253, 6036, 3646, 476, 320, 45765, 347, 253, 1885, 273, 1016, 6083, 3646, 3907, 273, 1016, 643, 1677, 253, 1375, 285, 597, 5115, 594, 407, 16984, 247, 21624, 4778, 326, 27972, 6083, 32536, 1580, 253, 15577, 1491, 310, 2834, 281, 11897, 253, 4477, 4081, 281, 11903, 885, 247, 36833, 2406, 3033, 253, 5933, 310, 28055, 17194, 347, 247, 3646, 19502, 7629, 275, 697, 3242, 10334, 792, 830, 533, 4679, 403, 2684, 342, 11454, 2990, 34754, 327, 690, 12620, 10704, 1543, 921, 11701, 689, 2045, 2074, 5609, 50276, 9072, 2792, 253, 15577, 1491, 8103, 253, 21624, 4778, 253, 39762, 2406, 3033, 285, 253, 5933, 403, 973, 17194, 2929, 310, 973, 3542, 50276, 3549, 1427, 1543, 1646, 21414, 50276, 20881, 2792, 253, 31331, 1701, 10636, 22128, 327, 1907, 3632, 21025, 342, 253, 3242, 1072, 8357, 310, 417, 247, 10237, 2900, 253, 9759, 347, 7321, 3646, 19502, 310, 8718, 533, 3240, 15246, 253, 1894, 5837, 347, 247, 22170, 310, 2074, 281, 1142, 643, 2987, 5046, 247, 25577, 651, 452, 6518, 50275, 34974, 310, 1900, 19915, 11408, 812, 320, 690, 48960, 1650, 273, 271, 3126, 835, 253, 8654, 3646, 4419, 3480, 273, 19915, 651, 2649, 253, 1655, 1332, 11089, 275, 824, 1083, 1057, 253, 2929, 5467, 6486, 1375, 1913, 5239, 604, 594, 4496, 1333, 352, 11120, 1580, 253, 3126, 556, 19191, 16307, 403, 2649, 625, 13260, 3058, 275, 1340, 281, 1581, 17356, 50276, 18, 285, 1335, 5416, 6242, 273, 8654, 3646, 5001, 253, 987, 2252, 1307, 273, 270, 275, 818, 310, 352, 5816, 432, 721, 285, 854, 50276, 26122, 326, 42126, 4833, 253, 4868, 1390, 12494, 273, 3239, 374, 281, 8338, 7561, 3133, 13155, 50276, 936, 7278, 17947, 1537, 320, 625, 7899, 253, 4477, 897, 253, 1307, 19349, 10659, 533, 352, 3133, 281, 3730, 247, 17699, 16561, 2990, 534, 6492, 5921, 2581, 685, 46449, 310, 326, 987, 2626, 1386, 273, 253, 11815, 12494, 275, 4706, 818, 651, 2649, 320, 625, 7899, 281, 1333, 9433, 16851, 3646, 19502, 7152, 339, 793, 360, 3454, 50276, 2520, 2929, 29328, 247, 4869, 15577, 1491, 7792, 323, 27293, 2304, 77, 1563, 253, 12288, 326, 15577, 1491, 273, 6083, 7823, 310, 253, 15301, 273, 19915, 436, 2929, 29328, 31940, 20, 317, 271, 6429, 317, 5933, 326, 5556, 4219, 1048, 3945, 10921, 347, 973, 347, 247, 39762, 2406, 3033, 273, 15577, 1491, 275, 253, 22199, 273, 45830, 615, 5661, 1543, 921, 34385, 273, 4081, 5933, 275, 5301, 342, 247, 1643, 22791, 7274, 50275, 5992, 7193, 5701, 50276, 74, 11435, 253, 2934, 273, 17617, 15577, 1491, 3641, 273, 6083, 7823, 281, 12454, 253, 19915, 273, 27293, 2709, 6083, 275, 619, 3367, 4743, 253, 3641, 273, 6083, 3646, 476, 320, 11575, 347, 271, 15301, 273, 19915, 3738, 253, 19915, 778, 417, 320, 247, 1175, 29776, 581, 253, 1543, 273, 3276, 4764, 4679, 671, 7568, 436, 891, 588, 3748, 436, 1127, 1996, 2708, 50275, 2577, 954, 4468, 310, 670, 253, 21624, 4778, 1182, 891, 5194, 326, 253, 21624, 4778, 1182, 943, 320, 271, 440, 45912, 4778, 534, 13806, 690, 1491, 273, 19915, 24088, 824, 247, 4778, 476, 320, 29688, 5802, 1309, 253, 4715, 1232, 273, 7823, 273, 6083, 275, 436, 2929, 253, 21624, 4778, 1182, 275, 1097, 4715, 285, 10636, 3340, 253, 4477, 897, 247, 1846, 285, 17095, 3425, 273, 1182, 310, 4561, 1078, 10636, 323, 6083, 275, 619, 4743, 436, 28096, 253, 45830, 615, 22199, 534, 310, 7558, 407, 253, 4477, 1580, 824, 247, 1846, 3425, 273, 1182, 310, 625, 751, 6843, 6298, 273, 849, 281, 1347, 19915, 417, 7777, 40880, 891, 717, 7378, 281, 4089, 253, 2007, 4685, 670, 253, 8542, 2554, 273, 1182, 432, 253, 4477, 50276, 3062, 1189, 352, 3133, 326, 253, 767, 273, 1264, 12620, 26332, 41143, 3456, 90, 285, 27293, 15034, 403, 417, 10571, 24802, 2963, 352, 651, 320, 1805, 281, 2085, 625, 7103, 762, 2963, 12620, 50275, 34974, 50276, 5092, 253, 4477, 2007, 2319, 253, 1921, 281, 3523, 253, 11786, 273, 1307, 247, 275, 5150, 5329, 1580, 824, 247, 5122, 1543, 275, 247, 8542, 13757, 8103, 1027, 432, 7629, 16851, 3641, 50275, 284, 275, 5933, 337, 275, 40880, 10636, 3408, 253, 21624, 1182, 310, 873, 281, 1182, 254, 710, 1870, 534, 15272, 253, 3632, 3425, 4561, 432, 253, 1072, 3632, 1232, 347, 2529, 275, 2593, 608, 534, 581, 310, 253, 8542, 1039, 285, 752, 310, 253, 3064, 275, 4715, 3045, 50276, 27490, 5150, 2145, 352, 2296, 326, 298, 50276, 18, 310, 908, 323, 278, 68, 15355, 310, 298, 50276, 18, 310, 4209, 323, 1175, 3045, 285, 849, 476, 4067, 2193, 273, 298, 4833, 253, 4715, 1232, 50276, 35640, 621, 50276, 284, 5393, 1840, 891, 1859, 3641, 347, 271, 15301, 273, 19915, 2299, 253, 19915, 778, 417, 320, 9013, 3103, 22950, 253, 3641, 4536, 778, 15464, 253, 6083, 275, 247, 749, 29776, 19915, 3738, 253, 10921, 6082, 422, 778, 1361, 731, 562, 281, 690, 4248, 581, 2442, 1941, 310, 247, 4067, 3276, 4764, 470, 1010, 288, 1301, 398, 253, 4715, 3045, 347, 2011, 275, 4677, 608, 22942, 16280, 253, 28366, 5802, 407, 3523, 7898, 27935, 273, 5150, 5329, 778, 671, 320, 5544, 407, 436, 1127, 3103, 247, 625, 18144, 5122, 323, 17825, 13757, 273, 3641, 476, 320, 2783, 50276, 531, 1896, 4468, 671, 1896, 2852, 789, 273, 4081, 2746, 310, 253, 1655, 14053, 273, 3641, 310, 28208, 14053, 273, 534, 253, 5165, 32489, 310, 253, 15180, 10454, 476, 2572, 13284, 5372, 281, 253, 5570, 1180, 347, 275, 436, 2929, 253, 4679, 3831, 12620, 342, 598, 281, 577, 6083, 247, 2440, 4404, 625, 6083, 943, 320, 2783, 50275, 1222, 641, 50276, 249, 4677, 608, 1390, 767, 749, 42045, 403, 1097, 13130, 347, 260, 3276, 3602, 403, 17007, 407, 9765, 275, 38209, 533, 9840, 275, 12494, 50275, 1189, 455, 891, 1158, 436, 2929, 3936, 247, 1175, 3177, 281, 1263, 3641, 275, 27293, 2304, 77, 36636, 247, 5272, 5933, 342, 12532, 5661, 1543, 5474, 33032, 2520, 2929, 29328, 3963, 3006, 253, 6041, 2304, 77, 4715, 8103, 342, 247, 15577, 1491, 1307, 281, 11907, 625, 9578, 13576, 2190, 1027, 6083, 253, 7680, 310, 4518, 4767, 2299, 253, 14259, 281, 2045, 2987, 310, 417, 10481, 5469, 285, 253, 2929, 6505, 562, 690, 1774, 2905, 2987, 50276, 783, 2929, 11903, 4219, 253, 15577, 1491, 875, 6083, 7823, 1677, 253, 1655, 3054, 281, 1581, 7823, 281, 320, 17697, 7976, 253, 4477, 5467, 627, 4961, 247, 28726, 4778, 253, 9021, 273, 253, 2929, 670, 253, 2406, 3033, 273, 253, 15577, 1491, 4706, 5976, 285, 3646, 5731, 4706, 7652, 403, 417, 1534, 253, 2406, 3033, 273, 253, 15577, 1491, 556, 4102, 644, 18171, 14859, 275, 4471, 12788, 7533, 908, 323, 18462, 2554, 21313, 36625, 5511, 285, 17947, 253, 3646, 5731, 285, 3646, 7756, 12215, 476, 320, 4354, 2797, 1754, 327, 2602, 35221, 4715, 6239, 50274, 24330, 4468, 670, 2905, 2987, 619, 2201, 4468, 310, 417, 670, 253, 767, 9021, 5393, 1840, 3185, 891, 1158, 326, 253, 806, 285, 2022, 7680, 273, 436, 2929, 310, 247, 8578, 273, 2045, 2987, 9021, 1407, 6811, 337, 25339, 849, 281, 22950, 253, 15577, 1491, 875, 253, 41534, 49604, 720, 2370, 273, 1027, 6083, 616, 5955, 2168, 10949, 253, 5921, 273, 7823, 273, 1027, 6083, 387, 802, 1494, 2014, 4522, 383, 554, 625, 15538, 253, 4477, 273, 326, 2929, 671, 1127, 562, 326, 760, 39793, 15577, 1491, 875, 24102, 310, 417, 2217, 984, 253, 10921, 2625, 556, 281, 320, 2783, 323, 1805, 3646, 4715, 597, 1014, 2319, 253, 1273, 2621, 4833, 875, 841, 15577, 18480, 3169, 15276, 23267, 253, 7680, 273, 436, 2929, 3133, 281, 320, 253, 806, 629, 273, 337, 1315, 507, 8919, 466, 1689, 1441, 285, 643, 8607, 671, 858, 8783, 273, 7126, 2987, 327, 436, 9400, 374, 16767, 2299, 253, 5955, 670, 841, 2905, 2987, 310, 12125, 432, 436, 2929, 50276, 29483, 595, 253, 3064, 875, 253, 4081, 1332, 285, 8729, 371, 405, 2929, 2675, 4833, 310, 417, 1534, 253, 760, 3064, 310, 1880, 253, 2250, 273, 643, 6083, 29168, 310, 29168, 85, 390, 29168, 85, 18, 672, 18899, 253, 15577, 1491, 275, 8729, 371, 405, 2929, 597, 5276, 326, 616, 15895, 310, 6425, 281, 247, 15577, 1491, 15895, 891, 513, 417, 1158, 253, 4477, 5426, 310, 271, 7756, 273, 326, 273, 8729, 10999, 387, 1878, 253, 4477, 943, 2085, 247, 625, 4092, 5955, 670, 436, 1127, 4931, 5277, 247, 4315, 2165, 281, 921, 326, 1027, 4522, 383, 2265, 513, 6296, 1056, 247, 3064, 253, 4477, 778, 9059, 326, 616, 4679, 921, 326, 616, 1332, 556, 1805, 3045, 253, 1895, 310, 326, 256, 8289, 908, 407, 8729, 10999, 310, 247, 625, 11132, 4836, 685, 1110, 908, 275, 436, 2929, 25761, 2675, 4833, 310, 7996, 281, 4373, 19484, 7533, 285, 3198, 1442, 292, 25004, 281, 3986, 697, 2120, 2442, 891, 588, 1818, 619, 2564, 5001, 253, 4679, 604, 253, 4477, 812, 2085, 256, 8289, 1543, 25987, 616, 3082, 281, 8892, 342, 13358, 5231, 310, 417, 1077, 2834, 16280, 8729, 10999, 1162, 355, 671, 2319, 849, 281, 1056, 1617, 18925, 2590, 875, 6083, 285, 849, 281, 4459, 562, 20803, 10924, 534, 403, 417, 5469, 275, 436, 2929, 50275, 18, 259, 606, 246, 259, 606, 480, 259, 86, 340, 285, 1182, 12109, 260, 6247, 4833, 3169, 4471, 12788, 17947, 17857, 32888, 9169, 34543, 50276, 19, 269, 247, 8919, 466, 1689, 1441, 256, 19311, 15989, 74, 285, 298, 268, 465, 4696, 8036, 4833, 3169, 38562, 323, 4471, 12788, 2718, 275, 10061, 273, 253, 2500, 290, 656, 895, 394, 39951, 2284, 8059, 327, 13345, 9260, 7223, 1638, 1423, 1047, 1619, 480, 2988, 4050, 50276, 12563, 923, 247, 3332, 3356, 2715, 5987, 39962, 2061, 5375, 16129, 26371, 24803, 50276, 20, 391, 320, 13692, 256, 1182, 300, 589, 6339, 362, 16277, 285, 260, 362, 5328, 1342, 5502, 17777, 40880, 1616, 729, 3061, 4870, 275, 10061, 273, 253, 5213, 8059, 327, 26279, 6083, 285, 4471, 12788, 2718, 7223, 577, 16989, 6469, 50276, 21, 5563, 3814, 402, 1952, 372, 5248, 287, 1045, 6736, 345, 2400, 32616, 687, 35331, 271, 4177, 250, 21844, 661, 5945, 260, 18506, 15767, 285, 1315, 507, 247, 8919, 466, 1689, 1441, 4833, 3169, 38562, 275, 3676, 35221, 4715, 275, 10061, 273, 253, 247, 33122, 22586, 327, 17825, 4715, 6083, 355, 66, 778, 6247, 50276, 22, 1315, 507, 247, 8919, 466, 1689, 1441, 285, 37622, 12679, 717, 4611, 247, 44003, 10199, 281, 40880, 31204, 69, 793, 7203, 254, 18399, 84, 275, 17497, 2718, 7203, 254, 778, 4022, 187, 187, 4118, 18435, 27, 39930, 436, 2929, 23970, 247, 4869, 15577, 1491, 1332, 323, 9073, 281, 13249, 391, 77, 6083, 1293, 5511, 50276, 49794, 690, 30628, 18274, 4404, 2997, 533, 891, 1119, 253, 767, 30628, 46705, 33944, 281, 320, 625, 21414, 50276, 250, 27167, 318, 436, 310, 271, 1774, 2561, 9400, 285, 516, 9995, 436, 2929, 310, 13654, 327, 253, 1895, 18670, 253, 10123, 588, 1361, 3157, 247, 2852, 2715, 273, 436, 2929, 891, 5194, 326, 436, 310, 247, 747, 1039, 273, 970, 15577, 1491, 533, 352, 3133, 625, 751, 247, 1355, 7756, 2581, 685, 247, 1077, 1534, 3213, 3579, 50275, 249, 1635, 891, 1158, 253, 4758, 3198, 281, 320, 1805, 17194, 436, 310, 247, 36409, 3733, 342, 40880, 10636, 45830, 615, 4758, 285, 436, 2929, 7729, 253, 6083, 13249, 275, 45830, 615, 253, 6083, 789, 275, 253, 3126, 285, 840, 6363, 616, 1491, 281, 6194, 1078, 45021, 327, 253, 1735, 9037, 891, 13414, 2096, 2139, 24088, 275, 4471, 49679, 6083, 651, 417, 320, 2104, 281, 13791, 1223, 7824, 476, 13791, 846, 597, 9302, 390, 5926, 253, 1789, 253, 9037, 7637, 285, 840, 2550, 13791, 2378, 253, 1735, 9037, 7866 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 4833, 253, 8103, 50276, 19, 783, 16038, 323, 24635, 15577, 1491, 970, 21624, 4778, 310, 417, 973, 17194, 891, 476, 760, 1089, 253, 806, 6197, 275, 2593, 7609, 50275, 20, 12756, 253, 747, 37820, 5649, 643, 391, 77, 11333, 1333, 651, 253, 37820, 5678, 342, 253, 8245, 11333, 2011, 275, 253, 3368, 320, 1805, 2429, 342, 1110, 1293, 37820, 50273, 21, 783, 1072, 21624, 4778, 1182, 310, 6096, 407, 512, 253, 7823, 347, 2011, 275, 3036, 337, 67, 651, 1182, 1944, 1056, 625, 3282, 9093, 651, 40880, 3733, 320, 625, 5272, 50275, 22, 249, 253, 3368, 4471, 12788, 2602, 12353, 68, 17425, 310, 5816, 26332, 760, 1978, 288, 81, 1944, 275, 253, 37820, 50276, 23, 5430, 310, 9765, 4236, 50276, 7152, 339, 793, 360, 3454, 253, 4477, 12661, 281, 2486, 253, 15577, 1491, 875, 6083, 19645, 5231, 275, 253, 8103, 281, 11907, 25899, 8770, 281, 10808, 2762, 15577, 1491, 253, 4477, 7921, 253, 9376, 326, 253, 6036, 3646, 476, 320, 45765, 347, 253, 1885, 273, 1016, 6083, 3646, 3907, 273, 1016, 643, 1677, 253, 1375, 285, 597, 5115, 594, 407, 16984, 247, 21624, 4778, 326, 27972, 6083, 32536, 1580, 253, 15577, 1491, 310, 2834, 281, 11897, 253, 4477, 4081, 281, 11903, 885, 247, 36833, 2406, 3033, 253, 5933, 310, 28055, 17194, 347, 247, 3646, 19502, 7629, 275, 697, 3242, 10334, 792, 830, 533, 4679, 403, 2684, 342, 11454, 2990, 34754, 327, 690, 12620, 10704, 1543, 921, 11701, 689, 2045, 2074, 5609, 50276, 9072, 2792, 253, 15577, 1491, 8103, 253, 21624, 4778, 253, 39762, 2406, 3033, 285, 253, 5933, 403, 973, 17194, 2929, 310, 973, 3542, 50276, 3549, 1427, 1543, 1646, 21414, 50276, 20881, 2792, 253, 31331, 1701, 10636, 22128, 327, 1907, 3632, 21025, 342, 253, 3242, 1072, 8357, 310, 417, 247, 10237, 2900, 253, 9759, 347, 7321, 3646, 19502, 310, 8718, 533, 3240, 15246, 253, 1894, 5837, 347, 247, 22170, 310, 2074, 281, 1142, 643, 2987, 5046, 247, 25577, 651, 452, 6518, 50275, 34974, 310, 1900, 19915, 11408, 812, 320, 690, 48960, 1650, 273, 271, 3126, 835, 253, 8654, 3646, 4419, 3480, 273, 19915, 651, 2649, 253, 1655, 1332, 11089, 275, 824, 1083, 1057, 253, 2929, 5467, 6486, 1375, 1913, 5239, 604, 594, 4496, 1333, 352, 11120, 1580, 253, 3126, 556, 19191, 16307, 403, 2649, 625, 13260, 3058, 275, 1340, 281, 1581, 17356, 50276, 18, 285, 1335, 5416, 6242, 273, 8654, 3646, 5001, 253, 987, 2252, 1307, 273, 270, 275, 818, 310, 352, 5816, 432, 721, 285, 854, 50276, 26122, 326, 42126, 4833, 253, 4868, 1390, 12494, 273, 3239, 374, 281, 8338, 7561, 3133, 13155, 50276, 936, 7278, 17947, 1537, 320, 625, 7899, 253, 4477, 897, 253, 1307, 19349, 10659, 533, 352, 3133, 281, 3730, 247, 17699, 16561, 2990, 534, 6492, 5921, 2581, 685, 46449, 310, 326, 987, 2626, 1386, 273, 253, 11815, 12494, 275, 4706, 818, 651, 2649, 320, 625, 7899, 281, 1333, 9433, 16851, 3646, 19502, 7152, 339, 793, 360, 3454, 50276, 2520, 2929, 29328, 247, 4869, 15577, 1491, 7792, 323, 27293, 2304, 77, 1563, 253, 12288, 326, 15577, 1491, 273, 6083, 7823, 310, 253, 15301, 273, 19915, 436, 2929, 29328, 31940, 20, 317, 271, 6429, 317, 5933, 326, 5556, 4219, 1048, 3945, 10921, 347, 973, 347, 247, 39762, 2406, 3033, 273, 15577, 1491, 275, 253, 22199, 273, 45830, 615, 5661, 1543, 921, 34385, 273, 4081, 5933, 275, 5301, 342, 247, 1643, 22791, 7274, 50275, 5992, 7193, 5701, 50276, 74, 11435, 253, 2934, 273, 17617, 15577, 1491, 3641, 273, 6083, 7823, 281, 12454, 253, 19915, 273, 27293, 2709, 6083, 275, 619, 3367, 4743, 253, 3641, 273, 6083, 3646, 476, 320, 11575, 347, 271, 15301, 273, 19915, 3738, 253, 19915, 778, 417, 320, 247, 1175, 29776, 581, 253, 1543, 273, 3276, 4764, 4679, 671, 7568, 436, 891, 588, 3748, 436, 1127, 1996, 2708, 50275, 2577, 954, 4468, 310, 670, 253, 21624, 4778, 1182, 891, 5194, 326, 253, 21624, 4778, 1182, 943, 320, 271, 440, 45912, 4778, 534, 13806, 690, 1491, 273, 19915, 24088, 824, 247, 4778, 476, 320, 29688, 5802, 1309, 253, 4715, 1232, 273, 7823, 273, 6083, 275, 436, 2929, 253, 21624, 4778, 1182, 275, 1097, 4715, 285, 10636, 3340, 253, 4477, 897, 247, 1846, 285, 17095, 3425, 273, 1182, 310, 4561, 1078, 10636, 323, 6083, 275, 619, 4743, 436, 28096, 253, 45830, 615, 22199, 534, 310, 7558, 407, 253, 4477, 1580, 824, 247, 1846, 3425, 273, 1182, 310, 625, 751, 6843, 6298, 273, 849, 281, 1347, 19915, 417, 7777, 40880, 891, 717, 7378, 281, 4089, 253, 2007, 4685, 670, 253, 8542, 2554, 273, 1182, 432, 253, 4477, 50276, 3062, 1189, 352, 3133, 326, 253, 767, 273, 1264, 12620, 26332, 41143, 3456, 90, 285, 27293, 15034, 403, 417, 10571, 24802, 2963, 352, 651, 320, 1805, 281, 2085, 625, 7103, 762, 2963, 12620, 50275, 34974, 50276, 5092, 253, 4477, 2007, 2319, 253, 1921, 281, 3523, 253, 11786, 273, 1307, 247, 275, 5150, 5329, 1580, 824, 247, 5122, 1543, 275, 247, 8542, 13757, 8103, 1027, 432, 7629, 16851, 3641, 50275, 284, 275, 5933, 337, 275, 40880, 10636, 3408, 253, 21624, 1182, 310, 873, 281, 1182, 254, 710, 1870, 534, 15272, 253, 3632, 3425, 4561, 432, 253, 1072, 3632, 1232, 347, 2529, 275, 2593, 608, 534, 581, 310, 253, 8542, 1039, 285, 752, 310, 253, 3064, 275, 4715, 3045, 50276, 27490, 5150, 2145, 352, 2296, 326, 298, 50276, 18, 310, 908, 323, 278, 68, 15355, 310, 298, 50276, 18, 310, 4209, 323, 1175, 3045, 285, 849, 476, 4067, 2193, 273, 298, 4833, 253, 4715, 1232, 50276, 35640, 621, 50276, 284, 5393, 1840, 891, 1859, 3641, 347, 271, 15301, 273, 19915, 2299, 253, 19915, 778, 417, 320, 9013, 3103, 22950, 253, 3641, 4536, 778, 15464, 253, 6083, 275, 247, 749, 29776, 19915, 3738, 253, 10921, 6082, 422, 778, 1361, 731, 562, 281, 690, 4248, 581, 2442, 1941, 310, 247, 4067, 3276, 4764, 470, 1010, 288, 1301, 398, 253, 4715, 3045, 347, 2011, 275, 4677, 608, 22942, 16280, 253, 28366, 5802, 407, 3523, 7898, 27935, 273, 5150, 5329, 778, 671, 320, 5544, 407, 436, 1127, 3103, 247, 625, 18144, 5122, 323, 17825, 13757, 273, 3641, 476, 320, 2783, 50276, 531, 1896, 4468, 671, 1896, 2852, 789, 273, 4081, 2746, 310, 253, 1655, 14053, 273, 3641, 310, 28208, 14053, 273, 534, 253, 5165, 32489, 310, 253, 15180, 10454, 476, 2572, 13284, 5372, 281, 253, 5570, 1180, 347, 275, 436, 2929, 253, 4679, 3831, 12620, 342, 598, 281, 577, 6083, 247, 2440, 4404, 625, 6083, 943, 320, 2783, 50275, 1222, 641, 50276, 249, 4677, 608, 1390, 767, 749, 42045, 403, 1097, 13130, 347, 260, 3276, 3602, 403, 17007, 407, 9765, 275, 38209, 533, 9840, 275, 12494, 50275, 1189, 455, 891, 1158, 436, 2929, 3936, 247, 1175, 3177, 281, 1263, 3641, 275, 27293, 2304, 77, 36636, 247, 5272, 5933, 342, 12532, 5661, 1543, 5474, 33032, 2520, 2929, 29328, 3963, 3006, 253, 6041, 2304, 77, 4715, 8103, 342, 247, 15577, 1491, 1307, 281, 11907, 625, 9578, 13576, 2190, 1027, 6083, 253, 7680, 310, 4518, 4767, 2299, 253, 14259, 281, 2045, 2987, 310, 417, 10481, 5469, 285, 253, 2929, 6505, 562, 690, 1774, 2905, 2987, 50276, 783, 2929, 11903, 4219, 253, 15577, 1491, 875, 6083, 7823, 1677, 253, 1655, 3054, 281, 1581, 7823, 281, 320, 17697, 7976, 253, 4477, 5467, 627, 4961, 247, 28726, 4778, 253, 9021, 273, 253, 2929, 670, 253, 2406, 3033, 273, 253, 15577, 1491, 4706, 5976, 285, 3646, 5731, 4706, 7652, 403, 417, 1534, 253, 2406, 3033, 273, 253, 15577, 1491, 556, 4102, 644, 18171, 14859, 275, 4471, 12788, 7533, 908, 323, 18462, 2554, 21313, 36625, 5511, 285, 17947, 253, 3646, 5731, 285, 3646, 7756, 12215, 476, 320, 4354, 2797, 1754, 327, 2602, 35221, 4715, 6239, 50274, 24330, 4468, 670, 2905, 2987, 619, 2201, 4468, 310, 417, 670, 253, 767, 9021, 5393, 1840, 3185, 891, 1158, 326, 253, 806, 285, 2022, 7680, 273, 436, 2929, 310, 247, 8578, 273, 2045, 2987, 9021, 1407, 6811, 337, 25339, 849, 281, 22950, 253, 15577, 1491, 875, 253, 41534, 49604, 720, 2370, 273, 1027, 6083, 616, 5955, 2168, 10949, 253, 5921, 273, 7823, 273, 1027, 6083, 387, 802, 1494, 2014, 4522, 383, 554, 625, 15538, 253, 4477, 273, 326, 2929, 671, 1127, 562, 326, 760, 39793, 15577, 1491, 875, 24102, 310, 417, 2217, 984, 253, 10921, 2625, 556, 281, 320, 2783, 323, 1805, 3646, 4715, 597, 1014, 2319, 253, 1273, 2621, 4833, 875, 841, 15577, 18480, 3169, 15276, 23267, 253, 7680, 273, 436, 2929, 3133, 281, 320, 253, 806, 629, 273, 337, 1315, 507, 8919, 466, 1689, 1441, 285, 643, 8607, 671, 858, 8783, 273, 7126, 2987, 327, 436, 9400, 374, 16767, 2299, 253, 5955, 670, 841, 2905, 2987, 310, 12125, 432, 436, 2929, 50276, 29483, 595, 253, 3064, 875, 253, 4081, 1332, 285, 8729, 371, 405, 2929, 2675, 4833, 310, 417, 1534, 253, 760, 3064, 310, 1880, 253, 2250, 273, 643, 6083, 29168, 310, 29168, 85, 390, 29168, 85, 18, 672, 18899, 253, 15577, 1491, 275, 8729, 371, 405, 2929, 597, 5276, 326, 616, 15895, 310, 6425, 281, 247, 15577, 1491, 15895, 891, 513, 417, 1158, 253, 4477, 5426, 310, 271, 7756, 273, 326, 273, 8729, 10999, 387, 1878, 253, 4477, 943, 2085, 247, 625, 4092, 5955, 670, 436, 1127, 4931, 5277, 247, 4315, 2165, 281, 921, 326, 1027, 4522, 383, 2265, 513, 6296, 1056, 247, 3064, 253, 4477, 778, 9059, 326, 616, 4679, 921, 326, 616, 1332, 556, 1805, 3045, 253, 1895, 310, 326, 256, 8289, 908, 407, 8729, 10999, 310, 247, 625, 11132, 4836, 685, 1110, 908, 275, 436, 2929, 25761, 2675, 4833, 310, 7996, 281, 4373, 19484, 7533, 285, 3198, 1442, 292, 25004, 281, 3986, 697, 2120, 2442, 891, 588, 1818, 619, 2564, 5001, 253, 4679, 604, 253, 4477, 812, 2085, 256, 8289, 1543, 25987, 616, 3082, 281, 8892, 342, 13358, 5231, 310, 417, 1077, 2834, 16280, 8729, 10999, 1162, 355, 671, 2319, 849, 281, 1056, 1617, 18925, 2590, 875, 6083, 285, 849, 281, 4459, 562, 20803, 10924, 534, 403, 417, 5469, 275, 436, 2929, 50275, 18, 259, 606, 246, 259, 606, 480, 259, 86, 340, 285, 1182, 12109, 260, 6247, 4833, 3169, 4471, 12788, 17947, 17857, 32888, 9169, 34543, 50276, 19, 269, 247, 8919, 466, 1689, 1441, 256, 19311, 15989, 74, 285, 298, 268, 465, 4696, 8036, 4833, 3169, 38562, 323, 4471, 12788, 2718, 275, 10061, 273, 253, 2500, 290, 656, 895, 394, 39951, 2284, 8059, 327, 13345, 9260, 7223, 1638, 1423, 1047, 1619, 480, 2988, 4050, 50276, 12563, 923, 247, 3332, 3356, 2715, 5987, 39962, 2061, 5375, 16129, 26371, 24803, 50276, 20, 391, 320, 13692, 256, 1182, 300, 589, 6339, 362, 16277, 285, 260, 362, 5328, 1342, 5502, 17777, 40880, 1616, 729, 3061, 4870, 275, 10061, 273, 253, 5213, 8059, 327, 26279, 6083, 285, 4471, 12788, 2718, 7223, 577, 16989, 6469, 50276, 21, 5563, 3814, 402, 1952, 372, 5248, 287, 1045, 6736, 345, 2400, 32616, 687, 35331, 271, 4177, 250, 21844, 661, 5945, 260, 18506, 15767, 285, 1315, 507, 247, 8919, 466, 1689, 1441, 4833, 3169, 38562, 275, 3676, 35221, 4715, 275, 10061, 273, 253, 247, 33122, 22586, 327, 17825, 4715, 6083, 355, 66, 778, 6247, 50276, 22, 1315, 507, 247, 8919, 466, 1689, 1441, 285, 37622, 12679, 717, 4611, 247, 44003, 10199, 281, 40880, 31204, 69, 793, 7203, 254, 18399, 84, 275, 17497, 2718, 7203, 254, 778, 4022, 187, 187, 4118, 18435, 27, 39930, 436, 2929, 23970, 247, 4869, 15577, 1491, 1332, 323, 9073, 281, 13249, 391, 77, 6083, 1293, 5511, 50276, 49794, 690, 30628, 18274, 4404, 2997, 533, 891, 1119, 253, 767, 30628, 46705, 33944, 281, 320, 625, 21414, 50276, 250, 27167, 318, 436, 310, 271, 1774, 2561, 9400, 285, 516, 9995, 436, 2929, 310, 13654, 327, 253, 1895, 18670, 253, 10123, 588, 1361, 3157, 247, 2852, 2715, 273, 436, 2929, 891, 5194, 326, 436, 310, 247, 747, 1039, 273, 970, 15577, 1491, 533, 352, 3133, 625, 751, 247, 1355, 7756, 2581, 685, 247, 1077, 1534, 3213, 3579, 50275, 249, 1635, 891, 1158, 253, 4758, 3198, 281, 320, 1805, 17194, 436, 310, 247, 36409, 3733, 342, 40880, 10636, 45830, 615, 4758, 285, 436, 2929, 7729, 253, 6083, 13249, 275, 45830, 615, 253, 6083, 789, 275, 253, 3126, 285, 840, 6363, 616, 1491, 281, 6194, 1078, 45021, 327, 253, 1735, 9037, 891, 13414, 2096, 2139, 24088, 275, 4471, 49679, 6083, 651, 417, 320, 2104, 281, 13791, 1223, 7824, 476, 13791, 846, 597, 9302, 390, 5926, 253, 1789, 253, 9037, 7637, 285, 840, 2550, 13791, 2378, 253, 1735, 9037, 7866 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper studies the problem of answering firstorder questions more on the terminology later that correspond to a single fact in a knowledge graph kg and focuses on the crossdomain setting where no curated training examples are provided for the unseen test domain the proposed base kgqa model is modified from the stateoftheart model on simplequestions from petrochuk and zettlemoyer 2018 but with the relation prediction component changed from a classification model to a ranking model to better handle unseen relations more on this later the key contribution is a way of generating synthetic questions for the relations in the unseen domain for data augmentation the generation model is from elsahar et al 2018 but is augmented with relationspecific keywords mined from wikipedia via distant supervision evaluation on reshuffled simplequestions shows that the proposed method can achieve a reasonable performance on 6 selected test domains of large to moderate scale and the question generation strategy is better than several baselines strengths overall the paper is wellwritten and easy to follow crossdomain semantic parsingquestion answering is a very important problem because of the broad applicability of the technique the evaluation appears to be well designed and shows some interesting and solid results weaknesses overall the technical contribution appears to be marginal its largely a recombination of known techniques for a simpler version of a widelystudied problem crossdomain semantic parsing the paper rightfully points out the importance of the crossdomain setting it is however a bit surprising to see that the discussion of related work is entirely confined to the works on simplequestions for a number of clear reasons building semantic parsing modelstraining methods that can generalize across domains is a wellrecognized demand and has received much attention it is for example a builtin requirement for a number of recent texttosql datasets like spider even just focusing on knowledge graphsbases there has been many studies in recent few years see several early ones listed for references in the end id note that the setting of this paper is sufficiently different from most of the existing studies because it only focuses on questions that correspond to a single fact but itd benefit the readers to better position this work in the broader literature the necessity of the proposed modifications to the base kgqa model doesnt seem totally necessary to me why not just use the stateoftheart model from petrochuk and zettlemoyer 2018 and augment it with the generated questions or at least use it as a baseline there might need a few minor adjustments to the base model but it doesnt seem to me it would be substantial the motivation for a rankingbased relation prediction model is given as this way we can in principle represent any relation r r during inference time however this doesnt seem to be a very convincing argument when a new domain is added in order to apply the proposed data augmentation we would need to retrain the kgqa model at that point we would have known the relations in the new domain for the purpose of data augmentation so why couldnt we train the multiclass classifier again on the augmented data with the new relations added there are a number of places in the proposed method that builds popularity as an inductive bias into the method for example answer selection always selects the one with the most popular subject entity in order for the wikipediabased distant supervision to work the entity pairs of a relation are required to exist on and linked to wikipedia which only a fraction of popular entities in freebase do related to that evaluation is also only conducted on domains that are wellpopulated in freebase this is not desired for crossdomain semantic parsing because 1 its more of an artifact of current datasets and 2 crossdomain semantic parsing is more valuable if it could work for less popular domains the long tail for popular domains its more likely one may be willing to pay the cost of data collection because the incentive is higher minor personally i dont think firstorder is the best term for this type of question answering because its easily confused with firstorder logic the description logics behind semantic webknowledge graphs is a subset of firstorder logic simple or singlerelational though still not perfectly precise may be slightly better if we have to give it a name 1 crossdomain semantic parsing via paraphrasing emnlp17 2 decoupling structure and lexicon for zeroshot semantic parsing emnlp18 docsepthis paper presents a simple approach for domain adaptation in knowledge graph question answering the paper consider the setting where the knowledge graph used to back the qa system contains the necessary facts for a testtime domain but the training domain didnt cover an examples that required inference over that subdomain to bridge the gap the paper proposed a simple procedure for constructing synthetic questions over the relations in the test domain pros the task definition considered is appealing and identifies another area in which simplequestions is not solved the approach yields modest but consistent empirical gains across the different domains considered it is relatively simple with few assumptions making it more likely to generalize cons the novelty of the paper is fairly limited synthetic question generation has been well studied in other areas of qa including training models from synthetic data only domain adaption is also well studied wu et al cited here also study adaptation to unseen relations for kbqa which is inherently closely related to unseen domain adaptation though not a flaw per se the generation method is fairly simplistic which might work well for something like simplequestions which hardly require natural language understanding but not for datasets with richer language structure the empirical gains are small most of the benefit seems to be coming from the improved rp network which uses standard components none of the submodules are pretrained it would be interesting to see if using a pretrained encoder such as a large language model bert etc would help in covering the gap in linguistic variation understanding across domainsdocsepthe paper studies the problem of singlerelation qa it proposes a data generation technique for domain adaptation the model is built using a known decomposition of the problem with emphasis on making relation prediction easy to retrain using generated data additional question data is generated to retrain the model for previously unseen domains overall the paper is well written and motivated the results are convincing the paper is narrow and focused in its contribution but the problem is significant enough to merit such focused contribution there are some issues it would be good to get the authors input on a the paper provides few examples and no figures both would have done much to illustrate the approach and make the paper more accessible in fact except the intro there is not a single question example this is also critical in the experimental results which provide little insight into the numerical results eg through qualitative analysis b theres no evaluation or even examples of the generated questions ideally this should be done with human evaluation this can really help understand the gap between the system performance and oracle questions c during experiments do you train once and then tune and test for each unseen domain or do you include the other 5 unseen domains in the training set d some of the results are only reported in text they should be in tables some numbers are just missing when you report the performance on seen relations its really important to provide the numbers for other recent approaches and not just provide the much less informative ranking if the paper is short on space all the tables can be merged into a single table with some separators e the related work section adds little coming like this at the end when most of the papers mentioned if not all were already discussed at similar level or deeper before some more minor issues that the authors should address a the methods seems to require enumerating and distinguishing domains its not specified how this is done in kb used in the paper this should be made clear b what is terms in section 2 referring to is this a nonstandard way to refer to tokens c for mention detection why not use bio tagging this is the most common standard and seems like a perfect fit here the current choice seems suboptimal d in rp the embeddings are initialized with word2vec but what about the entity placeholder token also do you use any initialization or embedding sharing between the natural language and the relation tokens e for as the paper mentions using a heuristic based on popularity does this really address the problem or maybe works just because of some artifacts in the data its ok to use this heuristic but a 12 sentence discussion would help f the first paragraph in section 41 is confusing with how it sets expectations for what is described below it for example the mention of wikipedia sentences is confusing its clarified later but still again figure and examples would help a lot the mention of randomly initialized embeddings next paragraph is confusing without mentioning training g some typos we create a extract of users with the same intend many h why take the median for evaluation is it strictly better than mean and stddev i the use rqx for research questions is not working the reader just cant remember what each is referring to ### Summary:
this paper studies the problem of simple question answering over new unseen domains during test time a domain adaption framework and a seq2seq question generation method have been proposed to tackle this problem and demonstrates significant improvements over the previous baselines all the reviewers agreed that this paper is wellwritten and the results are convincing but the problem is relatively narrow with a focused contribution several reviewers also questioned whether this paper contains enough technical contributions some other issues have been already addressed during the discussion phase long tail relations presentation issues and adding more related work however we recommend accepting the paper considering the simplicity and effectiveness of the approach we think it would lead to more discussionfuture work in this direction
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 2175, 253, 1895, 273, 22291, 806, 2621, 3533, 625, 327, 253, 28939, 1996, 326, 2723, 281, 247, 2014, 958, 275, 247, 3640, 4216, 15841, 285, 16633, 327, 253, 2831, 13517, 4758, 835, 642, 1095, 456, 3733, 6667, 403, 2530, 323, 253, 39709, 1071, 5028, 253, 4081, 2613, 15841, 31569, 1566, 310, 7321, 432, 253, 1375, 23037, 14387, 1566, 327, 2969, 34974, 432, 7590, 287, 348, 2788, 285, 1182, 3592, 5616, 37674, 4765, 533, 342, 253, 5886, 10554, 4445, 4391, 432, 247, 9162, 1566, 281, 247, 19947, 1566, 281, 1805, 6016, 39709, 2493, 625, 327, 436, 1996, 253, 2234, 7680, 310, 247, 1039, 273, 11365, 13506, 3533, 323, 253, 2493, 275, 253, 39709, 5028, 323, 941, 42072, 253, 5978, 1566, 310, 432, 1045, 84, 1240, 274, 1162, 355, 4765, 533, 310, 31612, 342, 2493, 29765, 28731, 278, 967, 432, 259, 15170, 3066, 13392, 20446, 7103, 327, 40206, 31377, 2969, 34974, 2722, 326, 253, 4081, 1332, 476, 5115, 247, 5272, 3045, 327, 721, 4236, 1071, 10625, 273, 1781, 281, 10290, 4311, 285, 253, 1953, 5978, 5700, 310, 1805, 685, 2067, 1666, 25379, 50276, 296, 3755, 20556, 50276, 1189, 455, 253, 2929, 310, 973, 15720, 285, 3477, 281, 956, 50276, 16599, 13517, 24705, 29072, 19751, 22291, 310, 247, 1077, 1774, 1895, 984, 273, 253, 3862, 30437, 273, 253, 5853, 50276, 783, 7103, 4620, 281, 320, 973, 4158, 285, 2722, 690, 4722, 285, 4891, 1543, 50276, 20881, 1255, 265, 50276, 1189, 455, 253, 7681, 7680, 4620, 281, 320, 16888, 697, 8127, 247, 22989, 273, 1929, 5609, 323, 247, 19554, 2715, 273, 247, 7561, 14091, 728, 1895, 50276, 16599, 13517, 24705, 29072, 50276, 783, 2929, 987, 2920, 2792, 562, 253, 6349, 273, 253, 2831, 13517, 4758, 352, 310, 2299, 247, 2372, 10084, 281, 923, 326, 253, 5955, 273, 2905, 789, 310, 7094, 18414, 281, 253, 2987, 327, 2969, 34974, 323, 247, 1180, 273, 2590, 4606, 3652, 24705, 29072, 1566, 10981, 1699, 3082, 326, 476, 39970, 2439, 10625, 310, 247, 973, 35477, 4831, 285, 556, 2959, 1199, 4116, 352, 310, 323, 1650, 247, 4270, 249, 8284, 323, 247, 1180, 273, 3332, 2505, 85, 375, 5848, 15302, 751, 32850, 1014, 816, 13654, 327, 3640, 14580, 67, 1169, 627, 556, 644, 1142, 2175, 275, 3332, 1643, 1107, 923, 2067, 2393, 4394, 7117, 323, 10414, 275, 253, 990, 2654, 3877, 326, 253, 4758, 273, 436, 2929, 310, 10481, 1027, 432, 954, 273, 253, 5368, 2175, 984, 352, 760, 16633, 327, 3533, 326, 2723, 281, 247, 2014, 958, 533, 352, 69, 5649, 253, 10668, 281, 1805, 1899, 436, 789, 275, 253, 16055, 6239, 50276, 783, 15504, 273, 253, 4081, 14586, 281, 253, 2613, 15841, 31569, 1566, 36908, 1646, 9106, 3309, 281, 479, 2139, 417, 816, 897, 253, 1375, 23037, 14387, 1566, 432, 7590, 287, 348, 2788, 285, 1182, 3592, 5616, 37674, 4765, 285, 35919, 352, 342, 253, 4561, 3533, 390, 387, 1878, 897, 352, 347, 247, 8245, 627, 1537, 878, 247, 1643, 5884, 23927, 281, 253, 2613, 1566, 533, 352, 36908, 1646, 281, 479, 352, 651, 320, 6832, 253, 16038, 323, 247, 19947, 3169, 5886, 10554, 1566, 310, 1677, 347, 436, 1039, 359, 476, 275, 8063, 1957, 667, 5886, 391, 50276, 83, 1309, 17032, 673, 2299, 436, 36908, 1646, 281, 320, 247, 1077, 21414, 4154, 672, 247, 747, 5028, 310, 2879, 275, 1340, 281, 4647, 253, 4081, 941, 42072, 359, 651, 878, 281, 851, 1949, 253, 15841, 31569, 1566, 387, 326, 1127, 359, 651, 452, 1929, 253, 2493, 275, 253, 747, 5028, 323, 253, 4096, 273, 941, 42072, 594, 2139, 812, 2649, 359, 6194, 253, 23559, 14407, 30410, 969, 327, 253, 31612, 941, 342, 253, 747, 2493, 2879, 50275, 9088, 403, 247, 1180, 273, 5053, 275, 253, 4081, 1332, 326, 21168, 18395, 347, 271, 42115, 8492, 715, 253, 1332, 323, 1650, 3662, 5438, 1900, 34899, 253, 581, 342, 253, 954, 4633, 2256, 10726, 275, 1340, 323, 253, 259, 14956, 24559, 357, 833, 13392, 20446, 281, 789, 253, 10726, 8557, 273, 247, 5886, 403, 2424, 281, 2226, 327, 285, 7939, 281, 259, 15170, 534, 760, 247, 6919, 273, 4633, 14429, 275, 1959, 4793, 513, 2905, 281, 326, 7103, 310, 671, 760, 5196, 327, 10625, 326, 403, 973, 9576, 2907, 275, 1959, 4793, 436, 310, 417, 6799, 323, 2831, 13517, 24705, 29072, 984, 337, 697, 625, 273, 271, 34332, 273, 1655, 15302, 285, 374, 2831, 13517, 24705, 29072, 310, 625, 9865, 604, 352, 812, 789, 323, 1679, 4633, 10625, 253, 1048, 8105, 323, 4633, 10625, 697, 625, 2779, 581, 778, 320, 7378, 281, 2075, 253, 2105, 273, 941, 4849, 984, 253, 25275, 310, 2169, 50276, 37585, 50276, 10816, 595, 891, 13414, 1158, 806, 2621, 310, 253, 1682, 1307, 323, 436, 1511, 273, 1953, 22291, 984, 697, 4354, 13477, 342, 806, 2621, 9317, 253, 5740, 2412, 982, 3212, 24705, 4384, 36871, 14580, 310, 247, 8578, 273, 806, 2621, 9317, 2969, 390, 2014, 1661, 1050, 2167, 1335, 417, 9670, 10799, 778, 320, 5777, 1805, 604, 359, 452, 281, 1918, 352, 247, 1416, 50275, 18, 2831, 13517, 24705, 29072, 3066, 1061, 24596, 2355, 50276, 358, 13307, 81, 1166, 374, 34430, 4906, 2605, 285, 26752, 3557, 323, 1182, 254, 6934, 302, 24705, 29072, 802, 13307, 81, 1093, 5474, 33032, 2520, 2929, 10262, 247, 2969, 2746, 323, 5028, 15644, 275, 3640, 4216, 1953, 22291, 253, 2929, 1908, 253, 4758, 835, 253, 3640, 4216, 908, 281, 896, 253, 2805, 66, 985, 4428, 253, 3309, 5441, 323, 247, 1071, 2606, 5028, 533, 253, 3733, 5028, 42126, 3835, 271, 6667, 326, 2424, 17032, 689, 326, 749, 13517, 281, 9729, 253, 8037, 253, 2929, 4081, 247, 2969, 5199, 323, 26736, 13506, 3533, 689, 253, 2493, 275, 253, 1071, 5028, 50276, 856, 84, 50276, 783, 4836, 5426, 2783, 310, 23176, 285, 22649, 1529, 2170, 275, 534, 2969, 34974, 310, 417, 14042, 50276, 783, 2746, 11026, 16453, 533, 5185, 16774, 15988, 2439, 253, 1027, 10625, 2783, 50276, 262, 310, 4942, 2969, 342, 1643, 13260, 2403, 352, 625, 2779, 281, 39970, 50276, 5040, 50276, 783, 38135, 273, 253, 2929, 310, 9648, 3710, 13506, 1953, 5978, 556, 644, 973, 5421, 275, 643, 3672, 273, 2805, 66, 1690, 3733, 3210, 432, 13506, 941, 760, 5028, 5223, 279, 310, 671, 973, 5421, 259, 86, 1162, 355, 11106, 1060, 671, 1263, 15644, 281, 39709, 2493, 323, 28724, 31569, 534, 310, 26557, 8244, 2905, 281, 39709, 5028, 15644, 50276, 2004, 417, 247, 19652, 591, 396, 253, 5978, 1332, 310, 9648, 8077, 2531, 50276, 4609, 1537, 789, 973, 323, 1633, 751, 2969, 34974, 534, 10693, 2430, 3626, 3448, 4685, 533, 417, 323, 15302, 342, 38539, 3448, 2605, 50276, 783, 16774, 15988, 403, 1355, 954, 273, 253, 5649, 3133, 281, 320, 3551, 432, 253, 5520, 391, 81, 2990, 534, 4648, 2629, 4295, 50276, 15422, 273, 253, 749, 14825, 403, 3215, 11273, 352, 651, 320, 4722, 281, 923, 604, 970, 247, 3215, 11273, 32049, 824, 347, 247, 1781, 3448, 1566, 270, 797, 3966, 651, 1361, 275, 10985, 253, 8037, 275, 32019, 7629, 50276, 4524, 6924, 2439, 10625, 7152, 339, 431, 248, 2929, 2175, 253, 1895, 273, 2014, 16429, 2805, 66, 352, 29328, 247, 941, 5978, 5853, 323, 5028, 15644, 253, 1566, 310, 4270, 970, 247, 1929, 14717, 273, 253, 1895, 342, 15075, 327, 2403, 5886, 10554, 3477, 281, 851, 1949, 970, 4561, 941, 3081, 1953, 941, 310, 4561, 281, 851, 1949, 253, 1566, 323, 3786, 39709, 10625, 50275, 1189, 455, 253, 2929, 310, 973, 3542, 285, 17194, 253, 1543, 403, 21414, 253, 2929, 310, 6891, 285, 7106, 275, 697, 7680, 533, 253, 1895, 310, 1534, 2217, 281, 15785, 824, 7106, 7680, 50275, 9088, 403, 690, 3374, 352, 651, 320, 1175, 281, 755, 253, 4477, 3280, 327, 50276, 66, 253, 2929, 3400, 1643, 6667, 285, 642, 8442, 1097, 651, 452, 2218, 1199, 281, 17093, 253, 2746, 285, 1056, 253, 2929, 625, 12482, 275, 958, 3707, 253, 26432, 627, 310, 417, 247, 2014, 1953, 1650, 436, 310, 671, 4619, 275, 253, 5661, 1543, 534, 2085, 1652, 12288, 715, 253, 10704, 1543, 24088, 949, 18276, 1783, 50276, 67, 253, 373, 642, 7103, 390, 1014, 6667, 273, 253, 4561, 3533, 34243, 436, 943, 320, 2218, 342, 1966, 7103, 436, 476, 1663, 1361, 2096, 253, 8037, 875, 253, 985, 3045, 285, 42295, 3533, 50275, 68, 1309, 4679, 513, 368, 6194, 2378, 285, 840, 19928, 285, 1071, 323, 1016, 39709, 5028, 390, 513, 368, 2486, 253, 643, 608, 39709, 10625, 275, 253, 3733, 873, 50275, 69, 690, 273, 253, 1543, 403, 760, 2361, 275, 2505, 597, 943, 320, 275, 7180, 690, 3904, 403, 816, 5816, 672, 368, 1304, 253, 3045, 327, 2326, 2493, 697, 1663, 1774, 281, 2085, 253, 3904, 323, 643, 3332, 7274, 285, 417, 816, 2085, 253, 1199, 1679, 27096, 19947, 604, 253, 2929, 310, 2159, 327, 2317, 512, 253, 7180, 476, 320, 21884, 715, 247, 2014, 2829, 342, 690, 2533, 2392, 50275, 70, 253, 2905, 789, 2593, 11323, 1652, 3551, 751, 436, 387, 253, 990, 672, 954, 273, 253, 9380, 5393, 604, 417, 512, 497, 2168, 5469, 387, 2074, 1268, 390, 12861, 1078, 50275, 8826, 625, 5884, 3374, 326, 253, 4477, 943, 2953, 50276, 66, 253, 3082, 3133, 281, 2430, 30482, 839, 285, 32495, 10625, 697, 417, 7616, 849, 436, 310, 2218, 275, 28724, 908, 275, 253, 2929, 436, 943, 320, 1160, 2590, 50275, 67, 752, 310, 2426, 275, 2593, 374, 14339, 281, 310, 436, 247, 1327, 15291, 1039, 281, 3730, 281, 21761, 50275, 68, 323, 3748, 5481, 2139, 417, 897, 9015, 48510, 436, 310, 253, 954, 1846, 2629, 285, 3133, 751, 247, 3962, 4944, 1060, 253, 1655, 4327, 3133, 749, 29776, 50275, 69, 275, 391, 81, 253, 46234, 403, 31260, 342, 3159, 19, 4642, 533, 752, 670, 253, 10726, 30300, 10669, 671, 513, 368, 897, 667, 31850, 390, 21496, 9628, 875, 253, 3626, 3448, 285, 253, 5886, 21761, 50275, 70, 323, 347, 253, 2929, 25957, 970, 247, 47641, 1754, 327, 18395, 1057, 436, 1663, 2953, 253, 1895, 390, 5046, 2987, 816, 984, 273, 690, 24165, 275, 253, 941, 697, 8718, 281, 897, 436, 47641, 533, 247, 1249, 6197, 5955, 651, 1361, 50275, 71, 253, 806, 12494, 275, 2593, 7609, 310, 21643, 342, 849, 352, 5239, 12656, 323, 752, 310, 2529, 2708, 352, 323, 1650, 253, 3748, 273, 259, 15170, 14683, 310, 21643, 697, 31637, 1996, 533, 1335, 969, 4677, 285, 6667, 651, 1361, 247, 2257, 253, 3748, 273, 12421, 31260, 46234, 1735, 12494, 310, 21643, 1293, 29570, 3733, 50275, 72, 690, 963, 993, 50276, 664, 2794, 247, 4908, 273, 50275, 15987, 342, 253, 1072, 18607, 1142, 50275, 73, 2139, 1379, 253, 8876, 323, 7103, 310, 352, 13714, 1805, 685, 1599, 285, 6268, 3620, 50275, 74, 253, 897, 391, 82, 89, 323, 2561, 3533, 310, 417, 2444, 253, 9414, 816, 16216, 4456, 752, 1016, 310, 14339, 281, 50266, 187, 187, 4118, 18435, 27, 2520, 2929, 2175, 253, 1895, 273, 2969, 1953, 22291, 689, 747, 39709, 10625, 1309, 1071, 673, 247, 5028, 5223, 279, 7792, 285, 247, 22510, 19, 14571, 1953, 5978, 1332, 452, 644, 4081, 281, 18915, 436, 1895, 285, 14371, 1534, 11701, 689, 253, 2045, 1666, 25379, 50276, 455, 253, 30628, 5821, 326, 436, 2929, 310, 973, 15720, 285, 253, 1543, 403, 21414, 533, 253, 1895, 310, 4942, 6891, 342, 247, 7106, 7680, 50276, 43249, 30628, 671, 17801, 1880, 436, 2929, 4428, 2217, 7681, 9021, 690, 643, 3374, 452, 644, 2168, 9713, 1309, 253, 5955, 3408, 1048, 8105, 2493, 9759, 3374, 285, 6240, 625, 2905, 789, 50276, 35529, 359, 5583, 18738, 253, 2929, 7296, 253, 17647, 285, 12510, 273, 253, 2746, 359, 1158, 352, 651, 1421, 281, 625, 5955, 32279, 789, 275, 436, 3884 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 2175, 253, 1895, 273, 22291, 806, 2621, 3533, 625, 327, 253, 28939, 1996, 326, 2723, 281, 247, 2014, 958, 275, 247, 3640, 4216, 15841, 285, 16633, 327, 253, 2831, 13517, 4758, 835, 642, 1095, 456, 3733, 6667, 403, 2530, 323, 253, 39709, 1071, 5028, 253, 4081, 2613, 15841, 31569, 1566, 310, 7321, 432, 253, 1375, 23037, 14387, 1566, 327, 2969, 34974, 432, 7590, 287, 348, 2788, 285, 1182, 3592, 5616, 37674, 4765, 533, 342, 253, 5886, 10554, 4445, 4391, 432, 247, 9162, 1566, 281, 247, 19947, 1566, 281, 1805, 6016, 39709, 2493, 625, 327, 436, 1996, 253, 2234, 7680, 310, 247, 1039, 273, 11365, 13506, 3533, 323, 253, 2493, 275, 253, 39709, 5028, 323, 941, 42072, 253, 5978, 1566, 310, 432, 1045, 84, 1240, 274, 1162, 355, 4765, 533, 310, 31612, 342, 2493, 29765, 28731, 278, 967, 432, 259, 15170, 3066, 13392, 20446, 7103, 327, 40206, 31377, 2969, 34974, 2722, 326, 253, 4081, 1332, 476, 5115, 247, 5272, 3045, 327, 721, 4236, 1071, 10625, 273, 1781, 281, 10290, 4311, 285, 253, 1953, 5978, 5700, 310, 1805, 685, 2067, 1666, 25379, 50276, 296, 3755, 20556, 50276, 1189, 455, 253, 2929, 310, 973, 15720, 285, 3477, 281, 956, 50276, 16599, 13517, 24705, 29072, 19751, 22291, 310, 247, 1077, 1774, 1895, 984, 273, 253, 3862, 30437, 273, 253, 5853, 50276, 783, 7103, 4620, 281, 320, 973, 4158, 285, 2722, 690, 4722, 285, 4891, 1543, 50276, 20881, 1255, 265, 50276, 1189, 455, 253, 7681, 7680, 4620, 281, 320, 16888, 697, 8127, 247, 22989, 273, 1929, 5609, 323, 247, 19554, 2715, 273, 247, 7561, 14091, 728, 1895, 50276, 16599, 13517, 24705, 29072, 50276, 783, 2929, 987, 2920, 2792, 562, 253, 6349, 273, 253, 2831, 13517, 4758, 352, 310, 2299, 247, 2372, 10084, 281, 923, 326, 253, 5955, 273, 2905, 789, 310, 7094, 18414, 281, 253, 2987, 327, 2969, 34974, 323, 247, 1180, 273, 2590, 4606, 3652, 24705, 29072, 1566, 10981, 1699, 3082, 326, 476, 39970, 2439, 10625, 310, 247, 973, 35477, 4831, 285, 556, 2959, 1199, 4116, 352, 310, 323, 1650, 247, 4270, 249, 8284, 323, 247, 1180, 273, 3332, 2505, 85, 375, 5848, 15302, 751, 32850, 1014, 816, 13654, 327, 3640, 14580, 67, 1169, 627, 556, 644, 1142, 2175, 275, 3332, 1643, 1107, 923, 2067, 2393, 4394, 7117, 323, 10414, 275, 253, 990, 2654, 3877, 326, 253, 4758, 273, 436, 2929, 310, 10481, 1027, 432, 954, 273, 253, 5368, 2175, 984, 352, 760, 16633, 327, 3533, 326, 2723, 281, 247, 2014, 958, 533, 352, 69, 5649, 253, 10668, 281, 1805, 1899, 436, 789, 275, 253, 16055, 6239, 50276, 783, 15504, 273, 253, 4081, 14586, 281, 253, 2613, 15841, 31569, 1566, 36908, 1646, 9106, 3309, 281, 479, 2139, 417, 816, 897, 253, 1375, 23037, 14387, 1566, 432, 7590, 287, 348, 2788, 285, 1182, 3592, 5616, 37674, 4765, 285, 35919, 352, 342, 253, 4561, 3533, 390, 387, 1878, 897, 352, 347, 247, 8245, 627, 1537, 878, 247, 1643, 5884, 23927, 281, 253, 2613, 1566, 533, 352, 36908, 1646, 281, 479, 352, 651, 320, 6832, 253, 16038, 323, 247, 19947, 3169, 5886, 10554, 1566, 310, 1677, 347, 436, 1039, 359, 476, 275, 8063, 1957, 667, 5886, 391, 50276, 83, 1309, 17032, 673, 2299, 436, 36908, 1646, 281, 320, 247, 1077, 21414, 4154, 672, 247, 747, 5028, 310, 2879, 275, 1340, 281, 4647, 253, 4081, 941, 42072, 359, 651, 878, 281, 851, 1949, 253, 15841, 31569, 1566, 387, 326, 1127, 359, 651, 452, 1929, 253, 2493, 275, 253, 747, 5028, 323, 253, 4096, 273, 941, 42072, 594, 2139, 812, 2649, 359, 6194, 253, 23559, 14407, 30410, 969, 327, 253, 31612, 941, 342, 253, 747, 2493, 2879, 50275, 9088, 403, 247, 1180, 273, 5053, 275, 253, 4081, 1332, 326, 21168, 18395, 347, 271, 42115, 8492, 715, 253, 1332, 323, 1650, 3662, 5438, 1900, 34899, 253, 581, 342, 253, 954, 4633, 2256, 10726, 275, 1340, 323, 253, 259, 14956, 24559, 357, 833, 13392, 20446, 281, 789, 253, 10726, 8557, 273, 247, 5886, 403, 2424, 281, 2226, 327, 285, 7939, 281, 259, 15170, 534, 760, 247, 6919, 273, 4633, 14429, 275, 1959, 4793, 513, 2905, 281, 326, 7103, 310, 671, 760, 5196, 327, 10625, 326, 403, 973, 9576, 2907, 275, 1959, 4793, 436, 310, 417, 6799, 323, 2831, 13517, 24705, 29072, 984, 337, 697, 625, 273, 271, 34332, 273, 1655, 15302, 285, 374, 2831, 13517, 24705, 29072, 310, 625, 9865, 604, 352, 812, 789, 323, 1679, 4633, 10625, 253, 1048, 8105, 323, 4633, 10625, 697, 625, 2779, 581, 778, 320, 7378, 281, 2075, 253, 2105, 273, 941, 4849, 984, 253, 25275, 310, 2169, 50276, 37585, 50276, 10816, 595, 891, 13414, 1158, 806, 2621, 310, 253, 1682, 1307, 323, 436, 1511, 273, 1953, 22291, 984, 697, 4354, 13477, 342, 806, 2621, 9317, 253, 5740, 2412, 982, 3212, 24705, 4384, 36871, 14580, 310, 247, 8578, 273, 806, 2621, 9317, 2969, 390, 2014, 1661, 1050, 2167, 1335, 417, 9670, 10799, 778, 320, 5777, 1805, 604, 359, 452, 281, 1918, 352, 247, 1416, 50275, 18, 2831, 13517, 24705, 29072, 3066, 1061, 24596, 2355, 50276, 358, 13307, 81, 1166, 374, 34430, 4906, 2605, 285, 26752, 3557, 323, 1182, 254, 6934, 302, 24705, 29072, 802, 13307, 81, 1093, 5474, 33032, 2520, 2929, 10262, 247, 2969, 2746, 323, 5028, 15644, 275, 3640, 4216, 1953, 22291, 253, 2929, 1908, 253, 4758, 835, 253, 3640, 4216, 908, 281, 896, 253, 2805, 66, 985, 4428, 253, 3309, 5441, 323, 247, 1071, 2606, 5028, 533, 253, 3733, 5028, 42126, 3835, 271, 6667, 326, 2424, 17032, 689, 326, 749, 13517, 281, 9729, 253, 8037, 253, 2929, 4081, 247, 2969, 5199, 323, 26736, 13506, 3533, 689, 253, 2493, 275, 253, 1071, 5028, 50276, 856, 84, 50276, 783, 4836, 5426, 2783, 310, 23176, 285, 22649, 1529, 2170, 275, 534, 2969, 34974, 310, 417, 14042, 50276, 783, 2746, 11026, 16453, 533, 5185, 16774, 15988, 2439, 253, 1027, 10625, 2783, 50276, 262, 310, 4942, 2969, 342, 1643, 13260, 2403, 352, 625, 2779, 281, 39970, 50276, 5040, 50276, 783, 38135, 273, 253, 2929, 310, 9648, 3710, 13506, 1953, 5978, 556, 644, 973, 5421, 275, 643, 3672, 273, 2805, 66, 1690, 3733, 3210, 432, 13506, 941, 760, 5028, 5223, 279, 310, 671, 973, 5421, 259, 86, 1162, 355, 11106, 1060, 671, 1263, 15644, 281, 39709, 2493, 323, 28724, 31569, 534, 310, 26557, 8244, 2905, 281, 39709, 5028, 15644, 50276, 2004, 417, 247, 19652, 591, 396, 253, 5978, 1332, 310, 9648, 8077, 2531, 50276, 4609, 1537, 789, 973, 323, 1633, 751, 2969, 34974, 534, 10693, 2430, 3626, 3448, 4685, 533, 417, 323, 15302, 342, 38539, 3448, 2605, 50276, 783, 16774, 15988, 403, 1355, 954, 273, 253, 5649, 3133, 281, 320, 3551, 432, 253, 5520, 391, 81, 2990, 534, 4648, 2629, 4295, 50276, 15422, 273, 253, 749, 14825, 403, 3215, 11273, 352, 651, 320, 4722, 281, 923, 604, 970, 247, 3215, 11273, 32049, 824, 347, 247, 1781, 3448, 1566, 270, 797, 3966, 651, 1361, 275, 10985, 253, 8037, 275, 32019, 7629, 50276, 4524, 6924, 2439, 10625, 7152, 339, 431, 248, 2929, 2175, 253, 1895, 273, 2014, 16429, 2805, 66, 352, 29328, 247, 941, 5978, 5853, 323, 5028, 15644, 253, 1566, 310, 4270, 970, 247, 1929, 14717, 273, 253, 1895, 342, 15075, 327, 2403, 5886, 10554, 3477, 281, 851, 1949, 970, 4561, 941, 3081, 1953, 941, 310, 4561, 281, 851, 1949, 253, 1566, 323, 3786, 39709, 10625, 50275, 1189, 455, 253, 2929, 310, 973, 3542, 285, 17194, 253, 1543, 403, 21414, 253, 2929, 310, 6891, 285, 7106, 275, 697, 7680, 533, 253, 1895, 310, 1534, 2217, 281, 15785, 824, 7106, 7680, 50275, 9088, 403, 690, 3374, 352, 651, 320, 1175, 281, 755, 253, 4477, 3280, 327, 50276, 66, 253, 2929, 3400, 1643, 6667, 285, 642, 8442, 1097, 651, 452, 2218, 1199, 281, 17093, 253, 2746, 285, 1056, 253, 2929, 625, 12482, 275, 958, 3707, 253, 26432, 627, 310, 417, 247, 2014, 1953, 1650, 436, 310, 671, 4619, 275, 253, 5661, 1543, 534, 2085, 1652, 12288, 715, 253, 10704, 1543, 24088, 949, 18276, 1783, 50276, 67, 253, 373, 642, 7103, 390, 1014, 6667, 273, 253, 4561, 3533, 34243, 436, 943, 320, 2218, 342, 1966, 7103, 436, 476, 1663, 1361, 2096, 253, 8037, 875, 253, 985, 3045, 285, 42295, 3533, 50275, 68, 1309, 4679, 513, 368, 6194, 2378, 285, 840, 19928, 285, 1071, 323, 1016, 39709, 5028, 390, 513, 368, 2486, 253, 643, 608, 39709, 10625, 275, 253, 3733, 873, 50275, 69, 690, 273, 253, 1543, 403, 760, 2361, 275, 2505, 597, 943, 320, 275, 7180, 690, 3904, 403, 816, 5816, 672, 368, 1304, 253, 3045, 327, 2326, 2493, 697, 1663, 1774, 281, 2085, 253, 3904, 323, 643, 3332, 7274, 285, 417, 816, 2085, 253, 1199, 1679, 27096, 19947, 604, 253, 2929, 310, 2159, 327, 2317, 512, 253, 7180, 476, 320, 21884, 715, 247, 2014, 2829, 342, 690, 2533, 2392, 50275, 70, 253, 2905, 789, 2593, 11323, 1652, 3551, 751, 436, 387, 253, 990, 672, 954, 273, 253, 9380, 5393, 604, 417, 512, 497, 2168, 5469, 387, 2074, 1268, 390, 12861, 1078, 50275, 8826, 625, 5884, 3374, 326, 253, 4477, 943, 2953, 50276, 66, 253, 3082, 3133, 281, 2430, 30482, 839, 285, 32495, 10625, 697, 417, 7616, 849, 436, 310, 2218, 275, 28724, 908, 275, 253, 2929, 436, 943, 320, 1160, 2590, 50275, 67, 752, 310, 2426, 275, 2593, 374, 14339, 281, 310, 436, 247, 1327, 15291, 1039, 281, 3730, 281, 21761, 50275, 68, 323, 3748, 5481, 2139, 417, 897, 9015, 48510, 436, 310, 253, 954, 1846, 2629, 285, 3133, 751, 247, 3962, 4944, 1060, 253, 1655, 4327, 3133, 749, 29776, 50275, 69, 275, 391, 81, 253, 46234, 403, 31260, 342, 3159, 19, 4642, 533, 752, 670, 253, 10726, 30300, 10669, 671, 513, 368, 897, 667, 31850, 390, 21496, 9628, 875, 253, 3626, 3448, 285, 253, 5886, 21761, 50275, 70, 323, 347, 253, 2929, 25957, 970, 247, 47641, 1754, 327, 18395, 1057, 436, 1663, 2953, 253, 1895, 390, 5046, 2987, 816, 984, 273, 690, 24165, 275, 253, 941, 697, 8718, 281, 897, 436, 47641, 533, 247, 1249, 6197, 5955, 651, 1361, 50275, 71, 253, 806, 12494, 275, 2593, 7609, 310, 21643, 342, 849, 352, 5239, 12656, 323, 752, 310, 2529, 2708, 352, 323, 1650, 253, 3748, 273, 259, 15170, 14683, 310, 21643, 697, 31637, 1996, 533, 1335, 969, 4677, 285, 6667, 651, 1361, 247, 2257, 253, 3748, 273, 12421, 31260, 46234, 1735, 12494, 310, 21643, 1293, 29570, 3733, 50275, 72, 690, 963, 993, 50276, 664, 2794, 247, 4908, 273, 50275, 15987, 342, 253, 1072, 18607, 1142, 50275, 73, 2139, 1379, 253, 8876, 323, 7103, 310, 352, 13714, 1805, 685, 1599, 285, 6268, 3620, 50275, 74, 253, 897, 391, 82, 89, 323, 2561, 3533, 310, 417, 2444, 253, 9414, 816, 16216, 4456, 752, 1016, 310, 14339, 281, 50266, 187, 187, 4118, 18435, 27, 2520, 2929, 2175, 253, 1895, 273, 2969, 1953, 22291, 689, 747, 39709, 10625, 1309, 1071, 673, 247, 5028, 5223, 279, 7792, 285, 247, 22510, 19, 14571, 1953, 5978, 1332, 452, 644, 4081, 281, 18915, 436, 1895, 285, 14371, 1534, 11701, 689, 253, 2045, 1666, 25379, 50276, 455, 253, 30628, 5821, 326, 436, 2929, 310, 973, 15720, 285, 253, 1543, 403, 21414, 533, 253, 1895, 310, 4942, 6891, 342, 247, 7106, 7680, 50276, 43249, 30628, 671, 17801, 1880, 436, 2929, 4428, 2217, 7681, 9021, 690, 643, 3374, 452, 644, 2168, 9713, 1309, 253, 5955, 3408, 1048, 8105, 2493, 9759, 3374, 285, 6240, 625, 2905, 789, 50276, 35529, 359, 5583, 18738, 253, 2929, 7296, 253, 17647, 285, 12510, 273, 253, 2746, 359, 1158, 352, 651, 1421, 281, 625, 5955, 32279, 789, 275, 436, 3884 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the introduction of highquality image generation models particularly the stylegan family provides a powerful tool to synthesize and manipulate images however existing models are built upon highquality hq data as desired outputs making them unfit for inthewild lowquality lq images which are common inputs for manipulation in this work the authors bridge this gap by proposing a novel gan structure that allows for generating images with controllable quality the network can synthesize various image degradation and restore the sharp image via a quality control code the proposed qcstylegan can directly edit lq images without altering their quality by applying gan inversion and manipulation techniques the proposed qcstylegan is novel and can generate both clean and degraded images with a qualitycontrol input the qcstylegan allows more accurate gan inversion on lowquality images making image manipulation applicable on these inputs it provides an efficient ganbased image restoration solution qcstylegan also supports novel image degradation synthesis tasks the reviewer think that the proposed method is novel and recommend to accept the paper the reviewer is not very familiar with the scope of this paper the reviewer has no idea of the limitations docsepthis paper targeted a very interesting problem when we use a ganinversion method to edit images the gan can only generate sharp images and what if we got a degraded image and we dont want its degradation to be distorted this paper gives a method to solve this problem in addition to an original stylegan network this paper proposes a degradblock to add degradation that is controlled by an additional degradation quality code when a new image comes the method not only inverts the sharp image but also inverts its degradation the results show good performance and several applications are studied strengths 1 the question of what if we got a degraded image is interesting 2 it is the first gan method that can be controlled to generate degraded images 3 nice performance 4 a generator that knows different degradation is a good idea for blind image restoration weaknesses 1 although interesting the practice value of the studied problem is not that obvious given the mentioned setting why would people want a degraded image maybe the restoration task is more valuable 2 the design of the degradblock lacks novelty it is like an addin controlled by the quality code and with some conv layers the direct product operation makes it give 0 when the quality code is 0 this method doesnt give us much insight technically although i comment on the novelty it is the last reason that affects my rating 3 the proposed method can be used as a blind face restoration method but the provided experiments are not enough to show the actual potential this work is related to face images but i guess there is no obvious potential negative societal impact the limitation is not discussed docsepstylegan is able to generate highresolution images and produce semantically disentangled and smooth latent space at the same time this paper proposed an extension of stylegan that is able to control the quality of the generated images by giving a controllable vector this extension not only allows to generate degraded images but also brings some interesting applications such as image editing and image restoration by applying gan inversion and manipulation techniques although there are not many quantitative comparisons qualitative results show impressive effectiveness strengths the idea is easy to understand and i think maybe the concept can be extended to different research the qualitative results show impressive effectiveness weaknesses there is no baseline method of the degraded image generation models for comparison although the qualitative results are good it would be more convincing if there are more quantitative comparisons as mentioned in my second question the proposed method may not perform well in some cases it is important to investigate the reason and it should be discussed in the paper it will be beneficial to the readers and may help readers to conduct further research in the future docsepthis paper aims to solve the problem that existing gan methods are unfit for lowquality input images the authors present a quality controllable image generation and manipulation method the proposed method can be applied to many applications like image restoration and image manipulation strengths the proposed method is applied to many applications like image restoration and image editing weaknesses the writing of this paper needs to be improved the paper is hard to follow for example there is no overview of the proposed method although there is an overview subsection it is more like problem formulation an overview in plain english and an illustration or framework are necessary before giving any formula the motivation novelty and contributions are not well described the method is not wellvalidated and no extensive comparison results are given the limitations are discussed ### Summary:
qcstylegan provides a controllable way of generating images with certain types of corruption generating corrupted images has many applications because they are common in realworld situations so having the ability to generate samples from distributions with controllable corruptions is a powerful idea the reviewers generally felt the paper should be accepted although the highest score came from the least confident reviewer the overall average is still on the accept side and i feel the paper has enough interesting ideas and potential applications to be useful to the neurips audience so i recommend acceptance
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 10199, 273, 1029, 15177, 2460, 5978, 3210, 3782, 253, 3740, 1247, 2021, 3400, 247, 6422, 4968, 281, 46919, 285, 26526, 3888, 2299, 5368, 3210, 403, 4270, 2220, 1029, 15177, 288, 82, 941, 347, 6799, 18012, 2403, 731, 5369, 262, 323, 540, 248, 32778, 1698, 15177, 298, 82, 3888, 534, 403, 1846, 14800, 323, 19763, 275, 436, 789, 253, 4477, 9729, 436, 8037, 407, 36636, 247, 4460, 36827, 2605, 326, 4483, 323, 11365, 3888, 342, 3661, 494, 3290, 253, 2990, 476, 46919, 2710, 2460, 11961, 285, 15042, 253, 9479, 2460, 3066, 247, 3290, 1453, 2127, 253, 4081, 2805, 68, 4826, 1247, 476, 3587, 12921, 298, 82, 3888, 1293, 30897, 616, 3290, 407, 9433, 36827, 27697, 285, 19763, 5609, 253, 4081, 2805, 68, 4826, 1247, 310, 4460, 285, 476, 6635, 1097, 4076, 285, 30853, 3888, 342, 247, 3290, 8519, 3280, 50276, 783, 2805, 68, 4826, 1247, 4483, 625, 7899, 36827, 27697, 327, 1698, 15177, 3888, 2403, 2460, 19763, 7763, 327, 841, 14800, 352, 3400, 271, 5919, 36827, 3169, 2460, 20384, 2900, 2805, 68, 4826, 1247, 671, 8525, 4460, 2460, 11961, 9066, 8892, 50276, 783, 37317, 1158, 326, 253, 4081, 1332, 310, 4460, 285, 5583, 281, 2997, 253, 2929, 253, 37317, 310, 417, 1077, 7615, 342, 253, 7990, 273, 436, 2929, 253, 37317, 556, 642, 2934, 273, 253, 7364, 5474, 33032, 2520, 2929, 10522, 247, 1077, 4722, 1895, 672, 359, 897, 247, 36827, 249, 4149, 1332, 281, 12921, 3888, 253, 36827, 476, 760, 6635, 9479, 3888, 285, 752, 604, 359, 1694, 247, 30853, 2460, 285, 359, 13414, 971, 697, 11961, 281, 320, 32408, 436, 2929, 4245, 247, 1332, 281, 8415, 436, 1895, 275, 1635, 281, 271, 3236, 3740, 1247, 2990, 436, 2929, 29328, 247, 9579, 6172, 281, 823, 11961, 326, 310, 6537, 407, 271, 3081, 11961, 3290, 2127, 672, 247, 747, 2460, 3249, 253, 1332, 417, 760, 275, 31332, 253, 9479, 2460, 533, 671, 275, 31332, 697, 11961, 253, 1543, 921, 1175, 3045, 285, 2067, 4893, 403, 5421, 50276, 296, 3755, 20556, 337, 253, 1953, 273, 752, 604, 359, 1694, 247, 30853, 2460, 310, 4722, 374, 352, 310, 253, 806, 36827, 1332, 326, 476, 320, 6537, 281, 6635, 30853, 3888, 495, 5322, 3045, 577, 247, 14156, 326, 6057, 1027, 11961, 310, 247, 1175, 2934, 323, 9645, 2460, 20384, 50276, 20881, 1255, 265, 337, 3738, 4722, 253, 3946, 1318, 273, 253, 5421, 1895, 310, 417, 326, 4755, 1677, 253, 5393, 4758, 2139, 651, 952, 971, 247, 30853, 2460, 5046, 253, 20384, 4836, 310, 625, 9865, 374, 253, 2216, 273, 253, 9579, 6172, 19756, 38135, 352, 310, 751, 271, 823, 249, 6537, 407, 253, 3290, 2127, 285, 342, 690, 2410, 8090, 253, 1480, 1885, 4254, 2789, 352, 1918, 470, 672, 253, 3290, 2127, 310, 470, 436, 1332, 36908, 1918, 441, 1199, 12288, 22335, 3738, 891, 4385, 327, 253, 38135, 352, 310, 253, 1390, 1921, 326, 11852, 619, 13716, 50276, 20, 253, 4081, 1332, 476, 320, 908, 347, 247, 9645, 2454, 20384, 1332, 533, 253, 2530, 4679, 403, 417, 2217, 281, 921, 253, 4588, 2442, 50276, 2520, 789, 310, 2905, 281, 2454, 3888, 533, 891, 5476, 627, 310, 642, 4755, 2442, 4016, 38058, 3486, 253, 12291, 310, 417, 5469, 50276, 7152, 33032, 4826, 1247, 310, 2104, 281, 6635, 1029, 21061, 3888, 285, 4711, 3300, 39904, 557, 290, 33195, 285, 6032, 21624, 2317, 387, 253, 1072, 673, 436, 2929, 4081, 271, 6880, 273, 3740, 1247, 326, 310, 2104, 281, 1453, 253, 3290, 273, 253, 4561, 3888, 407, 4933, 247, 3661, 494, 4972, 436, 6880, 417, 760, 4483, 281, 6635, 30853, 3888, 533, 671, 10316, 690, 4722, 4893, 824, 347, 2460, 14835, 285, 2460, 20384, 407, 9433, 36827, 27697, 285, 19763, 5609, 3738, 627, 403, 417, 1142, 11745, 14023, 18276, 1543, 921, 13943, 12510, 20544, 253, 2934, 310, 3477, 281, 2096, 285, 891, 1158, 5046, 253, 4473, 476, 320, 6508, 281, 1027, 2561, 253, 18276, 1543, 921, 13943, 12510, 50276, 20881, 1255, 265, 627, 310, 642, 8245, 1332, 273, 253, 30853, 2460, 5978, 3210, 323, 5301, 3738, 253, 18276, 1543, 403, 1175, 352, 651, 320, 625, 21414, 604, 627, 403, 625, 11745, 14023, 347, 5393, 275, 619, 1273, 1953, 253, 4081, 1332, 778, 417, 1347, 973, 275, 690, 2219, 352, 310, 1774, 281, 7409, 253, 1921, 285, 352, 943, 320, 5469, 275, 253, 2929, 352, 588, 320, 12912, 281, 253, 10668, 285, 778, 1361, 10668, 281, 2589, 2007, 2561, 275, 253, 2852, 5474, 33032, 2520, 2929, 13698, 281, 8415, 253, 1895, 326, 5368, 36827, 3082, 403, 5369, 262, 323, 1698, 15177, 3280, 3888, 253, 4477, 1246, 247, 3290, 3661, 494, 2460, 5978, 285, 19763, 1332, 253, 4081, 1332, 476, 320, 3732, 281, 1142, 4893, 751, 2460, 20384, 285, 2460, 19763, 50276, 296, 3755, 20556, 253, 4081, 1332, 310, 3732, 281, 1142, 4893, 751, 2460, 20384, 285, 2460, 14835, 50276, 20881, 1255, 265, 253, 4028, 273, 436, 2929, 3198, 281, 320, 5520, 253, 2929, 310, 1892, 281, 956, 323, 1650, 627, 310, 642, 18389, 273, 253, 4081, 1332, 3738, 627, 310, 271, 18389, 19087, 352, 310, 625, 751, 1895, 15895, 271, 18389, 275, 8342, 48087, 285, 271, 23356, 390, 7792, 403, 3309, 1078, 4933, 667, 7212, 253, 16038, 38135, 285, 9021, 403, 417, 973, 2529, 253, 1332, 310, 417, 973, 7210, 456, 285, 642, 9470, 5301, 1543, 403, 1677, 50276, 783, 7364, 403, 5469, 2490, 187, 4118, 18435, 27, 38697, 4826, 1247, 3400, 247, 3661, 494, 1039, 273, 11365, 3888, 342, 2176, 3510, 273, 16933, 11365, 40634, 3888, 556, 1142, 4893, 984, 597, 403, 1846, 275, 1524, 10186, 9534, 594, 1907, 253, 3745, 281, 6635, 3530, 432, 10670, 342, 3661, 494, 17715, 621, 310, 247, 6422, 2934, 253, 30628, 3839, 3543, 253, 2929, 943, 320, 7607, 3738, 253, 4585, 4868, 2210, 432, 253, 1878, 13224, 37317, 253, 4583, 3388, 310, 1335, 327, 253, 2997, 1930, 285, 891, 1928, 253, 2929, 556, 2217, 4722, 5697, 285, 2442, 4893, 281, 320, 4217, 281, 253, 5723, 2824, 8446, 594, 891, 5583, 14924 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 10199, 273, 1029, 15177, 2460, 5978, 3210, 3782, 253, 3740, 1247, 2021, 3400, 247, 6422, 4968, 281, 46919, 285, 26526, 3888, 2299, 5368, 3210, 403, 4270, 2220, 1029, 15177, 288, 82, 941, 347, 6799, 18012, 2403, 731, 5369, 262, 323, 540, 248, 32778, 1698, 15177, 298, 82, 3888, 534, 403, 1846, 14800, 323, 19763, 275, 436, 789, 253, 4477, 9729, 436, 8037, 407, 36636, 247, 4460, 36827, 2605, 326, 4483, 323, 11365, 3888, 342, 3661, 494, 3290, 253, 2990, 476, 46919, 2710, 2460, 11961, 285, 15042, 253, 9479, 2460, 3066, 247, 3290, 1453, 2127, 253, 4081, 2805, 68, 4826, 1247, 476, 3587, 12921, 298, 82, 3888, 1293, 30897, 616, 3290, 407, 9433, 36827, 27697, 285, 19763, 5609, 253, 4081, 2805, 68, 4826, 1247, 310, 4460, 285, 476, 6635, 1097, 4076, 285, 30853, 3888, 342, 247, 3290, 8519, 3280, 50276, 783, 2805, 68, 4826, 1247, 4483, 625, 7899, 36827, 27697, 327, 1698, 15177, 3888, 2403, 2460, 19763, 7763, 327, 841, 14800, 352, 3400, 271, 5919, 36827, 3169, 2460, 20384, 2900, 2805, 68, 4826, 1247, 671, 8525, 4460, 2460, 11961, 9066, 8892, 50276, 783, 37317, 1158, 326, 253, 4081, 1332, 310, 4460, 285, 5583, 281, 2997, 253, 2929, 253, 37317, 310, 417, 1077, 7615, 342, 253, 7990, 273, 436, 2929, 253, 37317, 556, 642, 2934, 273, 253, 7364, 5474, 33032, 2520, 2929, 10522, 247, 1077, 4722, 1895, 672, 359, 897, 247, 36827, 249, 4149, 1332, 281, 12921, 3888, 253, 36827, 476, 760, 6635, 9479, 3888, 285, 752, 604, 359, 1694, 247, 30853, 2460, 285, 359, 13414, 971, 697, 11961, 281, 320, 32408, 436, 2929, 4245, 247, 1332, 281, 8415, 436, 1895, 275, 1635, 281, 271, 3236, 3740, 1247, 2990, 436, 2929, 29328, 247, 9579, 6172, 281, 823, 11961, 326, 310, 6537, 407, 271, 3081, 11961, 3290, 2127, 672, 247, 747, 2460, 3249, 253, 1332, 417, 760, 275, 31332, 253, 9479, 2460, 533, 671, 275, 31332, 697, 11961, 253, 1543, 921, 1175, 3045, 285, 2067, 4893, 403, 5421, 50276, 296, 3755, 20556, 337, 253, 1953, 273, 752, 604, 359, 1694, 247, 30853, 2460, 310, 4722, 374, 352, 310, 253, 806, 36827, 1332, 326, 476, 320, 6537, 281, 6635, 30853, 3888, 495, 5322, 3045, 577, 247, 14156, 326, 6057, 1027, 11961, 310, 247, 1175, 2934, 323, 9645, 2460, 20384, 50276, 20881, 1255, 265, 337, 3738, 4722, 253, 3946, 1318, 273, 253, 5421, 1895, 310, 417, 326, 4755, 1677, 253, 5393, 4758, 2139, 651, 952, 971, 247, 30853, 2460, 5046, 253, 20384, 4836, 310, 625, 9865, 374, 253, 2216, 273, 253, 9579, 6172, 19756, 38135, 352, 310, 751, 271, 823, 249, 6537, 407, 253, 3290, 2127, 285, 342, 690, 2410, 8090, 253, 1480, 1885, 4254, 2789, 352, 1918, 470, 672, 253, 3290, 2127, 310, 470, 436, 1332, 36908, 1918, 441, 1199, 12288, 22335, 3738, 891, 4385, 327, 253, 38135, 352, 310, 253, 1390, 1921, 326, 11852, 619, 13716, 50276, 20, 253, 4081, 1332, 476, 320, 908, 347, 247, 9645, 2454, 20384, 1332, 533, 253, 2530, 4679, 403, 417, 2217, 281, 921, 253, 4588, 2442, 50276, 2520, 789, 310, 2905, 281, 2454, 3888, 533, 891, 5476, 627, 310, 642, 4755, 2442, 4016, 38058, 3486, 253, 12291, 310, 417, 5469, 50276, 7152, 33032, 4826, 1247, 310, 2104, 281, 6635, 1029, 21061, 3888, 285, 4711, 3300, 39904, 557, 290, 33195, 285, 6032, 21624, 2317, 387, 253, 1072, 673, 436, 2929, 4081, 271, 6880, 273, 3740, 1247, 326, 310, 2104, 281, 1453, 253, 3290, 273, 253, 4561, 3888, 407, 4933, 247, 3661, 494, 4972, 436, 6880, 417, 760, 4483, 281, 6635, 30853, 3888, 533, 671, 10316, 690, 4722, 4893, 824, 347, 2460, 14835, 285, 2460, 20384, 407, 9433, 36827, 27697, 285, 19763, 5609, 3738, 627, 403, 417, 1142, 11745, 14023, 18276, 1543, 921, 13943, 12510, 20544, 253, 2934, 310, 3477, 281, 2096, 285, 891, 1158, 5046, 253, 4473, 476, 320, 6508, 281, 1027, 2561, 253, 18276, 1543, 921, 13943, 12510, 50276, 20881, 1255, 265, 627, 310, 642, 8245, 1332, 273, 253, 30853, 2460, 5978, 3210, 323, 5301, 3738, 253, 18276, 1543, 403, 1175, 352, 651, 320, 625, 21414, 604, 627, 403, 625, 11745, 14023, 347, 5393, 275, 619, 1273, 1953, 253, 4081, 1332, 778, 417, 1347, 973, 275, 690, 2219, 352, 310, 1774, 281, 7409, 253, 1921, 285, 352, 943, 320, 5469, 275, 253, 2929, 352, 588, 320, 12912, 281, 253, 10668, 285, 778, 1361, 10668, 281, 2589, 2007, 2561, 275, 253, 2852, 5474, 33032, 2520, 2929, 13698, 281, 8415, 253, 1895, 326, 5368, 36827, 3082, 403, 5369, 262, 323, 1698, 15177, 3280, 3888, 253, 4477, 1246, 247, 3290, 3661, 494, 2460, 5978, 285, 19763, 1332, 253, 4081, 1332, 476, 320, 3732, 281, 1142, 4893, 751, 2460, 20384, 285, 2460, 19763, 50276, 296, 3755, 20556, 253, 4081, 1332, 310, 3732, 281, 1142, 4893, 751, 2460, 20384, 285, 2460, 14835, 50276, 20881, 1255, 265, 253, 4028, 273, 436, 2929, 3198, 281, 320, 5520, 253, 2929, 310, 1892, 281, 956, 323, 1650, 627, 310, 642, 18389, 273, 253, 4081, 1332, 3738, 627, 310, 271, 18389, 19087, 352, 310, 625, 751, 1895, 15895, 271, 18389, 275, 8342, 48087, 285, 271, 23356, 390, 7792, 403, 3309, 1078, 4933, 667, 7212, 253, 16038, 38135, 285, 9021, 403, 417, 973, 2529, 253, 1332, 310, 417, 973, 7210, 456, 285, 642, 9470, 5301, 1543, 403, 1677, 50276, 783, 7364, 403, 5469, 2490, 187, 4118, 18435, 27, 38697, 4826, 1247, 3400, 247, 3661, 494, 1039, 273, 11365, 3888, 342, 2176, 3510, 273, 16933, 11365, 40634, 3888, 556, 1142, 4893, 984, 597, 403, 1846, 275, 1524, 10186, 9534, 594, 1907, 253, 3745, 281, 6635, 3530, 432, 10670, 342, 3661, 494, 17715, 621, 310, 247, 6422, 2934, 253, 30628, 3839, 3543, 253, 2929, 943, 320, 7607, 3738, 253, 4585, 4868, 2210, 432, 253, 1878, 13224, 37317, 253, 4583, 3388, 310, 1335, 327, 253, 2997, 1930, 285, 891, 1928, 253, 2929, 556, 2217, 4722, 5697, 285, 2442, 4893, 281, 320, 4217, 281, 253, 5723, 2824, 8446, 594, 891, 5583, 14924 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper notes that universal successor feature approximators usfa psisaz here z is the policy vector exploits the smoothness of optimal value functions across different tasks but its approximation error could be large when target task vectors are far away to improve generalization to a new task w which can be expressed as a linear combination of source tasks the paper proposes constrained training of successor features that reduces its approximation error on the new task it then shows that a similar effect can be achieved by using similar constraints at test time without changing the training of the successor features the proposed method is able to improve task generalization on scavenger and reacher domains strengths 1 the resulting constrained gpi has good theoretical motivations 2 it exhibits better task generalization than gpi baselines weaknesses 1 the baselines used for comparisons are inadequate for eg recent works on bilinear decomposition of q functions bilinear value networks hong et al 2022 httpsarxivorgpdf220413695pdf showed improved generalization on novel tasks specifically for goal condtitioned task with goal g they choose the parameterization qsag phisatwsg as opposed to qsag phisagtw or qsag phisatwg it would be nice to see comparisons to this bilinear decomposition 2 furthermore the domains ie scavenger and reacher on which the method is tested is limited i would appreciate if authors can provide a more extensive evaluation on various goal conditioned tasks such as the ones in fetchgym httpsgithubcomjmichauxgymfetch the authors discuss the limitations of their work in section 5 docsepthe paper looks at the problem of transfer in reinforcement learning from source tasks to target tasks when the reward signal changes across tasks but the state and action spaces remain the same authors address a limitation of a current approach in this setting gpi with universal successor features approximators usfas the works argues that usfas can make high approximation errors on test targets if their solutions are distant from the source targets ones and proposes a solution to mitigate this limitation the solution involves constraining the approximation actionvalue error on new target task using a lower and upper bounds introduced in the paper the experiments show a zeroshot transfer improvement in performance compared to baselines usfa gpi on synthetic scavenger tasks and robotic locomotion tasks strengths the paper addresses relevant and important open problem in rl transfer learning the writing is clear rigorous and the paper is easy to follow a pleasant read i found theoretical contribution to be main significant result extending nemecek and parr result to about the value of an optimal policy for a new task beyond the conical hull is an important result the proposed method constrained gpi is an elegant way of improving gpi weaknesses the limitations of the work are not really discussed and no direction for future work is given this make it difficult to gauge if the method and its guarantees would extend to more complex and large scale environments very limited discussions on the limitations in fact i did not really understand the remark on not getting the full effectiveness no negative societal impacts were reported and i dont see any myself either docsepsuccessor features generalized policy improvement provide a method for generalizing to new tasks that are a linear combination of existing tasks usfa combine this approach to generalization with neural networks with also generalize by training function approximators conditioned on the task this work extends usfas by showing that one can establish lower and upper bounds on the generalization performance when estimating actionvalues on a new task these bounds are used to constrain the function approximator estimate they test this approach comparing to usfas on two simple environments and find using these constraints outperforms usfas strengths paper is well communicated paper will be of interest to community working on successor features and related approaches paper introduces both theoretical grounding and empirical results weaknesses in common with a lot of work in sf the test environments are toy and its unclear whether these approaches scale to more interesting tasks for this reason it may be of interest only to a smaller subset of the neurips community it would be interesting to compare very different approaches such as a conditional decision transformer 1 or gato style policy distillation 2 3 introduces an alternative method for improving the performance of gpi in the maxent rl framework that reduces the generalization error and should be cited and ideally compared against 1 chen lili et al decision transformer reinforcement learning via sequence modeling advances in neural information processing systems 34 2021 1508415097 2 reed scott et al a generalist agent arxiv preprint arxiv220506175 2022 3 hunt jonathan et al composing entropic policies using divergence correction international conference on machine learning pmlr 2019 yes although some comments and ideally testing on scaling to more complex domains would be useful ### Summary:
the reviewers agree that the paper is a valid contribution to the line of research on successor features sfs and generalized policy improvement gpi they also agree that the paper is well written and easy to follow three points that may be worth taking into account when preparing the final version of the paper all related to the presentation 1 the paper has two main contributions bounds that improve upon those of nemecek and parr 1 and a new application of bounds of this type to detect approximation errors in universal successor features approximators usfas although these two contributions are listed at the end of section 1 in the rest of the paper the derived bounds are always mentioned in the context of their specific use with usfas in particular i believe the authors never mention that their bounds could also be applied to decide when to add new policies to a set of policies to be used with gpi as suggested by nemecek and parr in summary the authors may want to have a presentation that clearly disentangles the two contributions of the paper 2 although the writing is mostly clear i feel like the core idea of the paper is never spelled out in a concise way this is especially important for the abstract given a set of vectors mathbfw1mathbfw2 dots mathbfwn and an associated space of tasks mathcalw composed of all possible linear combinations of the vectors mathbfwi the paper derives lower and upper bounds for the value functions of all tasks in mathcalw in terms of the value functions of the tasks mathbfwi these bounds can then be used in several ways one novel application proposed in the paper is to detect approximation errors in usfas maybe spelling this out in the abstract and introduction would help the reader to quickly understand the message of the paper 3 it seems like the max in the upper bound eq 10 will always be resolved based on the sign of alphamathbfw since the first term will always be larger when alphamathbfw 0 and the second term will always be larger when alphamathbfw 0 this seems to be the trick used to improve upon nemecek and parrs bound the authors should consider adding this comment to the paper i hope the constructive feedback is useful to improve your paper 1 nemecek m parr r policy caches with successor features icml 2021
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 7211, 326, 10898, 24193, 4735, 4020, 2392, 441, 6855, 3714, 261, 1370, 1060, 1182, 310, 253, 3646, 4972, 40725, 253, 6032, 1255, 273, 8654, 1318, 3470, 2439, 1027, 8892, 533, 697, 11193, 2228, 812, 320, 1781, 672, 2303, 4836, 11390, 403, 2080, 1977, 281, 3157, 26647, 281, 247, 747, 4836, 259, 534, 476, 320, 4469, 347, 247, 4872, 5019, 273, 2603, 8892, 253, 2929, 29328, 20793, 3733, 273, 24193, 3386, 326, 11355, 697, 11193, 2228, 327, 253, 747, 4836, 352, 840, 2722, 326, 247, 2074, 1055, 476, 320, 6786, 407, 970, 2074, 10806, 387, 1071, 673, 1293, 6890, 253, 3733, 273, 253, 24193, 3386, 253, 4081, 1332, 310, 2104, 281, 3157, 4836, 26647, 327, 660, 7603, 1063, 285, 294, 12844, 10625, 20544, 50275, 18, 253, 4795, 20793, 305, 2059, 556, 1175, 10527, 42852, 50275, 19, 352, 15646, 1805, 4836, 26647, 685, 305, 2059, 1666, 25379, 50276, 20881, 1255, 265, 50275, 18, 253, 1666, 25379, 908, 323, 14023, 403, 18766, 323, 24088, 3332, 2987, 327, 10370, 48971, 14717, 273, 2805, 3470, 10370, 48971, 1318, 6928, 288, 543, 1162, 355, 1384, 1423, 5987, 39962, 2061, 9275, 14256, 37012, 26541, 9275, 2692, 5520, 26647, 327, 4460, 8892, 5742, 323, 4736, 6882, 85, 539, 264, 4836, 342, 4736, 305, 597, 5206, 253, 4764, 1320, 2805, 84, 356, 50276, 29155, 255, 88, 8433, 347, 10066, 281, 2805, 84, 356, 50276, 29155, 356, 7553, 390, 2805, 84, 356, 50276, 29155, 255, 46506, 352, 651, 320, 5322, 281, 923, 14023, 281, 436, 10370, 48971, 14717, 50275, 19, 33810, 253, 10625, 26332, 660, 7603, 1063, 285, 294, 12844, 327, 534, 253, 1332, 310, 5762, 310, 3710, 891, 651, 11435, 604, 4477, 476, 2085, 247, 625, 9470, 7103, 327, 2710, 4736, 27039, 8892, 824, 347, 253, 4394, 275, 20279, 72, 1105, 5987, 7280, 681, 30999, 469, 10422, 72, 1105, 24717, 50276, 783, 4477, 2319, 253, 7364, 273, 616, 789, 275, 2593, 608, 5474, 339, 431, 248, 2929, 4453, 387, 253, 1895, 273, 3700, 275, 35221, 4715, 432, 2603, 8892, 281, 2303, 8892, 672, 253, 10921, 2625, 2544, 2439, 8892, 533, 253, 1375, 285, 2250, 8470, 3464, 253, 1072, 4477, 2953, 247, 12291, 273, 247, 1655, 2746, 275, 436, 4758, 305, 2059, 342, 10898, 24193, 3386, 4020, 2392, 441, 71, 284, 253, 2987, 8219, 326, 441, 71, 284, 476, 1056, 1029, 11193, 6332, 327, 1071, 8571, 604, 616, 5482, 403, 13392, 432, 253, 2603, 8571, 4394, 285, 29328, 247, 2900, 281, 29966, 436, 12291, 50275, 783, 2900, 8687, 1030, 26208, 253, 11193, 2250, 2877, 2228, 327, 747, 2303, 4836, 970, 247, 2406, 285, 5170, 14493, 5611, 275, 253, 2929, 253, 4679, 921, 247, 1182, 254, 6934, 302, 3700, 7756, 275, 3045, 2429, 281, 1666, 25379, 441, 6855, 50276, 72, 2059, 327, 13506, 660, 7603, 1063, 8892, 285, 35121, 23904, 5011, 8892, 50276, 296, 3755, 20556, 50275, 783, 2929, 12453, 4623, 285, 1774, 1527, 1895, 275, 391, 77, 3700, 4715, 50276, 783, 4028, 310, 2590, 26565, 285, 253, 2929, 310, 3477, 281, 956, 247, 17127, 1239, 50276, 74, 1119, 10527, 7680, 281, 320, 2022, 1534, 906, 13633, 20641, 18393, 76, 285, 1061, 83, 906, 281, 670, 253, 1318, 273, 271, 8654, 3646, 323, 247, 747, 4836, 4457, 253, 47233, 28470, 310, 271, 1774, 906, 50276, 783, 4081, 1332, 20793, 305, 2059, 310, 271, 20654, 1039, 273, 11138, 305, 2059, 50276, 20881, 1255, 265, 253, 7364, 273, 253, 789, 403, 417, 1663, 5469, 285, 642, 3884, 323, 2852, 789, 310, 1677, 436, 1056, 352, 2834, 281, 11206, 604, 253, 1332, 285, 697, 23632, 651, 9017, 281, 625, 2570, 285, 1781, 4311, 12620, 50275, 635, 3710, 11985, 327, 253, 7364, 275, 958, 891, 858, 417, 1663, 2096, 253, 7579, 327, 417, 2970, 253, 2120, 12510, 50276, 2369, 4016, 38058, 16274, 497, 2361, 285, 891, 13414, 923, 667, 4266, 2057, 50276, 7152, 339, 793, 86, 1828, 263, 3386, 50276, 16691, 1025, 3646, 7756, 2085, 247, 1332, 323, 2087, 3006, 281, 747, 8892, 326, 403, 247, 4872, 5019, 273, 5368, 8892, 441, 6855, 13398, 436, 2746, 281, 26647, 342, 11454, 6928, 342, 671, 39970, 407, 3733, 1159, 4020, 2392, 27039, 327, 253, 4836, 50276, 2520, 789, 8725, 441, 71, 284, 407, 4645, 326, 581, 476, 5100, 2406, 285, 5170, 14493, 327, 253, 26647, 3045, 672, 26230, 2250, 8858, 327, 247, 747, 4836, 841, 14493, 403, 908, 281, 37709, 253, 1159, 4020, 1080, 6642, 50276, 9328, 1071, 436, 2746, 10941, 281, 441, 71, 284, 327, 767, 2969, 12620, 285, 1089, 970, 841, 10806, 41731, 13015, 441, 71, 284, 50275, 296, 3755, 20556, 50275, 20790, 310, 973, 32452, 50275, 20790, 588, 320, 273, 1600, 281, 3114, 2444, 327, 24193, 3386, 285, 2905, 7274, 50275, 20790, 23970, 1097, 10527, 3216, 272, 285, 16774, 1543, 50276, 20881, 1255, 265, 50275, 249, 1846, 342, 247, 2257, 273, 789, 275, 42644, 253, 1071, 12620, 403, 20953, 285, 697, 12744, 1880, 841, 7274, 4311, 281, 625, 4722, 8892, 323, 436, 1921, 352, 778, 320, 273, 1600, 760, 281, 247, 4577, 8578, 273, 253, 5723, 2824, 3114, 50275, 262, 651, 320, 4722, 281, 7277, 1077, 1027, 7274, 824, 347, 247, 17697, 3061, 39707, 337, 390, 305, 4611, 3740, 3646, 940, 21755, 374, 50275, 20, 23970, 271, 5795, 1332, 323, 11138, 253, 3045, 273, 305, 2059, 275, 253, 2781, 290, 391, 77, 7792, 326, 11355, 253, 26647, 2228, 285, 943, 320, 11106, 285, 34243, 2429, 1411, 50276, 18, 260, 864, 298, 3093, 1162, 355, 3061, 39707, 35221, 4715, 3066, 3425, 14053, 16424, 275, 11454, 1491, 5162, 2718, 5910, 43425, 7783, 2759, 1010, 39289, 50276, 19, 294, 264, 660, 1519, 1162, 355, 247, 2087, 382, 5570, 549, 32693, 638, 3845, 549, 32693, 14256, 25670, 14840, 1384, 1423, 50276, 20, 20269, 480, 251, 10511, 1162, 355, 47247, 994, 12189, 7823, 970, 23279, 10618, 5213, 8059, 327, 5145, 4715, 268, 1686, 83, 6247, 50276, 9820, 3738, 690, 5701, 285, 34243, 5175, 327, 13642, 281, 625, 2570, 10625, 651, 320, 4217, 2490, 187, 4118, 18435, 27, 783, 30628, 5194, 326, 253, 2929, 310, 247, 3588, 7680, 281, 253, 1386, 273, 2561, 327, 24193, 3386, 256, 3671, 285, 14923, 3646, 7756, 305, 2059, 597, 671, 5194, 326, 253, 2929, 310, 973, 3542, 285, 3477, 281, 956, 50276, 13524, 2792, 326, 778, 320, 4409, 3192, 715, 2395, 672, 13828, 253, 2457, 2715, 273, 253, 2929, 512, 2905, 281, 253, 9759, 50276, 18, 253, 2929, 556, 767, 2022, 9021, 14493, 326, 3157, 2220, 1110, 273, 20641, 18393, 76, 285, 1061, 83, 337, 285, 247, 747, 2898, 273, 14493, 273, 436, 1511, 281, 2736, 11193, 6332, 275, 10898, 24193, 3386, 4020, 2392, 441, 71, 284, 3738, 841, 767, 9021, 403, 7117, 387, 253, 990, 273, 2593, 337, 275, 253, 1551, 273, 253, 2929, 253, 6012, 14493, 403, 1900, 5393, 275, 253, 3634, 273, 616, 2173, 897, 342, 441, 71, 284, 275, 1798, 891, 2868, 253, 4477, 1620, 3748, 326, 616, 14493, 812, 671, 320, 3732, 281, 7617, 672, 281, 823, 747, 7823, 281, 247, 873, 273, 7823, 281, 320, 908, 342, 305, 2059, 347, 5125, 407, 20641, 18393, 76, 285, 1061, 83, 275, 6010, 253, 4477, 778, 971, 281, 452, 247, 9759, 326, 4518, 557, 290, 19236, 253, 767, 9021, 273, 253, 2929, 50276, 19, 3738, 253, 4028, 310, 6571, 2590, 891, 1928, 751, 253, 5161, 2934, 273, 253, 2929, 310, 1620, 43997, 562, 275, 247, 44003, 1039, 436, 310, 3340, 1774, 323, 253, 12002, 1677, 247, 873, 273, 11390, 14168, 3342, 88, 18, 2407, 88, 19, 20200, 14168, 3342, 939, 285, 271, 2330, 2317, 273, 8892, 14168, 1179, 88, 9924, 273, 512, 1896, 4872, 13553, 273, 253, 11390, 14168, 3342, 22084, 253, 2929, 38422, 2406, 285, 5170, 14493, 323, 253, 1318, 3470, 273, 512, 8892, 275, 14168, 1179, 88, 275, 2426, 273, 253, 1318, 3470, 273, 253, 8892, 14168, 3342, 22084, 841, 14493, 476, 840, 320, 908, 275, 2067, 4088, 581, 4460, 2898, 4081, 275, 253, 2929, 310, 281, 2736, 11193, 6332, 275, 441, 71, 284, 5046, 33797, 436, 562, 275, 253, 12002, 285, 10199, 651, 1361, 253, 9414, 281, 4541, 2096, 253, 3935, 273, 253, 2929, 50276, 20, 352, 3133, 751, 253, 2781, 275, 253, 5170, 3033, 16186, 884, 588, 1900, 320, 11512, 1754, 327, 253, 861, 273, 355, 545, 312, 506, 3342, 88, 1580, 253, 806, 1307, 588, 1900, 320, 4067, 672, 355, 545, 312, 506, 3342, 88, 50276, 17, 285, 253, 1273, 1307, 588, 1900, 320, 4067, 672, 355, 545, 312, 506, 3342, 88, 50276, 17, 436, 3133, 281, 320, 253, 10480, 908, 281, 3157, 2220, 50276, 25476, 18393, 76, 285, 1061, 2967, 3033, 253, 4477, 943, 1908, 6240, 436, 4385, 281, 253, 2929, 50275, 74, 3524, 253, 25799, 8680, 310, 4217, 281, 3157, 634, 2929, 50276, 18, 20641, 18393, 76, 278, 1061, 83, 391, 3646, 260, 3844, 342, 24193, 3386, 17857, 1686, 43425, 50276 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 7211, 326, 10898, 24193, 4735, 4020, 2392, 441, 6855, 3714, 261, 1370, 1060, 1182, 310, 253, 3646, 4972, 40725, 253, 6032, 1255, 273, 8654, 1318, 3470, 2439, 1027, 8892, 533, 697, 11193, 2228, 812, 320, 1781, 672, 2303, 4836, 11390, 403, 2080, 1977, 281, 3157, 26647, 281, 247, 747, 4836, 259, 534, 476, 320, 4469, 347, 247, 4872, 5019, 273, 2603, 8892, 253, 2929, 29328, 20793, 3733, 273, 24193, 3386, 326, 11355, 697, 11193, 2228, 327, 253, 747, 4836, 352, 840, 2722, 326, 247, 2074, 1055, 476, 320, 6786, 407, 970, 2074, 10806, 387, 1071, 673, 1293, 6890, 253, 3733, 273, 253, 24193, 3386, 253, 4081, 1332, 310, 2104, 281, 3157, 4836, 26647, 327, 660, 7603, 1063, 285, 294, 12844, 10625, 20544, 50275, 18, 253, 4795, 20793, 305, 2059, 556, 1175, 10527, 42852, 50275, 19, 352, 15646, 1805, 4836, 26647, 685, 305, 2059, 1666, 25379, 50276, 20881, 1255, 265, 50275, 18, 253, 1666, 25379, 908, 323, 14023, 403, 18766, 323, 24088, 3332, 2987, 327, 10370, 48971, 14717, 273, 2805, 3470, 10370, 48971, 1318, 6928, 288, 543, 1162, 355, 1384, 1423, 5987, 39962, 2061, 9275, 14256, 37012, 26541, 9275, 2692, 5520, 26647, 327, 4460, 8892, 5742, 323, 4736, 6882, 85, 539, 264, 4836, 342, 4736, 305, 597, 5206, 253, 4764, 1320, 2805, 84, 356, 50276, 29155, 255, 88, 8433, 347, 10066, 281, 2805, 84, 356, 50276, 29155, 356, 7553, 390, 2805, 84, 356, 50276, 29155, 255, 46506, 352, 651, 320, 5322, 281, 923, 14023, 281, 436, 10370, 48971, 14717, 50275, 19, 33810, 253, 10625, 26332, 660, 7603, 1063, 285, 294, 12844, 327, 534, 253, 1332, 310, 5762, 310, 3710, 891, 651, 11435, 604, 4477, 476, 2085, 247, 625, 9470, 7103, 327, 2710, 4736, 27039, 8892, 824, 347, 253, 4394, 275, 20279, 72, 1105, 5987, 7280, 681, 30999, 469, 10422, 72, 1105, 24717, 50276, 783, 4477, 2319, 253, 7364, 273, 616, 789, 275, 2593, 608, 5474, 339, 431, 248, 2929, 4453, 387, 253, 1895, 273, 3700, 275, 35221, 4715, 432, 2603, 8892, 281, 2303, 8892, 672, 253, 10921, 2625, 2544, 2439, 8892, 533, 253, 1375, 285, 2250, 8470, 3464, 253, 1072, 4477, 2953, 247, 12291, 273, 247, 1655, 2746, 275, 436, 4758, 305, 2059, 342, 10898, 24193, 3386, 4020, 2392, 441, 71, 284, 253, 2987, 8219, 326, 441, 71, 284, 476, 1056, 1029, 11193, 6332, 327, 1071, 8571, 604, 616, 5482, 403, 13392, 432, 253, 2603, 8571, 4394, 285, 29328, 247, 2900, 281, 29966, 436, 12291, 50275, 783, 2900, 8687, 1030, 26208, 253, 11193, 2250, 2877, 2228, 327, 747, 2303, 4836, 970, 247, 2406, 285, 5170, 14493, 5611, 275, 253, 2929, 253, 4679, 921, 247, 1182, 254, 6934, 302, 3700, 7756, 275, 3045, 2429, 281, 1666, 25379, 441, 6855, 50276, 72, 2059, 327, 13506, 660, 7603, 1063, 8892, 285, 35121, 23904, 5011, 8892, 50276, 296, 3755, 20556, 50275, 783, 2929, 12453, 4623, 285, 1774, 1527, 1895, 275, 391, 77, 3700, 4715, 50276, 783, 4028, 310, 2590, 26565, 285, 253, 2929, 310, 3477, 281, 956, 247, 17127, 1239, 50276, 74, 1119, 10527, 7680, 281, 320, 2022, 1534, 906, 13633, 20641, 18393, 76, 285, 1061, 83, 906, 281, 670, 253, 1318, 273, 271, 8654, 3646, 323, 247, 747, 4836, 4457, 253, 47233, 28470, 310, 271, 1774, 906, 50276, 783, 4081, 1332, 20793, 305, 2059, 310, 271, 20654, 1039, 273, 11138, 305, 2059, 50276, 20881, 1255, 265, 253, 7364, 273, 253, 789, 403, 417, 1663, 5469, 285, 642, 3884, 323, 2852, 789, 310, 1677, 436, 1056, 352, 2834, 281, 11206, 604, 253, 1332, 285, 697, 23632, 651, 9017, 281, 625, 2570, 285, 1781, 4311, 12620, 50275, 635, 3710, 11985, 327, 253, 7364, 275, 958, 891, 858, 417, 1663, 2096, 253, 7579, 327, 417, 2970, 253, 2120, 12510, 50276, 2369, 4016, 38058, 16274, 497, 2361, 285, 891, 13414, 923, 667, 4266, 2057, 50276, 7152, 339, 793, 86, 1828, 263, 3386, 50276, 16691, 1025, 3646, 7756, 2085, 247, 1332, 323, 2087, 3006, 281, 747, 8892, 326, 403, 247, 4872, 5019, 273, 5368, 8892, 441, 6855, 13398, 436, 2746, 281, 26647, 342, 11454, 6928, 342, 671, 39970, 407, 3733, 1159, 4020, 2392, 27039, 327, 253, 4836, 50276, 2520, 789, 8725, 441, 71, 284, 407, 4645, 326, 581, 476, 5100, 2406, 285, 5170, 14493, 327, 253, 26647, 3045, 672, 26230, 2250, 8858, 327, 247, 747, 4836, 841, 14493, 403, 908, 281, 37709, 253, 1159, 4020, 1080, 6642, 50276, 9328, 1071, 436, 2746, 10941, 281, 441, 71, 284, 327, 767, 2969, 12620, 285, 1089, 970, 841, 10806, 41731, 13015, 441, 71, 284, 50275, 296, 3755, 20556, 50275, 20790, 310, 973, 32452, 50275, 20790, 588, 320, 273, 1600, 281, 3114, 2444, 327, 24193, 3386, 285, 2905, 7274, 50275, 20790, 23970, 1097, 10527, 3216, 272, 285, 16774, 1543, 50276, 20881, 1255, 265, 50275, 249, 1846, 342, 247, 2257, 273, 789, 275, 42644, 253, 1071, 12620, 403, 20953, 285, 697, 12744, 1880, 841, 7274, 4311, 281, 625, 4722, 8892, 323, 436, 1921, 352, 778, 320, 273, 1600, 760, 281, 247, 4577, 8578, 273, 253, 5723, 2824, 3114, 50275, 262, 651, 320, 4722, 281, 7277, 1077, 1027, 7274, 824, 347, 247, 17697, 3061, 39707, 337, 390, 305, 4611, 3740, 3646, 940, 21755, 374, 50275, 20, 23970, 271, 5795, 1332, 323, 11138, 253, 3045, 273, 305, 2059, 275, 253, 2781, 290, 391, 77, 7792, 326, 11355, 253, 26647, 2228, 285, 943, 320, 11106, 285, 34243, 2429, 1411, 50276, 18, 260, 864, 298, 3093, 1162, 355, 3061, 39707, 35221, 4715, 3066, 3425, 14053, 16424, 275, 11454, 1491, 5162, 2718, 5910, 43425, 7783, 2759, 1010, 39289, 50276, 19, 294, 264, 660, 1519, 1162, 355, 247, 2087, 382, 5570, 549, 32693, 638, 3845, 549, 32693, 14256, 25670, 14840, 1384, 1423, 50276, 20, 20269, 480, 251, 10511, 1162, 355, 47247, 994, 12189, 7823, 970, 23279, 10618, 5213, 8059, 327, 5145, 4715, 268, 1686, 83, 6247, 50276, 9820, 3738, 690, 5701, 285, 34243, 5175, 327, 13642, 281, 625, 2570, 10625, 651, 320, 4217, 2490, 187, 4118, 18435, 27, 783, 30628, 5194, 326, 253, 2929, 310, 247, 3588, 7680, 281, 253, 1386, 273, 2561, 327, 24193, 3386, 256, 3671, 285, 14923, 3646, 7756, 305, 2059, 597, 671, 5194, 326, 253, 2929, 310, 973, 3542, 285, 3477, 281, 956, 50276, 13524, 2792, 326, 778, 320, 4409, 3192, 715, 2395, 672, 13828, 253, 2457, 2715, 273, 253, 2929, 512, 2905, 281, 253, 9759, 50276, 18, 253, 2929, 556, 767, 2022, 9021, 14493, 326, 3157, 2220, 1110, 273, 20641, 18393, 76, 285, 1061, 83, 337, 285, 247, 747, 2898, 273, 14493, 273, 436, 1511, 281, 2736, 11193, 6332, 275, 10898, 24193, 3386, 4020, 2392, 441, 71, 284, 3738, 841, 767, 9021, 403, 7117, 387, 253, 990, 273, 2593, 337, 275, 253, 1551, 273, 253, 2929, 253, 6012, 14493, 403, 1900, 5393, 275, 253, 3634, 273, 616, 2173, 897, 342, 441, 71, 284, 275, 1798, 891, 2868, 253, 4477, 1620, 3748, 326, 616, 14493, 812, 671, 320, 3732, 281, 7617, 672, 281, 823, 747, 7823, 281, 247, 873, 273, 7823, 281, 320, 908, 342, 305, 2059, 347, 5125, 407, 20641, 18393, 76, 285, 1061, 83, 275, 6010, 253, 4477, 778, 971, 281, 452, 247, 9759, 326, 4518, 557, 290, 19236, 253, 767, 9021, 273, 253, 2929, 50276, 19, 3738, 253, 4028, 310, 6571, 2590, 891, 1928, 751, 253, 5161, 2934, 273, 253, 2929, 310, 1620, 43997, 562, 275, 247, 44003, 1039, 436, 310, 3340, 1774, 323, 253, 12002, 1677, 247, 873, 273, 11390, 14168, 3342, 88, 18, 2407, 88, 19, 20200, 14168, 3342, 939, 285, 271, 2330, 2317, 273, 8892, 14168, 1179, 88, 9924, 273, 512, 1896, 4872, 13553, 273, 253, 11390, 14168, 3342, 22084, 253, 2929, 38422, 2406, 285, 5170, 14493, 323, 253, 1318, 3470, 273, 512, 8892, 275, 14168, 1179, 88, 275, 2426, 273, 253, 1318, 3470, 273, 253, 8892, 14168, 3342, 22084, 841, 14493, 476, 840, 320, 908, 275, 2067, 4088, 581, 4460, 2898, 4081, 275, 253, 2929, 310, 281, 2736, 11193, 6332, 275, 441, 71, 284, 5046, 33797, 436, 562, 275, 253, 12002, 285, 10199, 651, 1361, 253, 9414, 281, 4541, 2096, 253, 3935, 273, 253, 2929, 50276, 20, 352, 3133, 751, 253, 2781, 275, 253, 5170, 3033, 16186, 884, 588, 1900, 320, 11512, 1754, 327, 253, 861, 273, 355, 545, 312, 506, 3342, 88, 1580, 253, 806, 1307, 588, 1900, 320, 4067, 672, 355, 545, 312, 506, 3342, 88, 50276, 17, 285, 253, 1273, 1307, 588, 1900, 320, 4067, 672, 355, 545, 312, 506, 3342, 88, 50276, 17, 436, 3133, 281, 320, 253, 10480, 908, 281, 3157, 2220, 50276, 25476, 18393, 76, 285, 1061, 2967, 3033, 253, 4477, 943, 1908, 6240, 436, 4385, 281, 253, 2929, 50275, 74, 3524, 253, 25799, 8680, 310, 4217, 281, 3157, 634, 2929, 50276, 18, 20641, 18393, 76, 278, 1061, 83, 391, 3646, 260, 3844, 342, 24193, 3386, 17857, 1686, 43425, 50276 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this submission studies how the choice of activation function impacts the reproducibility of experiments involving deep networks it proposes a new activation function with the goal of designing a smoothed relu and provide experiments comparing it against other activations in terms of irreproducibility measured via pd and performance the problem of understanding how model design choices can have negative impacts on experimental reproducibility is interesting and timely but i believe the paper does not provide a strong enough case for their approach and contributions first the adopted metric to measure irreproducibility prediction difference pd is never evaluated in terms of how sensible of a metric it is to capture reproducibility this also seems to be lacking in 1 actually one can argue that it is not a sensible metric at all except for its hamming form as it is not invariant to how the models are calibrated as discussed below for example take any binary classifier and consider two copies of it with different calibrations ie scaling the output layer weights by positive scalars one for each model even though the models always agree on their predicted labels regardless of their calibrations the pd can be made arbitrarily close to 05 by calibrating the models appropriately even more worrying is that the same can be done by taking a binary classifier and a copy of it with flipped predictions the pd between the two can be made arbitrarily close to 0 by scaling their weights down note that this problem also happens with the relative pd to see how this is connected to the choice of activation functions especially relu x smelu note that for normallydistributed inputs centered around the origin gradients variance of relu is 14 while for smelu it is approximately sigma2 4 beta2 for large enough beta sigma2 being the variance of the input distribution this discrepancy can have a nontrivial impact on the models calibration and cause differences in pd to be artifacts since this doesnt happen with the hamming form of pd i believe figure 15 in the appendix to be the most informative one however it seems that different activations result in less than 1 prediction discrepancy across models which is fairly insignificant and hence it is hard to argue that activations actually matter for reproducibility at least from the presented experiments lastly it is hard to draw any conclusions from the presented experiments the ctr results are based on a private dataset while the mnist ones are extremely smallscale with both the dataset and the model being arguably toy problems there are numerous tasks where reproducibility is a prominent issue eg deep reinforcement learning generative modelling especially gans making training a 2layer network on mnist a poor choice to evaluate reproducibility problems as an additional note the authors seem to rely heavily on the work of shamir coviello 20 1 which introduced the pd metric even though the paper was only made publicly available on arxiv a week after the reviewing period for this submission started when citing papers which are yet to be made available it would be helpful to introduce and discuss the relevant content in a selfcontained way while the authors avoided much of my confusion by presenting the full definition of the pd metric the referred paper has useful information which was not discussed such as which summand is normalized in its relative form and how the different variants compare since i have major concerns with the paper particularly on the reliability of pd as a metric and the unconvincing empirical results i am voting for rejection 1 shamir coviello antidistillation improving reproducibility of deep networks update after rebuttal it appears that the comment made by the reviewer may stem from an assumption that two models which are compared for pd can be different in the operations they perform to generate the predictions this is incorrect my review does not mention such assumption and my statements hold without it as stated in my review i consider models with different weight magnitudes making no assumptions on the underlying cause pd as we defined in section 2 is aimed explicitly at measuring differences between predictions of a set of models that are supposed to be identical in all their components indeed and my point is that comparing the pd of two sets of models that are not identical is also problematic even if all models within each set are identical except for the pd in its hamming form more details below changing calibration between such models violates this assumption please check the celebrated work of guo et al on calibration of modern neural networks calibration does not necessarily consist of an explicit additional component that modifies the model and the same model trained in different ways can present distinct calibrations more specifically two sets of models can have not only the same accuracy but the exact same predictions ie there is a 11 mapping from each model in one set to a model in the other set that has the exact same predictions for all data points but vastly different internal calibrations which will result in vastly different pds to be overly specific the scalar pd of a set will be different from the scalar pd of the other set even though the two sets agree pointwise in terms of predictions if one changes something about one of the models including how calibration is done one would expect them to predict differently and have different accuracies this is incorrect first im not assuming models are explicitly calibrated only that they have distinct internal calibrations confidences in terms of predicted probabilities which depend mostly on the parameters magnitudes second scaling the output layer weights by positive scalars quoting from my review will not change a models accuracy while it changes the classwise predicted probabilities the rank of the logits is preserved if the authors remain skeptical of this fact let phix denote the activations of the previous to last layer of a model and let langle wi phix rangle langle wj phix rangle where wi and wj are the weight vectors of output units respective to classes i and j ie pyi x pyj x for probabilities produced by a softmax over logits then for any alpha in mathbb r we have trivially that langle alpha wi phix rangle langle alpha wj phix rangle hence pyi x pyj x for probabilities p computed from the new logits again note that it is not necessary for an external explicit calibration factor alpha to be employed training the network differently or even adopting a different activation function just consider max0 10x for clarity which will scale phix by a positive factor and yield the same observation as above specifically if one flips the predictions of a binary classifier the flipped model will have much worse accuracy from the actual model of interest and measuring pd at this point is irrelevant the fact that two classifiers with vastly different accuracies can have zero pd is worrying and shows that pd is not a trustworthy metric claiming that such evaluation is irrelevant and should not be done does not address the issue since the authors remained unconvinced that the pd is sensible to positive scalings of a models parameters and hence comparing the pds of two sets of models with different activations one activation per set is not sensible here is a more detailed explanation of this fact assume a fairly trivial example for clarity two 1d data points x1 1 x2 1 and binary classification models f1 f2 where f1x sigmaw1 cdot phix and f2x sigmaw2 cdot phix are the assigned probabilities for the positive label and phi mathbb r to mathbb r captures some notion of activation function andor scale of weights before the final classification layer for simplicity let phix alpha x for some alpha in mathbb r and feel free to think of alpha as a magnitude of an activation function instead of some notion of internal calibration then we have p11 sigmaalpha w1 sigmaalpha w1 p12 sigmaalpha w2 sigmaalpha w2 p21 sigmaalpha w1 sigmaalpha w1 and p22 sigmaalpha w2 sigmaalpha w2 the pd of the set consisting of the two defined models after simplifying the 8 relevant terms ends up being simply delta1 sigmaalpha w1 sigmaalpha w2 lets pick some numbers to make this crystal clear let alpha 1 w1 10 w2 01 so we get delta1 sigma1 sigma01 approx 02 note that wlog we can assume that y1 1 y2 1 so that for these weights both models achieve 100 accuracy now take another set consisting of models g1 g2 defined similarly to f1 f2 but with g1x sigmaw1 cdot phix g2x sigmaw2 cdot phix where phi not the derivative of phi captures the the activation function andor weight magnitude of layers preceding the classification head let phix beta x for simplicity consider the case where beta 01 w1 10 w2 01 ie the weights of g1 g2 are exactly the same as the weights of f1 f2 but phi is a scaleddown phi eg a different activation function in this case note that both g1 and g2 achieve 100 accuracy as well for this new set of models consisting of the pair g1 g2 we get delta1 sigma01 sigma001 approx 002 a value around 10 times smaller than the pd of the first set of models even though the second set predicts the exact same labels for each data point and claiming that the set g1 g2 is more robust than the set f1 f2 in terms of reproducibility is simply factually wrong if the idea of having beta neq alpha sounds a bit of a stretch since the proposed activations are not simply scaled down relus consider instead the case beta 10 w1 01 w2 001 and note that we again get delta1 approx 002 for this second set of models the discrepancy in terms of magnitude of weights can be caused by different optimizers different strength of ell2 regularization or as my original review already mentioned smaller variance of gradients wrt activation function to reiterate in the above example we did not at any point compute the pd of a set of models that had different components both f1 f2 the first set had the same activation function phi while g1 g2 had phi going a step further which shows how problematic the pd is as a metric consider an arbitrary set of binary classifiers s1 f1 f2 dots fm where fix sigma langle wi phix rangle is the probability assigned by the ith model of x belonging to the positive class now take another set of binary classifiers s2 g1 g2 dots gm with gix sigmalangle wi phix rangle where wi is the same weight vector that model fi has ie except for phi the set s2 is pointwise identical to the set s1 finally let phix beta phix where beta in mathbb r and feel free to check that for any beta every model gi from s2 will agree with the model fi from s1 in terms of predicted class ie although the class probabilities will change the rank is be preserved for any beta this means that s2 produces the exact same predictions as s1 for any possible data point taking beta to 0 yields in gix to 05 for any i in m and possible x hence the pd of s2 will go to zero even though the pd of s1 can be arbitrarily large and the two model sets s1 s2 agree pointwise in terms of predicted classes in other words taking an arbitrary set of models with relu activations copying its weights and replacing the relu by phix max0 fracx1010 will yield a second model set with pd close to zero hopefully the authors agree with me that this trivial replacement of activation functions does not solve any reproducibility problem in machine learning with the above in mind i urge the authors to reevaluate pd as a metric as mentioned in my review the hamming form does not suffer from this issue but the reported numbers in this case seem to indicate that there is little to no reproducibility challenge for the adopted tasksdocsepthe paper claims that smooth activations are more reproducible than relu the accuracy gain claims seem marginal and not carefully carried out further ablation studies are needed to strengthen the conclusion on accuracy however the main point of the paper is reproducibility where the feature is measured by the prediction difference pd introduced in section 2 is a measure over a set of models where the pd score is low if the models output consistent estimates for the same validation samples do models with the same initialization and same randomness seed for the shuffles of sgd have high pd if so then there would be numerical issues in the way models are trained otherwise where does the difference in pd come from why such fluctuations are considered irreproducible why would pd be identified with reproducibility overall the paper has a consistent story to tell its a fresh perspective and focuses on the main problem at hand however i fail to understand some of the key measurements and their connection with reproducibility it is undeniable that the shape of the landscape and the consistency of the models are linked however the variation found by relus could help increase ensemble accuracy in other words such variation can be a feature depending on the context and in principle given the proper seeds and versions of the software the results should be fully reproducible am i missing something i am looking forward to the responses from the authors on this point docsepsummary this paper addresses the problem that deep neural networks dnns can lead to different predictions even when they are initialized the same way due to the stochasticity of samples selected in minibatch sgd and update procedures from different optimizers which leads to convergence to different regions along the loss surface they attribute this problem to the complicated loss surface that arises from the discontinuity in relu activations they show that smooth activations can help remedy this issue by tuning the activation to become more relulike which leads to a better tradeoff between prediction differences ie consistency and model accuracy the pitch of the problem is motivated by fields like healthcare where more consistent predictions are important however its unclear why one would expect reproducible predictions when two network with same initialization are trained with a different sampling of minibatch shuffles this stochasticity should lead to different solutions for complex surfaces just as a different initialization would smoothing the loss surface can help here and this paper only explores the extent that the smoothness of the activation plays a role its unclear to me whether the activation function alone is sufficient for solving this problem other options such as regularization are not explored comments it is nice that traditional activations like swish and softplus are parameterized with a beta so that they can be modulated to become more relulike given the stochasticity of minibatching its not surprising that you can start from the same initialization and navigate to a different region of the loss surface since the stochasiticity is due to the size of the minibatch this paper could benefit from exploring how the effect size of pds changes with batch size overall the empirical evidence is very light they compare different activation functions for a private dataset for ad clickthroughrate and mnist to make general claims there should be various networks and various optimizers on various datasets the paper can be more convincing if more datasets are explored preferably not on a private dataset for which no one can validate their results also the optimizer plays a major role in navigating the loss surface but only a single optimizer is compared in the first task while only 2 are explored in task 2 ie adagrad and sgd adding more optimizers especially popular ones like adam could help clarify whether the smooth activations is robust across optimizers or whether this is a special case for adagrad i suspect the former but experiments must be performed to prove any point on a smaller note what does each dot in fig 3 represent assuming it represents different beta values there should be more consistency softplus has many points while tanhexp only has 4 points also how do these betas alter the activation functions the mainappendix show plots for 5 values of beta also in fig 4 why arent the same betas used for adagrad and sgd the choices of beta for a given base activation function should be consistent throughout fig 4 should also show the standard deviation across the 12 models this can provide a sense for the statistical significance of the results rescu a generalized activation function is introduced but they never used in any comparisons what is the benefit of such a formulation this paper mentions that smelu is less expensive compared to the other smooth activations any direct statement like this should be followed up with quantitative comparisons in this case the time per epoch this paper shows how weight normalization also influences the loss surface but was not explored empirically i think adding additional experiments taht scan different weight norms for a fixed beta would only strengthen their claims it is unclear whether their initialization strategy for the dnns explored here is fixed or whether they simply sampled different weights from the same initialization distribution the wording of the motivation was that pd happens for the same initialization but the language in the text was ambiguous of whether they enforced the same random number seed for the initializer across each experiment as a control experiment the effect size of prediction differences between training a model with different initializations should be compared with their main results that explore training with different minibatch order starting from the same initialization docsepthanks for the efforts of the authors some of my concerns have been addressed however i still think that the experiments are not convincing enough for iclr so i keep my score summary the paper argues that a smooth activation function may produce a smooth surface of the output of a network which identifies a good reproducibility behavior based on this observation the paper proposes the smelu activation function and its generalized version experiments with fullyconnected neural networks are presented strengths the idea that smooth activation functions may have better reproducibility of neural networks is very interesting it provides a different way to understand the role of activation functions in neural networks weaknesses figure 2 shows the motivation of the paper it uses the surface of the output of a neural network wrt its input to show the number of local minima however we usually use loss landscape to show the local minima of a neural network moreover the paper needs to test with various layers eg shallow and deep initializations eg gaussian or uniform and architectures it seems that figure 2 only works with networks having twodimension input and onedimension output for other networks the visualization of the loss landscape of neural networks may be a good reference for smelu the parameter beta is very important it balances the accuracy and the reproducibility however the paper does not give a practical method to choose its value the paper claims in appendix b that beta can be learned with the weights of a neural network however the commonly used objective functions correspond to the accuracy how to update beta to balance the accuracy and the reproducibility during training is unknown in figure 4 different optimizers could produce different results however the paper only uses adagrad it is better to test with sgd adagrad adam and amsgrad the datasets used in the paper is very small it is better to test with larger datasets the paper only tests with fullyconnected neural networks it is better to test with convolutional neural networks overall since the paper is not a theoretical one it needs extensive experiments to verify the claims however the experiments in the paper are not convincing enough ### Summary:
all reviews were negative for this paper due to various issues i think the main issue was that the experimental results were too weak to be convincing for example the reviewers were not sure if the differences in performance between different activations are significant the reviewers also required more datasets and more experiments the authors added std to results more experiments and argued that the current datasets are sufficient but the reviewers seemed to remain unconvinced
[ 3365, 24337, 1066, 774, 316, 1908, 3185, 253, 1083, 9840, 50276, 740, 259, 18, 50276, 520, 259, 19, 50276, 2874, 285, 3877, 326, 359, 969, 755, 18687, 18, 1192, 89, 209, 4699, 323, 436, 1273, 873, 273, 3210, 253, 26210, 275, 2426, 273, 9777, 273, 13461, 476, 320, 4269, 407, 1027, 5556, 14460, 1027, 4757, 273, 11591, 19, 37820, 390, 347, 619, 3236, 2278, 2168, 5393, 4577, 11041, 273, 27935, 8772, 5743, 1159, 50276, 936, 28411, 366, 275, 253, 1840, 1650, 359, 858, 417, 387, 667, 1127, 11897, 253, 31385, 273, 247, 873, 273, 3210, 326, 574, 1027, 4295, 1097, 269, 18, 269, 19, 253, 806, 873, 574, 253, 1072, 5743, 1159, 815, 74, 1223, 305, 18, 305, 19, 574, 815, 74, 50276, 5681, 247, 3213, 2007, 534, 2722, 849, 20276, 253, 31385, 310, 347, 247, 7982, 1908, 271, 10341, 873, 273, 8985, 49996, 256, 18, 50276, 71, 18, 269, 19, 20200, 49555, 835, 4993, 50276, 2592, 298, 2134, 38435, 815, 895, 391, 2134, 310, 253, 5912, 7922, 407, 253, 209, 334, 1566, 273, 1269, 15823, 281, 253, 2762, 966, 1024, 1379, 1529, 873, 273, 8985, 49996, 256, 19, 50276, 72, 18, 305, 19, 20200, 305, 78, 342, 305, 895, 50276, 24502, 10367, 2134, 38435, 815, 895, 391, 2134, 835, 38435, 310, 253, 1072, 2801, 4972, 326, 1566, 12684, 556, 26332, 3707, 323, 815, 74, 253, 873, 256, 19, 310, 1127, 3020, 8931, 281, 253, 873, 256, 18, 4720, 1339, 815, 895, 50276, 2461, 815, 895, 835, 9840, 275, 14168, 4482, 391, 285, 1928, 1959, 281, 2451, 326, 323, 667, 9840, 1046, 1566, 15891, 432, 256, 19, 588, 5194, 342, 253, 1566, 12684, 432, 256, 18, 275, 2426, 273, 8131, 966, 26332, 3738, 253, 966, 20552, 588, 1818, 253, 5958, 310, 320, 15296, 323, 667, 9840, 436, 2097, 326, 256, 19, 11330, 253, 3242, 1072, 13650, 347, 256, 18, 323, 667, 1896, 941, 1127, 3192, 9840, 281, 470, 11026, 275, 305, 895, 281, 16987, 323, 667, 891, 275, 278, 285, 1896, 1269, 7613, 253, 31385, 273, 256, 19, 588, 564, 281, 5058, 1014, 2167, 253, 31385, 273, 256, 18, 476, 320, 29607, 1781, 285, 253, 767, 1566, 5239, 256, 18, 256, 19, 5194, 1127, 3020, 275, 2426, 273, 8131, 5971, 275, 643, 3000, 3192, 271, 10341, 873, 273, 3210, 342, 774, 86, 1396, 569, 24699, 697, 13461, 285, 15706, 253, 774, 86, 407, 815, 895, 50276, 4090, 17, 1315, 317, 89, 6903, 17, 588, 4917, 247, 1273, 1566, 873, 342, 31385, 2810, 281, 5058, 18670, 253, 4477, 5194, 342, 479, 326, 436, 14916, 5407, 273, 5743, 3470, 1057, 417, 8415, 667, 38041, 1895, 275, 5145, 4715, 50276, 3113, 253, 1840, 275, 2564, 891, 21434, 253, 4477, 281, 294, 45141, 31385, 347, 247, 7982, 347, 5393, 275, 619, 2278, 253, 288, 28444, 830, 1057, 417, 11089, 432, 436, 2523, 533, 253, 2361, 3904, 275, 436, 1083, 1646, 281, 5224, 326, 627, 310, 1652, 281, 642, 38041, 5691, 323, 253, 8671, 8892, 7152, 339, 431, 248, 2929, 3916, 326, 6032, 1396, 569, 403, 625, 41374, 685, 774, 86, 253, 7200, 6351, 3916, 1646, 16888, 285, 417, 9257, 4824, 562, 2007, 28913, 2175, 403, 3058, 281, 17084, 253, 6452, 327, 7200, 2299, 253, 2022, 1127, 273, 253, 2929, 310, 38041, 835, 253, 4735, 310, 4080, 407, 253, 10554, 3064, 31385, 5611, 275, 2593, 374, 310, 247, 2557, 689, 247, 873, 273, 3210, 835, 253, 31385, 4868, 310, 1698, 604, 253, 3210, 3453, 5185, 8197, 323, 253, 1072, 12820, 3530, 50275, 3088, 3210, 342, 253, 1072, 31850, 285, 1072, 3632, 1255, 8357, 323, 253, 439, 2066, 868, 273, 256, 35333, 452, 1029, 31385, 604, 594, 840, 627, 651, 320, 10704, 3374, 275, 253, 1039, 3210, 403, 10166, 5010, 835, 1057, 253, 3064, 275, 31385, 1705, 432, 2139, 824, 15113, 403, 2783, 21388, 5551, 20109, 2139, 651, 31385, 320, 3636, 342, 38041, 50276, 1189, 455, 253, 2929, 556, 247, 5185, 2926, 281, 2028, 697, 247, 5352, 8668, 285, 16633, 327, 253, 2022, 1895, 387, 1133, 2299, 891, 1891, 281, 2096, 690, 273, 253, 2234, 6341, 285, 616, 4602, 342, 38041, 352, 310, 43296, 6051, 326, 253, 5281, 273, 253, 13016, 285, 253, 15274, 273, 253, 3210, 403, 7939, 2299, 253, 7629, 1119, 407, 774, 316, 812, 1361, 2572, 19862, 7200, 275, 643, 3000, 824, 7629, 476, 320, 247, 4735, 7293, 327, 253, 3634, 285, 275, 8063, 1677, 253, 1463, 12922, 285, 9508, 273, 253, 3694, 253, 1543, 943, 320, 4751, 41374, 717, 891, 5816, 1633, 891, 717, 2819, 3579, 281, 253, 6128, 432, 253, 4477, 327, 436, 1127, 5474, 339, 793, 360, 3454, 436, 2929, 12453, 253, 1895, 326, 3676, 11454, 6928, 277, 79, 2224, 476, 1421, 281, 1027, 13650, 1014, 672, 597, 403, 31260, 253, 1072, 1039, 1955, 281, 253, 19191, 414, 273, 3530, 4236, 275, 1054, 487, 1506, 256, 35333, 285, 5731, 7259, 432, 1027, 5556, 14460, 534, 5644, 281, 14940, 281, 1027, 4811, 2112, 253, 2957, 2553, 50276, 9328, 11104, 436, 1895, 281, 253, 9542, 2957, 2553, 326, 15877, 432, 253, 16196, 10533, 275, 774, 86, 1396, 569, 597, 921, 326, 6032, 1396, 569, 476, 1361, 16748, 436, 2523, 407, 25184, 253, 5743, 281, 2489, 625, 774, 335, 2804, 534, 5644, 281, 247, 1805, 5454, 2727, 875, 10554, 3910, 26332, 15274, 285, 1566, 7200, 50274, 783, 11288, 273, 253, 1895, 310, 17194, 407, 4910, 751, 11723, 835, 625, 5185, 13650, 403, 1774, 2299, 697, 12744, 2139, 581, 651, 1902, 41374, 13650, 672, 767, 2990, 342, 1072, 31850, 403, 10166, 342, 247, 1027, 10491, 273, 1054, 487, 1506, 439, 2066, 868, 436, 19191, 414, 943, 1421, 281, 1027, 5482, 323, 2570, 9421, 816, 347, 247, 1027, 31850, 651, 36971, 253, 2957, 2553, 476, 1361, 1060, 285, 436, 2929, 760, 33826, 253, 6070, 326, 253, 6032, 1255, 273, 253, 5743, 7120, 247, 2554, 697, 12744, 281, 479, 1880, 253, 5743, 1159, 3815, 310, 4209, 323, 16161, 436, 1895, 643, 4610, 824, 347, 37820, 403, 417, 14859, 50276, 26122, 50276, 262, 310, 5322, 326, 5899, 1396, 569, 751, 1863, 763, 285, 2602, 11095, 403, 4764, 1025, 342, 247, 9840, 594, 326, 597, 476, 320, 27577, 281, 2489, 625, 774, 335, 2804, 1677, 253, 19191, 414, 273, 1054, 487, 16464, 697, 417, 10084, 326, 368, 476, 1265, 432, 253, 1072, 31850, 285, 24171, 281, 247, 1027, 2919, 273, 253, 2957, 2553, 1580, 253, 331, 3770, 284, 262, 5755, 310, 1955, 281, 253, 1979, 273, 253, 1054, 487, 1506, 436, 2929, 812, 5649, 432, 18216, 849, 253, 1055, 1979, 273, 268, 1397, 2544, 342, 14604, 1979, 50275, 1189, 455, 253, 16774, 1941, 310, 1077, 1708, 50276, 9328, 7277, 1027, 5743, 3470, 323, 247, 3055, 10895, 323, 519, 5532, 10489, 4427, 285, 278, 79, 382, 281, 1056, 2087, 3916, 627, 943, 320, 2710, 6928, 285, 2710, 5556, 14460, 327, 2710, 15302, 253, 2929, 476, 320, 625, 21414, 604, 625, 15302, 403, 14859, 13027, 417, 327, 247, 3055, 10895, 323, 534, 642, 581, 476, 17813, 616, 1543, 50275, 12563, 253, 5556, 6081, 7120, 247, 2201, 2554, 275, 49858, 253, 2957, 2553, 533, 760, 247, 2014, 5556, 6081, 310, 2429, 275, 253, 806, 4836, 1223, 760, 374, 403, 14859, 275, 4836, 374, 26332, 519, 356, 4614, 285, 256, 35333, 6240, 625, 5556, 14460, 3340, 4633, 4394, 751, 38622, 812, 1361, 19148, 1880, 253, 6032, 1396, 569, 310, 10237, 2439, 5556, 14460, 390, 1880, 436, 310, 247, 2714, 1083, 323, 519, 356, 4614, 891, 9101, 253, 3438, 533, 4679, 1364, 320, 2684, 281, 5276, 667, 1127, 50275, 251, 247, 4577, 3877, 752, 1057, 1016, 14261, 275, 3036, 495, 1957, 7384, 352, 6125, 1027, 9840, 2193, 627, 943, 320, 625, 15274, 50276, 5530, 11095, 556, 1142, 2792, 1223, 23136, 15741, 81, 760, 556, 577, 2792, 671, 849, 513, 841, 701, 284, 6990, 253, 5743, 3470, 50276, 783, 50276, 7265, 50237, 921, 14777, 323, 608, 2193, 273, 9840, 671, 275, 3036, 577, 2139, 403, 2649, 253, 1072, 701, 284, 908, 323, 519, 356, 4614, 285, 256, 35333, 50276, 783, 10165, 273, 9840, 323, 247, 1677, 2613, 5743, 1159, 943, 320, 5185, 4768, 50275, 926, 577, 943, 671, 921, 253, 2629, 11254, 2439, 253, 1249, 3210, 436, 476, 2085, 247, 3282, 323, 253, 7605, 8453, 273, 253, 1543, 50275, 373, 14573, 247, 14923, 5743, 1159, 310, 5611, 533, 597, 1620, 908, 275, 667, 14023, 752, 310, 253, 5649, 273, 824, 247, 15895, 436, 2929, 25957, 326, 924, 293, 86, 310, 1679, 8214, 2429, 281, 253, 643, 6032, 1396, 569, 667, 1480, 3908, 751, 436, 943, 320, 3560, 598, 342, 11745, 14023, 50276, 249, 436, 1083, 253, 673, 591, 23657, 50275, 2520, 2929, 2722, 849, 2801, 21539, 671, 16178, 253, 2957, 2553, 533, 369, 417, 14859, 45190, 891, 1158, 6240, 3081, 4679, 15307, 384, 11017, 1027, 2801, 22429, 323, 247, 4229, 9840, 651, 760, 17084, 616, 3916, 50276, 262, 310, 12744, 1880, 616, 31850, 5700, 323, 253, 277, 79, 2224, 14859, 1060, 310, 4229, 390, 1880, 597, 3365, 19958, 1027, 13461, 432, 253, 1072, 31850, 3268, 253, 41066, 273, 253, 16038, 369, 326, 31385, 6569, 323, 253, 1072, 31850, 533, 253, 3448, 275, 253, 2505, 369, 23851, 273, 1880, 597, 27810, 253, 1072, 3632, 1180, 8357, 323, 253, 3302, 6081, 2439, 1016, 3368, 50275, 284, 247, 1453, 3368, 253, 1055, 1979, 273, 10554, 3910, 875, 3733, 247, 1566, 342, 1027, 3302, 5904, 943, 320, 2429, 342, 616, 2022, 1543, 326, 8338, 3733, 342, 1027, 1054, 487, 1506, 1340, 4983, 432, 253, 1072, 31850, 50276, 7152, 33032, 35501, 323, 253, 6031, 273, 253, 4477, 690, 273, 619, 7350, 452, 644, 9713, 2299, 891, 1335, 1158, 326, 253, 4679, 403, 417, 21414, 2217, 323, 17857, 32888, 594, 891, 1978, 619, 4868, 50276, 8774, 50276, 783, 2929, 8219, 326, 247, 6032, 5743, 1159, 778, 4711, 247, 6032, 2553, 273, 253, 3453, 273, 247, 2990, 534, 22649, 247, 1175, 38041, 3879, 1754, 327, 436, 8310, 253, 2929, 29328, 253, 924, 293, 86, 5743, 1159, 285, 697, 14923, 2715, 4679, 342, 4751, 14063, 11454, 6928, 403, 3559, 50275, 296, 3755, 20556, 50275, 783, 2934, 326, 6032, 5743, 3470, 778, 452, 1805, 38041, 273, 11454, 6928, 310, 1077, 4722, 352, 3400, 247, 1027, 1039, 281, 2096, 253, 2554, 273, 5743, 3470, 275, 11454, 6928, 50276, 20881, 1255, 265, 50275, 13206, 374, 2722, 253, 16038, 273, 253, 2929, 352, 4648, 253, 2553, 273, 253, 3453, 273, 247, 11454, 2990, 8772, 697, 3280, 281, 921, 253, 1180, 273, 1980, 46836, 2299, 359, 3798, 897, 2957, 13016, 281, 921, 253, 1980, 46836, 273, 247, 11454, 2990, 25761, 253, 2929, 3198, 281, 1071, 342, 2710, 8090, 24088, 20126, 285, 3676, 3302, 5904, 24088, 305, 12064, 390, 6447, 285, 35615, 352, 3133, 326, 4677, 374, 760, 2987, 342, 6928, 1907, 2500, 351, 50127, 3280, 285, 327, 264, 50127, 3453, 323, 643, 6928, 253, 24426, 273, 253, 2957, 13016, 273, 11454, 6928, 778, 320, 247, 1175, 3806, 50276, 1542, 924, 293, 86, 253, 4764, 9840, 310, 1077, 1774, 352, 40216, 253, 7200, 285, 253, 38041, 2299, 253, 2929, 1057, 417, 1918, 247, 8542, 1332, 281, 5206, 697, 1318, 253, 2929, 3916, 275, 30762, 270, 326, 9840, 476, 320, 6311, 342, 253, 13461, 273, 247, 11454, 2990, 2299, 253, 7744, 908, 8103, 3470, 2723, 281, 253, 7200, 849, 281, 5731, 9840, 281, 6654, 253, 7200, 285, 253, 38041, 1309, 3733, 310, 7202, 50276, 249, 4677, 577, 1027, 5556, 14460, 812, 4711, 1027, 1543, 2299, 253, 2929, 760, 4648, 519, 356, 4614, 352, 310, 1805, 281, 1071, 342, 256, 35333, 519, 356, 4614, 38622, 285, 717, 84, 4971, 253, 15302, 908, 275, 253, 2929, 310, 1077, 1355, 352, 310, 1805, 281, 1071, 342, 4067, 15302, 253, 2929, 760, 5216, 342, 4751, 14063, 11454, 6928, 352, 310, 1805, 281, 1071, 342, 27311, 267, 11454, 6928, 50276, 1189, 455, 1580, 253, 2929, 310, 417, 247, 10527, 581, 352, 3198, 9470, 4679, 281, 12654, 253, 3916, 2299, 253, 4679, 275, 253, 2929, 403, 417, 21414, 2217, 187, 187, 4118, 18435, 27, 455, 10123, 497, 4016, 323, 436, 2929, 1955, 281, 2710, 3374, 891, 1158, 253, 2022, 2523, 369, 326, 253, 5661, 1543, 497, 1512, 5075, 281, 320, 21414, 323, 1650, 253, 30628, 497, 417, 2119, 604, 253, 3910, 275, 3045, 875, 1027, 1396, 569, 403, 1534, 253, 30628, 671, 2424, 625, 15302, 285, 625, 4679, 253, 4477, 2879, 6268, 281, 1543, 625, 4679, 285, 9125, 326, 253, 1655, 15302, 403, 4209, 533, 253, 30628, 4455, 281, 3464, 10915, 8498, 758 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 3365, 24337, 1066, 774, 316, 1908, 3185, 253, 1083, 9840, 50276, 740, 259, 18, 50276, 520, 259, 19, 50276, 2874, 285, 3877, 326, 359, 969, 755, 18687, 18, 1192, 89, 209, 4699, 323, 436, 1273, 873, 273, 3210, 253, 26210, 275, 2426, 273, 9777, 273, 13461, 476, 320, 4269, 407, 1027, 5556, 14460, 1027, 4757, 273, 11591, 19, 37820, 390, 347, 619, 3236, 2278, 2168, 5393, 4577, 11041, 273, 27935, 8772, 5743, 1159, 50276, 936, 28411, 366, 275, 253, 1840, 1650, 359, 858, 417, 387, 667, 1127, 11897, 253, 31385, 273, 247, 873, 273, 3210, 326, 574, 1027, 4295, 1097, 269, 18, 269, 19, 253, 806, 873, 574, 253, 1072, 5743, 1159, 815, 74, 1223, 305, 18, 305, 19, 574, 815, 74, 50276, 5681, 247, 3213, 2007, 534, 2722, 849, 20276, 253, 31385, 310, 347, 247, 7982, 1908, 271, 10341, 873, 273, 8985, 49996, 256, 18, 50276, 71, 18, 269, 19, 20200, 49555, 835, 4993, 50276, 2592, 298, 2134, 38435, 815, 895, 391, 2134, 310, 253, 5912, 7922, 407, 253, 209, 334, 1566, 273, 1269, 15823, 281, 253, 2762, 966, 1024, 1379, 1529, 873, 273, 8985, 49996, 256, 19, 50276, 72, 18, 305, 19, 20200, 305, 78, 342, 305, 895, 50276, 24502, 10367, 2134, 38435, 815, 895, 391, 2134, 835, 38435, 310, 253, 1072, 2801, 4972, 326, 1566, 12684, 556, 26332, 3707, 323, 815, 74, 253, 873, 256, 19, 310, 1127, 3020, 8931, 281, 253, 873, 256, 18, 4720, 1339, 815, 895, 50276, 2461, 815, 895, 835, 9840, 275, 14168, 4482, 391, 285, 1928, 1959, 281, 2451, 326, 323, 667, 9840, 1046, 1566, 15891, 432, 256, 19, 588, 5194, 342, 253, 1566, 12684, 432, 256, 18, 275, 2426, 273, 8131, 966, 26332, 3738, 253, 966, 20552, 588, 1818, 253, 5958, 310, 320, 15296, 323, 667, 9840, 436, 2097, 326, 256, 19, 11330, 253, 3242, 1072, 13650, 347, 256, 18, 323, 667, 1896, 941, 1127, 3192, 9840, 281, 470, 11026, 275, 305, 895, 281, 16987, 323, 667, 891, 275, 278, 285, 1896, 1269, 7613, 253, 31385, 273, 256, 19, 588, 564, 281, 5058, 1014, 2167, 253, 31385, 273, 256, 18, 476, 320, 29607, 1781, 285, 253, 767, 1566, 5239, 256, 18, 256, 19, 5194, 1127, 3020, 275, 2426, 273, 8131, 5971, 275, 643, 3000, 3192, 271, 10341, 873, 273, 3210, 342, 774, 86, 1396, 569, 24699, 697, 13461, 285, 15706, 253, 774, 86, 407, 815, 895, 50276, 4090, 17, 1315, 317, 89, 6903, 17, 588, 4917, 247, 1273, 1566, 873, 342, 31385, 2810, 281, 5058, 18670, 253, 4477, 5194, 342, 479, 326, 436, 14916, 5407, 273, 5743, 3470, 1057, 417, 8415, 667, 38041, 1895, 275, 5145, 4715, 50276, 3113, 253, 1840, 275, 2564, 891, 21434, 253, 4477, 281, 294, 45141, 31385, 347, 247, 7982, 347, 5393, 275, 619, 2278, 253, 288, 28444, 830, 1057, 417, 11089, 432, 436, 2523, 533, 253, 2361, 3904, 275, 436, 1083, 1646, 281, 5224, 326, 627, 310, 1652, 281, 642, 38041, 5691, 323, 253, 8671, 8892, 7152, 339, 431, 248, 2929, 3916, 326, 6032, 1396, 569, 403, 625, 41374, 685, 774, 86, 253, 7200, 6351, 3916, 1646, 16888, 285, 417, 9257, 4824, 562, 2007, 28913, 2175, 403, 3058, 281, 17084, 253, 6452, 327, 7200, 2299, 253, 2022, 1127, 273, 253, 2929, 310, 38041, 835, 253, 4735, 310, 4080, 407, 253, 10554, 3064, 31385, 5611, 275, 2593, 374, 310, 247, 2557, 689, 247, 873, 273, 3210, 835, 253, 31385, 4868, 310, 1698, 604, 253, 3210, 3453, 5185, 8197, 323, 253, 1072, 12820, 3530, 50275, 3088, 3210, 342, 253, 1072, 31850, 285, 1072, 3632, 1255, 8357, 323, 253, 439, 2066, 868, 273, 256, 35333, 452, 1029, 31385, 604, 594, 840, 627, 651, 320, 10704, 3374, 275, 253, 1039, 3210, 403, 10166, 5010, 835, 1057, 253, 3064, 275, 31385, 1705, 432, 2139, 824, 15113, 403, 2783, 21388, 5551, 20109, 2139, 651, 31385, 320, 3636, 342, 38041, 50276, 1189, 455, 253, 2929, 556, 247, 5185, 2926, 281, 2028, 697, 247, 5352, 8668, 285, 16633, 327, 253, 2022, 1895, 387, 1133, 2299, 891, 1891, 281, 2096, 690, 273, 253, 2234, 6341, 285, 616, 4602, 342, 38041, 352, 310, 43296, 6051, 326, 253, 5281, 273, 253, 13016, 285, 253, 15274, 273, 253, 3210, 403, 7939, 2299, 253, 7629, 1119, 407, 774, 316, 812, 1361, 2572, 19862, 7200, 275, 643, 3000, 824, 7629, 476, 320, 247, 4735, 7293, 327, 253, 3634, 285, 275, 8063, 1677, 253, 1463, 12922, 285, 9508, 273, 253, 3694, 253, 1543, 943, 320, 4751, 41374, 717, 891, 5816, 1633, 891, 717, 2819, 3579, 281, 253, 6128, 432, 253, 4477, 327, 436, 1127, 5474, 339, 793, 360, 3454, 436, 2929, 12453, 253, 1895, 326, 3676, 11454, 6928, 277, 79, 2224, 476, 1421, 281, 1027, 13650, 1014, 672, 597, 403, 31260, 253, 1072, 1039, 1955, 281, 253, 19191, 414, 273, 3530, 4236, 275, 1054, 487, 1506, 256, 35333, 285, 5731, 7259, 432, 1027, 5556, 14460, 534, 5644, 281, 14940, 281, 1027, 4811, 2112, 253, 2957, 2553, 50276, 9328, 11104, 436, 1895, 281, 253, 9542, 2957, 2553, 326, 15877, 432, 253, 16196, 10533, 275, 774, 86, 1396, 569, 597, 921, 326, 6032, 1396, 569, 476, 1361, 16748, 436, 2523, 407, 25184, 253, 5743, 281, 2489, 625, 774, 335, 2804, 534, 5644, 281, 247, 1805, 5454, 2727, 875, 10554, 3910, 26332, 15274, 285, 1566, 7200, 50274, 783, 11288, 273, 253, 1895, 310, 17194, 407, 4910, 751, 11723, 835, 625, 5185, 13650, 403, 1774, 2299, 697, 12744, 2139, 581, 651, 1902, 41374, 13650, 672, 767, 2990, 342, 1072, 31850, 403, 10166, 342, 247, 1027, 10491, 273, 1054, 487, 1506, 439, 2066, 868, 436, 19191, 414, 943, 1421, 281, 1027, 5482, 323, 2570, 9421, 816, 347, 247, 1027, 31850, 651, 36971, 253, 2957, 2553, 476, 1361, 1060, 285, 436, 2929, 760, 33826, 253, 6070, 326, 253, 6032, 1255, 273, 253, 5743, 7120, 247, 2554, 697, 12744, 281, 479, 1880, 253, 5743, 1159, 3815, 310, 4209, 323, 16161, 436, 1895, 643, 4610, 824, 347, 37820, 403, 417, 14859, 50276, 26122, 50276, 262, 310, 5322, 326, 5899, 1396, 569, 751, 1863, 763, 285, 2602, 11095, 403, 4764, 1025, 342, 247, 9840, 594, 326, 597, 476, 320, 27577, 281, 2489, 625, 774, 335, 2804, 1677, 253, 19191, 414, 273, 1054, 487, 16464, 697, 417, 10084, 326, 368, 476, 1265, 432, 253, 1072, 31850, 285, 24171, 281, 247, 1027, 2919, 273, 253, 2957, 2553, 1580, 253, 331, 3770, 284, 262, 5755, 310, 1955, 281, 253, 1979, 273, 253, 1054, 487, 1506, 436, 2929, 812, 5649, 432, 18216, 849, 253, 1055, 1979, 273, 268, 1397, 2544, 342, 14604, 1979, 50275, 1189, 455, 253, 16774, 1941, 310, 1077, 1708, 50276, 9328, 7277, 1027, 5743, 3470, 323, 247, 3055, 10895, 323, 519, 5532, 10489, 4427, 285, 278, 79, 382, 281, 1056, 2087, 3916, 627, 943, 320, 2710, 6928, 285, 2710, 5556, 14460, 327, 2710, 15302, 253, 2929, 476, 320, 625, 21414, 604, 625, 15302, 403, 14859, 13027, 417, 327, 247, 3055, 10895, 323, 534, 642, 581, 476, 17813, 616, 1543, 50275, 12563, 253, 5556, 6081, 7120, 247, 2201, 2554, 275, 49858, 253, 2957, 2553, 533, 760, 247, 2014, 5556, 6081, 310, 2429, 275, 253, 806, 4836, 1223, 760, 374, 403, 14859, 275, 4836, 374, 26332, 519, 356, 4614, 285, 256, 35333, 6240, 625, 5556, 14460, 3340, 4633, 4394, 751, 38622, 812, 1361, 19148, 1880, 253, 6032, 1396, 569, 310, 10237, 2439, 5556, 14460, 390, 1880, 436, 310, 247, 2714, 1083, 323, 519, 356, 4614, 891, 9101, 253, 3438, 533, 4679, 1364, 320, 2684, 281, 5276, 667, 1127, 50275, 251, 247, 4577, 3877, 752, 1057, 1016, 14261, 275, 3036, 495, 1957, 7384, 352, 6125, 1027, 9840, 2193, 627, 943, 320, 625, 15274, 50276, 5530, 11095, 556, 1142, 2792, 1223, 23136, 15741, 81, 760, 556, 577, 2792, 671, 849, 513, 841, 701, 284, 6990, 253, 5743, 3470, 50276, 783, 50276, 7265, 50237, 921, 14777, 323, 608, 2193, 273, 9840, 671, 275, 3036, 577, 2139, 403, 2649, 253, 1072, 701, 284, 908, 323, 519, 356, 4614, 285, 256, 35333, 50276, 783, 10165, 273, 9840, 323, 247, 1677, 2613, 5743, 1159, 943, 320, 5185, 4768, 50275, 926, 577, 943, 671, 921, 253, 2629, 11254, 2439, 253, 1249, 3210, 436, 476, 2085, 247, 3282, 323, 253, 7605, 8453, 273, 253, 1543, 50275, 373, 14573, 247, 14923, 5743, 1159, 310, 5611, 533, 597, 1620, 908, 275, 667, 14023, 752, 310, 253, 5649, 273, 824, 247, 15895, 436, 2929, 25957, 326, 924, 293, 86, 310, 1679, 8214, 2429, 281, 253, 643, 6032, 1396, 569, 667, 1480, 3908, 751, 436, 943, 320, 3560, 598, 342, 11745, 14023, 50276, 249, 436, 1083, 253, 673, 591, 23657, 50275, 2520, 2929, 2722, 849, 2801, 21539, 671, 16178, 253, 2957, 2553, 533, 369, 417, 14859, 45190, 891, 1158, 6240, 3081, 4679, 15307, 384, 11017, 1027, 2801, 22429, 323, 247, 4229, 9840, 651, 760, 17084, 616, 3916, 50276, 262, 310, 12744, 1880, 616, 31850, 5700, 323, 253, 277, 79, 2224, 14859, 1060, 310, 4229, 390, 1880, 597, 3365, 19958, 1027, 13461, 432, 253, 1072, 31850, 3268, 253, 41066, 273, 253, 16038, 369, 326, 31385, 6569, 323, 253, 1072, 31850, 533, 253, 3448, 275, 253, 2505, 369, 23851, 273, 1880, 597, 27810, 253, 1072, 3632, 1180, 8357, 323, 253, 3302, 6081, 2439, 1016, 3368, 50275, 284, 247, 1453, 3368, 253, 1055, 1979, 273, 10554, 3910, 875, 3733, 247, 1566, 342, 1027, 3302, 5904, 943, 320, 2429, 342, 616, 2022, 1543, 326, 8338, 3733, 342, 1027, 1054, 487, 1506, 1340, 4983, 432, 253, 1072, 31850, 50276, 7152, 33032, 35501, 323, 253, 6031, 273, 253, 4477, 690, 273, 619, 7350, 452, 644, 9713, 2299, 891, 1335, 1158, 326, 253, 4679, 403, 417, 21414, 2217, 323, 17857, 32888, 594, 891, 1978, 619, 4868, 50276, 8774, 50276, 783, 2929, 8219, 326, 247, 6032, 5743, 1159, 778, 4711, 247, 6032, 2553, 273, 253, 3453, 273, 247, 2990, 534, 22649, 247, 1175, 38041, 3879, 1754, 327, 436, 8310, 253, 2929, 29328, 253, 924, 293, 86, 5743, 1159, 285, 697, 14923, 2715, 4679, 342, 4751, 14063, 11454, 6928, 403, 3559, 50275, 296, 3755, 20556, 50275, 783, 2934, 326, 6032, 5743, 3470, 778, 452, 1805, 38041, 273, 11454, 6928, 310, 1077, 4722, 352, 3400, 247, 1027, 1039, 281, 2096, 253, 2554, 273, 5743, 3470, 275, 11454, 6928, 50276, 20881, 1255, 265, 50275, 13206, 374, 2722, 253, 16038, 273, 253, 2929, 352, 4648, 253, 2553, 273, 253, 3453, 273, 247, 11454, 2990, 8772, 697, 3280, 281, 921, 253, 1180, 273, 1980, 46836, 2299, 359, 3798, 897, 2957, 13016, 281, 921, 253, 1980, 46836, 273, 247, 11454, 2990, 25761, 253, 2929, 3198, 281, 1071, 342, 2710, 8090, 24088, 20126, 285, 3676, 3302, 5904, 24088, 305, 12064, 390, 6447, 285, 35615, 352, 3133, 326, 4677, 374, 760, 2987, 342, 6928, 1907, 2500, 351, 50127, 3280, 285, 327, 264, 50127, 3453, 323, 643, 6928, 253, 24426, 273, 253, 2957, 13016, 273, 11454, 6928, 778, 320, 247, 1175, 3806, 50276, 1542, 924, 293, 86, 253, 4764, 9840, 310, 1077, 1774, 352, 40216, 253, 7200, 285, 253, 38041, 2299, 253, 2929, 1057, 417, 1918, 247, 8542, 1332, 281, 5206, 697, 1318, 253, 2929, 3916, 275, 30762, 270, 326, 9840, 476, 320, 6311, 342, 253, 13461, 273, 247, 11454, 2990, 2299, 253, 7744, 908, 8103, 3470, 2723, 281, 253, 7200, 849, 281, 5731, 9840, 281, 6654, 253, 7200, 285, 253, 38041, 1309, 3733, 310, 7202, 50276, 249, 4677, 577, 1027, 5556, 14460, 812, 4711, 1027, 1543, 2299, 253, 2929, 760, 4648, 519, 356, 4614, 352, 310, 1805, 281, 1071, 342, 256, 35333, 519, 356, 4614, 38622, 285, 717, 84, 4971, 253, 15302, 908, 275, 253, 2929, 310, 1077, 1355, 352, 310, 1805, 281, 1071, 342, 4067, 15302, 253, 2929, 760, 5216, 342, 4751, 14063, 11454, 6928, 352, 310, 1805, 281, 1071, 342, 27311, 267, 11454, 6928, 50276, 1189, 455, 1580, 253, 2929, 310, 417, 247, 10527, 581, 352, 3198, 9470, 4679, 281, 12654, 253, 3916, 2299, 253, 4679, 275, 253, 2929, 403, 417, 21414, 2217, 187, 187, 4118, 18435, 27, 455, 10123, 497, 4016, 323, 436, 2929, 1955, 281, 2710, 3374, 891, 1158, 253, 2022, 2523, 369, 326, 253, 5661, 1543, 497, 1512, 5075, 281, 320, 21414, 323, 1650, 253, 30628, 497, 417, 2119, 604, 253, 3910, 275, 3045, 875, 1027, 1396, 569, 403, 1534, 253, 30628, 671, 2424, 625, 15302, 285, 625, 4679, 253, 4477, 2879, 6268, 281, 1543, 625, 4679, 285, 9125, 326, 253, 1655, 15302, 403, 4209, 533, 253, 30628, 4455, 281, 3464, 10915, 8498, 758 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: qualityproposed method is significantly more practical both in terms of ease of implementation and speed than prior related work for minimizing combinatorial losses the papers exposition needs to be improved considerably though see below originalitythe paper draws on advanced concepts from combinatorial optimization that may be unfamiliar to many iclr readers but that have the potential for large impact significancethe papers proposed method is practical for very common ml setups in nlp and computer vision that are used daytoday pros  proposed method is interesting practical and relatively easy to implement cons  paper writing omits many key details necessary to use the method in practice experiments build the method on top of out dated models and do not demonstrate that the method could be used with modern models eg attentionbased decoders commentsi appreciate that you provide a very general recipe for constructing the differentiable combinatorial layers however the paper provides far too few details for the particular problems bipartite matching and sequence alignment that appear in the experiments the supplementary material does not help your method is promising and practitioners that do not have a background in combinatorial optimization may want to use it the paper does not provide enough details to do so id replace algorithm 1 with a box specific to bipartite matching you need to provide far more detailbackground on differentiable decoding for the secnod set of experiments it was unclear what softmax gumbelsoftmax meant is softmax not the same as mle while the second set of experiments provides useful ablation analysis of the impact of your method it builds on an outdated model you write while this architecture is no longer the top performer in terms of rouge metric currently large pretrained selfattention models are the stateoftheart it is much more efficient intraining allowing for experimenting with different loss functions is the speed difference really that much im surprised that that makes a difference in terms of which experiments are feasible figure 1 why is cvxpy timing ushapeddocsepthe authors present a technique to integrate combinatorial optimization subproblems into a gradient descent based application the approach they describe relies only on differentiation of the value of the combinatorial program instead of the solution vector and can be done with relatively low overhead compared to techniques that involve modifying combinatorial algorithms to differentiable elements or the use of differentiable linearquadratic programming layers they motivate and show the advantages of their approach using two natural and useful examples the experimental results show promise and the paper is well written and motivated docsepthis paper shows how to differentiate through combinatorial losses by differentiating through the ideal formulation lp understanding how to differentiate through combinatorial optimization so that it can be used as part of the model or loss is important as it captures many natural operations i am giving this a weak accept as it is a novel approach for differentiation that the community can build on but the positioning and relation to prior work and empirical comparisons could be stronger more details below strengths to the best of my knowledge this is a novel approach that makes the elegant and natural connections going from a combinatorial problem to an ilp to an lp to differentiating through the lp using known methods weaknesses the biggest weakness is the lack of a comparison with related approaches for differentiating through combinatorial losses such as some of the approaches discussed in the introduction as pogancic 2020 that consider similar problems the experimental settings considered in this paper compare to baselines that dont differentiate through the combinatorial aspect of the problem while this is a great step of validating the power of these approaches i think that it would be significantly more convincing to empirically compare to approaches that differentiate through the combinatorial losses i also think its important to discuss the comparisons to the related approaches for differentiating through parameterized combinatorial optimization are the approaches using the same definition of a derivative pogancic 2020 discusses an issue with the real derivative through combinatorial optimization being uninformative or nearzero everywhere is this also an issue in the setting here can this approach be seen as an approximation or surrogate to the derivative of the combinatorial problem as the other approaches if i understand correctly this approach requires a known mapping from the combinatorial problem to the ilp and from the ilp to the lp which could make it more involved to apply than some of the related methods that dont require knowing this information other questions and comments how should the gradients of the continuous baselines with cvxpy compare to the method being proposed in the experiments if theyre using the ideal formulation lp should they be the same in theory as figure 1 validates but in practice due to solver errors gives suboptimal directions as table 1 shows the last paragraph of the introduction presents a form of the criterion with a loss and the combinatorial objective value with notation thats not used later on in the paper page 2 second paragraph the last sentence on differentiable continuous lpsqps seems separate from the rest of paragraph on combinatorial solversdocsepsummary the authors propose a simple method to optimize objective values defined as the optimal value of a combinatorial integer linear program whose parameter depends on the output of a certain model for this they note that generalized gradient of such objective values are efficiently computed using the primal and dual solution of the ilp itself in particular the ilp can be solved using specialized and efficient solver instead of solving a generic lp as proposed in concurrent work the authors propose two example applications that are described precisely and validated against generic lp solving approaches using combinatorial specialized solvers outperform generic lp solving approaches in term of computation and in term of validation metrics as generic lp solving is hindered by errors which makes the learning process diverge review the paper is well written and well organized the theoretical aspects are well documented and the examples are introduced precisely and pedagogically the method itself is interesting as it ensures that the generalized gradient of many ilp problems are computable efficiently theorem 1 states those guarantees and many examples are discussed on the other hand the novelty of the method may be a little overstated in particular it is known that using a generic lp solver is oftentimes not the most efficient way of computing the gradient and that specialized combinatorial solver should be used for the problem of gsa which corresponds to using a dynamic time warping loss solving the small lp is done using dynamic programming using a dtw loss on top of a deep neural network has already been studied see eg the cited mensch and blondel paper where the authors solve the lp using dp using a generic lp solver such as the one in cvxpy is a little naive in that case and it not surprising that it performs poorly in this example we only require the gradient with respect to the cost p is primalpdfiffefficient arguably this manuscript also enable us to backpropagate through parametrized constraints using the formula proposed in theorem 1 which is little known in this communitydocsepthe value of the optimal objective as a function of the cost vector c can be written as zc ct uc where the optimal solution u also depends on c the function uc is piecewise constant there are finitely resp countably many feasible solutions candidates for u and so the function zc is a piecewise linear function of c with gradient uc wherever it exists otherwise there is analogous subgradient obviously all it takes for computing uc is solving anyhow the combinatorial problem this is all trivial and wellknown yet the authors do precisely that can it be saved by proposing gradients also of wrt constraints no these results are slightly less trivial but as authors admit are known since 1975 moreover the gradient with respect to c is the only one used in experiments as far as i understand is there independent value in theorem 1 i do not see it it seems to be a bulky wrapper around the classical result it only introduces some sort of transition from a vector specifying a combinatorial problem to a collection of vectorsmatrices specifying an integer program also the central concept of generalized gradient merely provides a formal framework to talk about nonunique gradients at boundary regions similarly to subgradient subdifferential for the method itself it has no specific relevance the claims of better performance compared to cvxpy are also absolutely nonsurprising cvxpy currently uses a slightly suboptimal and a very expensive solver for linear programs that is all ### Summary:
this paper received high variance in the reviews i personally agree with anonreviewer4 that the theoretical results presented in this paper are wellknown results on the sensitivity analysis of linear programs see for instance introduction to linear optimization by bertsimas and tsitsiklis chapter 5 more generally these results are a special case of danskins theorem and the envelope theorem httpsenwikipediaorgwikidanskin27stheorem httpsenwikipediaorgwikienvelopetheorem clarkes generalized gradients are just subgradients in the case of convex functions which is the case here my recommentation to the authors if they want to publish their work is to focus on the applications and to stop claiming novelty on the theoretical side
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 15177, 856, 7334, 1332, 310, 3012, 625, 8542, 1097, 275, 2426, 273, 11990, 273, 7092, 285, 3885, 685, 2720, 2905, 789, 323, 28699, 38183, 11655, 253, 9380, 47284, 3198, 281, 320, 5520, 15455, 2167, 923, 2708, 50276, 19164, 414, 783, 2929, 21354, 327, 7269, 12342, 432, 38183, 13757, 326, 778, 320, 32139, 281, 1142, 17857, 32888, 10668, 533, 326, 452, 253, 2442, 323, 1781, 3486, 50275, 9188, 692, 1377, 10666, 9380, 4081, 1332, 310, 8542, 323, 1077, 1846, 13361, 873, 8777, 275, 295, 24343, 285, 4382, 8113, 326, 403, 908, 1388, 39799, 50275, 856, 84, 209, 575, 856, 7334, 1332, 310, 4722, 8542, 285, 4942, 3477, 281, 3359, 50275, 5040, 209, 575, 20790, 4028, 7005, 953, 1142, 2234, 4278, 3309, 281, 897, 253, 1332, 275, 3946, 4679, 1973, 253, 1332, 327, 1755, 273, 562, 15483, 3210, 285, 513, 417, 7568, 326, 253, 1332, 812, 320, 908, 342, 4980, 3210, 24088, 4116, 3169, 1086, 351, 398, 50276, 26122, 74, 11435, 326, 368, 2085, 247, 1077, 2087, 13612, 323, 26736, 253, 46350, 38183, 8090, 2299, 253, 2929, 3400, 2080, 1512, 1643, 4278, 323, 253, 1798, 3237, 49240, 11038, 285, 3425, 12420, 326, 3176, 275, 253, 4679, 253, 24864, 2144, 1057, 417, 1361, 634, 1332, 310, 12532, 285, 24432, 326, 513, 417, 452, 247, 4114, 275, 38183, 13757, 778, 971, 281, 897, 352, 253, 2929, 1057, 417, 2085, 2217, 4278, 281, 513, 594, 2654, 8171, 5933, 337, 342, 247, 3817, 2173, 281, 49240, 11038, 50276, 5658, 878, 281, 2085, 2080, 625, 2508, 11814, 327, 46350, 28490, 323, 253, 4706, 41933, 873, 273, 4679, 352, 369, 12744, 752, 2602, 4090, 305, 3561, 293, 5530, 4090, 5486, 310, 2602, 4090, 417, 253, 1072, 347, 278, 282, 50276, 6050, 253, 1273, 873, 273, 4679, 3400, 4217, 28913, 1783, 273, 253, 3486, 273, 634, 1332, 352, 21168, 327, 271, 36761, 1566, 368, 3630, 1223, 436, 10336, 310, 642, 3356, 253, 1755, 40247, 275, 2426, 273, 30497, 463, 7982, 4390, 1781, 3215, 11273, 1881, 42959, 3210, 403, 253, 1375, 23037, 14387, 50276, 262, 310, 1199, 625, 5919, 8376, 1699, 6941, 323, 46086, 342, 1027, 2957, 3470, 310, 253, 3885, 3064, 1663, 326, 1199, 516, 9861, 326, 326, 2789, 247, 3064, 275, 2426, 273, 534, 4679, 403, 17887, 50276, 13206, 337, 2139, 310, 30105, 89, 4789, 11795, 441, 73, 7760, 7152, 339, 431, 248, 4477, 1246, 247, 5853, 281, 19837, 38183, 13757, 749, 856, 23042, 715, 247, 11786, 18499, 1754, 2898, 253, 2746, 597, 6266, 15771, 760, 327, 9827, 273, 253, 1318, 273, 253, 38183, 2086, 3185, 273, 253, 2900, 4972, 285, 476, 320, 2218, 342, 4942, 1698, 18332, 2429, 281, 5609, 326, 6388, 26264, 38183, 11333, 281, 46350, 3603, 390, 253, 897, 273, 46350, 4872, 3362, 83, 1420, 10717, 8090, 50276, 9328, 41509, 285, 921, 253, 11361, 273, 616, 2746, 970, 767, 3626, 285, 4217, 6667, 253, 5661, 1543, 921, 9023, 285, 253, 2929, 310, 973, 3542, 285, 17194, 50276, 7152, 33032, 2520, 2929, 2722, 849, 281, 22629, 949, 38183, 11655, 407, 43073, 949, 253, 7445, 15895, 39322, 4685, 849, 281, 22629, 949, 38183, 13757, 594, 326, 352, 476, 320, 908, 347, 629, 273, 253, 1566, 390, 2957, 310, 1774, 347, 352, 28174, 1142, 3626, 5871, 891, 717, 4933, 436, 247, 5075, 2997, 347, 352, 310, 247, 4460, 2746, 323, 9827, 326, 253, 3114, 476, 1973, 327, 533, 253, 19274, 285, 5886, 281, 2720, 789, 285, 16774, 14023, 812, 320, 10046, 625, 4278, 2708, 50275, 296, 3755, 20556, 281, 253, 1682, 273, 619, 3640, 436, 310, 247, 4460, 2746, 326, 2789, 253, 20654, 285, 3626, 10291, 1469, 432, 247, 38183, 1895, 281, 271, 4164, 81, 281, 271, 39322, 281, 43073, 949, 253, 39322, 970, 1929, 3082, 50275, 20881, 1255, 265, 253, 5962, 14855, 310, 253, 3480, 273, 247, 5301, 342, 2905, 7274, 323, 43073, 949, 38183, 11655, 824, 347, 690, 273, 253, 7274, 5469, 275, 253, 10199, 347, 268, 462, 1377, 280, 9169, 326, 1908, 2074, 3237, 253, 5661, 7533, 2783, 275, 436, 2929, 7277, 281, 1666, 25379, 326, 13414, 22629, 949, 253, 38183, 4809, 273, 253, 1895, 1223, 436, 310, 247, 1270, 3213, 273, 3588, 839, 253, 1612, 273, 841, 7274, 891, 1158, 326, 352, 651, 320, 3012, 625, 21414, 281, 45190, 7277, 281, 7274, 326, 22629, 949, 253, 38183, 11655, 50276, 74, 671, 1158, 697, 1774, 281, 2319, 253, 14023, 281, 253, 2905, 7274, 323, 43073, 949, 4764, 1025, 38183, 13757, 403, 253, 7274, 970, 253, 1072, 5426, 273, 247, 4309, 268, 462, 1377, 280, 9169, 25339, 271, 2523, 342, 253, 1524, 4309, 949, 38183, 13757, 1146, 440, 37650, 800, 390, 2822, 10528, 11678, 310, 436, 671, 271, 2523, 275, 253, 4758, 1060, 476, 436, 2746, 320, 2326, 347, 271, 11193, 390, 35701, 281, 253, 4309, 273, 253, 38183, 1895, 347, 253, 643, 7274, 50276, 338, 891, 2096, 9113, 436, 2746, 4419, 247, 1929, 10603, 432, 253, 38183, 1895, 281, 253, 4164, 81, 285, 432, 253, 4164, 81, 281, 253, 39322, 534, 812, 1056, 352, 625, 3206, 281, 4647, 685, 690, 273, 253, 2905, 3082, 326, 13414, 2430, 8958, 436, 1491, 50275, 977, 3533, 285, 5701, 849, 943, 253, 27935, 273, 253, 5415, 1666, 25379, 342, 30105, 89, 4789, 7277, 281, 253, 1332, 1146, 4081, 275, 253, 4679, 604, 597, 250, 970, 253, 7445, 15895, 39322, 943, 597, 320, 253, 1072, 275, 3762, 347, 4677, 337, 3588, 684, 533, 275, 3946, 1955, 281, 47037, 6332, 4245, 749, 29776, 10746, 347, 2829, 337, 2722, 50276, 783, 1390, 12494, 273, 253, 10199, 10262, 247, 830, 273, 253, 17705, 342, 247, 2957, 285, 253, 38183, 8103, 1318, 342, 14951, 28763, 417, 908, 1996, 327, 275, 253, 2929, 50276, 6377, 374, 1273, 12494, 253, 1390, 6197, 327, 46350, 5415, 298, 793, 82, 793, 3133, 4858, 432, 253, 1551, 273, 12494, 327, 38183, 1220, 735, 7152, 339, 793, 360, 3454, 50275, 783, 4477, 12661, 247, 2969, 1332, 281, 22318, 8103, 2193, 2931, 347, 253, 8654, 1318, 273, 247, 38183, 7007, 4872, 2086, 3692, 4764, 7024, 327, 253, 3453, 273, 247, 2176, 1566, 50275, 1542, 436, 597, 3877, 326, 14923, 11786, 273, 824, 8103, 2193, 403, 14556, 10302, 970, 253, 819, 1983, 285, 8746, 2900, 273, 253, 4164, 81, 3139, 275, 1798, 253, 4164, 81, 476, 320, 14042, 970, 18052, 285, 5919, 47037, 3185, 273, 16161, 247, 12314, 39322, 347, 4081, 275, 17336, 789, 50276, 783, 4477, 12661, 767, 1650, 4893, 326, 403, 2529, 10534, 285, 17618, 1411, 12314, 39322, 16161, 7274, 970, 38183, 18052, 1220, 735, 562, 32231, 12314, 39322, 16161, 7274, 275, 1307, 273, 13782, 285, 275, 1307, 273, 12820, 17082, 347, 12314, 39322, 16161, 310, 17134, 2122, 407, 6332, 534, 2789, 253, 4715, 1232, 11711, 463, 50276, 15337, 50275, 783, 2929, 310, 973, 3542, 285, 973, 10932, 253, 10527, 7794, 403, 973, 14290, 285, 253, 6667, 403, 5611, 10534, 285, 7690, 356, 462, 1037, 50276, 783, 1332, 3139, 310, 4722, 347, 352, 20096, 326, 253, 14923, 11786, 273, 1142, 4164, 81, 3237, 403, 2475, 494, 14556, 10012, 337, 3054, 1110, 23632, 285, 1142, 6667, 403, 5469, 50276, 251, 253, 643, 1133, 253, 38135, 273, 253, 1332, 778, 320, 247, 1652, 689, 33834, 275, 1798, 352, 310, 1929, 326, 970, 247, 12314, 39322, 47037, 310, 39670, 290, 1022, 417, 253, 954, 5919, 1039, 273, 12672, 253, 11786, 285, 326, 18052, 38183, 47037, 943, 320, 908, 50275, 1542, 253, 1895, 273, 305, 6678, 534, 10140, 281, 970, 247, 7870, 673, 2137, 14650, 2957, 16161, 253, 1355, 39322, 310, 2218, 970, 7870, 10717, 970, 247, 277, 7553, 2957, 327, 1755, 273, 247, 3676, 11454, 2990, 556, 2168, 644, 5421, 923, 24088, 253, 11106, 34939, 348, 285, 37559, 293, 2929, 835, 253, 4477, 8415, 253, 39322, 970, 33234, 970, 247, 12314, 39322, 47037, 824, 347, 253, 581, 275, 30105, 89, 4789, 310, 247, 1652, 27785, 275, 326, 1083, 285, 352, 417, 10084, 326, 352, 17923, 15225, 50275, 249, 436, 1650, 359, 760, 2430, 253, 11786, 342, 1675, 281, 253, 2105, 268, 310, 819, 1983, 9275, 30908, 2276, 25711, 436, 7714, 671, 8046, 441, 281, 896, 44263, 366, 949, 30364, 50065, 10806, 970, 253, 7212, 4081, 275, 10012, 337, 534, 310, 1652, 1929, 275, 436, 3114, 7152, 339, 431, 248, 1318, 273, 253, 8654, 8103, 347, 247, 1159, 273, 253, 2105, 4972, 260, 476, 320, 3542, 347, 1182, 68, 50276, 291, 44274, 835, 253, 8654, 2900, 1484, 671, 7024, 327, 260, 253, 1159, 44274, 310, 5313, 3020, 3638, 50276, 9088, 403, 30268, 1183, 1385, 1598, 1142, 17887, 5482, 9183, 323, 1484, 50276, 395, 594, 253, 1159, 1182, 68, 310, 247, 5313, 3020, 4872, 1159, 273, 260, 342, 11786, 44274, 20312, 352, 4961, 5010, 627, 310, 19890, 749, 29844, 9090, 512, 352, 3936, 323, 12672, 44274, 310, 16161, 50276, 1279, 5430, 50276, 783, 38183, 1895, 436, 310, 512, 14916, 285, 973, 4304, 2568, 253, 4477, 513, 10534, 326, 50276, 5092, 352, 320, 9809, 407, 36636, 27935, 671, 273, 8772, 10806, 642, 841, 1543, 403, 5777, 1679, 14916, 533, 50276, 284, 4477, 11476, 50276, 609, 1929, 1580, 14752, 25761, 253, 11786, 342, 1675, 281, 260, 310, 253, 760, 581, 908, 275, 4679, 347, 2080, 347, 891, 2096, 50276, 261, 627, 3907, 1318, 275, 10012, 337, 891, 513, 417, 923, 352, 352, 3133, 281, 320, 247, 41274, 27436, 1475, 253, 8946, 906, 352, 760, 23970, 690, 3686, 273, 5502, 432, 247, 4972, 31238, 247, 38183, 1895, 281, 247, 4849, 273, 11390, 2056, 5395, 31238, 271, 7007, 2086, 671, 253, 4275, 4473, 273, 14923, 11786, 7960, 3400, 247, 7473, 7792, 281, 2312, 670, 1327, 22524, 27935, 387, 7548, 4811, 50276, 3549, 6241, 281, 749, 29844, 749, 19623, 451, 50276, 1542, 253, 1332, 3139, 352, 556, 642, 2173, 17200, 50276, 783, 3916, 273, 1805, 3045, 2429, 281, 30105, 89, 4789, 403, 671, 8839, 14122, 321, 20733, 50276, 17312, 89, 4789, 4390, 4648, 247, 5777, 749, 29776, 50276, 395, 247, 1077, 8214, 50276, 84, 14930, 323, 4872, 5659, 326, 310, 512, 2490, 187, 4118, 18435, 27, 2520, 2929, 2959, 1029, 11041, 275, 253, 10123, 50276, 74, 11697, 5194, 342, 271, 251, 15337, 254, 21, 326, 253, 10527, 1543, 3559, 275, 436, 2929, 403, 973, 4304, 1543, 327, 253, 7340, 1783, 273, 4872, 5659, 923, 323, 4227, 10199, 281, 4872, 13757, 407, 270, 797, 3549, 284, 285, 28669, 953, 1479, 38466, 8857, 608, 50276, 3062, 3839, 841, 1543, 403, 247, 2714, 1083, 273, 7723, 7232, 10012, 285, 253, 17329, 10012, 5987, 257, 25842, 2061, 44874, 301, 507, 5914, 1630, 296, 248, 4362, 5987, 257, 25842, 2061, 44874, 1914, 1155, 10666, 4362, 50276, 498, 782, 265, 14923, 27935, 403, 816, 749, 4971, 1104, 275, 253, 1083, 273, 17133, 3470, 534, 310, 253, 1083, 1060, 50276, 2577, 10774, 16977, 281, 253, 4477, 604, 597, 971, 281, 15452, 616, 789, 310, 281, 2770, 327, 253, 4893, 285, 281, 3523, 15081, 38135, 327, 253, 10527, 1930 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 15177, 856, 7334, 1332, 310, 3012, 625, 8542, 1097, 275, 2426, 273, 11990, 273, 7092, 285, 3885, 685, 2720, 2905, 789, 323, 28699, 38183, 11655, 253, 9380, 47284, 3198, 281, 320, 5520, 15455, 2167, 923, 2708, 50276, 19164, 414, 783, 2929, 21354, 327, 7269, 12342, 432, 38183, 13757, 326, 778, 320, 32139, 281, 1142, 17857, 32888, 10668, 533, 326, 452, 253, 2442, 323, 1781, 3486, 50275, 9188, 692, 1377, 10666, 9380, 4081, 1332, 310, 8542, 323, 1077, 1846, 13361, 873, 8777, 275, 295, 24343, 285, 4382, 8113, 326, 403, 908, 1388, 39799, 50275, 856, 84, 209, 575, 856, 7334, 1332, 310, 4722, 8542, 285, 4942, 3477, 281, 3359, 50275, 5040, 209, 575, 20790, 4028, 7005, 953, 1142, 2234, 4278, 3309, 281, 897, 253, 1332, 275, 3946, 4679, 1973, 253, 1332, 327, 1755, 273, 562, 15483, 3210, 285, 513, 417, 7568, 326, 253, 1332, 812, 320, 908, 342, 4980, 3210, 24088, 4116, 3169, 1086, 351, 398, 50276, 26122, 74, 11435, 326, 368, 2085, 247, 1077, 2087, 13612, 323, 26736, 253, 46350, 38183, 8090, 2299, 253, 2929, 3400, 2080, 1512, 1643, 4278, 323, 253, 1798, 3237, 49240, 11038, 285, 3425, 12420, 326, 3176, 275, 253, 4679, 253, 24864, 2144, 1057, 417, 1361, 634, 1332, 310, 12532, 285, 24432, 326, 513, 417, 452, 247, 4114, 275, 38183, 13757, 778, 971, 281, 897, 352, 253, 2929, 1057, 417, 2085, 2217, 4278, 281, 513, 594, 2654, 8171, 5933, 337, 342, 247, 3817, 2173, 281, 49240, 11038, 50276, 5658, 878, 281, 2085, 2080, 625, 2508, 11814, 327, 46350, 28490, 323, 253, 4706, 41933, 873, 273, 4679, 352, 369, 12744, 752, 2602, 4090, 305, 3561, 293, 5530, 4090, 5486, 310, 2602, 4090, 417, 253, 1072, 347, 278, 282, 50276, 6050, 253, 1273, 873, 273, 4679, 3400, 4217, 28913, 1783, 273, 253, 3486, 273, 634, 1332, 352, 21168, 327, 271, 36761, 1566, 368, 3630, 1223, 436, 10336, 310, 642, 3356, 253, 1755, 40247, 275, 2426, 273, 30497, 463, 7982, 4390, 1781, 3215, 11273, 1881, 42959, 3210, 403, 253, 1375, 23037, 14387, 50276, 262, 310, 1199, 625, 5919, 8376, 1699, 6941, 323, 46086, 342, 1027, 2957, 3470, 310, 253, 3885, 3064, 1663, 326, 1199, 516, 9861, 326, 326, 2789, 247, 3064, 275, 2426, 273, 534, 4679, 403, 17887, 50276, 13206, 337, 2139, 310, 30105, 89, 4789, 11795, 441, 73, 7760, 7152, 339, 431, 248, 4477, 1246, 247, 5853, 281, 19837, 38183, 13757, 749, 856, 23042, 715, 247, 11786, 18499, 1754, 2898, 253, 2746, 597, 6266, 15771, 760, 327, 9827, 273, 253, 1318, 273, 253, 38183, 2086, 3185, 273, 253, 2900, 4972, 285, 476, 320, 2218, 342, 4942, 1698, 18332, 2429, 281, 5609, 326, 6388, 26264, 38183, 11333, 281, 46350, 3603, 390, 253, 897, 273, 46350, 4872, 3362, 83, 1420, 10717, 8090, 50276, 9328, 41509, 285, 921, 253, 11361, 273, 616, 2746, 970, 767, 3626, 285, 4217, 6667, 253, 5661, 1543, 921, 9023, 285, 253, 2929, 310, 973, 3542, 285, 17194, 50276, 7152, 33032, 2520, 2929, 2722, 849, 281, 22629, 949, 38183, 11655, 407, 43073, 949, 253, 7445, 15895, 39322, 4685, 849, 281, 22629, 949, 38183, 13757, 594, 326, 352, 476, 320, 908, 347, 629, 273, 253, 1566, 390, 2957, 310, 1774, 347, 352, 28174, 1142, 3626, 5871, 891, 717, 4933, 436, 247, 5075, 2997, 347, 352, 310, 247, 4460, 2746, 323, 9827, 326, 253, 3114, 476, 1973, 327, 533, 253, 19274, 285, 5886, 281, 2720, 789, 285, 16774, 14023, 812, 320, 10046, 625, 4278, 2708, 50275, 296, 3755, 20556, 281, 253, 1682, 273, 619, 3640, 436, 310, 247, 4460, 2746, 326, 2789, 253, 20654, 285, 3626, 10291, 1469, 432, 247, 38183, 1895, 281, 271, 4164, 81, 281, 271, 39322, 281, 43073, 949, 253, 39322, 970, 1929, 3082, 50275, 20881, 1255, 265, 253, 5962, 14855, 310, 253, 3480, 273, 247, 5301, 342, 2905, 7274, 323, 43073, 949, 38183, 11655, 824, 347, 690, 273, 253, 7274, 5469, 275, 253, 10199, 347, 268, 462, 1377, 280, 9169, 326, 1908, 2074, 3237, 253, 5661, 7533, 2783, 275, 436, 2929, 7277, 281, 1666, 25379, 326, 13414, 22629, 949, 253, 38183, 4809, 273, 253, 1895, 1223, 436, 310, 247, 1270, 3213, 273, 3588, 839, 253, 1612, 273, 841, 7274, 891, 1158, 326, 352, 651, 320, 3012, 625, 21414, 281, 45190, 7277, 281, 7274, 326, 22629, 949, 253, 38183, 11655, 50276, 74, 671, 1158, 697, 1774, 281, 2319, 253, 14023, 281, 253, 2905, 7274, 323, 43073, 949, 4764, 1025, 38183, 13757, 403, 253, 7274, 970, 253, 1072, 5426, 273, 247, 4309, 268, 462, 1377, 280, 9169, 25339, 271, 2523, 342, 253, 1524, 4309, 949, 38183, 13757, 1146, 440, 37650, 800, 390, 2822, 10528, 11678, 310, 436, 671, 271, 2523, 275, 253, 4758, 1060, 476, 436, 2746, 320, 2326, 347, 271, 11193, 390, 35701, 281, 253, 4309, 273, 253, 38183, 1895, 347, 253, 643, 7274, 50276, 338, 891, 2096, 9113, 436, 2746, 4419, 247, 1929, 10603, 432, 253, 38183, 1895, 281, 253, 4164, 81, 285, 432, 253, 4164, 81, 281, 253, 39322, 534, 812, 1056, 352, 625, 3206, 281, 4647, 685, 690, 273, 253, 2905, 3082, 326, 13414, 2430, 8958, 436, 1491, 50275, 977, 3533, 285, 5701, 849, 943, 253, 27935, 273, 253, 5415, 1666, 25379, 342, 30105, 89, 4789, 7277, 281, 253, 1332, 1146, 4081, 275, 253, 4679, 604, 597, 250, 970, 253, 7445, 15895, 39322, 943, 597, 320, 253, 1072, 275, 3762, 347, 4677, 337, 3588, 684, 533, 275, 3946, 1955, 281, 47037, 6332, 4245, 749, 29776, 10746, 347, 2829, 337, 2722, 50276, 783, 1390, 12494, 273, 253, 10199, 10262, 247, 830, 273, 253, 17705, 342, 247, 2957, 285, 253, 38183, 8103, 1318, 342, 14951, 28763, 417, 908, 1996, 327, 275, 253, 2929, 50276, 6377, 374, 1273, 12494, 253, 1390, 6197, 327, 46350, 5415, 298, 793, 82, 793, 3133, 4858, 432, 253, 1551, 273, 12494, 327, 38183, 1220, 735, 7152, 339, 793, 360, 3454, 50275, 783, 4477, 12661, 247, 2969, 1332, 281, 22318, 8103, 2193, 2931, 347, 253, 8654, 1318, 273, 247, 38183, 7007, 4872, 2086, 3692, 4764, 7024, 327, 253, 3453, 273, 247, 2176, 1566, 50275, 1542, 436, 597, 3877, 326, 14923, 11786, 273, 824, 8103, 2193, 403, 14556, 10302, 970, 253, 819, 1983, 285, 8746, 2900, 273, 253, 4164, 81, 3139, 275, 1798, 253, 4164, 81, 476, 320, 14042, 970, 18052, 285, 5919, 47037, 3185, 273, 16161, 247, 12314, 39322, 347, 4081, 275, 17336, 789, 50276, 783, 4477, 12661, 767, 1650, 4893, 326, 403, 2529, 10534, 285, 17618, 1411, 12314, 39322, 16161, 7274, 970, 38183, 18052, 1220, 735, 562, 32231, 12314, 39322, 16161, 7274, 275, 1307, 273, 13782, 285, 275, 1307, 273, 12820, 17082, 347, 12314, 39322, 16161, 310, 17134, 2122, 407, 6332, 534, 2789, 253, 4715, 1232, 11711, 463, 50276, 15337, 50275, 783, 2929, 310, 973, 3542, 285, 973, 10932, 253, 10527, 7794, 403, 973, 14290, 285, 253, 6667, 403, 5611, 10534, 285, 7690, 356, 462, 1037, 50276, 783, 1332, 3139, 310, 4722, 347, 352, 20096, 326, 253, 14923, 11786, 273, 1142, 4164, 81, 3237, 403, 2475, 494, 14556, 10012, 337, 3054, 1110, 23632, 285, 1142, 6667, 403, 5469, 50276, 251, 253, 643, 1133, 253, 38135, 273, 253, 1332, 778, 320, 247, 1652, 689, 33834, 275, 1798, 352, 310, 1929, 326, 970, 247, 12314, 39322, 47037, 310, 39670, 290, 1022, 417, 253, 954, 5919, 1039, 273, 12672, 253, 11786, 285, 326, 18052, 38183, 47037, 943, 320, 908, 50275, 1542, 253, 1895, 273, 305, 6678, 534, 10140, 281, 970, 247, 7870, 673, 2137, 14650, 2957, 16161, 253, 1355, 39322, 310, 2218, 970, 7870, 10717, 970, 247, 277, 7553, 2957, 327, 1755, 273, 247, 3676, 11454, 2990, 556, 2168, 644, 5421, 923, 24088, 253, 11106, 34939, 348, 285, 37559, 293, 2929, 835, 253, 4477, 8415, 253, 39322, 970, 33234, 970, 247, 12314, 39322, 47037, 824, 347, 253, 581, 275, 30105, 89, 4789, 310, 247, 1652, 27785, 275, 326, 1083, 285, 352, 417, 10084, 326, 352, 17923, 15225, 50275, 249, 436, 1650, 359, 760, 2430, 253, 11786, 342, 1675, 281, 253, 2105, 268, 310, 819, 1983, 9275, 30908, 2276, 25711, 436, 7714, 671, 8046, 441, 281, 896, 44263, 366, 949, 30364, 50065, 10806, 970, 253, 7212, 4081, 275, 10012, 337, 534, 310, 1652, 1929, 275, 436, 3114, 7152, 339, 431, 248, 1318, 273, 253, 8654, 8103, 347, 247, 1159, 273, 253, 2105, 4972, 260, 476, 320, 3542, 347, 1182, 68, 50276, 291, 44274, 835, 253, 8654, 2900, 1484, 671, 7024, 327, 260, 253, 1159, 44274, 310, 5313, 3020, 3638, 50276, 9088, 403, 30268, 1183, 1385, 1598, 1142, 17887, 5482, 9183, 323, 1484, 50276, 395, 594, 253, 1159, 1182, 68, 310, 247, 5313, 3020, 4872, 1159, 273, 260, 342, 11786, 44274, 20312, 352, 4961, 5010, 627, 310, 19890, 749, 29844, 9090, 512, 352, 3936, 323, 12672, 44274, 310, 16161, 50276, 1279, 5430, 50276, 783, 38183, 1895, 436, 310, 512, 14916, 285, 973, 4304, 2568, 253, 4477, 513, 10534, 326, 50276, 5092, 352, 320, 9809, 407, 36636, 27935, 671, 273, 8772, 10806, 642, 841, 1543, 403, 5777, 1679, 14916, 533, 50276, 284, 4477, 11476, 50276, 609, 1929, 1580, 14752, 25761, 253, 11786, 342, 1675, 281, 260, 310, 253, 760, 581, 908, 275, 4679, 347, 2080, 347, 891, 2096, 50276, 261, 627, 3907, 1318, 275, 10012, 337, 891, 513, 417, 923, 352, 352, 3133, 281, 320, 247, 41274, 27436, 1475, 253, 8946, 906, 352, 760, 23970, 690, 3686, 273, 5502, 432, 247, 4972, 31238, 247, 38183, 1895, 281, 247, 4849, 273, 11390, 2056, 5395, 31238, 271, 7007, 2086, 671, 253, 4275, 4473, 273, 14923, 11786, 7960, 3400, 247, 7473, 7792, 281, 2312, 670, 1327, 22524, 27935, 387, 7548, 4811, 50276, 3549, 6241, 281, 749, 29844, 749, 19623, 451, 50276, 1542, 253, 1332, 3139, 352, 556, 642, 2173, 17200, 50276, 783, 3916, 273, 1805, 3045, 2429, 281, 30105, 89, 4789, 403, 671, 8839, 14122, 321, 20733, 50276, 17312, 89, 4789, 4390, 4648, 247, 5777, 749, 29776, 50276, 395, 247, 1077, 8214, 50276, 84, 14930, 323, 4872, 5659, 326, 310, 512, 2490, 187, 4118, 18435, 27, 2520, 2929, 2959, 1029, 11041, 275, 253, 10123, 50276, 74, 11697, 5194, 342, 271, 251, 15337, 254, 21, 326, 253, 10527, 1543, 3559, 275, 436, 2929, 403, 973, 4304, 1543, 327, 253, 7340, 1783, 273, 4872, 5659, 923, 323, 4227, 10199, 281, 4872, 13757, 407, 270, 797, 3549, 284, 285, 28669, 953, 1479, 38466, 8857, 608, 50276, 3062, 3839, 841, 1543, 403, 247, 2714, 1083, 273, 7723, 7232, 10012, 285, 253, 17329, 10012, 5987, 257, 25842, 2061, 44874, 301, 507, 5914, 1630, 296, 248, 4362, 5987, 257, 25842, 2061, 44874, 1914, 1155, 10666, 4362, 50276, 498, 782, 265, 14923, 27935, 403, 816, 749, 4971, 1104, 275, 253, 1083, 273, 17133, 3470, 534, 310, 253, 1083, 1060, 50276, 2577, 10774, 16977, 281, 253, 4477, 604, 597, 971, 281, 15452, 616, 789, 310, 281, 2770, 327, 253, 4893, 285, 281, 3523, 15081, 38135, 327, 253, 10527, 1930 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the authors present a dataset for benchmarking the performance of ecg and ppg waveform imputation methods by leveraging publicly available icu waveform data and using realistic missingness patterns to create simulated gaps to be imputed while i believe that some of the baseline model choices dont make sense overall the paper is well written and timely they share the waveform data along with missingness masks along with trainvalidationtest splits so that others can easily compare imputation methods to baseline model performance several baseline models from the literature were implemented and tested along with a novel transformer model to serve as an additional baseline strengths of the dataset include the scale and the realistic masking procedure used to generate missingness that tries to closely mirror patterns seen in mhealth systems i also appreciate that a downstream prediction task is also included in the dataset since often users care about how imputation quality will affect application performance many of the potential concerns i had with the dataset selection eg using icu patients mhealthspecific missingness patterns differences in ecg sensor quality from hospital to mhealth were preemptively by the authors explanation in appendix a1 im not sure that mean and linear imputation necessarily make sense as baselines if a peak is defined as a local maximum ie xi xi1 xi xi1 at index i in signal x then by definition there will be no peaks with mean or linear interpolation and the peak classification will fail this may explain the nan values and zero sensitivity of the mean and linear imputation in table 2 why not use a simple fftbased baseline since these are quasiperiodic signals the choice to not utilize the mimic matched waveform database so that covariates can be used either in the modeling process or during evaluation is also a weakness many parameters such as resting heart rate max heart rate etc vary as a function of age disease status medication usage etc and as a benchmarking dataset it would be nice to be able to incorporate that data docsepan existing architecture was slightly transformed and transferred to another domain ie wearables to address a gap in physiologic signal imputation the approach presented shows superior performance compared to existing work the authors demonstrate that the current stateoftheart algorithms trained on ecg and ppg waveform perform poorly on signals obtained from wearables table 1 outlines a clear comparison to existing work the explanation why multichannel is out of scope is solid the approach was only tested on one dataset for each domain validation on other datasets would strengthen the paper docsepthis paper raises the issue of missing data in pulsative signals collected from wearable devices and introduces an imputation benchmark pulseimpute specifically the authors extracted missingness patterns from realworld mhealth settings and applied them to two existing pulsative signal datasets they reproduced several existing imputation methods and proposed a new imputation method and used them as baselines applying these baseline imputation methods to their processed pulsative signal datasets with missing values the authors proposed benchmarks for the downstream tasks this paper defines an interesting research question code to replicate the baseline results i can see why the authors chose to apply the missingness patterns in the mhealth settings to the existing pulsative signal datasets rather than collecting data directly from mhealth wearables due to a possible lack of ground truth admittedly the authors also compared several differences between data collected from mhealth settings and clinical settings however the lack of a direct comparison of the differences in missing patterns in the data collected from the two settings and what exactly the missing patterns are in the data collected from mhealth wearables makes the choice of curated datasets to address the question raised in this paper less convincing what makes pulsative signals more interestingcompelling than other signals from a missingness perspective was less motivated the paper is a bit hard to follow and there are some inconsistencies and inaccessible references docsepthe usage of wearable sensors for medical purposes promises better monitoring with high frequency information however the usage of such sensors worn in daytoday life often leads to gaps in the sensory information imputation techniques attempt to fill such gaps but are lacking for pulsative signals like ecgs and ppgs this submission supplies datasets methods to mimic realistic missingness in the data as well as challenge tasks to evaluate pulsative signal imputation further the authors propose a benchmark transformer model where the signal tokenizer is realized via dilated convolution and empirically demonstrate significant outperformance compared to previous work on their benchmark tasks the proposed datasets and challenges appear highly relevant to facilitate broad usage of wearable sensors for medical purposes the definition of evidencedbased patterns of missingness fills a gap in previously published work to that end that definition is made accessible by the proposed challenge which is based on existing and peerreviewed datasets incorporating the patterns missingness the experiment design is clear and thoughtful the downstream taks are both difficult and relevant tasks for pulsative signals to test imputation methods the data provided makes it easy for researchers to work on the topic the authors propose a model architecture which demonstrates significant outperformance on the benchmark tasks in my view there are no major weaknesses two minor issues first the presentation of the bdc architecture may be a bit too confident in ll 211 following the authors state their requirements for their benchmark model and state that no existing transformer models address all three issues sufficiently id argue that these are rather common issues as an example images pixels require local context require some measure against permutation equivariance and have to deal with scaling of long sequences vision transformershttpsarxivorgabs201011929 vit have found ways to deal with that so has the perceiverhttpsarxivorgabs210303206 architecture directly on the data cvthttpsopenaccessthecvfcomcontenticcv2021paperswucvtintroducingconvolutionstovisiontransformersiccv2021paperpdf adds convolution data encoders to vit similar to the authors approach that does not take anything away from their model architecture which they show works fine id find it fair though to put it into context of other work the authos further state that positional encodings dont have a good inductive bias in their setting line 239 im curious is that empirical or are there other reasons to think that on other domains again images as example position encodings work surprisingly well to contextualize second equation 1 seems somewhat overcomplicated selfattention is commonplace and a softmax would have simplified it greatly docsepin this paper the authors proposed a new benchmark task for physiological sensor signal imputation specifically the authors focus on ecg and ppg signals and use public datasets for evaluation they first simulated realistic missing patterns for ecg and ppg signals by ablating samples and built a dataset with realistic missing data they implemented eight existing traditional or modern timeseries imputation techniques moreover they also proposed and developed a new bottleneck dilated convolution bdc selfattention architecture that fits the characteristic ecgppg data these data usually have a longrange structure as pulsative signals the authors then evaluated the performance of these algorithms on 1 the raw signal reconstruction task and 2 three downstream tasks a heartbeat detection in ecg b heartbeat detection in ppg and c cardiac pathophysiology classification in ecg their results indicate that the new bdc technique is significantly better than all baselines the topic of signal sensor imputation is an important realistic and very practical problem in mhealth daily physiological data collection both the raw construction error and the three downstream tasks are valid experiment designs the design of the new architecture does leverage the pattern of ecgppg properties and the advantage over the baseline methods are encouraging physiological signal types may go beyond ecgppg while ecgppg is arguably one of the most commonly collected physiological signals in mhealth applications there are other common sensors that are not covered in this paper such as imu accelerometer gyroscope and magnetometer gsr eeg etc i am not arguing that the authors must evaluate their techniques on these signals but their characteristics could be very different from ecgppg signals for example they may not have a clear pulsative pattern i am curious to know the authors consideration of the generalizability of the technique and the potential necessity to tone down the papers framing or clarify the scope of the paper baseline method selection the authors compare the new technique against eight baseline techniques which is great however why not compare against the two existing papers that specifically focus on imputing mhealth pulsative signals ie 20 40 the citation number in the paper please justify extremely low recall for baseline technique related to the previous point in table 2 the performance of the baseline techniques all have very low recall thus low f1 score the authors provided some reasons in the text which is great but such a low performance raises a concern that whether these baselines are too easy to beat comparing against the sota technique could provide more valid results 20 arman iranfar adriana arza and david atienza relearn a robust machine learning framework in presence of missing data for multimodal stress detection from physiological signals arxiv preprint arxiv210414278 2021 40 hillol sarker matthew tyburski md mahbubur rahman karen hovsepian moushumi sharmin david h epstein kenzie l preston c debra furrholden adam milam inbal nahumshani et al finding significant stress episodes in a discontinuous time series of rapidly varying mobile sensor data in proceedings of the 2016 chi conference on human factors in computing systems pages 44894501 2016 ### Summary:
this paper develops a new benchmark for missing data imputation in pulsative signals like ecg using realistic missingness models i expect such a dataset to drive important developments in this understudied area indeed the authors show that standard sota methods fail the reviewers enthusiasm makes this paper a clear accept
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 1246, 247, 10895, 323, 22791, 272, 253, 3045, 273, 10038, 72, 285, 7266, 72, 34048, 516, 10340, 3082, 407, 19732, 2977, 13644, 2130, 17857, 86, 34048, 941, 285, 970, 15958, 5816, 1255, 6127, 281, 2794, 15524, 18388, 281, 320, 516, 19280, 1223, 891, 2868, 326, 690, 273, 253, 8245, 1566, 10165, 13414, 1056, 3282, 4583, 253, 2929, 310, 973, 3542, 285, 14793, 597, 3894, 253, 34048, 941, 2112, 342, 5816, 1255, 25965, 2112, 342, 6194, 29599, 2566, 36509, 594, 326, 2571, 476, 4354, 7277, 516, 10340, 3082, 281, 8245, 1566, 3045, 2067, 8245, 3210, 432, 253, 6239, 497, 9009, 285, 5762, 2112, 342, 247, 4460, 39707, 1566, 281, 5752, 347, 271, 3081, 8245, 50276, 296, 3755, 20556, 273, 253, 10895, 2486, 253, 4311, 285, 253, 15958, 44790, 5199, 908, 281, 6635, 5816, 1255, 326, 14177, 281, 8244, 11472, 6127, 2326, 275, 278, 15356, 2718, 891, 671, 11435, 326, 247, 15450, 10554, 4836, 310, 671, 2908, 275, 253, 10895, 1580, 2223, 4212, 1557, 670, 849, 516, 10340, 3290, 588, 2818, 2898, 3045, 1142, 273, 253, 2442, 7350, 891, 574, 342, 253, 10895, 5438, 24088, 970, 17857, 86, 1363, 278, 15356, 6160, 5816, 1255, 6127, 3910, 275, 10038, 72, 8468, 3290, 432, 4675, 281, 278, 15356, 497, 36588, 1242, 407, 253, 4477, 8813, 275, 30762, 247, 18, 50276, 303, 417, 2119, 326, 1599, 285, 4872, 516, 10340, 7933, 1056, 3282, 347, 1666, 25379, 604, 247, 5241, 310, 2931, 347, 247, 1980, 4869, 26332, 1269, 74, 50276, 2981, 18, 50276, 2981, 50276, 2981, 18, 387, 3605, 891, 275, 2625, 1269, 840, 407, 5426, 627, 588, 320, 642, 13596, 342, 1599, 390, 4872, 30370, 285, 253, 5241, 9162, 588, 1891, 436, 778, 5513, 253, 6399, 2193, 285, 5058, 7340, 273, 253, 1599, 285, 4872, 516, 10340, 275, 2829, 374, 2139, 417, 897, 247, 2969, 269, 649, 3169, 8245, 1580, 841, 403, 15539, 38847, 6298, 50276, 783, 4327, 281, 417, 16584, 253, 25066, 13373, 34048, 5447, 594, 326, 33520, 476, 320, 908, 2057, 275, 253, 14053, 1232, 390, 1309, 7103, 310, 671, 247, 14855, 1142, 3602, 824, 347, 18180, 2798, 2281, 2781, 2798, 2281, 3966, 6889, 347, 247, 1159, 273, 2363, 2728, 3708, 12358, 10393, 3966, 285, 347, 247, 22791, 272, 10895, 352, 651, 320, 5322, 281, 320, 2104, 281, 19071, 326, 941, 50276, 7152, 339, 4029, 5368, 10336, 369, 5777, 13657, 285, 9495, 281, 1529, 5028, 26332, 8251, 2272, 281, 2953, 247, 8037, 275, 25258, 9522, 2625, 516, 10340, 50276, 783, 2746, 3559, 2722, 8936, 3045, 2429, 281, 5368, 789, 253, 4477, 7568, 326, 253, 1655, 1375, 23037, 14387, 11333, 10166, 327, 10038, 72, 285, 7266, 72, 34048, 1347, 15225, 327, 6298, 2797, 432, 8251, 2272, 50275, 2420, 337, 36264, 247, 2590, 5301, 281, 5368, 789, 253, 8813, 2139, 1554, 469, 3536, 310, 562, 273, 7990, 310, 4891, 50275, 783, 2746, 369, 760, 5762, 327, 581, 10895, 323, 1016, 5028, 12820, 327, 643, 15302, 651, 17084, 253, 2929, 5474, 33032, 2520, 2929, 16540, 253, 2523, 273, 5816, 941, 275, 24295, 800, 6298, 5728, 432, 8251, 494, 4095, 285, 23970, 271, 516, 10340, 22791, 10724, 303, 48334, 5742, 253, 4477, 10375, 5816, 1255, 6127, 432, 1524, 10186, 278, 15356, 7533, 285, 3732, 731, 281, 767, 5368, 24295, 800, 2625, 15302, 597, 23775, 2067, 5368, 516, 10340, 3082, 285, 4081, 247, 747, 516, 10340, 1332, 285, 908, 731, 347, 1666, 25379, 9433, 841, 8245, 516, 10340, 3082, 281, 616, 11742, 24295, 800, 2625, 15302, 342, 5816, 2193, 253, 4477, 4081, 49602, 323, 253, 15450, 8892, 50275, 2520, 2929, 13067, 271, 4722, 2561, 1953, 50275, 3211, 281, 25464, 253, 8245, 1543, 50276, 74, 476, 923, 2139, 253, 4477, 9703, 281, 4647, 253, 5816, 1255, 6127, 275, 253, 278, 15356, 7533, 281, 253, 5368, 24295, 800, 2625, 15302, 2581, 685, 17055, 941, 3587, 432, 278, 15356, 8251, 2272, 1955, 281, 247, 1896, 3480, 273, 3216, 5083, 47421, 253, 4477, 671, 2429, 2067, 3910, 875, 941, 5728, 432, 278, 15356, 7533, 285, 3382, 7533, 2299, 253, 3480, 273, 247, 1480, 5301, 273, 253, 3910, 275, 5816, 6127, 275, 253, 941, 5728, 432, 253, 767, 7533, 285, 752, 4555, 253, 5816, 6127, 403, 275, 253, 941, 5728, 432, 278, 15356, 8251, 2272, 2789, 253, 4327, 273, 1095, 456, 15302, 281, 2953, 253, 1953, 5439, 275, 436, 2929, 1679, 21414, 50276, 5371, 2789, 24295, 800, 6298, 625, 4722, 3118, 3485, 685, 643, 6298, 432, 247, 5816, 1255, 8668, 369, 1679, 17194, 50275, 783, 2929, 310, 247, 2372, 1892, 281, 956, 285, 627, 403, 690, 45611, 285, 49187, 10414, 50275, 7152, 339, 431, 248, 10393, 273, 8251, 494, 13479, 323, 3739, 6378, 16966, 1805, 8667, 342, 1029, 4294, 1491, 2299, 253, 10393, 273, 824, 13479, 16332, 275, 1388, 39799, 1495, 2223, 5644, 281, 18388, 275, 253, 17872, 1491, 516, 10340, 5609, 3177, 281, 7522, 824, 18388, 533, 403, 14999, 323, 24295, 800, 6298, 751, 10038, 5943, 285, 7266, 5943, 436, 19529, 13191, 15302, 3082, 281, 25066, 15958, 5816, 1255, 275, 253, 941, 347, 973, 347, 5691, 8892, 281, 7472, 24295, 800, 2625, 516, 10340, 2007, 253, 4477, 12661, 247, 22791, 39707, 1566, 835, 253, 2625, 10669, 6081, 310, 8156, 3066, 49783, 27311, 285, 45190, 7568, 1534, 562, 24159, 2429, 281, 2045, 789, 327, 616, 22791, 8892, 253, 4081, 15302, 285, 7881, 3176, 4122, 4623, 281, 12454, 3862, 10393, 273, 8251, 494, 13479, 323, 3739, 6378, 253, 5426, 273, 27007, 3169, 6127, 273, 5816, 1255, 32113, 247, 8037, 275, 3786, 3863, 789, 281, 326, 990, 50276, 3529, 5426, 310, 1160, 12482, 407, 253, 4081, 5691, 534, 310, 1754, 327, 5368, 285, 14218, 33349, 15302, 24049, 253, 6127, 5816, 1255, 253, 3368, 2216, 310, 2590, 285, 30457, 253, 15450, 246, 8765, 403, 1097, 2834, 285, 4623, 8892, 323, 24295, 800, 6298, 281, 1071, 516, 10340, 3082, 253, 941, 2530, 2789, 352, 3477, 323, 8607, 281, 789, 327, 253, 9400, 253, 4477, 12661, 247, 1566, 10336, 534, 14371, 1534, 562, 24159, 327, 253, 22791, 8892, 275, 619, 1859, 627, 403, 642, 2201, 32213, 767, 5884, 3374, 50276, 7053, 253, 9759, 273, 253, 270, 12352, 10336, 778, 320, 247, 2372, 1512, 13224, 275, 26198, 24978, 1563, 253, 4477, 1375, 616, 6095, 323, 616, 22791, 1566, 285, 1375, 326, 642, 5368, 39707, 3210, 2953, 512, 1264, 3374, 10481, 2654, 9059, 326, 841, 403, 2581, 1846, 3374, 347, 271, 1650, 3888, 50276, 30061, 1241, 2430, 1980, 3634, 2430, 690, 2557, 1411, 29391, 32270, 14417, 285, 452, 281, 2968, 342, 13642, 273, 1048, 6430, 8113, 4979, 398, 3614, 39962, 2061, 5375, 1252, 520, 746, 1717, 9084, 452, 1119, 4088, 281, 2968, 342, 326, 594, 556, 253, 591, 22070, 3614, 39962, 2061, 5375, 16899, 20914, 18040, 10336, 3587, 327, 253, 941, 260, 20282, 3614, 5758, 317, 707, 296, 248, 17312, 71, 681, 6071, 280, 17312, 938, 1797, 50004, 88, 1028, 20282, 36445, 2844, 13118, 2241, 296, 729, 1297, 16702, 398, 280, 17312, 938, 1797, 20790, 9275, 11323, 27311, 941, 2349, 351, 398, 281, 9084, 2074, 281, 253, 4477, 2746, 326, 1057, 417, 1379, 2712, 1977, 432, 616, 1566, 10336, 534, 597, 921, 2987, 4030, 2654, 1089, 352, 4344, 2167, 281, 1691, 352, 715, 3634, 273, 643, 789, 253, 24896, 375, 2007, 1375, 326, 40798, 2349, 351, 723, 13414, 452, 247, 1175, 42115, 8492, 275, 616, 4758, 1386, 27862, 516, 14338, 310, 326, 16774, 390, 403, 627, 643, 4606, 281, 1158, 326, 327, 643, 10625, 969, 3888, 347, 1650, 1899, 2349, 351, 723, 789, 19143, 973, 281, 33876, 907, 50274, 9815, 5150, 337, 3133, 8489, 689, 681, 37787, 1881, 42959, 310, 47817, 285, 247, 2602, 4090, 651, 452, 21010, 352, 10260, 50275, 7152, 339, 9852, 436, 2929, 253, 4477, 4081, 247, 747, 22791, 4836, 323, 13424, 8468, 2625, 516, 10340, 5742, 253, 4477, 2770, 327, 10038, 72, 285, 7266, 72, 6298, 285, 897, 1345, 15302, 323, 7103, 597, 806, 15524, 15958, 5816, 6127, 323, 10038, 72, 285, 7266, 72, 6298, 407, 490, 77, 839, 3530, 285, 4270, 247, 10895, 342, 15958, 5816, 941, 597, 9009, 4314, 5368, 5899, 390, 4980, 2069, 12395, 516, 10340, 5609, 25761, 597, 671, 4081, 285, 3715, 247, 747, 3673, 44856, 49783, 27311, 270, 12352, 1881, 42959, 10336, 326, 13840, 253, 8847, 10038, 72, 377, 72, 941, 841, 941, 3798, 452, 247, 1048, 6324, 2605, 347, 24295, 800, 6298, 253, 4477, 840, 6760, 253, 3045, 273, 841, 11333, 327, 337, 253, 9305, 2625, 14433, 4836, 285, 374, 1264, 15450, 8892, 247, 43534, 5481, 275, 10038, 72, 270, 43534, 5481, 275, 7266, 72, 285, 260, 10177, 42405, 9162, 275, 10038, 72, 616, 1543, 5224, 326, 253, 747, 270, 12352, 5853, 310, 3012, 1805, 685, 512, 1666, 25379, 50276, 783, 9400, 273, 2625, 8468, 516, 10340, 310, 271, 1774, 15958, 285, 1077, 8542, 1895, 275, 278, 15356, 5312, 13424, 941, 4849, 50276, 15617, 253, 9305, 5140, 2228, 285, 253, 1264, 15450, 8892, 403, 3588, 3368, 11809, 50276, 783, 2216, 273, 253, 747, 10336, 1057, 25057, 253, 3102, 273, 10038, 72, 377, 72, 3607, 285, 253, 5750, 689, 253, 8245, 3082, 403, 18462, 50276, 14453, 8673, 2625, 3510, 778, 564, 4457, 10038, 72, 377, 72, 1223, 10038, 72, 377, 72, 310, 25711, 581, 273, 253, 954, 7744, 5728, 13424, 6298, 275, 278, 15356, 4893, 627, 403, 643, 1846, 13479, 326, 403, 417, 6107, 275, 436, 2929, 824, 347, 516, 86, 17308, 11955, 19859, 10425, 1714, 285, 10973, 11955, 305, 18356, 299, 909, 3966, 891, 717, 417, 16425, 326, 253, 4477, 1364, 7472, 616, 5609, 327, 841, 6298, 533, 616, 5319, 812, 320, 1077, 1027, 432, 10038, 72, 377, 72, 6298, 323, 1650, 597, 778, 417, 452, 247, 2590, 24295, 800, 3102, 891, 717, 14338, 281, 871, 253, 4477, 8180, 273, 253, 2087, 50228, 273, 253, 5853, 285, 253, 2442, 15504, 281, 10541, 1066, 253, 9380, 39926, 390, 19148, 253, 7990, 273, 253, 2929, 50275, 44650, 1332, 5438, 253, 4477, 7277, 253, 747, 5853, 1411, 4314, 8245, 5609, 534, 310, 1270, 2299, 2139, 417, 7277, 1411, 253, 767, 5368, 9380, 326, 5742, 2770, 327, 516, 1065, 272, 278, 15356, 24295, 800, 6298, 26332, 1384, 3387, 50276, 783, 25577, 1180, 275, 253, 2929, 4496, 15249, 50275, 2068, 42807, 1698, 6983, 323, 8245, 5853, 2905, 281, 253, 2045, 1127, 275, 2829, 374, 253, 3045, 273, 253, 8245, 5609, 512, 452, 1077, 1698, 6983, 3021, 1698, 269, 18, 4868, 253, 4477, 2530, 690, 4606, 275, 253, 2505, 534, 310, 1270, 533, 824, 247, 1698, 3045, 16540, 247, 4468, 326, 1880, 841, 1666, 25379, 403, 1512, 3477, 281, 7171, 10941, 1411, 253, 256, 5503, 5853, 812, 2085, 625, 3588, 1543, 50276, 938, 549, 1342, 3496, 266, 14103, 519, 363, 3230, 549, 4019, 285, 34843, 301, 387, 1914, 4019, 1693, 1596, 247, 10237, 5145, 4715, 7792, 275, 3361, 273, 5816, 941, 323, 23390, 26306, 4073, 5481, 432, 13424, 6298, 549, 32693, 638, 3845, 549, 32693, 19, 11238, 1047, 24803, 43425, 50276, 1449, 13599, 311, 256, 26599, 1111, 783, 88, 963, 67, 2244, 5985, 31934, 35926, 67, 538, 321, 1218, 38373, 465, 6950, 288, 729, 33032, 757, 278, 528, 18256, 74, 17614, 1222, 34843, 301, 288, 2563, 6339, 465, 32560, 298, 638, 5493, 260, 4274, 376, 11829, 83, 4949, 257, 38622, 2301, 312, 275, 7187, 295, 1240, 360, 1200, 6451, 1162, 355, 4560, 1534, 4073, 13305, 275, 247, 16196, 3472, 673, 2962, 273, 9086, 11962, 6109, 8468, 941, 275, 10061, 273, 253, 4022, 21477, 8059, 327, 1966, 2616, 275, 12672, 2718, 7223, 577, 30452, 1857, 520, 4022, 2490, 187, 4118, 18435, 27, 2520, 2929, 24357, 247, 747, 22791, 323, 5816, 941, 516, 10340, 275, 24295, 800, 6298, 751, 10038, 72, 970, 15958, 5816, 1255, 3210, 891, 1902, 824, 247, 10895, 281, 4446, 1774, 16936, 275, 436, 762, 14091, 728, 2170, 6296, 253, 4477, 921, 326, 2629, 256, 5503, 3082, 1891, 253, 30628, 23027, 2789, 436, 2929, 247, 2590, 2997 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 1246, 247, 10895, 323, 22791, 272, 253, 3045, 273, 10038, 72, 285, 7266, 72, 34048, 516, 10340, 3082, 407, 19732, 2977, 13644, 2130, 17857, 86, 34048, 941, 285, 970, 15958, 5816, 1255, 6127, 281, 2794, 15524, 18388, 281, 320, 516, 19280, 1223, 891, 2868, 326, 690, 273, 253, 8245, 1566, 10165, 13414, 1056, 3282, 4583, 253, 2929, 310, 973, 3542, 285, 14793, 597, 3894, 253, 34048, 941, 2112, 342, 5816, 1255, 25965, 2112, 342, 6194, 29599, 2566, 36509, 594, 326, 2571, 476, 4354, 7277, 516, 10340, 3082, 281, 8245, 1566, 3045, 2067, 8245, 3210, 432, 253, 6239, 497, 9009, 285, 5762, 2112, 342, 247, 4460, 39707, 1566, 281, 5752, 347, 271, 3081, 8245, 50276, 296, 3755, 20556, 273, 253, 10895, 2486, 253, 4311, 285, 253, 15958, 44790, 5199, 908, 281, 6635, 5816, 1255, 326, 14177, 281, 8244, 11472, 6127, 2326, 275, 278, 15356, 2718, 891, 671, 11435, 326, 247, 15450, 10554, 4836, 310, 671, 2908, 275, 253, 10895, 1580, 2223, 4212, 1557, 670, 849, 516, 10340, 3290, 588, 2818, 2898, 3045, 1142, 273, 253, 2442, 7350, 891, 574, 342, 253, 10895, 5438, 24088, 970, 17857, 86, 1363, 278, 15356, 6160, 5816, 1255, 6127, 3910, 275, 10038, 72, 8468, 3290, 432, 4675, 281, 278, 15356, 497, 36588, 1242, 407, 253, 4477, 8813, 275, 30762, 247, 18, 50276, 303, 417, 2119, 326, 1599, 285, 4872, 516, 10340, 7933, 1056, 3282, 347, 1666, 25379, 604, 247, 5241, 310, 2931, 347, 247, 1980, 4869, 26332, 1269, 74, 50276, 2981, 18, 50276, 2981, 50276, 2981, 18, 387, 3605, 891, 275, 2625, 1269, 840, 407, 5426, 627, 588, 320, 642, 13596, 342, 1599, 390, 4872, 30370, 285, 253, 5241, 9162, 588, 1891, 436, 778, 5513, 253, 6399, 2193, 285, 5058, 7340, 273, 253, 1599, 285, 4872, 516, 10340, 275, 2829, 374, 2139, 417, 897, 247, 2969, 269, 649, 3169, 8245, 1580, 841, 403, 15539, 38847, 6298, 50276, 783, 4327, 281, 417, 16584, 253, 25066, 13373, 34048, 5447, 594, 326, 33520, 476, 320, 908, 2057, 275, 253, 14053, 1232, 390, 1309, 7103, 310, 671, 247, 14855, 1142, 3602, 824, 347, 18180, 2798, 2281, 2781, 2798, 2281, 3966, 6889, 347, 247, 1159, 273, 2363, 2728, 3708, 12358, 10393, 3966, 285, 347, 247, 22791, 272, 10895, 352, 651, 320, 5322, 281, 320, 2104, 281, 19071, 326, 941, 50276, 7152, 339, 4029, 5368, 10336, 369, 5777, 13657, 285, 9495, 281, 1529, 5028, 26332, 8251, 2272, 281, 2953, 247, 8037, 275, 25258, 9522, 2625, 516, 10340, 50276, 783, 2746, 3559, 2722, 8936, 3045, 2429, 281, 5368, 789, 253, 4477, 7568, 326, 253, 1655, 1375, 23037, 14387, 11333, 10166, 327, 10038, 72, 285, 7266, 72, 34048, 1347, 15225, 327, 6298, 2797, 432, 8251, 2272, 50275, 2420, 337, 36264, 247, 2590, 5301, 281, 5368, 789, 253, 8813, 2139, 1554, 469, 3536, 310, 562, 273, 7990, 310, 4891, 50275, 783, 2746, 369, 760, 5762, 327, 581, 10895, 323, 1016, 5028, 12820, 327, 643, 15302, 651, 17084, 253, 2929, 5474, 33032, 2520, 2929, 16540, 253, 2523, 273, 5816, 941, 275, 24295, 800, 6298, 5728, 432, 8251, 494, 4095, 285, 23970, 271, 516, 10340, 22791, 10724, 303, 48334, 5742, 253, 4477, 10375, 5816, 1255, 6127, 432, 1524, 10186, 278, 15356, 7533, 285, 3732, 731, 281, 767, 5368, 24295, 800, 2625, 15302, 597, 23775, 2067, 5368, 516, 10340, 3082, 285, 4081, 247, 747, 516, 10340, 1332, 285, 908, 731, 347, 1666, 25379, 9433, 841, 8245, 516, 10340, 3082, 281, 616, 11742, 24295, 800, 2625, 15302, 342, 5816, 2193, 253, 4477, 4081, 49602, 323, 253, 15450, 8892, 50275, 2520, 2929, 13067, 271, 4722, 2561, 1953, 50275, 3211, 281, 25464, 253, 8245, 1543, 50276, 74, 476, 923, 2139, 253, 4477, 9703, 281, 4647, 253, 5816, 1255, 6127, 275, 253, 278, 15356, 7533, 281, 253, 5368, 24295, 800, 2625, 15302, 2581, 685, 17055, 941, 3587, 432, 278, 15356, 8251, 2272, 1955, 281, 247, 1896, 3480, 273, 3216, 5083, 47421, 253, 4477, 671, 2429, 2067, 3910, 875, 941, 5728, 432, 278, 15356, 7533, 285, 3382, 7533, 2299, 253, 3480, 273, 247, 1480, 5301, 273, 253, 3910, 275, 5816, 6127, 275, 253, 941, 5728, 432, 253, 767, 7533, 285, 752, 4555, 253, 5816, 6127, 403, 275, 253, 941, 5728, 432, 278, 15356, 8251, 2272, 2789, 253, 4327, 273, 1095, 456, 15302, 281, 2953, 253, 1953, 5439, 275, 436, 2929, 1679, 21414, 50276, 5371, 2789, 24295, 800, 6298, 625, 4722, 3118, 3485, 685, 643, 6298, 432, 247, 5816, 1255, 8668, 369, 1679, 17194, 50275, 783, 2929, 310, 247, 2372, 1892, 281, 956, 285, 627, 403, 690, 45611, 285, 49187, 10414, 50275, 7152, 339, 431, 248, 10393, 273, 8251, 494, 13479, 323, 3739, 6378, 16966, 1805, 8667, 342, 1029, 4294, 1491, 2299, 253, 10393, 273, 824, 13479, 16332, 275, 1388, 39799, 1495, 2223, 5644, 281, 18388, 275, 253, 17872, 1491, 516, 10340, 5609, 3177, 281, 7522, 824, 18388, 533, 403, 14999, 323, 24295, 800, 6298, 751, 10038, 5943, 285, 7266, 5943, 436, 19529, 13191, 15302, 3082, 281, 25066, 15958, 5816, 1255, 275, 253, 941, 347, 973, 347, 5691, 8892, 281, 7472, 24295, 800, 2625, 516, 10340, 2007, 253, 4477, 12661, 247, 22791, 39707, 1566, 835, 253, 2625, 10669, 6081, 310, 8156, 3066, 49783, 27311, 285, 45190, 7568, 1534, 562, 24159, 2429, 281, 2045, 789, 327, 616, 22791, 8892, 253, 4081, 15302, 285, 7881, 3176, 4122, 4623, 281, 12454, 3862, 10393, 273, 8251, 494, 13479, 323, 3739, 6378, 253, 5426, 273, 27007, 3169, 6127, 273, 5816, 1255, 32113, 247, 8037, 275, 3786, 3863, 789, 281, 326, 990, 50276, 3529, 5426, 310, 1160, 12482, 407, 253, 4081, 5691, 534, 310, 1754, 327, 5368, 285, 14218, 33349, 15302, 24049, 253, 6127, 5816, 1255, 253, 3368, 2216, 310, 2590, 285, 30457, 253, 15450, 246, 8765, 403, 1097, 2834, 285, 4623, 8892, 323, 24295, 800, 6298, 281, 1071, 516, 10340, 3082, 253, 941, 2530, 2789, 352, 3477, 323, 8607, 281, 789, 327, 253, 9400, 253, 4477, 12661, 247, 1566, 10336, 534, 14371, 1534, 562, 24159, 327, 253, 22791, 8892, 275, 619, 1859, 627, 403, 642, 2201, 32213, 767, 5884, 3374, 50276, 7053, 253, 9759, 273, 253, 270, 12352, 10336, 778, 320, 247, 2372, 1512, 13224, 275, 26198, 24978, 1563, 253, 4477, 1375, 616, 6095, 323, 616, 22791, 1566, 285, 1375, 326, 642, 5368, 39707, 3210, 2953, 512, 1264, 3374, 10481, 2654, 9059, 326, 841, 403, 2581, 1846, 3374, 347, 271, 1650, 3888, 50276, 30061, 1241, 2430, 1980, 3634, 2430, 690, 2557, 1411, 29391, 32270, 14417, 285, 452, 281, 2968, 342, 13642, 273, 1048, 6430, 8113, 4979, 398, 3614, 39962, 2061, 5375, 1252, 520, 746, 1717, 9084, 452, 1119, 4088, 281, 2968, 342, 326, 594, 556, 253, 591, 22070, 3614, 39962, 2061, 5375, 16899, 20914, 18040, 10336, 3587, 327, 253, 941, 260, 20282, 3614, 5758, 317, 707, 296, 248, 17312, 71, 681, 6071, 280, 17312, 938, 1797, 50004, 88, 1028, 20282, 36445, 2844, 13118, 2241, 296, 729, 1297, 16702, 398, 280, 17312, 938, 1797, 20790, 9275, 11323, 27311, 941, 2349, 351, 398, 281, 9084, 2074, 281, 253, 4477, 2746, 326, 1057, 417, 1379, 2712, 1977, 432, 616, 1566, 10336, 534, 597, 921, 2987, 4030, 2654, 1089, 352, 4344, 2167, 281, 1691, 352, 715, 3634, 273, 643, 789, 253, 24896, 375, 2007, 1375, 326, 40798, 2349, 351, 723, 13414, 452, 247, 1175, 42115, 8492, 275, 616, 4758, 1386, 27862, 516, 14338, 310, 326, 16774, 390, 403, 627, 643, 4606, 281, 1158, 326, 327, 643, 10625, 969, 3888, 347, 1650, 1899, 2349, 351, 723, 789, 19143, 973, 281, 33876, 907, 50274, 9815, 5150, 337, 3133, 8489, 689, 681, 37787, 1881, 42959, 310, 47817, 285, 247, 2602, 4090, 651, 452, 21010, 352, 10260, 50275, 7152, 339, 9852, 436, 2929, 253, 4477, 4081, 247, 747, 22791, 4836, 323, 13424, 8468, 2625, 516, 10340, 5742, 253, 4477, 2770, 327, 10038, 72, 285, 7266, 72, 6298, 285, 897, 1345, 15302, 323, 7103, 597, 806, 15524, 15958, 5816, 6127, 323, 10038, 72, 285, 7266, 72, 6298, 407, 490, 77, 839, 3530, 285, 4270, 247, 10895, 342, 15958, 5816, 941, 597, 9009, 4314, 5368, 5899, 390, 4980, 2069, 12395, 516, 10340, 5609, 25761, 597, 671, 4081, 285, 3715, 247, 747, 3673, 44856, 49783, 27311, 270, 12352, 1881, 42959, 10336, 326, 13840, 253, 8847, 10038, 72, 377, 72, 941, 841, 941, 3798, 452, 247, 1048, 6324, 2605, 347, 24295, 800, 6298, 253, 4477, 840, 6760, 253, 3045, 273, 841, 11333, 327, 337, 253, 9305, 2625, 14433, 4836, 285, 374, 1264, 15450, 8892, 247, 43534, 5481, 275, 10038, 72, 270, 43534, 5481, 275, 7266, 72, 285, 260, 10177, 42405, 9162, 275, 10038, 72, 616, 1543, 5224, 326, 253, 747, 270, 12352, 5853, 310, 3012, 1805, 685, 512, 1666, 25379, 50276, 783, 9400, 273, 2625, 8468, 516, 10340, 310, 271, 1774, 15958, 285, 1077, 8542, 1895, 275, 278, 15356, 5312, 13424, 941, 4849, 50276, 15617, 253, 9305, 5140, 2228, 285, 253, 1264, 15450, 8892, 403, 3588, 3368, 11809, 50276, 783, 2216, 273, 253, 747, 10336, 1057, 25057, 253, 3102, 273, 10038, 72, 377, 72, 3607, 285, 253, 5750, 689, 253, 8245, 3082, 403, 18462, 50276, 14453, 8673, 2625, 3510, 778, 564, 4457, 10038, 72, 377, 72, 1223, 10038, 72, 377, 72, 310, 25711, 581, 273, 253, 954, 7744, 5728, 13424, 6298, 275, 278, 15356, 4893, 627, 403, 643, 1846, 13479, 326, 403, 417, 6107, 275, 436, 2929, 824, 347, 516, 86, 17308, 11955, 19859, 10425, 1714, 285, 10973, 11955, 305, 18356, 299, 909, 3966, 891, 717, 417, 16425, 326, 253, 4477, 1364, 7472, 616, 5609, 327, 841, 6298, 533, 616, 5319, 812, 320, 1077, 1027, 432, 10038, 72, 377, 72, 6298, 323, 1650, 597, 778, 417, 452, 247, 2590, 24295, 800, 3102, 891, 717, 14338, 281, 871, 253, 4477, 8180, 273, 253, 2087, 50228, 273, 253, 5853, 285, 253, 2442, 15504, 281, 10541, 1066, 253, 9380, 39926, 390, 19148, 253, 7990, 273, 253, 2929, 50275, 44650, 1332, 5438, 253, 4477, 7277, 253, 747, 5853, 1411, 4314, 8245, 5609, 534, 310, 1270, 2299, 2139, 417, 7277, 1411, 253, 767, 5368, 9380, 326, 5742, 2770, 327, 516, 1065, 272, 278, 15356, 24295, 800, 6298, 26332, 1384, 3387, 50276, 783, 25577, 1180, 275, 253, 2929, 4496, 15249, 50275, 2068, 42807, 1698, 6983, 323, 8245, 5853, 2905, 281, 253, 2045, 1127, 275, 2829, 374, 253, 3045, 273, 253, 8245, 5609, 512, 452, 1077, 1698, 6983, 3021, 1698, 269, 18, 4868, 253, 4477, 2530, 690, 4606, 275, 253, 2505, 534, 310, 1270, 533, 824, 247, 1698, 3045, 16540, 247, 4468, 326, 1880, 841, 1666, 25379, 403, 1512, 3477, 281, 7171, 10941, 1411, 253, 256, 5503, 5853, 812, 2085, 625, 3588, 1543, 50276, 938, 549, 1342, 3496, 266, 14103, 519, 363, 3230, 549, 4019, 285, 34843, 301, 387, 1914, 4019, 1693, 1596, 247, 10237, 5145, 4715, 7792, 275, 3361, 273, 5816, 941, 323, 23390, 26306, 4073, 5481, 432, 13424, 6298, 549, 32693, 638, 3845, 549, 32693, 19, 11238, 1047, 24803, 43425, 50276, 1449, 13599, 311, 256, 26599, 1111, 783, 88, 963, 67, 2244, 5985, 31934, 35926, 67, 538, 321, 1218, 38373, 465, 6950, 288, 729, 33032, 757, 278, 528, 18256, 74, 17614, 1222, 34843, 301, 288, 2563, 6339, 465, 32560, 298, 638, 5493, 260, 4274, 376, 11829, 83, 4949, 257, 38622, 2301, 312, 275, 7187, 295, 1240, 360, 1200, 6451, 1162, 355, 4560, 1534, 4073, 13305, 275, 247, 16196, 3472, 673, 2962, 273, 9086, 11962, 6109, 8468, 941, 275, 10061, 273, 253, 4022, 21477, 8059, 327, 1966, 2616, 275, 12672, 2718, 7223, 577, 30452, 1857, 520, 4022, 2490, 187, 4118, 18435, 27, 2520, 2929, 24357, 247, 747, 22791, 323, 5816, 941, 516, 10340, 275, 24295, 800, 6298, 751, 10038, 72, 970, 15958, 5816, 1255, 3210, 891, 1902, 824, 247, 10895, 281, 4446, 1774, 16936, 275, 436, 762, 14091, 728, 2170, 6296, 253, 4477, 921, 326, 2629, 256, 5503, 3082, 1891, 253, 30628, 23027, 2789, 436, 2929, 247, 2590, 2997 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper claims that training policy and value networks separately for actor critic algorithms like ppo or ppg can lead to overfitting of the value network in procedural generalization tasks like procgen to get around the overfitting issue it proposes to update the value network less often and with more data it also proposes to add a selfsupervised objective to further improve generalization performance and observes modest gains on some games in procgen strengths the paper makes an interesting empirical observation of overfitting in separate update ppg agents that could harm their generalization performance the paper is very clearly written except for section 43 and gets the idea across clearly figure 1 is a great summary of the proposed modification with clear differences to ppg and ppo the fix for the overfitting issue is straightforward and easy to implement the experiments are performed on a well benchmarked environment suite procgen and include informative ablations there are also a bunch of valuable auxiliary experiments investigation value function learning weaknesses the contribution of delayed value updates itself seems not quite significant very specific to ppg and its unclear how well it would generalize to other algorithms the overfitting issue might just vanish as we scale the model and data size for these agents so i fear the delayed value update contribution might end up being a regularization game for this specific ppg setup and end up getting bitter lessoned in the long run note the above two are subjective opinions on significance of the contributions the improvement for dcpg over ppg seems marginal especially when we look at individual performance curves in figure 6 of the appendix it seems like there are only 34 out of 16 games where the regularization has a clear and significant benefit the contribution a selfsupervised objective with learned dynamics has been explored multiple times in the literature spr httpsarxivorgabs200705929 pbl httpsarxivorgabs200414646 deepmdp httpsarxivorgabs190602736 etc moreover it has been explored explicitly in procgen as well in muzero httpsarxivorgabs211101587 and driml httpsarxivorgabs200607217 the idea of using a discriminator for this task was proposed in cpcaction and evaluated in procgen in driml and ablated in muzero using an inverse modelling objective along with the forward objective was also proposed in sgi httpsarxivorgabs210604799 the paper unfortunately doesnt mention or appropriately position itself against these works the reporting of results does not follow the recently proposed recommended practices as described in rliable httpsgithubcomgoogleresearchrliable and thus might be prone to making incorrect conclusions more details below na docsepthis paper proposes a new approach for rl in procedurally generated environments with empirical evidence at its core is the problem of the critic learning problem and potential overfitting the proposed approach mainly builds on the phasic policy gradient method by unifying it the method is compared against natural baselines and other environmentrelevant methods the experiment section also shows several ablation studies the contribution of this paper is original and appears relevant to the community it nicely builds on a battery of previous work which has some overfitting limitations the empirical analysis is quite furnished and gives very some interesting insights however the improvements provided by dcpg or ddcpg over the baselines appear to be less important than what is claimed by the authors in particular the statistical significance in tables 1 2 and 5 depicted by bold text and the empirical analysis in section 52 cannot be conclusive statistically speaking when compared to the baselines dcpg appears to improve on 218 tasks and ddcpg on 418 indeed unfortunately the stds appear in many instances to be generally too significant to draw any statistical conclusion we develop more points below regarding the quality overall the paper is also mostly well written and develops its ideas clearly the potential negative societal impact or the limitations of the work have not been addressed docsepin this paper it is first shown that the value function is easier to overfit compared to the policy function based on this observation a new policy gradient algorithm called delayedcritic policy gradient dcpg is proposed which reduces the overfitting of the value function by less frequent updates with more training data moreover a single discriminator is applied to learn both forward and inverse dynamics of environments which helps to generate better representations and improve the performance of dcpg across different tasks this work presents two novel policy gradient algorithms dcpg and ddcpg compared to ppg dcpg has a more compact model structure with only one encoder while achieving comparable or better performance on different tasks moreover by combining the discriminator that learns environmental dynamics ddcpg boosts the performance of dcpg furthermore this paper is clearly written with a good flow the analysis experiments and the ablation studies are also helpful in understanding the effectiveness of dcpg and ddcpg there are several minor issues in section 22 the clip operation is not explained though it might be obvious to readers familiar with ppo in section 43 both the encoder ethetacdot and the dynamics head fthetacdot cdot cdot are parameterized by theta is the encoder the same one as shown in figure 1c if so how could the same theta be used to parameterize a different function ie fthetacdot cdot cdot it is better to present the model structure of ddcpg as well to clarify in table 2 the error bars of some results overlap with each making the claim in section 54 less convincing ### Summary:
all reviewers were in favor of acceptance and after reading the paper myself i am in agreement the empirical results were good and the experimental work quite comprehensive the method is well explained and the writing is clear and easy to read the only real detraction i saw distinct from things already mentioned by reviewers was that there are some statements that feel overly strong given the presented results eg l161 l179 i was curious about sensitivity to hyperparameters specifically the settings around how long each phase lasts etc along the lines of what was done in the ppg paper that said i would generally downplay the importance of this as the proposed method is using the same hyperparameters as for ppg and appears to have undergone minimal hyperparameter tuning in all i think this makes a clear contribution to the field and should be accepted
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 3916, 326, 3733, 3646, 285, 1318, 6928, 11794, 323, 12353, 7291, 11333, 751, 268, 5367, 390, 7266, 72, 476, 1421, 281, 689, 31893, 273, 253, 1318, 2990, 275, 19993, 26647, 8892, 751, 15613, 1541, 281, 755, 1475, 253, 689, 31893, 2523, 352, 29328, 281, 5731, 253, 1318, 2990, 1679, 2223, 285, 342, 625, 941, 352, 671, 29328, 281, 823, 247, 1881, 35421, 8103, 281, 2007, 3157, 26647, 3045, 285, 40687, 16453, 15988, 327, 690, 3958, 275, 15613, 1541, 50276, 296, 3755, 20556, 50276, 783, 2929, 2789, 271, 4722, 16774, 8310, 273, 689, 31893, 275, 4858, 5731, 7266, 72, 6083, 326, 812, 5237, 616, 26647, 3045, 50275, 783, 2929, 310, 1077, 4518, 3542, 3707, 323, 2593, 7652, 285, 4850, 253, 2934, 2439, 4518, 4677, 337, 310, 247, 1270, 6010, 273, 253, 4081, 11237, 342, 2590, 3910, 281, 7266, 72, 285, 268, 5367, 50275, 783, 4993, 323, 253, 689, 31893, 2523, 310, 15246, 285, 3477, 281, 3359, 50275, 783, 4679, 403, 2684, 327, 247, 973, 22791, 264, 3126, 18880, 15613, 1541, 285, 2486, 27096, 490, 77, 569, 627, 403, 671, 247, 12190, 273, 9865, 24026, 4679, 5839, 1318, 1159, 4715, 50276, 20881, 1255, 265, 50275, 783, 7680, 273, 13444, 1318, 11269, 3139, 3133, 417, 3240, 1534, 1077, 2173, 281, 7266, 72, 285, 697, 12744, 849, 973, 352, 651, 39970, 281, 643, 11333, 50275, 783, 689, 31893, 2523, 1537, 816, 29259, 347, 359, 4311, 253, 1566, 285, 941, 1979, 323, 841, 6083, 594, 891, 4709, 253, 13444, 1318, 5731, 7680, 1537, 990, 598, 1146, 247, 37820, 2165, 323, 436, 2173, 7266, 72, 9978, 285, 990, 598, 2970, 17123, 1679, 11406, 275, 253, 1048, 1408, 50276, 9939, 253, 1840, 767, 403, 17854, 11626, 327, 8453, 273, 253, 9021, 50275, 783, 7756, 323, 277, 7693, 72, 689, 7266, 72, 3133, 16888, 3340, 672, 359, 1007, 387, 2060, 3045, 9191, 275, 4677, 721, 273, 253, 30762, 352, 3133, 751, 627, 403, 760, 5910, 562, 273, 1668, 3958, 835, 253, 37820, 556, 247, 2590, 285, 1534, 5649, 50276, 783, 7680, 247, 1881, 35421, 8103, 342, 6311, 8062, 556, 644, 14859, 2709, 2069, 275, 253, 6239, 8689, 5987, 39962, 2061, 5375, 8602, 30479, 1717, 268, 1559, 5987, 39962, 2061, 5375, 9430, 1047, 25238, 3676, 6535, 81, 5987, 39962, 2061, 5375, 16129, 1549, 1630, 1812, 3966, 25761, 352, 556, 644, 14859, 11120, 275, 15613, 1541, 347, 973, 275, 278, 7958, 2771, 5987, 39962, 2061, 5375, 17605, 6903, 47507, 285, 1837, 303, 77, 5987, 39962, 2061, 5375, 1518, 1549, 3547, 1166, 253, 2934, 273, 970, 247, 7134, 12915, 323, 436, 4836, 369, 4081, 275, 260, 5902, 1913, 285, 6760, 275, 15613, 1541, 275, 1837, 303, 77, 285, 490, 16148, 275, 278, 7958, 2771, 970, 271, 13737, 26278, 8103, 2112, 342, 253, 3579, 8103, 369, 671, 4081, 275, 256, 7311, 5987, 39962, 2061, 5375, 16899, 1549, 2504, 1525, 253, 2929, 19235, 36908, 3748, 390, 20420, 1899, 3139, 1411, 841, 2987, 50276, 783, 9610, 273, 1543, 1057, 417, 956, 253, 4102, 4081, 8521, 8333, 347, 2529, 275, 391, 965, 494, 5987, 7280, 681, 9906, 36642, 83, 965, 494, 285, 3021, 1537, 320, 21291, 281, 2403, 13583, 11815, 625, 4278, 2708, 50276, 2072, 5474, 33032, 2520, 2929, 29328, 247, 747, 2746, 323, 391, 77, 275, 3352, 8572, 4561, 12620, 342, 16774, 1941, 387, 697, 5161, 310, 253, 1895, 273, 253, 7291, 4715, 1895, 285, 2442, 689, 31893, 253, 4081, 2746, 7194, 21168, 327, 253, 815, 45539, 3646, 11786, 1332, 407, 440, 5411, 352, 253, 1332, 310, 2429, 1411, 3626, 1666, 25379, 285, 643, 3126, 15477, 3082, 253, 3368, 2593, 671, 2722, 2067, 28913, 2175, 253, 7680, 273, 436, 2929, 310, 3236, 285, 4620, 4623, 281, 253, 3114, 352, 23395, 21168, 327, 247, 9378, 273, 2045, 789, 534, 556, 690, 689, 31893, 7364, 253, 16774, 1783, 310, 3240, 25032, 285, 4245, 1077, 690, 4722, 16039, 2299, 253, 11701, 2530, 407, 277, 7693, 72, 390, 32765, 7693, 72, 689, 253, 1666, 25379, 3176, 281, 320, 1679, 1774, 685, 752, 310, 7558, 407, 253, 4477, 275, 1798, 253, 7605, 8453, 275, 7180, 337, 374, 285, 608, 17253, 407, 13433, 2505, 285, 253, 16774, 1783, 275, 2593, 8073, 2550, 320, 38662, 10126, 8288, 672, 2429, 281, 253, 1666, 25379, 277, 7693, 72, 4620, 281, 3157, 327, 26578, 8892, 285, 32765, 7693, 72, 327, 38627, 6296, 19235, 253, 331, 1397, 3176, 275, 1142, 10872, 281, 320, 3839, 1512, 1534, 281, 3812, 667, 7605, 6452, 359, 1287, 625, 2792, 2708, 5001, 253, 3290, 4583, 253, 2929, 310, 671, 6571, 973, 3542, 285, 24357, 697, 5697, 4518, 253, 2442, 4016, 38058, 3486, 390, 253, 7364, 273, 253, 789, 452, 417, 644, 9713, 5474, 339, 9852, 436, 2929, 352, 310, 806, 2011, 326, 253, 1318, 1159, 310, 6927, 281, 689, 8491, 2429, 281, 253, 3646, 1159, 1754, 327, 436, 8310, 247, 747, 3646, 11786, 5933, 1925, 13444, 68, 17425, 3646, 11786, 277, 7693, 72, 310, 4081, 534, 11355, 253, 689, 31893, 273, 253, 1318, 1159, 407, 1679, 10879, 11269, 342, 625, 3733, 941, 25761, 247, 2014, 7134, 12915, 310, 3732, 281, 3037, 1097, 3579, 285, 13737, 8062, 273, 12620, 534, 7729, 281, 6635, 1805, 14237, 285, 3157, 253, 3045, 273, 277, 7693, 72, 2439, 1027, 8892, 436, 789, 10262, 767, 4460, 3646, 11786, 11333, 50276, 69, 7693, 72, 285, 32765, 7693, 72, 2429, 281, 7266, 72, 277, 7693, 72, 556, 247, 625, 8566, 1566, 2605, 342, 760, 581, 32049, 1223, 17170, 10870, 390, 1805, 3045, 327, 1027, 8892, 25761, 407, 16248, 253, 7134, 12915, 326, 33772, 6938, 8062, 32765, 7693, 72, 9510, 84, 253, 3045, 273, 277, 7693, 72, 33810, 436, 2929, 310, 4518, 3542, 342, 247, 1175, 2685, 253, 1783, 4679, 285, 253, 28913, 2175, 403, 671, 9371, 275, 4685, 253, 12510, 273, 277, 7693, 72, 285, 32765, 7693, 72, 50276, 9088, 403, 2067, 5884, 3374, 275, 2593, 3307, 253, 17230, 4254, 310, 417, 5544, 2167, 352, 1537, 320, 4755, 281, 10668, 7615, 342, 268, 5367, 275, 2593, 7652, 1097, 253, 32049, 1162, 6168, 317, 5256, 285, 253, 8062, 1481, 269, 783, 85, 317, 5256, 260, 5256, 260, 5256, 403, 4764, 1025, 407, 39116, 310, 253, 32049, 253, 1072, 581, 347, 2011, 275, 4677, 337, 68, 604, 594, 849, 812, 253, 1072, 39116, 320, 908, 281, 4764, 907, 247, 1027, 1159, 26332, 269, 783, 85, 317, 5256, 260, 5256, 260, 5256, 352, 310, 1805, 281, 1246, 253, 1566, 2605, 273, 32765, 7693, 72, 347, 973, 281, 19148, 275, 2829, 374, 253, 2228, 8965, 273, 690, 1543, 14787, 342, 1016, 2403, 253, 1750, 275, 2593, 8255, 1679, 21414, 2490, 187, 4118, 18435, 27, 455, 30628, 497, 275, 3718, 273, 14924, 285, 846, 4361, 253, 2929, 4266, 891, 717, 275, 4345, 253, 16774, 1543, 497, 1175, 285, 253, 5661, 789, 3240, 11088, 253, 1332, 310, 973, 5544, 285, 253, 4028, 310, 2590, 285, 3477, 281, 1239, 253, 760, 1524, 843, 3460, 891, 3047, 5799, 432, 1841, 2168, 5393, 407, 30628, 369, 326, 627, 403, 690, 7234, 326, 1928, 27662, 2266, 1677, 253, 3559, 1543, 24088, 298, 20664, 298, 17864, 891, 369, 14338, 670, 7340, 281, 4373, 22041, 5742, 253, 7533, 1475, 849, 1048, 1016, 3408, 34756, 3966, 2112, 253, 3104, 273, 752, 369, 2218, 275, 253, 7266, 72, 2929, 326, 753, 891, 651, 3839, 1066, 1993, 253, 6349, 273, 436, 347, 253, 4081, 1332, 310, 970, 253, 1072, 4373, 22041, 347, 323, 7266, 72, 285, 4620, 281, 452, 29518, 8723, 4373, 19484, 25184, 50276, 249, 512, 891, 1158, 436, 2789, 247, 2590, 7680, 281, 253, 1673, 285, 943, 320, 7607, 50275 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 3916, 326, 3733, 3646, 285, 1318, 6928, 11794, 323, 12353, 7291, 11333, 751, 268, 5367, 390, 7266, 72, 476, 1421, 281, 689, 31893, 273, 253, 1318, 2990, 275, 19993, 26647, 8892, 751, 15613, 1541, 281, 755, 1475, 253, 689, 31893, 2523, 352, 29328, 281, 5731, 253, 1318, 2990, 1679, 2223, 285, 342, 625, 941, 352, 671, 29328, 281, 823, 247, 1881, 35421, 8103, 281, 2007, 3157, 26647, 3045, 285, 40687, 16453, 15988, 327, 690, 3958, 275, 15613, 1541, 50276, 296, 3755, 20556, 50276, 783, 2929, 2789, 271, 4722, 16774, 8310, 273, 689, 31893, 275, 4858, 5731, 7266, 72, 6083, 326, 812, 5237, 616, 26647, 3045, 50275, 783, 2929, 310, 1077, 4518, 3542, 3707, 323, 2593, 7652, 285, 4850, 253, 2934, 2439, 4518, 4677, 337, 310, 247, 1270, 6010, 273, 253, 4081, 11237, 342, 2590, 3910, 281, 7266, 72, 285, 268, 5367, 50275, 783, 4993, 323, 253, 689, 31893, 2523, 310, 15246, 285, 3477, 281, 3359, 50275, 783, 4679, 403, 2684, 327, 247, 973, 22791, 264, 3126, 18880, 15613, 1541, 285, 2486, 27096, 490, 77, 569, 627, 403, 671, 247, 12190, 273, 9865, 24026, 4679, 5839, 1318, 1159, 4715, 50276, 20881, 1255, 265, 50275, 783, 7680, 273, 13444, 1318, 11269, 3139, 3133, 417, 3240, 1534, 1077, 2173, 281, 7266, 72, 285, 697, 12744, 849, 973, 352, 651, 39970, 281, 643, 11333, 50275, 783, 689, 31893, 2523, 1537, 816, 29259, 347, 359, 4311, 253, 1566, 285, 941, 1979, 323, 841, 6083, 594, 891, 4709, 253, 13444, 1318, 5731, 7680, 1537, 990, 598, 1146, 247, 37820, 2165, 323, 436, 2173, 7266, 72, 9978, 285, 990, 598, 2970, 17123, 1679, 11406, 275, 253, 1048, 1408, 50276, 9939, 253, 1840, 767, 403, 17854, 11626, 327, 8453, 273, 253, 9021, 50275, 783, 7756, 323, 277, 7693, 72, 689, 7266, 72, 3133, 16888, 3340, 672, 359, 1007, 387, 2060, 3045, 9191, 275, 4677, 721, 273, 253, 30762, 352, 3133, 751, 627, 403, 760, 5910, 562, 273, 1668, 3958, 835, 253, 37820, 556, 247, 2590, 285, 1534, 5649, 50276, 783, 7680, 247, 1881, 35421, 8103, 342, 6311, 8062, 556, 644, 14859, 2709, 2069, 275, 253, 6239, 8689, 5987, 39962, 2061, 5375, 8602, 30479, 1717, 268, 1559, 5987, 39962, 2061, 5375, 9430, 1047, 25238, 3676, 6535, 81, 5987, 39962, 2061, 5375, 16129, 1549, 1630, 1812, 3966, 25761, 352, 556, 644, 14859, 11120, 275, 15613, 1541, 347, 973, 275, 278, 7958, 2771, 5987, 39962, 2061, 5375, 17605, 6903, 47507, 285, 1837, 303, 77, 5987, 39962, 2061, 5375, 1518, 1549, 3547, 1166, 253, 2934, 273, 970, 247, 7134, 12915, 323, 436, 4836, 369, 4081, 275, 260, 5902, 1913, 285, 6760, 275, 15613, 1541, 275, 1837, 303, 77, 285, 490, 16148, 275, 278, 7958, 2771, 970, 271, 13737, 26278, 8103, 2112, 342, 253, 3579, 8103, 369, 671, 4081, 275, 256, 7311, 5987, 39962, 2061, 5375, 16899, 1549, 2504, 1525, 253, 2929, 19235, 36908, 3748, 390, 20420, 1899, 3139, 1411, 841, 2987, 50276, 783, 9610, 273, 1543, 1057, 417, 956, 253, 4102, 4081, 8521, 8333, 347, 2529, 275, 391, 965, 494, 5987, 7280, 681, 9906, 36642, 83, 965, 494, 285, 3021, 1537, 320, 21291, 281, 2403, 13583, 11815, 625, 4278, 2708, 50276, 2072, 5474, 33032, 2520, 2929, 29328, 247, 747, 2746, 323, 391, 77, 275, 3352, 8572, 4561, 12620, 342, 16774, 1941, 387, 697, 5161, 310, 253, 1895, 273, 253, 7291, 4715, 1895, 285, 2442, 689, 31893, 253, 4081, 2746, 7194, 21168, 327, 253, 815, 45539, 3646, 11786, 1332, 407, 440, 5411, 352, 253, 1332, 310, 2429, 1411, 3626, 1666, 25379, 285, 643, 3126, 15477, 3082, 253, 3368, 2593, 671, 2722, 2067, 28913, 2175, 253, 7680, 273, 436, 2929, 310, 3236, 285, 4620, 4623, 281, 253, 3114, 352, 23395, 21168, 327, 247, 9378, 273, 2045, 789, 534, 556, 690, 689, 31893, 7364, 253, 16774, 1783, 310, 3240, 25032, 285, 4245, 1077, 690, 4722, 16039, 2299, 253, 11701, 2530, 407, 277, 7693, 72, 390, 32765, 7693, 72, 689, 253, 1666, 25379, 3176, 281, 320, 1679, 1774, 685, 752, 310, 7558, 407, 253, 4477, 275, 1798, 253, 7605, 8453, 275, 7180, 337, 374, 285, 608, 17253, 407, 13433, 2505, 285, 253, 16774, 1783, 275, 2593, 8073, 2550, 320, 38662, 10126, 8288, 672, 2429, 281, 253, 1666, 25379, 277, 7693, 72, 4620, 281, 3157, 327, 26578, 8892, 285, 32765, 7693, 72, 327, 38627, 6296, 19235, 253, 331, 1397, 3176, 275, 1142, 10872, 281, 320, 3839, 1512, 1534, 281, 3812, 667, 7605, 6452, 359, 1287, 625, 2792, 2708, 5001, 253, 3290, 4583, 253, 2929, 310, 671, 6571, 973, 3542, 285, 24357, 697, 5697, 4518, 253, 2442, 4016, 38058, 3486, 390, 253, 7364, 273, 253, 789, 452, 417, 644, 9713, 5474, 339, 9852, 436, 2929, 352, 310, 806, 2011, 326, 253, 1318, 1159, 310, 6927, 281, 689, 8491, 2429, 281, 253, 3646, 1159, 1754, 327, 436, 8310, 247, 747, 3646, 11786, 5933, 1925, 13444, 68, 17425, 3646, 11786, 277, 7693, 72, 310, 4081, 534, 11355, 253, 689, 31893, 273, 253, 1318, 1159, 407, 1679, 10879, 11269, 342, 625, 3733, 941, 25761, 247, 2014, 7134, 12915, 310, 3732, 281, 3037, 1097, 3579, 285, 13737, 8062, 273, 12620, 534, 7729, 281, 6635, 1805, 14237, 285, 3157, 253, 3045, 273, 277, 7693, 72, 2439, 1027, 8892, 436, 789, 10262, 767, 4460, 3646, 11786, 11333, 50276, 69, 7693, 72, 285, 32765, 7693, 72, 2429, 281, 7266, 72, 277, 7693, 72, 556, 247, 625, 8566, 1566, 2605, 342, 760, 581, 32049, 1223, 17170, 10870, 390, 1805, 3045, 327, 1027, 8892, 25761, 407, 16248, 253, 7134, 12915, 326, 33772, 6938, 8062, 32765, 7693, 72, 9510, 84, 253, 3045, 273, 277, 7693, 72, 33810, 436, 2929, 310, 4518, 3542, 342, 247, 1175, 2685, 253, 1783, 4679, 285, 253, 28913, 2175, 403, 671, 9371, 275, 4685, 253, 12510, 273, 277, 7693, 72, 285, 32765, 7693, 72, 50276, 9088, 403, 2067, 5884, 3374, 275, 2593, 3307, 253, 17230, 4254, 310, 417, 5544, 2167, 352, 1537, 320, 4755, 281, 10668, 7615, 342, 268, 5367, 275, 2593, 7652, 1097, 253, 32049, 1162, 6168, 317, 5256, 285, 253, 8062, 1481, 269, 783, 85, 317, 5256, 260, 5256, 260, 5256, 403, 4764, 1025, 407, 39116, 310, 253, 32049, 253, 1072, 581, 347, 2011, 275, 4677, 337, 68, 604, 594, 849, 812, 253, 1072, 39116, 320, 908, 281, 4764, 907, 247, 1027, 1159, 26332, 269, 783, 85, 317, 5256, 260, 5256, 260, 5256, 352, 310, 1805, 281, 1246, 253, 1566, 2605, 273, 32765, 7693, 72, 347, 973, 281, 19148, 275, 2829, 374, 253, 2228, 8965, 273, 690, 1543, 14787, 342, 1016, 2403, 253, 1750, 275, 2593, 8255, 1679, 21414, 2490, 187, 4118, 18435, 27, 455, 30628, 497, 275, 3718, 273, 14924, 285, 846, 4361, 253, 2929, 4266, 891, 717, 275, 4345, 253, 16774, 1543, 497, 1175, 285, 253, 5661, 789, 3240, 11088, 253, 1332, 310, 973, 5544, 285, 253, 4028, 310, 2590, 285, 3477, 281, 1239, 253, 760, 1524, 843, 3460, 891, 3047, 5799, 432, 1841, 2168, 5393, 407, 30628, 369, 326, 627, 403, 690, 7234, 326, 1928, 27662, 2266, 1677, 253, 3559, 1543, 24088, 298, 20664, 298, 17864, 891, 369, 14338, 670, 7340, 281, 4373, 22041, 5742, 253, 7533, 1475, 849, 1048, 1016, 3408, 34756, 3966, 2112, 253, 3104, 273, 752, 369, 2218, 275, 253, 7266, 72, 2929, 326, 753, 891, 651, 3839, 1066, 1993, 253, 6349, 273, 436, 347, 253, 4081, 1332, 310, 970, 253, 1072, 4373, 22041, 347, 323, 7266, 72, 285, 4620, 281, 452, 29518, 8723, 4373, 19484, 25184, 50276, 249, 512, 891, 1158, 436, 2789, 247, 2590, 7680, 281, 253, 1673, 285, 943, 320, 7607, 50275 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: summary and contributions the paper presents an empirical study of the open set image recognition osr problem it considers two classification setups singledomain and crossdomain and one semantic segmentation setup in particular the paper evaluates simple statistical models like the nearest class means ncms kmeans and gaussian mixture models gmms built atop ots deep features against deep sota baselines like clsk1 msp openmax c2ae gdm etc the paper demonstrates that simple statistical models outperform sota models if deep ots features pretrained on large datasets are used normalized and their dimensionality is reduced through spatial pooling and pca this also results in really lightweight image recognition pipelines for osr settings detailed review the following is the detailed review of the paper organized into strengths and weaknesses subsections strengths relevance and significance most ml models when deployed in realworld settings often need to operate in openset or open world conditions and need to provide robust estimates including an outofclass unknown label when encountering data from unknown classes models that have good osr properties should be of broad interest to the ml community empirical results in the paper suggest that simple statistical techniques can be quite effective in performing openset recognition the paper also validates the significance of performing feature preprocessing steps towards obtaining performance on par with or exceeding that of more complex sota methods clarity the paper is written well and is easy to understand weaknesses relation to prior art the paper does a reasonable job of presenting the prior art identifying the challenges and need for the presented work however it doesnt cite the following relevant works 1 sun et al conditional gaussian distribution learning for open set recognition cvpr 2020 2 geng et al recent advances in open set recognition a survey ieee pami 2020 novelty the study involving the exploration of preprocessing steps on a variety of simple statistical models seems novel however the survey paper by geng et al 2 also conducts an empirical study involving many of the models considered in this paper the paper needs to reference and compare against 2 empirical evaluation the empirical evaluation is inadequate experiment results are shown only for limited closed sets for training being an empirical paper it is expected for the paper to have more detailed experiments in addition 1 the paper needs to redesign the study in light of 2 2 it should compare results against more recent sota models for example 1 given that both uses classconditional gmms 3 authors mention in section 4 that hyperparameters are selected based on a smallscale validation set however the size of this validation set is not specified details about the validation set are needed to determine performance of the experiments 4 in section 41 the c2ae scores in table 1 seems to be lower than what is reported in their paper however experimental setup the same httpsarxivorgpdf190401198pdf is there any specific reason for this 5 section 42 results on more than one closeworld training dataset including mnist cifar for crossworld testing should be provided as also done by shafaei et al 6 the experimental results need to be explained better in sections 41 and 43 gmms gives good results while on the setting of section 42 gmms dont do well what is the explanation for this assessment this paper presents a study showing that simple lightweight statistical models can outperform deep sota models on the image osr problem this should be of interest to the ml community however the paper seems agnostic of a couple of recent works which render the study and the empirical evaluation inadequate in addition experimental evaluation and analysis needs to be improved as per the observations made above docsepsummary of paper the main claim of the paper is that out of distribution ood detection can be done by use of pretraining and appropriately deriving a feature space from sota activations via pooling pca based dimensionality reduction l2 normalization classical methods such as gmms kmeans etc can then be used to estimate the probability density function of features for use in ood detection several alternative schemes are compared against many ood detection schemes key contributions claimed are that pretrained nets have information about openworld statistics and offtheshelf net features along with appropriate choice of a lowdimensional representation helps in outperforming conventional ood schemes the paper builds upon recent work on ood method benchmarking method by shafaei et al 2019 that argues that most ood schemes are not able to pass a less unbiased test designed wherein a source data set is used for training using standard methodology validation data set for estimating a decision function between source and validation and finally the probability of outlier detection on other datasets and their variability is estimated to get robust view of ood 1 the general approach is as i think quite misguided the idea is to use pretrained datasets extract features from it and apply a gaussian mixture model this means though that such an approach will be able to recognize only classes for which features were made available hence the closedworld problem is just extended by additional knowledge about more classes from selected datasets no openworld problem is solved 2 the fact that pretraining helps is a generic statement which depends on the statistics of the dataset used for pretraining and then the dataset on which it is tested 3 the setup in 41 split a single dataset into open and closed sets wrt class label is questionable as the image statistics in the partition are same 4 in 42 the authors use an open dataset for validationtuning this makes this dataset not open per definition it may be true that it helps in the model generalizing better but the terminology is still misguided in my view the paper applies the testing methodology of shafaei et al 2019 incorrectly to design an ood algorithm and claim its superiority none of the tables in the paper provide error bars in order to convince that the insights are correct i would expect that the experiments need to be run with a larger sampling of outlier datasets as done in shafaei et al docsepmethodology the paper tackles the socalled openset classification where query examples outside any of the classes in the training set should be detected at inference by combining the feature extractor based on fashionable deep models and the classical clustering methods kmeans gmm etc the paper empirically shows that this pipeline can address many openset problems in realistic scenarios pros unfortunately none cons the paper basically reinvents the wheel the socalled openset classification problem is precisely the anomaly detection problem that has been studied for decades see v chandola a banerjee and v kumar anomaly detection a survey acm computing surveys vol 41 no 3 p 15 2009 and the use of classical clustering methods kmeans gmm etc was among one of the first efforts for anomaly detection applying the identical strategy on top of the fashionable deep features is the routine treatment for anomaly detection 101 in 2020 and brings in no novelty to the community docsepthis paper proposes an openset recognition approach that uses simple statistical measures such as gmms and kmeans on top of postprocessed intermediate features extracted from closedset deep models it finds that i this lightweight pipeline outperforms prior methods on openset image recognition across multiple evaluation protocols at much lesser memory and compute cost and ii openworld recognition generally benefits from using models pretrained on large datasets such as imagenet rather than training from scratch iii the technique also generalizes to openworld semantic segmentation strengths the paper studies an important problem is motivated clearly and is wellwritten the proposed approach is intuitive memoryefficient and appears to clearly and consistently perform prior work the modeling choices are motivated and analyzed well for eg fig 3 clearly illustrates why l2normalized and pooled intermediate features more clearly capture indomainness than logits used predominantly in prior work the set of experiments and baseline comparisons appears comprehensive the performance vs memory tradeoff of the current method vs prior work is analyzed well the limitations of the proposed method are also explored for eg fig 5 that shows how for open set semantic segmentation the proposed method is a better alternative than training a binary classifier only when very little outofdomain data is available to train on weaknesses studying howwhether the choice of layer from which features are extracted affects performance would have been an interesting addition additional comments suggestions fig 6 is difficult to read since some of the lines overlap varying opacity might help with readability overall comments this is an interesting and wellwritten paper that proposes simple and memoryefficient alternatives for openworld recognition that consistently outperform more complex methods from prior work postrebuttal comments i have read the concerns raised by other reviewers as well as the author response i still feel that this is an interesting work and its findings are of potential value to the community there has been a considerable amount of recent work in open set recognition for deep models and this paper calls into question the need for sophisticated techniques by showing fairly rigorously in my opinion that simple strategies and the right choice of feature engineering works better i agree with the authors that not using imagenet pretraining is an unrealistic and unnecessary constraint  moreover the method generalizes even when evaluated on datasets such as mnist and svhn which are distributionally very different from imagenet further i think the paper acknowledges prior work appropriately and did not find any of the claims made to be unreasonable i agree with r1s concerns about overlap with two recent papers but found the author response to be satisfactory overall i will retain my accept rating ### Summary:
the addresses openset recognition namely detecting anomalous samples that belong to classes not observed during training it has been shown that existing methods fail on openworld images the current paper shows empirically that performance can be greatly improved if based on lowdimensional features reviewers had grave concerns about the novelty of the approach and the logic behind the workflow they found merit in the paper but chose to retain their scores after reviewing the rebuttal as a future recommendation it would be useful to provide more evidence about what component of the method or workflow are novel and what makes them work well
[ 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 6010, 285, 9021, 253, 2929, 10262, 271, 16774, 1263, 273, 253, 1527, 873, 2460, 8981, 7684, 83, 1895, 352, 19401, 767, 9162, 873, 8777, 50276, 4093, 1070, 297, 404, 285, 2831, 13517, 285, 581, 24705, 26405, 9978, 275, 1798, 253, 2929, 44995, 2969, 7605, 3210, 751, 253, 5275, 966, 2097, 295, 47253, 465, 30799, 285, 305, 12064, 7802, 3210, 305, 78, 983, 4270, 31831, 258, 1641, 3676, 3386, 1411, 3676, 256, 5503, 1666, 25379, 751, 502, 3319, 18, 278, 1033, 1527, 4090, 260, 19, 3348, 305, 17670, 3966, 50274, 783, 2929, 14371, 326, 2969, 7605, 3210, 562, 32231, 256, 5503, 3210, 604, 3676, 258, 1641, 3386, 3215, 11273, 327, 1781, 15302, 403, 908, 12650, 285, 616, 7877, 1319, 310, 3777, 949, 8820, 45900, 285, 268, 6357, 436, 671, 1543, 275, 1663, 28441, 2460, 8981, 44387, 323, 7684, 83, 7533, 50274, 5992, 7193, 2278, 50276, 783, 1563, 310, 253, 7000, 2278, 273, 253, 2929, 10932, 715, 20544, 285, 32213, 749, 21454, 50274, 296, 3755, 20556, 50275, 11235, 11828, 285, 8453, 50276, 2252, 13361, 3210, 672, 18329, 275, 1524, 10186, 7533, 2223, 878, 281, 10196, 275, 13279, 292, 390, 1527, 1533, 2515, 285, 878, 281, 2085, 10237, 8197, 1690, 271, 562, 1171, 2437, 7202, 5203, 672, 13329, 272, 941, 432, 7202, 5971, 3210, 326, 452, 1175, 7684, 83, 3607, 943, 320, 273, 3862, 1600, 281, 253, 13361, 3114, 50274, 358, 5378, 474, 1543, 275, 253, 2929, 1804, 326, 2969, 7605, 5609, 476, 320, 3240, 3576, 275, 9591, 13279, 292, 8981, 253, 2929, 671, 3588, 684, 253, 8453, 273, 9591, 4735, 638, 21678, 5018, 4404, 13546, 3045, 327, 1061, 342, 390, 27433, 326, 273, 625, 2570, 256, 5503, 3082, 50274, 498, 15752, 50276, 783, 2929, 310, 3542, 973, 285, 310, 3477, 281, 2096, 50274, 20881, 1255, 265, 50275, 16429, 281, 2720, 1445, 50276, 783, 2929, 1057, 247, 5272, 2628, 273, 15250, 253, 2720, 1445, 12488, 253, 7881, 285, 878, 323, 253, 3559, 789, 2299, 352, 36908, 26542, 253, 1563, 4623, 2987, 50274, 18, 5101, 1162, 355, 17697, 305, 12064, 3268, 4715, 323, 1527, 873, 8981, 30105, 1087, 9169, 50275, 19, 730, 72, 1162, 355, 3332, 16424, 275, 1527, 873, 8981, 247, 6630, 26332, 1796, 268, 7588, 9169, 50274, 2369, 652, 555, 50276, 783, 1263, 7668, 253, 17947, 273, 638, 21678, 5018, 327, 247, 5235, 273, 2969, 7605, 3210, 3133, 4460, 2299, 253, 6630, 2929, 407, 730, 72, 1162, 355, 374, 671, 2589, 84, 271, 16774, 1263, 7668, 1142, 273, 253, 3210, 2783, 275, 436, 2929, 253, 2929, 3198, 281, 3806, 285, 7277, 1411, 374, 50273, 358, 5378, 474, 7103, 50276, 783, 16774, 7103, 310, 18766, 3368, 1543, 403, 2011, 760, 323, 3710, 4581, 5239, 323, 3733, 1146, 271, 16774, 2929, 352, 310, 3264, 323, 253, 2929, 281, 452, 625, 7000, 4679, 275, 1635, 50275, 18, 253, 2929, 3198, 281, 45755, 253, 1263, 275, 1708, 273, 374, 50274, 19, 352, 943, 7277, 1543, 1411, 625, 3332, 256, 5503, 3210, 323, 1650, 337, 1677, 326, 1097, 4648, 966, 35428, 305, 78, 983, 50275, 20, 4477, 3748, 275, 2593, 577, 326, 4373, 22041, 403, 4236, 1754, 327, 247, 1355, 7527, 12820, 873, 2299, 253, 1979, 273, 436, 12820, 873, 310, 417, 7616, 4278, 670, 253, 12820, 873, 403, 3058, 281, 3653, 3045, 273, 253, 4679, 50275, 21, 275, 2593, 7609, 253, 260, 19, 3348, 7363, 275, 2829, 337, 3133, 281, 320, 2406, 685, 752, 310, 2361, 275, 616, 2929, 2299, 5661, 9978, 253, 1072, 5987, 39962, 2061, 9275, 746, 2125, 520, 16903, 9275, 310, 627, 667, 2173, 1921, 323, 436, 50275, 22, 2593, 5976, 1543, 327, 625, 685, 581, 2810, 10186, 3733, 10895, 1690, 278, 79, 382, 260, 338, 274, 323, 2831, 10186, 5175, 943, 320, 2530, 347, 671, 2218, 407, 439, 2320, 3348, 74, 1162, 355, 50275, 23, 253, 5661, 1543, 878, 281, 320, 5544, 1805, 275, 7118, 7609, 285, 7652, 305, 78, 983, 4245, 1175, 1543, 1223, 327, 253, 4758, 273, 2593, 5976, 305, 78, 983, 13414, 513, 973, 752, 310, 253, 8813, 323, 436, 50274, 515, 18114, 50276, 2520, 2929, 10262, 247, 1263, 4645, 326, 2969, 28441, 7605, 3210, 476, 562, 32231, 3676, 256, 5503, 3210, 327, 253, 2460, 7684, 83, 1895, 436, 943, 320, 273, 1600, 281, 253, 13361, 3114, 2299, 253, 2929, 3133, 639, 79, 6932, 273, 247, 4564, 273, 3332, 2987, 534, 8600, 253, 1263, 285, 253, 16774, 7103, 18766, 275, 1635, 5661, 7103, 285, 1783, 3198, 281, 320, 5520, 347, 591, 253, 7313, 1160, 1840, 50276, 7152, 339, 793, 360, 3454, 273, 2929, 253, 2022, 1750, 273, 253, 2929, 310, 326, 562, 273, 3268, 258, 351, 5481, 476, 320, 2218, 407, 897, 273, 3215, 26208, 285, 20420, 44190, 247, 4735, 2317, 432, 256, 5503, 1396, 569, 50276, 13917, 45900, 268, 6357, 1754, 7877, 1319, 5141, 298, 19, 21539, 50276, 37347, 3082, 824, 347, 305, 78, 983, 465, 30799, 3966, 476, 840, 320, 908, 281, 6642, 253, 5912, 4038, 1159, 273, 3386, 323, 897, 275, 258, 351, 5481, 50276, 43249, 5795, 15849, 403, 2429, 1411, 1142, 258, 351, 5481, 15849, 50274, 2364, 9021, 7558, 403, 326, 3215, 11273, 37507, 452, 1491, 670, 1527, 10186, 9990, 285, 273, 649, 1041, 48164, 2036, 3386, 2112, 342, 4569, 4327, 273, 247, 1698, 6967, 6779, 7729, 275, 41731, 14692, 6041, 258, 351, 15849, 253, 2929, 21168, 2220, 3332, 789, 327, 258, 351, 1332, 22791, 272, 1332, 407, 439, 2320, 3348, 74, 1162, 355, 6247, 326, 8219, 326, 954, 258, 351, 15849, 403, 417, 2104, 281, 1509, 247, 1679, 38663, 1071, 4158, 50276, 2811, 249, 247, 2603, 941, 873, 310, 908, 323, 3733, 970, 2629, 16182, 12820, 941, 873, 323, 26230, 247, 3061, 1159, 875, 2603, 285, 12820, 50276, 395, 4720, 253, 5912, 273, 562, 3623, 5481, 327, 643, 15302, 285, 616, 13099, 310, 5998, 281, 755, 10237, 1859, 273, 258, 351, 50273, 18, 253, 2087, 2746, 310, 347, 891, 1158, 3240, 3731, 26960, 253, 2934, 310, 281, 897, 3215, 11273, 15302, 4908, 3386, 432, 352, 285, 4647, 247, 305, 12064, 7802, 1566, 436, 2097, 2167, 326, 824, 271, 2746, 588, 320, 2104, 281, 9446, 760, 5971, 323, 534, 3386, 497, 1160, 2130, 7613, 253, 4581, 10186, 1895, 310, 816, 6508, 407, 3081, 3640, 670, 625, 5971, 432, 4236, 15302, 642, 1527, 10186, 1895, 310, 14042, 50276, 19, 253, 958, 326, 3215, 26208, 7729, 310, 247, 12314, 3908, 534, 7024, 327, 253, 9990, 273, 253, 10895, 908, 323, 3215, 26208, 285, 840, 253, 10895, 327, 534, 352, 310, 5762, 50276, 20, 253, 9978, 275, 7609, 8085, 247, 2014, 10895, 715, 1527, 285, 4581, 5239, 8772, 966, 5203, 310, 30455, 50276, 284, 253, 2460, 9990, 275, 253, 10883, 403, 1072, 50274, 21, 275, 5976, 253, 4477, 897, 271, 1527, 10895, 323, 12820, 85, 25004, 50276, 2520, 2789, 436, 10895, 417, 1527, 591, 5426, 352, 778, 320, 2032, 326, 352, 7729, 275, 253, 1566, 2087, 3006, 1805, 533, 253, 28939, 310, 1335, 3731, 26960, 50276, 249, 619, 1859, 253, 2929, 10384, 253, 5175, 16182, 273, 439, 2320, 3348, 74, 1162, 355, 6247, 30833, 281, 2216, 271, 258, 351, 5933, 285, 1750, 697, 34385, 50273, 15422, 273, 253, 7180, 275, 253, 2929, 2085, 2228, 8965, 50276, 249, 1340, 281, 18578, 326, 253, 16039, 403, 3451, 891, 651, 1902, 326, 253, 4679, 878, 281, 320, 1408, 342, 247, 4067, 10491, 273, 562, 3623, 15302, 347, 2218, 275, 439, 2320, 3348, 74, 1162, 355, 50273, 7152, 339, 2617, 1130, 1497, 253, 2929, 39223, 253, 9267, 18859, 13279, 292, 9162, 835, 7316, 6667, 3345, 667, 273, 253, 5971, 275, 253, 3733, 873, 943, 320, 5189, 387, 17032, 407, 16248, 253, 4735, 4908, 263, 1754, 327, 43195, 3676, 3210, 285, 253, 8946, 17524, 3082, 465, 30799, 305, 2188, 3966, 253, 2929, 45190, 2722, 326, 436, 15722, 476, 2953, 1142, 13279, 292, 3237, 275, 15958, 15216, 50254, 856, 84, 19235, 5293, 50276, 5040, 253, 2929, 10323, 294, 7821, 592, 253, 9530, 253, 9267, 18859, 13279, 292, 9162, 1895, 310, 10534, 253, 30207, 5481, 1895, 326, 556, 644, 5421, 323, 8007, 923, 362, 448, 395, 6836, 247, 8913, 254, 39101, 285, 362, 465, 22711, 30207, 5481, 247, 6630, 913, 78, 12672, 17276, 1936, 7609, 642, 495, 268, 1458, 4748, 285, 253, 897, 273, 8946, 17524, 3082, 465, 30799, 305, 2188, 3966, 369, 2190, 581, 273, 253, 806, 6031, 323, 30207, 5481, 9433, 253, 8931, 5700, 327, 1755, 273, 253, 43195, 3676, 3386, 310, 253, 10934, 1971, 323, 30207, 5481, 8437, 275, 9169, 285, 10316, 275, 642, 38135, 281, 253, 3114, 50276, 7152, 33032, 2520, 2929, 29328, 271, 13279, 292, 8981, 2746, 326, 4648, 2969, 7605, 5593, 824, 347, 305, 78, 983, 285, 465, 30799, 327, 1755, 273, 1501, 36981, 10444, 3386, 10375, 432, 4581, 1178, 3676, 3210, 352, 9010, 326, 891, 436, 28441, 15722, 41731, 13015, 2720, 3082, 327, 13279, 292, 2460, 8981, 2439, 2709, 7103, 14238, 387, 1199, 16277, 3541, 285, 11897, 2105, 285, 21255, 1527, 10186, 8981, 3839, 5373, 432, 970, 3210, 3215, 11273, 327, 1781, 15302, 824, 347, 4440, 257, 292, 2581, 685, 3733, 432, 20041, 37685, 253, 5853, 671, 2087, 4219, 281, 1527, 10186, 24705, 26405, 50276, 296, 3755, 20556, 50275, 783, 2929, 2175, 271, 1774, 1895, 310, 17194, 4518, 285, 310, 973, 15720, 50275, 783, 4081, 2746, 310, 27350, 3541, 20246, 285, 4620, 281, 4518, 285, 12724, 1347, 2720, 789, 50275, 783, 14053, 10165, 403, 17194, 285, 5867, 973, 323, 24088, 3036, 495, 4518, 18303, 2139, 298, 19, 6320, 1025, 285, 24462, 10444, 3386, 625, 4518, 9232, 801, 297, 404, 1255, 685, 2412, 953, 908, 18705, 275, 2720, 789, 50275, 783, 873, 273, 4679, 285, 8245, 14023, 4620, 11088, 50275, 783, 3045, 4632, 3541, 5454, 2727, 273, 253, 1655, 1332, 4632, 2720, 789, 310, 5867, 973, 50275, 783, 7364, 273, 253, 4081, 1332, 403, 671, 14859, 323, 24088, 3036, 608, 326, 2722, 849, 323, 1527, 873, 24705, 26405, 253, 4081, 1332, 310, 247, 1805, 5795, 685, 3733, 247, 8985, 30410, 760, 672, 1077, 1652, 562, 1171, 13517, 941, 310, 2130, 281, 6194, 327, 50276, 20881, 1255, 265, 50275, 14091, 3184, 849, 20094, 253, 4327, 273, 3828, 432, 534, 3386, 403, 10375, 11852, 3045, 651, 452, 644, 271, 4722, 1635, 50276, 38092, 5701, 50276, 35640, 621, 50275, 926, 721, 310, 2834, 281, 1239, 1580, 690, 273, 253, 3104, 14787, 11962, 26655, 1537, 1361, 342, 1239, 1430, 50276, 1189, 455, 5701, 50276, 2520, 310, 271, 4722, 285, 973, 15720, 2929, 326, 29328, 2969, 285, 3541, 20246, 18075, 323, 1527, 10186, 8981, 326, 12724, 562, 32231, 625, 2570, 3082, 432, 2720, 789, 50275, 5996, 250, 2858, 22559, 5701, 50275, 74, 452, 1239, 253, 7350, 5439, 407, 643, 30628, 347, 973, 347, 253, 2488, 2380, 891, 1335, 1928, 326, 436, 310, 271, 4722, 789, 285, 697, 4342, 403, 273, 2442, 1318, 281, 253, 3114, 627, 556, 644, 247, 10665, 2408, 273, 3332, 789, 275, 1527, 873, 8981, 323, 3676, 3210, 285, 436, 2929, 5841, 715, 1953, 253, 878, 323, 18144, 5609, 407, 4645, 9648, 8132, 29689, 275, 619, 4743, 326, 2969, 8130, 285, 253, 987, 4327, 273, 4735, 11369, 2987, 1805, 891, 5194, 342, 253, 4477, 326, 417, 970, 4440, 257, 292, 3215, 26208, 310, 271, 46521, 285, 15279, 7658, 209, 575, 3062, 1189, 253, 1332, 2087, 4219, 1014, 672, 6760, 327, 15302, 824, 347, 278, 79, 382, 285, 18504, 13107, 534, 403, 3268, 595, 1077, 1027, 432, 4440, 257, 292, 2007, 891, 1158, 253, 2929, 26785, 2720, 789, 20420, 285, 858, 417, 1089, 667, 273, 253, 3916, 1160, 281, 320, 20697, 891, 5194, 342, 391, 18, 84, 7350, 670, 14787, 342, 767, 3332, 9380, 533, 1119, 253, 2488, 2380, 281, 320, 20297, 4583, 891, 588, 13280, 619, 2997, 13716, 2490, 187, 4118, 18435, 27, 783, 12453, 13279, 292, 8981, 10775, 15549, 31946, 3530, 326, 5663, 281, 5971, 417, 2540, 1309, 3733, 50276, 262, 556, 644, 2011, 326, 5368, 3082, 1891, 327, 1527, 10186, 3888, 253, 1655, 2929, 2722, 45190, 326, 3045, 476, 320, 10260, 5520, 604, 1754, 327, 1698, 6967, 3386, 50275, 15337, 398, 574, 16102, 7350, 670, 253, 38135, 273, 253, 2746, 285, 253, 9317, 3212, 253, 24824, 597, 1119, 15785, 275, 253, 2929, 533, 9703, 281, 13280, 616, 7363, 846, 16725, 253, 30080, 22559, 347, 247, 2852, 17401, 352, 651, 320, 4217, 281, 2085, 625, 1941, 670, 752, 4445, 273, 253, 1332, 390, 24824, 403, 4460, 285, 752, 2789, 731, 789, 973, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 6010, 285, 9021, 253, 2929, 10262, 271, 16774, 1263, 273, 253, 1527, 873, 2460, 8981, 7684, 83, 1895, 352, 19401, 767, 9162, 873, 8777, 50276, 4093, 1070, 297, 404, 285, 2831, 13517, 285, 581, 24705, 26405, 9978, 275, 1798, 253, 2929, 44995, 2969, 7605, 3210, 751, 253, 5275, 966, 2097, 295, 47253, 465, 30799, 285, 305, 12064, 7802, 3210, 305, 78, 983, 4270, 31831, 258, 1641, 3676, 3386, 1411, 3676, 256, 5503, 1666, 25379, 751, 502, 3319, 18, 278, 1033, 1527, 4090, 260, 19, 3348, 305, 17670, 3966, 50274, 783, 2929, 14371, 326, 2969, 7605, 3210, 562, 32231, 256, 5503, 3210, 604, 3676, 258, 1641, 3386, 3215, 11273, 327, 1781, 15302, 403, 908, 12650, 285, 616, 7877, 1319, 310, 3777, 949, 8820, 45900, 285, 268, 6357, 436, 671, 1543, 275, 1663, 28441, 2460, 8981, 44387, 323, 7684, 83, 7533, 50274, 5992, 7193, 2278, 50276, 783, 1563, 310, 253, 7000, 2278, 273, 253, 2929, 10932, 715, 20544, 285, 32213, 749, 21454, 50274, 296, 3755, 20556, 50275, 11235, 11828, 285, 8453, 50276, 2252, 13361, 3210, 672, 18329, 275, 1524, 10186, 7533, 2223, 878, 281, 10196, 275, 13279, 292, 390, 1527, 1533, 2515, 285, 878, 281, 2085, 10237, 8197, 1690, 271, 562, 1171, 2437, 7202, 5203, 672, 13329, 272, 941, 432, 7202, 5971, 3210, 326, 452, 1175, 7684, 83, 3607, 943, 320, 273, 3862, 1600, 281, 253, 13361, 3114, 50274, 358, 5378, 474, 1543, 275, 253, 2929, 1804, 326, 2969, 7605, 5609, 476, 320, 3240, 3576, 275, 9591, 13279, 292, 8981, 253, 2929, 671, 3588, 684, 253, 8453, 273, 9591, 4735, 638, 21678, 5018, 4404, 13546, 3045, 327, 1061, 342, 390, 27433, 326, 273, 625, 2570, 256, 5503, 3082, 50274, 498, 15752, 50276, 783, 2929, 310, 3542, 973, 285, 310, 3477, 281, 2096, 50274, 20881, 1255, 265, 50275, 16429, 281, 2720, 1445, 50276, 783, 2929, 1057, 247, 5272, 2628, 273, 15250, 253, 2720, 1445, 12488, 253, 7881, 285, 878, 323, 253, 3559, 789, 2299, 352, 36908, 26542, 253, 1563, 4623, 2987, 50274, 18, 5101, 1162, 355, 17697, 305, 12064, 3268, 4715, 323, 1527, 873, 8981, 30105, 1087, 9169, 50275, 19, 730, 72, 1162, 355, 3332, 16424, 275, 1527, 873, 8981, 247, 6630, 26332, 1796, 268, 7588, 9169, 50274, 2369, 652, 555, 50276, 783, 1263, 7668, 253, 17947, 273, 638, 21678, 5018, 327, 247, 5235, 273, 2969, 7605, 3210, 3133, 4460, 2299, 253, 6630, 2929, 407, 730, 72, 1162, 355, 374, 671, 2589, 84, 271, 16774, 1263, 7668, 1142, 273, 253, 3210, 2783, 275, 436, 2929, 253, 2929, 3198, 281, 3806, 285, 7277, 1411, 374, 50273, 358, 5378, 474, 7103, 50276, 783, 16774, 7103, 310, 18766, 3368, 1543, 403, 2011, 760, 323, 3710, 4581, 5239, 323, 3733, 1146, 271, 16774, 2929, 352, 310, 3264, 323, 253, 2929, 281, 452, 625, 7000, 4679, 275, 1635, 50275, 18, 253, 2929, 3198, 281, 45755, 253, 1263, 275, 1708, 273, 374, 50274, 19, 352, 943, 7277, 1543, 1411, 625, 3332, 256, 5503, 3210, 323, 1650, 337, 1677, 326, 1097, 4648, 966, 35428, 305, 78, 983, 50275, 20, 4477, 3748, 275, 2593, 577, 326, 4373, 22041, 403, 4236, 1754, 327, 247, 1355, 7527, 12820, 873, 2299, 253, 1979, 273, 436, 12820, 873, 310, 417, 7616, 4278, 670, 253, 12820, 873, 403, 3058, 281, 3653, 3045, 273, 253, 4679, 50275, 21, 275, 2593, 7609, 253, 260, 19, 3348, 7363, 275, 2829, 337, 3133, 281, 320, 2406, 685, 752, 310, 2361, 275, 616, 2929, 2299, 5661, 9978, 253, 1072, 5987, 39962, 2061, 9275, 746, 2125, 520, 16903, 9275, 310, 627, 667, 2173, 1921, 323, 436, 50275, 22, 2593, 5976, 1543, 327, 625, 685, 581, 2810, 10186, 3733, 10895, 1690, 278, 79, 382, 260, 338, 274, 323, 2831, 10186, 5175, 943, 320, 2530, 347, 671, 2218, 407, 439, 2320, 3348, 74, 1162, 355, 50275, 23, 253, 5661, 1543, 878, 281, 320, 5544, 1805, 275, 7118, 7609, 285, 7652, 305, 78, 983, 4245, 1175, 1543, 1223, 327, 253, 4758, 273, 2593, 5976, 305, 78, 983, 13414, 513, 973, 752, 310, 253, 8813, 323, 436, 50274, 515, 18114, 50276, 2520, 2929, 10262, 247, 1263, 4645, 326, 2969, 28441, 7605, 3210, 476, 562, 32231, 3676, 256, 5503, 3210, 327, 253, 2460, 7684, 83, 1895, 436, 943, 320, 273, 1600, 281, 253, 13361, 3114, 2299, 253, 2929, 3133, 639, 79, 6932, 273, 247, 4564, 273, 3332, 2987, 534, 8600, 253, 1263, 285, 253, 16774, 7103, 18766, 275, 1635, 5661, 7103, 285, 1783, 3198, 281, 320, 5520, 347, 591, 253, 7313, 1160, 1840, 50276, 7152, 339, 793, 360, 3454, 273, 2929, 253, 2022, 1750, 273, 253, 2929, 310, 326, 562, 273, 3268, 258, 351, 5481, 476, 320, 2218, 407, 897, 273, 3215, 26208, 285, 20420, 44190, 247, 4735, 2317, 432, 256, 5503, 1396, 569, 50276, 13917, 45900, 268, 6357, 1754, 7877, 1319, 5141, 298, 19, 21539, 50276, 37347, 3082, 824, 347, 305, 78, 983, 465, 30799, 3966, 476, 840, 320, 908, 281, 6642, 253, 5912, 4038, 1159, 273, 3386, 323, 897, 275, 258, 351, 5481, 50276, 43249, 5795, 15849, 403, 2429, 1411, 1142, 258, 351, 5481, 15849, 50274, 2364, 9021, 7558, 403, 326, 3215, 11273, 37507, 452, 1491, 670, 1527, 10186, 9990, 285, 273, 649, 1041, 48164, 2036, 3386, 2112, 342, 4569, 4327, 273, 247, 1698, 6967, 6779, 7729, 275, 41731, 14692, 6041, 258, 351, 15849, 253, 2929, 21168, 2220, 3332, 789, 327, 258, 351, 1332, 22791, 272, 1332, 407, 439, 2320, 3348, 74, 1162, 355, 6247, 326, 8219, 326, 954, 258, 351, 15849, 403, 417, 2104, 281, 1509, 247, 1679, 38663, 1071, 4158, 50276, 2811, 249, 247, 2603, 941, 873, 310, 908, 323, 3733, 970, 2629, 16182, 12820, 941, 873, 323, 26230, 247, 3061, 1159, 875, 2603, 285, 12820, 50276, 395, 4720, 253, 5912, 273, 562, 3623, 5481, 327, 643, 15302, 285, 616, 13099, 310, 5998, 281, 755, 10237, 1859, 273, 258, 351, 50273, 18, 253, 2087, 2746, 310, 347, 891, 1158, 3240, 3731, 26960, 253, 2934, 310, 281, 897, 3215, 11273, 15302, 4908, 3386, 432, 352, 285, 4647, 247, 305, 12064, 7802, 1566, 436, 2097, 2167, 326, 824, 271, 2746, 588, 320, 2104, 281, 9446, 760, 5971, 323, 534, 3386, 497, 1160, 2130, 7613, 253, 4581, 10186, 1895, 310, 816, 6508, 407, 3081, 3640, 670, 625, 5971, 432, 4236, 15302, 642, 1527, 10186, 1895, 310, 14042, 50276, 19, 253, 958, 326, 3215, 26208, 7729, 310, 247, 12314, 3908, 534, 7024, 327, 253, 9990, 273, 253, 10895, 908, 323, 3215, 26208, 285, 840, 253, 10895, 327, 534, 352, 310, 5762, 50276, 20, 253, 9978, 275, 7609, 8085, 247, 2014, 10895, 715, 1527, 285, 4581, 5239, 8772, 966, 5203, 310, 30455, 50276, 284, 253, 2460, 9990, 275, 253, 10883, 403, 1072, 50274, 21, 275, 5976, 253, 4477, 897, 271, 1527, 10895, 323, 12820, 85, 25004, 50276, 2520, 2789, 436, 10895, 417, 1527, 591, 5426, 352, 778, 320, 2032, 326, 352, 7729, 275, 253, 1566, 2087, 3006, 1805, 533, 253, 28939, 310, 1335, 3731, 26960, 50276, 249, 619, 1859, 253, 2929, 10384, 253, 5175, 16182, 273, 439, 2320, 3348, 74, 1162, 355, 6247, 30833, 281, 2216, 271, 258, 351, 5933, 285, 1750, 697, 34385, 50273, 15422, 273, 253, 7180, 275, 253, 2929, 2085, 2228, 8965, 50276, 249, 1340, 281, 18578, 326, 253, 16039, 403, 3451, 891, 651, 1902, 326, 253, 4679, 878, 281, 320, 1408, 342, 247, 4067, 10491, 273, 562, 3623, 15302, 347, 2218, 275, 439, 2320, 3348, 74, 1162, 355, 50273, 7152, 339, 2617, 1130, 1497, 253, 2929, 39223, 253, 9267, 18859, 13279, 292, 9162, 835, 7316, 6667, 3345, 667, 273, 253, 5971, 275, 253, 3733, 873, 943, 320, 5189, 387, 17032, 407, 16248, 253, 4735, 4908, 263, 1754, 327, 43195, 3676, 3210, 285, 253, 8946, 17524, 3082, 465, 30799, 305, 2188, 3966, 253, 2929, 45190, 2722, 326, 436, 15722, 476, 2953, 1142, 13279, 292, 3237, 275, 15958, 15216, 50254, 856, 84, 19235, 5293, 50276, 5040, 253, 2929, 10323, 294, 7821, 592, 253, 9530, 253, 9267, 18859, 13279, 292, 9162, 1895, 310, 10534, 253, 30207, 5481, 1895, 326, 556, 644, 5421, 323, 8007, 923, 362, 448, 395, 6836, 247, 8913, 254, 39101, 285, 362, 465, 22711, 30207, 5481, 247, 6630, 913, 78, 12672, 17276, 1936, 7609, 642, 495, 268, 1458, 4748, 285, 253, 897, 273, 8946, 17524, 3082, 465, 30799, 305, 2188, 3966, 369, 2190, 581, 273, 253, 806, 6031, 323, 30207, 5481, 9433, 253, 8931, 5700, 327, 1755, 273, 253, 43195, 3676, 3386, 310, 253, 10934, 1971, 323, 30207, 5481, 8437, 275, 9169, 285, 10316, 275, 642, 38135, 281, 253, 3114, 50276, 7152, 33032, 2520, 2929, 29328, 271, 13279, 292, 8981, 2746, 326, 4648, 2969, 7605, 5593, 824, 347, 305, 78, 983, 285, 465, 30799, 327, 1755, 273, 1501, 36981, 10444, 3386, 10375, 432, 4581, 1178, 3676, 3210, 352, 9010, 326, 891, 436, 28441, 15722, 41731, 13015, 2720, 3082, 327, 13279, 292, 2460, 8981, 2439, 2709, 7103, 14238, 387, 1199, 16277, 3541, 285, 11897, 2105, 285, 21255, 1527, 10186, 8981, 3839, 5373, 432, 970, 3210, 3215, 11273, 327, 1781, 15302, 824, 347, 4440, 257, 292, 2581, 685, 3733, 432, 20041, 37685, 253, 5853, 671, 2087, 4219, 281, 1527, 10186, 24705, 26405, 50276, 296, 3755, 20556, 50275, 783, 2929, 2175, 271, 1774, 1895, 310, 17194, 4518, 285, 310, 973, 15720, 50275, 783, 4081, 2746, 310, 27350, 3541, 20246, 285, 4620, 281, 4518, 285, 12724, 1347, 2720, 789, 50275, 783, 14053, 10165, 403, 17194, 285, 5867, 973, 323, 24088, 3036, 495, 4518, 18303, 2139, 298, 19, 6320, 1025, 285, 24462, 10444, 3386, 625, 4518, 9232, 801, 297, 404, 1255, 685, 2412, 953, 908, 18705, 275, 2720, 789, 50275, 783, 873, 273, 4679, 285, 8245, 14023, 4620, 11088, 50275, 783, 3045, 4632, 3541, 5454, 2727, 273, 253, 1655, 1332, 4632, 2720, 789, 310, 5867, 973, 50275, 783, 7364, 273, 253, 4081, 1332, 403, 671, 14859, 323, 24088, 3036, 608, 326, 2722, 849, 323, 1527, 873, 24705, 26405, 253, 4081, 1332, 310, 247, 1805, 5795, 685, 3733, 247, 8985, 30410, 760, 672, 1077, 1652, 562, 1171, 13517, 941, 310, 2130, 281, 6194, 327, 50276, 20881, 1255, 265, 50275, 14091, 3184, 849, 20094, 253, 4327, 273, 3828, 432, 534, 3386, 403, 10375, 11852, 3045, 651, 452, 644, 271, 4722, 1635, 50276, 38092, 5701, 50276, 35640, 621, 50275, 926, 721, 310, 2834, 281, 1239, 1580, 690, 273, 253, 3104, 14787, 11962, 26655, 1537, 1361, 342, 1239, 1430, 50276, 1189, 455, 5701, 50276, 2520, 310, 271, 4722, 285, 973, 15720, 2929, 326, 29328, 2969, 285, 3541, 20246, 18075, 323, 1527, 10186, 8981, 326, 12724, 562, 32231, 625, 2570, 3082, 432, 2720, 789, 50275, 5996, 250, 2858, 22559, 5701, 50275, 74, 452, 1239, 253, 7350, 5439, 407, 643, 30628, 347, 973, 347, 253, 2488, 2380, 891, 1335, 1928, 326, 436, 310, 271, 4722, 789, 285, 697, 4342, 403, 273, 2442, 1318, 281, 253, 3114, 627, 556, 644, 247, 10665, 2408, 273, 3332, 789, 275, 1527, 873, 8981, 323, 3676, 3210, 285, 436, 2929, 5841, 715, 1953, 253, 878, 323, 18144, 5609, 407, 4645, 9648, 8132, 29689, 275, 619, 4743, 326, 2969, 8130, 285, 253, 987, 4327, 273, 4735, 11369, 2987, 1805, 891, 5194, 342, 253, 4477, 326, 417, 970, 4440, 257, 292, 3215, 26208, 310, 271, 46521, 285, 15279, 7658, 209, 575, 3062, 1189, 253, 1332, 2087, 4219, 1014, 672, 6760, 327, 15302, 824, 347, 278, 79, 382, 285, 18504, 13107, 534, 403, 3268, 595, 1077, 1027, 432, 4440, 257, 292, 2007, 891, 1158, 253, 2929, 26785, 2720, 789, 20420, 285, 858, 417, 1089, 667, 273, 253, 3916, 1160, 281, 320, 20697, 891, 5194, 342, 391, 18, 84, 7350, 670, 14787, 342, 767, 3332, 9380, 533, 1119, 253, 2488, 2380, 281, 320, 20297, 4583, 891, 588, 13280, 619, 2997, 13716, 2490, 187, 4118, 18435, 27, 783, 12453, 13279, 292, 8981, 10775, 15549, 31946, 3530, 326, 5663, 281, 5971, 417, 2540, 1309, 3733, 50276, 262, 556, 644, 2011, 326, 5368, 3082, 1891, 327, 1527, 10186, 3888, 253, 1655, 2929, 2722, 45190, 326, 3045, 476, 320, 10260, 5520, 604, 1754, 327, 1698, 6967, 3386, 50275, 15337, 398, 574, 16102, 7350, 670, 253, 38135, 273, 253, 2746, 285, 253, 9317, 3212, 253, 24824, 597, 1119, 15785, 275, 253, 2929, 533, 9703, 281, 13280, 616, 7363, 846, 16725, 253, 30080, 22559, 347, 247, 2852, 17401, 352, 651, 320, 4217, 281, 2085, 625, 1941, 670, 752, 4445, 273, 253, 1332, 390, 24824, 403, 4460, 285, 752, 2789, 731, 789, 973, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: paper summary the paper gives theoretical proof showing that the recently proposed data augmentation technique mixup can indeed improve generalization and help in robustness the theorems cover glms and certain classes of neural networks the paper also contains numerical experiments supporting some aspects of the theory strengths 1 currently there is only a limited theoretical understanding of why mixup works this paper shows that mixup is essentially equal to regularizing the first and second derivatives with respect to the input x intuitively this means that changing the training samples slightly shouldnt change the output of the model much further the paper proves that the mixup loss is an upper bound on the 2nd order taylor approximation of the adversarial loss and hence reducing mixup loss reduces adversarial loss finally the paper proves that mixup helps in reducing the rademacher complexity and hence improves generalization 2 the results seem fairly general and apply to many models such as glms and neural networks 3 the paper supports its approximations and claims by numerical experiments concerns 1 the regularizing term mathcalr3 looks like it is minimizing ztnabla fthetaxi z for some z this promotes the hessian wrt x to have negative eigenvalues in the direction of z ideally we would want the hessian and also the gradient to be 0 around the training samples so that perturbing the input doesnt change the output much thus i dont see how the mathcalr3 term helps regularize the hessian properly 2 the paper claims that assumption 31 holds when the minimizers are not too dispersed does it still hold for practical neural networks where the minimizers can possible be fairly far apart comments although the paper seems well written i have a few suggestions 1 the notation costheta x which refers to fraclangle theta x ranglethetax should be explained in the preliminary section 2 on page 6 the statement fthetaxnabla fthetaxtx should be proven it will save the reader some time if the proof is provided 3 in remark 31 i think theorem 32 should actually be theorem 34 score justification there isnt much prior work on the theoretical understanding of mixup this paper provides theoretical guarantees for mixup on two fronts robustness and generalization for both glms and relus docsepthis paper shows that mixup training is approximately certain kind of regularized loss minimization based on this it provides some theoretical analysis on the advantages of mixup training for the generalization and adversarial robustness over onestep attacks this paper provides many insights on why mixup works eg connecting its 2nd order approximation with l2 adversarial loss and shows that the radmacher complexity of mixup adaptively characterize the intrinsic dimension of empirical data distribution though the techniques used in the paper were already developed by other works the new results and insights on mixup in this work are worthy of being known by the community particularly for that mixup is such a popular data augmentation trick in deep learning some questions 1 could you provide any comments on mixup and adversarial training eg onestep and multistep ones 2 what about the generality of the analysis on linfinity attacks docsepthe paper theoretically studies the beneficial effect of mixup on robustness and generalization of machine models the mixup loss is rewritten to be the sum of the original empirical loss and a regularization term plus a high order term for robustness the regularization term is proven to be upper bound of first and second order terms of the adversarial losss taylor expansion hence the mixup loss can upper bound the approximate adversarial loss for generalization the regularization term is used to control the hypothesis to have small rademacher complexity the paper is clearly written and well organized pros 1 rigorous theoretical analysis are conducted on nonlinear models specifically the neural network model 2 the theoretical results are clean and insightful cons 1 when studying robustness an approximated adversarial loss is considered the approximated loss is the truncation of the taylor expansion of the original loss the quality of the approximation is not explored in the paper it is better to provide numerical evidence that whether the bounds in thm 31 and 33 still hold for original adversarial loss and how tight the bounds are 2 in the generalization part only an indirect connection is built between mixup loss and the generalization gap no result is provided concerning the generalization error of the solution found by minimizing the mixup loss docsepsummary this paper has extensive analysis on mixup augmentation which focus on the effect of adversarial robustness and generalization in adversarial robustness they try to make a connection between mixup loss and adversarial loss on the other hand of generalization they argue that mixup is a kind of dataapdaptive regularization comment 1 good contribution about authors careful analysis on connecting between mixup loss and adversarial loss it seems to be the first theoretical analysis on discussing their connection since the past works just report the number to show how mixup and their variants to be robust to singlestep adversarial attack 2 good to community about having a connection between mixup and rademacher complexity i think it can make some impact to discuss the highlevel connection between data augmentation and model generalization ### Summary:
this paper provides theoretical justifications on why the data augmentation technique mixup convex combinations of pairs of data examples can help in improving robustness and generalization of glms and relus the authors rewrote a mixup loss function as the summation of a standard empirical loss and some regularization terms regularizing gradient hessian and some higher order terms using the quadratic approximation of the mixup loss ignoring the higher order terms the authors proved that the quadratic approximation of the mixup loss was equivalent to an upper bound of the second order taylor expansion of an adversarial loss providing justifications for why mixup loss training could improve robustness against small attacks using the same quadratic approximation of the mixup loss the regularization term controlled the hypothesis class to have a smaller rademacher complexity overall the paper provides insightful theoretical interpretations for a commonly used data augmentation technique in dl the paper also supports its claims by numerical experiments although there is some minor concerns on using the quadratic approximation of the mixup loss as well as r3 terms regularization effect on a broader family of models the paper provides unique and novel insights on mixup all reviewers acknowledge the authors applying the existing proof techniques to analyze mixups effect on robustness and generalization therefore i recommend accepting this paper
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 20790, 6010, 253, 2929, 4245, 10527, 4737, 4645, 326, 253, 4102, 4081, 941, 42072, 5853, 5878, 484, 476, 6296, 3157, 26647, 285, 1361, 275, 31640, 253, 39383, 3835, 1289, 983, 285, 2176, 5971, 273, 11454, 6928, 253, 2929, 671, 4428, 10704, 4679, 8109, 690, 7794, 273, 253, 3762, 50276, 296, 3755, 20556, 337, 4390, 627, 310, 760, 247, 3710, 10527, 4685, 273, 2139, 5878, 484, 2987, 436, 2929, 2722, 326, 5878, 484, 310, 9093, 4503, 281, 3963, 3006, 253, 806, 285, 1273, 13335, 342, 1675, 281, 253, 3280, 1269, 540, 41597, 436, 2097, 326, 6890, 253, 3733, 3530, 5777, 943, 2649, 1818, 253, 3453, 273, 253, 1566, 1199, 2007, 253, 2929, 19539, 326, 253, 5878, 484, 2957, 310, 271, 5170, 3033, 327, 253, 374, 2109, 1340, 246, 9614, 11193, 273, 253, 48960, 2957, 285, 7613, 8493, 5878, 484, 2957, 11355, 48960, 2957, 4720, 253, 2929, 19539, 326, 5878, 484, 7729, 275, 8493, 253, 1985, 358, 12844, 10454, 285, 7613, 19132, 26647, 374, 253, 1543, 1646, 9648, 2087, 285, 4647, 281, 1142, 3210, 824, 347, 1289, 983, 285, 11454, 6928, 495, 253, 2929, 8525, 697, 34754, 285, 3916, 407, 10704, 4679, 50276, 585, 1209, 2224, 337, 253, 3963, 3006, 1307, 14168, 1179, 83, 20, 4453, 751, 352, 310, 28699, 1182, 85, 6526, 269, 3124, 2981, 1182, 323, 690, 1182, 436, 18653, 253, 344, 859, 757, 8772, 1269, 281, 452, 4016, 20223, 275, 253, 3884, 273, 1182, 34243, 359, 651, 971, 253, 344, 859, 757, 285, 671, 253, 11786, 281, 320, 470, 1475, 253, 3733, 3530, 594, 326, 12230, 272, 253, 3280, 36908, 1818, 253, 3453, 1199, 3021, 891, 13414, 923, 849, 253, 14168, 1179, 83, 20, 1307, 7729, 3963, 907, 253, 344, 859, 757, 6283, 374, 253, 2929, 3916, 326, 9376, 4562, 6556, 672, 253, 7221, 14460, 403, 417, 1512, 27667, 1057, 352, 1335, 2186, 323, 8542, 11454, 6928, 835, 253, 7221, 14460, 476, 1896, 320, 9648, 2080, 7419, 50275, 26122, 3738, 253, 2929, 3133, 973, 3542, 891, 452, 247, 1643, 13991, 337, 253, 14951, 2105, 22666, 1269, 534, 10770, 281, 1315, 317, 4841, 39116, 1269, 391, 2134, 3124, 89, 943, 320, 5544, 275, 253, 12611, 2593, 374, 327, 3239, 721, 253, 3908, 269, 3124, 89, 6526, 269, 3124, 633, 89, 943, 320, 11464, 352, 588, 5321, 253, 9414, 690, 673, 604, 253, 4737, 310, 2530, 495, 275, 7579, 4562, 891, 1158, 10012, 4567, 943, 2686, 320, 10012, 5910, 50276, 18891, 22861, 627, 310, 2649, 1199, 2720, 789, 327, 253, 10527, 4685, 273, 5878, 484, 436, 2929, 3400, 10527, 23632, 323, 5878, 484, 327, 767, 43679, 50276, 18848, 461, 1255, 285, 26647, 323, 1097, 1289, 983, 285, 774, 316, 5474, 33032, 2520, 2929, 2722, 326, 5878, 484, 3733, 310, 5512, 2176, 2238, 273, 3963, 1025, 2957, 41458, 1754, 327, 436, 352, 3400, 690, 10527, 1783, 327, 253, 11361, 273, 5878, 484, 3733, 323, 253, 26647, 285, 48960, 31640, 689, 327, 383, 554, 8104, 50275, 2520, 2929, 3400, 1142, 16039, 327, 2139, 5878, 484, 2987, 24088, 12873, 697, 374, 2109, 1340, 11193, 342, 298, 19, 48960, 2957, 285, 2722, 326, 253, 1985, 12432, 379, 10454, 273, 5878, 484, 5223, 1242, 17710, 253, 15276, 7877, 273, 16774, 941, 3268, 2167, 253, 5609, 908, 275, 253, 2929, 497, 2168, 3715, 407, 643, 2987, 253, 747, 1543, 285, 16039, 327, 5878, 484, 275, 436, 789, 403, 18338, 273, 1146, 1929, 407, 253, 3114, 3782, 323, 326, 5878, 484, 310, 824, 247, 4633, 941, 42072, 10480, 275, 3676, 4715, 50275, 8826, 3533, 337, 186, 16534, 368, 2085, 667, 5701, 327, 5878, 484, 285, 48960, 3733, 24088, 327, 383, 554, 285, 1554, 382, 554, 4394, 374, 186, 5371, 670, 253, 31376, 273, 253, 1783, 327, 298, 43723, 8104, 50276, 7152, 339, 431, 248, 2929, 28055, 2175, 253, 12912, 1055, 273, 5878, 484, 327, 31640, 285, 26647, 273, 5145, 3210, 253, 5878, 484, 2957, 310, 50276, 2663, 22380, 281, 320, 253, 2020, 273, 253, 3236, 16774, 2957, 285, 247, 37820, 1307, 5043, 247, 1029, 1340, 1307, 323, 31640, 253, 37820, 1307, 310, 11464, 281, 320, 5170, 3033, 273, 806, 285, 1273, 1340, 2426, 273, 253, 48960, 3897, 859, 246, 9614, 7466, 7613, 253, 5878, 484, 2957, 476, 5170, 3033, 253, 16851, 48960, 2957, 323, 26647, 253, 37820, 1307, 310, 908, 281, 1453, 253, 9079, 281, 452, 1355, 1985, 358, 12844, 10454, 253, 2929, 310, 4518, 3542, 285, 973, 10932, 50276, 856, 84, 337, 26565, 10527, 1783, 403, 5196, 327, 14561, 3210, 5742, 253, 11454, 2990, 1566, 50276, 19, 253, 10527, 1543, 403, 4076, 285, 47860, 50275, 5040, 337, 672, 12392, 31640, 271, 34930, 48960, 2957, 310, 2783, 253, 34930, 2957, 310, 253, 47024, 273, 253, 246, 9614, 7466, 273, 253, 3236, 2957, 253, 3290, 273, 253, 11193, 310, 417, 14859, 275, 253, 2929, 352, 310, 1805, 281, 2085, 10704, 1941, 326, 1880, 253, 14493, 275, 289, 78, 4562, 285, 5922, 1335, 2186, 323, 3236, 48960, 2957, 285, 849, 6863, 253, 14493, 403, 50276, 19, 275, 253, 26647, 629, 760, 271, 11686, 4602, 310, 4270, 875, 5878, 484, 2957, 285, 253, 26647, 8037, 642, 906, 310, 2530, 8664, 253, 26647, 2228, 273, 253, 2900, 1119, 407, 28699, 253, 5878, 484, 2957, 5474, 339, 793, 360, 3454, 50276, 2520, 2929, 556, 9470, 1783, 327, 5878, 484, 42072, 534, 2770, 327, 253, 1055, 273, 48960, 31640, 285, 26647, 275, 48960, 31640, 597, 1611, 281, 1056, 247, 4602, 875, 5878, 484, 2957, 285, 48960, 2957, 327, 253, 643, 1133, 273, 26647, 50276, 9328, 9059, 326, 5878, 484, 310, 247, 2238, 273, 941, 522, 1473, 20834, 37820, 50276, 13982, 50276, 18, 1175, 7680, 670, 4477, 10182, 1783, 327, 12873, 875, 5878, 484, 2957, 285, 48960, 2957, 352, 3133, 281, 320, 253, 806, 10527, 1783, 327, 16585, 616, 4602, 1580, 253, 2469, 2987, 816, 1304, 253, 1180, 281, 921, 849, 5878, 484, 285, 616, 11640, 281, 320, 10237, 281, 1625, 46701, 554, 48960, 2983, 50276, 19, 1175, 281, 3114, 670, 1907, 247, 4602, 875, 5878, 484, 285, 1985, 358, 12844, 10454, 891, 1158, 352, 476, 1056, 690, 3486, 281, 2319, 253, 1029, 5251, 4602, 875, 941, 42072, 285, 1566, 26647, 187, 187, 4118, 18435, 27, 2520, 2929, 3400, 10527, 816, 6787, 327, 2139, 253, 941, 42072, 5853, 5878, 484, 17133, 13553, 273, 8557, 273, 941, 6667, 50276, 5092, 1361, 275, 11138, 31640, 285, 26647, 273, 1289, 983, 285, 774, 316, 253, 4477, 294, 48141, 247, 5878, 484, 2957, 1159, 347, 253, 36138, 273, 247, 2629, 16774, 2957, 285, 690, 37820, 2426, 3963, 3006, 11786, 344, 859, 757, 285, 690, 2169, 1340, 2426, 970, 253, 21396, 11193, 273, 253, 5878, 484, 2957, 23111, 253, 2169, 1340, 2426, 253, 4477, 8058, 326, 253, 21396, 11193, 273, 253, 5878, 484, 2957, 369, 6425, 281, 271, 5170, 3033, 273, 253, 1273, 1340, 246, 9614, 7466, 273, 271, 48960, 2957, 5277, 816, 6787, 323, 2139, 5878, 484, 2957, 3733, 812, 3157, 31640, 1411, 1355, 8104, 970, 253, 1072, 21396, 11193, 273, 253, 5878, 484, 2957, 253, 37820, 1307, 6537, 253, 9079, 966, 281, 452, 247, 4577, 1985, 358, 12844, 10454, 50276, 1189, 455, 253, 2929, 3400, 47860, 10527, 27838, 323, 247, 7744, 908, 941, 42072, 5853, 275, 45439, 253, 2929, 671, 8525, 697, 3916, 407, 10704, 4679, 3738, 627, 310, 690, 5884, 7350, 327, 970, 253, 21396, 11193, 273, 253, 5878, 484, 2957, 347, 973, 347, 391, 20, 2426, 37820, 1055, 327, 247, 16055, 2021, 273, 3210, 253, 2929, 3400, 4451, 285, 4460, 16039, 327, 5878, 484, 512, 30628, 14409, 253, 4477, 9433, 253, 5368, 4737, 5609, 281, 12106, 5878, 8777, 1055, 327, 31640, 285, 26647, 50276, 45230, 891, 5583, 18738, 436, 2929 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 20790, 6010, 253, 2929, 4245, 10527, 4737, 4645, 326, 253, 4102, 4081, 941, 42072, 5853, 5878, 484, 476, 6296, 3157, 26647, 285, 1361, 275, 31640, 253, 39383, 3835, 1289, 983, 285, 2176, 5971, 273, 11454, 6928, 253, 2929, 671, 4428, 10704, 4679, 8109, 690, 7794, 273, 253, 3762, 50276, 296, 3755, 20556, 337, 4390, 627, 310, 760, 247, 3710, 10527, 4685, 273, 2139, 5878, 484, 2987, 436, 2929, 2722, 326, 5878, 484, 310, 9093, 4503, 281, 3963, 3006, 253, 806, 285, 1273, 13335, 342, 1675, 281, 253, 3280, 1269, 540, 41597, 436, 2097, 326, 6890, 253, 3733, 3530, 5777, 943, 2649, 1818, 253, 3453, 273, 253, 1566, 1199, 2007, 253, 2929, 19539, 326, 253, 5878, 484, 2957, 310, 271, 5170, 3033, 327, 253, 374, 2109, 1340, 246, 9614, 11193, 273, 253, 48960, 2957, 285, 7613, 8493, 5878, 484, 2957, 11355, 48960, 2957, 4720, 253, 2929, 19539, 326, 5878, 484, 7729, 275, 8493, 253, 1985, 358, 12844, 10454, 285, 7613, 19132, 26647, 374, 253, 1543, 1646, 9648, 2087, 285, 4647, 281, 1142, 3210, 824, 347, 1289, 983, 285, 11454, 6928, 495, 253, 2929, 8525, 697, 34754, 285, 3916, 407, 10704, 4679, 50276, 585, 1209, 2224, 337, 253, 3963, 3006, 1307, 14168, 1179, 83, 20, 4453, 751, 352, 310, 28699, 1182, 85, 6526, 269, 3124, 2981, 1182, 323, 690, 1182, 436, 18653, 253, 344, 859, 757, 8772, 1269, 281, 452, 4016, 20223, 275, 253, 3884, 273, 1182, 34243, 359, 651, 971, 253, 344, 859, 757, 285, 671, 253, 11786, 281, 320, 470, 1475, 253, 3733, 3530, 594, 326, 12230, 272, 253, 3280, 36908, 1818, 253, 3453, 1199, 3021, 891, 13414, 923, 849, 253, 14168, 1179, 83, 20, 1307, 7729, 3963, 907, 253, 344, 859, 757, 6283, 374, 253, 2929, 3916, 326, 9376, 4562, 6556, 672, 253, 7221, 14460, 403, 417, 1512, 27667, 1057, 352, 1335, 2186, 323, 8542, 11454, 6928, 835, 253, 7221, 14460, 476, 1896, 320, 9648, 2080, 7419, 50275, 26122, 3738, 253, 2929, 3133, 973, 3542, 891, 452, 247, 1643, 13991, 337, 253, 14951, 2105, 22666, 1269, 534, 10770, 281, 1315, 317, 4841, 39116, 1269, 391, 2134, 3124, 89, 943, 320, 5544, 275, 253, 12611, 2593, 374, 327, 3239, 721, 253, 3908, 269, 3124, 89, 6526, 269, 3124, 633, 89, 943, 320, 11464, 352, 588, 5321, 253, 9414, 690, 673, 604, 253, 4737, 310, 2530, 495, 275, 7579, 4562, 891, 1158, 10012, 4567, 943, 2686, 320, 10012, 5910, 50276, 18891, 22861, 627, 310, 2649, 1199, 2720, 789, 327, 253, 10527, 4685, 273, 5878, 484, 436, 2929, 3400, 10527, 23632, 323, 5878, 484, 327, 767, 43679, 50276, 18848, 461, 1255, 285, 26647, 323, 1097, 1289, 983, 285, 774, 316, 5474, 33032, 2520, 2929, 2722, 326, 5878, 484, 3733, 310, 5512, 2176, 2238, 273, 3963, 1025, 2957, 41458, 1754, 327, 436, 352, 3400, 690, 10527, 1783, 327, 253, 11361, 273, 5878, 484, 3733, 323, 253, 26647, 285, 48960, 31640, 689, 327, 383, 554, 8104, 50275, 2520, 2929, 3400, 1142, 16039, 327, 2139, 5878, 484, 2987, 24088, 12873, 697, 374, 2109, 1340, 11193, 342, 298, 19, 48960, 2957, 285, 2722, 326, 253, 1985, 12432, 379, 10454, 273, 5878, 484, 5223, 1242, 17710, 253, 15276, 7877, 273, 16774, 941, 3268, 2167, 253, 5609, 908, 275, 253, 2929, 497, 2168, 3715, 407, 643, 2987, 253, 747, 1543, 285, 16039, 327, 5878, 484, 275, 436, 789, 403, 18338, 273, 1146, 1929, 407, 253, 3114, 3782, 323, 326, 5878, 484, 310, 824, 247, 4633, 941, 42072, 10480, 275, 3676, 4715, 50275, 8826, 3533, 337, 186, 16534, 368, 2085, 667, 5701, 327, 5878, 484, 285, 48960, 3733, 24088, 327, 383, 554, 285, 1554, 382, 554, 4394, 374, 186, 5371, 670, 253, 31376, 273, 253, 1783, 327, 298, 43723, 8104, 50276, 7152, 339, 431, 248, 2929, 28055, 2175, 253, 12912, 1055, 273, 5878, 484, 327, 31640, 285, 26647, 273, 5145, 3210, 253, 5878, 484, 2957, 310, 50276, 2663, 22380, 281, 320, 253, 2020, 273, 253, 3236, 16774, 2957, 285, 247, 37820, 1307, 5043, 247, 1029, 1340, 1307, 323, 31640, 253, 37820, 1307, 310, 11464, 281, 320, 5170, 3033, 273, 806, 285, 1273, 1340, 2426, 273, 253, 48960, 3897, 859, 246, 9614, 7466, 7613, 253, 5878, 484, 2957, 476, 5170, 3033, 253, 16851, 48960, 2957, 323, 26647, 253, 37820, 1307, 310, 908, 281, 1453, 253, 9079, 281, 452, 1355, 1985, 358, 12844, 10454, 253, 2929, 310, 4518, 3542, 285, 973, 10932, 50276, 856, 84, 337, 26565, 10527, 1783, 403, 5196, 327, 14561, 3210, 5742, 253, 11454, 2990, 1566, 50276, 19, 253, 10527, 1543, 403, 4076, 285, 47860, 50275, 5040, 337, 672, 12392, 31640, 271, 34930, 48960, 2957, 310, 2783, 253, 34930, 2957, 310, 253, 47024, 273, 253, 246, 9614, 7466, 273, 253, 3236, 2957, 253, 3290, 273, 253, 11193, 310, 417, 14859, 275, 253, 2929, 352, 310, 1805, 281, 2085, 10704, 1941, 326, 1880, 253, 14493, 275, 289, 78, 4562, 285, 5922, 1335, 2186, 323, 3236, 48960, 2957, 285, 849, 6863, 253, 14493, 403, 50276, 19, 275, 253, 26647, 629, 760, 271, 11686, 4602, 310, 4270, 875, 5878, 484, 2957, 285, 253, 26647, 8037, 642, 906, 310, 2530, 8664, 253, 26647, 2228, 273, 253, 2900, 1119, 407, 28699, 253, 5878, 484, 2957, 5474, 339, 793, 360, 3454, 50276, 2520, 2929, 556, 9470, 1783, 327, 5878, 484, 42072, 534, 2770, 327, 253, 1055, 273, 48960, 31640, 285, 26647, 275, 48960, 31640, 597, 1611, 281, 1056, 247, 4602, 875, 5878, 484, 2957, 285, 48960, 2957, 327, 253, 643, 1133, 273, 26647, 50276, 9328, 9059, 326, 5878, 484, 310, 247, 2238, 273, 941, 522, 1473, 20834, 37820, 50276, 13982, 50276, 18, 1175, 7680, 670, 4477, 10182, 1783, 327, 12873, 875, 5878, 484, 2957, 285, 48960, 2957, 352, 3133, 281, 320, 253, 806, 10527, 1783, 327, 16585, 616, 4602, 1580, 253, 2469, 2987, 816, 1304, 253, 1180, 281, 921, 849, 5878, 484, 285, 616, 11640, 281, 320, 10237, 281, 1625, 46701, 554, 48960, 2983, 50276, 19, 1175, 281, 3114, 670, 1907, 247, 4602, 875, 5878, 484, 285, 1985, 358, 12844, 10454, 891, 1158, 352, 476, 1056, 690, 3486, 281, 2319, 253, 1029, 5251, 4602, 875, 941, 42072, 285, 1566, 26647, 187, 187, 4118, 18435, 27, 2520, 2929, 3400, 10527, 816, 6787, 327, 2139, 253, 941, 42072, 5853, 5878, 484, 17133, 13553, 273, 8557, 273, 941, 6667, 50276, 5092, 1361, 275, 11138, 31640, 285, 26647, 273, 1289, 983, 285, 774, 316, 253, 4477, 294, 48141, 247, 5878, 484, 2957, 1159, 347, 253, 36138, 273, 247, 2629, 16774, 2957, 285, 690, 37820, 2426, 3963, 3006, 11786, 344, 859, 757, 285, 690, 2169, 1340, 2426, 970, 253, 21396, 11193, 273, 253, 5878, 484, 2957, 23111, 253, 2169, 1340, 2426, 253, 4477, 8058, 326, 253, 21396, 11193, 273, 253, 5878, 484, 2957, 369, 6425, 281, 271, 5170, 3033, 273, 253, 1273, 1340, 246, 9614, 7466, 273, 271, 48960, 2957, 5277, 816, 6787, 323, 2139, 5878, 484, 2957, 3733, 812, 3157, 31640, 1411, 1355, 8104, 970, 253, 1072, 21396, 11193, 273, 253, 5878, 484, 2957, 253, 37820, 1307, 6537, 253, 9079, 966, 281, 452, 247, 4577, 1985, 358, 12844, 10454, 50276, 1189, 455, 253, 2929, 3400, 47860, 10527, 27838, 323, 247, 7744, 908, 941, 42072, 5853, 275, 45439, 253, 2929, 671, 8525, 697, 3916, 407, 10704, 4679, 3738, 627, 310, 690, 5884, 7350, 327, 970, 253, 21396, 11193, 273, 253, 5878, 484, 2957, 347, 973, 347, 391, 20, 2426, 37820, 1055, 327, 247, 16055, 2021, 273, 3210, 253, 2929, 3400, 4451, 285, 4460, 16039, 327, 5878, 484, 512, 30628, 14409, 253, 4477, 9433, 253, 5368, 4737, 5609, 281, 12106, 5878, 8777, 1055, 327, 31640, 285, 26647, 50276, 45230, 891, 5583, 18738, 436, 2929 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper looks at the setting where the goal is to learn a policy without access to a reward from the environment with access only to feedback previous work xie et al 2021 assumes that the feedback does not have the action this paper removes that assumption and proposes a new contrastive approach that can integrate feedback with actions to learn a policy through different empirical evaluations toy mnist task and largescale synthetic eval on openml the authors show how this approach can learn from feedback that has the actions taken by the agent strengths the paper removes some assumptions introduced in previous works to bring algorithms closer to being realised in realistic environments learning from nonscalar rewards and feedback is an important direction and this work makes progress towards making this possible the paper has the appropriate ablations and model comparisons weaknesses i am not very familiar with the relevant literature in the subfield as someone from a related but outside the subfield here are some potential weaknesses my main concerns lie with the writing of the paper paper is not clearly written the introduction does not motivate the central problem well i would have liked to see a few examples where previous methods fail but the paper is able to bridge the gap the authors add these to the conclusion but it might be useful to add examples to the introduction to motivate the problem section 3 is especially difficult to understand as there is a lot of notation with little intuition the empirical evaluations arent described well i had to refer to xie et al 2021 to get a better idea of what the evaluations look like similarly the previous paper also does a much better job of describing and motivating different applications the empirical evaluations are limited assuming very simple policies and representations these would be difficult to realize in realistic settings that the authors use as examples the intuition behind the feedback signal for the mnist classification signal is not clear the empirical evaluations seem detached from potential practical applications feedback is often noisy the authors do not consider this scenario there is a lack of discussion of why a cb setting is better than an mdp formulation for the tasks proposed the authors discuss the potential negative impact of the work well limitations of the approach and testing might be lacking see weaknesses docsepthis paper studies the problem of interactiongrounded learning where an agent takes an action a given context x and receives a feedback vector y in return a reward to be optimized is also generated from the environment but not revealed to the learning process the goal is to learn a policy that optimizes reward without explicitly observing reward from the environment by learning a mapping from y to the space of rewards and learning a value function mapping x and a to 0 1 an existing challenge of this setting is that igl fails when y contains information about the executed action a this paper proposes a contrastive learning approach that allows igl to succeed even when y contains information about a experiments on several rl benchmarks show improvements over igl assuming conditional independence of y given a nearing performance of contextual bandit which assumes full access to reward strengths paper is relatively clear and easy to follow as someone who doesnt have much practice with reading theoryheavy papers unfortunately this paper is a few steps out of my area i found it easy to follow i found the informationtheoretic arguments intuitive main contributions have potential broad significance eg for applications as discussed in the introduction weaknesses see assumptions below discussion on unstated assumptions if any and their limits on realworld applications would be nice to have i found the format of figure 2 somewhat confusing to me the yaxis of the graphs seems to imply some kind of sequential process see above about limitations assumptions for applications to realworld settings docsepthis paper explores a straightforward reformulation of a newly introduced formalism interactiongrounded learning igl in which an agent must learn to act optimally in an environment only given access to some context an action space and perstep feedback that is some function of a latent reward function whereas prior work formalizes the feedback component of igl as conditionally independent of the context and language given the hidden rewards this work 1 shows that this prior conditional independence assumption is too strong leading to pathological failure modes when action information is present in the feedback 2 presents a new formulation with less restrictive conditional independence assumptions that condition feedback on both latent reward and executed action and 3 presents a learning algorithm for deriving good policies given the new formulation and feedback experiments on a modified mnist task where the context action are image label respectively and highdimensional feedback is given as some image y where the digit shown is taken as action 6 binaryreward 3 modulo 10 and a similar construction for the various openml cc18 datasets shows that learning with the proposed algorithm under the new conditional independence assumptions outperforms learning with the prior assumptions in cases where actioninformation is present in the feedback the key strength of this paper is in its clarity at its core it takes an existing formalism full conditional independence interactiongrounded learning finds a key failure mode and adjusts the formalism with new less restrictive assumptions each of the steps taken along the way are wellmotivated and the proofs tied to the new assumptions separability access to a baseline policy for symmetry breaking are clean and relevant unfortunately the weaknesses of this paper are in the evaluation and the underlying motivation behind this work the introduction and discussionconclusion motivate that this type of interaction grounded learning is critical for scenarios in multimodal interactive feedback in humancomputer interaction and in designing braincomputer interaction interfaces especially for the latter the problem of having feedback data that is tied to action information is already a key problem in fmri information orthogonalization line 340 and for eye tracker recalibration for als patients line 341 unfortunately none of the existing evaluations reflect these realworld use cases instead constructing synthetic tasks based on mnist or open classification benchmarks tasks that are hard to understand for example its hard to make the leap to these wellmotivated clear tasks from the current mnist classification experiments where feedback is an arbitrary formula of the right label and the predicted action in the paper experiments the feedback provided an agent is a digit with label action 6 binaryreward 3 modulo 10 in general its not clear why the actionintermingling with feedback is an action problem in realworld settings or why other simpler approaches for learning couldnt learn to decouple this information edit the new revision does address this a little bit so i am updating my score in general this paper is severely limited by its evaluation the paper seeks to show that the initial assumptions in the interactiongrounded learning framework are prohibitive and lead to failure modes designing a new formulation of the frameworks assumptions as well as a new learning algorithm to fix this however by only evaluating synthetic toy constructed tasks with arbitrary feedback that is not realistic its not clear that this is an actual problem this paper would be considerably stronger if evaluated on the motivating tasks in braincomputer interfaces or humancomputer interaction used in the introduction and conclusion and evaluating realworld instances of where actionintermingledfeedback is an actual problem ### Summary:
this paper addresses the problem of learning to behave optimally when actions result only in new observations but no rewards feedback is provided in the shape of a vector this problem known as igl has already been described in previous works which had to make the assumption that the action was not included in the feedback this paper gets rid of this assumption and provides theoretical guarantees the discussion has been quite extensive and the main issue raised by reviewers concerned the experimental setups they were considered toyish and too far from a real application especially the authors mentioned bci and hci in their intro mainly focusing on the fact that having the action in the observation is mandatory with humans in the loop but didnt provide experiments involving actual bci or hci the authors tried to address this issue by providing synthetic experiments simulating bci and fmri as the authors stated the cost of real experiments in that setup would be prohibitive given the effort made by authors to provide experimental results supporting them the algorithmic and theoretical contributions seem good enough to reach the acceptance bar
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 4453, 387, 253, 4758, 835, 253, 4736, 310, 281, 3037, 247, 3646, 1293, 2289, 281, 247, 10921, 432, 253, 3126, 342, 2289, 760, 281, 8680, 2045, 789, 1269, 466, 1162, 355, 43425, 19584, 326, 253, 8680, 1057, 417, 452, 253, 2250, 436, 2929, 26586, 326, 9376, 285, 29328, 247, 747, 4499, 422, 2746, 326, 476, 19837, 8680, 342, 5231, 281, 3037, 247, 3646, 949, 1027, 16774, 27163, 20953, 278, 79, 382, 4836, 285, 1236, 2510, 25912, 13506, 2777, 327, 1527, 1686, 253, 4477, 921, 849, 436, 2746, 476, 3037, 432, 8680, 326, 556, 253, 5231, 2668, 407, 253, 5570, 20544, 50275, 783, 2929, 26586, 690, 13260, 5611, 275, 2045, 2987, 281, 3324, 11333, 8003, 281, 1146, 25436, 275, 15958, 12620, 50276, 28269, 432, 14122, 1179, 274, 23267, 285, 8680, 310, 271, 1774, 3884, 285, 436, 789, 2789, 4780, 4404, 2403, 436, 1896, 50276, 783, 2929, 556, 253, 4569, 490, 77, 569, 285, 1566, 14023, 50276, 20881, 1255, 265, 50276, 74, 717, 417, 1077, 7615, 342, 253, 4623, 6239, 275, 253, 749, 3423, 347, 3095, 432, 247, 2905, 533, 3345, 253, 749, 3423, 1060, 403, 690, 2442, 32213, 619, 2022, 7350, 7027, 342, 253, 4028, 273, 253, 2929, 50275, 20790, 310, 417, 4518, 3542, 253, 10199, 1057, 417, 41509, 253, 4275, 1895, 973, 891, 651, 452, 10490, 281, 923, 247, 1643, 6667, 835, 2045, 3082, 1891, 533, 253, 2929, 310, 2104, 281, 9729, 253, 8037, 253, 4477, 823, 841, 281, 253, 6452, 533, 352, 1537, 320, 4217, 281, 823, 6667, 281, 253, 10199, 281, 41509, 253, 1895, 2593, 495, 310, 3340, 2834, 281, 2096, 347, 627, 310, 247, 2257, 273, 14951, 342, 1652, 30328, 50276, 783, 16774, 27163, 403, 2649, 2529, 973, 891, 574, 281, 3730, 281, 1269, 466, 1162, 355, 43425, 281, 755, 247, 1805, 2934, 273, 752, 253, 27163, 1007, 751, 12014, 253, 2045, 2929, 671, 1057, 247, 1199, 1805, 2628, 273, 12930, 285, 15265, 839, 1027, 4893, 50276, 783, 16774, 27163, 403, 3710, 7384, 1077, 2969, 7823, 285, 14237, 841, 651, 320, 2834, 281, 8968, 275, 15958, 7533, 326, 253, 4477, 897, 347, 6667, 50276, 783, 30328, 3212, 253, 8680, 2625, 323, 253, 278, 79, 382, 9162, 2625, 310, 417, 2590, 50276, 783, 16774, 27163, 1646, 31418, 432, 2442, 8542, 4893, 50276, 44333, 310, 2223, 27620, 253, 4477, 513, 417, 1908, 436, 10076, 50276, 9088, 310, 247, 3480, 273, 5955, 273, 2139, 247, 34795, 4758, 310, 1805, 685, 271, 278, 12132, 15895, 323, 253, 8892, 4081, 253, 4477, 2319, 253, 2442, 4016, 3486, 273, 253, 789, 973, 7364, 273, 253, 2746, 285, 5175, 1537, 320, 14999, 923, 32213, 5474, 33032, 2520, 2929, 2175, 253, 1895, 273, 5016, 2595, 264, 4715, 835, 271, 5570, 3936, 271, 2250, 247, 1677, 3634, 1269, 285, 14488, 247, 8680, 4972, 340, 275, 1091, 247, 10921, 281, 320, 18325, 310, 671, 4561, 432, 253, 3126, 533, 417, 4950, 281, 253, 4715, 1232, 253, 4736, 310, 281, 3037, 247, 3646, 326, 5556, 4219, 10921, 1293, 11120, 20764, 10921, 432, 253, 3126, 407, 4715, 247, 10603, 432, 340, 281, 253, 2317, 273, 23267, 285, 4715, 247, 1318, 1159, 10603, 1269, 285, 247, 281, 470, 337, 271, 5368, 5691, 273, 436, 4758, 310, 326, 25477, 77, 10224, 672, 340, 4428, 1491, 670, 253, 11407, 2250, 247, 436, 2929, 29328, 247, 4499, 422, 4715, 2746, 326, 4483, 25477, 77, 281, 9302, 1014, 672, 340, 4428, 1491, 670, 247, 4679, 327, 2067, 391, 77, 49602, 921, 11701, 689, 25477, 77, 7384, 17697, 14275, 273, 340, 1677, 247, 425, 1875, 3045, 273, 33876, 3961, 262, 534, 19584, 2120, 2289, 281, 10921, 20544, 50275, 20790, 310, 4942, 2590, 285, 3477, 281, 956, 347, 3095, 665, 36908, 452, 1199, 3946, 342, 4361, 3762, 37893, 9380, 19235, 436, 2929, 310, 247, 1643, 5018, 562, 273, 619, 2170, 891, 1119, 352, 3477, 281, 956, 891, 1119, 253, 1491, 783, 30325, 7125, 27350, 50276, 7265, 9021, 452, 2442, 3862, 8453, 24088, 323, 4893, 347, 5469, 275, 253, 10199, 50273, 20881, 1255, 265, 50275, 2887, 13260, 2708, 50276, 49794, 327, 440, 33834, 13260, 604, 667, 285, 616, 7787, 327, 1524, 10186, 4893, 651, 320, 5322, 281, 452, 50276, 74, 1119, 253, 5981, 273, 4677, 374, 8489, 21643, 281, 479, 253, 340, 10565, 273, 253, 14580, 3133, 281, 16084, 690, 2238, 273, 22453, 1232, 923, 1840, 670, 7364, 50276, 515, 360, 6372, 323, 4893, 281, 1524, 10186, 7533, 5474, 33032, 2520, 2929, 33826, 247, 15246, 8460, 1427, 273, 247, 9841, 5611, 30221, 50276, 2388, 1913, 2595, 264, 4715, 25477, 77, 50276, 249, 534, 271, 5570, 1364, 3037, 281, 769, 5556, 595, 275, 271, 3126, 760, 1677, 2289, 281, 690, 3634, 271, 2250, 2317, 285, 591, 10539, 8680, 326, 310, 690, 1159, 273, 247, 21624, 10921, 1159, 50275, 2811, 284, 2720, 789, 7473, 4219, 253, 8680, 4445, 273, 25477, 77, 347, 1617, 595, 3907, 273, 253, 3634, 285, 3448, 1677, 253, 8763, 23267, 436, 789, 337, 2722, 326, 436, 2720, 17697, 14275, 9376, 310, 1512, 2266, 4283, 281, 18977, 4433, 10006, 672, 2250, 1491, 310, 1246, 275, 253, 8680, 374, 10262, 247, 747, 15895, 342, 1679, 29190, 17697, 14275, 13260, 326, 1617, 8680, 327, 1097, 21624, 10921, 285, 11407, 2250, 285, 495, 10262, 247, 4715, 5933, 323, 44190, 1175, 7823, 1677, 253, 747, 15895, 285, 8680, 50275, 16217, 3825, 327, 247, 7321, 278, 79, 382, 4836, 835, 253, 3634, 2250, 403, 2460, 5203, 2975, 285, 1029, 6967, 8680, 310, 1677, 347, 690, 2460, 340, 835, 253, 6670, 2011, 310, 2668, 347, 2250, 50276, 23, 50276, 26458, 250, 1034, 50276, 20, 40090, 884, 285, 247, 2074, 5140, 323, 253, 2710, 1527, 1686, 25215, 1093, 15302, 2722, 326, 4715, 342, 253, 4081, 5933, 762, 253, 747, 17697, 14275, 13260, 41731, 13015, 4715, 342, 253, 2720, 13260, 275, 2219, 835, 2250, 18480, 310, 1246, 275, 253, 8680, 50276, 783, 2234, 4757, 273, 436, 2929, 310, 275, 697, 19843, 387, 697, 5161, 352, 3936, 271, 5368, 30221, 2120, 17697, 14275, 5016, 2595, 264, 4715, 9010, 247, 2234, 4433, 4438, 285, 4575, 84, 253, 30221, 342, 747, 1679, 29190, 13260, 1016, 273, 253, 5018, 2668, 2112, 253, 1039, 403, 973, 24013, 8550, 285, 253, 27947, 12331, 281, 253, 747, 13260, 2533, 1430, 2289, 281, 247, 8245, 3646, 323, 10377, 10155, 403, 4076, 285, 4623, 50276, 328, 9520, 253, 32213, 273, 436, 2929, 403, 275, 253, 7103, 285, 253, 6944, 16038, 3212, 436, 789, 253, 10199, 285, 5955, 585, 3444, 41509, 326, 436, 1511, 273, 5016, 28462, 4715, 310, 4619, 323, 15216, 275, 23390, 26306, 18366, 8680, 275, 1966, 32948, 5016, 285, 275, 20462, 3998, 32948, 5016, 19069, 3340, 323, 253, 6158, 253, 1895, 273, 1907, 8680, 941, 326, 310, 12331, 281, 2250, 1491, 310, 2168, 247, 2234, 1895, 275, 49555, 363, 1491, 19627, 1320, 1386, 28528, 285, 323, 5130, 40143, 42545, 11457, 323, 14350, 1363, 1386, 36076, 19235, 5293, 273, 253, 5368, 27163, 4887, 841, 1524, 10186, 897, 2219, 3185, 26736, 13506, 8892, 1754, 327, 278, 79, 382, 390, 1527, 9162, 49602, 50276, 40480, 326, 403, 1892, 281, 2096, 50276, 1542, 1650, 697, 1892, 281, 1056, 253, 26416, 281, 841, 973, 24013, 8550, 2590, 8892, 432, 253, 1655, 278, 79, 382, 9162, 4679, 835, 8680, 310, 271, 10341, 7212, 273, 253, 987, 5203, 285, 253, 8131, 2250, 50276, 249, 253, 2929, 4679, 253, 8680, 2530, 271, 5570, 310, 247, 6670, 342, 5203, 50276, 1913, 50276, 23, 50276, 26458, 250, 1034, 50276, 20, 40090, 884, 275, 2087, 697, 417, 2590, 2139, 253, 2250, 2388, 3987, 1981, 342, 8680, 310, 271, 2250, 1895, 275, 1524, 10186, 7533, 390, 2139, 643, 19554, 7274, 323, 4715, 812, 2649, 3037, 281, 34430, 713, 436, 1491, 50276, 15576, 253, 747, 18520, 1057, 2953, 436, 247, 1652, 2372, 594, 891, 717, 22753, 619, 4868, 275, 2087, 436, 2929, 310, 18270, 3710, 407, 697, 7103, 253, 2929, 14993, 281, 921, 326, 253, 3302, 13260, 275, 253, 5016, 2595, 264, 4715, 7792, 403, 9419, 1483, 285, 1421, 281, 4433, 10006, 20462, 247, 747, 15895, 273, 253, 31225, 13260, 347, 973, 347, 247, 747, 4715, 5933, 281, 4993, 436, 2299, 407, 760, 16344, 13506, 20953, 8818, 8892, 342, 10341, 8680, 326, 310, 417, 15958, 697, 417, 2590, 326, 436, 310, 271, 4588, 1895, 50276, 2520, 2929, 651, 320, 15455, 10046, 604, 6760, 327, 253, 15265, 839, 8892, 275, 3998, 32948, 19069, 390, 1966, 32948, 5016, 908, 275, 253, 10199, 285, 6452, 285, 16344, 1524, 10186, 10872, 273, 835, 2250, 2388, 3987, 1070, 44333, 310, 271, 4588, 1895, 2490, 187, 4118, 18435, 27, 2520, 2929, 12453, 253, 1895, 273, 4715, 281, 21319, 5556, 595, 672, 5231, 906, 760, 275, 747, 7313, 533, 642, 23267, 8680, 310, 2530, 275, 253, 5281, 273, 247, 4972, 436, 1895, 1929, 347, 25477, 77, 556, 2168, 644, 2529, 275, 2045, 2987, 534, 574, 281, 1056, 253, 9376, 326, 253, 2250, 369, 417, 2908, 275, 253, 8680, 436, 2929, 4850, 8314, 273, 436, 9376, 285, 3400, 10527, 23632, 50275, 783, 5955, 556, 644, 3240, 9470, 285, 253, 2022, 2523, 5439, 407, 30628, 7514, 253, 5661, 873, 8777, 597, 497, 2783, 20953, 763, 285, 1512, 2080, 432, 247, 1524, 2898, 3340, 253, 4477, 5393, 270, 5297, 285, 288, 5297, 275, 616, 26432, 7194, 13654, 327, 253, 958, 326, 1907, 253, 2250, 275, 253, 8310, 310, 17396, 342, 7497, 275, 253, 6287, 533, 42126, 2085, 4679, 7668, 4588, 270, 5297, 390, 288, 5297, 50275, 783, 4477, 3597, 281, 2953, 436, 2523, 407, 5277, 13506, 4679, 948, 8287, 270, 5297, 285, 49555, 363, 347, 253, 4477, 4767, 253, 2105, 273, 1524, 4679, 275, 326, 9978, 651, 320, 9419, 1483, 50275, 28821, 253, 3434, 1160, 407, 4477, 281, 2085, 5661, 1543, 8109, 731, 253, 5933, 280, 285, 10527, 9021, 1646, 1175, 2217, 281, 3986, 253, 14924, 2534, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 4453, 387, 253, 4758, 835, 253, 4736, 310, 281, 3037, 247, 3646, 1293, 2289, 281, 247, 10921, 432, 253, 3126, 342, 2289, 760, 281, 8680, 2045, 789, 1269, 466, 1162, 355, 43425, 19584, 326, 253, 8680, 1057, 417, 452, 253, 2250, 436, 2929, 26586, 326, 9376, 285, 29328, 247, 747, 4499, 422, 2746, 326, 476, 19837, 8680, 342, 5231, 281, 3037, 247, 3646, 949, 1027, 16774, 27163, 20953, 278, 79, 382, 4836, 285, 1236, 2510, 25912, 13506, 2777, 327, 1527, 1686, 253, 4477, 921, 849, 436, 2746, 476, 3037, 432, 8680, 326, 556, 253, 5231, 2668, 407, 253, 5570, 20544, 50275, 783, 2929, 26586, 690, 13260, 5611, 275, 2045, 2987, 281, 3324, 11333, 8003, 281, 1146, 25436, 275, 15958, 12620, 50276, 28269, 432, 14122, 1179, 274, 23267, 285, 8680, 310, 271, 1774, 3884, 285, 436, 789, 2789, 4780, 4404, 2403, 436, 1896, 50276, 783, 2929, 556, 253, 4569, 490, 77, 569, 285, 1566, 14023, 50276, 20881, 1255, 265, 50276, 74, 717, 417, 1077, 7615, 342, 253, 4623, 6239, 275, 253, 749, 3423, 347, 3095, 432, 247, 2905, 533, 3345, 253, 749, 3423, 1060, 403, 690, 2442, 32213, 619, 2022, 7350, 7027, 342, 253, 4028, 273, 253, 2929, 50275, 20790, 310, 417, 4518, 3542, 253, 10199, 1057, 417, 41509, 253, 4275, 1895, 973, 891, 651, 452, 10490, 281, 923, 247, 1643, 6667, 835, 2045, 3082, 1891, 533, 253, 2929, 310, 2104, 281, 9729, 253, 8037, 253, 4477, 823, 841, 281, 253, 6452, 533, 352, 1537, 320, 4217, 281, 823, 6667, 281, 253, 10199, 281, 41509, 253, 1895, 2593, 495, 310, 3340, 2834, 281, 2096, 347, 627, 310, 247, 2257, 273, 14951, 342, 1652, 30328, 50276, 783, 16774, 27163, 403, 2649, 2529, 973, 891, 574, 281, 3730, 281, 1269, 466, 1162, 355, 43425, 281, 755, 247, 1805, 2934, 273, 752, 253, 27163, 1007, 751, 12014, 253, 2045, 2929, 671, 1057, 247, 1199, 1805, 2628, 273, 12930, 285, 15265, 839, 1027, 4893, 50276, 783, 16774, 27163, 403, 3710, 7384, 1077, 2969, 7823, 285, 14237, 841, 651, 320, 2834, 281, 8968, 275, 15958, 7533, 326, 253, 4477, 897, 347, 6667, 50276, 783, 30328, 3212, 253, 8680, 2625, 323, 253, 278, 79, 382, 9162, 2625, 310, 417, 2590, 50276, 783, 16774, 27163, 1646, 31418, 432, 2442, 8542, 4893, 50276, 44333, 310, 2223, 27620, 253, 4477, 513, 417, 1908, 436, 10076, 50276, 9088, 310, 247, 3480, 273, 5955, 273, 2139, 247, 34795, 4758, 310, 1805, 685, 271, 278, 12132, 15895, 323, 253, 8892, 4081, 253, 4477, 2319, 253, 2442, 4016, 3486, 273, 253, 789, 973, 7364, 273, 253, 2746, 285, 5175, 1537, 320, 14999, 923, 32213, 5474, 33032, 2520, 2929, 2175, 253, 1895, 273, 5016, 2595, 264, 4715, 835, 271, 5570, 3936, 271, 2250, 247, 1677, 3634, 1269, 285, 14488, 247, 8680, 4972, 340, 275, 1091, 247, 10921, 281, 320, 18325, 310, 671, 4561, 432, 253, 3126, 533, 417, 4950, 281, 253, 4715, 1232, 253, 4736, 310, 281, 3037, 247, 3646, 326, 5556, 4219, 10921, 1293, 11120, 20764, 10921, 432, 253, 3126, 407, 4715, 247, 10603, 432, 340, 281, 253, 2317, 273, 23267, 285, 4715, 247, 1318, 1159, 10603, 1269, 285, 247, 281, 470, 337, 271, 5368, 5691, 273, 436, 4758, 310, 326, 25477, 77, 10224, 672, 340, 4428, 1491, 670, 253, 11407, 2250, 247, 436, 2929, 29328, 247, 4499, 422, 4715, 2746, 326, 4483, 25477, 77, 281, 9302, 1014, 672, 340, 4428, 1491, 670, 247, 4679, 327, 2067, 391, 77, 49602, 921, 11701, 689, 25477, 77, 7384, 17697, 14275, 273, 340, 1677, 247, 425, 1875, 3045, 273, 33876, 3961, 262, 534, 19584, 2120, 2289, 281, 10921, 20544, 50275, 20790, 310, 4942, 2590, 285, 3477, 281, 956, 347, 3095, 665, 36908, 452, 1199, 3946, 342, 4361, 3762, 37893, 9380, 19235, 436, 2929, 310, 247, 1643, 5018, 562, 273, 619, 2170, 891, 1119, 352, 3477, 281, 956, 891, 1119, 253, 1491, 783, 30325, 7125, 27350, 50276, 7265, 9021, 452, 2442, 3862, 8453, 24088, 323, 4893, 347, 5469, 275, 253, 10199, 50273, 20881, 1255, 265, 50275, 2887, 13260, 2708, 50276, 49794, 327, 440, 33834, 13260, 604, 667, 285, 616, 7787, 327, 1524, 10186, 4893, 651, 320, 5322, 281, 452, 50276, 74, 1119, 253, 5981, 273, 4677, 374, 8489, 21643, 281, 479, 253, 340, 10565, 273, 253, 14580, 3133, 281, 16084, 690, 2238, 273, 22453, 1232, 923, 1840, 670, 7364, 50276, 515, 360, 6372, 323, 4893, 281, 1524, 10186, 7533, 5474, 33032, 2520, 2929, 33826, 247, 15246, 8460, 1427, 273, 247, 9841, 5611, 30221, 50276, 2388, 1913, 2595, 264, 4715, 25477, 77, 50276, 249, 534, 271, 5570, 1364, 3037, 281, 769, 5556, 595, 275, 271, 3126, 760, 1677, 2289, 281, 690, 3634, 271, 2250, 2317, 285, 591, 10539, 8680, 326, 310, 690, 1159, 273, 247, 21624, 10921, 1159, 50275, 2811, 284, 2720, 789, 7473, 4219, 253, 8680, 4445, 273, 25477, 77, 347, 1617, 595, 3907, 273, 253, 3634, 285, 3448, 1677, 253, 8763, 23267, 436, 789, 337, 2722, 326, 436, 2720, 17697, 14275, 9376, 310, 1512, 2266, 4283, 281, 18977, 4433, 10006, 672, 2250, 1491, 310, 1246, 275, 253, 8680, 374, 10262, 247, 747, 15895, 342, 1679, 29190, 17697, 14275, 13260, 326, 1617, 8680, 327, 1097, 21624, 10921, 285, 11407, 2250, 285, 495, 10262, 247, 4715, 5933, 323, 44190, 1175, 7823, 1677, 253, 747, 15895, 285, 8680, 50275, 16217, 3825, 327, 247, 7321, 278, 79, 382, 4836, 835, 253, 3634, 2250, 403, 2460, 5203, 2975, 285, 1029, 6967, 8680, 310, 1677, 347, 690, 2460, 340, 835, 253, 6670, 2011, 310, 2668, 347, 2250, 50276, 23, 50276, 26458, 250, 1034, 50276, 20, 40090, 884, 285, 247, 2074, 5140, 323, 253, 2710, 1527, 1686, 25215, 1093, 15302, 2722, 326, 4715, 342, 253, 4081, 5933, 762, 253, 747, 17697, 14275, 13260, 41731, 13015, 4715, 342, 253, 2720, 13260, 275, 2219, 835, 2250, 18480, 310, 1246, 275, 253, 8680, 50276, 783, 2234, 4757, 273, 436, 2929, 310, 275, 697, 19843, 387, 697, 5161, 352, 3936, 271, 5368, 30221, 2120, 17697, 14275, 5016, 2595, 264, 4715, 9010, 247, 2234, 4433, 4438, 285, 4575, 84, 253, 30221, 342, 747, 1679, 29190, 13260, 1016, 273, 253, 5018, 2668, 2112, 253, 1039, 403, 973, 24013, 8550, 285, 253, 27947, 12331, 281, 253, 747, 13260, 2533, 1430, 2289, 281, 247, 8245, 3646, 323, 10377, 10155, 403, 4076, 285, 4623, 50276, 328, 9520, 253, 32213, 273, 436, 2929, 403, 275, 253, 7103, 285, 253, 6944, 16038, 3212, 436, 789, 253, 10199, 285, 5955, 585, 3444, 41509, 326, 436, 1511, 273, 5016, 28462, 4715, 310, 4619, 323, 15216, 275, 23390, 26306, 18366, 8680, 275, 1966, 32948, 5016, 285, 275, 20462, 3998, 32948, 5016, 19069, 3340, 323, 253, 6158, 253, 1895, 273, 1907, 8680, 941, 326, 310, 12331, 281, 2250, 1491, 310, 2168, 247, 2234, 1895, 275, 49555, 363, 1491, 19627, 1320, 1386, 28528, 285, 323, 5130, 40143, 42545, 11457, 323, 14350, 1363, 1386, 36076, 19235, 5293, 273, 253, 5368, 27163, 4887, 841, 1524, 10186, 897, 2219, 3185, 26736, 13506, 8892, 1754, 327, 278, 79, 382, 390, 1527, 9162, 49602, 50276, 40480, 326, 403, 1892, 281, 2096, 50276, 1542, 1650, 697, 1892, 281, 1056, 253, 26416, 281, 841, 973, 24013, 8550, 2590, 8892, 432, 253, 1655, 278, 79, 382, 9162, 4679, 835, 8680, 310, 271, 10341, 7212, 273, 253, 987, 5203, 285, 253, 8131, 2250, 50276, 249, 253, 2929, 4679, 253, 8680, 2530, 271, 5570, 310, 247, 6670, 342, 5203, 50276, 1913, 50276, 23, 50276, 26458, 250, 1034, 50276, 20, 40090, 884, 275, 2087, 697, 417, 2590, 2139, 253, 2250, 2388, 3987, 1981, 342, 8680, 310, 271, 2250, 1895, 275, 1524, 10186, 7533, 390, 2139, 643, 19554, 7274, 323, 4715, 812, 2649, 3037, 281, 34430, 713, 436, 1491, 50276, 15576, 253, 747, 18520, 1057, 2953, 436, 247, 1652, 2372, 594, 891, 717, 22753, 619, 4868, 275, 2087, 436, 2929, 310, 18270, 3710, 407, 697, 7103, 253, 2929, 14993, 281, 921, 326, 253, 3302, 13260, 275, 253, 5016, 2595, 264, 4715, 7792, 403, 9419, 1483, 285, 1421, 281, 4433, 10006, 20462, 247, 747, 15895, 273, 253, 31225, 13260, 347, 973, 347, 247, 747, 4715, 5933, 281, 4993, 436, 2299, 407, 760, 16344, 13506, 20953, 8818, 8892, 342, 10341, 8680, 326, 310, 417, 15958, 697, 417, 2590, 326, 436, 310, 271, 4588, 1895, 50276, 2520, 2929, 651, 320, 15455, 10046, 604, 6760, 327, 253, 15265, 839, 8892, 275, 3998, 32948, 19069, 390, 1966, 32948, 5016, 908, 275, 253, 10199, 285, 6452, 285, 16344, 1524, 10186, 10872, 273, 835, 2250, 2388, 3987, 1070, 44333, 310, 271, 4588, 1895, 2490, 187, 4118, 18435, 27, 2520, 2929, 12453, 253, 1895, 273, 4715, 281, 21319, 5556, 595, 672, 5231, 906, 760, 275, 747, 7313, 533, 642, 23267, 8680, 310, 2530, 275, 253, 5281, 273, 247, 4972, 436, 1895, 1929, 347, 25477, 77, 556, 2168, 644, 2529, 275, 2045, 2987, 534, 574, 281, 1056, 253, 9376, 326, 253, 2250, 369, 417, 2908, 275, 253, 8680, 436, 2929, 4850, 8314, 273, 436, 9376, 285, 3400, 10527, 23632, 50275, 783, 5955, 556, 644, 3240, 9470, 285, 253, 2022, 2523, 5439, 407, 30628, 7514, 253, 5661, 873, 8777, 597, 497, 2783, 20953, 763, 285, 1512, 2080, 432, 247, 1524, 2898, 3340, 253, 4477, 5393, 270, 5297, 285, 288, 5297, 275, 616, 26432, 7194, 13654, 327, 253, 958, 326, 1907, 253, 2250, 275, 253, 8310, 310, 17396, 342, 7497, 275, 253, 6287, 533, 42126, 2085, 4679, 7668, 4588, 270, 5297, 390, 288, 5297, 50275, 783, 4477, 3597, 281, 2953, 436, 2523, 407, 5277, 13506, 4679, 948, 8287, 270, 5297, 285, 49555, 363, 347, 253, 4477, 4767, 253, 2105, 273, 1524, 4679, 275, 326, 9978, 651, 320, 9419, 1483, 50275, 28821, 253, 3434, 1160, 407, 4477, 281, 2085, 5661, 1543, 8109, 731, 253, 5933, 280, 285, 10527, 9021, 1646, 1175, 2217, 281, 3986, 253, 14924, 2534, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper examines kernelized online optimization problems and propose to tackle the problem by some generalization of primaldual techniques the techniques developed in this paper looks plausible but i am not an expert in this area so feel a bit unsure about the depth of the techniques and wish the authors can elaborate a bit more on some conceptual questions the paper aims to optimize a function in rkhs subject to a certain softconstraint it can be violated in a sublinear manner then it argues that by carefully combining gaussian processes ubc and primaldual algorithms a nontrivial algorithm can be designed i feel the way different techniques are integrated is quite reasonable and this overall sounds like a nice result nevertheless i have a few questions 1 is there a nice way to interpret the soft constraint formulation the formulation smells like a madeup constraint because of certain limitation of the new techniques developed 2 how does this work compare to srinivas krause kakade and seeger 12 is that the only difference is that a soft constraint is added and maybe thats the reason some primal dual is needed also some submodular techniques seemed to be needed to deal with those information gain things it appears that this kind of techniques are not needed in this paper did the author manage to circumvent the submodular property in a different way or indirectly used this property somewhere 3 likely to be my ignorant does sublinear regret implies that there exists an offline algorithm that can optimize any function in strongly polynomial time with error parameter epsilon that sounds too good to be true or maybe standard gridsearch also can achieve so 4 related to 2 and 3 does the algorithm have a multidimensional generalization i imagine the result would be exponential in the dimension like skks12 so my question at the end is that if exponential regret is allowed how is this better than standard grid search comments on both the practical and theoretical aspects would be helpful same as above docsepthe authors study a stochastic bandit problem when the reward function and constraint function lie in a reproducing kernel hilbert space rkhs with a bounded norm the paper considers soft constraints that may be violated in any round as long as the cumulative violations are small they solve the restricted maximization problem using primaldual optimization and propose a flexible algorithm for various types of exploration including ucb and ts they also provide a unified analysis with sublinear regret and sublinear constraint violation their main contribution seems to suggest a unified solution for constrained kernelized bandits the paper succeds in providing a unified regret analysis that accommodates various exploration tools such as ucb and ts by circumventing previous approaches relying on lyapunovdrift arguement strength the authors develop a unified framework for kernel bandits with soft constraints using primaldual optimization and show sublinear reward regret and sublinear total constraint violation when ucb or ts type of exploration is utilized to construct a unified algorithm and regret analysis they identify a novel sufficient condition the paper is well written and clear 

 yes docsepthis paper provides a unified framework ckb for kernelized bandits optimization with unknown kernelized constraints based on primaldual optimization this framework can employ general exploration strategies gpucb and gpts and achieve sublinear cumulative regret with sublinear cumulative constraint violation a new exploration strategy randgpucb is also provided experiments on synthetic and realworld data are conducted the paper is generally wellwritten with detailed proofs provided in the appendices strengths the ckb framework can employ multiple exploration strategies under the sufficient condition ckb can attain sublinear cumulative regret and sublinear cumulative constraint violations weaknesses it is frequently mentioned that ckb can use general exploration strategies based on my understanding this algorithm requires a function estimator ft rather than an acquisition function some exploration strategies like maximum variance do not involve an estimator of the unknown function comments for line 154 a better upper bound on maximum information gain for the matern kernel has been provided in on information gain and regret bounds in gaussian process bandits comparisons between convex optimization methods and lyapunovdrift methods have been discussed docsepthe paper considers the problem of black box optimization under soft constraints where the objective of the learner is to maximize a function f subject to g leq 0 where both f and g are elements in possibly different rkhs the objective is to attain a sublinear regret and a sublinear violation of the constraint the authors propose a general algorithmic framework for such scenarios that works for a class of exploration strategies satisfying certain conditions it is shown that gpucb and gpts satisfy these conditions and the regret and violation bounds are derived for these two policies supporting numerical evidence has also been provided strengths i think the analysis techniques used in the work are interesting even though the ideas are borrowed from existing work the work draws interesting parallels and extends the analysis from gpucb to gpts i liked the discussion on convexopt and lyupanovopt techniques in the appendix weaknesses please see the next section for questions na ### Summary:
the paper provides new techniques algorithmic as well as analytical to solve black box optimization of smooth functions with constraints the reviewers are largely in favor of the papers contributions and the author responses have helped to clarify several aspects of the presentation and connections to existing work therefore i recommend that the paper be accepted
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 33888, 10295, 1025, 3909, 13757, 3237, 285, 12661, 281, 18915, 253, 1895, 407, 690, 26647, 273, 819, 1983, 34716, 5609, 253, 5609, 3715, 275, 436, 2929, 4453, 21541, 533, 891, 717, 417, 271, 6485, 275, 436, 2170, 594, 1928, 247, 2372, 31488, 670, 253, 6864, 273, 253, 5609, 285, 5730, 253, 4477, 476, 21184, 247, 2372, 625, 327, 690, 20178, 3533, 50274, 783, 2929, 13698, 281, 22318, 247, 1159, 275, 391, 76, 11285, 2256, 281, 247, 2176, 2602, 30995, 352, 476, 320, 13588, 275, 247, 749, 8172, 5133, 840, 352, 8219, 326, 407, 9257, 16248, 305, 12064, 4870, 12980, 68, 285, 819, 1983, 34716, 11333, 247, 37825, 5933, 476, 320, 4158, 891, 1928, 253, 1039, 1027, 5609, 403, 8527, 310, 3240, 5272, 285, 436, 4583, 7835, 751, 247, 5322, 906, 17837, 891, 452, 247, 1643, 3533, 50275, 18, 310, 627, 247, 5322, 1039, 281, 4665, 253, 2602, 7658, 15895, 253, 15895, 34247, 751, 247, 1160, 484, 7658, 984, 273, 2176, 12291, 273, 253, 747, 5609, 3715, 50276, 19, 849, 1057, 436, 789, 7277, 281, 256, 11078, 34627, 465, 376, 2327, 465, 518, 796, 285, 396, 38142, 1249, 310, 326, 253, 760, 3064, 310, 326, 247, 2602, 7658, 310, 2879, 285, 5046, 28763, 253, 1921, 690, 819, 1983, 8746, 310, 3058, 671, 690, 749, 2307, 792, 5609, 4455, 281, 320, 3058, 281, 2968, 342, 1110, 1491, 6351, 1841, 352, 4620, 326, 436, 2238, 273, 5609, 403, 417, 3058, 275, 436, 2929, 50276, 14958, 253, 2488, 8722, 281, 39256, 253, 749, 2307, 792, 2867, 275, 247, 1027, 1039, 390, 21719, 908, 436, 2867, 9366, 50276, 20, 2779, 281, 320, 619, 28884, 1057, 749, 8172, 14938, 8018, 326, 627, 4961, 271, 28841, 5933, 326, 476, 22318, 667, 1159, 275, 7052, 14189, 673, 342, 2228, 4764, 299, 4277, 326, 7835, 1512, 1175, 281, 320, 2032, 390, 5046, 2629, 9860, 8716, 671, 476, 5115, 594, 577, 2905, 281, 374, 285, 495, 1057, 253, 5933, 452, 247, 23964, 37613, 26647, 891, 8564, 253, 906, 651, 320, 17619, 275, 253, 7877, 751, 1629, 661, 805, 594, 619, 1953, 387, 253, 990, 310, 326, 604, 17619, 14938, 310, 4136, 849, 310, 436, 1805, 685, 2629, 9860, 3186, 5701, 327, 1097, 253, 8542, 285, 10527, 7794, 651, 320, 9371, 50273, 18941, 347, 1840, 50276, 7152, 339, 431, 248, 4477, 1263, 247, 19191, 3961, 262, 1895, 672, 253, 10921, 1159, 285, 7658, 1159, 7027, 275, 247, 39306, 10295, 288, 300, 6291, 2317, 391, 76, 11285, 342, 247, 11542, 5222, 253, 2929, 19401, 2602, 10806, 326, 778, 320, 13588, 275, 667, 3790, 347, 1048, 347, 253, 18849, 15927, 403, 1355, 597, 8415, 253, 11096, 11903, 1320, 1895, 970, 50276, 1087, 1983, 34716, 13757, 285, 12661, 247, 12112, 5933, 323, 2710, 3510, 273, 17947, 1690, 44274, 67, 285, 28669, 50276, 9328, 50276, 12563, 2085, 247, 27998, 1783, 342, 749, 8172, 14938, 285, 749, 8172, 7658, 8411, 50276, 14094, 2022, 7680, 3133, 281, 1804, 247, 27998, 2900, 323, 20793, 10295, 1025, 3961, 953, 50276, 783, 2929, 18382, 5797, 275, 5277, 247, 27998, 14938, 1783, 326, 10085, 684, 2710, 17947, 5657, 824, 347, 44274, 67, 285, 28669, 407, 39256, 272, 2045, 7274, 22128, 327, 12865, 522, 43772, 5267, 2094, 23311, 1003, 50275, 45563, 50276, 783, 4477, 1287, 247, 27998, 7792, 323, 10295, 3961, 953, 342, 2602, 10806, 970, 819, 1983, 34716, 13757, 285, 921, 749, 8172, 10921, 14938, 285, 749, 8172, 2264, 7658, 8411, 672, 44274, 67, 390, 28669, 1511, 273, 17947, 310, 12845, 50276, 936, 3989, 247, 27998, 5933, 285, 14938, 1783, 597, 4271, 247, 4460, 4209, 1617, 50276, 783, 2929, 310, 973, 3542, 285, 2590, 50276, 40702, 40702, 4754, 5474, 33032, 2520, 2929, 3400, 247, 27998, 7792, 260, 22421, 323, 10295, 1025, 3961, 953, 13757, 342, 7202, 10295, 1025, 10806, 1754, 327, 819, 1983, 34716, 13757, 436, 7792, 476, 2126, 2087, 17947, 8130, 31025, 1028, 67, 285, 305, 45276, 285, 5115, 749, 8172, 18849, 14938, 342, 749, 8172, 18849, 7658, 8411, 247, 747, 17947, 5700, 40819, 17788, 1028, 67, 310, 671, 2530, 4679, 327, 13506, 285, 1524, 10186, 941, 403, 5196, 253, 2929, 310, 3839, 973, 15720, 342, 7000, 27947, 2530, 275, 253, 14801, 1271, 50276, 296, 3755, 20556, 50276, 783, 260, 22421, 7792, 476, 2126, 2709, 17947, 8130, 50276, 4524, 253, 4209, 1617, 260, 22421, 476, 20685, 749, 8172, 18849, 14938, 285, 749, 8172, 18849, 7658, 15927, 50276, 20881, 1255, 265, 50276, 262, 310, 7208, 5393, 326, 260, 22421, 476, 897, 2087, 17947, 8130, 1754, 327, 619, 4685, 436, 5933, 4419, 247, 1159, 29107, 23899, 50276, 30786, 685, 271, 11931, 1159, 690, 17947, 8130, 751, 4869, 11041, 513, 417, 6388, 271, 29107, 273, 253, 7202, 1159, 50276, 26122, 50276, 1542, 1386, 21603, 247, 1805, 5170, 3033, 327, 4869, 1491, 6351, 323, 253, 45171, 10295, 556, 644, 2530, 275, 327, 1491, 6351, 285, 14938, 14493, 275, 305, 12064, 1232, 3961, 953, 14023, 875, 17133, 13757, 3082, 285, 12865, 522, 43772, 5267, 2094, 3082, 452, 644, 5469, 5474, 339, 431, 248, 2929, 19401, 253, 1895, 273, 2806, 3817, 13757, 762, 2602, 10806, 835, 253, 8103, 273, 253, 458, 47612, 310, 281, 22950, 247, 1159, 269, 2256, 281, 305, 458, 82, 470, 835, 1097, 269, 285, 305, 403, 3603, 275, 6830, 1027, 391, 76, 11285, 253, 8103, 310, 281, 20685, 247, 749, 8172, 14938, 285, 247, 749, 8172, 8411, 273, 253, 7658, 253, 4477, 12661, 247, 2087, 5933, 280, 7792, 323, 824, 15216, 326, 2987, 323, 247, 966, 273, 17947, 8130, 14127, 2176, 2515, 352, 310, 2011, 326, 31025, 1028, 67, 285, 305, 45276, 10517, 841, 2515, 285, 253, 14938, 285, 8411, 14493, 403, 6012, 323, 841, 767, 7823, 8109, 10704, 1941, 556, 671, 644, 2530, 50276, 296, 3755, 20556, 50276, 74, 1158, 253, 1783, 5609, 908, 275, 253, 789, 403, 4722, 1014, 2167, 253, 5697, 403, 29563, 432, 5368, 789, 253, 789, 21354, 4722, 43630, 285, 8725, 253, 1783, 432, 31025, 1028, 67, 281, 305, 45276, 891, 10490, 253, 5955, 327, 17133, 2178, 285, 12865, 484, 46964, 2178, 5609, 275, 253, 30762, 50276, 20881, 1255, 265, 50275, 32897, 923, 253, 1735, 2593, 323, 3533, 5549, 2490, 187, 4118, 18435, 27, 783, 2929, 3400, 747, 5609, 5933, 280, 347, 973, 347, 16101, 281, 8415, 2806, 3817, 13757, 273, 6032, 3470, 342, 10806, 253, 30628, 403, 8127, 275, 3718, 273, 253, 9380, 9021, 285, 253, 2488, 6128, 452, 6518, 281, 19148, 2067, 7794, 273, 253, 9759, 285, 10291, 281, 5368, 789, 3103, 891, 5583, 326, 253, 2929, 320, 7607, 50276 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 33888, 10295, 1025, 3909, 13757, 3237, 285, 12661, 281, 18915, 253, 1895, 407, 690, 26647, 273, 819, 1983, 34716, 5609, 253, 5609, 3715, 275, 436, 2929, 4453, 21541, 533, 891, 717, 417, 271, 6485, 275, 436, 2170, 594, 1928, 247, 2372, 31488, 670, 253, 6864, 273, 253, 5609, 285, 5730, 253, 4477, 476, 21184, 247, 2372, 625, 327, 690, 20178, 3533, 50274, 783, 2929, 13698, 281, 22318, 247, 1159, 275, 391, 76, 11285, 2256, 281, 247, 2176, 2602, 30995, 352, 476, 320, 13588, 275, 247, 749, 8172, 5133, 840, 352, 8219, 326, 407, 9257, 16248, 305, 12064, 4870, 12980, 68, 285, 819, 1983, 34716, 11333, 247, 37825, 5933, 476, 320, 4158, 891, 1928, 253, 1039, 1027, 5609, 403, 8527, 310, 3240, 5272, 285, 436, 4583, 7835, 751, 247, 5322, 906, 17837, 891, 452, 247, 1643, 3533, 50275, 18, 310, 627, 247, 5322, 1039, 281, 4665, 253, 2602, 7658, 15895, 253, 15895, 34247, 751, 247, 1160, 484, 7658, 984, 273, 2176, 12291, 273, 253, 747, 5609, 3715, 50276, 19, 849, 1057, 436, 789, 7277, 281, 256, 11078, 34627, 465, 376, 2327, 465, 518, 796, 285, 396, 38142, 1249, 310, 326, 253, 760, 3064, 310, 326, 247, 2602, 7658, 310, 2879, 285, 5046, 28763, 253, 1921, 690, 819, 1983, 8746, 310, 3058, 671, 690, 749, 2307, 792, 5609, 4455, 281, 320, 3058, 281, 2968, 342, 1110, 1491, 6351, 1841, 352, 4620, 326, 436, 2238, 273, 5609, 403, 417, 3058, 275, 436, 2929, 50276, 14958, 253, 2488, 8722, 281, 39256, 253, 749, 2307, 792, 2867, 275, 247, 1027, 1039, 390, 21719, 908, 436, 2867, 9366, 50276, 20, 2779, 281, 320, 619, 28884, 1057, 749, 8172, 14938, 8018, 326, 627, 4961, 271, 28841, 5933, 326, 476, 22318, 667, 1159, 275, 7052, 14189, 673, 342, 2228, 4764, 299, 4277, 326, 7835, 1512, 1175, 281, 320, 2032, 390, 5046, 2629, 9860, 8716, 671, 476, 5115, 594, 577, 2905, 281, 374, 285, 495, 1057, 253, 5933, 452, 247, 23964, 37613, 26647, 891, 8564, 253, 906, 651, 320, 17619, 275, 253, 7877, 751, 1629, 661, 805, 594, 619, 1953, 387, 253, 990, 310, 326, 604, 17619, 14938, 310, 4136, 849, 310, 436, 1805, 685, 2629, 9860, 3186, 5701, 327, 1097, 253, 8542, 285, 10527, 7794, 651, 320, 9371, 50273, 18941, 347, 1840, 50276, 7152, 339, 431, 248, 4477, 1263, 247, 19191, 3961, 262, 1895, 672, 253, 10921, 1159, 285, 7658, 1159, 7027, 275, 247, 39306, 10295, 288, 300, 6291, 2317, 391, 76, 11285, 342, 247, 11542, 5222, 253, 2929, 19401, 2602, 10806, 326, 778, 320, 13588, 275, 667, 3790, 347, 1048, 347, 253, 18849, 15927, 403, 1355, 597, 8415, 253, 11096, 11903, 1320, 1895, 970, 50276, 1087, 1983, 34716, 13757, 285, 12661, 247, 12112, 5933, 323, 2710, 3510, 273, 17947, 1690, 44274, 67, 285, 28669, 50276, 9328, 50276, 12563, 2085, 247, 27998, 1783, 342, 749, 8172, 14938, 285, 749, 8172, 7658, 8411, 50276, 14094, 2022, 7680, 3133, 281, 1804, 247, 27998, 2900, 323, 20793, 10295, 1025, 3961, 953, 50276, 783, 2929, 18382, 5797, 275, 5277, 247, 27998, 14938, 1783, 326, 10085, 684, 2710, 17947, 5657, 824, 347, 44274, 67, 285, 28669, 407, 39256, 272, 2045, 7274, 22128, 327, 12865, 522, 43772, 5267, 2094, 23311, 1003, 50275, 45563, 50276, 783, 4477, 1287, 247, 27998, 7792, 323, 10295, 3961, 953, 342, 2602, 10806, 970, 819, 1983, 34716, 13757, 285, 921, 749, 8172, 10921, 14938, 285, 749, 8172, 2264, 7658, 8411, 672, 44274, 67, 390, 28669, 1511, 273, 17947, 310, 12845, 50276, 936, 3989, 247, 27998, 5933, 285, 14938, 1783, 597, 4271, 247, 4460, 4209, 1617, 50276, 783, 2929, 310, 973, 3542, 285, 2590, 50276, 40702, 40702, 4754, 5474, 33032, 2520, 2929, 3400, 247, 27998, 7792, 260, 22421, 323, 10295, 1025, 3961, 953, 13757, 342, 7202, 10295, 1025, 10806, 1754, 327, 819, 1983, 34716, 13757, 436, 7792, 476, 2126, 2087, 17947, 8130, 31025, 1028, 67, 285, 305, 45276, 285, 5115, 749, 8172, 18849, 14938, 342, 749, 8172, 18849, 7658, 8411, 247, 747, 17947, 5700, 40819, 17788, 1028, 67, 310, 671, 2530, 4679, 327, 13506, 285, 1524, 10186, 941, 403, 5196, 253, 2929, 310, 3839, 973, 15720, 342, 7000, 27947, 2530, 275, 253, 14801, 1271, 50276, 296, 3755, 20556, 50276, 783, 260, 22421, 7792, 476, 2126, 2709, 17947, 8130, 50276, 4524, 253, 4209, 1617, 260, 22421, 476, 20685, 749, 8172, 18849, 14938, 285, 749, 8172, 18849, 7658, 15927, 50276, 20881, 1255, 265, 50276, 262, 310, 7208, 5393, 326, 260, 22421, 476, 897, 2087, 17947, 8130, 1754, 327, 619, 4685, 436, 5933, 4419, 247, 1159, 29107, 23899, 50276, 30786, 685, 271, 11931, 1159, 690, 17947, 8130, 751, 4869, 11041, 513, 417, 6388, 271, 29107, 273, 253, 7202, 1159, 50276, 26122, 50276, 1542, 1386, 21603, 247, 1805, 5170, 3033, 327, 4869, 1491, 6351, 323, 253, 45171, 10295, 556, 644, 2530, 275, 327, 1491, 6351, 285, 14938, 14493, 275, 305, 12064, 1232, 3961, 953, 14023, 875, 17133, 13757, 3082, 285, 12865, 522, 43772, 5267, 2094, 3082, 452, 644, 5469, 5474, 339, 431, 248, 2929, 19401, 253, 1895, 273, 2806, 3817, 13757, 762, 2602, 10806, 835, 253, 8103, 273, 253, 458, 47612, 310, 281, 22950, 247, 1159, 269, 2256, 281, 305, 458, 82, 470, 835, 1097, 269, 285, 305, 403, 3603, 275, 6830, 1027, 391, 76, 11285, 253, 8103, 310, 281, 20685, 247, 749, 8172, 14938, 285, 247, 749, 8172, 8411, 273, 253, 7658, 253, 4477, 12661, 247, 2087, 5933, 280, 7792, 323, 824, 15216, 326, 2987, 323, 247, 966, 273, 17947, 8130, 14127, 2176, 2515, 352, 310, 2011, 326, 31025, 1028, 67, 285, 305, 45276, 10517, 841, 2515, 285, 253, 14938, 285, 8411, 14493, 403, 6012, 323, 841, 767, 7823, 8109, 10704, 1941, 556, 671, 644, 2530, 50276, 296, 3755, 20556, 50276, 74, 1158, 253, 1783, 5609, 908, 275, 253, 789, 403, 4722, 1014, 2167, 253, 5697, 403, 29563, 432, 5368, 789, 253, 789, 21354, 4722, 43630, 285, 8725, 253, 1783, 432, 31025, 1028, 67, 281, 305, 45276, 891, 10490, 253, 5955, 327, 17133, 2178, 285, 12865, 484, 46964, 2178, 5609, 275, 253, 30762, 50276, 20881, 1255, 265, 50275, 32897, 923, 253, 1735, 2593, 323, 3533, 5549, 2490, 187, 4118, 18435, 27, 783, 2929, 3400, 747, 5609, 5933, 280, 347, 973, 347, 16101, 281, 8415, 2806, 3817, 13757, 273, 6032, 3470, 342, 10806, 253, 30628, 403, 8127, 275, 3718, 273, 253, 9380, 9021, 285, 253, 2488, 6128, 452, 6518, 281, 19148, 2067, 7794, 273, 253, 9759, 285, 10291, 281, 5368, 789, 3103, 891, 5583, 326, 253, 2929, 320, 7607, 50276 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this papers uses the label hierarchy to drive the search process over a set of labels using reinforcement learning the approach offers clever and promising techniques to force the inference process in structured classification to converge but experiments seem to lack appletoapple comparisons however i think the authors should rather present this work as structured classification as labels dependencies not modeled by the hierarchy are exploited and as other graph structure could be exploited to drive the rl search i tend to see hierarchical classification as an approach to multilabel classification justified by a greedy decomposition that reduced both training and test time this view has been outmoded for more than an decade first as flat approaches became feasible and now as endtoend structured classification is implementable with dnns see for instance david belanger work with mccallum compared to other structured classification approaches whose scope is limited by the complexity of the inference process this approaches is very attractive the authors open the optimization black box of the inference process by adding a few very clever tricks that facilitate convergence intermediate rewards based on the gain on f1 score self critical training approach clamped pretraining enabled by the use of state embeddings that are multiplied my a transition to any state in the free mode and just the next states in the hierarchy in the clamped mode addition of a flat loss to improve the quality of the document representation while those tricks may have been used for other applications they seem new in the context of hierarchicalmultilabelstructured classification while the experiments appear thorough they could be the major weakness of this paper the results the authors quote as representative of other approaches seem in fact entirely reproduced on datasets that were not used on the original papers and the authors do not try an appletoapple comparison to determine if this reproduction is fair none of the quoted work used the 2018 version of yelp and i could only find rcv1 microf1 experiments in johnson and yang who report a 84 microf1 far better than the 766 reported on their behalf here and better than the 827 reported by the authors i read note 4 about the difference in the way the threshold is computed but i doubt it can explain such a large difference i did not check everything but could not find and appletoapple comparison have the network architecture been properly optimized in terms of hyperparameters in particular having tried kim cnn on large label sets i suspect the author settings using a single layer after the convolution is suboptimal i concur with the following paper than an additional hidden layer is essential liu et al deep learning for extreme multilabel text classification i also note the 32 batch size could be way too small for sparse label sets i tend to use a batch size of 512 on this type of datadocsepthis work proposes an rl approach for hierarchical text classification by learning to navigating the hierarchy given a document experiments on 3 datasets show better performance im happy to see that it was possible to 1 we optimize the holistic metrics over the hierarchy by providing the policy network with holistic rewards i dont quite understand what are the holistic metrics and holistic rewards i would like the authors to answer what exactly does reinforcement learning get us is it optimizing f1 metric or is it the ability to fix inconsistent labeling problem if it is the latter what is an example of inconsistent labeling what fraction of errors in table 23 are inconsistent errors are we really seeing the inconsistent errors drop if it is the former how does this compare to existing approaches for optimizing f1 metric 2 the f1 score of each sample xi a f1 is a population metric what does it mean to have f1 for a single sample b im not aware of any work that shows optimizing perexample f1 minimizes f1 metric over a sample 3 with 10 rollouts per training sample imho it seems unrealistic that the expected reward can be computed correctly wouldnt most of the reward just be zero or is it the case the model is initialized with an mle pretrained parameters which seems like it but im not too sure results analysis imho most of the rows in table 2 does not seem comparable with each other due to pretrained wordembeddings and dataset filtering eg svmvariants hlstm in addition to above there is the standard issue of using different parameters across models which increasesdecreases model capacity this is ok as long as all parameters were tuned on held out set or using a common well established unfiltered test set neither of which is clear to me it is not clear how the f1 metric captures inconsistent labeling which seems to be the main selling point for hilap side comment reg textcnn performance could it be that dropout is too high the code was set to 05 docsepthis paper presents an end to end rl approach for hierarchical text classification the paper proposes a label assignment policy for determining the appropropriate positioning of a document in a hierarchy it is based on capturing the global hierachical structure during training and prediction phases as against most methods which either exploit the local information or neural net approaches which ignore the hierarchical structure it is demonstrated the method particularly works well compared to sota methods especially for macrof1 measure which captures the label weighted performance the approach seems original and a detailed experimental analysis is carried out on various datasets some of the concerns that i have regarding this work are the problem of hierarchical text classification is too specific and in this regard the impact of the work seems quite limited the significance is further limited by the scale of the datasets of considered in this paper the paper needs to evaluate against on much bigger datasets such as lshtc datasets httplshtciitdemokritosgr for instance the dataset available under lshtc3 is in the raw format and it would be really competitive to evaluate this method against other such as flat svm and hrsvm4 on this dataset and those from the challenge the experimental evaluation seems less convincing such as the results for hrsvm for rcv1 dataset are quite different in this paper and that given hrsvm paper it is 81665656 vs 728386 reported in this paper given that 81665656 is not too far from that given by hilap it remains a question if the extra computational complexity and lack of scalability of the proposed method is really a significant advantage over existing methods some of the references related to taxonomy adaptation such as 3 and reference therein which are also based on modifying the given taxonomy for better classification are missing comparison with label embedding methods such as 12 are missing for the scale of datasets discussed where svm based methods seem to be working well it is possible that approaches 12 which can exploit label correlations can do even better 1 k bhatia h jain p kar m varma and p jain sparse local embeddings for extreme multilabel classification in nips 2015 2 h yu p jain p kar and i dhillon largescale multilabel learning with missing labels in icml 2014 3 learning taxonomy adaptation in largescale classification jmlr 2016 4 recursive regularization for largescale classification with hierarchical and graphical dependencies httpsdlacmorgcitationcfmid2487644 ### Summary:
this paper presents a reinforcement learning approach to hierarchical text classification pros a potentially interesting idea to drive the search process over a hierachical set of labels using reinforcement learning cons the major concensus among all reviewers was that there were various concerns about experimental results eg appletoapple comparisons against prior art r1 proper tuning of hyperparameters r1 r2 the label space is too small 539 to have practical significance compared to tens of thousands of labels that have been used in other related work r3 and other missing baselines r3 in addition even after the rebuttal some of the technical clarity issues have not been fully resolved eg what the proposed method is actually doing optimizing f1 metric vs the ability to fix inconsistent labeling problem verdict reject while authors came back with many detailed responses they were not enough to address the major concerns reviewers had about the empirical significance of this work
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 9380, 4648, 253, 5203, 19868, 281, 4446, 253, 3186, 1232, 689, 247, 873, 273, 13301, 970, 35221, 4715, 253, 2746, 6131, 19080, 285, 12532, 5609, 281, 3490, 253, 17032, 1232, 275, 18872, 9162, 281, 29623, 533, 4679, 1646, 281, 3480, 19126, 936, 19934, 14023, 50276, 35529, 891, 1158, 253, 4477, 943, 2581, 1246, 436, 789, 347, 18872, 9162, 347, 13301, 21011, 417, 23115, 407, 253, 19868, 403, 28734, 285, 347, 643, 4216, 2605, 812, 320, 28734, 281, 4446, 253, 391, 77, 3186, 891, 5257, 281, 923, 24498, 9162, 347, 271, 2746, 281, 33362, 1492, 9162, 17285, 407, 247, 38754, 14717, 326, 3777, 1097, 3733, 285, 1071, 673, 436, 1859, 556, 644, 562, 2307, 264, 323, 625, 685, 271, 9976, 806, 347, 6507, 7274, 3395, 17887, 285, 1024, 347, 990, 936, 423, 50276, 34218, 9162, 310, 3359, 494, 342, 277, 79, 2224, 923, 323, 4227, 34843, 301, 1112, 3751, 789, 342, 278, 550, 455, 360, 50276, 3118, 1096, 281, 643, 18872, 9162, 7274, 3692, 7990, 310, 3710, 407, 253, 10454, 273, 253, 17032, 1232, 436, 7274, 310, 1077, 12994, 253, 4477, 1527, 253, 13757, 2806, 3817, 273, 253, 17032, 1232, 407, 6240, 247, 1643, 1077, 19080, 24866, 326, 12454, 14940, 50276, 2388, 8613, 23267, 1754, 327, 253, 6351, 327, 269, 18, 4868, 50276, 1286, 4619, 3733, 2746, 50276, 498, 17263, 3215, 26208, 11410, 407, 253, 897, 273, 1375, 46234, 326, 403, 31458, 619, 247, 5502, 281, 667, 1375, 275, 253, 1959, 4438, 285, 816, 253, 1735, 3054, 275, 253, 19868, 275, 253, 502, 17263, 4438, 50276, 29483, 273, 247, 6507, 2957, 281, 3157, 253, 3290, 273, 253, 3389, 6779, 50276, 6050, 1110, 24866, 778, 452, 644, 908, 323, 643, 4893, 597, 1646, 747, 275, 253, 3634, 273, 24498, 9961, 300, 1492, 34218, 9162, 50276, 6050, 253, 4679, 3176, 11080, 597, 812, 320, 253, 2201, 14855, 273, 436, 2929, 253, 1543, 253, 4477, 14430, 347, 8612, 273, 643, 7274, 1646, 275, 958, 7094, 23775, 327, 15302, 326, 497, 417, 908, 327, 253, 3236, 9380, 285, 253, 4477, 513, 417, 1611, 271, 19126, 936, 19934, 5301, 281, 3653, 604, 436, 21068, 310, 4344, 5293, 273, 253, 15212, 789, 908, 253, 4765, 2715, 273, 340, 47705, 285, 891, 812, 760, 1089, 391, 17312, 18, 2494, 71, 18, 4679, 275, 480, 2116, 1665, 285, 30966, 665, 1304, 247, 11130, 2494, 71, 18, 2080, 1805, 685, 253, 818, 2526, 2361, 327, 616, 11136, 1060, 285, 1805, 685, 253, 854, 1630, 2361, 50276, 1615, 253, 4477, 891, 1239, 3877, 577, 670, 253, 3064, 275, 253, 1039, 253, 7887, 310, 10302, 533, 891, 5545, 352, 476, 5513, 824, 247, 1781, 3064, 891, 858, 417, 2451, 3253, 533, 812, 417, 1089, 285, 19126, 936, 19934, 5301, 50276, 9802, 253, 2990, 10336, 644, 6283, 18325, 275, 2426, 273, 4373, 22041, 275, 1798, 1907, 3597, 465, 303, 260, 9866, 327, 1781, 5203, 5239, 891, 9101, 253, 2488, 7533, 970, 247, 2014, 3828, 846, 253, 27311, 310, 749, 29776, 891, 15038, 342, 253, 1563, 2929, 685, 271, 3081, 8763, 3828, 310, 5667, 632, 86, 1162, 355, 3676, 4715, 323, 9559, 33362, 1492, 2505, 9162, 891, 671, 3877, 253, 4567, 14604, 1979, 812, 320, 1039, 1512, 1355, 323, 23507, 5203, 5239, 891, 5257, 281, 897, 247, 14604, 1979, 273, 23414, 327, 436, 1511, 273, 2856, 44180, 33032, 2520, 789, 29328, 271, 391, 77, 2746, 323, 24498, 2505, 9162, 407, 4715, 281, 49858, 253, 19868, 1677, 247, 3389, 4679, 327, 495, 15302, 921, 1805, 3045, 516, 5211, 281, 923, 326, 352, 369, 1896, 281, 50275, 18, 359, 22318, 253, 45290, 17082, 689, 253, 19868, 407, 5277, 253, 3646, 2990, 342, 45290, 23267, 50276, 74, 13414, 3240, 2096, 752, 403, 253, 45290, 17082, 285, 45290, 23267, 891, 651, 751, 253, 4477, 281, 3662, 752, 4555, 1057, 35221, 4715, 755, 441, 50274, 261, 352, 39793, 269, 18, 7982, 390, 310, 352, 253, 3745, 281, 4993, 16706, 21473, 1895, 50274, 338, 352, 310, 253, 6158, 752, 310, 271, 1650, 273, 16706, 21473, 752, 6919, 273, 6332, 275, 2829, 3495, 403, 16706, 6332, 403, 359, 1663, 6523, 253, 16706, 6332, 5926, 50275, 338, 352, 310, 253, 3438, 849, 1057, 436, 7277, 281, 5368, 7274, 323, 39793, 269, 18, 7982, 50276, 19, 253, 269, 18, 4868, 273, 1016, 3410, 1269, 74, 50276, 66, 269, 18, 310, 247, 3072, 7982, 752, 1057, 352, 1599, 281, 452, 269, 18, 323, 247, 2014, 3410, 50276, 67, 516, 417, 6600, 273, 667, 789, 326, 2722, 39793, 759, 18398, 4636, 269, 18, 46926, 269, 18, 7982, 689, 247, 3410, 50276, 20, 342, 884, 4533, 8349, 591, 3733, 3410, 516, 1689, 352, 3133, 46521, 326, 253, 3264, 10921, 476, 320, 10302, 9113, 651, 2649, 954, 273, 253, 10921, 816, 320, 5058, 50276, 263, 310, 352, 253, 1083, 253, 1566, 310, 31260, 342, 271, 278, 282, 3215, 11273, 3602, 534, 3133, 751, 352, 533, 516, 417, 1512, 2119, 50276, 16680, 1783, 50276, 303, 1689, 954, 273, 253, 10175, 275, 2829, 374, 1057, 417, 1646, 10870, 342, 1016, 643, 1955, 281, 3215, 11273, 3159, 24224, 26935, 285, 10895, 19690, 24088, 256, 11618, 20617, 1103, 288, 42663, 78, 50276, 249, 1635, 281, 1840, 627, 310, 253, 2629, 2523, 273, 970, 1027, 3602, 2439, 3210, 534, 5459, 40600, 1169, 1566, 5350, 436, 310, 8718, 347, 1048, 347, 512, 3602, 497, 24251, 327, 2918, 562, 873, 390, 970, 247, 1846, 973, 4232, 440, 45407, 1071, 873, 50276, 570, 1622, 273, 534, 310, 2590, 281, 479, 50276, 262, 310, 417, 2590, 849, 253, 269, 18, 7982, 28174, 16706, 21473, 534, 3133, 281, 320, 253, 2022, 10156, 1127, 323, 288, 300, 522, 50275, 2189, 4385, 50276, 1747, 2505, 68, 9866, 3045, 812, 352, 320, 326, 5926, 483, 310, 1512, 1029, 50276, 783, 2127, 369, 873, 281, 16987, 50274, 7152, 33032, 2520, 2929, 10262, 271, 990, 281, 990, 391, 77, 2746, 323, 24498, 2505, 9162, 253, 2929, 29328, 247, 5203, 12714, 3646, 323, 8925, 253, 1192, 856, 3225, 366, 19274, 273, 247, 3389, 275, 247, 19868, 352, 310, 1754, 327, 26475, 253, 4156, 10549, 607, 474, 2605, 1309, 3733, 285, 10554, 12475, 347, 1411, 954, 3082, 534, 2057, 22059, 253, 1980, 1491, 390, 11454, 2036, 7274, 534, 11823, 253, 24498, 2605, 352, 310, 5183, 253, 1332, 3782, 2987, 973, 2429, 281, 256, 5503, 3082, 3340, 323, 14823, 71, 18, 2557, 534, 28174, 253, 5203, 17375, 3045, 253, 2746, 3133, 3236, 285, 247, 7000, 5661, 1783, 310, 4824, 562, 327, 2710, 15302, 50275, 8826, 273, 253, 7350, 326, 891, 452, 5001, 436, 789, 403, 50274, 783, 1895, 273, 24498, 2505, 9162, 310, 1512, 2173, 285, 275, 436, 2743, 253, 3486, 273, 253, 789, 3133, 3240, 3710, 50274, 783, 8453, 310, 2007, 3710, 407, 253, 4311, 273, 253, 15302, 273, 2783, 275, 436, 2929, 253, 2929, 3198, 281, 7472, 1411, 327, 1199, 8750, 15302, 824, 347, 35253, 384, 68, 15302, 2832, 446, 84, 384, 5297, 262, 9468, 536, 902, 375, 737, 323, 4227, 253, 10895, 2130, 762, 35253, 384, 68, 20, 310, 275, 253, 9305, 5981, 285, 352, 651, 320, 1663, 12085, 281, 7472, 436, 1332, 1411, 643, 824, 347, 6507, 256, 11618, 285, 39112, 11618, 21, 327, 436, 10895, 285, 1110, 432, 253, 5691, 50276, 783, 5661, 7103, 3133, 1679, 21414, 824, 347, 253, 1543, 323, 39112, 11618, 323, 391, 17312, 18, 10895, 403, 3240, 1027, 275, 436, 2929, 285, 326, 1677, 39112, 11618, 2929, 352, 310, 854, 1036, 2082, 29543, 4632, 818, 1619, 20373, 2361, 275, 436, 2929, 1677, 326, 50276, 46455, 2082, 29543, 310, 417, 1512, 2080, 432, 326, 1677, 407, 288, 300, 522, 352, 4558, 247, 1953, 604, 253, 4465, 15180, 10454, 285, 3480, 273, 9171, 1430, 50276, 1171, 253, 4081, 1332, 310, 1663, 247, 1534, 5750, 689, 5368, 3082, 50275, 8826, 273, 253, 10414, 2905, 281, 2891, 13646, 15644, 824, 347, 495, 285, 3806, 15308, 50276, 4609, 403, 671, 1754, 327, 26264, 253, 1677, 2891, 13646, 323, 1805, 9162, 403, 5816, 50275, 47109, 342, 5203, 21496, 3082, 824, 347, 1249, 403, 5816, 323, 253, 4311, 273, 15302, 5469, 835, 256, 11618, 1754, 3082, 1646, 281, 320, 2444, 973, 352, 310, 1896, 326, 7274, 1249, 534, 476, 22059, 5203, 13007, 476, 513, 1014, 1805, 337, 465, 270, 700, 571, 288, 480, 404, 268, 46247, 278, 945, 785, 285, 268, 480, 404, 23507, 1980, 46234, 323, 9559, 33362, 1492, 9162, 275, 295, 2824, 4104, 374, 50276, 73, 340, 86, 268, 480, 404, 268, 46247, 285, 891, 277, 18610, 251, 1236, 2510, 25912, 33362, 1492, 4715, 342, 5816, 13301, 275, 17857, 1686, 4059, 495, 4715, 2891, 13646, 15644, 275, 1236, 2510, 25912, 9162, 480, 1686, 83, 4022, 577, 33037, 37820, 323, 1236, 2510, 25912, 9162, 342, 24498, 285, 29886, 21011, 5987, 11830, 50232, 2061, 26977, 7836, 7893, 1348, 2597, 25959, 187, 187, 4118, 18435, 27, 2520, 2929, 10262, 247, 35221, 4715, 2746, 281, 24498, 2505, 9162, 50276, 856, 84, 247, 7826, 4722, 2934, 281, 4446, 253, 3186, 1232, 689, 247, 10549, 607, 474, 873, 273, 13301, 970, 35221, 4715, 50276, 5040, 253, 2201, 7036, 8536, 2190, 512, 30628, 369, 326, 627, 497, 2710, 7350, 670, 5661, 1543, 24088, 19126, 936, 19934, 14023, 1411, 2720, 1445, 391, 18, 1463, 25184, 273, 4373, 22041, 391, 18, 391, 19, 253, 5203, 2317, 310, 1512, 1355, 45610, 281, 452, 8542, 8453, 2429, 281, 7114, 273, 6763, 273, 13301, 326, 452, 644, 908, 275, 643, 2905, 789, 391, 20, 285, 643, 5816, 1666, 25379, 391, 20, 275, 1635, 1014, 846, 253, 30080, 22559, 690, 273, 253, 7681, 19843, 3374, 452, 417, 644, 4751, 11512, 24088, 752, 253, 4081, 1332, 310, 2686, 2509, 39793, 269, 18, 7982, 4632, 253, 3745, 281, 4993, 16706, 21473, 1895, 50276, 332, 8102, 50276, 49844, 1223, 4477, 2210, 896, 342, 1142, 7000, 6128, 597, 497, 417, 2217, 281, 2953, 253, 2201, 7350, 30628, 574, 670, 253, 16774, 8453, 273, 436, 789 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 9380, 4648, 253, 5203, 19868, 281, 4446, 253, 3186, 1232, 689, 247, 873, 273, 13301, 970, 35221, 4715, 253, 2746, 6131, 19080, 285, 12532, 5609, 281, 3490, 253, 17032, 1232, 275, 18872, 9162, 281, 29623, 533, 4679, 1646, 281, 3480, 19126, 936, 19934, 14023, 50276, 35529, 891, 1158, 253, 4477, 943, 2581, 1246, 436, 789, 347, 18872, 9162, 347, 13301, 21011, 417, 23115, 407, 253, 19868, 403, 28734, 285, 347, 643, 4216, 2605, 812, 320, 28734, 281, 4446, 253, 391, 77, 3186, 891, 5257, 281, 923, 24498, 9162, 347, 271, 2746, 281, 33362, 1492, 9162, 17285, 407, 247, 38754, 14717, 326, 3777, 1097, 3733, 285, 1071, 673, 436, 1859, 556, 644, 562, 2307, 264, 323, 625, 685, 271, 9976, 806, 347, 6507, 7274, 3395, 17887, 285, 1024, 347, 990, 936, 423, 50276, 34218, 9162, 310, 3359, 494, 342, 277, 79, 2224, 923, 323, 4227, 34843, 301, 1112, 3751, 789, 342, 278, 550, 455, 360, 50276, 3118, 1096, 281, 643, 18872, 9162, 7274, 3692, 7990, 310, 3710, 407, 253, 10454, 273, 253, 17032, 1232, 436, 7274, 310, 1077, 12994, 253, 4477, 1527, 253, 13757, 2806, 3817, 273, 253, 17032, 1232, 407, 6240, 247, 1643, 1077, 19080, 24866, 326, 12454, 14940, 50276, 2388, 8613, 23267, 1754, 327, 253, 6351, 327, 269, 18, 4868, 50276, 1286, 4619, 3733, 2746, 50276, 498, 17263, 3215, 26208, 11410, 407, 253, 897, 273, 1375, 46234, 326, 403, 31458, 619, 247, 5502, 281, 667, 1375, 275, 253, 1959, 4438, 285, 816, 253, 1735, 3054, 275, 253, 19868, 275, 253, 502, 17263, 4438, 50276, 29483, 273, 247, 6507, 2957, 281, 3157, 253, 3290, 273, 253, 3389, 6779, 50276, 6050, 1110, 24866, 778, 452, 644, 908, 323, 643, 4893, 597, 1646, 747, 275, 253, 3634, 273, 24498, 9961, 300, 1492, 34218, 9162, 50276, 6050, 253, 4679, 3176, 11080, 597, 812, 320, 253, 2201, 14855, 273, 436, 2929, 253, 1543, 253, 4477, 14430, 347, 8612, 273, 643, 7274, 1646, 275, 958, 7094, 23775, 327, 15302, 326, 497, 417, 908, 327, 253, 3236, 9380, 285, 253, 4477, 513, 417, 1611, 271, 19126, 936, 19934, 5301, 281, 3653, 604, 436, 21068, 310, 4344, 5293, 273, 253, 15212, 789, 908, 253, 4765, 2715, 273, 340, 47705, 285, 891, 812, 760, 1089, 391, 17312, 18, 2494, 71, 18, 4679, 275, 480, 2116, 1665, 285, 30966, 665, 1304, 247, 11130, 2494, 71, 18, 2080, 1805, 685, 253, 818, 2526, 2361, 327, 616, 11136, 1060, 285, 1805, 685, 253, 854, 1630, 2361, 50276, 1615, 253, 4477, 891, 1239, 3877, 577, 670, 253, 3064, 275, 253, 1039, 253, 7887, 310, 10302, 533, 891, 5545, 352, 476, 5513, 824, 247, 1781, 3064, 891, 858, 417, 2451, 3253, 533, 812, 417, 1089, 285, 19126, 936, 19934, 5301, 50276, 9802, 253, 2990, 10336, 644, 6283, 18325, 275, 2426, 273, 4373, 22041, 275, 1798, 1907, 3597, 465, 303, 260, 9866, 327, 1781, 5203, 5239, 891, 9101, 253, 2488, 7533, 970, 247, 2014, 3828, 846, 253, 27311, 310, 749, 29776, 891, 15038, 342, 253, 1563, 2929, 685, 271, 3081, 8763, 3828, 310, 5667, 632, 86, 1162, 355, 3676, 4715, 323, 9559, 33362, 1492, 2505, 9162, 891, 671, 3877, 253, 4567, 14604, 1979, 812, 320, 1039, 1512, 1355, 323, 23507, 5203, 5239, 891, 5257, 281, 897, 247, 14604, 1979, 273, 23414, 327, 436, 1511, 273, 2856, 44180, 33032, 2520, 789, 29328, 271, 391, 77, 2746, 323, 24498, 2505, 9162, 407, 4715, 281, 49858, 253, 19868, 1677, 247, 3389, 4679, 327, 495, 15302, 921, 1805, 3045, 516, 5211, 281, 923, 326, 352, 369, 1896, 281, 50275, 18, 359, 22318, 253, 45290, 17082, 689, 253, 19868, 407, 5277, 253, 3646, 2990, 342, 45290, 23267, 50276, 74, 13414, 3240, 2096, 752, 403, 253, 45290, 17082, 285, 45290, 23267, 891, 651, 751, 253, 4477, 281, 3662, 752, 4555, 1057, 35221, 4715, 755, 441, 50274, 261, 352, 39793, 269, 18, 7982, 390, 310, 352, 253, 3745, 281, 4993, 16706, 21473, 1895, 50274, 338, 352, 310, 253, 6158, 752, 310, 271, 1650, 273, 16706, 21473, 752, 6919, 273, 6332, 275, 2829, 3495, 403, 16706, 6332, 403, 359, 1663, 6523, 253, 16706, 6332, 5926, 50275, 338, 352, 310, 253, 3438, 849, 1057, 436, 7277, 281, 5368, 7274, 323, 39793, 269, 18, 7982, 50276, 19, 253, 269, 18, 4868, 273, 1016, 3410, 1269, 74, 50276, 66, 269, 18, 310, 247, 3072, 7982, 752, 1057, 352, 1599, 281, 452, 269, 18, 323, 247, 2014, 3410, 50276, 67, 516, 417, 6600, 273, 667, 789, 326, 2722, 39793, 759, 18398, 4636, 269, 18, 46926, 269, 18, 7982, 689, 247, 3410, 50276, 20, 342, 884, 4533, 8349, 591, 3733, 3410, 516, 1689, 352, 3133, 46521, 326, 253, 3264, 10921, 476, 320, 10302, 9113, 651, 2649, 954, 273, 253, 10921, 816, 320, 5058, 50276, 263, 310, 352, 253, 1083, 253, 1566, 310, 31260, 342, 271, 278, 282, 3215, 11273, 3602, 534, 3133, 751, 352, 533, 516, 417, 1512, 2119, 50276, 16680, 1783, 50276, 303, 1689, 954, 273, 253, 10175, 275, 2829, 374, 1057, 417, 1646, 10870, 342, 1016, 643, 1955, 281, 3215, 11273, 3159, 24224, 26935, 285, 10895, 19690, 24088, 256, 11618, 20617, 1103, 288, 42663, 78, 50276, 249, 1635, 281, 1840, 627, 310, 253, 2629, 2523, 273, 970, 1027, 3602, 2439, 3210, 534, 5459, 40600, 1169, 1566, 5350, 436, 310, 8718, 347, 1048, 347, 512, 3602, 497, 24251, 327, 2918, 562, 873, 390, 970, 247, 1846, 973, 4232, 440, 45407, 1071, 873, 50276, 570, 1622, 273, 534, 310, 2590, 281, 479, 50276, 262, 310, 417, 2590, 849, 253, 269, 18, 7982, 28174, 16706, 21473, 534, 3133, 281, 320, 253, 2022, 10156, 1127, 323, 288, 300, 522, 50275, 2189, 4385, 50276, 1747, 2505, 68, 9866, 3045, 812, 352, 320, 326, 5926, 483, 310, 1512, 1029, 50276, 783, 2127, 369, 873, 281, 16987, 50274, 7152, 33032, 2520, 2929, 10262, 271, 990, 281, 990, 391, 77, 2746, 323, 24498, 2505, 9162, 253, 2929, 29328, 247, 5203, 12714, 3646, 323, 8925, 253, 1192, 856, 3225, 366, 19274, 273, 247, 3389, 275, 247, 19868, 352, 310, 1754, 327, 26475, 253, 4156, 10549, 607, 474, 2605, 1309, 3733, 285, 10554, 12475, 347, 1411, 954, 3082, 534, 2057, 22059, 253, 1980, 1491, 390, 11454, 2036, 7274, 534, 11823, 253, 24498, 2605, 352, 310, 5183, 253, 1332, 3782, 2987, 973, 2429, 281, 256, 5503, 3082, 3340, 323, 14823, 71, 18, 2557, 534, 28174, 253, 5203, 17375, 3045, 253, 2746, 3133, 3236, 285, 247, 7000, 5661, 1783, 310, 4824, 562, 327, 2710, 15302, 50275, 8826, 273, 253, 7350, 326, 891, 452, 5001, 436, 789, 403, 50274, 783, 1895, 273, 24498, 2505, 9162, 310, 1512, 2173, 285, 275, 436, 2743, 253, 3486, 273, 253, 789, 3133, 3240, 3710, 50274, 783, 8453, 310, 2007, 3710, 407, 253, 4311, 273, 253, 15302, 273, 2783, 275, 436, 2929, 253, 2929, 3198, 281, 7472, 1411, 327, 1199, 8750, 15302, 824, 347, 35253, 384, 68, 15302, 2832, 446, 84, 384, 5297, 262, 9468, 536, 902, 375, 737, 323, 4227, 253, 10895, 2130, 762, 35253, 384, 68, 20, 310, 275, 253, 9305, 5981, 285, 352, 651, 320, 1663, 12085, 281, 7472, 436, 1332, 1411, 643, 824, 347, 6507, 256, 11618, 285, 39112, 11618, 21, 327, 436, 10895, 285, 1110, 432, 253, 5691, 50276, 783, 5661, 7103, 3133, 1679, 21414, 824, 347, 253, 1543, 323, 39112, 11618, 323, 391, 17312, 18, 10895, 403, 3240, 1027, 275, 436, 2929, 285, 326, 1677, 39112, 11618, 2929, 352, 310, 854, 1036, 2082, 29543, 4632, 818, 1619, 20373, 2361, 275, 436, 2929, 1677, 326, 50276, 46455, 2082, 29543, 310, 417, 1512, 2080, 432, 326, 1677, 407, 288, 300, 522, 352, 4558, 247, 1953, 604, 253, 4465, 15180, 10454, 285, 3480, 273, 9171, 1430, 50276, 1171, 253, 4081, 1332, 310, 1663, 247, 1534, 5750, 689, 5368, 3082, 50275, 8826, 273, 253, 10414, 2905, 281, 2891, 13646, 15644, 824, 347, 495, 285, 3806, 15308, 50276, 4609, 403, 671, 1754, 327, 26264, 253, 1677, 2891, 13646, 323, 1805, 9162, 403, 5816, 50275, 47109, 342, 5203, 21496, 3082, 824, 347, 1249, 403, 5816, 323, 253, 4311, 273, 15302, 5469, 835, 256, 11618, 1754, 3082, 1646, 281, 320, 2444, 973, 352, 310, 1896, 326, 7274, 1249, 534, 476, 22059, 5203, 13007, 476, 513, 1014, 1805, 337, 465, 270, 700, 571, 288, 480, 404, 268, 46247, 278, 945, 785, 285, 268, 480, 404, 23507, 1980, 46234, 323, 9559, 33362, 1492, 9162, 275, 295, 2824, 4104, 374, 50276, 73, 340, 86, 268, 480, 404, 268, 46247, 285, 891, 277, 18610, 251, 1236, 2510, 25912, 33362, 1492, 4715, 342, 5816, 13301, 275, 17857, 1686, 4059, 495, 4715, 2891, 13646, 15644, 275, 1236, 2510, 25912, 9162, 480, 1686, 83, 4022, 577, 33037, 37820, 323, 1236, 2510, 25912, 9162, 342, 24498, 285, 29886, 21011, 5987, 11830, 50232, 2061, 26977, 7836, 7893, 1348, 2597, 25959, 187, 187, 4118, 18435, 27, 2520, 2929, 10262, 247, 35221, 4715, 2746, 281, 24498, 2505, 9162, 50276, 856, 84, 247, 7826, 4722, 2934, 281, 4446, 253, 3186, 1232, 689, 247, 10549, 607, 474, 873, 273, 13301, 970, 35221, 4715, 50276, 5040, 253, 2201, 7036, 8536, 2190, 512, 30628, 369, 326, 627, 497, 2710, 7350, 670, 5661, 1543, 24088, 19126, 936, 19934, 14023, 1411, 2720, 1445, 391, 18, 1463, 25184, 273, 4373, 22041, 391, 18, 391, 19, 253, 5203, 2317, 310, 1512, 1355, 45610, 281, 452, 8542, 8453, 2429, 281, 7114, 273, 6763, 273, 13301, 326, 452, 644, 908, 275, 643, 2905, 789, 391, 20, 285, 643, 5816, 1666, 25379, 391, 20, 275, 1635, 1014, 846, 253, 30080, 22559, 690, 273, 253, 7681, 19843, 3374, 452, 417, 644, 4751, 11512, 24088, 752, 253, 4081, 1332, 310, 2686, 2509, 39793, 269, 18, 7982, 4632, 253, 3745, 281, 4993, 16706, 21473, 1895, 50276, 332, 8102, 50276, 49844, 1223, 4477, 2210, 896, 342, 1142, 7000, 6128, 597, 497, 417, 2217, 281, 2953, 253, 2201, 7350, 30628, 574, 670, 253, 16774, 8453, 273, 436, 789 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper considers the problem of continual learning on semantic segmentation the authors propose two simple ideas to improve the architecture using shared but wider convolution modules introducing dropout into the encoderdecoder architecture experiments have been conducted on the pascal voc dataset outperforming previous approaches strength the paper is wellwritten easy to follow weakness writingwise the paper discusses a lot on the information bottleneck i dont know how relevant it is as the further proposed solutions are more like empirical tricks i think the considered scenario does not reflect what happens in practise classincremental semantic segmentation disjoint and overlapped both cases consider training examples where only the background class cbt and classes in ct are used as labels if we do have labels for task t1 why are we not allowed to use them if for efficiency why cant use only a portion of them like replay the proposed solutions although simple does not give too much insight more like some tricks experiment on pascal is not good enough if the idea is to continuously learn new task or in fact learning to segment new categories it would make sense to test at least on coco with 80 classes or even lvis with over 1000 overall i dont think this paper should be published on neurips limitation has been included in the paper docsepthe authors firstly reveal the overcompression of features causes model degradation when training the backbone of the continual segmentation model they try to explain the reason from the aspect of the information bottleneck principle then they retrain the backbone combined with the extra dropout layer and train the classifier with a greater number of features than ssul the experimental results show the effectiveness of their proposed methods this proposed method is based on ssul but the introduction about ssul is not detailed enough which may lead to confusion and misunderstanding from the whole paper it seems that a strong backbone can significantly improve the accuracy of ciss while the classifier just needs to classify features the authors reveal the overcompression of features cause model degradation problem when training the backbone and some tricks for alleviating the problems while their proposed methods try to extract more socalled robust or highquality features for prediction these methods can not solve the overcompression problem experimental results partially prove the effectiveness but lack theoretical proof and ablation study prior knowledge about the ssul model needs to be listed in more detail some experimental settings should be clearly described especially your proposed module or methods for question 3 conduct some ablation study in exploring the same number of channels in the ssul head layer for question 4 conduct experiments on some dropout layers add after the other convolutional layers experimental results partially show the effectiveness of your proposed methods try to explain in theory docsepthis paper proposes to tackle continual semantic segmentation by addressing the overcompression issue in model learning in order to improve the generalization ability of representation to upcoming tasks the authors draw inspiration from previous studies in the field and introduce wider convolution for final feature extraction as well as apply dropout to the output of the feature encoder the proposed method achieves good performance on the pascal voc dataset pros this paper is clear and easy to read the proposed method is simple and efficient the proposed method achieves good performance on pascal voc dataset cons the contribution of this paper is limited previous works have demonstrated the effectiveness of wide convolution layersa and dropoutb for continual learning problems this paper basically applies the techniques to the semantic segmentation task which can be hardly considered as a significant contribution the authors try to interrupt the effectiveness with the information bottleneck principle yet either theoretical or empirical explanations are missing only the improvements in final accuracy are not sufficient to convince the reader about the papers completeness by the way the term generalisation phase sounds wired for explaining the information bottleneck principle in sec22 experiments are only conducted on the pascal voc dataset while comparisons on ade20k are missing since the proposed method is designed with a fixed feature extractor it looks hard to generalize in situations the base class set is small and the new task set is large in summary this paper presents a simple framework for continual semantic segmentation yet the contributions are insignificant and the experiments are insufficient a mirzadeh et al architecture matters in continual learning b mirzadeh et al dropout as an implicit gating mechanism for continual learning na docsepin this work the authors state that overcompression is a vital problem for continual learning and introduce two simple improvements for classincremental semantic segmentation ciss problem the first improvement is widening the network layers before the final classification layer to improve feature quality the second improvement is adding a dropout layer after the encoder during the offline phase of the model training to create more robust features for further cl steps strengths the authors focus on an important topic of overcompression which has a negative impact on continual learning model training also they show an increase in performance in specific setups using two proposed improvements a possible novelty is using dropout layer only during the offline training phase of cl model to prevent overcompression weaknesses originality the paper lacks originality the authors propose to use previously introduced improvements for a particular problem setting the proposed new evaluation framework is also not original likely it is a kind of a standard online learning setup quality one of the main claims line 8 lines 7880 that we can improve cl performance by increasing the feature expressiveness of the learnt representations is not directly supported with evidence within the work the authors do not use any metrics to show improvement in the representations without additional ablation study it is not clear how parts of the solution impact the overall results there are multiple minor improvements or ideas that are not strongly connected for example the evaluation scheme proposal is not directly connected to other presented ideas the authors state that they did not have enough computing resources to provide error bars for reported results clarity the work is not selfcontained for example the authors mention the background shift problem multiple times they state in line 153 that this is a unique problem of continual semantic segmentation but they do not explain the essence of this problem it is unclear how different parts of the related work section relate to the proposed solution the results in bold in table 2 are not the best in columns this is misleading some words are missing in the table 1 caption limitations of the proposed improvements were not described the authors mentioned general positive societal impact of efficient cl training ### Summary:
this paper deals with continual learning in semantic segmentation authors introduce wider convolution at final feature extraction layer and apply dropout to limit the overcompression issue no reviewer was convinced by the approach and they have raised many issues including model design choice training protocol and missing experiments no rebuttal has been provided by the authors as it is this submission is not ready for publication and we encourage the authors to consider the reviewers feedbacks for future publication
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 253, 2929, 19401, 253, 1895, 273, 45120, 4715, 327, 24705, 26405, 50275, 783, 4477, 12661, 767, 2969, 5697, 281, 3157, 253, 10336, 50272, 5302, 6096, 533, 14200, 27311, 11911, 50272, 36445, 2844, 5926, 483, 715, 253, 32049, 48759, 10336, 50274, 16217, 3825, 452, 644, 5196, 327, 253, 7222, 1179, 11571, 10895, 41731, 14692, 2045, 7274, 50276, 45563, 50272, 783, 2929, 310, 973, 15720, 3477, 281, 956, 50275, 20881, 1255, 50272, 17695, 3020, 253, 2929, 25339, 247, 2257, 327, 253, 1491, 3673, 44856, 891, 13414, 871, 849, 4623, 352, 310, 347, 253, 2007, 4081, 5482, 403, 625, 751, 16774, 24866, 50272, 74, 1158, 253, 2783, 10076, 1057, 417, 4887, 752, 6569, 275, 2283, 885, 966, 19687, 30132, 24705, 26405, 28465, 285, 48955, 1097, 2219, 1908, 3733, 6667, 835, 760, 253, 4114, 966, 260, 2612, 285, 5971, 275, 45830, 403, 908, 347, 13301, 604, 359, 513, 452, 13301, 323, 4836, 246, 18, 2139, 403, 359, 417, 4136, 281, 897, 731, 50276, 338, 323, 6733, 2139, 16216, 897, 760, 247, 5110, 273, 731, 751, 44864, 50272, 783, 4081, 5482, 3738, 2969, 1057, 417, 1918, 1512, 1199, 12288, 625, 751, 690, 24866, 50272, 16217, 2092, 327, 7222, 1179, 310, 417, 1175, 2217, 604, 253, 2934, 310, 281, 14949, 3037, 747, 4836, 390, 275, 958, 4715, 281, 8223, 747, 9050, 352, 651, 1056, 3282, 281, 1071, 387, 1878, 327, 9285, 80, 342, 5096, 5971, 390, 1014, 298, 4534, 342, 689, 9098, 50276, 1189, 455, 891, 13414, 1158, 436, 2929, 943, 320, 3863, 327, 5723, 2824, 50276, 2815, 3535, 556, 644, 2908, 275, 253, 2929, 5474, 339, 431, 248, 4477, 41005, 10313, 253, 689, 3118, 1256, 273, 3386, 5997, 1566, 11961, 672, 3733, 253, 27882, 273, 253, 45120, 26405, 1566, 597, 1611, 281, 5513, 253, 1921, 432, 253, 4809, 273, 253, 1491, 3673, 44856, 8063, 840, 597, 851, 1949, 253, 27882, 5678, 342, 253, 4465, 5926, 483, 3828, 285, 6194, 253, 30410, 342, 247, 3687, 1180, 273, 3386, 685, 23524, 335, 253, 5661, 1543, 921, 253, 12510, 273, 616, 4081, 3082, 436, 4081, 1332, 310, 1754, 327, 23524, 335, 533, 253, 10199, 670, 23524, 335, 310, 417, 7000, 2217, 534, 778, 1421, 281, 13775, 285, 40663, 432, 253, 2644, 2929, 352, 3133, 326, 247, 2266, 27882, 476, 3012, 3157, 253, 7200, 273, 260, 739, 1223, 253, 30410, 816, 3198, 281, 30215, 3386, 253, 4477, 10313, 253, 689, 3118, 1256, 273, 3386, 2847, 1566, 11961, 1895, 672, 3733, 253, 27882, 285, 690, 24866, 323, 7374, 6584, 839, 253, 3237, 1223, 616, 4081, 3082, 1611, 281, 4908, 625, 9267, 18859, 10237, 390, 1029, 15177, 3386, 323, 10554, 841, 3082, 476, 417, 8415, 253, 689, 3118, 1256, 1895, 5661, 1543, 10571, 5276, 253, 12510, 533, 3480, 10527, 4737, 285, 28913, 1263, 50276, 40844, 3640, 670, 253, 23524, 335, 1566, 3198, 281, 320, 7117, 275, 625, 2508, 690, 5661, 7533, 943, 320, 4518, 2529, 3340, 634, 4081, 6333, 390, 3082, 323, 1953, 495, 2589, 690, 28913, 1263, 275, 18216, 253, 1072, 1180, 273, 8123, 275, 253, 23524, 335, 1481, 3828, 50276, 1542, 1953, 577, 2589, 4679, 327, 690, 5926, 483, 8090, 823, 846, 253, 643, 27311, 267, 8090, 5661, 1543, 10571, 921, 253, 12510, 273, 634, 4081, 3082, 1611, 281, 5513, 275, 3762, 50276, 7152, 33032, 2520, 2929, 29328, 281, 18915, 45120, 24705, 26405, 407, 15974, 253, 689, 3118, 1256, 2523, 275, 1566, 4715, 275, 1340, 281, 3157, 253, 26647, 3745, 273, 6779, 281, 15146, 8892, 253, 4477, 3812, 17006, 432, 2045, 2175, 275, 253, 1673, 285, 9569, 14200, 27311, 323, 2457, 4735, 11998, 347, 973, 347, 4647, 5926, 483, 281, 253, 3453, 273, 253, 4735, 32049, 50276, 783, 4081, 1332, 33526, 1175, 3045, 327, 253, 7222, 1179, 11571, 10895, 50276, 856, 84, 50276, 2520, 2929, 310, 2590, 285, 3477, 281, 1239, 50276, 783, 4081, 1332, 310, 2969, 285, 5919, 50276, 783, 4081, 1332, 33526, 1175, 3045, 327, 7222, 1179, 11571, 10895, 50276, 5040, 50276, 783, 7680, 273, 436, 2929, 310, 3710, 2045, 2987, 452, 5183, 253, 12510, 273, 4618, 27311, 8090, 66, 285, 5926, 483, 67, 323, 45120, 4715, 3237, 436, 2929, 10323, 10384, 253, 5609, 281, 253, 24705, 26405, 4836, 534, 476, 320, 10693, 2783, 347, 247, 1534, 7680, 50275, 783, 4477, 1611, 281, 11606, 253, 12510, 342, 253, 1491, 3673, 44856, 8063, 2568, 2057, 10527, 390, 16774, 22909, 403, 5816, 760, 253, 11701, 275, 2457, 7200, 403, 417, 4209, 281, 18578, 253, 9414, 670, 253, 9380, 29867, 407, 253, 1039, 253, 1307, 2087, 5837, 3408, 7835, 36427, 323, 15571, 253, 1491, 3673, 44856, 8063, 275, 4706, 1423, 50275, 16217, 3825, 403, 760, 5196, 327, 253, 7222, 1179, 11571, 10895, 1223, 14023, 327, 519, 70, 938, 76, 403, 5816, 1580, 253, 4081, 1332, 310, 4158, 342, 247, 4229, 4735, 4908, 263, 352, 4453, 1892, 281, 39970, 275, 9534, 253, 2613, 966, 873, 310, 1355, 285, 253, 747, 4836, 873, 310, 1781, 50276, 249, 6010, 436, 2929, 10262, 247, 2969, 7792, 323, 45120, 24705, 26405, 2568, 253, 9021, 403, 34584, 285, 253, 4679, 403, 12497, 50275, 66, 6385, 91, 796, 73, 1162, 355, 10336, 8213, 275, 45120, 4715, 270, 6385, 91, 796, 73, 1162, 355, 5926, 483, 347, 271, 15424, 305, 839, 5122, 323, 45120, 4715, 5549, 5474, 339, 9852, 436, 789, 253, 4477, 1375, 326, 689, 3118, 1256, 310, 247, 12232, 1895, 323, 45120, 4715, 285, 9569, 767, 2969, 11701, 323, 966, 19687, 30132, 24705, 26405, 260, 739, 1895, 253, 806, 7756, 310, 50050, 253, 2990, 8090, 1078, 253, 2457, 9162, 3828, 281, 3157, 4735, 3290, 253, 1273, 7756, 310, 6240, 247, 5926, 483, 3828, 846, 253, 32049, 1309, 253, 28841, 3408, 273, 253, 1566, 3733, 281, 2794, 625, 10237, 3386, 323, 2007, 502, 5018, 20544, 253, 4477, 2770, 327, 271, 1774, 9400, 273, 689, 3118, 1256, 534, 556, 247, 4016, 3486, 327, 45120, 4715, 1566, 3733, 671, 597, 921, 271, 2572, 275, 3045, 275, 2173, 873, 8777, 970, 767, 4081, 11701, 247, 1896, 38135, 310, 970, 5926, 483, 3828, 760, 1309, 253, 28841, 3733, 3408, 273, 502, 1566, 281, 3657, 689, 3118, 1256, 50276, 20881, 1255, 265, 3236, 414, 253, 2929, 19756, 3236, 414, 253, 4477, 12661, 281, 897, 3786, 5611, 11701, 323, 247, 1798, 1895, 4758, 253, 4081, 747, 7103, 7792, 310, 671, 417, 3236, 2779, 352, 310, 247, 2238, 273, 247, 2629, 3909, 4715, 9978, 50276, 15177, 581, 273, 253, 2022, 3916, 1386, 854, 3104, 10523, 1438, 326, 359, 476, 3157, 502, 3045, 407, 3629, 253, 4735, 3890, 6460, 273, 253, 34003, 14237, 310, 417, 3587, 4516, 342, 1941, 1561, 253, 789, 253, 4477, 513, 417, 897, 667, 17082, 281, 921, 7756, 275, 253, 14237, 1293, 3081, 28913, 1263, 352, 310, 417, 2590, 849, 4243, 273, 253, 2900, 3486, 253, 4583, 1543, 50276, 9088, 403, 2709, 5884, 11701, 390, 5697, 326, 403, 417, 7052, 4802, 323, 1650, 253, 7103, 6974, 10419, 310, 417, 3587, 4802, 281, 643, 3559, 5697, 253, 4477, 1375, 326, 597, 858, 417, 452, 2217, 12672, 5300, 281, 2085, 2228, 8965, 323, 2361, 1543, 50275, 498, 15752, 253, 789, 310, 417, 1881, 41010, 323, 1650, 253, 4477, 3748, 253, 4114, 5333, 1895, 2709, 2069, 597, 1375, 275, 1386, 21579, 326, 436, 310, 247, 4451, 1895, 273, 45120, 24705, 26405, 533, 597, 513, 417, 5513, 253, 17718, 273, 436, 1895, 352, 310, 12744, 849, 1027, 4243, 273, 253, 2905, 789, 2593, 14588, 281, 253, 4081, 2900, 50276, 783, 1543, 275, 13433, 275, 2829, 374, 403, 417, 253, 1682, 275, 9930, 436, 310, 24363, 690, 3000, 403, 5816, 275, 253, 2829, 337, 11743, 7364, 273, 253, 4081, 11701, 497, 417, 2529, 253, 4477, 5393, 2087, 2762, 38058, 3486, 273, 5919, 502, 3733, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 13330, 342, 45120, 4715, 275, 24705, 26405, 50276, 43355, 9569, 14200, 27311, 387, 2457, 4735, 11998, 3828, 285, 4647, 5926, 483, 281, 2701, 253, 689, 3118, 1256, 2523, 50275, 2369, 37317, 369, 13762, 407, 253, 2746, 285, 597, 452, 5439, 1142, 3374, 1690, 1566, 2216, 4327, 3733, 7241, 285, 5816, 4679, 50275, 2369, 30080, 22559, 556, 644, 2530, 407, 253, 4477, 50275, 284, 352, 310, 436, 19529, 310, 417, 4704, 323, 9311, 285, 359, 11907, 253, 4477, 281, 1908, 253, 30628, 8680, 84, 323, 2852, 9311 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 253, 2929, 19401, 253, 1895, 273, 45120, 4715, 327, 24705, 26405, 50275, 783, 4477, 12661, 767, 2969, 5697, 281, 3157, 253, 10336, 50272, 5302, 6096, 533, 14200, 27311, 11911, 50272, 36445, 2844, 5926, 483, 715, 253, 32049, 48759, 10336, 50274, 16217, 3825, 452, 644, 5196, 327, 253, 7222, 1179, 11571, 10895, 41731, 14692, 2045, 7274, 50276, 45563, 50272, 783, 2929, 310, 973, 15720, 3477, 281, 956, 50275, 20881, 1255, 50272, 17695, 3020, 253, 2929, 25339, 247, 2257, 327, 253, 1491, 3673, 44856, 891, 13414, 871, 849, 4623, 352, 310, 347, 253, 2007, 4081, 5482, 403, 625, 751, 16774, 24866, 50272, 74, 1158, 253, 2783, 10076, 1057, 417, 4887, 752, 6569, 275, 2283, 885, 966, 19687, 30132, 24705, 26405, 28465, 285, 48955, 1097, 2219, 1908, 3733, 6667, 835, 760, 253, 4114, 966, 260, 2612, 285, 5971, 275, 45830, 403, 908, 347, 13301, 604, 359, 513, 452, 13301, 323, 4836, 246, 18, 2139, 403, 359, 417, 4136, 281, 897, 731, 50276, 338, 323, 6733, 2139, 16216, 897, 760, 247, 5110, 273, 731, 751, 44864, 50272, 783, 4081, 5482, 3738, 2969, 1057, 417, 1918, 1512, 1199, 12288, 625, 751, 690, 24866, 50272, 16217, 2092, 327, 7222, 1179, 310, 417, 1175, 2217, 604, 253, 2934, 310, 281, 14949, 3037, 747, 4836, 390, 275, 958, 4715, 281, 8223, 747, 9050, 352, 651, 1056, 3282, 281, 1071, 387, 1878, 327, 9285, 80, 342, 5096, 5971, 390, 1014, 298, 4534, 342, 689, 9098, 50276, 1189, 455, 891, 13414, 1158, 436, 2929, 943, 320, 3863, 327, 5723, 2824, 50276, 2815, 3535, 556, 644, 2908, 275, 253, 2929, 5474, 339, 431, 248, 4477, 41005, 10313, 253, 689, 3118, 1256, 273, 3386, 5997, 1566, 11961, 672, 3733, 253, 27882, 273, 253, 45120, 26405, 1566, 597, 1611, 281, 5513, 253, 1921, 432, 253, 4809, 273, 253, 1491, 3673, 44856, 8063, 840, 597, 851, 1949, 253, 27882, 5678, 342, 253, 4465, 5926, 483, 3828, 285, 6194, 253, 30410, 342, 247, 3687, 1180, 273, 3386, 685, 23524, 335, 253, 5661, 1543, 921, 253, 12510, 273, 616, 4081, 3082, 436, 4081, 1332, 310, 1754, 327, 23524, 335, 533, 253, 10199, 670, 23524, 335, 310, 417, 7000, 2217, 534, 778, 1421, 281, 13775, 285, 40663, 432, 253, 2644, 2929, 352, 3133, 326, 247, 2266, 27882, 476, 3012, 3157, 253, 7200, 273, 260, 739, 1223, 253, 30410, 816, 3198, 281, 30215, 3386, 253, 4477, 10313, 253, 689, 3118, 1256, 273, 3386, 2847, 1566, 11961, 1895, 672, 3733, 253, 27882, 285, 690, 24866, 323, 7374, 6584, 839, 253, 3237, 1223, 616, 4081, 3082, 1611, 281, 4908, 625, 9267, 18859, 10237, 390, 1029, 15177, 3386, 323, 10554, 841, 3082, 476, 417, 8415, 253, 689, 3118, 1256, 1895, 5661, 1543, 10571, 5276, 253, 12510, 533, 3480, 10527, 4737, 285, 28913, 1263, 50276, 40844, 3640, 670, 253, 23524, 335, 1566, 3198, 281, 320, 7117, 275, 625, 2508, 690, 5661, 7533, 943, 320, 4518, 2529, 3340, 634, 4081, 6333, 390, 3082, 323, 1953, 495, 2589, 690, 28913, 1263, 275, 18216, 253, 1072, 1180, 273, 8123, 275, 253, 23524, 335, 1481, 3828, 50276, 1542, 1953, 577, 2589, 4679, 327, 690, 5926, 483, 8090, 823, 846, 253, 643, 27311, 267, 8090, 5661, 1543, 10571, 921, 253, 12510, 273, 634, 4081, 3082, 1611, 281, 5513, 275, 3762, 50276, 7152, 33032, 2520, 2929, 29328, 281, 18915, 45120, 24705, 26405, 407, 15974, 253, 689, 3118, 1256, 2523, 275, 1566, 4715, 275, 1340, 281, 3157, 253, 26647, 3745, 273, 6779, 281, 15146, 8892, 253, 4477, 3812, 17006, 432, 2045, 2175, 275, 253, 1673, 285, 9569, 14200, 27311, 323, 2457, 4735, 11998, 347, 973, 347, 4647, 5926, 483, 281, 253, 3453, 273, 253, 4735, 32049, 50276, 783, 4081, 1332, 33526, 1175, 3045, 327, 253, 7222, 1179, 11571, 10895, 50276, 856, 84, 50276, 2520, 2929, 310, 2590, 285, 3477, 281, 1239, 50276, 783, 4081, 1332, 310, 2969, 285, 5919, 50276, 783, 4081, 1332, 33526, 1175, 3045, 327, 7222, 1179, 11571, 10895, 50276, 5040, 50276, 783, 7680, 273, 436, 2929, 310, 3710, 2045, 2987, 452, 5183, 253, 12510, 273, 4618, 27311, 8090, 66, 285, 5926, 483, 67, 323, 45120, 4715, 3237, 436, 2929, 10323, 10384, 253, 5609, 281, 253, 24705, 26405, 4836, 534, 476, 320, 10693, 2783, 347, 247, 1534, 7680, 50275, 783, 4477, 1611, 281, 11606, 253, 12510, 342, 253, 1491, 3673, 44856, 8063, 2568, 2057, 10527, 390, 16774, 22909, 403, 5816, 760, 253, 11701, 275, 2457, 7200, 403, 417, 4209, 281, 18578, 253, 9414, 670, 253, 9380, 29867, 407, 253, 1039, 253, 1307, 2087, 5837, 3408, 7835, 36427, 323, 15571, 253, 1491, 3673, 44856, 8063, 275, 4706, 1423, 50275, 16217, 3825, 403, 760, 5196, 327, 253, 7222, 1179, 11571, 10895, 1223, 14023, 327, 519, 70, 938, 76, 403, 5816, 1580, 253, 4081, 1332, 310, 4158, 342, 247, 4229, 4735, 4908, 263, 352, 4453, 1892, 281, 39970, 275, 9534, 253, 2613, 966, 873, 310, 1355, 285, 253, 747, 4836, 873, 310, 1781, 50276, 249, 6010, 436, 2929, 10262, 247, 2969, 7792, 323, 45120, 24705, 26405, 2568, 253, 9021, 403, 34584, 285, 253, 4679, 403, 12497, 50275, 66, 6385, 91, 796, 73, 1162, 355, 10336, 8213, 275, 45120, 4715, 270, 6385, 91, 796, 73, 1162, 355, 5926, 483, 347, 271, 15424, 305, 839, 5122, 323, 45120, 4715, 5549, 5474, 339, 9852, 436, 789, 253, 4477, 1375, 326, 689, 3118, 1256, 310, 247, 12232, 1895, 323, 45120, 4715, 285, 9569, 767, 2969, 11701, 323, 966, 19687, 30132, 24705, 26405, 260, 739, 1895, 253, 806, 7756, 310, 50050, 253, 2990, 8090, 1078, 253, 2457, 9162, 3828, 281, 3157, 4735, 3290, 253, 1273, 7756, 310, 6240, 247, 5926, 483, 3828, 846, 253, 32049, 1309, 253, 28841, 3408, 273, 253, 1566, 3733, 281, 2794, 625, 10237, 3386, 323, 2007, 502, 5018, 20544, 253, 4477, 2770, 327, 271, 1774, 9400, 273, 689, 3118, 1256, 534, 556, 247, 4016, 3486, 327, 45120, 4715, 1566, 3733, 671, 597, 921, 271, 2572, 275, 3045, 275, 2173, 873, 8777, 970, 767, 4081, 11701, 247, 1896, 38135, 310, 970, 5926, 483, 3828, 760, 1309, 253, 28841, 3733, 3408, 273, 502, 1566, 281, 3657, 689, 3118, 1256, 50276, 20881, 1255, 265, 3236, 414, 253, 2929, 19756, 3236, 414, 253, 4477, 12661, 281, 897, 3786, 5611, 11701, 323, 247, 1798, 1895, 4758, 253, 4081, 747, 7103, 7792, 310, 671, 417, 3236, 2779, 352, 310, 247, 2238, 273, 247, 2629, 3909, 4715, 9978, 50276, 15177, 581, 273, 253, 2022, 3916, 1386, 854, 3104, 10523, 1438, 326, 359, 476, 3157, 502, 3045, 407, 3629, 253, 4735, 3890, 6460, 273, 253, 34003, 14237, 310, 417, 3587, 4516, 342, 1941, 1561, 253, 789, 253, 4477, 513, 417, 897, 667, 17082, 281, 921, 7756, 275, 253, 14237, 1293, 3081, 28913, 1263, 352, 310, 417, 2590, 849, 4243, 273, 253, 2900, 3486, 253, 4583, 1543, 50276, 9088, 403, 2709, 5884, 11701, 390, 5697, 326, 403, 417, 7052, 4802, 323, 1650, 253, 7103, 6974, 10419, 310, 417, 3587, 4802, 281, 643, 3559, 5697, 253, 4477, 1375, 326, 597, 858, 417, 452, 2217, 12672, 5300, 281, 2085, 2228, 8965, 323, 2361, 1543, 50275, 498, 15752, 253, 789, 310, 417, 1881, 41010, 323, 1650, 253, 4477, 3748, 253, 4114, 5333, 1895, 2709, 2069, 597, 1375, 275, 1386, 21579, 326, 436, 310, 247, 4451, 1895, 273, 45120, 24705, 26405, 533, 597, 513, 417, 5513, 253, 17718, 273, 436, 1895, 352, 310, 12744, 849, 1027, 4243, 273, 253, 2905, 789, 2593, 14588, 281, 253, 4081, 2900, 50276, 783, 1543, 275, 13433, 275, 2829, 374, 403, 417, 253, 1682, 275, 9930, 436, 310, 24363, 690, 3000, 403, 5816, 275, 253, 2829, 337, 11743, 7364, 273, 253, 4081, 11701, 497, 417, 2529, 253, 4477, 5393, 2087, 2762, 38058, 3486, 273, 5919, 502, 3733, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 13330, 342, 45120, 4715, 275, 24705, 26405, 50276, 43355, 9569, 14200, 27311, 387, 2457, 4735, 11998, 3828, 285, 4647, 5926, 483, 281, 2701, 253, 689, 3118, 1256, 2523, 50275, 2369, 37317, 369, 13762, 407, 253, 2746, 285, 597, 452, 5439, 1142, 3374, 1690, 1566, 2216, 4327, 3733, 7241, 285, 5816, 4679, 50275, 2369, 30080, 22559, 556, 644, 2530, 407, 253, 4477, 50275, 284, 352, 310, 436, 19529, 310, 417, 4704, 323, 9311, 285, 359, 11907, 253, 4477, 281, 1908, 253, 30628, 8680, 84, 323, 2852, 9311 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper investigates the inconsistency problem in lola each lola agent assumes the other agent as a naive learner resulting in lola agents not converging to sfps in some games this paper aims to address this problem by the infiniteorder lola which can have a consistent view of each other and empirically observes that hola may not resolve lolas convergence issues instead of hola this paper proposes cola that employs neural networks to explicitly minimize the consistency loss cola empirically shows that cola finds the consistent solution when hola converges and finds more stable solutions when hola diverges strengths 1 this paper studies closely related literature ie lola hola cgd and make interesting empirical observations to the community 2 cola only requires up to secondorder derivatives compared to ilola which requires many higherorder derivatives weaknessesconcerns 1 my main concern is colas benefit against sos letcher et al 2019 sos not only has a theoretical convergence guarantee to sfps while retaining the lolas opponentshaping to achieve higher performance if needed however cola does not have a convergence guarantee and cola may perform worse than sos when it does not converge to sfps eg cola01 in the tandem game hence it is unclear when cola should be used instead of sos 2 could you clarify further on the consistency definition 1 regarding how it avoids the infinite regress problem specifically equation 5 is dependent on equation 6 as equation 5 includes h2 similarly equation 6 is dependent on equation 5 as equation 6 includes h1 as a result the infinite regress problem can arise when we replace h2 in equation 5 with equation 6 to compute h1 on the lefthand side of equation 5 it requires h2 in equation 6 which then requires computation of h1 in equation 5 and this recursion continues 3 while it is an interesting idea to learn the update functions h1 and h2 via neural networks parameterized by phi1 and phi2 i am concerned about the scalability of this approach when the dimension of theta1 and theta2 are small then learning the update functions that output the dimension of theta1 and theta2 is possible however when theta1 and theta2 are policies represented by neural networks then the dimension of theta1 and theta2 are large which results in the difficulty of learning h1 and h2 4 in figure 1 cgd is only compared in the tandem domain why is cgd not compared in other domains including matching pennies competitive game 5 this is minor but it is difficult to understand the experimental results because tables 12 and figure 1 are not positioned near section 6 i initially vote for a score of 3 while this paper finds interesting empirical observations to related works i am unsure how it improves over the stateoftheart approach in the literature such as sos after reading the authors responses to my questions i am open to raising my score docsepthe authors tackle the consistency problem of the original lola formulation the paper investigates hola convergence demonstrates that cgd does not correspond to highorder lola in general and proposes cola to directly address the consistency problem the proposed method cola seems more robust to different lookahead values where hola diverges the authors also find that cola is still sometime susceptible to the arrogant lola behavior opening questions for future work strengths paper is well written and provides a well motivated investigation into lolas failure to preserve sfps and corresponding arrogant behavior explanation of cases where cgd is not equivalent to ilola in generalsum games seems important and significant the motivation behind the proposed cola method is well explained and justified and the empirical results are thorough and support the authors claims weaknesses more thorough proof for cgd argument would strengthen the paper as well as including it as a baseline in more of the empirical evaluations this work attempts to overcome and investigate weaknesses of lola to tackle the consistency problem even with explicit consistency loss the authors still find that we still find that the arrogant behavior remains so the insights from this investigation seem relevant for future work and open questions in this area docsepthis paper deals with the problem of learning in differentiable games mainly the paper tackles the problem of learning in games while taking into account the learning of the opponent as well the main contribution of the paper is to point out a flaw in the existing claims regarding correspondence between competitive gradient descent and ilola the paper further gives a definition of consistent update rules for differentiable games and based on this definition show that ilola is update rule is consistent the paper proposes a new algorithm cola and shows that it find more consistent solutions empirically overall i like the paper however the main issue for me was that i could not clearly see the relation between cgd and lola this seems to be an important correction to existing claims however the paper says that this is not a rigorous proof of the correction the problem is that if an existing claims is incorrect than it is useful to correct it as soon as possible however the proof of correction should be quite clear i did not find that to be the case ideally it would be nice to first say that the existing claim says that series expansion will look like this and while doing so they forgot about this particular term that seems to appear in lolas loss and thus the correction to the existing claim overall i think even just this correction could be a sufficient contribution for publication the paper claims that cola follows a consistent update rule however empirically cola does not converge to a a titfortat strategy as desired or as lola does then what is the motivation behind using cola over lola i am not sure of what is the original claim form the cgd paper and if it is correctly interpreted in this paper it seems that the original claims from the cgd paper is series expansion of cgd recovers hola high order lola the paper says that this implies cgd is equal to ilola i am not sure how this is true while i have other minor points but i think the claim regarding the original claim from cgd paper and its correction should be first made clear i think my decision mainly depends on whether the claim that the paper makes a correction to the existing literature is true or not and if the paper can it more clear docsepthe paper is on learning in a differential game setting by accounting for the ability of the opponent to learn it displays several issues in previously existing methods proposes a new methods and demonstrates some of the features that appear to be superior to the existing methods the paper focuses on an interesting challenge in learning in the presence of opponents on the other hand the contributions the paper makes do not meet the claims made throughout the paper for example take section 41 this is not a rigorous proof but it should be intuitively clear first of all should has no meaning what if it is not second this is one of the main claims that the paper is built on if you are not going to give a rigorous proof for this what will you give a proof for too much is left to future work the paper is also filled with similar vague language in addition to the wishywashy language central concepts like consistency and stability are used throughout the paper and for the most part without a clear definition at least for a long time and even comparative form eg more stable are used even though nothing had been made quantitative while this shortcoming may be inherited from the literature the assumptions eg knowledge of the opponents payoff functions parameters and gradients made in the paper are just way too strict coupled with the lack of rigorous results the insights established in the paper are at best limited there is a proposition that gives the means to check whether a pair of update functions are consistent on the other hand it is not clear how one actually checks the condition for consistency given in the proposition some of the questions posed at the beginning of the results section are odd it is not clear whether an empirical analysis can even ever answer such questions as stated only rigorous proofs would answer the questions and the paper lacks those similarly the discussion of the observations from the empirical results overgeneralizes the paper is on an interesting issue yet it is at best a starting point and far from prime time after the author response thanks to the authors for the response it unfortunately does not change my assessment ### Summary:
the main contribution of this paper is that it points out incorrect claims in the literature of multiagent rl and provides new insight on the failure modes of current methods specifically this paper investigates the inconsistency problem in lola meaning it assumes the other agent as a naive learner thus not converging to sfps in some games it then shows problems with two fixes in the literature 1 hola addresses the inconsistency problem only when it converges otherwise hola does not resolve the issue 2 gcd does not resolve the issue although it claims to do so this paper then proposes a method cola that fixes the inconsistency issue which outperforms hola when it diverges reviewers generally agree that the insight from this work is interesting and important for the field however there were some concern on both the theory and the experiments while the updated version addresses some of the concerns it also made significant changes to both the theoretical and the empirical sections and would benefit from another round of close review thus i think the current version of this work is borderline
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 2340, 684, 253, 43430, 1895, 275, 298, 6836, 1016, 298, 6836, 5570, 19584, 253, 643, 5570, 347, 247, 27785, 458, 47612, 4795, 275, 298, 6836, 6083, 417, 5975, 3390, 281, 42644, 793, 275, 690, 3958, 436, 2929, 13698, 281, 2953, 436, 1895, 407, 253, 11968, 2621, 298, 6836, 534, 476, 452, 247, 5185, 1859, 273, 1016, 643, 285, 45190, 40687, 326, 5965, 66, 778, 417, 11322, 298, 16328, 14940, 3374, 3185, 273, 5965, 66, 436, 2929, 29328, 847, 66, 326, 27532, 11454, 6928, 281, 11120, 15338, 253, 15274, 2957, 847, 66, 45190, 2722, 326, 847, 66, 9010, 253, 5185, 2900, 672, 5965, 66, 26414, 285, 9010, 625, 6474, 5482, 672, 5965, 66, 11711, 2510, 50276, 296, 3755, 20556, 50276, 18, 436, 2929, 2175, 8244, 2905, 6239, 26332, 298, 6836, 5965, 66, 260, 35333, 285, 1056, 4722, 16774, 7313, 281, 253, 3114, 374, 847, 66, 760, 4419, 598, 281, 1273, 2621, 13335, 2429, 281, 4164, 6836, 534, 4419, 1142, 2169, 2621, 13335, 50276, 20881, 1255, 265, 585, 1209, 2224, 337, 619, 2022, 4468, 310, 847, 284, 5649, 1411, 256, 375, 1339, 5316, 1162, 355, 6247, 256, 375, 417, 760, 556, 247, 10527, 14940, 12215, 281, 42644, 793, 1223, 26179, 253, 298, 16328, 18062, 73, 15609, 281, 5115, 2169, 3045, 604, 3058, 2299, 847, 66, 1057, 417, 452, 247, 14940, 12215, 285, 847, 66, 778, 1347, 7197, 685, 256, 375, 672, 352, 1057, 417, 29623, 281, 42644, 793, 24088, 847, 66, 520, 275, 253, 30111, 2165, 7613, 352, 310, 12744, 672, 847, 66, 943, 320, 908, 3185, 273, 256, 375, 50276, 19, 812, 368, 19148, 2007, 327, 253, 15274, 5426, 337, 5001, 849, 352, 32547, 253, 11968, 810, 560, 1895, 5742, 5150, 608, 310, 7976, 327, 5150, 721, 347, 5150, 608, 3797, 288, 19, 12014, 5150, 721, 310, 7976, 327, 5150, 608, 347, 5150, 721, 3797, 288, 18, 347, 247, 906, 253, 11968, 810, 560, 1895, 476, 12893, 672, 359, 8171, 288, 19, 275, 5150, 608, 342, 5150, 721, 281, 11897, 288, 18, 327, 253, 458, 71, 394, 395, 1930, 273, 5150, 608, 352, 4419, 288, 19, 275, 5150, 721, 534, 840, 4419, 13782, 273, 288, 18, 275, 5150, 608, 285, 436, 43489, 7788, 50276, 20, 1223, 352, 310, 271, 4722, 2934, 281, 3037, 253, 5731, 3470, 288, 18, 285, 288, 19, 3066, 11454, 6928, 4764, 1025, 407, 815, 74, 18, 285, 815, 74, 19, 891, 717, 7514, 670, 253, 9171, 1430, 273, 436, 2746, 672, 253, 7877, 273, 39116, 18, 285, 39116, 19, 403, 1355, 840, 4715, 253, 5731, 3470, 326, 3453, 253, 7877, 273, 39116, 18, 285, 39116, 19, 310, 1896, 2299, 672, 39116, 18, 285, 39116, 19, 403, 7823, 6607, 407, 11454, 6928, 840, 253, 7877, 273, 39116, 18, 285, 39116, 19, 403, 1781, 534, 1543, 275, 253, 10183, 273, 4715, 288, 18, 285, 288, 19, 577, 275, 4677, 337, 260, 35333, 310, 760, 2429, 275, 253, 30111, 5028, 2139, 310, 260, 35333, 417, 2429, 275, 643, 10625, 1690, 11038, 33945, 447, 12085, 2165, 608, 436, 310, 5884, 533, 352, 310, 2834, 281, 2096, 253, 5661, 1543, 984, 7180, 1249, 285, 4677, 337, 403, 417, 15471, 2822, 2593, 721, 891, 8523, 6273, 323, 247, 4868, 273, 495, 1223, 436, 2929, 9010, 4722, 16774, 7313, 281, 2905, 2987, 891, 717, 31488, 849, 352, 19132, 689, 253, 1375, 23037, 14387, 2746, 275, 253, 6239, 824, 347, 256, 375, 846, 4361, 253, 4477, 6128, 281, 619, 3533, 891, 717, 1527, 281, 12976, 619, 4868, 5474, 339, 431, 248, 4477, 18915, 253, 15274, 1895, 273, 253, 3236, 298, 6836, 15895, 253, 2929, 2340, 684, 5965, 66, 14940, 14371, 326, 260, 35333, 1057, 417, 2723, 281, 1029, 2621, 298, 6836, 275, 2087, 285, 29328, 847, 66, 281, 3587, 2953, 253, 15274, 1895, 253, 4081, 1332, 847, 66, 3133, 625, 10237, 281, 1027, 1007, 42338, 2193, 835, 5965, 66, 11711, 2510, 253, 4477, 671, 1089, 326, 847, 66, 310, 1335, 24225, 16931, 281, 253, 43837, 298, 6836, 3879, 5909, 3533, 323, 2852, 789, 50275, 296, 3755, 20556, 50276, 20790, 310, 973, 3542, 285, 3400, 247, 973, 17194, 5839, 715, 298, 16328, 4433, 281, 14003, 42644, 793, 285, 3969, 43837, 3879, 50276, 911, 45525, 273, 2219, 835, 260, 35333, 310, 417, 6425, 281, 4164, 6836, 275, 39091, 360, 3958, 3133, 1774, 285, 1534, 50276, 783, 16038, 3212, 253, 4081, 847, 66, 1332, 310, 973, 5544, 285, 17285, 285, 253, 16774, 1543, 403, 11080, 285, 1329, 253, 4477, 3916, 50274, 20881, 1255, 265, 50276, 3062, 11080, 4737, 323, 260, 35333, 4154, 651, 17084, 253, 2929, 347, 973, 347, 1690, 352, 347, 247, 8245, 275, 625, 273, 253, 16774, 27163, 50276, 2520, 789, 9437, 281, 11399, 285, 7409, 32213, 273, 298, 6836, 281, 18915, 253, 15274, 1895, 1014, 342, 6843, 15274, 2957, 253, 4477, 1335, 1089, 326, 359, 1335, 1089, 326, 253, 43837, 3879, 4558, 594, 253, 16039, 432, 436, 5839, 1646, 4623, 323, 2852, 789, 285, 1527, 3533, 275, 436, 2170, 5474, 33032, 2520, 2929, 13330, 342, 253, 1895, 273, 4715, 275, 46350, 3958, 7194, 253, 2929, 39223, 253, 1895, 273, 4715, 275, 3958, 1223, 3192, 715, 2395, 253, 4715, 273, 253, 16871, 347, 973, 253, 2022, 7680, 273, 253, 2929, 310, 281, 1127, 562, 247, 19652, 275, 253, 5368, 3916, 5001, 17668, 875, 12085, 11786, 18499, 285, 4164, 6836, 253, 2929, 2007, 4245, 247, 5426, 273, 5185, 5731, 4803, 323, 46350, 3958, 285, 1754, 327, 436, 5426, 921, 326, 4164, 6836, 310, 5731, 4086, 310, 5185, 253, 2929, 29328, 247, 747, 5933, 847, 66, 285, 2722, 326, 352, 1089, 625, 5185, 5482, 45190, 50276, 1189, 455, 891, 751, 253, 2929, 2299, 253, 2022, 2523, 323, 479, 369, 326, 891, 812, 417, 4518, 923, 253, 5886, 875, 260, 35333, 285, 298, 6836, 436, 3133, 281, 320, 271, 1774, 10618, 281, 5368, 3916, 2299, 253, 2929, 2296, 326, 436, 310, 417, 247, 26565, 4737, 273, 253, 10618, 253, 1895, 310, 326, 604, 271, 5368, 3916, 310, 13583, 685, 352, 310, 4217, 281, 3451, 352, 347, 3517, 347, 1896, 2299, 253, 4737, 273, 10618, 943, 320, 3240, 2590, 891, 858, 417, 1089, 326, 281, 320, 253, 1083, 34243, 352, 651, 320, 5322, 281, 806, 1333, 326, 253, 5368, 1750, 2296, 326, 2962, 7466, 588, 1007, 751, 436, 285, 1223, 2509, 594, 597, 18298, 670, 436, 1798, 1307, 326, 3133, 281, 3176, 275, 298, 16328, 2957, 285, 3021, 253, 10618, 281, 253, 5368, 1750, 50276, 1189, 455, 891, 1158, 1014, 816, 436, 10618, 812, 320, 247, 4209, 7680, 323, 9311, 50275, 783, 2929, 3916, 326, 847, 66, 3637, 247, 5185, 5731, 4086, 2299, 45190, 847, 66, 1057, 417, 29623, 281, 247, 247, 8225, 4371, 255, 5700, 347, 6799, 390, 347, 298, 6836, 1057, 840, 752, 310, 253, 16038, 3212, 970, 847, 66, 689, 298, 6836, 50276, 74, 717, 417, 2119, 273, 752, 310, 253, 3236, 1750, 830, 253, 260, 35333, 2929, 285, 604, 352, 310, 9113, 12814, 275, 436, 2929, 352, 3133, 326, 253, 3236, 3916, 432, 253, 260, 35333, 2929, 310, 2962, 7466, 273, 260, 35333, 761, 12239, 5965, 66, 1029, 1340, 298, 6836, 253, 2929, 2296, 326, 436, 8018, 260, 35333, 310, 4503, 281, 4164, 6836, 891, 717, 417, 2119, 849, 436, 310, 2032, 50275, 6050, 891, 452, 643, 5884, 2792, 533, 891, 1158, 253, 1750, 5001, 253, 3236, 1750, 432, 260, 35333, 2929, 285, 697, 10618, 943, 320, 806, 1160, 2590, 891, 1158, 619, 3061, 7194, 7024, 327, 1880, 253, 1750, 326, 253, 2929, 2789, 247, 10618, 281, 253, 5368, 6239, 310, 2032, 390, 417, 285, 604, 253, 2929, 476, 352, 625, 2590, 50276, 7152, 339, 431, 248, 2929, 310, 327, 4715, 275, 247, 8967, 2165, 4758, 407, 15890, 323, 253, 3745, 273, 253, 16871, 281, 3037, 352, 12646, 2067, 3374, 275, 3786, 5368, 3082, 29328, 247, 747, 3082, 285, 14371, 690, 273, 253, 3386, 326, 3176, 281, 320, 8936, 281, 253, 5368, 3082, 50276, 783, 2929, 16633, 327, 271, 4722, 5691, 275, 4715, 275, 253, 3361, 273, 18062, 50275, 251, 253, 643, 1133, 253, 9021, 253, 2929, 2789, 513, 417, 2525, 253, 3916, 1160, 4768, 253, 2929, 323, 1650, 1379, 2593, 7609, 436, 310, 417, 247, 26565, 4737, 533, 352, 943, 320, 540, 41597, 2590, 806, 273, 512, 943, 556, 642, 4495, 752, 604, 352, 310, 417, 1273, 436, 310, 581, 273, 253, 2022, 3916, 326, 253, 2929, 310, 4270, 327, 604, 368, 403, 417, 1469, 281, 1918, 247, 26565, 4737, 323, 436, 752, 588, 368, 1918, 247, 4737, 323, 1512, 1199, 310, 1669, 281, 2852, 789, 50275, 783, 2929, 310, 671, 6898, 342, 2074, 21248, 3448, 275, 1635, 281, 253, 5730, 90, 43351, 90, 3448, 4275, 12342, 751, 15274, 285, 7882, 403, 908, 4768, 253, 2929, 285, 323, 253, 954, 629, 1293, 247, 2590, 5426, 387, 1878, 323, 247, 1048, 673, 285, 1014, 20407, 830, 24088, 625, 6474, 403, 908, 1014, 2167, 2717, 574, 644, 1160, 11745, 50275, 6050, 436, 2159, 4202, 778, 320, 20265, 432, 253, 6239, 253, 13260, 24088, 3640, 273, 253, 18062, 2075, 2727, 3470, 3602, 285, 27935, 1160, 275, 253, 2929, 403, 816, 1039, 1512, 7654, 9904, 342, 253, 3480, 273, 26565, 1543, 253, 16039, 4232, 275, 253, 2929, 403, 387, 1682, 3710, 50274, 9088, 310, 247, 13989, 326, 4245, 253, 2097, 281, 2451, 1880, 247, 4667, 273, 5731, 3470, 403, 5185, 327, 253, 643, 1133, 352, 310, 417, 2590, 849, 581, 2686, 12255, 253, 1617, 323, 15274, 1677, 275, 253, 13989, 50276, 8826, 273, 253, 3533, 22691, 387, 253, 5068, 273, 253, 1543, 2593, 403, 8909, 352, 310, 417, 2590, 1880, 271, 16774, 1783, 476, 1014, 2455, 3662, 824, 3533, 347, 4767, 760, 26565, 27947, 651, 3662, 253, 3533, 285, 253, 2929, 19756, 1110, 12014, 253, 5955, 273, 253, 7313, 432, 253, 16774, 1543, 689, 16691, 4219, 50274, 783, 2929, 310, 327, 271, 4722, 2523, 2568, 352, 310, 387, 1682, 247, 4983, 1127, 285, 2080, 432, 4335, 673, 50273, 6438, 253, 2488, 2380, 50276, 35501, 281, 253, 4477, 323, 253, 2380, 352, 19235, 1057, 417, 1818, 619, 6803, 50276, 187, 187, 4118, 18435, 27, 783, 2022, 7680, 273, 436, 2929, 310, 326, 352, 2792, 562, 13583, 3916, 275, 253, 6239, 273, 4471, 12788, 391, 77, 285, 3400, 747, 12288, 327, 253, 4433, 10006, 273, 1655, 3082, 5742, 436, 2929, 2340, 684, 253, 43430, 1895, 275, 298, 6836, 4495, 352, 19584, 253, 643, 5570, 347, 247, 27785, 458, 47612, 3021, 417, 5975, 3390, 281, 42644, 793, 275, 690, 3958, 352, 840, 2722, 3237, 342, 767, 26019, 275, 253, 6239, 337, 5965, 66, 12453, 253, 43430, 1895, 760, 672, 352, 26414, 5010, 5965, 66, 1057, 417, 11322, 253, 2523, 374, 305, 2428, 1057, 417, 11322, 253, 2523, 3738, 352, 3916, 281, 513, 594, 436, 2929, 840, 29328, 247, 1332, 847, 66, 326, 26019, 253, 43430, 2523, 534, 41731, 13015, 5965, 66, 672, 352, 11711, 2510, 30628, 3839, 5194, 326, 253, 12288, 432, 436, 789, 310, 4722, 285, 1774, 323, 253, 1673, 2299, 627, 497, 690, 4468, 327, 1097, 253, 3762, 285, 253, 4679, 1223, 253, 9300, 2715, 12453, 690, 273, 253, 7350, 352, 671, 1160, 1534, 2544, 281, 1097, 253, 10527, 285, 253, 16774, 7118, 285, 651, 5649, 432, 1529, 3790, 273, 2810, 2278, 3021, 891, 1158, 253, 1655, 2715, 273, 436, 789, 310, 45210 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 2340, 684, 253, 43430, 1895, 275, 298, 6836, 1016, 298, 6836, 5570, 19584, 253, 643, 5570, 347, 247, 27785, 458, 47612, 4795, 275, 298, 6836, 6083, 417, 5975, 3390, 281, 42644, 793, 275, 690, 3958, 436, 2929, 13698, 281, 2953, 436, 1895, 407, 253, 11968, 2621, 298, 6836, 534, 476, 452, 247, 5185, 1859, 273, 1016, 643, 285, 45190, 40687, 326, 5965, 66, 778, 417, 11322, 298, 16328, 14940, 3374, 3185, 273, 5965, 66, 436, 2929, 29328, 847, 66, 326, 27532, 11454, 6928, 281, 11120, 15338, 253, 15274, 2957, 847, 66, 45190, 2722, 326, 847, 66, 9010, 253, 5185, 2900, 672, 5965, 66, 26414, 285, 9010, 625, 6474, 5482, 672, 5965, 66, 11711, 2510, 50276, 296, 3755, 20556, 50276, 18, 436, 2929, 2175, 8244, 2905, 6239, 26332, 298, 6836, 5965, 66, 260, 35333, 285, 1056, 4722, 16774, 7313, 281, 253, 3114, 374, 847, 66, 760, 4419, 598, 281, 1273, 2621, 13335, 2429, 281, 4164, 6836, 534, 4419, 1142, 2169, 2621, 13335, 50276, 20881, 1255, 265, 585, 1209, 2224, 337, 619, 2022, 4468, 310, 847, 284, 5649, 1411, 256, 375, 1339, 5316, 1162, 355, 6247, 256, 375, 417, 760, 556, 247, 10527, 14940, 12215, 281, 42644, 793, 1223, 26179, 253, 298, 16328, 18062, 73, 15609, 281, 5115, 2169, 3045, 604, 3058, 2299, 847, 66, 1057, 417, 452, 247, 14940, 12215, 285, 847, 66, 778, 1347, 7197, 685, 256, 375, 672, 352, 1057, 417, 29623, 281, 42644, 793, 24088, 847, 66, 520, 275, 253, 30111, 2165, 7613, 352, 310, 12744, 672, 847, 66, 943, 320, 908, 3185, 273, 256, 375, 50276, 19, 812, 368, 19148, 2007, 327, 253, 15274, 5426, 337, 5001, 849, 352, 32547, 253, 11968, 810, 560, 1895, 5742, 5150, 608, 310, 7976, 327, 5150, 721, 347, 5150, 608, 3797, 288, 19, 12014, 5150, 721, 310, 7976, 327, 5150, 608, 347, 5150, 721, 3797, 288, 18, 347, 247, 906, 253, 11968, 810, 560, 1895, 476, 12893, 672, 359, 8171, 288, 19, 275, 5150, 608, 342, 5150, 721, 281, 11897, 288, 18, 327, 253, 458, 71, 394, 395, 1930, 273, 5150, 608, 352, 4419, 288, 19, 275, 5150, 721, 534, 840, 4419, 13782, 273, 288, 18, 275, 5150, 608, 285, 436, 43489, 7788, 50276, 20, 1223, 352, 310, 271, 4722, 2934, 281, 3037, 253, 5731, 3470, 288, 18, 285, 288, 19, 3066, 11454, 6928, 4764, 1025, 407, 815, 74, 18, 285, 815, 74, 19, 891, 717, 7514, 670, 253, 9171, 1430, 273, 436, 2746, 672, 253, 7877, 273, 39116, 18, 285, 39116, 19, 403, 1355, 840, 4715, 253, 5731, 3470, 326, 3453, 253, 7877, 273, 39116, 18, 285, 39116, 19, 310, 1896, 2299, 672, 39116, 18, 285, 39116, 19, 403, 7823, 6607, 407, 11454, 6928, 840, 253, 7877, 273, 39116, 18, 285, 39116, 19, 403, 1781, 534, 1543, 275, 253, 10183, 273, 4715, 288, 18, 285, 288, 19, 577, 275, 4677, 337, 260, 35333, 310, 760, 2429, 275, 253, 30111, 5028, 2139, 310, 260, 35333, 417, 2429, 275, 643, 10625, 1690, 11038, 33945, 447, 12085, 2165, 608, 436, 310, 5884, 533, 352, 310, 2834, 281, 2096, 253, 5661, 1543, 984, 7180, 1249, 285, 4677, 337, 403, 417, 15471, 2822, 2593, 721, 891, 8523, 6273, 323, 247, 4868, 273, 495, 1223, 436, 2929, 9010, 4722, 16774, 7313, 281, 2905, 2987, 891, 717, 31488, 849, 352, 19132, 689, 253, 1375, 23037, 14387, 2746, 275, 253, 6239, 824, 347, 256, 375, 846, 4361, 253, 4477, 6128, 281, 619, 3533, 891, 717, 1527, 281, 12976, 619, 4868, 5474, 339, 431, 248, 4477, 18915, 253, 15274, 1895, 273, 253, 3236, 298, 6836, 15895, 253, 2929, 2340, 684, 5965, 66, 14940, 14371, 326, 260, 35333, 1057, 417, 2723, 281, 1029, 2621, 298, 6836, 275, 2087, 285, 29328, 847, 66, 281, 3587, 2953, 253, 15274, 1895, 253, 4081, 1332, 847, 66, 3133, 625, 10237, 281, 1027, 1007, 42338, 2193, 835, 5965, 66, 11711, 2510, 253, 4477, 671, 1089, 326, 847, 66, 310, 1335, 24225, 16931, 281, 253, 43837, 298, 6836, 3879, 5909, 3533, 323, 2852, 789, 50275, 296, 3755, 20556, 50276, 20790, 310, 973, 3542, 285, 3400, 247, 973, 17194, 5839, 715, 298, 16328, 4433, 281, 14003, 42644, 793, 285, 3969, 43837, 3879, 50276, 911, 45525, 273, 2219, 835, 260, 35333, 310, 417, 6425, 281, 4164, 6836, 275, 39091, 360, 3958, 3133, 1774, 285, 1534, 50276, 783, 16038, 3212, 253, 4081, 847, 66, 1332, 310, 973, 5544, 285, 17285, 285, 253, 16774, 1543, 403, 11080, 285, 1329, 253, 4477, 3916, 50274, 20881, 1255, 265, 50276, 3062, 11080, 4737, 323, 260, 35333, 4154, 651, 17084, 253, 2929, 347, 973, 347, 1690, 352, 347, 247, 8245, 275, 625, 273, 253, 16774, 27163, 50276, 2520, 789, 9437, 281, 11399, 285, 7409, 32213, 273, 298, 6836, 281, 18915, 253, 15274, 1895, 1014, 342, 6843, 15274, 2957, 253, 4477, 1335, 1089, 326, 359, 1335, 1089, 326, 253, 43837, 3879, 4558, 594, 253, 16039, 432, 436, 5839, 1646, 4623, 323, 2852, 789, 285, 1527, 3533, 275, 436, 2170, 5474, 33032, 2520, 2929, 13330, 342, 253, 1895, 273, 4715, 275, 46350, 3958, 7194, 253, 2929, 39223, 253, 1895, 273, 4715, 275, 3958, 1223, 3192, 715, 2395, 253, 4715, 273, 253, 16871, 347, 973, 253, 2022, 7680, 273, 253, 2929, 310, 281, 1127, 562, 247, 19652, 275, 253, 5368, 3916, 5001, 17668, 875, 12085, 11786, 18499, 285, 4164, 6836, 253, 2929, 2007, 4245, 247, 5426, 273, 5185, 5731, 4803, 323, 46350, 3958, 285, 1754, 327, 436, 5426, 921, 326, 4164, 6836, 310, 5731, 4086, 310, 5185, 253, 2929, 29328, 247, 747, 5933, 847, 66, 285, 2722, 326, 352, 1089, 625, 5185, 5482, 45190, 50276, 1189, 455, 891, 751, 253, 2929, 2299, 253, 2022, 2523, 323, 479, 369, 326, 891, 812, 417, 4518, 923, 253, 5886, 875, 260, 35333, 285, 298, 6836, 436, 3133, 281, 320, 271, 1774, 10618, 281, 5368, 3916, 2299, 253, 2929, 2296, 326, 436, 310, 417, 247, 26565, 4737, 273, 253, 10618, 253, 1895, 310, 326, 604, 271, 5368, 3916, 310, 13583, 685, 352, 310, 4217, 281, 3451, 352, 347, 3517, 347, 1896, 2299, 253, 4737, 273, 10618, 943, 320, 3240, 2590, 891, 858, 417, 1089, 326, 281, 320, 253, 1083, 34243, 352, 651, 320, 5322, 281, 806, 1333, 326, 253, 5368, 1750, 2296, 326, 2962, 7466, 588, 1007, 751, 436, 285, 1223, 2509, 594, 597, 18298, 670, 436, 1798, 1307, 326, 3133, 281, 3176, 275, 298, 16328, 2957, 285, 3021, 253, 10618, 281, 253, 5368, 1750, 50276, 1189, 455, 891, 1158, 1014, 816, 436, 10618, 812, 320, 247, 4209, 7680, 323, 9311, 50275, 783, 2929, 3916, 326, 847, 66, 3637, 247, 5185, 5731, 4086, 2299, 45190, 847, 66, 1057, 417, 29623, 281, 247, 247, 8225, 4371, 255, 5700, 347, 6799, 390, 347, 298, 6836, 1057, 840, 752, 310, 253, 16038, 3212, 970, 847, 66, 689, 298, 6836, 50276, 74, 717, 417, 2119, 273, 752, 310, 253, 3236, 1750, 830, 253, 260, 35333, 2929, 285, 604, 352, 310, 9113, 12814, 275, 436, 2929, 352, 3133, 326, 253, 3236, 3916, 432, 253, 260, 35333, 2929, 310, 2962, 7466, 273, 260, 35333, 761, 12239, 5965, 66, 1029, 1340, 298, 6836, 253, 2929, 2296, 326, 436, 8018, 260, 35333, 310, 4503, 281, 4164, 6836, 891, 717, 417, 2119, 849, 436, 310, 2032, 50275, 6050, 891, 452, 643, 5884, 2792, 533, 891, 1158, 253, 1750, 5001, 253, 3236, 1750, 432, 260, 35333, 2929, 285, 697, 10618, 943, 320, 806, 1160, 2590, 891, 1158, 619, 3061, 7194, 7024, 327, 1880, 253, 1750, 326, 253, 2929, 2789, 247, 10618, 281, 253, 5368, 6239, 310, 2032, 390, 417, 285, 604, 253, 2929, 476, 352, 625, 2590, 50276, 7152, 339, 431, 248, 2929, 310, 327, 4715, 275, 247, 8967, 2165, 4758, 407, 15890, 323, 253, 3745, 273, 253, 16871, 281, 3037, 352, 12646, 2067, 3374, 275, 3786, 5368, 3082, 29328, 247, 747, 3082, 285, 14371, 690, 273, 253, 3386, 326, 3176, 281, 320, 8936, 281, 253, 5368, 3082, 50276, 783, 2929, 16633, 327, 271, 4722, 5691, 275, 4715, 275, 253, 3361, 273, 18062, 50275, 251, 253, 643, 1133, 253, 9021, 253, 2929, 2789, 513, 417, 2525, 253, 3916, 1160, 4768, 253, 2929, 323, 1650, 1379, 2593, 7609, 436, 310, 417, 247, 26565, 4737, 533, 352, 943, 320, 540, 41597, 2590, 806, 273, 512, 943, 556, 642, 4495, 752, 604, 352, 310, 417, 1273, 436, 310, 581, 273, 253, 2022, 3916, 326, 253, 2929, 310, 4270, 327, 604, 368, 403, 417, 1469, 281, 1918, 247, 26565, 4737, 323, 436, 752, 588, 368, 1918, 247, 4737, 323, 1512, 1199, 310, 1669, 281, 2852, 789, 50275, 783, 2929, 310, 671, 6898, 342, 2074, 21248, 3448, 275, 1635, 281, 253, 5730, 90, 43351, 90, 3448, 4275, 12342, 751, 15274, 285, 7882, 403, 908, 4768, 253, 2929, 285, 323, 253, 954, 629, 1293, 247, 2590, 5426, 387, 1878, 323, 247, 1048, 673, 285, 1014, 20407, 830, 24088, 625, 6474, 403, 908, 1014, 2167, 2717, 574, 644, 1160, 11745, 50275, 6050, 436, 2159, 4202, 778, 320, 20265, 432, 253, 6239, 253, 13260, 24088, 3640, 273, 253, 18062, 2075, 2727, 3470, 3602, 285, 27935, 1160, 275, 253, 2929, 403, 816, 1039, 1512, 7654, 9904, 342, 253, 3480, 273, 26565, 1543, 253, 16039, 4232, 275, 253, 2929, 403, 387, 1682, 3710, 50274, 9088, 310, 247, 13989, 326, 4245, 253, 2097, 281, 2451, 1880, 247, 4667, 273, 5731, 3470, 403, 5185, 327, 253, 643, 1133, 352, 310, 417, 2590, 849, 581, 2686, 12255, 253, 1617, 323, 15274, 1677, 275, 253, 13989, 50276, 8826, 273, 253, 3533, 22691, 387, 253, 5068, 273, 253, 1543, 2593, 403, 8909, 352, 310, 417, 2590, 1880, 271, 16774, 1783, 476, 1014, 2455, 3662, 824, 3533, 347, 4767, 760, 26565, 27947, 651, 3662, 253, 3533, 285, 253, 2929, 19756, 1110, 12014, 253, 5955, 273, 253, 7313, 432, 253, 16774, 1543, 689, 16691, 4219, 50274, 783, 2929, 310, 327, 271, 4722, 2523, 2568, 352, 310, 387, 1682, 247, 4983, 1127, 285, 2080, 432, 4335, 673, 50273, 6438, 253, 2488, 2380, 50276, 35501, 281, 253, 4477, 323, 253, 2380, 352, 19235, 1057, 417, 1818, 619, 6803, 50276, 187, 187, 4118, 18435, 27, 783, 2022, 7680, 273, 436, 2929, 310, 326, 352, 2792, 562, 13583, 3916, 275, 253, 6239, 273, 4471, 12788, 391, 77, 285, 3400, 747, 12288, 327, 253, 4433, 10006, 273, 1655, 3082, 5742, 436, 2929, 2340, 684, 253, 43430, 1895, 275, 298, 6836, 4495, 352, 19584, 253, 643, 5570, 347, 247, 27785, 458, 47612, 3021, 417, 5975, 3390, 281, 42644, 793, 275, 690, 3958, 352, 840, 2722, 3237, 342, 767, 26019, 275, 253, 6239, 337, 5965, 66, 12453, 253, 43430, 1895, 760, 672, 352, 26414, 5010, 5965, 66, 1057, 417, 11322, 253, 2523, 374, 305, 2428, 1057, 417, 11322, 253, 2523, 3738, 352, 3916, 281, 513, 594, 436, 2929, 840, 29328, 247, 1332, 847, 66, 326, 26019, 253, 43430, 2523, 534, 41731, 13015, 5965, 66, 672, 352, 11711, 2510, 30628, 3839, 5194, 326, 253, 12288, 432, 436, 789, 310, 4722, 285, 1774, 323, 253, 1673, 2299, 627, 497, 690, 4468, 327, 1097, 253, 3762, 285, 253, 4679, 1223, 253, 9300, 2715, 12453, 690, 273, 253, 7350, 352, 671, 1160, 1534, 2544, 281, 1097, 253, 10527, 285, 253, 16774, 7118, 285, 651, 5649, 432, 1529, 3790, 273, 2810, 2278, 3021, 891, 1158, 253, 1655, 2715, 273, 436, 789, 310, 45210 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: 1 summary the authors proposed the negative data augmentation technique which is useful for generative adversarial networks anomaly detection selfsupervised learning frameworks the idea is simple and the technique was proven that it is powerful for several tasks they performed several experiments and i think the experiments were enough to show the techniques superiority 2 strong points good idea strong experimental results simple to use easy to understand 3 weak points in figure 3 they claim that in the absence of nda the support of a generative model learned from samples may overgeneralize i am not sure that the sentence is true this paper is well written and concrete i recommend that this paper should be presented in iclr 2021 docsepthis paper investigates how augmenting the negative examples not just the positive examples can improve a variety of representation learning tasks the paper investigates a number of different augmentations and applies them to gans and contrastive learning with images and videos strengths a major strength of the paper is its simplicity the method is fairly straightforward to implement into several approaches and it obtains strong results on each approach evaluated in the paper the approaches evaluated on gans and contrastive learning with images and videos although the novelty of this method is limited the paper does a good job at establishing some theoretical results to give intuition why the method works in contrast to the number of advances in machine learning that lack intuition into why it works this paper does a good job at offering some explanations and motivations for the approach although this paper focuses on images and videos the same ideas could be extended to other modalities such as text or audio as well the experiments are convincing to show the generality of this idea the experiments are on several different datasets the experiments are supported by theoretical results establishing intuition into why the method works the introduction does a good job at establishing the difference to other data augmentation methods in particular by using negative examples the paper is well written and easy to read weaknesses in some cases the negative data augmentations may actually be inside the positive set how would the approach scale with noise in the negative augmentations docsepthis paper presents a method that uses artificial augmentation of data as negative aka ood samples to improve various computer vision tasks including generation unsupervised learning on images and videos prons the paper is very well written experiments are comprehensive across different tasks the usage of data augmentation seems interesting but with some questions see below it designs losses for both gans and contrastive representation learning code is provided cons augmentation has been proven in gans to provide benefits through consistency training eg crgan iclr2020 image augmentations for gan training these samples are used as positive samples that should generate consistent predictions the most famous mixup is also treated as positive samples for training so the augmentation usage here is a bit counterintuitive to me because you show the opposite conclusion is that because only particular augmentation can be used as negative samples eg jiasaw the answer to this question is critical however the paper does not mention study much advanced selfsupervised contrastive learning reply on strong augmentation how negative samples can adapt to these methods how do we categorize augmentation types used for general cases or nda cases any insights on what kind of augmentations are useful for nda for example in figure 9 the paper proposes to push samples and its jigsaw version away however these two pairs share strong local visual contents of objects just like an image and its crop parts that usually contrastive learning wants to pull them together the proposed method tries to push them away any insights why it should work if justifications of these questions can be sufficient i think it can be a strong paper docsep post rebuttal the authors addressed well most of my concerns i increase my rating however the authors need to address all comments of the reviewers and also discuss all missing related works in the updated version summary the paper proposes a new method of leveraging the negative samples outofdistribution samples purposely generated from the training data distribution in generative modeling and representation learning the main idea aims to leverage the inductive bias of negative samples to constrain the learning of the model eg these negative samples may tell some more information about the support of the data the experimental results suggest that using these negative samples in gans studies with biggan model for conditionalunconditional image generation and contrastive representation learning study with cpc oord et al 2018 on unsupervised learning on images and videos improves the performance of baselines the paper also reports on improvements in imageimage translation and anomaly detection the paper also provides theorems to prove the convergence of the proposed model on gans and cpc overall the paper is easy to read and the idea makes sense however im a bit concerned about the theory the significance of improvements and the fairness of the comparison the paper also misses to discuss and compare with recent works also on data augmentation for gans strength s1 the usage of negative examples which obtained from some prior knowledge to provide the evidence of the learning model about the supportgeometry of data distribution sounds reasonable it has been applied in sung et al 2019 in semisupervised learning the proposed method applies to new applications of generative and representation learnings s2 the experimental results are quite extensive in regards to the applications and the improvements on gans quite significant with jigsaw augmentation weakness w1 the paper does not provide the very detailed implementations of proposed models which is a bit difficult to justify the correctness generative learning w2 the detail of how to incorporate nda into gan is not clear also the pda baseline for gan is not detailedly discussed does the pda nda and baseline biggan train with the same batch size i guess that the pda and nda had more augmented samples therefore batch size is larger than the biggan baseline w3 the paper does not discuss important related works abc of data augmentation for gan recently published in these papers they show transforming only real samples if i understand correctly it likely is similar to pda of the proposed baseline only to train gan changing the target distribution therefore the generator will learn infrequent or outofdistribution samples however if both realfake are transformed data augmentation is helpful in training gan can the authors compare the proposed nda to at least one of them with the same gan model a differentiable augmentation for dataefficient gan training b on data augmentation for gan training c training generative adversarial networks with limited data w4 eq 10 showed that lflambda gtheta 1 lambda overlinep dphi dfplambdaq1lambda overlinep but then infers the lower bound dfplambdaq1lambda overlinep lambda ffrac1lambda 1 lambda f0 dfplambdap1lambda overlinep therefore theoretically i am concerned a bit about the convergence of the model i wonder whether the authors need an upper bound instead of the lowerbound in this case w5 the paper claimed random horizontal flip is not effective as an nda this is because flipping does not spatially corrupt the image but is rather a semantic preserving transformation how about the random vertical flip only can it improve the model since this augmentation looks very good to tell us about the boundary of the data representation learning w6 the improvements in representation learning do not look significant to me and the improvements are not too consistent on different datasets according to the type of augmentations w7 the lower bound of eq 18 looks just like due to adding a larger batch size for negative samples to train cpc can authors compare nda to cpc with just the same batch size training as the nda method ### Summary:
all reviewers find the proposed data augmentation approach simple interesting and effective they agree that paper does a good job exploring this idea with number of experiments however the paper also suffers from some drawbacks and reviewers raise questions about some of the conclusions of the paper in particular how to designate an augmentation as either negative or positive is not clear apriori to training while i agree with this criticism i believe the paper overall explores an interesting direction and provides a good set of experiments than can be built on in future works and i suggest acceptance i encourage authors to address all the reviewers concerns as per the feedback in the final version
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 18, 6010, 50276, 783, 4477, 4081, 253, 4016, 941, 42072, 5853, 534, 310, 4217, 323, 1006, 800, 48960, 6928, 30207, 5481, 1881, 35421, 4715, 31225, 50276, 783, 2934, 310, 2969, 285, 253, 5853, 369, 11464, 326, 352, 310, 6422, 323, 2067, 8892, 50276, 9328, 2684, 2067, 4679, 285, 891, 1158, 253, 4679, 497, 2217, 281, 921, 253, 5609, 34385, 50276, 19, 2266, 2792, 50276, 12311, 2934, 50276, 9072, 5661, 1543, 50276, 19583, 281, 897, 50276, 36423, 281, 2096, 50276, 20, 5075, 2792, 50276, 249, 4677, 495, 597, 1750, 326, 275, 253, 5928, 273, 295, 1473, 253, 1329, 273, 247, 1006, 800, 1566, 6311, 432, 3530, 778, 689, 16691, 907, 50276, 74, 717, 417, 2119, 326, 253, 6197, 310, 2032, 50276, 2520, 2929, 310, 973, 3542, 285, 11859, 891, 5583, 326, 436, 2929, 943, 320, 3559, 275, 17857, 32888, 43425, 50276, 7152, 33032, 2520, 2929, 2340, 684, 849, 35919, 272, 253, 4016, 6667, 417, 816, 253, 2762, 6667, 476, 3157, 247, 5235, 273, 6779, 4715, 8892, 253, 2929, 2340, 684, 247, 1180, 273, 1027, 35919, 569, 285, 10384, 731, 281, 305, 507, 285, 4499, 422, 4715, 342, 3888, 285, 10556, 50276, 296, 3755, 20556, 247, 2201, 4757, 273, 253, 2929, 310, 697, 17647, 253, 1332, 310, 9648, 15246, 281, 3359, 715, 2067, 7274, 285, 352, 31326, 2266, 1543, 327, 1016, 2746, 6760, 275, 253, 2929, 253, 7274, 6760, 327, 305, 507, 285, 4499, 422, 4715, 342, 3888, 285, 10556, 50275, 20261, 253, 38135, 273, 436, 1332, 310, 3710, 253, 2929, 1057, 247, 1175, 2628, 387, 14631, 690, 10527, 1543, 281, 1918, 30328, 2139, 253, 1332, 2987, 275, 4499, 281, 253, 1180, 273, 16424, 275, 5145, 4715, 326, 3480, 30328, 715, 2139, 352, 2987, 436, 2929, 1057, 247, 1175, 2628, 387, 9159, 690, 22909, 285, 42852, 323, 253, 2746, 50275, 20261, 436, 2929, 16633, 327, 3888, 285, 10556, 253, 1072, 5697, 812, 320, 6508, 281, 643, 33433, 824, 347, 2505, 390, 9797, 347, 973, 50276, 783, 4679, 403, 21414, 281, 921, 253, 31376, 273, 436, 2934, 253, 4679, 403, 327, 2067, 1027, 15302, 253, 4679, 403, 4516, 407, 10527, 1543, 14631, 30328, 715, 2139, 253, 1332, 2987, 253, 10199, 1057, 247, 1175, 2628, 387, 14631, 253, 3064, 281, 643, 941, 42072, 3082, 275, 1798, 407, 970, 4016, 6667, 50275, 783, 2929, 310, 973, 3542, 285, 3477, 281, 1239, 50275, 20881, 1255, 265, 275, 690, 2219, 253, 4016, 941, 35919, 569, 778, 2686, 320, 3304, 253, 2762, 873, 849, 651, 253, 2746, 4311, 342, 6046, 275, 253, 4016, 35919, 569, 50276, 7152, 33032, 2520, 2929, 10262, 247, 1332, 326, 4648, 13345, 42072, 273, 941, 347, 4016, 38857, 258, 351, 3530, 281, 3157, 2710, 4382, 8113, 8892, 1690, 5978, 440, 35421, 4715, 327, 3888, 285, 10556, 50276, 1087, 790, 50276, 783, 2929, 310, 1077, 973, 3542, 50276, 16217, 3825, 403, 11088, 2439, 1027, 8892, 50276, 783, 10393, 273, 941, 42072, 3133, 4722, 533, 342, 690, 3533, 923, 2708, 50276, 262, 11809, 11655, 323, 1097, 305, 507, 285, 4499, 422, 6779, 4715, 50276, 3211, 310, 2530, 50276, 5040, 50276, 2321, 16977, 556, 644, 11464, 275, 305, 507, 281, 2085, 5373, 949, 15274, 3733, 50276, 909, 1531, 1247, 17857, 32888, 14952, 50276, 5695, 35919, 569, 323, 36827, 3733, 841, 3530, 403, 908, 347, 2762, 3530, 326, 943, 6635, 5185, 13650, 253, 954, 8530, 5878, 484, 310, 671, 4127, 347, 2762, 3530, 323, 3733, 594, 253, 42072, 10393, 1060, 310, 247, 2372, 4828, 565, 48714, 281, 479, 984, 368, 921, 253, 7285, 6452, 310, 326, 984, 760, 1798, 42072, 476, 320, 908, 347, 4016, 3530, 24088, 480, 6358, 1403, 253, 3662, 281, 436, 1953, 310, 4619, 2299, 253, 2929, 1057, 417, 3748, 1263, 1199, 50276, 46813, 1881, 35421, 4499, 422, 4715, 12252, 327, 2266, 42072, 849, 4016, 3530, 476, 5223, 281, 841, 3082, 849, 513, 359, 13213, 907, 42072, 3510, 908, 323, 2087, 2219, 390, 295, 1473, 2219, 667, 16039, 327, 752, 2238, 273, 35919, 569, 403, 4217, 323, 295, 1473, 323, 1650, 275, 4677, 898, 253, 2929, 29328, 281, 7450, 3530, 285, 697, 480, 17638, 1403, 2715, 1977, 2299, 841, 767, 8557, 3894, 2266, 1980, 5304, 9410, 273, 5113, 816, 751, 271, 2460, 285, 697, 17177, 4243, 326, 3798, 4499, 422, 4715, 5605, 281, 3785, 731, 2366, 253, 4081, 1332, 14177, 281, 7450, 731, 1977, 667, 16039, 2139, 352, 943, 789, 50276, 338, 816, 6787, 273, 841, 3533, 476, 320, 4209, 891, 1158, 352, 476, 320, 247, 2266, 2929, 50276, 7152, 33032, 1501, 30080, 22559, 50275, 783, 4477, 9713, 973, 954, 273, 619, 7350, 50276, 74, 2572, 619, 13716, 2299, 253, 4477, 878, 281, 2953, 512, 5701, 273, 253, 30628, 285, 671, 2319, 512, 5816, 2905, 2987, 275, 253, 9300, 2715, 50275, 8774, 50274, 783, 2929, 29328, 247, 747, 1332, 273, 19732, 2977, 253, 4016, 3530, 562, 1171, 35360, 3530, 46060, 600, 4561, 432, 253, 3733, 941, 3268, 275, 1006, 800, 14053, 285, 6779, 4715, 253, 2022, 2934, 13698, 281, 25057, 253, 42115, 8492, 273, 4016, 3530, 281, 37709, 253, 4715, 273, 253, 1566, 24088, 841, 4016, 3530, 778, 2028, 690, 625, 1491, 670, 253, 1329, 273, 253, 941, 253, 5661, 1543, 1804, 326, 970, 841, 4016, 3530, 275, 305, 507, 2175, 342, 1943, 1247, 1566, 323, 17697, 328, 35428, 2460, 5978, 285, 4499, 422, 6779, 4715, 1263, 342, 260, 5902, 258, 636, 1162, 355, 4765, 327, 440, 35421, 4715, 327, 3888, 285, 10556, 50276, 303, 856, 1634, 253, 3045, 273, 1666, 25379, 253, 2929, 671, 5012, 327, 11701, 275, 2460, 5695, 10234, 285, 30207, 5481, 253, 2929, 671, 3400, 39383, 281, 5276, 253, 14940, 273, 253, 4081, 1566, 327, 305, 507, 285, 260, 5902, 50275, 1189, 455, 253, 2929, 310, 3477, 281, 1239, 285, 253, 2934, 2789, 3282, 2299, 516, 247, 2372, 7514, 670, 253, 3762, 253, 8453, 273, 11701, 285, 253, 28959, 273, 253, 5301, 253, 2929, 671, 38771, 281, 2319, 285, 7277, 342, 3332, 2987, 671, 327, 941, 42072, 323, 305, 507, 50274, 45563, 50274, 84, 18, 50276, 783, 10393, 273, 4016, 6667, 534, 2797, 432, 690, 2720, 3640, 281, 2085, 253, 1941, 273, 253, 4715, 1566, 670, 253, 1329, 38965, 273, 941, 3268, 7835, 5272, 352, 556, 644, 3732, 275, 29502, 1162, 355, 6247, 275, 49863, 29974, 13337, 4715, 253, 4081, 1332, 10384, 281, 747, 4893, 273, 1006, 800, 285, 6779, 3037, 723, 50275, 84, 19, 50276, 783, 5661, 1543, 403, 3240, 9470, 275, 17730, 281, 253, 4893, 285, 253, 11701, 327, 305, 507, 3240, 1534, 342, 480, 17638, 1403, 42072, 50274, 20881, 1255, 50275, 88, 18, 50276, 783, 2929, 1057, 417, 2085, 253, 1077, 7000, 27558, 273, 4081, 3210, 534, 310, 247, 2372, 2834, 281, 15249, 253, 36594, 50276, 36749, 4715, 50276, 88, 19, 50276, 783, 2508, 273, 849, 281, 19071, 295, 1473, 715, 36827, 310, 417, 2590, 671, 253, 268, 1473, 8245, 323, 36827, 310, 417, 7000, 314, 5469, 1057, 253, 268, 1473, 295, 1473, 285, 8245, 1943, 1247, 6194, 342, 253, 1072, 14604, 1979, 891, 5476, 326, 253, 268, 1473, 285, 295, 1473, 574, 625, 31612, 3530, 3103, 14604, 1979, 310, 4067, 685, 253, 1943, 1247, 8245, 50276, 88, 20, 50276, 783, 2929, 1057, 417, 2319, 1774, 2905, 2987, 490, 68, 273, 941, 42072, 323, 36827, 4102, 3863, 275, 841, 9380, 597, 921, 27197, 760, 1524, 3530, 604, 891, 2096, 9113, 352, 2779, 310, 2074, 281, 268, 1473, 273, 253, 4081, 8245, 760, 281, 6194, 36827, 6890, 253, 2303, 3268, 3103, 253, 14156, 588, 3037, 2192, 38976, 390, 562, 1171, 35360, 3530, 2299, 604, 1097, 1524, 39182, 403, 13657, 941, 42072, 310, 9371, 275, 3733, 36827, 476, 253, 4477, 7277, 253, 4081, 295, 1473, 281, 387, 1878, 581, 273, 731, 342, 253, 1072, 36827, 1566, 50276, 66, 46350, 42072, 323, 941, 20246, 36827, 3733, 50276, 67, 327, 941, 42072, 323, 36827, 3733, 50276, 68, 3733, 1006, 800, 48960, 6928, 342, 3710, 941, 50275, 88, 21, 50276, 2574, 884, 2692, 326, 298, 1258, 1836, 50276, 72, 3124, 50276, 18, 50276, 2260, 50276, 4862, 81, 277, 2162, 50276, 4989, 446, 1836, 82, 18, 2260, 50276, 4862, 81, 533, 840, 2192, 398, 253, 2406, 3033, 20926, 446, 1836, 82, 18, 2260, 50276, 4862, 81, 50276, 2260, 50276, 567, 8306, 18, 2260, 50276, 18, 50276, 2260, 269, 17, 50276, 4989, 446, 1369, 69, 522, 18, 2260, 50276, 4862, 81, 3103, 28055, 891, 717, 7514, 247, 2372, 670, 253, 14940, 273, 253, 1566, 891, 4282, 1880, 253, 4477, 878, 271, 5170, 3033, 3185, 273, 253, 2406, 9458, 275, 436, 1083, 50276, 88, 22, 50276, 783, 2929, 7558, 3632, 11593, 19153, 310, 417, 3576, 347, 271, 295, 1473, 436, 310, 984, 46899, 1057, 417, 28819, 17715, 253, 2460, 533, 310, 2581, 247, 24705, 24279, 9261, 849, 670, 253, 3632, 9118, 19153, 760, 476, 352, 3157, 253, 1566, 1580, 436, 42072, 4453, 1077, 1175, 281, 2028, 441, 670, 253, 7548, 273, 253, 941, 50275, 37626, 4715, 50276, 88, 23, 50276, 783, 11701, 275, 6779, 4715, 513, 417, 1007, 1534, 281, 479, 285, 253, 11701, 403, 417, 1512, 5185, 327, 1027, 15302, 2556, 281, 253, 1511, 273, 35919, 569, 50276, 88, 24, 50276, 783, 2406, 3033, 273, 16186, 1283, 4453, 816, 751, 1955, 281, 6240, 247, 4067, 14604, 1979, 323, 4016, 3530, 281, 6194, 260, 5902, 476, 4477, 7277, 295, 1473, 281, 260, 5902, 342, 816, 253, 1072, 14604, 1979, 3733, 347, 253, 295, 1473, 1332, 50276, 187, 187, 4118, 18435, 27, 455, 30628, 1089, 253, 4081, 941, 42072, 2746, 2969, 4722, 285, 3576, 597, 5194, 326, 2929, 1057, 247, 1175, 2628, 18216, 436, 2934, 342, 1180, 273, 4679, 2299, 253, 2929, 671, 27171, 432, 690, 30453, 285, 30628, 7164, 3533, 670, 690, 273, 253, 11815, 273, 253, 2929, 50276, 249, 1798, 849, 281, 42638, 271, 42072, 347, 2057, 4016, 390, 2762, 310, 417, 2590, 1049, 7947, 74, 281, 3733, 1223, 891, 5194, 342, 436, 14226, 891, 2868, 253, 2929, 4583, 33826, 271, 4722, 3884, 285, 3400, 247, 1175, 873, 273, 4679, 685, 476, 320, 4270, 327, 275, 50276, 32279, 2987, 285, 891, 1804, 14924, 891, 11907, 4477, 281, 2953, 512, 253, 30628, 7350, 347, 591, 253, 8680, 275, 253, 2457, 2715 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 18, 6010, 50276, 783, 4477, 4081, 253, 4016, 941, 42072, 5853, 534, 310, 4217, 323, 1006, 800, 48960, 6928, 30207, 5481, 1881, 35421, 4715, 31225, 50276, 783, 2934, 310, 2969, 285, 253, 5853, 369, 11464, 326, 352, 310, 6422, 323, 2067, 8892, 50276, 9328, 2684, 2067, 4679, 285, 891, 1158, 253, 4679, 497, 2217, 281, 921, 253, 5609, 34385, 50276, 19, 2266, 2792, 50276, 12311, 2934, 50276, 9072, 5661, 1543, 50276, 19583, 281, 897, 50276, 36423, 281, 2096, 50276, 20, 5075, 2792, 50276, 249, 4677, 495, 597, 1750, 326, 275, 253, 5928, 273, 295, 1473, 253, 1329, 273, 247, 1006, 800, 1566, 6311, 432, 3530, 778, 689, 16691, 907, 50276, 74, 717, 417, 2119, 326, 253, 6197, 310, 2032, 50276, 2520, 2929, 310, 973, 3542, 285, 11859, 891, 5583, 326, 436, 2929, 943, 320, 3559, 275, 17857, 32888, 43425, 50276, 7152, 33032, 2520, 2929, 2340, 684, 849, 35919, 272, 253, 4016, 6667, 417, 816, 253, 2762, 6667, 476, 3157, 247, 5235, 273, 6779, 4715, 8892, 253, 2929, 2340, 684, 247, 1180, 273, 1027, 35919, 569, 285, 10384, 731, 281, 305, 507, 285, 4499, 422, 4715, 342, 3888, 285, 10556, 50276, 296, 3755, 20556, 247, 2201, 4757, 273, 253, 2929, 310, 697, 17647, 253, 1332, 310, 9648, 15246, 281, 3359, 715, 2067, 7274, 285, 352, 31326, 2266, 1543, 327, 1016, 2746, 6760, 275, 253, 2929, 253, 7274, 6760, 327, 305, 507, 285, 4499, 422, 4715, 342, 3888, 285, 10556, 50275, 20261, 253, 38135, 273, 436, 1332, 310, 3710, 253, 2929, 1057, 247, 1175, 2628, 387, 14631, 690, 10527, 1543, 281, 1918, 30328, 2139, 253, 1332, 2987, 275, 4499, 281, 253, 1180, 273, 16424, 275, 5145, 4715, 326, 3480, 30328, 715, 2139, 352, 2987, 436, 2929, 1057, 247, 1175, 2628, 387, 9159, 690, 22909, 285, 42852, 323, 253, 2746, 50275, 20261, 436, 2929, 16633, 327, 3888, 285, 10556, 253, 1072, 5697, 812, 320, 6508, 281, 643, 33433, 824, 347, 2505, 390, 9797, 347, 973, 50276, 783, 4679, 403, 21414, 281, 921, 253, 31376, 273, 436, 2934, 253, 4679, 403, 327, 2067, 1027, 15302, 253, 4679, 403, 4516, 407, 10527, 1543, 14631, 30328, 715, 2139, 253, 1332, 2987, 253, 10199, 1057, 247, 1175, 2628, 387, 14631, 253, 3064, 281, 643, 941, 42072, 3082, 275, 1798, 407, 970, 4016, 6667, 50275, 783, 2929, 310, 973, 3542, 285, 3477, 281, 1239, 50275, 20881, 1255, 265, 275, 690, 2219, 253, 4016, 941, 35919, 569, 778, 2686, 320, 3304, 253, 2762, 873, 849, 651, 253, 2746, 4311, 342, 6046, 275, 253, 4016, 35919, 569, 50276, 7152, 33032, 2520, 2929, 10262, 247, 1332, 326, 4648, 13345, 42072, 273, 941, 347, 4016, 38857, 258, 351, 3530, 281, 3157, 2710, 4382, 8113, 8892, 1690, 5978, 440, 35421, 4715, 327, 3888, 285, 10556, 50276, 1087, 790, 50276, 783, 2929, 310, 1077, 973, 3542, 50276, 16217, 3825, 403, 11088, 2439, 1027, 8892, 50276, 783, 10393, 273, 941, 42072, 3133, 4722, 533, 342, 690, 3533, 923, 2708, 50276, 262, 11809, 11655, 323, 1097, 305, 507, 285, 4499, 422, 6779, 4715, 50276, 3211, 310, 2530, 50276, 5040, 50276, 2321, 16977, 556, 644, 11464, 275, 305, 507, 281, 2085, 5373, 949, 15274, 3733, 50276, 909, 1531, 1247, 17857, 32888, 14952, 50276, 5695, 35919, 569, 323, 36827, 3733, 841, 3530, 403, 908, 347, 2762, 3530, 326, 943, 6635, 5185, 13650, 253, 954, 8530, 5878, 484, 310, 671, 4127, 347, 2762, 3530, 323, 3733, 594, 253, 42072, 10393, 1060, 310, 247, 2372, 4828, 565, 48714, 281, 479, 984, 368, 921, 253, 7285, 6452, 310, 326, 984, 760, 1798, 42072, 476, 320, 908, 347, 4016, 3530, 24088, 480, 6358, 1403, 253, 3662, 281, 436, 1953, 310, 4619, 2299, 253, 2929, 1057, 417, 3748, 1263, 1199, 50276, 46813, 1881, 35421, 4499, 422, 4715, 12252, 327, 2266, 42072, 849, 4016, 3530, 476, 5223, 281, 841, 3082, 849, 513, 359, 13213, 907, 42072, 3510, 908, 323, 2087, 2219, 390, 295, 1473, 2219, 667, 16039, 327, 752, 2238, 273, 35919, 569, 403, 4217, 323, 295, 1473, 323, 1650, 275, 4677, 898, 253, 2929, 29328, 281, 7450, 3530, 285, 697, 480, 17638, 1403, 2715, 1977, 2299, 841, 767, 8557, 3894, 2266, 1980, 5304, 9410, 273, 5113, 816, 751, 271, 2460, 285, 697, 17177, 4243, 326, 3798, 4499, 422, 4715, 5605, 281, 3785, 731, 2366, 253, 4081, 1332, 14177, 281, 7450, 731, 1977, 667, 16039, 2139, 352, 943, 789, 50276, 338, 816, 6787, 273, 841, 3533, 476, 320, 4209, 891, 1158, 352, 476, 320, 247, 2266, 2929, 50276, 7152, 33032, 1501, 30080, 22559, 50275, 783, 4477, 9713, 973, 954, 273, 619, 7350, 50276, 74, 2572, 619, 13716, 2299, 253, 4477, 878, 281, 2953, 512, 5701, 273, 253, 30628, 285, 671, 2319, 512, 5816, 2905, 2987, 275, 253, 9300, 2715, 50275, 8774, 50274, 783, 2929, 29328, 247, 747, 1332, 273, 19732, 2977, 253, 4016, 3530, 562, 1171, 35360, 3530, 46060, 600, 4561, 432, 253, 3733, 941, 3268, 275, 1006, 800, 14053, 285, 6779, 4715, 253, 2022, 2934, 13698, 281, 25057, 253, 42115, 8492, 273, 4016, 3530, 281, 37709, 253, 4715, 273, 253, 1566, 24088, 841, 4016, 3530, 778, 2028, 690, 625, 1491, 670, 253, 1329, 273, 253, 941, 253, 5661, 1543, 1804, 326, 970, 841, 4016, 3530, 275, 305, 507, 2175, 342, 1943, 1247, 1566, 323, 17697, 328, 35428, 2460, 5978, 285, 4499, 422, 6779, 4715, 1263, 342, 260, 5902, 258, 636, 1162, 355, 4765, 327, 440, 35421, 4715, 327, 3888, 285, 10556, 50276, 303, 856, 1634, 253, 3045, 273, 1666, 25379, 253, 2929, 671, 5012, 327, 11701, 275, 2460, 5695, 10234, 285, 30207, 5481, 253, 2929, 671, 3400, 39383, 281, 5276, 253, 14940, 273, 253, 4081, 1566, 327, 305, 507, 285, 260, 5902, 50275, 1189, 455, 253, 2929, 310, 3477, 281, 1239, 285, 253, 2934, 2789, 3282, 2299, 516, 247, 2372, 7514, 670, 253, 3762, 253, 8453, 273, 11701, 285, 253, 28959, 273, 253, 5301, 253, 2929, 671, 38771, 281, 2319, 285, 7277, 342, 3332, 2987, 671, 327, 941, 42072, 323, 305, 507, 50274, 45563, 50274, 84, 18, 50276, 783, 10393, 273, 4016, 6667, 534, 2797, 432, 690, 2720, 3640, 281, 2085, 253, 1941, 273, 253, 4715, 1566, 670, 253, 1329, 38965, 273, 941, 3268, 7835, 5272, 352, 556, 644, 3732, 275, 29502, 1162, 355, 6247, 275, 49863, 29974, 13337, 4715, 253, 4081, 1332, 10384, 281, 747, 4893, 273, 1006, 800, 285, 6779, 3037, 723, 50275, 84, 19, 50276, 783, 5661, 1543, 403, 3240, 9470, 275, 17730, 281, 253, 4893, 285, 253, 11701, 327, 305, 507, 3240, 1534, 342, 480, 17638, 1403, 42072, 50274, 20881, 1255, 50275, 88, 18, 50276, 783, 2929, 1057, 417, 2085, 253, 1077, 7000, 27558, 273, 4081, 3210, 534, 310, 247, 2372, 2834, 281, 15249, 253, 36594, 50276, 36749, 4715, 50276, 88, 19, 50276, 783, 2508, 273, 849, 281, 19071, 295, 1473, 715, 36827, 310, 417, 2590, 671, 253, 268, 1473, 8245, 323, 36827, 310, 417, 7000, 314, 5469, 1057, 253, 268, 1473, 295, 1473, 285, 8245, 1943, 1247, 6194, 342, 253, 1072, 14604, 1979, 891, 5476, 326, 253, 268, 1473, 285, 295, 1473, 574, 625, 31612, 3530, 3103, 14604, 1979, 310, 4067, 685, 253, 1943, 1247, 8245, 50276, 88, 20, 50276, 783, 2929, 1057, 417, 2319, 1774, 2905, 2987, 490, 68, 273, 941, 42072, 323, 36827, 4102, 3863, 275, 841, 9380, 597, 921, 27197, 760, 1524, 3530, 604, 891, 2096, 9113, 352, 2779, 310, 2074, 281, 268, 1473, 273, 253, 4081, 8245, 760, 281, 6194, 36827, 6890, 253, 2303, 3268, 3103, 253, 14156, 588, 3037, 2192, 38976, 390, 562, 1171, 35360, 3530, 2299, 604, 1097, 1524, 39182, 403, 13657, 941, 42072, 310, 9371, 275, 3733, 36827, 476, 253, 4477, 7277, 253, 4081, 295, 1473, 281, 387, 1878, 581, 273, 731, 342, 253, 1072, 36827, 1566, 50276, 66, 46350, 42072, 323, 941, 20246, 36827, 3733, 50276, 67, 327, 941, 42072, 323, 36827, 3733, 50276, 68, 3733, 1006, 800, 48960, 6928, 342, 3710, 941, 50275, 88, 21, 50276, 2574, 884, 2692, 326, 298, 1258, 1836, 50276, 72, 3124, 50276, 18, 50276, 2260, 50276, 4862, 81, 277, 2162, 50276, 4989, 446, 1836, 82, 18, 2260, 50276, 4862, 81, 533, 840, 2192, 398, 253, 2406, 3033, 20926, 446, 1836, 82, 18, 2260, 50276, 4862, 81, 50276, 2260, 50276, 567, 8306, 18, 2260, 50276, 18, 50276, 2260, 269, 17, 50276, 4989, 446, 1369, 69, 522, 18, 2260, 50276, 4862, 81, 3103, 28055, 891, 717, 7514, 247, 2372, 670, 253, 14940, 273, 253, 1566, 891, 4282, 1880, 253, 4477, 878, 271, 5170, 3033, 3185, 273, 253, 2406, 9458, 275, 436, 1083, 50276, 88, 22, 50276, 783, 2929, 7558, 3632, 11593, 19153, 310, 417, 3576, 347, 271, 295, 1473, 436, 310, 984, 46899, 1057, 417, 28819, 17715, 253, 2460, 533, 310, 2581, 247, 24705, 24279, 9261, 849, 670, 253, 3632, 9118, 19153, 760, 476, 352, 3157, 253, 1566, 1580, 436, 42072, 4453, 1077, 1175, 281, 2028, 441, 670, 253, 7548, 273, 253, 941, 50275, 37626, 4715, 50276, 88, 23, 50276, 783, 11701, 275, 6779, 4715, 513, 417, 1007, 1534, 281, 479, 285, 253, 11701, 403, 417, 1512, 5185, 327, 1027, 15302, 2556, 281, 253, 1511, 273, 35919, 569, 50276, 88, 24, 50276, 783, 2406, 3033, 273, 16186, 1283, 4453, 816, 751, 1955, 281, 6240, 247, 4067, 14604, 1979, 323, 4016, 3530, 281, 6194, 260, 5902, 476, 4477, 7277, 295, 1473, 281, 260, 5902, 342, 816, 253, 1072, 14604, 1979, 3733, 347, 253, 295, 1473, 1332, 50276, 187, 187, 4118, 18435, 27, 455, 30628, 1089, 253, 4081, 941, 42072, 2746, 2969, 4722, 285, 3576, 597, 5194, 326, 2929, 1057, 247, 1175, 2628, 18216, 436, 2934, 342, 1180, 273, 4679, 2299, 253, 2929, 671, 27171, 432, 690, 30453, 285, 30628, 7164, 3533, 670, 690, 273, 253, 11815, 273, 253, 2929, 50276, 249, 1798, 849, 281, 42638, 271, 42072, 347, 2057, 4016, 390, 2762, 310, 417, 2590, 1049, 7947, 74, 281, 3733, 1223, 891, 5194, 342, 436, 14226, 891, 2868, 253, 2929, 4583, 33826, 271, 4722, 3884, 285, 3400, 247, 1175, 873, 273, 4679, 685, 476, 320, 4270, 327, 275, 50276, 32279, 2987, 285, 891, 1804, 14924, 891, 11907, 4477, 281, 2953, 512, 253, 30628, 7350, 347, 591, 253, 8680, 275, 253, 2457, 2715 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the authors train a neural network to do object recognition on downsampled moving images of objects they show that by using a recurrent neural network in the early layers it can learn to produce representations that result in recognition performance nearly as good as with static full resolution images the overall thesis here is very interesting however there are numerous concerns one of the main claims of this paper is that dynamic retinal input combined with recurrent processing is key to how the brain manages to do hyperacuity the results of table 1 seem at first glance to support this however none of the baseline models considered the most important control what if you just feed in a series of static images without motion to the drcfe as is all that we can conclude from this paper is that a recurrent network in the early stages somehow helps but it is not clear that the motion in the input image has anything to do with this in fact if the motion did help it would beg even more questions the 8x8 images were created by downsampling from the 32x32 images with bicubic interpolation essentially smoothing or lowpass filtering if you simply move and resample a lowpass filtered image there is no new information that can be exploited by later information processing assuming that it was lowpass filtered below the nyquist rate for an 8x8 image which presumably it was an important detail that is missing this is given by basic signal processing it seems plausible that recurrent computation in the early layers helps it is essentially like making a deeper network but it would appear the effect has nothing to do with the motion in the input the paper seems motivated by neuroscience and psychophysics but there is very little attempt to tie anything about the neural architecture of the model to substrates in the brain for example it is mentioned that neurons exhibit temporal dynamics with phasic responses but none of this is incorporated in the model this seems like run of the mill deep convnet engineering as opposed to neuroscience im not sure what we learn here from a neuroscience point of view there is no overall theory presented as to how the brain could benefit from motion of the sensor in building a higher acuity representation enabling tasks such as hyperacuity there is much verbal reasoning in the introduction however there is now much engineering and mathematical knowhow about how such problems can be solved eg superresolution these works are mentioned at the end in the discussion but then almost immediately dismissed because they reconstruct the image rather than doing recognition this is a shame because the theory behind these models is exactly what the authors need to implement their idea instead all of the requisite established theory is tossed aside and the authors resort to training a neural network to solve the problem yielding a nontransparent solution providing little insight into how the brain might actually solve this problem the introduction does not properly attribute prior work first rucci et al have been writing and talking about the benefits of image motion for more than a decade now but you wouldnt know this by reading the intro although rucci is cited it is about drift motion in general and not with regard to his theory of why image motion is helpful which is well known in the vision science community buraks 2010 important earlier work is cited but misattributed as providing an account for how how drift motion could improve acuity which is wrong buraks model shows how the cortex could disentangle shape from motion from retinal spike trains so as to recover shape information on the retina but does not address the question of why the motion may be beneficial to begin with also missing in the intro is any mention of ratnam et al 2017 and anderson et al 2020 those works are brought up in discussion at the end but given the high degree of relevance of these prior works to the authors thesis it is baffling why they are not brought up earlier especially with regard to what the authors hope to do here that goes beyond or improves upon this prior work an interesting idea but implementation is problematic docsepthis paper takes inspiration from the biological phenomenon of fixation drift slow lowamplitude movements during fixation that are believed to result in hyperresolution in human vision they hypothesize that this phenomenon can be explained by a model that has a recurrent convolutional front end that integrates over fixation drift feeding into a welltrained backend from a conventional model resnet 50 they demonstrate that this dynamical recurrent classifier drc is capable of restoring performance on 8x8 images to nearly the performance on high resolution 32x32 cifar images actually no one would call 32x32 high resolution they analyze the representations learned by the model and show they have strong spatiotemporal features with some learned features emphasizing spatial features some emphasizing temporal features but most combine the two finally they show that using curved trajectories improves performance over more random walks which can potentially explain recent results in humans they suggest this model can be useful in ai applications involving limited resolution but with multiple samples over time this paper is wellwritten proposes a highly innovative model that is consistent with behavioral and neural data and obtains excellent results the paper addresses a longneglected aspect of human vision fixation drift in neurocomputational models of human vision and shows that it has efficacy in challenging conditions they dont stop at demonstrating that the model improves accuracy over a static model that uses multiple images they develop a method for assessing the dynamical features of their model because the usual activation maximization technique doesnt work in this setting finally they demonstrate that a recently discovered phenomenon curved paths in the fixational drift promotes higher classification accuracy the front end of the model is a twolayer recurrent convolutional network it is trained by feature distillation from a layer of resnet 50 the resnet 50 is pretrained on imagenet and then finetuned on either cifar10 or cifar100 the recurrent net is provided with 8x8 images and trained to match the features activated by the 32x32 versions in the resnet the inputs are shifted slightly based on dynamical difference equations with random perturbations that determined the xy coordinates of the next input simulating fixation drift the network was trained to reproduce the activations of the teacher network after 5 or 10 inputs then the output of this front end was input to the remaining layers of the resnet50 network which was then finetuned to improve performance the baselines are quite reasonable a network trained directly on the 8x8 images the same network but using the average prediction over the 5 or 10 images a resnet rnn network trained on a sequence of 5 8x8 images with or without positional information they show that with increasing number of inputs 5 images or 10 images performance of the drc improves while the static network flatlines at 5 images positional information also improves performance in the end a 10step drc network with positional information achieves performance nearly as good as the original 32x32 resnet50 in both cifar 10 and cifar 100 they then go on to analyze the features they perform the usual gradient ascent procedure to obtain maximallyactivating 32x32 inputs for the features of the resnet50 network used to train the features of the drc network they find that this same procedure doesnt converge for the drc network so they have to invent a novel technique for finding the optimal features they use the idea of the generative network nguyen et al 2016 modified for their setting the generative network has to learn to generate a sequence of 8x8 images that maximally activate the drc features these resemble the corresponding resnet features they were trained on but obviously have dynamics to evaluate the spatial and temporal aspects of these units they also apply the same procedure but only allow the generative network to generate one image that is repeated giving the best spatial activation of the feature or they only allow the generative network to vary the images but all the images have to have the same pixel everywhere giving the best temporal activation but without form these activations generally arent as high as the unconstrained optimization it took me a while to parse figure 2b but once i figured it out it was reasonably clear finally they set up the fixation location dynamics in such a way that they can control the curvature of the drift they find that more curvature in the drift dynamics the better the accuracy in fact an enforced spiral dynamics gives the best results it turns out the performance data is based on this model which is about 4 better than the less constrained model this is interesting because it accords with recent human data from michele ruccis lab that finds curved drifts are used by subjects when the recognition problem is challenging i havent read that paper so i dont know how faithful they are to ruccis data or that this correctly describes his results weaknesses with concrete actionable feedback the weaknesses are mainly in the exposition i had several clarification questions it is unclear what the representation of the positional information is is the rnn an lstm network in general im confused about the role of smallnet in this paper please clarify the procedure by which the generative network determines the optimal features is not clear this could be described more clearly the supplementary material is insufficient in this regard you have an unused halfpage in the main text so that should be enough room to elucidate how this is done minor comments wording etc page 1 3rd line from the bottom dominate have dominated first sentence in section 211 we applied a feature distillation learning paradigm next paragraph therefor therefore spell check also in this paragraph i initially thought you were saying you applied feature distillation to an 8x8 layer using 56x56 features which is not what you did this is one of those places where the role of smallnet is unclear stackup processing stack cosyne cosine our model was mostly implemented in the keras package with the convolutional gru in the sentence beginning the accuracy of the reference teacher it isnt clear which entry in the table you are referring to here i believe it is naive training so call it that in this sentence middle of page 6 saptiotemporal spatiotemporal spell check property properties various places use two left apostrophes below the tilde on the standard keyboard instead of in latex on the left side of a word eg spirals near the bottom of page 6 in figure 2b you say you are showing predominantly temporal predominantly spatial and mixed examples here if thats the case i would expect one callout to be from the far left point in the upperleft hand corner predominantly spatial your choice is reasonable here but the point to the left of it would be even better a point in the lower righthand corner predominantly temporal and then the third one you show the point you use from the lower left hand corner corresponds to 0 temporal and low spatial so there isnt a predominantly temporal example here can you pick one from the lowerright hand corner instead third line of figure 2 caption students students third line from the bottom of page 7 reported at tables reported in tables last sentence in figure 3 caption wors worse spell check dont annoy your reviewers wording suggestion for discussion this setting is novel and has been hardly addressed in the this setting is novel and has been mostly neglected in the middle of page 8 stackup architecture last word in third paragraph from the bottom of page 8 is not the one you want prepossessing preprocessing step furthermore furthermore to idealistic to the idealistic first line last paragraph it sets our work sets last sentence last paragraph this is not really a sentence in english rewrite as this is enabled by a solution caption of supplementary figure s4 the second sentence is garbled it needs a right somewhere or compared to i dont know what same means in the padding column in your supplementary tables same as what this paper is wellwritten proposes a highly innovative model that is consistent with behavioral and neural data and obtains excellent results the paper addresses a longneglected aspect of human vision fixation drift in neurocomputational models of human vision and shows that it has efficacy in challenging conditions this result suggests that it can be used in engineering applications where the stimuli are lowresolution it has some confusing parts but these can be fixed by the authors docsepthe papers main claim is that recurrence aids to enhance visual acuity in settings with limited resolution such as the one imposed by limited photoreceptors in the retina the authors therefore build a convolutional network with recurrent connectivity in its early layers termed drc that receives a timeseries of low resolution frames and learns representations for classification in cifar from a teacher network receiving full resolution inputs drc outperforms a lowresolution baseline and approaches standard resolution performance additionally the paper visualizes the drcs learned features pros as far as i can tell this is a novel setting and i have not seen much work investigating the impact of low retinal resolution on object recognition models the results on cifar10 and cifar100 are clearly described and show that the recurrent drc model aided by a fullresolution teacher can regain most of the performance of a standard resolution model the paper provides good background on the biological motivation for modeling lowresolution photoreceptors cons lack of connection to biology the proposed model is motivated from biological observations but model predictions are never tested against any experimental results are the models resulting features any more brainlike does it exhibit the same hyperacuity as observed in biology requirement of a teacher the drc is only tested when learning representations from a teacher which both has a nonobvious connection to biology and is an unfair comparison to the nonrecurrent baselines which do not use a teacher would any of the baselines perform better when trained with a fullresolution teacher in the same way as the drc unclear benefits for computer vision it is not obvious to me ifwhere processing sensory data with low resolution but many temporal samples will be helpful to the machine learning community some connection is made in the very last paragraph to alwayson cameras such as body worn cameras but it is not made clear if those are really in the regime of lowresolution and high temporal sampling minor some more discussion of related recurrent models eg httpspapersnipsccpaper2019hash7813d1590d28a7dd372ad54b5d29d033abstracthtml and httpswwwpnasorgcontent1164321854short would be helpful to contextualize the work the secondtolast paragraph on page 4 states that the resnetrnn achieved accuracy lower by 35 and 10 respectively for cifar10100 but table 2 has its accuracy as 83945961 compared to the standard resolution 96838294 this seems to be inconsistent it is not clear to me what to take from the visualization of features figs 2 and 3 in general i found the paper a bit hard to follow at times the consistent story is not clear to me eg how do the feature visualizations support the main claim figure 3 caption has a typo the wors case the paper lacks a clear demonstration of usefulness either an improved fit to biological data since the motivation starts from limited sampling in the retina or a clear use case in computer vision since neither is demonstrated i find it really hard to contextualize the work and cannot tell if the proposed model makes any improvements over previous models see the main review for detailed suggestions the use of a fullresolution teacher network is also not well motivated especially in connection to biology and the second half of the paper is a bit hard to follow ie what to take from the feature visualizations rebuttal update i have increased my score following the authors attempts at connecting to biology more directly but i still believe key comparisons are missing either a stronger link to biology and concretely relating model predictions to experimental results andor explicit comparisons to alternative models in ml tasks docsephere the authors attempt to leverage spatiotemporal computations for object recognition on the standard cifar10 and cifar100 datasets in short they use a network with a frontend of recurrent units convgru to recognize objects given spatially jittered downsampled images  effectively approximating an active sensor the network is trained in a studentteacher configuration where weights in a temporal pooling layer after the recurrent layers are trained to match the weights of a feature layer inresnet50 next the network is finetuned to increase classification accuracy altogether the authors are asking if spatiotemporal computations are enough to produce a feature layer similar to a larger network train on fullres images and in turn if this feature layer supports object recognition on par with fullres performance of resnet50 the authors demonstrate that their network is almost as performant as resnet50 with 4x downsampled images especially when the downsampled images are jittered in a spiral formation they also present analysis demonstrating that the network is in fact performing spatiotemporal calculations strengths 1 the use of an active sensor ie jittering the input image is an interesting idea that capitalizes on recent developments in neuroscience and psychology 2 on first blush the results are relatively strong however with a caveat that more controls are needed to interpret them major issues 1 a central claim made by the authors is that spatiotemporal computations in the frontend of the network are important the main evidence here is that the resnetrnn network ie putting the recurrent computations on the backend does not work nearly as well in fact resnetrnn appears to do no better than simply averaging the prediction of resnet over 5 frames which is surprising this needs to be evaluated much more systematically since it is not an applestoapples comparison why is the rnn only used after the global average pooling layer here the drc is using convgru while the comparison is made with vanilla gru units in the resnetrnn network so they do not have access to spatial information so really the comparison is spatiotemporal computation for the drc network and temporal only computation for resnetrnn it is also unclear how the resnetrnn network incorporates spatial information since i could not find the parameters related to input trajectory in the appendix it is also unclear how the resnetrnn network was trained 2 figure 2 demonstrates that the drc network uses a mixture of spatial and temporal computation but the results are underanalyzed does the network produce a similar distribution across different random initializations does the performance covary with these distributions what happens if units with specific criteria are ablated much more analysis is needed to be able to interpret the importance of whats shown in figure 2 3 another central claim is that the trajectory of images over time is important fig 3 in general this result is underanalyzed and difficult to interpret as is as k becomes more negative and the trajectories more curved trajectories are likelier to remain closer to the center and have more overlap yet i could not find any analysis of this if indeed curvature matters then the authors must show that curved trajectories are betterperforming than other trajectories with less curvature but similar aggregate statistics eg the trajectories are a similar distance from the centerpoint and have similar degrees of overlap a very simple control here is to shuffle the trajectories over time in this case the statistics should be the same but the degree of curvature from point to point will be destroyed if the drc network performs just as well with the shuffle then the overall statistics matter more than curvature this is essential to understanding what allows the drc network to perform well minor issues 1 please check for typos 2 figure 3 please provide a legend 3 figure 3 i do not understand why the authors are plotting an average of 2 datapoints many more points should be computed to estimate the distribution properly so we can visualize a reasonable confidence interval for each parameter setting in summary i found the core idea of authors network interesting however the authors claims are currently not justified by the results many more controls are needed to ensure that the drc network performs better than alternatives if the drc network does perform better than all other control networks then additional analysis is required to understand how the network is able to improve its performance ### Summary:
this paper explores the idea that fixational drift of a sensor over an image something that primate eyes do could be used to achieve visual hyperacuity ie image recognition with low resolution images equivalent to what would be achieved with high resolution images the authors construct networks where the bottom of a deep convnet is replaced by recurrent networks and the network is then trained on lowresolution versions of highresolution images that are sampled with fixational drift across the image the authors show that this approach allows their system dynamical recurrent classifier or drc to get much better classification performance on cifar images than can be achieved without the early recurrence and drift the authors also show that the most robust classification mandates drift trajectories with higher curvature and they show that this matches some of the properties of visual drift trajectories in humans the reviews on this paper were highly divergent ranging from 3 to 10 three of the reviewers felt this paper should be rejected but one felt very strongly it should be accepted the primary concerns from the negative reviewers were lack of appropriate controls lack of insight into why the system works lack of appropriate references to past work and lack of connection to biology the authors made a very concerted effort to attend to all of the reviewers comments they ran all of the requested control experiments updated the text to better reflect past literature and included some comparison to psychophysics data in the end only one reviewer increased their score though leading to final scores of 3 10 5 and 3 discussion did not lead to any more consensus thus this paper was still very much in the borderline zone and required ac consideration after reading through the paper reviews and rebuttals the ac felt that the authors really had addressed the primary concerns as best as could be hoped for in the timeframe for iclr and that the paper was sufficiently interesting and informative for ml and neuroscience to be worthy of publication some of the negative review points stand eg there are still some mysteries as to why this works and there is certainly a lot more that could be done to make this paper informative for neuroscience nonetheless in total the ac felt that this paper deserved to be accepted given that the authors did most of what the reviewers requested of them
[ 3210, 273, 1966, 8113, 285, 2722, 326, 352, 556, 10307, 275, 11132, 2515, 436, 906, 5936, 326, 352, 476, 320, 908, 275, 11369, 4893, 835, 253, 15374, 403, 1698, 21061, 352, 556, 690, 21643, 4243, 533, 841, 476, 320, 4229, 407, 253, 4477, 50276, 7152, 339, 431, 248, 9380, 2022, 1750, 310, 326, 15969, 34253, 281, 7278, 5304, 43457, 275, 7533, 342, 3710, 6064, 824, 347, 253, 581, 11295, 407, 3710, 38099, 37555, 275, 253, 30067, 253, 4477, 3103, 1973, 247, 27311, 267, 2990, 342, 18902, 17769, 275, 697, 2393, 8090, 23776, 1837, 68, 326, 14488, 247, 2069, 12395, 273, 1698, 6064, 13009, 285, 33772, 14237, 50276, 1542, 9162, 275, 260, 338, 274, 50276, 4064, 247, 9732, 2990, 6883, 2120, 6064, 14800, 1837, 68, 41731, 13015, 247, 1698, 21061, 8245, 285, 7274, 2629, 6064, 3045, 23000, 253, 2929, 5304, 4219, 253, 1837, 6113, 6311, 3386, 5847, 50276, 284, 2080, 347, 891, 476, 2028, 436, 310, 247, 4460, 4758, 285, 891, 452, 417, 2326, 1199, 789, 15686, 253, 3486, 273, 1698, 22043, 6064, 327, 1789, 8981, 3210, 50276, 783, 1543, 327, 260, 338, 274, 740, 285, 260, 338, 274, 2313, 403, 4518, 2529, 285, 921, 326, 253, 18902, 1837, 68, 1566, 35479, 407, 247, 2120, 21061, 9732, 476, 35398, 954, 273, 253, 3045, 273, 247, 2629, 6064, 1566, 50276, 783, 2929, 3400, 1175, 4114, 327, 253, 7534, 16038, 323, 14053, 1698, 21061, 38099, 37555, 50276, 5040, 50276, 77, 471, 273, 4602, 281, 16775, 253, 4081, 1566, 310, 17194, 432, 7534, 7313, 533, 1566, 13650, 403, 1620, 5762, 1411, 667, 5661, 1543, 403, 253, 3210, 4795, 3386, 667, 625, 3998, 3022, 1057, 352, 10738, 253, 1072, 4373, 317, 10533, 347, 2540, 275, 16775, 50276, 15684, 420, 273, 247, 9732, 253, 1837, 68, 310, 760, 5762, 672, 4715, 14237, 432, 247, 9732, 534, 1097, 556, 247, 1327, 706, 3391, 4602, 281, 16775, 285, 310, 271, 16593, 5301, 281, 253, 1327, 250, 6259, 1666, 25379, 534, 513, 417, 897, 247, 9732, 651, 667, 273, 253, 1666, 25379, 1347, 1805, 672, 10166, 342, 247, 2120, 21061, 9732, 275, 253, 1072, 1039, 347, 253, 1837, 68, 50276, 328, 8250, 5373, 323, 4382, 8113, 352, 310, 417, 4755, 281, 479, 604, 2811, 5162, 17872, 941, 342, 1698, 6064, 533, 1142, 11935, 3530, 588, 320, 9371, 281, 253, 5145, 4715, 3114, 690, 4602, 310, 1160, 275, 253, 1077, 1390, 12494, 281, 1900, 251, 14693, 824, 347, 2133, 16332, 14693, 533, 352, 310, 417, 1160, 2590, 604, 1110, 403, 1663, 275, 253, 9459, 273, 1698, 21061, 285, 1029, 11935, 10491, 50276, 37585, 50275, 8826, 625, 5955, 273, 2905, 18902, 3210, 24088, 5987, 50004, 79, 2824, 550, 20790, 9638, 13362, 3141, 1012, 69, 1010, 2270, 69, 1619, 66, 24, 1678, 29412, 324, 3439, 67, 22, 69, 1717, 69, 18092, 15834, 2974, 285, 5987, 2700, 16077, 284, 2061, 6071, 883, 1540, 1237, 1093, 3439, 14458, 651, 320, 9371, 281, 33876, 907, 253, 789, 50276, 783, 1273, 34776, 505, 12494, 327, 3239, 577, 3054, 326, 253, 501, 3024, 83, 9866, 6786, 7200, 2406, 407, 4791, 285, 884, 2975, 323, 260, 338, 274, 6903, 361, 533, 2829, 374, 556, 697, 7200, 347, 854, 1867, 28333, 3832, 2429, 281, 253, 2629, 6064, 898, 2358, 1839, 22858, 50276, 2520, 3133, 281, 320, 16706, 50276, 262, 310, 417, 2590, 281, 479, 752, 281, 1379, 432, 253, 24426, 273, 3386, 3036, 84, 374, 285, 495, 275, 2087, 891, 1119, 253, 2929, 247, 2372, 1892, 281, 956, 387, 2069, 253, 5185, 2926, 310, 417, 2590, 281, 479, 24088, 849, 513, 253, 4735, 5304, 5904, 1329, 253, 2022, 1750, 50276, 13206, 495, 11743, 556, 247, 1745, 80, 253, 31542, 1083, 253, 2929, 19756, 247, 2590, 20028, 273, 31471, 2057, 271, 5520, 4944, 281, 7534, 941, 1580, 253, 16038, 7866, 432, 3710, 10491, 275, 253, 30067, 390, 247, 2590, 897, 1083, 275, 4382, 8113, 1580, 6747, 310, 5183, 891, 1089, 352, 1663, 1892, 281, 33876, 907, 253, 789, 285, 2550, 2028, 604, 253, 4081, 1566, 2789, 667, 11701, 689, 2045, 3210, 923, 253, 2022, 2278, 323, 7000, 13991, 253, 897, 273, 247, 2120, 21061, 9732, 2990, 310, 671, 417, 973, 17194, 3340, 275, 4602, 281, 16775, 285, 253, 1273, 2716, 273, 253, 2929, 310, 247, 2372, 1892, 281, 956, 26332, 752, 281, 1379, 432, 253, 4735, 5304, 5904, 50276, 250, 2858, 22559, 5731, 891, 452, 2559, 619, 4868, 1563, 253, 4477, 9437, 387, 12873, 281, 16775, 625, 3587, 533, 891, 1335, 2868, 2234, 14023, 403, 5816, 2057, 247, 10046, 3048, 281, 16775, 285, 345, 2414, 600, 12600, 1566, 13650, 281, 5661, 1543, 285, 263, 6843, 14023, 281, 5795, 3210, 275, 13361, 8892, 5474, 33032, 1568, 253, 4477, 3177, 281, 25057, 7046, 7173, 358, 23702, 30745, 323, 1789, 8981, 327, 253, 2629, 260, 338, 274, 740, 285, 260, 338, 274, 2313, 15302, 275, 2159, 597, 897, 247, 2990, 342, 247, 2914, 423, 273, 18902, 5085, 2410, 30107, 281, 9446, 5113, 1677, 28819, 480, 262, 3606, 1066, 22163, 6216, 3888, 209, 575, 8222, 1242, 4020, 839, 271, 3939, 8468, 253, 2990, 310, 10166, 275, 247, 5974, 442, 12844, 6661, 835, 13461, 275, 247, 11935, 45900, 3828, 846, 253, 18902, 8090, 403, 10166, 281, 3761, 253, 13461, 273, 247, 4735, 3828, 275, 373, 3024, 1235, 1735, 253, 2990, 310, 1442, 292, 37437, 281, 2572, 9162, 7200, 50276, 2711, 9518, 253, 4477, 403, 7004, 604, 7046, 7173, 358, 23702, 30745, 403, 2217, 281, 4711, 247, 4735, 3828, 2074, 281, 247, 4067, 2990, 6194, 327, 2120, 373, 3888, 285, 275, 1614, 604, 436, 4735, 3828, 8525, 1789, 8981, 327, 1061, 342, 2120, 373, 3045, 273, 501, 3024, 1235, 253, 4477, 7568, 326, 616, 2990, 310, 2761, 347, 1347, 386, 347, 501, 3024, 1235, 342, 577, 89, 1066, 22163, 6216, 3888, 3340, 672, 253, 1066, 22163, 6216, 3888, 403, 480, 262, 3606, 275, 247, 22377, 4702, 597, 671, 1246, 1783, 17227, 326, 253, 2990, 310, 275, 958, 9591, 7046, 7173, 358, 23702, 10426, 20544, 50276, 18, 253, 897, 273, 271, 3939, 8468, 26332, 480, 4069, 272, 253, 3280, 2460, 310, 271, 4722, 2934, 326, 5347, 4219, 327, 3332, 16936, 275, 6551, 21559, 285, 20162, 50276, 19, 327, 806, 787, 2345, 253, 1543, 403, 4942, 2266, 2299, 342, 247, 15985, 255, 326, 625, 5760, 403, 3058, 281, 4665, 731, 50276, 24330, 3374, 50276, 18, 247, 4275, 1750, 1160, 407, 253, 4477, 310, 326, 7046, 7173, 358, 23702, 30745, 275, 253, 2914, 423, 273, 253, 2990, 403, 1774, 253, 2022, 1941, 1060, 310, 326, 253, 501, 3024, 83, 9866, 2990, 26332, 8133, 253, 18902, 30745, 327, 253, 31446, 1057, 417, 789, 4829, 347, 973, 275, 958, 501, 3024, 83, 9866, 4620, 281, 513, 642, 1805, 685, 3365, 25001, 253, 10554, 273, 501, 3024, 689, 608, 13009, 534, 310, 10084, 436, 3198, 281, 320, 6760, 1199, 625, 24181, 1580, 352, 310, 417, 271, 2999, 22055, 1212, 868, 5301, 2139, 310, 253, 391, 9866, 760, 908, 846, 253, 4156, 3388, 45900, 3828, 1060, 253, 1837, 68, 310, 970, 2410, 30107, 1223, 253, 5301, 310, 1160, 342, 26724, 26970, 5085, 275, 253, 501, 3024, 83, 9866, 2990, 594, 597, 513, 417, 452, 2289, 281, 8820, 1491, 594, 1663, 253, 5301, 310, 7046, 7173, 358, 23702, 13782, 323, 253, 1837, 68, 2990, 285, 11935, 760, 13782, 323, 501, 3024, 83, 9866, 50276, 262, 310, 671, 12744, 849, 253, 501, 3024, 83, 9866, 2990, 31167, 8820, 1491, 1580, 891, 812, 417, 1089, 253, 3602, 2905, 281, 3280, 18974, 275, 253, 30762, 352, 310, 671, 12744, 849, 253, 501, 3024, 83, 9866, 2990, 369, 10166, 50276, 19, 4677, 374, 14371, 326, 253, 1837, 68, 2990, 4648, 247, 7802, 273, 8820, 285, 11935, 13782, 533, 253, 1543, 403, 762, 29965, 4337, 50276, 18566, 253, 2990, 4711, 247, 2074, 3268, 2439, 1027, 3632, 3302, 5904, 1057, 253, 3045, 9383, 552, 342, 841, 10670, 752, 6569, 604, 5085, 342, 2173, 6866, 403, 490, 16148, 1199, 625, 1783, 310, 3058, 281, 320, 2104, 281, 4665, 253, 6349, 273, 47515, 2011, 275, 4677, 374, 50275, 20, 1529, 4275, 1750, 310, 326, 253, 18974, 273, 3888, 689, 673, 310, 1774, 3036, 495, 275, 2087, 436, 906, 310, 762, 29965, 4337, 285, 2834, 281, 4665, 347, 310, 347, 465, 4916, 625, 4016, 285, 253, 24102, 625, 22627, 24102, 403, 2078, 293, 1321, 281, 3464, 8003, 281, 253, 4055, 285, 452, 625, 14787, 2568, 891, 812, 417, 1089, 667, 1783, 273, 436, 604, 6296, 16841, 8213, 840, 253, 4477, 1364, 921, 326, 22627, 24102, 403, 1805, 468, 14692, 685, 643, 24102, 342, 1679, 16841, 533, 2074, 19737, 9990, 24088, 253, 24102, 403, 247, 2074, 4181, 432, 253, 4055, 3659, 285, 452, 2074, 7759, 273, 14787, 247, 1077, 2969, 1453, 1060, 310, 281, 46671, 253, 24102, 689, 673, 275, 436, 1083, 253, 9990, 943, 320, 253, 1072, 533, 253, 4248, 273, 16841, 432, 1127, 281, 1127, 588, 320, 11069, 604, 253, 1837, 68, 2990, 17923, 816, 347, 973, 342, 253, 46671, 840, 253, 4583, 9990, 2647, 625, 685, 16841, 50276, 2520, 310, 5667, 281, 4685, 752, 4483, 253, 1837, 68, 2990, 281, 1347, 973, 50276, 37585, 3374, 50276, 18, 4496, 2451, 323, 963, 993, 374, 4677, 495, 4496, 2085, 247, 13691, 495, 4677, 495, 891, 513, 417, 2096, 2139, 253, 4477, 403, 38542, 271, 3388, 273, 374, 2856, 522, 842, 84, 50276, 20415, 625, 2792, 943, 320, 10302, 281, 6642, 253, 3268, 6283, 594, 359, 476, 31986, 247, 5272, 7162, 7726, 323, 1016, 4764, 4758, 275, 6010, 891, 1119, 253, 5161, 2934, 273, 4477, 2990, 4722, 2299, 253, 4477, 3916, 403, 4390, 417, 17285, 407, 253, 1543, 1142, 625, 5760, 403, 3058, 281, 5416, 326, 253, 1837, 68, 2990, 17923, 1805, 685, 18075, 604, 253, 1837, 68, 2990, 1057, 1347, 1805, 685, 512, 643, 1453, 6928, 840, 3081, 1783, 310, 2424, 281, 2096, 849, 253, 2990, 310, 2104, 281, 3157, 697, 3045, 2490, 187, 4118, 18435, 27, 2520, 2929, 33826, 253, 2934, 326, 4993, 1050, 16924, 273, 247, 8468, 689, 271, 2460, 1633, 326, 2248, 366, 2927, 513, 812, 320, 908, 281, 5115, 5304, 4373, 317, 10533, 26332, 2460, 8981, 342, 1698, 6064, 3888, 6425, 281, 752, 651, 320, 6786, 342, 1029, 6064, 3888, 253, 4477, 3989, 6928, 835, 253, 5004, 273, 247, 3676, 2410, 3024, 310, 7932, 407, 18902, 6928, 285, 253, 2990, 310, 840, 10166, 327, 1698, 21061, 9508, 273, 1029, 21061, 3888, 326, 403, 19958, 342, 4993, 1050, 16924, 2439, 253, 2460, 253, 4477, 921, 326, 436, 2746, 4483, 616, 985, 18525, 18902, 30410, 390, 1837, 68, 281, 755, 1199, 1805, 9162, 3045, 327, 260, 338, 274, 3888, 685, 476, 320, 6786, 1293, 253, 2393, 15969, 285, 16924, 253, 4477, 671, 921, 326, 253, 954, 10237, 9162, 44143, 16924, 24102, 342, 2169, 16841, 285, 597, 921, 326, 436, 10129, 690, 273, 253, 3607, 273, 5304, 16924, 24102, 275, 7497, 50275, 783, 10123, 327, 436, 2929, 497, 4122, 34249, 12319, 432, 495, 281, 884, 1264, 273, 253, 30628, 3543, 436, 2929, 943, 320, 10945, 533, 581, 3543, 1077, 7052, 352, 943, 320, 7607, 253, 3625, 7350, 432, 253, 4016, 30628, 497, 3480, 273, 4569, 5760, 3480, 273, 12288, 715, 2139, 253, 985, 2987, 3480, 273, 4569, 10414, 281, 2469, 789, 285, 3480, 273, 4602, 281, 16775, 253, 4477, 1160, 247, 1077, 12699, 264, 3434, 281, 8041, 281, 512, 273, 253, 30628, 5701, 597, 6337, 512, 273, 253, 9521, 1453, 4679, 9300, 253, 2505, 281, 1805, 4887, 2469, 6239, 285, 2908, 690, 5301, 281, 4369, 16946, 982, 941, 275, 253, 990, 760, 581, 37317, 2559, 616, 4868, 2167, 4283, 281, 2457, 7363, 273, 495, 884, 608, 285, 495, 5955, 858, 417, 1421, 281, 667, 625, 13969, 50275, 40622, 436, 2929, 369, 1335, 1077, 1199, 275, 253, 45210, 8232, 285, 2424, 913, 8180, 846, 4361, 949, 253, 2929, 10123, 285, 30080, 85, 932, 253, 913, 3543, 326, 253, 4477, 1663, 574, 9713, 253, 3625, 7350, 347, 1682, 347, 812, 320, 13937, 323, 275, 253, 673, 6301, 323, 17857, 32888, 285, 326, 253, 2929, 369, 10481, 4722, 285, 27096, 323, 13361, 285, 6551, 21559, 281, 320, 18338, 273, 9311, 690, 273, 253, 4016, 2278, 2792, 1462, 24088, 627, 403, 1335, 690, 40710, 347, 281, 2139, 436, 2987, 285, 627, 310, 5604, 247, 2257, 625, 326, 812, 320, 2218, 281, 1056, 436, 2929, 27096, 323, 6551, 21559, 23188, 275, 2264, 253, 913, 3543, 326, 436, 2929, 29774, 281, 320, 7607, 1677, 326, 253, 4477, 858, 954, 273, 752, 253, 30628, 9521, 273, 731 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 3210, 273, 1966, 8113, 285, 2722, 326, 352, 556, 10307, 275, 11132, 2515, 436, 906, 5936, 326, 352, 476, 320, 908, 275, 11369, 4893, 835, 253, 15374, 403, 1698, 21061, 352, 556, 690, 21643, 4243, 533, 841, 476, 320, 4229, 407, 253, 4477, 50276, 7152, 339, 431, 248, 9380, 2022, 1750, 310, 326, 15969, 34253, 281, 7278, 5304, 43457, 275, 7533, 342, 3710, 6064, 824, 347, 253, 581, 11295, 407, 3710, 38099, 37555, 275, 253, 30067, 253, 4477, 3103, 1973, 247, 27311, 267, 2990, 342, 18902, 17769, 275, 697, 2393, 8090, 23776, 1837, 68, 326, 14488, 247, 2069, 12395, 273, 1698, 6064, 13009, 285, 33772, 14237, 50276, 1542, 9162, 275, 260, 338, 274, 50276, 4064, 247, 9732, 2990, 6883, 2120, 6064, 14800, 1837, 68, 41731, 13015, 247, 1698, 21061, 8245, 285, 7274, 2629, 6064, 3045, 23000, 253, 2929, 5304, 4219, 253, 1837, 6113, 6311, 3386, 5847, 50276, 284, 2080, 347, 891, 476, 2028, 436, 310, 247, 4460, 4758, 285, 891, 452, 417, 2326, 1199, 789, 15686, 253, 3486, 273, 1698, 22043, 6064, 327, 1789, 8981, 3210, 50276, 783, 1543, 327, 260, 338, 274, 740, 285, 260, 338, 274, 2313, 403, 4518, 2529, 285, 921, 326, 253, 18902, 1837, 68, 1566, 35479, 407, 247, 2120, 21061, 9732, 476, 35398, 954, 273, 253, 3045, 273, 247, 2629, 6064, 1566, 50276, 783, 2929, 3400, 1175, 4114, 327, 253, 7534, 16038, 323, 14053, 1698, 21061, 38099, 37555, 50276, 5040, 50276, 77, 471, 273, 4602, 281, 16775, 253, 4081, 1566, 310, 17194, 432, 7534, 7313, 533, 1566, 13650, 403, 1620, 5762, 1411, 667, 5661, 1543, 403, 253, 3210, 4795, 3386, 667, 625, 3998, 3022, 1057, 352, 10738, 253, 1072, 4373, 317, 10533, 347, 2540, 275, 16775, 50276, 15684, 420, 273, 247, 9732, 253, 1837, 68, 310, 760, 5762, 672, 4715, 14237, 432, 247, 9732, 534, 1097, 556, 247, 1327, 706, 3391, 4602, 281, 16775, 285, 310, 271, 16593, 5301, 281, 253, 1327, 250, 6259, 1666, 25379, 534, 513, 417, 897, 247, 9732, 651, 667, 273, 253, 1666, 25379, 1347, 1805, 672, 10166, 342, 247, 2120, 21061, 9732, 275, 253, 1072, 1039, 347, 253, 1837, 68, 50276, 328, 8250, 5373, 323, 4382, 8113, 352, 310, 417, 4755, 281, 479, 604, 2811, 5162, 17872, 941, 342, 1698, 6064, 533, 1142, 11935, 3530, 588, 320, 9371, 281, 253, 5145, 4715, 3114, 690, 4602, 310, 1160, 275, 253, 1077, 1390, 12494, 281, 1900, 251, 14693, 824, 347, 2133, 16332, 14693, 533, 352, 310, 417, 1160, 2590, 604, 1110, 403, 1663, 275, 253, 9459, 273, 1698, 21061, 285, 1029, 11935, 10491, 50276, 37585, 50275, 8826, 625, 5955, 273, 2905, 18902, 3210, 24088, 5987, 50004, 79, 2824, 550, 20790, 9638, 13362, 3141, 1012, 69, 1010, 2270, 69, 1619, 66, 24, 1678, 29412, 324, 3439, 67, 22, 69, 1717, 69, 18092, 15834, 2974, 285, 5987, 2700, 16077, 284, 2061, 6071, 883, 1540, 1237, 1093, 3439, 14458, 651, 320, 9371, 281, 33876, 907, 253, 789, 50276, 783, 1273, 34776, 505, 12494, 327, 3239, 577, 3054, 326, 253, 501, 3024, 83, 9866, 6786, 7200, 2406, 407, 4791, 285, 884, 2975, 323, 260, 338, 274, 6903, 361, 533, 2829, 374, 556, 697, 7200, 347, 854, 1867, 28333, 3832, 2429, 281, 253, 2629, 6064, 898, 2358, 1839, 22858, 50276, 2520, 3133, 281, 320, 16706, 50276, 262, 310, 417, 2590, 281, 479, 752, 281, 1379, 432, 253, 24426, 273, 3386, 3036, 84, 374, 285, 495, 275, 2087, 891, 1119, 253, 2929, 247, 2372, 1892, 281, 956, 387, 2069, 253, 5185, 2926, 310, 417, 2590, 281, 479, 24088, 849, 513, 253, 4735, 5304, 5904, 1329, 253, 2022, 1750, 50276, 13206, 495, 11743, 556, 247, 1745, 80, 253, 31542, 1083, 253, 2929, 19756, 247, 2590, 20028, 273, 31471, 2057, 271, 5520, 4944, 281, 7534, 941, 1580, 253, 16038, 7866, 432, 3710, 10491, 275, 253, 30067, 390, 247, 2590, 897, 1083, 275, 4382, 8113, 1580, 6747, 310, 5183, 891, 1089, 352, 1663, 1892, 281, 33876, 907, 253, 789, 285, 2550, 2028, 604, 253, 4081, 1566, 2789, 667, 11701, 689, 2045, 3210, 923, 253, 2022, 2278, 323, 7000, 13991, 253, 897, 273, 247, 2120, 21061, 9732, 2990, 310, 671, 417, 973, 17194, 3340, 275, 4602, 281, 16775, 285, 253, 1273, 2716, 273, 253, 2929, 310, 247, 2372, 1892, 281, 956, 26332, 752, 281, 1379, 432, 253, 4735, 5304, 5904, 50276, 250, 2858, 22559, 5731, 891, 452, 2559, 619, 4868, 1563, 253, 4477, 9437, 387, 12873, 281, 16775, 625, 3587, 533, 891, 1335, 2868, 2234, 14023, 403, 5816, 2057, 247, 10046, 3048, 281, 16775, 285, 345, 2414, 600, 12600, 1566, 13650, 281, 5661, 1543, 285, 263, 6843, 14023, 281, 5795, 3210, 275, 13361, 8892, 5474, 33032, 1568, 253, 4477, 3177, 281, 25057, 7046, 7173, 358, 23702, 30745, 323, 1789, 8981, 327, 253, 2629, 260, 338, 274, 740, 285, 260, 338, 274, 2313, 15302, 275, 2159, 597, 897, 247, 2990, 342, 247, 2914, 423, 273, 18902, 5085, 2410, 30107, 281, 9446, 5113, 1677, 28819, 480, 262, 3606, 1066, 22163, 6216, 3888, 209, 575, 8222, 1242, 4020, 839, 271, 3939, 8468, 253, 2990, 310, 10166, 275, 247, 5974, 442, 12844, 6661, 835, 13461, 275, 247, 11935, 45900, 3828, 846, 253, 18902, 8090, 403, 10166, 281, 3761, 253, 13461, 273, 247, 4735, 3828, 275, 373, 3024, 1235, 1735, 253, 2990, 310, 1442, 292, 37437, 281, 2572, 9162, 7200, 50276, 2711, 9518, 253, 4477, 403, 7004, 604, 7046, 7173, 358, 23702, 30745, 403, 2217, 281, 4711, 247, 4735, 3828, 2074, 281, 247, 4067, 2990, 6194, 327, 2120, 373, 3888, 285, 275, 1614, 604, 436, 4735, 3828, 8525, 1789, 8981, 327, 1061, 342, 2120, 373, 3045, 273, 501, 3024, 1235, 253, 4477, 7568, 326, 616, 2990, 310, 2761, 347, 1347, 386, 347, 501, 3024, 1235, 342, 577, 89, 1066, 22163, 6216, 3888, 3340, 672, 253, 1066, 22163, 6216, 3888, 403, 480, 262, 3606, 275, 247, 22377, 4702, 597, 671, 1246, 1783, 17227, 326, 253, 2990, 310, 275, 958, 9591, 7046, 7173, 358, 23702, 10426, 20544, 50276, 18, 253, 897, 273, 271, 3939, 8468, 26332, 480, 4069, 272, 253, 3280, 2460, 310, 271, 4722, 2934, 326, 5347, 4219, 327, 3332, 16936, 275, 6551, 21559, 285, 20162, 50276, 19, 327, 806, 787, 2345, 253, 1543, 403, 4942, 2266, 2299, 342, 247, 15985, 255, 326, 625, 5760, 403, 3058, 281, 4665, 731, 50276, 24330, 3374, 50276, 18, 247, 4275, 1750, 1160, 407, 253, 4477, 310, 326, 7046, 7173, 358, 23702, 30745, 275, 253, 2914, 423, 273, 253, 2990, 403, 1774, 253, 2022, 1941, 1060, 310, 326, 253, 501, 3024, 83, 9866, 2990, 26332, 8133, 253, 18902, 30745, 327, 253, 31446, 1057, 417, 789, 4829, 347, 973, 275, 958, 501, 3024, 83, 9866, 4620, 281, 513, 642, 1805, 685, 3365, 25001, 253, 10554, 273, 501, 3024, 689, 608, 13009, 534, 310, 10084, 436, 3198, 281, 320, 6760, 1199, 625, 24181, 1580, 352, 310, 417, 271, 2999, 22055, 1212, 868, 5301, 2139, 310, 253, 391, 9866, 760, 908, 846, 253, 4156, 3388, 45900, 3828, 1060, 253, 1837, 68, 310, 970, 2410, 30107, 1223, 253, 5301, 310, 1160, 342, 26724, 26970, 5085, 275, 253, 501, 3024, 83, 9866, 2990, 594, 597, 513, 417, 452, 2289, 281, 8820, 1491, 594, 1663, 253, 5301, 310, 7046, 7173, 358, 23702, 13782, 323, 253, 1837, 68, 2990, 285, 11935, 760, 13782, 323, 501, 3024, 83, 9866, 50276, 262, 310, 671, 12744, 849, 253, 501, 3024, 83, 9866, 2990, 31167, 8820, 1491, 1580, 891, 812, 417, 1089, 253, 3602, 2905, 281, 3280, 18974, 275, 253, 30762, 352, 310, 671, 12744, 849, 253, 501, 3024, 83, 9866, 2990, 369, 10166, 50276, 19, 4677, 374, 14371, 326, 253, 1837, 68, 2990, 4648, 247, 7802, 273, 8820, 285, 11935, 13782, 533, 253, 1543, 403, 762, 29965, 4337, 50276, 18566, 253, 2990, 4711, 247, 2074, 3268, 2439, 1027, 3632, 3302, 5904, 1057, 253, 3045, 9383, 552, 342, 841, 10670, 752, 6569, 604, 5085, 342, 2173, 6866, 403, 490, 16148, 1199, 625, 1783, 310, 3058, 281, 320, 2104, 281, 4665, 253, 6349, 273, 47515, 2011, 275, 4677, 374, 50275, 20, 1529, 4275, 1750, 310, 326, 253, 18974, 273, 3888, 689, 673, 310, 1774, 3036, 495, 275, 2087, 436, 906, 310, 762, 29965, 4337, 285, 2834, 281, 4665, 347, 310, 347, 465, 4916, 625, 4016, 285, 253, 24102, 625, 22627, 24102, 403, 2078, 293, 1321, 281, 3464, 8003, 281, 253, 4055, 285, 452, 625, 14787, 2568, 891, 812, 417, 1089, 667, 1783, 273, 436, 604, 6296, 16841, 8213, 840, 253, 4477, 1364, 921, 326, 22627, 24102, 403, 1805, 468, 14692, 685, 643, 24102, 342, 1679, 16841, 533, 2074, 19737, 9990, 24088, 253, 24102, 403, 247, 2074, 4181, 432, 253, 4055, 3659, 285, 452, 2074, 7759, 273, 14787, 247, 1077, 2969, 1453, 1060, 310, 281, 46671, 253, 24102, 689, 673, 275, 436, 1083, 253, 9990, 943, 320, 253, 1072, 533, 253, 4248, 273, 16841, 432, 1127, 281, 1127, 588, 320, 11069, 604, 253, 1837, 68, 2990, 17923, 816, 347, 973, 342, 253, 46671, 840, 253, 4583, 9990, 2647, 625, 685, 16841, 50276, 2520, 310, 5667, 281, 4685, 752, 4483, 253, 1837, 68, 2990, 281, 1347, 973, 50276, 37585, 3374, 50276, 18, 4496, 2451, 323, 963, 993, 374, 4677, 495, 4496, 2085, 247, 13691, 495, 4677, 495, 891, 513, 417, 2096, 2139, 253, 4477, 403, 38542, 271, 3388, 273, 374, 2856, 522, 842, 84, 50276, 20415, 625, 2792, 943, 320, 10302, 281, 6642, 253, 3268, 6283, 594, 359, 476, 31986, 247, 5272, 7162, 7726, 323, 1016, 4764, 4758, 275, 6010, 891, 1119, 253, 5161, 2934, 273, 4477, 2990, 4722, 2299, 253, 4477, 3916, 403, 4390, 417, 17285, 407, 253, 1543, 1142, 625, 5760, 403, 3058, 281, 5416, 326, 253, 1837, 68, 2990, 17923, 1805, 685, 18075, 604, 253, 1837, 68, 2990, 1057, 1347, 1805, 685, 512, 643, 1453, 6928, 840, 3081, 1783, 310, 2424, 281, 2096, 849, 253, 2990, 310, 2104, 281, 3157, 697, 3045, 2490, 187, 4118, 18435, 27, 2520, 2929, 33826, 253, 2934, 326, 4993, 1050, 16924, 273, 247, 8468, 689, 271, 2460, 1633, 326, 2248, 366, 2927, 513, 812, 320, 908, 281, 5115, 5304, 4373, 317, 10533, 26332, 2460, 8981, 342, 1698, 6064, 3888, 6425, 281, 752, 651, 320, 6786, 342, 1029, 6064, 3888, 253, 4477, 3989, 6928, 835, 253, 5004, 273, 247, 3676, 2410, 3024, 310, 7932, 407, 18902, 6928, 285, 253, 2990, 310, 840, 10166, 327, 1698, 21061, 9508, 273, 1029, 21061, 3888, 326, 403, 19958, 342, 4993, 1050, 16924, 2439, 253, 2460, 253, 4477, 921, 326, 436, 2746, 4483, 616, 985, 18525, 18902, 30410, 390, 1837, 68, 281, 755, 1199, 1805, 9162, 3045, 327, 260, 338, 274, 3888, 685, 476, 320, 6786, 1293, 253, 2393, 15969, 285, 16924, 253, 4477, 671, 921, 326, 253, 954, 10237, 9162, 44143, 16924, 24102, 342, 2169, 16841, 285, 597, 921, 326, 436, 10129, 690, 273, 253, 3607, 273, 5304, 16924, 24102, 275, 7497, 50275, 783, 10123, 327, 436, 2929, 497, 4122, 34249, 12319, 432, 495, 281, 884, 1264, 273, 253, 30628, 3543, 436, 2929, 943, 320, 10945, 533, 581, 3543, 1077, 7052, 352, 943, 320, 7607, 253, 3625, 7350, 432, 253, 4016, 30628, 497, 3480, 273, 4569, 5760, 3480, 273, 12288, 715, 2139, 253, 985, 2987, 3480, 273, 4569, 10414, 281, 2469, 789, 285, 3480, 273, 4602, 281, 16775, 253, 4477, 1160, 247, 1077, 12699, 264, 3434, 281, 8041, 281, 512, 273, 253, 30628, 5701, 597, 6337, 512, 273, 253, 9521, 1453, 4679, 9300, 253, 2505, 281, 1805, 4887, 2469, 6239, 285, 2908, 690, 5301, 281, 4369, 16946, 982, 941, 275, 253, 990, 760, 581, 37317, 2559, 616, 4868, 2167, 4283, 281, 2457, 7363, 273, 495, 884, 608, 285, 495, 5955, 858, 417, 1421, 281, 667, 625, 13969, 50275, 40622, 436, 2929, 369, 1335, 1077, 1199, 275, 253, 45210, 8232, 285, 2424, 913, 8180, 846, 4361, 949, 253, 2929, 10123, 285, 30080, 85, 932, 253, 913, 3543, 326, 253, 4477, 1663, 574, 9713, 253, 3625, 7350, 347, 1682, 347, 812, 320, 13937, 323, 275, 253, 673, 6301, 323, 17857, 32888, 285, 326, 253, 2929, 369, 10481, 4722, 285, 27096, 323, 13361, 285, 6551, 21559, 281, 320, 18338, 273, 9311, 690, 273, 253, 4016, 2278, 2792, 1462, 24088, 627, 403, 1335, 690, 40710, 347, 281, 2139, 436, 2987, 285, 627, 310, 5604, 247, 2257, 625, 326, 812, 320, 2218, 281, 1056, 436, 2929, 27096, 323, 6551, 21559, 23188, 275, 2264, 253, 913, 3543, 326, 436, 2929, 29774, 281, 320, 7607, 1677, 326, 253, 4477, 858, 954, 273, 752, 253, 30628, 9521, 273, 731 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper tackles the problem of sensor fusion where multiple possibly differing sensor modalities are available and neural network architectures are used to combine information from them to perform prediction tasks the paper proposed modifications to a gated fusion network specifically 1 grouping sets of sensors and concatenating them before further processing and 2 performing multilevel fusion where early sensor data representations are concatenated to produce weightings additional to the those obtained from features concatenated at a later stage experimental results show that these architectures achieve performance gains from 26 especially when sensors are noisy or missing strengths the architectures encourage fusion at multiple levels especially the second one which is a concept that has been successful across the deep learning literature the paper looks at an interesting topic especially related to looking at the effects of noise and missing sensors on the gating mechanisms the results show some positive performance gains although see caveats below weaknesses the related work paragraph is extremely sparse fusion is an enormous field see survey cited in this paper as well 1 and i find the small choice of fusion results with a youbot to be strange a strong set of related work is necessary focusing on those that are similar to the work as an example spatiotemporal fusion slow fusion 2 bears some resemblence to this work but there are many others eg 34 as a few examples 1 ramachandram dhanesh and graham w taylor deep multimodal learning a survey on recent advances and trends ieee signal processing magazine 346 2017 96108 ramach 2 karpathy andrej et al largescale video classification with convolutional neural networks proceedings of the ieee conference on computer vision and pattern recognition 2014 3 mees oier andreas eitel and wolfram burgard choosing smartly adaptive multimodal fusion for object detection in changing environments intelligent robots and systems iros 2016 ieeersj international conference on ieee 2016 4 kim j koh j kim y choi j hwang y choi j w 2018 robust deep multimodal learning based on gated information fusion network arxiv preprint arxiv180706233 the paper claims to provide a deep understanding of the relationships between sensory inputs fusion weights network architecture and resulting performance i dont think it really achieves this with the small examples of weights for some simple situations it is very unclear whether the architectures have more or less parameters at one point it is stated that the original architecture overfits and the new architecture has less parameters sec 22 and 3 but then it is stated for fairness the number of neurons is equalized 52 and later in that section that the new architectures have additional neurons which of these is accurate related to the previous point and possibly the biggest weakness the experimental methodology makes it hard to tell if performance is actually improved for example it is not clear to me that the performance gains are not just a result of less overfitting for whatever reason of baselines and that the fixed number of epochs therefore results in stopping at a better performance please show training and validation curves so that we can see whether the epochs chosen for the baselines are not just chosen after overfitting in which case early stopping will improve the performance as another example there are no variances shown in the bar graphs the examples with noise and failures are limited for example it is also not clear why an increase of noise in the rpm feature table 5 actually increases the weight of that group in the twostage architecture what does that mean in general there isnt any principled method proposed for analyzing these situations some minor commentsclarifications what is the difference between these gated networks and attentional mechanisms eg alpha attention see attention is all you need paper what is a principled method to decide on the groupings there are several typos throughout the paper in the presence of snesor in the presence of sensor throughout the paper predication prediction likelihood of stucking the training tensorflow is not a simulation environment overall the paper proposes architectural changes to an existing method for fusion and while positive results are demonstrated there are several issues in the experimental methodology that make it unclear where the benefits come from further the paper lacks novelty as multilevel fusion has been explored significantly and the changes are rather minor there is no principled method or concepts that drive the architectural changes and while the authors claim a deeper investigation into the networks effectiveness under noise and failures the actual analysis is too shallow docsepoverview and contributions the authors improve upon several limitations of the baseline negated architecture by proposing 1 a coarsergrained gated fusion architecture and 2 a twostage gated fusion architecture the authors show improvements in driving mode prediction and human activity recognition in settings where all modalities are observed as well as settings where there are noisy or missing modalities strengths 1 the model seems interesting and tackles the difficult problem of multisensor fusion under both normal and noisy settings 2 good results obtained on standard benchmarks with improvements in settings where all modalities are observed as well as settings where there are noisy or missing modalities weaknesses 1 i am worried about the novelty of the proposed approach the main idea for the fusiongroup gated fusion architecture is to perform additional early fusion of sensory inputs within each group which reduces the number of grouplevel fusion weights and therefore the number of parameters to tune the twostage gated fusion architecture simply combines the baseline model and the proposed fusiongroup model both these ideas seem relatively incremental 2 doesnt the final twostage gated fusion architecture further increase the number of parameters as compared to the baseline model i believe there are several additional fcnn blocks in figure 3 and more attention gating weights i find this counterintuitive since section 22 motivated potential overfitting as one drawback of the baseline netgated architecture how does the increase in parameters for the final model affect the running time and convergence questions to authors 1 i dont understand tables 456 why are the results for grouplevel fusion weight in the middle of several columns which features are being used in which groups please make this clear using vertical separators 2 for the proposed twostage gated fusion architecture do the 2 branches learn different things ie focus on different portions of the multimodal inputs i would have liked to see more visualizations and analysis instead of just qualitative results presentation improvements typos edits style missing references 1 general poor presentation of experimental results tables are not clear and bar graphs are not professionally drawn the paper extends to 9 pages when a lot of space could be saved by making the presentation of experimental results more compact i believe the guidelines mention that more pages can be used if there are extensive results but i dont think the experimental results warrant the extra pagedocsepthis paper proposes two gated deep learning architectures for sensor fusion they are all based on the previous work naman patel et als modality fusion with cnns for ugv autonomous driving in indoor environments iros by having the grouped features the author demonstrated improved performance especially in the presence of random sensor noise and failures organizationstyle the paper is well written organized and clear on most points a few minor points 1 the total length of the paper exceeds 8 pages some figures and tables should be adjusted to have it fit into 8 pages 2 the literature review is limited 3 there are clearly some misspellings for example the netgated is often written as negated technical accuracy the two architecture that the author proposes all based on the grouped features which to my point of view is a very important and necessary part of the new model however the author failed to rigorously prove or clearly demonstrated that why this is effective to our new model moreover how to make groups or how many groups are needed are not clearly specified the experiments used only two completely different datasets none of them are related to the previous sensor fusion method they are trying to compete im afraid this method cannot generalize to a common case in addition if we look at table 4 and table 5 we can find the first grouplevel fusion weight actually increases which seems contradictory to the result shown in table 6 adequacy of citations poor coverage of literature in sensor fusion there are less than 10 references are related to sensor fusion overall it is not an iclr standard paper ### Summary:
the paper builds up on the gated fusion network architectures and adapt those approaches to reach improved results in that it is incrementally worthwhile all the same all reviewers agree that the work is not yet up to par in particular the paper is only incremental and the novelty of it is not clear it does not relate well to existing work in this field and the results are not rigorously evaluated thus its merit is unclear experimentally
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 436, 2929, 39223, 253, 1895, 273, 8468, 11781, 835, 2709, 6830, 26704, 8468, 33433, 403, 2130, 285, 11454, 2990, 35615, 403, 908, 281, 13398, 1491, 432, 731, 281, 1347, 10554, 8892, 253, 2929, 4081, 14586, 281, 247, 305, 456, 11781, 2990, 5742, 337, 32827, 5239, 273, 13479, 285, 32147, 839, 731, 1078, 2007, 5162, 285, 374, 9591, 1554, 48268, 11781, 835, 2393, 8468, 941, 14237, 403, 32147, 456, 281, 4711, 2801, 723, 3081, 281, 253, 1110, 2797, 432, 3386, 32147, 456, 387, 247, 1996, 3924, 5661, 1543, 921, 326, 841, 35615, 5115, 3045, 15988, 432, 3436, 3340, 672, 13479, 403, 27620, 390, 5816, 50275, 296, 3755, 20556, 50274, 783, 35615, 11907, 11781, 387, 2709, 2308, 3340, 253, 1273, 581, 534, 310, 247, 4473, 326, 556, 644, 5547, 2439, 253, 3676, 4715, 6239, 50273, 783, 2929, 4453, 387, 271, 4722, 9400, 3340, 2905, 281, 2819, 387, 253, 2538, 273, 6046, 285, 5816, 13479, 327, 253, 305, 839, 6297, 50273, 783, 1543, 921, 690, 2762, 3045, 15988, 3738, 923, 15985, 1832, 2708, 50275, 20881, 1255, 265, 50273, 783, 2905, 789, 12494, 310, 6685, 23507, 11781, 310, 271, 14779, 1673, 923, 6630, 11106, 275, 436, 2929, 347, 973, 337, 285, 891, 1089, 253, 1355, 4327, 273, 11781, 1543, 342, 247, 368, 12042, 281, 320, 8921, 247, 2266, 873, 273, 2905, 789, 310, 3309, 13654, 327, 1110, 326, 403, 2074, 281, 253, 789, 347, 271, 1650, 7046, 7173, 358, 23702, 11781, 3468, 11781, 374, 17267, 690, 16188, 1559, 566, 281, 436, 789, 533, 627, 403, 1142, 2571, 24088, 5910, 347, 247, 1643, 6667, 50272, 18, 17653, 607, 395, 3358, 277, 5582, 15897, 285, 7098, 3964, 259, 246, 9614, 3676, 23390, 26306, 4715, 247, 6630, 327, 3332, 16424, 285, 13554, 26332, 1796, 2625, 5162, 11338, 38290, 4240, 9161, 12347, 50276, 186, 50275, 3358, 607, 50274, 19, 46247, 26300, 285, 250, 75, 1162, 355, 1236, 2510, 25912, 3492, 9162, 342, 27311, 267, 11454, 6928, 10061, 273, 253, 26332, 1796, 8059, 327, 4382, 8113, 285, 3102, 8981, 4059, 50274, 20, 479, 265, 258, 1321, 285, 250, 284, 299, 49311, 285, 259, 311, 925, 312, 3600, 36964, 13887, 7060, 314, 17825, 23390, 26306, 11781, 323, 1789, 5481, 275, 6890, 12620, 17497, 25497, 285, 2718, 891, 2921, 4022, 26332, 70, 398, 75, 5213, 8059, 327, 26332, 1796, 4022, 50274, 21, 465, 303, 480, 465, 1368, 480, 465, 303, 340, 2093, 74, 480, 288, 33317, 340, 50276, 4039, 74, 480, 259, 4765, 10237, 3676, 23390, 26306, 4715, 1754, 327, 305, 456, 1491, 11781, 2990, 549, 32693, 638, 3845, 549, 32693, 11395, 28166, 22187, 50274, 783, 2929, 3916, 281, 2085, 247, 3676, 4685, 273, 253, 7688, 875, 17872, 14800, 11781, 13461, 2990, 10336, 285, 4795, 3045, 891, 13414, 1158, 352, 1663, 33526, 50270, 2520, 342, 253, 1355, 6667, 273, 13461, 323, 690, 2969, 9534, 50274, 262, 310, 1077, 12744, 1880, 253, 35615, 452, 625, 390, 1679, 3602, 387, 581, 1127, 352, 310, 4767, 326, 253, 3236, 10336, 689, 26017, 285, 253, 747, 10336, 556, 1679, 3602, 4706, 3307, 285, 495, 533, 840, 352, 310, 4767, 323, 28959, 253, 1180, 273, 8512, 310, 4503, 1025, 8073, 285, 1996, 275, 326, 2593, 326, 253, 747, 35615, 452, 3081, 8512, 534, 273, 841, 310, 7899, 50274, 4919, 281, 253, 2045, 1127, 285, 6830, 253, 5962, 14855, 253, 5661, 16182, 2789, 352, 1892, 281, 2028, 604, 3045, 310, 2686, 5520, 323, 1650, 352, 310, 417, 2590, 281, 479, 326, 253, 3045, 15988, 403, 417, 816, 247, 906, 273, 1679, 689, 31893, 323, 5913, 1921, 273, 1666, 25379, 285, 326, 253, 4229, 1180, 273, 44540, 3103, 1543, 275, 15910, 387, 247, 1805, 3045, 4496, 921, 3733, 285, 12820, 9191, 594, 326, 359, 476, 923, 1880, 253, 44540, 6777, 323, 253, 1666, 25379, 403, 417, 816, 6777, 846, 689, 31893, 275, 534, 1083, 2393, 15910, 588, 3157, 253, 3045, 347, 1529, 1650, 627, 403, 642, 48894, 2011, 275, 253, 2534, 14580, 50273, 783, 6667, 342, 6046, 285, 20101, 403, 3710, 323, 1650, 352, 310, 671, 417, 2590, 2139, 271, 2572, 273, 6046, 275, 253, 33936, 4735, 2829, 608, 2686, 5459, 253, 2801, 273, 326, 1387, 275, 253, 2500, 493, 486, 10336, 752, 1057, 326, 1599, 275, 2087, 627, 310, 2649, 667, 3505, 74, 6216, 1332, 4081, 323, 18918, 841, 9534, 50275, 8826, 5884, 5701, 498, 274, 6787, 50274, 5371, 310, 253, 3064, 875, 841, 305, 456, 6928, 285, 4116, 267, 6297, 24088, 9765, 4116, 923, 4116, 310, 512, 368, 878, 2929, 50274, 5371, 310, 247, 3505, 74, 6216, 1332, 281, 7617, 327, 253, 1387, 723, 50274, 9088, 403, 2067, 963, 993, 4768, 253, 2929, 50272, 249, 253, 3361, 273, 3802, 265, 263, 50276, 249, 253, 3361, 273, 8468, 50272, 10489, 483, 253, 2929, 2063, 8518, 50276, 12787, 2474, 50272, 7513, 10202, 273, 10960, 272, 253, 3733, 50274, 26109, 5449, 310, 417, 247, 9864, 3126, 50276, 1189, 455, 253, 2929, 29328, 27934, 2544, 281, 271, 5368, 1332, 323, 11781, 285, 1223, 2762, 1543, 403, 5183, 627, 403, 2067, 3374, 275, 253, 5661, 16182, 326, 1056, 352, 12744, 835, 253, 5373, 1705, 432, 2007, 253, 2929, 19756, 38135, 347, 1554, 48268, 11781, 556, 644, 14859, 3012, 285, 253, 2544, 403, 2581, 5884, 627, 310, 642, 3505, 74, 6216, 1332, 390, 12342, 326, 4446, 253, 27934, 2544, 285, 1223, 253, 4477, 1750, 247, 12861, 5839, 715, 253, 6928, 12510, 762, 6046, 285, 20101, 253, 4588, 1783, 310, 1512, 20126, 5474, 33032, 39930, 285, 9021, 253, 4477, 3157, 2220, 2067, 7364, 273, 253, 8245, 2297, 456, 10336, 407, 36636, 337, 247, 820, 1032, 1326, 11273, 305, 456, 11781, 10336, 285, 374, 247, 2500, 493, 486, 305, 456, 11781, 10336, 253, 4477, 921, 11701, 275, 6276, 4438, 10554, 285, 1966, 2425, 8981, 275, 7533, 835, 512, 33433, 403, 2540, 347, 973, 347, 7533, 835, 627, 403, 27620, 390, 5816, 33433, 50276, 296, 3755, 20556, 337, 253, 1566, 3133, 4722, 285, 39223, 253, 2834, 1895, 273, 1554, 261, 11313, 11781, 762, 1097, 2622, 285, 27620, 7533, 374, 1175, 1543, 2797, 327, 2629, 49602, 342, 11701, 275, 7533, 835, 512, 33433, 403, 2540, 347, 973, 347, 7533, 835, 627, 403, 27620, 390, 5816, 33433, 50276, 20881, 1255, 265, 337, 891, 717, 11926, 670, 253, 38135, 273, 253, 4081, 2746, 253, 2022, 2934, 323, 253, 11781, 4399, 305, 456, 11781, 10336, 310, 281, 1347, 3081, 2393, 11781, 273, 17872, 14800, 1561, 1016, 1387, 534, 11355, 253, 1180, 273, 650, 276, 713, 652, 11781, 13461, 285, 3103, 253, 1180, 273, 3602, 281, 19928, 253, 2500, 493, 486, 305, 456, 11781, 10336, 3365, 24772, 253, 8245, 1566, 285, 253, 4081, 11781, 4399, 1566, 1097, 841, 5697, 1646, 4942, 32809, 374, 36908, 253, 2457, 2500, 493, 486, 305, 456, 11781, 10336, 2007, 2572, 253, 1180, 273, 3602, 347, 2429, 281, 253, 8245, 1566, 891, 2868, 627, 403, 2067, 3081, 269, 68, 9866, 8336, 275, 4677, 495, 285, 625, 4116, 305, 839, 13461, 891, 1089, 436, 4828, 565, 48714, 1580, 2593, 3307, 17194, 2442, 689, 31893, 347, 581, 32489, 273, 253, 8245, 2036, 27189, 10336, 849, 1057, 253, 2572, 275, 3602, 323, 253, 2457, 1566, 2818, 253, 3515, 673, 285, 14940, 50275, 34974, 281, 4477, 337, 891, 13414, 2096, 7180, 38094, 2139, 403, 253, 1543, 323, 650, 276, 713, 652, 11781, 2801, 275, 253, 4766, 273, 2067, 9930, 534, 3386, 403, 1146, 908, 275, 534, 2390, 4496, 1056, 436, 2590, 970, 9118, 2533, 2392, 374, 323, 253, 4081, 2500, 493, 486, 305, 456, 11781, 10336, 513, 253, 374, 12998, 3037, 1027, 1841, 26332, 2770, 327, 1027, 11821, 273, 253, 23390, 26306, 14800, 891, 651, 452, 10490, 281, 923, 625, 5304, 5904, 285, 1783, 3185, 273, 816, 18276, 1543, 50276, 49836, 11701, 963, 993, 1407, 953, 3740, 5816, 10414, 337, 2087, 4105, 9759, 273, 5661, 1543, 7180, 403, 417, 2590, 285, 2534, 14580, 403, 417, 39470, 8392, 253, 2929, 8725, 281, 898, 7223, 672, 247, 2257, 273, 2317, 812, 320, 9809, 407, 2403, 253, 9759, 273, 5661, 1543, 625, 8566, 891, 2868, 253, 9600, 3748, 326, 625, 7223, 476, 320, 908, 604, 627, 403, 9470, 1543, 533, 891, 13414, 1158, 253, 5661, 1543, 7501, 253, 4465, 268, 2961, 406, 33032, 2520, 2929, 29328, 767, 305, 456, 3676, 4715, 35615, 323, 8468, 11781, 597, 403, 512, 1754, 327, 253, 2045, 789, 50276, 6292, 266, 869, 293, 1162, 14350, 36453, 11781, 342, 260, 79, 2224, 323, 15649, 87, 26279, 6276, 275, 24340, 12620, 891, 2921, 407, 1907, 253, 24104, 3386, 253, 2488, 5183, 5520, 3045, 3340, 275, 253, 3361, 273, 3632, 8468, 6046, 285, 20101, 50276, 25590, 4826, 253, 2929, 310, 973, 3542, 10932, 285, 2590, 327, 954, 2792, 247, 1643, 5884, 2792, 337, 253, 2264, 2978, 273, 253, 2929, 23141, 854, 7223, 690, 8442, 285, 7180, 943, 320, 10904, 281, 452, 352, 4944, 715, 854, 7223, 374, 253, 6239, 2278, 310, 3710, 495, 627, 403, 4518, 690, 2985, 81, 437, 723, 323, 1650, 253, 2036, 27189, 310, 2223, 3542, 347, 2297, 456, 50276, 48746, 7200, 253, 767, 10336, 326, 253, 2488, 29328, 512, 1754, 327, 253, 24104, 3386, 534, 281, 619, 1127, 273, 1859, 310, 247, 1077, 1774, 285, 3309, 629, 273, 253, 747, 1566, 2299, 253, 2488, 4242, 281, 8132, 29689, 5276, 390, 4518, 5183, 326, 2139, 436, 310, 3576, 281, 776, 747, 1566, 50276, 3062, 1189, 849, 281, 1056, 2390, 390, 849, 1142, 2390, 403, 3058, 403, 417, 4518, 7616, 253, 4679, 908, 760, 767, 4336, 1027, 15302, 5293, 273, 731, 403, 2905, 281, 253, 2045, 8468, 11781, 1332, 597, 403, 2820, 281, 15639, 516, 9202, 436, 1332, 2550, 39970, 281, 247, 1846, 1083, 50276, 249, 1635, 604, 359, 1007, 387, 2829, 577, 285, 2829, 608, 359, 476, 1089, 253, 806, 650, 276, 713, 652, 11781, 2801, 2686, 5459, 534, 3133, 34126, 281, 253, 906, 2011, 275, 2829, 721, 50276, 14629, 1974, 273, 30404, 50276, 31943, 7031, 273, 6239, 275, 8468, 11781, 627, 403, 1679, 685, 884, 10414, 403, 2905, 281, 8468, 11781, 50276, 1189, 455, 352, 310, 417, 271, 17857, 32888, 2629, 2929, 187, 187, 4118, 18435, 27, 783, 2929, 21168, 598, 327, 253, 305, 456, 11781, 2990, 35615, 285, 5223, 1110, 7274, 281, 3986, 5520, 1543, 50276, 249, 326, 352, 310, 17627, 595, 32811, 50276, 455, 253, 1072, 512, 30628, 5194, 326, 253, 789, 310, 417, 2568, 598, 281, 1061, 50276, 249, 1798, 253, 2929, 310, 760, 32809, 285, 253, 38135, 273, 352, 310, 417, 2590, 50276, 262, 1057, 417, 14588, 973, 281, 5368, 789, 275, 436, 1673, 285, 253, 1543, 403, 417, 8132, 29689, 6760, 3021, 697, 15785, 310, 12744, 21657, 50276 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 436, 2929, 39223, 253, 1895, 273, 8468, 11781, 835, 2709, 6830, 26704, 8468, 33433, 403, 2130, 285, 11454, 2990, 35615, 403, 908, 281, 13398, 1491, 432, 731, 281, 1347, 10554, 8892, 253, 2929, 4081, 14586, 281, 247, 305, 456, 11781, 2990, 5742, 337, 32827, 5239, 273, 13479, 285, 32147, 839, 731, 1078, 2007, 5162, 285, 374, 9591, 1554, 48268, 11781, 835, 2393, 8468, 941, 14237, 403, 32147, 456, 281, 4711, 2801, 723, 3081, 281, 253, 1110, 2797, 432, 3386, 32147, 456, 387, 247, 1996, 3924, 5661, 1543, 921, 326, 841, 35615, 5115, 3045, 15988, 432, 3436, 3340, 672, 13479, 403, 27620, 390, 5816, 50275, 296, 3755, 20556, 50274, 783, 35615, 11907, 11781, 387, 2709, 2308, 3340, 253, 1273, 581, 534, 310, 247, 4473, 326, 556, 644, 5547, 2439, 253, 3676, 4715, 6239, 50273, 783, 2929, 4453, 387, 271, 4722, 9400, 3340, 2905, 281, 2819, 387, 253, 2538, 273, 6046, 285, 5816, 13479, 327, 253, 305, 839, 6297, 50273, 783, 1543, 921, 690, 2762, 3045, 15988, 3738, 923, 15985, 1832, 2708, 50275, 20881, 1255, 265, 50273, 783, 2905, 789, 12494, 310, 6685, 23507, 11781, 310, 271, 14779, 1673, 923, 6630, 11106, 275, 436, 2929, 347, 973, 337, 285, 891, 1089, 253, 1355, 4327, 273, 11781, 1543, 342, 247, 368, 12042, 281, 320, 8921, 247, 2266, 873, 273, 2905, 789, 310, 3309, 13654, 327, 1110, 326, 403, 2074, 281, 253, 789, 347, 271, 1650, 7046, 7173, 358, 23702, 11781, 3468, 11781, 374, 17267, 690, 16188, 1559, 566, 281, 436, 789, 533, 627, 403, 1142, 2571, 24088, 5910, 347, 247, 1643, 6667, 50272, 18, 17653, 607, 395, 3358, 277, 5582, 15897, 285, 7098, 3964, 259, 246, 9614, 3676, 23390, 26306, 4715, 247, 6630, 327, 3332, 16424, 285, 13554, 26332, 1796, 2625, 5162, 11338, 38290, 4240, 9161, 12347, 50276, 186, 50275, 3358, 607, 50274, 19, 46247, 26300, 285, 250, 75, 1162, 355, 1236, 2510, 25912, 3492, 9162, 342, 27311, 267, 11454, 6928, 10061, 273, 253, 26332, 1796, 8059, 327, 4382, 8113, 285, 3102, 8981, 4059, 50274, 20, 479, 265, 258, 1321, 285, 250, 284, 299, 49311, 285, 259, 311, 925, 312, 3600, 36964, 13887, 7060, 314, 17825, 23390, 26306, 11781, 323, 1789, 5481, 275, 6890, 12620, 17497, 25497, 285, 2718, 891, 2921, 4022, 26332, 70, 398, 75, 5213, 8059, 327, 26332, 1796, 4022, 50274, 21, 465, 303, 480, 465, 1368, 480, 465, 303, 340, 2093, 74, 480, 288, 33317, 340, 50276, 4039, 74, 480, 259, 4765, 10237, 3676, 23390, 26306, 4715, 1754, 327, 305, 456, 1491, 11781, 2990, 549, 32693, 638, 3845, 549, 32693, 11395, 28166, 22187, 50274, 783, 2929, 3916, 281, 2085, 247, 3676, 4685, 273, 253, 7688, 875, 17872, 14800, 11781, 13461, 2990, 10336, 285, 4795, 3045, 891, 13414, 1158, 352, 1663, 33526, 50270, 2520, 342, 253, 1355, 6667, 273, 13461, 323, 690, 2969, 9534, 50274, 262, 310, 1077, 12744, 1880, 253, 35615, 452, 625, 390, 1679, 3602, 387, 581, 1127, 352, 310, 4767, 326, 253, 3236, 10336, 689, 26017, 285, 253, 747, 10336, 556, 1679, 3602, 4706, 3307, 285, 495, 533, 840, 352, 310, 4767, 323, 28959, 253, 1180, 273, 8512, 310, 4503, 1025, 8073, 285, 1996, 275, 326, 2593, 326, 253, 747, 35615, 452, 3081, 8512, 534, 273, 841, 310, 7899, 50274, 4919, 281, 253, 2045, 1127, 285, 6830, 253, 5962, 14855, 253, 5661, 16182, 2789, 352, 1892, 281, 2028, 604, 3045, 310, 2686, 5520, 323, 1650, 352, 310, 417, 2590, 281, 479, 326, 253, 3045, 15988, 403, 417, 816, 247, 906, 273, 1679, 689, 31893, 323, 5913, 1921, 273, 1666, 25379, 285, 326, 253, 4229, 1180, 273, 44540, 3103, 1543, 275, 15910, 387, 247, 1805, 3045, 4496, 921, 3733, 285, 12820, 9191, 594, 326, 359, 476, 923, 1880, 253, 44540, 6777, 323, 253, 1666, 25379, 403, 417, 816, 6777, 846, 689, 31893, 275, 534, 1083, 2393, 15910, 588, 3157, 253, 3045, 347, 1529, 1650, 627, 403, 642, 48894, 2011, 275, 253, 2534, 14580, 50273, 783, 6667, 342, 6046, 285, 20101, 403, 3710, 323, 1650, 352, 310, 671, 417, 2590, 2139, 271, 2572, 273, 6046, 275, 253, 33936, 4735, 2829, 608, 2686, 5459, 253, 2801, 273, 326, 1387, 275, 253, 2500, 493, 486, 10336, 752, 1057, 326, 1599, 275, 2087, 627, 310, 2649, 667, 3505, 74, 6216, 1332, 4081, 323, 18918, 841, 9534, 50275, 8826, 5884, 5701, 498, 274, 6787, 50274, 5371, 310, 253, 3064, 875, 841, 305, 456, 6928, 285, 4116, 267, 6297, 24088, 9765, 4116, 923, 4116, 310, 512, 368, 878, 2929, 50274, 5371, 310, 247, 3505, 74, 6216, 1332, 281, 7617, 327, 253, 1387, 723, 50274, 9088, 403, 2067, 963, 993, 4768, 253, 2929, 50272, 249, 253, 3361, 273, 3802, 265, 263, 50276, 249, 253, 3361, 273, 8468, 50272, 10489, 483, 253, 2929, 2063, 8518, 50276, 12787, 2474, 50272, 7513, 10202, 273, 10960, 272, 253, 3733, 50274, 26109, 5449, 310, 417, 247, 9864, 3126, 50276, 1189, 455, 253, 2929, 29328, 27934, 2544, 281, 271, 5368, 1332, 323, 11781, 285, 1223, 2762, 1543, 403, 5183, 627, 403, 2067, 3374, 275, 253, 5661, 16182, 326, 1056, 352, 12744, 835, 253, 5373, 1705, 432, 2007, 253, 2929, 19756, 38135, 347, 1554, 48268, 11781, 556, 644, 14859, 3012, 285, 253, 2544, 403, 2581, 5884, 627, 310, 642, 3505, 74, 6216, 1332, 390, 12342, 326, 4446, 253, 27934, 2544, 285, 1223, 253, 4477, 1750, 247, 12861, 5839, 715, 253, 6928, 12510, 762, 6046, 285, 20101, 253, 4588, 1783, 310, 1512, 20126, 5474, 33032, 39930, 285, 9021, 253, 4477, 3157, 2220, 2067, 7364, 273, 253, 8245, 2297, 456, 10336, 407, 36636, 337, 247, 820, 1032, 1326, 11273, 305, 456, 11781, 10336, 285, 374, 247, 2500, 493, 486, 305, 456, 11781, 10336, 253, 4477, 921, 11701, 275, 6276, 4438, 10554, 285, 1966, 2425, 8981, 275, 7533, 835, 512, 33433, 403, 2540, 347, 973, 347, 7533, 835, 627, 403, 27620, 390, 5816, 33433, 50276, 296, 3755, 20556, 337, 253, 1566, 3133, 4722, 285, 39223, 253, 2834, 1895, 273, 1554, 261, 11313, 11781, 762, 1097, 2622, 285, 27620, 7533, 374, 1175, 1543, 2797, 327, 2629, 49602, 342, 11701, 275, 7533, 835, 512, 33433, 403, 2540, 347, 973, 347, 7533, 835, 627, 403, 27620, 390, 5816, 33433, 50276, 20881, 1255, 265, 337, 891, 717, 11926, 670, 253, 38135, 273, 253, 4081, 2746, 253, 2022, 2934, 323, 253, 11781, 4399, 305, 456, 11781, 10336, 310, 281, 1347, 3081, 2393, 11781, 273, 17872, 14800, 1561, 1016, 1387, 534, 11355, 253, 1180, 273, 650, 276, 713, 652, 11781, 13461, 285, 3103, 253, 1180, 273, 3602, 281, 19928, 253, 2500, 493, 486, 305, 456, 11781, 10336, 3365, 24772, 253, 8245, 1566, 285, 253, 4081, 11781, 4399, 1566, 1097, 841, 5697, 1646, 4942, 32809, 374, 36908, 253, 2457, 2500, 493, 486, 305, 456, 11781, 10336, 2007, 2572, 253, 1180, 273, 3602, 347, 2429, 281, 253, 8245, 1566, 891, 2868, 627, 403, 2067, 3081, 269, 68, 9866, 8336, 275, 4677, 495, 285, 625, 4116, 305, 839, 13461, 891, 1089, 436, 4828, 565, 48714, 1580, 2593, 3307, 17194, 2442, 689, 31893, 347, 581, 32489, 273, 253, 8245, 2036, 27189, 10336, 849, 1057, 253, 2572, 275, 3602, 323, 253, 2457, 1566, 2818, 253, 3515, 673, 285, 14940, 50275, 34974, 281, 4477, 337, 891, 13414, 2096, 7180, 38094, 2139, 403, 253, 1543, 323, 650, 276, 713, 652, 11781, 2801, 275, 253, 4766, 273, 2067, 9930, 534, 3386, 403, 1146, 908, 275, 534, 2390, 4496, 1056, 436, 2590, 970, 9118, 2533, 2392, 374, 323, 253, 4081, 2500, 493, 486, 305, 456, 11781, 10336, 513, 253, 374, 12998, 3037, 1027, 1841, 26332, 2770, 327, 1027, 11821, 273, 253, 23390, 26306, 14800, 891, 651, 452, 10490, 281, 923, 625, 5304, 5904, 285, 1783, 3185, 273, 816, 18276, 1543, 50276, 49836, 11701, 963, 993, 1407, 953, 3740, 5816, 10414, 337, 2087, 4105, 9759, 273, 5661, 1543, 7180, 403, 417, 2590, 285, 2534, 14580, 403, 417, 39470, 8392, 253, 2929, 8725, 281, 898, 7223, 672, 247, 2257, 273, 2317, 812, 320, 9809, 407, 2403, 253, 9759, 273, 5661, 1543, 625, 8566, 891, 2868, 253, 9600, 3748, 326, 625, 7223, 476, 320, 908, 604, 627, 403, 9470, 1543, 533, 891, 13414, 1158, 253, 5661, 1543, 7501, 253, 4465, 268, 2961, 406, 33032, 2520, 2929, 29328, 767, 305, 456, 3676, 4715, 35615, 323, 8468, 11781, 597, 403, 512, 1754, 327, 253, 2045, 789, 50276, 6292, 266, 869, 293, 1162, 14350, 36453, 11781, 342, 260, 79, 2224, 323, 15649, 87, 26279, 6276, 275, 24340, 12620, 891, 2921, 407, 1907, 253, 24104, 3386, 253, 2488, 5183, 5520, 3045, 3340, 275, 253, 3361, 273, 3632, 8468, 6046, 285, 20101, 50276, 25590, 4826, 253, 2929, 310, 973, 3542, 10932, 285, 2590, 327, 954, 2792, 247, 1643, 5884, 2792, 337, 253, 2264, 2978, 273, 253, 2929, 23141, 854, 7223, 690, 8442, 285, 7180, 943, 320, 10904, 281, 452, 352, 4944, 715, 854, 7223, 374, 253, 6239, 2278, 310, 3710, 495, 627, 403, 4518, 690, 2985, 81, 437, 723, 323, 1650, 253, 2036, 27189, 310, 2223, 3542, 347, 2297, 456, 50276, 48746, 7200, 253, 767, 10336, 326, 253, 2488, 29328, 512, 1754, 327, 253, 24104, 3386, 534, 281, 619, 1127, 273, 1859, 310, 247, 1077, 1774, 285, 3309, 629, 273, 253, 747, 1566, 2299, 253, 2488, 4242, 281, 8132, 29689, 5276, 390, 4518, 5183, 326, 2139, 436, 310, 3576, 281, 776, 747, 1566, 50276, 3062, 1189, 849, 281, 1056, 2390, 390, 849, 1142, 2390, 403, 3058, 403, 417, 4518, 7616, 253, 4679, 908, 760, 767, 4336, 1027, 15302, 5293, 273, 731, 403, 2905, 281, 253, 2045, 8468, 11781, 1332, 597, 403, 2820, 281, 15639, 516, 9202, 436, 1332, 2550, 39970, 281, 247, 1846, 1083, 50276, 249, 1635, 604, 359, 1007, 387, 2829, 577, 285, 2829, 608, 359, 476, 1089, 253, 806, 650, 276, 713, 652, 11781, 2801, 2686, 5459, 534, 3133, 34126, 281, 253, 906, 2011, 275, 2829, 721, 50276, 14629, 1974, 273, 30404, 50276, 31943, 7031, 273, 6239, 275, 8468, 11781, 627, 403, 1679, 685, 884, 10414, 403, 2905, 281, 8468, 11781, 50276, 1189, 455, 352, 310, 417, 271, 17857, 32888, 2629, 2929, 187, 187, 4118, 18435, 27, 783, 2929, 21168, 598, 327, 253, 305, 456, 11781, 2990, 35615, 285, 5223, 1110, 7274, 281, 3986, 5520, 1543, 50276, 249, 326, 352, 310, 17627, 595, 32811, 50276, 455, 253, 1072, 512, 30628, 5194, 326, 253, 789, 310, 417, 2568, 598, 281, 1061, 50276, 249, 1798, 253, 2929, 310, 760, 32809, 285, 253, 38135, 273, 352, 310, 417, 2590, 50276, 262, 1057, 417, 14588, 973, 281, 5368, 789, 275, 436, 1673, 285, 253, 1543, 403, 417, 8132, 29689, 6760, 3021, 697, 15785, 310, 12744, 21657, 50276 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the main result of the paper is an upper bound on the rademacher complexity of neural networks in the case of adversarial examples meaning analyzing it with respect to the robust loss class previous bounds for linear classifiers and shallow neural networks were investigated by yin et al 2019 and awasthi et al 2020 the known methods for upper bounding the rademacher complexity for deep neural networks in the standard case dont apply in this case the main idea in this paper is to analyze the covering numbers directly as opposed to calculating them by induction on the layers the question investigated in the paper described above is natural and was posed as an open question by previous papers the main technique seems reasonable the experiments suggest that the change in the margin and the product of the weight norm may explain the larger generalization gap compared to the standard generalization weakness showing a depthwidthdependent lower bound could have been nice citing the paper httpsarxivorgpdf181002180pdf alt19 and jmlr is very relevant in the section adversarial generalization it provides uniform convergence results in the case of a finite number of perturbations for a mixture of classifiers the analysis goes through the rademacher complexity as well in section 3 d2 should be 2d i recommend accepting the paper i read the overview of the main proof but not the fully detailed proof docsepin this paper the authors developed an upper bound of adversarial rademacher complexity which includes the product of weight norms this implies that the large weight norm hinders from achieving good generalization performance in adversarial the authors also empirically show that the product of weight norm in adversarial training is indeed much larger than that in standard training strengths 1 this paper provides new bounds for adversarial rademacher complexity 2 this paper is well written and easy to read weaknesses the empirical validation in this paper seems insufficient 1 although the authors verified that the product of weight norm in adversarial training is indeed much larger than that in standard training i think this is not a strong empirical verification of the proposed bound the reason is there existing some constants in the bounds in adversarial training and standard training which may be different therefore simply comparing the product of the weight norm is not rigorous i would like to see more discussion on this we know that some of the existing deep learning theory may be only mathematically correct thus providing sufficient empirical evidence to show the consistence between the theory and practice is important 2 there exists a product of the weight norm in the proposed bound which implies that the trained model can generalize better if this product is smaller therefore can we use some techniques to regularize this product during training to improve the generalization ability the authors are recommended to give such kind of experiments to support their theoretical results after rebuttal after reading the response from the authors i raised my rating 1 this paper provides a new complexity bound for adversarial training 2 the empirical validation of this paper is insufficient docsepthis paper proposes new generalization bounds for adversarial training in neural networks based on the rademacher complexity these are more general than previous results in that they apply to neural networks of any depth experiments are performed on cifar10 and cifar100 with multiple vgg architectures these related with the main quantities appearing in the proposed bounds provide an explanation for the limited generalization capacity of adversarial training strengths relevant topic of theoretical investigation of adversarial examples and adversarial training that has gained traction in recent years improved theoretical result over prior art meaningful experiments based on the theoretical results that suggest some reasons why adversarial training does not generalize well clear and wellwritten paper a few typos remain weaknesses it would be great if the paper included a comparison to existing adversarial bounds based on rademacher complexity or other frameworks eg plotted for a toy example or small network lacking this it is harder to judge the improvement the current paper makes over prior results in terms of tightness of the bound moreover it might be worth comparing the contribution to other types of theoretical approaches in the field eg the provable methods that the paper cites questions and other comments it would be good to underline that the bounds provided also hold for convolutional neural networks earlier than the experiments section or more generally what layers are covered what is the impact of using pgd adversarial examples in practice instead of the optimal perturbation is it reasonable to consider a loss in the range 01 for neural networks are rademacher complexity bounds tight enough for neural networks to be informative or applicable in practice update post discussion i am raising my rating by one point following the exchanges below good theoretical result supported by experimental evaluation on relevant topic docsepthis paper overcomes some technical difficulties to provide adversarial rademacher complexity of deep neural networks compared with existing literature which try to show other variants of adversarial rademacher complexity this paper directly works on adversarial rademacher complexity itself this paper provides both the lower bound and the upper bound besides this paper conduct numerical experiments to combine with the theoretical bound to justify why adversarial training has a worse generalization than standard training i think the main contribution of this paper is interesting it directly overcomes the difficulties in deriving adversarial rademacher complexity rather than using other variants it provides both the upper bound and lower bound both of which are important however there are still a lot of things that can be improved in this paper below are my major concerns towards this paper ordered in importance my rating is currently weak accept given the importance of the upper and lower bounds but it could be adjusted based on the author response towards my concerns 1 my understanding towards the adversarial training and the adversarial rademacher complexity is that both adversarialtrained and standardtrained neural networks have their own standard rademacher complexity and adversarial rademacher complexity so the generalization gap of adversarial training loss and adversarial testing loss is larger than the generalization gap of standard training loss and standard testing loss for both adversarialtrained denote as a and b and standardtrained neural networks denote as c and d ie ab and cd on the other hand through experiments it is observed that the adversarially trained neural networks obtains larger norms so its generalization is worse ie ad is my understanding correct is it essential to provide evidence for ac or bd in other to conclude ad in figure 1 what does the generalization gap refer to in addition could you provide some insights on why the norm of adversarially trained model is larger why does increasing the samples size lead to a larger normmargin also given these observations is there any way to improve the generalization performance of adversarial training is there any implications on the loss landscape of adversarial robust neural networks the current section 6 displays some observations but no detailed insights 1as important as the above the current proof of theorem 3 only says by the results of the lower bounds of we obtain that please provide more details on the existing results and how to obtain the final result please provide some concrete illustrations to this either in pdf or in the discussion 3 the literature review of generalization of adversarial training in this paper only considers those about rademacher complexity there are many other studies working on the generalization from other theoretical aspects below are some important articles please do some literature research in this general area of theoretical study and include in this paper papers which provide both upper bound and lower bound dan chen yuting wei and pradeep ravikumar sharp statistical guaratees for adversarially robust gaussian classification international conference on machine learning pmlr 2020 xing yue ruizhi zhang and guang cheng adversarially robust estimate and risk analysis in linear regression international conference on artificial intelligence and statistics pmlr 2021 papers about generalization properties allenzhu zeyuan and yuanzhi li feature purification how adversarial training performs robust deep learning arxiv preprint arxiv200510190 2020 javanmard adel mahdi soltanolkotabi and hamed hassani precise tradeoffs in adversarial training for linear regression conference on learning theory pmlr 2020 javanmard adel and mahdi soltanolkotabi precise statistical analysis of classification accuracies for adversarial training arxiv preprint arxiv201011213 2020 taheri hossein ramtin pedarsani and christos thrampoulidis asymptotic behavior of adversarial training in binary classification arxiv preprint arxiv201013275 2020 wu dongxian shutao xia and yisen wang adversarial weight perturbation helps robust generalization advances in neural information processing systems 33 2020 xing yue qifan song and guang cheng on the generalization properties of adversarial training international conference on artificial intelligence and statistics pmlr 2021 zhai runtian et al adversarially robust generalization just requires more unlabeled data arxiv preprint arxiv190600555 2019 4 the main idea of this paper is not hard to follow and the authors make a lot of comparison to existing literature it would be great if the authors could make it clear about the following questions when describing the proof steps in theorem 1 1 which steps are different from the derivation of standard rademacher complexity which steps are not essential if we use theorem 1 and take epsilon0 what steps should we modify to obtain the standard rademacher complexity mentioned in section 53 2 which steps are different from the literature about adversarial rademacher complexity which steps do they skip 3 could you explain the remark after before theorem 2 in detail minor issue not ordered in importance 5 when mentioning your contributions in the last paragraph of section 1 could you write some descriptions 1 for the first contribution is there any interesting findings in the bound 2 for the second contribution could you answer your why question 6 please consider remove proposition 1 and 2 and move proposition 3 to the appendix proposition 1 and 2 do not help deepen the understanding of the main goal of this paper similarly please shorten section 3 for inequalities which are unrelated to the main goal this paper provides some important results about the adversarial rademacher complexity so i vote for weak acceptance but there are many issues towards the experiments proofs and the writing of this paper ### Summary:
the paper made a solid theoretical contribution on the adversarial generalization bounds of multilayer neural networks however the paper at the current form has many issues in the claim that the product of the norm can explain the generalization gap 1 weight decay the authors uses the weight norm as the proxy for generalization gap however it is unclear to me that adversarial trained networks have a larger generalization gap can be explained by the product of weight norms to carefully verify this the authors have to at least carefully tune the weight decay to the largest possible extend so the generalization error is not hurt and compare the product of the weight norms in this scenario without weight decay the neural networks might learn a lot of redundancies in the weights especially with adversarial training which makes the product of the norm to be too large the authors do perform experiments showing that with weight decay the generalization gap becomes smaller and the norms become smaller however it is totally unclear to me that the weight decay considered in the experiments are actually optimal it could still be the case that with proper weight decay the product of the norms in adversarial training is actually smaller comparing to that of the clean training moreover the authors should also clarify that the product of the norms according to the experiments are simply too large and they can not be used in the theoretical result to get any meaningful generalization bounds 2 the product of the norm in rademacher complexity is tight this claim only holds for neural networks with 1 neurons per layer once there are more than one neurons there can simply be one neuron that learns fx and the other learns fx and they completely cancels each other so the product of the norm is obviously not tight for any neural network with more than one neuron per layer in fact the gap can be infinitely large unfortunately i like the paper very much and i hope this paper could be published however the claims the product of the norm can explain the generalization gap is simply too misleading and illsupported i encourage the authors to completely remove this claim and submit the paper to colt
[ 253, 4477, 16058, 326, 253, 1885, 273, 2801, 5222, 275, 48960, 3733, 310, 6296, 1199, 4067, 685, 326, 275, 2629, 3733, 891, 1158, 436, 310, 417, 247, 2266, 16774, 21999, 273, 253, 4081, 3033, 253, 1921, 310, 627, 5368, 690, 14637, 275, 253, 14493, 275, 48960, 3733, 285, 2629, 3733, 534, 778, 320, 1027, 3103, 3365, 10941, 253, 1885, 273, 253, 2801, 5222, 310, 417, 26565, 891, 651, 751, 281, 923, 625, 5955, 327, 436, 359, 871, 326, 690, 273, 253, 5368, 3676, 4715, 3762, 778, 320, 760, 11076, 1037, 3451, 3021, 5277, 4209, 16774, 1941, 281, 921, 253, 2882, 566, 875, 253, 3762, 285, 3946, 310, 1774, 50275, 19, 627, 4961, 247, 1885, 273, 253, 2801, 5222, 275, 253, 4081, 3033, 534, 8018, 326, 253, 10166, 1566, 476, 39970, 1805, 604, 436, 1885, 310, 4577, 3103, 476, 359, 897, 690, 5609, 281, 3963, 907, 436, 1885, 1309, 3733, 281, 3157, 253, 26647, 3745, 253, 4477, 403, 8521, 281, 1918, 824, 2238, 273, 4679, 281, 1329, 616, 10527, 1543, 50275, 6438, 30080, 22559, 50276, 6438, 4361, 253, 2380, 432, 253, 4477, 891, 5439, 619, 13716, 50276, 18, 436, 2929, 3400, 247, 747, 10454, 3033, 323, 48960, 3733, 50276, 19, 253, 16774, 12820, 273, 436, 2929, 310, 12497, 50276, 7152, 33032, 2520, 2929, 29328, 747, 26647, 14493, 323, 48960, 3733, 275, 11454, 6928, 1754, 327, 253, 1985, 358, 12844, 10454, 841, 403, 625, 2087, 685, 2045, 1543, 275, 326, 597, 4647, 281, 11454, 6928, 273, 667, 6864, 4679, 403, 2684, 327, 260, 338, 274, 740, 285, 260, 338, 274, 2313, 342, 2709, 362, 1266, 35615, 841, 2905, 342, 253, 2022, 13483, 15602, 275, 253, 4081, 14493, 2085, 271, 8813, 323, 253, 3710, 26647, 5350, 273, 48960, 3733, 20544, 50276, 15477, 9400, 273, 10527, 5839, 273, 48960, 6667, 285, 48960, 3733, 326, 556, 12103, 32535, 275, 3332, 1107, 50276, 303, 27369, 10527, 906, 689, 2720, 1445, 50276, 30407, 1020, 4679, 1754, 327, 253, 10527, 1543, 326, 1804, 690, 4606, 2139, 48960, 3733, 1057, 417, 39970, 973, 50276, 8250, 285, 973, 15720, 2929, 247, 1643, 963, 993, 3464, 50276, 20881, 1255, 265, 50276, 262, 651, 320, 1270, 604, 253, 2929, 2908, 247, 5301, 281, 5368, 48960, 14493, 1754, 327, 1985, 358, 12844, 10454, 390, 643, 31225, 24088, 17944, 323, 247, 20953, 1650, 390, 1355, 2990, 14999, 436, 352, 310, 12150, 281, 5963, 253, 7756, 253, 1655, 2929, 2789, 689, 2720, 1543, 275, 2426, 273, 6863, 1255, 273, 253, 3033, 25761, 352, 1537, 320, 4409, 10941, 253, 7680, 281, 643, 3510, 273, 10527, 7274, 275, 253, 1673, 24088, 253, 872, 494, 3082, 326, 253, 2929, 28070, 50276, 34974, 285, 643, 5701, 50276, 262, 651, 320, 1175, 281, 762, 1282, 326, 253, 14493, 2530, 671, 2186, 323, 27311, 267, 11454, 6928, 4321, 685, 253, 4679, 2593, 390, 625, 3839, 752, 8090, 403, 6107, 50276, 5371, 310, 253, 3486, 273, 970, 23256, 69, 48960, 6667, 275, 3946, 3185, 273, 253, 8654, 20452, 50276, 261, 352, 5272, 281, 1908, 247, 2957, 275, 253, 2491, 14805, 323, 11454, 6928, 50276, 609, 1985, 358, 12844, 10454, 14493, 6863, 2217, 323, 11454, 6928, 281, 320, 27096, 390, 7763, 275, 3946, 50276, 11183, 1501, 5955, 891, 717, 12976, 619, 13716, 407, 581, 1127, 1563, 253, 23261, 2708, 1175, 10527, 906, 4516, 407, 5661, 7103, 327, 4623, 9400, 5474, 33032, 2520, 2929, 689, 3217, 690, 7681, 12748, 281, 2085, 48960, 1985, 358, 12844, 10454, 273, 3676, 11454, 6928, 2429, 342, 5368, 6239, 534, 1611, 281, 921, 643, 11640, 273, 48960, 1985, 358, 12844, 10454, 436, 2929, 3587, 2987, 327, 48960, 1985, 358, 12844, 10454, 3139, 436, 2929, 3400, 1097, 253, 2406, 3033, 285, 253, 5170, 3033, 16280, 436, 2929, 2589, 10704, 4679, 281, 13398, 342, 253, 10527, 3033, 281, 15249, 2139, 48960, 3733, 556, 247, 7197, 26647, 685, 2629, 3733, 891, 1158, 253, 2022, 7680, 273, 436, 2929, 310, 4722, 352, 3587, 689, 3217, 253, 12748, 275, 44190, 48960, 1985, 358, 12844, 10454, 2581, 685, 970, 643, 11640, 352, 3400, 1097, 253, 5170, 3033, 285, 2406, 3033, 1097, 273, 534, 403, 1774, 50275, 35529, 627, 403, 1335, 247, 2257, 273, 1841, 326, 476, 320, 5520, 275, 436, 2929, 2708, 403, 619, 2201, 7350, 4404, 436, 2929, 6960, 275, 6349, 619, 13716, 310, 4390, 5075, 2997, 1677, 253, 6349, 273, 253, 5170, 285, 2406, 14493, 533, 352, 812, 320, 10904, 1754, 327, 253, 2488, 2380, 4404, 619, 7350, 50275, 18, 619, 4685, 4404, 253, 48960, 3733, 285, 253, 48960, 1985, 358, 12844, 10454, 310, 326, 1097, 48960, 32927, 285, 2629, 32927, 11454, 6928, 452, 616, 1211, 2629, 1985, 358, 12844, 10454, 285, 48960, 1985, 358, 12844, 10454, 594, 253, 26647, 8037, 273, 48960, 3733, 2957, 285, 48960, 5175, 2957, 310, 4067, 685, 253, 26647, 8037, 273, 2629, 3733, 2957, 285, 2629, 5175, 2957, 323, 1097, 48960, 32927, 9173, 347, 247, 285, 270, 285, 2629, 32927, 11454, 6928, 9173, 347, 260, 285, 277, 26332, 490, 285, 22942, 327, 253, 643, 1133, 949, 4679, 352, 310, 2540, 326, 253, 18539, 274, 1365, 10166, 11454, 6928, 31326, 4067, 22429, 594, 697, 26647, 310, 7197, 26332, 519, 310, 619, 4685, 3451, 310, 352, 5667, 281, 2085, 1941, 323, 913, 390, 270, 69, 275, 643, 281, 7525, 519, 275, 4677, 337, 752, 1057, 253, 26647, 8037, 3730, 281, 50276, 249, 1635, 812, 368, 2085, 690, 16039, 327, 2139, 253, 5222, 273, 18539, 274, 1365, 10166, 1566, 310, 4067, 2139, 1057, 3629, 253, 3530, 1979, 1421, 281, 247, 4067, 5222, 15456, 671, 1677, 841, 7313, 310, 627, 667, 1039, 281, 3157, 253, 26647, 3045, 273, 48960, 3733, 310, 627, 667, 12739, 327, 253, 2957, 13016, 273, 48960, 10237, 11454, 6928, 253, 1655, 2593, 721, 12646, 690, 7313, 533, 642, 7000, 16039, 50276, 18, 284, 1774, 347, 253, 1840, 253, 1655, 4737, 273, 10012, 495, 760, 2296, 407, 253, 1543, 273, 253, 2406, 14493, 273, 50276, 664, 4044, 326, 50276, 32897, 2085, 625, 4278, 327, 253, 5368, 1543, 285, 849, 281, 4044, 253, 2457, 906, 4496, 2085, 690, 11859, 33954, 281, 436, 2057, 275, 31697, 390, 275, 253, 5955, 50275, 20, 253, 6239, 2278, 273, 26647, 273, 48960, 3733, 275, 436, 2929, 760, 19401, 1110, 670, 1985, 358, 12844, 10454, 627, 403, 1142, 643, 2175, 2444, 327, 253, 26647, 432, 643, 10527, 7794, 2708, 403, 690, 1774, 7774, 4496, 513, 690, 6239, 2561, 275, 436, 2087, 2170, 273, 10527, 1263, 285, 2486, 275, 436, 2929, 50276, 50004, 534, 2085, 1097, 5170, 3033, 285, 2406, 3033, 50276, 21329, 260, 864, 340, 9634, 359, 74, 285, 819, 796, 554, 1218, 30127, 22711, 9479, 7605, 1149, 33573, 265, 323, 18539, 274, 1365, 10237, 305, 12064, 9162, 5213, 8059, 327, 5145, 4715, 268, 1686, 83, 9169, 50276, 89, 272, 340, 489, 8864, 478, 5801, 1182, 12109, 285, 1149, 606, 260, 24176, 18539, 274, 1365, 10237, 6642, 285, 2495, 1783, 275, 4872, 9077, 5213, 8059, 327, 13345, 9260, 285, 9990, 268, 1686, 83, 43425, 50276, 50004, 670, 26647, 3607, 50276, 455, 12586, 11917, 1182, 2653, 9041, 285, 340, 9041, 91, 5801, 632, 4735, 23609, 849, 48960, 3733, 17923, 10237, 3676, 4715, 549, 32693, 638, 3845, 549, 32693, 1518, 3712, 520, 2270, 9169, 50276, 75, 18444, 78, 472, 519, 293, 35926, 5168, 1220, 85, 9201, 46841, 18754, 285, 288, 3163, 38193, 6451, 10799, 5454, 14273, 275, 48960, 3733, 323, 4872, 9077, 8059, 327, 4715, 3762, 268, 1686, 83, 9169, 50276, 75, 18444, 78, 472, 519, 293, 285, 35926, 5168, 1220, 85, 9201, 46841, 18754, 10799, 7605, 1783, 273, 9162, 3933, 19103, 323, 48960, 3733, 549, 32693, 638, 3845, 549, 32693, 1252, 520, 805, 1012, 9169, 50276, 893, 248, 363, 288, 375, 34103, 17653, 31987, 7690, 1032, 6451, 285, 37622, 375, 7635, 1301, 3941, 30861, 20185, 3879, 273, 48960, 3733, 275, 8985, 9162, 549, 32693, 638, 3845, 549, 32693, 1252, 520, 1237, 1976, 9169, 50276, 44217, 277, 543, 89, 757, 6294, 8500, 1269, 571, 285, 340, 11889, 259, 606, 48960, 2801, 20452, 7729, 10237, 26647, 16424, 275, 11454, 1491, 5162, 2718, 5922, 9169, 50276, 89, 272, 340, 489, 2805, 338, 266, 4498, 285, 1149, 606, 260, 24176, 327, 253, 26647, 3607, 273, 48960, 3733, 5213, 8059, 327, 13345, 9260, 285, 9990, 268, 1686, 83, 43425, 50276, 91, 16926, 1408, 85, 757, 1162, 355, 18539, 274, 1365, 10237, 26647, 816, 4419, 625, 440, 22027, 941, 549, 32693, 638, 3845, 549, 32693, 746, 3071, 361, 27865, 6247, 50275, 21, 253, 2022, 2934, 273, 436, 2929, 310, 417, 1892, 281, 956, 285, 253, 4477, 1056, 247, 2257, 273, 5301, 281, 5368, 6239, 352, 651, 320, 1270, 604, 253, 4477, 812, 1056, 352, 2590, 670, 253, 1563, 3533, 672, 12930, 253, 4737, 5018, 275, 10012, 337, 50273, 18, 534, 5018, 403, 1027, 432, 253, 28529, 273, 2629, 1985, 358, 12844, 10454, 534, 5018, 403, 417, 5667, 604, 359, 897, 10012, 337, 285, 1379, 299, 4277, 17, 752, 5018, 943, 359, 10007, 281, 4044, 253, 2629, 1985, 358, 12844, 10454, 5393, 275, 2593, 8676, 50273, 19, 534, 5018, 403, 1027, 432, 253, 6239, 670, 48960, 1985, 358, 12844, 10454, 534, 5018, 513, 597, 17049, 50273, 20, 812, 368, 5513, 253, 7579, 846, 1078, 10012, 374, 275, 2508, 50274, 37585, 2523, 417, 6960, 275, 6349, 50276, 22, 672, 29570, 634, 9021, 275, 253, 1390, 12494, 273, 2593, 337, 812, 368, 3630, 690, 20121, 50272, 18, 323, 253, 806, 7680, 310, 627, 667, 4722, 4342, 275, 253, 3033, 50273, 19, 323, 253, 1273, 7680, 812, 368, 3662, 634, 2139, 1953, 50275, 23, 4496, 1908, 5386, 13989, 337, 285, 374, 285, 2118, 13989, 495, 281, 253, 30762, 13989, 337, 285, 374, 513, 417, 1361, 3676, 257, 253, 4685, 273, 253, 2022, 4736, 273, 436, 2929, 12014, 4496, 48399, 2593, 495, 323, 25930, 534, 403, 20804, 281, 253, 2022, 4736, 50274, 2520, 2929, 3400, 690, 1774, 1543, 670, 253, 48960, 1985, 358, 12844, 10454, 594, 891, 6273, 323, 5075, 14924, 533, 627, 403, 1142, 3374, 4404, 253, 4679, 27947, 285, 253, 4028, 273, 436, 2929, 50276, 187, 187, 4118, 18435, 27, 783, 2929, 1160, 247, 4891, 10527, 7680, 327, 253, 48960, 50276, 16691, 1320, 14493, 273, 33362, 4071, 11454, 6928, 50276, 35529, 253, 2929, 387, 253, 1655, 830, 556, 1142, 3374, 275, 253, 1750, 326, 253, 1885, 273, 253, 5222, 476, 5513, 253, 26647, 8037, 50276, 18, 2801, 10027, 253, 4477, 4648, 253, 2801, 5222, 347, 253, 17335, 323, 26647, 8037, 2299, 352, 310, 12744, 281, 479, 326, 48960, 10166, 6928, 452, 247, 4067, 26647, 8037, 476, 320, 5544, 407, 253, 1885, 273, 2801, 22429, 281, 9257, 12654, 436, 253, 4477, 452, 281, 387, 1878, 9257, 19928, 253, 2801, 10027, 281, 253, 6253, 1896, 9017, 594, 253, 26647, 2228, 310, 417, 8513, 285, 7277, 253, 1885, 273, 253, 2801, 22429, 275, 436, 10076, 50276, 14920, 2801, 10027, 253, 11454, 6928, 1537, 3037, 247, 2257, 273, 19886, 14013, 275, 253, 13461, 3340, 342, 48960, 3733, 50276, 4609, 2789, 253, 1885, 273, 253, 5222, 281, 320, 1512, 1781, 50275, 783, 4477, 513, 1347, 4679, 4645, 326, 342, 2801, 10027, 253, 26647, 8037, 4916, 4577, 285, 253, 22429, 2489, 4577, 2299, 352, 310, 9106, 12744, 281, 479, 326, 253, 2801, 10027, 2783, 275, 253, 4679, 403, 2686, 8654, 50276, 262, 812, 1335, 320, 253, 1083, 326, 342, 1463, 2801, 10027, 253, 1885, 273, 253, 22429, 275, 48960, 3733, 310, 2686, 4577, 10941, 281, 326, 273, 253, 4076, 3733, 50274, 3062, 1189, 253, 4477, 943, 671, 19148, 326, 253, 1885, 273, 253, 22429, 2556, 281, 253, 4679, 403, 3365, 1512, 1781, 285, 597, 476, 417, 320, 908, 275, 253, 10527, 906, 281, 755, 667, 14282, 26647, 14493, 50274, 19, 253, 1885, 273, 253, 5222, 275, 1985, 358, 12844, 10454, 50276, 261, 6863, 436, 50276, 7041, 760, 6556, 323, 11454, 6928, 342, 337, 8512, 591, 3828, 2378, 627, 403, 625, 685, 581, 8512, 627, 476, 3365, 320, 581, 23586, 326, 33772, 269, 89, 285, 253, 643, 33772, 269, 89, 285, 597, 4336, 476, 35430, 1016, 643, 594, 253, 1885, 273, 253, 5222, 310, 9090, 417, 6863, 323, 667, 11454, 2990, 342, 625, 685, 581, 23586, 591, 3828, 275, 958, 253, 8037, 476, 320, 29556, 1781, 50274, 328, 9520, 891, 751, 253, 2929, 1077, 1199, 285, 891, 3524, 436, 2929, 812, 320, 3863, 2299, 253, 3916, 50276, 783, 1885, 273, 253, 5222, 476, 5513, 253, 26647, 8037, 310, 3365, 1512, 24363, 285, 2853, 19391, 891, 11907, 253, 4477, 281, 4336, 5386, 436, 1750, 285, 11929, 253, 2929, 281, 847, 85 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 253, 4477, 16058, 326, 253, 1885, 273, 2801, 5222, 275, 48960, 3733, 310, 6296, 1199, 4067, 685, 326, 275, 2629, 3733, 891, 1158, 436, 310, 417, 247, 2266, 16774, 21999, 273, 253, 4081, 3033, 253, 1921, 310, 627, 5368, 690, 14637, 275, 253, 14493, 275, 48960, 3733, 285, 2629, 3733, 534, 778, 320, 1027, 3103, 3365, 10941, 253, 1885, 273, 253, 2801, 5222, 310, 417, 26565, 891, 651, 751, 281, 923, 625, 5955, 327, 436, 359, 871, 326, 690, 273, 253, 5368, 3676, 4715, 3762, 778, 320, 760, 11076, 1037, 3451, 3021, 5277, 4209, 16774, 1941, 281, 921, 253, 2882, 566, 875, 253, 3762, 285, 3946, 310, 1774, 50275, 19, 627, 4961, 247, 1885, 273, 253, 2801, 5222, 275, 253, 4081, 3033, 534, 8018, 326, 253, 10166, 1566, 476, 39970, 1805, 604, 436, 1885, 310, 4577, 3103, 476, 359, 897, 690, 5609, 281, 3963, 907, 436, 1885, 1309, 3733, 281, 3157, 253, 26647, 3745, 253, 4477, 403, 8521, 281, 1918, 824, 2238, 273, 4679, 281, 1329, 616, 10527, 1543, 50275, 6438, 30080, 22559, 50276, 6438, 4361, 253, 2380, 432, 253, 4477, 891, 5439, 619, 13716, 50276, 18, 436, 2929, 3400, 247, 747, 10454, 3033, 323, 48960, 3733, 50276, 19, 253, 16774, 12820, 273, 436, 2929, 310, 12497, 50276, 7152, 33032, 2520, 2929, 29328, 747, 26647, 14493, 323, 48960, 3733, 275, 11454, 6928, 1754, 327, 253, 1985, 358, 12844, 10454, 841, 403, 625, 2087, 685, 2045, 1543, 275, 326, 597, 4647, 281, 11454, 6928, 273, 667, 6864, 4679, 403, 2684, 327, 260, 338, 274, 740, 285, 260, 338, 274, 2313, 342, 2709, 362, 1266, 35615, 841, 2905, 342, 253, 2022, 13483, 15602, 275, 253, 4081, 14493, 2085, 271, 8813, 323, 253, 3710, 26647, 5350, 273, 48960, 3733, 20544, 50276, 15477, 9400, 273, 10527, 5839, 273, 48960, 6667, 285, 48960, 3733, 326, 556, 12103, 32535, 275, 3332, 1107, 50276, 303, 27369, 10527, 906, 689, 2720, 1445, 50276, 30407, 1020, 4679, 1754, 327, 253, 10527, 1543, 326, 1804, 690, 4606, 2139, 48960, 3733, 1057, 417, 39970, 973, 50276, 8250, 285, 973, 15720, 2929, 247, 1643, 963, 993, 3464, 50276, 20881, 1255, 265, 50276, 262, 651, 320, 1270, 604, 253, 2929, 2908, 247, 5301, 281, 5368, 48960, 14493, 1754, 327, 1985, 358, 12844, 10454, 390, 643, 31225, 24088, 17944, 323, 247, 20953, 1650, 390, 1355, 2990, 14999, 436, 352, 310, 12150, 281, 5963, 253, 7756, 253, 1655, 2929, 2789, 689, 2720, 1543, 275, 2426, 273, 6863, 1255, 273, 253, 3033, 25761, 352, 1537, 320, 4409, 10941, 253, 7680, 281, 643, 3510, 273, 10527, 7274, 275, 253, 1673, 24088, 253, 872, 494, 3082, 326, 253, 2929, 28070, 50276, 34974, 285, 643, 5701, 50276, 262, 651, 320, 1175, 281, 762, 1282, 326, 253, 14493, 2530, 671, 2186, 323, 27311, 267, 11454, 6928, 4321, 685, 253, 4679, 2593, 390, 625, 3839, 752, 8090, 403, 6107, 50276, 5371, 310, 253, 3486, 273, 970, 23256, 69, 48960, 6667, 275, 3946, 3185, 273, 253, 8654, 20452, 50276, 261, 352, 5272, 281, 1908, 247, 2957, 275, 253, 2491, 14805, 323, 11454, 6928, 50276, 609, 1985, 358, 12844, 10454, 14493, 6863, 2217, 323, 11454, 6928, 281, 320, 27096, 390, 7763, 275, 3946, 50276, 11183, 1501, 5955, 891, 717, 12976, 619, 13716, 407, 581, 1127, 1563, 253, 23261, 2708, 1175, 10527, 906, 4516, 407, 5661, 7103, 327, 4623, 9400, 5474, 33032, 2520, 2929, 689, 3217, 690, 7681, 12748, 281, 2085, 48960, 1985, 358, 12844, 10454, 273, 3676, 11454, 6928, 2429, 342, 5368, 6239, 534, 1611, 281, 921, 643, 11640, 273, 48960, 1985, 358, 12844, 10454, 436, 2929, 3587, 2987, 327, 48960, 1985, 358, 12844, 10454, 3139, 436, 2929, 3400, 1097, 253, 2406, 3033, 285, 253, 5170, 3033, 16280, 436, 2929, 2589, 10704, 4679, 281, 13398, 342, 253, 10527, 3033, 281, 15249, 2139, 48960, 3733, 556, 247, 7197, 26647, 685, 2629, 3733, 891, 1158, 253, 2022, 7680, 273, 436, 2929, 310, 4722, 352, 3587, 689, 3217, 253, 12748, 275, 44190, 48960, 1985, 358, 12844, 10454, 2581, 685, 970, 643, 11640, 352, 3400, 1097, 253, 5170, 3033, 285, 2406, 3033, 1097, 273, 534, 403, 1774, 50275, 35529, 627, 403, 1335, 247, 2257, 273, 1841, 326, 476, 320, 5520, 275, 436, 2929, 2708, 403, 619, 2201, 7350, 4404, 436, 2929, 6960, 275, 6349, 619, 13716, 310, 4390, 5075, 2997, 1677, 253, 6349, 273, 253, 5170, 285, 2406, 14493, 533, 352, 812, 320, 10904, 1754, 327, 253, 2488, 2380, 4404, 619, 7350, 50275, 18, 619, 4685, 4404, 253, 48960, 3733, 285, 253, 48960, 1985, 358, 12844, 10454, 310, 326, 1097, 48960, 32927, 285, 2629, 32927, 11454, 6928, 452, 616, 1211, 2629, 1985, 358, 12844, 10454, 285, 48960, 1985, 358, 12844, 10454, 594, 253, 26647, 8037, 273, 48960, 3733, 2957, 285, 48960, 5175, 2957, 310, 4067, 685, 253, 26647, 8037, 273, 2629, 3733, 2957, 285, 2629, 5175, 2957, 323, 1097, 48960, 32927, 9173, 347, 247, 285, 270, 285, 2629, 32927, 11454, 6928, 9173, 347, 260, 285, 277, 26332, 490, 285, 22942, 327, 253, 643, 1133, 949, 4679, 352, 310, 2540, 326, 253, 18539, 274, 1365, 10166, 11454, 6928, 31326, 4067, 22429, 594, 697, 26647, 310, 7197, 26332, 519, 310, 619, 4685, 3451, 310, 352, 5667, 281, 2085, 1941, 323, 913, 390, 270, 69, 275, 643, 281, 7525, 519, 275, 4677, 337, 752, 1057, 253, 26647, 8037, 3730, 281, 50276, 249, 1635, 812, 368, 2085, 690, 16039, 327, 2139, 253, 5222, 273, 18539, 274, 1365, 10166, 1566, 310, 4067, 2139, 1057, 3629, 253, 3530, 1979, 1421, 281, 247, 4067, 5222, 15456, 671, 1677, 841, 7313, 310, 627, 667, 1039, 281, 3157, 253, 26647, 3045, 273, 48960, 3733, 310, 627, 667, 12739, 327, 253, 2957, 13016, 273, 48960, 10237, 11454, 6928, 253, 1655, 2593, 721, 12646, 690, 7313, 533, 642, 7000, 16039, 50276, 18, 284, 1774, 347, 253, 1840, 253, 1655, 4737, 273, 10012, 495, 760, 2296, 407, 253, 1543, 273, 253, 2406, 14493, 273, 50276, 664, 4044, 326, 50276, 32897, 2085, 625, 4278, 327, 253, 5368, 1543, 285, 849, 281, 4044, 253, 2457, 906, 4496, 2085, 690, 11859, 33954, 281, 436, 2057, 275, 31697, 390, 275, 253, 5955, 50275, 20, 253, 6239, 2278, 273, 26647, 273, 48960, 3733, 275, 436, 2929, 760, 19401, 1110, 670, 1985, 358, 12844, 10454, 627, 403, 1142, 643, 2175, 2444, 327, 253, 26647, 432, 643, 10527, 7794, 2708, 403, 690, 1774, 7774, 4496, 513, 690, 6239, 2561, 275, 436, 2087, 2170, 273, 10527, 1263, 285, 2486, 275, 436, 2929, 50276, 50004, 534, 2085, 1097, 5170, 3033, 285, 2406, 3033, 50276, 21329, 260, 864, 340, 9634, 359, 74, 285, 819, 796, 554, 1218, 30127, 22711, 9479, 7605, 1149, 33573, 265, 323, 18539, 274, 1365, 10237, 305, 12064, 9162, 5213, 8059, 327, 5145, 4715, 268, 1686, 83, 9169, 50276, 89, 272, 340, 489, 8864, 478, 5801, 1182, 12109, 285, 1149, 606, 260, 24176, 18539, 274, 1365, 10237, 6642, 285, 2495, 1783, 275, 4872, 9077, 5213, 8059, 327, 13345, 9260, 285, 9990, 268, 1686, 83, 43425, 50276, 50004, 670, 26647, 3607, 50276, 455, 12586, 11917, 1182, 2653, 9041, 285, 340, 9041, 91, 5801, 632, 4735, 23609, 849, 48960, 3733, 17923, 10237, 3676, 4715, 549, 32693, 638, 3845, 549, 32693, 1518, 3712, 520, 2270, 9169, 50276, 75, 18444, 78, 472, 519, 293, 35926, 5168, 1220, 85, 9201, 46841, 18754, 285, 288, 3163, 38193, 6451, 10799, 5454, 14273, 275, 48960, 3733, 323, 4872, 9077, 8059, 327, 4715, 3762, 268, 1686, 83, 9169, 50276, 75, 18444, 78, 472, 519, 293, 285, 35926, 5168, 1220, 85, 9201, 46841, 18754, 10799, 7605, 1783, 273, 9162, 3933, 19103, 323, 48960, 3733, 549, 32693, 638, 3845, 549, 32693, 1252, 520, 805, 1012, 9169, 50276, 893, 248, 363, 288, 375, 34103, 17653, 31987, 7690, 1032, 6451, 285, 37622, 375, 7635, 1301, 3941, 30861, 20185, 3879, 273, 48960, 3733, 275, 8985, 9162, 549, 32693, 638, 3845, 549, 32693, 1252, 520, 1237, 1976, 9169, 50276, 44217, 277, 543, 89, 757, 6294, 8500, 1269, 571, 285, 340, 11889, 259, 606, 48960, 2801, 20452, 7729, 10237, 26647, 16424, 275, 11454, 1491, 5162, 2718, 5922, 9169, 50276, 89, 272, 340, 489, 2805, 338, 266, 4498, 285, 1149, 606, 260, 24176, 327, 253, 26647, 3607, 273, 48960, 3733, 5213, 8059, 327, 13345, 9260, 285, 9990, 268, 1686, 83, 43425, 50276, 91, 16926, 1408, 85, 757, 1162, 355, 18539, 274, 1365, 10237, 26647, 816, 4419, 625, 440, 22027, 941, 549, 32693, 638, 3845, 549, 32693, 746, 3071, 361, 27865, 6247, 50275, 21, 253, 2022, 2934, 273, 436, 2929, 310, 417, 1892, 281, 956, 285, 253, 4477, 1056, 247, 2257, 273, 5301, 281, 5368, 6239, 352, 651, 320, 1270, 604, 253, 4477, 812, 1056, 352, 2590, 670, 253, 1563, 3533, 672, 12930, 253, 4737, 5018, 275, 10012, 337, 50273, 18, 534, 5018, 403, 1027, 432, 253, 28529, 273, 2629, 1985, 358, 12844, 10454, 534, 5018, 403, 417, 5667, 604, 359, 897, 10012, 337, 285, 1379, 299, 4277, 17, 752, 5018, 943, 359, 10007, 281, 4044, 253, 2629, 1985, 358, 12844, 10454, 5393, 275, 2593, 8676, 50273, 19, 534, 5018, 403, 1027, 432, 253, 6239, 670, 48960, 1985, 358, 12844, 10454, 534, 5018, 513, 597, 17049, 50273, 20, 812, 368, 5513, 253, 7579, 846, 1078, 10012, 374, 275, 2508, 50274, 37585, 2523, 417, 6960, 275, 6349, 50276, 22, 672, 29570, 634, 9021, 275, 253, 1390, 12494, 273, 2593, 337, 812, 368, 3630, 690, 20121, 50272, 18, 323, 253, 806, 7680, 310, 627, 667, 4722, 4342, 275, 253, 3033, 50273, 19, 323, 253, 1273, 7680, 812, 368, 3662, 634, 2139, 1953, 50275, 23, 4496, 1908, 5386, 13989, 337, 285, 374, 285, 2118, 13989, 495, 281, 253, 30762, 13989, 337, 285, 374, 513, 417, 1361, 3676, 257, 253, 4685, 273, 253, 2022, 4736, 273, 436, 2929, 12014, 4496, 48399, 2593, 495, 323, 25930, 534, 403, 20804, 281, 253, 2022, 4736, 50274, 2520, 2929, 3400, 690, 1774, 1543, 670, 253, 48960, 1985, 358, 12844, 10454, 594, 891, 6273, 323, 5075, 14924, 533, 627, 403, 1142, 3374, 4404, 253, 4679, 27947, 285, 253, 4028, 273, 436, 2929, 50276, 187, 187, 4118, 18435, 27, 783, 2929, 1160, 247, 4891, 10527, 7680, 327, 253, 48960, 50276, 16691, 1320, 14493, 273, 33362, 4071, 11454, 6928, 50276, 35529, 253, 2929, 387, 253, 1655, 830, 556, 1142, 3374, 275, 253, 1750, 326, 253, 1885, 273, 253, 5222, 476, 5513, 253, 26647, 8037, 50276, 18, 2801, 10027, 253, 4477, 4648, 253, 2801, 5222, 347, 253, 17335, 323, 26647, 8037, 2299, 352, 310, 12744, 281, 479, 326, 48960, 10166, 6928, 452, 247, 4067, 26647, 8037, 476, 320, 5544, 407, 253, 1885, 273, 2801, 22429, 281, 9257, 12654, 436, 253, 4477, 452, 281, 387, 1878, 9257, 19928, 253, 2801, 10027, 281, 253, 6253, 1896, 9017, 594, 253, 26647, 2228, 310, 417, 8513, 285, 7277, 253, 1885, 273, 253, 2801, 22429, 275, 436, 10076, 50276, 14920, 2801, 10027, 253, 11454, 6928, 1537, 3037, 247, 2257, 273, 19886, 14013, 275, 253, 13461, 3340, 342, 48960, 3733, 50276, 4609, 2789, 253, 1885, 273, 253, 5222, 281, 320, 1512, 1781, 50275, 783, 4477, 513, 1347, 4679, 4645, 326, 342, 2801, 10027, 253, 26647, 8037, 4916, 4577, 285, 253, 22429, 2489, 4577, 2299, 352, 310, 9106, 12744, 281, 479, 326, 253, 2801, 10027, 2783, 275, 253, 4679, 403, 2686, 8654, 50276, 262, 812, 1335, 320, 253, 1083, 326, 342, 1463, 2801, 10027, 253, 1885, 273, 253, 22429, 275, 48960, 3733, 310, 2686, 4577, 10941, 281, 326, 273, 253, 4076, 3733, 50274, 3062, 1189, 253, 4477, 943, 671, 19148, 326, 253, 1885, 273, 253, 22429, 2556, 281, 253, 4679, 403, 3365, 1512, 1781, 285, 597, 476, 417, 320, 908, 275, 253, 10527, 906, 281, 755, 667, 14282, 26647, 14493, 50274, 19, 253, 1885, 273, 253, 5222, 275, 1985, 358, 12844, 10454, 50276, 261, 6863, 436, 50276, 7041, 760, 6556, 323, 11454, 6928, 342, 337, 8512, 591, 3828, 2378, 627, 403, 625, 685, 581, 8512, 627, 476, 3365, 320, 581, 23586, 326, 33772, 269, 89, 285, 253, 643, 33772, 269, 89, 285, 597, 4336, 476, 35430, 1016, 643, 594, 253, 1885, 273, 253, 5222, 310, 9090, 417, 6863, 323, 667, 11454, 2990, 342, 625, 685, 581, 23586, 591, 3828, 275, 958, 253, 8037, 476, 320, 29556, 1781, 50274, 328, 9520, 891, 751, 253, 2929, 1077, 1199, 285, 891, 3524, 436, 2929, 812, 320, 3863, 2299, 253, 3916, 50276, 783, 1885, 273, 253, 5222, 476, 5513, 253, 26647, 8037, 310, 3365, 1512, 24363, 285, 2853, 19391, 891, 11907, 253, 4477, 281, 4336, 5386, 436, 1750, 285, 11929, 253, 2929, 281, 847, 85 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper proposes a neural network based contextual bandit algorithm in the offline setting where a dataset of contexts and rewards are given by a logging policy the goal of the proposed algorithm is to learn an optimal policy from the offline dataset the proposed algorithm neurallcb is similarly structured as the neuralucb algorithm zhou et a 2019 in the online setting the difference is that it uses a lower confidence bound for estimating the reward function instead of an upper confidence bound and that the optimization procedure for learning the neural network representation is based on the loss on one data point instead of the whole historical data the authors established an upper bound of the optimal gap of the learned policy and evaluated the algorithm on both simulation and uci datasets some references are repeated in the bibgraph for example there are multiple entries for the same papers is pessimism provably efficient for offline rl gradient descent provably optimizes overparameterized neural networks neural contextual bandits with ucbbased exploration in the related work there is another recent paper that studies policy learning in contextual bandit using the same neural network structure as in this paper xu p wen z zhao h gu q 2020 neural contextual bandits with deep representation and shallow exploration arxiv preprint arxiv201201780 similar to the claim of the current work this paper also improves the computational complexity of neural contextual bandits to a large extent in line 4 of algorithm 1 are you retrieving the data tuple randomly sample without replacement or just in a fixed order the notation of xt and xi in the introduction the authors stated that actions in the offline data are independent and depend only on the current state in rashidinejad et al 2021 however it should be also clearly stated that the current paper did not resolve this problem since by assumption 42 we know that the authors in this paper still need the actions to be independent from each other under assumption 42 the authors claimed that the data coverage condition is only imposed on the observed feature contexts however a significant difference in the setting of this paper from others that rely on uniform coverage is that the total number of arm contexts are fixed as nk shown in assumption 41 while other papers such as nguyentang et al 2021 do not require this condition is this the crucial reason for the relaxation of the condition in theorem 1 it is claimed that the network width m is a polynomial in the number of data n but it is unclear to me what the exact dependence of m on the number of data n is in addition did you also validate the dependency in the experiment settings m100 t15000 after the authors response i am satisfied with the responses to my questions i agree that this paper makes good progress in connecting neural contextual bandits and offline settings i am willing to increase my score for this paper my recommendation is mainly based on the theoretical novelty and the applicability of the theory docsepthis paper is the first study considers offline policy learning for contextual bandits with neural networks the authors proposed neuralcb algorithm that used neural network to model the rewards and followed pessimism principle with lower confidence bound in policy learning it is a very intuitive combination the algorithm works with mild assumptions with theoretical guarantee on its suboptimality experiments also showed that neuralcb outperforms other baselines strength comparing with existing works neuralcb requires weaker assumptions on data coverage empirical singlepolicy concentration condition and data generation data can be dependent on history past data and i think this is a major contribution of the paper the weaker assumption makes the algorithm applicable to more practical settings theoretical analysis on suboptimality learned by neuralcb is provided the algorithm essentially works in an online fashion that trained on one data point at each iteration thus several intermediate results can be directly applied from or very similar to online regret minimization in neural bandits eg neuralucb zhou et al 2019 if my understanding is correct due to the different goal between online and offline policy learning instead of improved technical lemmas over zhou et al 2019 the design of neuralcb is very different from neuralucb eg single data point sgd verse full gradient descent at every iteration i checked the main theorem and several lemmas they look sound and i only found a few mistakes see below weakness mistakes in proof of lemma b5 log det function is concave instead of convex when bounding the difference of the log det functions the first inequality does not hold because log det function is concave by taylor expansion we have that log detx log dety leq y1 xyf but the inequality does not hold when taking the absolute value of both sides which is the case in current proof this problem should be fixable and lemma b5 should still hold in experiments the authors mentioned a bmode variant of neuralcb and neuralgreedy that update with a small batch of data points at each iteration the result is reported after grid search over bmode and smode one step sgd on single data point i would suggest the authors to report the performance of smode and bmode separately in experiment on the realworld datasets the authors observed that smode performed better than bmode again both results should be plotted thats an interesting observation it would be helpful to provide some explanations over this observation other comments besides the theoretical convenience of learning in an online manner are there any other reasons to prevent full gradient descent training it sounds very natural for offline learning the authors mentioned after assumption 42 that neuralcb works with depend data the offline data was collected by an active learner such as a qlearning agent currently the synthetic experiment is on independent data it would be interesting to see if neuralcb could work well with depend data eg use some bandit algorithms to collect the dataset overall i think this is a good paper with a straightforward idea of combining pessimism and neural bandits for offline policy learning i am open to revise my rating if the authors could address the concerns docsepthe paper studies the problem of offline contextual bandits where policy learning can only leverage a fixed dataset collected a priori by behavior policies using a pessimism principle the authors propose a new algorithm called neurallcb with overparameterized neural networks and provide theoretical regret guarantees based on the analysis framework of the neural tangent kernel experiments on both synthetic and realworld data are conducted which confirms the theoretical results pros a the paper provides concrete theoretical analysis to support the proposed algorithm i have not gone through all the derivations but the overall result looks good b comprehensive comparisons with other related works are properly presented c the paper is generally well written the required assumptions are discussed clearly cons a apart from theoretical analysis it would be better if the paper could throw some light on algorithm design consideration beyond lcblike algorithm b it is not very clear how the improvement of sqrtd and on is achieved it would be better to give more concrete discussions overall the paper studies a new problem and presents a good analysis since the technique is quite similar to existing works it would be much better to present necessary discussions to claim their contributions docsep thank the authors for the clarification some of my former comments were nicely addressed and hence i am willing to increase my rating to 6 this paper considers the offline setting of the contextual bandit with neural network function approximation the key idea of the proposed neuralcb is to use neural network to learn the reward function and use a pessimism principle via a lower confidence bound lcb for decision making in theory the proposed approach is shown to learn the optimal policy with an error bound okappa tilded12 n12 where kappa measures the distributional shift and tilded is an effective dimension of the neural network the empirical effectiveness of the proposed method is shown in a range of synthetic and realworld offpolicy learning problems pros 1 the pessimism principle kidambi et al 2020 buckman et al 2020 jin et al 2020 is a recent idea introduced in offpolicy learning to avoid the strong assumption of sufficient coverage of the data the key idea is to consider a regularization version ie lower confidence bound lcb after its introduction it has been applied in various offpolicy rl and offpolicy bandit settings this paper further advances this area by introducing this idea into the neural contextual bandit setting 2 this paper is very well written and is easy to follow cons 1 related to my previous comments on the pessimism principle the main contribution of this paper is to incorporate the pessimism principle neural contextual bandit setting it combines the strength of both rashidinejad et al 2021 and zhou et al 2020 where the former studies the lower confidence bound pessimism principle for decisionmaking in tabular mdp and contextual bandit and the latter studies neural contextual bandit with upper confidence bound in typical online setting therefore the technical advantages over these existing work are expected and routine 2 in algorithm 1 why to randomly sample policy uniformly from hatpi1 ldots hatpin 3 in theorem 41 the error bound okappa tilded12 n12 kappa measures the distributional shift and tilded is an effective dimension of the neural network 1 the assumption 42 requires kappa to be a uniform upper bound over all sample size n and all samples xt kappa would be a function of the dimension and the sample size in this case the error bound might diverge it would be more convincing if the authors could provide some justifications or examples when kappa would be small 2 similarly the effective dimension of the neural network tilded might be very large this effective dimension of the neural network was used for online neural contextual bandit zhou et al 2019 and neural mdp yang et al 2020 the authors commented that tilded in these references were typically small in the order of logn however it is unknown whether tilded in the proposed offpolicy neural contextual bandit is still small in order to ensure this one would inevitably assume conditions on the data generation process to control the decaying speed of eigenvalues of the gram matrix h it would be more convincing if the authors could provide detailed quantification of data generation process the technical contribution of this paper is ok but not strong the assumptions and theory of the paper need substantial clarifications ### Summary:
this paper studies offpolicy learning of contextual bandits with neural network generalization the proposed algorithm neuralcb acts based on pessimistic estimates of the rewards obtained through lower confidence bounds neuralcb is both analyzed and empirically evaluated this paper received four borderline reviews which improved during the rebuttal phase the main strengths of this paper are that it is well executed and that the result is timely considering the recent advances in pessimism for offline rl the weakness is that the result is not very technically novel essentially a direct combination of pessimism with neural networks this paper was discussed and all reviewers agreed that the strengths of this paper outweigh its weaknesses i agree and recommended this paper to be accepted
[ 253, 2644, 9493, 941, 253, 4477, 4232, 271, 5170, 3033, 273, 253, 8654, 8037, 273, 253, 6311, 3646, 285, 6760, 253, 5933, 327, 1097, 9864, 285, 44274, 74, 15302, 50275, 8826, 10414, 403, 6015, 275, 253, 20314, 10580, 323, 1650, 627, 403, 2709, 12028, 323, 253, 1072, 9380, 310, 45234, 1204, 872, 1598, 5919, 323, 28841, 391, 77, 11786, 18499, 872, 1598, 5556, 4219, 689, 19484, 1025, 11454, 6928, 11454, 33876, 3961, 953, 342, 44274, 67, 3169, 17947, 275, 253, 2905, 789, 627, 310, 1529, 3332, 2929, 326, 2175, 3646, 4715, 275, 33876, 3961, 262, 970, 253, 1072, 11454, 2990, 2605, 347, 275, 436, 2929, 1269, 86, 268, 259, 257, 1182, 1182, 31035, 288, 50276, 4297, 2805, 9169, 11454, 33876, 3961, 953, 342, 3676, 6779, 285, 20126, 17947, 549, 32693, 638, 3845, 549, 32693, 1252, 7132, 1438, 2074, 281, 253, 1750, 273, 253, 1655, 789, 436, 2929, 671, 19132, 253, 15180, 10454, 273, 11454, 33876, 3961, 953, 281, 247, 1781, 6070, 50275, 249, 1386, 577, 273, 5933, 337, 403, 368, 48484, 253, 941, 31343, 12421, 3410, 1293, 5407, 390, 816, 275, 247, 4229, 1340, 253, 14951, 273, 209, 633, 285, 1269, 74, 50276, 249, 253, 10199, 253, 4477, 4767, 326, 5231, 275, 253, 28841, 941, 403, 3907, 285, 3469, 760, 327, 253, 1655, 1375, 275, 35511, 13196, 75, 324, 1162, 355, 43425, 2299, 352, 943, 320, 671, 4518, 4767, 326, 253, 1655, 2929, 858, 417, 11322, 436, 1895, 1580, 407, 9376, 5976, 359, 871, 326, 253, 4477, 275, 436, 2929, 1335, 878, 253, 5231, 281, 320, 3907, 432, 1016, 643, 50275, 4524, 9376, 5976, 253, 4477, 7558, 326, 253, 941, 7031, 1617, 310, 760, 11295, 327, 253, 2540, 4735, 22349, 2299, 247, 1534, 3064, 275, 253, 4758, 273, 436, 2929, 432, 2571, 326, 10725, 327, 6447, 7031, 310, 326, 253, 2264, 1180, 273, 4430, 22349, 403, 4229, 347, 295, 76, 2011, 275, 9376, 7609, 1223, 643, 9380, 824, 347, 295, 26619, 290, 606, 1162, 355, 43425, 513, 417, 2430, 436, 1617, 310, 436, 253, 9560, 1921, 323, 253, 17040, 273, 253, 1617, 50276, 249, 10012, 337, 352, 310, 7558, 326, 253, 2990, 4871, 278, 310, 247, 14189, 275, 253, 1180, 273, 941, 295, 533, 352, 310, 12744, 281, 479, 752, 253, 3242, 10096, 273, 278, 327, 253, 1180, 273, 941, 295, 310, 275, 1635, 858, 368, 671, 17813, 253, 18925, 275, 253, 3368, 7533, 278, 2313, 246, 1010, 933, 50275, 6438, 253, 4477, 2380, 50268, 74, 717, 10048, 342, 253, 6128, 281, 619, 3533, 891, 5194, 326, 436, 2929, 2789, 1175, 4780, 275, 12873, 11454, 33876, 3961, 953, 285, 28841, 7533, 891, 717, 7378, 281, 2572, 619, 4868, 323, 436, 2929, 619, 17401, 310, 7194, 1754, 327, 253, 10527, 38135, 285, 253, 30437, 273, 253, 3762, 50276, 7152, 33032, 2520, 2929, 310, 253, 806, 1263, 19401, 28841, 3646, 4715, 323, 33876, 3961, 953, 342, 11454, 6928, 253, 4477, 4081, 11454, 11316, 5933, 326, 908, 11454, 2990, 281, 1566, 253, 23267, 285, 3560, 45234, 1204, 8063, 342, 2406, 7162, 3033, 275, 3646, 4715, 352, 310, 247, 1077, 27350, 5019, 253, 5933, 2987, 342, 11134, 13260, 342, 10527, 12215, 327, 697, 749, 32581, 1319, 4679, 671, 2692, 326, 11454, 11316, 41731, 13015, 643, 1666, 25379, 4757, 50275, 681, 48434, 342, 5368, 2987, 11454, 11316, 4419, 21076, 13260, 327, 941, 7031, 16774, 2014, 22872, 4719, 1617, 285, 941, 5978, 941, 476, 320, 7976, 327, 2892, 50276, 32628, 941, 285, 891, 1158, 436, 310, 247, 2201, 7680, 273, 253, 2929, 253, 21076, 9376, 2789, 253, 5933, 7763, 281, 625, 8542, 7533, 50275, 783, 33977, 1783, 327, 749, 32581, 1319, 6311, 407, 11454, 11316, 310, 2530, 253, 5933, 9093, 2987, 275, 271, 3909, 8142, 326, 10166, 327, 581, 941, 1127, 387, 1016, 19502, 3021, 2067, 10444, 1543, 476, 320, 3587, 3732, 432, 390, 1077, 2074, 281, 3909, 14938, 41458, 275, 11454, 3961, 953, 24088, 11454, 1028, 67, 1182, 14451, 1162, 355, 6247, 604, 619, 4685, 310, 3451, 1955, 281, 253, 1027, 4736, 875, 3909, 285, 28841, 3646, 4715, 3185, 273, 5520, 7681, 458, 44661, 689, 1182, 14451, 1162, 355, 6247, 253, 2216, 273, 11454, 11316, 310, 1077, 1027, 432, 11454, 1028, 67, 24088, 2014, 941, 1127, 256, 35333, 23252, 2120, 11786, 18499, 387, 1046, 19502, 50276, 74, 10141, 253, 2022, 10012, 285, 2067, 458, 44661, 597, 1007, 3590, 285, 891, 760, 1119, 247, 1643, 16503, 923, 2708, 50276, 20881, 1255, 50275, 33542, 1582, 275, 4737, 273, 18057, 270, 22, 50274, 2808, 843, 1159, 310, 40886, 3185, 273, 17133, 50274, 9453, 41113, 253, 3064, 273, 253, 2412, 843, 3470, 253, 806, 11370, 1057, 417, 2186, 984, 2412, 843, 1159, 310, 40886, 407, 246, 9614, 7466, 359, 452, 326, 2412, 843, 89, 50276, 2808, 843, 90, 458, 82, 340, 18, 1269, 90, 71, 533, 253, 11370, 1057, 417, 2186, 672, 3192, 253, 7880, 1318, 273, 1097, 7123, 534, 310, 253, 1083, 275, 1655, 4737, 436, 1895, 943, 320, 4993, 494, 285, 18057, 270, 22, 943, 1335, 2186, 50275, 249, 4679, 253, 4477, 5393, 247, 270, 9561, 12955, 273, 11454, 11316, 285, 11454, 24204, 6368, 326, 5731, 342, 247, 1355, 14604, 273, 941, 2792, 387, 1016, 19502, 253, 906, 310, 2361, 846, 9860, 3186, 689, 270, 9561, 285, 924, 853, 581, 3213, 256, 35333, 327, 2014, 941, 1127, 891, 651, 1804, 253, 4477, 281, 1304, 253, 3045, 273, 924, 853, 285, 270, 9561, 11794, 275, 3368, 327, 253, 1524, 10186, 15302, 253, 4477, 2540, 326, 924, 853, 2684, 1805, 685, 270, 9561, 969, 1097, 1543, 943, 320, 17944, 28763, 271, 4722, 8310, 352, 651, 320, 9371, 281, 2085, 690, 22909, 689, 436, 8310, 50276, 977, 5701, 50275, 67, 11587, 253, 10527, 16397, 273, 4715, 275, 271, 3909, 5133, 403, 627, 667, 643, 4606, 281, 3657, 2120, 11786, 18499, 3733, 352, 7835, 1077, 3626, 323, 28841, 4715, 50274, 783, 4477, 5393, 846, 9376, 5976, 326, 11454, 11316, 2987, 342, 3469, 941, 253, 28841, 941, 369, 5728, 407, 271, 3939, 458, 47612, 824, 347, 247, 2805, 28269, 5570, 4390, 253, 13506, 3368, 310, 327, 3907, 941, 352, 651, 320, 4722, 281, 923, 604, 11454, 11316, 812, 789, 973, 342, 3469, 941, 24088, 897, 690, 3961, 262, 11333, 281, 4822, 253, 10895, 50273, 1189, 455, 891, 1158, 436, 310, 247, 1175, 2929, 342, 247, 15246, 2934, 273, 16248, 45234, 1204, 285, 11454, 3961, 953, 323, 28841, 3646, 4715, 891, 717, 1527, 281, 49620, 619, 13716, 604, 253, 4477, 812, 2953, 253, 7350, 5474, 339, 431, 248, 2929, 2175, 253, 1895, 273, 28841, 33876, 3961, 953, 835, 3646, 4715, 476, 760, 25057, 247, 4229, 10895, 5728, 247, 30400, 407, 3879, 7823, 970, 247, 45234, 1204, 8063, 253, 4477, 12661, 247, 747, 5933, 1925, 5723, 455, 11316, 342, 689, 19484, 1025, 11454, 6928, 285, 2085, 10527, 14938, 23632, 1754, 327, 253, 1783, 7792, 273, 253, 11454, 28196, 10295, 4679, 327, 1097, 13506, 285, 1524, 10186, 941, 403, 5196, 534, 23849, 253, 10527, 1543, 5847, 50276, 66, 253, 2929, 3400, 11859, 10527, 1783, 281, 1329, 253, 4081, 5933, 891, 452, 417, 4783, 949, 512, 253, 3538, 569, 533, 253, 4583, 906, 4453, 1175, 50276, 67, 11088, 14023, 342, 643, 2905, 2987, 403, 6283, 3559, 50276, 68, 253, 2929, 310, 3839, 973, 3542, 253, 2424, 13260, 403, 5469, 4518, 50276, 5040, 50276, 66, 7419, 432, 10527, 1783, 352, 651, 320, 1805, 604, 253, 2929, 812, 4710, 690, 1708, 327, 5933, 2216, 8180, 4457, 298, 11316, 3022, 5933, 50276, 67, 352, 310, 417, 1077, 2590, 849, 253, 7756, 273, 8084, 69, 285, 327, 310, 6786, 352, 651, 320, 1805, 281, 1918, 625, 11859, 11985, 50276, 1189, 455, 253, 2929, 2175, 247, 747, 1895, 285, 10262, 247, 1175, 1783, 1580, 253, 5853, 310, 3240, 2074, 281, 5368, 2987, 352, 651, 320, 1199, 1805, 281, 1246, 3309, 11985, 281, 1750, 616, 9021, 5474, 33032, 5717, 253, 4477, 323, 253, 37699, 690, 273, 619, 3438, 5701, 497, 23395, 9713, 285, 7613, 891, 717, 7378, 281, 2572, 619, 13716, 281, 721, 50275, 2520, 2929, 19401, 253, 28841, 4758, 273, 253, 33876, 3961, 262, 342, 11454, 2990, 1159, 11193, 253, 2234, 2934, 273, 253, 4081, 11454, 11316, 310, 281, 897, 11454, 2990, 281, 3037, 253, 10921, 1159, 285, 897, 247, 45234, 1204, 8063, 3066, 247, 2406, 7162, 3033, 298, 11316, 323, 3061, 2403, 275, 3762, 253, 4081, 2746, 310, 2011, 281, 3037, 253, 8654, 3646, 342, 271, 2228, 3033, 258, 6165, 246, 786, 264, 805, 295, 805, 835, 465, 5596, 5593, 253, 3268, 267, 5333, 285, 246, 786, 264, 310, 271, 3576, 7877, 273, 253, 11454, 2990, 253, 16774, 12510, 273, 253, 4081, 1332, 310, 2011, 275, 247, 2491, 273, 13506, 285, 1524, 10186, 745, 22872, 4715, 3237, 50275, 856, 84, 50275, 18, 253, 45234, 1204, 8063, 5772, 1369, 74, 1162, 355, 9169, 12433, 1342, 1162, 355, 9169, 480, 249, 1162, 355, 9169, 310, 247, 3332, 2934, 5611, 275, 745, 22872, 4715, 281, 3693, 253, 2266, 9376, 273, 4209, 7031, 273, 253, 941, 253, 2234, 2934, 310, 281, 1908, 247, 37820, 2715, 26332, 2406, 7162, 3033, 298, 11316, 846, 697, 10199, 352, 556, 644, 3732, 275, 2710, 745, 22872, 391, 77, 285, 745, 22872, 3961, 262, 7533, 436, 2929, 2007, 16424, 436, 2170, 407, 16984, 436, 2934, 715, 253, 11454, 33876, 3961, 262, 4758, 50274, 19, 436, 2929, 310, 1077, 973, 3542, 285, 310, 3477, 281, 956, 50275, 5040, 50275, 18, 2905, 281, 619, 2045, 5701, 327, 253, 45234, 1204, 8063, 253, 2022, 7680, 273, 436, 2929, 310, 281, 19071, 253, 45234, 1204, 8063, 11454, 33876, 3961, 262, 4758, 352, 24772, 253, 4757, 273, 1097, 35511, 13196, 75, 324, 1162, 355, 43425, 285, 1182, 14451, 1162, 355, 9169, 835, 253, 3438, 2175, 253, 2406, 7162, 3033, 45234, 1204, 8063, 323, 3061, 11849, 275, 10334, 792, 278, 12132, 285, 33876, 3961, 262, 285, 253, 6158, 2175, 11454, 33876, 3961, 262, 342, 5170, 7162, 3033, 275, 6867, 3909, 4758, 3103, 253, 7681, 11361, 689, 841, 5368, 789, 403, 3264, 285, 10934, 50275, 19, 275, 5933, 337, 2139, 281, 12421, 3410, 3646, 17568, 432, 7856, 2059, 18, 298, 6768, 7856, 9852, 50275, 20, 275, 10012, 7609, 253, 2228, 3033, 258, 6165, 246, 786, 264, 805, 295, 805, 465, 5596, 5593, 253, 3268, 267, 5333, 285, 246, 786, 264, 310, 271, 3576, 7877, 273, 253, 11454, 2990, 50275, 18, 253, 9376, 5976, 4419, 465, 5596, 281, 320, 247, 6447, 5170, 3033, 689, 512, 3410, 1979, 295, 285, 512, 3530, 209, 633, 465, 5596, 651, 320, 247, 1159, 273, 253, 7877, 285, 253, 3410, 1979, 275, 436, 1083, 253, 2228, 3033, 1537, 11711, 463, 352, 651, 320, 625, 21414, 604, 253, 4477, 812, 2085, 690, 816, 6787, 390, 6667, 672, 465, 5596, 651, 320, 1355, 50276, 19, 12014, 253, 3576, 7877, 273, 253, 11454, 2990, 246, 786, 264, 1537, 320, 1077, 1781, 436, 3576, 7877, 273, 253, 11454, 2990, 369, 908, 323, 3909, 11454, 33876, 3961, 262, 1182, 14451, 1162, 355, 6247, 285, 11454, 278, 12132, 30966, 1162, 355, 9169, 253, 4477, 20503, 326, 246, 786, 264, 275, 841, 10414, 497, 5431, 1355, 275, 253, 1340, 273, 298, 2331, 2299, 352, 310, 7202, 1880, 50276, 85, 786, 264, 275, 253, 4081, 745, 22872, 11454, 33876, 3961, 262, 310, 1335, 1355, 275, 1340, 281, 5416, 436, 581, 651, 24473, 5467, 2515, 327, 253, 941, 5978, 1232, 281, 1453, 253, 46957, 3885, 273, 20223, 273, 253, 29975, 4315, 288, 352, 651, 320, 625, 21414, 604, 253, 4477, 812, 2085, 7000, 21652, 273, 941, 5978, 1232, 50275, 783, 7681, 7680, 273, 436, 2929, 310, 8718, 533, 417, 2266, 253, 13260, 285, 3762, 273, 253, 2929, 878, 6832, 8254, 6787, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 2175, 745, 22872, 4715, 273, 33876, 3961, 953, 342, 11454, 2990, 26647, 253, 4081, 5933, 11454, 11316, 6993, 1754, 327, 45234, 2531, 8197, 273, 253, 23267, 2797, 949, 2406, 7162, 14493, 11454, 11316, 310, 1097, 5867, 285, 45190, 6760, 50276, 2520, 2929, 2959, 1740, 45210, 10123, 534, 5520, 1309, 253, 30080, 22559, 3408, 253, 2022, 20544, 273, 436, 2929, 403, 326, 352, 310, 973, 11407, 285, 326, 253, 906, 310, 14793, 7296, 253, 3332, 16424, 275, 45234, 1204, 323, 28841, 391, 77, 253, 14855, 310, 326, 253, 906, 310, 417, 1077, 22335, 4460, 9093, 247, 1480, 5019, 273, 45234, 1204, 342, 11454, 6928, 436, 2929, 369, 5469, 285, 512, 30628, 5821, 326, 253, 20544, 273, 436, 2929, 32180, 798, 697, 32213, 891, 5194, 285, 8521, 436, 2929, 281, 320, 7607 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 253, 2644, 9493, 941, 253, 4477, 4232, 271, 5170, 3033, 273, 253, 8654, 8037, 273, 253, 6311, 3646, 285, 6760, 253, 5933, 327, 1097, 9864, 285, 44274, 74, 15302, 50275, 8826, 10414, 403, 6015, 275, 253, 20314, 10580, 323, 1650, 627, 403, 2709, 12028, 323, 253, 1072, 9380, 310, 45234, 1204, 872, 1598, 5919, 323, 28841, 391, 77, 11786, 18499, 872, 1598, 5556, 4219, 689, 19484, 1025, 11454, 6928, 11454, 33876, 3961, 953, 342, 44274, 67, 3169, 17947, 275, 253, 2905, 789, 627, 310, 1529, 3332, 2929, 326, 2175, 3646, 4715, 275, 33876, 3961, 262, 970, 253, 1072, 11454, 2990, 2605, 347, 275, 436, 2929, 1269, 86, 268, 259, 257, 1182, 1182, 31035, 288, 50276, 4297, 2805, 9169, 11454, 33876, 3961, 953, 342, 3676, 6779, 285, 20126, 17947, 549, 32693, 638, 3845, 549, 32693, 1252, 7132, 1438, 2074, 281, 253, 1750, 273, 253, 1655, 789, 436, 2929, 671, 19132, 253, 15180, 10454, 273, 11454, 33876, 3961, 953, 281, 247, 1781, 6070, 50275, 249, 1386, 577, 273, 5933, 337, 403, 368, 48484, 253, 941, 31343, 12421, 3410, 1293, 5407, 390, 816, 275, 247, 4229, 1340, 253, 14951, 273, 209, 633, 285, 1269, 74, 50276, 249, 253, 10199, 253, 4477, 4767, 326, 5231, 275, 253, 28841, 941, 403, 3907, 285, 3469, 760, 327, 253, 1655, 1375, 275, 35511, 13196, 75, 324, 1162, 355, 43425, 2299, 352, 943, 320, 671, 4518, 4767, 326, 253, 1655, 2929, 858, 417, 11322, 436, 1895, 1580, 407, 9376, 5976, 359, 871, 326, 253, 4477, 275, 436, 2929, 1335, 878, 253, 5231, 281, 320, 3907, 432, 1016, 643, 50275, 4524, 9376, 5976, 253, 4477, 7558, 326, 253, 941, 7031, 1617, 310, 760, 11295, 327, 253, 2540, 4735, 22349, 2299, 247, 1534, 3064, 275, 253, 4758, 273, 436, 2929, 432, 2571, 326, 10725, 327, 6447, 7031, 310, 326, 253, 2264, 1180, 273, 4430, 22349, 403, 4229, 347, 295, 76, 2011, 275, 9376, 7609, 1223, 643, 9380, 824, 347, 295, 26619, 290, 606, 1162, 355, 43425, 513, 417, 2430, 436, 1617, 310, 436, 253, 9560, 1921, 323, 253, 17040, 273, 253, 1617, 50276, 249, 10012, 337, 352, 310, 7558, 326, 253, 2990, 4871, 278, 310, 247, 14189, 275, 253, 1180, 273, 941, 295, 533, 352, 310, 12744, 281, 479, 752, 253, 3242, 10096, 273, 278, 327, 253, 1180, 273, 941, 295, 310, 275, 1635, 858, 368, 671, 17813, 253, 18925, 275, 253, 3368, 7533, 278, 2313, 246, 1010, 933, 50275, 6438, 253, 4477, 2380, 50268, 74, 717, 10048, 342, 253, 6128, 281, 619, 3533, 891, 5194, 326, 436, 2929, 2789, 1175, 4780, 275, 12873, 11454, 33876, 3961, 953, 285, 28841, 7533, 891, 717, 7378, 281, 2572, 619, 4868, 323, 436, 2929, 619, 17401, 310, 7194, 1754, 327, 253, 10527, 38135, 285, 253, 30437, 273, 253, 3762, 50276, 7152, 33032, 2520, 2929, 310, 253, 806, 1263, 19401, 28841, 3646, 4715, 323, 33876, 3961, 953, 342, 11454, 6928, 253, 4477, 4081, 11454, 11316, 5933, 326, 908, 11454, 2990, 281, 1566, 253, 23267, 285, 3560, 45234, 1204, 8063, 342, 2406, 7162, 3033, 275, 3646, 4715, 352, 310, 247, 1077, 27350, 5019, 253, 5933, 2987, 342, 11134, 13260, 342, 10527, 12215, 327, 697, 749, 32581, 1319, 4679, 671, 2692, 326, 11454, 11316, 41731, 13015, 643, 1666, 25379, 4757, 50275, 681, 48434, 342, 5368, 2987, 11454, 11316, 4419, 21076, 13260, 327, 941, 7031, 16774, 2014, 22872, 4719, 1617, 285, 941, 5978, 941, 476, 320, 7976, 327, 2892, 50276, 32628, 941, 285, 891, 1158, 436, 310, 247, 2201, 7680, 273, 253, 2929, 253, 21076, 9376, 2789, 253, 5933, 7763, 281, 625, 8542, 7533, 50275, 783, 33977, 1783, 327, 749, 32581, 1319, 6311, 407, 11454, 11316, 310, 2530, 253, 5933, 9093, 2987, 275, 271, 3909, 8142, 326, 10166, 327, 581, 941, 1127, 387, 1016, 19502, 3021, 2067, 10444, 1543, 476, 320, 3587, 3732, 432, 390, 1077, 2074, 281, 3909, 14938, 41458, 275, 11454, 3961, 953, 24088, 11454, 1028, 67, 1182, 14451, 1162, 355, 6247, 604, 619, 4685, 310, 3451, 1955, 281, 253, 1027, 4736, 875, 3909, 285, 28841, 3646, 4715, 3185, 273, 5520, 7681, 458, 44661, 689, 1182, 14451, 1162, 355, 6247, 253, 2216, 273, 11454, 11316, 310, 1077, 1027, 432, 11454, 1028, 67, 24088, 2014, 941, 1127, 256, 35333, 23252, 2120, 11786, 18499, 387, 1046, 19502, 50276, 74, 10141, 253, 2022, 10012, 285, 2067, 458, 44661, 597, 1007, 3590, 285, 891, 760, 1119, 247, 1643, 16503, 923, 2708, 50276, 20881, 1255, 50275, 33542, 1582, 275, 4737, 273, 18057, 270, 22, 50274, 2808, 843, 1159, 310, 40886, 3185, 273, 17133, 50274, 9453, 41113, 253, 3064, 273, 253, 2412, 843, 3470, 253, 806, 11370, 1057, 417, 2186, 984, 2412, 843, 1159, 310, 40886, 407, 246, 9614, 7466, 359, 452, 326, 2412, 843, 89, 50276, 2808, 843, 90, 458, 82, 340, 18, 1269, 90, 71, 533, 253, 11370, 1057, 417, 2186, 672, 3192, 253, 7880, 1318, 273, 1097, 7123, 534, 310, 253, 1083, 275, 1655, 4737, 436, 1895, 943, 320, 4993, 494, 285, 18057, 270, 22, 943, 1335, 2186, 50275, 249, 4679, 253, 4477, 5393, 247, 270, 9561, 12955, 273, 11454, 11316, 285, 11454, 24204, 6368, 326, 5731, 342, 247, 1355, 14604, 273, 941, 2792, 387, 1016, 19502, 253, 906, 310, 2361, 846, 9860, 3186, 689, 270, 9561, 285, 924, 853, 581, 3213, 256, 35333, 327, 2014, 941, 1127, 891, 651, 1804, 253, 4477, 281, 1304, 253, 3045, 273, 924, 853, 285, 270, 9561, 11794, 275, 3368, 327, 253, 1524, 10186, 15302, 253, 4477, 2540, 326, 924, 853, 2684, 1805, 685, 270, 9561, 969, 1097, 1543, 943, 320, 17944, 28763, 271, 4722, 8310, 352, 651, 320, 9371, 281, 2085, 690, 22909, 689, 436, 8310, 50276, 977, 5701, 50275, 67, 11587, 253, 10527, 16397, 273, 4715, 275, 271, 3909, 5133, 403, 627, 667, 643, 4606, 281, 3657, 2120, 11786, 18499, 3733, 352, 7835, 1077, 3626, 323, 28841, 4715, 50274, 783, 4477, 5393, 846, 9376, 5976, 326, 11454, 11316, 2987, 342, 3469, 941, 253, 28841, 941, 369, 5728, 407, 271, 3939, 458, 47612, 824, 347, 247, 2805, 28269, 5570, 4390, 253, 13506, 3368, 310, 327, 3907, 941, 352, 651, 320, 4722, 281, 923, 604, 11454, 11316, 812, 789, 973, 342, 3469, 941, 24088, 897, 690, 3961, 262, 11333, 281, 4822, 253, 10895, 50273, 1189, 455, 891, 1158, 436, 310, 247, 1175, 2929, 342, 247, 15246, 2934, 273, 16248, 45234, 1204, 285, 11454, 3961, 953, 323, 28841, 3646, 4715, 891, 717, 1527, 281, 49620, 619, 13716, 604, 253, 4477, 812, 2953, 253, 7350, 5474, 339, 431, 248, 2929, 2175, 253, 1895, 273, 28841, 33876, 3961, 953, 835, 3646, 4715, 476, 760, 25057, 247, 4229, 10895, 5728, 247, 30400, 407, 3879, 7823, 970, 247, 45234, 1204, 8063, 253, 4477, 12661, 247, 747, 5933, 1925, 5723, 455, 11316, 342, 689, 19484, 1025, 11454, 6928, 285, 2085, 10527, 14938, 23632, 1754, 327, 253, 1783, 7792, 273, 253, 11454, 28196, 10295, 4679, 327, 1097, 13506, 285, 1524, 10186, 941, 403, 5196, 534, 23849, 253, 10527, 1543, 5847, 50276, 66, 253, 2929, 3400, 11859, 10527, 1783, 281, 1329, 253, 4081, 5933, 891, 452, 417, 4783, 949, 512, 253, 3538, 569, 533, 253, 4583, 906, 4453, 1175, 50276, 67, 11088, 14023, 342, 643, 2905, 2987, 403, 6283, 3559, 50276, 68, 253, 2929, 310, 3839, 973, 3542, 253, 2424, 13260, 403, 5469, 4518, 50276, 5040, 50276, 66, 7419, 432, 10527, 1783, 352, 651, 320, 1805, 604, 253, 2929, 812, 4710, 690, 1708, 327, 5933, 2216, 8180, 4457, 298, 11316, 3022, 5933, 50276, 67, 352, 310, 417, 1077, 2590, 849, 253, 7756, 273, 8084, 69, 285, 327, 310, 6786, 352, 651, 320, 1805, 281, 1918, 625, 11859, 11985, 50276, 1189, 455, 253, 2929, 2175, 247, 747, 1895, 285, 10262, 247, 1175, 1783, 1580, 253, 5853, 310, 3240, 2074, 281, 5368, 2987, 352, 651, 320, 1199, 1805, 281, 1246, 3309, 11985, 281, 1750, 616, 9021, 5474, 33032, 5717, 253, 4477, 323, 253, 37699, 690, 273, 619, 3438, 5701, 497, 23395, 9713, 285, 7613, 891, 717, 7378, 281, 2572, 619, 13716, 281, 721, 50275, 2520, 2929, 19401, 253, 28841, 4758, 273, 253, 33876, 3961, 262, 342, 11454, 2990, 1159, 11193, 253, 2234, 2934, 273, 253, 4081, 11454, 11316, 310, 281, 897, 11454, 2990, 281, 3037, 253, 10921, 1159, 285, 897, 247, 45234, 1204, 8063, 3066, 247, 2406, 7162, 3033, 298, 11316, 323, 3061, 2403, 275, 3762, 253, 4081, 2746, 310, 2011, 281, 3037, 253, 8654, 3646, 342, 271, 2228, 3033, 258, 6165, 246, 786, 264, 805, 295, 805, 835, 465, 5596, 5593, 253, 3268, 267, 5333, 285, 246, 786, 264, 310, 271, 3576, 7877, 273, 253, 11454, 2990, 253, 16774, 12510, 273, 253, 4081, 1332, 310, 2011, 275, 247, 2491, 273, 13506, 285, 1524, 10186, 745, 22872, 4715, 3237, 50275, 856, 84, 50275, 18, 253, 45234, 1204, 8063, 5772, 1369, 74, 1162, 355, 9169, 12433, 1342, 1162, 355, 9169, 480, 249, 1162, 355, 9169, 310, 247, 3332, 2934, 5611, 275, 745, 22872, 4715, 281, 3693, 253, 2266, 9376, 273, 4209, 7031, 273, 253, 941, 253, 2234, 2934, 310, 281, 1908, 247, 37820, 2715, 26332, 2406, 7162, 3033, 298, 11316, 846, 697, 10199, 352, 556, 644, 3732, 275, 2710, 745, 22872, 391, 77, 285, 745, 22872, 3961, 262, 7533, 436, 2929, 2007, 16424, 436, 2170, 407, 16984, 436, 2934, 715, 253, 11454, 33876, 3961, 262, 4758, 50274, 19, 436, 2929, 310, 1077, 973, 3542, 285, 310, 3477, 281, 956, 50275, 5040, 50275, 18, 2905, 281, 619, 2045, 5701, 327, 253, 45234, 1204, 8063, 253, 2022, 7680, 273, 436, 2929, 310, 281, 19071, 253, 45234, 1204, 8063, 11454, 33876, 3961, 262, 4758, 352, 24772, 253, 4757, 273, 1097, 35511, 13196, 75, 324, 1162, 355, 43425, 285, 1182, 14451, 1162, 355, 9169, 835, 253, 3438, 2175, 253, 2406, 7162, 3033, 45234, 1204, 8063, 323, 3061, 11849, 275, 10334, 792, 278, 12132, 285, 33876, 3961, 262, 285, 253, 6158, 2175, 11454, 33876, 3961, 262, 342, 5170, 7162, 3033, 275, 6867, 3909, 4758, 3103, 253, 7681, 11361, 689, 841, 5368, 789, 403, 3264, 285, 10934, 50275, 19, 275, 5933, 337, 2139, 281, 12421, 3410, 3646, 17568, 432, 7856, 2059, 18, 298, 6768, 7856, 9852, 50275, 20, 275, 10012, 7609, 253, 2228, 3033, 258, 6165, 246, 786, 264, 805, 295, 805, 465, 5596, 5593, 253, 3268, 267, 5333, 285, 246, 786, 264, 310, 271, 3576, 7877, 273, 253, 11454, 2990, 50275, 18, 253, 9376, 5976, 4419, 465, 5596, 281, 320, 247, 6447, 5170, 3033, 689, 512, 3410, 1979, 295, 285, 512, 3530, 209, 633, 465, 5596, 651, 320, 247, 1159, 273, 253, 7877, 285, 253, 3410, 1979, 275, 436, 1083, 253, 2228, 3033, 1537, 11711, 463, 352, 651, 320, 625, 21414, 604, 253, 4477, 812, 2085, 690, 816, 6787, 390, 6667, 672, 465, 5596, 651, 320, 1355, 50276, 19, 12014, 253, 3576, 7877, 273, 253, 11454, 2990, 246, 786, 264, 1537, 320, 1077, 1781, 436, 3576, 7877, 273, 253, 11454, 2990, 369, 908, 323, 3909, 11454, 33876, 3961, 262, 1182, 14451, 1162, 355, 6247, 285, 11454, 278, 12132, 30966, 1162, 355, 9169, 253, 4477, 20503, 326, 246, 786, 264, 275, 841, 10414, 497, 5431, 1355, 275, 253, 1340, 273, 298, 2331, 2299, 352, 310, 7202, 1880, 50276, 85, 786, 264, 275, 253, 4081, 745, 22872, 11454, 33876, 3961, 262, 310, 1335, 1355, 275, 1340, 281, 5416, 436, 581, 651, 24473, 5467, 2515, 327, 253, 941, 5978, 1232, 281, 1453, 253, 46957, 3885, 273, 20223, 273, 253, 29975, 4315, 288, 352, 651, 320, 625, 21414, 604, 253, 4477, 812, 2085, 7000, 21652, 273, 941, 5978, 1232, 50275, 783, 7681, 7680, 273, 436, 2929, 310, 8718, 533, 417, 2266, 253, 13260, 285, 3762, 273, 253, 2929, 878, 6832, 8254, 6787, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 2175, 745, 22872, 4715, 273, 33876, 3961, 953, 342, 11454, 2990, 26647, 253, 4081, 5933, 11454, 11316, 6993, 1754, 327, 45234, 2531, 8197, 273, 253, 23267, 2797, 949, 2406, 7162, 14493, 11454, 11316, 310, 1097, 5867, 285, 45190, 6760, 50276, 2520, 2929, 2959, 1740, 45210, 10123, 534, 5520, 1309, 253, 30080, 22559, 3408, 253, 2022, 20544, 273, 436, 2929, 403, 326, 352, 310, 973, 11407, 285, 326, 253, 906, 310, 14793, 7296, 253, 3332, 16424, 275, 45234, 1204, 323, 28841, 391, 77, 253, 14855, 310, 326, 253, 906, 310, 417, 1077, 22335, 4460, 9093, 247, 1480, 5019, 273, 45234, 1204, 342, 11454, 6928, 436, 2929, 369, 5469, 285, 512, 30628, 5821, 326, 253, 20544, 273, 436, 2929, 32180, 798, 697, 32213, 891, 5194, 285, 8521, 436, 2929, 281, 320, 7607 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper looks at the problem of combining local and global explanations to provide more effective explanations specifically they consider the use of highlights which provides policy summaries in the form of example state action pairs chosen based on specific selection criteria and the local explanation takes the form of reward component decomposition of the q values the approach was evaluated using a userstudy on a simulated driving domain where the participants were shown sample agent behavior in some conditions chosen using highlights algorithm with possible local explanations and then asked questions regarding the agent characteristics on the positive side i believe the paper looks at a very interesting problem the need to provide explanation at different levels of fidelity is an important problem and i particularly like the idea of enriching summaries with local explanation i also appreciate the fact that the authors actually ran user studies to evaluate the approach however i do have some reservations about the paper in particular regarding two main aspects choice of global and local explanation why were these particular global and local explanation techniques selected for localexplanation while there exists so many possible choices was there any particular characteristics of this local explanation that made it seem like an interesting choice as local explanation ideally you would want the local explanation to feed into the specific global explanation being used an vice versa could you have use clear distinction between the different reward components as a possible way to choose the high level states to be shown as part of the summary currently i dont see any reason to believe another local explanation wouldnt have worked as well or better in this context if in fact the authors goal is to try to find such combination of global and local explanation through user studies i would recommend doing a larger study where various global and local explanations are tried at the same time details regarding the technical approach unfortunately current writing of the paper makes it a bit hard to properly evaluate many of the technical aspects of the paper the problems here include issues from inconsistant notation usage to a lot of important details being missing just to cite an example i am assuming the symbols yt and yt are being used interchangeably more importantly in the section neural network architecture i was actually quite confused about the discussion about having different agent with different policies from the rest of the paper the idea i had gotten was that the you were selecting an action that maximizes the sum of individual reward components result in different policies i had to go back to reading the original paper van seijen et al 2017 to see that at least by my understanding they seem to maximize each q factor component separately in that context it makes sense to assume there are different agents however in your case the loss function for each head seem to take the sum of the q values of all the components which is different from what they were doing additionally there is the question of what ydoubledqnci is if it is defined only in terms of the cost component ci and chooses a next action that maximizes the q value for just that component then you have something thats between an optimal solver and what van seijen et al 2017 but then the question rises about why one would want to use this method van seijen et al 2017 motivates their method by citing the fact that in fact the method may help apply these methods to high dimension problems and its best to choose reward components that may use minimal number of state variables your problem is simple enough that you could have used an exact method additionally your reward decomposition is more dependent on whether the reward components correspond to meaningful terms with that said i still think the paper would lead to interesting discussions in the workshop and would recommend accepting it on a smaller note i was a bit surprised to read that you had to teach the participants about reinforcement learning are the authors in some sense claiming that these methods will only be useful to people who understand rl this would make these methods extremely narrow in its applicability and i assume it was mainly required because of the local explanation could one have used more intuitive terms for presenting the reward decomposition or does the authors believe this is a fundamental limitation of the methoddocsepcontribution and summary the main contribution of this paper is that they combined reward decomposition explanation local explanation and an existing policy summarization method called highlight global explanation they did a user study on a selfdriving car domain multilane highway environment to evaluate whether reward decomposition explanation combined with global explanation will decrease the users error in understanding the agents preferences they compared the combined explanation with baselines where there is no reward decomposition explanation added they also evaluate users confidence and satisfaction through a subjective questionnaire to see how different explanations their combined explanation vs baselines would affect these factors the takeaways of this work are that combining reward decomposition explanation with global explanation will increase the correctness rate of the users but it will not have any effects on users confidence and satisfaction major comments in my opinion while a good presentation and the right evaluation of the idea could make the work an interesting work this paper in its current version has major issues in its presentation detail and evaluations the main motivation of this paper is not clear as to why such explanations are needed and what would this work and its result add to the community the evaluations also are incomplete and the result cannot be concluded in this current state my comments are as follows 1 the major motivation of the paper is not clear why such integration and study is needed 2 in the related work more detail needs to be provided about the cited papers for instance a sequence of references is listed for the local explanation but some detail about each of these works should be provided what are their methods and takeaways this is the case for the cited papers on explainability in selfdriving cars more importantly after that you need to provide the motivation and differences of these papers and how your paper stands out from the related works 3 on page 2 it is written that the work uses the double dqn but without explaining why and the reason for using double dqn instead of alternatives more details need to be provided 4 figure 1 is very premature where it is just showing a neural network with three heads what exactly did you want to present with this image i think anybody knows what a three heads nn looks like instead its good to have a more informative figure that showed the architecture used in the paper 5 in the section integrating highlight with reward decomposition its not clear what is the change you made in the neural network and what are the differences its useful to have a figure to show both architectures and more detail is needed to explain both clearly and explain the reason for having this multihead nn furthermore the integration part is also not clear how do you integrate highlight and reward decomposition and what does it mean by their integration 6 its not clear why the multihead is needed at all and what its use and benefit of it is it for integration or something else major detail is needed in the integration section 7 the paper wasnt organized well especially in empirical evaluation for example it first talked about the 4 conditions of the study and the table then gave the detail while the detail should come before that this is the case when talking about the image and video 8 in fsrd it should be clarified how the states are shown to the users i could later guess based on the figure but more detail needs to be provided 9 on hrd page 4 it is written meaning that they did not get the context to that state as the highlight algorithm provides what is the context that is missing here 10 in fs it is not clear how many states are selected 11 fs is another way of policy summarization so its better to have an additional baseline without any explanation to have a useful comparison 12 in the evaluation the users get to assess the preferences in a pairwise reward component while the paired comparison is useful in my opinion all 4 also should have been rated besides each other to get the right evaluation 13 in table 2 how and why these rewards are chosen what is the base for these choices and how the change in the rewards will affect the results 14 the paper evaluates the correctness rate of the users in knowing the agents preferences in my opinion this is a factor influenced by many things such as the users attention and level of understanding of the explanation so to evaluate this you should measure their attention and their understanding as well to see how much the result is dependent on the provided explanation as other dependent variables moreover it is important to make this clear that the users correctness rate is different than their understanding of the explanation something that is not evaluated here 15 the paper excludes participants who didnt answer attention questions and didnt complete the survey in less than 7 min what is the attention question what is the basis for choosing the 7 min criterion for excluding participants 16 the results and evaluation are incomplete and inconclusive you cant say one condition is significantly better than the other without actually evaluating them with a proper statistical test and without calculating the pvalue the paper needs to provide clear hypotheses and then show them by statistical tests 17 it was written that in addition our results show that the combination of hrd helped assess the agents preferences when the difference between the reward types was minor why does it matter i think just having a higher correctness rate for hrd is not enough to conclude that not only should a statistical test be done to evaluate this but also in my opinion many different factors might influence the condition which makes it hard to conclude this even with a statistical test 18 similar to previous points the claim that hrd is significantly better and there were cases in which this combination significantly helped is questionable without actual evaluation minor comments 1 some of the equations are written in a bad format such as missing space and combining text and math which is confusing for example on page 2 t rs a s 0 1sts s s a a there should be space between different elements here rn to rm should be rn rm also some notations are very confusing to me doesnt look right and i am not sure if this is the standard way of writing them examples a vectorvalued reward function r s x a rc and vectorvalued qfunction q its not clear what is the meaning of the first arrow beside the text 2 on page 3 its not clear if the lane left idle lane right faster slower are the actions or something else 3 change lane speeding up and moving to the right lane were introduced as components of the reward function and then later on page 4 they are mentioned as actions please clarify this typos there are many typos in the paper here are some of them 1 page 1 understanding graphs might be more difficult then video summarization than 2 page 2 in order in order for the users 3 page 2 li esars y dqn i qs a i2 missing after equation 4 page 3 si si sil1 ail1 second si should be ai 5 page 3 its loss function its loss function 6 page 6 the agents preferences agents preferences 7 page 7 participants were asked to compere between different agents compare 8 page 7 we asked participants to comer between different actions compare suggestions and questions to the authors my suggestions and questions to the authors are what i mentioned in the comments ### Summary:
the paper looks at local and global explanation methods for explaining the behavior of rl agents the problem being addressed is very interesting and can lead to fruitful discussions in the workshop nonetheless as pointed out by the reviewers the paper has many flaws with respect to its presentation technical description and evaluation of the proposed approaches one of the main concerns raised was on the papers underlying motivation ie the choice of the specific local and global explanation techniques has not been made clear or justified by the authors the reviewers have also pointed out issues with the papers clarity technical details not being described and justified adequately notational issues and so on i would recommend the authors to not only address these comments in the future iteration of the paper but to also engage and respond to the reviewers on openreview
[ 4307, 347, 973, 390, 1805, 275, 436, 3634, 604, 275, 958, 253, 4477, 4736, 310, 281, 1611, 281, 1089, 824, 5019, 273, 4156, 285, 1980, 8813, 949, 2608, 2175, 891, 651, 5583, 2509, 247, 4067, 1263, 835, 2710, 4156, 285, 1980, 22909, 403, 3597, 387, 253, 1072, 673, 50276, 23454, 5001, 253, 7681, 2746, 19235, 1655, 4028, 273, 253, 2929, 2789, 352, 247, 2372, 1892, 281, 6283, 7472, 1142, 273, 253, 7681, 7794, 273, 253, 2929, 253, 3237, 1060, 2486, 3374, 432, 12592, 5567, 14951, 10393, 281, 247, 2257, 273, 1774, 4278, 1146, 5816, 816, 281, 26542, 271, 1650, 891, 717, 7384, 253, 14217, 340, 85, 285, 340, 85, 403, 1146, 908, 28961, 1598, 50276, 3062, 15538, 275, 253, 2593, 11454, 2990, 10336, 891, 369, 2686, 3240, 13477, 670, 253, 5955, 670, 1907, 1027, 5570, 342, 1027, 7823, 432, 253, 1551, 273, 253, 2929, 253, 2934, 891, 574, 12759, 369, 326, 253, 368, 497, 17221, 271, 2250, 326, 11903, 4219, 253, 2020, 273, 2060, 10921, 4295, 906, 275, 1027, 7823, 891, 574, 281, 564, 896, 281, 4361, 253, 3236, 2929, 3889, 396, 1944, 257, 1162, 355, 4240, 281, 923, 326, 387, 1878, 407, 619, 4685, 597, 1646, 281, 22950, 1016, 2805, 2803, 4445, 11794, 275, 326, 3634, 352, 2789, 3282, 281, 5467, 627, 403, 1027, 6083, 2299, 275, 634, 1083, 253, 2957, 1159, 323, 1016, 1481, 1646, 281, 1379, 253, 2020, 273, 253, 2805, 2193, 273, 512, 253, 4295, 534, 310, 1027, 432, 752, 597, 497, 2509, 23000, 627, 310, 253, 1953, 273, 752, 340, 45922, 11046, 47051, 5297, 310, 604, 352, 310, 2931, 760, 275, 2426, 273, 253, 2105, 4445, 16399, 285, 28467, 247, 1735, 2250, 326, 11903, 4219, 253, 2805, 1318, 323, 816, 326, 4445, 840, 368, 452, 1633, 28763, 875, 271, 8654, 47037, 285, 752, 3889, 396, 1944, 257, 1162, 355, 4240, 533, 840, 253, 1953, 22844, 670, 2139, 581, 651, 971, 281, 897, 436, 1332, 3889, 396, 1944, 257, 1162, 355, 4240, 15265, 684, 616, 1332, 407, 19936, 253, 958, 326, 275, 958, 253, 1332, 778, 1361, 4647, 841, 3082, 281, 1029, 7877, 3237, 285, 697, 1682, 281, 5206, 10921, 4295, 326, 778, 897, 8723, 1180, 273, 1375, 4903, 634, 1895, 310, 2969, 2217, 326, 368, 812, 452, 908, 271, 3242, 1332, 23000, 634, 10921, 14717, 310, 625, 7976, 327, 1880, 253, 10921, 4295, 2723, 281, 14282, 2426, 50276, 3113, 326, 753, 891, 1335, 1158, 253, 2929, 651, 1421, 281, 4722, 11985, 275, 253, 22586, 285, 651, 5583, 18738, 352, 50276, 251, 247, 4577, 3877, 891, 369, 247, 2372, 9861, 281, 1239, 326, 368, 574, 281, 9798, 253, 5014, 670, 35221, 4715, 403, 253, 4477, 275, 690, 3282, 15081, 326, 841, 3082, 588, 760, 320, 4217, 281, 952, 665, 2096, 391, 77, 436, 651, 1056, 841, 3082, 6685, 6891, 275, 697, 30437, 285, 891, 5467, 352, 369, 7194, 2424, 984, 273, 253, 1980, 8813, 812, 581, 452, 908, 625, 27350, 2426, 323, 15250, 253, 10921, 14717, 390, 1057, 253, 4477, 2868, 436, 310, 247, 7936, 12291, 273, 253, 1332, 7152, 33032, 1987, 2382, 285, 6010, 253, 2022, 7680, 273, 436, 2929, 310, 326, 597, 5678, 10921, 14717, 8813, 1980, 8813, 285, 271, 5368, 3646, 10405, 1320, 1332, 1925, 6780, 4156, 8813, 597, 858, 247, 2608, 1263, 327, 247, 1881, 41571, 1113, 5028, 33362, 1351, 17657, 3126, 281, 7472, 1880, 10921, 14717, 8813, 5678, 342, 4156, 8813, 588, 6379, 253, 4212, 2228, 275, 4685, 253, 6083, 17971, 597, 2429, 253, 5678, 8813, 342, 1666, 25379, 835, 627, 310, 642, 10921, 14717, 8813, 2879, 597, 671, 7472, 4212, 7162, 285, 13212, 949, 247, 17854, 15126, 281, 923, 849, 1027, 22909, 616, 5678, 8813, 4632, 1666, 25379, 651, 2818, 841, 2616, 50275, 783, 1379, 42287, 273, 436, 789, 403, 326, 16248, 10921, 14717, 8813, 342, 4156, 8813, 588, 2572, 253, 36594, 2281, 273, 253, 4212, 533, 352, 588, 417, 452, 667, 2538, 327, 4212, 7162, 285, 13212, 50276, 24330, 5701, 275, 619, 4743, 1223, 247, 1175, 9759, 285, 253, 987, 7103, 273, 253, 2934, 812, 1056, 253, 789, 271, 4722, 789, 436, 2929, 275, 697, 1655, 2715, 556, 2201, 3374, 275, 697, 9759, 2508, 285, 27163, 253, 2022, 16038, 273, 436, 2929, 310, 417, 2590, 347, 281, 2139, 824, 22909, 403, 3058, 285, 752, 651, 436, 789, 285, 697, 906, 823, 281, 253, 3114, 253, 27163, 671, 403, 18464, 285, 253, 906, 2550, 320, 7945, 275, 436, 1655, 1375, 619, 5701, 403, 347, 3637, 50276, 18, 253, 2201, 16038, 273, 253, 2929, 310, 417, 2590, 2139, 824, 9554, 285, 1263, 310, 3058, 374, 275, 253, 2905, 789, 625, 2508, 3198, 281, 320, 2530, 670, 253, 11106, 9380, 323, 4227, 247, 3425, 273, 10414, 310, 7117, 323, 253, 1980, 8813, 533, 690, 2508, 670, 1016, 273, 841, 2987, 943, 320, 2530, 752, 403, 616, 3082, 285, 1379, 42287, 436, 310, 253, 1083, 323, 253, 11106, 9380, 327, 5513, 1430, 275, 1881, 41571, 8458, 625, 15538, 846, 326, 368, 878, 281, 2085, 253, 16038, 285, 3910, 273, 841, 9380, 285, 849, 634, 2929, 9572, 562, 432, 253, 2905, 2987, 495, 327, 3239, 374, 352, 310, 3542, 326, 253, 789, 4648, 253, 4021, 277, 47051, 533, 1293, 15571, 2139, 285, 253, 1921, 323, 970, 4021, 277, 47051, 3185, 273, 18075, 625, 4278, 878, 281, 320, 2530, 50276, 21, 4677, 337, 310, 1077, 20346, 835, 352, 310, 816, 4645, 247, 11454, 2990, 342, 1264, 9851, 752, 4555, 858, 368, 971, 281, 1246, 342, 436, 2460, 891, 1158, 13098, 6057, 752, 247, 1264, 9851, 48257, 4453, 751, 3185, 697, 1175, 281, 452, 247, 625, 27096, 4677, 326, 2692, 253, 10336, 908, 275, 253, 2929, 608, 275, 253, 2593, 24399, 6780, 342, 10921, 14717, 697, 417, 2590, 752, 310, 253, 1818, 368, 1160, 275, 253, 11454, 2990, 285, 752, 403, 253, 3910, 697, 4217, 281, 452, 247, 4677, 281, 921, 1097, 35615, 285, 625, 2508, 310, 3058, 281, 5513, 1097, 4518, 285, 5513, 253, 1921, 323, 1907, 436, 4471, 2522, 48257, 33810, 253, 9554, 629, 310, 671, 417, 2590, 849, 513, 368, 19837, 6780, 285, 10921, 14717, 285, 752, 1057, 352, 1599, 407, 616, 9554, 721, 697, 417, 2590, 2139, 253, 4471, 2522, 310, 3058, 387, 512, 285, 752, 697, 897, 285, 5649, 273, 352, 310, 352, 323, 9554, 390, 1633, 2010, 2201, 2508, 310, 3058, 275, 253, 9554, 2593, 50276, 24, 253, 2929, 369, 2649, 10932, 973, 3340, 275, 16774, 7103, 323, 1650, 352, 806, 10062, 670, 253, 577, 2515, 273, 253, 1263, 285, 253, 2829, 840, 3534, 253, 2508, 1223, 253, 2508, 943, 1705, 1078, 326, 436, 310, 253, 1083, 672, 5015, 670, 253, 2460, 285, 3492, 50276, 25, 275, 25290, 5784, 352, 943, 320, 31637, 849, 253, 3054, 403, 2011, 281, 253, 4212, 891, 812, 1996, 5476, 1754, 327, 253, 4677, 533, 625, 2508, 3198, 281, 320, 2530, 898, 327, 288, 5784, 3239, 577, 352, 310, 3542, 4495, 326, 597, 858, 417, 755, 253, 3634, 281, 326, 1375, 347, 253, 6780, 5933, 3400, 752, 310, 253, 3634, 326, 310, 5816, 1060, 884, 275, 25290, 352, 310, 417, 2590, 849, 1142, 3054, 403, 4236, 1903, 25290, 310, 1529, 1039, 273, 3646, 10405, 1320, 594, 697, 1805, 281, 452, 271, 3081, 8245, 1293, 667, 8813, 281, 452, 247, 4217, 5301, 1249, 275, 253, 7103, 253, 4212, 755, 281, 2939, 253, 17971, 275, 247, 28208, 10921, 4445, 1223, 253, 18433, 5301, 310, 4217, 275, 619, 4743, 512, 577, 671, 943, 452, 644, 20139, 16280, 1016, 643, 281, 755, 253, 987, 7103, 2145, 275, 2829, 374, 849, 285, 2139, 841, 23267, 403, 6777, 752, 310, 253, 2613, 323, 841, 10165, 285, 849, 253, 1818, 275, 253, 23267, 588, 2818, 253, 1543, 1638, 253, 2929, 44995, 253, 36594, 2281, 273, 253, 4212, 275, 8958, 253, 6083, 17971, 275, 619, 4743, 436, 310, 247, 2803, 12208, 407, 1142, 1841, 824, 347, 253, 4212, 4116, 285, 1268, 273, 4685, 273, 253, 8813, 594, 281, 7472, 436, 368, 943, 2557, 616, 4116, 285, 616, 4685, 347, 973, 281, 923, 849, 1199, 253, 906, 310, 7976, 327, 253, 2530, 8813, 347, 643, 7976, 4903, 25761, 352, 310, 1774, 281, 1056, 436, 2590, 326, 253, 4212, 36594, 2281, 310, 1027, 685, 616, 4685, 273, 253, 8813, 1633, 326, 310, 417, 6760, 1060, 50276, 1010, 253, 2929, 43337, 5014, 665, 42126, 3662, 4116, 3533, 285, 42126, 3426, 253, 6630, 275, 1679, 685, 818, 1054, 752, 310, 253, 4116, 1953, 752, 310, 253, 3720, 323, 13887, 253, 818, 1054, 17705, 323, 22914, 5014, 1668, 253, 1543, 285, 7103, 403, 18464, 285, 16656, 7426, 368, 16216, 1333, 581, 1617, 310, 3012, 1805, 685, 253, 643, 1293, 2686, 16344, 731, 342, 247, 1463, 7605, 1071, 285, 1293, 18899, 253, 268, 2877, 253, 2929, 3198, 281, 2085, 2590, 24316, 285, 840, 921, 731, 407, 7605, 5216, 50276, 1166, 352, 369, 3542, 326, 275, 1635, 776, 1543, 921, 326, 253, 5019, 273, 288, 5784, 6518, 2939, 253, 6083, 17971, 672, 253, 3064, 875, 253, 10921, 3510, 369, 5884, 2139, 1057, 352, 2647, 891, 1158, 816, 1907, 247, 2169, 36594, 2281, 323, 288, 5784, 310, 417, 2217, 281, 7525, 326, 417, 760, 943, 247, 7605, 1071, 320, 2218, 281, 7472, 436, 533, 671, 275, 619, 4743, 1142, 1027, 2616, 1537, 4833, 253, 1617, 534, 2789, 352, 1892, 281, 7525, 436, 1014, 342, 247, 7605, 1071, 1283, 2074, 281, 2045, 2792, 253, 1750, 326, 288, 5784, 310, 3012, 1805, 285, 627, 497, 2219, 275, 534, 436, 5019, 3012, 6518, 310, 30455, 1293, 4588, 7103, 50276, 37585, 5701, 50276, 18, 690, 273, 253, 7424, 403, 3542, 275, 247, 3076, 5981, 824, 347, 5816, 2317, 285, 16248, 2505, 285, 14168, 534, 310, 21643, 323, 1650, 327, 3239, 374, 246, 14208, 247, 256, 50275, 17, 337, 7752, 256, 50276, 84, 247, 50276, 66, 627, 943, 320, 2317, 875, 1027, 3603, 1060, 50276, 30930, 281, 40373, 943, 320, 391, 79, 50276, 1109, 671, 690, 41818, 403, 1077, 21643, 281, 479, 36908, 1007, 987, 285, 891, 717, 417, 2119, 604, 436, 310, 253, 2629, 1039, 273, 4028, 731, 6667, 247, 4972, 24995, 10921, 1159, 391, 50276, 84, 1269, 247, 50276, 3373, 285, 50276, 11000, 24995, 2805, 3701, 2805, 697, 417, 2590, 752, 310, 253, 4495, 273, 253, 806, 14150, 12200, 253, 2505, 374, 327, 3239, 495, 697, 417, 2590, 604, 253, 18209, 1669, 27473, 18209, 987, 7938, 17357, 403, 253, 5231, 390, 1633, 2010, 495, 1818, 18209, 43088, 598, 285, 4886, 281, 253, 987, 18209, 497, 5611, 347, 4295, 273, 253, 10921, 1159, 285, 840, 1996, 327, 3239, 577, 597, 403, 5393, 347, 5231, 4496, 19148, 436, 50276, 555, 993, 50276, 9088, 403, 1142, 963, 993, 275, 253, 2929, 1060, 403, 690, 273, 731, 337, 3239, 337, 4685, 14580, 1537, 320, 625, 2834, 840, 3492, 10405, 1320, 50276, 14644, 374, 3239, 374, 275, 1340, 275, 1340, 323, 253, 4212, 50276, 20, 3239, 374, 632, 50276, 265, 1032, 340, 277, 47051, 891, 2805, 84, 247, 891, 19, 50275, 33722, 50276, 6438, 5150, 577, 3239, 495, 50275, 9245, 50276, 9245, 2830, 18, 44244, 18, 50276, 9815, 4927, 943, 320, 23105, 608, 3239, 495, 697, 2957, 1159, 50276, 953, 2957, 1159, 721, 3239, 721, 253, 6083, 17971, 50276, 21215, 17971, 818, 3239, 818, 5014, 497, 2546, 281, 389, 44929, 875, 1027, 6083, 50276, 23813, 854, 3239, 818, 359, 2546, 5014, 281, 389, 254, 875, 1027, 5231, 50276, 23813, 50275, 35640, 621, 285, 3533, 281, 253, 4477, 619, 13991, 285, 3533, 281, 253, 4477, 403, 752, 891, 5393, 275, 253, 5701, 50276, 187, 187, 4118, 18435, 27, 783, 2929, 4453, 387, 1980, 285, 4156, 8813, 3082, 323, 15571, 253, 3879, 273, 391, 77, 6083, 253, 1895, 1146, 9713, 310, 1077, 4722, 285, 476, 1421, 281, 46001, 11985, 275, 253, 22586, 50276, 4160, 14153, 347, 8042, 562, 407, 253, 30628, 253, 2929, 556, 1142, 32138, 342, 1675, 281, 697, 9759, 7681, 5740, 285, 7103, 273, 253, 4081, 7274, 581, 273, 253, 2022, 7350, 5439, 369, 327, 253, 9380, 6944, 16038, 26332, 253, 4327, 273, 253, 2173, 1980, 285, 4156, 8813, 5609, 556, 417, 644, 1160, 2590, 390, 17285, 407, 253, 4477, 253, 30628, 452, 671, 8042, 562, 3374, 342, 253, 9380, 19843, 7681, 4278, 417, 1146, 2529, 285, 17285, 18212, 417, 1050, 3374, 285, 594, 327, 50276, 74, 651, 5583, 253, 4477, 281, 417, 760, 2953, 841, 5701, 275, 253, 2852, 19502, 273, 253, 2929, 533, 281, 671, 11377, 285, 3794, 281, 253, 30628, 327, 1527, 15337, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 4307, 347, 973, 390, 1805, 275, 436, 3634, 604, 275, 958, 253, 4477, 4736, 310, 281, 1611, 281, 1089, 824, 5019, 273, 4156, 285, 1980, 8813, 949, 2608, 2175, 891, 651, 5583, 2509, 247, 4067, 1263, 835, 2710, 4156, 285, 1980, 22909, 403, 3597, 387, 253, 1072, 673, 50276, 23454, 5001, 253, 7681, 2746, 19235, 1655, 4028, 273, 253, 2929, 2789, 352, 247, 2372, 1892, 281, 6283, 7472, 1142, 273, 253, 7681, 7794, 273, 253, 2929, 253, 3237, 1060, 2486, 3374, 432, 12592, 5567, 14951, 10393, 281, 247, 2257, 273, 1774, 4278, 1146, 5816, 816, 281, 26542, 271, 1650, 891, 717, 7384, 253, 14217, 340, 85, 285, 340, 85, 403, 1146, 908, 28961, 1598, 50276, 3062, 15538, 275, 253, 2593, 11454, 2990, 10336, 891, 369, 2686, 3240, 13477, 670, 253, 5955, 670, 1907, 1027, 5570, 342, 1027, 7823, 432, 253, 1551, 273, 253, 2929, 253, 2934, 891, 574, 12759, 369, 326, 253, 368, 497, 17221, 271, 2250, 326, 11903, 4219, 253, 2020, 273, 2060, 10921, 4295, 906, 275, 1027, 7823, 891, 574, 281, 564, 896, 281, 4361, 253, 3236, 2929, 3889, 396, 1944, 257, 1162, 355, 4240, 281, 923, 326, 387, 1878, 407, 619, 4685, 597, 1646, 281, 22950, 1016, 2805, 2803, 4445, 11794, 275, 326, 3634, 352, 2789, 3282, 281, 5467, 627, 403, 1027, 6083, 2299, 275, 634, 1083, 253, 2957, 1159, 323, 1016, 1481, 1646, 281, 1379, 253, 2020, 273, 253, 2805, 2193, 273, 512, 253, 4295, 534, 310, 1027, 432, 752, 597, 497, 2509, 23000, 627, 310, 253, 1953, 273, 752, 340, 45922, 11046, 47051, 5297, 310, 604, 352, 310, 2931, 760, 275, 2426, 273, 253, 2105, 4445, 16399, 285, 28467, 247, 1735, 2250, 326, 11903, 4219, 253, 2805, 1318, 323, 816, 326, 4445, 840, 368, 452, 1633, 28763, 875, 271, 8654, 47037, 285, 752, 3889, 396, 1944, 257, 1162, 355, 4240, 533, 840, 253, 1953, 22844, 670, 2139, 581, 651, 971, 281, 897, 436, 1332, 3889, 396, 1944, 257, 1162, 355, 4240, 15265, 684, 616, 1332, 407, 19936, 253, 958, 326, 275, 958, 253, 1332, 778, 1361, 4647, 841, 3082, 281, 1029, 7877, 3237, 285, 697, 1682, 281, 5206, 10921, 4295, 326, 778, 897, 8723, 1180, 273, 1375, 4903, 634, 1895, 310, 2969, 2217, 326, 368, 812, 452, 908, 271, 3242, 1332, 23000, 634, 10921, 14717, 310, 625, 7976, 327, 1880, 253, 10921, 4295, 2723, 281, 14282, 2426, 50276, 3113, 326, 753, 891, 1335, 1158, 253, 2929, 651, 1421, 281, 4722, 11985, 275, 253, 22586, 285, 651, 5583, 18738, 352, 50276, 251, 247, 4577, 3877, 891, 369, 247, 2372, 9861, 281, 1239, 326, 368, 574, 281, 9798, 253, 5014, 670, 35221, 4715, 403, 253, 4477, 275, 690, 3282, 15081, 326, 841, 3082, 588, 760, 320, 4217, 281, 952, 665, 2096, 391, 77, 436, 651, 1056, 841, 3082, 6685, 6891, 275, 697, 30437, 285, 891, 5467, 352, 369, 7194, 2424, 984, 273, 253, 1980, 8813, 812, 581, 452, 908, 625, 27350, 2426, 323, 15250, 253, 10921, 14717, 390, 1057, 253, 4477, 2868, 436, 310, 247, 7936, 12291, 273, 253, 1332, 7152, 33032, 1987, 2382, 285, 6010, 253, 2022, 7680, 273, 436, 2929, 310, 326, 597, 5678, 10921, 14717, 8813, 1980, 8813, 285, 271, 5368, 3646, 10405, 1320, 1332, 1925, 6780, 4156, 8813, 597, 858, 247, 2608, 1263, 327, 247, 1881, 41571, 1113, 5028, 33362, 1351, 17657, 3126, 281, 7472, 1880, 10921, 14717, 8813, 5678, 342, 4156, 8813, 588, 6379, 253, 4212, 2228, 275, 4685, 253, 6083, 17971, 597, 2429, 253, 5678, 8813, 342, 1666, 25379, 835, 627, 310, 642, 10921, 14717, 8813, 2879, 597, 671, 7472, 4212, 7162, 285, 13212, 949, 247, 17854, 15126, 281, 923, 849, 1027, 22909, 616, 5678, 8813, 4632, 1666, 25379, 651, 2818, 841, 2616, 50275, 783, 1379, 42287, 273, 436, 789, 403, 326, 16248, 10921, 14717, 8813, 342, 4156, 8813, 588, 2572, 253, 36594, 2281, 273, 253, 4212, 533, 352, 588, 417, 452, 667, 2538, 327, 4212, 7162, 285, 13212, 50276, 24330, 5701, 275, 619, 4743, 1223, 247, 1175, 9759, 285, 253, 987, 7103, 273, 253, 2934, 812, 1056, 253, 789, 271, 4722, 789, 436, 2929, 275, 697, 1655, 2715, 556, 2201, 3374, 275, 697, 9759, 2508, 285, 27163, 253, 2022, 16038, 273, 436, 2929, 310, 417, 2590, 347, 281, 2139, 824, 22909, 403, 3058, 285, 752, 651, 436, 789, 285, 697, 906, 823, 281, 253, 3114, 253, 27163, 671, 403, 18464, 285, 253, 906, 2550, 320, 7945, 275, 436, 1655, 1375, 619, 5701, 403, 347, 3637, 50276, 18, 253, 2201, 16038, 273, 253, 2929, 310, 417, 2590, 2139, 824, 9554, 285, 1263, 310, 3058, 374, 275, 253, 2905, 789, 625, 2508, 3198, 281, 320, 2530, 670, 253, 11106, 9380, 323, 4227, 247, 3425, 273, 10414, 310, 7117, 323, 253, 1980, 8813, 533, 690, 2508, 670, 1016, 273, 841, 2987, 943, 320, 2530, 752, 403, 616, 3082, 285, 1379, 42287, 436, 310, 253, 1083, 323, 253, 11106, 9380, 327, 5513, 1430, 275, 1881, 41571, 8458, 625, 15538, 846, 326, 368, 878, 281, 2085, 253, 16038, 285, 3910, 273, 841, 9380, 285, 849, 634, 2929, 9572, 562, 432, 253, 2905, 2987, 495, 327, 3239, 374, 352, 310, 3542, 326, 253, 789, 4648, 253, 4021, 277, 47051, 533, 1293, 15571, 2139, 285, 253, 1921, 323, 970, 4021, 277, 47051, 3185, 273, 18075, 625, 4278, 878, 281, 320, 2530, 50276, 21, 4677, 337, 310, 1077, 20346, 835, 352, 310, 816, 4645, 247, 11454, 2990, 342, 1264, 9851, 752, 4555, 858, 368, 971, 281, 1246, 342, 436, 2460, 891, 1158, 13098, 6057, 752, 247, 1264, 9851, 48257, 4453, 751, 3185, 697, 1175, 281, 452, 247, 625, 27096, 4677, 326, 2692, 253, 10336, 908, 275, 253, 2929, 608, 275, 253, 2593, 24399, 6780, 342, 10921, 14717, 697, 417, 2590, 752, 310, 253, 1818, 368, 1160, 275, 253, 11454, 2990, 285, 752, 403, 253, 3910, 697, 4217, 281, 452, 247, 4677, 281, 921, 1097, 35615, 285, 625, 2508, 310, 3058, 281, 5513, 1097, 4518, 285, 5513, 253, 1921, 323, 1907, 436, 4471, 2522, 48257, 33810, 253, 9554, 629, 310, 671, 417, 2590, 849, 513, 368, 19837, 6780, 285, 10921, 14717, 285, 752, 1057, 352, 1599, 407, 616, 9554, 721, 697, 417, 2590, 2139, 253, 4471, 2522, 310, 3058, 387, 512, 285, 752, 697, 897, 285, 5649, 273, 352, 310, 352, 323, 9554, 390, 1633, 2010, 2201, 2508, 310, 3058, 275, 253, 9554, 2593, 50276, 24, 253, 2929, 369, 2649, 10932, 973, 3340, 275, 16774, 7103, 323, 1650, 352, 806, 10062, 670, 253, 577, 2515, 273, 253, 1263, 285, 253, 2829, 840, 3534, 253, 2508, 1223, 253, 2508, 943, 1705, 1078, 326, 436, 310, 253, 1083, 672, 5015, 670, 253, 2460, 285, 3492, 50276, 25, 275, 25290, 5784, 352, 943, 320, 31637, 849, 253, 3054, 403, 2011, 281, 253, 4212, 891, 812, 1996, 5476, 1754, 327, 253, 4677, 533, 625, 2508, 3198, 281, 320, 2530, 898, 327, 288, 5784, 3239, 577, 352, 310, 3542, 4495, 326, 597, 858, 417, 755, 253, 3634, 281, 326, 1375, 347, 253, 6780, 5933, 3400, 752, 310, 253, 3634, 326, 310, 5816, 1060, 884, 275, 25290, 352, 310, 417, 2590, 849, 1142, 3054, 403, 4236, 1903, 25290, 310, 1529, 1039, 273, 3646, 10405, 1320, 594, 697, 1805, 281, 452, 271, 3081, 8245, 1293, 667, 8813, 281, 452, 247, 4217, 5301, 1249, 275, 253, 7103, 253, 4212, 755, 281, 2939, 253, 17971, 275, 247, 28208, 10921, 4445, 1223, 253, 18433, 5301, 310, 4217, 275, 619, 4743, 512, 577, 671, 943, 452, 644, 20139, 16280, 1016, 643, 281, 755, 253, 987, 7103, 2145, 275, 2829, 374, 849, 285, 2139, 841, 23267, 403, 6777, 752, 310, 253, 2613, 323, 841, 10165, 285, 849, 253, 1818, 275, 253, 23267, 588, 2818, 253, 1543, 1638, 253, 2929, 44995, 253, 36594, 2281, 273, 253, 4212, 275, 8958, 253, 6083, 17971, 275, 619, 4743, 436, 310, 247, 2803, 12208, 407, 1142, 1841, 824, 347, 253, 4212, 4116, 285, 1268, 273, 4685, 273, 253, 8813, 594, 281, 7472, 436, 368, 943, 2557, 616, 4116, 285, 616, 4685, 347, 973, 281, 923, 849, 1199, 253, 906, 310, 7976, 327, 253, 2530, 8813, 347, 643, 7976, 4903, 25761, 352, 310, 1774, 281, 1056, 436, 2590, 326, 253, 4212, 36594, 2281, 310, 1027, 685, 616, 4685, 273, 253, 8813, 1633, 326, 310, 417, 6760, 1060, 50276, 1010, 253, 2929, 43337, 5014, 665, 42126, 3662, 4116, 3533, 285, 42126, 3426, 253, 6630, 275, 1679, 685, 818, 1054, 752, 310, 253, 4116, 1953, 752, 310, 253, 3720, 323, 13887, 253, 818, 1054, 17705, 323, 22914, 5014, 1668, 253, 1543, 285, 7103, 403, 18464, 285, 16656, 7426, 368, 16216, 1333, 581, 1617, 310, 3012, 1805, 685, 253, 643, 1293, 2686, 16344, 731, 342, 247, 1463, 7605, 1071, 285, 1293, 18899, 253, 268, 2877, 253, 2929, 3198, 281, 2085, 2590, 24316, 285, 840, 921, 731, 407, 7605, 5216, 50276, 1166, 352, 369, 3542, 326, 275, 1635, 776, 1543, 921, 326, 253, 5019, 273, 288, 5784, 6518, 2939, 253, 6083, 17971, 672, 253, 3064, 875, 253, 10921, 3510, 369, 5884, 2139, 1057, 352, 2647, 891, 1158, 816, 1907, 247, 2169, 36594, 2281, 323, 288, 5784, 310, 417, 2217, 281, 7525, 326, 417, 760, 943, 247, 7605, 1071, 320, 2218, 281, 7472, 436, 533, 671, 275, 619, 4743, 1142, 1027, 2616, 1537, 4833, 253, 1617, 534, 2789, 352, 1892, 281, 7525, 436, 1014, 342, 247, 7605, 1071, 1283, 2074, 281, 2045, 2792, 253, 1750, 326, 288, 5784, 310, 3012, 1805, 285, 627, 497, 2219, 275, 534, 436, 5019, 3012, 6518, 310, 30455, 1293, 4588, 7103, 50276, 37585, 5701, 50276, 18, 690, 273, 253, 7424, 403, 3542, 275, 247, 3076, 5981, 824, 347, 5816, 2317, 285, 16248, 2505, 285, 14168, 534, 310, 21643, 323, 1650, 327, 3239, 374, 246, 14208, 247, 256, 50275, 17, 337, 7752, 256, 50276, 84, 247, 50276, 66, 627, 943, 320, 2317, 875, 1027, 3603, 1060, 50276, 30930, 281, 40373, 943, 320, 391, 79, 50276, 1109, 671, 690, 41818, 403, 1077, 21643, 281, 479, 36908, 1007, 987, 285, 891, 717, 417, 2119, 604, 436, 310, 253, 2629, 1039, 273, 4028, 731, 6667, 247, 4972, 24995, 10921, 1159, 391, 50276, 84, 1269, 247, 50276, 3373, 285, 50276, 11000, 24995, 2805, 3701, 2805, 697, 417, 2590, 752, 310, 253, 4495, 273, 253, 806, 14150, 12200, 253, 2505, 374, 327, 3239, 495, 697, 417, 2590, 604, 253, 18209, 1669, 27473, 18209, 987, 7938, 17357, 403, 253, 5231, 390, 1633, 2010, 495, 1818, 18209, 43088, 598, 285, 4886, 281, 253, 987, 18209, 497, 5611, 347, 4295, 273, 253, 10921, 1159, 285, 840, 1996, 327, 3239, 577, 597, 403, 5393, 347, 5231, 4496, 19148, 436, 50276, 555, 993, 50276, 9088, 403, 1142, 963, 993, 275, 253, 2929, 1060, 403, 690, 273, 731, 337, 3239, 337, 4685, 14580, 1537, 320, 625, 2834, 840, 3492, 10405, 1320, 50276, 14644, 374, 3239, 374, 275, 1340, 275, 1340, 323, 253, 4212, 50276, 20, 3239, 374, 632, 50276, 265, 1032, 340, 277, 47051, 891, 2805, 84, 247, 891, 19, 50275, 33722, 50276, 6438, 5150, 577, 3239, 495, 50275, 9245, 50276, 9245, 2830, 18, 44244, 18, 50276, 9815, 4927, 943, 320, 23105, 608, 3239, 495, 697, 2957, 1159, 50276, 953, 2957, 1159, 721, 3239, 721, 253, 6083, 17971, 50276, 21215, 17971, 818, 3239, 818, 5014, 497, 2546, 281, 389, 44929, 875, 1027, 6083, 50276, 23813, 854, 3239, 818, 359, 2546, 5014, 281, 389, 254, 875, 1027, 5231, 50276, 23813, 50275, 35640, 621, 285, 3533, 281, 253, 4477, 619, 13991, 285, 3533, 281, 253, 4477, 403, 752, 891, 5393, 275, 253, 5701, 50276, 187, 187, 4118, 18435, 27, 783, 2929, 4453, 387, 1980, 285, 4156, 8813, 3082, 323, 15571, 253, 3879, 273, 391, 77, 6083, 253, 1895, 1146, 9713, 310, 1077, 4722, 285, 476, 1421, 281, 46001, 11985, 275, 253, 22586, 50276, 4160, 14153, 347, 8042, 562, 407, 253, 30628, 253, 2929, 556, 1142, 32138, 342, 1675, 281, 697, 9759, 7681, 5740, 285, 7103, 273, 253, 4081, 7274, 581, 273, 253, 2022, 7350, 5439, 369, 327, 253, 9380, 6944, 16038, 26332, 253, 4327, 273, 253, 2173, 1980, 285, 4156, 8813, 5609, 556, 417, 644, 1160, 2590, 390, 17285, 407, 253, 4477, 253, 30628, 452, 671, 8042, 562, 3374, 342, 253, 9380, 19843, 7681, 4278, 417, 1146, 2529, 285, 17285, 18212, 417, 1050, 3374, 285, 594, 327, 50276, 74, 651, 5583, 253, 4477, 281, 417, 760, 2953, 841, 5701, 275, 253, 2852, 19502, 273, 253, 2929, 533, 281, 671, 11377, 285, 3794, 281, 253, 30628, 327, 1527, 15337, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper proposes to speed up inference of deep equilibrium models deqs by replacing the classic fixedpoint solvers broyden or anderson acceleration by a learned extension of aa their approach operates on a pretrained deq and trains a small neural network to propose an initialization and update scheme based on ground truth fixed points their method is orthogonal to existing regularization approaches to speeding up deqs the paper has extensive experiments across large scale tasks language modeling imagenet classification and semantic segmentation they show pareto improvements across these tasks as compared to standard deqs while only adding a 1 additional training overhead of the deq the introduction motivation and overview of deqs and their contribution is very strong they provide extensive experimentation proving the benefits of their method across many largescale tasks pushing the state of the art of deqs closer to practical deployment some questions does the learned extension of aa still guarantee fixedpoint convergence is it enough to enforce that the alphak weights sum to 1 thanks to this paper deq models seem to be approaching practicality during inference time how does their training time compare to explicit models i understand this is orthogonal to this paper im just curious one caveat you mention is that bptt to train hyperanderson network could use a lot of memory i think this could be substantially reduced with gradient checkpointing if it ever becomes a bottleneck i believe this is a very strong paper that significantly pushes the state of the art of deqs and should get accepted without reservation barring something substantial i may have missed docsepthe authors introduce a neural network approach for solving the fixed point equations arising in deep equilibrium models this consists of a tiny network that provides an initial guess for the fixed point as well as a small network that computes coefficients inside an algorithm inspired by anderson iteration the approach is intuitive and empirical although no theory is given the authors demonstrate the strength of their proposed solver in large scale experimental evaluations specifically the new solver is fast to train has a small parameter count and appears to drastically shift the pareto front of the inference speedperformance curve for all deq models strengths the technique is intuitively motivated as a learnable version of anderson acceleration which while lacking theoretical basis is easy to follow and a natural method to try the experimental evaluation is convincing the extensive empirical evaluation of the proposed approach gives the reader the sense that the hypersolver is at least worth trying on their deq model weaknesses 1 being a largely empiricallyintuitively motivated work there are many user settings and hyperparameters and even hardcoded arbitrary choices eg the 3 losses introduced in section 42 that have to be chosen within any theoretical guidance and are not wellmotivated still this work opens the door for future mathemtical analysis of the proposed method although perhaps on a more restricted or smaller practicalempirical scale 2 in lconv is zast necessarily a true solution or just one provided by another solver none of the losses in section 42 actually minimise the ostensible goal of the equilibrium solver why is there no loss that directly minimises the absolute value of the residual g it seems like the neural solver is learning to imitate the behaviour of a provided solver given access to zast which may not actually be a root nor does a root necessarily exist nor is it guaranteed to be unique rather than solve the problem directly did you try to minimise g instead any reason or intuition as to why you chose this method commentssuggestionsquestions 3 one of the really nice features of your technique is that as you mention the fixed point solver does not need training labels in order to update its parameters this ability to do unsupervised training could be highlighted in the abstract 4 typo we treat the it as a mini timeseries of length 5 typo 2 these hyprtsolvers can be trained very quickly 6 figure 4d all of the interesting part of the graph occupies 996 to 100 remove the bottom 996 of the vertical axis maybe even a logarithmic scale is appropriate 7 it may be present in the appendix but i was not able to easily find it how many random seeds do the curves in figure 4 represent hopefully not one is each point independent from each other does each point use different random seeds both within the same colour and outside of the same colour 8 can you try your method on some smaller networks eg around 1000100000 parameters and simpler tasks to see if an advantage over anderson acceleration still persists i am interested in understanding whether the success of this method is tied to the difficulty of the fixed point problem which should intuitively be greater in larger models an empirically motivated neural network replacing the role of a traditional root finder eg anderson acceleration in deq models appears to improve performance and inference time however this root finding network introduces a new set of hyperparameters and all the baggage usually associated with deep learning no theory of convergence optimisation generalisation hyperparameter choice weakness 1 and it is not clear whether the choices made by the authors are universally applicable weakness 12 all in all a good paper that is likely to be adopted by the community and spark future research docsepthe paper presents a method called neural deep equilibrium solver to increase the efficiency in the inference stage for implicit deep models by initializing the equilibrium states using neural network the authors start with the traditional anderson acceleration scheme for fixed point calculation and extend it using neural network initialization and anderson steps to improve the inference efficiency the authors conduct comprehensive experiments to demonstrate that the speed up in inference is significant and general with little overhead at training time the further experiments shows that the proposed method can be incorporated in the training procedure to give faster training while introducing the speedup at inference time strength the paper has the following strength the paper researches the very interesting important and difficult problem of accelerating implicit model inference the paper gives a working solution using neural networks to assist the inference fixed point calculation with impressive results the method offers significant speedup 2x in forward steps over the current solvers for fixed point calcuation at inference bringing the inference speed of the model close to explicit forward feeding models thorough empirical analysis on the property of the solver is well conducted in the paper showing the overhead for training the solver is minimal compared with the training of the implicit models it is also demonstrated that the solver generalizes and scales well to large experiments in natrual language processing and computer vision further experiments on creative usage of the solver at training time shows that the merits of the speedup when incorporated in training is larger than the overhead on training the solver this shows that the application of the method is free weekness the paper is well written but has a few glitches i think the paper would be a good addition to iclr if the authors can carefully address them the notations of the paper has some typos is gk in 2 a matrix if not the l2 norm can be defined clearly is the dimension of hatgk of dimension c by mk 1 instead of mk 1 by c the use of hyper anderson iterations can be more clearly justified after reading the sec 41 i am still curious how much worse it would be to do the naive thing by solving problem 2 maybe approximately explicit explanation with some numbers could help the paper does not offer significant theroetical contribution but this is not the goal of the paper and the reveiwer believes that it should not undermine its empirical contributions the merits of the proposed solver should be highly general and go beyond deep equilibrium models dems to other implicit models including implicit archetectures for graphstructured data gu 2020 and implicit feature pyramid network for object detection wang 2020 discussions and potentially experiments about how the method would work on them can only improve the merits of the work ref gu f chang h zhu w sojoudi s el ghaoui l 2020 implicit graph neural networks advances in neural information processing systems 33 1198411995 httpspapersnipsccpaper2020hash8b5c8441a8ff8e151b191c53c1842a38abstracthtml wang t zhang x sun j 2020 implicit feature pyramid network for object detection arxiv preprint arxiv201213563 httpsarxivorgabs201213563 this paper presents a method to significantly improve the inference of implicit models although the paper does not focus on theoretical contribution it demonstrates empirical merits very well and the results are impressive the paper is written clearly and is a solid work overall ### Summary:
the authors introduce a neural network approach for solving the fixed point equations arising in deep equilibrium models this consists of a tiny network that provides an initial guess for the fixed point as well as a small network that computes coefficients inside an algorithm inspired by anderson iteration overall there is consensus among the reviewers that the paper is well written and is a strong empirical study i recommend acceptance as a poster additional remarks the authors argue the deqs implicit deep learning models allow a decoupling between representational capacity and inferencetime efficiency yet in the regularizing implicit models paragraph they write implicit models are known to be slow during training and inference to address this recent works have developed certain regularization methods that encourage these models to be more stable and thus easier to solve which seems like a contradiction to me so while in theory i agree with this decoupling in practice it seems not completely true section 3 should include some discussion on conditions on ftheta for the existence of a fixed point since the initialization and hyperanderson networks are trained using unrolling there is some memory overhead compared to vanilla deqs that are differentiated purely using implicit differentiation it would be great to clarify the amount of extra memory needed by these networks it is necessary to justify that the initialization and hyperanderson networks are smaller than usual neural networks
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 29328, 281, 3885, 598, 17032, 273, 3676, 12902, 3210, 372, 41317, 407, 15706, 253, 10610, 4229, 3659, 1220, 735, 1795, 90, 3354, 390, 285, 3796, 17680, 407, 247, 6311, 6880, 273, 39951, 616, 2746, 17209, 327, 247, 3215, 11273, 372, 82, 285, 18784, 247, 1355, 11454, 2990, 281, 12661, 271, 31850, 285, 5731, 6974, 1754, 327, 3216, 5083, 4229, 2792, 616, 1332, 310, 19627, 281, 5368, 37820, 7274, 281, 43088, 598, 372, 41317, 50276, 783, 2929, 556, 9470, 4679, 2439, 1781, 4311, 8892, 3448, 14053, 4440, 257, 292, 9162, 285, 24705, 26405, 597, 921, 22865, 936, 11701, 2439, 841, 8892, 347, 2429, 281, 2629, 372, 41317, 1223, 760, 6240, 247, 337, 3081, 3733, 18332, 273, 253, 372, 82, 253, 10199, 16038, 285, 18389, 273, 372, 41317, 285, 616, 7680, 310, 1077, 2266, 597, 2085, 9470, 40290, 18597, 253, 5373, 273, 616, 1332, 2439, 1142, 1236, 2510, 25912, 8892, 13383, 253, 1375, 273, 253, 1445, 273, 372, 41317, 8003, 281, 8542, 19007, 50276, 8826, 3533, 50276, 18566, 253, 6311, 6880, 273, 39951, 1335, 12215, 4229, 3659, 14940, 310, 352, 2217, 281, 7767, 326, 253, 355, 545, 518, 13461, 2020, 281, 337, 50276, 35501, 281, 436, 2929, 372, 82, 3210, 1646, 281, 320, 17682, 8542, 414, 1309, 17032, 673, 849, 1057, 616, 3733, 673, 7277, 281, 6843, 3210, 891, 2096, 436, 310, 19627, 281, 436, 2929, 516, 816, 14338, 50276, 531, 15985, 255, 368, 3748, 310, 326, 270, 431, 85, 281, 6194, 4373, 395, 3796, 2990, 812, 897, 247, 2257, 273, 3541, 891, 1158, 436, 812, 320, 9619, 3777, 342, 11786, 32552, 272, 604, 352, 2455, 4916, 247, 3673, 44856, 891, 2868, 436, 310, 247, 1077, 2266, 2929, 326, 3012, 32804, 253, 1375, 273, 253, 1445, 273, 372, 41317, 285, 943, 755, 7607, 1293, 28930, 2534, 804, 1633, 6832, 891, 778, 452, 9829, 5474, 339, 431, 248, 4477, 9569, 247, 11454, 2990, 2746, 323, 16161, 253, 4229, 1127, 7424, 14475, 275, 3676, 12902, 3210, 436, 8414, 273, 247, 10058, 2990, 326, 3400, 271, 3302, 5476, 323, 253, 4229, 1127, 347, 973, 347, 247, 1355, 2990, 326, 48169, 10303, 3304, 271, 5933, 11797, 407, 285, 3796, 19502, 253, 2746, 310, 27350, 285, 16774, 3738, 642, 3762, 310, 1677, 253, 4477, 7568, 253, 4757, 273, 616, 4081, 47037, 275, 1781, 4311, 5661, 27163, 5742, 253, 747, 47037, 310, 3809, 281, 6194, 556, 247, 1355, 4764, 1385, 285, 4620, 281, 31063, 5333, 253, 22865, 936, 2914, 273, 253, 17032, 3885, 24159, 6970, 323, 512, 372, 82, 3210, 20544, 50276, 783, 5853, 310, 540, 41597, 17194, 347, 247, 3037, 494, 2715, 273, 285, 3796, 17680, 534, 1223, 14999, 10527, 3720, 310, 3477, 281, 956, 285, 247, 3626, 1332, 281, 1611, 50276, 783, 5661, 7103, 310, 21414, 253, 9470, 16774, 7103, 273, 253, 4081, 2746, 4245, 253, 9414, 253, 3282, 326, 253, 24052, 14930, 310, 387, 1878, 4409, 2820, 327, 616, 372, 82, 1566, 50275, 20881, 1255, 265, 337, 1146, 247, 8127, 45190, 565, 41597, 17194, 789, 627, 403, 1142, 2608, 7533, 285, 4373, 22041, 285, 1014, 1892, 38059, 10341, 10165, 24088, 253, 495, 11655, 5611, 275, 2593, 5976, 326, 452, 281, 320, 6777, 1561, 667, 10527, 12925, 285, 403, 417, 973, 24013, 8550, 1335, 436, 789, 13279, 253, 3369, 323, 2852, 1111, 3296, 85, 474, 1783, 273, 253, 4081, 1332, 3738, 4931, 327, 247, 625, 11096, 390, 4577, 8542, 358, 5378, 474, 4311, 374, 275, 298, 13118, 310, 1182, 505, 7933, 247, 2032, 2900, 390, 816, 581, 2530, 407, 1529, 47037, 5293, 273, 253, 11655, 275, 2593, 5976, 2686, 7221, 885, 253, 27723, 35418, 4736, 273, 253, 12902, 47037, 2139, 310, 627, 642, 2957, 326, 3587, 7221, 3013, 253, 7880, 1318, 273, 253, 12541, 305, 352, 3133, 751, 253, 11454, 47037, 310, 4715, 281, 516, 17255, 253, 8770, 273, 247, 2530, 47037, 1677, 2289, 281, 1182, 505, 534, 778, 417, 2686, 320, 247, 5230, 4543, 1057, 247, 5230, 7933, 2226, 4543, 310, 352, 16293, 281, 320, 4451, 2581, 685, 8415, 253, 1895, 3587, 858, 368, 1611, 281, 7221, 885, 305, 3185, 667, 1921, 390, 30328, 347, 281, 2139, 368, 9703, 436, 1332, 50275, 13982, 859, 21662, 621, 34974, 50276, 20, 581, 273, 253, 1663, 5322, 3386, 273, 634, 5853, 310, 326, 347, 368, 3748, 253, 4229, 1127, 47037, 1057, 417, 878, 3733, 13301, 275, 1340, 281, 5731, 697, 3602, 436, 3745, 281, 513, 440, 35421, 3733, 812, 320, 16318, 275, 253, 12002, 577, 1745, 80, 359, 1555, 253, 352, 347, 247, 12949, 2069, 12395, 273, 2978, 608, 1745, 80, 374, 841, 1465, 1087, 1641, 311, 735, 476, 320, 10166, 1077, 4541, 721, 4677, 577, 69, 512, 273, 253, 4722, 629, 273, 253, 4216, 41358, 898, 4196, 281, 2233, 5386, 253, 5004, 898, 4196, 273, 253, 9118, 7844, 5046, 1014, 247, 32643, 4311, 310, 4569, 818, 352, 778, 320, 1246, 275, 253, 30762, 533, 891, 369, 417, 2104, 281, 4354, 1089, 352, 849, 1142, 3632, 12922, 513, 253, 9191, 275, 4677, 577, 1957, 18670, 417, 581, 310, 1016, 1127, 3907, 432, 1016, 643, 1057, 1016, 1127, 897, 1027, 3632, 12922, 1097, 1561, 253, 1072, 10688, 285, 3345, 273, 253, 1072, 10688, 854, 476, 368, 1611, 634, 1332, 327, 690, 4577, 6928, 24088, 1475, 2233, 520, 9439, 3602, 285, 19554, 8892, 281, 923, 604, 271, 5750, 689, 285, 3796, 17680, 1335, 42575, 891, 717, 6110, 275, 4685, 1880, 253, 2323, 273, 436, 1332, 310, 12331, 281, 253, 10183, 273, 253, 4229, 1127, 1895, 534, 943, 540, 41597, 320, 3687, 275, 4067, 3210, 50276, 266, 45190, 17194, 11454, 2990, 15706, 253, 2554, 273, 247, 5899, 5230, 47135, 24088, 285, 3796, 17680, 275, 372, 82, 3210, 4620, 281, 3157, 3045, 285, 17032, 673, 2299, 436, 5230, 4560, 2990, 23970, 247, 747, 873, 273, 4373, 22041, 285, 512, 253, 40834, 3798, 2330, 342, 3676, 4715, 642, 3762, 273, 14940, 5556, 5837, 2087, 5837, 4373, 19484, 4327, 14855, 337, 285, 352, 310, 417, 2590, 1880, 253, 10165, 1160, 407, 253, 4477, 403, 35114, 7763, 14855, 1249, 512, 275, 512, 247, 1175, 2929, 326, 310, 2779, 281, 320, 8671, 407, 253, 3114, 285, 13673, 2852, 2561, 5474, 339, 431, 248, 2929, 10262, 247, 1332, 1925, 11454, 3676, 12902, 47037, 281, 2572, 253, 6733, 275, 253, 17032, 3924, 323, 15424, 3676, 3210, 407, 3302, 3006, 253, 12902, 3054, 970, 11454, 2990, 253, 4477, 1265, 342, 253, 5899, 285, 3796, 17680, 6974, 323, 4229, 1127, 10272, 285, 9017, 352, 970, 11454, 2990, 31850, 285, 285, 3796, 5018, 281, 3157, 253, 17032, 6733, 253, 4477, 2589, 11088, 4679, 281, 7568, 326, 253, 3885, 598, 275, 17032, 310, 1534, 285, 2087, 342, 1652, 18332, 387, 3733, 673, 253, 2007, 4679, 2722, 326, 253, 4081, 1332, 476, 320, 11217, 275, 253, 3733, 5199, 281, 1918, 7938, 3733, 1223, 16984, 253, 3885, 484, 387, 17032, 673, 4757, 253, 2929, 556, 253, 1563, 4757, 50276, 783, 2929, 29905, 2706, 253, 1077, 4722, 1774, 285, 2834, 1895, 273, 38757, 15424, 1566, 17032, 253, 2929, 4245, 247, 2444, 2900, 970, 11454, 6928, 281, 10073, 253, 17032, 4229, 1127, 10272, 342, 13943, 1543, 50276, 783, 1332, 6131, 1534, 3885, 484, 374, 89, 275, 3579, 5018, 689, 253, 1655, 1220, 735, 323, 4229, 1127, 9039, 2368, 387, 17032, 9745, 253, 17032, 3885, 273, 253, 1566, 2810, 281, 6843, 3579, 12422, 3210, 50276, 42771, 602, 16774, 1783, 327, 253, 2867, 273, 253, 47037, 310, 973, 5196, 275, 253, 2929, 4645, 253, 18332, 323, 3733, 253, 47037, 310, 8723, 2429, 342, 253, 3733, 273, 253, 15424, 3210, 352, 310, 671, 5183, 326, 253, 47037, 2087, 4219, 285, 11498, 973, 281, 1781, 4679, 275, 2889, 579, 267, 3448, 5162, 285, 4382, 8113, 50276, 44295, 4679, 327, 10995, 10393, 273, 253, 47037, 387, 3733, 673, 2722, 326, 253, 16108, 273, 253, 3885, 484, 672, 11217, 275, 3733, 310, 4067, 685, 253, 18332, 327, 3733, 253, 47037, 436, 2722, 326, 253, 2898, 273, 253, 1332, 310, 1959, 50276, 11151, 1255, 253, 2929, 310, 973, 3542, 533, 556, 247, 1643, 1289, 27539, 891, 1158, 253, 2929, 651, 320, 247, 1175, 1635, 281, 17857, 32888, 604, 253, 4477, 476, 9257, 2953, 731, 50276, 783, 41818, 273, 253, 2929, 556, 690, 963, 993, 310, 305, 76, 275, 374, 247, 4315, 604, 417, 253, 298, 19, 5222, 476, 320, 2931, 4518, 310, 253, 7877, 273, 7856, 72, 76, 273, 7877, 260, 407, 36904, 50276, 18, 3185, 273, 36904, 50276, 18, 407, 260, 50276, 783, 897, 273, 4373, 285, 3796, 25142, 476, 320, 625, 4518, 17285, 846, 4361, 253, 4706, 7609, 891, 717, 1335, 14338, 849, 1199, 7197, 352, 651, 320, 281, 513, 253, 27785, 2181, 407, 16161, 1895, 374, 5046, 5512, 6843, 8813, 342, 690, 3904, 812, 1361, 50276, 783, 2929, 1057, 417, 3959, 1534, 253, 287, 292, 474, 7680, 533, 436, 310, 417, 253, 4736, 273, 253, 2929, 285, 253, 3195, 74, 8358, 11532, 326, 352, 943, 417, 29614, 697, 16774, 9021, 50276, 783, 16108, 273, 253, 4081, 47037, 943, 320, 4122, 2087, 285, 564, 4457, 3676, 12902, 3210, 1471, 84, 281, 643, 15424, 3210, 1690, 15424, 36703, 442, 291, 980, 323, 4216, 34218, 941, 1149, 9169, 285, 15424, 4735, 39694, 2990, 323, 1789, 5481, 259, 606, 9169, 11985, 285, 7826, 4679, 670, 849, 253, 1332, 651, 789, 327, 731, 476, 760, 3157, 253, 16108, 273, 253, 789, 50276, 709, 50276, 4297, 269, 1683, 288, 1182, 11917, 259, 594, 75, 2995, 74, 256, 1045, 305, 3227, 276, 74, 298, 9169, 15424, 4216, 11454, 6928, 16424, 275, 11454, 1491, 5162, 2718, 5922, 12035, 2759, 12115, 2222, 5987, 50004, 79, 2824, 550, 20790, 14952, 13362, 25, 67, 22, 68, 25, 31774, 66, 25, 567, 25, 70, 18795, 67, 22179, 68, 3357, 68, 1093, 2945, 66, 1839, 15834, 2974, 50276, 33317, 246, 1182, 12109, 1269, 50276, 13998, 480, 9169, 15424, 4735, 39694, 2990, 323, 1789, 5481, 549, 32693, 638, 3845, 549, 32693, 6755, 1012, 42310, 5987, 39962, 2061, 5375, 6755, 1012, 42310, 436, 2929, 10262, 247, 1332, 281, 3012, 3157, 253, 17032, 273, 15424, 3210, 3738, 253, 2929, 1057, 417, 2770, 327, 10527, 7680, 352, 14371, 16774, 16108, 1077, 973, 285, 253, 1543, 403, 13943, 253, 2929, 310, 3542, 4518, 285, 310, 247, 4891, 789, 4583, 2490, 187, 4118, 18435, 27, 783, 4477, 9569, 247, 11454, 2990, 2746, 323, 16161, 253, 4229, 1127, 7424, 14475, 275, 3676, 12902, 3210, 436, 8414, 273, 247, 10058, 2990, 326, 3400, 271, 3302, 5476, 323, 253, 4229, 1127, 347, 973, 347, 247, 1355, 2990, 326, 48169, 10303, 3304, 271, 5933, 11797, 407, 285, 3796, 19502, 50276, 1189, 455, 627, 310, 13969, 2190, 253, 30628, 326, 253, 2929, 310, 973, 3542, 285, 310, 247, 2266, 16774, 1263, 50276, 74, 5583, 14924, 347, 247, 20731, 50276, 38092, 16157, 50275, 783, 4477, 9059, 253, 372, 41317, 50276, 303, 20692, 3676, 4715, 3210, 1581, 247, 34430, 4906, 875, 1957, 1050, 5350, 285, 275, 22498, 68, 7816, 6733, 2568, 275, 253, 3963, 3006, 15424, 3210, 12494, 597, 3630, 15424, 3210, 403, 1929, 281, 320, 3468, 1309, 3733, 285, 17032, 281, 2953, 436, 3332, 2987, 452, 3715, 2176, 37820, 3082, 326, 11907, 841, 3210, 281, 320, 625, 6474, 285, 3021, 6927, 281, 8415, 534, 3133, 751, 247, 20620, 281, 479, 594, 1223, 275, 3762, 891, 5194, 342, 436, 34430, 4906, 275, 3946, 352, 3133, 417, 4336, 2032, 50275, 4674, 495, 943, 2486, 690, 5955, 327, 2515, 327, 269, 3124, 323, 253, 6242, 273, 247, 4229, 1127, 50275, 17480, 253, 31850, 285, 4373, 395, 3796, 6928, 403, 10166, 970, 440, 19891, 627, 310, 690, 3541, 18332, 2429, 281, 26724, 372, 41317, 326, 403, 22266, 15846, 970, 15424, 9827, 352, 651, 320, 1270, 281, 19148, 253, 2408, 273, 4465, 3541, 3058, 407, 841, 6928, 352, 310, 3309, 281, 15249, 326, 253, 31850, 285, 4373, 395, 3796, 6928, 403, 4577, 685, 7312, 11454, 6928 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 29328, 281, 3885, 598, 17032, 273, 3676, 12902, 3210, 372, 41317, 407, 15706, 253, 10610, 4229, 3659, 1220, 735, 1795, 90, 3354, 390, 285, 3796, 17680, 407, 247, 6311, 6880, 273, 39951, 616, 2746, 17209, 327, 247, 3215, 11273, 372, 82, 285, 18784, 247, 1355, 11454, 2990, 281, 12661, 271, 31850, 285, 5731, 6974, 1754, 327, 3216, 5083, 4229, 2792, 616, 1332, 310, 19627, 281, 5368, 37820, 7274, 281, 43088, 598, 372, 41317, 50276, 783, 2929, 556, 9470, 4679, 2439, 1781, 4311, 8892, 3448, 14053, 4440, 257, 292, 9162, 285, 24705, 26405, 597, 921, 22865, 936, 11701, 2439, 841, 8892, 347, 2429, 281, 2629, 372, 41317, 1223, 760, 6240, 247, 337, 3081, 3733, 18332, 273, 253, 372, 82, 253, 10199, 16038, 285, 18389, 273, 372, 41317, 285, 616, 7680, 310, 1077, 2266, 597, 2085, 9470, 40290, 18597, 253, 5373, 273, 616, 1332, 2439, 1142, 1236, 2510, 25912, 8892, 13383, 253, 1375, 273, 253, 1445, 273, 372, 41317, 8003, 281, 8542, 19007, 50276, 8826, 3533, 50276, 18566, 253, 6311, 6880, 273, 39951, 1335, 12215, 4229, 3659, 14940, 310, 352, 2217, 281, 7767, 326, 253, 355, 545, 518, 13461, 2020, 281, 337, 50276, 35501, 281, 436, 2929, 372, 82, 3210, 1646, 281, 320, 17682, 8542, 414, 1309, 17032, 673, 849, 1057, 616, 3733, 673, 7277, 281, 6843, 3210, 891, 2096, 436, 310, 19627, 281, 436, 2929, 516, 816, 14338, 50276, 531, 15985, 255, 368, 3748, 310, 326, 270, 431, 85, 281, 6194, 4373, 395, 3796, 2990, 812, 897, 247, 2257, 273, 3541, 891, 1158, 436, 812, 320, 9619, 3777, 342, 11786, 32552, 272, 604, 352, 2455, 4916, 247, 3673, 44856, 891, 2868, 436, 310, 247, 1077, 2266, 2929, 326, 3012, 32804, 253, 1375, 273, 253, 1445, 273, 372, 41317, 285, 943, 755, 7607, 1293, 28930, 2534, 804, 1633, 6832, 891, 778, 452, 9829, 5474, 339, 431, 248, 4477, 9569, 247, 11454, 2990, 2746, 323, 16161, 253, 4229, 1127, 7424, 14475, 275, 3676, 12902, 3210, 436, 8414, 273, 247, 10058, 2990, 326, 3400, 271, 3302, 5476, 323, 253, 4229, 1127, 347, 973, 347, 247, 1355, 2990, 326, 48169, 10303, 3304, 271, 5933, 11797, 407, 285, 3796, 19502, 253, 2746, 310, 27350, 285, 16774, 3738, 642, 3762, 310, 1677, 253, 4477, 7568, 253, 4757, 273, 616, 4081, 47037, 275, 1781, 4311, 5661, 27163, 5742, 253, 747, 47037, 310, 3809, 281, 6194, 556, 247, 1355, 4764, 1385, 285, 4620, 281, 31063, 5333, 253, 22865, 936, 2914, 273, 253, 17032, 3885, 24159, 6970, 323, 512, 372, 82, 3210, 20544, 50276, 783, 5853, 310, 540, 41597, 17194, 347, 247, 3037, 494, 2715, 273, 285, 3796, 17680, 534, 1223, 14999, 10527, 3720, 310, 3477, 281, 956, 285, 247, 3626, 1332, 281, 1611, 50276, 783, 5661, 7103, 310, 21414, 253, 9470, 16774, 7103, 273, 253, 4081, 2746, 4245, 253, 9414, 253, 3282, 326, 253, 24052, 14930, 310, 387, 1878, 4409, 2820, 327, 616, 372, 82, 1566, 50275, 20881, 1255, 265, 337, 1146, 247, 8127, 45190, 565, 41597, 17194, 789, 627, 403, 1142, 2608, 7533, 285, 4373, 22041, 285, 1014, 1892, 38059, 10341, 10165, 24088, 253, 495, 11655, 5611, 275, 2593, 5976, 326, 452, 281, 320, 6777, 1561, 667, 10527, 12925, 285, 403, 417, 973, 24013, 8550, 1335, 436, 789, 13279, 253, 3369, 323, 2852, 1111, 3296, 85, 474, 1783, 273, 253, 4081, 1332, 3738, 4931, 327, 247, 625, 11096, 390, 4577, 8542, 358, 5378, 474, 4311, 374, 275, 298, 13118, 310, 1182, 505, 7933, 247, 2032, 2900, 390, 816, 581, 2530, 407, 1529, 47037, 5293, 273, 253, 11655, 275, 2593, 5976, 2686, 7221, 885, 253, 27723, 35418, 4736, 273, 253, 12902, 47037, 2139, 310, 627, 642, 2957, 326, 3587, 7221, 3013, 253, 7880, 1318, 273, 253, 12541, 305, 352, 3133, 751, 253, 11454, 47037, 310, 4715, 281, 516, 17255, 253, 8770, 273, 247, 2530, 47037, 1677, 2289, 281, 1182, 505, 534, 778, 417, 2686, 320, 247, 5230, 4543, 1057, 247, 5230, 7933, 2226, 4543, 310, 352, 16293, 281, 320, 4451, 2581, 685, 8415, 253, 1895, 3587, 858, 368, 1611, 281, 7221, 885, 305, 3185, 667, 1921, 390, 30328, 347, 281, 2139, 368, 9703, 436, 1332, 50275, 13982, 859, 21662, 621, 34974, 50276, 20, 581, 273, 253, 1663, 5322, 3386, 273, 634, 5853, 310, 326, 347, 368, 3748, 253, 4229, 1127, 47037, 1057, 417, 878, 3733, 13301, 275, 1340, 281, 5731, 697, 3602, 436, 3745, 281, 513, 440, 35421, 3733, 812, 320, 16318, 275, 253, 12002, 577, 1745, 80, 359, 1555, 253, 352, 347, 247, 12949, 2069, 12395, 273, 2978, 608, 1745, 80, 374, 841, 1465, 1087, 1641, 311, 735, 476, 320, 10166, 1077, 4541, 721, 4677, 577, 69, 512, 273, 253, 4722, 629, 273, 253, 4216, 41358, 898, 4196, 281, 2233, 5386, 253, 5004, 898, 4196, 273, 253, 9118, 7844, 5046, 1014, 247, 32643, 4311, 310, 4569, 818, 352, 778, 320, 1246, 275, 253, 30762, 533, 891, 369, 417, 2104, 281, 4354, 1089, 352, 849, 1142, 3632, 12922, 513, 253, 9191, 275, 4677, 577, 1957, 18670, 417, 581, 310, 1016, 1127, 3907, 432, 1016, 643, 1057, 1016, 1127, 897, 1027, 3632, 12922, 1097, 1561, 253, 1072, 10688, 285, 3345, 273, 253, 1072, 10688, 854, 476, 368, 1611, 634, 1332, 327, 690, 4577, 6928, 24088, 1475, 2233, 520, 9439, 3602, 285, 19554, 8892, 281, 923, 604, 271, 5750, 689, 285, 3796, 17680, 1335, 42575, 891, 717, 6110, 275, 4685, 1880, 253, 2323, 273, 436, 1332, 310, 12331, 281, 253, 10183, 273, 253, 4229, 1127, 1895, 534, 943, 540, 41597, 320, 3687, 275, 4067, 3210, 50276, 266, 45190, 17194, 11454, 2990, 15706, 253, 2554, 273, 247, 5899, 5230, 47135, 24088, 285, 3796, 17680, 275, 372, 82, 3210, 4620, 281, 3157, 3045, 285, 17032, 673, 2299, 436, 5230, 4560, 2990, 23970, 247, 747, 873, 273, 4373, 22041, 285, 512, 253, 40834, 3798, 2330, 342, 3676, 4715, 642, 3762, 273, 14940, 5556, 5837, 2087, 5837, 4373, 19484, 4327, 14855, 337, 285, 352, 310, 417, 2590, 1880, 253, 10165, 1160, 407, 253, 4477, 403, 35114, 7763, 14855, 1249, 512, 275, 512, 247, 1175, 2929, 326, 310, 2779, 281, 320, 8671, 407, 253, 3114, 285, 13673, 2852, 2561, 5474, 339, 431, 248, 2929, 10262, 247, 1332, 1925, 11454, 3676, 12902, 47037, 281, 2572, 253, 6733, 275, 253, 17032, 3924, 323, 15424, 3676, 3210, 407, 3302, 3006, 253, 12902, 3054, 970, 11454, 2990, 253, 4477, 1265, 342, 253, 5899, 285, 3796, 17680, 6974, 323, 4229, 1127, 10272, 285, 9017, 352, 970, 11454, 2990, 31850, 285, 285, 3796, 5018, 281, 3157, 253, 17032, 6733, 253, 4477, 2589, 11088, 4679, 281, 7568, 326, 253, 3885, 598, 275, 17032, 310, 1534, 285, 2087, 342, 1652, 18332, 387, 3733, 673, 253, 2007, 4679, 2722, 326, 253, 4081, 1332, 476, 320, 11217, 275, 253, 3733, 5199, 281, 1918, 7938, 3733, 1223, 16984, 253, 3885, 484, 387, 17032, 673, 4757, 253, 2929, 556, 253, 1563, 4757, 50276, 783, 2929, 29905, 2706, 253, 1077, 4722, 1774, 285, 2834, 1895, 273, 38757, 15424, 1566, 17032, 253, 2929, 4245, 247, 2444, 2900, 970, 11454, 6928, 281, 10073, 253, 17032, 4229, 1127, 10272, 342, 13943, 1543, 50276, 783, 1332, 6131, 1534, 3885, 484, 374, 89, 275, 3579, 5018, 689, 253, 1655, 1220, 735, 323, 4229, 1127, 9039, 2368, 387, 17032, 9745, 253, 17032, 3885, 273, 253, 1566, 2810, 281, 6843, 3579, 12422, 3210, 50276, 42771, 602, 16774, 1783, 327, 253, 2867, 273, 253, 47037, 310, 973, 5196, 275, 253, 2929, 4645, 253, 18332, 323, 3733, 253, 47037, 310, 8723, 2429, 342, 253, 3733, 273, 253, 15424, 3210, 352, 310, 671, 5183, 326, 253, 47037, 2087, 4219, 285, 11498, 973, 281, 1781, 4679, 275, 2889, 579, 267, 3448, 5162, 285, 4382, 8113, 50276, 44295, 4679, 327, 10995, 10393, 273, 253, 47037, 387, 3733, 673, 2722, 326, 253, 16108, 273, 253, 3885, 484, 672, 11217, 275, 3733, 310, 4067, 685, 253, 18332, 327, 3733, 253, 47037, 436, 2722, 326, 253, 2898, 273, 253, 1332, 310, 1959, 50276, 11151, 1255, 253, 2929, 310, 973, 3542, 533, 556, 247, 1643, 1289, 27539, 891, 1158, 253, 2929, 651, 320, 247, 1175, 1635, 281, 17857, 32888, 604, 253, 4477, 476, 9257, 2953, 731, 50276, 783, 41818, 273, 253, 2929, 556, 690, 963, 993, 310, 305, 76, 275, 374, 247, 4315, 604, 417, 253, 298, 19, 5222, 476, 320, 2931, 4518, 310, 253, 7877, 273, 7856, 72, 76, 273, 7877, 260, 407, 36904, 50276, 18, 3185, 273, 36904, 50276, 18, 407, 260, 50276, 783, 897, 273, 4373, 285, 3796, 25142, 476, 320, 625, 4518, 17285, 846, 4361, 253, 4706, 7609, 891, 717, 1335, 14338, 849, 1199, 7197, 352, 651, 320, 281, 513, 253, 27785, 2181, 407, 16161, 1895, 374, 5046, 5512, 6843, 8813, 342, 690, 3904, 812, 1361, 50276, 783, 2929, 1057, 417, 3959, 1534, 253, 287, 292, 474, 7680, 533, 436, 310, 417, 253, 4736, 273, 253, 2929, 285, 253, 3195, 74, 8358, 11532, 326, 352, 943, 417, 29614, 697, 16774, 9021, 50276, 783, 16108, 273, 253, 4081, 47037, 943, 320, 4122, 2087, 285, 564, 4457, 3676, 12902, 3210, 1471, 84, 281, 643, 15424, 3210, 1690, 15424, 36703, 442, 291, 980, 323, 4216, 34218, 941, 1149, 9169, 285, 15424, 4735, 39694, 2990, 323, 1789, 5481, 259, 606, 9169, 11985, 285, 7826, 4679, 670, 849, 253, 1332, 651, 789, 327, 731, 476, 760, 3157, 253, 16108, 273, 253, 789, 50276, 709, 50276, 4297, 269, 1683, 288, 1182, 11917, 259, 594, 75, 2995, 74, 256, 1045, 305, 3227, 276, 74, 298, 9169, 15424, 4216, 11454, 6928, 16424, 275, 11454, 1491, 5162, 2718, 5922, 12035, 2759, 12115, 2222, 5987, 50004, 79, 2824, 550, 20790, 14952, 13362, 25, 67, 22, 68, 25, 31774, 66, 25, 567, 25, 70, 18795, 67, 22179, 68, 3357, 68, 1093, 2945, 66, 1839, 15834, 2974, 50276, 33317, 246, 1182, 12109, 1269, 50276, 13998, 480, 9169, 15424, 4735, 39694, 2990, 323, 1789, 5481, 549, 32693, 638, 3845, 549, 32693, 6755, 1012, 42310, 5987, 39962, 2061, 5375, 6755, 1012, 42310, 436, 2929, 10262, 247, 1332, 281, 3012, 3157, 253, 17032, 273, 15424, 3210, 3738, 253, 2929, 1057, 417, 2770, 327, 10527, 7680, 352, 14371, 16774, 16108, 1077, 973, 285, 253, 1543, 403, 13943, 253, 2929, 310, 3542, 4518, 285, 310, 247, 4891, 789, 4583, 2490, 187, 4118, 18435, 27, 783, 4477, 9569, 247, 11454, 2990, 2746, 323, 16161, 253, 4229, 1127, 7424, 14475, 275, 3676, 12902, 3210, 436, 8414, 273, 247, 10058, 2990, 326, 3400, 271, 3302, 5476, 323, 253, 4229, 1127, 347, 973, 347, 247, 1355, 2990, 326, 48169, 10303, 3304, 271, 5933, 11797, 407, 285, 3796, 19502, 50276, 1189, 455, 627, 310, 13969, 2190, 253, 30628, 326, 253, 2929, 310, 973, 3542, 285, 310, 247, 2266, 16774, 1263, 50276, 74, 5583, 14924, 347, 247, 20731, 50276, 38092, 16157, 50275, 783, 4477, 9059, 253, 372, 41317, 50276, 303, 20692, 3676, 4715, 3210, 1581, 247, 34430, 4906, 875, 1957, 1050, 5350, 285, 275, 22498, 68, 7816, 6733, 2568, 275, 253, 3963, 3006, 15424, 3210, 12494, 597, 3630, 15424, 3210, 403, 1929, 281, 320, 3468, 1309, 3733, 285, 17032, 281, 2953, 436, 3332, 2987, 452, 3715, 2176, 37820, 3082, 326, 11907, 841, 3210, 281, 320, 625, 6474, 285, 3021, 6927, 281, 8415, 534, 3133, 751, 247, 20620, 281, 479, 594, 1223, 275, 3762, 891, 5194, 342, 436, 34430, 4906, 275, 3946, 352, 3133, 417, 4336, 2032, 50275, 4674, 495, 943, 2486, 690, 5955, 327, 2515, 327, 269, 3124, 323, 253, 6242, 273, 247, 4229, 1127, 50275, 17480, 253, 31850, 285, 4373, 395, 3796, 6928, 403, 10166, 970, 440, 19891, 627, 310, 690, 3541, 18332, 2429, 281, 26724, 372, 41317, 326, 403, 22266, 15846, 970, 15424, 9827, 352, 651, 320, 1270, 281, 19148, 253, 2408, 273, 4465, 3541, 3058, 407, 841, 6928, 352, 310, 3309, 281, 15249, 326, 253, 31850, 285, 4373, 395, 3796, 6928, 403, 4577, 685, 7312, 11454, 6928 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper proposes a federated learning algorithm called fedprof by using a training rule that updates the model based on divergences between data representation strenth i personally like the style of this work which attempts to formulate and present results on fl in a more theoretical way weakness overall the presentation on theoretical results can be improved and the connection of some statements to the main results is not clear 1 for example in proposition 1 and 2 all nonlinear components in the neural network seem to be bypassed and the statements are simply equivalent ways of stating clt if it is the case this needs to be clearly mentioned before those propositions besides it is also not clear to me that how this oversimplified model is related to the rest of the works 2 the main theorem needs to be better stated and explained for example the most relevant question to the readers is how the construction of profiles affects the performance of the algorithm 3 to better understand the main theorem it would help if the stated learning bound can be compared to some benchmarks eg the performances of using fixed lambda 4 minor the last equations in all proposition proofs are stated in terms of convergence in distribution which may not hold overall the results in this paper can be better presented and the connections between certain statements are not clear therefore i recommend a rejection docsepthe authors tackle the federated learning scenarios where local data is biased noisy or even irreverent which could significantly degenerate the global performance and the authors propose fedprof which utilizes data representation profiling and matching to mitigate the impact of lowquality clients during training this paper is wellwritten and easy to understand the motivation is clear and intuitive the proposed method is technically sound and the results are strong i have a few comments particularly for the evaluation part more datasets it seems to work well on the reported datasets i wonder if the tendency is still consistent with more benchmark datasets i would like to suggest common datasets such as cifar10 and cifar100 i also would like to see the results when the number of classes is increased standard deviation can you provide statistical information in table 2 and 3 i would like to see the variance of the proposed model few performances seem marginal compared to the base models the current baseline models i found that the baseline models used in the paper are slightly outdated i suggest 1 the stateoftheart method tackling similar scenarios via performing modellevel contrastive learning overview please consider adding the concept illustration for fedprof which can be helpful for common readers to capture highlevel ideas 1 modelcontrastive federated learning cvpr21 overall i enjoyed reading the paper and ill raise my score if my concerns are properly addressed docsepthis paper proposes fedprof a new client sampling scheme that speeds up the convergence of fedavg type algorithms the author proves convergence of fedprof under a set of simplifying conditions and they demonstrate the utility of fedprof via empirical studies strengths the proposed method is intuitive and simple to implement it also comes with theoretical guarantees though the assumptions are highly idealized weaknesses and some questions if i understand correctly pls correct me if im wrong proposition 1 and 2 are consequences of lyapunov clt thus one needs each coordinate of the representation vector to be independent it is not clear to me if the independence assumption would hold in practice following my previous point i am curious about what will happen if we replace the product gaussian eq 2 by a gaussian with some nonidentity covariance matrix is the choice of kl divergence eq 4 essential for the performance what will happen if we choose other metrics how sensitive is the performance with respect to the choice of alpha i appreciate the simplicity of the idea and the performance boost it brings about from what i understand it can be applied to any fl algorithms that involve client sampling therefore i think this paper deserves 6 marginally above the acceptance threshold docsepin this paper the authors propose a user selection algorithm for federated learning fl the key motivation is to select high quality clients for update and thus to reduce the impact of low quality data on fl training a hidden hypothesis is that high quality data has similar representations while noisy and lowquality data has different distributions of representations based on this hypothesis the key idea is to select users based on their representation layer distribution difference from the global model the higher the difference the lower the chance of the client being selected furthermore the authors observed and proved that representation layers follow a gaussian distribution which makes the representation difference learning more efficient the authors evaluated the proposed algorithm in comparison to existing algorithms using a smallscale sensor dataset and a largescale emnist dataset the proposed algorithm performs well in the evaluation the key idea of the paper is to select highquality data under the hypothesis noisy and lowquality data has different distributions of representations this hypothesis needs further justification first fl is known for dealing with noniid heterogeneous datasets consider a scenario where clients have highquality but heterogeneous datasets how general does this hypothesis hold in some sense the idea here is opposite to another idea that has been used for client selection where clients with high loss are more likely to be selected also i wonder if the algorithm could get stuck in a local suboptimal condition specifically lets say there are two clusters of clients with different focuses by chance initial training selected users from one cluster then the global model will further select users from the same cluster could this happen the current evaluation is limited to two simple and homogeneous tasks with added noise which provides an ideal situation where the proposed algorithm would shine while these evaluations illustrate the effectiveness of the proposed idea under ideal conditions more thorough evaluations under general heterogeneous conditions are needed to better understand the pros and cons of the proposed algorithm also most of the comparison algorithms were proposed in 2019 except one that focuses on communications given the fast development of fl i wonder if there are more recent algorithms that should be compared with the authors made an observation that representation layers follow a gaussian distribution i understand that this makes it more efficient to calculate distributional difference however in principle this can be applied to cases where gaussian distribution does not hold right can you elaborate more on this both in terms of using gaussian distribution to approximate a nongaussian one and using the true nongaussian distribution in the proofs of prop 1 and prop 2 central limit theorem is applied and thus requires the lyapunovs condition def 1 could you comment on how restrictive this condition is for example in your experiments does this assumption hold one clarification question in the experiments how did you select the representation layer in summary the authors propose to more favorably select fl clients whose representation distribution is more similar to that of the global model while the current evaluation results are encouraging data heterogeneity in fl and model robustness need to be carefully considered ### Summary:
this paper proposes a federated learning method called fedprof that adaptively selects different subsets of the clients data in training the global model there were several concerns brought up in the reviews and discussion the multivariate gaussian with identity covariance assumption on the neural network representation is limited the paper also claimed to provide privacy preservation but there is no formal statement of the actual privacy guarantees the fact that its running federated learning does not guarantee privacy protection the presentation could use improvement the reviewers had issues trying to understand the main theorem overall there is not sufficient support for acceptance
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 247, 10208, 12072, 4715, 5933, 1925, 10208, 34586, 407, 970, 247, 3733, 4086, 326, 11269, 253, 1566, 1754, 327, 11711, 1541, 707, 875, 941, 6779, 50276, 296, 445, 394, 50276, 74, 11697, 751, 253, 3740, 273, 436, 789, 534, 9437, 281, 36803, 285, 1246, 1543, 327, 892, 275, 247, 625, 10527, 1039, 50276, 20881, 1255, 50276, 1189, 455, 253, 9759, 327, 10527, 1543, 476, 320, 5520, 285, 253, 4602, 273, 690, 7234, 281, 253, 2022, 1543, 310, 417, 2590, 50276, 18, 323, 1650, 275, 13989, 337, 285, 374, 512, 14561, 4295, 275, 253, 11454, 2990, 1646, 281, 320, 18210, 264, 285, 253, 7234, 403, 3365, 6425, 4088, 273, 14851, 502, 85, 604, 352, 310, 253, 1083, 436, 3198, 281, 320, 4518, 5393, 1078, 1110, 39325, 16280, 352, 310, 671, 417, 2590, 281, 479, 326, 849, 436, 689, 48573, 1245, 1566, 310, 2905, 281, 253, 1551, 273, 253, 2987, 50276, 19, 253, 2022, 10012, 3198, 281, 320, 1805, 4767, 285, 5544, 323, 1650, 253, 954, 4623, 1953, 281, 253, 10668, 310, 849, 253, 5140, 273, 10104, 11852, 253, 3045, 273, 253, 5933, 50273, 20, 281, 1805, 2096, 253, 2022, 10012, 352, 651, 1361, 604, 253, 4767, 4715, 3033, 476, 320, 2429, 281, 690, 49602, 24088, 253, 16226, 273, 970, 4229, 29331, 50276, 21, 5884, 253, 1390, 7424, 275, 512, 13989, 27947, 403, 4767, 275, 2426, 273, 14940, 275, 3268, 534, 778, 417, 2186, 50274, 1189, 455, 253, 1543, 275, 436, 2929, 476, 320, 1805, 3559, 285, 253, 10291, 875, 2176, 7234, 403, 417, 2590, 3103, 891, 5583, 247, 18235, 5474, 339, 431, 248, 4477, 18915, 253, 10208, 12072, 4715, 15216, 835, 1980, 941, 310, 23539, 27620, 390, 1014, 21388, 306, 624, 534, 812, 3012, 29458, 253, 4156, 3045, 285, 253, 4477, 12661, 10208, 34586, 534, 29820, 941, 6779, 27866, 285, 11038, 281, 29966, 253, 3486, 273, 1698, 15177, 8548, 1309, 3733, 50276, 2520, 2929, 310, 973, 15720, 285, 3477, 281, 2096, 253, 16038, 310, 2590, 285, 27350, 253, 4081, 1332, 310, 22335, 3590, 285, 253, 1543, 403, 2266, 50276, 74, 452, 247, 1643, 5701, 3782, 323, 253, 7103, 629, 50274, 3062, 15302, 352, 3133, 281, 789, 973, 327, 253, 2361, 15302, 891, 4282, 604, 253, 14955, 310, 1335, 5185, 342, 625, 22791, 15302, 891, 651, 751, 281, 1804, 1846, 15302, 824, 347, 260, 338, 274, 740, 285, 260, 338, 274, 2313, 891, 671, 651, 751, 281, 923, 253, 1543, 672, 253, 1180, 273, 5971, 310, 2559, 50275, 15291, 11254, 476, 368, 2085, 7605, 1491, 275, 2829, 374, 285, 495, 891, 651, 751, 281, 923, 253, 11041, 273, 253, 4081, 1566, 1643, 16226, 1646, 16888, 2429, 281, 253, 2613, 3210, 50273, 783, 1655, 8245, 3210, 891, 1119, 326, 253, 8245, 3210, 908, 275, 253, 2929, 403, 5777, 36761, 891, 1804, 337, 253, 1375, 23037, 14387, 1332, 46710, 2074, 15216, 3066, 9591, 1566, 5251, 4499, 422, 4715, 50274, 39930, 4496, 1908, 6240, 253, 4473, 23356, 323, 10208, 34586, 534, 476, 320, 9371, 323, 1846, 10668, 281, 9232, 1029, 5251, 5697, 50275, 18, 1566, 45842, 422, 10208, 12072, 4715, 30105, 1087, 1797, 50273, 1189, 455, 891, 11346, 4361, 253, 2929, 285, 2853, 7164, 619, 4868, 604, 619, 7350, 403, 6283, 9713, 5474, 33032, 2520, 2929, 29328, 10208, 34586, 247, 747, 5268, 10491, 6974, 326, 18819, 598, 253, 14940, 273, 10208, 42921, 1511, 11333, 253, 2488, 19539, 14940, 273, 10208, 34586, 762, 247, 873, 273, 8077, 5411, 2515, 285, 597, 7568, 253, 11839, 273, 10208, 34586, 3066, 16774, 2175, 20544, 253, 4081, 1332, 310, 27350, 285, 2969, 281, 3359, 352, 671, 3249, 342, 10527, 23632, 2167, 253, 13260, 403, 4122, 7445, 1025, 50275, 20881, 1255, 265, 285, 690, 3533, 50276, 338, 891, 2096, 9113, 499, 84, 3451, 479, 604, 516, 3430, 13989, 337, 285, 374, 403, 9099, 273, 12865, 522, 43772, 502, 85, 3021, 581, 3198, 1016, 13249, 273, 253, 6779, 4972, 281, 320, 3907, 352, 310, 417, 2590, 281, 479, 604, 253, 14275, 9376, 651, 2186, 275, 3946, 50276, 34814, 619, 2045, 1127, 891, 717, 14338, 670, 752, 588, 5108, 604, 359, 8171, 253, 1885, 305, 12064, 16186, 374, 407, 247, 305, 12064, 342, 690, 1327, 32965, 26677, 4315, 50276, 261, 253, 4327, 273, 27451, 23279, 16186, 577, 5667, 323, 253, 3045, 752, 588, 5108, 604, 359, 5206, 643, 17082, 50276, 5430, 7996, 310, 253, 3045, 342, 1675, 281, 253, 4327, 273, 9765, 891, 11435, 253, 17647, 273, 253, 2934, 285, 253, 3045, 9510, 352, 10316, 670, 432, 752, 891, 2096, 352, 476, 320, 3732, 281, 667, 892, 11333, 326, 6388, 5268, 10491, 3103, 891, 1158, 436, 2929, 22828, 721, 42876, 1840, 253, 14924, 7887, 5474, 339, 9852, 436, 2929, 253, 4477, 12661, 247, 2608, 5438, 5933, 323, 10208, 12072, 4715, 892, 253, 2234, 16038, 310, 281, 3609, 1029, 3290, 8548, 323, 5731, 285, 3021, 281, 4796, 253, 3486, 273, 1698, 3290, 941, 327, 892, 3733, 247, 8763, 9079, 310, 326, 1029, 3290, 941, 556, 2074, 14237, 1223, 27620, 285, 1698, 15177, 941, 556, 1027, 10670, 273, 14237, 1754, 327, 436, 9079, 253, 2234, 2934, 310, 281, 3609, 4212, 1754, 327, 616, 6779, 3828, 3268, 3064, 432, 253, 4156, 1566, 253, 2169, 253, 3064, 253, 2406, 253, 4839, 273, 253, 5268, 1146, 4236, 50276, 44295, 3062, 253, 4477, 2540, 285, 8058, 326, 6779, 8090, 956, 247, 305, 12064, 3268, 534, 2789, 253, 6779, 3064, 4715, 625, 5919, 253, 4477, 6760, 253, 4081, 5933, 275, 5301, 281, 5368, 11333, 970, 247, 1355, 7527, 8468, 10895, 285, 247, 1236, 2510, 25912, 802, 79, 382, 10895, 253, 4081, 5933, 17923, 973, 275, 253, 7103, 253, 2234, 2934, 273, 253, 2929, 310, 281, 3609, 1029, 15177, 941, 762, 253, 9079, 27620, 285, 1698, 15177, 941, 556, 1027, 10670, 273, 14237, 436, 9079, 3198, 2007, 22861, 806, 892, 310, 1929, 323, 10620, 342, 1327, 74, 301, 22766, 15302, 1908, 247, 10076, 835, 8548, 452, 1029, 15177, 533, 22766, 15302, 849, 2087, 1057, 436, 9079, 2186, 275, 690, 3282, 253, 2934, 1060, 310, 7285, 281, 1529, 2934, 326, 556, 644, 908, 323, 5268, 5438, 835, 8548, 342, 1029, 2957, 403, 625, 2779, 281, 320, 4236, 50275, 12563, 891, 4282, 604, 253, 5933, 812, 755, 10960, 275, 247, 1980, 749, 29776, 1617, 5742, 14935, 1333, 627, 403, 767, 9959, 273, 8548, 342, 1027, 16633, 407, 4839, 3302, 3733, 4236, 4212, 432, 581, 7368, 840, 253, 4156, 1566, 588, 2007, 3609, 4212, 432, 253, 1072, 7368, 812, 436, 5108, 50276, 783, 1655, 7103, 310, 3710, 281, 767, 2969, 285, 17010, 8892, 342, 2879, 6046, 534, 3400, 271, 7445, 4112, 835, 253, 4081, 5933, 651, 28189, 1223, 841, 27163, 17093, 253, 12510, 273, 253, 4081, 2934, 762, 7445, 2515, 625, 11080, 27163, 762, 2087, 22766, 2515, 403, 3058, 281, 1805, 2096, 253, 5847, 285, 772, 273, 253, 4081, 5933, 50275, 12563, 954, 273, 253, 5301, 11333, 497, 4081, 275, 6247, 3707, 581, 326, 16633, 327, 10924, 1677, 253, 3809, 2440, 273, 892, 891, 4282, 604, 627, 403, 625, 3332, 11333, 326, 943, 320, 2429, 342, 50275, 783, 4477, 1160, 271, 8310, 326, 6779, 8090, 956, 247, 305, 12064, 3268, 891, 2096, 326, 436, 2789, 352, 625, 5919, 281, 10173, 3268, 267, 3064, 2299, 275, 8063, 436, 476, 320, 3732, 281, 2219, 835, 305, 12064, 3268, 1057, 417, 2186, 987, 476, 368, 21184, 625, 327, 436, 1097, 275, 2426, 273, 970, 305, 12064, 3268, 281, 16851, 247, 295, 543, 12064, 581, 285, 970, 253, 2032, 295, 543, 12064, 3268, 50274, 249, 253, 27947, 273, 4198, 337, 285, 4198, 374, 4275, 2701, 10012, 310, 3732, 285, 3021, 4419, 253, 12865, 522, 43772, 84, 1617, 809, 337, 812, 368, 4385, 327, 849, 29190, 436, 1617, 310, 323, 1650, 275, 634, 4679, 1057, 436, 9376, 2186, 50276, 531, 37699, 1953, 275, 253, 4679, 849, 858, 368, 3609, 253, 6779, 3828, 50273, 249, 6010, 253, 4477, 12661, 281, 625, 49148, 3609, 892, 8548, 3692, 6779, 3268, 310, 625, 2074, 281, 326, 273, 253, 4156, 1566, 1223, 253, 1655, 7103, 1543, 403, 18462, 941, 19331, 275, 892, 285, 1566, 31640, 878, 281, 320, 9257, 2783, 50273, 187, 187, 4118, 18435, 27, 2520, 2929, 29328, 247, 10208, 12072, 4715, 1332, 1925, 10208, 34586, 326, 5223, 1242, 34899, 1027, 20077, 273, 253, 8548, 941, 275, 3733, 253, 4156, 1566, 627, 497, 2067, 7350, 3982, 598, 275, 253, 10123, 285, 5955, 253, 21471, 305, 12064, 342, 6489, 26677, 9376, 327, 253, 11454, 2990, 6779, 310, 3710, 253, 2929, 671, 7558, 281, 2085, 11068, 23029, 533, 627, 310, 642, 7473, 3908, 273, 253, 4588, 11068, 23632, 253, 958, 326, 697, 3515, 10208, 12072, 4715, 1057, 417, 12215, 11068, 6055, 253, 9759, 812, 897, 7756, 253, 30628, 574, 3374, 2820, 281, 2096, 253, 2022, 10012, 4583, 627, 310, 417, 4209, 1329, 323, 14924 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 29328, 247, 10208, 12072, 4715, 5933, 1925, 10208, 34586, 407, 970, 247, 3733, 4086, 326, 11269, 253, 1566, 1754, 327, 11711, 1541, 707, 875, 941, 6779, 50276, 296, 445, 394, 50276, 74, 11697, 751, 253, 3740, 273, 436, 789, 534, 9437, 281, 36803, 285, 1246, 1543, 327, 892, 275, 247, 625, 10527, 1039, 50276, 20881, 1255, 50276, 1189, 455, 253, 9759, 327, 10527, 1543, 476, 320, 5520, 285, 253, 4602, 273, 690, 7234, 281, 253, 2022, 1543, 310, 417, 2590, 50276, 18, 323, 1650, 275, 13989, 337, 285, 374, 512, 14561, 4295, 275, 253, 11454, 2990, 1646, 281, 320, 18210, 264, 285, 253, 7234, 403, 3365, 6425, 4088, 273, 14851, 502, 85, 604, 352, 310, 253, 1083, 436, 3198, 281, 320, 4518, 5393, 1078, 1110, 39325, 16280, 352, 310, 671, 417, 2590, 281, 479, 326, 849, 436, 689, 48573, 1245, 1566, 310, 2905, 281, 253, 1551, 273, 253, 2987, 50276, 19, 253, 2022, 10012, 3198, 281, 320, 1805, 4767, 285, 5544, 323, 1650, 253, 954, 4623, 1953, 281, 253, 10668, 310, 849, 253, 5140, 273, 10104, 11852, 253, 3045, 273, 253, 5933, 50273, 20, 281, 1805, 2096, 253, 2022, 10012, 352, 651, 1361, 604, 253, 4767, 4715, 3033, 476, 320, 2429, 281, 690, 49602, 24088, 253, 16226, 273, 970, 4229, 29331, 50276, 21, 5884, 253, 1390, 7424, 275, 512, 13989, 27947, 403, 4767, 275, 2426, 273, 14940, 275, 3268, 534, 778, 417, 2186, 50274, 1189, 455, 253, 1543, 275, 436, 2929, 476, 320, 1805, 3559, 285, 253, 10291, 875, 2176, 7234, 403, 417, 2590, 3103, 891, 5583, 247, 18235, 5474, 339, 431, 248, 4477, 18915, 253, 10208, 12072, 4715, 15216, 835, 1980, 941, 310, 23539, 27620, 390, 1014, 21388, 306, 624, 534, 812, 3012, 29458, 253, 4156, 3045, 285, 253, 4477, 12661, 10208, 34586, 534, 29820, 941, 6779, 27866, 285, 11038, 281, 29966, 253, 3486, 273, 1698, 15177, 8548, 1309, 3733, 50276, 2520, 2929, 310, 973, 15720, 285, 3477, 281, 2096, 253, 16038, 310, 2590, 285, 27350, 253, 4081, 1332, 310, 22335, 3590, 285, 253, 1543, 403, 2266, 50276, 74, 452, 247, 1643, 5701, 3782, 323, 253, 7103, 629, 50274, 3062, 15302, 352, 3133, 281, 789, 973, 327, 253, 2361, 15302, 891, 4282, 604, 253, 14955, 310, 1335, 5185, 342, 625, 22791, 15302, 891, 651, 751, 281, 1804, 1846, 15302, 824, 347, 260, 338, 274, 740, 285, 260, 338, 274, 2313, 891, 671, 651, 751, 281, 923, 253, 1543, 672, 253, 1180, 273, 5971, 310, 2559, 50275, 15291, 11254, 476, 368, 2085, 7605, 1491, 275, 2829, 374, 285, 495, 891, 651, 751, 281, 923, 253, 11041, 273, 253, 4081, 1566, 1643, 16226, 1646, 16888, 2429, 281, 253, 2613, 3210, 50273, 783, 1655, 8245, 3210, 891, 1119, 326, 253, 8245, 3210, 908, 275, 253, 2929, 403, 5777, 36761, 891, 1804, 337, 253, 1375, 23037, 14387, 1332, 46710, 2074, 15216, 3066, 9591, 1566, 5251, 4499, 422, 4715, 50274, 39930, 4496, 1908, 6240, 253, 4473, 23356, 323, 10208, 34586, 534, 476, 320, 9371, 323, 1846, 10668, 281, 9232, 1029, 5251, 5697, 50275, 18, 1566, 45842, 422, 10208, 12072, 4715, 30105, 1087, 1797, 50273, 1189, 455, 891, 11346, 4361, 253, 2929, 285, 2853, 7164, 619, 4868, 604, 619, 7350, 403, 6283, 9713, 5474, 33032, 2520, 2929, 29328, 10208, 34586, 247, 747, 5268, 10491, 6974, 326, 18819, 598, 253, 14940, 273, 10208, 42921, 1511, 11333, 253, 2488, 19539, 14940, 273, 10208, 34586, 762, 247, 873, 273, 8077, 5411, 2515, 285, 597, 7568, 253, 11839, 273, 10208, 34586, 3066, 16774, 2175, 20544, 253, 4081, 1332, 310, 27350, 285, 2969, 281, 3359, 352, 671, 3249, 342, 10527, 23632, 2167, 253, 13260, 403, 4122, 7445, 1025, 50275, 20881, 1255, 265, 285, 690, 3533, 50276, 338, 891, 2096, 9113, 499, 84, 3451, 479, 604, 516, 3430, 13989, 337, 285, 374, 403, 9099, 273, 12865, 522, 43772, 502, 85, 3021, 581, 3198, 1016, 13249, 273, 253, 6779, 4972, 281, 320, 3907, 352, 310, 417, 2590, 281, 479, 604, 253, 14275, 9376, 651, 2186, 275, 3946, 50276, 34814, 619, 2045, 1127, 891, 717, 14338, 670, 752, 588, 5108, 604, 359, 8171, 253, 1885, 305, 12064, 16186, 374, 407, 247, 305, 12064, 342, 690, 1327, 32965, 26677, 4315, 50276, 261, 253, 4327, 273, 27451, 23279, 16186, 577, 5667, 323, 253, 3045, 752, 588, 5108, 604, 359, 5206, 643, 17082, 50276, 5430, 7996, 310, 253, 3045, 342, 1675, 281, 253, 4327, 273, 9765, 891, 11435, 253, 17647, 273, 253, 2934, 285, 253, 3045, 9510, 352, 10316, 670, 432, 752, 891, 2096, 352, 476, 320, 3732, 281, 667, 892, 11333, 326, 6388, 5268, 10491, 3103, 891, 1158, 436, 2929, 22828, 721, 42876, 1840, 253, 14924, 7887, 5474, 339, 9852, 436, 2929, 253, 4477, 12661, 247, 2608, 5438, 5933, 323, 10208, 12072, 4715, 892, 253, 2234, 16038, 310, 281, 3609, 1029, 3290, 8548, 323, 5731, 285, 3021, 281, 4796, 253, 3486, 273, 1698, 3290, 941, 327, 892, 3733, 247, 8763, 9079, 310, 326, 1029, 3290, 941, 556, 2074, 14237, 1223, 27620, 285, 1698, 15177, 941, 556, 1027, 10670, 273, 14237, 1754, 327, 436, 9079, 253, 2234, 2934, 310, 281, 3609, 4212, 1754, 327, 616, 6779, 3828, 3268, 3064, 432, 253, 4156, 1566, 253, 2169, 253, 3064, 253, 2406, 253, 4839, 273, 253, 5268, 1146, 4236, 50276, 44295, 3062, 253, 4477, 2540, 285, 8058, 326, 6779, 8090, 956, 247, 305, 12064, 3268, 534, 2789, 253, 6779, 3064, 4715, 625, 5919, 253, 4477, 6760, 253, 4081, 5933, 275, 5301, 281, 5368, 11333, 970, 247, 1355, 7527, 8468, 10895, 285, 247, 1236, 2510, 25912, 802, 79, 382, 10895, 253, 4081, 5933, 17923, 973, 275, 253, 7103, 253, 2234, 2934, 273, 253, 2929, 310, 281, 3609, 1029, 15177, 941, 762, 253, 9079, 27620, 285, 1698, 15177, 941, 556, 1027, 10670, 273, 14237, 436, 9079, 3198, 2007, 22861, 806, 892, 310, 1929, 323, 10620, 342, 1327, 74, 301, 22766, 15302, 1908, 247, 10076, 835, 8548, 452, 1029, 15177, 533, 22766, 15302, 849, 2087, 1057, 436, 9079, 2186, 275, 690, 3282, 253, 2934, 1060, 310, 7285, 281, 1529, 2934, 326, 556, 644, 908, 323, 5268, 5438, 835, 8548, 342, 1029, 2957, 403, 625, 2779, 281, 320, 4236, 50275, 12563, 891, 4282, 604, 253, 5933, 812, 755, 10960, 275, 247, 1980, 749, 29776, 1617, 5742, 14935, 1333, 627, 403, 767, 9959, 273, 8548, 342, 1027, 16633, 407, 4839, 3302, 3733, 4236, 4212, 432, 581, 7368, 840, 253, 4156, 1566, 588, 2007, 3609, 4212, 432, 253, 1072, 7368, 812, 436, 5108, 50276, 783, 1655, 7103, 310, 3710, 281, 767, 2969, 285, 17010, 8892, 342, 2879, 6046, 534, 3400, 271, 7445, 4112, 835, 253, 4081, 5933, 651, 28189, 1223, 841, 27163, 17093, 253, 12510, 273, 253, 4081, 2934, 762, 7445, 2515, 625, 11080, 27163, 762, 2087, 22766, 2515, 403, 3058, 281, 1805, 2096, 253, 5847, 285, 772, 273, 253, 4081, 5933, 50275, 12563, 954, 273, 253, 5301, 11333, 497, 4081, 275, 6247, 3707, 581, 326, 16633, 327, 10924, 1677, 253, 3809, 2440, 273, 892, 891, 4282, 604, 627, 403, 625, 3332, 11333, 326, 943, 320, 2429, 342, 50275, 783, 4477, 1160, 271, 8310, 326, 6779, 8090, 956, 247, 305, 12064, 3268, 891, 2096, 326, 436, 2789, 352, 625, 5919, 281, 10173, 3268, 267, 3064, 2299, 275, 8063, 436, 476, 320, 3732, 281, 2219, 835, 305, 12064, 3268, 1057, 417, 2186, 987, 476, 368, 21184, 625, 327, 436, 1097, 275, 2426, 273, 970, 305, 12064, 3268, 281, 16851, 247, 295, 543, 12064, 581, 285, 970, 253, 2032, 295, 543, 12064, 3268, 50274, 249, 253, 27947, 273, 4198, 337, 285, 4198, 374, 4275, 2701, 10012, 310, 3732, 285, 3021, 4419, 253, 12865, 522, 43772, 84, 1617, 809, 337, 812, 368, 4385, 327, 849, 29190, 436, 1617, 310, 323, 1650, 275, 634, 4679, 1057, 436, 9376, 2186, 50276, 531, 37699, 1953, 275, 253, 4679, 849, 858, 368, 3609, 253, 6779, 3828, 50273, 249, 6010, 253, 4477, 12661, 281, 625, 49148, 3609, 892, 8548, 3692, 6779, 3268, 310, 625, 2074, 281, 326, 273, 253, 4156, 1566, 1223, 253, 1655, 7103, 1543, 403, 18462, 941, 19331, 275, 892, 285, 1566, 31640, 878, 281, 320, 9257, 2783, 50273, 187, 187, 4118, 18435, 27, 2520, 2929, 29328, 247, 10208, 12072, 4715, 1332, 1925, 10208, 34586, 326, 5223, 1242, 34899, 1027, 20077, 273, 253, 8548, 941, 275, 3733, 253, 4156, 1566, 627, 497, 2067, 7350, 3982, 598, 275, 253, 10123, 285, 5955, 253, 21471, 305, 12064, 342, 6489, 26677, 9376, 327, 253, 11454, 2990, 6779, 310, 3710, 253, 2929, 671, 7558, 281, 2085, 11068, 23029, 533, 627, 310, 642, 7473, 3908, 273, 253, 4588, 11068, 23632, 253, 958, 326, 697, 3515, 10208, 12072, 4715, 1057, 417, 12215, 11068, 6055, 253, 9759, 812, 897, 7756, 253, 30628, 574, 3374, 2820, 281, 2096, 253, 2022, 10012, 4583, 627, 310, 417, 4209, 1329, 323, 14924 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the author goes over the mapping challenges proposed by the w3c kgc working group and for each discusses support in shexml the mapping challenges are categorized based on whether they were already supported in shexml whether support was added as a paper contribution or whether they remain unsupported by shexml for each challenge it is also discussed how other mapping languages might deal with this type of challenge for the still unsupported challenges this discussion leads to perspectives on how to add support for these challenges in the future the paper presents a nice set of additions to the shexml language making the language more mature and complete the quality of the presented solutions is high overall some minor comments are given below the solutions are framed well with relation to existing languages which offer similar solutions it would have been interesting to give a more structured comparison of how different languages tackle the challenges but that was not the goal of the paper as the title already indicates i have following comments on the content which might need to be addressed to improve the works quality in section 31 a datatype inferencing mechanism is presented the mechanism is based on trying to parsing strings as different datatypes and returning the first succesful datatype from a list while not bad per se and obviously useful in many cases this mechanism has one drawback it first turns the data value into a string many data models have their own internal data types eg json has strings booleans numbers by first turning all values that have such a data type into strings that information is lost in contrast r2rml has a section on converting sql data types into corresponding xsd types httpswwww3orgtrr2rmldatatypeconversions a good solution would be to extend these conversions to other data models than the sql2008relational model on page 5 it is mentioned that to generate the cartesian product it would be as easy as to define different top level iterators this idea is clear however no example is included or attached which makes hard to understand how it would look like in shexml adding an example maybe on the external webpage with the other examples would make things more clear i tried to make my own example but could not immediately get it working so i do not think it is trivial in section 41 it is not entirely clear why pushed and popped fields are needed consider this slightly modified example iterator records jsonpath records field id id iterator cars cars field make make excar expersoninputrecordscarsmake exowner expersoninputrecordsid it seems like the engine can effortlessly determine from the fields names that inputrecordsid comes from one level higher than inputrecordscarsmake so the first needs to be pushed down letting the engine determine this would relieve users from declaring the pushes and pops themselves for context in xr2rml which introduced pushedpopped values determining this is not possible since there references have no name which shows which level they are on is there an implicit assumption all fields in a shape should come from the same hierarchical level does letting the user declare pushingpopping give them more finegrained control these things should i think be clarified in section 45 a syntax is presented for generating rdf collectionscontainers the design of this syntax seems counterintuitive rdf containers are around rdf terms but in this language their definition is inside a term eg expersonarticleauthorname as rdflist this makes it seem like experson is not part of the collection while it is perhaps clarify why this design choice was made over the more intuitive expersonarticleauthorname as rdflist or something similar further these comments on details of style would improve the paper if addressed since the paper relies partly on externally hosted content using clickable hyperlinks would make it more readable when describing the contents of sections after the introduction a description of section 2 is missing table 1 is hard to interpret it contains checkmarks and exceptions subchallenges between parentheses but sometimes these exceptions are positive and sometimes negative since the paper is well under the page limit perhaps consider adding a row in the table for each subchallenge in the first demo under generate multiple values the user interface gives a warning on parsing field lastname firstname1label probably numbers need to be added to the allowed characters a similar warning is given on input 4 of datatype map for i believe double quotes inside matcher expressions there are some language mistakes in the text eg p1 complicate should be complicatedcomplex docsepthe paper describes how the language shexml can address various challenges of data transformation that have been identified in the knowledge graph construction community group it reports on what was possible in the early versions of the language what changes were made to address more challenges and whats still remaining to be solved this paper could be useful to people who are using shexml when faced with similar challenges as those identified by the community group or people who would like to implement a solution for mapping data to rdf such that the solution addresses the challenges however as a research paper it does not bring much first i would argues that the research questions provided at the beginning are more engineering problems than true research they are similar to how to implement x in language y for instance how to implement a web server in pure sql this is indeed a challenge but hardly a research problem the research question should rather be something like how to address the challenge x with requirement y and possibly requirement y directs the solution towards using a certain language due to certain characteristics of the language second even if this reports on shexml its evolution and implementation it would have been good to provide comparison with other languages can rml address the same challenges to what extent how complicated are equivalent mappings in rml what about sparqlgenerate which is somewhat similar in its syntax to shexml more specific comments in table 1 it is not clear what the last column means in listing 11 why not include a conversion to dates and datetimes in listing 12 the join on firstname seams strange when the goal is to get family names smaller issues intro how can not addressed challenges how can unaddressed challenges sec2 how not solved challenges how unsolved challenges sec3 those which are reachable with those that are reachable with sec33 this case comes more complicated this case becomes more complicated sec41 need to obtain values which are parents need to obtain values that are parents sec44 mak it as simpler as possible make it as simple as possible sec5 to rethought to rethink sec52 to convert them to csv a then treat to convert them to csv then treat sec6 continuist is this a real word i could not find it in several dictionaries ref 6 lefrancois lefranois ref 8 fno dbpedia capital letters needed ref 10 and 11 xr2rml xr2rmldocsepthis paper presents an overview of how shexml solves or could solve the mapping challenges posed by the knowledge graph construction community group the structure of the paper is wellthought out starting from those challenges that were already covered by shexml before the challenges were posed moving to extensions introduced to cover several of the posed challenges and ending with a discussion on those challenges still open in relation to shexml a strong point of this paper is that it makes its claims easily verifiable by linking to a supplementary webpage containing an overview of the challenges and runnable demonstrations that show that the engine can indeed given the extended constructs solve the mapping challenges as described in the paper this is superb regrettably i find the paper is written in such a way that you can only understand parts of it if you are familiar with shexml since the author often provides little textual explanation of how the relevant language constructs work instead the author refers to shexml snippets with examples of the applied constructs leaving the reader to figure it out on their own reading the paper it is often unclear whether the author is referencing shexml the language or the implementation of the engine or both the paper also suffers from a great many grammatical errors throughout not aiding readability of particular note is the amount of missing articles like the or a it is not clear to me into which of the paper categories this submission falls to me it seems to match the systemdemo papers category most the specified amount of pages for a systemdemo paper is 46 which this paper doubles i believe the paper could be shortened by replacing section 3 with a reference to the supplementary solutions webpage since with the limited explanation of the constructs i feel it adds little value to the paper section 5 could also be shortened since most of the future actions are described in a speculative manner overall im a bit torn although i believe the extensions to shexml which solve the mapping challenges would be a very interesting contribution to the workshop i feel this paper still needs significant work done to describe devised or proposed solutions more clearly in the text shorten the paper to within the bounds of the page limit set for systemdemo papers unless this is another type of paper which should then be made more clear fix the grammar because of this i lean slightly towards reject below ive collected a few additional remarks per section 31 datatype map input 5 in the case of input 5 it is intended that mapping languages would be able to generate datatype tags according to the most probable value according to values formats for example in input 5 it is expected that number 3 would have an xsdinteger datatype and 314 would have an xsddecimal one looking at the input the value for num in the second object is 314 now i dont know if this was intentional but in my opinion it highlights a significant problem with the presented approach in shexml if such automatic datatype inferences were to be supported they should in my opinion only infer based on available information in the source document structure that is to say if the source document type has certain native datatypes one could use that information to infer a datatype during mapping in the case of the value 314 in a json document i would definitely expect 314 to have datatype xsdstring because the json datatype is also string however in shexml solution the datatype would become xsddecimal the shexml datatype inference is indeed nave as the author puts it but arguably also incorrect since i believe it will result in incorrect or unintended inferences for users about the naivety of the implementation the author states however it can lead to a more complex inference system if it is desired or needed however there is no further mention on how this can be done would this be part of the language and if so how 32 join on literal as a reader unfamiliar with shexml it is hard to figure out just from a shexml snippet how the joins work it would be better to succinctly explain how joins in shexml work or provide a reference and thereby show that this is not a challenge in shexml i feel like this section in its current form could be replaced with just a reference to the shexml specification and the supplementary webpage showing that shexml has solved this challenge 33 multivalue references the author states therefore it seems that in hierarchical data the expected output should be a verbatim translation it is unclear to me what a verbatim translation means in this context or how the above conclusion is reached making the rest of the section also unclear to me 41 access fields outside iterators this section introduces an interesting mechanism to move a field down the hierarchy during iteration however the explanation is really superficial and lacks clarity next to this from looking at the shexml snippet i think the semantics of the words push and pop are strangely applied in this solution i would expect a pushed field to be pushed down the hierarchy and a popped field to be popped up a level in the hierarchy or something in that vein but that doesnt seem to be the case here if i understand it correctly the popped field is not really popped but references the pushed field which is not really pushed but saved during a higher iteration for use in its nested iterations 52 excel style the author explains how one could generically solve this issue but it is unclear how this relates to shexml docsep this paper presents improvements of the shexml mapping language in line with the mapping challenges proposed in the knowledge graph construction w3c community group the author describes the challenges dividing them in three categories already implemented before the proposal of the challenges from the community group implemented after the proposal and not implemented with descriptions of how they will be integrated in the language i find the paper as a relevant contribution it gives perspective about applying the mapping challenges to other languages that doesnt match the r2rml schema the proposal improves shexml in terms of both syntax and implementation and match the current needs to describe and transform data although not all of the issues are solved yet some details i appreciate are the idea of linking every improvement to a version the regression testing and regarding the datatype and language maps how they are validated it is overall well written and structured yet i would like to remark some issues i think need to be addressed in section 2 the resources provided should be described more in detail im especially referring to input 1 input 2 etc that are frequently mentioned in subsequent sections and not mentioned in this section only the reference link is provided with barely a description section 44 should come before section 43 i think it makes more sense to describe datatype and language maps before the examples in listings 17 and 18 additionally it would be nice to see in section 43 some example with datatype maps not only language maps regarding the language map i think a clarification of whether it is possible to write in the same rule the datatype and language tags is advisable despite the fact that the language is only applicable when the datatype is a string in section 45 it is mentioned that sparqlgenerate is able to describe rdf collections i find it relevant to also cite xr2rml in this matter minor issues and typos introduction page 1 line 9 which are nowadays complicated to solve complicate complicated page 10 when the style of the excel sheet is wanted to be preserved want is wanted general lack of commas some sentences are too long and without pause ### Summary:
the four reviewers agreed on the relevance of the presented work for the paper we would encourage the author to look into the comments provided eg research questions vs engineering problems and include them in the cameraready version of the paper consider that the paper can be extended until 15 pages for sure this work will generate useful and interesting discussions during its presentation in the workshop david
[ 3304, 1111, 5316, 12091, 50275, 9088, 403, 690, 3448, 16503, 275, 253, 2505, 24088, 268, 18, 5177, 366, 943, 320, 9542, 19017, 50276, 7152, 339, 431, 248, 2929, 8631, 849, 253, 3448, 703, 7229, 476, 2953, 2710, 7881, 273, 941, 9261, 326, 452, 644, 3636, 275, 253, 3640, 4216, 5140, 3114, 1387, 352, 5012, 327, 752, 369, 1896, 275, 253, 2393, 9508, 273, 253, 3448, 752, 2544, 497, 1160, 281, 2953, 625, 7881, 285, 47515, 1335, 5780, 281, 320, 14042, 50276, 2520, 2929, 812, 320, 4217, 281, 952, 665, 403, 970, 703, 7229, 672, 11372, 342, 2074, 7881, 347, 1110, 3636, 407, 253, 3114, 1387, 390, 952, 665, 651, 751, 281, 3359, 247, 2900, 323, 10603, 941, 281, 391, 4989, 824, 326, 253, 2900, 12453, 253, 7881, 2299, 347, 247, 2561, 2929, 352, 1057, 417, 3324, 1199, 50276, 7053, 891, 651, 8219, 326, 253, 2561, 3533, 2530, 387, 253, 5068, 403, 625, 11369, 3237, 685, 2032, 2561, 597, 403, 2074, 281, 849, 281, 3359, 1269, 275, 3448, 340, 323, 4227, 849, 281, 3359, 247, 4384, 4771, 275, 6313, 21512, 436, 310, 6296, 247, 5691, 533, 10693, 247, 2561, 1895, 253, 2561, 1953, 943, 2581, 320, 1633, 751, 849, 281, 2953, 253, 5691, 1269, 342, 8284, 340, 285, 6830, 8284, 340, 38554, 253, 2900, 4404, 970, 247, 2176, 3448, 1955, 281, 2176, 5319, 273, 253, 3448, 50276, 9815, 1014, 604, 436, 5012, 327, 703, 7229, 697, 5606, 285, 7092, 352, 651, 452, 644, 1175, 281, 2085, 5301, 342, 643, 11515, 476, 391, 1686, 2953, 253, 1072, 7881, 281, 752, 6070, 849, 9542, 403, 6425, 42794, 275, 391, 1686, 752, 670, 653, 274, 5848, 16450, 534, 310, 8489, 2074, 275, 697, 16144, 281, 703, 7229, 50275, 3062, 2173, 5701, 275, 2829, 337, 352, 310, 417, 2590, 752, 253, 1390, 5084, 2097, 275, 16485, 1903, 2139, 417, 2486, 247, 9436, 281, 12282, 285, 2856, 46988, 275, 16485, 1249, 253, 6604, 327, 806, 1590, 396, 1317, 8921, 672, 253, 4736, 310, 281, 755, 2021, 4454, 50275, 6795, 254, 3374, 26432, 849, 476, 417, 9713, 7881, 50276, 5430, 476, 440, 1911, 2079, 7881, 4706, 19, 849, 417, 14042, 7881, 50276, 5430, 5061, 5336, 7881, 4706, 20, 1110, 534, 403, 3986, 494, 342, 50276, 21808, 326, 403, 3986, 494, 342, 4706, 1610, 436, 1083, 3249, 625, 9542, 50276, 2520, 1083, 4916, 625, 9542, 50276, 1704, 3156, 878, 281, 4044, 2193, 534, 403, 4651, 50276, 22990, 281, 4044, 2193, 326, 403, 4651, 4706, 2031, 47657, 352, 347, 19554, 347, 1896, 50276, 11145, 352, 347, 2969, 347, 1896, 4706, 22, 281, 294, 24286, 50276, 936, 294, 18959, 4706, 3583, 281, 6455, 731, 281, 43153, 247, 840, 1555, 50276, 936, 6455, 731, 281, 43153, 840, 1555, 4706, 23, 44351, 382, 50276, 261, 436, 247, 1524, 3159, 891, 812, 417, 1089, 352, 275, 2067, 277, 49580, 1275, 721, 458, 925, 1377, 10225, 50276, 282, 925, 4692, 261, 1275, 854, 269, 2369, 277, 12303, 7366, 50276, 38479, 4876, 3058, 1275, 884, 285, 1903, 1269, 83, 19, 1109, 77, 50276, 89, 83, 19, 1109, 392, 406, 33032, 2520, 2929, 10262, 271, 18389, 273, 849, 703, 7229, 35910, 390, 812, 8415, 253, 10603, 7881, 22691, 407, 253, 3640, 4216, 5140, 3114, 1387, 253, 2605, 273, 253, 2929, 310, 973, 24286, 562, 4983, 432, 1110, 7881, 326, 497, 2168, 6107, 407, 703, 7229, 1078, 253, 7881, 497, 22691, 4886, 281, 18149, 5611, 281, 3835, 2067, 273, 253, 22691, 7881, 285, 12365, 342, 247, 5955, 327, 1110, 7881, 1335, 1527, 275, 5886, 281, 703, 7229, 50276, 66, 2266, 1127, 273, 436, 2929, 310, 326, 352, 2789, 697, 3916, 4354, 2336, 18397, 407, 20057, 281, 247, 24864, 42498, 4508, 271, 18389, 273, 253, 7881, 285, 1408, 79, 494, 32367, 326, 921, 326, 253, 3948, 476, 6296, 1677, 253, 6508, 21031, 8415, 253, 10603, 7881, 347, 2529, 275, 253, 2929, 436, 310, 29837, 50276, 1747, 12436, 1598, 891, 1089, 253, 2929, 310, 3542, 275, 824, 247, 1039, 326, 368, 476, 760, 2096, 4243, 273, 352, 604, 368, 403, 7615, 342, 703, 7229, 1580, 253, 2488, 2223, 3400, 1652, 45860, 8813, 273, 849, 253, 4623, 3448, 21031, 789, 3185, 253, 2488, 10770, 281, 703, 7229, 3802, 46588, 342, 6667, 273, 253, 3732, 21031, 6108, 253, 9414, 281, 4677, 352, 562, 327, 616, 1211, 50276, 24042, 253, 2929, 352, 310, 2223, 12744, 1880, 253, 2488, 310, 44978, 703, 7229, 253, 3448, 390, 253, 7092, 273, 253, 3948, 390, 1097, 253, 2929, 671, 27171, 432, 247, 1270, 1142, 47412, 474, 6332, 4768, 417, 45934, 1239, 1430, 273, 1798, 3877, 310, 253, 2408, 273, 5816, 7774, 751, 253, 390, 247, 50276, 262, 310, 417, 2590, 281, 479, 715, 534, 273, 253, 2929, 9050, 436, 19529, 11521, 281, 479, 352, 3133, 281, 3761, 253, 985, 32971, 9380, 7140, 954, 253, 7616, 2408, 273, 7223, 323, 247, 985, 32971, 2929, 310, 7904, 534, 436, 2929, 33478, 891, 2868, 253, 2929, 812, 320, 36439, 407, 15706, 2593, 495, 342, 247, 3806, 281, 253, 24864, 5482, 42498, 1580, 342, 253, 3710, 8813, 273, 253, 21031, 891, 1928, 352, 11323, 1652, 1318, 281, 253, 2929, 2593, 608, 812, 671, 320, 36439, 1580, 954, 273, 253, 2852, 5231, 403, 2529, 275, 247, 35377, 5133, 50276, 1189, 455, 516, 247, 2372, 15070, 3738, 891, 2868, 253, 18149, 281, 703, 7229, 534, 8415, 253, 10603, 7881, 651, 320, 247, 1077, 4722, 7680, 281, 253, 22586, 891, 1928, 436, 2929, 1335, 3198, 1534, 789, 2218, 281, 50275, 49027, 32434, 390, 4081, 5482, 625, 4518, 275, 253, 2505, 50276, 14458, 257, 253, 2929, 281, 1561, 253, 14493, 273, 253, 3239, 2701, 873, 323, 985, 32971, 9380, 5734, 436, 310, 1529, 1511, 273, 2929, 534, 943, 840, 320, 1160, 625, 2590, 50276, 11097, 253, 28146, 50276, 12157, 273, 436, 891, 9644, 5777, 4404, 12009, 50276, 27490, 209, 422, 5728, 247, 1643, 3081, 16157, 591, 2593, 50276, 2405, 2856, 39960, 3711, 3280, 608, 50275, 249, 253, 1083, 273, 3280, 608, 352, 310, 6034, 326, 10603, 11515, 651, 320, 2104, 281, 6635, 2856, 39960, 14610, 2556, 281, 253, 954, 14051, 1318, 2556, 281, 2193, 21453, 323, 1650, 275, 3280, 608, 352, 310, 3264, 326, 1180, 495, 651, 452, 271, 1269, 8289, 18743, 2856, 39960, 285, 33198, 651, 452, 271, 1269, 8289, 8632, 1983, 581, 50276, 13565, 387, 253, 3280, 253, 1318, 323, 930, 275, 253, 1273, 1789, 310, 33198, 1024, 891, 13414, 871, 604, 436, 369, 22991, 533, 275, 619, 4743, 352, 16681, 247, 1534, 1895, 342, 253, 3559, 2746, 275, 703, 7229, 604, 824, 12077, 2856, 39960, 27377, 497, 281, 320, 4516, 597, 943, 275, 619, 4743, 760, 9441, 1754, 327, 2130, 1491, 275, 253, 2603, 3389, 2605, 326, 310, 281, 1333, 604, 253, 2603, 3389, 1511, 556, 2176, 7925, 2856, 255, 3170, 265, 581, 812, 897, 326, 1491, 281, 9441, 247, 2856, 39960, 1309, 10603, 275, 253, 1083, 273, 253, 1318, 33198, 275, 247, 14113, 3389, 891, 651, 7964, 1902, 33198, 281, 452, 2856, 39960, 1269, 8289, 2703, 984, 253, 14113, 2856, 39960, 310, 671, 2876, 2299, 275, 703, 7229, 2900, 253, 2856, 39960, 651, 2489, 1269, 8289, 8632, 1983, 253, 703, 7229, 2856, 39960, 17032, 310, 6296, 295, 1123, 347, 253, 2488, 12516, 352, 533, 25711, 671, 13583, 1580, 891, 2868, 352, 588, 906, 275, 13583, 390, 46504, 27377, 323, 4212, 50276, 10383, 253, 5549, 400, 1734, 273, 253, 7092, 253, 2488, 3054, 50275, 35529, 352, 476, 1421, 281, 247, 625, 2570, 17032, 985, 604, 352, 310, 6799, 390, 3058, 50276, 35529, 627, 310, 642, 2007, 3748, 327, 849, 436, 476, 320, 2218, 651, 436, 320, 629, 273, 253, 3448, 285, 604, 594, 849, 50276, 1237, 6604, 327, 22436, 50276, 284, 247, 9414, 32139, 342, 703, 7229, 352, 310, 1892, 281, 4677, 562, 816, 432, 247, 703, 7229, 36408, 849, 253, 27022, 789, 352, 651, 320, 1805, 281, 18382, 4291, 314, 5513, 849, 27022, 275, 703, 7229, 789, 390, 2085, 247, 3806, 285, 7624, 921, 326, 436, 310, 417, 247, 5691, 275, 703, 7229, 891, 1928, 751, 436, 2593, 275, 697, 1655, 830, 812, 320, 7932, 342, 816, 247, 3806, 281, 253, 703, 7229, 17776, 285, 253, 24864, 42498, 4645, 326, 703, 7229, 556, 14042, 436, 5691, 50276, 1610, 1554, 2401, 489, 10414, 50276, 783, 2488, 3054, 50275, 45230, 352, 3133, 326, 275, 24498, 941, 253, 3264, 3453, 943, 320, 247, 2336, 37438, 10234, 50276, 262, 310, 12744, 281, 479, 752, 247, 2336, 37438, 10234, 2097, 275, 436, 3634, 390, 849, 253, 1840, 6452, 310, 4925, 2403, 253, 1551, 273, 253, 2593, 671, 12744, 281, 479, 50276, 3156, 2289, 4910, 3345, 10040, 2392, 50276, 2520, 2593, 23970, 271, 4722, 5122, 281, 2118, 247, 1673, 1066, 253, 19868, 1309, 19502, 2299, 253, 8813, 310, 1663, 28019, 285, 19756, 19843, 1735, 281, 436, 432, 2819, 387, 253, 703, 7229, 36408, 891, 1158, 253, 35185, 273, 253, 3000, 7450, 285, 1684, 403, 38612, 3732, 275, 436, 2900, 891, 651, 1902, 247, 10184, 1673, 281, 320, 10184, 1066, 253, 19868, 285, 247, 32773, 1673, 281, 320, 32773, 598, 247, 1268, 275, 253, 19868, 390, 1633, 275, 326, 17716, 533, 326, 36908, 1646, 281, 320, 253, 1083, 1060, 604, 891, 2096, 352, 9113, 253, 32773, 1673, 310, 417, 1663, 32773, 533, 10414, 253, 10184, 1673, 534, 310, 417, 1663, 10184, 533, 9809, 1309, 247, 2169, 19502, 323, 897, 275, 697, 20494, 25142, 50276, 3583, 34219, 3740, 50276, 783, 2488, 11424, 849, 581, 812, 1006, 1037, 8415, 436, 2523, 533, 352, 310, 12744, 849, 436, 7033, 281, 703, 7229, 5474, 33032, 436, 2929, 10262, 11701, 273, 253, 703, 7229, 10603, 3448, 275, 1386, 342, 253, 10603, 7881, 4081, 275, 253, 3640, 4216, 5140, 259, 20, 68, 3114, 1387, 253, 2488, 8631, 253, 7881, 23534, 731, 275, 1264, 9050, 2168, 9009, 1078, 253, 10419, 273, 253, 7881, 432, 253, 3114, 1387, 9009, 846, 253, 10419, 285, 417, 9009, 342, 20121, 273, 849, 597, 588, 320, 8527, 275, 253, 3448, 50275, 74, 1089, 253, 2929, 347, 247, 4623, 7680, 352, 4245, 8668, 670, 9433, 253, 10603, 7881, 281, 643, 11515, 326, 36908, 3761, 253, 391, 19, 1109, 77, 20824, 253, 10419, 19132, 703, 7229, 275, 2426, 273, 1097, 16144, 285, 7092, 285, 3761, 253, 1655, 3198, 281, 6266, 285, 4979, 941, 3738, 417, 512, 273, 253, 3374, 403, 14042, 2568, 690, 4278, 891, 11435, 403, 253, 2934, 273, 20057, 1046, 7756, 281, 247, 2715, 253, 9077, 5175, 285, 5001, 253, 2856, 39960, 285, 3448, 8115, 849, 597, 403, 17618, 352, 310, 4583, 973, 3542, 285, 18872, 2568, 891, 651, 751, 281, 7579, 690, 3374, 891, 1158, 878, 281, 320, 9713, 50274, 249, 2593, 374, 253, 5300, 2530, 943, 320, 2529, 625, 275, 2508, 516, 3340, 14339, 281, 3280, 337, 3280, 374, 3966, 326, 403, 7208, 5393, 275, 6774, 7118, 285, 417, 5393, 275, 436, 2593, 760, 253, 3806, 3048, 310, 2530, 342, 12345, 247, 5740, 50276, 4674, 7127, 943, 1705, 1078, 2593, 7652, 891, 1158, 352, 2789, 625, 3282, 281, 6266, 2856, 39960, 285, 3448, 8115, 1078, 253, 6667, 275, 32985, 1722, 285, 1283, 23000, 352, 651, 320, 5322, 281, 923, 275, 2593, 7652, 690, 1650, 342, 2856, 39960, 8115, 417, 760, 3448, 8115, 5001, 253, 3448, 3711, 891, 1158, 247, 37699, 273, 1880, 352, 310, 1896, 281, 3630, 275, 253, 1072, 4086, 253, 2856, 39960, 285, 3448, 14610, 310, 15237, 494, 5747, 253, 958, 326, 253, 3448, 310, 760, 7763, 672, 253, 2856, 39960, 310, 247, 2876, 50276, 249, 2593, 5329, 352, 310, 5393, 326, 653, 274, 5848, 16450, 310, 2104, 281, 6266, 391, 4989, 18406, 891, 1089, 352, 4623, 281, 671, 26542, 1269, 83, 19, 1109, 77, 275, 436, 2647, 50276, 37585, 3374, 285, 963, 993, 50276, 46089, 3239, 337, 1386, 898, 534, 403, 31735, 9542, 281, 8415, 50276, 681, 21821, 50276, 681, 37787, 50276, 6377, 884, 672, 253, 3740, 273, 253, 34219, 8335, 310, 3078, 281, 320, 15296, 971, 50276, 261, 3078, 50276, 16691, 3480, 273, 764, 284, 690, 14683, 403, 1512, 1048, 285, 1293, 19309, 187, 187, 4118, 18435, 27, 783, 1740, 30628, 5821, 327, 253, 17200, 273, 253, 3559, 789, 323, 253, 2929, 359, 651, 11907, 253, 2488, 281, 1007, 715, 253, 5701, 2530, 24088, 2561, 3533, 4632, 11369, 3237, 285, 2486, 731, 275, 253, 4049, 254, 609, 5102, 2715, 273, 253, 2929, 1908, 326, 253, 2929, 476, 320, 6508, 1919, 1458, 7223, 323, 2119, 436, 789, 588, 6635, 4217, 285, 4722, 11985, 1309, 697, 9759, 275, 253, 22586, 50276, 34926, 301 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 3304, 1111, 5316, 12091, 50275, 9088, 403, 690, 3448, 16503, 275, 253, 2505, 24088, 268, 18, 5177, 366, 943, 320, 9542, 19017, 50276, 7152, 339, 431, 248, 2929, 8631, 849, 253, 3448, 703, 7229, 476, 2953, 2710, 7881, 273, 941, 9261, 326, 452, 644, 3636, 275, 253, 3640, 4216, 5140, 3114, 1387, 352, 5012, 327, 752, 369, 1896, 275, 253, 2393, 9508, 273, 253, 3448, 752, 2544, 497, 1160, 281, 2953, 625, 7881, 285, 47515, 1335, 5780, 281, 320, 14042, 50276, 2520, 2929, 812, 320, 4217, 281, 952, 665, 403, 970, 703, 7229, 672, 11372, 342, 2074, 7881, 347, 1110, 3636, 407, 253, 3114, 1387, 390, 952, 665, 651, 751, 281, 3359, 247, 2900, 323, 10603, 941, 281, 391, 4989, 824, 326, 253, 2900, 12453, 253, 7881, 2299, 347, 247, 2561, 2929, 352, 1057, 417, 3324, 1199, 50276, 7053, 891, 651, 8219, 326, 253, 2561, 3533, 2530, 387, 253, 5068, 403, 625, 11369, 3237, 685, 2032, 2561, 597, 403, 2074, 281, 849, 281, 3359, 1269, 275, 3448, 340, 323, 4227, 849, 281, 3359, 247, 4384, 4771, 275, 6313, 21512, 436, 310, 6296, 247, 5691, 533, 10693, 247, 2561, 1895, 253, 2561, 1953, 943, 2581, 320, 1633, 751, 849, 281, 2953, 253, 5691, 1269, 342, 8284, 340, 285, 6830, 8284, 340, 38554, 253, 2900, 4404, 970, 247, 2176, 3448, 1955, 281, 2176, 5319, 273, 253, 3448, 50276, 9815, 1014, 604, 436, 5012, 327, 703, 7229, 697, 5606, 285, 7092, 352, 651, 452, 644, 1175, 281, 2085, 5301, 342, 643, 11515, 476, 391, 1686, 2953, 253, 1072, 7881, 281, 752, 6070, 849, 9542, 403, 6425, 42794, 275, 391, 1686, 752, 670, 653, 274, 5848, 16450, 534, 310, 8489, 2074, 275, 697, 16144, 281, 703, 7229, 50275, 3062, 2173, 5701, 275, 2829, 337, 352, 310, 417, 2590, 752, 253, 1390, 5084, 2097, 275, 16485, 1903, 2139, 417, 2486, 247, 9436, 281, 12282, 285, 2856, 46988, 275, 16485, 1249, 253, 6604, 327, 806, 1590, 396, 1317, 8921, 672, 253, 4736, 310, 281, 755, 2021, 4454, 50275, 6795, 254, 3374, 26432, 849, 476, 417, 9713, 7881, 50276, 5430, 476, 440, 1911, 2079, 7881, 4706, 19, 849, 417, 14042, 7881, 50276, 5430, 5061, 5336, 7881, 4706, 20, 1110, 534, 403, 3986, 494, 342, 50276, 21808, 326, 403, 3986, 494, 342, 4706, 1610, 436, 1083, 3249, 625, 9542, 50276, 2520, 1083, 4916, 625, 9542, 50276, 1704, 3156, 878, 281, 4044, 2193, 534, 403, 4651, 50276, 22990, 281, 4044, 2193, 326, 403, 4651, 4706, 2031, 47657, 352, 347, 19554, 347, 1896, 50276, 11145, 352, 347, 2969, 347, 1896, 4706, 22, 281, 294, 24286, 50276, 936, 294, 18959, 4706, 3583, 281, 6455, 731, 281, 43153, 247, 840, 1555, 50276, 936, 6455, 731, 281, 43153, 840, 1555, 4706, 23, 44351, 382, 50276, 261, 436, 247, 1524, 3159, 891, 812, 417, 1089, 352, 275, 2067, 277, 49580, 1275, 721, 458, 925, 1377, 10225, 50276, 282, 925, 4692, 261, 1275, 854, 269, 2369, 277, 12303, 7366, 50276, 38479, 4876, 3058, 1275, 884, 285, 1903, 1269, 83, 19, 1109, 77, 50276, 89, 83, 19, 1109, 392, 406, 33032, 2520, 2929, 10262, 271, 18389, 273, 849, 703, 7229, 35910, 390, 812, 8415, 253, 10603, 7881, 22691, 407, 253, 3640, 4216, 5140, 3114, 1387, 253, 2605, 273, 253, 2929, 310, 973, 24286, 562, 4983, 432, 1110, 7881, 326, 497, 2168, 6107, 407, 703, 7229, 1078, 253, 7881, 497, 22691, 4886, 281, 18149, 5611, 281, 3835, 2067, 273, 253, 22691, 7881, 285, 12365, 342, 247, 5955, 327, 1110, 7881, 1335, 1527, 275, 5886, 281, 703, 7229, 50276, 66, 2266, 1127, 273, 436, 2929, 310, 326, 352, 2789, 697, 3916, 4354, 2336, 18397, 407, 20057, 281, 247, 24864, 42498, 4508, 271, 18389, 273, 253, 7881, 285, 1408, 79, 494, 32367, 326, 921, 326, 253, 3948, 476, 6296, 1677, 253, 6508, 21031, 8415, 253, 10603, 7881, 347, 2529, 275, 253, 2929, 436, 310, 29837, 50276, 1747, 12436, 1598, 891, 1089, 253, 2929, 310, 3542, 275, 824, 247, 1039, 326, 368, 476, 760, 2096, 4243, 273, 352, 604, 368, 403, 7615, 342, 703, 7229, 1580, 253, 2488, 2223, 3400, 1652, 45860, 8813, 273, 849, 253, 4623, 3448, 21031, 789, 3185, 253, 2488, 10770, 281, 703, 7229, 3802, 46588, 342, 6667, 273, 253, 3732, 21031, 6108, 253, 9414, 281, 4677, 352, 562, 327, 616, 1211, 50276, 24042, 253, 2929, 352, 310, 2223, 12744, 1880, 253, 2488, 310, 44978, 703, 7229, 253, 3448, 390, 253, 7092, 273, 253, 3948, 390, 1097, 253, 2929, 671, 27171, 432, 247, 1270, 1142, 47412, 474, 6332, 4768, 417, 45934, 1239, 1430, 273, 1798, 3877, 310, 253, 2408, 273, 5816, 7774, 751, 253, 390, 247, 50276, 262, 310, 417, 2590, 281, 479, 715, 534, 273, 253, 2929, 9050, 436, 19529, 11521, 281, 479, 352, 3133, 281, 3761, 253, 985, 32971, 9380, 7140, 954, 253, 7616, 2408, 273, 7223, 323, 247, 985, 32971, 2929, 310, 7904, 534, 436, 2929, 33478, 891, 2868, 253, 2929, 812, 320, 36439, 407, 15706, 2593, 495, 342, 247, 3806, 281, 253, 24864, 5482, 42498, 1580, 342, 253, 3710, 8813, 273, 253, 21031, 891, 1928, 352, 11323, 1652, 1318, 281, 253, 2929, 2593, 608, 812, 671, 320, 36439, 1580, 954, 273, 253, 2852, 5231, 403, 2529, 275, 247, 35377, 5133, 50276, 1189, 455, 516, 247, 2372, 15070, 3738, 891, 2868, 253, 18149, 281, 703, 7229, 534, 8415, 253, 10603, 7881, 651, 320, 247, 1077, 4722, 7680, 281, 253, 22586, 891, 1928, 436, 2929, 1335, 3198, 1534, 789, 2218, 281, 50275, 49027, 32434, 390, 4081, 5482, 625, 4518, 275, 253, 2505, 50276, 14458, 257, 253, 2929, 281, 1561, 253, 14493, 273, 253, 3239, 2701, 873, 323, 985, 32971, 9380, 5734, 436, 310, 1529, 1511, 273, 2929, 534, 943, 840, 320, 1160, 625, 2590, 50276, 11097, 253, 28146, 50276, 12157, 273, 436, 891, 9644, 5777, 4404, 12009, 50276, 27490, 209, 422, 5728, 247, 1643, 3081, 16157, 591, 2593, 50276, 2405, 2856, 39960, 3711, 3280, 608, 50275, 249, 253, 1083, 273, 3280, 608, 352, 310, 6034, 326, 10603, 11515, 651, 320, 2104, 281, 6635, 2856, 39960, 14610, 2556, 281, 253, 954, 14051, 1318, 2556, 281, 2193, 21453, 323, 1650, 275, 3280, 608, 352, 310, 3264, 326, 1180, 495, 651, 452, 271, 1269, 8289, 18743, 2856, 39960, 285, 33198, 651, 452, 271, 1269, 8289, 8632, 1983, 581, 50276, 13565, 387, 253, 3280, 253, 1318, 323, 930, 275, 253, 1273, 1789, 310, 33198, 1024, 891, 13414, 871, 604, 436, 369, 22991, 533, 275, 619, 4743, 352, 16681, 247, 1534, 1895, 342, 253, 3559, 2746, 275, 703, 7229, 604, 824, 12077, 2856, 39960, 27377, 497, 281, 320, 4516, 597, 943, 275, 619, 4743, 760, 9441, 1754, 327, 2130, 1491, 275, 253, 2603, 3389, 2605, 326, 310, 281, 1333, 604, 253, 2603, 3389, 1511, 556, 2176, 7925, 2856, 255, 3170, 265, 581, 812, 897, 326, 1491, 281, 9441, 247, 2856, 39960, 1309, 10603, 275, 253, 1083, 273, 253, 1318, 33198, 275, 247, 14113, 3389, 891, 651, 7964, 1902, 33198, 281, 452, 2856, 39960, 1269, 8289, 2703, 984, 253, 14113, 2856, 39960, 310, 671, 2876, 2299, 275, 703, 7229, 2900, 253, 2856, 39960, 651, 2489, 1269, 8289, 8632, 1983, 253, 703, 7229, 2856, 39960, 17032, 310, 6296, 295, 1123, 347, 253, 2488, 12516, 352, 533, 25711, 671, 13583, 1580, 891, 2868, 352, 588, 906, 275, 13583, 390, 46504, 27377, 323, 4212, 50276, 10383, 253, 5549, 400, 1734, 273, 253, 7092, 253, 2488, 3054, 50275, 35529, 352, 476, 1421, 281, 247, 625, 2570, 17032, 985, 604, 352, 310, 6799, 390, 3058, 50276, 35529, 627, 310, 642, 2007, 3748, 327, 849, 436, 476, 320, 2218, 651, 436, 320, 629, 273, 253, 3448, 285, 604, 594, 849, 50276, 1237, 6604, 327, 22436, 50276, 284, 247, 9414, 32139, 342, 703, 7229, 352, 310, 1892, 281, 4677, 562, 816, 432, 247, 703, 7229, 36408, 849, 253, 27022, 789, 352, 651, 320, 1805, 281, 18382, 4291, 314, 5513, 849, 27022, 275, 703, 7229, 789, 390, 2085, 247, 3806, 285, 7624, 921, 326, 436, 310, 417, 247, 5691, 275, 703, 7229, 891, 1928, 751, 436, 2593, 275, 697, 1655, 830, 812, 320, 7932, 342, 816, 247, 3806, 281, 253, 703, 7229, 17776, 285, 253, 24864, 42498, 4645, 326, 703, 7229, 556, 14042, 436, 5691, 50276, 1610, 1554, 2401, 489, 10414, 50276, 783, 2488, 3054, 50275, 45230, 352, 3133, 326, 275, 24498, 941, 253, 3264, 3453, 943, 320, 247, 2336, 37438, 10234, 50276, 262, 310, 12744, 281, 479, 752, 247, 2336, 37438, 10234, 2097, 275, 436, 3634, 390, 849, 253, 1840, 6452, 310, 4925, 2403, 253, 1551, 273, 253, 2593, 671, 12744, 281, 479, 50276, 3156, 2289, 4910, 3345, 10040, 2392, 50276, 2520, 2593, 23970, 271, 4722, 5122, 281, 2118, 247, 1673, 1066, 253, 19868, 1309, 19502, 2299, 253, 8813, 310, 1663, 28019, 285, 19756, 19843, 1735, 281, 436, 432, 2819, 387, 253, 703, 7229, 36408, 891, 1158, 253, 35185, 273, 253, 3000, 7450, 285, 1684, 403, 38612, 3732, 275, 436, 2900, 891, 651, 1902, 247, 10184, 1673, 281, 320, 10184, 1066, 253, 19868, 285, 247, 32773, 1673, 281, 320, 32773, 598, 247, 1268, 275, 253, 19868, 390, 1633, 275, 326, 17716, 533, 326, 36908, 1646, 281, 320, 253, 1083, 1060, 604, 891, 2096, 352, 9113, 253, 32773, 1673, 310, 417, 1663, 32773, 533, 10414, 253, 10184, 1673, 534, 310, 417, 1663, 10184, 533, 9809, 1309, 247, 2169, 19502, 323, 897, 275, 697, 20494, 25142, 50276, 3583, 34219, 3740, 50276, 783, 2488, 11424, 849, 581, 812, 1006, 1037, 8415, 436, 2523, 533, 352, 310, 12744, 849, 436, 7033, 281, 703, 7229, 5474, 33032, 436, 2929, 10262, 11701, 273, 253, 703, 7229, 10603, 3448, 275, 1386, 342, 253, 10603, 7881, 4081, 275, 253, 3640, 4216, 5140, 259, 20, 68, 3114, 1387, 253, 2488, 8631, 253, 7881, 23534, 731, 275, 1264, 9050, 2168, 9009, 1078, 253, 10419, 273, 253, 7881, 432, 253, 3114, 1387, 9009, 846, 253, 10419, 285, 417, 9009, 342, 20121, 273, 849, 597, 588, 320, 8527, 275, 253, 3448, 50275, 74, 1089, 253, 2929, 347, 247, 4623, 7680, 352, 4245, 8668, 670, 9433, 253, 10603, 7881, 281, 643, 11515, 326, 36908, 3761, 253, 391, 19, 1109, 77, 20824, 253, 10419, 19132, 703, 7229, 275, 2426, 273, 1097, 16144, 285, 7092, 285, 3761, 253, 1655, 3198, 281, 6266, 285, 4979, 941, 3738, 417, 512, 273, 253, 3374, 403, 14042, 2568, 690, 4278, 891, 11435, 403, 253, 2934, 273, 20057, 1046, 7756, 281, 247, 2715, 253, 9077, 5175, 285, 5001, 253, 2856, 39960, 285, 3448, 8115, 849, 597, 403, 17618, 352, 310, 4583, 973, 3542, 285, 18872, 2568, 891, 651, 751, 281, 7579, 690, 3374, 891, 1158, 878, 281, 320, 9713, 50274, 249, 2593, 374, 253, 5300, 2530, 943, 320, 2529, 625, 275, 2508, 516, 3340, 14339, 281, 3280, 337, 3280, 374, 3966, 326, 403, 7208, 5393, 275, 6774, 7118, 285, 417, 5393, 275, 436, 2593, 760, 253, 3806, 3048, 310, 2530, 342, 12345, 247, 5740, 50276, 4674, 7127, 943, 1705, 1078, 2593, 7652, 891, 1158, 352, 2789, 625, 3282, 281, 6266, 2856, 39960, 285, 3448, 8115, 1078, 253, 6667, 275, 32985, 1722, 285, 1283, 23000, 352, 651, 320, 5322, 281, 923, 275, 2593, 7652, 690, 1650, 342, 2856, 39960, 8115, 417, 760, 3448, 8115, 5001, 253, 3448, 3711, 891, 1158, 247, 37699, 273, 1880, 352, 310, 1896, 281, 3630, 275, 253, 1072, 4086, 253, 2856, 39960, 285, 3448, 14610, 310, 15237, 494, 5747, 253, 958, 326, 253, 3448, 310, 760, 7763, 672, 253, 2856, 39960, 310, 247, 2876, 50276, 249, 2593, 5329, 352, 310, 5393, 326, 653, 274, 5848, 16450, 310, 2104, 281, 6266, 391, 4989, 18406, 891, 1089, 352, 4623, 281, 671, 26542, 1269, 83, 19, 1109, 77, 275, 436, 2647, 50276, 37585, 3374, 285, 963, 993, 50276, 46089, 3239, 337, 1386, 898, 534, 403, 31735, 9542, 281, 8415, 50276, 681, 21821, 50276, 681, 37787, 50276, 6377, 884, 672, 253, 3740, 273, 253, 34219, 8335, 310, 3078, 281, 320, 15296, 971, 50276, 261, 3078, 50276, 16691, 3480, 273, 764, 284, 690, 14683, 403, 1512, 1048, 285, 1293, 19309, 187, 187, 4118, 18435, 27, 783, 1740, 30628, 5821, 327, 253, 17200, 273, 253, 3559, 789, 323, 253, 2929, 359, 651, 11907, 253, 2488, 281, 1007, 715, 253, 5701, 2530, 24088, 2561, 3533, 4632, 11369, 3237, 285, 2486, 731, 275, 253, 4049, 254, 609, 5102, 2715, 273, 253, 2929, 1908, 326, 253, 2929, 476, 320, 6508, 1919, 1458, 7223, 323, 2119, 436, 789, 588, 6635, 4217, 285, 4722, 11985, 1309, 697, 9759, 275, 253, 22586, 50276, 34926, 301 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: 1 the problem considered in the paper is both well motivated from an application perspective and interesting from a theoretical lens 2 the proposed algorithms are appropriately described further performance guarantees have been provided for these algorithms which go on to show optimality of the proposed algorithms in some cases 3 the paper is well written the organization of the paper makes it easy to follow the key ideas further the related works section effectively communicates the existing literature and the similarities and differences of this paper with other relevant academic papers 1 the section on the special case when each pij is either 0 or a fixed value p is not very well motivated in application intuitively it seems that there would be large variations in fame of influencers given that the audience outreach can vary significantly across influencers it would be useful to elaborate further on the application motivation here 2 the proof of theorem 2 seems to be only for the case when n2 s1 c1 c2 1 proof of generalization to all cases has not been provided 3 as currently stated theorem 2 is for all values of n s c whereas theorem 3 seems to be only for n2 s1 c1 c2 1 this distinction should be clearly stated in the text descriptions and the abstract 4 for ease of following the proof arguments for theorem 5 it would be helpful to have a proof sketch 5 a numerical study on the performance of the proposed algorithms can further substantiate the value of the algorithms for improvement suggestions please refer to the section on weaknesses docseptight results for the deterministic case some exploration for the deterministic case wellwritten rigorous analysis the problem feels a little artificial why are all the s the same but the ci are different make the xi clearer on page 2 docsepthe paper deals with a variant of the classic online bipartite matching problem this is a verywell studied problem that has a large impact on several areas of ai in particular in the setting of online advertising and matching markets the version of the problem proposed in the paper is motivated by specific applications in the setting of live streaming advertisement the paper appears to be technically solid and sound technical results are not trivial and proofs are given of all the main statements the organization of the paper is good no experiments are presented to support and complement the competitive analysis of the proposed algorithms in particular for the randomized algorithm it could help to have an experimental analysis of the competitiveness of the random algorithm in general theorem 3 holds only for the case with at most 2 arrivals and 1 node assigned technical parts are not so easy to read and a little bit of intuition could help the reader that is not an expert in the field of online algorithms and competitive analysis could you say something about the performance of algorithm random in more general cases ### Summary:
meta review this paper considers a variant of the online bipartite matching problem and proposes corresponding algorithms with guarantees on their competitive ratio the reviewers are all positive about this paper the paper could be more complete if some empirical analyses were provided
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 18, 253, 1895, 2783, 275, 253, 2929, 310, 1097, 973, 17194, 432, 271, 2898, 8668, 285, 4722, 432, 247, 10527, 9655, 50276, 19, 253, 4081, 11333, 403, 20420, 2529, 2007, 3045, 23632, 452, 644, 2530, 323, 841, 11333, 534, 564, 327, 281, 921, 5556, 1319, 273, 253, 4081, 11333, 275, 690, 2219, 50276, 20, 253, 2929, 310, 973, 3542, 253, 6003, 273, 253, 2929, 2789, 352, 3477, 281, 956, 253, 2234, 5697, 2007, 253, 2905, 2987, 2593, 8069, 3461, 684, 253, 5368, 6239, 285, 253, 22620, 285, 3910, 273, 436, 2929, 342, 643, 4623, 11073, 9380, 50275, 18, 253, 2593, 327, 253, 2714, 1083, 672, 1016, 268, 1944, 310, 2057, 470, 390, 247, 4229, 1318, 268, 310, 417, 1077, 973, 17194, 275, 2898, 540, 41597, 352, 3133, 326, 627, 651, 320, 1781, 10575, 275, 23898, 273, 3108, 2083, 398, 1677, 326, 253, 8446, 37344, 476, 6889, 3012, 2439, 3108, 2083, 398, 352, 651, 320, 4217, 281, 21184, 2007, 327, 253, 2898, 16038, 1060, 374, 253, 4737, 273, 10012, 374, 3133, 281, 320, 760, 323, 253, 1083, 672, 295, 19, 256, 18, 260, 18, 50276, 68, 19, 50276, 18, 4737, 273, 26647, 281, 512, 2219, 556, 417, 644, 2530, 495, 347, 4390, 4767, 10012, 374, 310, 323, 512, 2193, 273, 295, 256, 260, 5727, 10012, 495, 3133, 281, 320, 760, 323, 295, 19, 256, 18, 260, 18, 50276, 68, 19, 50276, 18, 436, 13812, 943, 320, 4518, 4767, 275, 253, 2505, 20121, 285, 253, 12002, 577, 323, 11990, 273, 1563, 253, 4737, 7125, 323, 10012, 608, 352, 651, 320, 9371, 281, 452, 247, 4737, 23211, 608, 247, 10704, 1263, 327, 253, 3045, 273, 253, 4081, 11333, 476, 2007, 4326, 4513, 253, 1318, 273, 253, 11333, 50276, 1542, 7756, 13991, 4496, 3730, 281, 253, 2593, 327, 32213, 5474, 339, 431, 429, 1543, 323, 253, 30027, 1083, 690, 17947, 323, 253, 30027, 1083, 50276, 4714, 15720, 26565, 1783, 253, 1895, 9193, 247, 1652, 13345, 2139, 403, 512, 253, 256, 253, 1072, 533, 253, 16399, 403, 1027, 1056, 253, 1269, 74, 30909, 327, 3239, 374, 5474, 339, 431, 248, 2929, 13330, 342, 247, 12955, 273, 253, 10610, 3909, 49240, 11038, 1895, 436, 310, 247, 1077, 4714, 5421, 1895, 326, 556, 247, 1781, 3486, 327, 2067, 3672, 273, 23105, 275, 1798, 275, 253, 4758, 273, 3909, 12089, 285, 11038, 10169, 253, 2715, 273, 253, 1895, 4081, 275, 253, 2929, 310, 17194, 407, 2173, 4893, 275, 253, 4758, 273, 3153, 18361, 28064, 50276, 783, 2929, 4620, 281, 320, 22335, 4891, 285, 3590, 7681, 1543, 403, 417, 14916, 285, 27947, 403, 1677, 273, 512, 253, 2022, 7234, 253, 6003, 273, 253, 2929, 310, 1175, 642, 4679, 403, 3559, 281, 1329, 285, 13503, 253, 12085, 1783, 273, 253, 4081, 11333, 275, 1798, 323, 253, 14871, 5933, 352, 812, 1361, 281, 452, 271, 5661, 1783, 273, 253, 3947, 48826, 273, 253, 3632, 5933, 275, 2087, 10012, 495, 6556, 760, 323, 253, 1083, 342, 387, 954, 374, 4582, 932, 285, 337, 4666, 7922, 50276, 48746, 4243, 403, 417, 594, 3477, 281, 1239, 285, 247, 1652, 2372, 273, 30328, 812, 1361, 253, 9414, 326, 310, 417, 271, 6485, 275, 253, 1673, 273, 3909, 11333, 285, 12085, 1783, 50276, 16534, 368, 1333, 1633, 670, 253, 3045, 273, 5933, 3632, 275, 625, 2087, 2219, 2490, 187, 4118, 18435, 27, 13518, 2278, 436, 2929, 19401, 247, 12955, 273, 253, 3909, 49240, 11038, 1895, 285, 29328, 3969, 11333, 342, 23632, 327, 616, 12085, 4313, 253, 30628, 403, 512, 2762, 670, 436, 2929, 253, 2929, 812, 320, 625, 3426, 604, 690, 16774, 6260, 497, 2530 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 18, 253, 1895, 2783, 275, 253, 2929, 310, 1097, 973, 17194, 432, 271, 2898, 8668, 285, 4722, 432, 247, 10527, 9655, 50276, 19, 253, 4081, 11333, 403, 20420, 2529, 2007, 3045, 23632, 452, 644, 2530, 323, 841, 11333, 534, 564, 327, 281, 921, 5556, 1319, 273, 253, 4081, 11333, 275, 690, 2219, 50276, 20, 253, 2929, 310, 973, 3542, 253, 6003, 273, 253, 2929, 2789, 352, 3477, 281, 956, 253, 2234, 5697, 2007, 253, 2905, 2987, 2593, 8069, 3461, 684, 253, 5368, 6239, 285, 253, 22620, 285, 3910, 273, 436, 2929, 342, 643, 4623, 11073, 9380, 50275, 18, 253, 2593, 327, 253, 2714, 1083, 672, 1016, 268, 1944, 310, 2057, 470, 390, 247, 4229, 1318, 268, 310, 417, 1077, 973, 17194, 275, 2898, 540, 41597, 352, 3133, 326, 627, 651, 320, 1781, 10575, 275, 23898, 273, 3108, 2083, 398, 1677, 326, 253, 8446, 37344, 476, 6889, 3012, 2439, 3108, 2083, 398, 352, 651, 320, 4217, 281, 21184, 2007, 327, 253, 2898, 16038, 1060, 374, 253, 4737, 273, 10012, 374, 3133, 281, 320, 760, 323, 253, 1083, 672, 295, 19, 256, 18, 260, 18, 50276, 68, 19, 50276, 18, 4737, 273, 26647, 281, 512, 2219, 556, 417, 644, 2530, 495, 347, 4390, 4767, 10012, 374, 310, 323, 512, 2193, 273, 295, 256, 260, 5727, 10012, 495, 3133, 281, 320, 760, 323, 295, 19, 256, 18, 260, 18, 50276, 68, 19, 50276, 18, 436, 13812, 943, 320, 4518, 4767, 275, 253, 2505, 20121, 285, 253, 12002, 577, 323, 11990, 273, 1563, 253, 4737, 7125, 323, 10012, 608, 352, 651, 320, 9371, 281, 452, 247, 4737, 23211, 608, 247, 10704, 1263, 327, 253, 3045, 273, 253, 4081, 11333, 476, 2007, 4326, 4513, 253, 1318, 273, 253, 11333, 50276, 1542, 7756, 13991, 4496, 3730, 281, 253, 2593, 327, 32213, 5474, 339, 431, 429, 1543, 323, 253, 30027, 1083, 690, 17947, 323, 253, 30027, 1083, 50276, 4714, 15720, 26565, 1783, 253, 1895, 9193, 247, 1652, 13345, 2139, 403, 512, 253, 256, 253, 1072, 533, 253, 16399, 403, 1027, 1056, 253, 1269, 74, 30909, 327, 3239, 374, 5474, 339, 431, 248, 2929, 13330, 342, 247, 12955, 273, 253, 10610, 3909, 49240, 11038, 1895, 436, 310, 247, 1077, 4714, 5421, 1895, 326, 556, 247, 1781, 3486, 327, 2067, 3672, 273, 23105, 275, 1798, 275, 253, 4758, 273, 3909, 12089, 285, 11038, 10169, 253, 2715, 273, 253, 1895, 4081, 275, 253, 2929, 310, 17194, 407, 2173, 4893, 275, 253, 4758, 273, 3153, 18361, 28064, 50276, 783, 2929, 4620, 281, 320, 22335, 4891, 285, 3590, 7681, 1543, 403, 417, 14916, 285, 27947, 403, 1677, 273, 512, 253, 2022, 7234, 253, 6003, 273, 253, 2929, 310, 1175, 642, 4679, 403, 3559, 281, 1329, 285, 13503, 253, 12085, 1783, 273, 253, 4081, 11333, 275, 1798, 323, 253, 14871, 5933, 352, 812, 1361, 281, 452, 271, 5661, 1783, 273, 253, 3947, 48826, 273, 253, 3632, 5933, 275, 2087, 10012, 495, 6556, 760, 323, 253, 1083, 342, 387, 954, 374, 4582, 932, 285, 337, 4666, 7922, 50276, 48746, 4243, 403, 417, 594, 3477, 281, 1239, 285, 247, 1652, 2372, 273, 30328, 812, 1361, 253, 9414, 326, 310, 417, 271, 6485, 275, 253, 1673, 273, 3909, 11333, 285, 12085, 1783, 50276, 16534, 368, 1333, 1633, 670, 253, 3045, 273, 5933, 3632, 275, 625, 2087, 2219, 2490, 187, 4118, 18435, 27, 13518, 2278, 436, 2929, 19401, 247, 12955, 273, 253, 3909, 49240, 11038, 1895, 285, 29328, 3969, 11333, 342, 23632, 327, 616, 12085, 4313, 253, 30628, 403, 512, 2762, 670, 436, 2929, 253, 2929, 812, 320, 625, 3426, 604, 690, 16774, 6260, 497, 2530 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper highlights and studies the interesting possibility of hiding information within neural network weights which is a form of steganography the sensitivity of different neural network layers to perturbations is evaluated and based on this a technique for hiding information is proposed and demonstrated it is shown that it is possible to hide information in the weights of a number of standard baseline neural networks without being easily detectable overall this is an interesting paper which highlighted a possibility which i was not aware of although i do have a little familiarity with steganography my only significant criticism is that i would really like to see a more thorough exploration and discussion of the information hiding capacity of neural networks in particular it would be great if the authors could explore the relationship between the quantity of information that can be hidden and the sizenumber of parameters as far as i can see the sizes of the weights are only mentioned in passing once on the first page of the paper where they are said to be in the hundreds of megabytes an understatement these days and the paper does not say how much information they were able to hide in weights as a bare minimum i would ask that the authors state how much information they were able to hide as in how many megabytes as well as the actual sizes of the weights for the neural networks used rather than the vague hundreds of megabytes a more thorough exploration of the relationship between neural net size and stego capacity would be even better does it follow the square root law typical in other settings see eg httpwwwcsoxacukandrewkerdocsadk71bpdf perhaps this more thorough exploration can be deferred to future work apart from this i have a number of more minor criticisms and suggested improvements multiple issues with figure 1 in the first sentence of the caption fraction bit distributions you should probably replace distributions with histograms the yaxis of at least one of the histogram plots should be labelled i guess with frequency so that its clear what the numerical values represent instead of top left three you should be more specific and say top row first three generally there were a lot of grammatical errors and spelling mistakes and you should spend more time carefully proofreading the paper a probably not exhaustive list of examples the first sentence of the second paragraph on page 2 beginning since the fraction bits is not a valid sentence neural is misspelled in the first line of the last paragraph of page 3 the title of section 2 should be background not backgrounds in first sentence of section 4 what does their refer to probably should replace their usefulness with the usefulness of dnns about two thirds of the way through the first paragraph in section 44 in the sentence beginning in particular fraction bits an that an should be an and also in we made further divided them into two types delete made the third sentence of the second paragraph of section 5 beginning in particular our experiment is not a valid sentence dont use double parentheses for citations as you do on the last line of page 1 you can use semicolons to separate the citations from the other text inside the parentheses in the third paragraph of page 3 you say a trivial approach to hide replace the least significant bits in the pixel values of the secret image im pretty sure you mean cover image there not secret image in the second paragraph of page 4 could you explain the difference between watermarking and steganography in the fifth paragraph on page 5 you say a load of this is far too colloquial for an academic paper the investigation represented by table 1 could have been more detailed for example you could have tested as others have done previously whether outputs are more sensitive to weights closer to the inputoutput of the nn you could also note that its pretty obvious that a nn would be more sensitive to batchnorm parameters than others since these affect the scale and position of all of the outputs of a layer the way you structure the experiments section with two short introductory paragraphs followed by subsections could be streamlined merge the informative parts of the intro paragraphs into the subsections and then delete the intro paragraphsdocsepthis paper proposed a method to hide information in the parameters of neural network models to avoid significant perturbation the paper only considers embed the information in the fraction bits of the parameters the paper considers hiding the information in either the least significant bits of the most significant bits hiding in the least significant bits is harder to be detected but the message can also be easily removed without much degradation in the model performance on the other hand information hiding in the most significant fraction bits will be very hard to remove without model performance degradation but also harder to embed the message for the same reason sensitivity analysis to select least sensitive parameters to use and finetuning after embedding to recover the model performance can be two remedies overall the paper is clearly written and the proposed method is well supported by the experiments some questions comments below how many files in each format are used to generate the plots in figure 1 is it statistically significant sec 33 in the experiment section we partitioned the parameters of a model into two where the former is used to hide secrets into one part while the latter is finetuned its not clear to me what the criteria is for partitioning the parameters is it purely based on sensitivity analysis sec 4 all the values reported in the table are the averaged ones across ten different runs can the standard deviation be added to the results as well sec 41 since the number of bn parameters only account for less than 1 of the entire parameters which indicates low capacity we excluded bn parameters in later experiments does it mean all models used do not have bn layers re table 2 its interesting to see some models are not sensitive to even msfb perturbations any insight on the specific reasondocsepthis paper investigates if neural network parameters trained for a standard task such as image recognition can be used as a cover medium in steganography the authors contend that neural networks are a good choice of cover medium mainly because the less important fractional bits of parameters follow a uniform distribution this is empirically demonstrated in figure 1 where the probability of least significant bits taking values 0 or 1 are 50 for vgg resnet densenet i assume these are imagenet models judging from later experiments the authors experiment with replacing either the least or most significant bits from the fractional part of parameter values the motivation behind replacing the most significant bits is that it will be more difficult for an attacker to remove secret information bits of lower significance could be removed or replaced with little effect on test accuracy while removing more significant fractional bits will have a larger impact on accuracy the downside of this approach is that replacing more important fractional bits with secrets will also have a larger impact on accuracy but the authors show finetuning can somewhat alleviate this drawback the authors then show simple mlps trained to distinguish between standard and stego networks are only slightly better than a random coin flip overall i thought the paper was generally wellwritten and contains some interesting ideas but these positives are also accompanied by an unclear motivation lack of positioning with respect to related work and poor experimental evaluation my primary concern is the practical motivation for this idea in steganography it is normally assumed that both the cover and steg imageaudionetwork cannot both be revealed otherwise a trivial comparison will reveal secret information is present thus breaking security of the scheme does this mean a new model needs to be trained each time two parties want to communicate if so this seems to represent a serious limitation since training these models requires a nontrivial amount of effort in comparison to say generating an image or audio file as the cover furthermore a problematic scaling law then appears where larger models need to be trained on presumably more complex tasks to recover some plausible deniability eg it would be suspicious to see a resnet50 trained on mnist to hide larger messages these larger models are much more expensive to train and again generating highresolution images is a much cheaper option sharing large resolution images will generally consume less memory than very large neural networks i wonder how the authors view the practicality of this work it would have been great to compare this scheme with some standard steganography schemes on images or audio across desideratum such as bandwidth robustness etc the authors may not be aware but there has been work on information hiding in previous work song et al 2017 1 also investigate how information can be imperceptibly embedded and recovered from neural networks in addition to experiments on embedding secrets in fractional bits they have experiments in blackbox settings could the authors comment on relationship between this work and song et al 2017 1 as far as i can tell they are quite similar ideas i was a little disappointed that the authors didnt include any practical demonstrations such as successful recovery of secrets as shown in song et al 2017 1 steganographic scheme are evaluated by security capacity and robustness as the authors comment in section 21 but most of the experiments in section 4 concentrate on capacity how does the scheme standup against robustness attacks such as downstream task finetuning parameter pruning etc i was also confused by the experiment setup in section 44 for a fair experiment of distinguishability i had expected a large number of cover and stego models to be trained for some notion of confidence and then use a steganalysis tool to distinguish between the two im not sure training a steganalysis model on a single stego network allows for a fair interpretation of performance 1 song congzheng thomas ristenpart and vitaly shmatikov machine learning models that remember too much proceedings of the 2017 acm sigsac conference on computer and communications security 2017 ### Summary:
this work presents an original analysis of using the weights of a neural network as a medium on which to hide information although the paper offers a novel perspective its motivation and applicability remain unclear as reviewer 3 points out the proposed method does not seem very practical for any particular application and the authors do not give a practical demonstration that shows the usefulness of the approach its not clear how the paper should be positioned with respect to previous work and the proposed method is not directly compared with standard steganography schemes on metrics such as bandwidth robustness etc making it difficult to assess the value of the contribution for these reasons i recommend rejecting the paper in its current form
[ 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 16681, 285, 2175, 253, 4722, 6387, 273, 17197, 1491, 1561, 11454, 2990, 13461, 534, 310, 247, 830, 273, 331, 30558, 3756, 253, 7340, 273, 1027, 11454, 2990, 8090, 281, 26309, 310, 6760, 285, 1754, 327, 436, 247, 5853, 323, 17197, 1491, 310, 4081, 285, 5183, 352, 310, 2011, 326, 352, 310, 1896, 281, 10507, 1491, 275, 253, 13461, 273, 247, 1180, 273, 2629, 8245, 11454, 6928, 1293, 1146, 4354, 21759, 50276, 1189, 455, 436, 310, 271, 4722, 2929, 534, 16318, 247, 6387, 534, 891, 369, 417, 6600, 273, 3738, 891, 513, 452, 247, 1652, 38550, 342, 331, 30558, 3756, 619, 760, 1534, 14226, 310, 326, 891, 651, 1663, 751, 281, 923, 247, 625, 11080, 17947, 285, 5955, 273, 253, 1491, 17197, 5350, 273, 11454, 6928, 275, 1798, 352, 651, 320, 1270, 604, 253, 4477, 812, 8338, 253, 2954, 875, 253, 10671, 273, 1491, 326, 476, 320, 8763, 285, 253, 256, 27297, 2764, 273, 3602, 347, 2080, 347, 891, 476, 923, 253, 9552, 273, 253, 13461, 403, 760, 5393, 275, 8136, 2378, 327, 253, 806, 3239, 273, 253, 2929, 835, 597, 403, 753, 281, 320, 275, 253, 8307, 273, 19488, 357, 12742, 271, 762, 25322, 841, 1897, 285, 253, 2929, 1057, 417, 1333, 849, 1199, 1491, 597, 497, 2104, 281, 10507, 275, 13461, 347, 247, 8050, 5927, 891, 651, 1642, 326, 253, 4477, 1375, 849, 1199, 1491, 597, 497, 2104, 281, 10507, 347, 275, 849, 1142, 19488, 357, 12742, 347, 973, 347, 253, 4588, 9552, 273, 253, 13461, 323, 253, 11454, 6928, 908, 2581, 685, 253, 21248, 8307, 273, 19488, 357, 12742, 247, 625, 11080, 17947, 273, 253, 2954, 875, 11454, 2036, 1979, 285, 331, 32844, 5350, 651, 320, 1014, 1805, 1057, 352, 956, 253, 6278, 5230, 1569, 6867, 275, 643, 7533, 923, 24088, 3944, 2700, 68, 601, 89, 317, 2788, 395, 2663, 6426, 13880, 324, 76, 3677, 67, 9275, 4931, 436, 625, 11080, 17947, 476, 320, 36334, 281, 2852, 789, 50276, 522, 435, 432, 436, 891, 452, 247, 1180, 273, 625, 5884, 43680, 285, 5125, 11701, 50275, 34263, 3374, 342, 4677, 337, 50273, 249, 253, 806, 6197, 273, 253, 11743, 6919, 2372, 10670, 368, 943, 3164, 8171, 10670, 342, 47846, 50273, 783, 340, 10565, 273, 387, 1878, 581, 273, 253, 33105, 14777, 943, 320, 27214, 891, 5476, 342, 4294, 594, 326, 697, 2590, 752, 253, 10704, 2193, 1957, 50273, 34235, 273, 1755, 1669, 1264, 368, 943, 320, 625, 2173, 285, 1333, 1755, 4194, 806, 1264, 50275, 43786, 627, 497, 247, 2257, 273, 47412, 474, 6332, 285, 33797, 16503, 285, 368, 943, 6947, 625, 673, 9257, 4737, 24042, 253, 2929, 247, 3164, 417, 41389, 1618, 273, 6667, 50273, 783, 806, 6197, 273, 253, 1273, 12494, 327, 3239, 374, 5068, 1580, 253, 6919, 9886, 310, 417, 247, 3588, 6197, 50273, 570, 1546, 310, 2985, 81, 5911, 275, 253, 806, 1386, 273, 253, 1390, 12494, 273, 3239, 495, 50273, 783, 4060, 273, 2593, 374, 943, 320, 4114, 417, 24550, 50273, 249, 806, 6197, 273, 2593, 577, 752, 1057, 616, 3730, 281, 3164, 943, 8171, 616, 31471, 342, 253, 31471, 273, 277, 79, 2224, 50273, 10383, 767, 289, 14950, 273, 253, 1039, 949, 253, 806, 12494, 275, 2593, 7127, 275, 253, 6197, 5068, 275, 1798, 50276, 46991, 9886, 271, 326, 271, 943, 320, 271, 285, 671, 275, 359, 1160, 2007, 4272, 731, 715, 767, 3510, 11352, 1160, 50273, 783, 2626, 6197, 273, 253, 1273, 12494, 273, 2593, 608, 5068, 275, 1798, 776, 3368, 310, 417, 247, 3588, 6197, 50275, 69, 834, 897, 4021, 41616, 323, 30404, 347, 368, 513, 327, 253, 1390, 1386, 273, 3239, 337, 368, 476, 897, 3300, 20917, 790, 281, 4858, 253, 30404, 432, 253, 643, 2505, 3304, 253, 41616, 50275, 249, 253, 2626, 12494, 273, 3239, 495, 368, 1333, 247, 14916, 2746, 281, 10507, 8171, 253, 1878, 1534, 9886, 275, 253, 12275, 2193, 273, 253, 4279, 2460, 516, 3965, 2119, 368, 1599, 3835, 2460, 627, 417, 4279, 2460, 50275, 249, 253, 1273, 12494, 273, 3239, 577, 812, 368, 5513, 253, 3064, 875, 37385, 782, 272, 285, 331, 30558, 3756, 50275, 249, 253, 10720, 12494, 327, 3239, 608, 368, 1333, 247, 3301, 273, 436, 310, 2080, 1512, 50014, 451, 323, 271, 11073, 2929, 50275, 783, 5839, 6607, 407, 2829, 337, 812, 452, 644, 625, 7000, 323, 1650, 368, 812, 452, 5762, 347, 2571, 452, 2218, 3786, 1880, 18012, 403, 625, 7996, 281, 13461, 8003, 281, 253, 3280, 9252, 273, 253, 48257, 368, 812, 671, 3877, 326, 697, 3965, 4755, 326, 247, 48257, 651, 320, 625, 7996, 281, 10464, 1451, 526, 3602, 685, 2571, 1580, 841, 2818, 253, 4311, 285, 1899, 273, 512, 273, 253, 18012, 273, 247, 3828, 50275, 783, 1039, 368, 2605, 253, 4679, 2593, 342, 767, 2159, 47649, 33295, 3560, 407, 749, 21454, 812, 320, 5542, 12490, 17310, 253, 27096, 4243, 273, 253, 26432, 33295, 715, 253, 749, 21454, 285, 840, 11352, 253, 26432, 33295, 7152, 33032, 2520, 2929, 4081, 247, 1332, 281, 10507, 1491, 275, 253, 3602, 273, 11454, 2990, 3210, 281, 3693, 1534, 20452, 253, 2929, 760, 19401, 8473, 253, 1491, 275, 253, 6919, 9886, 273, 253, 3602, 253, 2929, 19401, 17197, 253, 1491, 275, 2057, 253, 1878, 1534, 9886, 273, 253, 954, 1534, 9886, 17197, 275, 253, 1878, 1534, 9886, 310, 12150, 281, 320, 5189, 533, 253, 3935, 476, 671, 320, 4354, 5176, 1293, 1199, 11961, 275, 253, 1566, 3045, 327, 253, 643, 1133, 1491, 50276, 73, 2821, 275, 253, 954, 1534, 6919, 9886, 588, 320, 1077, 1892, 281, 5386, 1293, 1566, 3045, 11961, 533, 671, 12150, 281, 8473, 253, 3935, 323, 253, 1072, 1921, 7340, 1783, 281, 3609, 1878, 7996, 3602, 281, 897, 285, 1442, 292, 25004, 846, 21496, 281, 9295, 253, 1566, 3045, 476, 320, 767, 24371, 50276, 1189, 455, 253, 2929, 310, 4518, 3542, 285, 253, 4081, 1332, 310, 973, 4516, 407, 253, 4679, 690, 3533, 50276, 26122, 2708, 50276, 5430, 1142, 4367, 275, 1016, 5981, 403, 908, 281, 6635, 253, 14777, 275, 4677, 337, 310, 352, 10126, 1534, 50276, 1704, 5922, 275, 253, 3368, 2593, 359, 10883, 264, 253, 3602, 273, 247, 1566, 715, 767, 835, 253, 3438, 310, 908, 281, 10507, 20196, 715, 581, 629, 1223, 253, 6158, 310, 1442, 292, 37437, 697, 417, 2590, 281, 479, 752, 253, 6866, 310, 323, 41463, 253, 3602, 310, 352, 15846, 1754, 327, 7340, 1783, 50276, 1704, 577, 512, 253, 2193, 2361, 275, 253, 2829, 403, 253, 17522, 4394, 2439, 3578, 1027, 6613, 476, 253, 2629, 11254, 320, 2879, 281, 253, 1543, 347, 973, 50276, 1704, 7609, 1580, 253, 1180, 273, 270, 79, 3602, 760, 2395, 323, 1679, 685, 337, 273, 253, 2862, 3602, 534, 6492, 1698, 5350, 359, 10432, 270, 79, 3602, 275, 1996, 4679, 1057, 352, 1599, 512, 3210, 908, 513, 417, 452, 270, 79, 8090, 50276, 250, 2829, 374, 697, 4722, 281, 923, 690, 3210, 403, 417, 7996, 281, 1014, 278, 6091, 67, 26309, 667, 12288, 327, 253, 2173, 294, 284, 857, 406, 33032, 2520, 2929, 2340, 684, 604, 11454, 2990, 3602, 10166, 323, 247, 2629, 4836, 824, 347, 2460, 8981, 476, 320, 908, 347, 247, 3835, 4646, 275, 331, 30558, 3756, 253, 4477, 21244, 326, 11454, 6928, 403, 247, 1175, 4327, 273, 3835, 4646, 7194, 984, 253, 1679, 1774, 24622, 9886, 273, 3602, 956, 247, 6447, 3268, 436, 310, 45190, 5183, 275, 4677, 337, 835, 253, 5912, 273, 1878, 1534, 9886, 3192, 2193, 470, 390, 337, 403, 2456, 323, 362, 1266, 501, 3024, 12006, 257, 292, 891, 5467, 841, 403, 4440, 257, 292, 3210, 32721, 432, 1996, 4679, 253, 4477, 3368, 342, 15706, 2057, 253, 1878, 390, 954, 1534, 9886, 432, 253, 24622, 629, 273, 4764, 2193, 253, 16038, 3212, 15706, 253, 954, 1534, 9886, 310, 326, 352, 588, 320, 625, 2834, 323, 271, 30539, 281, 5386, 4279, 1491, 9886, 273, 2406, 8453, 812, 320, 5176, 390, 7932, 342, 1652, 1055, 327, 1071, 7200, 1223, 11922, 625, 1534, 24622, 9886, 588, 452, 247, 4067, 3486, 327, 7200, 253, 42719, 273, 436, 2746, 310, 326, 15706, 625, 1774, 24622, 9886, 342, 20196, 588, 671, 452, 247, 4067, 3486, 327, 7200, 533, 253, 4477, 921, 1442, 292, 25004, 476, 8489, 33623, 436, 32489, 253, 4477, 840, 921, 2969, 13361, 793, 10166, 281, 12129, 875, 2629, 285, 331, 32844, 6928, 403, 760, 5777, 1805, 685, 247, 3632, 18011, 19153, 4583, 891, 1869, 253, 2929, 369, 3839, 973, 15720, 285, 4428, 690, 4722, 5697, 533, 841, 37865, 403, 671, 11704, 407, 271, 12744, 16038, 3480, 273, 19274, 342, 1675, 281, 2905, 789, 285, 4105, 5661, 7103, 50276, 2577, 3625, 4468, 310, 253, 8542, 16038, 323, 436, 2934, 275, 331, 30558, 3756, 352, 310, 9403, 8025, 326, 1097, 253, 3835, 285, 331, 909, 2460, 5353, 279, 292, 1601, 2550, 1097, 320, 4950, 5010, 247, 14916, 5301, 588, 10313, 4279, 1491, 310, 1246, 3021, 10155, 3988, 273, 253, 6974, 1057, 436, 1599, 247, 747, 1566, 3198, 281, 320, 10166, 1016, 673, 767, 4676, 971, 281, 13791, 604, 594, 436, 3133, 281, 1957, 247, 4092, 12291, 1580, 3733, 841, 3210, 4419, 247, 37825, 2408, 273, 3434, 275, 5301, 281, 1333, 11365, 271, 2460, 390, 9797, 1873, 347, 253, 3835, 33810, 247, 20276, 13642, 1569, 840, 4620, 835, 4067, 3210, 878, 281, 320, 10166, 327, 18289, 625, 2570, 8892, 281, 9295, 690, 21541, 1850, 74, 1430, 24088, 352, 651, 320, 20634, 281, 923, 247, 501, 3024, 1235, 10166, 327, 278, 79, 382, 281, 10507, 4067, 8169, 841, 4067, 3210, 403, 1199, 625, 8214, 281, 6194, 285, 969, 11365, 1029, 21061, 3888, 310, 247, 1199, 20182, 4500, 9628, 1781, 6064, 3888, 588, 3839, 21409, 1679, 3541, 685, 1077, 1781, 11454, 6928, 891, 4282, 849, 253, 4477, 1859, 253, 8542, 414, 273, 436, 789, 352, 651, 452, 644, 1270, 281, 7277, 436, 6974, 342, 690, 2629, 331, 30558, 3756, 15849, 327, 3888, 390, 9797, 2439, 711, 1334, 18438, 824, 347, 16992, 31640, 3966, 50276, 783, 4477, 778, 417, 320, 6600, 533, 627, 556, 644, 789, 327, 1491, 17197, 275, 2045, 789, 4498, 1162, 355, 4240, 337, 671, 7409, 849, 1491, 476, 320, 9719, 916, 4360, 12691, 285, 12372, 432, 11454, 6928, 275, 1635, 281, 4679, 327, 21496, 20196, 275, 24622, 9886, 597, 452, 4679, 275, 2806, 3364, 7533, 812, 253, 4477, 4385, 327, 2954, 875, 436, 789, 285, 4498, 1162, 355, 4240, 337, 347, 2080, 347, 891, 476, 2028, 597, 403, 3240, 2074, 5697, 50276, 74, 369, 247, 1652, 19271, 326, 253, 4477, 42126, 2486, 667, 8542, 32367, 824, 347, 5547, 7355, 273, 20196, 347, 2011, 275, 4498, 1162, 355, 4240, 337, 331, 30558, 5576, 6974, 403, 6760, 407, 3988, 5350, 285, 31640, 347, 253, 4477, 4385, 275, 2593, 3127, 533, 954, 273, 253, 4679, 275, 2593, 577, 21364, 327, 5350, 849, 1057, 253, 6974, 1462, 484, 1411, 31640, 8104, 824, 347, 15450, 4836, 1442, 292, 25004, 4764, 819, 25004, 3966, 891, 369, 671, 13477, 407, 253, 3368, 9978, 275, 2593, 7127, 323, 247, 4344, 3368, 273, 12129, 1430, 891, 574, 3264, 247, 1781, 1180, 273, 3835, 285, 331, 32844, 3210, 281, 320, 10166, 323, 690, 10732, 273, 7162, 285, 840, 897, 247, 331, 909, 12792, 4968, 281, 12129, 875, 253, 767, 516, 417, 2119, 3733, 247, 331, 909, 12792, 1566, 327, 247, 2014, 331, 32844, 2990, 4483, 323, 247, 4344, 7914, 273, 3045, 50274, 18, 4498, 345, 20449, 24176, 289, 4921, 391, 26307, 2003, 285, 12232, 90, 439, 2056, 34089, 5145, 4715, 3210, 326, 4456, 1512, 1199, 10061, 273, 253, 4240, 913, 78, 9788, 38346, 8059, 327, 4382, 285, 10924, 3988, 4240, 2490, 187, 4118, 18435, 27, 2520, 789, 10262, 271, 3236, 1783, 273, 970, 253, 13461, 273, 247, 11454, 2990, 347, 247, 4646, 327, 534, 281, 10507, 1491, 3738, 253, 2929, 6131, 247, 4460, 8668, 697, 16038, 285, 30437, 3464, 12744, 347, 37317, 495, 2792, 562, 253, 4081, 1332, 1057, 417, 1646, 1077, 8542, 323, 667, 1798, 2898, 285, 253, 4477, 513, 417, 1918, 247, 8542, 20028, 326, 2722, 253, 31471, 273, 253, 2746, 697, 417, 2590, 849, 253, 2929, 943, 320, 15471, 342, 1675, 281, 2045, 789, 285, 253, 4081, 1332, 310, 417, 3587, 2429, 342, 2629, 331, 30558, 3756, 15849, 327, 17082, 824, 347, 16992, 31640, 3966, 2403, 352, 2834, 281, 2939, 253, 1318, 273, 253, 7680, 323, 841, 4606, 891, 5583, 33944, 253, 2929, 275, 697, 1655, 830 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 16681, 285, 2175, 253, 4722, 6387, 273, 17197, 1491, 1561, 11454, 2990, 13461, 534, 310, 247, 830, 273, 331, 30558, 3756, 253, 7340, 273, 1027, 11454, 2990, 8090, 281, 26309, 310, 6760, 285, 1754, 327, 436, 247, 5853, 323, 17197, 1491, 310, 4081, 285, 5183, 352, 310, 2011, 326, 352, 310, 1896, 281, 10507, 1491, 275, 253, 13461, 273, 247, 1180, 273, 2629, 8245, 11454, 6928, 1293, 1146, 4354, 21759, 50276, 1189, 455, 436, 310, 271, 4722, 2929, 534, 16318, 247, 6387, 534, 891, 369, 417, 6600, 273, 3738, 891, 513, 452, 247, 1652, 38550, 342, 331, 30558, 3756, 619, 760, 1534, 14226, 310, 326, 891, 651, 1663, 751, 281, 923, 247, 625, 11080, 17947, 285, 5955, 273, 253, 1491, 17197, 5350, 273, 11454, 6928, 275, 1798, 352, 651, 320, 1270, 604, 253, 4477, 812, 8338, 253, 2954, 875, 253, 10671, 273, 1491, 326, 476, 320, 8763, 285, 253, 256, 27297, 2764, 273, 3602, 347, 2080, 347, 891, 476, 923, 253, 9552, 273, 253, 13461, 403, 760, 5393, 275, 8136, 2378, 327, 253, 806, 3239, 273, 253, 2929, 835, 597, 403, 753, 281, 320, 275, 253, 8307, 273, 19488, 357, 12742, 271, 762, 25322, 841, 1897, 285, 253, 2929, 1057, 417, 1333, 849, 1199, 1491, 597, 497, 2104, 281, 10507, 275, 13461, 347, 247, 8050, 5927, 891, 651, 1642, 326, 253, 4477, 1375, 849, 1199, 1491, 597, 497, 2104, 281, 10507, 347, 275, 849, 1142, 19488, 357, 12742, 347, 973, 347, 253, 4588, 9552, 273, 253, 13461, 323, 253, 11454, 6928, 908, 2581, 685, 253, 21248, 8307, 273, 19488, 357, 12742, 247, 625, 11080, 17947, 273, 253, 2954, 875, 11454, 2036, 1979, 285, 331, 32844, 5350, 651, 320, 1014, 1805, 1057, 352, 956, 253, 6278, 5230, 1569, 6867, 275, 643, 7533, 923, 24088, 3944, 2700, 68, 601, 89, 317, 2788, 395, 2663, 6426, 13880, 324, 76, 3677, 67, 9275, 4931, 436, 625, 11080, 17947, 476, 320, 36334, 281, 2852, 789, 50276, 522, 435, 432, 436, 891, 452, 247, 1180, 273, 625, 5884, 43680, 285, 5125, 11701, 50275, 34263, 3374, 342, 4677, 337, 50273, 249, 253, 806, 6197, 273, 253, 11743, 6919, 2372, 10670, 368, 943, 3164, 8171, 10670, 342, 47846, 50273, 783, 340, 10565, 273, 387, 1878, 581, 273, 253, 33105, 14777, 943, 320, 27214, 891, 5476, 342, 4294, 594, 326, 697, 2590, 752, 253, 10704, 2193, 1957, 50273, 34235, 273, 1755, 1669, 1264, 368, 943, 320, 625, 2173, 285, 1333, 1755, 4194, 806, 1264, 50275, 43786, 627, 497, 247, 2257, 273, 47412, 474, 6332, 285, 33797, 16503, 285, 368, 943, 6947, 625, 673, 9257, 4737, 24042, 253, 2929, 247, 3164, 417, 41389, 1618, 273, 6667, 50273, 783, 806, 6197, 273, 253, 1273, 12494, 327, 3239, 374, 5068, 1580, 253, 6919, 9886, 310, 417, 247, 3588, 6197, 50273, 570, 1546, 310, 2985, 81, 5911, 275, 253, 806, 1386, 273, 253, 1390, 12494, 273, 3239, 495, 50273, 783, 4060, 273, 2593, 374, 943, 320, 4114, 417, 24550, 50273, 249, 806, 6197, 273, 2593, 577, 752, 1057, 616, 3730, 281, 3164, 943, 8171, 616, 31471, 342, 253, 31471, 273, 277, 79, 2224, 50273, 10383, 767, 289, 14950, 273, 253, 1039, 949, 253, 806, 12494, 275, 2593, 7127, 275, 253, 6197, 5068, 275, 1798, 50276, 46991, 9886, 271, 326, 271, 943, 320, 271, 285, 671, 275, 359, 1160, 2007, 4272, 731, 715, 767, 3510, 11352, 1160, 50273, 783, 2626, 6197, 273, 253, 1273, 12494, 273, 2593, 608, 5068, 275, 1798, 776, 3368, 310, 417, 247, 3588, 6197, 50275, 69, 834, 897, 4021, 41616, 323, 30404, 347, 368, 513, 327, 253, 1390, 1386, 273, 3239, 337, 368, 476, 897, 3300, 20917, 790, 281, 4858, 253, 30404, 432, 253, 643, 2505, 3304, 253, 41616, 50275, 249, 253, 2626, 12494, 273, 3239, 495, 368, 1333, 247, 14916, 2746, 281, 10507, 8171, 253, 1878, 1534, 9886, 275, 253, 12275, 2193, 273, 253, 4279, 2460, 516, 3965, 2119, 368, 1599, 3835, 2460, 627, 417, 4279, 2460, 50275, 249, 253, 1273, 12494, 273, 3239, 577, 812, 368, 5513, 253, 3064, 875, 37385, 782, 272, 285, 331, 30558, 3756, 50275, 249, 253, 10720, 12494, 327, 3239, 608, 368, 1333, 247, 3301, 273, 436, 310, 2080, 1512, 50014, 451, 323, 271, 11073, 2929, 50275, 783, 5839, 6607, 407, 2829, 337, 812, 452, 644, 625, 7000, 323, 1650, 368, 812, 452, 5762, 347, 2571, 452, 2218, 3786, 1880, 18012, 403, 625, 7996, 281, 13461, 8003, 281, 253, 3280, 9252, 273, 253, 48257, 368, 812, 671, 3877, 326, 697, 3965, 4755, 326, 247, 48257, 651, 320, 625, 7996, 281, 10464, 1451, 526, 3602, 685, 2571, 1580, 841, 2818, 253, 4311, 285, 1899, 273, 512, 273, 253, 18012, 273, 247, 3828, 50275, 783, 1039, 368, 2605, 253, 4679, 2593, 342, 767, 2159, 47649, 33295, 3560, 407, 749, 21454, 812, 320, 5542, 12490, 17310, 253, 27096, 4243, 273, 253, 26432, 33295, 715, 253, 749, 21454, 285, 840, 11352, 253, 26432, 33295, 7152, 33032, 2520, 2929, 4081, 247, 1332, 281, 10507, 1491, 275, 253, 3602, 273, 11454, 2990, 3210, 281, 3693, 1534, 20452, 253, 2929, 760, 19401, 8473, 253, 1491, 275, 253, 6919, 9886, 273, 253, 3602, 253, 2929, 19401, 17197, 253, 1491, 275, 2057, 253, 1878, 1534, 9886, 273, 253, 954, 1534, 9886, 17197, 275, 253, 1878, 1534, 9886, 310, 12150, 281, 320, 5189, 533, 253, 3935, 476, 671, 320, 4354, 5176, 1293, 1199, 11961, 275, 253, 1566, 3045, 327, 253, 643, 1133, 1491, 50276, 73, 2821, 275, 253, 954, 1534, 6919, 9886, 588, 320, 1077, 1892, 281, 5386, 1293, 1566, 3045, 11961, 533, 671, 12150, 281, 8473, 253, 3935, 323, 253, 1072, 1921, 7340, 1783, 281, 3609, 1878, 7996, 3602, 281, 897, 285, 1442, 292, 25004, 846, 21496, 281, 9295, 253, 1566, 3045, 476, 320, 767, 24371, 50276, 1189, 455, 253, 2929, 310, 4518, 3542, 285, 253, 4081, 1332, 310, 973, 4516, 407, 253, 4679, 690, 3533, 50276, 26122, 2708, 50276, 5430, 1142, 4367, 275, 1016, 5981, 403, 908, 281, 6635, 253, 14777, 275, 4677, 337, 310, 352, 10126, 1534, 50276, 1704, 5922, 275, 253, 3368, 2593, 359, 10883, 264, 253, 3602, 273, 247, 1566, 715, 767, 835, 253, 3438, 310, 908, 281, 10507, 20196, 715, 581, 629, 1223, 253, 6158, 310, 1442, 292, 37437, 697, 417, 2590, 281, 479, 752, 253, 6866, 310, 323, 41463, 253, 3602, 310, 352, 15846, 1754, 327, 7340, 1783, 50276, 1704, 577, 512, 253, 2193, 2361, 275, 253, 2829, 403, 253, 17522, 4394, 2439, 3578, 1027, 6613, 476, 253, 2629, 11254, 320, 2879, 281, 253, 1543, 347, 973, 50276, 1704, 7609, 1580, 253, 1180, 273, 270, 79, 3602, 760, 2395, 323, 1679, 685, 337, 273, 253, 2862, 3602, 534, 6492, 1698, 5350, 359, 10432, 270, 79, 3602, 275, 1996, 4679, 1057, 352, 1599, 512, 3210, 908, 513, 417, 452, 270, 79, 8090, 50276, 250, 2829, 374, 697, 4722, 281, 923, 690, 3210, 403, 417, 7996, 281, 1014, 278, 6091, 67, 26309, 667, 12288, 327, 253, 2173, 294, 284, 857, 406, 33032, 2520, 2929, 2340, 684, 604, 11454, 2990, 3602, 10166, 323, 247, 2629, 4836, 824, 347, 2460, 8981, 476, 320, 908, 347, 247, 3835, 4646, 275, 331, 30558, 3756, 253, 4477, 21244, 326, 11454, 6928, 403, 247, 1175, 4327, 273, 3835, 4646, 7194, 984, 253, 1679, 1774, 24622, 9886, 273, 3602, 956, 247, 6447, 3268, 436, 310, 45190, 5183, 275, 4677, 337, 835, 253, 5912, 273, 1878, 1534, 9886, 3192, 2193, 470, 390, 337, 403, 2456, 323, 362, 1266, 501, 3024, 12006, 257, 292, 891, 5467, 841, 403, 4440, 257, 292, 3210, 32721, 432, 1996, 4679, 253, 4477, 3368, 342, 15706, 2057, 253, 1878, 390, 954, 1534, 9886, 432, 253, 24622, 629, 273, 4764, 2193, 253, 16038, 3212, 15706, 253, 954, 1534, 9886, 310, 326, 352, 588, 320, 625, 2834, 323, 271, 30539, 281, 5386, 4279, 1491, 9886, 273, 2406, 8453, 812, 320, 5176, 390, 7932, 342, 1652, 1055, 327, 1071, 7200, 1223, 11922, 625, 1534, 24622, 9886, 588, 452, 247, 4067, 3486, 327, 7200, 253, 42719, 273, 436, 2746, 310, 326, 15706, 625, 1774, 24622, 9886, 342, 20196, 588, 671, 452, 247, 4067, 3486, 327, 7200, 533, 253, 4477, 921, 1442, 292, 25004, 476, 8489, 33623, 436, 32489, 253, 4477, 840, 921, 2969, 13361, 793, 10166, 281, 12129, 875, 2629, 285, 331, 32844, 6928, 403, 760, 5777, 1805, 685, 247, 3632, 18011, 19153, 4583, 891, 1869, 253, 2929, 369, 3839, 973, 15720, 285, 4428, 690, 4722, 5697, 533, 841, 37865, 403, 671, 11704, 407, 271, 12744, 16038, 3480, 273, 19274, 342, 1675, 281, 2905, 789, 285, 4105, 5661, 7103, 50276, 2577, 3625, 4468, 310, 253, 8542, 16038, 323, 436, 2934, 275, 331, 30558, 3756, 352, 310, 9403, 8025, 326, 1097, 253, 3835, 285, 331, 909, 2460, 5353, 279, 292, 1601, 2550, 1097, 320, 4950, 5010, 247, 14916, 5301, 588, 10313, 4279, 1491, 310, 1246, 3021, 10155, 3988, 273, 253, 6974, 1057, 436, 1599, 247, 747, 1566, 3198, 281, 320, 10166, 1016, 673, 767, 4676, 971, 281, 13791, 604, 594, 436, 3133, 281, 1957, 247, 4092, 12291, 1580, 3733, 841, 3210, 4419, 247, 37825, 2408, 273, 3434, 275, 5301, 281, 1333, 11365, 271, 2460, 390, 9797, 1873, 347, 253, 3835, 33810, 247, 20276, 13642, 1569, 840, 4620, 835, 4067, 3210, 878, 281, 320, 10166, 327, 18289, 625, 2570, 8892, 281, 9295, 690, 21541, 1850, 74, 1430, 24088, 352, 651, 320, 20634, 281, 923, 247, 501, 3024, 1235, 10166, 327, 278, 79, 382, 281, 10507, 4067, 8169, 841, 4067, 3210, 403, 1199, 625, 8214, 281, 6194, 285, 969, 11365, 1029, 21061, 3888, 310, 247, 1199, 20182, 4500, 9628, 1781, 6064, 3888, 588, 3839, 21409, 1679, 3541, 685, 1077, 1781, 11454, 6928, 891, 4282, 849, 253, 4477, 1859, 253, 8542, 414, 273, 436, 789, 352, 651, 452, 644, 1270, 281, 7277, 436, 6974, 342, 690, 2629, 331, 30558, 3756, 15849, 327, 3888, 390, 9797, 2439, 711, 1334, 18438, 824, 347, 16992, 31640, 3966, 50276, 783, 4477, 778, 417, 320, 6600, 533, 627, 556, 644, 789, 327, 1491, 17197, 275, 2045, 789, 4498, 1162, 355, 4240, 337, 671, 7409, 849, 1491, 476, 320, 9719, 916, 4360, 12691, 285, 12372, 432, 11454, 6928, 275, 1635, 281, 4679, 327, 21496, 20196, 275, 24622, 9886, 597, 452, 4679, 275, 2806, 3364, 7533, 812, 253, 4477, 4385, 327, 2954, 875, 436, 789, 285, 4498, 1162, 355, 4240, 337, 347, 2080, 347, 891, 476, 2028, 597, 403, 3240, 2074, 5697, 50276, 74, 369, 247, 1652, 19271, 326, 253, 4477, 42126, 2486, 667, 8542, 32367, 824, 347, 5547, 7355, 273, 20196, 347, 2011, 275, 4498, 1162, 355, 4240, 337, 331, 30558, 5576, 6974, 403, 6760, 407, 3988, 5350, 285, 31640, 347, 253, 4477, 4385, 275, 2593, 3127, 533, 954, 273, 253, 4679, 275, 2593, 577, 21364, 327, 5350, 849, 1057, 253, 6974, 1462, 484, 1411, 31640, 8104, 824, 347, 15450, 4836, 1442, 292, 25004, 4764, 819, 25004, 3966, 891, 369, 671, 13477, 407, 253, 3368, 9978, 275, 2593, 7127, 323, 247, 4344, 3368, 273, 12129, 1430, 891, 574, 3264, 247, 1781, 1180, 273, 3835, 285, 331, 32844, 3210, 281, 320, 10166, 323, 690, 10732, 273, 7162, 285, 840, 897, 247, 331, 909, 12792, 4968, 281, 12129, 875, 253, 767, 516, 417, 2119, 3733, 247, 331, 909, 12792, 1566, 327, 247, 2014, 331, 32844, 2990, 4483, 323, 247, 4344, 7914, 273, 3045, 50274, 18, 4498, 345, 20449, 24176, 289, 4921, 391, 26307, 2003, 285, 12232, 90, 439, 2056, 34089, 5145, 4715, 3210, 326, 4456, 1512, 1199, 10061, 273, 253, 4240, 913, 78, 9788, 38346, 8059, 327, 4382, 285, 10924, 3988, 4240, 2490, 187, 4118, 18435, 27, 2520, 789, 10262, 271, 3236, 1783, 273, 970, 253, 13461, 273, 247, 11454, 2990, 347, 247, 4646, 327, 534, 281, 10507, 1491, 3738, 253, 2929, 6131, 247, 4460, 8668, 697, 16038, 285, 30437, 3464, 12744, 347, 37317, 495, 2792, 562, 253, 4081, 1332, 1057, 417, 1646, 1077, 8542, 323, 667, 1798, 2898, 285, 253, 4477, 513, 417, 1918, 247, 8542, 20028, 326, 2722, 253, 31471, 273, 253, 2746, 697, 417, 2590, 849, 253, 2929, 943, 320, 15471, 342, 1675, 281, 2045, 789, 285, 253, 4081, 1332, 310, 417, 3587, 2429, 342, 2629, 331, 30558, 3756, 15849, 327, 17082, 824, 347, 16992, 31640, 3966, 2403, 352, 2834, 281, 2939, 253, 1318, 273, 253, 7680, 323, 841, 4606, 891, 5583, 33944, 253, 2929, 275, 697, 1655, 830 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper presents a platform for audiovisual simulation that can be used for audiovisual rendering the paper also presents a dataset called soundspacespanoir for acousticvisual learning tasks the simulation platform has multiple configurable parameters 1 the platform can be very useful for the research community as it supports audiovisual rendering and can be used in any environment 2 the platform supports acoustic and spatial continuity and has multiple configurable parameters like frequency bands diffraction etc 3 the authors have shown results for multiple downstram tasks such as farfield speech recognition 1 the baseline could have been improved 2 the authors addressed improving the accuracy of the directtoreverent ratio drr by fixing the bias of 4 present in the original soundspaces however no discussion is presented on this in the paper 3 there is no discussion on the impact of reducing the number of rays in highspeed mode how does the reduction in the number of rays affect the performance 4 the subsection 52 compares the simulation accuracy with real irs using the frl apartment from the replica dataset the comparison should be made in more diverse settings like crowdy apartments environmental location docsepthe authors present a secondgeneration of the soundscape tool which is designed to simulate realistic acoustic reverberation from 3dimensional roomscene models such as are already used for rendering computer graphics such a tool facilitates largescale rendering of audio for simulated scenes the authors have already run this simulation at scale and release a data set of 10million pairs of rendered images with corresponding acoustic impulseresponses irs in addition to obvious uses in computer game audio rendering such a tool may aid in the development of machine hearing algorithms like speech recognition which often show pathological behaviour in the presence of reverberation that humans would consider very moderate of course the use of such a tool and associated simulated dataset is predicated on the simulation accuracy which itself relies on the accuracy of properties of the user generated scene model to demonstrate the accuracy of soundscape 20 irs the authors show two demonstrations that appear successful but in this reviewers view are not as strong as they could or should be this does not mean the work isnt useful but the paper should certainly acknowledge the shortcomings directly and clearly additionally the authors present two additional benchmark tasks in one a simulated agent must localize a sound in a complex simulated environment here the others demonstrate that soundscape 20 improves upon the first generation soundscape in conveying simulated source location via simulated sound the other benchmark compares two soundscape 20 simulations of reverberation for one room one made with highquality mode and the other with highspeed mode while these two benchmark tasks are not without merit they do seem to belong in a different category than the tests that involve realworld audio i believe mixing all four tasks into one section on evaluation and benchmarks obscures an important difference the claims made with entirely simulated tasks are well justified but those about accuracy are not the authors deserve great credit for tackling a relevant problem and providing tools that allow audio rendering to be run at scale with preexisting scene simulation tools as is already common with image rendering the synthetic reverberation might even prove to be accurate however the task of validating this is more difficult than suggested by this paper and that should be addressed moreover there is no reason to think highquality reverberation can be simulated with a flawed 3dscene model no matter how good the simulation algorithm and the paper and documentation should warn users of pitfalls the codebase presented here provides a useful tool for synthesizing acoustic reverberation using meshgrids formatted for common scene simulation tools the codebase improves upon its first generation soundscape by allowing continuous sampling of locations for the sound source and receiver this almost certainly results in more accurate changes of sound with listener location which is plausibly why simulated agents can better localize sources in such an environment spatial audio in vr environments may well also benefit from this update the authors have provided evidence of sim2real transfer for a speech recognition algorithm trained on synthetic reverberation i think their test could have gone farther as i will elaborate upon later but nonetheless this first attempt and the public release of the tools to build thereupon is laudable the authors claim in the abstract that the simulated reverberation is highly realistic they might be correct but i find the evidence for this disappointingly thin none of the issues i bring up here are cause for immediate rejection raytracing algorithms have a tremendous role to play in acoustics research and this tool will likely be useful in many research projects moreover given the dearth of largescale datasets of reverberant irs perhaps limited models are better than none however so that other researchers can make good use this toolset these limitations should be clearly stated in the main text of the paper the major issues i find are as follows the direct comparisons between simulated and recorded reverberation are limited to only one room while it is certainly true that realworld ir measurements are not easy 3 measurements each in 2 rooms would have been better than 7 measurements in one room moreover a great many past research papers on reverberation have measured 310 rooms and some many more timeconsuming is not the same as unfeasible and if the authors want the research community to train algorithms with 10 million examples of simulated irs a comparison set of at least 10 rooms doesnt seem unreasonable given this the authors should moderate their claim of highly realistic and highlight more thorough tests with realworld irs as future work the comparison between the recorded reverberation and the simulated is rather coarse both the rt60 and drr are frequency dependent in real rooms and large rooms exhibit prominent resonent peaks in the late tail due to resonant modes do the soundscape 20 irs capture these structures if the answer is yes than so much the better for soundscape 20 if no the discrepancies will be informative and useful to know the simulation requires specifying a set of parameters for any material in the simulation scene absorption scattering transmission it is good that the authors have provided default values for common materials and it is good that users can specify such values vi an api but i still have many questions how were the default values obtained from measurements or physics models or are they all obtained to optimize the fit to one single room as implied in footnote 7 moreover i am curious to know how dramatically the reverberation would change if these values were altered if the authors resimulated the 7 irs they compare with realworld recordings with the acoustic randomization procedure they applied in the speech recognition task by how much will the simulated rt60 and drr vary if the answer is not much the rt60 and drr are robust then a skeptical reader will be reassured that the close match shown in fig 3 is not just due to overfitting parameters to a single room if otherwise readers will be forewarned that their own simulations may require some care and thought in the initial setup to yield highquality audio in which case how should a user know how to adjust these parameters whereas graphics rendering can ignore the thickness of walls and surfaces acoustic rendering does not have this luxury the absorptiontransmission coefficients of real rooms will certainly be different for thinvsthick surfaces presumably this can be fixed by manually altering the material parameters but this may require manually adjusting these values for any given rendered room again future users of thie simulation tool should be forewarned that they may have to think about these issues to get highquality results in the supplemental material document the authors state regarding fig 4 the main reason for the matterport3ds rt60 distribution skewing towards left is because there are lots of broken meshes in that dataset which results in ray leaking from holes and smaller reverberation in general on the contrary gibson and hm3d have higher quality mesh and have larger rt60s on average of course the authors here cannot be blamed for problematic meshes a rendering algorithm can only be as good as the information fed into it but this does highlight an issue standard 3d meshes are optimized for image rendering which faces different constraints than audio rendering image rendering requires raytracers to track a much smaller number of reflections than audio renderers so holes in the mesh which pose little problem for image rendering and might be common can prove problematic for audio this implies that a fraction of the 10 million irs in their public dataset underestimate the rt60 how many the dataset is still of use but this issue should be publicized also once more those who wish to use the simulation tool should be forewarned they need to check their mesh quality in the speech recognition task the authors demonstrate that a speech recognition trained on reverberant speech performs better when trained on soundscape 20 reverberation rather than pyroomacoustics i assume they ensured that the distribution of rt60s and drrs for the two sets of simulated irs were matched but this is not stated in addition i would be curious to see the effect if the speech recognition algorithm was trained with recorded reverberation from another dataset presumably soundscape 20 would yield better performance because of the size of the simulated dataset or with perceptually inspired statistical reverberation synthesis models such as used in hearing research or in music digital effects presumably here soundscape 20 would yield better performance because the reverberation is more realistic if this were demonstrated it would provide evidence that soundscape 20 really is a stateoftheart audio renderer until then i would suggest the authors moderate their claims of highly realistic and mention these other tests as useful future work docsepin this submission the authors introduced soundspaces 20 a simulation platform that supports audio simulation with a number of improvements over existing platforms in addition the authors have validated the simulation accuracies with realworld recordings introduced a dataset based on the environment and conducted benchmarking on two tasks the contributions are significant first the simulation now supports realtime continuous audio rendering second the rendering is also very realistic based on bidirectional path tracing third the simulation is now configurable and can be used as a general tool to simulate any 3d environment all these features are much needed for embodied ai research the dataset and the benchmarking results are also good to have for the community i see two weaknesses regarding the submission first the methods used in benchmarking are limited the authors used one algorithm for each of the two benchmark tasks the one for navigation is also introduced in the previous version of the work itd be good to see a more systematic evaluation with additional methods second the simtoreal experiments can be strengthened the authors conducted some preliminary experiments in benchmark 2 itd be great to see more of these experiments in more benchmarks and tasks to show that not only the simulation itself is realistic but models trained in these environments do transfer to the real world docsepthis paper proposed a new simulator for simulating acoustic effects the major difference compared to the prior work is that this work can compute sound properties on the fly and thus avoid using precomputed data therefore the new platform can support arbitrary geometry and continuous outputs the authors also perform two experiments to provide evidence for the usefulness of this work an upgraded version of the platform overcomes some major limitations of the prior work some experiments demonstrate that continuous inputs do impact model performance provided datasets and platforms can be quite useful to the community details of the sound sources are not clear an object might drop and only produces one impact sound not sure if the current platform supports this natural interaction for experiment 2 the authors are using highquality mode its interesting to see how the faster version compared to the slower version so we can better understand the impact of this tradeoff overall its a good paper however the authors should include more details to reach a broader audience and to help the community understand what can be done with the proposed platform a more detailed limitation section or some comparison with realworld sound sources should help docsepsummary the authors introduce soundspaces 20 a platform for onthefly geometrybased audio 2 rendering for 3d environments it allows continuous spatial sampling and demonstrates the effectiveness in two downstream tasks embodied navigation and farfield automatic speech recognition 1 novelty the proposed platform for continuous spatial sampling is novel and the ideas are original 2 detail pipeline the authors provide the audiovisual rendering pipeline in detail and illustrate the heavy workload 3 experiments the authors compare and benchmark two downstream tasks continuous audiovisual navigation and farfield speech recognition and further provide insights on designing these systems 4 good presentation the paper is clearly constructed and presented 1 though two downstream tasks have been benchmarked it could be better to provide results on extended tasks such as crossmodal learning and more audioonly research 2 its better to provide statistics of the proposed dataset including evaluation and visualization of the duration number of speakers and so on docsepthe paper introduces soundspaces 20 the first geometrybased realtime acoustic simulation platform for 3d environments the main difference from its previous version soundspaces is its continuous acoustic simulation as opposed to discrete which brings three benefits 1 more accurate acoustic simulation not restricted to predefined grid points 2 generalization to new scenes arbitrary input meshes 3 better configurability microphones materials etc in terms of experimental evaluation the authors compare the simulated sounds with realworld audio measurements they also showcase two downstream tasks that leverage the simulator audiovisual navigation and farfield automatic speech recognition the paper proposes a solid opensourced implementation of an audio simulation environment that performs onthefly geometrybased audio rendering for arbitrary scenes its first of its kind and can open up many research opportunities for multimodal embodied ai in terms of accessibility everything seems to be opensourced the platform supports two modes highspeed and highquality and also includes a largescale offline dataset called soundspacespanoir if the users are not interested in embodied ai tasks the experimental evaluation section has a few noteworthy weaknesses first its a bit unclear why continuous audio simulation is important intuitively its better because its closer to the real world but the experimental results table 3 specifically are not very convincing comparing row 2 and row 3 its expected that training in soundspaces and testing in soundspaces 20 will cause a large domain shift which is known to be very detrimental for deep learning models its still unclear whether having continuous audio simulation is important if the ultimate goal is to deploy a home service robot in the real world a motivation that is repeatedly mentioned in the paper the results of table 4 is more convincing clearly soundspaces 20 generates more realistic synthetic irs than pyroomaoustics for the purpose of asr however table 4 should also include the result of finetuned on soundspaces in addition figure 3 b is a bit suspicious there seems to be a constant delta between the results of soundspaces and soundspaces 20 where is that delta coming from in other words why is soundspace always producing drr 10 db higher furthermore in table 2 the authors should probably also compare with soundspaces 10 and threedworld they have certain simplifications discrete grid point for soundspaces and shoebox simplification for tdw and presumably they can run faster the users should be aware of the full spectrum of the simulation speed versus quality tradeoff when choosing which simulator to use for example if some user needs 100 fps or 1000 fps for their rl use case maybe they will choose soundspaces over soundspaces 20 finally there seems to be missing a clear comparison with threedworld in the experimental evaluation section threedworld also supports realtime continuous audio rendering and also impact sound simulation which is missing from soundspaces 20 the main limitation seems to be its assumption of simplified scene geometry shoebox the authors should demonstrate how this limitation affects the learning of downstream tasks including embodied ones eg audiovisual navigation or nonembodied ones eg asr ### Summary:
this paper was reviewed by six experts and received all positive scores ac feels this work is interesting and deserves to be published on neurips 2022 dataset track the reviewers did raise some valuable concerns that should be addressed in the final cameraready version of the paper the authors are encouraged to make the necessary changes in the final version
[ 3590, 9875, 1384, 50276, 338, 642, 253, 37122, 588, 320, 27096, 285, 4217, 281, 871, 50275, 783, 9864, 4419, 31238, 247, 873, 273, 3602, 323, 667, 2144, 275, 253, 9864, 6200, 11058, 11715, 6322, 352, 310, 1175, 326, 253, 4477, 452, 2530, 4284, 2193, 323, 1846, 4753, 285, 352, 310, 1175, 326, 4212, 476, 13199, 824, 2193, 2177, 271, 23370, 50276, 2858, 891, 1335, 452, 1142, 3533, 849, 497, 253, 4284, 2193, 2797, 432, 6341, 390, 12057, 3210, 390, 403, 597, 512, 2797, 281, 22318, 253, 4944, 281, 581, 2014, 2316, 347, 10466, 275, 43302, 818, 50276, 3062, 1189, 891, 717, 14338, 281, 871, 849, 16821, 253, 45427, 589, 318, 651, 1818, 604, 841, 2193, 497, 12059, 604, 253, 4477, 501, 303, 2907, 253, 818, 3496, 84, 597, 7277, 342, 1524, 10186, 19654, 342, 253, 19463, 46852, 5199, 597, 3732, 275, 253, 6519, 8981, 4836, 407, 849, 1199, 588, 253, 15524, 37523, 1549, 285, 1837, 83, 6889, 604, 253, 3662, 310, 417, 1199, 253, 37523, 1549, 285, 1837, 83, 403, 10237, 840, 247, 33872, 9414, 588, 320, 17279, 1520, 326, 253, 2810, 3761, 2011, 275, 3036, 495, 310, 417, 816, 1955, 281, 689, 31893, 3602, 281, 247, 2014, 2316, 604, 5010, 10668, 588, 320, 2273, 30289, 264, 326, 616, 1211, 9938, 778, 2430, 690, 1557, 285, 1869, 275, 253, 3302, 9978, 281, 4917, 1029, 15177, 9797, 275, 534, 1083, 849, 943, 247, 2608, 871, 849, 281, 4575, 841, 3602, 50275, 2811, 284, 15896, 18164, 476, 11823, 253, 9544, 273, 8099, 285, 9421, 19463, 18164, 1057, 417, 452, 436, 18997, 50276, 783, 11058, 3675, 2230, 10303, 273, 1524, 9956, 588, 5604, 320, 1027, 323, 6906, 87, 296, 73, 781, 9421, 50276, 10192, 40224, 436, 476, 320, 4229, 407, 13542, 30897, 253, 2144, 3602, 533, 436, 778, 2430, 13542, 19427, 841, 2193, 323, 667, 1677, 13697, 2316, 969, 2852, 4212, 273, 289, 466, 9864, 4968, 943, 320, 2273, 30289, 264, 326, 597, 778, 452, 281, 1158, 670, 841, 3374, 281, 755, 1029, 15177, 1543, 50274, 249, 253, 25702, 2144, 3389, 253, 4477, 1375, 5001, 3036, 577, 253, 2022, 1921, 323, 253, 2647, 631, 20, 1397, 37523, 1549, 3268, 8413, 7706, 4404, 1669, 310, 984, 627, 403, 8783, 273, 7154, 6191, 1041, 275, 326, 10895, 534, 1543, 275, 21868, 40929, 432, 11385, 285, 4577, 45427, 589, 318, 275, 2087, 327, 253, 10214, 33342, 1665, 285, 288, 78, 20, 69, 452, 2169, 3290, 17489, 285, 452, 4067, 37523, 1549, 84, 327, 3388, 50276, 1171, 2282, 253, 4477, 1060, 2550, 320, 27137, 323, 20276, 6191, 1041, 50276, 66, 18164, 5933, 476, 760, 320, 347, 1175, 347, 253, 1491, 10208, 715, 352, 50276, 2858, 436, 1057, 6780, 271, 2523, 2629, 495, 69, 6191, 1041, 403, 18325, 323, 2460, 18164, 534, 9365, 1027, 10806, 685, 9797, 18164, 2460, 18164, 4419, 21868, 1206, 317, 398, 281, 3540, 247, 1199, 4577, 1180, 273, 24233, 685, 9797, 3816, 615, 22895, 594, 11385, 275, 253, 17489, 534, 16753, 1652, 1895, 323, 2460, 18164, 285, 1537, 320, 1846, 476, 5276, 20276, 323, 9797, 50276, 2520, 8018, 326, 247, 6919, 273, 253, 884, 3041, 3496, 84, 275, 616, 1345, 10895, 45166, 253, 37523, 1549, 849, 1142, 253, 10895, 310, 1335, 273, 897, 533, 436, 2523, 943, 320, 1345, 1025, 50276, 12563, 2378, 625, 1110, 665, 5730, 281, 897, 253, 9864, 4968, 943, 320, 2273, 30289, 264, 597, 878, 281, 2451, 616, 17489, 3290, 50274, 249, 253, 6519, 8981, 4836, 253, 4477, 7568, 326, 247, 6519, 8981, 10166, 327, 45427, 589, 386, 6519, 17923, 1805, 672, 10166, 327, 3590, 9875, 1384, 45427, 589, 318, 2581, 685, 7239, 4461, 317, 26202, 982, 50276, 74, 5467, 597, 33075, 326, 253, 3268, 273, 37523, 1549, 84, 285, 1837, 2967, 323, 253, 767, 5239, 273, 15524, 3496, 84, 497, 13373, 533, 436, 310, 417, 4767, 275, 1635, 891, 651, 320, 14338, 281, 923, 253, 1055, 604, 253, 6519, 8981, 5933, 369, 10166, 342, 5950, 45427, 589, 318, 432, 1529, 10895, 18289, 3590, 9875, 1384, 651, 4917, 1805, 3045, 984, 273, 253, 1979, 273, 253, 15524, 10895, 390, 342, 591, 916, 1230, 11797, 7605, 45427, 589, 318, 9066, 3210, 824, 347, 908, 275, 4854, 2561, 390, 275, 3440, 5865, 2538, 18289, 1060, 3590, 9875, 1384, 651, 4917, 1805, 3045, 984, 253, 45427, 589, 318, 310, 625, 15958, 604, 436, 497, 5183, 352, 651, 2085, 1941, 326, 3590, 9875, 1384, 1663, 310, 247, 1375, 23037, 14387, 9797, 3816, 21052, 1919, 840, 891, 651, 1804, 253, 4477, 10290, 616, 3916, 273, 4122, 15958, 285, 3748, 841, 643, 5216, 347, 4217, 2852, 789, 5474, 339, 9852, 436, 19529, 253, 4477, 5611, 3590, 31748, 1384, 247, 9864, 5147, 326, 8525, 9797, 9864, 342, 247, 1180, 273, 11701, 689, 5368, 13498, 275, 1635, 253, 4477, 452, 17618, 253, 9864, 3933, 19103, 342, 1524, 10186, 19654, 5611, 247, 10895, 1754, 327, 253, 3126, 285, 5196, 22791, 272, 327, 767, 8892, 253, 9021, 403, 1534, 806, 253, 9864, 1024, 8525, 1524, 2606, 5415, 9797, 18164, 1273, 253, 18164, 310, 671, 1077, 15958, 1754, 327, 12246, 30869, 1854, 30749, 2626, 253, 9864, 310, 1024, 3596, 11722, 285, 476, 320, 908, 347, 247, 2087, 4968, 281, 26065, 667, 495, 69, 3126, 512, 841, 3386, 403, 1199, 3058, 323, 36080, 23105, 2561, 50276, 783, 10895, 285, 253, 22791, 272, 1543, 403, 671, 1175, 281, 452, 323, 253, 3114, 50276, 74, 923, 767, 32213, 5001, 253, 19529, 50276, 7053, 253, 3082, 908, 275, 22791, 272, 403, 3710, 253, 4477, 908, 581, 5933, 323, 1016, 273, 253, 767, 22791, 8892, 50276, 783, 581, 323, 15034, 310, 671, 5611, 275, 253, 2045, 2715, 273, 253, 789, 50276, 262, 69, 320, 1175, 281, 923, 247, 625, 12082, 7103, 342, 3081, 3082, 50276, 9815, 253, 948, 85, 43549, 4679, 476, 320, 34615, 50276, 783, 4477, 5196, 690, 12611, 4679, 275, 22791, 374, 50276, 262, 69, 320, 1270, 281, 923, 625, 273, 841, 4679, 275, 625, 49602, 285, 8892, 281, 921, 326, 417, 760, 253, 9864, 3139, 310, 15958, 533, 3210, 10166, 275, 841, 12620, 513, 3700, 281, 253, 1524, 1533, 5474, 33032, 2520, 2929, 4081, 247, 747, 40022, 323, 948, 8287, 19463, 2538, 253, 2201, 3064, 2429, 281, 253, 2720, 789, 310, 326, 436, 789, 476, 11897, 3590, 3607, 327, 253, 8778, 285, 3021, 3693, 970, 638, 16777, 264, 941, 3103, 253, 747, 5147, 476, 1329, 10341, 12087, 285, 5415, 18012, 253, 4477, 671, 1347, 767, 4679, 281, 2085, 1941, 323, 253, 31471, 273, 436, 789, 50276, 266, 29101, 2715, 273, 253, 5147, 689, 3217, 690, 2201, 7364, 273, 253, 2720, 789, 690, 4679, 7568, 326, 5415, 14800, 513, 3486, 1566, 3045, 50276, 33850, 15302, 285, 13498, 476, 320, 3240, 4217, 281, 253, 3114, 4278, 273, 253, 3590, 4973, 403, 417, 2590, 271, 1789, 1537, 5926, 285, 760, 11330, 581, 3486, 3590, 50276, 1439, 2119, 604, 253, 1655, 5147, 8525, 436, 3626, 5016, 50275, 1542, 3368, 374, 253, 4477, 403, 970, 1029, 15177, 4438, 697, 4722, 281, 923, 849, 253, 7938, 2715, 2429, 281, 253, 17357, 2715, 594, 359, 476, 1805, 2096, 253, 3486, 273, 436, 5454, 2727, 50275, 1189, 455, 697, 247, 1175, 2929, 2299, 253, 4477, 943, 2486, 625, 4278, 281, 3986, 247, 16055, 8446, 285, 281, 1361, 253, 3114, 2096, 752, 476, 320, 2218, 342, 253, 4081, 5147, 247, 625, 7000, 12291, 2593, 390, 690, 5301, 342, 1524, 10186, 3590, 4973, 943, 1361, 50276, 7152, 339, 793, 360, 3454, 253, 4477, 9569, 3590, 31748, 1384, 247, 5147, 323, 327, 783, 16247, 12087, 3169, 9797, 374, 18164, 323, 495, 69, 12620, 352, 4483, 5415, 8820, 10491, 285, 14371, 253, 12510, 275, 767, 15450, 8892, 36080, 15034, 285, 2080, 3423, 12077, 6519, 8981, 337, 38135, 253, 4081, 5147, 323, 5415, 8820, 10491, 310, 4460, 285, 253, 5697, 403, 3236, 374, 2508, 15722, 253, 4477, 2085, 253, 41174, 729, 261, 780, 18164, 15722, 275, 2508, 285, 17093, 253, 5536, 32140, 495, 4679, 253, 4477, 7277, 285, 22791, 767, 15450, 8892, 5415, 41174, 729, 261, 780, 15034, 285, 2080, 3423, 6519, 8981, 285, 2007, 2085, 16039, 327, 20462, 841, 2718, 577, 1175, 9759, 253, 2929, 310, 4518, 8818, 285, 3559, 337, 2167, 767, 15450, 8892, 452, 644, 22791, 264, 352, 812, 320, 1805, 281, 2085, 1543, 327, 6508, 8892, 824, 347, 2831, 24353, 4715, 285, 625, 9797, 7483, 2561, 50276, 19, 697, 1805, 281, 2085, 9990, 273, 253, 4081, 10895, 1690, 7103, 285, 24426, 273, 253, 7467, 1180, 273, 17999, 285, 594, 327, 5474, 339, 431, 248, 2929, 23970, 3590, 31748, 1384, 253, 806, 12087, 3169, 1524, 2606, 19463, 9864, 5147, 323, 495, 69, 12620, 253, 2022, 3064, 432, 697, 2045, 2715, 3590, 31748, 310, 697, 5415, 19463, 9864, 347, 10066, 281, 13358, 534, 10316, 1264, 5373, 337, 625, 7899, 19463, 9864, 417, 11096, 281, 41364, 9860, 2792, 374, 26647, 281, 747, 13451, 10341, 3280, 6191, 1041, 495, 1805, 3596, 321, 1430, 2494, 15791, 4753, 3966, 275, 2426, 273, 5661, 7103, 253, 4477, 7277, 253, 15524, 7835, 342, 1524, 10186, 9797, 6341, 597, 671, 34647, 767, 15450, 8892, 326, 25057, 253, 40022, 41174, 729, 261, 780, 15034, 285, 2080, 3423, 12077, 6519, 8981, 50276, 783, 2929, 29328, 247, 4891, 13279, 47549, 7092, 273, 271, 9797, 9864, 3126, 326, 17923, 327, 783, 16247, 12087, 3169, 9797, 18164, 323, 10341, 13451, 697, 806, 273, 697, 2238, 285, 476, 1527, 598, 1142, 2561, 9091, 323, 23390, 26306, 36080, 23105, 275, 2426, 273, 28092, 3253, 3133, 281, 320, 13279, 47549, 253, 5147, 8525, 767, 10006, 50276, 8656, 15507, 285, 1029, 15177, 285, 671, 3797, 247, 1236, 2510, 25912, 28841, 10895, 1925, 3590, 31748, 4029, 9283, 604, 253, 4212, 403, 417, 6110, 275, 36080, 23105, 8892, 50275, 783, 5661, 7103, 2593, 556, 247, 1643, 35092, 32213, 50275, 7053, 697, 247, 2372, 12744, 2139, 5415, 9797, 9864, 310, 1774, 540, 41597, 697, 1805, 984, 697, 8003, 281, 253, 1524, 1533, 533, 253, 5661, 1543, 2829, 495, 5742, 403, 417, 1077, 21414, 10941, 4194, 374, 285, 4194, 495, 697, 3264, 326, 3733, 275, 3590, 31748, 285, 5175, 275, 3590, 31748, 1384, 588, 2847, 247, 1781, 5028, 5333, 534, 310, 1929, 281, 320, 1077, 30078, 323, 3676, 4715, 3210, 697, 1335, 12744, 1880, 1907, 5415, 9797, 9864, 310, 1774, 604, 253, 12553, 4736, 310, 281, 8745, 247, 1728, 2579, 15688, 275, 253, 1524, 1533, 247, 16038, 326, 310, 12889, 5393, 275, 253, 2929, 50275, 783, 1543, 273, 2829, 577, 310, 625, 21414, 4518, 3590, 31748, 1384, 15693, 625, 15958, 13506, 3496, 84, 685, 44993, 3691, 26202, 982, 323, 253, 4096, 273, 347, 83, 2299, 2829, 577, 943, 671, 2486, 253, 906, 273, 1442, 292, 37437, 327, 3590, 31748, 275, 1635, 4677, 495, 270, 310, 247, 2372, 20634, 627, 3133, 281, 320, 247, 3638, 18687, 875, 253, 1543, 273, 3590, 31748, 285, 3590, 31748, 1384, 835, 310, 326, 18687, 3551, 432, 275, 643, 3000, 2139, 310, 3590, 5641, 1900, 9603, 1837, 83, 884, 14073, 2169, 50276, 44295, 3062, 275, 2829, 374, 253, 4477, 943, 3164, 671, 7277, 342, 3590, 31748, 884, 285, 289, 22767, 10186, 597, 452, 2176, 8077, 6787, 13358, 9860, 1127, 323, 3590, 31748, 285, 6738, 2275, 1004, 8077, 1877, 323, 246, 23985, 285, 18289, 597, 476, 1408, 7938, 253, 4212, 943, 320, 6600, 273, 253, 2120, 6637, 273, 253, 9864, 3885, 7147, 3290, 5454, 2727, 672, 13887, 534, 40022, 281, 897, 323, 1650, 604, 690, 2608, 3198, 2233, 269, 793, 390, 9098, 269, 793, 323, 616, 391, 77, 897, 1083, 5046, 597, 588, 5206, 3590, 31748, 689, 3590, 31748, 1384, 50276, 71, 3341, 627, 3133, 281, 320, 5816, 247, 2590, 5301, 342, 289, 22767, 10186, 275, 253, 5661, 7103, 2593, 289, 22767, 10186, 671, 8525, 1524, 2606, 5415, 9797, 18164, 285, 671, 3486, 3590, 9864, 534, 310, 5816, 432, 3590, 31748, 1384, 253, 2022, 12291, 3133, 281, 320, 697, 9376, 273, 21010, 6200, 12087, 6738, 2275, 1004, 253, 4477, 943, 7568, 849, 436, 12291, 11852, 253, 4715, 273, 15450, 8892, 1690, 36080, 4394, 24088, 41174, 729, 261, 780, 15034, 390, 1327, 16697, 351, 728, 4394, 24088, 347, 83, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 369, 9814, 407, 2800, 10071, 285, 2959, 512, 2762, 7363, 913, 9193, 436, 789, 310, 4722, 285, 22828, 281, 320, 3863, 327, 5723, 2824, 1384, 1423, 10895, 3540, 253, 30628, 858, 7164, 690, 9865, 7350, 326, 943, 320, 9713, 275, 253, 2457, 4049, 254, 609, 5102, 2715, 273, 253, 2929, 253, 4477, 403, 14659, 281, 1056, 253, 3309, 2544, 275, 253, 2457, 2715 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 3590, 9875, 1384, 50276, 338, 642, 253, 37122, 588, 320, 27096, 285, 4217, 281, 871, 50275, 783, 9864, 4419, 31238, 247, 873, 273, 3602, 323, 667, 2144, 275, 253, 9864, 6200, 11058, 11715, 6322, 352, 310, 1175, 326, 253, 4477, 452, 2530, 4284, 2193, 323, 1846, 4753, 285, 352, 310, 1175, 326, 4212, 476, 13199, 824, 2193, 2177, 271, 23370, 50276, 2858, 891, 1335, 452, 1142, 3533, 849, 497, 253, 4284, 2193, 2797, 432, 6341, 390, 12057, 3210, 390, 403, 597, 512, 2797, 281, 22318, 253, 4944, 281, 581, 2014, 2316, 347, 10466, 275, 43302, 818, 50276, 3062, 1189, 891, 717, 14338, 281, 871, 849, 16821, 253, 45427, 589, 318, 651, 1818, 604, 841, 2193, 497, 12059, 604, 253, 4477, 501, 303, 2907, 253, 818, 3496, 84, 597, 7277, 342, 1524, 10186, 19654, 342, 253, 19463, 46852, 5199, 597, 3732, 275, 253, 6519, 8981, 4836, 407, 849, 1199, 588, 253, 15524, 37523, 1549, 285, 1837, 83, 6889, 604, 253, 3662, 310, 417, 1199, 253, 37523, 1549, 285, 1837, 83, 403, 10237, 840, 247, 33872, 9414, 588, 320, 17279, 1520, 326, 253, 2810, 3761, 2011, 275, 3036, 495, 310, 417, 816, 1955, 281, 689, 31893, 3602, 281, 247, 2014, 2316, 604, 5010, 10668, 588, 320, 2273, 30289, 264, 326, 616, 1211, 9938, 778, 2430, 690, 1557, 285, 1869, 275, 253, 3302, 9978, 281, 4917, 1029, 15177, 9797, 275, 534, 1083, 849, 943, 247, 2608, 871, 849, 281, 4575, 841, 3602, 50275, 2811, 284, 15896, 18164, 476, 11823, 253, 9544, 273, 8099, 285, 9421, 19463, 18164, 1057, 417, 452, 436, 18997, 50276, 783, 11058, 3675, 2230, 10303, 273, 1524, 9956, 588, 5604, 320, 1027, 323, 6906, 87, 296, 73, 781, 9421, 50276, 10192, 40224, 436, 476, 320, 4229, 407, 13542, 30897, 253, 2144, 3602, 533, 436, 778, 2430, 13542, 19427, 841, 2193, 323, 667, 1677, 13697, 2316, 969, 2852, 4212, 273, 289, 466, 9864, 4968, 943, 320, 2273, 30289, 264, 326, 597, 778, 452, 281, 1158, 670, 841, 3374, 281, 755, 1029, 15177, 1543, 50274, 249, 253, 25702, 2144, 3389, 253, 4477, 1375, 5001, 3036, 577, 253, 2022, 1921, 323, 253, 2647, 631, 20, 1397, 37523, 1549, 3268, 8413, 7706, 4404, 1669, 310, 984, 627, 403, 8783, 273, 7154, 6191, 1041, 275, 326, 10895, 534, 1543, 275, 21868, 40929, 432, 11385, 285, 4577, 45427, 589, 318, 275, 2087, 327, 253, 10214, 33342, 1665, 285, 288, 78, 20, 69, 452, 2169, 3290, 17489, 285, 452, 4067, 37523, 1549, 84, 327, 3388, 50276, 1171, 2282, 253, 4477, 1060, 2550, 320, 27137, 323, 20276, 6191, 1041, 50276, 66, 18164, 5933, 476, 760, 320, 347, 1175, 347, 253, 1491, 10208, 715, 352, 50276, 2858, 436, 1057, 6780, 271, 2523, 2629, 495, 69, 6191, 1041, 403, 18325, 323, 2460, 18164, 534, 9365, 1027, 10806, 685, 9797, 18164, 2460, 18164, 4419, 21868, 1206, 317, 398, 281, 3540, 247, 1199, 4577, 1180, 273, 24233, 685, 9797, 3816, 615, 22895, 594, 11385, 275, 253, 17489, 534, 16753, 1652, 1895, 323, 2460, 18164, 285, 1537, 320, 1846, 476, 5276, 20276, 323, 9797, 50276, 2520, 8018, 326, 247, 6919, 273, 253, 884, 3041, 3496, 84, 275, 616, 1345, 10895, 45166, 253, 37523, 1549, 849, 1142, 253, 10895, 310, 1335, 273, 897, 533, 436, 2523, 943, 320, 1345, 1025, 50276, 12563, 2378, 625, 1110, 665, 5730, 281, 897, 253, 9864, 4968, 943, 320, 2273, 30289, 264, 597, 878, 281, 2451, 616, 17489, 3290, 50274, 249, 253, 6519, 8981, 4836, 253, 4477, 7568, 326, 247, 6519, 8981, 10166, 327, 45427, 589, 386, 6519, 17923, 1805, 672, 10166, 327, 3590, 9875, 1384, 45427, 589, 318, 2581, 685, 7239, 4461, 317, 26202, 982, 50276, 74, 5467, 597, 33075, 326, 253, 3268, 273, 37523, 1549, 84, 285, 1837, 2967, 323, 253, 767, 5239, 273, 15524, 3496, 84, 497, 13373, 533, 436, 310, 417, 4767, 275, 1635, 891, 651, 320, 14338, 281, 923, 253, 1055, 604, 253, 6519, 8981, 5933, 369, 10166, 342, 5950, 45427, 589, 318, 432, 1529, 10895, 18289, 3590, 9875, 1384, 651, 4917, 1805, 3045, 984, 273, 253, 1979, 273, 253, 15524, 10895, 390, 342, 591, 916, 1230, 11797, 7605, 45427, 589, 318, 9066, 3210, 824, 347, 908, 275, 4854, 2561, 390, 275, 3440, 5865, 2538, 18289, 1060, 3590, 9875, 1384, 651, 4917, 1805, 3045, 984, 253, 45427, 589, 318, 310, 625, 15958, 604, 436, 497, 5183, 352, 651, 2085, 1941, 326, 3590, 9875, 1384, 1663, 310, 247, 1375, 23037, 14387, 9797, 3816, 21052, 1919, 840, 891, 651, 1804, 253, 4477, 10290, 616, 3916, 273, 4122, 15958, 285, 3748, 841, 643, 5216, 347, 4217, 2852, 789, 5474, 339, 9852, 436, 19529, 253, 4477, 5611, 3590, 31748, 1384, 247, 9864, 5147, 326, 8525, 9797, 9864, 342, 247, 1180, 273, 11701, 689, 5368, 13498, 275, 1635, 253, 4477, 452, 17618, 253, 9864, 3933, 19103, 342, 1524, 10186, 19654, 5611, 247, 10895, 1754, 327, 253, 3126, 285, 5196, 22791, 272, 327, 767, 8892, 253, 9021, 403, 1534, 806, 253, 9864, 1024, 8525, 1524, 2606, 5415, 9797, 18164, 1273, 253, 18164, 310, 671, 1077, 15958, 1754, 327, 12246, 30869, 1854, 30749, 2626, 253, 9864, 310, 1024, 3596, 11722, 285, 476, 320, 908, 347, 247, 2087, 4968, 281, 26065, 667, 495, 69, 3126, 512, 841, 3386, 403, 1199, 3058, 323, 36080, 23105, 2561, 50276, 783, 10895, 285, 253, 22791, 272, 1543, 403, 671, 1175, 281, 452, 323, 253, 3114, 50276, 74, 923, 767, 32213, 5001, 253, 19529, 50276, 7053, 253, 3082, 908, 275, 22791, 272, 403, 3710, 253, 4477, 908, 581, 5933, 323, 1016, 273, 253, 767, 22791, 8892, 50276, 783, 581, 323, 15034, 310, 671, 5611, 275, 253, 2045, 2715, 273, 253, 789, 50276, 262, 69, 320, 1175, 281, 923, 247, 625, 12082, 7103, 342, 3081, 3082, 50276, 9815, 253, 948, 85, 43549, 4679, 476, 320, 34615, 50276, 783, 4477, 5196, 690, 12611, 4679, 275, 22791, 374, 50276, 262, 69, 320, 1270, 281, 923, 625, 273, 841, 4679, 275, 625, 49602, 285, 8892, 281, 921, 326, 417, 760, 253, 9864, 3139, 310, 15958, 533, 3210, 10166, 275, 841, 12620, 513, 3700, 281, 253, 1524, 1533, 5474, 33032, 2520, 2929, 4081, 247, 747, 40022, 323, 948, 8287, 19463, 2538, 253, 2201, 3064, 2429, 281, 253, 2720, 789, 310, 326, 436, 789, 476, 11897, 3590, 3607, 327, 253, 8778, 285, 3021, 3693, 970, 638, 16777, 264, 941, 3103, 253, 747, 5147, 476, 1329, 10341, 12087, 285, 5415, 18012, 253, 4477, 671, 1347, 767, 4679, 281, 2085, 1941, 323, 253, 31471, 273, 436, 789, 50276, 266, 29101, 2715, 273, 253, 5147, 689, 3217, 690, 2201, 7364, 273, 253, 2720, 789, 690, 4679, 7568, 326, 5415, 14800, 513, 3486, 1566, 3045, 50276, 33850, 15302, 285, 13498, 476, 320, 3240, 4217, 281, 253, 3114, 4278, 273, 253, 3590, 4973, 403, 417, 2590, 271, 1789, 1537, 5926, 285, 760, 11330, 581, 3486, 3590, 50276, 1439, 2119, 604, 253, 1655, 5147, 8525, 436, 3626, 5016, 50275, 1542, 3368, 374, 253, 4477, 403, 970, 1029, 15177, 4438, 697, 4722, 281, 923, 849, 253, 7938, 2715, 2429, 281, 253, 17357, 2715, 594, 359, 476, 1805, 2096, 253, 3486, 273, 436, 5454, 2727, 50275, 1189, 455, 697, 247, 1175, 2929, 2299, 253, 4477, 943, 2486, 625, 4278, 281, 3986, 247, 16055, 8446, 285, 281, 1361, 253, 3114, 2096, 752, 476, 320, 2218, 342, 253, 4081, 5147, 247, 625, 7000, 12291, 2593, 390, 690, 5301, 342, 1524, 10186, 3590, 4973, 943, 1361, 50276, 7152, 339, 793, 360, 3454, 253, 4477, 9569, 3590, 31748, 1384, 247, 5147, 323, 327, 783, 16247, 12087, 3169, 9797, 374, 18164, 323, 495, 69, 12620, 352, 4483, 5415, 8820, 10491, 285, 14371, 253, 12510, 275, 767, 15450, 8892, 36080, 15034, 285, 2080, 3423, 12077, 6519, 8981, 337, 38135, 253, 4081, 5147, 323, 5415, 8820, 10491, 310, 4460, 285, 253, 5697, 403, 3236, 374, 2508, 15722, 253, 4477, 2085, 253, 41174, 729, 261, 780, 18164, 15722, 275, 2508, 285, 17093, 253, 5536, 32140, 495, 4679, 253, 4477, 7277, 285, 22791, 767, 15450, 8892, 5415, 41174, 729, 261, 780, 15034, 285, 2080, 3423, 6519, 8981, 285, 2007, 2085, 16039, 327, 20462, 841, 2718, 577, 1175, 9759, 253, 2929, 310, 4518, 8818, 285, 3559, 337, 2167, 767, 15450, 8892, 452, 644, 22791, 264, 352, 812, 320, 1805, 281, 2085, 1543, 327, 6508, 8892, 824, 347, 2831, 24353, 4715, 285, 625, 9797, 7483, 2561, 50276, 19, 697, 1805, 281, 2085, 9990, 273, 253, 4081, 10895, 1690, 7103, 285, 24426, 273, 253, 7467, 1180, 273, 17999, 285, 594, 327, 5474, 339, 431, 248, 2929, 23970, 3590, 31748, 1384, 253, 806, 12087, 3169, 1524, 2606, 19463, 9864, 5147, 323, 495, 69, 12620, 253, 2022, 3064, 432, 697, 2045, 2715, 3590, 31748, 310, 697, 5415, 19463, 9864, 347, 10066, 281, 13358, 534, 10316, 1264, 5373, 337, 625, 7899, 19463, 9864, 417, 11096, 281, 41364, 9860, 2792, 374, 26647, 281, 747, 13451, 10341, 3280, 6191, 1041, 495, 1805, 3596, 321, 1430, 2494, 15791, 4753, 3966, 275, 2426, 273, 5661, 7103, 253, 4477, 7277, 253, 15524, 7835, 342, 1524, 10186, 9797, 6341, 597, 671, 34647, 767, 15450, 8892, 326, 25057, 253, 40022, 41174, 729, 261, 780, 15034, 285, 2080, 3423, 12077, 6519, 8981, 50276, 783, 2929, 29328, 247, 4891, 13279, 47549, 7092, 273, 271, 9797, 9864, 3126, 326, 17923, 327, 783, 16247, 12087, 3169, 9797, 18164, 323, 10341, 13451, 697, 806, 273, 697, 2238, 285, 476, 1527, 598, 1142, 2561, 9091, 323, 23390, 26306, 36080, 23105, 275, 2426, 273, 28092, 3253, 3133, 281, 320, 13279, 47549, 253, 5147, 8525, 767, 10006, 50276, 8656, 15507, 285, 1029, 15177, 285, 671, 3797, 247, 1236, 2510, 25912, 28841, 10895, 1925, 3590, 31748, 4029, 9283, 604, 253, 4212, 403, 417, 6110, 275, 36080, 23105, 8892, 50275, 783, 5661, 7103, 2593, 556, 247, 1643, 35092, 32213, 50275, 7053, 697, 247, 2372, 12744, 2139, 5415, 9797, 9864, 310, 1774, 540, 41597, 697, 1805, 984, 697, 8003, 281, 253, 1524, 1533, 533, 253, 5661, 1543, 2829, 495, 5742, 403, 417, 1077, 21414, 10941, 4194, 374, 285, 4194, 495, 697, 3264, 326, 3733, 275, 3590, 31748, 285, 5175, 275, 3590, 31748, 1384, 588, 2847, 247, 1781, 5028, 5333, 534, 310, 1929, 281, 320, 1077, 30078, 323, 3676, 4715, 3210, 697, 1335, 12744, 1880, 1907, 5415, 9797, 9864, 310, 1774, 604, 253, 12553, 4736, 310, 281, 8745, 247, 1728, 2579, 15688, 275, 253, 1524, 1533, 247, 16038, 326, 310, 12889, 5393, 275, 253, 2929, 50275, 783, 1543, 273, 2829, 577, 310, 625, 21414, 4518, 3590, 31748, 1384, 15693, 625, 15958, 13506, 3496, 84, 685, 44993, 3691, 26202, 982, 323, 253, 4096, 273, 347, 83, 2299, 2829, 577, 943, 671, 2486, 253, 906, 273, 1442, 292, 37437, 327, 3590, 31748, 275, 1635, 4677, 495, 270, 310, 247, 2372, 20634, 627, 3133, 281, 320, 247, 3638, 18687, 875, 253, 1543, 273, 3590, 31748, 285, 3590, 31748, 1384, 835, 310, 326, 18687, 3551, 432, 275, 643, 3000, 2139, 310, 3590, 5641, 1900, 9603, 1837, 83, 884, 14073, 2169, 50276, 44295, 3062, 275, 2829, 374, 253, 4477, 943, 3164, 671, 7277, 342, 3590, 31748, 884, 285, 289, 22767, 10186, 597, 452, 2176, 8077, 6787, 13358, 9860, 1127, 323, 3590, 31748, 285, 6738, 2275, 1004, 8077, 1877, 323, 246, 23985, 285, 18289, 597, 476, 1408, 7938, 253, 4212, 943, 320, 6600, 273, 253, 2120, 6637, 273, 253, 9864, 3885, 7147, 3290, 5454, 2727, 672, 13887, 534, 40022, 281, 897, 323, 1650, 604, 690, 2608, 3198, 2233, 269, 793, 390, 9098, 269, 793, 323, 616, 391, 77, 897, 1083, 5046, 597, 588, 5206, 3590, 31748, 689, 3590, 31748, 1384, 50276, 71, 3341, 627, 3133, 281, 320, 5816, 247, 2590, 5301, 342, 289, 22767, 10186, 275, 253, 5661, 7103, 2593, 289, 22767, 10186, 671, 8525, 1524, 2606, 5415, 9797, 18164, 285, 671, 3486, 3590, 9864, 534, 310, 5816, 432, 3590, 31748, 1384, 253, 2022, 12291, 3133, 281, 320, 697, 9376, 273, 21010, 6200, 12087, 6738, 2275, 1004, 253, 4477, 943, 7568, 849, 436, 12291, 11852, 253, 4715, 273, 15450, 8892, 1690, 36080, 4394, 24088, 41174, 729, 261, 780, 15034, 390, 1327, 16697, 351, 728, 4394, 24088, 347, 83, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 369, 9814, 407, 2800, 10071, 285, 2959, 512, 2762, 7363, 913, 9193, 436, 789, 310, 4722, 285, 22828, 281, 320, 3863, 327, 5723, 2824, 1384, 1423, 10895, 3540, 253, 30628, 858, 7164, 690, 9865, 7350, 326, 943, 320, 9713, 275, 253, 2457, 4049, 254, 609, 5102, 2715, 273, 253, 2929, 253, 4477, 403, 14659, 281, 1056, 253, 3309, 2544, 275, 253, 2457, 2715 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the idea of learning feature maps for spatially adjacent slices is simple and general the reported results are very competitive and were achieved on public datasets the manuscript is clearly written and easy to understand the contributions of the paper are clear and substantiated the method could be compared to more segmentation variants eg 25d segmentation approaches utilizing feature maps from neighboring slices is not entirely novel a similar idea has been previously suggested for videos by lin et al 2019 docsepthe paper is well written two main strengths in my opinions 1 enabling 2d convolutions to make use of information from neighboring by a distill dsm which learns to extract information from a part of the channels and shares it with neighboring slices 2 extensive evaluations on several datasets the proposed model achieves better performance than 3d cnn for heart and prostate datasets and comparable performance on brats 2020 pancreas and hippocampus dataset with simply 28 of parameters compared to 3d cnn model 1 another important class of efficient 2d3d approaches is not mentioned and discussed 2 some explanation of the results on brats 2020 and some clarification are needed for example the evaluation metric called spec ificity for brats 2020 may not be needed docsep1 the paper is written clearly the description of the method and the experiments are reasonable i can easily understand the proposed module 2 the proposed distill dsm is easy to follow and can be plugged into different 2d networks for 3d volumetric segmentation 3 the authors conducted abundant and wellorganized experiments to validate the effectiveness of proposed distill dsm 1 the proposed distill dsm lacks theory or reference supporting it seems like an engineering application rather than innovation in method if the author would like to prove the novelty of their proposed method it needs more theoretical explanation and method reference 2 the description and visualization of methods seems to be too simple which easily make readers confused for instance the description depth shift module lin et al 2019 shifts part of feature channels in each frame to its neighbouring frame so that 2d convolution could handle depth information does the dsm shifts the same features in each frame to its last and next frames what the fig 1 presents is not the same as the description docsepthe paper is very well written and easy to follow i have seen works attempting to use neighbouring slices in the input the so called 25d but this is the first time that i see this idea applied to convolutions in medical imaging highlight should be given to the as far as i know novel idea of judging which features should be included on the forward and backward slice instead of simply shifting channels its an interesting idea with potential for providing more efficient networks according to the authors findings comparisons were made with famous architectures and a similar approach residual dsm in multiple datasets bringing more validity to the authors claims overall i do not see major weaknesses or problems in this manuscript however there are some minor problems writing needs to be improved especially in the introduction give attention to proper use of articles and verb tenses additionally the introduction contains strong claims without proper citations the explanation of how distill dsm mixes information from far away slices could be improved more minor details and suggestions in the detailed comments i see no promise to make code available but that would make this work even stronger and easier to reproducebe cited by future works ### Summary:
the reviewers find the work of interest and there was initial consensus that the paper can be accepted this was confirmed after the rebuttal
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2934, 273, 4715, 4735, 8115, 323, 28819, 9701, 18484, 310, 2969, 285, 2087, 50276, 783, 2361, 1543, 403, 1077, 12085, 285, 497, 6786, 327, 1345, 15302, 50276, 783, 7714, 310, 4518, 3542, 285, 3477, 281, 2096, 50276, 783, 9021, 273, 253, 2929, 403, 2590, 285, 4326, 4215, 50275, 783, 1332, 812, 320, 2429, 281, 625, 26405, 11640, 24088, 2030, 69, 26405, 7274, 50276, 8906, 3006, 4735, 8115, 432, 20667, 18484, 310, 417, 7094, 4460, 247, 2074, 2934, 556, 644, 3786, 5125, 323, 10556, 407, 19169, 1162, 355, 6247, 50275, 7152, 339, 431, 248, 2929, 310, 973, 3542, 767, 2022, 20544, 275, 619, 11626, 50276, 18, 17690, 374, 69, 2410, 17009, 281, 1056, 897, 273, 1491, 432, 20667, 407, 247, 940, 408, 277, 3610, 534, 33772, 281, 4908, 1491, 432, 247, 629, 273, 253, 8123, 285, 10764, 352, 342, 20667, 18484, 50275, 19, 9470, 27163, 327, 2067, 15302, 253, 4081, 1566, 33526, 1805, 3045, 685, 495, 69, 260, 9866, 323, 2798, 285, 14527, 15302, 285, 10870, 3045, 327, 1308, 1832, 9169, 35142, 285, 26382, 10895, 342, 3365, 3349, 273, 3602, 2429, 281, 495, 69, 260, 9866, 1566, 337, 1529, 1774, 966, 273, 5919, 374, 69, 20, 69, 7274, 310, 417, 5393, 285, 5469, 50275, 19, 690, 8813, 273, 253, 1543, 327, 1308, 1832, 9169, 285, 690, 37699, 403, 3058, 323, 1650, 253, 7103, 7982, 1925, 946, 189, 692, 414, 323, 1308, 1832, 9169, 778, 417, 320, 3058, 50275, 7152, 33032, 18, 50276, 783, 2929, 310, 3542, 4518, 253, 5740, 273, 253, 1332, 285, 253, 4679, 403, 5272, 891, 476, 4354, 2096, 253, 4081, 6333, 374, 50276, 783, 4081, 940, 408, 277, 3610, 310, 3477, 281, 956, 285, 476, 320, 43867, 715, 1027, 374, 69, 6928, 323, 495, 69, 1936, 45558, 26405, 495, 50276, 783, 4477, 5196, 17829, 285, 973, 34092, 4679, 281, 17813, 253, 12510, 273, 4081, 940, 408, 277, 3610, 337, 253, 4081, 940, 408, 277, 3610, 19756, 3762, 390, 3806, 8109, 352, 3133, 751, 271, 11369, 2898, 2581, 685, 15832, 275, 1332, 604, 253, 2488, 651, 751, 281, 5276, 253, 38135, 273, 616, 4081, 1332, 352, 3198, 625, 10527, 8813, 285, 1332, 3806, 50275, 19, 253, 5740, 285, 24426, 273, 3082, 3133, 50276, 936, 320, 1512, 2969, 534, 4354, 1056, 10668, 13477, 323, 4227, 253, 5740, 6864, 5333, 6333, 19169, 1162, 355, 6247, 15036, 629, 273, 4735, 8123, 275, 1016, 3665, 281, 697, 33423, 3665, 594, 326, 374, 69, 27311, 812, 6016, 6864, 1491, 1057, 253, 277, 3610, 15036, 253, 1072, 3386, 275, 1016, 3665, 281, 697, 1390, 285, 1735, 13009, 752, 253, 3036, 337, 10262, 310, 417, 253, 1072, 347, 253, 5740, 5474, 339, 431, 248, 2929, 310, 1077, 973, 3542, 285, 3477, 281, 956, 50275, 74, 452, 2326, 2987, 13756, 281, 897, 33423, 18484, 275, 253, 3280, 253, 594, 1925, 2030, 69, 533, 436, 310, 253, 806, 673, 326, 891, 923, 436, 2934, 3732, 281, 2410, 17009, 275, 3739, 6979, 6780, 943, 320, 1677, 281, 253, 347, 2080, 347, 891, 871, 4460, 2934, 273, 32721, 534, 3386, 943, 320, 2908, 327, 253, 3579, 285, 19265, 15512, 3185, 273, 3365, 19507, 8123, 697, 271, 4722, 2934, 342, 2442, 323, 5277, 625, 5919, 6928, 2556, 281, 253, 4477, 4342, 50275, 681, 1148, 10047, 497, 1160, 342, 8530, 35615, 285, 247, 2074, 2746, 12541, 277, 3610, 275, 2709, 15302, 9745, 625, 13091, 281, 253, 4477, 3916, 50276, 1189, 455, 891, 513, 417, 923, 2201, 32213, 390, 3237, 275, 436, 7714, 2299, 627, 403, 690, 5884, 3237, 50276, 17695, 3198, 281, 320, 5520, 3340, 275, 253, 10199, 1918, 4116, 281, 1463, 897, 273, 7774, 285, 17257, 246, 5060, 23000, 253, 10199, 4428, 2266, 3916, 1293, 1463, 30404, 253, 8813, 273, 849, 940, 408, 277, 3610, 47603, 1491, 432, 2080, 1977, 18484, 812, 320, 5520, 625, 5884, 4278, 285, 13991, 275, 253, 7000, 5701, 50275, 74, 923, 642, 9023, 281, 1056, 2127, 2130, 533, 326, 651, 1056, 436, 789, 1014, 10046, 285, 6927, 281, 18302, 1257, 11106, 407, 2852, 2987, 50275, 187, 187, 4118, 18435, 27, 783, 30628, 1089, 253, 789, 273, 1600, 285, 627, 369, 3302, 13969, 326, 253, 2929, 476, 320, 7607, 436, 369, 5783, 846, 253, 30080, 22559 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2934, 273, 4715, 4735, 8115, 323, 28819, 9701, 18484, 310, 2969, 285, 2087, 50276, 783, 2361, 1543, 403, 1077, 12085, 285, 497, 6786, 327, 1345, 15302, 50276, 783, 7714, 310, 4518, 3542, 285, 3477, 281, 2096, 50276, 783, 9021, 273, 253, 2929, 403, 2590, 285, 4326, 4215, 50275, 783, 1332, 812, 320, 2429, 281, 625, 26405, 11640, 24088, 2030, 69, 26405, 7274, 50276, 8906, 3006, 4735, 8115, 432, 20667, 18484, 310, 417, 7094, 4460, 247, 2074, 2934, 556, 644, 3786, 5125, 323, 10556, 407, 19169, 1162, 355, 6247, 50275, 7152, 339, 431, 248, 2929, 310, 973, 3542, 767, 2022, 20544, 275, 619, 11626, 50276, 18, 17690, 374, 69, 2410, 17009, 281, 1056, 897, 273, 1491, 432, 20667, 407, 247, 940, 408, 277, 3610, 534, 33772, 281, 4908, 1491, 432, 247, 629, 273, 253, 8123, 285, 10764, 352, 342, 20667, 18484, 50275, 19, 9470, 27163, 327, 2067, 15302, 253, 4081, 1566, 33526, 1805, 3045, 685, 495, 69, 260, 9866, 323, 2798, 285, 14527, 15302, 285, 10870, 3045, 327, 1308, 1832, 9169, 35142, 285, 26382, 10895, 342, 3365, 3349, 273, 3602, 2429, 281, 495, 69, 260, 9866, 1566, 337, 1529, 1774, 966, 273, 5919, 374, 69, 20, 69, 7274, 310, 417, 5393, 285, 5469, 50275, 19, 690, 8813, 273, 253, 1543, 327, 1308, 1832, 9169, 285, 690, 37699, 403, 3058, 323, 1650, 253, 7103, 7982, 1925, 946, 189, 692, 414, 323, 1308, 1832, 9169, 778, 417, 320, 3058, 50275, 7152, 33032, 18, 50276, 783, 2929, 310, 3542, 4518, 253, 5740, 273, 253, 1332, 285, 253, 4679, 403, 5272, 891, 476, 4354, 2096, 253, 4081, 6333, 374, 50276, 783, 4081, 940, 408, 277, 3610, 310, 3477, 281, 956, 285, 476, 320, 43867, 715, 1027, 374, 69, 6928, 323, 495, 69, 1936, 45558, 26405, 495, 50276, 783, 4477, 5196, 17829, 285, 973, 34092, 4679, 281, 17813, 253, 12510, 273, 4081, 940, 408, 277, 3610, 337, 253, 4081, 940, 408, 277, 3610, 19756, 3762, 390, 3806, 8109, 352, 3133, 751, 271, 11369, 2898, 2581, 685, 15832, 275, 1332, 604, 253, 2488, 651, 751, 281, 5276, 253, 38135, 273, 616, 4081, 1332, 352, 3198, 625, 10527, 8813, 285, 1332, 3806, 50275, 19, 253, 5740, 285, 24426, 273, 3082, 3133, 50276, 936, 320, 1512, 2969, 534, 4354, 1056, 10668, 13477, 323, 4227, 253, 5740, 6864, 5333, 6333, 19169, 1162, 355, 6247, 15036, 629, 273, 4735, 8123, 275, 1016, 3665, 281, 697, 33423, 3665, 594, 326, 374, 69, 27311, 812, 6016, 6864, 1491, 1057, 253, 277, 3610, 15036, 253, 1072, 3386, 275, 1016, 3665, 281, 697, 1390, 285, 1735, 13009, 752, 253, 3036, 337, 10262, 310, 417, 253, 1072, 347, 253, 5740, 5474, 339, 431, 248, 2929, 310, 1077, 973, 3542, 285, 3477, 281, 956, 50275, 74, 452, 2326, 2987, 13756, 281, 897, 33423, 18484, 275, 253, 3280, 253, 594, 1925, 2030, 69, 533, 436, 310, 253, 806, 673, 326, 891, 923, 436, 2934, 3732, 281, 2410, 17009, 275, 3739, 6979, 6780, 943, 320, 1677, 281, 253, 347, 2080, 347, 891, 871, 4460, 2934, 273, 32721, 534, 3386, 943, 320, 2908, 327, 253, 3579, 285, 19265, 15512, 3185, 273, 3365, 19507, 8123, 697, 271, 4722, 2934, 342, 2442, 323, 5277, 625, 5919, 6928, 2556, 281, 253, 4477, 4342, 50275, 681, 1148, 10047, 497, 1160, 342, 8530, 35615, 285, 247, 2074, 2746, 12541, 277, 3610, 275, 2709, 15302, 9745, 625, 13091, 281, 253, 4477, 3916, 50276, 1189, 455, 891, 513, 417, 923, 2201, 32213, 390, 3237, 275, 436, 7714, 2299, 627, 403, 690, 5884, 3237, 50276, 17695, 3198, 281, 320, 5520, 3340, 275, 253, 10199, 1918, 4116, 281, 1463, 897, 273, 7774, 285, 17257, 246, 5060, 23000, 253, 10199, 4428, 2266, 3916, 1293, 1463, 30404, 253, 8813, 273, 849, 940, 408, 277, 3610, 47603, 1491, 432, 2080, 1977, 18484, 812, 320, 5520, 625, 5884, 4278, 285, 13991, 275, 253, 7000, 5701, 50275, 74, 923, 642, 9023, 281, 1056, 2127, 2130, 533, 326, 651, 1056, 436, 789, 1014, 10046, 285, 6927, 281, 18302, 1257, 11106, 407, 2852, 2987, 50275, 187, 187, 4118, 18435, 27, 783, 30628, 1089, 253, 789, 273, 1600, 285, 627, 369, 3302, 13969, 326, 253, 2929, 476, 320, 7607, 436, 369, 5783, 846, 253, 30080, 22559 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: authors propose a new perspective on adaptive gradient methods main contribution is a trust region based algorithm they call stochastic ellipsoidal trust region method thats flexible to include both full and diagonal matrix as the preconditioning matrix authors also mention that the preconditioners are generally diagonally dominant in practice and may only require diagonal matrix leaves full matrix for future work reason to score weak emperical results small models on small datasets without normalizing for batch sizes between experiments i have listed my concerns below and hopefully authors can address my concern during the rebuttal period i think the authors could substantially improve the emperical results in the paper by including commonly used adaptive methods as baseline such as adam and providing results on stronger baselines and break down on computational effeciency of the proposed approach in more details questionscomments a there is appendix c that states that batch size used first order method is 32 vs for this method authors use 128512 and then compare backprops this extremely problematic when using backprop as a way to measure efficiency as this gives 416x improvement from just larger batch sizes i would suggest redoing experiments with exact same batch sizes b authors indicate using diagonal preconditioner could authors consider previous work on kronecker factored preconditioners such as kfac or shampoo that is computationally cheaper in their experiments c could authors also include walltime comparisons to split time spent in forward backward hessianvector product cg iterations including details on these as layer sizes increases for say upto 4k which are common in deep networks trained today d could you run a comparison against baselines and settings in httpwwwcstorontoedujmartensdocsdeephessianfreepdf docsepthis paper analyzes adaptive methods like adam and amsprop and shows that they can be reinterpreted as first order trust region methods with an ellipsoidal trust region lemma 1 the authors then propose a second order trust region method with similar ellipsoidal trust regions induced by the rmsprop matrices eq 7 under some assumptions they show that this algorithm will converge in a finite number of steps depending upon the accuracy desired they also show some experiments to demonstrate their algorithm the approach proposed in the paper is interesting but the significance of the paper is not clear the application of ellipsoidal trust region to newton algorithms can certainly help with the development of new optimizers but the presented algorithm was not very clear to me i assume approximate gt is the minibatch gradient but how is the approximate bt computed how is mtst computed approximately could you say a little more about assumptions 1 and 2 when do they hold finally the experimental results were not very clear it seems like ellipsoidal tr methods are often outperformed by first order methods also no generalization results on test sets were showndocsepin section 32 the authors use the ratio of the diagonal to the overall mass of a matrix to measure the quality of diagonal vs full preconditioning why this a good measure motivation and comments about this measure are needed i think the most important thing to check here is the difference between a1g and diaga1g i think an interesting question that one may investigate here is to link this ratio to the  difference between a1g and diaga1gor to link this ratio to the complexity in proposition 1 define sigma1 2 i understand that they are defined in the appendix just move eq 42 to prop 1 the same about the relation between h x in proposition 2 rewrite proposition 2 it is a bit misleading the limit when n goes to infinite should be independent of n you may write proportional instead of limit figure d2 shows that sqrtnsqrtn decreases to zero with n going to infinite however this quantity is nondecreasing as a function of n and it converges to 1 when n diverges page 6 after lemma 2 the authors stated that thm 1 shows the first convergence rate for ellipsoidal tr methods i disagree about this the authors may check for instance the work of conn et al this work is cited in the paper section 67  and bergou et al httpslinkspringercomarticle101007s1058901799292    assumption1 the authors stated that for finitesum objectives such as eq 1 the above condition can be met by random subsampling due to classical concentration results for sums of random variables this is incorrect take for instance the case where in the finite sum only one term is not zero and all the other are equal to zero if the nonzero term is not in the subsampling for all t the bounds you mentioned may not be satisfied you can have these bounds but only in a probabilistic manner as in the blanchet et al work this work is cited in the paper docsepthe paper proposes novel stochastic ellipsoidal trustregion methods inspired by adaptive gradient methods and studies the application of them with adaptive diagonal preconditioners theoretical convergence analysis is provided for tr with rmsprop ellipsoid and numerical results demonstrates the superiority of ellipsoided tr over uniform tr interestingly the paper shows for the first time that adaptive gradient methods can be view as firstorder tr with ellipsoidal constraints the negative comparative results with stateoftheart adaptive methods are appreciated showing that the trtype methods may not be great choices for deep network training since the hessians are often diagonaldominant in deeplearning practice and hence the benefit of secondorder methods are limited as said the paper is mostly wellwritten and present an insightful investigation of stochastic ellipsoidal tr which could be potentially an alternative to stateoftheart adaptive gradient methods for modern deep network training and the paper gives a negative answer on the other hand the reviewer feels that the potential impact for the deep learning practitioners may be a bit limited and is not very sure about whether the contribution of this work is that significant the experimental details seem unclear in the experiments how were the approximate hessian bt calculated are they computed on the same sampled minibatch of gradient or on a newbigger minibatch meanwhile only training accuracies are reported which seems inadequate should also include the test accuracy plots ### Summary:
the paper considers adaptive stochastic optimization methods and shows that they can be reinterpreted as first order trust region methods with an ellipsoidal trust region they consider a related second order method and they show convergence properties and empirical results the results are of interest but the significance of some of the results is not clear part of this has to do with substance and part of this has to do with presentation that can be improved empirical results are weak including appropriate baselines and details of the empirical results
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 43355, 12661, 247, 747, 8668, 327, 17825, 11786, 3082, 2022, 7680, 310, 247, 4517, 2919, 1754, 5933, 597, 1067, 19191, 36809, 601, 11421, 4517, 2919, 1332, 28763, 12112, 281, 2486, 1097, 2120, 285, 16421, 4315, 347, 253, 638, 42743, 4315, 50276, 43355, 671, 3748, 326, 253, 638, 12380, 398, 403, 3839, 1073, 5154, 595, 11360, 275, 3946, 285, 778, 760, 2430, 16421, 4315, 6505, 2120, 4315, 323, 2852, 789, 50276, 10752, 281, 4868, 50276, 20881, 802, 468, 474, 1543, 1355, 3210, 327, 1355, 15302, 1293, 2622, 3006, 323, 14604, 9552, 875, 4679, 50276, 74, 452, 7117, 619, 7350, 2708, 285, 18670, 4477, 476, 2953, 619, 4468, 1309, 253, 30080, 22559, 2180, 50276, 74, 1158, 253, 4477, 812, 9619, 3157, 253, 802, 468, 474, 1543, 275, 253, 2929, 407, 1690, 7744, 908, 17825, 3082, 347, 8245, 824, 347, 38622, 285, 5277, 1543, 327, 10046, 1666, 25379, 285, 2740, 1066, 327, 15180, 29692, 453, 68, 4364, 273, 253, 4081, 2746, 275, 625, 4278, 50276, 34974, 26122, 50276, 66, 627, 310, 30762, 260, 326, 3054, 326, 14604, 1979, 908, 806, 1340, 1332, 310, 4567, 4632, 323, 436, 1332, 4477, 897, 1249, 2227, 805, 285, 840, 7277, 896, 21390, 436, 6685, 20276, 672, 970, 50276, 2135, 8560, 347, 247, 1039, 281, 2557, 6733, 347, 436, 4245, 35272, 89, 7756, 432, 816, 4067, 14604, 9552, 50276, 74, 651, 1804, 2502, 80, 272, 4679, 342, 3242, 1072, 14604, 9552, 50276, 67, 4477, 5224, 970, 16421, 638, 12380, 254, 812, 4477, 1908, 2045, 789, 327, 465, 29037, 13692, 958, 2149, 638, 12380, 398, 824, 347, 465, 28402, 390, 439, 1301, 3288, 326, 310, 43245, 20182, 275, 616, 4679, 50276, 68, 812, 4477, 671, 2486, 3402, 2606, 14023, 281, 8085, 673, 5262, 275, 3579, 19265, 50276, 35659, 757, 11000, 1885, 260, 72, 25142, 1690, 4278, 327, 841, 347, 3828, 9552, 5459, 323, 1333, 11776, 80, 577, 76, 534, 403, 1846, 275, 3676, 6928, 10166, 3063, 50276, 69, 812, 368, 1408, 247, 5301, 1411, 1666, 25379, 285, 7533, 275, 3944, 2700, 68, 296, 263, 10905, 264, 10441, 20707, 561, 13880, 22412, 35659, 757, 18105, 554, 4989, 50275, 7152, 33032, 2520, 2929, 3537, 13505, 17825, 3082, 751, 38622, 285, 717, 84, 8560, 285, 2722, 326, 597, 476, 320, 294, 22416, 264, 347, 806, 1340, 4517, 2919, 3082, 342, 271, 36809, 601, 11421, 4517, 2919, 18057, 337, 50276, 783, 4477, 840, 12661, 247, 1273, 1340, 4517, 2919, 1332, 342, 2074, 36809, 601, 11421, 4517, 4811, 5802, 407, 253, 391, 983, 8560, 12624, 16186, 818, 762, 690, 13260, 597, 921, 326, 436, 5933, 588, 29623, 275, 247, 6486, 1180, 273, 5018, 7293, 2220, 253, 7200, 6799, 597, 671, 921, 690, 4679, 281, 7568, 616, 5933, 50276, 783, 2746, 4081, 275, 253, 2929, 310, 4722, 533, 253, 8453, 273, 253, 2929, 310, 417, 2590, 253, 2898, 273, 36809, 601, 11421, 4517, 2919, 281, 747, 1299, 11333, 476, 5604, 1361, 342, 253, 2440, 273, 747, 5556, 14460, 533, 253, 3559, 5933, 369, 417, 1077, 2590, 281, 479, 891, 5467, 16851, 305, 85, 310, 253, 1054, 487, 1506, 11786, 533, 849, 310, 253, 16851, 37989, 10302, 849, 310, 26301, 296, 10302, 5512, 812, 368, 1333, 247, 1652, 625, 670, 13260, 337, 285, 374, 50276, 9453, 513, 597, 2186, 4720, 253, 5661, 1543, 497, 417, 1077, 2590, 50276, 262, 3133, 751, 36809, 601, 11421, 492, 3082, 403, 2223, 41731, 10574, 407, 806, 1340, 3082, 671, 642, 26647, 1543, 327, 1071, 5239, 497, 2011, 7152, 339, 9852, 2593, 4567, 253, 4477, 897, 253, 4313, 273, 253, 16421, 281, 253, 4583, 2280, 273, 247, 4315, 281, 2557, 253, 3290, 273, 16421, 4632, 2120, 638, 42743, 2139, 436, 247, 1175, 2557, 16038, 285, 5701, 670, 436, 2557, 403, 3058, 891, 1158, 253, 954, 1774, 2181, 281, 2451, 1060, 310, 253, 3064, 875, 247, 18, 72, 285, 6687, 66, 18, 72, 891, 1158, 271, 4722, 1953, 326, 581, 778, 7409, 1060, 310, 281, 3048, 436, 4313, 281, 253, 209, 575, 40651, 875, 247, 18, 72, 285, 6687, 66, 18, 3892, 281, 3048, 436, 4313, 281, 253, 10454, 50276, 249, 13989, 337, 4853, 40009, 18, 50276, 19, 891, 2096, 326, 597, 403, 2931, 275, 253, 30762, 816, 2118, 16186, 5976, 281, 4198, 337, 253, 1072, 670, 253, 5886, 875, 288, 50276, 89, 275, 13989, 374, 50275, 2663, 3852, 13989, 374, 352, 310, 247, 2372, 24363, 253, 2701, 672, 295, 4566, 281, 11968, 943, 320, 3907, 273, 295, 50276, 5658, 778, 3630, 14495, 3185, 273, 2701, 50275, 13206, 277, 19, 2722, 326, 8084, 2224, 2274, 79, 12075, 281, 5058, 342, 295, 1469, 281, 11968, 2299, 436, 10671, 310, 1327, 40600, 2355, 347, 247, 1159, 273, 295, 285, 352, 26414, 281, 337, 672, 295, 11711, 2510, 50276, 6377, 721, 846, 18057, 374, 253, 4477, 4767, 326, 289, 78, 337, 2722, 253, 806, 14940, 2281, 323, 36809, 601, 11421, 492, 3082, 891, 14936, 670, 436, 253, 4477, 778, 2451, 323, 4227, 50276, 783, 789, 273, 30606, 1162, 355, 436, 789, 310, 11106, 275, 253, 2929, 2593, 9963, 209, 575, 395, 270, 1326, 276, 1162, 355, 5987, 23053, 81, 47761, 681, 14600, 6903, 7931, 84, 10655, 2511, 11027, 1525, 27122, 17345, 50275, 575, 515, 23892, 18, 253, 4477, 4767, 326, 323, 1442, 3254, 360, 16566, 824, 347, 16186, 337, 253, 1840, 1617, 476, 320, 1313, 407, 3632, 8790, 312, 4906, 1955, 281, 8946, 4719, 1543, 323, 22661, 273, 3632, 4903, 436, 310, 13583, 1379, 323, 4227, 253, 1083, 835, 275, 253, 6486, 2020, 760, 581, 1307, 310, 417, 5058, 285, 512, 253, 643, 403, 4503, 281, 5058, 604, 253, 28078, 1307, 310, 417, 275, 253, 8790, 312, 4906, 323, 512, 246, 253, 14493, 368, 5393, 778, 417, 320, 10048, 368, 476, 452, 841, 14493, 533, 760, 275, 247, 37851, 5133, 347, 275, 253, 787, 1377, 6168, 1162, 355, 789, 436, 789, 310, 11106, 275, 253, 2929, 5474, 339, 431, 248, 2929, 29328, 4460, 19191, 36809, 601, 11421, 4517, 17187, 3082, 50276, 38358, 407, 17825, 11786, 3082, 285, 2175, 253, 2898, 273, 731, 342, 17825, 16421, 638, 12380, 398, 10527, 14940, 1783, 310, 2530, 323, 492, 342, 391, 983, 8560, 36809, 601, 301, 285, 10704, 1543, 14371, 253, 34385, 273, 36809, 601, 1356, 492, 689, 6447, 492, 4722, 314, 253, 2929, 2722, 323, 253, 806, 673, 326, 17825, 11786, 3082, 476, 320, 1859, 347, 806, 2621, 492, 342, 36809, 601, 11421, 10806, 253, 4016, 20407, 1543, 342, 1375, 23037, 14387, 17825, 3082, 403, 14109, 4645, 326, 253, 492, 881, 3082, 778, 417, 320, 1270, 10165, 323, 3676, 2990, 3733, 1580, 253, 344, 859, 2458, 403, 2223, 16421, 33285, 386, 275, 3676, 28269, 3946, 285, 7613, 253, 5649, 273, 1273, 2621, 3082, 403, 3710, 50276, 284, 753, 253, 2929, 310, 6571, 973, 15720, 285, 1246, 271, 47860, 5839, 273, 19191, 36809, 601, 11421, 492, 534, 812, 320, 7826, 271, 5795, 281, 1375, 23037, 14387, 17825, 11786, 3082, 323, 4980, 3676, 2990, 3733, 285, 253, 2929, 4245, 247, 4016, 3662, 327, 253, 643, 1133, 253, 37317, 9193, 326, 253, 2442, 3486, 323, 253, 3676, 4715, 24432, 778, 320, 247, 2372, 3710, 285, 310, 417, 1077, 2119, 670, 1880, 253, 7680, 273, 436, 789, 310, 326, 1534, 50276, 783, 5661, 4278, 1646, 12744, 275, 253, 4679, 849, 497, 253, 16851, 344, 859, 757, 37989, 5118, 403, 597, 10302, 327, 253, 1072, 19958, 1054, 487, 1506, 273, 11786, 390, 327, 247, 747, 2760, 1063, 1054, 487, 1506, 26614, 760, 3733, 3933, 19103, 403, 2361, 534, 3133, 18766, 50276, 11425, 671, 2486, 253, 1071, 7200, 14777, 187, 187, 4118, 18435, 27, 783, 2929, 19401, 17825, 19191, 13757, 3082, 285, 2722, 326, 597, 476, 320, 294, 22416, 264, 347, 806, 1340, 4517, 2919, 3082, 342, 271, 36809, 601, 11421, 4517, 2919, 597, 1908, 247, 2905, 1273, 1340, 1332, 285, 597, 921, 14940, 3607, 285, 16774, 1543, 50276, 783, 1543, 403, 273, 1600, 533, 253, 8453, 273, 690, 273, 253, 1543, 310, 417, 2590, 50276, 2003, 273, 436, 556, 281, 513, 342, 10359, 285, 629, 273, 436, 556, 281, 513, 342, 9759, 326, 476, 320, 5520, 50276, 358, 5378, 474, 1543, 403, 5075, 1690, 4569, 1666, 25379, 285, 4278, 273, 253, 16774, 1543 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 43355, 12661, 247, 747, 8668, 327, 17825, 11786, 3082, 2022, 7680, 310, 247, 4517, 2919, 1754, 5933, 597, 1067, 19191, 36809, 601, 11421, 4517, 2919, 1332, 28763, 12112, 281, 2486, 1097, 2120, 285, 16421, 4315, 347, 253, 638, 42743, 4315, 50276, 43355, 671, 3748, 326, 253, 638, 12380, 398, 403, 3839, 1073, 5154, 595, 11360, 275, 3946, 285, 778, 760, 2430, 16421, 4315, 6505, 2120, 4315, 323, 2852, 789, 50276, 10752, 281, 4868, 50276, 20881, 802, 468, 474, 1543, 1355, 3210, 327, 1355, 15302, 1293, 2622, 3006, 323, 14604, 9552, 875, 4679, 50276, 74, 452, 7117, 619, 7350, 2708, 285, 18670, 4477, 476, 2953, 619, 4468, 1309, 253, 30080, 22559, 2180, 50276, 74, 1158, 253, 4477, 812, 9619, 3157, 253, 802, 468, 474, 1543, 275, 253, 2929, 407, 1690, 7744, 908, 17825, 3082, 347, 8245, 824, 347, 38622, 285, 5277, 1543, 327, 10046, 1666, 25379, 285, 2740, 1066, 327, 15180, 29692, 453, 68, 4364, 273, 253, 4081, 2746, 275, 625, 4278, 50276, 34974, 26122, 50276, 66, 627, 310, 30762, 260, 326, 3054, 326, 14604, 1979, 908, 806, 1340, 1332, 310, 4567, 4632, 323, 436, 1332, 4477, 897, 1249, 2227, 805, 285, 840, 7277, 896, 21390, 436, 6685, 20276, 672, 970, 50276, 2135, 8560, 347, 247, 1039, 281, 2557, 6733, 347, 436, 4245, 35272, 89, 7756, 432, 816, 4067, 14604, 9552, 50276, 74, 651, 1804, 2502, 80, 272, 4679, 342, 3242, 1072, 14604, 9552, 50276, 67, 4477, 5224, 970, 16421, 638, 12380, 254, 812, 4477, 1908, 2045, 789, 327, 465, 29037, 13692, 958, 2149, 638, 12380, 398, 824, 347, 465, 28402, 390, 439, 1301, 3288, 326, 310, 43245, 20182, 275, 616, 4679, 50276, 68, 812, 4477, 671, 2486, 3402, 2606, 14023, 281, 8085, 673, 5262, 275, 3579, 19265, 50276, 35659, 757, 11000, 1885, 260, 72, 25142, 1690, 4278, 327, 841, 347, 3828, 9552, 5459, 323, 1333, 11776, 80, 577, 76, 534, 403, 1846, 275, 3676, 6928, 10166, 3063, 50276, 69, 812, 368, 1408, 247, 5301, 1411, 1666, 25379, 285, 7533, 275, 3944, 2700, 68, 296, 263, 10905, 264, 10441, 20707, 561, 13880, 22412, 35659, 757, 18105, 554, 4989, 50275, 7152, 33032, 2520, 2929, 3537, 13505, 17825, 3082, 751, 38622, 285, 717, 84, 8560, 285, 2722, 326, 597, 476, 320, 294, 22416, 264, 347, 806, 1340, 4517, 2919, 3082, 342, 271, 36809, 601, 11421, 4517, 2919, 18057, 337, 50276, 783, 4477, 840, 12661, 247, 1273, 1340, 4517, 2919, 1332, 342, 2074, 36809, 601, 11421, 4517, 4811, 5802, 407, 253, 391, 983, 8560, 12624, 16186, 818, 762, 690, 13260, 597, 921, 326, 436, 5933, 588, 29623, 275, 247, 6486, 1180, 273, 5018, 7293, 2220, 253, 7200, 6799, 597, 671, 921, 690, 4679, 281, 7568, 616, 5933, 50276, 783, 2746, 4081, 275, 253, 2929, 310, 4722, 533, 253, 8453, 273, 253, 2929, 310, 417, 2590, 253, 2898, 273, 36809, 601, 11421, 4517, 2919, 281, 747, 1299, 11333, 476, 5604, 1361, 342, 253, 2440, 273, 747, 5556, 14460, 533, 253, 3559, 5933, 369, 417, 1077, 2590, 281, 479, 891, 5467, 16851, 305, 85, 310, 253, 1054, 487, 1506, 11786, 533, 849, 310, 253, 16851, 37989, 10302, 849, 310, 26301, 296, 10302, 5512, 812, 368, 1333, 247, 1652, 625, 670, 13260, 337, 285, 374, 50276, 9453, 513, 597, 2186, 4720, 253, 5661, 1543, 497, 417, 1077, 2590, 50276, 262, 3133, 751, 36809, 601, 11421, 492, 3082, 403, 2223, 41731, 10574, 407, 806, 1340, 3082, 671, 642, 26647, 1543, 327, 1071, 5239, 497, 2011, 7152, 339, 9852, 2593, 4567, 253, 4477, 897, 253, 4313, 273, 253, 16421, 281, 253, 4583, 2280, 273, 247, 4315, 281, 2557, 253, 3290, 273, 16421, 4632, 2120, 638, 42743, 2139, 436, 247, 1175, 2557, 16038, 285, 5701, 670, 436, 2557, 403, 3058, 891, 1158, 253, 954, 1774, 2181, 281, 2451, 1060, 310, 253, 3064, 875, 247, 18, 72, 285, 6687, 66, 18, 72, 891, 1158, 271, 4722, 1953, 326, 581, 778, 7409, 1060, 310, 281, 3048, 436, 4313, 281, 253, 209, 575, 40651, 875, 247, 18, 72, 285, 6687, 66, 18, 3892, 281, 3048, 436, 4313, 281, 253, 10454, 50276, 249, 13989, 337, 4853, 40009, 18, 50276, 19, 891, 2096, 326, 597, 403, 2931, 275, 253, 30762, 816, 2118, 16186, 5976, 281, 4198, 337, 253, 1072, 670, 253, 5886, 875, 288, 50276, 89, 275, 13989, 374, 50275, 2663, 3852, 13989, 374, 352, 310, 247, 2372, 24363, 253, 2701, 672, 295, 4566, 281, 11968, 943, 320, 3907, 273, 295, 50276, 5658, 778, 3630, 14495, 3185, 273, 2701, 50275, 13206, 277, 19, 2722, 326, 8084, 2224, 2274, 79, 12075, 281, 5058, 342, 295, 1469, 281, 11968, 2299, 436, 10671, 310, 1327, 40600, 2355, 347, 247, 1159, 273, 295, 285, 352, 26414, 281, 337, 672, 295, 11711, 2510, 50276, 6377, 721, 846, 18057, 374, 253, 4477, 4767, 326, 289, 78, 337, 2722, 253, 806, 14940, 2281, 323, 36809, 601, 11421, 492, 3082, 891, 14936, 670, 436, 253, 4477, 778, 2451, 323, 4227, 50276, 783, 789, 273, 30606, 1162, 355, 436, 789, 310, 11106, 275, 253, 2929, 2593, 9963, 209, 575, 395, 270, 1326, 276, 1162, 355, 5987, 23053, 81, 47761, 681, 14600, 6903, 7931, 84, 10655, 2511, 11027, 1525, 27122, 17345, 50275, 575, 515, 23892, 18, 253, 4477, 4767, 326, 323, 1442, 3254, 360, 16566, 824, 347, 16186, 337, 253, 1840, 1617, 476, 320, 1313, 407, 3632, 8790, 312, 4906, 1955, 281, 8946, 4719, 1543, 323, 22661, 273, 3632, 4903, 436, 310, 13583, 1379, 323, 4227, 253, 1083, 835, 275, 253, 6486, 2020, 760, 581, 1307, 310, 417, 5058, 285, 512, 253, 643, 403, 4503, 281, 5058, 604, 253, 28078, 1307, 310, 417, 275, 253, 8790, 312, 4906, 323, 512, 246, 253, 14493, 368, 5393, 778, 417, 320, 10048, 368, 476, 452, 841, 14493, 533, 760, 275, 247, 37851, 5133, 347, 275, 253, 787, 1377, 6168, 1162, 355, 789, 436, 789, 310, 11106, 275, 253, 2929, 5474, 339, 431, 248, 2929, 29328, 4460, 19191, 36809, 601, 11421, 4517, 17187, 3082, 50276, 38358, 407, 17825, 11786, 3082, 285, 2175, 253, 2898, 273, 731, 342, 17825, 16421, 638, 12380, 398, 10527, 14940, 1783, 310, 2530, 323, 492, 342, 391, 983, 8560, 36809, 601, 301, 285, 10704, 1543, 14371, 253, 34385, 273, 36809, 601, 1356, 492, 689, 6447, 492, 4722, 314, 253, 2929, 2722, 323, 253, 806, 673, 326, 17825, 11786, 3082, 476, 320, 1859, 347, 806, 2621, 492, 342, 36809, 601, 11421, 10806, 253, 4016, 20407, 1543, 342, 1375, 23037, 14387, 17825, 3082, 403, 14109, 4645, 326, 253, 492, 881, 3082, 778, 417, 320, 1270, 10165, 323, 3676, 2990, 3733, 1580, 253, 344, 859, 2458, 403, 2223, 16421, 33285, 386, 275, 3676, 28269, 3946, 285, 7613, 253, 5649, 273, 1273, 2621, 3082, 403, 3710, 50276, 284, 753, 253, 2929, 310, 6571, 973, 15720, 285, 1246, 271, 47860, 5839, 273, 19191, 36809, 601, 11421, 492, 534, 812, 320, 7826, 271, 5795, 281, 1375, 23037, 14387, 17825, 11786, 3082, 323, 4980, 3676, 2990, 3733, 285, 253, 2929, 4245, 247, 4016, 3662, 327, 253, 643, 1133, 253, 37317, 9193, 326, 253, 2442, 3486, 323, 253, 3676, 4715, 24432, 778, 320, 247, 2372, 3710, 285, 310, 417, 1077, 2119, 670, 1880, 253, 7680, 273, 436, 789, 310, 326, 1534, 50276, 783, 5661, 4278, 1646, 12744, 275, 253, 4679, 849, 497, 253, 16851, 344, 859, 757, 37989, 5118, 403, 597, 10302, 327, 253, 1072, 19958, 1054, 487, 1506, 273, 11786, 390, 327, 247, 747, 2760, 1063, 1054, 487, 1506, 26614, 760, 3733, 3933, 19103, 403, 2361, 534, 3133, 18766, 50276, 11425, 671, 2486, 253, 1071, 7200, 14777, 187, 187, 4118, 18435, 27, 783, 2929, 19401, 17825, 19191, 13757, 3082, 285, 2722, 326, 597, 476, 320, 294, 22416, 264, 347, 806, 1340, 4517, 2919, 3082, 342, 271, 36809, 601, 11421, 4517, 2919, 597, 1908, 247, 2905, 1273, 1340, 1332, 285, 597, 921, 14940, 3607, 285, 16774, 1543, 50276, 783, 1543, 403, 273, 1600, 533, 253, 8453, 273, 690, 273, 253, 1543, 310, 417, 2590, 50276, 2003, 273, 436, 556, 281, 513, 342, 10359, 285, 629, 273, 436, 556, 281, 513, 342, 9759, 326, 476, 320, 5520, 50276, 358, 5378, 474, 1543, 403, 5075, 1690, 4569, 1666, 25379, 285, 4278, 273, 253, 16774, 1543 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper tackles an important concern when it comes to pacbayes bounds ie the fact that such bounds apply to the risk of a stochastic predictor rather than a deterministic one the authors propose bounds on a single draw of a predictor according to the pacbayesian predictor however i would like the author to discuss to which extent the derivation of this single draw bound differs from the result of hellstrm durisi 2020 they refer to indeed the latter is a slow rate o1sqrtn bound while the proposed result is a fast rate o1n one however such bounds for randomized predictors already exist in the pacbayes literature for randomized predictors as a matter of fact the pacbayes theorem of seeger 2002 involving the kl divergence between two bernoulli distributions is tighter than both the slow rate result of eq 1 and the fast rate result of eq 2 the seegers bound achieving fast rate when the empirical loss is zero see letarte et al 2019 thm 3 for an explicit connection between eq 2 and seegers bound in other words i wonder if the proposed fast rate bound is a particular case of a general analysis obtainable from a slight generalization of hellstrm durisi 2020 the experiments show promising results but would benefit by being extended figure 1 shows the bound values according to the training epoch up to 30 epochs the training and test loss are still decreasing at this point in order to have the complete picture it would be important to compare the results for fully trained models moreover i would like to see the bounds values in the overfitting regime when the training error is close to zero as the experiments are performed in dziugate roy 2017 note that the proposed fast rate bounds only converge in this setting finally a nice addition would be to comment on the possibly to learn a predictor by directly minimizing a single draw bound as it is frequently done in the pacbayes works eg dziugate roy 2017 minor comments i do not understand the subscript i si n of tilde z middle of page 3 it is strange that the authors cite an unpublished paper guedj and pujol 2019 for the canonical pacbayes bound stated in the early pacbayes works of mcallester references letarte et al dichotomize and generalize pacbayesian binary activated deep neural networks neurips 2019 seeger pacbayesian generalisation error bounds for gaussian process classification jmlr 2002 docsepthis paper derives bounds on the test loss under the randomsubset setting which are expressed with conditional information measures and have fast rates with respect to the sample size n the derived bounds are compared with the practical performance of deep neural networks dnns trained on the mnist and fashionmnist data sets although the derived bounds can be somewhat loose practically as it can be larger than those of slowrates the paper provides a steady theoretical progress on the evaluation of loss bounds using conditional information measures although the experiments with nns and practical data sets are interesting they are a little unsatisfactory for demonstrating the significance of the derived fastrate bounds p5 main theorem it would be nicer to explain what novel important proof techniques are required if any to derive the fastrate bounds in section 3 p6 computation of mu2 since nns have hierarchical structures computing the average of their weights does not seem so good i wonder if computed mu2 are close to zero and if there is any other possible ways to compute mu2 p7 experiments on mnists although the experiments demonstrate that the fastrate bound can in fact be tighter than the slowrate bound in practical cases i wonder if some experiments should be designed with changing the sample size n in order to compare fast and slow rate bounds as the training and test errors reported in fig 1 are close to each other isnt it better to produce any overfitting situation even for synthetic data docsepthis paper extends results of prior work by steinke and zakynthinou by providing generalization bounds in the pacbayesian and singledraw settings that depend on the conditional mutual information the emphasis in this work is on obtaining fast rates 1n vs 1sqrtn the authors also conduct empirical experiments showing how the fast rate bounds they propose can be useful for obtaining nonvacuous generalization bounds in the context of overparameterized neural networks i think theorem 1 and its corollaries are a nice contribution the paper is very well written and clear the authors do an excellent job in explaining the relevant related work and how their results expand upon the results of earlier work it might be useful if the authors could indicate or explain whether their bounds suggest or motivate improved learning algorithms for example how should one choose a prior distribution q over the hypothesis based on knowledge of the full training examples ### Summary:
the paper extends results from the recent work of steinke and zakynthinou sz for the test loss of randomized learning algorithms they provide bounds in the single draw as well as pacbayes setting the main result is about fast rates the proof of which follows with minor modifications from the corresponding result in sz it is unclear to me the contribution over existing work is sufficient to merit acceptance
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 39223, 271, 1774, 4468, 672, 352, 3249, 281, 19162, 32442, 265, 14493, 26332, 253, 958, 326, 824, 14493, 4647, 281, 253, 2495, 273, 247, 19191, 23403, 2581, 685, 247, 30027, 581, 253, 4477, 12661, 14493, 327, 247, 2014, 3812, 273, 247, 23403, 2556, 281, 253, 19162, 32442, 16561, 23403, 50275, 35529, 891, 651, 751, 253, 2488, 281, 2319, 281, 534, 6070, 253, 28529, 273, 436, 2014, 3812, 3033, 19986, 432, 253, 906, 273, 7085, 296, 1109, 50276, 69, 321, 13401, 9169, 597, 3730, 281, 6296, 253, 6158, 310, 247, 3468, 2281, 258, 18, 2609, 79, 3033, 1223, 253, 4081, 906, 310, 247, 3809, 2281, 258, 18, 79, 581, 2299, 824, 14493, 323, 14871, 23477, 2168, 2226, 275, 253, 19162, 32442, 265, 6239, 323, 14871, 23477, 347, 247, 2647, 273, 958, 253, 19162, 32442, 265, 10012, 273, 396, 38142, 6752, 7668, 253, 27451, 23279, 875, 767, 270, 1808, 276, 25658, 10670, 310, 40638, 685, 1097, 253, 3468, 2281, 906, 273, 16186, 337, 285, 253, 3809, 2281, 906, 273, 16186, 374, 253, 396, 909, 398, 3033, 17170, 3809, 2281, 672, 253, 16774, 2957, 310, 5058, 923, 1339, 26228, 1162, 355, 6247, 289, 78, 495, 323, 271, 6843, 4602, 875, 16186, 374, 285, 396, 909, 398, 3033, 50275, 249, 643, 3000, 891, 4282, 604, 253, 4081, 3809, 2281, 3033, 310, 247, 1798, 1083, 273, 247, 2087, 1783, 4044, 494, 432, 247, 4512, 26647, 273, 7085, 296, 1109, 50276, 69, 321, 13401, 9169, 50276, 783, 4679, 921, 12532, 1543, 533, 651, 5649, 407, 1146, 6508, 4677, 337, 2722, 253, 3033, 2193, 2556, 281, 253, 3733, 23657, 598, 281, 1884, 44540, 253, 3733, 285, 1071, 2957, 403, 1335, 11052, 387, 436, 1127, 275, 1340, 281, 452, 253, 3426, 5406, 352, 651, 320, 1774, 281, 7277, 253, 1543, 323, 4751, 10166, 3210, 25761, 891, 651, 751, 281, 923, 253, 14493, 2193, 275, 253, 689, 31893, 9459, 672, 253, 3733, 2228, 310, 2810, 281, 5058, 347, 253, 4679, 403, 2684, 275, 277, 9877, 814, 366, 50276, 4926, 4240, 3877, 326, 253, 4081, 3809, 2281, 14493, 760, 29623, 275, 436, 4758, 50276, 71, 3341, 247, 5322, 1635, 651, 320, 281, 4385, 327, 253, 6830, 281, 3037, 247, 23403, 407, 3587, 28699, 247, 2014, 3812, 3033, 347, 352, 310, 7208, 2218, 275, 253, 19162, 32442, 265, 2987, 24088, 277, 9877, 814, 366, 50276, 4926, 4240, 50274, 37585, 5701, 50276, 74, 513, 417, 2096, 253, 749, 3866, 891, 50276, 9245, 295, 273, 246, 6227, 1182, 4766, 273, 3239, 495, 50276, 262, 310, 8921, 326, 253, 4477, 26542, 271, 27085, 2929, 1149, 264, 75, 285, 8429, 75, 311, 6247, 323, 253, 15516, 19162, 32442, 265, 3033, 4767, 275, 253, 2393, 19162, 32442, 265, 2987, 273, 278, 4065, 9358, 50275, 250, 3065, 1339, 26228, 1162, 355, 19821, 35756, 907, 285, 39970, 19162, 32442, 16561, 8985, 10564, 3676, 11454, 6928, 5723, 2824, 6247, 396, 38142, 19162, 32442, 16561, 2087, 5837, 2228, 14493, 323, 305, 12064, 1232, 9162, 480, 1686, 83, 6752, 5474, 33032, 2520, 2929, 38422, 14493, 327, 253, 1071, 2957, 762, 253, 3632, 6040, 4758, 534, 403, 4469, 342, 17697, 1491, 5593, 285, 452, 3809, 4142, 342, 1675, 281, 253, 3410, 1979, 295, 253, 6012, 14493, 403, 2429, 342, 253, 8542, 3045, 273, 3676, 11454, 6928, 277, 79, 2224, 10166, 327, 253, 278, 79, 382, 285, 8142, 16192, 382, 941, 5239, 50276, 20261, 253, 6012, 14493, 476, 320, 8489, 13155, 18236, 347, 352, 476, 320, 4067, 685, 1110, 273, 3468, 27619, 253, 2929, 3400, 247, 11792, 10527, 4780, 327, 253, 7103, 273, 2957, 14493, 970, 17697, 1491, 5593, 50276, 20261, 253, 4679, 342, 295, 2224, 285, 8542, 941, 5239, 403, 4722, 597, 403, 247, 1652, 49770, 323, 17227, 253, 8453, 273, 253, 6012, 3809, 4427, 14493, 50276, 81, 22, 2022, 10012, 352, 651, 320, 49482, 281, 5513, 752, 4460, 1774, 4737, 5609, 403, 2424, 604, 667, 281, 15313, 253, 3809, 4427, 14493, 275, 2593, 495, 50276, 81, 23, 13782, 273, 12910, 19, 1580, 295, 2224, 452, 24498, 5289, 12672, 253, 3388, 273, 616, 13461, 1057, 417, 1646, 594, 1175, 891, 4282, 604, 10302, 12910, 19, 403, 2810, 281, 5058, 285, 604, 627, 310, 667, 643, 1896, 4088, 281, 11897, 12910, 19, 50276, 81, 24, 4679, 327, 278, 79, 1346, 3738, 253, 4679, 7568, 326, 253, 3809, 4427, 3033, 476, 275, 958, 320, 40638, 685, 253, 3468, 4427, 3033, 275, 8542, 2219, 891, 4282, 604, 690, 4679, 943, 320, 4158, 342, 6890, 253, 3410, 1979, 295, 275, 1340, 281, 7277, 3809, 285, 3468, 2281, 14493, 50276, 284, 253, 3733, 285, 1071, 6332, 2361, 275, 3036, 337, 403, 2810, 281, 1016, 643, 310, 2649, 352, 1805, 281, 4711, 667, 689, 31893, 4112, 1014, 323, 13506, 941, 5474, 33032, 2520, 2929, 8725, 1543, 273, 2720, 789, 407, 2870, 249, 413, 285, 1182, 518, 1362, 35275, 276, 407, 5277, 26647, 14493, 275, 253, 19162, 32442, 16561, 285, 1625, 1070, 2040, 7533, 326, 3469, 327, 253, 17697, 15577, 1491, 253, 15075, 275, 436, 789, 310, 327, 13546, 3809, 4142, 337, 79, 4632, 337, 2609, 79, 253, 4477, 671, 2589, 16774, 4679, 4645, 849, 253, 3809, 2281, 14493, 597, 12661, 476, 320, 4217, 323, 13546, 1327, 28902, 3472, 26647, 14493, 275, 253, 3634, 273, 689, 19484, 1025, 11454, 6928, 50276, 74, 1158, 10012, 337, 285, 697, 944, 2555, 3927, 403, 247, 5322, 7680, 253, 2929, 310, 1077, 973, 3542, 285, 2590, 253, 4477, 513, 271, 7126, 2628, 275, 15571, 253, 4623, 2905, 789, 285, 849, 616, 1543, 5645, 2220, 253, 1543, 273, 4321, 789, 50275, 262, 1537, 320, 4217, 604, 253, 4477, 812, 5224, 390, 5513, 1880, 616, 14493, 1804, 390, 41509, 5520, 4715, 11333, 323, 1650, 849, 943, 581, 5206, 247, 2720, 3268, 2805, 689, 253, 9079, 1754, 327, 3640, 273, 253, 2120, 3733, 6667, 50276, 187, 187, 4118, 18435, 27, 783, 2929, 8725, 1543, 432, 253, 3332, 789, 273, 2870, 249, 413, 285, 1182, 518, 1362, 35275, 276, 18558, 323, 253, 1071, 2957, 273, 14871, 4715, 11333, 597, 2085, 14493, 275, 253, 2014, 3812, 347, 973, 347, 19162, 32442, 265, 4758, 253, 2022, 906, 310, 670, 3809, 4142, 253, 4737, 273, 534, 3637, 342, 5884, 14586, 432, 253, 3969, 906, 275, 18558, 352, 310, 12744, 281, 479, 253, 7680, 689, 5368, 789, 310, 4209, 281, 15785, 14924 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 39223, 271, 1774, 4468, 672, 352, 3249, 281, 19162, 32442, 265, 14493, 26332, 253, 958, 326, 824, 14493, 4647, 281, 253, 2495, 273, 247, 19191, 23403, 2581, 685, 247, 30027, 581, 253, 4477, 12661, 14493, 327, 247, 2014, 3812, 273, 247, 23403, 2556, 281, 253, 19162, 32442, 16561, 23403, 50275, 35529, 891, 651, 751, 253, 2488, 281, 2319, 281, 534, 6070, 253, 28529, 273, 436, 2014, 3812, 3033, 19986, 432, 253, 906, 273, 7085, 296, 1109, 50276, 69, 321, 13401, 9169, 597, 3730, 281, 6296, 253, 6158, 310, 247, 3468, 2281, 258, 18, 2609, 79, 3033, 1223, 253, 4081, 906, 310, 247, 3809, 2281, 258, 18, 79, 581, 2299, 824, 14493, 323, 14871, 23477, 2168, 2226, 275, 253, 19162, 32442, 265, 6239, 323, 14871, 23477, 347, 247, 2647, 273, 958, 253, 19162, 32442, 265, 10012, 273, 396, 38142, 6752, 7668, 253, 27451, 23279, 875, 767, 270, 1808, 276, 25658, 10670, 310, 40638, 685, 1097, 253, 3468, 2281, 906, 273, 16186, 337, 285, 253, 3809, 2281, 906, 273, 16186, 374, 253, 396, 909, 398, 3033, 17170, 3809, 2281, 672, 253, 16774, 2957, 310, 5058, 923, 1339, 26228, 1162, 355, 6247, 289, 78, 495, 323, 271, 6843, 4602, 875, 16186, 374, 285, 396, 909, 398, 3033, 50275, 249, 643, 3000, 891, 4282, 604, 253, 4081, 3809, 2281, 3033, 310, 247, 1798, 1083, 273, 247, 2087, 1783, 4044, 494, 432, 247, 4512, 26647, 273, 7085, 296, 1109, 50276, 69, 321, 13401, 9169, 50276, 783, 4679, 921, 12532, 1543, 533, 651, 5649, 407, 1146, 6508, 4677, 337, 2722, 253, 3033, 2193, 2556, 281, 253, 3733, 23657, 598, 281, 1884, 44540, 253, 3733, 285, 1071, 2957, 403, 1335, 11052, 387, 436, 1127, 275, 1340, 281, 452, 253, 3426, 5406, 352, 651, 320, 1774, 281, 7277, 253, 1543, 323, 4751, 10166, 3210, 25761, 891, 651, 751, 281, 923, 253, 14493, 2193, 275, 253, 689, 31893, 9459, 672, 253, 3733, 2228, 310, 2810, 281, 5058, 347, 253, 4679, 403, 2684, 275, 277, 9877, 814, 366, 50276, 4926, 4240, 3877, 326, 253, 4081, 3809, 2281, 14493, 760, 29623, 275, 436, 4758, 50276, 71, 3341, 247, 5322, 1635, 651, 320, 281, 4385, 327, 253, 6830, 281, 3037, 247, 23403, 407, 3587, 28699, 247, 2014, 3812, 3033, 347, 352, 310, 7208, 2218, 275, 253, 19162, 32442, 265, 2987, 24088, 277, 9877, 814, 366, 50276, 4926, 4240, 50274, 37585, 5701, 50276, 74, 513, 417, 2096, 253, 749, 3866, 891, 50276, 9245, 295, 273, 246, 6227, 1182, 4766, 273, 3239, 495, 50276, 262, 310, 8921, 326, 253, 4477, 26542, 271, 27085, 2929, 1149, 264, 75, 285, 8429, 75, 311, 6247, 323, 253, 15516, 19162, 32442, 265, 3033, 4767, 275, 253, 2393, 19162, 32442, 265, 2987, 273, 278, 4065, 9358, 50275, 250, 3065, 1339, 26228, 1162, 355, 19821, 35756, 907, 285, 39970, 19162, 32442, 16561, 8985, 10564, 3676, 11454, 6928, 5723, 2824, 6247, 396, 38142, 19162, 32442, 16561, 2087, 5837, 2228, 14493, 323, 305, 12064, 1232, 9162, 480, 1686, 83, 6752, 5474, 33032, 2520, 2929, 38422, 14493, 327, 253, 1071, 2957, 762, 253, 3632, 6040, 4758, 534, 403, 4469, 342, 17697, 1491, 5593, 285, 452, 3809, 4142, 342, 1675, 281, 253, 3410, 1979, 295, 253, 6012, 14493, 403, 2429, 342, 253, 8542, 3045, 273, 3676, 11454, 6928, 277, 79, 2224, 10166, 327, 253, 278, 79, 382, 285, 8142, 16192, 382, 941, 5239, 50276, 20261, 253, 6012, 14493, 476, 320, 8489, 13155, 18236, 347, 352, 476, 320, 4067, 685, 1110, 273, 3468, 27619, 253, 2929, 3400, 247, 11792, 10527, 4780, 327, 253, 7103, 273, 2957, 14493, 970, 17697, 1491, 5593, 50276, 20261, 253, 4679, 342, 295, 2224, 285, 8542, 941, 5239, 403, 4722, 597, 403, 247, 1652, 49770, 323, 17227, 253, 8453, 273, 253, 6012, 3809, 4427, 14493, 50276, 81, 22, 2022, 10012, 352, 651, 320, 49482, 281, 5513, 752, 4460, 1774, 4737, 5609, 403, 2424, 604, 667, 281, 15313, 253, 3809, 4427, 14493, 275, 2593, 495, 50276, 81, 23, 13782, 273, 12910, 19, 1580, 295, 2224, 452, 24498, 5289, 12672, 253, 3388, 273, 616, 13461, 1057, 417, 1646, 594, 1175, 891, 4282, 604, 10302, 12910, 19, 403, 2810, 281, 5058, 285, 604, 627, 310, 667, 643, 1896, 4088, 281, 11897, 12910, 19, 50276, 81, 24, 4679, 327, 278, 79, 1346, 3738, 253, 4679, 7568, 326, 253, 3809, 4427, 3033, 476, 275, 958, 320, 40638, 685, 253, 3468, 4427, 3033, 275, 8542, 2219, 891, 4282, 604, 690, 4679, 943, 320, 4158, 342, 6890, 253, 3410, 1979, 295, 275, 1340, 281, 7277, 3809, 285, 3468, 2281, 14493, 50276, 284, 253, 3733, 285, 1071, 6332, 2361, 275, 3036, 337, 403, 2810, 281, 1016, 643, 310, 2649, 352, 1805, 281, 4711, 667, 689, 31893, 4112, 1014, 323, 13506, 941, 5474, 33032, 2520, 2929, 8725, 1543, 273, 2720, 789, 407, 2870, 249, 413, 285, 1182, 518, 1362, 35275, 276, 407, 5277, 26647, 14493, 275, 253, 19162, 32442, 16561, 285, 1625, 1070, 2040, 7533, 326, 3469, 327, 253, 17697, 15577, 1491, 253, 15075, 275, 436, 789, 310, 327, 13546, 3809, 4142, 337, 79, 4632, 337, 2609, 79, 253, 4477, 671, 2589, 16774, 4679, 4645, 849, 253, 3809, 2281, 14493, 597, 12661, 476, 320, 4217, 323, 13546, 1327, 28902, 3472, 26647, 14493, 275, 253, 3634, 273, 689, 19484, 1025, 11454, 6928, 50276, 74, 1158, 10012, 337, 285, 697, 944, 2555, 3927, 403, 247, 5322, 7680, 253, 2929, 310, 1077, 973, 3542, 285, 2590, 253, 4477, 513, 271, 7126, 2628, 275, 15571, 253, 4623, 2905, 789, 285, 849, 616, 1543, 5645, 2220, 253, 1543, 273, 4321, 789, 50275, 262, 1537, 320, 4217, 604, 253, 4477, 812, 5224, 390, 5513, 1880, 616, 14493, 1804, 390, 41509, 5520, 4715, 11333, 323, 1650, 849, 943, 581, 5206, 247, 2720, 3268, 2805, 689, 253, 9079, 1754, 327, 3640, 273, 253, 2120, 3733, 6667, 50276, 187, 187, 4118, 18435, 27, 783, 2929, 8725, 1543, 432, 253, 3332, 789, 273, 2870, 249, 413, 285, 1182, 518, 1362, 35275, 276, 18558, 323, 253, 1071, 2957, 273, 14871, 4715, 11333, 597, 2085, 14493, 275, 253, 2014, 3812, 347, 973, 347, 19162, 32442, 265, 4758, 253, 2022, 906, 310, 670, 3809, 4142, 253, 4737, 273, 534, 3637, 342, 5884, 14586, 432, 253, 3969, 906, 275, 18558, 352, 310, 12744, 281, 479, 253, 7680, 689, 5368, 789, 310, 4209, 281, 15785, 14924 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper studies the robustness of machine learning models this paper proposed to use targeted adversarial attacks to increase the robustness of the model specifically the approach uses the semantic difference between the prediction and the true label as a measure to quantify the concept of mistake severity this paper is very wellwritten and easy to understand the experiment results demonstrated the effectiveness of the proposed approach the topic of the paper is aligned with the theme of the workshop my major concern is the use of semantic similarity to quantify mistake severity i think mistake severity should depend on downstream tasks ie how the downstream components change their behaviors in response to the prediction errors instead of the similarities between objects from a semantic perspective for example two semantically different objects could induce similar robot behaviors and two semantically similar objects could also induce very different robot behaviors using semantic similarity can not fully capture the mistake severity and the design of a semantic similar set may introduce additional human bias in the downstream tasks in addition discussion related to training complicitytime could be includeddocsepthis paper proposes a robust training method that fortifies the models prediction accuracies to the adversarially corrupted inputs the core idea is to train the model with noisy inputs whose noises maximize the classification error to the given target labels strong point the proposed method is conceptually straightforward it can be used as an offtheshelf technic for dealing with adversarial attacks weak point the method is too generic it is challenging to find the novelty of the proposed method im quite sure we can find similar approaches having more related research can enhance the credibility of the authors claim otherwise weakening the claim in l57 we propose a novel method of increasing alignment of semantically similar classes based on targeted adversarial training can be more appropriate weak empirical evidence as shown in figure 34 the standard model performs better than the proposed method up to a certain degree of noise or corruptiondocsepthis paper extended the ordinary adversarial training recipe to the targeted attackoriented version to increase the semantic similarity between a models predictions and actual labels of misclassified instances although the approach seems simple the idea and the use case are interesting my specific comments are listed below 1 strictly speaking the targeted version of adversarial training is no longer the minmax optimization problem this is because the attack objective is not the opposite of the training objective if the target label is present in contrast this becomes a bilevel optimizationbased adversarial training formulation see the related work at revisiting and advancing fast adversarial training through the lens of bilevel optimization httpsarxivorgpdf211212376pdf please clarify this 2 please add more details for the experiment setup eg the training details and evaluation metrics 3 feel free to add more examples in figure 1 ### Summary:
the paper proposed to use the semantic difference between the predictions and labels to quantify the severity of mistakes which is used to generate targeted adversarial attacks to improve the robustness of the model the idea is interesting in general and the proposed methos is straightforward and easy to use please try to address the reviewers concerns especially the ones from reviewer gsdz in the final version of the paper
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 2175, 253, 31640, 273, 5145, 4715, 3210, 436, 2929, 4081, 281, 897, 10522, 48960, 8104, 281, 2572, 253, 31640, 273, 253, 1566, 5742, 253, 2746, 4648, 253, 24705, 3064, 875, 253, 10554, 285, 253, 2032, 5203, 347, 247, 2557, 281, 22048, 253, 4473, 273, 10551, 12147, 50276, 2520, 2929, 310, 1077, 973, 15720, 285, 3477, 281, 2096, 253, 3368, 1543, 5183, 253, 12510, 273, 253, 4081, 2746, 253, 9400, 273, 253, 2929, 310, 15616, 342, 253, 10014, 273, 253, 22586, 50276, 2577, 2201, 4468, 310, 253, 897, 273, 24705, 14259, 281, 22048, 10551, 12147, 50276, 74, 1158, 10551, 12147, 943, 3469, 327, 15450, 8892, 26332, 849, 253, 15450, 4295, 1818, 616, 13576, 275, 2380, 281, 253, 10554, 6332, 3185, 273, 253, 22620, 875, 5113, 432, 247, 24705, 8668, 323, 1650, 767, 3300, 39904, 1027, 5113, 812, 10808, 2074, 15688, 13576, 285, 767, 3300, 39904, 2074, 5113, 812, 671, 10808, 1077, 1027, 15688, 13576, 970, 24705, 14259, 476, 417, 4751, 9232, 253, 10551, 12147, 285, 253, 2216, 273, 247, 24705, 2074, 873, 778, 9569, 3081, 1966, 8492, 275, 253, 15450, 8892, 275, 1635, 5955, 2905, 281, 3733, 5177, 414, 2606, 812, 320, 2908, 7152, 33032, 2520, 2929, 29328, 247, 10237, 3733, 1332, 326, 7574, 7790, 253, 3210, 10554, 3933, 19103, 281, 253, 18539, 274, 1365, 40634, 14800, 253, 5161, 2934, 310, 281, 6194, 253, 1566, 342, 27620, 14800, 3692, 33737, 22950, 253, 9162, 2228, 281, 253, 1677, 2303, 13301, 50276, 9072, 1127, 50276, 783, 4081, 1332, 310, 4473, 1230, 15246, 50276, 262, 476, 320, 908, 347, 271, 273, 649, 1041, 48164, 1732, 280, 323, 10620, 342, 48960, 8104, 50276, 20881, 1127, 50276, 783, 1332, 310, 1512, 12314, 352, 310, 11132, 281, 1089, 253, 38135, 273, 253, 4081, 1332, 516, 3240, 2119, 359, 476, 1089, 2074, 7274, 1907, 625, 2905, 2561, 476, 7278, 253, 17938, 273, 253, 4477, 1750, 5010, 49060, 253, 1750, 275, 298, 3011, 359, 12661, 247, 4460, 1332, 273, 3629, 12420, 273, 3300, 39904, 2074, 5971, 1754, 327, 10522, 48960, 3733, 476, 320, 625, 4569, 50276, 20881, 16774, 1941, 347, 2011, 275, 4677, 5910, 253, 2629, 1566, 17923, 1805, 685, 253, 4081, 1332, 598, 281, 247, 2176, 4248, 273, 6046, 390, 16933, 7152, 33032, 2520, 2929, 6508, 253, 9826, 48960, 3733, 13612, 281, 253, 10522, 2983, 21085, 2715, 281, 2572, 253, 24705, 14259, 875, 247, 3210, 13650, 285, 4588, 13301, 273, 3731, 39651, 10872, 3738, 253, 2746, 3133, 2969, 253, 2934, 285, 253, 897, 1083, 403, 4722, 619, 2173, 5701, 403, 7117, 2708, 50275, 18, 13714, 8288, 253, 10522, 2715, 273, 48960, 3733, 310, 642, 3356, 253, 1054, 4090, 13757, 1895, 436, 310, 984, 253, 2983, 8103, 310, 417, 253, 7285, 273, 253, 3733, 8103, 604, 253, 2303, 5203, 310, 1246, 275, 4499, 436, 4916, 247, 26413, 652, 13757, 3169, 48960, 3733, 15895, 923, 253, 2905, 789, 387, 27694, 2996, 285, 26441, 3809, 48960, 3733, 949, 253, 9655, 273, 26413, 652, 13757, 5987, 39962, 2061, 9275, 1797, 805, 805, 25848, 9275, 50276, 32897, 19148, 436, 50275, 19, 4496, 823, 625, 4278, 323, 253, 3368, 9978, 24088, 253, 3733, 4278, 285, 7103, 17082, 50276, 20, 1928, 1959, 281, 823, 625, 6667, 275, 4677, 337, 50275, 187, 187, 4118, 18435, 27, 783, 2929, 4081, 281, 897, 253, 24705, 3064, 875, 253, 13650, 285, 13301, 281, 22048, 253, 12147, 273, 16503, 534, 310, 908, 281, 6635, 10522, 48960, 8104, 281, 3157, 253, 31640, 273, 253, 1566, 253, 2934, 310, 4722, 275, 2087, 285, 253, 4081, 10258, 375, 310, 15246, 285, 3477, 281, 897, 4496, 1611, 281, 2953, 253, 30628, 7350, 3340, 253, 4394, 432, 37317, 305, 8289, 91, 275, 253, 2457, 2715, 273, 253, 2929, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 2175, 253, 31640, 273, 5145, 4715, 3210, 436, 2929, 4081, 281, 897, 10522, 48960, 8104, 281, 2572, 253, 31640, 273, 253, 1566, 5742, 253, 2746, 4648, 253, 24705, 3064, 875, 253, 10554, 285, 253, 2032, 5203, 347, 247, 2557, 281, 22048, 253, 4473, 273, 10551, 12147, 50276, 2520, 2929, 310, 1077, 973, 15720, 285, 3477, 281, 2096, 253, 3368, 1543, 5183, 253, 12510, 273, 253, 4081, 2746, 253, 9400, 273, 253, 2929, 310, 15616, 342, 253, 10014, 273, 253, 22586, 50276, 2577, 2201, 4468, 310, 253, 897, 273, 24705, 14259, 281, 22048, 10551, 12147, 50276, 74, 1158, 10551, 12147, 943, 3469, 327, 15450, 8892, 26332, 849, 253, 15450, 4295, 1818, 616, 13576, 275, 2380, 281, 253, 10554, 6332, 3185, 273, 253, 22620, 875, 5113, 432, 247, 24705, 8668, 323, 1650, 767, 3300, 39904, 1027, 5113, 812, 10808, 2074, 15688, 13576, 285, 767, 3300, 39904, 2074, 5113, 812, 671, 10808, 1077, 1027, 15688, 13576, 970, 24705, 14259, 476, 417, 4751, 9232, 253, 10551, 12147, 285, 253, 2216, 273, 247, 24705, 2074, 873, 778, 9569, 3081, 1966, 8492, 275, 253, 15450, 8892, 275, 1635, 5955, 2905, 281, 3733, 5177, 414, 2606, 812, 320, 2908, 7152, 33032, 2520, 2929, 29328, 247, 10237, 3733, 1332, 326, 7574, 7790, 253, 3210, 10554, 3933, 19103, 281, 253, 18539, 274, 1365, 40634, 14800, 253, 5161, 2934, 310, 281, 6194, 253, 1566, 342, 27620, 14800, 3692, 33737, 22950, 253, 9162, 2228, 281, 253, 1677, 2303, 13301, 50276, 9072, 1127, 50276, 783, 4081, 1332, 310, 4473, 1230, 15246, 50276, 262, 476, 320, 908, 347, 271, 273, 649, 1041, 48164, 1732, 280, 323, 10620, 342, 48960, 8104, 50276, 20881, 1127, 50276, 783, 1332, 310, 1512, 12314, 352, 310, 11132, 281, 1089, 253, 38135, 273, 253, 4081, 1332, 516, 3240, 2119, 359, 476, 1089, 2074, 7274, 1907, 625, 2905, 2561, 476, 7278, 253, 17938, 273, 253, 4477, 1750, 5010, 49060, 253, 1750, 275, 298, 3011, 359, 12661, 247, 4460, 1332, 273, 3629, 12420, 273, 3300, 39904, 2074, 5971, 1754, 327, 10522, 48960, 3733, 476, 320, 625, 4569, 50276, 20881, 16774, 1941, 347, 2011, 275, 4677, 5910, 253, 2629, 1566, 17923, 1805, 685, 253, 4081, 1332, 598, 281, 247, 2176, 4248, 273, 6046, 390, 16933, 7152, 33032, 2520, 2929, 6508, 253, 9826, 48960, 3733, 13612, 281, 253, 10522, 2983, 21085, 2715, 281, 2572, 253, 24705, 14259, 875, 247, 3210, 13650, 285, 4588, 13301, 273, 3731, 39651, 10872, 3738, 253, 2746, 3133, 2969, 253, 2934, 285, 253, 897, 1083, 403, 4722, 619, 2173, 5701, 403, 7117, 2708, 50275, 18, 13714, 8288, 253, 10522, 2715, 273, 48960, 3733, 310, 642, 3356, 253, 1054, 4090, 13757, 1895, 436, 310, 984, 253, 2983, 8103, 310, 417, 253, 7285, 273, 253, 3733, 8103, 604, 253, 2303, 5203, 310, 1246, 275, 4499, 436, 4916, 247, 26413, 652, 13757, 3169, 48960, 3733, 15895, 923, 253, 2905, 789, 387, 27694, 2996, 285, 26441, 3809, 48960, 3733, 949, 253, 9655, 273, 26413, 652, 13757, 5987, 39962, 2061, 9275, 1797, 805, 805, 25848, 9275, 50276, 32897, 19148, 436, 50275, 19, 4496, 823, 625, 4278, 323, 253, 3368, 9978, 24088, 253, 3733, 4278, 285, 7103, 17082, 50276, 20, 1928, 1959, 281, 823, 625, 6667, 275, 4677, 337, 50275, 187, 187, 4118, 18435, 27, 783, 2929, 4081, 281, 897, 253, 24705, 3064, 875, 253, 13650, 285, 13301, 281, 22048, 253, 12147, 273, 16503, 534, 310, 908, 281, 6635, 10522, 48960, 8104, 281, 3157, 253, 31640, 273, 253, 1566, 253, 2934, 310, 4722, 275, 2087, 285, 253, 4081, 10258, 375, 310, 15246, 285, 3477, 281, 897, 4496, 1611, 281, 2953, 253, 30628, 7350, 3340, 253, 4394, 432, 37317, 305, 8289, 91, 275, 253, 2457, 2715, 273, 253, 2929, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: summary the authors tackle the problem of zeroshot learning that is the recognition of classes and categories for which no visual data are available but only semantic embedding providing a description of the classes in terms of auxiliary textual descriptions to this aim authors propose a method dubbed imageguided semantic classification in which a twostream network fed by either visual and semantic embedding learns a compatibility function whose recognition performance is enhanced by means of calibrated stacking chao et al 2016 pros the method is simple and easy to understand cons the computational pipeline is not novel and largely inspired by methods such as the ones proposed in yang et al a unified perspective on multidomain and multitask learning iclr 2015 or liu et al generalized zeroshot learning with deep calibration network neurips 2018 which are not even cited in the paper unfortunately authors seem essential to add a calibration module to this kind of architectures since the calibration module is inherited from prior works chao et al 2016 i found the method quite incremental within the experimental comparison ganbased methods are not reported although being a mainstream class of stateoftheart methods i am referring to works such as clswgan xian et al features generating networks for zeroshot learning cvpr 2018 fvaegand2 xian et al a feature generating framework for anyshot learning cvpr 2019 cadavae schonfeld et al generalized zero and fewshot learning via aligned variational autoencoders cvpr 2019 dlfzrl tong et al hierarchical disentanglement of discriminative latent features for zeroshot learning cvpr 2019 or tfvaegan narayan et al latent embedding feedback and discriminative features for zeroshot classification eccv 2020 authors justified this approach by reducing the methods in comparison within the ones which use unseen semantic embeddings only at test time however the knowledge of semantic embeddings also for the unseen classes is something which is necessary in a zeroshot recognition paradigm the authors themselves take advantage of them during the nearest neighbor search thus at this point using the semantic embeddings for the unseen classes is therefore legit and i do not see any added value in constraining the experiments the reported performance is highly suboptimal with respect to the stateoftheart invertible zeroshot recognition flows recently proposed at eccv 2020 greatly outperformed the proposed approach by margin h498 on apy h548 on sun h594 on cub and h680 on awa2 prerebuttal evaluation i regret to register a substantial overlap with prior methods thus undermining the novelty impact of the present submission on the experimental side i found a sharply gapped performance which is highly inferior with respect to the stateoftheart caused by reducing the approaches included in the comparison on the basis of which methods exploit unseen semantic embeddings for training this claim is not convincing in my opinion unseen semantic embeddings are used in any cases why not exploiting them for feature generation purposes for those reasons i am afraid to discourage the acceptance of the manuscript postrebuttal evaluation final i would like to thank the authors for their response and for the updated version of the manuscript i appreciate their efforts unfortunately i still believe that the paper lacking about original contribution and i am not fully convinced by the authors comments on the relationship with yang et al a unified perspective on multidomain and multitask learning iclr 2015 and liu et al generalized zeroshot learning with deep calibration network neurips 2018 which i still judge highly overlapping with the current methodology furthermore although authors clarified on the avoidance of using semantic embeddings for the training methodology i do not see a sharp point in pursuing this approach given the high gap in performance with prior art ganbased for all these reasons i regret to confirm my initial rejection score docsepthis paper proposes a visualsemantic embedding model useful for generalized zeroshot learning the proposed model transforms an image into a label classifier which is then used to predict the correct label in the semantic space the paper is well constructed and easy to read it provides a good presentation of some related work and identifies the contributions as compared to existing approaches the experimental validation is performed on four popular public datasets and compares the performance to several state of the art approaches the obtained performance shows similarpromising results as compared to the state of the art from my perspective the paper is missing some experimental analysiscomparison to some recent methods that are inductive only to samples and also some methods that are transductive for unseen class prototypes and unlabeled unseen test instances for instance papers mentioned in section 43 first that comparison will allow evaluating the performance of the proposed approach to more recent papers than the ones used in section 42 second it seems that these methods specifically the ones that are transductive for unseen class prototypes achieve a much higher performance and its important to evaluate the performance of the proposed method in that setting or to report on the performance loss when someone decides to use this approach in that specific setting inductive to both unseen images and unseen semantic vectors a discussion that addresses the above questionsconcerns could do it too postrebuttal evaluation final i would like to thank the authors for their response and for updating their paper based on the reviewers feedback following these updates im changing my recommendation to rejection for the following 2 reasons 1 the technical novelty of this paper is moderate and theres a significant gap in the model performance as compared to state of the art 2 the authors failed to provide convincing answers to many of the reviewers concerns including motivation for not using semantic embeddings during the training process and not comparing their approach to transductive zsl ones which achieve a higher performancedocsepthis paper describes zeroshot learning for image classification the proposed method is termed as imageguided semantic classification igsc that learns imagespecific label classifiers to achieve zeroshot classification experimental results on four standard datasets are reported the authors are suggested to address the following comments 1 the idea to learn imagespecific classifiers specified by the yielded parameters resembles the conventional local learning where a classifier is learned for each training sample in the last paragraph of page 4 the authors mention that the proposed igsc mechanism is similar to that of dynamic filter networks by jia et al but also point out a fundamental difference that their method instead aims to learn model representations however similar techniques in this aspect have already been explored eg in solving the vqa problem take for example the cvpr 2016 paper entitled image question answering using convolutional neural network with dynamic parameter prediction the novelty of the proposed imageguided formulation seems to be limited 2 the math notations in the sentence right after equation 1 are incorrect y in yu should be corrected into y yu also y in ys cup yu should be y ys cup yu 3 besides the technical novelty my main concern about this work is that the experimental results are not convincing for example the previous technique aren xie et al 2019 is included in table 4 but not in table 3 however the classification results achieved by aren are considerably better than those yielded by the proposed technique for the zsl results in table 3 aren achieves 606718679392 while those by the proposed method are 583569621352 for the gzsl results in table 4 the accu result of sun by aren is 19 not 9 furthermore as calibrated stacking cs is also used in aren it is not reasonable that the comparison excludes the gzsl results of arencs which are significantly better than those by the proposed igsccs in particular the respective harmonic means derived by arencs are 359660647369 while those by the proposed technique igscac are 349487393332 4 the comparisons skip more recent zsl techniques in that the proposed igsc formulation does not consider semantic vectors of unseen classes however the semantic vectors of both seen and unseen classes are often made available in the training stage of zsl and most recent zsl techniques do consider exploring such information it would be useful if the proposed igsc can be generalized to take account of all the available semantic information so that comprehensive comparisons to sota zsl techniques can be carried out after author response my main concerns about this paper are technical novelty and weak experimental results the authors did not make efforts to address the two aspects properly for example i have listed several issues in my comment 3 about the specific experimental results but the authors did not try to address them at all their responses to reviewer 4 about the transductive zsl are not relevant to the weak experimental results of concern as most of my concerns are not resolved in the response i see no evidence to upgrade my rating i appreciate that the authors have revised their responses however the added response still does not address the two specific questions the reasons why aren is not included in table 3 and arencs not in table 4 in my comment 3 my final evaluation about the paper is on the negative side in that its technical novelty is moderate and the experimental results are not convincing docsep summary this paper proposes a simple yet effective method for zeroshot learning in the method a network is learned to predict the compatibility function weight given the input of the image the predicted weight is then applied to semantic attributes and the final class label is predicted by the maximum compatibility score the method is evaluated on benchmark datasets and illustrates competitive performance reason for score the paper is overall clear and easy to understand and the proposed method is simple and achieves competitive results but the paper lacks significance and contributions detailed below which makes me hard to judge its potential impact in the related area so i dont find the paper meet the standard of iclr hopefully the authors can address my concerns in the rebuttal period pros 1 the authors propose a simple method for zeroshot learning motivated by learning instanceguided classifier weights 2 the method demonstrates competitive results on benchmark datasets with a very simple implementation 3 the paper is written clearly and easy to follow cons 1 the paper fails to validate its motivation the authors claim the method is proposed to learn an imageguided semantic classification model and the weight values depend on the input image although the model architecture complies the motivation and the evaluation results are good it is hard to tell if the results are achieved by the imageguided weight values the authors didnt provide either theoretical analysis or empirical evaluations to demonstrate that the weights learned by the model is imagesensitive and the overall performance benefits from it for example it will be helpful if the authors can visualize the 2 some implementation details are missing and some choices in the method lack explanations for example why the model is trained by a crossentropy loss in 8 is it specifically beneficial to the proposed method in the zeroshot learning context for the nonlinear label classifier what is the hidden layer dimension h used in the experiments what is the effect of extending the nonlinear label classifier to deeper networks will it saturate etc the paper lacks this type of details or evaluations which makes the readers hard to evaluate its potential effectiveness 3 the proposed method is a framework which can implemented in various ways but the authors fail to provide enough analysis on its potential extensions for example in the experiments for fair comparison the authors froze the feature extractor but it will be helpful if the authors can also provide the performance with the feature extractor finetuned so as to provide insight on how this framework will synergize with deep feature networks in addition id love to see the performance of the proposed method on largescale recognitioninthewild dataset like imagenet 4 although the authors fix the feature extractor and only compare with the methods in the exact same setting namely feature extractor fixed no unseen information in training the baselines compared are rather weak for example the baseline models may not benefit from the calibrated stacking cs and the proposed method without cs has less significant results some baseline models like spaen have lower unseen and higher seen than the proposed method so it is possible they can achieve better performance with cs also like we discussed in 3 many featurebased methods achieve significantly better results and it will be great to show how the proposed method works if the entire model including the feature extractor is trained endtoend based on the points above i find the paper less significant and lack contributions thus giving my score ### Summary:
this paper presents work on zeroshot learning the reviewers appreciated the simplicity of the method and its clear exposition however concerns were raised over novelty motivation and empirical validation after reading the authors response the reviewers remained of the opinion that these concerns have not yet been addressed sufficiently based on these points the paper is not yet ready for publication
[ 50275, 783, 2361, 3045, 310, 4122, 749, 29776, 342, 1675, 281, 253, 1375, 23037, 14387, 42275, 1182, 254, 6934, 302, 8981, 14221, 4102, 4081, 387, 23746, 87, 9169, 10260, 41731, 10574, 253, 4081, 2746, 407, 8459, 288, 31358, 327, 1049, 90, 288, 47070, 327, 5101, 288, 37308, 327, 12966, 285, 288, 40100, 327, 33326, 19, 50274, 3456, 250, 2858, 22559, 7103, 891, 14938, 281, 8749, 247, 6832, 14787, 342, 2720, 3082, 3021, 762, 39590, 253, 38135, 3486, 273, 253, 1246, 19529, 327, 253, 5661, 1930, 891, 1119, 247, 23071, 305, 6965, 3045, 534, 310, 4122, 18134, 342, 1675, 281, 253, 1375, 23037, 14387, 4269, 407, 8493, 253, 7274, 2908, 275, 253, 5301, 327, 253, 3720, 273, 534, 3082, 22059, 39709, 24705, 46234, 323, 3733, 436, 1750, 310, 417, 21414, 275, 619, 4743, 39709, 24705, 46234, 403, 908, 275, 667, 2219, 2139, 417, 38883, 731, 323, 4735, 5978, 6378, 323, 1110, 4606, 891, 717, 9202, 281, 43162, 253, 14924, 273, 253, 7714, 50276, 5996, 250, 2858, 22559, 7103, 2457, 891, 651, 751, 281, 5717, 253, 4477, 323, 616, 2380, 285, 323, 253, 9300, 2715, 273, 253, 7714, 891, 11435, 616, 6031, 19235, 891, 1335, 2868, 326, 253, 2929, 14999, 670, 3236, 7680, 285, 891, 717, 417, 4751, 13762, 407, 253, 4477, 5701, 327, 253, 2954, 342, 30966, 1162, 355, 247, 27998, 8668, 327, 23964, 297, 404, 285, 1554, 262, 1945, 4715, 17857, 32888, 4104, 285, 632, 86, 1162, 355, 14923, 1182, 254, 6934, 302, 4715, 342, 3676, 18543, 2990, 5723, 2824, 4765, 534, 891, 1335, 5963, 4122, 21481, 342, 253, 1655, 16182, 33810, 3738, 4477, 31637, 327, 253, 28772, 273, 970, 24705, 46234, 323, 253, 3733, 16182, 891, 513, 417, 923, 247, 9479, 1127, 275, 23453, 436, 2746, 1677, 253, 1029, 8037, 275, 3045, 342, 2720, 1445, 36827, 3169, 50276, 1542, 512, 841, 4606, 891, 14938, 281, 6583, 619, 3302, 18235, 4868, 50276, 7152, 33032, 2520, 2929, 29328, 247, 5304, 6017, 6484, 21496, 1566, 4217, 323, 14923, 1182, 254, 6934, 302, 4715, 253, 4081, 1566, 29698, 271, 2460, 715, 247, 5203, 30410, 534, 310, 840, 908, 281, 3283, 253, 3451, 5203, 275, 253, 24705, 2317, 50275, 783, 2929, 310, 973, 8818, 285, 3477, 281, 1239, 352, 3400, 247, 1175, 9759, 273, 690, 2905, 789, 285, 22649, 253, 9021, 347, 2429, 281, 5368, 7274, 50276, 783, 5661, 12820, 310, 2684, 327, 1740, 4633, 1345, 15302, 285, 26662, 253, 3045, 281, 2067, 1375, 273, 253, 1445, 7274, 253, 2797, 3045, 2722, 2074, 13382, 2182, 1543, 347, 2429, 281, 253, 1375, 273, 253, 1445, 50276, 4064, 619, 8668, 253, 2929, 310, 5816, 690, 5661, 1783, 47109, 281, 690, 3332, 3082, 326, 403, 42115, 760, 281, 3530, 285, 671, 690, 3082, 326, 403, 811, 43324, 323, 39709, 966, 3861, 9117, 285, 440, 22027, 39709, 1071, 10872, 50276, 1542, 4227, 9380, 5393, 275, 2593, 7652, 806, 326, 5301, 588, 1581, 16344, 253, 3045, 273, 253, 4081, 2746, 281, 625, 3332, 9380, 685, 253, 4394, 908, 275, 2593, 5976, 1273, 352, 3133, 326, 841, 3082, 5742, 253, 4394, 326, 403, 811, 43324, 323, 39709, 966, 3861, 9117, 5115, 247, 1199, 2169, 3045, 285, 697, 1774, 281, 7472, 253, 3045, 273, 253, 4081, 1332, 275, 326, 4758, 390, 281, 1304, 327, 253, 3045, 2957, 672, 3095, 21936, 281, 897, 436, 2746, 275, 326, 2173, 4758, 42115, 281, 1097, 39709, 3888, 285, 39709, 24705, 11390, 50276, 66, 5955, 326, 12453, 253, 1840, 3533, 585, 1209, 2224, 812, 513, 352, 1512, 50276, 5996, 250, 2858, 22559, 7103, 2457, 891, 651, 751, 281, 5717, 253, 4477, 323, 616, 2380, 285, 323, 22753, 616, 2929, 1754, 327, 253, 30628, 8680, 1563, 841, 11269, 516, 6890, 619, 17401, 281, 18235, 323, 253, 1563, 374, 4606, 337, 253, 7681, 38135, 273, 436, 2929, 310, 10290, 285, 253, 373, 247, 1534, 8037, 275, 253, 1566, 3045, 347, 2429, 281, 1375, 273, 253, 1445, 374, 253, 4477, 4242, 281, 2085, 21414, 9172, 281, 1142, 273, 253, 30628, 7350, 1690, 16038, 323, 417, 970, 24705, 46234, 1309, 253, 3733, 1232, 285, 417, 10941, 616, 2746, 281, 50276, 3675, 43324, 1182, 3433, 4394, 534, 5115, 247, 2169, 1347, 3086, 406, 33032, 2520, 2929, 8631, 1182, 254, 6934, 302, 4715, 323, 2460, 9162, 50276, 783, 4081, 1332, 310, 23776, 347, 2460, 26960, 24705, 9162, 25477, 1026, 326, 33772, 3888, 29765, 5203, 49996, 281, 5115, 1182, 254, 6934, 302, 9162, 5661, 1543, 327, 1740, 2629, 15302, 403, 2361, 50276, 783, 4477, 403, 5125, 281, 2953, 253, 1563, 5701, 337, 253, 2934, 281, 3037, 3888, 29765, 49996, 7616, 407, 253, 20714, 3602, 29217, 253, 6041, 1980, 4715, 835, 247, 30410, 310, 6311, 323, 1016, 3733, 3410, 275, 253, 1390, 12494, 273, 3239, 577, 253, 4477, 3748, 326, 253, 4081, 25477, 1026, 5122, 310, 2074, 281, 326, 273, 50276, 19681, 5806, 6928, 407, 480, 571, 1162, 355, 533, 50276, 12563, 1127, 562, 247, 7936, 3064, 326, 616, 1332, 3185, 13698, 281, 3037, 1566, 14237, 2299, 2074, 5609, 275, 436, 4809, 452, 2168, 644, 14859, 24088, 275, 16161, 253, 362, 31569, 1895, 1379, 323, 1650, 253, 30105, 1087, 4022, 2929, 7429, 2460, 1953, 22291, 970, 27311, 267, 11454, 2990, 342, 7870, 4764, 10554, 253, 38135, 273, 253, 4081, 2460, 26960, 15895, 3133, 281, 320, 3710, 50276, 19, 253, 14168, 41818, 275, 253, 6197, 987, 846, 5150, 337, 403, 13583, 340, 275, 340, 86, 943, 320, 15045, 715, 340, 50276, 30838, 671, 340, 275, 340, 84, 5500, 340, 86, 943, 320, 340, 50276, 656, 5500, 340, 86, 50276, 20, 16280, 253, 7681, 38135, 619, 2022, 4468, 670, 436, 789, 310, 326, 253, 5661, 1543, 403, 417, 21414, 323, 1650, 253, 2045, 5853, 6403, 1269, 466, 1162, 355, 6247, 310, 2908, 275, 2829, 577, 533, 417, 275, 2829, 495, 2299, 253, 9162, 1543, 6786, 407, 6403, 403, 15455, 1805, 685, 1110, 20714, 407, 253, 4081, 5853, 50276, 1542, 253, 1182, 3433, 1543, 275, 2829, 495, 6403, 33526, 3925, 2251, 1093, 26233, 29827, 1223, 1110, 407, 253, 4081, 1332, 403, 9135, 1671, 2090, 3763, 1012, 3583, 323, 253, 305, 91, 3433, 1543, 275, 2829, 577, 253, 756, 86, 906, 273, 5101, 407, 6403, 310, 655, 417, 898, 33810, 347, 35890, 37444, 29180, 310, 671, 908, 275, 6403, 352, 310, 417, 5272, 326, 253, 5301, 43337, 253, 305, 91, 3433, 1543, 273, 6403, 6113, 534, 403, 3012, 1805, 685, 1110, 407, 253, 4081, 209, 17638, 550, 84, 275, 1798, 253, 9056, 23007, 2097, 6012, 407, 6403, 6113, 403, 34205, 38874, 25953, 26673, 1223, 1110, 407, 253, 4081, 5853, 25477, 1026, 317, 403, 36130, 30910, 1867, 1610, 1237, 577, 253, 14023, 17049, 625, 3332, 1182, 3433, 5609, 275, 326, 253, 4081, 25477, 1026, 15895, 1057, 417, 1908, 24705, 11390, 273, 39709, 5971, 2299, 253, 24705, 11390, 273, 1097, 2326, 285, 39709, 5971, 403, 2223, 1160, 2130, 275, 253, 3733, 3924, 273, 1182, 3433, 285, 954, 3332, 1182, 3433, 5609, 513, 1908, 18216, 824, 1491, 352, 651, 320, 4217, 604, 253, 4081, 25477, 1026, 476, 320, 14923, 281, 1379, 2395, 273, 512, 253, 2130, 24705, 1491, 594, 326, 11088, 14023, 281, 256, 5503, 1182, 3433, 5609, 476, 320, 4824, 562, 50276, 6438, 2488, 2380, 50276, 2577, 2022, 7350, 670, 436, 2929, 403, 7681, 38135, 285, 5075, 5661, 1543, 253, 4477, 858, 417, 1056, 6031, 281, 2953, 253, 767, 7794, 6283, 323, 1650, 891, 452, 7117, 2067, 3374, 275, 619, 4385, 495, 670, 253, 2173, 5661, 1543, 533, 253, 4477, 858, 417, 1611, 281, 2953, 731, 387, 512, 616, 6128, 281, 37317, 577, 670, 253, 811, 43324, 1182, 3433, 403, 417, 4623, 281, 253, 5075, 5661, 1543, 273, 4468, 50276, 284, 954, 273, 619, 7350, 403, 417, 11512, 275, 253, 2380, 891, 923, 642, 1941, 281, 15047, 619, 13716, 50276, 74, 11435, 326, 253, 4477, 452, 17265, 616, 6128, 2299, 253, 2879, 2380, 1335, 1057, 417, 2953, 253, 767, 2173, 3533, 253, 4606, 2139, 6403, 310, 417, 2908, 275, 2829, 495, 285, 6403, 6113, 417, 275, 2829, 577, 275, 619, 4385, 495, 619, 2457, 7103, 670, 253, 2929, 310, 327, 253, 4016, 1930, 275, 326, 697, 7681, 38135, 310, 10290, 285, 253, 5661, 1543, 403, 417, 21414, 5474, 33032, 6010, 436, 2929, 29328, 247, 2969, 2568, 3576, 1332, 323, 1182, 254, 6934, 302, 4715, 275, 253, 1332, 247, 2990, 310, 6311, 281, 3283, 253, 22862, 1159, 2801, 1677, 253, 3280, 273, 253, 2460, 253, 8131, 2801, 310, 840, 3732, 281, 24705, 12474, 285, 253, 2457, 966, 5203, 310, 8131, 407, 253, 4869, 22862, 4868, 253, 1332, 310, 6760, 327, 22791, 15302, 285, 18303, 12085, 3045, 50274, 10752, 323, 4868, 253, 2929, 310, 4583, 2590, 285, 3477, 281, 2096, 285, 253, 4081, 1332, 310, 2969, 285, 33526, 12085, 1543, 533, 253, 2929, 19756, 8453, 285, 9021, 7000, 2708, 534, 2789, 479, 1892, 281, 5963, 697, 2442, 3486, 275, 253, 2905, 2170, 50276, 601, 891, 13414, 1089, 253, 2929, 2525, 253, 2629, 273, 17857, 32888, 18670, 253, 4477, 476, 2953, 619, 7350, 275, 253, 30080, 22559, 2180, 50274, 856, 84, 337, 253, 4477, 12661, 247, 2969, 1332, 323, 1182, 254, 6934, 302, 4715, 17194, 407, 4715, 4227, 26960, 30410, 13461, 374, 253, 1332, 14371, 12085, 1543, 327, 22791, 15302, 342, 247, 1077, 2969, 7092, 495, 253, 2929, 310, 3542, 4518, 285, 3477, 281, 956, 50275, 5040, 337, 253, 2929, 10224, 281, 17813, 697, 16038, 253, 4477, 1750, 253, 1332, 310, 4081, 281, 3037, 271, 2460, 26960, 24705, 9162, 1566, 285, 253, 2801, 2193, 3469, 327, 253, 3280, 2460, 3738, 253, 1566, 10336, 3137, 447, 253, 16038, 285, 253, 7103, 1543, 403, 1175, 352, 310, 1892, 281, 2028, 604, 253, 1543, 403, 6786, 407, 253, 2460, 26960, 2801, 2193, 253, 4477, 42126, 2085, 2057, 10527, 1783, 390, 16774, 27163, 281, 7568, 326, 253, 13461, 6311, 407, 253, 1566, 310, 3888, 18917, 285, 253, 4583, 3045, 5373, 432, 352, 323, 1650, 352, 588, 320, 9371, 604, 253, 4477, 476, 31986, 253, 50276, 19, 690, 7092, 4278, 403, 5816, 285, 690, 10165, 275, 253, 1332, 3480, 22909, 323, 1650, 2139, 253, 1566, 310, 10166, 407, 247, 2831, 290, 10144, 2957, 275, 854, 310, 352, 5742, 12912, 281, 253, 4081, 1332, 275, 253, 1182, 254, 6934, 302, 4715, 3634, 323, 253, 14561, 5203, 30410, 752, 310, 253, 8763, 3828, 7877, 288, 908, 275, 253, 4679, 752, 310, 253, 1055, 273, 13633, 253, 14561, 5203, 30410, 281, 12861, 6928, 588, 352, 19004, 366, 3966, 253, 2929, 19756, 436, 1511, 273, 4278, 390, 27163, 534, 2789, 253, 10668, 1892, 281, 7472, 697, 2442, 12510, 495, 253, 4081, 1332, 310, 247, 7792, 534, 476, 9009, 275, 2710, 4088, 533, 253, 4477, 1891, 281, 2085, 2217, 1783, 327, 697, 2442, 18149, 323, 1650, 275, 253, 4679, 323, 4344, 5301, 253, 4477, 44659, 253, 4735, 4908, 263, 533, 352, 588, 320, 9371, 604, 253, 4477, 476, 671, 2085, 253, 3045, 342, 253, 4735, 4908, 263, 1442, 292, 37437, 594, 347, 281, 2085, 12288, 327, 849, 436, 7792, 588, 26455, 907, 342, 3676, 4735, 6928, 275, 1635, 2654, 2389, 281, 923, 253, 3045, 273, 253, 4081, 1332, 327, 1236, 2510, 25912, 8981, 565, 248, 32778, 10895, 751, 4440, 257, 292, 577, 3738, 253, 4477, 4993, 253, 4735, 4908, 263, 285, 760, 7277, 342, 253, 3082, 275, 253, 3242, 1072, 4758, 10775, 4735, 4908, 263, 4229, 642, 39709, 1491, 275, 3733, 253, 1666, 25379, 2429, 403, 2581, 5075, 323, 1650, 253, 8245, 3210, 778, 417, 5649, 432, 253, 35890, 37444, 29180, 50276, 395, 253, 4081, 1332, 1293, 29180, 556, 1679, 1534, 1543, 690, 8245, 3210, 751, 39072, 257, 452, 2406, 39709, 285, 2169, 2326, 685, 253, 4081, 1332, 594, 352, 310, 1896, 597, 476, 5115, 1805, 3045, 342, 29180, 671, 751, 359, 5469, 275, 495, 1142, 4735, 3169, 3082, 5115, 3012, 1805, 1543, 285, 352, 588, 320, 1270, 281, 921, 849, 253, 4081, 1332, 2987, 604, 253, 2862, 1566, 1690, 253, 4735, 4908, 263, 310, 10166, 990, 936, 423, 50275, 3169, 327, 253, 2792, 1840, 891, 1089, 253, 2929, 1679, 1534, 285, 3480, 9021, 3021, 4933, 619, 4868, 187, 187, 4118, 18435, 27, 2520, 2929, 10262, 789, 327, 1182, 254, 6934, 302, 4715, 50276, 783, 30628, 14109, 253, 17647, 273, 253, 1332, 285, 697, 2590, 47284, 50276, 35529, 7350, 497, 5439, 689, 38135, 16038, 285, 16774, 12820, 50276, 6438, 4361, 253, 4477, 2380, 253, 30628, 6376, 273, 253, 4743, 326, 841, 7350, 452, 417, 2568, 644, 9713, 10481, 50276, 3169, 327, 841, 2792, 253, 2929, 310, 417, 2568, 4704, 323, 9311 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 50275, 783, 2361, 3045, 310, 4122, 749, 29776, 342, 1675, 281, 253, 1375, 23037, 14387, 42275, 1182, 254, 6934, 302, 8981, 14221, 4102, 4081, 387, 23746, 87, 9169, 10260, 41731, 10574, 253, 4081, 2746, 407, 8459, 288, 31358, 327, 1049, 90, 288, 47070, 327, 5101, 288, 37308, 327, 12966, 285, 288, 40100, 327, 33326, 19, 50274, 3456, 250, 2858, 22559, 7103, 891, 14938, 281, 8749, 247, 6832, 14787, 342, 2720, 3082, 3021, 762, 39590, 253, 38135, 3486, 273, 253, 1246, 19529, 327, 253, 5661, 1930, 891, 1119, 247, 23071, 305, 6965, 3045, 534, 310, 4122, 18134, 342, 1675, 281, 253, 1375, 23037, 14387, 4269, 407, 8493, 253, 7274, 2908, 275, 253, 5301, 327, 253, 3720, 273, 534, 3082, 22059, 39709, 24705, 46234, 323, 3733, 436, 1750, 310, 417, 21414, 275, 619, 4743, 39709, 24705, 46234, 403, 908, 275, 667, 2219, 2139, 417, 38883, 731, 323, 4735, 5978, 6378, 323, 1110, 4606, 891, 717, 9202, 281, 43162, 253, 14924, 273, 253, 7714, 50276, 5996, 250, 2858, 22559, 7103, 2457, 891, 651, 751, 281, 5717, 253, 4477, 323, 616, 2380, 285, 323, 253, 9300, 2715, 273, 253, 7714, 891, 11435, 616, 6031, 19235, 891, 1335, 2868, 326, 253, 2929, 14999, 670, 3236, 7680, 285, 891, 717, 417, 4751, 13762, 407, 253, 4477, 5701, 327, 253, 2954, 342, 30966, 1162, 355, 247, 27998, 8668, 327, 23964, 297, 404, 285, 1554, 262, 1945, 4715, 17857, 32888, 4104, 285, 632, 86, 1162, 355, 14923, 1182, 254, 6934, 302, 4715, 342, 3676, 18543, 2990, 5723, 2824, 4765, 534, 891, 1335, 5963, 4122, 21481, 342, 253, 1655, 16182, 33810, 3738, 4477, 31637, 327, 253, 28772, 273, 970, 24705, 46234, 323, 253, 3733, 16182, 891, 513, 417, 923, 247, 9479, 1127, 275, 23453, 436, 2746, 1677, 253, 1029, 8037, 275, 3045, 342, 2720, 1445, 36827, 3169, 50276, 1542, 512, 841, 4606, 891, 14938, 281, 6583, 619, 3302, 18235, 4868, 50276, 7152, 33032, 2520, 2929, 29328, 247, 5304, 6017, 6484, 21496, 1566, 4217, 323, 14923, 1182, 254, 6934, 302, 4715, 253, 4081, 1566, 29698, 271, 2460, 715, 247, 5203, 30410, 534, 310, 840, 908, 281, 3283, 253, 3451, 5203, 275, 253, 24705, 2317, 50275, 783, 2929, 310, 973, 8818, 285, 3477, 281, 1239, 352, 3400, 247, 1175, 9759, 273, 690, 2905, 789, 285, 22649, 253, 9021, 347, 2429, 281, 5368, 7274, 50276, 783, 5661, 12820, 310, 2684, 327, 1740, 4633, 1345, 15302, 285, 26662, 253, 3045, 281, 2067, 1375, 273, 253, 1445, 7274, 253, 2797, 3045, 2722, 2074, 13382, 2182, 1543, 347, 2429, 281, 253, 1375, 273, 253, 1445, 50276, 4064, 619, 8668, 253, 2929, 310, 5816, 690, 5661, 1783, 47109, 281, 690, 3332, 3082, 326, 403, 42115, 760, 281, 3530, 285, 671, 690, 3082, 326, 403, 811, 43324, 323, 39709, 966, 3861, 9117, 285, 440, 22027, 39709, 1071, 10872, 50276, 1542, 4227, 9380, 5393, 275, 2593, 7652, 806, 326, 5301, 588, 1581, 16344, 253, 3045, 273, 253, 4081, 2746, 281, 625, 3332, 9380, 685, 253, 4394, 908, 275, 2593, 5976, 1273, 352, 3133, 326, 841, 3082, 5742, 253, 4394, 326, 403, 811, 43324, 323, 39709, 966, 3861, 9117, 5115, 247, 1199, 2169, 3045, 285, 697, 1774, 281, 7472, 253, 3045, 273, 253, 4081, 1332, 275, 326, 4758, 390, 281, 1304, 327, 253, 3045, 2957, 672, 3095, 21936, 281, 897, 436, 2746, 275, 326, 2173, 4758, 42115, 281, 1097, 39709, 3888, 285, 39709, 24705, 11390, 50276, 66, 5955, 326, 12453, 253, 1840, 3533, 585, 1209, 2224, 812, 513, 352, 1512, 50276, 5996, 250, 2858, 22559, 7103, 2457, 891, 651, 751, 281, 5717, 253, 4477, 323, 616, 2380, 285, 323, 22753, 616, 2929, 1754, 327, 253, 30628, 8680, 1563, 841, 11269, 516, 6890, 619, 17401, 281, 18235, 323, 253, 1563, 374, 4606, 337, 253, 7681, 38135, 273, 436, 2929, 310, 10290, 285, 253, 373, 247, 1534, 8037, 275, 253, 1566, 3045, 347, 2429, 281, 1375, 273, 253, 1445, 374, 253, 4477, 4242, 281, 2085, 21414, 9172, 281, 1142, 273, 253, 30628, 7350, 1690, 16038, 323, 417, 970, 24705, 46234, 1309, 253, 3733, 1232, 285, 417, 10941, 616, 2746, 281, 50276, 3675, 43324, 1182, 3433, 4394, 534, 5115, 247, 2169, 1347, 3086, 406, 33032, 2520, 2929, 8631, 1182, 254, 6934, 302, 4715, 323, 2460, 9162, 50276, 783, 4081, 1332, 310, 23776, 347, 2460, 26960, 24705, 9162, 25477, 1026, 326, 33772, 3888, 29765, 5203, 49996, 281, 5115, 1182, 254, 6934, 302, 9162, 5661, 1543, 327, 1740, 2629, 15302, 403, 2361, 50276, 783, 4477, 403, 5125, 281, 2953, 253, 1563, 5701, 337, 253, 2934, 281, 3037, 3888, 29765, 49996, 7616, 407, 253, 20714, 3602, 29217, 253, 6041, 1980, 4715, 835, 247, 30410, 310, 6311, 323, 1016, 3733, 3410, 275, 253, 1390, 12494, 273, 3239, 577, 253, 4477, 3748, 326, 253, 4081, 25477, 1026, 5122, 310, 2074, 281, 326, 273, 50276, 19681, 5806, 6928, 407, 480, 571, 1162, 355, 533, 50276, 12563, 1127, 562, 247, 7936, 3064, 326, 616, 1332, 3185, 13698, 281, 3037, 1566, 14237, 2299, 2074, 5609, 275, 436, 4809, 452, 2168, 644, 14859, 24088, 275, 16161, 253, 362, 31569, 1895, 1379, 323, 1650, 253, 30105, 1087, 4022, 2929, 7429, 2460, 1953, 22291, 970, 27311, 267, 11454, 2990, 342, 7870, 4764, 10554, 253, 38135, 273, 253, 4081, 2460, 26960, 15895, 3133, 281, 320, 3710, 50276, 19, 253, 14168, 41818, 275, 253, 6197, 987, 846, 5150, 337, 403, 13583, 340, 275, 340, 86, 943, 320, 15045, 715, 340, 50276, 30838, 671, 340, 275, 340, 84, 5500, 340, 86, 943, 320, 340, 50276, 656, 5500, 340, 86, 50276, 20, 16280, 253, 7681, 38135, 619, 2022, 4468, 670, 436, 789, 310, 326, 253, 5661, 1543, 403, 417, 21414, 323, 1650, 253, 2045, 5853, 6403, 1269, 466, 1162, 355, 6247, 310, 2908, 275, 2829, 577, 533, 417, 275, 2829, 495, 2299, 253, 9162, 1543, 6786, 407, 6403, 403, 15455, 1805, 685, 1110, 20714, 407, 253, 4081, 5853, 50276, 1542, 253, 1182, 3433, 1543, 275, 2829, 495, 6403, 33526, 3925, 2251, 1093, 26233, 29827, 1223, 1110, 407, 253, 4081, 1332, 403, 9135, 1671, 2090, 3763, 1012, 3583, 323, 253, 305, 91, 3433, 1543, 275, 2829, 577, 253, 756, 86, 906, 273, 5101, 407, 6403, 310, 655, 417, 898, 33810, 347, 35890, 37444, 29180, 310, 671, 908, 275, 6403, 352, 310, 417, 5272, 326, 253, 5301, 43337, 253, 305, 91, 3433, 1543, 273, 6403, 6113, 534, 403, 3012, 1805, 685, 1110, 407, 253, 4081, 209, 17638, 550, 84, 275, 1798, 253, 9056, 23007, 2097, 6012, 407, 6403, 6113, 403, 34205, 38874, 25953, 26673, 1223, 1110, 407, 253, 4081, 5853, 25477, 1026, 317, 403, 36130, 30910, 1867, 1610, 1237, 577, 253, 14023, 17049, 625, 3332, 1182, 3433, 5609, 275, 326, 253, 4081, 25477, 1026, 15895, 1057, 417, 1908, 24705, 11390, 273, 39709, 5971, 2299, 253, 24705, 11390, 273, 1097, 2326, 285, 39709, 5971, 403, 2223, 1160, 2130, 275, 253, 3733, 3924, 273, 1182, 3433, 285, 954, 3332, 1182, 3433, 5609, 513, 1908, 18216, 824, 1491, 352, 651, 320, 4217, 604, 253, 4081, 25477, 1026, 476, 320, 14923, 281, 1379, 2395, 273, 512, 253, 2130, 24705, 1491, 594, 326, 11088, 14023, 281, 256, 5503, 1182, 3433, 5609, 476, 320, 4824, 562, 50276, 6438, 2488, 2380, 50276, 2577, 2022, 7350, 670, 436, 2929, 403, 7681, 38135, 285, 5075, 5661, 1543, 253, 4477, 858, 417, 1056, 6031, 281, 2953, 253, 767, 7794, 6283, 323, 1650, 891, 452, 7117, 2067, 3374, 275, 619, 4385, 495, 670, 253, 2173, 5661, 1543, 533, 253, 4477, 858, 417, 1611, 281, 2953, 731, 387, 512, 616, 6128, 281, 37317, 577, 670, 253, 811, 43324, 1182, 3433, 403, 417, 4623, 281, 253, 5075, 5661, 1543, 273, 4468, 50276, 284, 954, 273, 619, 7350, 403, 417, 11512, 275, 253, 2380, 891, 923, 642, 1941, 281, 15047, 619, 13716, 50276, 74, 11435, 326, 253, 4477, 452, 17265, 616, 6128, 2299, 253, 2879, 2380, 1335, 1057, 417, 2953, 253, 767, 2173, 3533, 253, 4606, 2139, 6403, 310, 417, 2908, 275, 2829, 495, 285, 6403, 6113, 417, 275, 2829, 577, 275, 619, 4385, 495, 619, 2457, 7103, 670, 253, 2929, 310, 327, 253, 4016, 1930, 275, 326, 697, 7681, 38135, 310, 10290, 285, 253, 5661, 1543, 403, 417, 21414, 5474, 33032, 6010, 436, 2929, 29328, 247, 2969, 2568, 3576, 1332, 323, 1182, 254, 6934, 302, 4715, 275, 253, 1332, 247, 2990, 310, 6311, 281, 3283, 253, 22862, 1159, 2801, 1677, 253, 3280, 273, 253, 2460, 253, 8131, 2801, 310, 840, 3732, 281, 24705, 12474, 285, 253, 2457, 966, 5203, 310, 8131, 407, 253, 4869, 22862, 4868, 253, 1332, 310, 6760, 327, 22791, 15302, 285, 18303, 12085, 3045, 50274, 10752, 323, 4868, 253, 2929, 310, 4583, 2590, 285, 3477, 281, 2096, 285, 253, 4081, 1332, 310, 2969, 285, 33526, 12085, 1543, 533, 253, 2929, 19756, 8453, 285, 9021, 7000, 2708, 534, 2789, 479, 1892, 281, 5963, 697, 2442, 3486, 275, 253, 2905, 2170, 50276, 601, 891, 13414, 1089, 253, 2929, 2525, 253, 2629, 273, 17857, 32888, 18670, 253, 4477, 476, 2953, 619, 7350, 275, 253, 30080, 22559, 2180, 50274, 856, 84, 337, 253, 4477, 12661, 247, 2969, 1332, 323, 1182, 254, 6934, 302, 4715, 17194, 407, 4715, 4227, 26960, 30410, 13461, 374, 253, 1332, 14371, 12085, 1543, 327, 22791, 15302, 342, 247, 1077, 2969, 7092, 495, 253, 2929, 310, 3542, 4518, 285, 3477, 281, 956, 50275, 5040, 337, 253, 2929, 10224, 281, 17813, 697, 16038, 253, 4477, 1750, 253, 1332, 310, 4081, 281, 3037, 271, 2460, 26960, 24705, 9162, 1566, 285, 253, 2801, 2193, 3469, 327, 253, 3280, 2460, 3738, 253, 1566, 10336, 3137, 447, 253, 16038, 285, 253, 7103, 1543, 403, 1175, 352, 310, 1892, 281, 2028, 604, 253, 1543, 403, 6786, 407, 253, 2460, 26960, 2801, 2193, 253, 4477, 42126, 2085, 2057, 10527, 1783, 390, 16774, 27163, 281, 7568, 326, 253, 13461, 6311, 407, 253, 1566, 310, 3888, 18917, 285, 253, 4583, 3045, 5373, 432, 352, 323, 1650, 352, 588, 320, 9371, 604, 253, 4477, 476, 31986, 253, 50276, 19, 690, 7092, 4278, 403, 5816, 285, 690, 10165, 275, 253, 1332, 3480, 22909, 323, 1650, 2139, 253, 1566, 310, 10166, 407, 247, 2831, 290, 10144, 2957, 275, 854, 310, 352, 5742, 12912, 281, 253, 4081, 1332, 275, 253, 1182, 254, 6934, 302, 4715, 3634, 323, 253, 14561, 5203, 30410, 752, 310, 253, 8763, 3828, 7877, 288, 908, 275, 253, 4679, 752, 310, 253, 1055, 273, 13633, 253, 14561, 5203, 30410, 281, 12861, 6928, 588, 352, 19004, 366, 3966, 253, 2929, 19756, 436, 1511, 273, 4278, 390, 27163, 534, 2789, 253, 10668, 1892, 281, 7472, 697, 2442, 12510, 495, 253, 4081, 1332, 310, 247, 7792, 534, 476, 9009, 275, 2710, 4088, 533, 253, 4477, 1891, 281, 2085, 2217, 1783, 327, 697, 2442, 18149, 323, 1650, 275, 253, 4679, 323, 4344, 5301, 253, 4477, 44659, 253, 4735, 4908, 263, 533, 352, 588, 320, 9371, 604, 253, 4477, 476, 671, 2085, 253, 3045, 342, 253, 4735, 4908, 263, 1442, 292, 37437, 594, 347, 281, 2085, 12288, 327, 849, 436, 7792, 588, 26455, 907, 342, 3676, 4735, 6928, 275, 1635, 2654, 2389, 281, 923, 253, 3045, 273, 253, 4081, 1332, 327, 1236, 2510, 25912, 8981, 565, 248, 32778, 10895, 751, 4440, 257, 292, 577, 3738, 253, 4477, 4993, 253, 4735, 4908, 263, 285, 760, 7277, 342, 253, 3082, 275, 253, 3242, 1072, 4758, 10775, 4735, 4908, 263, 4229, 642, 39709, 1491, 275, 3733, 253, 1666, 25379, 2429, 403, 2581, 5075, 323, 1650, 253, 8245, 3210, 778, 417, 5649, 432, 253, 35890, 37444, 29180, 50276, 395, 253, 4081, 1332, 1293, 29180, 556, 1679, 1534, 1543, 690, 8245, 3210, 751, 39072, 257, 452, 2406, 39709, 285, 2169, 2326, 685, 253, 4081, 1332, 594, 352, 310, 1896, 597, 476, 5115, 1805, 3045, 342, 29180, 671, 751, 359, 5469, 275, 495, 1142, 4735, 3169, 3082, 5115, 3012, 1805, 1543, 285, 352, 588, 320, 1270, 281, 921, 849, 253, 4081, 1332, 2987, 604, 253, 2862, 1566, 1690, 253, 4735, 4908, 263, 310, 10166, 990, 936, 423, 50275, 3169, 327, 253, 2792, 1840, 891, 1089, 253, 2929, 1679, 1534, 285, 3480, 9021, 3021, 4933, 619, 4868, 187, 187, 4118, 18435, 27, 2520, 2929, 10262, 789, 327, 1182, 254, 6934, 302, 4715, 50276, 783, 30628, 14109, 253, 17647, 273, 253, 1332, 285, 697, 2590, 47284, 50276, 35529, 7350, 497, 5439, 689, 38135, 16038, 285, 16774, 12820, 50276, 6438, 4361, 253, 4477, 2380, 253, 30628, 6376, 273, 253, 4743, 326, 841, 7350, 452, 417, 2568, 644, 9713, 10481, 50276, 3169, 327, 841, 2792, 253, 2929, 310, 417, 2568, 4704, 323, 9311 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: summary in this paper the authors propose the continual memory cm targeted towards a reasoning scenario called reasoning after memorization the main goal of cm is to enable longterm memorization as opposed to memory networks that suffer from gradual forgetting they evaluate their model both on synthetic data as well as a few downstream benchmarks overall assessment i really struggled with this one and i think there are some interesting ideas in there however it was very hard for me to understand the main motivation and story behind the proposed model and its design choices moreover the task itself is not clearly defined until the experiments section making it really hard to understand the claims and motivations of the work i will provide detailed feedback below feedback 1 one thing that can improve the paper substantially is restructuring the introduction to clearly state the motivation studied task proposed solution and the main contributions of the work 11 for example the authors briefly mention qavqarecommendation in the beginning of the introduction and then do not formally presentdiscuss their studied task is in the introduction the qavqarecommendation are large research areas with many different benchmarks and approaches some references to reasoning has also been mentioned in the introduction but what area in reasoning is this paper specifically studying it would be very helpful for the reader to understand early on what the target of the paper is 12 some concepts used in the introduction are not well defined for example the authors refer multiple times to reasoning while experiencing and reasoning after memorizing without formally defining them i was not familiar with these notions and wasnt able to find any pointers through online search however if these are known concepts in a subarea it would be very helpful if the authors can add a citation to where they were originally defined if not it would be helpful to formally define them another vague concept is raw content it is not clear what it is referring to is it the input perhaps some source of knowledge if the task is defined the authors can use examples to make these concepts more clear 13 the introduction makes some connections to human cognition all throughout that read a bit subjective and are stated without any citations for example paragraph 2 in the introduction it is really hard for me to understand the main contributions of the paper and to make a fair assessment until the paper text has been revised if the authors are willing to submit a modified version during the author response period i will reread and reevaluate my score docsepthis paper proposes the continual memory machine which learns to compress an input stream with a continuallylearned memory module for reasoning after longterm memorization specifically they first model an input stream as a sequence of fixedlength segments of individual items then they use the transformer architecture with multihead attention to learn slottoitem attention scores for updating the memory consisting of k memory slots with input features the memory updates are controlled by a gru network for improving memorization over longterm memories they also propose selfsupervised training inspired by maskedlanguagemodeling of bert their target scenarios are longterm text question answering and lifelong sequence recommendation the paper also conducts experiments on its syntectic dataset i believe a key contribution of this paper is that they connect supersupervised learning and memoryaugmented neural networks however i am not exactly convinced that this helps the memorization of longterm memorization and reasoning after memorizing the current experiments cannot justify this as well the major weakness of this paper is the experiment design the main experiments are done on the synthetic dataset but the construction of the synthetic dataset is not clear to me for example how do you create a series of the logic chain here and can you connect your synthetic data with a real application by giving some illustrative examples can you at least show some examples of your synthetic dataset in this current presentation i cannot justify the use of synthetic data is reasonable or not for evaluating the methods for the realistic datasets can you also show some case studies such as the learned reasoning chains and how the memory updates during memorization i dont see a clear benefit of the socalled reasoning after memorizing versus reasoning while experiencing the authors aslo didnt use any clear mathematical formulation to distinguish between these two settings the mentioned related works can also save their memories after training and then use them for reasoning and inference only what are the main disadvantage of doing that can you show the differences in math and in experiments some minor points is rf the number of all facts or the number of all fact types or do you use fact and fact type interchangeably here why are there rq ra different chains for each queryanswer pair i thought rq is the number of all queries no why do you claim using segments can lead to bidirectional context sec 32 please use their instead of his or her for making iclr a more inclusive community docsepto approach reasoning after memorization the paper presents a continual memory cm framework using a memoryaugmented neural network mann and selfsupervised training in particular the cm compresses the input sequence into a matrix memory using selfattention mechanisms and gated recurrent update gru then together with a transformer decoder the memory is used for downstream reasoning tasks without the need of referring to the original input sequence moreover the framework is simultaneously trained with auxiliary losses to enforce memorization capability a variety of experiments demonstrate promising results in which the cm outperforms two mann baselines and shows competitive performance against stateoftheart methods pros reasoning after memorization is an interesting problem and the proposed solution generally makes sense under this setting the proposed solution combines several techniques some of which seem novel and useful the experiments are diverse with good results and sota for recommendation task cons the writing sometimes is misleading and vague there is no major novel contribution the synthetic task is poorly described the experiments lack details of baselines and hyperparameters detailed comments and questions in the introduction castatrophic forgetting 1 is about continual learning over multiple tasks and thus different from the problem the paper is addressing please explain the relation here in sec 32 the memory writing looks overcomplicated any explanation for the choice of designing it this way as in eq 3 the memory slots seem to be updated independently that is there is no memorymemory interaction in determining the content of the memory slot for the next timestep classical manns allow reading from memory during encoding and thus enable using other memory slots to write to a memory slot the cm seems not to have this property which may be a disadvantage please elaborate more on this point or correct me if i misunderstand in sec 33 to construct negative fragments 50 of unmasked items were replaced with what selfsupervised training to improve memorization in mann is not new please review other related works 23 there is little description of how the decoder uses the memory for inference please consider using explicit equations to describe clearly the process there is no concrete example of the data used in the synthetic task the task seems to be more like a memorization benchmark rather than reasoning after memorization the authors should consider known synthetic tasks that test both memorization and reasoning such as nfarthest 4 relational associative recall 5 or babi 6 using known tasks makes comparison with other approaches easier the memorybased baselines are inadequate please consider stronger baselines that can reason and remember 45 also the related work is incomprehensive without these methods did the authors tune critical hyperparameters of dnc eg number of heads or nutm eg number of cores also the authors should consider a comparison between baselines number of hyperparameters the authors claim that segmentlevel memorization is better how to choose a good segment size n if possible please conduct an ablation study to verify the performance with different n the name of the baseline twostream is misleading it is unclear how the baseline works some writing format problems add space after cm in baseline name eg page 6 cmonly reason cm only reason please use citet when appropriate eg page 2 le et al 2019a proposes le et al 2019a proposes i may raise the score if the authors improve the writing clarity and add more content baselines synthetic tasks hyperparameters to the experiments 1 james kirkpatrick razvan pascanu neil rabinowitz joel veness guillaume desjardins andrei a rusu kieran milan john quan tiago ramalho agnieszka grabskabarwinska et al overcoming catastrophic forgetting in neural networks proceedings of the national academy of sciences 1141335213526 2017 2 munkhdalai tsendsuren alessandro sordoni tong wang and adam trischler metalearned neural memory in advances in neural information processing systems pp 1333113342 2019 3 park taewon inchul choi and minho lee distributed memory based selfsupervised differentiable neural computer arxiv preprint arxiv200710637 2020 4 santoro adam ryan faulkner david raposo jack rae mike chrzanowski theophane weber daan wierstra oriol vinyals razvan pascanu and timothy lillicrap relational recurrent neural networks in advances in neural information processing systems pp 72997310 2018 5 hung le truyen tran and svetha venkatesh selfattentive associative memory in proceedings of machine learning and systems 2020 pages 23632372 2020 6 weston j bordes a chopra s rush a m van merrienboer b joulin a and mikolov t towards aicomplete question answering a set of prerequisite toy tasks arxiv preprint arxiv150205698 2015 ### Summary:
this paper proposes an approach to allow a neural network to memorize and reason over a long time horizon experiments on synthetic datasets question answering and sequence recommendation are presented to evaluate the proposed method the paper addresses an important problem of processing long sequences however all reviewers agree that the writing of the paper can be improved ie motivation details of experiment designsetup and others below importantly i think the authors need to elaborate the differences of continual memory with existing episodic memory methods the authors added a paragraph about continual learning during the rebuttal period and mentioned that their continual memory focuses on remembering infinite information stream without forgetting episodic memory models can be appliedadapted for this purpose so the authors should at least compare with one of them ideally more
[ 10076, 1925, 14720, 846, 16407, 1320, 253, 2022, 4736, 273, 7892, 310, 281, 8046, 1048, 3945, 16407, 1320, 347, 10066, 281, 3541, 6928, 326, 11089, 432, 26830, 37264, 597, 7472, 616, 1566, 1097, 327, 13506, 941, 347, 973, 347, 247, 1643, 15450, 49602, 50274, 1189, 455, 6803, 50276, 74, 1663, 19460, 342, 436, 581, 285, 891, 1158, 627, 403, 690, 4722, 5697, 275, 627, 2299, 352, 369, 1077, 1892, 323, 479, 281, 2096, 253, 2022, 16038, 285, 2926, 3212, 253, 4081, 1566, 285, 697, 2216, 10165, 25761, 253, 4836, 3139, 310, 417, 4518, 2931, 1919, 253, 4679, 2593, 2403, 352, 1663, 1892, 281, 2096, 253, 3916, 285, 42852, 273, 253, 789, 891, 588, 2085, 7000, 8680, 2708, 50274, 44333, 50276, 18, 581, 2181, 326, 476, 3157, 253, 2929, 9619, 310, 48090, 253, 10199, 281, 4518, 1375, 253, 16038, 5421, 4836, 4081, 2900, 285, 253, 2022, 9021, 273, 253, 789, 50275, 883, 323, 1650, 253, 4477, 13366, 3748, 2805, 580, 82, 609, 27167, 318, 275, 253, 5068, 273, 253, 10199, 285, 840, 513, 417, 19186, 1246, 35844, 616, 5421, 4836, 310, 275, 253, 10199, 253, 2805, 580, 82, 609, 27167, 318, 403, 1781, 2561, 3672, 342, 1142, 1027, 49602, 285, 7274, 690, 10414, 281, 14720, 556, 671, 644, 5393, 275, 253, 10199, 533, 752, 2170, 275, 14720, 310, 436, 2929, 5742, 12392, 352, 651, 320, 1077, 9371, 323, 253, 9414, 281, 2096, 2393, 327, 752, 253, 2303, 273, 253, 2929, 310, 50276, 805, 690, 12342, 908, 275, 253, 10199, 403, 417, 973, 2931, 323, 1650, 253, 4477, 3730, 2709, 2069, 281, 14720, 1223, 18492, 285, 14720, 846, 16407, 3006, 1293, 19186, 13947, 731, 891, 369, 417, 7615, 342, 841, 27367, 285, 369, 2649, 2104, 281, 1089, 667, 29476, 949, 3909, 3186, 2299, 604, 841, 403, 1929, 12342, 275, 247, 749, 12879, 352, 651, 320, 1077, 9371, 604, 253, 4477, 476, 823, 247, 25577, 281, 835, 597, 497, 8927, 2931, 604, 417, 352, 651, 320, 9371, 281, 19186, 4853, 731, 1529, 21248, 4473, 310, 9305, 2600, 352, 310, 417, 2590, 752, 352, 310, 14339, 281, 310, 352, 253, 3280, 4931, 690, 2603, 273, 3640, 604, 253, 4836, 310, 2931, 253, 4477, 476, 897, 6667, 281, 1056, 841, 12342, 625, 2590, 50276, 1012, 253, 10199, 2789, 690, 10291, 281, 1966, 31937, 512, 4768, 326, 1239, 247, 2372, 17854, 285, 403, 4767, 1293, 667, 30404, 323, 1650, 12494, 374, 275, 253, 10199, 50275, 262, 310, 1663, 1892, 323, 479, 281, 2096, 253, 2022, 9021, 273, 253, 2929, 285, 281, 1056, 247, 4344, 6803, 1919, 253, 2929, 2505, 556, 644, 17265, 604, 253, 4477, 403, 7378, 281, 11929, 247, 7321, 2715, 1309, 253, 2488, 2380, 2180, 891, 588, 294, 1088, 285, 294, 45141, 619, 4868, 5474, 33032, 2520, 2929, 29328, 253, 45120, 3541, 5145, 534, 33772, 281, 19477, 271, 3280, 5542, 342, 247, 23265, 29343, 264, 3541, 6333, 323, 14720, 846, 1048, 3945, 16407, 1320, 5742, 597, 806, 1566, 271, 3280, 5542, 347, 247, 3425, 273, 4229, 3985, 13288, 273, 2060, 4957, 50274, 7461, 597, 897, 253, 39707, 10336, 342, 4471, 2522, 4116, 281, 3037, 15239, 936, 4835, 4116, 7363, 323, 22753, 253, 3541, 11253, 273, 465, 3541, 25195, 342, 3280, 3386, 253, 3541, 11269, 403, 6537, 407, 247, 26970, 2990, 323, 11138, 16407, 1320, 689, 1048, 3945, 12959, 597, 671, 12661, 1881, 35421, 3733, 11797, 407, 34741, 77, 2435, 31646, 351, 8855, 273, 270, 797, 50276, 14094, 2303, 15216, 403, 1048, 3945, 2505, 1953, 22291, 285, 36536, 3425, 17401, 253, 2929, 671, 2589, 84, 4679, 327, 697, 2753, 442, 10010, 10895, 50276, 74, 2868, 247, 2234, 7680, 273, 436, 2929, 310, 326, 597, 4684, 17402, 29974, 13337, 4715, 285, 3541, 2321, 16390, 11454, 6928, 2299, 891, 717, 417, 4555, 13762, 326, 436, 7729, 253, 16407, 1320, 273, 1048, 3945, 16407, 1320, 285, 14720, 846, 16407, 3006, 253, 1655, 4679, 2550, 15249, 436, 347, 973, 50275, 783, 2201, 14855, 273, 436, 2929, 310, 253, 3368, 2216, 253, 2022, 4679, 403, 2218, 327, 253, 13506, 10895, 533, 253, 5140, 273, 253, 13506, 10895, 310, 417, 2590, 281, 479, 323, 1650, 849, 513, 368, 2794, 247, 2962, 273, 253, 9317, 5931, 1060, 285, 476, 368, 4684, 634, 13506, 941, 342, 247, 1524, 2898, 407, 4933, 690, 47386, 6667, 476, 368, 387, 1878, 921, 690, 6667, 273, 634, 13506, 10895, 275, 436, 1655, 9759, 891, 2550, 15249, 253, 897, 273, 13506, 941, 310, 5272, 390, 417, 323, 16344, 253, 3082, 50276, 1542, 253, 15958, 15302, 476, 368, 671, 921, 690, 1083, 2175, 824, 347, 253, 6311, 14720, 13178, 285, 849, 253, 3541, 11269, 1309, 16407, 1320, 50275, 74, 13414, 923, 247, 2590, 5649, 273, 253, 9267, 18859, 14720, 846, 16407, 3006, 7147, 14720, 1223, 18492, 253, 4477, 347, 4213, 42126, 897, 667, 2590, 15965, 15895, 281, 12129, 875, 841, 767, 7533, 253, 5393, 2905, 2987, 476, 671, 5321, 616, 12959, 846, 3733, 285, 840, 897, 731, 323, 14720, 285, 17032, 760, 752, 403, 253, 2022, 18928, 273, 2509, 326, 476, 368, 921, 253, 3910, 275, 14168, 285, 275, 4679, 50275, 8826, 5884, 2792, 50276, 261, 391, 71, 253, 1180, 273, 512, 5441, 390, 253, 1180, 273, 512, 958, 3510, 390, 513, 368, 897, 958, 285, 958, 1511, 28961, 1598, 1060, 50276, 22309, 403, 627, 391, 82, 50276, 376, 1027, 13178, 323, 1016, 7316, 31984, 4667, 891, 1869, 391, 82, 310, 253, 1180, 273, 512, 19241, 642, 50275, 22309, 513, 368, 1750, 970, 13288, 476, 1421, 281, 12246, 30869, 3634, 4706, 4567, 50275, 32897, 897, 616, 3185, 273, 521, 390, 617, 323, 2403, 17857, 32888, 247, 625, 25495, 3114, 50270, 7152, 339, 22352, 2746, 14720, 846, 16407, 1320, 253, 2929, 10262, 247, 45120, 3541, 7892, 7792, 970, 247, 3541, 2321, 16390, 11454, 2990, 637, 79, 285, 1881, 35421, 3733, 275, 1798, 50276, 783, 7892, 50276, 3118, 16443, 253, 3280, 3425, 715, 247, 4315, 3541, 970, 1881, 42959, 6297, 285, 305, 456, 18902, 5731, 26970, 840, 2366, 342, 247, 39707, 29810, 253, 3541, 310, 908, 323, 15450, 14720, 8892, 1293, 253, 878, 273, 14339, 281, 253, 3236, 3280, 3425, 50276, 3062, 1189, 253, 7792, 310, 10486, 10166, 342, 24026, 11655, 281, 7767, 16407, 1320, 14603, 247, 5235, 273, 4679, 7568, 12532, 1543, 275, 534, 253, 7892, 41731, 13015, 767, 637, 79, 1666, 25379, 285, 2722, 12085, 3045, 1411, 1375, 23037, 14387, 3082, 50275, 856, 84, 50276, 10752, 272, 846, 16407, 1320, 310, 271, 4722, 1895, 285, 253, 4081, 2900, 3839, 2789, 3282, 762, 436, 4758, 50276, 783, 4081, 2900, 24772, 2067, 5609, 690, 273, 534, 1646, 4460, 285, 4217, 50276, 783, 4679, 403, 11117, 342, 1175, 1543, 285, 256, 5503, 323, 17401, 4836, 50276, 5040, 50276, 783, 4028, 4536, 310, 24363, 285, 21248, 50276, 9088, 310, 642, 2201, 4460, 7680, 50276, 783, 13506, 4836, 310, 15225, 2529, 50275, 783, 4679, 3480, 4278, 273, 1666, 25379, 285, 4373, 22041, 50276, 5992, 7193, 5701, 285, 3533, 50275, 249, 253, 10199, 5248, 255, 16117, 37264, 337, 310, 670, 45120, 4715, 689, 2709, 8892, 285, 3021, 1027, 432, 253, 1895, 253, 2929, 310, 15974, 4496, 5513, 253, 5886, 1060, 50275, 249, 4706, 4567, 253, 3541, 4028, 4453, 689, 681, 37787, 667, 8813, 323, 253, 4327, 273, 20462, 352, 436, 1039, 50275, 284, 275, 16186, 495, 253, 3541, 25195, 1646, 281, 320, 9300, 10939, 326, 310, 627, 310, 642, 3541, 20704, 5016, 275, 8925, 253, 2600, 273, 253, 3541, 15239, 323, 253, 1735, 4522, 383, 554, 8946, 637, 2224, 1581, 4361, 432, 3541, 1309, 9706, 285, 3021, 8046, 970, 643, 3541, 25195, 281, 3630, 281, 247, 3541, 15239, 253, 7892, 3133, 417, 281, 452, 436, 2867, 534, 778, 320, 247, 18928, 4496, 21184, 625, 327, 436, 1127, 390, 3451, 479, 604, 891, 23452, 1676, 50276, 249, 4706, 5922, 281, 3989, 4016, 14251, 2456, 273, 440, 12477, 264, 4957, 497, 7932, 342, 752, 50275, 1286, 35421, 3733, 281, 3157, 16407, 1320, 275, 637, 79, 310, 417, 747, 4496, 2278, 643, 2905, 2987, 3495, 50276, 9088, 310, 1652, 5740, 273, 849, 253, 29810, 4648, 253, 3541, 323, 17032, 4496, 1908, 970, 6843, 7424, 281, 6266, 4518, 253, 1232, 50276, 9088, 310, 642, 11859, 1650, 273, 253, 941, 908, 275, 253, 13506, 4836, 253, 4836, 3133, 281, 320, 625, 751, 247, 16407, 1320, 22791, 2581, 685, 14720, 846, 16407, 1320, 253, 4477, 943, 1908, 1929, 13506, 8892, 326, 1071, 1097, 16407, 1320, 285, 14720, 824, 347, 295, 71, 435, 3957, 577, 38524, 42162, 6983, 608, 390, 5366, 74, 721, 970, 1929, 8892, 2789, 5301, 342, 643, 7274, 6927, 50275, 783, 3541, 3169, 1666, 25379, 403, 18766, 4496, 1908, 10046, 1666, 25379, 326, 476, 1921, 285, 4456, 5329, 671, 253, 2905, 789, 310, 15321, 8391, 422, 1293, 841, 3082, 50275, 14958, 253, 4477, 19928, 4619, 4373, 22041, 273, 277, 9068, 24088, 1180, 273, 9851, 390, 5825, 78, 24088, 1180, 273, 23018, 671, 253, 4477, 943, 1908, 247, 5301, 875, 1666, 25379, 1180, 273, 4373, 22041, 50276, 783, 4477, 1750, 326, 8223, 5251, 16407, 1320, 310, 1805, 50276, 5430, 281, 5206, 247, 1175, 8223, 1979, 295, 604, 1896, 4496, 2589, 271, 28913, 1263, 281, 12654, 253, 3045, 342, 1027, 295, 50275, 783, 1416, 273, 253, 8245, 2500, 33203, 310, 24363, 352, 310, 12744, 849, 253, 8245, 2987, 50275, 8826, 4028, 5981, 3237, 823, 2317, 846, 7892, 275, 8245, 1416, 24088, 3239, 721, 260, 2163, 314, 1921, 50276, 3591, 760, 1921, 4496, 897, 4851, 292, 672, 4569, 24088, 3239, 374, 50276, 282, 1162, 355, 6247, 66, 29328, 50276, 282, 1162, 355, 6247, 66, 29328, 50273, 74, 778, 7164, 253, 4868, 604, 253, 4477, 3157, 253, 4028, 19843, 285, 823, 625, 2600, 1666, 25379, 13506, 8892, 4373, 22041, 281, 253, 4679, 50275, 18, 480, 1443, 465, 11131, 38456, 21721, 6148, 268, 4843, 40160, 425, 300, 391, 25967, 29384, 3371, 293, 8097, 405, 1149, 6077, 2123, 711, 75, 472, 968, 285, 31586, 247, 391, 316, 86, 465, 1321, 266, 2301, 266, 480, 2116, 572, 266, 16816, 5477, 17653, 267, 1689, 639, 79, 447, 91, 4530, 10013, 3319, 47718, 88, 968, 4530, 1162, 355, 40845, 36256, 37264, 275, 11454, 6928, 10061, 273, 253, 3872, 35893, 273, 23496, 11601, 1012, 27267, 13743, 1731, 4240, 50276, 19, 50276, 1815, 76, 13838, 267, 2284, 28669, 1727, 39771, 355, 405, 33998, 256, 636, 12355, 12386, 259, 606, 285, 38622, 492, 17291, 2146, 5148, 613, 9306, 11454, 3541, 275, 16424, 275, 11454, 1491, 5162, 2718, 7266, 14114, 27348, 1610, 2945, 6247, 50276, 20, 5603, 15307, 999, 251, 16416, 335, 2093, 74, 285, 1054, 1689, 458, 70, 5939, 3541, 1754, 1881, 35421, 46350, 11454, 4382, 549, 32693, 638, 3845, 549, 32693, 8602, 12971, 1787, 9169, 50276, 21, 256, 386, 12892, 38622, 391, 8202, 4195, 23073, 1216, 34843, 301, 1218, 993, 80, 19708, 1218, 70, 278, 2804, 41934, 30317, 15767, 253, 2689, 1351, 359, 589, 4204, 266, 259, 1321, 10981, 47692, 311, 362, 5104, 932, 21721, 6148, 268, 4843, 40160, 285, 4522, 16715, 298, 408, 280, 1761, 38524, 18902, 11454, 6928, 275, 16424, 275, 11454, 1491, 5162, 2718, 7266, 8187, 1525, 3655, 740, 4765, 50276, 22, 10416, 458, 492, 7352, 257, 21191, 285, 18504, 678, 66, 8097, 76, 684, 73, 1881, 1595, 290, 422, 42162, 3541, 275, 10061, 273, 5145, 4715, 285, 2718, 9169, 7223, 28131, 1237, 29412, 9169, 50276, 23, 8935, 251, 480, 270, 636, 265, 247, 38419, 376, 256, 16949, 247, 278, 3889, 4285, 19430, 2399, 254, 270, 42138, 3642, 247, 285, 278, 1479, 311, 729, 246, 4404, 50276, 39533, 297, 13350, 1953, 22291, 247, 873, 273, 38445, 20953, 8892, 549, 32693, 638, 3845, 549, 32693, 8970, 16763, 31671, 4104, 187, 187, 4118, 18435, 27, 2520, 2929, 29328, 271, 2746, 281, 1581, 247, 11454, 2990, 281, 16407, 907, 285, 1921, 689, 247, 1048, 673, 16892, 4679, 327, 13506, 15302, 1953, 22291, 285, 3425, 17401, 403, 3559, 281, 7472, 253, 4081, 1332, 50275, 783, 2929, 12453, 271, 1774, 1895, 273, 5162, 1048, 6430, 2299, 50276, 455, 30628, 5194, 326, 253, 4028, 273, 253, 2929, 476, 320, 5520, 26332, 16038, 4278, 273, 3368, 2216, 23320, 285, 2571, 2708, 15538, 891, 1158, 253, 4477, 878, 281, 21184, 253, 3910, 273, 45120, 3541, 342, 5368, 6314, 23329, 3541, 3082, 253, 4477, 2879, 247, 12494, 670, 45120, 4715, 1309, 253, 30080, 22559, 2180, 285, 5393, 326, 616, 45120, 3541, 16633, 327, 29055, 11968, 1491, 5542, 1293, 37264, 6314, 23329, 3541, 3210, 476, 320, 3732, 26672, 264, 323, 436, 4096, 594, 253, 4477, 943, 387, 1878, 7277, 342, 581, 273, 731, 34243, 625 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 10076, 1925, 14720, 846, 16407, 1320, 253, 2022, 4736, 273, 7892, 310, 281, 8046, 1048, 3945, 16407, 1320, 347, 10066, 281, 3541, 6928, 326, 11089, 432, 26830, 37264, 597, 7472, 616, 1566, 1097, 327, 13506, 941, 347, 973, 347, 247, 1643, 15450, 49602, 50274, 1189, 455, 6803, 50276, 74, 1663, 19460, 342, 436, 581, 285, 891, 1158, 627, 403, 690, 4722, 5697, 275, 627, 2299, 352, 369, 1077, 1892, 323, 479, 281, 2096, 253, 2022, 16038, 285, 2926, 3212, 253, 4081, 1566, 285, 697, 2216, 10165, 25761, 253, 4836, 3139, 310, 417, 4518, 2931, 1919, 253, 4679, 2593, 2403, 352, 1663, 1892, 281, 2096, 253, 3916, 285, 42852, 273, 253, 789, 891, 588, 2085, 7000, 8680, 2708, 50274, 44333, 50276, 18, 581, 2181, 326, 476, 3157, 253, 2929, 9619, 310, 48090, 253, 10199, 281, 4518, 1375, 253, 16038, 5421, 4836, 4081, 2900, 285, 253, 2022, 9021, 273, 253, 789, 50275, 883, 323, 1650, 253, 4477, 13366, 3748, 2805, 580, 82, 609, 27167, 318, 275, 253, 5068, 273, 253, 10199, 285, 840, 513, 417, 19186, 1246, 35844, 616, 5421, 4836, 310, 275, 253, 10199, 253, 2805, 580, 82, 609, 27167, 318, 403, 1781, 2561, 3672, 342, 1142, 1027, 49602, 285, 7274, 690, 10414, 281, 14720, 556, 671, 644, 5393, 275, 253, 10199, 533, 752, 2170, 275, 14720, 310, 436, 2929, 5742, 12392, 352, 651, 320, 1077, 9371, 323, 253, 9414, 281, 2096, 2393, 327, 752, 253, 2303, 273, 253, 2929, 310, 50276, 805, 690, 12342, 908, 275, 253, 10199, 403, 417, 973, 2931, 323, 1650, 253, 4477, 3730, 2709, 2069, 281, 14720, 1223, 18492, 285, 14720, 846, 16407, 3006, 1293, 19186, 13947, 731, 891, 369, 417, 7615, 342, 841, 27367, 285, 369, 2649, 2104, 281, 1089, 667, 29476, 949, 3909, 3186, 2299, 604, 841, 403, 1929, 12342, 275, 247, 749, 12879, 352, 651, 320, 1077, 9371, 604, 253, 4477, 476, 823, 247, 25577, 281, 835, 597, 497, 8927, 2931, 604, 417, 352, 651, 320, 9371, 281, 19186, 4853, 731, 1529, 21248, 4473, 310, 9305, 2600, 352, 310, 417, 2590, 752, 352, 310, 14339, 281, 310, 352, 253, 3280, 4931, 690, 2603, 273, 3640, 604, 253, 4836, 310, 2931, 253, 4477, 476, 897, 6667, 281, 1056, 841, 12342, 625, 2590, 50276, 1012, 253, 10199, 2789, 690, 10291, 281, 1966, 31937, 512, 4768, 326, 1239, 247, 2372, 17854, 285, 403, 4767, 1293, 667, 30404, 323, 1650, 12494, 374, 275, 253, 10199, 50275, 262, 310, 1663, 1892, 323, 479, 281, 2096, 253, 2022, 9021, 273, 253, 2929, 285, 281, 1056, 247, 4344, 6803, 1919, 253, 2929, 2505, 556, 644, 17265, 604, 253, 4477, 403, 7378, 281, 11929, 247, 7321, 2715, 1309, 253, 2488, 2380, 2180, 891, 588, 294, 1088, 285, 294, 45141, 619, 4868, 5474, 33032, 2520, 2929, 29328, 253, 45120, 3541, 5145, 534, 33772, 281, 19477, 271, 3280, 5542, 342, 247, 23265, 29343, 264, 3541, 6333, 323, 14720, 846, 1048, 3945, 16407, 1320, 5742, 597, 806, 1566, 271, 3280, 5542, 347, 247, 3425, 273, 4229, 3985, 13288, 273, 2060, 4957, 50274, 7461, 597, 897, 253, 39707, 10336, 342, 4471, 2522, 4116, 281, 3037, 15239, 936, 4835, 4116, 7363, 323, 22753, 253, 3541, 11253, 273, 465, 3541, 25195, 342, 3280, 3386, 253, 3541, 11269, 403, 6537, 407, 247, 26970, 2990, 323, 11138, 16407, 1320, 689, 1048, 3945, 12959, 597, 671, 12661, 1881, 35421, 3733, 11797, 407, 34741, 77, 2435, 31646, 351, 8855, 273, 270, 797, 50276, 14094, 2303, 15216, 403, 1048, 3945, 2505, 1953, 22291, 285, 36536, 3425, 17401, 253, 2929, 671, 2589, 84, 4679, 327, 697, 2753, 442, 10010, 10895, 50276, 74, 2868, 247, 2234, 7680, 273, 436, 2929, 310, 326, 597, 4684, 17402, 29974, 13337, 4715, 285, 3541, 2321, 16390, 11454, 6928, 2299, 891, 717, 417, 4555, 13762, 326, 436, 7729, 253, 16407, 1320, 273, 1048, 3945, 16407, 1320, 285, 14720, 846, 16407, 3006, 253, 1655, 4679, 2550, 15249, 436, 347, 973, 50275, 783, 2201, 14855, 273, 436, 2929, 310, 253, 3368, 2216, 253, 2022, 4679, 403, 2218, 327, 253, 13506, 10895, 533, 253, 5140, 273, 253, 13506, 10895, 310, 417, 2590, 281, 479, 323, 1650, 849, 513, 368, 2794, 247, 2962, 273, 253, 9317, 5931, 1060, 285, 476, 368, 4684, 634, 13506, 941, 342, 247, 1524, 2898, 407, 4933, 690, 47386, 6667, 476, 368, 387, 1878, 921, 690, 6667, 273, 634, 13506, 10895, 275, 436, 1655, 9759, 891, 2550, 15249, 253, 897, 273, 13506, 941, 310, 5272, 390, 417, 323, 16344, 253, 3082, 50276, 1542, 253, 15958, 15302, 476, 368, 671, 921, 690, 1083, 2175, 824, 347, 253, 6311, 14720, 13178, 285, 849, 253, 3541, 11269, 1309, 16407, 1320, 50275, 74, 13414, 923, 247, 2590, 5649, 273, 253, 9267, 18859, 14720, 846, 16407, 3006, 7147, 14720, 1223, 18492, 253, 4477, 347, 4213, 42126, 897, 667, 2590, 15965, 15895, 281, 12129, 875, 841, 767, 7533, 253, 5393, 2905, 2987, 476, 671, 5321, 616, 12959, 846, 3733, 285, 840, 897, 731, 323, 14720, 285, 17032, 760, 752, 403, 253, 2022, 18928, 273, 2509, 326, 476, 368, 921, 253, 3910, 275, 14168, 285, 275, 4679, 50275, 8826, 5884, 2792, 50276, 261, 391, 71, 253, 1180, 273, 512, 5441, 390, 253, 1180, 273, 512, 958, 3510, 390, 513, 368, 897, 958, 285, 958, 1511, 28961, 1598, 1060, 50276, 22309, 403, 627, 391, 82, 50276, 376, 1027, 13178, 323, 1016, 7316, 31984, 4667, 891, 1869, 391, 82, 310, 253, 1180, 273, 512, 19241, 642, 50275, 22309, 513, 368, 1750, 970, 13288, 476, 1421, 281, 12246, 30869, 3634, 4706, 4567, 50275, 32897, 897, 616, 3185, 273, 521, 390, 617, 323, 2403, 17857, 32888, 247, 625, 25495, 3114, 50270, 7152, 339, 22352, 2746, 14720, 846, 16407, 1320, 253, 2929, 10262, 247, 45120, 3541, 7892, 7792, 970, 247, 3541, 2321, 16390, 11454, 2990, 637, 79, 285, 1881, 35421, 3733, 275, 1798, 50276, 783, 7892, 50276, 3118, 16443, 253, 3280, 3425, 715, 247, 4315, 3541, 970, 1881, 42959, 6297, 285, 305, 456, 18902, 5731, 26970, 840, 2366, 342, 247, 39707, 29810, 253, 3541, 310, 908, 323, 15450, 14720, 8892, 1293, 253, 878, 273, 14339, 281, 253, 3236, 3280, 3425, 50276, 3062, 1189, 253, 7792, 310, 10486, 10166, 342, 24026, 11655, 281, 7767, 16407, 1320, 14603, 247, 5235, 273, 4679, 7568, 12532, 1543, 275, 534, 253, 7892, 41731, 13015, 767, 637, 79, 1666, 25379, 285, 2722, 12085, 3045, 1411, 1375, 23037, 14387, 3082, 50275, 856, 84, 50276, 10752, 272, 846, 16407, 1320, 310, 271, 4722, 1895, 285, 253, 4081, 2900, 3839, 2789, 3282, 762, 436, 4758, 50276, 783, 4081, 2900, 24772, 2067, 5609, 690, 273, 534, 1646, 4460, 285, 4217, 50276, 783, 4679, 403, 11117, 342, 1175, 1543, 285, 256, 5503, 323, 17401, 4836, 50276, 5040, 50276, 783, 4028, 4536, 310, 24363, 285, 21248, 50276, 9088, 310, 642, 2201, 4460, 7680, 50276, 783, 13506, 4836, 310, 15225, 2529, 50275, 783, 4679, 3480, 4278, 273, 1666, 25379, 285, 4373, 22041, 50276, 5992, 7193, 5701, 285, 3533, 50275, 249, 253, 10199, 5248, 255, 16117, 37264, 337, 310, 670, 45120, 4715, 689, 2709, 8892, 285, 3021, 1027, 432, 253, 1895, 253, 2929, 310, 15974, 4496, 5513, 253, 5886, 1060, 50275, 249, 4706, 4567, 253, 3541, 4028, 4453, 689, 681, 37787, 667, 8813, 323, 253, 4327, 273, 20462, 352, 436, 1039, 50275, 284, 275, 16186, 495, 253, 3541, 25195, 1646, 281, 320, 9300, 10939, 326, 310, 627, 310, 642, 3541, 20704, 5016, 275, 8925, 253, 2600, 273, 253, 3541, 15239, 323, 253, 1735, 4522, 383, 554, 8946, 637, 2224, 1581, 4361, 432, 3541, 1309, 9706, 285, 3021, 8046, 970, 643, 3541, 25195, 281, 3630, 281, 247, 3541, 15239, 253, 7892, 3133, 417, 281, 452, 436, 2867, 534, 778, 320, 247, 18928, 4496, 21184, 625, 327, 436, 1127, 390, 3451, 479, 604, 891, 23452, 1676, 50276, 249, 4706, 5922, 281, 3989, 4016, 14251, 2456, 273, 440, 12477, 264, 4957, 497, 7932, 342, 752, 50275, 1286, 35421, 3733, 281, 3157, 16407, 1320, 275, 637, 79, 310, 417, 747, 4496, 2278, 643, 2905, 2987, 3495, 50276, 9088, 310, 1652, 5740, 273, 849, 253, 29810, 4648, 253, 3541, 323, 17032, 4496, 1908, 970, 6843, 7424, 281, 6266, 4518, 253, 1232, 50276, 9088, 310, 642, 11859, 1650, 273, 253, 941, 908, 275, 253, 13506, 4836, 253, 4836, 3133, 281, 320, 625, 751, 247, 16407, 1320, 22791, 2581, 685, 14720, 846, 16407, 1320, 253, 4477, 943, 1908, 1929, 13506, 8892, 326, 1071, 1097, 16407, 1320, 285, 14720, 824, 347, 295, 71, 435, 3957, 577, 38524, 42162, 6983, 608, 390, 5366, 74, 721, 970, 1929, 8892, 2789, 5301, 342, 643, 7274, 6927, 50275, 783, 3541, 3169, 1666, 25379, 403, 18766, 4496, 1908, 10046, 1666, 25379, 326, 476, 1921, 285, 4456, 5329, 671, 253, 2905, 789, 310, 15321, 8391, 422, 1293, 841, 3082, 50275, 14958, 253, 4477, 19928, 4619, 4373, 22041, 273, 277, 9068, 24088, 1180, 273, 9851, 390, 5825, 78, 24088, 1180, 273, 23018, 671, 253, 4477, 943, 1908, 247, 5301, 875, 1666, 25379, 1180, 273, 4373, 22041, 50276, 783, 4477, 1750, 326, 8223, 5251, 16407, 1320, 310, 1805, 50276, 5430, 281, 5206, 247, 1175, 8223, 1979, 295, 604, 1896, 4496, 2589, 271, 28913, 1263, 281, 12654, 253, 3045, 342, 1027, 295, 50275, 783, 1416, 273, 253, 8245, 2500, 33203, 310, 24363, 352, 310, 12744, 849, 253, 8245, 2987, 50275, 8826, 4028, 5981, 3237, 823, 2317, 846, 7892, 275, 8245, 1416, 24088, 3239, 721, 260, 2163, 314, 1921, 50276, 3591, 760, 1921, 4496, 897, 4851, 292, 672, 4569, 24088, 3239, 374, 50276, 282, 1162, 355, 6247, 66, 29328, 50276, 282, 1162, 355, 6247, 66, 29328, 50273, 74, 778, 7164, 253, 4868, 604, 253, 4477, 3157, 253, 4028, 19843, 285, 823, 625, 2600, 1666, 25379, 13506, 8892, 4373, 22041, 281, 253, 4679, 50275, 18, 480, 1443, 465, 11131, 38456, 21721, 6148, 268, 4843, 40160, 425, 300, 391, 25967, 29384, 3371, 293, 8097, 405, 1149, 6077, 2123, 711, 75, 472, 968, 285, 31586, 247, 391, 316, 86, 465, 1321, 266, 2301, 266, 480, 2116, 572, 266, 16816, 5477, 17653, 267, 1689, 639, 79, 447, 91, 4530, 10013, 3319, 47718, 88, 968, 4530, 1162, 355, 40845, 36256, 37264, 275, 11454, 6928, 10061, 273, 253, 3872, 35893, 273, 23496, 11601, 1012, 27267, 13743, 1731, 4240, 50276, 19, 50276, 1815, 76, 13838, 267, 2284, 28669, 1727, 39771, 355, 405, 33998, 256, 636, 12355, 12386, 259, 606, 285, 38622, 492, 17291, 2146, 5148, 613, 9306, 11454, 3541, 275, 16424, 275, 11454, 1491, 5162, 2718, 7266, 14114, 27348, 1610, 2945, 6247, 50276, 20, 5603, 15307, 999, 251, 16416, 335, 2093, 74, 285, 1054, 1689, 458, 70, 5939, 3541, 1754, 1881, 35421, 46350, 11454, 4382, 549, 32693, 638, 3845, 549, 32693, 8602, 12971, 1787, 9169, 50276, 21, 256, 386, 12892, 38622, 391, 8202, 4195, 23073, 1216, 34843, 301, 1218, 993, 80, 19708, 1218, 70, 278, 2804, 41934, 30317, 15767, 253, 2689, 1351, 359, 589, 4204, 266, 259, 1321, 10981, 47692, 311, 362, 5104, 932, 21721, 6148, 268, 4843, 40160, 285, 4522, 16715, 298, 408, 280, 1761, 38524, 18902, 11454, 6928, 275, 16424, 275, 11454, 1491, 5162, 2718, 7266, 8187, 1525, 3655, 740, 4765, 50276, 22, 10416, 458, 492, 7352, 257, 21191, 285, 18504, 678, 66, 8097, 76, 684, 73, 1881, 1595, 290, 422, 42162, 3541, 275, 10061, 273, 5145, 4715, 285, 2718, 9169, 7223, 28131, 1237, 29412, 9169, 50276, 23, 8935, 251, 480, 270, 636, 265, 247, 38419, 376, 256, 16949, 247, 278, 3889, 4285, 19430, 2399, 254, 270, 42138, 3642, 247, 285, 278, 1479, 311, 729, 246, 4404, 50276, 39533, 297, 13350, 1953, 22291, 247, 873, 273, 38445, 20953, 8892, 549, 32693, 638, 3845, 549, 32693, 8970, 16763, 31671, 4104, 187, 187, 4118, 18435, 27, 2520, 2929, 29328, 271, 2746, 281, 1581, 247, 11454, 2990, 281, 16407, 907, 285, 1921, 689, 247, 1048, 673, 16892, 4679, 327, 13506, 15302, 1953, 22291, 285, 3425, 17401, 403, 3559, 281, 7472, 253, 4081, 1332, 50275, 783, 2929, 12453, 271, 1774, 1895, 273, 5162, 1048, 6430, 2299, 50276, 455, 30628, 5194, 326, 253, 4028, 273, 253, 2929, 476, 320, 5520, 26332, 16038, 4278, 273, 3368, 2216, 23320, 285, 2571, 2708, 15538, 891, 1158, 253, 4477, 878, 281, 21184, 253, 3910, 273, 45120, 3541, 342, 5368, 6314, 23329, 3541, 3082, 253, 4477, 2879, 247, 12494, 670, 45120, 4715, 1309, 253, 30080, 22559, 2180, 285, 5393, 326, 616, 45120, 3541, 16633, 327, 29055, 11968, 1491, 5542, 1293, 37264, 6314, 23329, 3541, 3210, 476, 320, 3732, 26672, 264, 323, 436, 4096, 594, 253, 4477, 943, 387, 1878, 7277, 342, 581, 273, 731, 34243, 625 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper proposes two simple modifications of the sam optimizer that allow to significantly improve its computational efficiency without sacrificing performance this is a nice paper that is well written and addressing a very clear problem reducing the computational complexity of sam it proposes two simple tricks that both make sense and shows that they improve efficiency without hurting performance swp implementation i wanted to ask a question about swp how do you implement it in practice i am a bit surprised that not computing the gradients for a random subset eg 50 of weights leads to a significant improvement in performance do you have to do any tricks to achieve this improvement alignment of ssp gradient with fullbatch gradient you mention in page 5 that the gradient on the subset selected by ssp is well aligned with the gradient on the full batch did you try to actually measure this alignment eg in terms of cosine similarity it could be interesting to compare it to eg a random subset and mathbbb why does esam outperform sam in some of your experiments esam outperforms sam in accuracy do you have ideas for why this could be happening imagenet results one concern i have with the paper is that the original sam paper 1 seems to report better results for sam in some of the same settings than you do in particular on imagenet in table 2 you report 767 for resnet50 with sam and 7705 with esam however 1 reports 775 with sam in table 2 similarly their results for resnet101 are better and the results for pyramidnet on cifar are better is there a difference in the setting you use references 1 sharpnessaware minimization for efficiently improving generalization pierre foret ariel kleiner hossein mobahi behnam neyshabur overall this is a very nice paper my only concern is about the results relative to the original sam paper if the authors address this concern in the rebuttal i am happy to recommend this paper for acceptance docseppaper proposes techniques to improve the efficiency of sharpness aware minimization method they are stochastic weight perturbation select subset of the parameters at any step and sharpnesssensitive data selection results demonstrates efficacy over sam at small batch sizes on multiple models strengths simple modifications that improves over existing sam results wide range of baselines weakness lack of ablations across batch size especially with the dataselection algorithm on cost of sam cost of sam is mentioned as 100 in the text of the paper typical sam application when batch sizes are larger involve data parallel training across a flock of gpus the efficient version of sam that works in practice do not communicate the gradients across the cores and hence cost is not exactly 100 but lower it would be good to know results against this standard version of sam hyperparameter tuning it would be ideal if the authors normalized for the training cost what would be the accuracy of sgd with 40 more epochs with appropriate tuning of learning rate schedules choice of gamma and batch size is the gamma05 mean 50 of examples in a batch are discarded from the gradient step this is quite fascinating that this method was able to improve accuracy while discarding half the batch it would be extremely valuable to know see comment on cost of sam to much larger batches for example 2048 for cifar10100 if this continues to remain true modifications are simple and efficient in practice and may improve the practical usage of sam across workloads docsepthis paper investigates the crucial efficiency issue of the sharpnessaware minimizer sam sam improves the generalization of dnns but results in a double training time compared to vanilla training by analyzing the minmax procedure of sam the authors observe the computational redundancy and then propose a method esam to improve the efficiency from the data and parameters perspectives the authors argue that sam can be approximated properly with fewer computations empirical results show that esam can reduce the extra training time in cifar10100 and imagenet datasets with improved accuracy compared to sam pros 1 the proposed approach is wellmotivated and practical in realworld applications the enhanced efficiency does not lead to a degradation of accuracy the experimental setup is satisfactory and convincing the proposed method is verified on multiple datasets to show the effectiveness the ablation studies and the supplementary experiments in the appendix are appropriate 2 the idea of picking representative samples for updating parameters is interesting and effective why not use the alpha directly as the hyperparameter for data selection 3 the paper is wellorganized figures 3 and 4 are good for demonstrating cons 1 it is mentioned that the computations of swp slightly increase in the deeper dnns what is the exact impact of using swp in largescale deep neural networks 2 the explanation of swps computation in section 22 is not clear and may be made shorter for example l8 decreasing beta may degrade is superfluous the claimed positive correlation between saved computations with beta is vague and should be more precise 3 the gaussian perturbations used in figure 4 are not representative enough there is no difference between the xaxis perturbation and the yaxis perturbation consider visualizing the loss landscape with the adversarial perturbations additional comments 1 better to use hyperparameters of sgd to replace parameters of sgd in sec 31 line 6 2 better to add training speed 1403 vs sam 100 for the efficiency comparison in section 31 p3l4 1403 vs sam 100 this work raises an important question of sam where the improved training algorithm requires doubled training cost this paper addresses and improves the efficiency drawback of sam the proposed approach is wellmotivated and has been verified to improve the efficiency and accuracy of sam the authors also provide codes for reproducibility however the degradation of swps effectiveness in largescale neural networks as well as the justifications for the improved accuracy contributed by swp and sds have not been clearly addressed in this paper i expect the author can answer the points above and then i can adjust my final score docsepthis paper presents a method called esam for improving the efficiency of sam esam has two components swp and sds sws accelerates the estimation of epsilon via random sampling a subset of parameters in backpropagation sds further improves the efficiency via sampling a subset of data points that is enough for calculating the upper bound of l combining these two techniques they achieve a speed improvement for sam while yielding comparable or even better performance strengths this paper is well written and easy to follow while sam has proven to be helpful for many applications this technique can be helpful in practice weakness for swp is the gradient mask the same for the samples in one batch or different gradient masks are applied for each sample in the batch for sds an interesting baseline to be compared is estimating epsilon with a subset of randomly sampled data which could match the origin epsilon in terms of empirical estimation recent paper 1 points that sam can be very useful in vit and mlpmixer it would be more convincing if the author could show esam can also work on those two architectures detailed comments first row in page 8 as shwon in table 3 swp improves shwon shown 1 chen xiangning chojui hsieh and boqing gong when vision transformers outperform resnets without pretraining or strong data augmentations arxiv preprint arxiv210601548 2021 this paper is the pioneer in exploring a more efficient sam method this technique can be useful since sam has been proven to be useful for many networks therefore i would recommend this paper as weak accept i would be more convincing the practicability of this paper if the authors can show the proposed esam works on vit and mlp mixer where the sam has been proven to be very effective ### Summary:
this paper focuses on improving the efficiency of sharpnessaware minimization method for training neural networks the proposals are stochastic weight perturbation namely selecting subset of the parameters at any step and sharpnesssensitive data selection the philosophy behind sounds quite interesting to me namely sharpnessaware minimizer can be approximated properly with fewer computations after analyzing the minmax procedure this philosophy leads to a novel algorithm design i have never seen the clarity and novelty are clearly above the bar of iclr while the reviewers had some concerns on the significance the authors did a particularly good job in their rebuttal thus all of us have agreed to accept this paper for publication please include the additional experimental results in the next version
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 29328, 767, 2969, 14586, 273, 253, 1775, 5556, 6081, 326, 1581, 281, 3012, 3157, 697, 15180, 6733, 1293, 18501, 272, 3045, 436, 310, 247, 5322, 2929, 326, 310, 973, 3542, 285, 15974, 247, 1077, 2590, 1895, 8493, 253, 15180, 10454, 273, 1775, 352, 29328, 767, 2969, 24866, 326, 1097, 1056, 3282, 285, 2722, 326, 597, 3157, 6733, 1293, 34426, 3045, 50275, 2140, 81, 7092, 50276, 74, 3078, 281, 1642, 247, 1953, 670, 1863, 81, 849, 513, 368, 3359, 352, 275, 3946, 891, 717, 247, 2372, 9861, 326, 417, 12672, 253, 27935, 323, 247, 3632, 8578, 24088, 2456, 273, 13461, 5644, 281, 247, 1534, 7756, 275, 3045, 513, 368, 452, 281, 513, 667, 24866, 281, 5115, 436, 7756, 50275, 40446, 273, 256, 1033, 11786, 342, 2120, 23941, 11786, 50276, 5658, 3748, 275, 3239, 608, 326, 253, 11786, 327, 253, 8578, 4236, 407, 256, 1033, 310, 973, 15616, 342, 253, 11786, 327, 253, 2120, 14604, 858, 368, 1611, 281, 2686, 2557, 436, 12420, 24088, 275, 2426, 273, 7349, 460, 14259, 352, 812, 320, 4722, 281, 7277, 352, 281, 24088, 247, 3632, 8578, 285, 14168, 4482, 67, 50275, 22309, 1057, 1578, 312, 562, 32231, 1775, 50276, 249, 690, 273, 634, 4679, 1578, 312, 41731, 13015, 1775, 275, 7200, 513, 368, 452, 5697, 323, 2139, 436, 812, 320, 9369, 50274, 303, 6533, 292, 1543, 50276, 531, 4468, 891, 452, 342, 253, 2929, 310, 326, 253, 3236, 1775, 2929, 337, 3133, 281, 1304, 1805, 1543, 323, 1775, 275, 690, 273, 253, 1072, 7533, 685, 368, 513, 275, 1798, 327, 4440, 257, 292, 275, 2829, 374, 368, 1304, 818, 2251, 323, 501, 3024, 1235, 342, 1775, 285, 10484, 1762, 342, 1578, 312, 2299, 337, 5012, 818, 1976, 342, 1775, 275, 2829, 374, 12014, 616, 1543, 323, 501, 3024, 6903, 403, 1805, 285, 253, 1543, 323, 39694, 3024, 327, 260, 338, 274, 403, 1805, 50276, 261, 627, 247, 3064, 275, 253, 4758, 368, 897, 50274, 250, 3065, 50276, 18, 9479, 1255, 13823, 41458, 323, 14556, 11138, 26647, 18753, 250, 2273, 85, 247, 19399, 465, 282, 7068, 288, 375, 34103, 9119, 42128, 1602, 6292, 425, 656, 8621, 321, 4583, 436, 310, 247, 1077, 5322, 2929, 619, 760, 4468, 310, 670, 253, 1543, 4103, 281, 253, 3236, 1775, 2929, 604, 253, 4477, 2953, 436, 4468, 275, 253, 30080, 22559, 891, 717, 5211, 281, 5583, 436, 2929, 323, 14924, 5474, 339, 377, 6653, 29328, 5609, 281, 3157, 253, 6733, 273, 9479, 1255, 6600, 41458, 1332, 597, 403, 19191, 2801, 20452, 3609, 8578, 273, 253, 3602, 387, 667, 3213, 285, 9479, 1255, 19579, 941, 5438, 50276, 16680, 14371, 10307, 689, 1775, 387, 1355, 14604, 9552, 327, 2709, 3210, 50275, 296, 3755, 20556, 50276, 19583, 14586, 326, 19132, 689, 5368, 1775, 1543, 50276, 4363, 2491, 273, 1666, 25379, 50276, 20881, 1255, 50276, 77, 471, 273, 490, 77, 569, 2439, 14604, 1979, 3340, 342, 253, 2856, 511, 1788, 5933, 50254, 50273, 251, 2105, 273, 1775, 50276, 16736, 273, 1775, 310, 5393, 347, 2233, 275, 253, 2505, 273, 253, 2929, 6867, 1775, 2898, 672, 14604, 9552, 403, 4067, 6388, 941, 7529, 3733, 2439, 247, 29447, 273, 31025, 316, 253, 5919, 2715, 273, 1775, 326, 2987, 275, 3946, 513, 417, 13791, 253, 27935, 2439, 253, 23018, 285, 7613, 2105, 310, 417, 4555, 2233, 533, 2406, 352, 651, 320, 1175, 281, 871, 1543, 1411, 436, 2629, 2715, 273, 1775, 50275, 27049, 19484, 25184, 50275, 262, 651, 320, 7445, 604, 253, 4477, 12650, 323, 253, 3733, 2105, 752, 651, 320, 253, 7200, 273, 256, 35333, 342, 3387, 625, 44540, 342, 4569, 25184, 273, 4715, 2281, 28631, 50275, 22122, 273, 17356, 285, 14604, 1979, 310, 253, 17356, 1762, 1599, 2456, 273, 6667, 275, 247, 14604, 403, 25665, 432, 253, 11786, 3213, 436, 310, 3240, 20996, 326, 436, 1332, 369, 2104, 281, 3157, 7200, 1223, 1262, 13218, 2716, 253, 14604, 352, 651, 320, 6685, 9865, 281, 871, 923, 4385, 327, 2105, 273, 1775, 281, 1199, 4067, 39657, 323, 1650, 1384, 2385, 323, 260, 338, 274, 6903, 361, 604, 436, 7788, 281, 3464, 2032, 50276, 2307, 6787, 403, 2969, 285, 5919, 275, 3946, 285, 778, 3157, 253, 8542, 10393, 273, 1775, 2439, 32140, 84, 5474, 33032, 2520, 2929, 2340, 684, 253, 9560, 6733, 2523, 273, 253, 9479, 1255, 13823, 7221, 6081, 1775, 1775, 19132, 253, 26647, 273, 277, 79, 2224, 533, 1543, 275, 247, 4021, 3733, 673, 2429, 281, 26724, 3733, 407, 18918, 253, 1054, 4090, 5199, 273, 1775, 253, 4477, 10018, 253, 15180, 39296, 285, 840, 12661, 247, 1332, 1578, 312, 281, 3157, 253, 6733, 432, 253, 941, 285, 3602, 24302, 253, 4477, 9059, 326, 1775, 476, 320, 34930, 6283, 342, 11184, 30745, 16774, 1543, 921, 326, 1578, 312, 476, 4796, 253, 4465, 3733, 673, 275, 260, 338, 274, 6903, 361, 285, 4440, 257, 292, 15302, 342, 5520, 7200, 2429, 281, 1775, 50276, 856, 84, 50276, 18, 186, 783, 4081, 2746, 310, 973, 24013, 8550, 285, 8542, 275, 1524, 10186, 4893, 253, 8655, 6733, 1057, 417, 1421, 281, 247, 11961, 273, 7200, 253, 5661, 9978, 310, 20297, 285, 21414, 253, 4081, 1332, 310, 16058, 327, 2709, 15302, 281, 921, 253, 12510, 253, 28913, 2175, 285, 253, 24864, 4679, 275, 253, 30762, 403, 4569, 374, 186, 783, 2934, 273, 8871, 8612, 3530, 323, 22753, 3602, 310, 4722, 285, 3576, 2139, 417, 897, 253, 9765, 3587, 347, 253, 4373, 19484, 323, 941, 5438, 495, 186, 783, 2929, 310, 973, 34092, 8442, 495, 285, 577, 403, 1175, 323, 17227, 50276, 5040, 337, 186, 262, 310, 5393, 326, 253, 30745, 273, 1863, 81, 5777, 2572, 275, 253, 12861, 277, 79, 2224, 752, 310, 253, 3242, 3486, 273, 970, 1863, 81, 275, 1236, 2510, 25912, 3676, 11454, 6928, 50276, 19, 186, 783, 8813, 273, 1863, 793, 13782, 275, 2593, 3307, 310, 417, 2590, 285, 778, 320, 1160, 12217, 323, 1650, 298, 25, 11052, 9840, 50276, 11159, 40195, 310, 2221, 1258, 3472, 253, 7558, 2762, 5921, 875, 9809, 30745, 342, 9840, 310, 21248, 285, 943, 320, 625, 10799, 50276, 20, 186, 783, 305, 12064, 26309, 908, 275, 4677, 577, 403, 417, 8612, 2217, 627, 310, 642, 3064, 875, 253, 1269, 10565, 20452, 285, 253, 340, 10565, 20452, 1908, 5304, 3006, 253, 2957, 13016, 342, 253, 48960, 26309, 50276, 38092, 5701, 337, 186, 29266, 281, 897, 4373, 22041, 273, 256, 35333, 281, 8171, 3602, 273, 256, 35333, 275, 4706, 4562, 1386, 721, 50276, 19, 186, 29266, 281, 823, 3733, 3885, 11858, 20, 4632, 1775, 2233, 323, 253, 6733, 5301, 275, 2593, 4562, 268, 20, 77, 21, 11858, 20, 4632, 1775, 2233, 50276, 2520, 789, 16540, 271, 1774, 1953, 273, 1775, 835, 253, 5520, 3733, 5933, 4419, 25128, 3733, 2105, 436, 2929, 12453, 285, 19132, 253, 6733, 32489, 273, 1775, 253, 4081, 2746, 310, 973, 24013, 8550, 285, 556, 644, 16058, 281, 3157, 253, 6733, 285, 7200, 273, 1775, 253, 4477, 671, 2085, 11646, 323, 38041, 2299, 253, 11961, 273, 1863, 793, 12510, 275, 1236, 2510, 25912, 11454, 6928, 347, 973, 347, 253, 816, 6787, 323, 253, 5520, 7200, 9945, 407, 1863, 81, 285, 256, 1397, 452, 417, 644, 4518, 9713, 275, 436, 2929, 891, 1902, 253, 2488, 476, 3662, 253, 2792, 1840, 285, 840, 891, 476, 4575, 619, 2457, 4868, 50276, 7152, 33032, 2520, 2929, 10262, 247, 1332, 1925, 1578, 312, 323, 11138, 253, 6733, 273, 1775, 1578, 312, 556, 767, 4295, 1863, 81, 285, 256, 1397, 1863, 84, 17308, 684, 253, 13418, 273, 299, 4277, 3066, 3632, 10491, 247, 8578, 273, 3602, 275, 896, 44263, 318, 256, 1397, 2007, 19132, 253, 6733, 3066, 10491, 247, 8578, 273, 941, 2792, 326, 310, 2217, 323, 18899, 253, 5170, 3033, 273, 298, 16248, 841, 767, 5609, 597, 5115, 247, 3885, 7756, 323, 1775, 1223, 27012, 10870, 390, 1014, 1805, 3045, 20544, 50276, 2520, 2929, 310, 973, 3542, 285, 3477, 281, 956, 50276, 6050, 1775, 556, 11464, 281, 320, 9371, 323, 1142, 4893, 436, 5853, 476, 320, 9371, 275, 3946, 50276, 20881, 1255, 50275, 1542, 1863, 81, 310, 253, 11786, 8989, 253, 1072, 323, 253, 3530, 275, 581, 14604, 390, 1027, 11786, 25965, 403, 3732, 323, 1016, 3410, 275, 253, 14604, 50276, 1542, 256, 1397, 271, 4722, 8245, 281, 320, 2429, 310, 26230, 299, 4277, 342, 247, 8578, 273, 12421, 19958, 941, 534, 812, 3761, 253, 6510, 299, 4277, 275, 2426, 273, 16774, 13418, 50275, 45019, 2929, 337, 2792, 326, 1775, 476, 320, 1077, 4217, 275, 9084, 285, 13361, 2617, 895, 254, 352, 651, 320, 625, 21414, 604, 253, 2488, 812, 921, 1578, 312, 476, 671, 789, 327, 1110, 767, 35615, 50276, 5992, 7193, 5701, 50276, 7053, 4194, 275, 3239, 854, 50276, 284, 439, 33382, 275, 2829, 495, 1863, 81, 19132, 439, 33382, 50276, 40831, 50276, 18, 260, 864, 1269, 22589, 920, 2093, 75, 4113, 288, 48188, 73, 285, 1766, 82, 272, 305, 543, 672, 8113, 4979, 398, 562, 32231, 501, 47301, 1293, 3215, 26208, 390, 2266, 941, 35919, 569, 549, 32693, 638, 3845, 549, 32693, 19, 12971, 10496, 2385, 43425, 436, 2929, 310, 253, 37775, 275, 18216, 247, 625, 5919, 1775, 1332, 436, 5853, 476, 320, 4217, 1580, 1775, 556, 644, 11464, 281, 320, 4217, 323, 1142, 6928, 3103, 891, 651, 5583, 436, 2929, 347, 5075, 2997, 891, 651, 320, 625, 21414, 253, 2283, 280, 1430, 273, 436, 2929, 604, 253, 4477, 476, 921, 253, 4081, 1578, 312, 2987, 327, 9084, 285, 13361, 81, 33947, 835, 253, 1775, 556, 644, 11464, 281, 320, 1077, 3576, 2490, 187, 4118, 18435, 27, 2520, 2929, 16633, 327, 11138, 253, 6733, 273, 9479, 1255, 13823, 41458, 1332, 323, 3733, 11454, 6928, 253, 18595, 403, 19191, 2801, 20452, 10775, 17221, 8578, 273, 253, 3602, 387, 667, 3213, 285, 9479, 1255, 19579, 941, 5438, 253, 11727, 3212, 7835, 3240, 4722, 281, 479, 10775, 9479, 1255, 13823, 7221, 6081, 476, 320, 34930, 6283, 342, 11184, 30745, 846, 18918, 253, 1054, 4090, 5199, 436, 11727, 5644, 281, 247, 4460, 5933, 2216, 891, 452, 1620, 2326, 50276, 783, 19843, 285, 38135, 403, 4518, 1840, 253, 2534, 273, 17857, 32888, 1223, 253, 30628, 574, 690, 7350, 327, 253, 8453, 253, 4477, 858, 247, 3782, 1175, 2628, 275, 616, 30080, 22559, 3021, 512, 273, 441, 452, 5821, 281, 2997, 436, 2929, 323, 9311, 4496, 2486, 253, 3081, 5661, 1543, 275, 253, 1735, 2715 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 29328, 767, 2969, 14586, 273, 253, 1775, 5556, 6081, 326, 1581, 281, 3012, 3157, 697, 15180, 6733, 1293, 18501, 272, 3045, 436, 310, 247, 5322, 2929, 326, 310, 973, 3542, 285, 15974, 247, 1077, 2590, 1895, 8493, 253, 15180, 10454, 273, 1775, 352, 29328, 767, 2969, 24866, 326, 1097, 1056, 3282, 285, 2722, 326, 597, 3157, 6733, 1293, 34426, 3045, 50275, 2140, 81, 7092, 50276, 74, 3078, 281, 1642, 247, 1953, 670, 1863, 81, 849, 513, 368, 3359, 352, 275, 3946, 891, 717, 247, 2372, 9861, 326, 417, 12672, 253, 27935, 323, 247, 3632, 8578, 24088, 2456, 273, 13461, 5644, 281, 247, 1534, 7756, 275, 3045, 513, 368, 452, 281, 513, 667, 24866, 281, 5115, 436, 7756, 50275, 40446, 273, 256, 1033, 11786, 342, 2120, 23941, 11786, 50276, 5658, 3748, 275, 3239, 608, 326, 253, 11786, 327, 253, 8578, 4236, 407, 256, 1033, 310, 973, 15616, 342, 253, 11786, 327, 253, 2120, 14604, 858, 368, 1611, 281, 2686, 2557, 436, 12420, 24088, 275, 2426, 273, 7349, 460, 14259, 352, 812, 320, 4722, 281, 7277, 352, 281, 24088, 247, 3632, 8578, 285, 14168, 4482, 67, 50275, 22309, 1057, 1578, 312, 562, 32231, 1775, 50276, 249, 690, 273, 634, 4679, 1578, 312, 41731, 13015, 1775, 275, 7200, 513, 368, 452, 5697, 323, 2139, 436, 812, 320, 9369, 50274, 303, 6533, 292, 1543, 50276, 531, 4468, 891, 452, 342, 253, 2929, 310, 326, 253, 3236, 1775, 2929, 337, 3133, 281, 1304, 1805, 1543, 323, 1775, 275, 690, 273, 253, 1072, 7533, 685, 368, 513, 275, 1798, 327, 4440, 257, 292, 275, 2829, 374, 368, 1304, 818, 2251, 323, 501, 3024, 1235, 342, 1775, 285, 10484, 1762, 342, 1578, 312, 2299, 337, 5012, 818, 1976, 342, 1775, 275, 2829, 374, 12014, 616, 1543, 323, 501, 3024, 6903, 403, 1805, 285, 253, 1543, 323, 39694, 3024, 327, 260, 338, 274, 403, 1805, 50276, 261, 627, 247, 3064, 275, 253, 4758, 368, 897, 50274, 250, 3065, 50276, 18, 9479, 1255, 13823, 41458, 323, 14556, 11138, 26647, 18753, 250, 2273, 85, 247, 19399, 465, 282, 7068, 288, 375, 34103, 9119, 42128, 1602, 6292, 425, 656, 8621, 321, 4583, 436, 310, 247, 1077, 5322, 2929, 619, 760, 4468, 310, 670, 253, 1543, 4103, 281, 253, 3236, 1775, 2929, 604, 253, 4477, 2953, 436, 4468, 275, 253, 30080, 22559, 891, 717, 5211, 281, 5583, 436, 2929, 323, 14924, 5474, 339, 377, 6653, 29328, 5609, 281, 3157, 253, 6733, 273, 9479, 1255, 6600, 41458, 1332, 597, 403, 19191, 2801, 20452, 3609, 8578, 273, 253, 3602, 387, 667, 3213, 285, 9479, 1255, 19579, 941, 5438, 50276, 16680, 14371, 10307, 689, 1775, 387, 1355, 14604, 9552, 327, 2709, 3210, 50275, 296, 3755, 20556, 50276, 19583, 14586, 326, 19132, 689, 5368, 1775, 1543, 50276, 4363, 2491, 273, 1666, 25379, 50276, 20881, 1255, 50276, 77, 471, 273, 490, 77, 569, 2439, 14604, 1979, 3340, 342, 253, 2856, 511, 1788, 5933, 50254, 50273, 251, 2105, 273, 1775, 50276, 16736, 273, 1775, 310, 5393, 347, 2233, 275, 253, 2505, 273, 253, 2929, 6867, 1775, 2898, 672, 14604, 9552, 403, 4067, 6388, 941, 7529, 3733, 2439, 247, 29447, 273, 31025, 316, 253, 5919, 2715, 273, 1775, 326, 2987, 275, 3946, 513, 417, 13791, 253, 27935, 2439, 253, 23018, 285, 7613, 2105, 310, 417, 4555, 2233, 533, 2406, 352, 651, 320, 1175, 281, 871, 1543, 1411, 436, 2629, 2715, 273, 1775, 50275, 27049, 19484, 25184, 50275, 262, 651, 320, 7445, 604, 253, 4477, 12650, 323, 253, 3733, 2105, 752, 651, 320, 253, 7200, 273, 256, 35333, 342, 3387, 625, 44540, 342, 4569, 25184, 273, 4715, 2281, 28631, 50275, 22122, 273, 17356, 285, 14604, 1979, 310, 253, 17356, 1762, 1599, 2456, 273, 6667, 275, 247, 14604, 403, 25665, 432, 253, 11786, 3213, 436, 310, 3240, 20996, 326, 436, 1332, 369, 2104, 281, 3157, 7200, 1223, 1262, 13218, 2716, 253, 14604, 352, 651, 320, 6685, 9865, 281, 871, 923, 4385, 327, 2105, 273, 1775, 281, 1199, 4067, 39657, 323, 1650, 1384, 2385, 323, 260, 338, 274, 6903, 361, 604, 436, 7788, 281, 3464, 2032, 50276, 2307, 6787, 403, 2969, 285, 5919, 275, 3946, 285, 778, 3157, 253, 8542, 10393, 273, 1775, 2439, 32140, 84, 5474, 33032, 2520, 2929, 2340, 684, 253, 9560, 6733, 2523, 273, 253, 9479, 1255, 13823, 7221, 6081, 1775, 1775, 19132, 253, 26647, 273, 277, 79, 2224, 533, 1543, 275, 247, 4021, 3733, 673, 2429, 281, 26724, 3733, 407, 18918, 253, 1054, 4090, 5199, 273, 1775, 253, 4477, 10018, 253, 15180, 39296, 285, 840, 12661, 247, 1332, 1578, 312, 281, 3157, 253, 6733, 432, 253, 941, 285, 3602, 24302, 253, 4477, 9059, 326, 1775, 476, 320, 34930, 6283, 342, 11184, 30745, 16774, 1543, 921, 326, 1578, 312, 476, 4796, 253, 4465, 3733, 673, 275, 260, 338, 274, 6903, 361, 285, 4440, 257, 292, 15302, 342, 5520, 7200, 2429, 281, 1775, 50276, 856, 84, 50276, 18, 186, 783, 4081, 2746, 310, 973, 24013, 8550, 285, 8542, 275, 1524, 10186, 4893, 253, 8655, 6733, 1057, 417, 1421, 281, 247, 11961, 273, 7200, 253, 5661, 9978, 310, 20297, 285, 21414, 253, 4081, 1332, 310, 16058, 327, 2709, 15302, 281, 921, 253, 12510, 253, 28913, 2175, 285, 253, 24864, 4679, 275, 253, 30762, 403, 4569, 374, 186, 783, 2934, 273, 8871, 8612, 3530, 323, 22753, 3602, 310, 4722, 285, 3576, 2139, 417, 897, 253, 9765, 3587, 347, 253, 4373, 19484, 323, 941, 5438, 495, 186, 783, 2929, 310, 973, 34092, 8442, 495, 285, 577, 403, 1175, 323, 17227, 50276, 5040, 337, 186, 262, 310, 5393, 326, 253, 30745, 273, 1863, 81, 5777, 2572, 275, 253, 12861, 277, 79, 2224, 752, 310, 253, 3242, 3486, 273, 970, 1863, 81, 275, 1236, 2510, 25912, 3676, 11454, 6928, 50276, 19, 186, 783, 8813, 273, 1863, 793, 13782, 275, 2593, 3307, 310, 417, 2590, 285, 778, 320, 1160, 12217, 323, 1650, 298, 25, 11052, 9840, 50276, 11159, 40195, 310, 2221, 1258, 3472, 253, 7558, 2762, 5921, 875, 9809, 30745, 342, 9840, 310, 21248, 285, 943, 320, 625, 10799, 50276, 20, 186, 783, 305, 12064, 26309, 908, 275, 4677, 577, 403, 417, 8612, 2217, 627, 310, 642, 3064, 875, 253, 1269, 10565, 20452, 285, 253, 340, 10565, 20452, 1908, 5304, 3006, 253, 2957, 13016, 342, 253, 48960, 26309, 50276, 38092, 5701, 337, 186, 29266, 281, 897, 4373, 22041, 273, 256, 35333, 281, 8171, 3602, 273, 256, 35333, 275, 4706, 4562, 1386, 721, 50276, 19, 186, 29266, 281, 823, 3733, 3885, 11858, 20, 4632, 1775, 2233, 323, 253, 6733, 5301, 275, 2593, 4562, 268, 20, 77, 21, 11858, 20, 4632, 1775, 2233, 50276, 2520, 789, 16540, 271, 1774, 1953, 273, 1775, 835, 253, 5520, 3733, 5933, 4419, 25128, 3733, 2105, 436, 2929, 12453, 285, 19132, 253, 6733, 32489, 273, 1775, 253, 4081, 2746, 310, 973, 24013, 8550, 285, 556, 644, 16058, 281, 3157, 253, 6733, 285, 7200, 273, 1775, 253, 4477, 671, 2085, 11646, 323, 38041, 2299, 253, 11961, 273, 1863, 793, 12510, 275, 1236, 2510, 25912, 11454, 6928, 347, 973, 347, 253, 816, 6787, 323, 253, 5520, 7200, 9945, 407, 1863, 81, 285, 256, 1397, 452, 417, 644, 4518, 9713, 275, 436, 2929, 891, 1902, 253, 2488, 476, 3662, 253, 2792, 1840, 285, 840, 891, 476, 4575, 619, 2457, 4868, 50276, 7152, 33032, 2520, 2929, 10262, 247, 1332, 1925, 1578, 312, 323, 11138, 253, 6733, 273, 1775, 1578, 312, 556, 767, 4295, 1863, 81, 285, 256, 1397, 1863, 84, 17308, 684, 253, 13418, 273, 299, 4277, 3066, 3632, 10491, 247, 8578, 273, 3602, 275, 896, 44263, 318, 256, 1397, 2007, 19132, 253, 6733, 3066, 10491, 247, 8578, 273, 941, 2792, 326, 310, 2217, 323, 18899, 253, 5170, 3033, 273, 298, 16248, 841, 767, 5609, 597, 5115, 247, 3885, 7756, 323, 1775, 1223, 27012, 10870, 390, 1014, 1805, 3045, 20544, 50276, 2520, 2929, 310, 973, 3542, 285, 3477, 281, 956, 50276, 6050, 1775, 556, 11464, 281, 320, 9371, 323, 1142, 4893, 436, 5853, 476, 320, 9371, 275, 3946, 50276, 20881, 1255, 50275, 1542, 1863, 81, 310, 253, 11786, 8989, 253, 1072, 323, 253, 3530, 275, 581, 14604, 390, 1027, 11786, 25965, 403, 3732, 323, 1016, 3410, 275, 253, 14604, 50276, 1542, 256, 1397, 271, 4722, 8245, 281, 320, 2429, 310, 26230, 299, 4277, 342, 247, 8578, 273, 12421, 19958, 941, 534, 812, 3761, 253, 6510, 299, 4277, 275, 2426, 273, 16774, 13418, 50275, 45019, 2929, 337, 2792, 326, 1775, 476, 320, 1077, 4217, 275, 9084, 285, 13361, 2617, 895, 254, 352, 651, 320, 625, 21414, 604, 253, 2488, 812, 921, 1578, 312, 476, 671, 789, 327, 1110, 767, 35615, 50276, 5992, 7193, 5701, 50276, 7053, 4194, 275, 3239, 854, 50276, 284, 439, 33382, 275, 2829, 495, 1863, 81, 19132, 439, 33382, 50276, 40831, 50276, 18, 260, 864, 1269, 22589, 920, 2093, 75, 4113, 288, 48188, 73, 285, 1766, 82, 272, 305, 543, 672, 8113, 4979, 398, 562, 32231, 501, 47301, 1293, 3215, 26208, 390, 2266, 941, 35919, 569, 549, 32693, 638, 3845, 549, 32693, 19, 12971, 10496, 2385, 43425, 436, 2929, 310, 253, 37775, 275, 18216, 247, 625, 5919, 1775, 1332, 436, 5853, 476, 320, 4217, 1580, 1775, 556, 644, 11464, 281, 320, 4217, 323, 1142, 6928, 3103, 891, 651, 5583, 436, 2929, 347, 5075, 2997, 891, 651, 320, 625, 21414, 253, 2283, 280, 1430, 273, 436, 2929, 604, 253, 4477, 476, 921, 253, 4081, 1578, 312, 2987, 327, 9084, 285, 13361, 81, 33947, 835, 253, 1775, 556, 644, 11464, 281, 320, 1077, 3576, 2490, 187, 4118, 18435, 27, 2520, 2929, 16633, 327, 11138, 253, 6733, 273, 9479, 1255, 13823, 41458, 1332, 323, 3733, 11454, 6928, 253, 18595, 403, 19191, 2801, 20452, 10775, 17221, 8578, 273, 253, 3602, 387, 667, 3213, 285, 9479, 1255, 19579, 941, 5438, 253, 11727, 3212, 7835, 3240, 4722, 281, 479, 10775, 9479, 1255, 13823, 7221, 6081, 476, 320, 34930, 6283, 342, 11184, 30745, 846, 18918, 253, 1054, 4090, 5199, 436, 11727, 5644, 281, 247, 4460, 5933, 2216, 891, 452, 1620, 2326, 50276, 783, 19843, 285, 38135, 403, 4518, 1840, 253, 2534, 273, 17857, 32888, 1223, 253, 30628, 574, 690, 7350, 327, 253, 8453, 253, 4477, 858, 247, 3782, 1175, 2628, 275, 616, 30080, 22559, 3021, 512, 273, 441, 452, 5821, 281, 2997, 436, 2929, 323, 9311, 4496, 2486, 253, 3081, 5661, 1543, 275, 253, 1735, 2715 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this work extends to zhou et al 2019 and ramanujan et al 2019 to allow sign flipping in the supermasks without updating the magnitudes of weights the main technical contributions are the new thresholdingbased training method and the new weight initialization scheme that considers the supermasks empirical results show better performance on small conv nets and also residual networks on cifar10100 datasets firstly this paper is written clearly and easy to follow the main technical contributions are clear performance improvements over zhou et al 2019 and ramanujan et al 2019 are clear however there are several concerns i have for this work 1 i think more detailed ablation study is needed for the experiments for example it is imaginable that the threshold tau will have an influence on the number of remaining parameters but the corresponding experiments are lacking having some will be nice also the new training technique is always coupled with the new initialization method ablation study to decouple the effect of these two contributions will be good for the readers to understand the which one is more important this is crucial because in ramanujan et al 2019 the authors also investigated a certain form of scaling of the parameters which is found to improve the performance significantly 2 in the abstract the authors posed this work as tackling the issue of difficult interpretation and trainingusage in realworld applications of highly overparameterized networks due to their size however i dont think the proposed method save anything during training eg memory computations in flops or time because it uses latent weights minor point in table 2 should the baseline accuracy for conv 8 be 82 instead of 72 overall i think this work proposes an interesting extension to previous supermask works but lack necessary experiments making the results less convincing docsepthis paper introduces signed supermasks which builds on top of the supermasks line of work of finding binary masks on untrained networks that result in good performance this work extends the original supermask by adding the ability to flip the sign of the weights the proposed method learns parameters of a mask and converts the mask parameters into a tertiary mask through the use of two thresholds the paper further proposes to use elu activation function and a elus initialization scheme that is more tailored to supermask training performance is compared against fullytrained models and prior work on supermasks on mnist cifar10 and cifar100 and the proposed method shows competitive results strength this paper is well written and the proposed method is clearly described the proposed method is fairly novel and the resulting high performance and sparsity level of the conv 2468 models are inspiring weakness the paper motivates the signed supermask as a way to improve the interpretability of the trained model while this would be very interesting i find that the supporting interpretability analyses to be lacking in substance the mask visualizations are limited to the first layer of a fc network on mnist and cannot be directly extended to more layers or other architectures the observations of layerwise pruning ratios are interesting but are not new and have been well studied in the pruning literature another motivation is in terms of efficiency due to sparsity and compression of the final trained model to this point i think it is more appropriate to compare performance to other tertiary network methods as the supermask baselines have much less capacity due to maintaining the sign at initialization is signed supermask a competitive method for training tertiary networks how does it compare to other tertiary methods in terms of the performance and efficiency tradeoff testing the limits of how barebones can a network be is an interesting question and it may very well be the case that the signed supermask pushes this limit but it needs to be made clear through a quantitative comparison to similar methods rather than just a discussion comparisons to the other supermask baselines are missing in the resnet models for some reason this paper presents an interesting new approach of training tertiary neural networks with high sparsity levels the paper is well written and several of the results look promising and would be of interest to the community however i take some issue with the motivations of this paper as a method that is proposed to produce insights on neural network training and enable interpretablility of networks i find a lack of new insights provided in the paper as a method that improves efficiency and compressibility i find it to be very intriguing and promising but the paper currently lacks the appropriate performance comparisons with other tertiary neural networks to fully convince me of its advantages update after rebuttal thank you to the author for the update i agree that there is value in training the most bareboned network that we can even though not all the value has been demonstrated in the present work i think it will be useful in many perhaps unexpected ways in the future although this is not really discussed im particularly interested in how this form of training can act as a particular form of regularization that is orthogonal to most other forms ill raise my score from 5 to 8 and encourage the authors to continue in this direction docsepthis paper discusses the signed supermask that can improve the model accuracy of untrained neural networks significantly while enhancing sparsity compared to the original supermask idea the authors introduce 1 as an additional mask value to enable flipping a sign of an initialized weight and suggest related activation functions and fixed threshold hyperparameters to achieve sparse networks analysis on sparsity and corresponding accuracy is given for various cnn models including resnet models the role of batch normalization for large models is also studied previous studies on supermask are interesting because they provide insights that even untrained models have chances to achieve reasonable model accuracy only if we can find such supermask in advance this reviewer is not sure whether minor improvements over the original supermask idea are critical if a large amount of training is still required if the authors can reveal that the model accuracy can be significantly improved even for untrained models compared to previous supermask it would be helpful to understand the inherent characteristics of the neural networks initialization but as shown in table 1 compared to the work by zhou et al even test accuracy is slightly degraded even with additional 1 mask while normal training is still required the original idea of supermask is meaningful because it shows a new insight that untrained model has already a subnetwork of reasonable accuracy if the authors want to claim that we can generate a new supermask generation method it would not produce additionally new insight such concern is obvious for residual networks without training batch norm parameters it is unavoidable to see noticeable accuracy degradation then batchnorm hyperparameters are known to be affine hyperparameters that are superior to normal weights in terms of expressive power since the existence of supermask is already known the focus of new research in supermask would need to be finding supermask efficiently with minimal efforts otherwise supermask would not be practical in the field unfortunately as indicated in table 1 training time to find supermask is quite slow then why we dont just perform usual pruning method to achieve better model accuracy comparison on the previous works is missing at least this paper needs to include whats hidden in a randomly weighted neural network in cvpr 2020 since this paper introduces only marginal improvements over original supermask work this reviewer cannot find significantly new insights from this paper supermask needs to involve much higher model accuracy than previous works or minimal efforts to be computed docsepthis paper proposes signed supermask an extension of the original supermask work zhou2019 for finding more efficient untrained subnetworks instead of learning a binary mask signed supermask claims and shows that adding another dimension 1 to the masks leads to higher sparsity with higher accuracy the main contribution of this paper is the introduction of weight flipping as an improvement over the existing approaches the method is simple and effective the empirical experiments validate the effectiveness of the proposal pros 1 the paper proposes a method to find untrained subnetworks that can approach the performance achieved by trained networks the performance improvement achieved by allowing weights to flip is interesting 2 overall the paper is well written readers can directly grasp the main idea of the paper cons 1 my main concern is the motivation and the usage of the proposed method as the authors said in the introduction however due to their sheer size the networks not only became difficult to interpret but also problematic to train and use in realworld applications i expect a method that can either reduce the number of trained parameters or require a fewer number of flops to be proposed while the authors claim that the subnetworks discovered are untrained i find that it requires training mask matrices with the same size as the model weights considering that straightthrough estimator is used to estimate gradients i suppose the backward pass is also dense and thus leads to no acceleration for training compared with directly training a dense network i suppose the overall training flops required by signed supermask is similar the larger tt epoch over the baseline in table 1 confirms my concern i believe it is necessary to explain the benefits of signed supermask compared with training a dense network and prune or directly training a sparse network with the sparsity learned by signed supermask i believe even directly training a sparse neural network with static or dynamic sparsity eg mocanu et al 1 evci et al 2 liu et al 3 can have similar performance while with much fewer training flops i encourage the authors to clarify this 2 there are some related works that are missing in the paper eg diffenderfer kailkhura 4 accepted by iclr 2021 and chijiwa 5 accepted by neurips 2021 since the number of works on this topic is quite small i expect a good submission should at least do a good related work job by introducing them whats more as they are aiming to search for untrained subnetworks without adding another dimension comparisons with these works are encouraged the current experiments only include comparisons with zhou2019 and ramanujan2019 with small convolutional networks which is too small to draw a solid conclusion for instance ramanujan2019 4 and 5 all provide results on large scale dataset imagenet 3 as i understand the performance improvement of signed supermask comes from 1 allowing weight flip 2 elus initialization i expect elus is a universal method that can improve all the related works however i didnt see any ablation study of these two components it is not clear to me if the improvement is caused by one of them or both of them 4 as mentioned by the authors the two threshold hyperparameters in equation 1 control the sparsity level i expect to see more experiments with how these thresholds influence the sparsity level and the corresponding performance of signed supermask minor typos instead of starting with a new paragraph there is an inexplicable blank after some lines eg 2rd paragraph on page 1 1st paragraph on page 7 a space is missing before this sentence the gradient is estimated on page 3 reference 1 mocanu decebal constantin et al scalable training of artificial neural networks with adaptive sparse connectivity inspired by network science nature communications 91 2018 112 2 evci utku et al rigging the lottery making all tickets winners international conference on machine learning pmlr 2020 3 shiwei liu et al sparse training via boosting pruning plasticity with neuroregeneration neurips 2021 4 diffenderfer j kailkhura b 2021 multiprize lottery ticket hypothesis finding accurate binary neural networks by pruning a randomly weighted network iclr 2021 5 chijiwa daiki et al pruning randomly initialized neural networks with iterative randomization neurips 2021 while the performance achieved by untrained nn is interesting the motivation and experiments need more work ### Summary:
this paper builds on previous work on supermasks it proposes to replace binary masks by a signed supermask ie a trainable trasholdbased mask that can take values from 101 this change in combination with the use of elus activation functions and an elus specific initialization strategy leads to a significantly higher pruning rate while keeping competitive performance in comparison to baseline models most reviewers agreed that the paper is well written and that the proposed approach and the experimental findings are interesting the motivation to improve interpretability was commonly perceived as misleading another downside that was mentioned is the training timeefficiency this however should not be taken too much into account since the work focusses on finding the smallest possible subnetwork that still performs well without changing the weight values and in line with work on the lottery hypothesis aims at understanding more about the structure of the winning tickets which is interesting for itself the paper therefore should be accepted
[ 31850, 6974, 326, 310, 625, 27846, 281, 2221, 12477, 3733, 3045, 310, 2429, 1411, 4751, 32927, 3210, 285, 2720, 789, 327, 2221, 78, 6579, 327, 278, 79, 382, 260, 338, 274, 740, 285, 260, 338, 274, 2313, 285, 253, 4081, 1332, 2722, 12085, 1543, 4757, 50275, 2520, 2929, 310, 973, 3542, 285, 253, 4081, 1332, 310, 4518, 2529, 50275, 783, 4081, 1332, 310, 9648, 4460, 285, 253, 4795, 1029, 3045, 285, 37139, 414, 1268, 273, 253, 2410, 2164, 2358, 3210, 403, 29853, 50275, 20881, 1255, 50275, 783, 2929, 15265, 684, 253, 6704, 2221, 12477, 347, 247, 1039, 281, 3157, 253, 4665, 1430, 273, 253, 10166, 1566, 1223, 436, 651, 320, 1077, 4722, 891, 1089, 326, 253, 8109, 4665, 1430, 6260, 281, 320, 14999, 275, 10359, 253, 8989, 5304, 5904, 403, 3710, 281, 253, 806, 3828, 273, 247, 269, 68, 2990, 327, 278, 79, 382, 285, 2550, 320, 3587, 6508, 281, 625, 8090, 390, 643, 35615, 253, 7313, 273, 3828, 3020, 819, 25004, 11878, 403, 4722, 533, 403, 417, 747, 285, 452, 644, 973, 5421, 275, 253, 819, 25004, 6239, 50274, 23955, 16038, 310, 275, 2426, 273, 6733, 1955, 281, 37139, 414, 285, 13800, 273, 253, 2457, 10166, 1566, 281, 436, 1127, 891, 1158, 352, 310, 625, 4569, 281, 7277, 3045, 281, 643, 31000, 2990, 3082, 347, 253, 2221, 12477, 1666, 25379, 452, 1199, 1679, 5350, 1955, 281, 11850, 253, 861, 387, 31850, 310, 6704, 2221, 12477, 247, 12085, 1332, 323, 3733, 31000, 6928, 849, 1057, 352, 7277, 281, 643, 31000, 3082, 275, 2426, 273, 253, 3045, 285, 6733, 5454, 2727, 5175, 253, 7787, 273, 849, 8050, 47473, 476, 247, 2990, 320, 310, 271, 4722, 1953, 285, 352, 778, 1077, 973, 320, 253, 1083, 326, 253, 6704, 2221, 12477, 32804, 436, 2701, 533, 352, 3198, 281, 320, 1160, 2590, 949, 247, 11745, 5301, 281, 2074, 3082, 2581, 685, 816, 247, 5955, 50275, 681, 1148, 10047, 281, 253, 643, 2221, 12477, 1666, 25379, 403, 5816, 275, 253, 501, 3024, 3210, 323, 690, 1921, 50276, 2520, 2929, 10262, 271, 4722, 747, 2746, 273, 3733, 31000, 11454, 6928, 342, 1029, 37139, 414, 2308, 253, 2929, 310, 973, 3542, 285, 2067, 273, 253, 1543, 1007, 12532, 285, 651, 320, 273, 1600, 281, 253, 3114, 2299, 891, 1379, 690, 2523, 342, 253, 42852, 273, 436, 2929, 347, 247, 1332, 326, 310, 4081, 281, 4711, 16039, 327, 11454, 2990, 3733, 285, 8046, 4665, 1752, 874, 273, 6928, 891, 1089, 247, 3480, 273, 747, 16039, 2530, 275, 253, 2929, 347, 247, 1332, 326, 19132, 6733, 285, 19477, 2322, 891, 1089, 352, 281, 320, 1077, 27807, 285, 12532, 533, 253, 2929, 4390, 19756, 253, 4569, 3045, 14023, 342, 643, 31000, 11454, 6928, 281, 4751, 18578, 479, 273, 697, 11361, 50275, 11183, 846, 30080, 22559, 50276, 47033, 368, 281, 253, 2488, 323, 253, 5731, 891, 5194, 326, 627, 310, 1318, 275, 3733, 253, 954, 8050, 4006, 264, 2990, 326, 359, 476, 1014, 2167, 417, 512, 253, 1318, 556, 644, 5183, 275, 253, 1246, 789, 891, 1158, 352, 588, 320, 4217, 275, 1142, 4931, 12439, 4088, 275, 253, 2852, 3738, 436, 310, 417, 1663, 5469, 516, 3782, 6110, 275, 849, 436, 830, 273, 3733, 476, 769, 347, 247, 1798, 830, 273, 37820, 326, 310, 19627, 281, 954, 643, 4948, 2853, 7164, 619, 4868, 432, 608, 281, 854, 285, 11907, 253, 4477, 281, 4035, 275, 436, 3884, 5474, 33032, 2520, 2929, 25339, 253, 6704, 2221, 12477, 326, 476, 3157, 253, 1566, 7200, 273, 440, 32927, 11454, 6928, 3012, 1223, 22474, 37139, 414, 2429, 281, 253, 3236, 2221, 12477, 2934, 253, 4477, 9569, 337, 347, 271, 3081, 8989, 1318, 281, 8046, 46899, 247, 861, 273, 271, 31260, 2801, 285, 1804, 2905, 5743, 3470, 285, 4229, 7887, 4373, 22041, 281, 5115, 23507, 6928, 1783, 327, 37139, 414, 285, 3969, 7200, 310, 1677, 323, 2710, 260, 9866, 3210, 1690, 501, 3024, 3210, 253, 2554, 273, 14604, 21539, 323, 1781, 3210, 310, 671, 5421, 2045, 2175, 327, 2221, 12477, 403, 4722, 984, 597, 2085, 16039, 326, 1014, 440, 32927, 3210, 452, 14512, 281, 5115, 5272, 1566, 7200, 760, 604, 359, 476, 1089, 824, 2221, 12477, 275, 7170, 436, 37317, 310, 417, 2119, 1880, 5884, 11701, 689, 253, 3236, 2221, 12477, 2934, 403, 4619, 604, 247, 1781, 2408, 273, 3733, 310, 1335, 2424, 50276, 338, 253, 4477, 476, 10313, 326, 253, 1566, 7200, 476, 320, 3012, 5520, 1014, 323, 440, 32927, 3210, 2429, 281, 2045, 2221, 12477, 352, 651, 320, 9371, 281, 2096, 253, 12794, 5319, 273, 253, 11454, 6928, 31850, 533, 347, 2011, 275, 2829, 337, 2429, 281, 253, 789, 407, 1182, 14451, 1162, 355, 1014, 1071, 7200, 310, 5777, 30853, 1014, 342, 3081, 337, 8989, 1223, 2622, 3733, 310, 1335, 2424, 253, 3236, 2934, 273, 2221, 12477, 310, 14282, 984, 352, 2722, 247, 747, 12288, 326, 440, 32927, 1566, 556, 2168, 247, 749, 18428, 273, 5272, 7200, 604, 253, 4477, 971, 281, 1750, 326, 359, 476, 6635, 247, 747, 2221, 12477, 5978, 1332, 352, 651, 417, 4711, 23000, 747, 12288, 824, 4468, 310, 4755, 323, 12541, 6928, 1293, 3733, 14604, 5222, 3602, 352, 310, 46133, 281, 923, 28629, 7200, 11961, 840, 10464, 1451, 526, 4373, 22041, 403, 1929, 281, 320, 29438, 4373, 22041, 326, 403, 8936, 281, 2622, 13461, 275, 2426, 273, 43541, 1612, 1580, 253, 6242, 273, 2221, 12477, 310, 2168, 1929, 253, 2770, 273, 747, 2561, 275, 2221, 12477, 651, 878, 281, 320, 4560, 2221, 12477, 14556, 342, 8723, 6031, 5010, 2221, 12477, 651, 417, 320, 8542, 275, 253, 1673, 19235, 347, 4860, 275, 2829, 337, 3733, 673, 281, 1089, 2221, 12477, 310, 3240, 3468, 840, 2139, 359, 13414, 816, 1347, 7312, 819, 25004, 1332, 281, 5115, 1805, 1566, 7200, 50276, 47109, 327, 253, 2045, 2987, 310, 5816, 387, 1878, 436, 2929, 3198, 281, 2486, 47515, 8763, 275, 247, 12421, 17375, 11454, 2990, 275, 30105, 1087, 9169, 1580, 436, 2929, 23970, 760, 16888, 11701, 689, 3236, 2221, 12477, 789, 436, 37317, 2550, 1089, 3012, 747, 16039, 432, 436, 2929, 2221, 12477, 3198, 281, 6388, 1199, 2169, 1566, 7200, 685, 2045, 2987, 390, 8723, 6031, 281, 320, 10302, 50276, 7152, 33032, 2520, 2929, 29328, 6704, 2221, 12477, 50276, 266, 6880, 273, 253, 3236, 2221, 12477, 789, 1182, 14451, 9638, 323, 4560, 625, 5919, 440, 32927, 749, 3024, 4896, 3185, 273, 4715, 247, 8985, 8989, 6704, 2221, 12477, 3916, 285, 2722, 326, 6240, 1529, 7877, 337, 281, 253, 25965, 5644, 281, 2169, 37139, 414, 342, 2169, 7200, 253, 2022, 7680, 273, 436, 2929, 310, 253, 10199, 273, 2801, 46899, 347, 271, 7756, 689, 253, 5368, 7274, 253, 1332, 310, 2969, 285, 3576, 253, 16774, 4679, 17813, 253, 12510, 273, 253, 10419, 50274, 856, 84, 50272, 18, 253, 2929, 29328, 247, 1332, 281, 1089, 440, 32927, 749, 3024, 4896, 326, 476, 2746, 253, 3045, 6786, 407, 10166, 6928, 253, 3045, 7756, 6786, 407, 6941, 13461, 281, 19153, 310, 4722, 50273, 19, 4583, 253, 2929, 310, 973, 3542, 10668, 476, 3587, 15909, 253, 2022, 2934, 273, 253, 2929, 50272, 5040, 50275, 18, 619, 2022, 4468, 310, 253, 16038, 285, 253, 10393, 273, 253, 4081, 1332, 50276, 284, 253, 4477, 753, 275, 253, 10199, 2299, 1955, 281, 616, 23658, 1979, 253, 6928, 417, 760, 3395, 2834, 281, 4665, 533, 671, 20276, 281, 6194, 285, 897, 275, 1524, 10186, 4893, 891, 1902, 247, 1332, 326, 476, 2057, 4796, 253, 1180, 273, 10166, 3602, 390, 2430, 247, 11184, 1180, 273, 892, 2695, 281, 320, 4081, 1223, 253, 4477, 1750, 326, 253, 749, 3024, 4896, 6888, 403, 440, 32927, 891, 1089, 326, 352, 4419, 3733, 8989, 12624, 342, 253, 1072, 1979, 347, 253, 1566, 13461, 7296, 326, 4951, 10489, 29107, 310, 908, 281, 6642, 27935, 891, 9428, 253, 19265, 1509, 310, 671, 14086, 285, 3021, 5644, 281, 642, 17680, 323, 3733, 2429, 342, 3587, 3733, 247, 14086, 2990, 891, 9428, 253, 4583, 3733, 892, 2695, 2424, 407, 6704, 2221, 12477, 310, 2074, 253, 4067, 42085, 50276, 554, 3770, 689, 253, 8245, 275, 2829, 337, 23849, 619, 4468, 891, 2868, 352, 310, 3309, 281, 5513, 253, 5373, 273, 6704, 2221, 12477, 2429, 342, 3733, 247, 14086, 2990, 285, 819, 2517, 390, 3587, 3733, 247, 23507, 2990, 342, 253, 37139, 414, 6311, 407, 6704, 2221, 12477, 891, 2868, 1014, 3587, 3733, 247, 23507, 11454, 2990, 342, 4228, 390, 7870, 37139, 414, 24088, 278, 406, 40160, 1162, 355, 337, 612, 5297, 1162, 355, 374, 632, 86, 1162, 355, 495, 476, 452, 2074, 3045, 1223, 342, 1199, 11184, 3733, 892, 2695, 891, 11907, 253, 4477, 281, 19148, 436, 50276, 19, 627, 403, 690, 2905, 2987, 326, 403, 5816, 275, 253, 2929, 24088, 2171, 3109, 1592, 50276, 76, 647, 17616, 5650, 577, 7607, 407, 17857, 32888, 43425, 285, 448, 27520, 8754, 608, 7607, 407, 5723, 2824, 43425, 1580, 253, 1180, 273, 2987, 327, 436, 9400, 310, 3240, 1355, 891, 1902, 247, 1175, 19529, 943, 387, 1878, 513, 247, 1175, 2905, 789, 2628, 407, 16984, 731, 47515, 625, 347, 597, 403, 26400, 281, 3186, 323, 440, 32927, 749, 3024, 4896, 1293, 6240, 1529, 7877, 14023, 342, 841, 2987, 403, 14659, 253, 1655, 4679, 760, 2486, 14023, 342, 1182, 14451, 9638, 285, 391, 14990, 10441, 266, 9638, 342, 1355, 27311, 267, 6928, 534, 310, 1512, 1355, 281, 3812, 247, 4891, 6452, 323, 4227, 50276, 3358, 266, 10441, 266, 9638, 577, 285, 608, 512, 2085, 1543, 327, 1781, 4311, 10895, 4440, 257, 292, 50276, 20, 347, 891, 2096, 253, 3045, 7756, 273, 6704, 2221, 12477, 3249, 432, 337, 6941, 2801, 19153, 374, 1045, 316, 31850, 891, 1902, 1045, 316, 50276, 261, 247, 10898, 1332, 326, 476, 3157, 512, 253, 2905, 2987, 2299, 891, 42126, 923, 667, 28913, 1263, 273, 841, 767, 4295, 352, 310, 417, 2590, 281, 479, 604, 253, 7756, 310, 4269, 407, 581, 273, 731, 390, 1097, 273, 731, 50275, 21, 347, 5393, 407, 253, 4477, 253, 767, 7887, 4373, 22041, 275, 5150, 337, 1453, 253, 37139, 414, 1268, 891, 1902, 281, 923, 625, 4679, 342, 849, 841, 26682, 4833, 253, 37139, 414, 1268, 285, 253, 3969, 3045, 273, 6704, 2221, 12477, 50276, 37585, 963, 993, 50275, 34235, 273, 4983, 342, 247, 747, 12494, 627, 310, 271, 29257, 1860, 494, 9912, 846, 690, 3104, 24088, 374, 5784, 12494, 327, 3239, 337, 337, 296, 12494, 327, 3239, 818, 50276, 66, 2317, 310, 5816, 1078, 436, 6197, 253, 11786, 310, 5998, 327, 3239, 495, 50276, 14005, 50276, 18, 278, 406, 40160, 17630, 7187, 3638, 249, 1162, 355, 44755, 3733, 273, 13345, 11454, 6928, 342, 17825, 23507, 17769, 11797, 407, 2990, 5859, 3753, 10924, 11583, 4765, 11633, 50276, 19, 612, 5297, 2780, 13312, 1162, 355, 8132, 3390, 253, 36284, 2403, 512, 14997, 20721, 5213, 8059, 327, 5145, 4715, 268, 1686, 83, 9169, 50276, 20, 439, 74, 26981, 50276, 965, 86, 1162, 355, 23507, 3733, 3066, 43124, 819, 25004, 30535, 342, 6551, 250, 14520, 5723, 2824, 43425, 50276, 21, 2171, 3109, 1592, 480, 50276, 76, 647, 17616, 5650, 270, 43425, 10796, 363, 2721, 36284, 13571, 9079, 4560, 7899, 8985, 11454, 6928, 407, 819, 25004, 247, 12421, 17375, 2990, 17857, 32888, 43425, 50276, 22, 448, 27520, 8754, 4204, 8678, 1162, 355, 819, 25004, 12421, 31260, 11454, 6928, 342, 34560, 46852, 5723, 2824, 43425, 50275, 6050, 253, 3045, 6786, 407, 440, 32927, 48257, 310, 4722, 253, 16038, 285, 4679, 878, 625, 789, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 21168, 327, 2045, 789, 327, 2221, 78, 6579, 352, 50276, 856, 6013, 281, 8171, 8985, 25965, 407, 247, 6704, 2221, 12477, 26332, 247, 6194, 494, 22381, 744, 3169, 8989, 326, 476, 1379, 2193, 432, 8437, 436, 1818, 275, 5019, 342, 253, 897, 273, 1045, 316, 5743, 3470, 285, 271, 1045, 316, 2173, 31850, 5700, 5644, 281, 247, 3012, 2169, 819, 25004, 2281, 1223, 7562, 12085, 3045, 50276, 249, 5301, 281, 8245, 3210, 50275, 2252, 30628, 5821, 326, 253, 2929, 310, 973, 3542, 285, 326, 253, 4081, 2746, 285, 253, 5661, 4342, 403, 4722, 253, 16038, 281, 3157, 4665, 1430, 369, 7744, 12351, 347, 24363, 1529, 50276, 3487, 2189, 326, 369, 5393, 310, 253, 3733, 673, 46505, 436, 2299, 943, 417, 320, 2668, 1512, 1199, 715, 2395, 1580, 253, 789, 41685, 1316, 265, 327, 4560, 253, 8004, 1896, 749, 18428, 326, 1335, 17923, 973, 1293, 6890, 253, 2801, 2193, 285, 275, 1386, 342, 789, 327, 253, 36284, 9079, 50276, 1468, 84, 387, 4685, 625, 670, 253, 2605, 273, 253, 9880, 14997, 534, 310, 4722, 323, 3139, 253, 2929, 3103, 943, 320, 7607 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 31850, 6974, 326, 310, 625, 27846, 281, 2221, 12477, 3733, 3045, 310, 2429, 1411, 4751, 32927, 3210, 285, 2720, 789, 327, 2221, 78, 6579, 327, 278, 79, 382, 260, 338, 274, 740, 285, 260, 338, 274, 2313, 285, 253, 4081, 1332, 2722, 12085, 1543, 4757, 50275, 2520, 2929, 310, 973, 3542, 285, 253, 4081, 1332, 310, 4518, 2529, 50275, 783, 4081, 1332, 310, 9648, 4460, 285, 253, 4795, 1029, 3045, 285, 37139, 414, 1268, 273, 253, 2410, 2164, 2358, 3210, 403, 29853, 50275, 20881, 1255, 50275, 783, 2929, 15265, 684, 253, 6704, 2221, 12477, 347, 247, 1039, 281, 3157, 253, 4665, 1430, 273, 253, 10166, 1566, 1223, 436, 651, 320, 1077, 4722, 891, 1089, 326, 253, 8109, 4665, 1430, 6260, 281, 320, 14999, 275, 10359, 253, 8989, 5304, 5904, 403, 3710, 281, 253, 806, 3828, 273, 247, 269, 68, 2990, 327, 278, 79, 382, 285, 2550, 320, 3587, 6508, 281, 625, 8090, 390, 643, 35615, 253, 7313, 273, 3828, 3020, 819, 25004, 11878, 403, 4722, 533, 403, 417, 747, 285, 452, 644, 973, 5421, 275, 253, 819, 25004, 6239, 50274, 23955, 16038, 310, 275, 2426, 273, 6733, 1955, 281, 37139, 414, 285, 13800, 273, 253, 2457, 10166, 1566, 281, 436, 1127, 891, 1158, 352, 310, 625, 4569, 281, 7277, 3045, 281, 643, 31000, 2990, 3082, 347, 253, 2221, 12477, 1666, 25379, 452, 1199, 1679, 5350, 1955, 281, 11850, 253, 861, 387, 31850, 310, 6704, 2221, 12477, 247, 12085, 1332, 323, 3733, 31000, 6928, 849, 1057, 352, 7277, 281, 643, 31000, 3082, 275, 2426, 273, 253, 3045, 285, 6733, 5454, 2727, 5175, 253, 7787, 273, 849, 8050, 47473, 476, 247, 2990, 320, 310, 271, 4722, 1953, 285, 352, 778, 1077, 973, 320, 253, 1083, 326, 253, 6704, 2221, 12477, 32804, 436, 2701, 533, 352, 3198, 281, 320, 1160, 2590, 949, 247, 11745, 5301, 281, 2074, 3082, 2581, 685, 816, 247, 5955, 50275, 681, 1148, 10047, 281, 253, 643, 2221, 12477, 1666, 25379, 403, 5816, 275, 253, 501, 3024, 3210, 323, 690, 1921, 50276, 2520, 2929, 10262, 271, 4722, 747, 2746, 273, 3733, 31000, 11454, 6928, 342, 1029, 37139, 414, 2308, 253, 2929, 310, 973, 3542, 285, 2067, 273, 253, 1543, 1007, 12532, 285, 651, 320, 273, 1600, 281, 253, 3114, 2299, 891, 1379, 690, 2523, 342, 253, 42852, 273, 436, 2929, 347, 247, 1332, 326, 310, 4081, 281, 4711, 16039, 327, 11454, 2990, 3733, 285, 8046, 4665, 1752, 874, 273, 6928, 891, 1089, 247, 3480, 273, 747, 16039, 2530, 275, 253, 2929, 347, 247, 1332, 326, 19132, 6733, 285, 19477, 2322, 891, 1089, 352, 281, 320, 1077, 27807, 285, 12532, 533, 253, 2929, 4390, 19756, 253, 4569, 3045, 14023, 342, 643, 31000, 11454, 6928, 281, 4751, 18578, 479, 273, 697, 11361, 50275, 11183, 846, 30080, 22559, 50276, 47033, 368, 281, 253, 2488, 323, 253, 5731, 891, 5194, 326, 627, 310, 1318, 275, 3733, 253, 954, 8050, 4006, 264, 2990, 326, 359, 476, 1014, 2167, 417, 512, 253, 1318, 556, 644, 5183, 275, 253, 1246, 789, 891, 1158, 352, 588, 320, 4217, 275, 1142, 4931, 12439, 4088, 275, 253, 2852, 3738, 436, 310, 417, 1663, 5469, 516, 3782, 6110, 275, 849, 436, 830, 273, 3733, 476, 769, 347, 247, 1798, 830, 273, 37820, 326, 310, 19627, 281, 954, 643, 4948, 2853, 7164, 619, 4868, 432, 608, 281, 854, 285, 11907, 253, 4477, 281, 4035, 275, 436, 3884, 5474, 33032, 2520, 2929, 25339, 253, 6704, 2221, 12477, 326, 476, 3157, 253, 1566, 7200, 273, 440, 32927, 11454, 6928, 3012, 1223, 22474, 37139, 414, 2429, 281, 253, 3236, 2221, 12477, 2934, 253, 4477, 9569, 337, 347, 271, 3081, 8989, 1318, 281, 8046, 46899, 247, 861, 273, 271, 31260, 2801, 285, 1804, 2905, 5743, 3470, 285, 4229, 7887, 4373, 22041, 281, 5115, 23507, 6928, 1783, 327, 37139, 414, 285, 3969, 7200, 310, 1677, 323, 2710, 260, 9866, 3210, 1690, 501, 3024, 3210, 253, 2554, 273, 14604, 21539, 323, 1781, 3210, 310, 671, 5421, 2045, 2175, 327, 2221, 12477, 403, 4722, 984, 597, 2085, 16039, 326, 1014, 440, 32927, 3210, 452, 14512, 281, 5115, 5272, 1566, 7200, 760, 604, 359, 476, 1089, 824, 2221, 12477, 275, 7170, 436, 37317, 310, 417, 2119, 1880, 5884, 11701, 689, 253, 3236, 2221, 12477, 2934, 403, 4619, 604, 247, 1781, 2408, 273, 3733, 310, 1335, 2424, 50276, 338, 253, 4477, 476, 10313, 326, 253, 1566, 7200, 476, 320, 3012, 5520, 1014, 323, 440, 32927, 3210, 2429, 281, 2045, 2221, 12477, 352, 651, 320, 9371, 281, 2096, 253, 12794, 5319, 273, 253, 11454, 6928, 31850, 533, 347, 2011, 275, 2829, 337, 2429, 281, 253, 789, 407, 1182, 14451, 1162, 355, 1014, 1071, 7200, 310, 5777, 30853, 1014, 342, 3081, 337, 8989, 1223, 2622, 3733, 310, 1335, 2424, 253, 3236, 2934, 273, 2221, 12477, 310, 14282, 984, 352, 2722, 247, 747, 12288, 326, 440, 32927, 1566, 556, 2168, 247, 749, 18428, 273, 5272, 7200, 604, 253, 4477, 971, 281, 1750, 326, 359, 476, 6635, 247, 747, 2221, 12477, 5978, 1332, 352, 651, 417, 4711, 23000, 747, 12288, 824, 4468, 310, 4755, 323, 12541, 6928, 1293, 3733, 14604, 5222, 3602, 352, 310, 46133, 281, 923, 28629, 7200, 11961, 840, 10464, 1451, 526, 4373, 22041, 403, 1929, 281, 320, 29438, 4373, 22041, 326, 403, 8936, 281, 2622, 13461, 275, 2426, 273, 43541, 1612, 1580, 253, 6242, 273, 2221, 12477, 310, 2168, 1929, 253, 2770, 273, 747, 2561, 275, 2221, 12477, 651, 878, 281, 320, 4560, 2221, 12477, 14556, 342, 8723, 6031, 5010, 2221, 12477, 651, 417, 320, 8542, 275, 253, 1673, 19235, 347, 4860, 275, 2829, 337, 3733, 673, 281, 1089, 2221, 12477, 310, 3240, 3468, 840, 2139, 359, 13414, 816, 1347, 7312, 819, 25004, 1332, 281, 5115, 1805, 1566, 7200, 50276, 47109, 327, 253, 2045, 2987, 310, 5816, 387, 1878, 436, 2929, 3198, 281, 2486, 47515, 8763, 275, 247, 12421, 17375, 11454, 2990, 275, 30105, 1087, 9169, 1580, 436, 2929, 23970, 760, 16888, 11701, 689, 3236, 2221, 12477, 789, 436, 37317, 2550, 1089, 3012, 747, 16039, 432, 436, 2929, 2221, 12477, 3198, 281, 6388, 1199, 2169, 1566, 7200, 685, 2045, 2987, 390, 8723, 6031, 281, 320, 10302, 50276, 7152, 33032, 2520, 2929, 29328, 6704, 2221, 12477, 50276, 266, 6880, 273, 253, 3236, 2221, 12477, 789, 1182, 14451, 9638, 323, 4560, 625, 5919, 440, 32927, 749, 3024, 4896, 3185, 273, 4715, 247, 8985, 8989, 6704, 2221, 12477, 3916, 285, 2722, 326, 6240, 1529, 7877, 337, 281, 253, 25965, 5644, 281, 2169, 37139, 414, 342, 2169, 7200, 253, 2022, 7680, 273, 436, 2929, 310, 253, 10199, 273, 2801, 46899, 347, 271, 7756, 689, 253, 5368, 7274, 253, 1332, 310, 2969, 285, 3576, 253, 16774, 4679, 17813, 253, 12510, 273, 253, 10419, 50274, 856, 84, 50272, 18, 253, 2929, 29328, 247, 1332, 281, 1089, 440, 32927, 749, 3024, 4896, 326, 476, 2746, 253, 3045, 6786, 407, 10166, 6928, 253, 3045, 7756, 6786, 407, 6941, 13461, 281, 19153, 310, 4722, 50273, 19, 4583, 253, 2929, 310, 973, 3542, 10668, 476, 3587, 15909, 253, 2022, 2934, 273, 253, 2929, 50272, 5040, 50275, 18, 619, 2022, 4468, 310, 253, 16038, 285, 253, 10393, 273, 253, 4081, 1332, 50276, 284, 253, 4477, 753, 275, 253, 10199, 2299, 1955, 281, 616, 23658, 1979, 253, 6928, 417, 760, 3395, 2834, 281, 4665, 533, 671, 20276, 281, 6194, 285, 897, 275, 1524, 10186, 4893, 891, 1902, 247, 1332, 326, 476, 2057, 4796, 253, 1180, 273, 10166, 3602, 390, 2430, 247, 11184, 1180, 273, 892, 2695, 281, 320, 4081, 1223, 253, 4477, 1750, 326, 253, 749, 3024, 4896, 6888, 403, 440, 32927, 891, 1089, 326, 352, 4419, 3733, 8989, 12624, 342, 253, 1072, 1979, 347, 253, 1566, 13461, 7296, 326, 4951, 10489, 29107, 310, 908, 281, 6642, 27935, 891, 9428, 253, 19265, 1509, 310, 671, 14086, 285, 3021, 5644, 281, 642, 17680, 323, 3733, 2429, 342, 3587, 3733, 247, 14086, 2990, 891, 9428, 253, 4583, 3733, 892, 2695, 2424, 407, 6704, 2221, 12477, 310, 2074, 253, 4067, 42085, 50276, 554, 3770, 689, 253, 8245, 275, 2829, 337, 23849, 619, 4468, 891, 2868, 352, 310, 3309, 281, 5513, 253, 5373, 273, 6704, 2221, 12477, 2429, 342, 3733, 247, 14086, 2990, 285, 819, 2517, 390, 3587, 3733, 247, 23507, 2990, 342, 253, 37139, 414, 6311, 407, 6704, 2221, 12477, 891, 2868, 1014, 3587, 3733, 247, 23507, 11454, 2990, 342, 4228, 390, 7870, 37139, 414, 24088, 278, 406, 40160, 1162, 355, 337, 612, 5297, 1162, 355, 374, 632, 86, 1162, 355, 495, 476, 452, 2074, 3045, 1223, 342, 1199, 11184, 3733, 892, 2695, 891, 11907, 253, 4477, 281, 19148, 436, 50276, 19, 627, 403, 690, 2905, 2987, 326, 403, 5816, 275, 253, 2929, 24088, 2171, 3109, 1592, 50276, 76, 647, 17616, 5650, 577, 7607, 407, 17857, 32888, 43425, 285, 448, 27520, 8754, 608, 7607, 407, 5723, 2824, 43425, 1580, 253, 1180, 273, 2987, 327, 436, 9400, 310, 3240, 1355, 891, 1902, 247, 1175, 19529, 943, 387, 1878, 513, 247, 1175, 2905, 789, 2628, 407, 16984, 731, 47515, 625, 347, 597, 403, 26400, 281, 3186, 323, 440, 32927, 749, 3024, 4896, 1293, 6240, 1529, 7877, 14023, 342, 841, 2987, 403, 14659, 253, 1655, 4679, 760, 2486, 14023, 342, 1182, 14451, 9638, 285, 391, 14990, 10441, 266, 9638, 342, 1355, 27311, 267, 6928, 534, 310, 1512, 1355, 281, 3812, 247, 4891, 6452, 323, 4227, 50276, 3358, 266, 10441, 266, 9638, 577, 285, 608, 512, 2085, 1543, 327, 1781, 4311, 10895, 4440, 257, 292, 50276, 20, 347, 891, 2096, 253, 3045, 7756, 273, 6704, 2221, 12477, 3249, 432, 337, 6941, 2801, 19153, 374, 1045, 316, 31850, 891, 1902, 1045, 316, 50276, 261, 247, 10898, 1332, 326, 476, 3157, 512, 253, 2905, 2987, 2299, 891, 42126, 923, 667, 28913, 1263, 273, 841, 767, 4295, 352, 310, 417, 2590, 281, 479, 604, 253, 7756, 310, 4269, 407, 581, 273, 731, 390, 1097, 273, 731, 50275, 21, 347, 5393, 407, 253, 4477, 253, 767, 7887, 4373, 22041, 275, 5150, 337, 1453, 253, 37139, 414, 1268, 891, 1902, 281, 923, 625, 4679, 342, 849, 841, 26682, 4833, 253, 37139, 414, 1268, 285, 253, 3969, 3045, 273, 6704, 2221, 12477, 50276, 37585, 963, 993, 50275, 34235, 273, 4983, 342, 247, 747, 12494, 627, 310, 271, 29257, 1860, 494, 9912, 846, 690, 3104, 24088, 374, 5784, 12494, 327, 3239, 337, 337, 296, 12494, 327, 3239, 818, 50276, 66, 2317, 310, 5816, 1078, 436, 6197, 253, 11786, 310, 5998, 327, 3239, 495, 50276, 14005, 50276, 18, 278, 406, 40160, 17630, 7187, 3638, 249, 1162, 355, 44755, 3733, 273, 13345, 11454, 6928, 342, 17825, 23507, 17769, 11797, 407, 2990, 5859, 3753, 10924, 11583, 4765, 11633, 50276, 19, 612, 5297, 2780, 13312, 1162, 355, 8132, 3390, 253, 36284, 2403, 512, 14997, 20721, 5213, 8059, 327, 5145, 4715, 268, 1686, 83, 9169, 50276, 20, 439, 74, 26981, 50276, 965, 86, 1162, 355, 23507, 3733, 3066, 43124, 819, 25004, 30535, 342, 6551, 250, 14520, 5723, 2824, 43425, 50276, 21, 2171, 3109, 1592, 480, 50276, 76, 647, 17616, 5650, 270, 43425, 10796, 363, 2721, 36284, 13571, 9079, 4560, 7899, 8985, 11454, 6928, 407, 819, 25004, 247, 12421, 17375, 2990, 17857, 32888, 43425, 50276, 22, 448, 27520, 8754, 4204, 8678, 1162, 355, 819, 25004, 12421, 31260, 11454, 6928, 342, 34560, 46852, 5723, 2824, 43425, 50275, 6050, 253, 3045, 6786, 407, 440, 32927, 48257, 310, 4722, 253, 16038, 285, 4679, 878, 625, 789, 50276, 187, 187, 4118, 18435, 27, 2520, 2929, 21168, 327, 2045, 789, 327, 2221, 78, 6579, 352, 50276, 856, 6013, 281, 8171, 8985, 25965, 407, 247, 6704, 2221, 12477, 26332, 247, 6194, 494, 22381, 744, 3169, 8989, 326, 476, 1379, 2193, 432, 8437, 436, 1818, 275, 5019, 342, 253, 897, 273, 1045, 316, 5743, 3470, 285, 271, 1045, 316, 2173, 31850, 5700, 5644, 281, 247, 3012, 2169, 819, 25004, 2281, 1223, 7562, 12085, 3045, 50276, 249, 5301, 281, 8245, 3210, 50275, 2252, 30628, 5821, 326, 253, 2929, 310, 973, 3542, 285, 326, 253, 4081, 2746, 285, 253, 5661, 4342, 403, 4722, 253, 16038, 281, 3157, 4665, 1430, 369, 7744, 12351, 347, 24363, 1529, 50276, 3487, 2189, 326, 369, 5393, 310, 253, 3733, 673, 46505, 436, 2299, 943, 417, 320, 2668, 1512, 1199, 715, 2395, 1580, 253, 789, 41685, 1316, 265, 327, 4560, 253, 8004, 1896, 749, 18428, 326, 1335, 17923, 973, 1293, 6890, 253, 2801, 2193, 285, 275, 1386, 342, 789, 327, 253, 36284, 9079, 50276, 1468, 84, 387, 4685, 625, 670, 253, 2605, 273, 253, 9880, 14997, 534, 310, 4722, 323, 3139, 253, 2929, 3103, 943, 320, 7607 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: in this work the authors explore the relationship between 4 different brain regions multidemand network language network visual system auditory system and different features of program code specifically they look at hidden state representations of code language models seq2seq codeberta codetransformer xlnet tfidf and bow representations of the input for nonlm based features they look at a code vs sentence contrast variable language english vs japanese variable names data types strings vs numerals and control flow for vs if vs no branching to analyze relationships between the bold signal and stimulus features they build linear classifiersregressors from the bold activity of each system to each of the stimulus features models are evaluated using classification accuracy and linear correlation for regression in the case of hiddenstate features model are evaluated using rank accuracy overall the authors find that the visual system is capable of significantly predicting several of the handcrafted code features suggesting that these features are correlated with lowlevel stimulus properties like program length while differences between md ls are not significant these models successfully predict 56 handcrafted features in the code representation prediction task the authors find that the md ls and visual systems are able to rank significantly above change however the ls and visual systems do not beat a random token embedding baseline aside from codeberta the md system is also not significantly above the random baseline updated review apologies for the delay i commend the authors for engaging thoroughly with the reviews providing new evidence and making convincing arguments having gone through their updated draft rebuttals and new identification experiment for the visual system confound i am convinced that the mdls results show code processing distinct from pure visual features and that the random baseline performance in result 2 is perhaps an effect of the simple dataset not that the regionsmodels only care about tokenlevel information to this end i am increasing my score by 2 points having said that i agree with the other reviewers that this has been a productive discussion and the paper could benefit from more additions particularly strongercleaner results to show what information is differentially processed in md vs ls etc i am still unconvinced that the current approach is more promising in deciphering brain function than encoding with variance partitioning hence i am unable to fully endorse this paper at this time strengths the paper was wellwritten and the research question is interesting useful the use of publicly available data and code release is beneficial to the community the authors note that the visual roi can effectively classify 4 different stimulus properties but it does not significantly predict the token count this to me suggests that the 4 properties evaluated here are in fact strongly confounded with program length presence of letters etc and may not specifically correspond to code comprehension what do the authors think about such confounds being relevant in other areas as well and its impact on the conclusions reached in the paper overall i am not convinced that classification or identification accuracy is useful in gauging what properties of code are encoded by different brain regions it would be useful to introduce more contrasts or control for alternate exploratory variables while the authors allude to several of the metrics being correlated like in the static analyses and byte code not only would it be useful to add a correlation matrix but also use a more systematic approach like variance partitioning to see what unique properties of code these feature spaces explain both against each other and confounds like program length the authors are making definitive statements about the observed performance patterns based on prior knowledge of the md system ls system etc however given the possibility of confounds and correlations between the metrics itself i am unable to view these results as strong evidence for one mechanism over the other re experiment 1 why isnt the baseline just chance probability instead of a roundabout way with randomly sampled assignments for example it should 13 for control flow or 05 for variable language there are no details on how the functional rois were identified did you use specific contrasts freesurfer annotations etc if indeed ac refers to primary ac perhaps the significance test being used is weak and cannot meaningfully identify signifiant effects re analysis 1 1 ivanova et al 2020 effectively demonstrate that the multidemand network uniquely responds to code however it is not clear to me what new findings are proposed in the current work beyond this 2 control flow and data type the authors make claims about accuracy differences between md ls however these differences are not statistically signifiant by their own admission
 we follow this up by using ttests to examine whether any one brain region decodes any given property more significantly than another but find no differences between the md system and the language system for any properties table 8 appendix e 3 same comment holds for the contrast in dynamic and static analyses we additionally test if any brain region has a preference for a specific code property over another for instance is the evidence seen in figure 4 of md more accurately decoding dynamic analysis properties than static analysis properties statistically significant likewise does the ls predict static analysis properties significantly more accurately than the dynamic analysis properties we do not find any significant differences table 9 appendix e re experiment 2 what layer of the models were used is it the logits or softmax output can the authors provide more details on how rank accuracy was computed and how the regression task was set up it is unclear to me what scientific question this experiment is answering yes different brain regions have good identification accuracy for different brain models but isnt this expected from a prior work showing these regions respond to code and 2 these models capturing nonlinear features of the code stimuli what does this tell us about brain function i found weak evidence for the identification accuracies presented here since several of the tests were not statistically significant and did not beat the random embedding baseline re the following
 the fact that md maps to other complex models other than codeberta like codetransformer and seq2seq as accurately as the random embedding model is a strong indicator that the representations in these brain systems is strongly driven by tokenlevel information in the input programs i do not follow this claim do the authors mean brain systems or code models couldnt the same result be observed if the highlevel code information was not linearly decodable as permissible by the linear regression setting here other questions 1 since the magnitude of the predicted value is important here i am confused by the use of linear correlation as opposed to rmse 2 for discrete properties this is clearly not regression the authors should clarify that they build classificationregression models with an l2 penalty 3 what do the authors mean by zeroshot here since iiuc the model was never actually tested on stimulus decoding following training on code representation identification we show that it is possible to perform zeroshot decoding of the computer program being comprehended using a proxy representation produced by a suite of code models 4 how were code representations extracted from xlnet and seq2seq were the models finetuned on code while the proposition of understanding what aspects of code processing are represented in different brain regions is interesting in its current form i think the paper doesnt successfully disentangle this several of the effects reported are not significant above chance or random baselines and further not significantly different across brain regions the features considered are also confounded with lowlevel properties of code comprehension that havent been controlled for to this end it is hard to infer what properties different systems are encoding and if there is a take away beyond prior work that identifies regions robustly responding to code docsepthis paper introduces a systematical framework to discover the relationship between the brain representations of programs and their corresponding code models this framework helps us to understand the code properties encoded in the human brain so that we could evaluate whether ml models faithfully represent human brain representations of computer code comperhensions this paper focuses on answering two questions by showing the results of related experiments on a dataset of 72 programs and 24 persons brain recordings first of all the authors show that how well each of the four brain systems considered in this paper including multiple demand language vision and auditory systems encode specific code properties using a ridge regressor then they demonstrate another ridge regressor that can map brain representations to the corresponding learned representations by computational language models of code with different model complexity major comments and questions the results of experiment 1 show the accuracy of the brain representation of code properties it shows some correlation between brain regions and code properties however the ttest results show no significant differences for any brain regions having any preferences for a specific code property this experiment has been done only on one dataset with python code how much is this kind of correlation reliable and pervasive in other datasets as we know python is a highlevel scripting language that might not be quite distinguishable from short sentences have you tried similar experiments on any other datasets or other lowerlevel languages such as cc or even assembly is there any improvements in the brain code representations in this work compared to previous work such as ivanovas given that you are selecting a vector of roughly 1000 voxels for each region ivanovas paper reported some activities in the md system during code comprehension they showed moderate activity in language systems corresponding to python code comprehension and no activity regarding visual programming like scratchjr they also indicated that ls is responsive to python code while visual areas are responsive to scratchjr given that one other natural experiment to support the mapping brain representation to the code model could be to try it on the brain representations of visual programs mostly in vision and md systems and the corresponding machinelearned encodings using visual recognition models such as convolutional neural nets have you tried such experiments on scratchjr codes similar to what you have done for python codes minor comments the explanation in the paper is fine however adding more diagrams of the whole procedure of the framework as well as mathematical descriptions of the models and their input and output as matricesvectors would have helped a lot to understand the whole idea faster and easier the paper is well written the motivations are clear and the references are enough related work is adequately established the existing research in the field and compared the goal of the current research to previous work however the ideas of the paper are incremental and mostly applications of existing methods which have been put together to approach an interesting open problem moreover usually further experiments on more datasets are required to evaluate any correlation between the code models and the brain representations in the proposed framework docsepthis paper investigates encoding of computer code in the human brain the author build decoding models for fmri responses to predict 1 various properties of python code and 2 representations of python code derived from different machine learning models the main conclusion is that the responses from the multiple demand system in the brain is capable of provide significance decoding performance of properties and model representations of computer code such as runtime information strengths this paper is very clear and well written it investigates an interesting question of what kinds of information is decodable from brain representations of computer code experiments are well thought out and carefully controlled with a decent set of properties of interest and model representation results are compared with relatively extensive sets of models weakness i am not sure whether this paper actually provides a lot of new insights for both audiences from the neuroscience community and machine learning community it is not surprising to me that the multiple demand system in the brain is able to provide above chance decoding performance of the selected properties of the computer code the multiple demand system includes a large part of the prefrontal cortex that is generally responsible for any executive control and cognitive processing human do it would be more interesting for example to run a searchlight algorithm over the brain to pin down specific locations where different properties of computer code are represented for experiment 2 other than model complexity another useful control would be the information integration window or context window of different models the models in comparison have very different mechanism to integrate information across the time during which a computer program is presented therefore it is unclear to me how the integration of information over time or length of program is going to affect the brain mapping to these representations the mapping from brain representations to context window is also potentially helpful for machine learning researchers in designing better models for code representations however the analysis in the paper so far largely limited to decoding based methods and only compared with limited number of controls are not very informative the paper has done thoughtful analysis and comparisons to investigate how the brain represent computer code and has great potential to be accepted but more indepth analysis are needed to make it a more informative paper for both the neuroscience andor machine learning audience docsepthis work examines the relationship between fmri recordings of people who read short programs and different properties and representations of the programming code the aim of the work is to understand what properties of code are encoded by different brain systems and to understand how similar the representations of code in the brain are to those encoded by selfsupervised language models that are pretrained to encode programming code the authors find that several program properties can be significantly decoded from 3 brain systems the multiple demand system the language system and the visual system they further find that representations of the programs extracted from several machine learning models of varying complexity can also be significantly decoded from these brain systems strengths code made publicly available and the paper also uses publicly available brain data mostly clear exposition with only a few places where more details are needed see questions below an indepth and timely empirical investigation weaknesses limited methodological novelty this work uses previously established methods for brain decoding and does not make advances either in the methodology or in the interpretation of existing methodology by itself this is not a fatal flaw a paper can have value to the community without such novelty limited evidence for some of the stated conclusions see question 1 below for more detailed comments one general comment when reading the main paper it is not always clear what differences between brain systems or ml models are actually significant limited discussion the investigation of the relationship between language models and the brain recordings has the potential to be very interesting but the discussion of these results is extremely limited the paper can benefit from a more thorough discussion of what is learned by the comparison with the language models for example it appears that a very simple model like bow reaches similar decoding performance to that of the most complex model codebertaso then what do we learn by the comparison with the complex model questions for authors 1 i have some concerns about the conclusions for 3 of the 4 types of experiments in section 51 under analysis 1 code vs sentences fig 3 shows that 3 of the brain systems multidemand language and visual can significantly distinguish programs according to whether they were written using code or using sentences with very similar accuracy in conclusion the authors state the underlying operations required to solve both types of programming problems were the same eg both might have required summing elements in a list yet despite the mental operations remaining mostly the same the md system is able to discriminate between the two contrasting conditions this supports the claims made by liu et al 2020 and ivanova et al 2020 that the md system is responsible for code comprehension in addition to being responsive to mentally tracing the execution flow of code snippetsa typical working memory task since the two conditions are easily distinguishable based on visual properties as evidence by the very high significant decoding accuracy in the visual system how can the authors make any claims about what the multidemand or the language systems are doing based on the decoding accuracy of these two conditions variable language here the authors examine the decoding differences between two conditions one in which the variable names are in english and the other where the variable names are in japanese the authors say that they expect a significant decoding performance in the language system would they not expect a significant decoding performance in the visual system shouldnt the most obvious difference be in the visual system because of the systematic visual differences between english and japanese fig 3 shows no significant decoding performance for any brain systems neither the language nor the visual system could this be due to the way the brain data is aggregated over trs for this analysis control flow and data type could the differences in decoding performance for the multidemand and language systems be due to a difference in the signaltonoise ratio in the two systems and how to the authors control for this possible snr difference between brain systems 2 can the authors comment on why they chose to do decoding analyses over encoding analyses ie predicting brain activity from representations of programs it seems that encoding could allow for a more indepth analysis of the differences in what properties different regionsbrain systems respond to ie by comparing the weights of the learned encoding models for different brain regionssystems 3 can the authors comment on why only leftlateralized regions of interest were examined several works in natural language comprehension with naturalistic stimuli show that both the left and right hemispheres are engaged during language comprehension wehbe et al 2014 huth et al 2016 jain and huth 2018 do the authors expect that the mirror brain regions in the right hemisphere do not engage in the same way during programming comprehension and if so do the authors believe that this is a function of their short stimuli ie if the stimuli were longer pieces of code we would expect both hemispheres to be engaged 4 one of the questions that was thoughtfully examined by the authors was how the complexity of the ml model affects its relationship with the brain recordings the authors clarify that what they mean by complexity is the number of model parameters another related model property that may affect the relationship with the brain recordings is the dimensionality of the output embeddingsdid the authors also examine this property its somewhat related to the number of parameters in the model but is not a perfect predictor ie bert and gpt2 have the same embedding size but different numbers of parameters overall 4 clarifications needed about representations of programs extracted from the models how exactly were they extracted was the whole program provided to the model at once the authors say that the output of the encoder was used but what about for autoregressive models ie the last state or something else brain experiment was the program presented at once or wordbyword for how long what were the subjects instructed to do brain data assuming that each program was presented for multiple trs how was the data from multiple trs handled in the decoding procedure overall the paper investigates an interesting and timely question but offers limited technical and empirical insights the work can be strengthened by a more indepth discussion about what is learned by the comparison with complex ml models and by additional analyses that strengthen the empirical conclusions see under questions ### Summary:
this paper aims to relate brain activity of people reading computer code to properties of the computer code they relate the found representations to those obtained from ml computational language models applied to the same programs the paper is clearly written and an interesting idea there was a lot of discussion and the authors updated their paper a lot program length as a potential confound was raised and successfully rebutted the extent of novelty from ivanova et al 2020 was also discussed and successfully rebutted in the end the main issues the reviewers had were 1 that the paper had been updated substantially since submission and would therefore benefit from a thorough rereview and 2 whether the results provide enough new insights about the brain or about ml language models to summarize the authors spent a lot of time addressing issues in the rebuttal phase and the paper got a lot better with the reviewers suggestions but reviewers agreed it would benefit from more work and further review before acceptance i agree with this assessment
[ 275, 3448, 2718, 3969, 281, 15548, 2127, 35380, 285, 642, 2425, 5001, 5304, 10717, 751, 20041, 75, 83, 597, 671, 4860, 326, 35253, 310, 20876, 281, 15548, 2127, 1223, 5304, 3672, 403, 20876, 281, 20041, 75, 83, 1677, 326, 581, 643, 3626, 3368, 281, 1329, 253, 10603, 3998, 6779, 281, 253, 2127, 1566, 812, 320, 281, 1611, 352, 327, 253, 3998, 14237, 273, 5304, 5659, 6571, 275, 8113, 285, 31934, 2718, 285, 253, 3969, 5145, 29343, 264, 2349, 351, 723, 970, 5304, 8981, 3210, 824, 347, 27311, 267, 11454, 37507, 452, 368, 3597, 824, 4679, 327, 20041, 75, 83, 11646, 2074, 281, 752, 368, 452, 2218, 323, 15548, 11646, 50276, 37585, 5701, 50276, 783, 8813, 275, 253, 2929, 310, 4030, 2299, 6240, 625, 21302, 273, 253, 2644, 5199, 273, 253, 7792, 347, 973, 347, 15965, 20121, 273, 253, 3210, 285, 616, 3280, 285, 3453, 347, 12624, 34383, 651, 452, 6518, 247, 2257, 281, 2096, 253, 2644, 2934, 7938, 285, 6927, 50276, 783, 2929, 310, 973, 3542, 253, 42852, 403, 2590, 285, 253, 10414, 403, 2217, 2905, 789, 310, 18212, 4232, 253, 5368, 2561, 275, 253, 1673, 285, 2429, 253, 4736, 273, 253, 1655, 2561, 281, 2045, 789, 2299, 253, 5697, 273, 253, 2929, 403, 32809, 285, 6571, 4893, 273, 5368, 3082, 534, 452, 644, 1691, 2366, 281, 2746, 271, 4722, 1527, 1895, 25761, 3798, 2007, 4679, 327, 625, 15302, 403, 2424, 281, 7472, 667, 5921, 875, 253, 2127, 3210, 285, 253, 3998, 14237, 275, 253, 4081, 7792, 5474, 33032, 2520, 2929, 2340, 684, 9706, 273, 4382, 2127, 275, 253, 1966, 3998, 253, 2488, 1973, 28490, 3210, 323, 49555, 363, 6128, 281, 3283, 337, 2710, 3607, 273, 15548, 2127, 285, 374, 14237, 273, 15548, 2127, 6012, 432, 1027, 5145, 4715, 3210, 253, 2022, 6452, 310, 326, 253, 6128, 432, 253, 2709, 4831, 985, 275, 253, 3998, 310, 7032, 273, 2085, 8453, 28490, 3045, 273, 3607, 285, 1566, 14237, 273, 4382, 2127, 824, 347, 20243, 1491, 20544, 436, 2929, 310, 1077, 2590, 285, 973, 3542, 352, 2340, 684, 271, 4722, 1953, 273, 752, 9351, 273, 1491, 310, 1086, 351, 494, 432, 3998, 14237, 273, 4382, 2127, 4679, 403, 973, 1869, 562, 285, 9257, 6537, 342, 247, 12524, 873, 273, 3607, 273, 1600, 285, 1566, 6779, 1543, 403, 2429, 342, 4942, 9470, 5239, 273, 3210, 50275, 20881, 1255, 891, 717, 417, 2119, 1880, 436, 2929, 2686, 3400, 247, 2257, 273, 747, 16039, 323, 1097, 23886, 432, 253, 6551, 21559, 3114, 285, 5145, 4715, 3114, 50275, 262, 310, 417, 10084, 281, 479, 326, 253, 2709, 4831, 985, 275, 253, 3998, 310, 2104, 281, 2085, 1840, 4839, 28490, 3045, 273, 253, 4236, 3607, 273, 253, 4382, 2127, 253, 2709, 4831, 985, 3797, 247, 1781, 629, 273, 253, 42793, 14031, 326, 310, 3839, 5506, 323, 667, 9165, 1453, 285, 9699, 5162, 1966, 513, 352, 651, 320, 625, 4722, 323, 1650, 281, 1408, 247, 3186, 3243, 5933, 689, 253, 3998, 281, 9176, 1066, 2173, 8593, 835, 1027, 3607, 273, 4382, 2127, 403, 6607, 50276, 1542, 3368, 374, 643, 685, 1566, 10454, 1529, 4217, 1453, 651, 320, 253, 1491, 9554, 3497, 390, 3634, 3497, 273, 1027, 3210, 253, 3210, 275, 5301, 452, 1077, 1027, 5122, 281, 19837, 1491, 2439, 253, 673, 1309, 534, 247, 4382, 2086, 310, 3559, 3103, 352, 310, 12744, 281, 479, 849, 253, 9554, 273, 1491, 689, 673, 390, 2978, 273, 2086, 310, 1469, 281, 2818, 253, 3998, 10603, 281, 841, 14237, 253, 10603, 432, 3998, 14237, 281, 3634, 3497, 310, 671, 7826, 9371, 323, 5145, 4715, 8607, 275, 20462, 1805, 3210, 323, 2127, 14237, 2299, 253, 1783, 275, 253, 2929, 594, 2080, 8127, 3710, 281, 28490, 1754, 3082, 285, 760, 2429, 342, 3710, 1180, 273, 5760, 403, 417, 1077, 27096, 50275, 783, 2929, 556, 2218, 30457, 1783, 285, 14023, 281, 7409, 849, 253, 3998, 1957, 4382, 2127, 285, 556, 1270, 2442, 281, 320, 7607, 533, 625, 801, 554, 394, 1783, 403, 3058, 281, 1056, 352, 247, 625, 27096, 2929, 323, 1097, 253, 6551, 21559, 285, 263, 5145, 4715, 8446, 50275, 7152, 33032, 2520, 789, 33888, 253, 2954, 875, 49555, 363, 19654, 273, 952, 665, 1239, 2159, 5659, 285, 1027, 3607, 285, 14237, 273, 253, 10717, 2127, 253, 4388, 273, 253, 789, 310, 281, 2096, 752, 3607, 273, 2127, 403, 16202, 407, 1027, 3998, 2718, 285, 281, 2096, 849, 2074, 253, 14237, 273, 2127, 275, 253, 3998, 403, 281, 1110, 16202, 407, 1881, 35421, 3448, 3210, 326, 403, 3215, 11273, 281, 22573, 10717, 2127, 253, 4477, 1089, 326, 2067, 2086, 3607, 476, 320, 3012, 45775, 432, 495, 3998, 2718, 253, 2709, 4831, 985, 253, 3448, 985, 285, 253, 5304, 985, 597, 2007, 1089, 326, 14237, 273, 253, 5659, 10375, 432, 2067, 5145, 4715, 3210, 273, 11962, 10454, 476, 671, 320, 3012, 45775, 432, 841, 3998, 2718, 50274, 296, 3755, 20556, 50276, 3211, 1160, 13644, 2130, 285, 253, 2929, 671, 4648, 13644, 2130, 3998, 941, 50276, 39025, 2590, 47284, 342, 760, 247, 1643, 5053, 835, 625, 4278, 403, 3058, 923, 3533, 2708, 50276, 266, 801, 554, 394, 285, 14793, 16774, 5839, 50275, 20881, 1255, 265, 50276, 15870, 35961, 38135, 50273, 2520, 789, 4648, 3786, 4232, 3082, 323, 3998, 28490, 285, 1057, 417, 1056, 16424, 2057, 275, 253, 16182, 390, 275, 253, 7914, 273, 5368, 16182, 407, 3139, 436, 310, 417, 247, 15444, 19652, 247, 2929, 476, 452, 1318, 281, 253, 3114, 1293, 824, 38135, 50276, 15870, 1941, 323, 690, 273, 253, 4767, 11815, 50273, 2887, 1953, 337, 2708, 323, 625, 7000, 5701, 50272, 531, 2087, 4385, 672, 4361, 253, 2022, 2929, 352, 310, 417, 1900, 2590, 752, 3910, 875, 3998, 2718, 390, 13361, 3210, 403, 2686, 1534, 50276, 15870, 5955, 50274, 783, 5839, 273, 253, 2954, 875, 3448, 3210, 285, 253, 3998, 19654, 556, 253, 2442, 281, 320, 1077, 4722, 533, 253, 5955, 273, 841, 1543, 310, 6685, 3710, 253, 2929, 476, 5649, 432, 247, 625, 11080, 5955, 273, 752, 310, 6311, 407, 253, 5301, 342, 253, 3448, 3210, 323, 1650, 352, 4620, 326, 247, 1077, 2969, 1566, 751, 6026, 14190, 2074, 28490, 3045, 281, 326, 273, 253, 954, 2570, 1566, 2127, 6291, 284, 80, 840, 752, 513, 359, 3037, 407, 253, 5301, 342, 253, 2570, 1566, 50276, 34974, 323, 4477, 337, 891, 452, 690, 7350, 670, 253, 11815, 323, 495, 273, 253, 577, 3510, 273, 4679, 275, 2593, 8319, 762, 1783, 337, 50273, 3211, 4632, 14683, 3036, 495, 2722, 326, 495, 273, 253, 3998, 2718, 1554, 5710, 395, 3448, 285, 5304, 476, 3012, 12129, 5659, 2556, 281, 1880, 597, 497, 3542, 970, 2127, 390, 970, 14683, 342, 1077, 2074, 7200, 275, 6452, 253, 4477, 1375, 50276, 783, 6944, 5871, 2424, 281, 8415, 1097, 3510, 273, 10717, 3237, 497, 253, 1072, 24088, 1097, 1537, 452, 2424, 49947, 3603, 275, 247, 1618, 2568, 5747, 253, 6255, 5871, 5780, 6571, 253, 1072, 253, 31934, 985, 310, 2104, 281, 30530, 875, 253, 767, 42455, 2515, 436, 8525, 253, 3916, 1160, 407, 632, 86, 1162, 355, 9169, 285, 209, 18064, 8947, 1162, 355, 9169, 326, 253, 31934, 985, 310, 5506, 323, 2127, 35380, 275, 1635, 281, 1146, 20876, 281, 23198, 30749, 253, 10636, 2685, 273, 2127, 3802, 46588, 66, 6867, 2444, 3541, 4836, 50270, 17480, 253, 767, 2515, 403, 4354, 37651, 1754, 327, 5304, 3607, 347, 1941, 407, 253, 1077, 1029, 1534, 28490, 7200, 275, 253, 5304, 985, 849, 476, 253, 4477, 1056, 667, 3916, 670, 752, 253, 1554, 5710, 395, 390, 253, 3448, 2718, 403, 2509, 1754, 327, 253, 28490, 7200, 273, 841, 767, 2515, 50272, 18645, 3448, 1060, 253, 4477, 9186, 253, 28490, 3910, 875, 767, 2515, 581, 275, 534, 253, 4778, 4454, 403, 275, 48087, 285, 253, 643, 835, 253, 4778, 4454, 403, 275, 480, 3682, 3248, 253, 4477, 1333, 326, 597, 1902, 247, 1534, 28490, 3045, 275, 253, 3448, 985, 651, 597, 417, 1902, 247, 1534, 28490, 3045, 275, 253, 5304, 985, 943, 2649, 253, 954, 4755, 3064, 320, 275, 253, 5304, 985, 984, 273, 253, 12082, 5304, 3910, 875, 48087, 285, 480, 3682, 3248, 3036, 495, 2722, 642, 1534, 28490, 3045, 323, 667, 3998, 2718, 6747, 253, 3448, 4543, 253, 5304, 985, 812, 436, 320, 1955, 281, 253, 1039, 253, 3998, 941, 310, 40006, 689, 492, 84, 323, 436, 1783, 50273, 8519, 2685, 285, 941, 1511, 812, 253, 3910, 275, 28490, 3045, 323, 253, 1554, 5710, 395, 285, 3448, 2718, 320, 1955, 281, 247, 3064, 275, 253, 2625, 1299, 45416, 4313, 275, 253, 767, 2718, 285, 849, 281, 253, 4477, 1453, 323, 436, 1896, 3802, 83, 3064, 875, 3998, 2718, 374, 476, 253, 4477, 4385, 327, 2139, 597, 9703, 281, 513, 28490, 6260, 689, 9706, 6260, 26332, 21565, 3998, 2425, 432, 14237, 273, 5659, 352, 3133, 326, 9706, 812, 1581, 323, 247, 625, 801, 554, 394, 1783, 273, 253, 3910, 275, 752, 3607, 1027, 4811, 21590, 2718, 3794, 281, 26332, 407, 10941, 253, 13461, 273, 253, 6311, 9706, 3210, 323, 1027, 3998, 4811, 39098, 495, 476, 253, 4477, 4385, 327, 2139, 760, 1669, 77, 7476, 1025, 4811, 273, 1600, 497, 6730, 2067, 2987, 275, 3626, 3448, 35380, 342, 3626, 2531, 15374, 921, 326, 1097, 253, 1669, 285, 987, 25100, 14210, 403, 9583, 1309, 3448, 35380, 359, 73, 1257, 1162, 355, 4059, 288, 3097, 1162, 355, 4022, 480, 404, 285, 288, 3097, 4765, 513, 253, 4477, 1902, 326, 253, 11472, 3998, 4811, 275, 253, 987, 32660, 513, 417, 11377, 275, 253, 1072, 1039, 1309, 10717, 35380, 285, 604, 594, 513, 253, 4477, 2868, 326, 436, 310, 247, 1159, 273, 616, 2159, 15374, 26332, 604, 253, 15374, 497, 3356, 7437, 273, 2127, 359, 651, 1902, 1097, 25100, 14210, 281, 320, 9583, 577, 581, 273, 253, 3533, 326, 369, 1869, 2920, 6730, 407, 253, 4477, 369, 849, 253, 10454, 273, 253, 13361, 1566, 11852, 697, 2954, 342, 253, 3998, 19654, 253, 4477, 19148, 326, 752, 597, 1599, 407, 10454, 310, 253, 1180, 273, 1566, 3602, 1529, 2905, 1566, 2867, 326, 778, 2818, 253, 2954, 342, 253, 3998, 19654, 310, 253, 7877, 1319, 273, 253, 3453, 46234, 14958, 253, 4477, 671, 9186, 436, 2867, 697, 8489, 2905, 281, 253, 1180, 273, 3602, 275, 253, 1566, 533, 310, 417, 247, 3962, 23403, 26332, 270, 797, 285, 305, 431, 19, 452, 253, 1072, 21496, 1979, 533, 1027, 3904, 273, 3602, 4583, 50276, 21, 8254, 6787, 3058, 670, 50273, 12554, 569, 273, 5659, 10375, 432, 253, 3210, 849, 4555, 497, 597, 10375, 369, 253, 2644, 2086, 2530, 281, 253, 1566, 387, 2378, 253, 4477, 1333, 326, 253, 3453, 273, 253, 32049, 369, 908, 533, 752, 670, 323, 47694, 11020, 3210, 26332, 253, 1390, 1375, 390, 1633, 2010, 50272, 21590, 3368, 369, 253, 2086, 3559, 387, 2378, 390, 3159, 1615, 3418, 323, 849, 1048, 752, 497, 253, 5705, 17189, 281, 513, 50273, 21590, 941, 7384, 326, 1016, 2086, 369, 3559, 323, 2709, 492, 84, 849, 369, 253, 941, 432, 2709, 492, 84, 15726, 275, 253, 28490, 5199, 50276, 1189, 455, 253, 2929, 2340, 684, 271, 4722, 285, 14793, 1953, 533, 6131, 3710, 7681, 285, 16774, 16039, 253, 789, 476, 320, 34615, 407, 247, 625, 801, 554, 394, 5955, 670, 752, 310, 6311, 407, 253, 5301, 342, 2570, 13361, 3210, 285, 407, 3081, 6260, 326, 17084, 253, 16774, 11815, 923, 762, 3533, 2490, 187, 4118, 18435, 27, 2520, 2929, 13698, 281, 14588, 3998, 2425, 273, 952, 4361, 4382, 2127, 281, 3607, 273, 253, 4382, 2127, 597, 14588, 253, 1119, 14237, 281, 1110, 2797, 432, 13361, 15180, 3448, 50276, 19286, 3732, 281, 253, 1072, 5659, 50276, 783, 2929, 310, 4518, 3542, 285, 271, 4722, 2934, 50276, 9088, 369, 247, 2257, 273, 5955, 285, 253, 4477, 9300, 616, 2929, 247, 2257, 50276, 14996, 2978, 347, 247, 2442, 44667, 369, 5439, 285, 50276, 12566, 2920, 30080, 8659, 50276, 783, 6070, 273, 38135, 432, 209, 18064, 8947, 1162, 355, 9169, 369, 671, 5469, 285, 8379, 30080, 8659, 50276, 249, 253, 990, 253, 2022, 3374, 253, 30628, 574, 497, 337, 326, 253, 2929, 574, 644, 9300, 9619, 1580, 19529, 285, 651, 3103, 5649, 432, 247, 11080, 294, 15337, 285, 374, 1880, 253, 1543, 2085, 2217, 747, 50275, 968, 4380, 670, 253, 3998, 390, 670, 13361, 3448, 3210, 50274, 936, 26799, 253, 4477, 5262, 247, 2257, 273, 673, 15974, 3374, 275, 253, 50276, 250, 2858, 22559, 3408, 285, 253, 2929, 1694, 247, 2257, 1805, 342, 253, 30628, 13991, 533, 30628, 5821, 352, 651, 5649, 432, 625, 789, 285, 2007, 2278, 1078, 14924, 50276, 74, 5194, 342, 436, 6803 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 275, 3448, 2718, 3969, 281, 15548, 2127, 35380, 285, 642, 2425, 5001, 5304, 10717, 751, 20041, 75, 83, 597, 671, 4860, 326, 35253, 310, 20876, 281, 15548, 2127, 1223, 5304, 3672, 403, 20876, 281, 20041, 75, 83, 1677, 326, 581, 643, 3626, 3368, 281, 1329, 253, 10603, 3998, 6779, 281, 253, 2127, 1566, 812, 320, 281, 1611, 352, 327, 253, 3998, 14237, 273, 5304, 5659, 6571, 275, 8113, 285, 31934, 2718, 285, 253, 3969, 5145, 29343, 264, 2349, 351, 723, 970, 5304, 8981, 3210, 824, 347, 27311, 267, 11454, 37507, 452, 368, 3597, 824, 4679, 327, 20041, 75, 83, 11646, 2074, 281, 752, 368, 452, 2218, 323, 15548, 11646, 50276, 37585, 5701, 50276, 783, 8813, 275, 253, 2929, 310, 4030, 2299, 6240, 625, 21302, 273, 253, 2644, 5199, 273, 253, 7792, 347, 973, 347, 15965, 20121, 273, 253, 3210, 285, 616, 3280, 285, 3453, 347, 12624, 34383, 651, 452, 6518, 247, 2257, 281, 2096, 253, 2644, 2934, 7938, 285, 6927, 50276, 783, 2929, 310, 973, 3542, 253, 42852, 403, 2590, 285, 253, 10414, 403, 2217, 2905, 789, 310, 18212, 4232, 253, 5368, 2561, 275, 253, 1673, 285, 2429, 253, 4736, 273, 253, 1655, 2561, 281, 2045, 789, 2299, 253, 5697, 273, 253, 2929, 403, 32809, 285, 6571, 4893, 273, 5368, 3082, 534, 452, 644, 1691, 2366, 281, 2746, 271, 4722, 1527, 1895, 25761, 3798, 2007, 4679, 327, 625, 15302, 403, 2424, 281, 7472, 667, 5921, 875, 253, 2127, 3210, 285, 253, 3998, 14237, 275, 253, 4081, 7792, 5474, 33032, 2520, 2929, 2340, 684, 9706, 273, 4382, 2127, 275, 253, 1966, 3998, 253, 2488, 1973, 28490, 3210, 323, 49555, 363, 6128, 281, 3283, 337, 2710, 3607, 273, 15548, 2127, 285, 374, 14237, 273, 15548, 2127, 6012, 432, 1027, 5145, 4715, 3210, 253, 2022, 6452, 310, 326, 253, 6128, 432, 253, 2709, 4831, 985, 275, 253, 3998, 310, 7032, 273, 2085, 8453, 28490, 3045, 273, 3607, 285, 1566, 14237, 273, 4382, 2127, 824, 347, 20243, 1491, 20544, 436, 2929, 310, 1077, 2590, 285, 973, 3542, 352, 2340, 684, 271, 4722, 1953, 273, 752, 9351, 273, 1491, 310, 1086, 351, 494, 432, 3998, 14237, 273, 4382, 2127, 4679, 403, 973, 1869, 562, 285, 9257, 6537, 342, 247, 12524, 873, 273, 3607, 273, 1600, 285, 1566, 6779, 1543, 403, 2429, 342, 4942, 9470, 5239, 273, 3210, 50275, 20881, 1255, 891, 717, 417, 2119, 1880, 436, 2929, 2686, 3400, 247, 2257, 273, 747, 16039, 323, 1097, 23886, 432, 253, 6551, 21559, 3114, 285, 5145, 4715, 3114, 50275, 262, 310, 417, 10084, 281, 479, 326, 253, 2709, 4831, 985, 275, 253, 3998, 310, 2104, 281, 2085, 1840, 4839, 28490, 3045, 273, 253, 4236, 3607, 273, 253, 4382, 2127, 253, 2709, 4831, 985, 3797, 247, 1781, 629, 273, 253, 42793, 14031, 326, 310, 3839, 5506, 323, 667, 9165, 1453, 285, 9699, 5162, 1966, 513, 352, 651, 320, 625, 4722, 323, 1650, 281, 1408, 247, 3186, 3243, 5933, 689, 253, 3998, 281, 9176, 1066, 2173, 8593, 835, 1027, 3607, 273, 4382, 2127, 403, 6607, 50276, 1542, 3368, 374, 643, 685, 1566, 10454, 1529, 4217, 1453, 651, 320, 253, 1491, 9554, 3497, 390, 3634, 3497, 273, 1027, 3210, 253, 3210, 275, 5301, 452, 1077, 1027, 5122, 281, 19837, 1491, 2439, 253, 673, 1309, 534, 247, 4382, 2086, 310, 3559, 3103, 352, 310, 12744, 281, 479, 849, 253, 9554, 273, 1491, 689, 673, 390, 2978, 273, 2086, 310, 1469, 281, 2818, 253, 3998, 10603, 281, 841, 14237, 253, 10603, 432, 3998, 14237, 281, 3634, 3497, 310, 671, 7826, 9371, 323, 5145, 4715, 8607, 275, 20462, 1805, 3210, 323, 2127, 14237, 2299, 253, 1783, 275, 253, 2929, 594, 2080, 8127, 3710, 281, 28490, 1754, 3082, 285, 760, 2429, 342, 3710, 1180, 273, 5760, 403, 417, 1077, 27096, 50275, 783, 2929, 556, 2218, 30457, 1783, 285, 14023, 281, 7409, 849, 253, 3998, 1957, 4382, 2127, 285, 556, 1270, 2442, 281, 320, 7607, 533, 625, 801, 554, 394, 1783, 403, 3058, 281, 1056, 352, 247, 625, 27096, 2929, 323, 1097, 253, 6551, 21559, 285, 263, 5145, 4715, 8446, 50275, 7152, 33032, 2520, 789, 33888, 253, 2954, 875, 49555, 363, 19654, 273, 952, 665, 1239, 2159, 5659, 285, 1027, 3607, 285, 14237, 273, 253, 10717, 2127, 253, 4388, 273, 253, 789, 310, 281, 2096, 752, 3607, 273, 2127, 403, 16202, 407, 1027, 3998, 2718, 285, 281, 2096, 849, 2074, 253, 14237, 273, 2127, 275, 253, 3998, 403, 281, 1110, 16202, 407, 1881, 35421, 3448, 3210, 326, 403, 3215, 11273, 281, 22573, 10717, 2127, 253, 4477, 1089, 326, 2067, 2086, 3607, 476, 320, 3012, 45775, 432, 495, 3998, 2718, 253, 2709, 4831, 985, 253, 3448, 985, 285, 253, 5304, 985, 597, 2007, 1089, 326, 14237, 273, 253, 5659, 10375, 432, 2067, 5145, 4715, 3210, 273, 11962, 10454, 476, 671, 320, 3012, 45775, 432, 841, 3998, 2718, 50274, 296, 3755, 20556, 50276, 3211, 1160, 13644, 2130, 285, 253, 2929, 671, 4648, 13644, 2130, 3998, 941, 50276, 39025, 2590, 47284, 342, 760, 247, 1643, 5053, 835, 625, 4278, 403, 3058, 923, 3533, 2708, 50276, 266, 801, 554, 394, 285, 14793, 16774, 5839, 50275, 20881, 1255, 265, 50276, 15870, 35961, 38135, 50273, 2520, 789, 4648, 3786, 4232, 3082, 323, 3998, 28490, 285, 1057, 417, 1056, 16424, 2057, 275, 253, 16182, 390, 275, 253, 7914, 273, 5368, 16182, 407, 3139, 436, 310, 417, 247, 15444, 19652, 247, 2929, 476, 452, 1318, 281, 253, 3114, 1293, 824, 38135, 50276, 15870, 1941, 323, 690, 273, 253, 4767, 11815, 50273, 2887, 1953, 337, 2708, 323, 625, 7000, 5701, 50272, 531, 2087, 4385, 672, 4361, 253, 2022, 2929, 352, 310, 417, 1900, 2590, 752, 3910, 875, 3998, 2718, 390, 13361, 3210, 403, 2686, 1534, 50276, 15870, 5955, 50274, 783, 5839, 273, 253, 2954, 875, 3448, 3210, 285, 253, 3998, 19654, 556, 253, 2442, 281, 320, 1077, 4722, 533, 253, 5955, 273, 841, 1543, 310, 6685, 3710, 253, 2929, 476, 5649, 432, 247, 625, 11080, 5955, 273, 752, 310, 6311, 407, 253, 5301, 342, 253, 3448, 3210, 323, 1650, 352, 4620, 326, 247, 1077, 2969, 1566, 751, 6026, 14190, 2074, 28490, 3045, 281, 326, 273, 253, 954, 2570, 1566, 2127, 6291, 284, 80, 840, 752, 513, 359, 3037, 407, 253, 5301, 342, 253, 2570, 1566, 50276, 34974, 323, 4477, 337, 891, 452, 690, 7350, 670, 253, 11815, 323, 495, 273, 253, 577, 3510, 273, 4679, 275, 2593, 8319, 762, 1783, 337, 50273, 3211, 4632, 14683, 3036, 495, 2722, 326, 495, 273, 253, 3998, 2718, 1554, 5710, 395, 3448, 285, 5304, 476, 3012, 12129, 5659, 2556, 281, 1880, 597, 497, 3542, 970, 2127, 390, 970, 14683, 342, 1077, 2074, 7200, 275, 6452, 253, 4477, 1375, 50276, 783, 6944, 5871, 2424, 281, 8415, 1097, 3510, 273, 10717, 3237, 497, 253, 1072, 24088, 1097, 1537, 452, 2424, 49947, 3603, 275, 247, 1618, 2568, 5747, 253, 6255, 5871, 5780, 6571, 253, 1072, 253, 31934, 985, 310, 2104, 281, 30530, 875, 253, 767, 42455, 2515, 436, 8525, 253, 3916, 1160, 407, 632, 86, 1162, 355, 9169, 285, 209, 18064, 8947, 1162, 355, 9169, 326, 253, 31934, 985, 310, 5506, 323, 2127, 35380, 275, 1635, 281, 1146, 20876, 281, 23198, 30749, 253, 10636, 2685, 273, 2127, 3802, 46588, 66, 6867, 2444, 3541, 4836, 50270, 17480, 253, 767, 2515, 403, 4354, 37651, 1754, 327, 5304, 3607, 347, 1941, 407, 253, 1077, 1029, 1534, 28490, 7200, 275, 253, 5304, 985, 849, 476, 253, 4477, 1056, 667, 3916, 670, 752, 253, 1554, 5710, 395, 390, 253, 3448, 2718, 403, 2509, 1754, 327, 253, 28490, 7200, 273, 841, 767, 2515, 50272, 18645, 3448, 1060, 253, 4477, 9186, 253, 28490, 3910, 875, 767, 2515, 581, 275, 534, 253, 4778, 4454, 403, 275, 48087, 285, 253, 643, 835, 253, 4778, 4454, 403, 275, 480, 3682, 3248, 253, 4477, 1333, 326, 597, 1902, 247, 1534, 28490, 3045, 275, 253, 3448, 985, 651, 597, 417, 1902, 247, 1534, 28490, 3045, 275, 253, 5304, 985, 943, 2649, 253, 954, 4755, 3064, 320, 275, 253, 5304, 985, 984, 273, 253, 12082, 5304, 3910, 875, 48087, 285, 480, 3682, 3248, 3036, 495, 2722, 642, 1534, 28490, 3045, 323, 667, 3998, 2718, 6747, 253, 3448, 4543, 253, 5304, 985, 812, 436, 320, 1955, 281, 253, 1039, 253, 3998, 941, 310, 40006, 689, 492, 84, 323, 436, 1783, 50273, 8519, 2685, 285, 941, 1511, 812, 253, 3910, 275, 28490, 3045, 323, 253, 1554, 5710, 395, 285, 3448, 2718, 320, 1955, 281, 247, 3064, 275, 253, 2625, 1299, 45416, 4313, 275, 253, 767, 2718, 285, 849, 281, 253, 4477, 1453, 323, 436, 1896, 3802, 83, 3064, 875, 3998, 2718, 374, 476, 253, 4477, 4385, 327, 2139, 597, 9703, 281, 513, 28490, 6260, 689, 9706, 6260, 26332, 21565, 3998, 2425, 432, 14237, 273, 5659, 352, 3133, 326, 9706, 812, 1581, 323, 247, 625, 801, 554, 394, 1783, 273, 253, 3910, 275, 752, 3607, 1027, 4811, 21590, 2718, 3794, 281, 26332, 407, 10941, 253, 13461, 273, 253, 6311, 9706, 3210, 323, 1027, 3998, 4811, 39098, 495, 476, 253, 4477, 4385, 327, 2139, 760, 1669, 77, 7476, 1025, 4811, 273, 1600, 497, 6730, 2067, 2987, 275, 3626, 3448, 35380, 342, 3626, 2531, 15374, 921, 326, 1097, 253, 1669, 285, 987, 25100, 14210, 403, 9583, 1309, 3448, 35380, 359, 73, 1257, 1162, 355, 4059, 288, 3097, 1162, 355, 4022, 480, 404, 285, 288, 3097, 4765, 513, 253, 4477, 1902, 326, 253, 11472, 3998, 4811, 275, 253, 987, 32660, 513, 417, 11377, 275, 253, 1072, 1039, 1309, 10717, 35380, 285, 604, 594, 513, 253, 4477, 2868, 326, 436, 310, 247, 1159, 273, 616, 2159, 15374, 26332, 604, 253, 15374, 497, 3356, 7437, 273, 2127, 359, 651, 1902, 1097, 25100, 14210, 281, 320, 9583, 577, 581, 273, 253, 3533, 326, 369, 1869, 2920, 6730, 407, 253, 4477, 369, 849, 253, 10454, 273, 253, 13361, 1566, 11852, 697, 2954, 342, 253, 3998, 19654, 253, 4477, 19148, 326, 752, 597, 1599, 407, 10454, 310, 253, 1180, 273, 1566, 3602, 1529, 2905, 1566, 2867, 326, 778, 2818, 253, 2954, 342, 253, 3998, 19654, 310, 253, 7877, 1319, 273, 253, 3453, 46234, 14958, 253, 4477, 671, 9186, 436, 2867, 697, 8489, 2905, 281, 253, 1180, 273, 3602, 275, 253, 1566, 533, 310, 417, 247, 3962, 23403, 26332, 270, 797, 285, 305, 431, 19, 452, 253, 1072, 21496, 1979, 533, 1027, 3904, 273, 3602, 4583, 50276, 21, 8254, 6787, 3058, 670, 50273, 12554, 569, 273, 5659, 10375, 432, 253, 3210, 849, 4555, 497, 597, 10375, 369, 253, 2644, 2086, 2530, 281, 253, 1566, 387, 2378, 253, 4477, 1333, 326, 253, 3453, 273, 253, 32049, 369, 908, 533, 752, 670, 323, 47694, 11020, 3210, 26332, 253, 1390, 1375, 390, 1633, 2010, 50272, 21590, 3368, 369, 253, 2086, 3559, 387, 2378, 390, 3159, 1615, 3418, 323, 849, 1048, 752, 497, 253, 5705, 17189, 281, 513, 50273, 21590, 941, 7384, 326, 1016, 2086, 369, 3559, 323, 2709, 492, 84, 849, 369, 253, 941, 432, 2709, 492, 84, 15726, 275, 253, 28490, 5199, 50276, 1189, 455, 253, 2929, 2340, 684, 271, 4722, 285, 14793, 1953, 533, 6131, 3710, 7681, 285, 16774, 16039, 253, 789, 476, 320, 34615, 407, 247, 625, 801, 554, 394, 5955, 670, 752, 310, 6311, 407, 253, 5301, 342, 2570, 13361, 3210, 285, 407, 3081, 6260, 326, 17084, 253, 16774, 11815, 923, 762, 3533, 2490, 187, 4118, 18435, 27, 2520, 2929, 13698, 281, 14588, 3998, 2425, 273, 952, 4361, 4382, 2127, 281, 3607, 273, 253, 4382, 2127, 597, 14588, 253, 1119, 14237, 281, 1110, 2797, 432, 13361, 15180, 3448, 50276, 19286, 3732, 281, 253, 1072, 5659, 50276, 783, 2929, 310, 4518, 3542, 285, 271, 4722, 2934, 50276, 9088, 369, 247, 2257, 273, 5955, 285, 253, 4477, 9300, 616, 2929, 247, 2257, 50276, 14996, 2978, 347, 247, 2442, 44667, 369, 5439, 285, 50276, 12566, 2920, 30080, 8659, 50276, 783, 6070, 273, 38135, 432, 209, 18064, 8947, 1162, 355, 9169, 369, 671, 5469, 285, 8379, 30080, 8659, 50276, 249, 253, 990, 253, 2022, 3374, 253, 30628, 574, 497, 337, 326, 253, 2929, 574, 644, 9300, 9619, 1580, 19529, 285, 651, 3103, 5649, 432, 247, 11080, 294, 15337, 285, 374, 1880, 253, 1543, 2085, 2217, 747, 50275, 968, 4380, 670, 253, 3998, 390, 670, 13361, 3448, 3210, 50274, 936, 26799, 253, 4477, 5262, 247, 2257, 273, 673, 15974, 3374, 275, 253, 50276, 250, 2858, 22559, 3408, 285, 253, 2929, 1694, 247, 2257, 1805, 342, 253, 30628, 13991, 533, 30628, 5821, 352, 651, 5649, 432, 625, 789, 285, 2007, 2278, 1078, 14924, 50276, 74, 5194, 342, 436, 6803 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper studies the domain adaptation problem when the source data comes from multiple domains continuously and the test domain for adaptation is unknown the assumption used for domain shift is that the domain label would change the features but not the labels so the main idea is to learn invariant representations across all the domains and avoid spurious correlation on the domain label the proposed method then involves a multiplayer minimax game the adversaries are the domain discriminator for each class which tries to maximize the domain discrepancy the minimizer player is the representation learner the paper introduces two discrepancy measure based on the jensenshannon divergence and the one dimensional wasserstein distance in the experiment the data set for continual learning is constructed using domain shift data such that it mimics the online learning setting the results are competitive in comparison with a limited set of baselines strong points 1 the paper focused on an interesting and important topic 2 the multiplayer adversarial game in terms of minimizing domain discrepancy seems to be novel weak points 1 the online or continual learning perspective is merely solved by keeping an episodic source data buffer which i think is overly simplified in general i have a question about how this adversarial method would work when there is not an onlinecontinual learning component given a fixed target this a static set of source domains it seems the method should be still valid so i am not sure how the domain generalization side and the online side of the method interact with each other investigating the online setting before the batched setting seems problematic to me 2 the evaluation in experiments shows that the fm measure of the proposed method is not very competitive i believe it is related to how the memory is sampled and the length of the task sequence however it also indicates the method is not very satisfactory for avoiding forgetting otherwise it could also be the case the conditional domain shift assumption is not valid in the data 3 the idea to learn an invariant representation is not novel for example the invariant risk minimization irm method is exactly dealing with the same problem using an episodic source data buffer it seems you can also apply irm to solve the problem i think it should be included as a baseline in general more invariant learning approaches should be discussed in the related work 4 writing can be significantly improved given the weak points i recommend rejection for this paper here are some of my questions and additional suggestions 1 i am curious about how the invariant feature looks like basically after learning the adversarial domain predictors how the representation learning looks like it would be nice to have some visualization on that 2 i do recommend focusing on a batched problem before going to the online version i feel even though it is not the case that the batched method will work for online cases at least in domain generalization we cannot expect a bad learner to work online when the data is even more limited even for an online paper showing the performance for the batched version seems to be necessary 3 how alpha is chosen and how it affects learning in general more ablation studies would improve paper quality docsepsummary this paper talks about online continual learning in scenarios where there might be domainshift during test time though i find the problem to be important i believe that the solution proposed in this paper is straightforward which is fine and imposes a new set of constraints knowing a priori the domain id during training making the problem quite impractical for scenarios where continual learning might be useful please find below my comments in detail 1 problem formulation i do agree with the authors that taskfree setup also known as singlehead is something we should focus more on and like the fact that they remove this constraint of knowing task id during traintest however while relaxing this constraint they added a new assumption of knowing the domain id during training i find this rather a more strict requirement the right problem formulation would be where there is a stream of samples coming online with a relatively blurry task boundary and these samples might belong to different domains knowing domains a priori is very impractical in continual learning setup could you please scenarios where its feasible to know domain id a priori 2 approach while i understand the er part i do not understand clearly section 41 what is the need of all gan literature here why do you call dj a domain critic am i wrong in saying that the only role of dj for class yj is to apply crossentropy loss over domains clipping if needed please correct me if i am wrong the section was a bit unclear to follow 3 how do you make sure that the assumption of data balance is valid given that the memory is very small compared to the current dataset also it seems like that the memory sizes of 9k 10k and 34k are too big please comment 4 training time since the setup is online training efficiently is extremely crucial in algorithm 1 every parameter update requires y extra backprops for the domain critics sgd step could you please comment on the training time 5 references i would suggest the authors to correct their citations a bit for example when talking about taskfree i would also cite icarl synaptic intelligence si and rwalk similarly the forgetting measure used in this paper was not proposed in chaudhry 2019a it was proposed in rwalk overall even though the problem is important i find the final experimental setup to be impractical there already are too many impractical formulations for continual learning i would rather refrain myself from encouraging a new one on top of this there are a few technical aspects training time memory budget etc i do not completely agree with therefore would request authors to answer above questions for claritydocsepthis paper investigates continual learning under domain shift and proposes a conditional invariant experience replay cier method accordingly cier aims to retain old knowledge acquire new information and generalize to unseen domains in particular cier uses adversarial training to correct the domain shift experimental results on three benchmark datasets are reported and discussed pros 1 the problem setting of continual learning with domain shift is well motivated the authors explained the rationality of this new problem setting by using a causal model 2 three scenarios are considered and evaluated including the significant shift mild shift and no shift experimental results show that the proposed methods outperform baselines in many cases 3 overall the paper is well written and clearly organized the technical details are easy to follow cons 1 although the problem setting is new my main concern is the limited novelty of the proposed cier method in detail experience replay has been extensively studied in continual learning while the domain adversarial learning has also been widely used in domain adaptation literature for many years the proposed cier method as illustrated in figure 2 simply combines these two components 2 ablation studies of the proposed method are missing thus it is difficult to understand the contribution of each component the effects of different sampling buffer sizes etc docsepid like to preface this review by saying that i am not an expert in this area of continual learning hence i will adjust my review after reading the authors response as well as other reviewers comments accordingly this work proposes cier a continual learning method that adjusts to domain shift during test time the authors claim that existing methods correct distribution shift in px y which makes the stronger assumption that py x is the same across domains in contract this work follow prior work by zhang 2013 and corrects distribution shift in px y which makes the weaker assumption that py is the same across domains the contribution of this work include using an adversarial objective via a critic that attempts to distinguish the domain of the learned representation fx y as well as the adoption of an experience replay buffer from which to sample examples to optimize the minimax adversarial objective my concerns are as follows 1 i am skeptical of the central claim of this work which is that continual learning does not address domain shift i am not an expert in this area but to my knowledge all reinforcement learning work that train on one environment and adapts to another environment must continual learn in a different environment eg httpsarxivorgabs180311347 httpsarxivorgabs190504819 httpsarxivorgabs191008210 can the authors comment on whether they think these works that both continually learn and adapt to new domains are relevant if so would it make sense to compare to them moreover these works and others in domain adaptation do not assume that py is identical across domains hence i feel like this is a rather strong assumption 2 the manuscript rehash the story of learning an invariant fx y repeatedly however the terms in which it does this are not precise for example how do the authors define domain task test domain target domain invariance stability i suggest that the authors define these precisely for example w mathematical definitions and give concrete examples grounded in application settings where possible 3 when the authors say that the distribution is stable they mean that there is no class imbalance between domains however stability in a distribution leads to a number of consequences httpsenwikipediaorgwikistabledistribution are these consequences necessary if so in what ways if not would it make sense to just say there is no class imbalance between domains 4 the manuscript contains some traces of notation abuse which make it hard to read for example d and its cursive variant represent critic and domain the lowercase t represents both task and time the authors alternate between test domain and target domain i suggests that the authors make their terminology consistent 5 in section 41 the authors propose learning a representation conditionally invariant in pxy however they also say that this was done by zhang 2013 and li 2018 hence this is not a contribution correct in summary due to the limitations posed by the assumption of this work py the same across domains the incremental contribution of this work adversarial loss experience replay and the lack of clarity and precision in the manuscript i recommend a weak rejection due to my lack of experience in continual learning i cannot assess the strength of the benchmarks and the significance of the experimental results and will defer to other reviewers ### Summary:
this paper presented an online continual learning method where there may be a shift in data distribution at test time the paper proposes a conditional invariant experience replay cier approach to correct the short which matches the distribution of inputs conditioned on the outputs this is based on an adversarial training scheme the reviewers found the problem setting interesting but found the approach to be lacking in novelty and problem formulation somewhat restrictive eg requiring domain id during training the author feedback was taken into account but the reviewers stayed with their original assessment and even after the rebuttal phase none of the reviewers is in favor of accepting the paper the authors are advised to consider the feedback from the reviewers which will hopefully help to improve the paper for a future submission to another venue
[ 908, 323, 5028, 5333, 310, 326, 253, 5028, 5203, 651, 1818, 253, 3386, 533, 417, 253, 13301, 594, 253, 2022, 2934, 310, 281, 3037, 13727, 14237, 2439, 512, 253, 10625, 285, 3693, 46541, 5921, 327, 253, 5028, 5203, 253, 4081, 1332, 840, 8687, 247, 44047, 7221, 991, 2165, 253, 18539, 3927, 403, 253, 5028, 7134, 12915, 323, 1016, 966, 534, 14177, 281, 22950, 253, 5028, 26210, 253, 7221, 6081, 4760, 310, 253, 6779, 458, 47612, 253, 2929, 23970, 767, 26210, 2557, 1754, 327, 253, 480, 561, 561, 73, 16554, 23279, 285, 253, 581, 15759, 369, 2152, 6339, 4181, 275, 253, 3368, 253, 941, 873, 323, 45120, 4715, 310, 8818, 970, 5028, 5333, 941, 824, 326, 352, 43341, 253, 3909, 4715, 4758, 253, 1543, 403, 12085, 275, 5301, 342, 247, 3710, 873, 273, 1666, 25379, 50273, 9072, 2792, 337, 253, 2929, 7106, 327, 271, 4722, 285, 1774, 9400, 374, 253, 44047, 48960, 2165, 275, 2426, 273, 28699, 5028, 26210, 3133, 281, 320, 4460, 50275, 20881, 2792, 337, 253, 3909, 390, 45120, 4715, 8668, 310, 7960, 14042, 407, 7562, 271, 6314, 23329, 2603, 941, 6391, 534, 891, 1158, 310, 27662, 21010, 275, 2087, 891, 452, 247, 1953, 670, 849, 436, 48960, 1332, 651, 789, 672, 627, 310, 417, 271, 3909, 8190, 780, 4715, 4445, 1677, 247, 4229, 2303, 436, 247, 4228, 873, 273, 2603, 10625, 352, 3133, 253, 1332, 943, 320, 1335, 3588, 594, 891, 717, 417, 2119, 849, 253, 5028, 26647, 1930, 285, 253, 3909, 1930, 273, 253, 1332, 8008, 342, 1016, 643, 15686, 253, 3909, 4758, 1078, 253, 10464, 2147, 4758, 3133, 20276, 281, 479, 50275, 19, 253, 7103, 275, 4679, 2722, 326, 253, 49555, 2557, 273, 253, 4081, 1332, 310, 417, 1077, 12085, 891, 2868, 352, 310, 2905, 281, 849, 253, 3541, 310, 19958, 285, 253, 2978, 273, 253, 4836, 3425, 2299, 352, 671, 6492, 253, 1332, 310, 417, 1077, 20297, 323, 17816, 37264, 5010, 352, 812, 671, 320, 253, 1083, 253, 17697, 5028, 5333, 9376, 310, 417, 3588, 275, 253, 941, 50275, 20, 253, 2934, 281, 3037, 271, 13727, 6779, 310, 417, 4460, 323, 1650, 253, 13727, 2495, 41458, 209, 2683, 1332, 310, 4555, 10620, 342, 253, 1072, 1895, 970, 271, 6314, 23329, 2603, 941, 6391, 352, 3133, 368, 476, 671, 4647, 209, 2683, 281, 8415, 253, 1895, 891, 1158, 352, 943, 320, 2908, 347, 247, 8245, 275, 2087, 625, 13727, 4715, 7274, 943, 320, 5469, 275, 253, 2905, 789, 50275, 21, 4028, 476, 320, 3012, 5520, 50275, 28821, 253, 5075, 2792, 891, 5583, 18235, 323, 436, 2929, 50275, 1568, 403, 690, 273, 619, 3533, 285, 3081, 13991, 337, 891, 717, 14338, 670, 849, 253, 13727, 4735, 4453, 751, 10323, 846, 4715, 253, 48960, 5028, 23477, 849, 253, 6779, 4715, 4453, 751, 352, 651, 320, 5322, 281, 452, 690, 24426, 327, 326, 50276, 19, 891, 513, 5583, 13654, 327, 247, 10464, 2147, 1895, 1078, 1469, 281, 253, 3909, 2715, 891, 1928, 1014, 2167, 352, 310, 417, 253, 1083, 326, 253, 10464, 2147, 1332, 588, 789, 323, 3909, 2219, 387, 1878, 275, 5028, 26647, 359, 2550, 1902, 247, 3076, 458, 47612, 281, 789, 3909, 672, 253, 941, 310, 1014, 625, 3710, 1014, 323, 271, 3909, 2929, 4645, 253, 3045, 323, 253, 10464, 2147, 2715, 3133, 281, 320, 3309, 50276, 20, 849, 9765, 310, 6777, 285, 849, 352, 11852, 4715, 275, 2087, 625, 28913, 2175, 651, 3157, 2929, 3290, 50276, 7152, 339, 793, 360, 3454, 436, 2929, 12088, 670, 3909, 45120, 4715, 275, 15216, 835, 627, 1537, 320, 5028, 11551, 1309, 1071, 673, 2167, 891, 1089, 253, 1895, 281, 320, 1774, 891, 2868, 326, 253, 2900, 4081, 275, 436, 2929, 310, 15246, 534, 310, 4030, 285, 35979, 247, 747, 873, 273, 10806, 8958, 247, 30400, 253, 5028, 2654, 1309, 3733, 2403, 253, 1895, 3240, 45783, 323, 15216, 835, 45120, 4715, 1537, 320, 4217, 4496, 1089, 2708, 619, 5701, 275, 2508, 50276, 18, 1895, 15895, 891, 513, 5194, 342, 253, 4477, 326, 4836, 4924, 9978, 671, 1929, 347, 2014, 2522, 310, 1633, 359, 943, 2770, 625, 327, 285, 751, 253, 958, 326, 597, 5386, 436, 7658, 273, 8958, 4836, 2654, 1309, 1140, 565, 383, 2299, 1223, 32196, 436, 7658, 597, 2879, 247, 747, 9376, 273, 8958, 253, 5028, 2654, 1309, 3733, 891, 1089, 436, 2581, 247, 625, 7654, 8284, 253, 987, 1895, 15895, 651, 320, 835, 627, 310, 247, 5542, 273, 3530, 3551, 3909, 342, 247, 4942, 787, 20657, 4836, 7548, 285, 841, 3530, 1537, 5663, 281, 1027, 10625, 8958, 10625, 247, 30400, 310, 1077, 45783, 275, 45120, 4715, 9978, 812, 368, 4496, 15216, 835, 697, 17887, 281, 871, 5028, 2654, 247, 30400, 50275, 19, 2746, 1223, 891, 2096, 253, 2827, 629, 891, 513, 417, 2096, 4518, 2593, 7609, 752, 310, 253, 878, 273, 512, 36827, 6239, 1060, 2139, 513, 368, 1067, 277, 75, 247, 5028, 7291, 717, 891, 3430, 275, 3981, 326, 253, 760, 2554, 273, 277, 75, 323, 966, 340, 75, 310, 281, 4647, 2831, 290, 10144, 2957, 689, 10625, 502, 8201, 604, 3058, 4496, 3451, 479, 604, 891, 717, 3430, 253, 2593, 369, 247, 2372, 12744, 281, 956, 50275, 20, 849, 513, 368, 1056, 2119, 326, 253, 9376, 273, 941, 6654, 310, 3588, 1677, 326, 253, 3541, 310, 1077, 1355, 2429, 281, 253, 1655, 10895, 671, 352, 3133, 751, 326, 253, 3541, 9552, 273, 898, 76, 884, 76, 285, 5910, 76, 403, 1512, 1943, 4496, 4385, 50275, 21, 3733, 673, 1580, 253, 9978, 310, 3909, 3733, 14556, 310, 6685, 9560, 275, 5933, 337, 1046, 4764, 5731, 4419, 340, 4465, 896, 21390, 323, 253, 5028, 17139, 256, 35333, 3213, 812, 368, 4496, 4385, 327, 253, 3733, 673, 50276, 22, 10414, 891, 651, 1804, 253, 4477, 281, 3451, 616, 30404, 247, 2372, 323, 1650, 672, 5015, 670, 4836, 4924, 891, 651, 671, 26542, 17857, 7694, 21066, 9260, 4927, 285, 391, 13678, 12014, 253, 37264, 2557, 908, 275, 436, 2929, 369, 417, 4081, 275, 448, 5353, 73, 610, 6247, 66, 352, 369, 4081, 275, 391, 13678, 50275, 1189, 455, 1014, 2167, 253, 1895, 310, 1774, 891, 1089, 253, 2457, 5661, 9978, 281, 320, 45783, 627, 2168, 403, 1512, 1142, 45783, 26850, 323, 45120, 4715, 891, 651, 2581, 35531, 4266, 432, 18462, 247, 747, 581, 327, 1755, 273, 436, 627, 403, 247, 1643, 7681, 7794, 3733, 673, 3541, 7563, 3966, 891, 513, 417, 4336, 5194, 342, 3103, 651, 2748, 4477, 281, 3662, 1840, 3533, 323, 19843, 7152, 33032, 2520, 2929, 2340, 684, 45120, 4715, 762, 5028, 5333, 285, 29328, 247, 17697, 13727, 2793, 44864, 260, 1321, 1332, 15672, 260, 1321, 13698, 281, 13280, 1711, 3640, 16270, 747, 1491, 285, 39970, 281, 39709, 10625, 275, 1798, 260, 1321, 4648, 48960, 3733, 281, 3451, 253, 5028, 5333, 5661, 1543, 327, 1264, 22791, 15302, 403, 2361, 285, 5469, 50276, 856, 84, 337, 253, 1895, 4758, 273, 45120, 4715, 342, 5028, 5333, 310, 973, 17194, 253, 4477, 5544, 253, 8870, 414, 273, 436, 747, 1895, 4758, 407, 970, 247, 19349, 1566, 50276, 19, 1264, 15216, 403, 2783, 285, 6760, 1690, 253, 1534, 5333, 11134, 5333, 285, 642, 5333, 5661, 1543, 921, 326, 253, 4081, 3082, 562, 32231, 1666, 25379, 275, 1142, 2219, 495, 4583, 253, 2929, 310, 973, 3542, 285, 4518, 10932, 253, 7681, 4278, 403, 3477, 281, 956, 50272, 5040, 50276, 18, 3738, 253, 1895, 4758, 310, 747, 619, 2022, 4468, 310, 253, 3710, 38135, 273, 253, 4081, 260, 1321, 1332, 275, 2508, 2793, 44864, 556, 644, 18171, 5421, 275, 45120, 4715, 1223, 253, 5028, 48960, 4715, 556, 671, 644, 7561, 908, 275, 5028, 15644, 6239, 323, 1142, 1107, 253, 4081, 260, 1321, 1332, 347, 12800, 275, 4677, 374, 3365, 24772, 841, 767, 4295, 50276, 19, 28913, 2175, 273, 253, 4081, 1332, 403, 5816, 3021, 352, 310, 2834, 281, 2096, 253, 7680, 273, 1016, 4445, 253, 2538, 273, 1027, 10491, 6391, 9552, 3966, 5474, 339, 4239, 751, 281, 638, 1664, 436, 2278, 407, 3981, 326, 891, 717, 417, 271, 6485, 275, 436, 2170, 273, 45120, 4715, 7613, 891, 588, 4575, 619, 2278, 846, 4361, 253, 4477, 2380, 347, 973, 347, 643, 30628, 5701, 15672, 50276, 2520, 789, 29328, 260, 1321, 247, 45120, 4715, 1332, 326, 4575, 84, 281, 5028, 5333, 1309, 1071, 673, 253, 4477, 1750, 326, 5368, 3082, 3451, 3268, 5333, 275, 268, 89, 340, 534, 2789, 253, 10046, 9376, 326, 7239, 50276, 89, 310, 253, 1072, 2439, 10625, 275, 3310, 436, 789, 956, 2720, 789, 407, 1182, 12109, 4072, 285, 3451, 84, 3268, 5333, 275, 268, 89, 50276, 90, 534, 2789, 253, 21076, 9376, 326, 7239, 310, 253, 1072, 2439, 10625, 253, 7680, 273, 436, 789, 2486, 970, 271, 48960, 8103, 3066, 247, 7291, 326, 9437, 281, 12129, 253, 5028, 273, 253, 6311, 6779, 269, 89, 50276, 90, 347, 973, 347, 253, 16253, 273, 271, 2793, 44864, 6391, 432, 534, 281, 3410, 6667, 281, 22318, 253, 7221, 991, 48960, 8103, 50276, 2577, 7350, 403, 347, 3637, 337, 891, 717, 33872, 273, 253, 4275, 1750, 273, 436, 789, 534, 310, 326, 45120, 4715, 1057, 417, 2953, 5028, 5333, 891, 717, 417, 271, 6485, 275, 436, 2170, 533, 281, 619, 3640, 512, 35221, 4715, 789, 326, 6194, 327, 581, 3126, 285, 5223, 84, 281, 1529, 3126, 1364, 45120, 3037, 275, 247, 1027, 3126, 24088, 5987, 39962, 2061, 5375, 1093, 2941, 883, 23568, 5987, 39962, 2061, 5375, 16129, 1235, 2385, 746, 5987, 39962, 2061, 5375, 746, 2313, 3507, 740, 476, 253, 4477, 4385, 327, 1880, 597, 1158, 841, 2987, 326, 1097, 23265, 3037, 285, 5223, 281, 747, 10625, 403, 4623, 604, 594, 651, 352, 1056, 3282, 281, 7277, 281, 731, 25761, 841, 2987, 285, 2571, 275, 5028, 15644, 513, 417, 5467, 326, 7239, 310, 8931, 2439, 10625, 7613, 891, 1928, 751, 436, 310, 247, 2581, 2266, 9376, 374, 253, 7714, 294, 13362, 253, 2926, 273, 4715, 271, 13727, 269, 89, 50276, 90, 12889, 2299, 253, 2426, 275, 534, 352, 1057, 436, 403, 417, 10799, 323, 1650, 849, 513, 253, 4477, 4853, 5028, 4836, 1071, 5028, 2303, 5028, 31429, 7882, 891, 1804, 326, 253, 4477, 4853, 841, 10534, 323, 1650, 259, 15965, 14308, 285, 1918, 11859, 6667, 28462, 275, 2898, 7533, 835, 1896, 495, 672, 253, 4477, 1333, 326, 253, 3268, 310, 6474, 597, 1599, 326, 627, 310, 642, 966, 31561, 875, 10625, 2299, 7882, 275, 247, 3268, 5644, 281, 247, 1180, 273, 9099, 5987, 257, 25842, 2061, 44874, 382, 5063, 382, 2382, 50276, 609, 841, 9099, 3309, 604, 594, 275, 752, 4088, 604, 417, 651, 352, 1056, 3282, 281, 816, 1333, 627, 310, 642, 966, 31561, 875, 10625, 577, 253, 7714, 4428, 690, 20274, 273, 14951, 7242, 534, 1056, 352, 1892, 281, 1239, 323, 1650, 277, 285, 697, 50215, 422, 12955, 1957, 7291, 285, 5028, 253, 2406, 5045, 246, 6125, 1097, 4836, 285, 673, 253, 4477, 17958, 875, 1071, 5028, 285, 2303, 5028, 891, 5936, 326, 253, 4477, 1056, 616, 28939, 5185, 608, 275, 2593, 7609, 253, 4477, 12661, 4715, 247, 6779, 1617, 595, 13727, 275, 268, 5246, 2299, 597, 671, 1333, 326, 436, 369, 2218, 407, 1182, 12109, 4072, 285, 632, 4765, 7613, 436, 310, 417, 247, 7680, 3451, 50276, 249, 6010, 1955, 281, 253, 7364, 22691, 407, 253, 9376, 273, 436, 789, 7239, 253, 1072, 2439, 10625, 253, 32809, 7680, 273, 436, 789, 48960, 2957, 50276, 38835, 44864, 285, 253, 3480, 273, 19843, 285, 12320, 275, 253, 7714, 50276, 74, 5583, 247, 5075, 18235, 1955, 281, 619, 3480, 273, 2793, 275, 45120, 4715, 891, 2550, 2939, 253, 4757, 273, 253, 49602, 285, 253, 8453, 273, 253, 5661, 1543, 285, 588, 36574, 281, 643, 30628, 187, 187, 4118, 18435, 27, 2520, 2929, 3559, 271, 3909, 45120, 4715, 1332, 835, 627, 778, 320, 247, 5333, 275, 941, 3268, 387, 1071, 673, 253, 2929, 29328, 247, 17697, 13727, 2793, 44864, 260, 1321, 2746, 281, 3451, 253, 2159, 534, 10129, 253, 3268, 273, 14800, 27039, 327, 253, 18012, 436, 310, 1754, 327, 271, 48960, 3733, 6974, 50276, 783, 30628, 1119, 253, 1895, 4758, 4722, 533, 1119, 253, 2746, 281, 320, 14999, 275, 38135, 285, 1895, 15895, 8489, 29190, 24088, 50276, 1844, 4261, 5028, 2654, 1309, 3733, 253, 2488, 8680, 369, 2668, 715, 2395, 533, 253, 30628, 11791, 342, 616, 3236, 6803, 285, 1014, 846, 253, 30080, 22559, 3408, 5293, 273, 253, 30628, 310, 275, 3718, 273, 18738, 253, 2929, 50276, 783, 4477, 403, 15140, 281, 1908, 253, 8680, 432, 253, 30628, 534, 588, 18670, 1361, 281, 3157, 253, 2929, 323, 247, 2852, 19529, 281, 1529, 18767 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 908, 323, 5028, 5333, 310, 326, 253, 5028, 5203, 651, 1818, 253, 3386, 533, 417, 253, 13301, 594, 253, 2022, 2934, 310, 281, 3037, 13727, 14237, 2439, 512, 253, 10625, 285, 3693, 46541, 5921, 327, 253, 5028, 5203, 253, 4081, 1332, 840, 8687, 247, 44047, 7221, 991, 2165, 253, 18539, 3927, 403, 253, 5028, 7134, 12915, 323, 1016, 966, 534, 14177, 281, 22950, 253, 5028, 26210, 253, 7221, 6081, 4760, 310, 253, 6779, 458, 47612, 253, 2929, 23970, 767, 26210, 2557, 1754, 327, 253, 480, 561, 561, 73, 16554, 23279, 285, 253, 581, 15759, 369, 2152, 6339, 4181, 275, 253, 3368, 253, 941, 873, 323, 45120, 4715, 310, 8818, 970, 5028, 5333, 941, 824, 326, 352, 43341, 253, 3909, 4715, 4758, 253, 1543, 403, 12085, 275, 5301, 342, 247, 3710, 873, 273, 1666, 25379, 50273, 9072, 2792, 337, 253, 2929, 7106, 327, 271, 4722, 285, 1774, 9400, 374, 253, 44047, 48960, 2165, 275, 2426, 273, 28699, 5028, 26210, 3133, 281, 320, 4460, 50275, 20881, 2792, 337, 253, 3909, 390, 45120, 4715, 8668, 310, 7960, 14042, 407, 7562, 271, 6314, 23329, 2603, 941, 6391, 534, 891, 1158, 310, 27662, 21010, 275, 2087, 891, 452, 247, 1953, 670, 849, 436, 48960, 1332, 651, 789, 672, 627, 310, 417, 271, 3909, 8190, 780, 4715, 4445, 1677, 247, 4229, 2303, 436, 247, 4228, 873, 273, 2603, 10625, 352, 3133, 253, 1332, 943, 320, 1335, 3588, 594, 891, 717, 417, 2119, 849, 253, 5028, 26647, 1930, 285, 253, 3909, 1930, 273, 253, 1332, 8008, 342, 1016, 643, 15686, 253, 3909, 4758, 1078, 253, 10464, 2147, 4758, 3133, 20276, 281, 479, 50275, 19, 253, 7103, 275, 4679, 2722, 326, 253, 49555, 2557, 273, 253, 4081, 1332, 310, 417, 1077, 12085, 891, 2868, 352, 310, 2905, 281, 849, 253, 3541, 310, 19958, 285, 253, 2978, 273, 253, 4836, 3425, 2299, 352, 671, 6492, 253, 1332, 310, 417, 1077, 20297, 323, 17816, 37264, 5010, 352, 812, 671, 320, 253, 1083, 253, 17697, 5028, 5333, 9376, 310, 417, 3588, 275, 253, 941, 50275, 20, 253, 2934, 281, 3037, 271, 13727, 6779, 310, 417, 4460, 323, 1650, 253, 13727, 2495, 41458, 209, 2683, 1332, 310, 4555, 10620, 342, 253, 1072, 1895, 970, 271, 6314, 23329, 2603, 941, 6391, 352, 3133, 368, 476, 671, 4647, 209, 2683, 281, 8415, 253, 1895, 891, 1158, 352, 943, 320, 2908, 347, 247, 8245, 275, 2087, 625, 13727, 4715, 7274, 943, 320, 5469, 275, 253, 2905, 789, 50275, 21, 4028, 476, 320, 3012, 5520, 50275, 28821, 253, 5075, 2792, 891, 5583, 18235, 323, 436, 2929, 50275, 1568, 403, 690, 273, 619, 3533, 285, 3081, 13991, 337, 891, 717, 14338, 670, 849, 253, 13727, 4735, 4453, 751, 10323, 846, 4715, 253, 48960, 5028, 23477, 849, 253, 6779, 4715, 4453, 751, 352, 651, 320, 5322, 281, 452, 690, 24426, 327, 326, 50276, 19, 891, 513, 5583, 13654, 327, 247, 10464, 2147, 1895, 1078, 1469, 281, 253, 3909, 2715, 891, 1928, 1014, 2167, 352, 310, 417, 253, 1083, 326, 253, 10464, 2147, 1332, 588, 789, 323, 3909, 2219, 387, 1878, 275, 5028, 26647, 359, 2550, 1902, 247, 3076, 458, 47612, 281, 789, 3909, 672, 253, 941, 310, 1014, 625, 3710, 1014, 323, 271, 3909, 2929, 4645, 253, 3045, 323, 253, 10464, 2147, 2715, 3133, 281, 320, 3309, 50276, 20, 849, 9765, 310, 6777, 285, 849, 352, 11852, 4715, 275, 2087, 625, 28913, 2175, 651, 3157, 2929, 3290, 50276, 7152, 339, 793, 360, 3454, 436, 2929, 12088, 670, 3909, 45120, 4715, 275, 15216, 835, 627, 1537, 320, 5028, 11551, 1309, 1071, 673, 2167, 891, 1089, 253, 1895, 281, 320, 1774, 891, 2868, 326, 253, 2900, 4081, 275, 436, 2929, 310, 15246, 534, 310, 4030, 285, 35979, 247, 747, 873, 273, 10806, 8958, 247, 30400, 253, 5028, 2654, 1309, 3733, 2403, 253, 1895, 3240, 45783, 323, 15216, 835, 45120, 4715, 1537, 320, 4217, 4496, 1089, 2708, 619, 5701, 275, 2508, 50276, 18, 1895, 15895, 891, 513, 5194, 342, 253, 4477, 326, 4836, 4924, 9978, 671, 1929, 347, 2014, 2522, 310, 1633, 359, 943, 2770, 625, 327, 285, 751, 253, 958, 326, 597, 5386, 436, 7658, 273, 8958, 4836, 2654, 1309, 1140, 565, 383, 2299, 1223, 32196, 436, 7658, 597, 2879, 247, 747, 9376, 273, 8958, 253, 5028, 2654, 1309, 3733, 891, 1089, 436, 2581, 247, 625, 7654, 8284, 253, 987, 1895, 15895, 651, 320, 835, 627, 310, 247, 5542, 273, 3530, 3551, 3909, 342, 247, 4942, 787, 20657, 4836, 7548, 285, 841, 3530, 1537, 5663, 281, 1027, 10625, 8958, 10625, 247, 30400, 310, 1077, 45783, 275, 45120, 4715, 9978, 812, 368, 4496, 15216, 835, 697, 17887, 281, 871, 5028, 2654, 247, 30400, 50275, 19, 2746, 1223, 891, 2096, 253, 2827, 629, 891, 513, 417, 2096, 4518, 2593, 7609, 752, 310, 253, 878, 273, 512, 36827, 6239, 1060, 2139, 513, 368, 1067, 277, 75, 247, 5028, 7291, 717, 891, 3430, 275, 3981, 326, 253, 760, 2554, 273, 277, 75, 323, 966, 340, 75, 310, 281, 4647, 2831, 290, 10144, 2957, 689, 10625, 502, 8201, 604, 3058, 4496, 3451, 479, 604, 891, 717, 3430, 253, 2593, 369, 247, 2372, 12744, 281, 956, 50275, 20, 849, 513, 368, 1056, 2119, 326, 253, 9376, 273, 941, 6654, 310, 3588, 1677, 326, 253, 3541, 310, 1077, 1355, 2429, 281, 253, 1655, 10895, 671, 352, 3133, 751, 326, 253, 3541, 9552, 273, 898, 76, 884, 76, 285, 5910, 76, 403, 1512, 1943, 4496, 4385, 50275, 21, 3733, 673, 1580, 253, 9978, 310, 3909, 3733, 14556, 310, 6685, 9560, 275, 5933, 337, 1046, 4764, 5731, 4419, 340, 4465, 896, 21390, 323, 253, 5028, 17139, 256, 35333, 3213, 812, 368, 4496, 4385, 327, 253, 3733, 673, 50276, 22, 10414, 891, 651, 1804, 253, 4477, 281, 3451, 616, 30404, 247, 2372, 323, 1650, 672, 5015, 670, 4836, 4924, 891, 651, 671, 26542, 17857, 7694, 21066, 9260, 4927, 285, 391, 13678, 12014, 253, 37264, 2557, 908, 275, 436, 2929, 369, 417, 4081, 275, 448, 5353, 73, 610, 6247, 66, 352, 369, 4081, 275, 391, 13678, 50275, 1189, 455, 1014, 2167, 253, 1895, 310, 1774, 891, 1089, 253, 2457, 5661, 9978, 281, 320, 45783, 627, 2168, 403, 1512, 1142, 45783, 26850, 323, 45120, 4715, 891, 651, 2581, 35531, 4266, 432, 18462, 247, 747, 581, 327, 1755, 273, 436, 627, 403, 247, 1643, 7681, 7794, 3733, 673, 3541, 7563, 3966, 891, 513, 417, 4336, 5194, 342, 3103, 651, 2748, 4477, 281, 3662, 1840, 3533, 323, 19843, 7152, 33032, 2520, 2929, 2340, 684, 45120, 4715, 762, 5028, 5333, 285, 29328, 247, 17697, 13727, 2793, 44864, 260, 1321, 1332, 15672, 260, 1321, 13698, 281, 13280, 1711, 3640, 16270, 747, 1491, 285, 39970, 281, 39709, 10625, 275, 1798, 260, 1321, 4648, 48960, 3733, 281, 3451, 253, 5028, 5333, 5661, 1543, 327, 1264, 22791, 15302, 403, 2361, 285, 5469, 50276, 856, 84, 337, 253, 1895, 4758, 273, 45120, 4715, 342, 5028, 5333, 310, 973, 17194, 253, 4477, 5544, 253, 8870, 414, 273, 436, 747, 1895, 4758, 407, 970, 247, 19349, 1566, 50276, 19, 1264, 15216, 403, 2783, 285, 6760, 1690, 253, 1534, 5333, 11134, 5333, 285, 642, 5333, 5661, 1543, 921, 326, 253, 4081, 3082, 562, 32231, 1666, 25379, 275, 1142, 2219, 495, 4583, 253, 2929, 310, 973, 3542, 285, 4518, 10932, 253, 7681, 4278, 403, 3477, 281, 956, 50272, 5040, 50276, 18, 3738, 253, 1895, 4758, 310, 747, 619, 2022, 4468, 310, 253, 3710, 38135, 273, 253, 4081, 260, 1321, 1332, 275, 2508, 2793, 44864, 556, 644, 18171, 5421, 275, 45120, 4715, 1223, 253, 5028, 48960, 4715, 556, 671, 644, 7561, 908, 275, 5028, 15644, 6239, 323, 1142, 1107, 253, 4081, 260, 1321, 1332, 347, 12800, 275, 4677, 374, 3365, 24772, 841, 767, 4295, 50276, 19, 28913, 2175, 273, 253, 4081, 1332, 403, 5816, 3021, 352, 310, 2834, 281, 2096, 253, 7680, 273, 1016, 4445, 253, 2538, 273, 1027, 10491, 6391, 9552, 3966, 5474, 339, 4239, 751, 281, 638, 1664, 436, 2278, 407, 3981, 326, 891, 717, 417, 271, 6485, 275, 436, 2170, 273, 45120, 4715, 7613, 891, 588, 4575, 619, 2278, 846, 4361, 253, 4477, 2380, 347, 973, 347, 643, 30628, 5701, 15672, 50276, 2520, 789, 29328, 260, 1321, 247, 45120, 4715, 1332, 326, 4575, 84, 281, 5028, 5333, 1309, 1071, 673, 253, 4477, 1750, 326, 5368, 3082, 3451, 3268, 5333, 275, 268, 89, 340, 534, 2789, 253, 10046, 9376, 326, 7239, 50276, 89, 310, 253, 1072, 2439, 10625, 275, 3310, 436, 789, 956, 2720, 789, 407, 1182, 12109, 4072, 285, 3451, 84, 3268, 5333, 275, 268, 89, 50276, 90, 534, 2789, 253, 21076, 9376, 326, 7239, 310, 253, 1072, 2439, 10625, 253, 7680, 273, 436, 789, 2486, 970, 271, 48960, 8103, 3066, 247, 7291, 326, 9437, 281, 12129, 253, 5028, 273, 253, 6311, 6779, 269, 89, 50276, 90, 347, 973, 347, 253, 16253, 273, 271, 2793, 44864, 6391, 432, 534, 281, 3410, 6667, 281, 22318, 253, 7221, 991, 48960, 8103, 50276, 2577, 7350, 403, 347, 3637, 337, 891, 717, 33872, 273, 253, 4275, 1750, 273, 436, 789, 534, 310, 326, 45120, 4715, 1057, 417, 2953, 5028, 5333, 891, 717, 417, 271, 6485, 275, 436, 2170, 533, 281, 619, 3640, 512, 35221, 4715, 789, 326, 6194, 327, 581, 3126, 285, 5223, 84, 281, 1529, 3126, 1364, 45120, 3037, 275, 247, 1027, 3126, 24088, 5987, 39962, 2061, 5375, 1093, 2941, 883, 23568, 5987, 39962, 2061, 5375, 16129, 1235, 2385, 746, 5987, 39962, 2061, 5375, 746, 2313, 3507, 740, 476, 253, 4477, 4385, 327, 1880, 597, 1158, 841, 2987, 326, 1097, 23265, 3037, 285, 5223, 281, 747, 10625, 403, 4623, 604, 594, 651, 352, 1056, 3282, 281, 7277, 281, 731, 25761, 841, 2987, 285, 2571, 275, 5028, 15644, 513, 417, 5467, 326, 7239, 310, 8931, 2439, 10625, 7613, 891, 1928, 751, 436, 310, 247, 2581, 2266, 9376, 374, 253, 7714, 294, 13362, 253, 2926, 273, 4715, 271, 13727, 269, 89, 50276, 90, 12889, 2299, 253, 2426, 275, 534, 352, 1057, 436, 403, 417, 10799, 323, 1650, 849, 513, 253, 4477, 4853, 5028, 4836, 1071, 5028, 2303, 5028, 31429, 7882, 891, 1804, 326, 253, 4477, 4853, 841, 10534, 323, 1650, 259, 15965, 14308, 285, 1918, 11859, 6667, 28462, 275, 2898, 7533, 835, 1896, 495, 672, 253, 4477, 1333, 326, 253, 3268, 310, 6474, 597, 1599, 326, 627, 310, 642, 966, 31561, 875, 10625, 2299, 7882, 275, 247, 3268, 5644, 281, 247, 1180, 273, 9099, 5987, 257, 25842, 2061, 44874, 382, 5063, 382, 2382, 50276, 609, 841, 9099, 3309, 604, 594, 275, 752, 4088, 604, 417, 651, 352, 1056, 3282, 281, 816, 1333, 627, 310, 642, 966, 31561, 875, 10625, 577, 253, 7714, 4428, 690, 20274, 273, 14951, 7242, 534, 1056, 352, 1892, 281, 1239, 323, 1650, 277, 285, 697, 50215, 422, 12955, 1957, 7291, 285, 5028, 253, 2406, 5045, 246, 6125, 1097, 4836, 285, 673, 253, 4477, 17958, 875, 1071, 5028, 285, 2303, 5028, 891, 5936, 326, 253, 4477, 1056, 616, 28939, 5185, 608, 275, 2593, 7609, 253, 4477, 12661, 4715, 247, 6779, 1617, 595, 13727, 275, 268, 5246, 2299, 597, 671, 1333, 326, 436, 369, 2218, 407, 1182, 12109, 4072, 285, 632, 4765, 7613, 436, 310, 417, 247, 7680, 3451, 50276, 249, 6010, 1955, 281, 253, 7364, 22691, 407, 253, 9376, 273, 436, 789, 7239, 253, 1072, 2439, 10625, 253, 32809, 7680, 273, 436, 789, 48960, 2957, 50276, 38835, 44864, 285, 253, 3480, 273, 19843, 285, 12320, 275, 253, 7714, 50276, 74, 5583, 247, 5075, 18235, 1955, 281, 619, 3480, 273, 2793, 275, 45120, 4715, 891, 2550, 2939, 253, 4757, 273, 253, 49602, 285, 253, 8453, 273, 253, 5661, 1543, 285, 588, 36574, 281, 643, 30628, 187, 187, 4118, 18435, 27, 2520, 2929, 3559, 271, 3909, 45120, 4715, 1332, 835, 627, 778, 320, 247, 5333, 275, 941, 3268, 387, 1071, 673, 253, 2929, 29328, 247, 17697, 13727, 2793, 44864, 260, 1321, 2746, 281, 3451, 253, 2159, 534, 10129, 253, 3268, 273, 14800, 27039, 327, 253, 18012, 436, 310, 1754, 327, 271, 48960, 3733, 6974, 50276, 783, 30628, 1119, 253, 1895, 4758, 4722, 533, 1119, 253, 2746, 281, 320, 14999, 275, 38135, 285, 1895, 15895, 8489, 29190, 24088, 50276, 1844, 4261, 5028, 2654, 1309, 3733, 253, 2488, 8680, 369, 2668, 715, 2395, 533, 253, 30628, 11791, 342, 616, 3236, 6803, 285, 1014, 846, 253, 30080, 22559, 3408, 5293, 273, 253, 30628, 310, 275, 3718, 273, 18738, 253, 2929, 50276, 783, 4477, 403, 15140, 281, 1908, 253, 8680, 432, 253, 30628, 534, 588, 18670, 1361, 281, 3157, 253, 2929, 323, 247, 2852, 19529, 281, 1529, 18767 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper studies varianceadaptive linear bandits and linear mixture mdps this paper improves significantly upon regret bounds obtained in the previous work by zhang et al 2021 to achieve this they improved the structure of the confidence set in the previous work by removing the epsilonnet construction in the previous work and they improved the analyses of the previous work by replacing the complicated volumetricbased argument in the previous work by a simple argument based on their novel elliptical potential count lemma the improvement on the confidence set and the novel elliptical potential count lemma are technically solid strengths 1 the regret bound is significantly improved upon the previous work 2 the analyses is largely simplified upon the previous work which paves the way for the novel convidence set to be applied to broader theoretical applications 3 the novel elliptical potential count lemma would be of broarder interest to the bandit learning community 3 the bound for linear mixture mdp is nearoptimal up to log factors weakness 1 the bound for linear bandits is sqrt d away from being tight 2 the confidence set is still too complicated to be applied in practice and optimizing in the confidence set cannot be done in polynomial time 1 the proof roadmap for both linear bandits and linear mixture mdp largely follows upon the previous work which might limit the novelty of this paper however the confidence set might not be able to become a standalone paper so i think it inevitable to have this limitation docsepthis paper revisits the problem for linear bandits and linear mixture mdps from zhang et al 2021 the authors follow their notion of varianceadaptive regret bound and provide novel analysis to improve such bound in particular for the linear bandits setting the current work improves zhang et als result by a factor of od3 in the leading regrets for the linear mixture mdps the improvement is a factor of od35 the key of the analysis is a novel peelingbased regret analysis that leverages the elliptical potential count lemma which could be of independent interest to other related problems strength 1 the authors made solid theoretical contributions to the literature the current work significantly improves previous works regret bounds and the techniques are applicable to both linear bandits and linear mixture mdps from my perspective the peeling techniques and the elliptical potential count lemma can be applied to a wider scope beyond the aforementioned linear bandits and linear mixture mdps 2 the technical analysis is sound and the writing is easy to follow weakness 1 my main concern is about the practical usage of the current varianceadaptive algorithm the confidence interval used in eq 3 seems quite complicated i cannot see the clear landscape of this confidence interval i wonder how to find the action xk given such a conceptual interval it could be better if the authors could elaborate more on the landscape of the optimization problem using eq 3 as the feasible set of theta for example the author could explain under what conditions or in what special cases this interval could be efficiently used another way is to show some plausible heuristics to calculate line 4 of the voful2 like optimizing theta and xk alternately 2 the current work does not include any empirical evaluations for their algorithm and the contribution is purely theoretical i guess it is due to the same reason as mentioned above that line 4 of voful2 does not have efficient solutions the authors mentioned the limitations of computational issues but does not provide explicit solutions see my comments in the strengths and weaknesses part and there is no potential negative societal impact of their work docsepthis work focused on the linear bandit and linear mixture mdps the author proposes a novel variancedependent algorithm voful2 for the linear bandit which improved the regret with a factor of d3 for linear mixture mdp the author also proposes the varlin 2 algorithm with a horizonfree regret guarantee which improved the previous results with a factor of d35 strength 1 the voful2 algorithm improved previous results in unknownvariance linear bandit and reduced the gap to only sqrtd 2 the varlin2 algorithm obtains a horizonfree regret guarantee for linear mixture mdp and the result is nearoptimal with the dimension d weakness 1 the voful2 and varlin2 algorithms are not novel they are part following the voful and varlin 2 algorithms in zhang et al 2021 zhang z yang j ji x et al varianceaware confidence set variancedependent bound for linear bandits and horizonfree bound for linear mixture mdp 2 voful2 and varlin2 also inherited the disadvantages of computing inefficiently in addition without the epsilonnet structure the voful2 algorithm needs to consider the intersection of the infinite confidence set which is much more complicated than the original voful algorithm 3 the lemma 4 elliptical potential count is not new this lemma and also the corresponding peeling technique in linear mixture mdps appears in the proof of lemmas b3b5 he el al 2021 previously he j zhou d gu q logarithmic regret for reinforcement learning with linear function approximation this paper provides theoretical guarantees for learning linear bandit and linear mixture mdp there is no negative societal impact ### Summary:
this paper gives the first minimax optimal up to log factors and horizonfree for linear mixture mdp and improved variancedependent bound for linear bandits furthermore the paper developed a new peelingbased analysis that can be useful for other problems these contributions make this paper a strong paper in the theoretical rl community the ac thus recommends acceptance
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 2175, 11041, 26672, 422, 4872, 3961, 953, 285, 4872, 7802, 31934, 793, 436, 2929, 19132, 3012, 2220, 14938, 14493, 2797, 275, 253, 2045, 789, 407, 1182, 12109, 1162, 355, 43425, 281, 5115, 436, 597, 5520, 253, 2605, 273, 253, 7162, 873, 275, 253, 2045, 789, 407, 11922, 253, 299, 4277, 3024, 5140, 275, 253, 2045, 789, 285, 597, 5520, 253, 6260, 273, 253, 2045, 789, 407, 15706, 253, 9542, 1936, 45558, 3169, 4154, 275, 253, 2045, 789, 407, 247, 2969, 4154, 1754, 327, 616, 4460, 44613, 2442, 1385, 18057, 253, 7756, 327, 253, 7162, 873, 285, 253, 4460, 44613, 2442, 1385, 18057, 403, 22335, 4891, 50276, 296, 3755, 20556, 337, 253, 14938, 3033, 310, 3012, 5520, 2220, 253, 2045, 789, 374, 253, 6260, 310, 8127, 21010, 2220, 253, 2045, 789, 534, 268, 3465, 253, 1039, 323, 253, 4460, 2410, 1435, 873, 281, 320, 3732, 281, 16055, 10527, 4893, 50276, 20, 253, 4460, 44613, 2442, 1385, 18057, 651, 320, 273, 1795, 472, 254, 1600, 281, 253, 3961, 262, 4715, 3114, 50276, 20, 253, 3033, 323, 4872, 7802, 278, 12132, 310, 2822, 29776, 598, 281, 2412, 2616, 50276, 20881, 1255, 337, 253, 3033, 323, 4872, 3961, 953, 310, 8084, 277, 1977, 432, 1146, 6863, 374, 253, 7162, 873, 310, 1335, 1512, 9542, 281, 320, 3732, 275, 3946, 285, 39793, 275, 253, 7162, 873, 2550, 320, 2218, 275, 14189, 673, 50276, 18, 253, 4737, 3971, 4251, 323, 1097, 4872, 3961, 953, 285, 4872, 7802, 278, 12132, 8127, 3637, 2220, 253, 2045, 789, 534, 1537, 2701, 253, 38135, 273, 436, 2929, 2299, 253, 7162, 873, 50276, 22732, 417, 320, 2104, 281, 2489, 247, 40468, 2929, 594, 891, 1158, 352, 19455, 281, 452, 436, 12291, 5474, 33032, 2520, 2929, 27694, 953, 253, 1895, 323, 4872, 3961, 953, 285, 4872, 7802, 31934, 793, 432, 1182, 12109, 1162, 355, 43425, 253, 4477, 956, 616, 10732, 273, 11041, 26672, 422, 14938, 3033, 285, 2085, 4460, 1783, 281, 3157, 824, 3033, 275, 1798, 323, 253, 4872, 3961, 953, 4758, 253, 1655, 789, 19132, 1182, 12109, 1162, 14350, 906, 407, 247, 2803, 273, 7687, 20, 275, 253, 4283, 14938, 84, 323, 253, 4872, 7802, 31934, 793, 253, 7756, 310, 247, 2803, 273, 7687, 1671, 253, 2234, 273, 253, 1783, 310, 247, 4460, 759, 8855, 3169, 14938, 1783, 326, 19732, 1131, 253, 44613, 2442, 1385, 18057, 534, 812, 320, 273, 3907, 1600, 281, 643, 2905, 3237, 4757, 337, 253, 4477, 1160, 4891, 10527, 9021, 281, 253, 6239, 253, 1655, 789, 3012, 19132, 2045, 2987, 14938, 14493, 285, 253, 5609, 403, 7763, 281, 1097, 4872, 3961, 953, 285, 4872, 7802, 31934, 793, 432, 619, 8668, 253, 759, 8855, 5609, 285, 253, 44613, 2442, 1385, 18057, 476, 320, 3732, 281, 247, 14200, 7990, 4457, 253, 18979, 4872, 3961, 953, 285, 4872, 7802, 31934, 793, 50276, 19, 253, 7681, 1783, 310, 3590, 285, 253, 4028, 310, 3477, 281, 956, 50276, 20881, 1255, 50276, 18, 619, 2022, 4468, 310, 670, 253, 8542, 10393, 273, 253, 1655, 11041, 26672, 422, 5933, 253, 7162, 7726, 908, 275, 16186, 495, 3133, 3240, 9542, 891, 2550, 923, 253, 2590, 13016, 273, 436, 7162, 7726, 891, 4282, 849, 281, 1089, 253, 2250, 1269, 76, 1677, 824, 247, 20178, 7726, 352, 812, 320, 1805, 604, 253, 4477, 812, 21184, 625, 327, 253, 13016, 273, 253, 13757, 1895, 970, 16186, 495, 347, 253, 17887, 873, 273, 39116, 323, 1650, 253, 2488, 812, 5513, 762, 752, 2515, 390, 275, 752, 2714, 2219, 436, 7726, 812, 320, 14556, 908, 1529, 1039, 310, 281, 921, 690, 21541, 344, 321, 3397, 281, 10173, 1386, 577, 273, 253, 3273, 1020, 19, 751, 39793, 39116, 285, 1269, 76, 3960, 1523, 50276, 19, 253, 1655, 789, 1057, 417, 2486, 667, 16774, 27163, 323, 616, 5933, 285, 253, 7680, 310, 15846, 10527, 891, 5476, 352, 310, 1955, 281, 253, 1072, 1921, 347, 5393, 1840, 326, 1386, 577, 273, 3273, 1020, 19, 1057, 417, 452, 5919, 5482, 50276, 783, 4477, 5393, 253, 7364, 273, 15180, 3374, 533, 1057, 417, 2085, 6843, 5482, 923, 619, 5701, 275, 253, 20544, 285, 32213, 629, 285, 627, 310, 642, 2442, 4016, 38058, 3486, 273, 616, 789, 5474, 33032, 2520, 789, 7106, 327, 253, 4872, 3961, 262, 285, 4872, 7802, 31934, 793, 253, 2488, 29328, 247, 4460, 945, 757, 758, 2662, 5933, 3273, 1020, 19, 323, 253, 4872, 3961, 262, 534, 5520, 253, 14938, 342, 247, 2803, 273, 277, 20, 323, 4872, 7802, 278, 12132, 253, 2488, 671, 29328, 253, 945, 3642, 374, 5933, 342, 247, 16892, 4924, 14938, 12215, 534, 5520, 253, 2045, 1543, 342, 247, 2803, 273, 277, 1671, 4757, 337, 253, 3273, 1020, 19, 5933, 5520, 2045, 1543, 275, 7202, 87, 14417, 4872, 3961, 262, 285, 3777, 253, 8037, 281, 760, 8084, 69, 374, 253, 945, 3642, 19, 5933, 31326, 247, 16892, 4924, 14938, 12215, 323, 4872, 7802, 278, 12132, 285, 253, 906, 310, 2822, 29776, 342, 253, 7877, 277, 50276, 20881, 1255, 337, 253, 3273, 1020, 19, 285, 945, 3642, 19, 11333, 403, 417, 4460, 597, 403, 629, 1563, 253, 3273, 1020, 285, 945, 3642, 374, 11333, 275, 1182, 12109, 1162, 355, 43425, 50275, 91, 12109, 1182, 30966, 480, 480, 74, 1269, 1162, 355, 11041, 13823, 7162, 873, 945, 757, 758, 2662, 3033, 323, 4872, 3961, 953, 285, 16892, 4924, 3033, 323, 4872, 7802, 278, 12132, 50276, 19, 3273, 1020, 19, 285, 945, 3642, 19, 671, 20265, 253, 23797, 273, 12672, 31334, 314, 275, 1635, 1293, 253, 299, 4277, 3024, 2605, 253, 3273, 1020, 19, 5933, 3198, 281, 1908, 253, 15171, 273, 253, 11968, 7162, 873, 534, 310, 1199, 625, 9542, 685, 253, 3236, 3273, 1020, 5933, 50276, 20, 253, 18057, 577, 44613, 2442, 1385, 310, 417, 747, 436, 18057, 285, 671, 253, 3969, 759, 8855, 5853, 275, 4872, 7802, 31934, 793, 4620, 275, 253, 4737, 273, 458, 44661, 270, 20, 67, 22, 344, 1045, 355, 43425, 3786, 50276, 248, 480, 1182, 14451, 277, 1149, 2805, 32643, 14938, 323, 35221, 4715, 342, 4872, 1159, 11193, 50276, 2520, 2929, 3400, 10527, 23632, 323, 4715, 4872, 3961, 262, 285, 4872, 7802, 278, 12132, 627, 310, 642, 4016, 38058, 3486, 2490, 187, 4118, 18435, 27, 2520, 2929, 4245, 253, 806, 7221, 991, 8654, 598, 281, 2412, 2616, 285, 16892, 4924, 323, 4872, 7802, 278, 12132, 285, 5520, 945, 757, 758, 2662, 3033, 323, 4872, 3961, 953, 33810, 253, 2929, 3715, 247, 747, 759, 8855, 3169, 1783, 326, 476, 320, 4217, 323, 643, 3237, 841, 9021, 1056, 436, 2929, 247, 2266, 2929, 275, 253, 10527, 391, 77, 3114, 253, 913, 3021, 32636, 14924, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 2175, 11041, 26672, 422, 4872, 3961, 953, 285, 4872, 7802, 31934, 793, 436, 2929, 19132, 3012, 2220, 14938, 14493, 2797, 275, 253, 2045, 789, 407, 1182, 12109, 1162, 355, 43425, 281, 5115, 436, 597, 5520, 253, 2605, 273, 253, 7162, 873, 275, 253, 2045, 789, 407, 11922, 253, 299, 4277, 3024, 5140, 275, 253, 2045, 789, 285, 597, 5520, 253, 6260, 273, 253, 2045, 789, 407, 15706, 253, 9542, 1936, 45558, 3169, 4154, 275, 253, 2045, 789, 407, 247, 2969, 4154, 1754, 327, 616, 4460, 44613, 2442, 1385, 18057, 253, 7756, 327, 253, 7162, 873, 285, 253, 4460, 44613, 2442, 1385, 18057, 403, 22335, 4891, 50276, 296, 3755, 20556, 337, 253, 14938, 3033, 310, 3012, 5520, 2220, 253, 2045, 789, 374, 253, 6260, 310, 8127, 21010, 2220, 253, 2045, 789, 534, 268, 3465, 253, 1039, 323, 253, 4460, 2410, 1435, 873, 281, 320, 3732, 281, 16055, 10527, 4893, 50276, 20, 253, 4460, 44613, 2442, 1385, 18057, 651, 320, 273, 1795, 472, 254, 1600, 281, 253, 3961, 262, 4715, 3114, 50276, 20, 253, 3033, 323, 4872, 7802, 278, 12132, 310, 2822, 29776, 598, 281, 2412, 2616, 50276, 20881, 1255, 337, 253, 3033, 323, 4872, 3961, 953, 310, 8084, 277, 1977, 432, 1146, 6863, 374, 253, 7162, 873, 310, 1335, 1512, 9542, 281, 320, 3732, 275, 3946, 285, 39793, 275, 253, 7162, 873, 2550, 320, 2218, 275, 14189, 673, 50276, 18, 253, 4737, 3971, 4251, 323, 1097, 4872, 3961, 953, 285, 4872, 7802, 278, 12132, 8127, 3637, 2220, 253, 2045, 789, 534, 1537, 2701, 253, 38135, 273, 436, 2929, 2299, 253, 7162, 873, 50276, 22732, 417, 320, 2104, 281, 2489, 247, 40468, 2929, 594, 891, 1158, 352, 19455, 281, 452, 436, 12291, 5474, 33032, 2520, 2929, 27694, 953, 253, 1895, 323, 4872, 3961, 953, 285, 4872, 7802, 31934, 793, 432, 1182, 12109, 1162, 355, 43425, 253, 4477, 956, 616, 10732, 273, 11041, 26672, 422, 14938, 3033, 285, 2085, 4460, 1783, 281, 3157, 824, 3033, 275, 1798, 323, 253, 4872, 3961, 953, 4758, 253, 1655, 789, 19132, 1182, 12109, 1162, 14350, 906, 407, 247, 2803, 273, 7687, 20, 275, 253, 4283, 14938, 84, 323, 253, 4872, 7802, 31934, 793, 253, 7756, 310, 247, 2803, 273, 7687, 1671, 253, 2234, 273, 253, 1783, 310, 247, 4460, 759, 8855, 3169, 14938, 1783, 326, 19732, 1131, 253, 44613, 2442, 1385, 18057, 534, 812, 320, 273, 3907, 1600, 281, 643, 2905, 3237, 4757, 337, 253, 4477, 1160, 4891, 10527, 9021, 281, 253, 6239, 253, 1655, 789, 3012, 19132, 2045, 2987, 14938, 14493, 285, 253, 5609, 403, 7763, 281, 1097, 4872, 3961, 953, 285, 4872, 7802, 31934, 793, 432, 619, 8668, 253, 759, 8855, 5609, 285, 253, 44613, 2442, 1385, 18057, 476, 320, 3732, 281, 247, 14200, 7990, 4457, 253, 18979, 4872, 3961, 953, 285, 4872, 7802, 31934, 793, 50276, 19, 253, 7681, 1783, 310, 3590, 285, 253, 4028, 310, 3477, 281, 956, 50276, 20881, 1255, 50276, 18, 619, 2022, 4468, 310, 670, 253, 8542, 10393, 273, 253, 1655, 11041, 26672, 422, 5933, 253, 7162, 7726, 908, 275, 16186, 495, 3133, 3240, 9542, 891, 2550, 923, 253, 2590, 13016, 273, 436, 7162, 7726, 891, 4282, 849, 281, 1089, 253, 2250, 1269, 76, 1677, 824, 247, 20178, 7726, 352, 812, 320, 1805, 604, 253, 4477, 812, 21184, 625, 327, 253, 13016, 273, 253, 13757, 1895, 970, 16186, 495, 347, 253, 17887, 873, 273, 39116, 323, 1650, 253, 2488, 812, 5513, 762, 752, 2515, 390, 275, 752, 2714, 2219, 436, 7726, 812, 320, 14556, 908, 1529, 1039, 310, 281, 921, 690, 21541, 344, 321, 3397, 281, 10173, 1386, 577, 273, 253, 3273, 1020, 19, 751, 39793, 39116, 285, 1269, 76, 3960, 1523, 50276, 19, 253, 1655, 789, 1057, 417, 2486, 667, 16774, 27163, 323, 616, 5933, 285, 253, 7680, 310, 15846, 10527, 891, 5476, 352, 310, 1955, 281, 253, 1072, 1921, 347, 5393, 1840, 326, 1386, 577, 273, 3273, 1020, 19, 1057, 417, 452, 5919, 5482, 50276, 783, 4477, 5393, 253, 7364, 273, 15180, 3374, 533, 1057, 417, 2085, 6843, 5482, 923, 619, 5701, 275, 253, 20544, 285, 32213, 629, 285, 627, 310, 642, 2442, 4016, 38058, 3486, 273, 616, 789, 5474, 33032, 2520, 789, 7106, 327, 253, 4872, 3961, 262, 285, 4872, 7802, 31934, 793, 253, 2488, 29328, 247, 4460, 945, 757, 758, 2662, 5933, 3273, 1020, 19, 323, 253, 4872, 3961, 262, 534, 5520, 253, 14938, 342, 247, 2803, 273, 277, 20, 323, 4872, 7802, 278, 12132, 253, 2488, 671, 29328, 253, 945, 3642, 374, 5933, 342, 247, 16892, 4924, 14938, 12215, 534, 5520, 253, 2045, 1543, 342, 247, 2803, 273, 277, 1671, 4757, 337, 253, 3273, 1020, 19, 5933, 5520, 2045, 1543, 275, 7202, 87, 14417, 4872, 3961, 262, 285, 3777, 253, 8037, 281, 760, 8084, 69, 374, 253, 945, 3642, 19, 5933, 31326, 247, 16892, 4924, 14938, 12215, 323, 4872, 7802, 278, 12132, 285, 253, 906, 310, 2822, 29776, 342, 253, 7877, 277, 50276, 20881, 1255, 337, 253, 3273, 1020, 19, 285, 945, 3642, 19, 11333, 403, 417, 4460, 597, 403, 629, 1563, 253, 3273, 1020, 285, 945, 3642, 374, 11333, 275, 1182, 12109, 1162, 355, 43425, 50275, 91, 12109, 1182, 30966, 480, 480, 74, 1269, 1162, 355, 11041, 13823, 7162, 873, 945, 757, 758, 2662, 3033, 323, 4872, 3961, 953, 285, 16892, 4924, 3033, 323, 4872, 7802, 278, 12132, 50276, 19, 3273, 1020, 19, 285, 945, 3642, 19, 671, 20265, 253, 23797, 273, 12672, 31334, 314, 275, 1635, 1293, 253, 299, 4277, 3024, 2605, 253, 3273, 1020, 19, 5933, 3198, 281, 1908, 253, 15171, 273, 253, 11968, 7162, 873, 534, 310, 1199, 625, 9542, 685, 253, 3236, 3273, 1020, 5933, 50276, 20, 253, 18057, 577, 44613, 2442, 1385, 310, 417, 747, 436, 18057, 285, 671, 253, 3969, 759, 8855, 5853, 275, 4872, 7802, 31934, 793, 4620, 275, 253, 4737, 273, 458, 44661, 270, 20, 67, 22, 344, 1045, 355, 43425, 3786, 50276, 248, 480, 1182, 14451, 277, 1149, 2805, 32643, 14938, 323, 35221, 4715, 342, 4872, 1159, 11193, 50276, 2520, 2929, 3400, 10527, 23632, 323, 4715, 4872, 3961, 262, 285, 4872, 7802, 278, 12132, 627, 310, 642, 4016, 38058, 3486, 2490, 187, 4118, 18435, 27, 2520, 2929, 4245, 253, 806, 7221, 991, 8654, 598, 281, 2412, 2616, 285, 16892, 4924, 323, 4872, 7802, 278, 12132, 285, 5520, 945, 757, 758, 2662, 3033, 323, 4872, 3961, 953, 33810, 253, 2929, 3715, 247, 747, 759, 8855, 3169, 1783, 326, 476, 320, 4217, 323, 643, 3237, 841, 9021, 1056, 436, 2929, 247, 2266, 2929, 275, 253, 10527, 391, 77, 3114, 253, 913, 3021, 32636, 14924, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper proposes a transformer architectural variant the motivation of this variant is that it contains a sliding language modeling objective for sentence scoring that enables it to look at all the tokens in a sentence using bidirectional context in a single pass this solves the issue of using standard causal language modeling or masked language models which respectively have limitations of using only unidirectional context and requiring multiple forward passes strengths the structure of the paper is wellmotivated it is clear what the limitations of causal lm and mlm are for sentence scoring and the paper proposes an architecture that addresses those two issues the writing of the paper is quite clear there is a natural progression from motivation to method to results and overall the paper is pretty understandable the significance of the paper is very high in principle as causal lm and mlm are the standard objectives that are very commonly used in nlp a new paradigm that outperforms these two types of objectives would be very impactful the fact that slm needs 3 times the compute of similar sized models is a limitation but the authors explicitly explored this section 5 weaknesses the promise of a better sentence scorer is potentially very impactful the authors chose to evaluate sentence reranking on multiple generated candidates for nmt and for asr i have several concerns about these experiments that made them not particularly convincing for me first the experiments were in my view are a bit thin for instance the gain from slm over bert is not very large and there is still a large gap between the oracle furthermore while several baselines were provided there have been many architectures recently proposed and probably more comparisons are needed to fully justify the method as some examples it could have been good to compare with t5 bart xlnet etc i am also not sure about the setup for the ablations why were the ablations not performed on all the datasets i think this could have been addressed by putting a final row in table 2 with slm with onethird of the compute so that the compute is matched it is fine if it does not win in all settings under the same compute but in this case it would be good to be transparent about how much compute is needed to get the same performance and whether there are performance gains in multiple different computematched settings finally i am not sure why nmt and asr were chosen as the experimental settings in the first place it could have been stronger if the paper had chosen a language understanding benchmark like glue which could be a better fit for pretrained language models i have voted for rejection primarily because the experiments are not particularly convincing but if enough of the questionsconcerns that i had are addressed in the rebuttal i will be happy to change my score the authors were upfront about the additional computational cost of their method though additional experiments would be needed for the empirical gains in computematched settings to be convincing for me the societal impact part seems fine docsepthis work presents a transformerbased model for sentence scoring which aims to overcome the limitations of gptstyle causal language model pretraining only the unidirectional context is used and bertstyle masked language model pretraining multiple forward passes are required when scores for each token are needed the authors propose an xlnetstyle multistream selfattention which employs a forward stream to capture the ltr context a backward stream for the rtl context and a query stream which captures information from both this method is experimentally validated on two tasks neural machine translation and reranking for asr the proposed method generally outperforms clm and mlm approaches the authors also perform further experiments showing that the proposed method is still able to outperform clm when the size of the models is adjusted such that the amount of inferencetime computation is equal finally they also attempt to make mlm more efficient by masking multiple tokens at a time thereby reducing the amount of required computation but the proposed method still comes out ahead the main idea seems like fairly straightforward variant of the xlnet architecture not too dissimilar from other past proposals to modify its base architecture by changing the nature of the streams aside from originality overall the paper feels solid both the architecture and the experiments are clearly described the tasks chosen by the authors for the evaluation are realistic and at least for the nmt experiments the baselines are sensible and their performance is in the right ballpark given that one of the selling points of the proposed method is computational efficiency the experiments of sections 51 and 52 are useful the results are promising showing that under comparable amounts of computation the authors proposal still comes ahead a couple of minor points it would be useful to have dataset and split sizes for the experiments of section 4 directly in the paper iwslt14 might be a small dataset but i dont think german spanish italian etc should be described as lowresource i do not see any potential negative societal impacts of this work limitations are sufficiently addressed docsepthis work proposes a modified transformer encoder the transcormer which gives perposition log pseudoprobabilities in a single pass unlike masked lms while still capturing tokentoken dependencies at all layers these probabilities sum to give a score that can be used to rescore nmt and asr hypotheses this works by having the keyvalue representations be concatenated from a forward inclusive and backward inclusive attention stream before taking the outer product with the query which never sees the content of its own position experiments on iwslt wmt and librispeech suggest this scheme outperforms mlm and clm for rescoring postrebuttal i raise soundness to 44 and my score from 410 to 610 the work is thorough and introduces an architectural innovation to more richly model pysybackslash s in one pass unlike the paper originally claimed it is not the first such model tta and electric also unsupervised sentence scoring remains niche mlm scoring is more for inspection of preexisting models and for downstream tasks most works use discriminative ranking however the authors have done many experiments it performs better than mlms and previous models and may have applications elsewhere analogous to xlnet also published at neurips there are still many grammar errors please work with a copyeditor or native english speaker on any cameraready the authors succeed in proposing a scheme that models token probabilities conditioned on all other tokens eq 1 in a single pass while preserving deep bidirectionality the method appears sound and while the mechanism is complicated figures 1 and 2 do well to explain it though figure 2b should have labels saying which axis is layer l1 and which is layer l however i have concerns re the clarity and significance of this work firstly there are two uncited works that have proposed onepass solutions to tokenwise probabilities tta j shin y lee s yoon k jung fast and accurate deep bidirectional language representations for unsupervised learning acl 2020 electric k clark mt luong q le c manning pretraining transformers as energybased cloze models emnlp 2020 in particular the first works proposed tta has many similarities a query initialized with position embeddings only a query that is updated through to the end and is selectively blocked on the self position while having some benefits no 3x in cost and disadvantages the keys values do not transform over layers the authors should also acknowledge 18 in section 53 where an almost identical analysis is performed even down to the choice of length and plot figure 3 here is largely 18s figures 34 while the authors compare on some tasks nmt asr as prior work it would be good to compare on the same datasets or even nbest lists for example 18 20 and the above two works compare on a shared set of nbest librispeech lists while this work creates new ones this is mitigated by the authors releasing model code thank you but would help convince readers especially as slm models are not offtheshelf but trained from scratch and hopefully released it is also unclear eg what l20s quality of a sentence or what l2324s precise sentence scoring mean only in section 21 page 3 do we get two definitions via clm which by the chain rule gives the loglikelihood score log py1 yy via mlm which gives the pseudologlikelihood score sumi1y log pyi mid ybackslash yi slm models the latter it would be better to make this clear early and then mention howwhy these quantities are empirically or even theoretically good quantities to rescore with so before in 2638 with the second being better which is why slm models it discriminative rescoring should also be discussedcontrasted early or at least in the main text not the appendix i appreciate some of the analyses which complement past work in particular a31s discussion comparing sentence vs fixedlength pretraining and 53 showing slm and mlms similar scores section 51 and 52 are unsatisfying though as increasing the number of masked tokens would be expected to be poor given the mismatch with pretraining in bert masking 15 minor notes the word sliding evokes sliding windows ie only a contiguous fixed length of inputs are considered at any time giving the wrong impression l195198 the structure of bilms should be explained more l163 should clarify that pi refer to position embeddings writing should be proofread eg l18 and elsewhere language modelings language modeling approaches or language modeling schemes or even language models l85 in a chainstyle rule via the chain rule figure 2 reuse reuses can not cannot table 5 appendix table 5 testclear testclean l233 236 appendix table 5 librspeech librispeech while slm gives tokenwise probabilities out of the box it does require pretraining from scratch and is rather specialized whereas mlm and clm have other uses hopefully more uses can be explored to justify this cost see works above for examples like acceptability blimp or text similarity sts ### Summary:
this paper proposes a transformer architectural variant the motivation of this variant is that it contains a sliding language modeling objective for sentence scoring that enables it to look at all the tokens in a sentence using bidirectional context in a single pass this solves the issue of using standard causal language modeling or masked language models which respectively have limitations of using only unidirectional context and requiring multiple forward passes the empirical gains in terms of quality are modest but the speedups are quite impressive it would have been nice to see evaluations on a few more tasks like superglue it is a bit unclear why such results are not presented but on balance the paper is probably still above the cutoff
[ 30869, 3634, 285, 10568, 2709, 3579, 11999, 50275, 296, 3755, 20556, 50276, 783, 2605, 273, 253, 2929, 310, 973, 24013, 8550, 352, 310, 2590, 752, 253, 7364, 273, 19349, 298, 78, 285, 13361, 78, 403, 323, 6197, 14755, 285, 253, 2929, 29328, 271, 10336, 326, 12453, 1110, 767, 3374, 50275, 783, 4028, 273, 253, 2929, 310, 3240, 2590, 627, 310, 247, 3626, 10005, 432, 16038, 281, 1332, 281, 1543, 285, 4583, 253, 2929, 310, 3965, 34007, 50276, 783, 8453, 273, 253, 2929, 310, 1077, 1029, 275, 8063, 347, 19349, 298, 78, 285, 13361, 78, 403, 253, 2629, 16566, 326, 403, 1077, 7744, 908, 275, 295, 24343, 247, 747, 22199, 326, 41731, 13015, 841, 767, 3510, 273, 16566, 651, 320, 1077, 3486, 1020, 50276, 783, 958, 326, 1499, 78, 3198, 495, 2069, 253, 11897, 273, 2074, 25180, 3210, 310, 247, 12291, 533, 253, 4477, 11120, 14859, 436, 2593, 608, 50275, 20881, 1255, 265, 253, 9023, 273, 247, 1805, 6197, 4868, 83, 310, 7826, 1077, 3486, 1020, 253, 4477, 9703, 281, 7472, 6197, 294, 47883, 327, 2709, 4561, 9183, 323, 295, 6917, 285, 323, 347, 83, 891, 452, 2067, 7350, 670, 841, 4679, 326, 1160, 731, 417, 3782, 21414, 323, 479, 50275, 7053, 253, 4679, 497, 275, 619, 1859, 403, 247, 2372, 6906, 323, 4227, 253, 6351, 432, 1499, 78, 689, 270, 797, 310, 417, 1077, 1781, 285, 627, 310, 1335, 247, 1781, 8037, 875, 253, 42295, 33810, 1223, 2067, 1666, 25379, 497, 2530, 627, 452, 644, 1142, 35615, 4102, 4081, 285, 3164, 625, 14023, 403, 3058, 281, 4751, 15249, 253, 1332, 347, 690, 6667, 352, 812, 452, 644, 1175, 281, 7277, 342, 246, 22, 44693, 1269, 77, 3024, 3966, 50276, 74, 717, 671, 417, 2119, 670, 253, 9978, 323, 253, 490, 77, 569, 2139, 497, 253, 490, 77, 569, 417, 2684, 327, 512, 253, 15302, 891, 1158, 436, 812, 452, 644, 9713, 407, 8133, 247, 2457, 4194, 275, 2829, 374, 342, 1499, 78, 342, 327, 678, 1817, 273, 253, 11897, 594, 326, 253, 11897, 310, 13373, 352, 310, 4030, 604, 352, 1057, 417, 3330, 275, 512, 7533, 762, 253, 1072, 11897, 533, 275, 436, 1083, 352, 651, 320, 1175, 281, 320, 13955, 670, 849, 1199, 11897, 310, 3058, 281, 755, 253, 1072, 3045, 285, 1880, 627, 403, 3045, 15988, 275, 2709, 1027, 2475, 14359, 2147, 7533, 50276, 71, 3341, 891, 717, 417, 2119, 2139, 295, 6917, 285, 347, 83, 497, 6777, 347, 253, 5661, 7533, 275, 253, 806, 1659, 352, 812, 452, 644, 10046, 604, 253, 2929, 574, 6777, 247, 3448, 4685, 22791, 751, 28400, 534, 812, 320, 247, 1805, 4944, 323, 3215, 11273, 3448, 3210, 50276, 74, 452, 14285, 323, 18235, 8558, 984, 253, 4679, 403, 417, 3782, 21414, 533, 604, 2217, 273, 253, 3533, 585, 1209, 2224, 326, 891, 574, 403, 9713, 275, 253, 30080, 22559, 891, 588, 320, 5211, 281, 1818, 619, 4868, 253, 4477, 497, 598, 6342, 670, 253, 3081, 15180, 2105, 273, 616, 1332, 2167, 3081, 4679, 651, 320, 3058, 323, 253, 16774, 15988, 275, 2475, 14359, 2147, 7533, 281, 320, 21414, 323, 479, 50276, 783, 38058, 3486, 629, 3133, 4030, 5474, 33032, 2520, 789, 10262, 247, 39707, 3169, 1566, 323, 6197, 14755, 534, 13698, 281, 11399, 253, 7364, 273, 305, 431, 4826, 19349, 3448, 1566, 3215, 26208, 760, 253, 440, 301, 30869, 3634, 310, 908, 285, 270, 797, 4826, 34741, 3448, 1566, 3215, 26208, 2709, 3579, 11999, 403, 2424, 672, 7363, 323, 1016, 10669, 403, 3058, 253, 4477, 12661, 271, 1269, 77, 3024, 4826, 1554, 382, 1883, 1881, 42959, 534, 27532, 247, 3579, 5542, 281, 9232, 253, 298, 1206, 3634, 247, 19265, 5542, 323, 253, 391, 17945, 3634, 285, 247, 7316, 5542, 534, 28174, 1491, 432, 1097, 50276, 2520, 1332, 310, 21657, 17618, 327, 767, 8892, 11454, 5145, 10234, 285, 294, 47883, 323, 347, 83, 253, 4081, 1332, 3839, 41731, 13015, 502, 78, 285, 13361, 78, 7274, 253, 4477, 671, 1347, 2007, 4679, 4645, 326, 253, 4081, 1332, 310, 1335, 2104, 281, 562, 32231, 502, 78, 672, 253, 1979, 273, 253, 3210, 310, 10904, 824, 326, 253, 2408, 273, 275, 22498, 68, 7816, 13782, 310, 4503, 4720, 597, 671, 3177, 281, 1056, 13361, 78, 625, 5919, 407, 44790, 2709, 21761, 387, 247, 673, 7624, 8493, 253, 2408, 273, 2424, 13782, 533, 253, 4081, 1332, 1335, 3249, 562, 6386, 253, 2022, 2934, 3133, 751, 9648, 15246, 12955, 273, 253, 1269, 77, 3024, 10336, 417, 1512, 43110, 432, 643, 2469, 18595, 281, 10007, 697, 2613, 10336, 407, 6890, 253, 3753, 273, 253, 17795, 50276, 45529, 432, 3236, 414, 4583, 253, 2929, 9193, 4891, 1097, 253, 10336, 285, 253, 4679, 403, 4518, 2529, 253, 8892, 6777, 407, 253, 4477, 323, 253, 7103, 403, 15958, 285, 387, 1878, 323, 253, 295, 6917, 4679, 253, 1666, 25379, 403, 24600, 285, 616, 3045, 310, 275, 253, 987, 4023, 30844, 50276, 28821, 326, 581, 273, 253, 10156, 2792, 273, 253, 4081, 1332, 310, 15180, 6733, 253, 4679, 273, 7118, 8319, 285, 8073, 403, 4217, 253, 1543, 403, 12532, 4645, 326, 762, 10870, 8322, 273, 13782, 253, 4477, 10419, 1335, 3249, 6386, 50276, 66, 4564, 273, 5884, 2792, 50276, 262, 651, 320, 4217, 281, 452, 10895, 285, 8085, 9552, 323, 253, 4679, 273, 2593, 577, 3587, 275, 253, 2929, 50276, 27684, 3433, 85, 1047, 1537, 320, 247, 1355, 10895, 533, 891, 13414, 1158, 305, 8592, 653, 9289, 352, 6770, 3966, 943, 320, 2529, 347, 1698, 15024, 50276, 74, 513, 417, 923, 667, 2442, 4016, 38058, 16274, 273, 436, 789, 7364, 403, 10481, 9713, 5474, 33032, 2520, 789, 29328, 247, 7321, 39707, 32049, 253, 21427, 526, 254, 534, 4245, 591, 3321, 2412, 10585, 412, 18848, 6720, 275, 247, 2014, 1509, 12401, 34741, 298, 983, 1223, 1335, 26475, 18734, 290, 5097, 21011, 387, 512, 8090, 841, 20552, 2020, 281, 1918, 247, 4868, 326, 476, 320, 908, 281, 501, 6443, 295, 6917, 285, 347, 83, 24316, 436, 2987, 407, 1907, 253, 2234, 2877, 14237, 320, 32147, 456, 432, 247, 3579, 25495, 285, 19265, 25495, 4116, 5542, 1078, 3192, 253, 8346, 1885, 342, 253, 7316, 534, 1620, 11403, 253, 2600, 273, 697, 1211, 1899, 4679, 327, 891, 88, 3433, 85, 259, 6917, 285, 40211, 261, 365, 5036, 1804, 436, 6974, 41731, 13015, 13361, 78, 285, 502, 78, 323, 9708, 4263, 50276, 5996, 250, 2858, 22559, 891, 7164, 3590, 1255, 281, 7127, 285, 619, 4868, 432, 31938, 281, 43385, 253, 789, 310, 11080, 285, 23970, 271, 27934, 15832, 281, 625, 6793, 314, 1566, 268, 656, 90, 27054, 256, 275, 581, 1509, 12401, 253, 2929, 8927, 7558, 352, 310, 417, 253, 806, 824, 1566, 246, 893, 285, 5637, 671, 440, 35421, 6197, 14755, 4558, 25803, 13361, 78, 14755, 310, 625, 323, 15981, 273, 638, 20137, 3210, 285, 323, 15450, 8892, 954, 2987, 897, 20741, 800, 19947, 2299, 253, 4477, 452, 2218, 1142, 4679, 352, 17923, 1805, 685, 13361, 983, 285, 2045, 3210, 285, 778, 452, 4893, 11358, 19890, 281, 1269, 77, 3024, 671, 3863, 387, 5723, 2824, 627, 403, 1335, 1142, 28146, 6332, 4496, 789, 342, 247, 3491, 20511, 390, 7925, 48087, 14925, 327, 667, 4049, 254, 609, 5102, 253, 4477, 9302, 275, 36636, 247, 6974, 326, 3210, 10669, 20552, 27039, 327, 512, 643, 21761, 16186, 337, 275, 247, 2014, 1509, 1223, 24279, 3676, 12246, 11798, 1319, 253, 1332, 4620, 3590, 285, 1223, 253, 5122, 310, 9542, 8442, 337, 285, 374, 513, 973, 281, 5513, 352, 2167, 4677, 374, 67, 943, 452, 13301, 3981, 534, 7844, 310, 3828, 298, 18, 285, 534, 310, 3828, 298, 50276, 35529, 891, 452, 7350, 294, 253, 19843, 285, 8453, 273, 436, 789, 50276, 7053, 314, 627, 403, 767, 5258, 959, 2987, 326, 452, 4081, 327, 554, 515, 5482, 281, 10669, 3020, 20552, 50276, 85, 893, 480, 439, 249, 340, 458, 70, 256, 340, 3508, 465, 480, 1947, 3809, 285, 7899, 3676, 12246, 30869, 3448, 14237, 323, 440, 35421, 4715, 247, 498, 9169, 50276, 34502, 465, 502, 782, 26301, 26535, 543, 2805, 458, 260, 637, 920, 3215, 26208, 4979, 398, 347, 2341, 3169, 23406, 2721, 3210, 802, 13307, 81, 9169, 50276, 249, 1798, 253, 806, 2987, 4081, 246, 893, 556, 1142, 22620, 50276, 66, 7316, 31260, 342, 1899, 46234, 760, 50276, 66, 7316, 326, 310, 9300, 949, 281, 253, 990, 285, 310, 21656, 13230, 327, 253, 1881, 1899, 50276, 6050, 1907, 690, 5373, 642, 495, 89, 275, 2105, 285, 23797, 253, 10149, 2193, 513, 417, 4979, 689, 8090, 50276, 783, 4477, 943, 671, 14409, 1283, 275, 2593, 8676, 835, 271, 2761, 8931, 1783, 310, 2684, 1014, 1066, 281, 253, 4327, 273, 2978, 285, 7484, 4677, 495, 1060, 310, 8127, 1283, 84, 8442, 5910, 50276, 6050, 253, 4477, 7277, 327, 690, 8892, 295, 6917, 347, 83, 347, 2720, 789, 352, 651, 320, 1175, 281, 7277, 327, 253, 1072, 15302, 390, 1014, 295, 14461, 10894, 323, 1650, 1283, 1384, 285, 253, 1840, 767, 2987, 7277, 327, 247, 6096, 873, 273, 295, 14461, 40211, 261, 365, 5036, 10894, 1223, 436, 789, 10513, 747, 4394, 436, 310, 4784, 27285, 407, 253, 4477, 20437, 1566, 2127, 5717, 368, 533, 651, 1361, 18578, 10668, 50276, 20432, 347, 1499, 78, 3210, 403, 417, 273, 649, 1041, 48164, 533, 10166, 432, 20041, 285, 18670, 4439, 50276, 262, 310, 671, 12744, 24088, 752, 298, 938, 84, 3290, 273, 247, 6197, 390, 752, 298, 19136, 21, 84, 10799, 6197, 14755, 1599, 760, 275, 2593, 3127, 3239, 495, 513, 359, 755, 767, 14308, 50276, 13917, 502, 78, 534, 407, 253, 5931, 4086, 4245, 253, 2412, 7513, 10202, 4868, 2412, 7239, 18, 50276, 12502, 50276, 13917, 13361, 78, 534, 4245, 253, 10585, 862, 7513, 10202, 4868, 2020, 74, 18, 90, 2412, 7239, 74, 4260, 340, 27054, 340, 74, 50276, 3433, 78, 3210, 253, 6158, 352, 651, 320, 1805, 281, 1056, 436, 2590, 2393, 285, 840, 3748, 849, 22309, 841, 13483, 403, 45190, 390, 1014, 28055, 1175, 13483, 281, 501, 6443, 342, 594, 1078, 275, 3436, 1839, 342, 253, 1273, 1146, 1805, 534, 310, 2139, 1499, 78, 3210, 352, 20741, 800, 9708, 4263, 943, 671, 320, 5469, 19657, 12457, 2393, 390, 387, 1878, 275, 253, 2022, 2505, 417, 253, 30762, 50276, 74, 11435, 690, 273, 253, 6260, 534, 13503, 2469, 789, 275, 1798, 247, 2405, 84, 5955, 10941, 6197, 4632, 4229, 3985, 3215, 26208, 285, 8676, 4645, 1499, 78, 285, 13361, 983, 2074, 7363, 2593, 8319, 285, 8073, 403, 43288, 3184, 2167, 347, 3629, 253, 1180, 273, 34741, 21761, 651, 320, 3264, 281, 320, 4105, 1677, 253, 29713, 342, 3215, 26208, 275, 270, 797, 44790, 1458, 50276, 37585, 7211, 50276, 783, 3159, 20661, 612, 7095, 20661, 8323, 26332, 760, 247, 41248, 4229, 2978, 273, 14800, 403, 2783, 387, 667, 673, 4933, 253, 3430, 13214, 50276, 77, 17726, 16903, 253, 2605, 273, 10370, 983, 943, 320, 5544, 625, 50276, 77, 18121, 943, 19148, 326, 12580, 3730, 281, 1899, 46234, 50276, 17695, 943, 320, 4737, 1088, 24088, 50276, 77, 1093, 285, 11358, 3448, 1566, 723, 50276, 12982, 14053, 7274, 390, 3448, 14053, 15849, 390, 1014, 3448, 3210, 50276, 77, 2227, 275, 247, 5931, 4826, 4086, 50276, 13917, 253, 5931, 4086, 50276, 13206, 374, 33150, 50276, 250, 5123, 476, 417, 50276, 32824, 50276, 2420, 608, 30762, 2829, 608, 1071, 8250, 50276, 2566, 16437, 50276, 77, 22187, 28131, 30762, 2829, 608, 40211, 48460, 50276, 77, 2560, 261, 365, 5036, 1223, 1499, 78, 4245, 10669, 3020, 20552, 562, 273, 253, 3817, 352, 1057, 2430, 3215, 26208, 432, 20041, 285, 310, 2581, 18052, 5727, 13361, 78, 285, 502, 78, 452, 643, 4648, 18670, 625, 4648, 476, 320, 14859, 281, 15249, 436, 2105, 923, 2987, 1840, 323, 6667, 751, 2997, 1430, 787, 11548, 390, 2505, 14259, 331, 84, 2490, 187, 4118, 18435, 27, 2520, 2929, 29328, 247, 39707, 27934, 12955, 253, 16038, 273, 436, 12955, 310, 326, 352, 4428, 247, 20661, 3448, 14053, 8103, 323, 6197, 14755, 326, 13276, 352, 281, 1007, 387, 512, 253, 21761, 275, 247, 6197, 970, 12246, 30869, 3634, 275, 247, 2014, 1509, 436, 35910, 253, 2523, 273, 970, 2629, 19349, 3448, 14053, 390, 34741, 3448, 3210, 534, 2975, 452, 7364, 273, 970, 760, 440, 301, 30869, 3634, 285, 10568, 2709, 3579, 11999, 253, 16774, 15988, 275, 2426, 273, 3290, 403, 16453, 533, 253, 3885, 8777, 403, 3240, 13943, 352, 651, 452, 644, 5322, 281, 923, 27163, 327, 247, 1643, 625, 8892, 751, 2221, 3129, 489, 352, 310, 247, 2372, 12744, 2139, 824, 1543, 403, 417, 3559, 533, 327, 6654, 253, 2929, 310, 3164, 1335, 1840, 253, 23046 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30869, 3634, 285, 10568, 2709, 3579, 11999, 50275, 296, 3755, 20556, 50276, 783, 2605, 273, 253, 2929, 310, 973, 24013, 8550, 352, 310, 2590, 752, 253, 7364, 273, 19349, 298, 78, 285, 13361, 78, 403, 323, 6197, 14755, 285, 253, 2929, 29328, 271, 10336, 326, 12453, 1110, 767, 3374, 50275, 783, 4028, 273, 253, 2929, 310, 3240, 2590, 627, 310, 247, 3626, 10005, 432, 16038, 281, 1332, 281, 1543, 285, 4583, 253, 2929, 310, 3965, 34007, 50276, 783, 8453, 273, 253, 2929, 310, 1077, 1029, 275, 8063, 347, 19349, 298, 78, 285, 13361, 78, 403, 253, 2629, 16566, 326, 403, 1077, 7744, 908, 275, 295, 24343, 247, 747, 22199, 326, 41731, 13015, 841, 767, 3510, 273, 16566, 651, 320, 1077, 3486, 1020, 50276, 783, 958, 326, 1499, 78, 3198, 495, 2069, 253, 11897, 273, 2074, 25180, 3210, 310, 247, 12291, 533, 253, 4477, 11120, 14859, 436, 2593, 608, 50275, 20881, 1255, 265, 253, 9023, 273, 247, 1805, 6197, 4868, 83, 310, 7826, 1077, 3486, 1020, 253, 4477, 9703, 281, 7472, 6197, 294, 47883, 327, 2709, 4561, 9183, 323, 295, 6917, 285, 323, 347, 83, 891, 452, 2067, 7350, 670, 841, 4679, 326, 1160, 731, 417, 3782, 21414, 323, 479, 50275, 7053, 253, 4679, 497, 275, 619, 1859, 403, 247, 2372, 6906, 323, 4227, 253, 6351, 432, 1499, 78, 689, 270, 797, 310, 417, 1077, 1781, 285, 627, 310, 1335, 247, 1781, 8037, 875, 253, 42295, 33810, 1223, 2067, 1666, 25379, 497, 2530, 627, 452, 644, 1142, 35615, 4102, 4081, 285, 3164, 625, 14023, 403, 3058, 281, 4751, 15249, 253, 1332, 347, 690, 6667, 352, 812, 452, 644, 1175, 281, 7277, 342, 246, 22, 44693, 1269, 77, 3024, 3966, 50276, 74, 717, 671, 417, 2119, 670, 253, 9978, 323, 253, 490, 77, 569, 2139, 497, 253, 490, 77, 569, 417, 2684, 327, 512, 253, 15302, 891, 1158, 436, 812, 452, 644, 9713, 407, 8133, 247, 2457, 4194, 275, 2829, 374, 342, 1499, 78, 342, 327, 678, 1817, 273, 253, 11897, 594, 326, 253, 11897, 310, 13373, 352, 310, 4030, 604, 352, 1057, 417, 3330, 275, 512, 7533, 762, 253, 1072, 11897, 533, 275, 436, 1083, 352, 651, 320, 1175, 281, 320, 13955, 670, 849, 1199, 11897, 310, 3058, 281, 755, 253, 1072, 3045, 285, 1880, 627, 403, 3045, 15988, 275, 2709, 1027, 2475, 14359, 2147, 7533, 50276, 71, 3341, 891, 717, 417, 2119, 2139, 295, 6917, 285, 347, 83, 497, 6777, 347, 253, 5661, 7533, 275, 253, 806, 1659, 352, 812, 452, 644, 10046, 604, 253, 2929, 574, 6777, 247, 3448, 4685, 22791, 751, 28400, 534, 812, 320, 247, 1805, 4944, 323, 3215, 11273, 3448, 3210, 50276, 74, 452, 14285, 323, 18235, 8558, 984, 253, 4679, 403, 417, 3782, 21414, 533, 604, 2217, 273, 253, 3533, 585, 1209, 2224, 326, 891, 574, 403, 9713, 275, 253, 30080, 22559, 891, 588, 320, 5211, 281, 1818, 619, 4868, 253, 4477, 497, 598, 6342, 670, 253, 3081, 15180, 2105, 273, 616, 1332, 2167, 3081, 4679, 651, 320, 3058, 323, 253, 16774, 15988, 275, 2475, 14359, 2147, 7533, 281, 320, 21414, 323, 479, 50276, 783, 38058, 3486, 629, 3133, 4030, 5474, 33032, 2520, 789, 10262, 247, 39707, 3169, 1566, 323, 6197, 14755, 534, 13698, 281, 11399, 253, 7364, 273, 305, 431, 4826, 19349, 3448, 1566, 3215, 26208, 760, 253, 440, 301, 30869, 3634, 310, 908, 285, 270, 797, 4826, 34741, 3448, 1566, 3215, 26208, 2709, 3579, 11999, 403, 2424, 672, 7363, 323, 1016, 10669, 403, 3058, 253, 4477, 12661, 271, 1269, 77, 3024, 4826, 1554, 382, 1883, 1881, 42959, 534, 27532, 247, 3579, 5542, 281, 9232, 253, 298, 1206, 3634, 247, 19265, 5542, 323, 253, 391, 17945, 3634, 285, 247, 7316, 5542, 534, 28174, 1491, 432, 1097, 50276, 2520, 1332, 310, 21657, 17618, 327, 767, 8892, 11454, 5145, 10234, 285, 294, 47883, 323, 347, 83, 253, 4081, 1332, 3839, 41731, 13015, 502, 78, 285, 13361, 78, 7274, 253, 4477, 671, 1347, 2007, 4679, 4645, 326, 253, 4081, 1332, 310, 1335, 2104, 281, 562, 32231, 502, 78, 672, 253, 1979, 273, 253, 3210, 310, 10904, 824, 326, 253, 2408, 273, 275, 22498, 68, 7816, 13782, 310, 4503, 4720, 597, 671, 3177, 281, 1056, 13361, 78, 625, 5919, 407, 44790, 2709, 21761, 387, 247, 673, 7624, 8493, 253, 2408, 273, 2424, 13782, 533, 253, 4081, 1332, 1335, 3249, 562, 6386, 253, 2022, 2934, 3133, 751, 9648, 15246, 12955, 273, 253, 1269, 77, 3024, 10336, 417, 1512, 43110, 432, 643, 2469, 18595, 281, 10007, 697, 2613, 10336, 407, 6890, 253, 3753, 273, 253, 17795, 50276, 45529, 432, 3236, 414, 4583, 253, 2929, 9193, 4891, 1097, 253, 10336, 285, 253, 4679, 403, 4518, 2529, 253, 8892, 6777, 407, 253, 4477, 323, 253, 7103, 403, 15958, 285, 387, 1878, 323, 253, 295, 6917, 4679, 253, 1666, 25379, 403, 24600, 285, 616, 3045, 310, 275, 253, 987, 4023, 30844, 50276, 28821, 326, 581, 273, 253, 10156, 2792, 273, 253, 4081, 1332, 310, 15180, 6733, 253, 4679, 273, 7118, 8319, 285, 8073, 403, 4217, 253, 1543, 403, 12532, 4645, 326, 762, 10870, 8322, 273, 13782, 253, 4477, 10419, 1335, 3249, 6386, 50276, 66, 4564, 273, 5884, 2792, 50276, 262, 651, 320, 4217, 281, 452, 10895, 285, 8085, 9552, 323, 253, 4679, 273, 2593, 577, 3587, 275, 253, 2929, 50276, 27684, 3433, 85, 1047, 1537, 320, 247, 1355, 10895, 533, 891, 13414, 1158, 305, 8592, 653, 9289, 352, 6770, 3966, 943, 320, 2529, 347, 1698, 15024, 50276, 74, 513, 417, 923, 667, 2442, 4016, 38058, 16274, 273, 436, 789, 7364, 403, 10481, 9713, 5474, 33032, 2520, 789, 29328, 247, 7321, 39707, 32049, 253, 21427, 526, 254, 534, 4245, 591, 3321, 2412, 10585, 412, 18848, 6720, 275, 247, 2014, 1509, 12401, 34741, 298, 983, 1223, 1335, 26475, 18734, 290, 5097, 21011, 387, 512, 8090, 841, 20552, 2020, 281, 1918, 247, 4868, 326, 476, 320, 908, 281, 501, 6443, 295, 6917, 285, 347, 83, 24316, 436, 2987, 407, 1907, 253, 2234, 2877, 14237, 320, 32147, 456, 432, 247, 3579, 25495, 285, 19265, 25495, 4116, 5542, 1078, 3192, 253, 8346, 1885, 342, 253, 7316, 534, 1620, 11403, 253, 2600, 273, 697, 1211, 1899, 4679, 327, 891, 88, 3433, 85, 259, 6917, 285, 40211, 261, 365, 5036, 1804, 436, 6974, 41731, 13015, 13361, 78, 285, 502, 78, 323, 9708, 4263, 50276, 5996, 250, 2858, 22559, 891, 7164, 3590, 1255, 281, 7127, 285, 619, 4868, 432, 31938, 281, 43385, 253, 789, 310, 11080, 285, 23970, 271, 27934, 15832, 281, 625, 6793, 314, 1566, 268, 656, 90, 27054, 256, 275, 581, 1509, 12401, 253, 2929, 8927, 7558, 352, 310, 417, 253, 806, 824, 1566, 246, 893, 285, 5637, 671, 440, 35421, 6197, 14755, 4558, 25803, 13361, 78, 14755, 310, 625, 323, 15981, 273, 638, 20137, 3210, 285, 323, 15450, 8892, 954, 2987, 897, 20741, 800, 19947, 2299, 253, 4477, 452, 2218, 1142, 4679, 352, 17923, 1805, 685, 13361, 983, 285, 2045, 3210, 285, 778, 452, 4893, 11358, 19890, 281, 1269, 77, 3024, 671, 3863, 387, 5723, 2824, 627, 403, 1335, 1142, 28146, 6332, 4496, 789, 342, 247, 3491, 20511, 390, 7925, 48087, 14925, 327, 667, 4049, 254, 609, 5102, 253, 4477, 9302, 275, 36636, 247, 6974, 326, 3210, 10669, 20552, 27039, 327, 512, 643, 21761, 16186, 337, 275, 247, 2014, 1509, 1223, 24279, 3676, 12246, 11798, 1319, 253, 1332, 4620, 3590, 285, 1223, 253, 5122, 310, 9542, 8442, 337, 285, 374, 513, 973, 281, 5513, 352, 2167, 4677, 374, 67, 943, 452, 13301, 3981, 534, 7844, 310, 3828, 298, 18, 285, 534, 310, 3828, 298, 50276, 35529, 891, 452, 7350, 294, 253, 19843, 285, 8453, 273, 436, 789, 50276, 7053, 314, 627, 403, 767, 5258, 959, 2987, 326, 452, 4081, 327, 554, 515, 5482, 281, 10669, 3020, 20552, 50276, 85, 893, 480, 439, 249, 340, 458, 70, 256, 340, 3508, 465, 480, 1947, 3809, 285, 7899, 3676, 12246, 30869, 3448, 14237, 323, 440, 35421, 4715, 247, 498, 9169, 50276, 34502, 465, 502, 782, 26301, 26535, 543, 2805, 458, 260, 637, 920, 3215, 26208, 4979, 398, 347, 2341, 3169, 23406, 2721, 3210, 802, 13307, 81, 9169, 50276, 249, 1798, 253, 806, 2987, 4081, 246, 893, 556, 1142, 22620, 50276, 66, 7316, 31260, 342, 1899, 46234, 760, 50276, 66, 7316, 326, 310, 9300, 949, 281, 253, 990, 285, 310, 21656, 13230, 327, 253, 1881, 1899, 50276, 6050, 1907, 690, 5373, 642, 495, 89, 275, 2105, 285, 23797, 253, 10149, 2193, 513, 417, 4979, 689, 8090, 50276, 783, 4477, 943, 671, 14409, 1283, 275, 2593, 8676, 835, 271, 2761, 8931, 1783, 310, 2684, 1014, 1066, 281, 253, 4327, 273, 2978, 285, 7484, 4677, 495, 1060, 310, 8127, 1283, 84, 8442, 5910, 50276, 6050, 253, 4477, 7277, 327, 690, 8892, 295, 6917, 347, 83, 347, 2720, 789, 352, 651, 320, 1175, 281, 7277, 327, 253, 1072, 15302, 390, 1014, 295, 14461, 10894, 323, 1650, 1283, 1384, 285, 253, 1840, 767, 2987, 7277, 327, 247, 6096, 873, 273, 295, 14461, 40211, 261, 365, 5036, 10894, 1223, 436, 789, 10513, 747, 4394, 436, 310, 4784, 27285, 407, 253, 4477, 20437, 1566, 2127, 5717, 368, 533, 651, 1361, 18578, 10668, 50276, 20432, 347, 1499, 78, 3210, 403, 417, 273, 649, 1041, 48164, 533, 10166, 432, 20041, 285, 18670, 4439, 50276, 262, 310, 671, 12744, 24088, 752, 298, 938, 84, 3290, 273, 247, 6197, 390, 752, 298, 19136, 21, 84, 10799, 6197, 14755, 1599, 760, 275, 2593, 3127, 3239, 495, 513, 359, 755, 767, 14308, 50276, 13917, 502, 78, 534, 407, 253, 5931, 4086, 4245, 253, 2412, 7513, 10202, 4868, 2412, 7239, 18, 50276, 12502, 50276, 13917, 13361, 78, 534, 4245, 253, 10585, 862, 7513, 10202, 4868, 2020, 74, 18, 90, 2412, 7239, 74, 4260, 340, 27054, 340, 74, 50276, 3433, 78, 3210, 253, 6158, 352, 651, 320, 1805, 281, 1056, 436, 2590, 2393, 285, 840, 3748, 849, 22309, 841, 13483, 403, 45190, 390, 1014, 28055, 1175, 13483, 281, 501, 6443, 342, 594, 1078, 275, 3436, 1839, 342, 253, 1273, 1146, 1805, 534, 310, 2139, 1499, 78, 3210, 352, 20741, 800, 9708, 4263, 943, 671, 320, 5469, 19657, 12457, 2393, 390, 387, 1878, 275, 253, 2022, 2505, 417, 253, 30762, 50276, 74, 11435, 690, 273, 253, 6260, 534, 13503, 2469, 789, 275, 1798, 247, 2405, 84, 5955, 10941, 6197, 4632, 4229, 3985, 3215, 26208, 285, 8676, 4645, 1499, 78, 285, 13361, 983, 2074, 7363, 2593, 8319, 285, 8073, 403, 43288, 3184, 2167, 347, 3629, 253, 1180, 273, 34741, 21761, 651, 320, 3264, 281, 320, 4105, 1677, 253, 29713, 342, 3215, 26208, 275, 270, 797, 44790, 1458, 50276, 37585, 7211, 50276, 783, 3159, 20661, 612, 7095, 20661, 8323, 26332, 760, 247, 41248, 4229, 2978, 273, 14800, 403, 2783, 387, 667, 673, 4933, 253, 3430, 13214, 50276, 77, 17726, 16903, 253, 2605, 273, 10370, 983, 943, 320, 5544, 625, 50276, 77, 18121, 943, 19148, 326, 12580, 3730, 281, 1899, 46234, 50276, 17695, 943, 320, 4737, 1088, 24088, 50276, 77, 1093, 285, 11358, 3448, 1566, 723, 50276, 12982, 14053, 7274, 390, 3448, 14053, 15849, 390, 1014, 3448, 3210, 50276, 77, 2227, 275, 247, 5931, 4826, 4086, 50276, 13917, 253, 5931, 4086, 50276, 13206, 374, 33150, 50276, 250, 5123, 476, 417, 50276, 32824, 50276, 2420, 608, 30762, 2829, 608, 1071, 8250, 50276, 2566, 16437, 50276, 77, 22187, 28131, 30762, 2829, 608, 40211, 48460, 50276, 77, 2560, 261, 365, 5036, 1223, 1499, 78, 4245, 10669, 3020, 20552, 562, 273, 253, 3817, 352, 1057, 2430, 3215, 26208, 432, 20041, 285, 310, 2581, 18052, 5727, 13361, 78, 285, 502, 78, 452, 643, 4648, 18670, 625, 4648, 476, 320, 14859, 281, 15249, 436, 2105, 923, 2987, 1840, 323, 6667, 751, 2997, 1430, 787, 11548, 390, 2505, 14259, 331, 84, 2490, 187, 4118, 18435, 27, 2520, 2929, 29328, 247, 39707, 27934, 12955, 253, 16038, 273, 436, 12955, 310, 326, 352, 4428, 247, 20661, 3448, 14053, 8103, 323, 6197, 14755, 326, 13276, 352, 281, 1007, 387, 512, 253, 21761, 275, 247, 6197, 970, 12246, 30869, 3634, 275, 247, 2014, 1509, 436, 35910, 253, 2523, 273, 970, 2629, 19349, 3448, 14053, 390, 34741, 3448, 3210, 534, 2975, 452, 7364, 273, 970, 760, 440, 301, 30869, 3634, 285, 10568, 2709, 3579, 11999, 253, 16774, 15988, 275, 2426, 273, 3290, 403, 16453, 533, 253, 3885, 8777, 403, 3240, 13943, 352, 651, 452, 644, 5322, 281, 923, 27163, 327, 247, 1643, 625, 8892, 751, 2221, 3129, 489, 352, 310, 247, 2372, 12744, 2139, 824, 1543, 403, 417, 3559, 533, 327, 6654, 253, 2929, 310, 3164, 1335, 1840, 253, 23046 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper aims to learn an autoencoder that can be used to effectively encode the known attributes generative factors and this allows easy and controlled manipulation of the images while producing realistic images to achieve this ordinarily the encoder produces latent code with two components y and z where y are clamped to known attributes using supervised loss while z is unconstrained and mainly useful for good reconstruction but his setup fails when z is sufficiently large as the decoder can learn to ignore y altogether smaller sized z leads to poor reconstruction to overcome this issue the authors propose to employ a student teacher training paradigm the teacher is trained such that the encoder only produces y and the decoder that only consumes y this ensures good disentanglement but poor reconstruction subsequently a student autoencoder is learned which has a much larger latent code and produces both y and z the y component is mapped to the teacher encoders y component using jacobian regularization positives the results of image manipulation using known attributes is quite impressive the authors propose modifications to the jacobian regularization as simple reconstruction losses for efficient training the approach avoids adversarial training and thus is easier to train negatives unsupervised disentanglement results are only shown for mnist i am not convinced similar results for unsupervised disentanglement can be obtained on more complex datasets authors should include some results on this aspect or reduce the emphasis on unsupervised disentanglement also when studying this quantitative evaluation for disentanglement such as in betavae will be nice to have typos page 3 tobtain obtain page 5 conditionning conditioning docsepsummary the paper proposes a method to tackle the disentanglementreconstruction tradeoff problem in many disentangling approaches this is achieved by first training the teacher autoencoder unsupervised or supervised that learns to disentangle the factors of variation at the cost of poor reconstruction and then distills these learned representations into a student model with extra latent dimensions where these extra latents can be used to improve the reconstructions of the student autoencoder compared to the teacher autoencoder the distillation of the learned representation is encouraged via a novel jacobian loss term that encourages the change in reconstructions of the teacher and student to be similar when the latent representation changes there is one experiment for progressive unsupervised disentangling disentangling factor by factor on mnist data and one experiment for semisupervised disentangling on celebahq pros i think the idea of progressively capturing factors of variation one by one is neat and this appears to be one of the first successful attempts at this problem the distillation appears to work well on the mnist data and does indeed decrease the reconstruction loss of the student compared to the teacher the qualitative results on celebahq look strong especially apparent in the video with the clear advantage over fader networks being that the proposed model is a single model that can manipulate the 40 different attributes whereas fader nets can only deal with at most 3 attributes per model cons there are not enough quantitative results supporting the claim that the model is effective at both disentangling and reconstruction the degree of disentanglement in the representations is only shown qualitatively via latent interpolation and only for a single model such qualitative results are generally prone to cherrypicking and it is difficult to reliably compare different disentangling methods in this manner this calls for quantitative measures of disentanglement had you used a dataset where you know the ground truth factors of variation eg dsprites2d shapes data for the unsupervised disentangling method then the level of disentanglement in the learned representations could be quantified and thus your method could be compared against unsupervised disentangling baselines for the semisupervised disentanglement example on celeba you could for example quantify how well the encoder predicts the different attributes because there is ground truth here eg report rmse of the yis on a held out test set with ground truth a quantitative comparison with fader networks in this manner appears necessary the qualitative comparison on a single face in figure 5 is nowhere near sufficient there is quantitative evidence that the reconstruction loss decreases when training the student but here its not clear whether this quantitative difference makes a qualitative difference in the reconstructions getting higher fidelity images is one of the motivations behind improving reconstructions so it would be informative to compare the reconstructions of the teacher and the student on the same image in the celeba experiments the benefit of student training is not visible in the results in figure 5 you already show that the teacher model gives decent reconstructions yet you dont show the reconstruction for the student model quantitatively you show that it improves in figure 3b but again it is worth checking if it makes a difference visually also its not clear whether figure 4 are results from the student model or the teacher model im guessing that they are from the student model these quantitative results could form the basis of doing ablation studies for each of the different losses in the additive loss for both unsupervised semisupervised tasks because there are many components in the loss with a hyperparameter for each it would be helpful to know what losses the results are sensitive to for the sake of tuning hyperparameters this would be especially useful should i wish to apply the proposed method to a different dataset i think the derivation of the jacobian loss requires some more justification the higher order terms in the taylor expansion in 2 and 3 can only be ignored when y2 y1 is small compared to the coefficients but there is no validationjustification regarding this other qscomments on page 5 in the last paragraph of section 3 you say that after training of the student with d1 is finished we consider it as the new teacher here do you append z to y when you form the new teacher on page 6 in the paragraph for prediction loss you say this allows the decoder to naturally of the attributes i guess you mean this allows the model to give realistic interpolations between y1 and 1 bottom of page 6 here we could have used any random values in lieu of y2 not sure i understand this typo conditionnning conditioning i would be inclined to boost the score up to 7 if the authors include some quantitative results along with more thorough comparisons to fader networks revision the authors updates include further quantitative comparisons to fader networks and ablation studies for the different types of losses addressing the concerns i had in the review hence i have boosted up my score to 7docsepthis paper proposed a novel approach for learning disentangled representation from supervised data x as the input image y as different attributes by learning an encoder e and a decoder d so that 1 dex reconstructs the image 2 edx reconstruct the latent vector in particular for the vectors that are constructed by mingling different portion of the latent vectors extracted from two training samples 3 the jacobian matrix matches and 4 the predicted latent vector matches with the provided attributes in addition the work also proposes to progressively add latent nodes to the network for training the claim is that using this framework one avoid ganstyle training eg fader network which could be unstable and hard to tune although the idea is interesting the experiments are lacking while previous works eg fader network has both qualitative eg image quality when changing attribute values and quantitative results eg classification results of generated image with novel combination of attributes this paper only shows visual comparison fig 4 and fig 5 and its comparison with fader network is a bit vague eg it is not clear to me why fig 5e generated by proposed approach is more natural than fig 5d even if i check the updated version mentioned by the authors comments also in the paper there are five hyperparameters eqn 14 and the center claim is that using jacobian loss is better however there is no ablation study to support the claim andor the design choice from my opinion the paper should show the performance of supervised training of attributes the effects of using jacobian loss andor cycle loss the inception score of generated images etc i acknowledge the authors for their honesty in raising the issues of fig 4 and providing an updated version ### Summary:
the paper proposes a new way to tackle the tradeoff between disentanglement and reconstruction by training a teacher autoencoder that learns to disentangle then distilling into a student model the distillation is encouraged with a loss term that constrains the jacobian in an interesting way the qualitative results with image manipulation are interesting and the general idea seems to be wellliked by the reviewers and myself the main weaknesses of the paper seem to be in the evaluation disentanglement is not exactly easy to measure as such but overall the various ablation studies do show that the jacobian regularization term improves meaningfully over fader nets given the quality of the results and the fact that this work moves the needle in an important albeit hard to define area of learning disentangled representations i think would be a good piece of work to present at iclr so i recommend acceptance
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 13698, 281, 3037, 271, 6753, 36465, 326, 476, 320, 908, 281, 8069, 22573, 253, 1929, 12474, 1006, 800, 2616, 285, 436, 4483, 3477, 285, 6537, 19763, 273, 253, 3888, 1223, 9603, 15958, 3888, 50276, 936, 5115, 436, 36165, 253, 32049, 11330, 21624, 2127, 342, 767, 4295, 340, 285, 1182, 835, 340, 403, 502, 17263, 281, 1929, 12474, 970, 22296, 2957, 1223, 1182, 310, 440, 48454, 285, 7194, 4217, 323, 1175, 14433, 533, 521, 9978, 10224, 672, 1182, 310, 10481, 1781, 347, 253, 29810, 476, 3037, 281, 11823, 340, 17965, 4577, 25180, 1182, 5644, 281, 4105, 14433, 50276, 936, 11399, 436, 2523, 253, 4477, 12661, 281, 2126, 247, 5974, 9732, 3733, 22199, 253, 9732, 310, 10166, 824, 326, 253, 32049, 760, 11330, 340, 285, 253, 29810, 326, 760, 3651, 265, 340, 436, 20096, 1175, 557, 290, 606, 1338, 533, 4105, 14433, 9674, 247, 5974, 6753, 36465, 310, 6311, 534, 556, 247, 1199, 4067, 21624, 2127, 285, 11330, 1097, 340, 285, 1182, 253, 340, 4445, 310, 18301, 281, 253, 9732, 2349, 351, 398, 340, 4445, 970, 480, 317, 706, 757, 37820, 50276, 993, 23223, 253, 1543, 273, 2460, 19763, 970, 1929, 12474, 310, 3240, 13943, 253, 4477, 12661, 14586, 281, 253, 480, 317, 706, 757, 37820, 347, 2969, 14433, 11655, 323, 5919, 3733, 253, 2746, 32547, 48960, 3733, 285, 3021, 310, 6927, 281, 6194, 50276, 8265, 3993, 440, 35421, 557, 290, 606, 1338, 1543, 403, 760, 2011, 323, 278, 79, 382, 891, 717, 417, 13762, 2074, 1543, 323, 440, 35421, 557, 290, 606, 1338, 476, 320, 2797, 327, 625, 2570, 15302, 4477, 943, 2486, 690, 1543, 327, 436, 4809, 390, 4796, 253, 15075, 327, 440, 35421, 557, 290, 606, 1338, 671, 672, 12392, 436, 11745, 7103, 323, 557, 290, 606, 1338, 824, 347, 275, 701, 2623, 70, 588, 320, 5322, 281, 452, 50276, 555, 993, 3239, 495, 281, 2612, 404, 50276, 706, 14721, 3239, 608, 1617, 920, 50276, 42743, 5474, 339, 793, 360, 3454, 253, 2929, 29328, 247, 1332, 281, 18915, 253, 557, 290, 606, 1338, 250, 11682, 5454, 2727, 1895, 275, 1142, 557, 290, 36874, 7274, 436, 310, 6786, 407, 806, 3733, 253, 9732, 6753, 36465, 440, 35421, 390, 22296, 326, 33772, 281, 557, 290, 2134, 253, 2616, 273, 7629, 387, 253, 2105, 273, 4105, 14433, 285, 840, 940, 3171, 841, 6311, 14237, 715, 247, 5974, 1566, 342, 4465, 21624, 10103, 835, 841, 4465, 4329, 592, 476, 320, 908, 281, 3157, 253, 49866, 6477, 273, 253, 5974, 6753, 36465, 2429, 281, 253, 9732, 6753, 36465, 253, 940, 21755, 273, 253, 6311, 6779, 310, 14659, 3066, 247, 4460, 480, 317, 706, 757, 2957, 1307, 326, 29426, 253, 1818, 275, 49866, 6477, 273, 253, 9732, 285, 5974, 281, 320, 2074, 672, 253, 21624, 6779, 2544, 627, 310, 581, 3368, 323, 13439, 440, 35421, 557, 290, 36874, 557, 290, 36874, 2803, 407, 2803, 327, 278, 79, 382, 941, 285, 581, 3368, 323, 49863, 29974, 13337, 557, 290, 36874, 327, 6076, 67, 1240, 82, 50276, 856, 84, 50276, 74, 1158, 253, 2934, 273, 31414, 26475, 2616, 273, 7629, 581, 407, 581, 310, 18176, 285, 436, 4620, 281, 320, 581, 273, 253, 806, 5547, 9437, 387, 436, 1895, 50276, 783, 940, 21755, 4620, 281, 789, 973, 327, 253, 278, 79, 382, 941, 285, 1057, 6296, 6379, 253, 14433, 2957, 273, 253, 5974, 2429, 281, 253, 9732, 50276, 783, 18276, 1543, 327, 6076, 67, 1240, 82, 1007, 2266, 3340, 5165, 275, 253, 3492, 342, 253, 2590, 5750, 689, 269, 6475, 6928, 1146, 326, 253, 4081, 1566, 310, 247, 2014, 1566, 326, 476, 26526, 253, 3387, 1027, 12474, 5727, 269, 6475, 37507, 476, 760, 2968, 342, 387, 954, 495, 12474, 591, 1566, 50276, 5040, 50276, 9088, 403, 417, 2217, 11745, 1543, 8109, 253, 1750, 326, 253, 1566, 310, 3576, 387, 1097, 557, 290, 36874, 285, 14433, 253, 4248, 273, 557, 290, 606, 1338, 275, 253, 14237, 310, 760, 2011, 36143, 3066, 21624, 30370, 285, 760, 323, 247, 2014, 1566, 824, 18276, 1543, 403, 3839, 21291, 281, 33804, 81, 12427, 285, 352, 310, 2834, 281, 27340, 7277, 1027, 557, 290, 36874, 3082, 275, 436, 5133, 436, 5841, 323, 11745, 5593, 273, 557, 290, 606, 1338, 574, 368, 908, 247, 10895, 835, 368, 871, 253, 3216, 5083, 2616, 273, 7629, 24088, 277, 1033, 31320, 19, 69, 15029, 941, 323, 253, 440, 35421, 557, 290, 36874, 1332, 840, 253, 1268, 273, 557, 290, 606, 1338, 275, 253, 6311, 14237, 812, 320, 18755, 285, 3021, 634, 1332, 812, 320, 2429, 1411, 440, 35421, 557, 290, 36874, 1666, 25379, 323, 253, 49863, 29974, 13337, 557, 290, 606, 1338, 1650, 327, 6076, 5830, 368, 812, 323, 1650, 22048, 849, 973, 253, 32049, 26295, 253, 1027, 12474, 984, 627, 310, 3216, 5083, 1060, 24088, 1304, 40373, 339, 273, 253, 340, 261, 327, 247, 2918, 562, 1071, 873, 342, 3216, 5083, 247, 11745, 5301, 342, 269, 6475, 6928, 275, 436, 5133, 4620, 3309, 253, 18276, 5301, 327, 247, 2014, 2454, 275, 4677, 608, 310, 17663, 2822, 4209, 50276, 9088, 310, 11745, 1941, 326, 253, 14433, 2957, 12075, 672, 3733, 253, 5974, 533, 1060, 697, 417, 2590, 1880, 436, 11745, 3064, 2789, 247, 18276, 3064, 275, 253, 49866, 6477, 2970, 2169, 32422, 3888, 310, 581, 273, 253, 42852, 3212, 11138, 49866, 6477, 594, 352, 651, 320, 27096, 281, 7277, 253, 49866, 6477, 273, 253, 9732, 285, 253, 5974, 327, 253, 1072, 2460, 50276, 249, 253, 6076, 5830, 4679, 253, 5649, 273, 5974, 3733, 310, 417, 7985, 275, 253, 1543, 275, 4677, 608, 368, 2168, 921, 326, 253, 9732, 1566, 4245, 12524, 49866, 6477, 2568, 368, 13414, 921, 253, 14433, 323, 253, 5974, 1566, 36878, 368, 921, 326, 352, 19132, 275, 4677, 495, 67, 533, 969, 352, 310, 4409, 12669, 604, 352, 2789, 247, 3064, 25910, 671, 697, 417, 2590, 1880, 4677, 577, 403, 1543, 432, 253, 5974, 1566, 390, 253, 9732, 1566, 516, 29985, 326, 597, 403, 432, 253, 5974, 1566, 50276, 20513, 11745, 1543, 812, 830, 253, 3720, 273, 2509, 28913, 2175, 323, 1016, 273, 253, 1027, 11655, 275, 253, 21842, 2957, 323, 1097, 440, 35421, 50276, 6017, 261, 29974, 13337, 8892, 984, 627, 403, 1142, 4295, 275, 253, 2957, 342, 247, 4373, 19484, 323, 1016, 352, 651, 320, 9371, 281, 871, 752, 11655, 253, 1543, 403, 7996, 281, 323, 253, 13232, 273, 25184, 4373, 22041, 436, 651, 320, 3340, 4217, 943, 891, 5730, 281, 4647, 253, 4081, 1332, 281, 247, 1027, 10895, 50276, 74, 1158, 253, 28529, 273, 253, 480, 317, 706, 757, 2957, 4419, 690, 625, 22861, 253, 2169, 1340, 2426, 275, 253, 246, 9614, 7466, 275, 374, 285, 495, 476, 760, 320, 12841, 672, 340, 19, 50276, 90, 18, 310, 1355, 2429, 281, 253, 10303, 533, 627, 310, 642, 12820, 6309, 1877, 5001, 436, 50276, 977, 2805, 84, 26122, 50276, 251, 3239, 608, 275, 253, 1390, 12494, 273, 2593, 495, 368, 1333, 326, 846, 3733, 273, 253, 5974, 342, 277, 18, 310, 6699, 359, 1908, 352, 347, 253, 747, 9732, 1060, 513, 368, 14801, 1182, 281, 340, 672, 368, 830, 253, 747, 9732, 50276, 251, 3239, 721, 275, 253, 12494, 323, 10554, 2957, 368, 1333, 436, 4483, 253, 29810, 281, 10748, 50276, 1171, 253, 12474, 891, 5476, 368, 1599, 436, 4483, 253, 1566, 281, 1918, 15958, 20670, 569, 875, 340, 18, 285, 337, 50276, 10492, 273, 3239, 721, 1060, 359, 812, 452, 908, 667, 3632, 2193, 275, 32470, 273, 340, 19, 50276, 1439, 2119, 891, 2096, 436, 50276, 555, 5367, 1617, 79, 920, 50276, 42743, 50276, 74, 651, 320, 21802, 281, 9510, 253, 4868, 598, 281, 818, 604, 253, 4477, 2486, 690, 11745, 1543, 2112, 342, 625, 11080, 14023, 281, 269, 6475, 6928, 50275, 250, 4694, 50276, 783, 4477, 11269, 2486, 2007, 11745, 14023, 281, 269, 6475, 6928, 285, 28913, 2175, 323, 253, 1027, 3510, 273, 11655, 15974, 253, 7350, 891, 574, 275, 253, 2278, 7613, 891, 452, 46002, 598, 619, 4868, 281, 818, 7152, 33032, 2520, 2929, 4081, 247, 4460, 2746, 323, 4715, 557, 290, 33195, 6779, 432, 22296, 941, 1269, 347, 253, 3280, 2460, 340, 347, 1027, 12474, 407, 4715, 271, 32049, 299, 285, 247, 29810, 277, 594, 326, 337, 27625, 17029, 84, 253, 2460, 374, 1407, 89, 17029, 253, 21624, 4972, 275, 1798, 323, 253, 11390, 326, 403, 8818, 407, 43261, 1981, 1027, 5110, 273, 253, 21624, 11390, 10375, 432, 767, 3733, 3530, 495, 253, 480, 317, 706, 757, 4315, 10129, 285, 577, 253, 8131, 21624, 4972, 10129, 342, 253, 2530, 12474, 275, 1635, 253, 789, 671, 29328, 281, 31414, 823, 21624, 7632, 281, 253, 2990, 323, 3733, 253, 1750, 310, 326, 970, 436, 7792, 581, 3693, 36827, 4826, 3733, 24088, 269, 6475, 2990, 534, 812, 320, 17631, 285, 1892, 281, 19928, 50275, 20261, 253, 2934, 310, 4722, 253, 4679, 403, 14999, 1223, 2045, 2987, 24088, 269, 6475, 2990, 556, 1097, 18276, 24088, 2460, 3290, 672, 6890, 11104, 2193, 285, 11745, 1543, 24088, 9162, 1543, 273, 4561, 2460, 342, 4460, 5019, 273, 12474, 436, 2929, 760, 2722, 5304, 5301, 3036, 577, 285, 3036, 608, 285, 697, 5301, 342, 269, 6475, 2990, 310, 247, 2372, 21248, 24088, 352, 310, 417, 2590, 281, 479, 2139, 3036, 608, 70, 4561, 407, 4081, 2746, 310, 625, 3626, 685, 3036, 608, 69, 1014, 604, 891, 2451, 253, 9300, 2715, 5393, 407, 253, 4477, 5701, 671, 275, 253, 2929, 627, 403, 2620, 4373, 22041, 16186, 79, 1638, 285, 253, 4055, 1750, 310, 326, 970, 480, 317, 706, 757, 2957, 310, 1805, 2299, 627, 310, 642, 28913, 1263, 281, 1329, 253, 1750, 285, 263, 253, 2216, 4327, 432, 619, 4743, 253, 2929, 943, 921, 253, 3045, 273, 22296, 3733, 273, 12474, 253, 2538, 273, 970, 480, 317, 706, 757, 2957, 285, 263, 5880, 2957, 253, 39645, 4868, 273, 4561, 3888, 3966, 50275, 74, 14409, 253, 4477, 323, 616, 33773, 275, 12976, 253, 3374, 273, 3036, 577, 285, 5277, 271, 9300, 2715, 2490, 187, 4118, 18435, 27, 783, 2929, 29328, 247, 747, 1039, 281, 18915, 253, 5454, 2727, 875, 557, 290, 606, 1338, 285, 14433, 407, 3733, 247, 9732, 6753, 36465, 326, 33772, 281, 557, 290, 2134, 840, 940, 3867, 715, 247, 5974, 1566, 253, 940, 21755, 310, 14659, 342, 247, 2957, 1307, 326, 1030, 44196, 253, 480, 317, 706, 757, 275, 271, 4722, 1039, 253, 18276, 1543, 342, 2460, 19763, 403, 4722, 285, 253, 2087, 2934, 3133, 281, 320, 973, 44939, 407, 253, 30628, 285, 4266, 50276, 783, 2022, 32213, 273, 253, 2929, 1646, 281, 320, 275, 253, 7103, 557, 290, 606, 1338, 310, 417, 4555, 3477, 281, 2557, 347, 824, 533, 4583, 253, 2710, 28913, 2175, 513, 921, 326, 253, 480, 317, 706, 757, 37820, 1307, 19132, 4495, 2920, 689, 269, 6475, 37507, 1677, 253, 3290, 273, 253, 1543, 285, 253, 958, 326, 436, 789, 9727, 253, 15460, 275, 271, 1774, 23447, 1892, 281, 4853, 2170, 273, 4715, 557, 290, 33195, 14237, 891, 1158, 651, 320, 247, 1175, 5313, 273, 789, 281, 1246, 387, 17857, 32888, 594, 891, 5583, 14924 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 2929, 13698, 281, 3037, 271, 6753, 36465, 326, 476, 320, 908, 281, 8069, 22573, 253, 1929, 12474, 1006, 800, 2616, 285, 436, 4483, 3477, 285, 6537, 19763, 273, 253, 3888, 1223, 9603, 15958, 3888, 50276, 936, 5115, 436, 36165, 253, 32049, 11330, 21624, 2127, 342, 767, 4295, 340, 285, 1182, 835, 340, 403, 502, 17263, 281, 1929, 12474, 970, 22296, 2957, 1223, 1182, 310, 440, 48454, 285, 7194, 4217, 323, 1175, 14433, 533, 521, 9978, 10224, 672, 1182, 310, 10481, 1781, 347, 253, 29810, 476, 3037, 281, 11823, 340, 17965, 4577, 25180, 1182, 5644, 281, 4105, 14433, 50276, 936, 11399, 436, 2523, 253, 4477, 12661, 281, 2126, 247, 5974, 9732, 3733, 22199, 253, 9732, 310, 10166, 824, 326, 253, 32049, 760, 11330, 340, 285, 253, 29810, 326, 760, 3651, 265, 340, 436, 20096, 1175, 557, 290, 606, 1338, 533, 4105, 14433, 9674, 247, 5974, 6753, 36465, 310, 6311, 534, 556, 247, 1199, 4067, 21624, 2127, 285, 11330, 1097, 340, 285, 1182, 253, 340, 4445, 310, 18301, 281, 253, 9732, 2349, 351, 398, 340, 4445, 970, 480, 317, 706, 757, 37820, 50276, 993, 23223, 253, 1543, 273, 2460, 19763, 970, 1929, 12474, 310, 3240, 13943, 253, 4477, 12661, 14586, 281, 253, 480, 317, 706, 757, 37820, 347, 2969, 14433, 11655, 323, 5919, 3733, 253, 2746, 32547, 48960, 3733, 285, 3021, 310, 6927, 281, 6194, 50276, 8265, 3993, 440, 35421, 557, 290, 606, 1338, 1543, 403, 760, 2011, 323, 278, 79, 382, 891, 717, 417, 13762, 2074, 1543, 323, 440, 35421, 557, 290, 606, 1338, 476, 320, 2797, 327, 625, 2570, 15302, 4477, 943, 2486, 690, 1543, 327, 436, 4809, 390, 4796, 253, 15075, 327, 440, 35421, 557, 290, 606, 1338, 671, 672, 12392, 436, 11745, 7103, 323, 557, 290, 606, 1338, 824, 347, 275, 701, 2623, 70, 588, 320, 5322, 281, 452, 50276, 555, 993, 3239, 495, 281, 2612, 404, 50276, 706, 14721, 3239, 608, 1617, 920, 50276, 42743, 5474, 339, 793, 360, 3454, 253, 2929, 29328, 247, 1332, 281, 18915, 253, 557, 290, 606, 1338, 250, 11682, 5454, 2727, 1895, 275, 1142, 557, 290, 36874, 7274, 436, 310, 6786, 407, 806, 3733, 253, 9732, 6753, 36465, 440, 35421, 390, 22296, 326, 33772, 281, 557, 290, 2134, 253, 2616, 273, 7629, 387, 253, 2105, 273, 4105, 14433, 285, 840, 940, 3171, 841, 6311, 14237, 715, 247, 5974, 1566, 342, 4465, 21624, 10103, 835, 841, 4465, 4329, 592, 476, 320, 908, 281, 3157, 253, 49866, 6477, 273, 253, 5974, 6753, 36465, 2429, 281, 253, 9732, 6753, 36465, 253, 940, 21755, 273, 253, 6311, 6779, 310, 14659, 3066, 247, 4460, 480, 317, 706, 757, 2957, 1307, 326, 29426, 253, 1818, 275, 49866, 6477, 273, 253, 9732, 285, 5974, 281, 320, 2074, 672, 253, 21624, 6779, 2544, 627, 310, 581, 3368, 323, 13439, 440, 35421, 557, 290, 36874, 557, 290, 36874, 2803, 407, 2803, 327, 278, 79, 382, 941, 285, 581, 3368, 323, 49863, 29974, 13337, 557, 290, 36874, 327, 6076, 67, 1240, 82, 50276, 856, 84, 50276, 74, 1158, 253, 2934, 273, 31414, 26475, 2616, 273, 7629, 581, 407, 581, 310, 18176, 285, 436, 4620, 281, 320, 581, 273, 253, 806, 5547, 9437, 387, 436, 1895, 50276, 783, 940, 21755, 4620, 281, 789, 973, 327, 253, 278, 79, 382, 941, 285, 1057, 6296, 6379, 253, 14433, 2957, 273, 253, 5974, 2429, 281, 253, 9732, 50276, 783, 18276, 1543, 327, 6076, 67, 1240, 82, 1007, 2266, 3340, 5165, 275, 253, 3492, 342, 253, 2590, 5750, 689, 269, 6475, 6928, 1146, 326, 253, 4081, 1566, 310, 247, 2014, 1566, 326, 476, 26526, 253, 3387, 1027, 12474, 5727, 269, 6475, 37507, 476, 760, 2968, 342, 387, 954, 495, 12474, 591, 1566, 50276, 5040, 50276, 9088, 403, 417, 2217, 11745, 1543, 8109, 253, 1750, 326, 253, 1566, 310, 3576, 387, 1097, 557, 290, 36874, 285, 14433, 253, 4248, 273, 557, 290, 606, 1338, 275, 253, 14237, 310, 760, 2011, 36143, 3066, 21624, 30370, 285, 760, 323, 247, 2014, 1566, 824, 18276, 1543, 403, 3839, 21291, 281, 33804, 81, 12427, 285, 352, 310, 2834, 281, 27340, 7277, 1027, 557, 290, 36874, 3082, 275, 436, 5133, 436, 5841, 323, 11745, 5593, 273, 557, 290, 606, 1338, 574, 368, 908, 247, 10895, 835, 368, 871, 253, 3216, 5083, 2616, 273, 7629, 24088, 277, 1033, 31320, 19, 69, 15029, 941, 323, 253, 440, 35421, 557, 290, 36874, 1332, 840, 253, 1268, 273, 557, 290, 606, 1338, 275, 253, 6311, 14237, 812, 320, 18755, 285, 3021, 634, 1332, 812, 320, 2429, 1411, 440, 35421, 557, 290, 36874, 1666, 25379, 323, 253, 49863, 29974, 13337, 557, 290, 606, 1338, 1650, 327, 6076, 5830, 368, 812, 323, 1650, 22048, 849, 973, 253, 32049, 26295, 253, 1027, 12474, 984, 627, 310, 3216, 5083, 1060, 24088, 1304, 40373, 339, 273, 253, 340, 261, 327, 247, 2918, 562, 1071, 873, 342, 3216, 5083, 247, 11745, 5301, 342, 269, 6475, 6928, 275, 436, 5133, 4620, 3309, 253, 18276, 5301, 327, 247, 2014, 2454, 275, 4677, 608, 310, 17663, 2822, 4209, 50276, 9088, 310, 11745, 1941, 326, 253, 14433, 2957, 12075, 672, 3733, 253, 5974, 533, 1060, 697, 417, 2590, 1880, 436, 11745, 3064, 2789, 247, 18276, 3064, 275, 253, 49866, 6477, 2970, 2169, 32422, 3888, 310, 581, 273, 253, 42852, 3212, 11138, 49866, 6477, 594, 352, 651, 320, 27096, 281, 7277, 253, 49866, 6477, 273, 253, 9732, 285, 253, 5974, 327, 253, 1072, 2460, 50276, 249, 253, 6076, 5830, 4679, 253, 5649, 273, 5974, 3733, 310, 417, 7985, 275, 253, 1543, 275, 4677, 608, 368, 2168, 921, 326, 253, 9732, 1566, 4245, 12524, 49866, 6477, 2568, 368, 13414, 921, 253, 14433, 323, 253, 5974, 1566, 36878, 368, 921, 326, 352, 19132, 275, 4677, 495, 67, 533, 969, 352, 310, 4409, 12669, 604, 352, 2789, 247, 3064, 25910, 671, 697, 417, 2590, 1880, 4677, 577, 403, 1543, 432, 253, 5974, 1566, 390, 253, 9732, 1566, 516, 29985, 326, 597, 403, 432, 253, 5974, 1566, 50276, 20513, 11745, 1543, 812, 830, 253, 3720, 273, 2509, 28913, 2175, 323, 1016, 273, 253, 1027, 11655, 275, 253, 21842, 2957, 323, 1097, 440, 35421, 50276, 6017, 261, 29974, 13337, 8892, 984, 627, 403, 1142, 4295, 275, 253, 2957, 342, 247, 4373, 19484, 323, 1016, 352, 651, 320, 9371, 281, 871, 752, 11655, 253, 1543, 403, 7996, 281, 323, 253, 13232, 273, 25184, 4373, 22041, 436, 651, 320, 3340, 4217, 943, 891, 5730, 281, 4647, 253, 4081, 1332, 281, 247, 1027, 10895, 50276, 74, 1158, 253, 28529, 273, 253, 480, 317, 706, 757, 2957, 4419, 690, 625, 22861, 253, 2169, 1340, 2426, 275, 253, 246, 9614, 7466, 275, 374, 285, 495, 476, 760, 320, 12841, 672, 340, 19, 50276, 90, 18, 310, 1355, 2429, 281, 253, 10303, 533, 627, 310, 642, 12820, 6309, 1877, 5001, 436, 50276, 977, 2805, 84, 26122, 50276, 251, 3239, 608, 275, 253, 1390, 12494, 273, 2593, 495, 368, 1333, 326, 846, 3733, 273, 253, 5974, 342, 277, 18, 310, 6699, 359, 1908, 352, 347, 253, 747, 9732, 1060, 513, 368, 14801, 1182, 281, 340, 672, 368, 830, 253, 747, 9732, 50276, 251, 3239, 721, 275, 253, 12494, 323, 10554, 2957, 368, 1333, 436, 4483, 253, 29810, 281, 10748, 50276, 1171, 253, 12474, 891, 5476, 368, 1599, 436, 4483, 253, 1566, 281, 1918, 15958, 20670, 569, 875, 340, 18, 285, 337, 50276, 10492, 273, 3239, 721, 1060, 359, 812, 452, 908, 667, 3632, 2193, 275, 32470, 273, 340, 19, 50276, 1439, 2119, 891, 2096, 436, 50276, 555, 5367, 1617, 79, 920, 50276, 42743, 50276, 74, 651, 320, 21802, 281, 9510, 253, 4868, 598, 281, 818, 604, 253, 4477, 2486, 690, 11745, 1543, 2112, 342, 625, 11080, 14023, 281, 269, 6475, 6928, 50275, 250, 4694, 50276, 783, 4477, 11269, 2486, 2007, 11745, 14023, 281, 269, 6475, 6928, 285, 28913, 2175, 323, 253, 1027, 3510, 273, 11655, 15974, 253, 7350, 891, 574, 275, 253, 2278, 7613, 891, 452, 46002, 598, 619, 4868, 281, 818, 7152, 33032, 2520, 2929, 4081, 247, 4460, 2746, 323, 4715, 557, 290, 33195, 6779, 432, 22296, 941, 1269, 347, 253, 3280, 2460, 340, 347, 1027, 12474, 407, 4715, 271, 32049, 299, 285, 247, 29810, 277, 594, 326, 337, 27625, 17029, 84, 253, 2460, 374, 1407, 89, 17029, 253, 21624, 4972, 275, 1798, 323, 253, 11390, 326, 403, 8818, 407, 43261, 1981, 1027, 5110, 273, 253, 21624, 11390, 10375, 432, 767, 3733, 3530, 495, 253, 480, 317, 706, 757, 4315, 10129, 285, 577, 253, 8131, 21624, 4972, 10129, 342, 253, 2530, 12474, 275, 1635, 253, 789, 671, 29328, 281, 31414, 823, 21624, 7632, 281, 253, 2990, 323, 3733, 253, 1750, 310, 326, 970, 436, 7792, 581, 3693, 36827, 4826, 3733, 24088, 269, 6475, 2990, 534, 812, 320, 17631, 285, 1892, 281, 19928, 50275, 20261, 253, 2934, 310, 4722, 253, 4679, 403, 14999, 1223, 2045, 2987, 24088, 269, 6475, 2990, 556, 1097, 18276, 24088, 2460, 3290, 672, 6890, 11104, 2193, 285, 11745, 1543, 24088, 9162, 1543, 273, 4561, 2460, 342, 4460, 5019, 273, 12474, 436, 2929, 760, 2722, 5304, 5301, 3036, 577, 285, 3036, 608, 285, 697, 5301, 342, 269, 6475, 2990, 310, 247, 2372, 21248, 24088, 352, 310, 417, 2590, 281, 479, 2139, 3036, 608, 70, 4561, 407, 4081, 2746, 310, 625, 3626, 685, 3036, 608, 69, 1014, 604, 891, 2451, 253, 9300, 2715, 5393, 407, 253, 4477, 5701, 671, 275, 253, 2929, 627, 403, 2620, 4373, 22041, 16186, 79, 1638, 285, 253, 4055, 1750, 310, 326, 970, 480, 317, 706, 757, 2957, 310, 1805, 2299, 627, 310, 642, 28913, 1263, 281, 1329, 253, 1750, 285, 263, 253, 2216, 4327, 432, 619, 4743, 253, 2929, 943, 921, 253, 3045, 273, 22296, 3733, 273, 12474, 253, 2538, 273, 970, 480, 317, 706, 757, 2957, 285, 263, 5880, 2957, 253, 39645, 4868, 273, 4561, 3888, 3966, 50275, 74, 14409, 253, 4477, 323, 616, 33773, 275, 12976, 253, 3374, 273, 3036, 577, 285, 5277, 271, 9300, 2715, 2490, 187, 4118, 18435, 27, 783, 2929, 29328, 247, 747, 1039, 281, 18915, 253, 5454, 2727, 875, 557, 290, 606, 1338, 285, 14433, 407, 3733, 247, 9732, 6753, 36465, 326, 33772, 281, 557, 290, 2134, 840, 940, 3867, 715, 247, 5974, 1566, 253, 940, 21755, 310, 14659, 342, 247, 2957, 1307, 326, 1030, 44196, 253, 480, 317, 706, 757, 275, 271, 4722, 1039, 253, 18276, 1543, 342, 2460, 19763, 403, 4722, 285, 253, 2087, 2934, 3133, 281, 320, 973, 44939, 407, 253, 30628, 285, 4266, 50276, 783, 2022, 32213, 273, 253, 2929, 1646, 281, 320, 275, 253, 7103, 557, 290, 606, 1338, 310, 417, 4555, 3477, 281, 2557, 347, 824, 533, 4583, 253, 2710, 28913, 2175, 513, 921, 326, 253, 480, 317, 706, 757, 37820, 1307, 19132, 4495, 2920, 689, 269, 6475, 37507, 1677, 253, 3290, 273, 253, 1543, 285, 253, 958, 326, 436, 789, 9727, 253, 15460, 275, 271, 1774, 23447, 1892, 281, 4853, 2170, 273, 4715, 557, 290, 33195, 14237, 891, 1158, 651, 320, 247, 1175, 5313, 273, 789, 281, 1246, 387, 17857, 32888, 594, 891, 5583, 14924 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the authors propose relational proxies a novel approach that leverages the relational information between the global and local views of an object for encoding its semantic label i think the main novelty comes from the introduced relational proxies and the corresponding comprehensive theoretical and experimental analysis weaknesses one area of improvement for the paper at hand would be clarity especially with respect to the exposition of the proposed architecture it takes multiple read throughs in order to identify the actual architecture proposed lack of experiments on larger datasets in a time of evergrowing dataset sizes it would be good to provide and compare results of suchmodels when trained on larger datasets this is important for judging the impact asimprovements stemming from architecture engineering typically vanish with growing dataset sizes na docsepsummary this paper is dedicated to developing algorithms for finegrained image recognition they argue it is not enough to distinguish finegrained categories only based on partial information therefore they propose relational proxies which leverage the relational information between the global and local views of an object they also provide theoretical explanations to support the effectiveness of the proposed methods experiments on six finegrained benchmark datasets offer positive results pros 1 the proposed methods make sense and are wellmotivated both theoretical and empirical analyses are provided to support the effectiveness 2 the paper is well written and easy to follow figure 1 is informative and illustrative cons 1 there are missing numbers in table 1 for a comprehensive comparison it is necessary to complete it minor the caption of tables should be on the above content 2 the performance gains are marginal especially on cub 03 and na birds 02 any explanations it seems the proposed methods are less working for bird images meanwhile the error bar is required for table1 since the current accuracy margin is too small 3 more network backbones are needed to support the generalization of proposed methods across architectures 4 as for the study in figure 2 the number of local views l and the size of the local patch should be correlated a detailed analysis is needed both limitations and potential negative social impacts are discussed in the submission docsepin the finegrained setting discriminating between different classes requires learning how different local parts combine to form the object of interest in this work the authors introduce a theoretical framework and a novel method that decomposes fgvc tasks into relationagnostic feature extraction and crossview relation learning they show the superiority of such method through a set of experiments the problem this work address is relevant while i am unable to gauge the relevance of the proposed theoretical framework and its broad usefulness to the community the proposed approach is solid and of interest strengths 1 the experimental setup is solid the authors test their method on multiple datasets and consistently show competitive results across all of them these results support the idea of modeling the crossview relation in their proposed architecture this is further demonstrated through ablation studies that highlight the need for a crossview relational function and provide further insights into the method 2 the theoretical framework established in this work is clear and while it builds on a lot of definitions the proofs are simple and easy to follow weaknesses 1 the permutation invariance property is counterintuitive the authors introduce the permutation invariance property as a necessary property for a model to solve fgvc tasks in section 34 they introduce a novel transformer ast that does not use positional embeddings and is thus permutation invariant while the permutation invariance property can provide certain desired properties like potentially better generalization the authors do not provide theoretical evidence for why it would be necessary and it is not fully clear how it is motivated either certain claims in the introduction would actually suggest otherwise differ only in the way the attributes combine to generate the global view of the object or features like the distance between the head and the body or the angular orientation of the legs this suggest that features like positional encoding would actually be critical 2 the term relation is not explicitly defined and it is unclear what the authors mean the relationagnostic representation is established in definition 4 and while it is clear what it means in mathematical terms its relation to fgvc problems is not evident providing more clarifications would make the text easier to follow 3 decomposing the problem into relationagnostic encoder and a crossview relational function the authors do not argue for why this decomposition is necessary when solving fgvc problems at least in the main text a discussion of this can be found in appendix 63 and i would argue for including this in the main text as it better explains the idea of the relational gap and seems to at least provide an initial motivation for this decomposition the authors discuss one of the main limitation being the local view generation docsepin this paper authors propose to address the finegrained image classification from a novel perspective namely for the finegrained classes with the visually similar local features more attention should be made on leveraging the relational information between the global and local views of an object for encoding its semantic label relational proxies are designed based on the proposed theory and achieve the superior results on multiple benchmark datasets strengths a hypothesis is made for finegrained visual classification ie when two categories possess same local attributes and differ only in the way the attributes combine to generate the global view of the object relationagnostic approaches do not capture the full semantic information in an input image this hypothesis is then proved by theory and validated in the experimental parts which is the main theoretical contribution of the paper i do like this novel perspective weakness i do not have major concerns regarding the technical details however the visualization analysis of the proposed method is lacking for example under what circumstance that the proposed model can significant improves the performances and when does it fail as noted by authors the main limitation is considered as the local views obtained by the proposed method is cropped from the global view which may not be the best representations of local parts docsepthis paper proposes a method based on relationships between views of an object to perform fine grained visual categorization the authors hypothesise that not only representing local parts but relating them is pivotal to achieve good performance the authors then experiment using their proposed approach on different fgvc datasets improving on current methods strengths 1 the point the paper makes that the relationship between object parts is important for learning the classification label makes intuitive sense and seems to be important in their experiments 2 the paper is reasonably clear as to the motivations and experiments run by the authors weaknesses 1 the authors in my opinion over complicate the mathematical explanation spending two pages explaining why modelling the relationships between object parts is important i do not think this adds much to the paper and the space would be better spent giving a high level clear explanation of the intuition 2 i am not sure why their ast how they combine local parts is so different from an attention layer eg in transfg it seems to be performing the same operation so i do not understand how this is fundamentally different than the transfg architecture is the main difference the global embedding of the object that the local crops can be compared against and using all three embeddings the zg zl r when computing distances and the final metric 3 in general the paper is clear their main contribution is using the global local and r information in the metric learning and so the insight is that using all 3 sources of information is the most useful however in the experiments there seems to be limited improvement from these properties over the base encoder table 2 so i wonder how useful these things really are moreover the improvements are small over standard methods and there is no standard deviation to explain if these results are significant i further wonder if the authors carefully made sure that their setup was similar to the underlying setup and that the improvement was actually due to their method or better data augmentation or underlying architectures the authors discuss limitations ### Summary:
this paper proposes a novel approach for finegrained image recognition which utilizes the relational information between the global and local views of an object it is a reasonable and important finding that not only representing local parts but relating them are critical to establishing superior performance the authors validate their proposals effectiveness with both theoretical explanations and positive empirical results on various benchmarks the authors also did a great job in rebuttal they provide more clarifications extra experiments on large datasets and newly included error bars most of the reviewers are satisfied with the rebuttals and discussions and all reviewers have a consistent recommendation we think this paper can bring new insights to the visual recognition community and help people understand how the key features and their relations work please also include the newly added experiments and clarifications in the new revision
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 12661, 38524, 16843, 447, 247, 4460, 2746, 326, 19732, 1131, 253, 38524, 1491, 875, 253, 4156, 285, 1980, 6849, 273, 271, 1789, 323, 9706, 697, 24705, 5203, 50275, 74, 1158, 253, 2022, 38135, 3249, 432, 253, 5611, 38524, 16843, 447, 285, 253, 3969, 11088, 10527, 285, 5661, 1783, 50276, 20881, 1255, 265, 581, 2170, 273, 7756, 323, 253, 2929, 387, 1133, 651, 320, 19843, 3340, 342, 1675, 281, 253, 47284, 273, 253, 4081, 10336, 352, 3936, 2709, 1239, 949, 84, 275, 1340, 281, 4271, 253, 4588, 10336, 4081, 50276, 77, 471, 273, 4679, 327, 4067, 15302, 275, 247, 673, 273, 2455, 33601, 10895, 9552, 352, 651, 320, 1175, 281, 2085, 285, 7277, 1543, 273, 824, 19286, 672, 10166, 327, 4067, 15302, 436, 310, 1774, 323, 32721, 253, 3486, 347, 49831, 942, 45030, 432, 10336, 11369, 5431, 29259, 342, 5675, 10895, 9552, 5549, 5474, 339, 793, 360, 3454, 50276, 2520, 2929, 310, 9940, 281, 6684, 11333, 323, 4030, 72, 11273, 2460, 8981, 597, 9059, 352, 310, 417, 2217, 281, 12129, 4030, 72, 11273, 9050, 760, 1754, 327, 7898, 1491, 3103, 597, 12661, 38524, 16843, 447, 534, 25057, 253, 38524, 1491, 875, 253, 4156, 285, 1980, 6849, 273, 271, 1789, 597, 671, 2085, 10527, 22909, 281, 1329, 253, 12510, 273, 253, 4081, 3082, 4679, 327, 2800, 4030, 72, 11273, 22791, 15302, 3959, 2762, 1543, 5847, 50276, 18, 253, 4081, 3082, 1056, 3282, 285, 403, 973, 24013, 8550, 1097, 10527, 285, 16774, 6260, 403, 2530, 281, 1329, 253, 12510, 50276, 19, 253, 2929, 310, 973, 3542, 285, 3477, 281, 956, 4677, 337, 310, 27096, 285, 47386, 50274, 5040, 50276, 18, 627, 403, 5816, 3904, 275, 2829, 337, 323, 247, 11088, 5301, 352, 310, 3309, 281, 3426, 352, 5884, 253, 11743, 273, 7180, 943, 320, 327, 253, 1840, 2600, 50276, 19, 253, 3045, 15988, 403, 16888, 3340, 327, 12966, 17272, 285, 5549, 11260, 16261, 667, 22909, 352, 3133, 253, 4081, 3082, 403, 1679, 2444, 323, 12621, 3888, 26614, 253, 2228, 2534, 310, 2424, 323, 2829, 18, 1580, 253, 1655, 7200, 8459, 310, 1512, 1355, 50276, 20, 625, 2990, 896, 47473, 403, 3058, 281, 1329, 253, 26647, 273, 4081, 3082, 2439, 35615, 50276, 21, 347, 323, 253, 1263, 275, 4677, 374, 253, 1180, 273, 1980, 6849, 298, 285, 253, 1979, 273, 253, 1980, 12097, 943, 320, 9578, 247, 7000, 1783, 310, 3058, 1097, 7364, 285, 2442, 4016, 2675, 16274, 403, 5469, 275, 253, 19529, 5474, 339, 9852, 253, 4030, 72, 11273, 4758, 7134, 8779, 875, 1027, 5971, 4419, 4715, 849, 1027, 1980, 4243, 13398, 281, 830, 253, 1789, 273, 1600, 275, 436, 789, 253, 4477, 9569, 247, 10527, 7792, 285, 247, 4460, 1332, 326, 11101, 6013, 269, 72, 16788, 8892, 715, 5886, 1530, 6932, 4735, 11998, 285, 2831, 1374, 5886, 4715, 597, 921, 253, 34385, 273, 824, 1332, 949, 247, 873, 273, 4679, 253, 1895, 436, 789, 2953, 310, 4623, 1223, 891, 717, 7591, 281, 11206, 253, 17200, 273, 253, 4081, 10527, 7792, 285, 697, 3862, 31471, 281, 253, 3114, 253, 4081, 2746, 310, 4891, 285, 273, 1600, 50276, 296, 3755, 20556, 337, 253, 5661, 9978, 310, 4891, 253, 4477, 1071, 616, 1332, 327, 2709, 15302, 285, 12724, 921, 12085, 1543, 2439, 512, 273, 731, 841, 1543, 1329, 253, 2934, 273, 14053, 253, 2831, 1374, 5886, 275, 616, 4081, 10336, 436, 310, 2007, 5183, 949, 28913, 2175, 326, 6780, 253, 878, 323, 247, 2831, 1374, 38524, 1159, 285, 2085, 2007, 16039, 715, 253, 1332, 374, 253, 10527, 7792, 4232, 275, 436, 789, 310, 2590, 285, 1223, 352, 21168, 327, 247, 2257, 273, 14308, 253, 27947, 403, 2969, 285, 3477, 281, 956, 50275, 20881, 1255, 265, 337, 253, 29391, 31429, 2867, 310, 4828, 565, 48714, 253, 4477, 9569, 253, 29391, 31429, 2867, 347, 247, 3309, 2867, 323, 247, 1566, 281, 8415, 269, 72, 16788, 8892, 275, 2593, 5910, 597, 9569, 247, 4460, 39707, 7846, 326, 1057, 417, 897, 40798, 46234, 285, 310, 3021, 29391, 13727, 1223, 253, 29391, 31429, 2867, 476, 2085, 2176, 6799, 3607, 751, 7826, 1805, 26647, 253, 4477, 513, 417, 2085, 10527, 1941, 323, 2139, 352, 651, 320, 3309, 285, 352, 310, 417, 4751, 2590, 849, 352, 310, 17194, 2057, 2176, 3916, 275, 253, 10199, 651, 2686, 1804, 5010, 9184, 760, 275, 253, 1039, 253, 12474, 13398, 281, 6635, 253, 4156, 1859, 273, 253, 1789, 390, 3386, 751, 253, 4181, 875, 253, 1481, 285, 253, 2133, 390, 253, 12336, 11259, 273, 253, 9246, 436, 1804, 326, 3386, 751, 40798, 9706, 651, 2686, 320, 4619, 50276, 19, 253, 1307, 5886, 310, 417, 11120, 2931, 285, 352, 310, 12744, 752, 253, 4477, 1599, 253, 5886, 1530, 6932, 6779, 310, 4232, 275, 5426, 577, 285, 1223, 352, 310, 2590, 752, 352, 2097, 275, 15965, 2426, 697, 5886, 281, 269, 72, 16788, 3237, 310, 417, 8943, 5277, 625, 8254, 6787, 651, 1056, 253, 2505, 6927, 281, 956, 495, 11101, 28163, 253, 1895, 715, 5886, 1530, 6932, 32049, 285, 247, 2831, 1374, 38524, 1159, 253, 4477, 513, 417, 9059, 323, 2139, 436, 14717, 310, 3309, 672, 16161, 269, 72, 16788, 3237, 387, 1878, 275, 253, 2022, 2505, 247, 5955, 273, 436, 476, 320, 1119, 275, 30762, 9654, 285, 891, 651, 9059, 323, 1690, 436, 275, 253, 2022, 2505, 347, 352, 1805, 11424, 253, 2934, 273, 253, 38524, 8037, 285, 3133, 281, 387, 1878, 2085, 271, 3302, 16038, 323, 436, 14717, 50276, 783, 4477, 2319, 581, 273, 253, 2022, 12291, 1146, 253, 1980, 1859, 5978, 50275, 7152, 339, 9852, 436, 2929, 4477, 12661, 281, 2953, 253, 4030, 72, 11273, 2460, 9162, 432, 247, 4460, 8668, 10775, 323, 253, 4030, 72, 11273, 5971, 342, 253, 25910, 2074, 1980, 3386, 625, 4116, 943, 320, 1160, 327, 19732, 2977, 253, 38524, 1491, 875, 253, 4156, 285, 1980, 6849, 273, 271, 1789, 323, 9706, 697, 24705, 5203, 38524, 16843, 447, 403, 4158, 1754, 327, 253, 4081, 3762, 285, 5115, 253, 8936, 1543, 327, 2709, 22791, 15302, 20544, 247, 9079, 310, 1160, 323, 4030, 72, 11273, 5304, 9162, 26332, 672, 767, 9050, 7081, 1072, 1980, 12474, 285, 9184, 760, 275, 253, 1039, 253, 12474, 13398, 281, 6635, 253, 4156, 1859, 273, 253, 1789, 5886, 1530, 6932, 7274, 513, 417, 9232, 253, 2120, 24705, 1491, 275, 271, 3280, 2460, 436, 9079, 310, 840, 8058, 407, 3762, 285, 17618, 275, 253, 5661, 4243, 534, 310, 253, 2022, 10527, 7680, 273, 253, 2929, 891, 513, 751, 436, 4460, 8668, 50276, 20881, 1255, 891, 513, 417, 452, 2201, 7350, 5001, 253, 7681, 4278, 2299, 253, 24426, 1783, 273, 253, 4081, 1332, 310, 14999, 323, 1650, 762, 752, 26741, 326, 253, 4081, 1566, 476, 1534, 19132, 253, 16226, 285, 672, 1057, 352, 1891, 347, 4879, 407, 4477, 253, 2022, 12291, 310, 2783, 347, 253, 1980, 6849, 2797, 407, 253, 4081, 1332, 310, 9187, 1882, 432, 253, 4156, 1859, 534, 778, 417, 320, 253, 1682, 14237, 273, 1980, 4243, 5474, 33032, 2520, 2929, 29328, 247, 1332, 1754, 327, 7688, 875, 6849, 273, 271, 1789, 281, 1347, 4030, 7098, 967, 5304, 13213, 1320, 253, 4477, 6482, 885, 326, 417, 760, 9999, 1980, 4243, 533, 12600, 731, 310, 30847, 281, 5115, 1175, 3045, 253, 4477, 840, 3368, 970, 616, 4081, 2746, 327, 1027, 269, 72, 16788, 15302, 11138, 327, 1655, 3082, 20544, 337, 253, 1127, 253, 2929, 2789, 326, 253, 2954, 875, 1789, 4243, 310, 1774, 323, 4715, 253, 9162, 5203, 2789, 27350, 3282, 285, 3133, 281, 320, 1774, 275, 616, 4679, 50276, 19, 253, 2929, 310, 12054, 2590, 347, 281, 253, 42852, 285, 4679, 1408, 407, 253, 4477, 50276, 20881, 1255, 265, 337, 253, 4477, 275, 619, 4743, 689, 5177, 366, 253, 15965, 8813, 9100, 767, 7223, 15571, 2139, 26278, 253, 7688, 875, 1789, 4243, 310, 1774, 891, 513, 417, 1158, 436, 11323, 1199, 281, 253, 2929, 285, 253, 2317, 651, 320, 1805, 5262, 4933, 247, 1029, 1268, 2590, 8813, 273, 253, 30328, 50276, 19, 891, 717, 417, 2119, 2139, 616, 7846, 849, 597, 13398, 1980, 4243, 310, 594, 1027, 432, 271, 4116, 3828, 24088, 275, 811, 16054, 352, 3133, 281, 320, 9591, 253, 1072, 4254, 594, 891, 513, 417, 2096, 849, 436, 310, 26401, 1027, 685, 253, 811, 16054, 10336, 310, 253, 2022, 3064, 253, 4156, 21496, 273, 253, 1789, 326, 253, 1980, 19492, 476, 320, 2429, 1411, 285, 970, 512, 1264, 46234, 50276, 783, 1182, 72, 1182, 77, 391, 50276, 9453, 12672, 13849, 285, 253, 2457, 7982, 50276, 20, 275, 2087, 253, 2929, 310, 2590, 616, 2022, 7680, 310, 970, 253, 4156, 1980, 285, 391, 1491, 275, 253, 7982, 4715, 285, 594, 253, 12288, 310, 326, 970, 512, 495, 4973, 273, 1491, 310, 253, 954, 4217, 2299, 275, 253, 4679, 627, 3133, 281, 320, 3710, 7756, 432, 841, 3607, 689, 253, 2613, 32049, 2829, 374, 594, 891, 4282, 849, 4217, 841, 1841, 1663, 403, 25761, 253, 11701, 403, 1355, 689, 2629, 3082, 285, 627, 310, 642, 2629, 11254, 281, 5513, 604, 841, 1543, 403, 1534, 891, 2007, 4282, 604, 253, 4477, 9257, 1160, 2119, 326, 616, 9978, 369, 2074, 281, 253, 6944, 9978, 285, 326, 253, 7756, 369, 2686, 1955, 281, 616, 1332, 390, 1805, 941, 42072, 390, 6944, 35615, 253, 4477, 2319, 7364, 2490, 187, 4118, 18435, 27, 2520, 2929, 29328, 247, 4460, 2746, 323, 4030, 72, 11273, 2460, 8981, 534, 29820, 253, 38524, 1491, 875, 253, 4156, 285, 1980, 6849, 273, 271, 1789, 50276, 262, 310, 247, 5272, 285, 1774, 4560, 326, 417, 760, 9999, 1980, 4243, 533, 12600, 731, 403, 4619, 281, 14631, 8936, 3045, 50276, 783, 4477, 17813, 616, 18595, 12510, 342, 1097, 10527, 22909, 285, 2762, 16774, 1543, 327, 2710, 49602, 50276, 783, 4477, 671, 858, 247, 1270, 2628, 275, 30080, 22559, 597, 2085, 625, 8254, 6787, 4465, 4679, 327, 1781, 15302, 285, 9841, 2908, 2228, 8965, 50274, 2252, 273, 253, 30628, 403, 10048, 342, 253, 30080, 85, 932, 285, 11985, 50276, 395, 512, 30628, 452, 247, 5185, 17401, 50276, 664, 1158, 436, 2929, 476, 3324, 747, 16039, 281, 253, 5304, 8981, 3114, 285, 1361, 952, 2096, 849, 253, 2234, 3386, 285, 616, 2493, 789, 50275, 32897, 671, 2486, 253, 9841, 2879, 4679, 285, 8254, 6787, 275, 253, 747, 18520, 50276 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 12661, 38524, 16843, 447, 247, 4460, 2746, 326, 19732, 1131, 253, 38524, 1491, 875, 253, 4156, 285, 1980, 6849, 273, 271, 1789, 323, 9706, 697, 24705, 5203, 50275, 74, 1158, 253, 2022, 38135, 3249, 432, 253, 5611, 38524, 16843, 447, 285, 253, 3969, 11088, 10527, 285, 5661, 1783, 50276, 20881, 1255, 265, 581, 2170, 273, 7756, 323, 253, 2929, 387, 1133, 651, 320, 19843, 3340, 342, 1675, 281, 253, 47284, 273, 253, 4081, 10336, 352, 3936, 2709, 1239, 949, 84, 275, 1340, 281, 4271, 253, 4588, 10336, 4081, 50276, 77, 471, 273, 4679, 327, 4067, 15302, 275, 247, 673, 273, 2455, 33601, 10895, 9552, 352, 651, 320, 1175, 281, 2085, 285, 7277, 1543, 273, 824, 19286, 672, 10166, 327, 4067, 15302, 436, 310, 1774, 323, 32721, 253, 3486, 347, 49831, 942, 45030, 432, 10336, 11369, 5431, 29259, 342, 5675, 10895, 9552, 5549, 5474, 339, 793, 360, 3454, 50276, 2520, 2929, 310, 9940, 281, 6684, 11333, 323, 4030, 72, 11273, 2460, 8981, 597, 9059, 352, 310, 417, 2217, 281, 12129, 4030, 72, 11273, 9050, 760, 1754, 327, 7898, 1491, 3103, 597, 12661, 38524, 16843, 447, 534, 25057, 253, 38524, 1491, 875, 253, 4156, 285, 1980, 6849, 273, 271, 1789, 597, 671, 2085, 10527, 22909, 281, 1329, 253, 12510, 273, 253, 4081, 3082, 4679, 327, 2800, 4030, 72, 11273, 22791, 15302, 3959, 2762, 1543, 5847, 50276, 18, 253, 4081, 3082, 1056, 3282, 285, 403, 973, 24013, 8550, 1097, 10527, 285, 16774, 6260, 403, 2530, 281, 1329, 253, 12510, 50276, 19, 253, 2929, 310, 973, 3542, 285, 3477, 281, 956, 4677, 337, 310, 27096, 285, 47386, 50274, 5040, 50276, 18, 627, 403, 5816, 3904, 275, 2829, 337, 323, 247, 11088, 5301, 352, 310, 3309, 281, 3426, 352, 5884, 253, 11743, 273, 7180, 943, 320, 327, 253, 1840, 2600, 50276, 19, 253, 3045, 15988, 403, 16888, 3340, 327, 12966, 17272, 285, 5549, 11260, 16261, 667, 22909, 352, 3133, 253, 4081, 3082, 403, 1679, 2444, 323, 12621, 3888, 26614, 253, 2228, 2534, 310, 2424, 323, 2829, 18, 1580, 253, 1655, 7200, 8459, 310, 1512, 1355, 50276, 20, 625, 2990, 896, 47473, 403, 3058, 281, 1329, 253, 26647, 273, 4081, 3082, 2439, 35615, 50276, 21, 347, 323, 253, 1263, 275, 4677, 374, 253, 1180, 273, 1980, 6849, 298, 285, 253, 1979, 273, 253, 1980, 12097, 943, 320, 9578, 247, 7000, 1783, 310, 3058, 1097, 7364, 285, 2442, 4016, 2675, 16274, 403, 5469, 275, 253, 19529, 5474, 339, 9852, 253, 4030, 72, 11273, 4758, 7134, 8779, 875, 1027, 5971, 4419, 4715, 849, 1027, 1980, 4243, 13398, 281, 830, 253, 1789, 273, 1600, 275, 436, 789, 253, 4477, 9569, 247, 10527, 7792, 285, 247, 4460, 1332, 326, 11101, 6013, 269, 72, 16788, 8892, 715, 5886, 1530, 6932, 4735, 11998, 285, 2831, 1374, 5886, 4715, 597, 921, 253, 34385, 273, 824, 1332, 949, 247, 873, 273, 4679, 253, 1895, 436, 789, 2953, 310, 4623, 1223, 891, 717, 7591, 281, 11206, 253, 17200, 273, 253, 4081, 10527, 7792, 285, 697, 3862, 31471, 281, 253, 3114, 253, 4081, 2746, 310, 4891, 285, 273, 1600, 50276, 296, 3755, 20556, 337, 253, 5661, 9978, 310, 4891, 253, 4477, 1071, 616, 1332, 327, 2709, 15302, 285, 12724, 921, 12085, 1543, 2439, 512, 273, 731, 841, 1543, 1329, 253, 2934, 273, 14053, 253, 2831, 1374, 5886, 275, 616, 4081, 10336, 436, 310, 2007, 5183, 949, 28913, 2175, 326, 6780, 253, 878, 323, 247, 2831, 1374, 38524, 1159, 285, 2085, 2007, 16039, 715, 253, 1332, 374, 253, 10527, 7792, 4232, 275, 436, 789, 310, 2590, 285, 1223, 352, 21168, 327, 247, 2257, 273, 14308, 253, 27947, 403, 2969, 285, 3477, 281, 956, 50275, 20881, 1255, 265, 337, 253, 29391, 31429, 2867, 310, 4828, 565, 48714, 253, 4477, 9569, 253, 29391, 31429, 2867, 347, 247, 3309, 2867, 323, 247, 1566, 281, 8415, 269, 72, 16788, 8892, 275, 2593, 5910, 597, 9569, 247, 4460, 39707, 7846, 326, 1057, 417, 897, 40798, 46234, 285, 310, 3021, 29391, 13727, 1223, 253, 29391, 31429, 2867, 476, 2085, 2176, 6799, 3607, 751, 7826, 1805, 26647, 253, 4477, 513, 417, 2085, 10527, 1941, 323, 2139, 352, 651, 320, 3309, 285, 352, 310, 417, 4751, 2590, 849, 352, 310, 17194, 2057, 2176, 3916, 275, 253, 10199, 651, 2686, 1804, 5010, 9184, 760, 275, 253, 1039, 253, 12474, 13398, 281, 6635, 253, 4156, 1859, 273, 253, 1789, 390, 3386, 751, 253, 4181, 875, 253, 1481, 285, 253, 2133, 390, 253, 12336, 11259, 273, 253, 9246, 436, 1804, 326, 3386, 751, 40798, 9706, 651, 2686, 320, 4619, 50276, 19, 253, 1307, 5886, 310, 417, 11120, 2931, 285, 352, 310, 12744, 752, 253, 4477, 1599, 253, 5886, 1530, 6932, 6779, 310, 4232, 275, 5426, 577, 285, 1223, 352, 310, 2590, 752, 352, 2097, 275, 15965, 2426, 697, 5886, 281, 269, 72, 16788, 3237, 310, 417, 8943, 5277, 625, 8254, 6787, 651, 1056, 253, 2505, 6927, 281, 956, 495, 11101, 28163, 253, 1895, 715, 5886, 1530, 6932, 32049, 285, 247, 2831, 1374, 38524, 1159, 253, 4477, 513, 417, 9059, 323, 2139, 436, 14717, 310, 3309, 672, 16161, 269, 72, 16788, 3237, 387, 1878, 275, 253, 2022, 2505, 247, 5955, 273, 436, 476, 320, 1119, 275, 30762, 9654, 285, 891, 651, 9059, 323, 1690, 436, 275, 253, 2022, 2505, 347, 352, 1805, 11424, 253, 2934, 273, 253, 38524, 8037, 285, 3133, 281, 387, 1878, 2085, 271, 3302, 16038, 323, 436, 14717, 50276, 783, 4477, 2319, 581, 273, 253, 2022, 12291, 1146, 253, 1980, 1859, 5978, 50275, 7152, 339, 9852, 436, 2929, 4477, 12661, 281, 2953, 253, 4030, 72, 11273, 2460, 9162, 432, 247, 4460, 8668, 10775, 323, 253, 4030, 72, 11273, 5971, 342, 253, 25910, 2074, 1980, 3386, 625, 4116, 943, 320, 1160, 327, 19732, 2977, 253, 38524, 1491, 875, 253, 4156, 285, 1980, 6849, 273, 271, 1789, 323, 9706, 697, 24705, 5203, 38524, 16843, 447, 403, 4158, 1754, 327, 253, 4081, 3762, 285, 5115, 253, 8936, 1543, 327, 2709, 22791, 15302, 20544, 247, 9079, 310, 1160, 323, 4030, 72, 11273, 5304, 9162, 26332, 672, 767, 9050, 7081, 1072, 1980, 12474, 285, 9184, 760, 275, 253, 1039, 253, 12474, 13398, 281, 6635, 253, 4156, 1859, 273, 253, 1789, 5886, 1530, 6932, 7274, 513, 417, 9232, 253, 2120, 24705, 1491, 275, 271, 3280, 2460, 436, 9079, 310, 840, 8058, 407, 3762, 285, 17618, 275, 253, 5661, 4243, 534, 310, 253, 2022, 10527, 7680, 273, 253, 2929, 891, 513, 751, 436, 4460, 8668, 50276, 20881, 1255, 891, 513, 417, 452, 2201, 7350, 5001, 253, 7681, 4278, 2299, 253, 24426, 1783, 273, 253, 4081, 1332, 310, 14999, 323, 1650, 762, 752, 26741, 326, 253, 4081, 1566, 476, 1534, 19132, 253, 16226, 285, 672, 1057, 352, 1891, 347, 4879, 407, 4477, 253, 2022, 12291, 310, 2783, 347, 253, 1980, 6849, 2797, 407, 253, 4081, 1332, 310, 9187, 1882, 432, 253, 4156, 1859, 534, 778, 417, 320, 253, 1682, 14237, 273, 1980, 4243, 5474, 33032, 2520, 2929, 29328, 247, 1332, 1754, 327, 7688, 875, 6849, 273, 271, 1789, 281, 1347, 4030, 7098, 967, 5304, 13213, 1320, 253, 4477, 6482, 885, 326, 417, 760, 9999, 1980, 4243, 533, 12600, 731, 310, 30847, 281, 5115, 1175, 3045, 253, 4477, 840, 3368, 970, 616, 4081, 2746, 327, 1027, 269, 72, 16788, 15302, 11138, 327, 1655, 3082, 20544, 337, 253, 1127, 253, 2929, 2789, 326, 253, 2954, 875, 1789, 4243, 310, 1774, 323, 4715, 253, 9162, 5203, 2789, 27350, 3282, 285, 3133, 281, 320, 1774, 275, 616, 4679, 50276, 19, 253, 2929, 310, 12054, 2590, 347, 281, 253, 42852, 285, 4679, 1408, 407, 253, 4477, 50276, 20881, 1255, 265, 337, 253, 4477, 275, 619, 4743, 689, 5177, 366, 253, 15965, 8813, 9100, 767, 7223, 15571, 2139, 26278, 253, 7688, 875, 1789, 4243, 310, 1774, 891, 513, 417, 1158, 436, 11323, 1199, 281, 253, 2929, 285, 253, 2317, 651, 320, 1805, 5262, 4933, 247, 1029, 1268, 2590, 8813, 273, 253, 30328, 50276, 19, 891, 717, 417, 2119, 2139, 616, 7846, 849, 597, 13398, 1980, 4243, 310, 594, 1027, 432, 271, 4116, 3828, 24088, 275, 811, 16054, 352, 3133, 281, 320, 9591, 253, 1072, 4254, 594, 891, 513, 417, 2096, 849, 436, 310, 26401, 1027, 685, 253, 811, 16054, 10336, 310, 253, 2022, 3064, 253, 4156, 21496, 273, 253, 1789, 326, 253, 1980, 19492, 476, 320, 2429, 1411, 285, 970, 512, 1264, 46234, 50276, 783, 1182, 72, 1182, 77, 391, 50276, 9453, 12672, 13849, 285, 253, 2457, 7982, 50276, 20, 275, 2087, 253, 2929, 310, 2590, 616, 2022, 7680, 310, 970, 253, 4156, 1980, 285, 391, 1491, 275, 253, 7982, 4715, 285, 594, 253, 12288, 310, 326, 970, 512, 495, 4973, 273, 1491, 310, 253, 954, 4217, 2299, 275, 253, 4679, 627, 3133, 281, 320, 3710, 7756, 432, 841, 3607, 689, 253, 2613, 32049, 2829, 374, 594, 891, 4282, 849, 4217, 841, 1841, 1663, 403, 25761, 253, 11701, 403, 1355, 689, 2629, 3082, 285, 627, 310, 642, 2629, 11254, 281, 5513, 604, 841, 1543, 403, 1534, 891, 2007, 4282, 604, 253, 4477, 9257, 1160, 2119, 326, 616, 9978, 369, 2074, 281, 253, 6944, 9978, 285, 326, 253, 7756, 369, 2686, 1955, 281, 616, 1332, 390, 1805, 941, 42072, 390, 6944, 35615, 253, 4477, 2319, 7364, 2490, 187, 4118, 18435, 27, 2520, 2929, 29328, 247, 4460, 2746, 323, 4030, 72, 11273, 2460, 8981, 534, 29820, 253, 38524, 1491, 875, 253, 4156, 285, 1980, 6849, 273, 271, 1789, 50276, 262, 310, 247, 5272, 285, 1774, 4560, 326, 417, 760, 9999, 1980, 4243, 533, 12600, 731, 403, 4619, 281, 14631, 8936, 3045, 50276, 783, 4477, 17813, 616, 18595, 12510, 342, 1097, 10527, 22909, 285, 2762, 16774, 1543, 327, 2710, 49602, 50276, 783, 4477, 671, 858, 247, 1270, 2628, 275, 30080, 22559, 597, 2085, 625, 8254, 6787, 4465, 4679, 327, 1781, 15302, 285, 9841, 2908, 2228, 8965, 50274, 2252, 273, 253, 30628, 403, 10048, 342, 253, 30080, 85, 932, 285, 11985, 50276, 395, 512, 30628, 452, 247, 5185, 17401, 50276, 664, 1158, 436, 2929, 476, 3324, 747, 16039, 281, 253, 5304, 8981, 3114, 285, 1361, 952, 2096, 849, 253, 2234, 3386, 285, 616, 2493, 789, 50275, 32897, 671, 2486, 253, 9841, 2879, 4679, 285, 8254, 6787, 275, 253, 747, 18520, 50276 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: in this paper the authors focus on the design and analysis of graph scattering networks with variable branching ratios and generic functional calculus filters spectrallyagnostic stability guarantees for node and graphlevel perturbations are established strengths this paper is wellorganized the theoretical results on stability guarantees and the experimental results on quantum chemical energies are interesting weaknesses 1 the authors present several upper bounds in section 4 for stability guarantees however it is unknown the optimality of these bounds 2 the section on higher order scattering needs more background knowledge to follow for readers yes docsepthe paper proposes a generalization of graph scattering networks that goes beyond the graph wavelets setting the authors provide stability guarantees for their generalized scattering transform as well as layerwise energy decay bounds the authors propose a simple feature aggregation method to transform graphs into euclidean space and briefly discussed taking into account higherorder scattering the authors conducted experiments of their methods on a graph classification task and a graph regression task originality the idea methods etc are original the noveltyoriginality of the paper definitely meets the bar for publication at neurips quality the background and tools used are somewhat technical and motivated quite abstractly the quality of the mathematicsderivations even though i didnt check proofs line by line are to the best of my knowledge sound clarity the organization and exposition of the article leaves much room for improvement details are given below significance nowadays it is certainly interesting and significant to consider problems involving gcns and graph networks the topic of this paper is of sufficient significance to meet the bar for publication at neurips overall i think the originality significance and mathematical soundness of the paper are fine whereas the clarity of the paper leaves room for improvement which i will detail in sections below main limitations are outlined in section above no negative societal impact docsepin this paper the authors define a very general notion of scattering transform with an application on graphs they redefine each building blocks of the scattering transform spectral filters output lowpass functions nonlinearities etc as well as proper projection operators when the domain changes between layers under appropriate assumption of lipschitzness of each of these elements they show stability of the resulting transform as well as energy preservation generalizing the classical euclidean results some variants on graphs are presented an aggregation strategy for graph classification and higherorder scattering experiments on real data show the effectiveness of the approach strengths a general approach than can take into account many variants domain changes etc all classical theoretical results on scattering hold a very complete supplementary material the experiments are convincing especially for graph regression weaknesses a bit paradoxically the approach suffers from too much generality the authors define very abstract operators and elements and in fact nothing in particular is about graphs at all until the discussions of sections 6 and 7 which are not the core of the approach furthermore the actual choice of the filters some combination of sin and cos is quite hidden within the experiment section and may seem a tad arbitrary as a result the reader is somewhat left wondering all along the paper what the actual architecture is if this is just an abstract formulation of previous architecture or if there is something fundamentally new here examples of implementation on graphs along the abstract description could really help the understanding of the approach many variants are described but it seems not tested in experiments changing graphs higherorder tensors the theorems are valid under many assumptions but a minimal examples satisfying all of them is not given the authors are quite honest about the limitations in their first experiment where they are often not stateoftheart ### Summary:
in the discussion we reached a clear consensus that this paper is interesting for the neurips community and should be accepted the authors rebuttal and subsequent discussion were very useful and we are looking forward to the final version of the paper with the promised improvements implemented
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 249, 436, 2929, 253, 4477, 2770, 327, 253, 2216, 285, 1783, 273, 4216, 11715, 6928, 342, 4778, 27213, 11878, 285, 12314, 5164, 34171, 15116, 2812, 83, 595, 1530, 6932, 7882, 23632, 323, 4666, 285, 4216, 5251, 26309, 403, 4232, 50275, 296, 3755, 20556, 50276, 2520, 2929, 310, 973, 34092, 253, 10527, 1543, 327, 7882, 23632, 285, 253, 5661, 1543, 327, 6318, 5793, 14120, 403, 4722, 50276, 20881, 1255, 265, 337, 253, 4477, 1246, 2067, 5170, 14493, 275, 2593, 577, 323, 7882, 23632, 2299, 352, 310, 7202, 253, 5556, 1319, 273, 841, 14493, 50276, 19, 253, 2593, 327, 2169, 1340, 11715, 3198, 625, 4114, 3640, 281, 956, 323, 10668, 4754, 5474, 339, 431, 248, 2929, 29328, 247, 26647, 273, 4216, 11715, 6928, 326, 4566, 4457, 253, 4216, 5149, 6639, 4758, 253, 4477, 2085, 7882, 23632, 323, 616, 14923, 11715, 4979, 347, 973, 347, 3828, 3020, 2341, 10027, 14493, 253, 4477, 12661, 247, 2969, 4735, 20828, 1332, 281, 4979, 14580, 715, 299, 26365, 2317, 285, 13366, 5469, 3192, 715, 2395, 2169, 2621, 11715, 253, 4477, 5196, 4679, 273, 616, 3082, 327, 247, 4216, 9162, 4836, 285, 247, 4216, 9077, 4836, 50275, 19164, 414, 253, 2934, 3082, 3966, 403, 3236, 253, 38135, 19164, 414, 273, 253, 2929, 7964, 16382, 253, 2534, 323, 9311, 387, 5723, 2824, 50276, 15177, 253, 4114, 285, 5657, 908, 403, 8489, 7681, 285, 17194, 3240, 12002, 314, 253, 3290, 273, 253, 23065, 46229, 569, 1014, 2167, 891, 42126, 2451, 27947, 1386, 407, 1386, 403, 281, 253, 1682, 273, 619, 3640, 3590, 50274, 498, 15752, 253, 6003, 285, 47284, 273, 253, 3929, 6505, 1199, 2316, 323, 7756, 4278, 403, 1677, 2708, 50275, 9188, 40348, 31735, 352, 310, 5604, 4722, 285, 1534, 281, 1908, 3237, 7668, 305, 68, 2224, 285, 4216, 6928, 253, 9400, 273, 436, 2929, 310, 273, 4209, 8453, 281, 2525, 253, 2534, 323, 9311, 387, 5723, 2824, 50275, 1189, 455, 891, 1158, 253, 3236, 414, 8453, 285, 15965, 3590, 1255, 273, 253, 2929, 403, 4030, 5727, 253, 19843, 273, 253, 2929, 6505, 2316, 323, 7756, 534, 891, 588, 2508, 275, 7118, 2708, 50276, 7265, 7364, 403, 18627, 275, 2593, 1840, 50276, 2369, 4016, 38058, 3486, 50275, 7152, 339, 9852, 436, 2929, 253, 4477, 4853, 247, 1077, 2087, 10732, 273, 11715, 4979, 342, 271, 2898, 327, 14580, 597, 294, 3182, 1016, 3652, 8336, 273, 253, 11715, 4979, 9879, 15116, 3453, 1698, 5858, 3470, 14561, 1005, 3966, 347, 973, 347, 1463, 12378, 9158, 672, 253, 5028, 2544, 875, 8090, 762, 4569, 9376, 273, 11233, 37913, 1255, 273, 1016, 273, 841, 3603, 597, 921, 7882, 273, 253, 4795, 4979, 347, 973, 347, 2341, 23029, 2087, 3006, 253, 8946, 299, 26365, 1543, 690, 11640, 327, 14580, 403, 3559, 271, 20828, 5700, 323, 4216, 9162, 285, 2169, 2621, 11715, 4679, 327, 1524, 941, 921, 253, 12510, 273, 253, 2746, 20544, 50275, 66, 2087, 2746, 685, 476, 1379, 715, 2395, 1142, 11640, 5028, 2544, 3966, 50276, 455, 8946, 10527, 1543, 327, 11715, 2186, 50276, 66, 1077, 3426, 24864, 2144, 50276, 783, 4679, 403, 21414, 3340, 323, 4216, 9077, 50276, 20881, 1255, 265, 50275, 66, 2372, 25286, 1037, 253, 2746, 27171, 432, 1512, 1199, 31376, 253, 4477, 4853, 1077, 12002, 9158, 285, 3603, 285, 275, 958, 2717, 275, 1798, 310, 670, 14580, 387, 512, 1919, 253, 11985, 273, 7118, 721, 285, 818, 534, 403, 417, 253, 5161, 273, 253, 2746, 33810, 253, 4588, 4327, 273, 253, 15116, 690, 5019, 273, 6868, 285, 7349, 310, 3240, 8763, 1561, 253, 3368, 2593, 285, 778, 1646, 247, 46868, 10341, 347, 247, 906, 253, 9414, 310, 8489, 1669, 12371, 512, 2112, 253, 2929, 752, 253, 4588, 10336, 310, 604, 436, 310, 816, 271, 12002, 15895, 273, 2045, 10336, 390, 604, 627, 310, 1633, 26401, 747, 1060, 6667, 273, 7092, 327, 14580, 2112, 253, 12002, 5740, 812, 1663, 1361, 253, 4685, 273, 253, 2746, 50276, 20415, 11640, 403, 2529, 533, 352, 3133, 417, 5762, 275, 4679, 6890, 14580, 2169, 2621, 47454, 50276, 783, 39383, 403, 3588, 762, 1142, 13260, 533, 247, 8723, 6667, 14127, 512, 273, 731, 310, 417, 1677, 253, 4477, 403, 3240, 8274, 670, 253, 7364, 275, 616, 806, 3368, 835, 597, 403, 2223, 417, 1375, 23037, 14387, 2490, 187, 4118, 18435, 27, 249, 253, 5955, 359, 4925, 247, 2590, 13969, 326, 436, 2929, 310, 4722, 323, 253, 5723, 2824, 3114, 285, 943, 320, 7607, 253, 4477, 30080, 22559, 285, 6774, 5955, 497, 1077, 4217, 285, 359, 403, 2819, 3579, 281, 253, 2457, 2715, 273, 253, 2929, 342, 253, 12316, 11701, 9009, 209 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 249, 436, 2929, 253, 4477, 2770, 327, 253, 2216, 285, 1783, 273, 4216, 11715, 6928, 342, 4778, 27213, 11878, 285, 12314, 5164, 34171, 15116, 2812, 83, 595, 1530, 6932, 7882, 23632, 323, 4666, 285, 4216, 5251, 26309, 403, 4232, 50275, 296, 3755, 20556, 50276, 2520, 2929, 310, 973, 34092, 253, 10527, 1543, 327, 7882, 23632, 285, 253, 5661, 1543, 327, 6318, 5793, 14120, 403, 4722, 50276, 20881, 1255, 265, 337, 253, 4477, 1246, 2067, 5170, 14493, 275, 2593, 577, 323, 7882, 23632, 2299, 352, 310, 7202, 253, 5556, 1319, 273, 841, 14493, 50276, 19, 253, 2593, 327, 2169, 1340, 11715, 3198, 625, 4114, 3640, 281, 956, 323, 10668, 4754, 5474, 339, 431, 248, 2929, 29328, 247, 26647, 273, 4216, 11715, 6928, 326, 4566, 4457, 253, 4216, 5149, 6639, 4758, 253, 4477, 2085, 7882, 23632, 323, 616, 14923, 11715, 4979, 347, 973, 347, 3828, 3020, 2341, 10027, 14493, 253, 4477, 12661, 247, 2969, 4735, 20828, 1332, 281, 4979, 14580, 715, 299, 26365, 2317, 285, 13366, 5469, 3192, 715, 2395, 2169, 2621, 11715, 253, 4477, 5196, 4679, 273, 616, 3082, 327, 247, 4216, 9162, 4836, 285, 247, 4216, 9077, 4836, 50275, 19164, 414, 253, 2934, 3082, 3966, 403, 3236, 253, 38135, 19164, 414, 273, 253, 2929, 7964, 16382, 253, 2534, 323, 9311, 387, 5723, 2824, 50276, 15177, 253, 4114, 285, 5657, 908, 403, 8489, 7681, 285, 17194, 3240, 12002, 314, 253, 3290, 273, 253, 23065, 46229, 569, 1014, 2167, 891, 42126, 2451, 27947, 1386, 407, 1386, 403, 281, 253, 1682, 273, 619, 3640, 3590, 50274, 498, 15752, 253, 6003, 285, 47284, 273, 253, 3929, 6505, 1199, 2316, 323, 7756, 4278, 403, 1677, 2708, 50275, 9188, 40348, 31735, 352, 310, 5604, 4722, 285, 1534, 281, 1908, 3237, 7668, 305, 68, 2224, 285, 4216, 6928, 253, 9400, 273, 436, 2929, 310, 273, 4209, 8453, 281, 2525, 253, 2534, 323, 9311, 387, 5723, 2824, 50275, 1189, 455, 891, 1158, 253, 3236, 414, 8453, 285, 15965, 3590, 1255, 273, 253, 2929, 403, 4030, 5727, 253, 19843, 273, 253, 2929, 6505, 2316, 323, 7756, 534, 891, 588, 2508, 275, 7118, 2708, 50276, 7265, 7364, 403, 18627, 275, 2593, 1840, 50276, 2369, 4016, 38058, 3486, 50275, 7152, 339, 9852, 436, 2929, 253, 4477, 4853, 247, 1077, 2087, 10732, 273, 11715, 4979, 342, 271, 2898, 327, 14580, 597, 294, 3182, 1016, 3652, 8336, 273, 253, 11715, 4979, 9879, 15116, 3453, 1698, 5858, 3470, 14561, 1005, 3966, 347, 973, 347, 1463, 12378, 9158, 672, 253, 5028, 2544, 875, 8090, 762, 4569, 9376, 273, 11233, 37913, 1255, 273, 1016, 273, 841, 3603, 597, 921, 7882, 273, 253, 4795, 4979, 347, 973, 347, 2341, 23029, 2087, 3006, 253, 8946, 299, 26365, 1543, 690, 11640, 327, 14580, 403, 3559, 271, 20828, 5700, 323, 4216, 9162, 285, 2169, 2621, 11715, 4679, 327, 1524, 941, 921, 253, 12510, 273, 253, 2746, 20544, 50275, 66, 2087, 2746, 685, 476, 1379, 715, 2395, 1142, 11640, 5028, 2544, 3966, 50276, 455, 8946, 10527, 1543, 327, 11715, 2186, 50276, 66, 1077, 3426, 24864, 2144, 50276, 783, 4679, 403, 21414, 3340, 323, 4216, 9077, 50276, 20881, 1255, 265, 50275, 66, 2372, 25286, 1037, 253, 2746, 27171, 432, 1512, 1199, 31376, 253, 4477, 4853, 1077, 12002, 9158, 285, 3603, 285, 275, 958, 2717, 275, 1798, 310, 670, 14580, 387, 512, 1919, 253, 11985, 273, 7118, 721, 285, 818, 534, 403, 417, 253, 5161, 273, 253, 2746, 33810, 253, 4588, 4327, 273, 253, 15116, 690, 5019, 273, 6868, 285, 7349, 310, 3240, 8763, 1561, 253, 3368, 2593, 285, 778, 1646, 247, 46868, 10341, 347, 247, 906, 253, 9414, 310, 8489, 1669, 12371, 512, 2112, 253, 2929, 752, 253, 4588, 10336, 310, 604, 436, 310, 816, 271, 12002, 15895, 273, 2045, 10336, 390, 604, 627, 310, 1633, 26401, 747, 1060, 6667, 273, 7092, 327, 14580, 2112, 253, 12002, 5740, 812, 1663, 1361, 253, 4685, 273, 253, 2746, 50276, 20415, 11640, 403, 2529, 533, 352, 3133, 417, 5762, 275, 4679, 6890, 14580, 2169, 2621, 47454, 50276, 783, 39383, 403, 3588, 762, 1142, 13260, 533, 247, 8723, 6667, 14127, 512, 273, 731, 310, 417, 1677, 253, 4477, 403, 3240, 8274, 670, 253, 7364, 275, 616, 806, 3368, 835, 597, 403, 2223, 417, 1375, 23037, 14387, 2490, 187, 4118, 18435, 27, 249, 253, 5955, 359, 4925, 247, 2590, 13969, 326, 436, 2929, 310, 4722, 323, 253, 5723, 2824, 3114, 285, 943, 320, 7607, 253, 4477, 30080, 22559, 285, 6774, 5955, 497, 1077, 4217, 285, 359, 403, 2819, 3579, 281, 253, 2457, 2715, 273, 253, 2929, 342, 253, 12316, 11701, 9009, 209 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper presents an approach architectbuilder iterated guiding a method that tackles what is presented as an architectbuilder problem a scenario in which two actors an architect with knowledge of a highlevel goal or reward function must communicate over a discrete channel with a builder who can take actions in the environment based on the architects message in many ways this resembles a hierarchical reinforcement learning setup specifically reminiscent of feudal reinforcement learning feudal reinforcement learning dayan et al neurips 1992 the paper presents a motivated easytounderstand algorithm for training both the architect and the builder and evaluate on a series of construction based gridworld tasks resembling gridlu minigrid or mazebase namely the proposed approach takes inspiration from experimental semiotics and separates learning into separate interaction frames similar again to multiphase approaches in the hierarchical rl literature eg in hiro data efficient hierarchical reinforcement learning nachum et al 2018 httpsarxivorgabs170301161 these interaction frames consist of a modeling frame in which the architect learns a model of the builder after rolling out and sending messageswatching the builders actions followed by a guiding frame where the architect exploits its model of the builder to produce the optimal actions via a heuristic driven monte carlo tree search rather than explicitly estimate a value function which is costly and noisy making this all possible is that the architect has full knowledge of the highlevel reward in addition to the groundtruth state transition function this lets the architect explicitly search over messages to send to the builder as it fully observes the builders rollout then improve the builder by selfimitating over this new data mirroring a biphase optimization setup the evaluation focuses on the proposed model and two simple ablations one where the architect has nointent at training sending random messages and one in which the builder takes random actions the paper also presents in the main body and appendix a meaningful thoughtful intuitive explanation of the learning dynamics of the architecture and the builder more papers should dedicate portions of the main body to explanations such as this i believe that this is a wellwritten paper with solid motivation and thoughtful care in crafting the approach however the bulk of my criticism focuses on the assumptions underlying the approach and its situation relative to prior work in hierarchical and specifically feudal rl an entire body of work that is omitted in the paper concretely i believe that providing the architect access to the transition function is an incredibly strong assumption that undercuts this work existing work in hierarchical rl that learns separate high and lowlevel policies under similar assumptions as the architecturebuilder setup though possibly without the restriction of the discrete message channel do not make this perfect transition assumption and instead focus on alternative means of learning the architect some examples are hiro httpsarxivorgabs180508296 feudal networks httpsarxivorgabs170301161 hdqn httpsarxivorgabs160406057 amongst many many more there are many other feudal rl approaches including more modern ones that appear in the skill discovery literature this leads into my second key weakness which is around the evaluation i understand that full system evaluations are hard but in this specific case i think the current ablations trivially fail and dont provide much insight into the proposed approach instead i would love to see other work that looks at more traditional hrl approaches or that relaxes the some of the assumptions made in this work specifically relaxing the discrete communication channel assumption would allow for outofthebox comparisons to hiro and hdqn as well as more recent work other ablations id like to see in future versions of this paper would look at versions of the architect learning without access to transition rewards typosstylequestions broad positioning question i love the architectbuilder problem statement but its not immediately clear to me why we need to restrict the communication mechanism between the architect and the builderwhat that gives us more broadly if the goal is to see how we might emerge a language discrete tokens similar to work on emergent communication work itd be nice to see how well an architect or builder learned with this approach transfers to reallanguage settings eg adapt to natural language instructions if the goal is broadly coordination i definitely think falling back on the feudal rl literature is important looking at latent continuous representations of intent is pretty pervasive in hiro and newer work such as valor httpsarxivorgabs180710299 in general i believe this is a wellwritten wellmotivated paper that draws some great ideas from experimental semiotics furthermore the discussion and analysis of the learning dynamics of the various components of the approach should be a mainstay of future systems driven work in ml however i am deeply concerned with the assumptions made with this approach stemming from the architects access to the groundtruth state transition function this coupled with a missing discussion of related work in hierarchical rl and feudal rl specifically and a more thorough evaluation including comparison to these methods and other relevant ablations inform my decision to lean towards rejecting this paper docsepthe paper proposes an approach for interactive learning between a socalled architect and builder agent which is a different but related protocol to rl or imitation learning under the assumption that the architect knows the target dynamics and reward function the authors focus on learning the communication between the two parties such that the builder can solve the mdp after formalizing the setting in the multiagent paradigm where two individual mdps are defined the authors propose an algorithm with socalled modelling and guiding phases in each phase architect and builder gather datasets from which policies are extracted via behaviour cloning and planning the paper is evaluated on a proposed block environment for the problem and uses a random building agent and combination of random modelling phase with proposed guiding phase as baselines the results show that the proposed approach is superior wrt the baselines for solving the block environment in addition experiments show that the learned communication channel can potentially be reused for solving other tasks the modelling of the problem is sensible ie using two distinct connected mdps it would support clarity if this design decision was explainedbacked with respect to established research on multiagent modelling are any reductions possible it seems that several challenges of the proposed framework have been solved in different mas works so it would be userful to be clear here the goal of the paper is relevant for the conference as a communication channel is directly learned and the framework somewhat works towards the bigger goal of agi however it would be useful to add more real world applications of the approach what would intermediate applications be before a perfect version of the approach is available the related work is informative but i am missing comparison to recent works on such learning protocols eg 1 there are differences between the frameworks and 1 might work towards hri as also mentioned in the paper but the direction seems quite related it would be also interesting to dwell on the reusability of the proposed algorithm for the problem at hand the authors themselves note in the discussion of the paper that the employed methods are suboptimal for the defined learning problem ie using behavioral cloning and mcts it would be beneficial to clearly state the learning challenges when defining the protocol as it is a main part of the contribution i am also wondering to what extent it might be possible to reuse established approaches for emerging communication as already cited in the paper eg 2 the empirical evaluation is informative although the random building agent does not give too much additional insight it would be good to show relevant ablations in the main part of the paper a minor comment with respect to the figure of the approach i find the pseudocode of the algorithm quite helpful in my opinion more helpful but it is in the suppl material 1 nguyen k misra d schapire r dudk m and shafto p 2021 interactive learning from activity description arxiv preprint arxiv210207024 2 foerster jn assael ym de freitas n and whiteson s 2016 learning to communicate with deep multiagent reinforcement learning arxiv preprint arxiv160506676 the paper proposes an interesting and relevant framework for interactive learning teaching between two agents the model choice and solution is sensible and the empirical evaluation can shows that the initial approach for the problem works sufficiently well for the toy problem the learning challenges with respect to other works in mas could be clarified in more detail docsep after rebuttal i am keeping my current score i am positive about this framework as it presents a better model for multiagent communication especially enriching the communication among agents over the fixed restricted rewardbased communication protocol in traditional rl before rebuttal the paper proposes an architectbuilder problem setting where the architecture guides the builder to accomplish a goal by sending messages only the architecture knows the goal and has access to rewards whereas only the builder can act in the environment this setting is distinct from traditional reinforcement learning and imitation learning drawing inspiration from cognitive science theories the authors devise an algorithm for learning a communication protocol between the architect and the builder on gridworld tasks they show that learned the communication protocol can generalize to previously unseen tasks overall i find the setting and the proposed algorithm novel and interesting i really like that authors carefully distinguish their setting from rl and il and give very nice intuitions about the learning dynamics of the proposed algorithm although the architectbuilder relationship has been explored in previous work preventing the builder from knowing the rewards is to my best knowledge not explored however i feel like this setting makes more sense if the architect were a human as they may have implicit intent that the builder has to manage to interpret id encourage the authors to give a concrete example where this setting is useful in the agentagent setting in general what is the motivation of this work is it a computational model for studying humanhuman communication or is it offering a learning framework that aim for specific practical scenarios the motivation should be stated more clear in the introduction another suggestion is to rewrite the problem formulation using the mdp formalism ie s a t r gamma clearly defining the mdps and policies of the architect and the builder in addition the description of the interaction needs to be more precise right now it does not specify whether the architect sends a message to the builder after every step or the builder can take multiple steps after a message the main limitations of this work are 1 the architecture requires access to a simulator of the environment which may not be a problem for a human and 2 the simplicity of the experimented environment the authors adequately acknowledge these limitations suggested related work on more complex environments 1 collaborative dialogue in minecraft httpsaclanthologyorgp191537pdf 2 hierarchical decision making by generating and following natural language instructions httpsarxivorgpdf190600744pdf 3 interactive learning from activity description httpsarxivorgpdf210207024pdf 4 neural abstructions abstractions that support construction for grounded language learning httpsarxivorgpdf210709285pdf 5 incorporating pragmatic reasoning communication into emergent language httpsarxivorgpdf200604109pdf the paper presents a novel and interesting setting and algorithm they nicely implement ideas from cognitive science in an mdp setting even though the formulation needs to be more rigorous and the experiment environments are simplistic i think the contributions are interesting enough to draw attention of the community in the future i recommend acceptance docsepthe authors propose a setting in which two agents with asymmetric information collaborate to complete a goal they demonstrate that learning in this setting results in the emergence of communication protocols that generalize to new tasks strengths an interesting setting and learning paradigm which seems to promote the emergence of generalizable communication protocols weaknesses several assumptions seem to be made which limit the approachs direct applicability to more complex environments giving the architect access to the groundtruth environment model seems to be a very strong assumption the heuristic used in mcts is not described in detail but presumably it uses a significant amount of domain knowledge its unclear how the selfimitation learning works see questions below for more detail my main concern is that this only works due to the fact that the architect has access to ground truth environment transition models such that it can exploit spurious correlations between messages and behavior in an untrained builder model this seems unlikely to work at all even in this simple environment without access to the environment model lacking in experiments which attempt to understand the nature of the learned communication protocol from section b3 it seems like the agents simply learn to map messages directly to actions which is somewhat disappointing if this is the case then the generalization results may simply be down to the effectiveness of the mcts procedure in other words the architectbuilder pair learns messages that map to all possible actions so the architects job is reduced to standard mcts with the original action space questions how is it possible for the builder to have any sort of coherent predictable behavior at the beginning of training it seems this would be crucial for the selfimitation learning to work in other words an untrained builder agent will have no ability to interpret messages and modulate its behavior through them as a result the trajectories produced by the architectbuilder pair will be no better than random and selfimitation of these trajectories shouldnt produce any meaningful results it seems section 33 attempts to address this but a more intuitive description would be appreciated this paper presents an interesting setting however it appears to rely too heavily on strong assumptions and the communication protocols that emerge do not seem to exhibit any interesting properties ie the communicating agent simply learns to output messages that correspond to the desired action for the acting agent ### Summary:
this paper proposes a cognitive scienceinspired interaction setting between two agents an architect and builder in which the architect must produce messages to guide the builder to achieve a task unlike other related settings such as typical approaches in marl hrl or hri the builder does not have access to the architects reward function and must learn to interpret the architects messages by assuming the architect is telling it to do something sensible at the same time the architect determines what is sensible by building a model of the builders behavior and planning over it this setting is common particularly in humanagent interactions where humans may not be able to either 1 accurately communicate a scalar reward or 2 provide demonstrations but can still provide information that the agent ought to be able to learn from the paper demonstrates that the learned communication protocol generalizes well to new settings while this paper generated a lot of discussion the reviewers did not come to a consensus on whether the paper should be accepted or rejected with those in favor of the paper maintaining it should be accepted and those not in favor maintaining that it needs work i have therefore done a particularly close read of both the paper and the discussion in order to weigh the pros and cons brought up by the reviewers the positive reviews clearly indicate that this work is insightful and of interest to researchers in the iclr community in fact all reviewers mentioned they found the work interesting and wellwritten in particular reviewer hmet wrote i am positive about this framework as it presents a better model for multiagent communication especially enriching the communication among agents over the fixed restricted rewardbased communication protocol in traditional rl i am inclined to agree with this assessment and find the communication setting studied in this paper to be much more ecologically valid for humanagent interaction settings than having humans communicate scalar rewards or provide demonstrations humans are typically poor at the former and may not have the same embodiment to achieve the latter the negative reviews focused on a few cons 1 the assumption that the architect has access to a groundtruth environment model 2 confusion about differences from other related fields eg feudal rl marl and 3 lack of analysis of the communication protocol i have considered these points but do not feel any of them are fatal flaws 1 from the perspective of humanagent interaction i think it is very reasonable to assume that a human architect would have a good model of the world and would be generally proficient at solving tasks in the world making this approach work in the setting where the architect is also learning how the world works seems squarely in the domain of future work 2 the authors have done an extensive job of clarifying the differences between these related areas and as discussed above other reviewers found the way in which agp is different to be insightful and ecologically valid 3 this is potentially the most serious con as the discussion with reviewer bhgy brought up the learned communication protocol may just be a simple mapping between messages and environment interactions after further discussion in which the authors argued that learning a simple mapping is not a problemthe main question is how to even induce such a mapping in the first placethe reviewer acknowledged that this is not a fatal flaw but that makes the results somewhat less interesting in summary the positive reviews highlighted the interestingness and insightful nature of the questions studied in this paper and have convinced me that this paper will be of interest to the iclr community as it has provides a new perspective on the problem of agentagent interaction particularly for the special case of humanagent interaction the negative reviews did highlight a few limitations of the paper but i expect these can be addressed by future work and do not feel they outweigh the interestingness of the problem in light of this i recommend acceptance as a poster suggestion for the authors i found the discussion with reviewer bhgy to be particularly insightful and helpful in understanding the aims of the paper i would encourage you to incorporate some of this into the cameraready version of the paper and perhaps to lean more heavily on the special case of humanagent interaction as motivation of this work as also hinted at by reviewer hmet
[ 247, 2022, 629, 273, 253, 7680, 50275, 74, 717, 671, 12371, 281, 752, 6070, 352, 1537, 320, 1896, 281, 33150, 4232, 7274, 323, 14149, 5511, 347, 2168, 11106, 275, 253, 2929, 24088, 374, 50276, 783, 16774, 7103, 310, 27096, 3738, 253, 3632, 3652, 5570, 1057, 417, 1918, 1512, 1199, 3081, 12288, 352, 651, 320, 1175, 281, 921, 4623, 490, 77, 569, 275, 253, 2022, 629, 273, 253, 2929, 50275, 66, 5884, 4385, 342, 1675, 281, 253, 4677, 273, 253, 2746, 891, 1089, 253, 10585, 406, 853, 273, 253, 5933, 3240, 9371, 275, 619, 4743, 625, 9371, 533, 352, 310, 275, 253, 7642, 2144, 50276, 18, 295, 39170, 465, 3731, 376, 277, 5807, 522, 603, 391, 277, 438, 76, 278, 285, 13755, 80, 268, 43425, 18366, 4715, 432, 2425, 5740, 549, 32693, 638, 3845, 549, 32693, 16899, 938, 1967, 1348, 374, 7954, 254, 2971, 480, 79, 718, 4696, 340, 78, 372, 4107, 30126, 295, 285, 19991, 251, 256, 4022, 4715, 281, 13791, 342, 3676, 4471, 12788, 35221, 4715, 549, 32693, 638, 3845, 549, 32693, 9913, 25670, 29462, 50276, 783, 2929, 29328, 271, 4722, 285, 4623, 7792, 323, 18366, 4715, 50276, 442, 5250, 875, 767, 6083, 253, 1566, 4327, 285, 2900, 310, 24600, 285, 253, 16774, 7103, 476, 2722, 326, 253, 3302, 2746, 323, 253, 1895, 2987, 10481, 973, 323, 253, 20953, 1895, 253, 4715, 7881, 342, 1675, 281, 643, 2987, 275, 9425, 812, 320, 31637, 275, 625, 2508, 5474, 33032, 846, 30080, 22559, 891, 717, 7562, 619, 1655, 4868, 891, 717, 2762, 670, 436, 7792, 347, 352, 10262, 247, 1805, 1566, 323, 4471, 12788, 5511, 3340, 15655, 272, 253, 5511, 2190, 6083, 689, 253, 4229, 11096, 10921, 3169, 5511, 7241, 275, 5899, 391, 77, 50276, 9131, 30080, 22559, 253, 2929, 29328, 271, 6805, 27136, 1895, 4758, 835, 253, 10336, 22591, 253, 22074, 281, 14294, 247, 4736, 407, 10430, 8169, 760, 253, 10336, 6057, 253, 4736, 285, 556, 2289, 281, 23267, 5727, 760, 253, 22074, 476, 769, 275, 253, 3126, 436, 4758, 310, 5799, 432, 5899, 35221, 4715, 285, 45738, 4715, 50275, 6553, 272, 17006, 432, 9699, 5859, 11813, 253, 4477, 45018, 271, 5933, 323, 4715, 247, 5511, 7241, 875, 253, 6805, 285, 253, 22074, 327, 9860, 10186, 8892, 597, 921, 326, 6311, 253, 5511, 7241, 476, 39970, 281, 3786, 39709, 8892, 50275, 1189, 455, 891, 1089, 253, 4758, 285, 253, 4081, 5933, 4460, 285, 4722, 891, 1663, 751, 326, 4477, 9257, 12129, 616, 4758, 432, 391, 77, 285, 4164, 285, 1918, 1077, 5322, 16875, 4431, 670, 253, 4715, 8062, 273, 253, 4081, 5933, 3738, 253, 6805, 27136, 2954, 556, 644, 14859, 275, 2045, 789, 13538, 253, 22074, 432, 8958, 253, 23267, 310, 281, 619, 1682, 3640, 417, 14859, 2299, 891, 1928, 751, 436, 4758, 2789, 625, 3282, 604, 253, 6805, 497, 247, 1966, 347, 597, 778, 452, 15424, 6860, 326, 253, 22074, 556, 281, 8722, 281, 4665, 2654, 11907, 253, 4477, 281, 1918, 247, 11859, 1650, 835, 436, 4758, 310, 4217, 275, 253, 5570, 12788, 4758, 275, 2087, 752, 310, 253, 16038, 273, 436, 789, 310, 352, 247, 15180, 1566, 323, 12392, 1966, 13961, 5511, 390, 310, 352, 9159, 247, 4715, 7792, 326, 4388, 323, 2173, 8542, 15216, 253, 16038, 943, 320, 4767, 625, 2590, 275, 253, 10199, 50275, 23955, 14876, 310, 281, 24813, 253, 1895, 15895, 970, 253, 278, 12132, 30221, 26332, 256, 247, 246, 391, 17356, 4518, 13947, 253, 31934, 793, 285, 7823, 273, 253, 6805, 285, 253, 22074, 275, 1635, 253, 5740, 273, 253, 5016, 3198, 281, 320, 625, 10799, 987, 1024, 352, 1057, 417, 13199, 1880, 253, 6805, 16965, 247, 3935, 281, 253, 22074, 846, 1046, 3213, 390, 253, 22074, 476, 1379, 2709, 5018, 846, 247, 3935, 50275, 783, 2022, 7364, 273, 436, 789, 403, 337, 253, 10336, 4419, 2289, 281, 247, 40022, 273, 253, 3126, 534, 778, 417, 320, 247, 1895, 323, 247, 1966, 285, 374, 253, 17647, 273, 253, 3368, 264, 3126, 253, 4477, 18212, 14409, 841, 7364, 50275, 35640, 264, 2905, 789, 327, 625, 2570, 12620, 50276, 18, 27549, 17414, 275, 7477, 12517, 5987, 29404, 14718, 1497, 2061, 81, 746, 1010, 1787, 9275, 50276, 19, 24498, 3061, 2403, 407, 11365, 285, 1563, 3626, 3448, 7997, 5987, 39962, 2061, 9275, 746, 3071, 7931, 2031, 9275, 50276, 20, 18366, 4715, 432, 2425, 5740, 5987, 39962, 2061, 9275, 16899, 938, 1967, 1348, 9275, 50276, 21, 11454, 20965, 6477, 490, 10981, 960, 326, 1329, 5140, 323, 28462, 3448, 4715, 5987, 39962, 2061, 9275, 16899, 26371, 23025, 9275, 50276, 22, 24049, 41585, 14720, 5511, 715, 47006, 3448, 5987, 39962, 2061, 9275, 1518, 26890, 12852, 9275, 253, 2929, 10262, 247, 4460, 285, 4722, 4758, 285, 5933, 597, 23395, 3359, 5697, 432, 9699, 5859, 275, 271, 278, 12132, 4758, 1014, 2167, 253, 15895, 3198, 281, 320, 625, 26565, 285, 253, 3368, 12620, 403, 8077, 2531, 891, 1158, 253, 9021, 403, 4722, 2217, 281, 3812, 4116, 273, 253, 3114, 275, 253, 2852, 891, 5583, 14924, 50276, 7152, 339, 431, 248, 4477, 12661, 247, 4758, 275, 534, 767, 6083, 342, 26640, 1491, 42124, 281, 3426, 247, 4736, 597, 7568, 326, 4715, 275, 436, 4758, 1543, 275, 253, 21313, 273, 5511, 14238, 326, 39970, 281, 747, 8892, 50276, 296, 3755, 20556, 50276, 266, 4722, 4758, 285, 4715, 22199, 534, 3133, 281, 8591, 253, 21313, 273, 2087, 12729, 5511, 14238, 50275, 20881, 1255, 265, 50276, 43249, 13260, 1646, 281, 320, 1160, 534, 2701, 253, 2746, 84, 1480, 30437, 281, 625, 2570, 12620, 50274, 19647, 253, 6805, 2289, 281, 253, 3216, 33024, 3126, 1566, 3133, 281, 320, 247, 1077, 2266, 9376, 50274, 783, 47641, 908, 275, 278, 291, 84, 310, 417, 2529, 275, 2508, 533, 18289, 352, 4648, 247, 1534, 2408, 273, 5028, 3640, 50276, 953, 12744, 849, 253, 1881, 303, 3535, 4715, 2987, 923, 3533, 2708, 323, 625, 2508, 619, 2022, 4468, 310, 326, 436, 760, 2987, 1955, 281, 253, 958, 326, 253, 6805, 556, 2289, 281, 3216, 5083, 3126, 5502, 3210, 824, 326, 352, 476, 22059, 46541, 13007, 875, 8169, 285, 3879, 275, 271, 440, 32927, 22074, 1566, 436, 3133, 11543, 281, 789, 387, 512, 1014, 275, 436, 2969, 3126, 1293, 2289, 281, 253, 3126, 1566, 50276, 77, 10892, 275, 4679, 534, 3177, 281, 2096, 253, 3753, 273, 253, 6311, 5511, 7241, 432, 2593, 270, 20, 352, 3133, 751, 253, 6083, 3365, 3037, 281, 3711, 8169, 3587, 281, 5231, 534, 310, 8489, 31623, 604, 436, 310, 253, 1083, 840, 253, 26647, 1543, 778, 3365, 320, 1066, 281, 253, 12510, 273, 253, 278, 291, 84, 5199, 275, 643, 3000, 253, 6805, 27136, 4667, 33772, 8169, 326, 3711, 281, 512, 1896, 5231, 594, 253, 38446, 2628, 310, 3777, 281, 2629, 278, 291, 84, 342, 253, 3236, 2250, 2317, 50275, 34974, 50276, 5430, 310, 352, 1896, 323, 253, 22074, 281, 452, 667, 3686, 273, 18893, 28826, 3879, 387, 253, 5068, 273, 3733, 352, 3133, 436, 651, 320, 9560, 323, 253, 1881, 303, 3535, 4715, 281, 789, 275, 643, 3000, 271, 440, 32927, 22074, 5570, 588, 452, 642, 3745, 281, 4665, 8169, 285, 28498, 697, 3879, 949, 731, 347, 247, 906, 253, 24102, 4197, 407, 253, 6805, 27136, 4667, 588, 320, 642, 1805, 685, 3632, 285, 1881, 303, 3535, 273, 841, 24102, 943, 2649, 4711, 667, 14282, 1543, 352, 3133, 2593, 5922, 9437, 281, 2953, 436, 533, 247, 625, 27350, 5740, 651, 320, 14109, 436, 2929, 10262, 271, 4722, 4758, 2299, 352, 4620, 281, 10725, 1512, 11306, 327, 2266, 13260, 285, 253, 5511, 14238, 326, 20177, 513, 417, 1646, 281, 10738, 667, 4722, 3607, 26332, 253, 26728, 5570, 3365, 33772, 281, 3453, 8169, 326, 2723, 281, 253, 6799, 2250, 323, 253, 8534, 5570, 2490, 187, 4118, 18435, 27, 2520, 2929, 29328, 247, 9699, 5859, 38358, 5016, 4758, 875, 767, 6083, 271, 6805, 285, 22074, 275, 534, 253, 6805, 1364, 4711, 8169, 281, 7102, 253, 22074, 281, 5115, 247, 4836, 12401, 643, 2905, 7533, 824, 347, 6867, 7274, 275, 2304, 77, 288, 8435, 390, 288, 363, 253, 22074, 1057, 417, 452, 2289, 281, 253, 38446, 10921, 1159, 285, 1364, 3037, 281, 4665, 253, 38446, 8169, 407, 7384, 253, 6805, 310, 7746, 352, 281, 513, 1633, 24600, 387, 253, 1072, 673, 253, 6805, 14802, 752, 310, 24600, 407, 3652, 247, 1566, 273, 253, 44267, 3879, 285, 7219, 689, 352, 436, 4758, 310, 1846, 3782, 275, 1966, 12788, 6355, 835, 7497, 778, 417, 320, 2104, 281, 2057, 337, 13613, 13791, 247, 13434, 10921, 390, 374, 2085, 32367, 533, 476, 1335, 2085, 1491, 326, 253, 5570, 12758, 281, 320, 2104, 281, 3037, 432, 253, 2929, 14371, 326, 253, 6311, 5511, 7241, 2087, 4219, 973, 281, 747, 7533, 50276, 6050, 436, 2929, 4561, 247, 2257, 273, 5955, 253, 30628, 858, 417, 1705, 281, 247, 13969, 327, 1880, 253, 2929, 943, 320, 7607, 390, 10945, 342, 1110, 275, 3718, 273, 253, 2929, 11850, 352, 943, 320, 7607, 285, 1110, 417, 275, 3718, 11850, 326, 352, 3198, 789, 891, 452, 3103, 2218, 247, 3782, 2810, 1239, 273, 1097, 253, 2929, 285, 253, 5955, 275, 1340, 281, 14357, 253, 5847, 285, 772, 3982, 598, 407, 253, 30628, 50276, 783, 2762, 10123, 4518, 5224, 326, 436, 789, 310, 47860, 285, 273, 1600, 281, 8607, 275, 253, 17857, 32888, 3114, 275, 958, 512, 30628, 5393, 597, 1119, 253, 789, 4722, 285, 973, 15720, 275, 1798, 37317, 288, 3899, 4159, 891, 717, 2762, 670, 436, 7792, 347, 352, 10262, 247, 1805, 1566, 323, 4471, 12788, 5511, 3340, 15655, 272, 253, 5511, 2190, 6083, 689, 253, 4229, 11096, 10921, 3169, 5511, 7241, 275, 5899, 391, 77, 891, 717, 21802, 281, 5194, 342, 436, 6803, 285, 1089, 253, 5511, 4758, 5421, 275, 436, 2929, 281, 320, 1199, 625, 10038, 11220, 3588, 323, 1966, 12788, 5016, 7533, 685, 1907, 7497, 13791, 13434, 23267, 390, 2085, 32367, 7497, 403, 5431, 4105, 387, 253, 3438, 285, 778, 417, 452, 253, 1072, 14704, 281, 5115, 253, 6158, 50276, 783, 4016, 10123, 7106, 327, 247, 1643, 772, 337, 253, 9376, 326, 253, 6805, 556, 2289, 281, 247, 3216, 33024, 3126, 1566, 374, 13775, 670, 3910, 432, 643, 2905, 4910, 24088, 40157, 267, 391, 77, 2304, 77, 285, 495, 3480, 273, 1783, 273, 253, 5511, 7241, 891, 452, 2783, 841, 2792, 533, 513, 417, 1928, 667, 273, 731, 403, 15444, 32138, 337, 432, 253, 8668, 273, 1966, 12788, 5016, 891, 1158, 352, 310, 1077, 5272, 281, 5467, 326, 247, 1966, 6805, 651, 452, 247, 1175, 1566, 273, 253, 1533, 285, 651, 320, 3839, 1801, 17952, 387, 16161, 8892, 275, 253, 1533, 2403, 436, 2746, 789, 275, 253, 4758, 835, 253, 6805, 310, 671, 4715, 849, 253, 1533, 2987, 3133, 48859, 275, 253, 5028, 273, 2852, 789, 374, 253, 4477, 452, 2218, 271, 9470, 2628, 273, 8254, 5411, 253, 3910, 875, 841, 2905, 3672, 285, 347, 5469, 1840, 643, 30628, 1119, 253, 1039, 275, 534, 639, 81, 310, 1027, 281, 320, 47860, 285, 10038, 11220, 3588, 495, 436, 310, 7826, 253, 954, 4092, 345, 347, 253, 5955, 342, 37317, 270, 73, 4233, 3982, 598, 253, 6311, 5511, 7241, 778, 816, 320, 247, 2969, 10603, 875, 8169, 285, 3126, 6355, 846, 2007, 5955, 275, 534, 253, 4477, 9125, 326, 4715, 247, 2969, 10603, 310, 417, 247, 1895, 783, 2022, 1953, 310, 849, 281, 1014, 10808, 824, 247, 10603, 275, 253, 806, 21927, 10666, 37317, 14969, 326, 436, 310, 417, 247, 15444, 19652, 533, 326, 2789, 253, 1543, 8489, 1679, 4722, 50276, 249, 6010, 253, 2762, 10123, 16318, 253, 4722, 1255, 285, 47860, 3753, 273, 253, 3533, 5421, 275, 436, 2929, 285, 452, 13762, 479, 326, 436, 2929, 588, 320, 273, 1600, 281, 253, 17857, 32888, 3114, 347, 352, 556, 3400, 247, 747, 8668, 327, 253, 1895, 273, 5570, 12788, 5016, 3782, 323, 253, 2714, 1083, 273, 1966, 12788, 5016, 253, 4016, 10123, 858, 6780, 247, 1643, 7364, 273, 253, 2929, 533, 891, 1902, 841, 476, 320, 9713, 407, 2852, 789, 285, 513, 417, 1928, 597, 32180, 798, 253, 4722, 1255, 273, 253, 1895, 275, 1708, 273, 436, 891, 5583, 14924, 347, 247, 20731, 50276, 35640, 279, 323, 253, 4477, 891, 1119, 253, 5955, 342, 37317, 270, 73, 4233, 281, 320, 3782, 47860, 285, 9371, 275, 4685, 253, 13698, 273, 253, 2929, 891, 651, 11907, 368, 281, 19071, 690, 273, 436, 715, 253, 4049, 254, 609, 5102, 2715, 273, 253, 2929, 285, 4931, 281, 9644, 625, 11306, 327, 253, 2714, 1083, 273, 1966, 12788, 5016, 347, 16038, 273, 436, 789, 347, 671, 47466, 387, 407, 37317, 288, 3899 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 247, 2022, 629, 273, 253, 7680, 50275, 74, 717, 671, 12371, 281, 752, 6070, 352, 1537, 320, 1896, 281, 33150, 4232, 7274, 323, 14149, 5511, 347, 2168, 11106, 275, 253, 2929, 24088, 374, 50276, 783, 16774, 7103, 310, 27096, 3738, 253, 3632, 3652, 5570, 1057, 417, 1918, 1512, 1199, 3081, 12288, 352, 651, 320, 1175, 281, 921, 4623, 490, 77, 569, 275, 253, 2022, 629, 273, 253, 2929, 50275, 66, 5884, 4385, 342, 1675, 281, 253, 4677, 273, 253, 2746, 891, 1089, 253, 10585, 406, 853, 273, 253, 5933, 3240, 9371, 275, 619, 4743, 625, 9371, 533, 352, 310, 275, 253, 7642, 2144, 50276, 18, 295, 39170, 465, 3731, 376, 277, 5807, 522, 603, 391, 277, 438, 76, 278, 285, 13755, 80, 268, 43425, 18366, 4715, 432, 2425, 5740, 549, 32693, 638, 3845, 549, 32693, 16899, 938, 1967, 1348, 374, 7954, 254, 2971, 480, 79, 718, 4696, 340, 78, 372, 4107, 30126, 295, 285, 19991, 251, 256, 4022, 4715, 281, 13791, 342, 3676, 4471, 12788, 35221, 4715, 549, 32693, 638, 3845, 549, 32693, 9913, 25670, 29462, 50276, 783, 2929, 29328, 271, 4722, 285, 4623, 7792, 323, 18366, 4715, 50276, 442, 5250, 875, 767, 6083, 253, 1566, 4327, 285, 2900, 310, 24600, 285, 253, 16774, 7103, 476, 2722, 326, 253, 3302, 2746, 323, 253, 1895, 2987, 10481, 973, 323, 253, 20953, 1895, 253, 4715, 7881, 342, 1675, 281, 643, 2987, 275, 9425, 812, 320, 31637, 275, 625, 2508, 5474, 33032, 846, 30080, 22559, 891, 717, 7562, 619, 1655, 4868, 891, 717, 2762, 670, 436, 7792, 347, 352, 10262, 247, 1805, 1566, 323, 4471, 12788, 5511, 3340, 15655, 272, 253, 5511, 2190, 6083, 689, 253, 4229, 11096, 10921, 3169, 5511, 7241, 275, 5899, 391, 77, 50276, 9131, 30080, 22559, 253, 2929, 29328, 271, 6805, 27136, 1895, 4758, 835, 253, 10336, 22591, 253, 22074, 281, 14294, 247, 4736, 407, 10430, 8169, 760, 253, 10336, 6057, 253, 4736, 285, 556, 2289, 281, 23267, 5727, 760, 253, 22074, 476, 769, 275, 253, 3126, 436, 4758, 310, 5799, 432, 5899, 35221, 4715, 285, 45738, 4715, 50275, 6553, 272, 17006, 432, 9699, 5859, 11813, 253, 4477, 45018, 271, 5933, 323, 4715, 247, 5511, 7241, 875, 253, 6805, 285, 253, 22074, 327, 9860, 10186, 8892, 597, 921, 326, 6311, 253, 5511, 7241, 476, 39970, 281, 3786, 39709, 8892, 50275, 1189, 455, 891, 1089, 253, 4758, 285, 253, 4081, 5933, 4460, 285, 4722, 891, 1663, 751, 326, 4477, 9257, 12129, 616, 4758, 432, 391, 77, 285, 4164, 285, 1918, 1077, 5322, 16875, 4431, 670, 253, 4715, 8062, 273, 253, 4081, 5933, 3738, 253, 6805, 27136, 2954, 556, 644, 14859, 275, 2045, 789, 13538, 253, 22074, 432, 8958, 253, 23267, 310, 281, 619, 1682, 3640, 417, 14859, 2299, 891, 1928, 751, 436, 4758, 2789, 625, 3282, 604, 253, 6805, 497, 247, 1966, 347, 597, 778, 452, 15424, 6860, 326, 253, 22074, 556, 281, 8722, 281, 4665, 2654, 11907, 253, 4477, 281, 1918, 247, 11859, 1650, 835, 436, 4758, 310, 4217, 275, 253, 5570, 12788, 4758, 275, 2087, 752, 310, 253, 16038, 273, 436, 789, 310, 352, 247, 15180, 1566, 323, 12392, 1966, 13961, 5511, 390, 310, 352, 9159, 247, 4715, 7792, 326, 4388, 323, 2173, 8542, 15216, 253, 16038, 943, 320, 4767, 625, 2590, 275, 253, 10199, 50275, 23955, 14876, 310, 281, 24813, 253, 1895, 15895, 970, 253, 278, 12132, 30221, 26332, 256, 247, 246, 391, 17356, 4518, 13947, 253, 31934, 793, 285, 7823, 273, 253, 6805, 285, 253, 22074, 275, 1635, 253, 5740, 273, 253, 5016, 3198, 281, 320, 625, 10799, 987, 1024, 352, 1057, 417, 13199, 1880, 253, 6805, 16965, 247, 3935, 281, 253, 22074, 846, 1046, 3213, 390, 253, 22074, 476, 1379, 2709, 5018, 846, 247, 3935, 50275, 783, 2022, 7364, 273, 436, 789, 403, 337, 253, 10336, 4419, 2289, 281, 247, 40022, 273, 253, 3126, 534, 778, 417, 320, 247, 1895, 323, 247, 1966, 285, 374, 253, 17647, 273, 253, 3368, 264, 3126, 253, 4477, 18212, 14409, 841, 7364, 50275, 35640, 264, 2905, 789, 327, 625, 2570, 12620, 50276, 18, 27549, 17414, 275, 7477, 12517, 5987, 29404, 14718, 1497, 2061, 81, 746, 1010, 1787, 9275, 50276, 19, 24498, 3061, 2403, 407, 11365, 285, 1563, 3626, 3448, 7997, 5987, 39962, 2061, 9275, 746, 3071, 7931, 2031, 9275, 50276, 20, 18366, 4715, 432, 2425, 5740, 5987, 39962, 2061, 9275, 16899, 938, 1967, 1348, 9275, 50276, 21, 11454, 20965, 6477, 490, 10981, 960, 326, 1329, 5140, 323, 28462, 3448, 4715, 5987, 39962, 2061, 9275, 16899, 26371, 23025, 9275, 50276, 22, 24049, 41585, 14720, 5511, 715, 47006, 3448, 5987, 39962, 2061, 9275, 1518, 26890, 12852, 9275, 253, 2929, 10262, 247, 4460, 285, 4722, 4758, 285, 5933, 597, 23395, 3359, 5697, 432, 9699, 5859, 275, 271, 278, 12132, 4758, 1014, 2167, 253, 15895, 3198, 281, 320, 625, 26565, 285, 253, 3368, 12620, 403, 8077, 2531, 891, 1158, 253, 9021, 403, 4722, 2217, 281, 3812, 4116, 273, 253, 3114, 275, 253, 2852, 891, 5583, 14924, 50276, 7152, 339, 431, 248, 4477, 12661, 247, 4758, 275, 534, 767, 6083, 342, 26640, 1491, 42124, 281, 3426, 247, 4736, 597, 7568, 326, 4715, 275, 436, 4758, 1543, 275, 253, 21313, 273, 5511, 14238, 326, 39970, 281, 747, 8892, 50276, 296, 3755, 20556, 50276, 266, 4722, 4758, 285, 4715, 22199, 534, 3133, 281, 8591, 253, 21313, 273, 2087, 12729, 5511, 14238, 50275, 20881, 1255, 265, 50276, 43249, 13260, 1646, 281, 320, 1160, 534, 2701, 253, 2746, 84, 1480, 30437, 281, 625, 2570, 12620, 50274, 19647, 253, 6805, 2289, 281, 253, 3216, 33024, 3126, 1566, 3133, 281, 320, 247, 1077, 2266, 9376, 50274, 783, 47641, 908, 275, 278, 291, 84, 310, 417, 2529, 275, 2508, 533, 18289, 352, 4648, 247, 1534, 2408, 273, 5028, 3640, 50276, 953, 12744, 849, 253, 1881, 303, 3535, 4715, 2987, 923, 3533, 2708, 323, 625, 2508, 619, 2022, 4468, 310, 326, 436, 760, 2987, 1955, 281, 253, 958, 326, 253, 6805, 556, 2289, 281, 3216, 5083, 3126, 5502, 3210, 824, 326, 352, 476, 22059, 46541, 13007, 875, 8169, 285, 3879, 275, 271, 440, 32927, 22074, 1566, 436, 3133, 11543, 281, 789, 387, 512, 1014, 275, 436, 2969, 3126, 1293, 2289, 281, 253, 3126, 1566, 50276, 77, 10892, 275, 4679, 534, 3177, 281, 2096, 253, 3753, 273, 253, 6311, 5511, 7241, 432, 2593, 270, 20, 352, 3133, 751, 253, 6083, 3365, 3037, 281, 3711, 8169, 3587, 281, 5231, 534, 310, 8489, 31623, 604, 436, 310, 253, 1083, 840, 253, 26647, 1543, 778, 3365, 320, 1066, 281, 253, 12510, 273, 253, 278, 291, 84, 5199, 275, 643, 3000, 253, 6805, 27136, 4667, 33772, 8169, 326, 3711, 281, 512, 1896, 5231, 594, 253, 38446, 2628, 310, 3777, 281, 2629, 278, 291, 84, 342, 253, 3236, 2250, 2317, 50275, 34974, 50276, 5430, 310, 352, 1896, 323, 253, 22074, 281, 452, 667, 3686, 273, 18893, 28826, 3879, 387, 253, 5068, 273, 3733, 352, 3133, 436, 651, 320, 9560, 323, 253, 1881, 303, 3535, 4715, 281, 789, 275, 643, 3000, 271, 440, 32927, 22074, 5570, 588, 452, 642, 3745, 281, 4665, 8169, 285, 28498, 697, 3879, 949, 731, 347, 247, 906, 253, 24102, 4197, 407, 253, 6805, 27136, 4667, 588, 320, 642, 1805, 685, 3632, 285, 1881, 303, 3535, 273, 841, 24102, 943, 2649, 4711, 667, 14282, 1543, 352, 3133, 2593, 5922, 9437, 281, 2953, 436, 533, 247, 625, 27350, 5740, 651, 320, 14109, 436, 2929, 10262, 271, 4722, 4758, 2299, 352, 4620, 281, 10725, 1512, 11306, 327, 2266, 13260, 285, 253, 5511, 14238, 326, 20177, 513, 417, 1646, 281, 10738, 667, 4722, 3607, 26332, 253, 26728, 5570, 3365, 33772, 281, 3453, 8169, 326, 2723, 281, 253, 6799, 2250, 323, 253, 8534, 5570, 2490, 187, 4118, 18435, 27, 2520, 2929, 29328, 247, 9699, 5859, 38358, 5016, 4758, 875, 767, 6083, 271, 6805, 285, 22074, 275, 534, 253, 6805, 1364, 4711, 8169, 281, 7102, 253, 22074, 281, 5115, 247, 4836, 12401, 643, 2905, 7533, 824, 347, 6867, 7274, 275, 2304, 77, 288, 8435, 390, 288, 363, 253, 22074, 1057, 417, 452, 2289, 281, 253, 38446, 10921, 1159, 285, 1364, 3037, 281, 4665, 253, 38446, 8169, 407, 7384, 253, 6805, 310, 7746, 352, 281, 513, 1633, 24600, 387, 253, 1072, 673, 253, 6805, 14802, 752, 310, 24600, 407, 3652, 247, 1566, 273, 253, 44267, 3879, 285, 7219, 689, 352, 436, 4758, 310, 1846, 3782, 275, 1966, 12788, 6355, 835, 7497, 778, 417, 320, 2104, 281, 2057, 337, 13613, 13791, 247, 13434, 10921, 390, 374, 2085, 32367, 533, 476, 1335, 2085, 1491, 326, 253, 5570, 12758, 281, 320, 2104, 281, 3037, 432, 253, 2929, 14371, 326, 253, 6311, 5511, 7241, 2087, 4219, 973, 281, 747, 7533, 50276, 6050, 436, 2929, 4561, 247, 2257, 273, 5955, 253, 30628, 858, 417, 1705, 281, 247, 13969, 327, 1880, 253, 2929, 943, 320, 7607, 390, 10945, 342, 1110, 275, 3718, 273, 253, 2929, 11850, 352, 943, 320, 7607, 285, 1110, 417, 275, 3718, 11850, 326, 352, 3198, 789, 891, 452, 3103, 2218, 247, 3782, 2810, 1239, 273, 1097, 253, 2929, 285, 253, 5955, 275, 1340, 281, 14357, 253, 5847, 285, 772, 3982, 598, 407, 253, 30628, 50276, 783, 2762, 10123, 4518, 5224, 326, 436, 789, 310, 47860, 285, 273, 1600, 281, 8607, 275, 253, 17857, 32888, 3114, 275, 958, 512, 30628, 5393, 597, 1119, 253, 789, 4722, 285, 973, 15720, 275, 1798, 37317, 288, 3899, 4159, 891, 717, 2762, 670, 436, 7792, 347, 352, 10262, 247, 1805, 1566, 323, 4471, 12788, 5511, 3340, 15655, 272, 253, 5511, 2190, 6083, 689, 253, 4229, 11096, 10921, 3169, 5511, 7241, 275, 5899, 391, 77, 891, 717, 21802, 281, 5194, 342, 436, 6803, 285, 1089, 253, 5511, 4758, 5421, 275, 436, 2929, 281, 320, 1199, 625, 10038, 11220, 3588, 323, 1966, 12788, 5016, 7533, 685, 1907, 7497, 13791, 13434, 23267, 390, 2085, 32367, 7497, 403, 5431, 4105, 387, 253, 3438, 285, 778, 417, 452, 253, 1072, 14704, 281, 5115, 253, 6158, 50276, 783, 4016, 10123, 7106, 327, 247, 1643, 772, 337, 253, 9376, 326, 253, 6805, 556, 2289, 281, 247, 3216, 33024, 3126, 1566, 374, 13775, 670, 3910, 432, 643, 2905, 4910, 24088, 40157, 267, 391, 77, 2304, 77, 285, 495, 3480, 273, 1783, 273, 253, 5511, 7241, 891, 452, 2783, 841, 2792, 533, 513, 417, 1928, 667, 273, 731, 403, 15444, 32138, 337, 432, 253, 8668, 273, 1966, 12788, 5016, 891, 1158, 352, 310, 1077, 5272, 281, 5467, 326, 247, 1966, 6805, 651, 452, 247, 1175, 1566, 273, 253, 1533, 285, 651, 320, 3839, 1801, 17952, 387, 16161, 8892, 275, 253, 1533, 2403, 436, 2746, 789, 275, 253, 4758, 835, 253, 6805, 310, 671, 4715, 849, 253, 1533, 2987, 3133, 48859, 275, 253, 5028, 273, 2852, 789, 374, 253, 4477, 452, 2218, 271, 9470, 2628, 273, 8254, 5411, 253, 3910, 875, 841, 2905, 3672, 285, 347, 5469, 1840, 643, 30628, 1119, 253, 1039, 275, 534, 639, 81, 310, 1027, 281, 320, 47860, 285, 10038, 11220, 3588, 495, 436, 310, 7826, 253, 954, 4092, 345, 347, 253, 5955, 342, 37317, 270, 73, 4233, 3982, 598, 253, 6311, 5511, 7241, 778, 816, 320, 247, 2969, 10603, 875, 8169, 285, 3126, 6355, 846, 2007, 5955, 275, 534, 253, 4477, 9125, 326, 4715, 247, 2969, 10603, 310, 417, 247, 1895, 783, 2022, 1953, 310, 849, 281, 1014, 10808, 824, 247, 10603, 275, 253, 806, 21927, 10666, 37317, 14969, 326, 436, 310, 417, 247, 15444, 19652, 533, 326, 2789, 253, 1543, 8489, 1679, 4722, 50276, 249, 6010, 253, 2762, 10123, 16318, 253, 4722, 1255, 285, 47860, 3753, 273, 253, 3533, 5421, 275, 436, 2929, 285, 452, 13762, 479, 326, 436, 2929, 588, 320, 273, 1600, 281, 253, 17857, 32888, 3114, 347, 352, 556, 3400, 247, 747, 8668, 327, 253, 1895, 273, 5570, 12788, 5016, 3782, 323, 253, 2714, 1083, 273, 1966, 12788, 5016, 253, 4016, 10123, 858, 6780, 247, 1643, 7364, 273, 253, 2929, 533, 891, 1902, 841, 476, 320, 9713, 407, 2852, 789, 285, 513, 417, 1928, 597, 32180, 798, 253, 4722, 1255, 273, 253, 1895, 275, 1708, 273, 436, 891, 5583, 14924, 347, 247, 20731, 50276, 35640, 279, 323, 253, 4477, 891, 1119, 253, 5955, 342, 37317, 270, 73, 4233, 281, 320, 3782, 47860, 285, 9371, 275, 4685, 253, 13698, 273, 253, 2929, 891, 651, 11907, 368, 281, 19071, 690, 273, 436, 715, 253, 4049, 254, 609, 5102, 2715, 273, 253, 2929, 285, 4931, 281, 9644, 625, 11306, 327, 253, 2714, 1083, 273, 1966, 12788, 5016, 347, 16038, 273, 436, 789, 347, 671, 47466, 387, 407, 37317, 288, 3899 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper incorporate the popular contrastive with unsupervised learning from video specifically multiple frames from the same video is used as positive pairs and frames from different videos is viewed as negative pair the author also proposed a simple and effective ways to collect classbalanced and diverse video frame dataset from youtube the author conducted extensive evaluation experiments on both video recognition and image recognition downstream tasks extensive ablation experiments demonstrated the effectiveness of utilizing multiple frames and the balanced data collection algorithms my concerns 1 lack of novelty the idea of mapping different frames in one video closer predicting other frames in one videos or maximizing mutual informations of embedding of different frames in one video is widely explored and adopting contrastive learning in video unsupervised learning scenario has been done before mentioned in the related work section of your paper too 2 no comparison against other video based unsupervised learning algorithms from my viewpoint improve over single framed based contrastive learning only proves that your algorithm successfully utilized temporal information encoded in the data and provide limited insights for exploiting more useful information from videos 3 if you can demonstrate your way of incorporating contrastive learning into video based unsupervised learning offers a nontrivial improvement or have a significant difference with other video based unsupervised learning the impact of your work will be larger overall i think this paper is interesting but its contribution is limited docsepthe idea of learning representations from video rather than single images is an appealing one with many favorable properties to allow a system to get direct signal on appearance of objects under various natural transformations occlusion lighting etc combining instance discrimination ideas of loss based on unlabelled images for which it is known whether they are similar or not with the idea of curating the images from video is hypothesized to yield learned representations that capture properties enabling improved performance across a variety of single image tasks the authors create a dataset based on video with positive pairs for noise contrastive estimation conduct fairly comprehensive experiments and promise to make their newly constructed dataset available the experiments showcase this type of learned representation outperform alternatives not based on videos on a variety of tasks quality this seems like a solid paper offering a good intuitive idea with well supported experimental section to show case its relevance clarity the paper is quite clearly written for the most part the dataset section 31 seems to have an omitted paragraph please see point 1 below section 32 is quite lean and does not stand on its own but relies heavily on previous work omitting much of the essence i would recommend spending a bit more time on ensuring it is more rigorously written see for example my comment 2 below originality the paper is modestly original it combines two existing ideas that of using discrimination loss for unsupervised learning of image features and that of using video based data to allow for rich example of the same data that takes into account real world type transformations thus it is hard to claim more than moderate originality however significance the improvements over existing baselines are solid though i would not categorize them as dramatic given the originality is also solid the overall significance is moderate comments 1 the dataset generation section is strange did you omit too much we use the following fast and automated procedure to generate the images in our dataset using this procedure it almost seems like a few sentences were dropped between the first and second sentences while the information exists in the appendix a sentence or two seem to be warranted in the main test 2 gradients flow through the positive pairs at this point in the text you have only introduced a loss the sentence warrants the question of gradients with respect to what for rigor and clarity of exposition this intuition related statement should come after you talk of what is the parameterized aspect of eq 1 wrt which gradients are taken so after eq 2 is introduced and the idea that the feature representations are captured through learned method and in particular some reference to the nn you are using this makes it a bit more complete as a description of the method and presented in a more methodical order 3 why would you not remove the option of choosing an anchor and positive as the same image seems easy to avoid 4 how do you know if the video doesnt contain a shift to a different scene with different content thus making the positive actually very different 5 since the representation you provide is supposed to learn representations that are somehow more informed about natural transformations occlusion lighting is there an experiment you can conceive of to test this specific hypothesis i think it would both be interesting and also give insight on whether this is indeed what is being learned this might also make this representation useful for other types of tasks that are not looked at in the paper that i would encourage the authors to explore docsepthis paper proposes an idea to use nce for videos where positivenegative training pairs are created by temporally sampling different frames in the video the idea is fairly straightforward there is not anything particularly novel about the approach other than how the training pairs are created i think it would have been more interesting and insightful if more ways to generate these pairs were proposed and evaluated for example video has many modalities eg rgb audio optical flow etc in addition to the spatial and temporal dimensions exploring ways of creating pairs using multimodal data would have been more interesting experimentally there arent really any comparisons to previous selfsupervised learning methods this is a pretty major weakness as it makes it difficult to understand how well this task is doing methods like simclr provide 70 accuracy on imagenet and others do well on video tasks see missing related works below currently im not convinced by the experiments the studies comparing different pretraining data and multiframe vs single frame are interesting and show the potential of the approach there are some missing related works for example cooperative learning of audio and video models from selfsupervised synchronization neurips18 audiovisual scene analysis with selfsupervised multisensory features eccv18 evolving losses for unsupervised video representation learning cvpr20 these works all provide strong performance on unsupervised video representation learning yet are not mentioned or compared to overall i think the proposed idea is not especially novel and the experiments arent strong enough to show that the simple idea is good i think there is some potential in the paper but needs more to be convincing ### Summary:
this paper was a difficult decision overall it seems to be a quality paper well written and with many experiments in particular evaluating learned representations across various tasks and datasets the authors were also quite courteous in their replies which is appreciated i really like the point the paper makes about video as a natural augmentation and i find that novel amid the recent nce surge where most papers rely critically on augmentation r4 was also very positive about the paper overall concept in terms of paper weaknesses two of the reviewers voted for rejection because the paper ignores existing work on contrastive learning from videos the authors rebuttal is that they are the first evaluating on images not on videos all reviewers also point out limited technical novelty which the authors acknowledge finally r1 is not very confident about the experiments overall and after calibration the appropriate recommendation seems to be rejection
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 19071, 253, 4633, 4499, 422, 342, 440, 35421, 4715, 432, 3492, 5742, 2709, 13009, 432, 253, 1072, 3492, 310, 908, 347, 2762, 8557, 285, 13009, 432, 1027, 10556, 310, 11575, 347, 4016, 4667, 50276, 783, 2488, 671, 4081, 247, 2969, 285, 3576, 4088, 281, 4822, 966, 30063, 285, 11117, 3492, 3665, 10895, 432, 49683, 50276, 783, 2488, 5196, 9470, 7103, 4679, 327, 1097, 3492, 8981, 285, 2460, 8981, 15450, 8892, 9470, 28913, 4679, 5183, 253, 12510, 50276, 1171, 17617, 2709, 13009, 285, 253, 16645, 941, 4849, 11333, 50275, 2577, 7350, 337, 50276, 77, 471, 273, 38135, 50276, 783, 2934, 273, 10603, 1027, 13009, 275, 581, 3492, 8003, 50276, 22714, 272, 643, 13009, 275, 581, 10556, 390, 46875, 15577, 4151, 569, 273, 21496, 273, 1027, 13009, 275, 581, 3492, 310, 7561, 14859, 50276, 395, 25987, 4499, 422, 4715, 275, 3492, 440, 35421, 4715, 10076, 556, 644, 2218, 1078, 5393, 275, 253, 2905, 789, 2593, 273, 634, 2929, 1512, 374, 642, 5301, 1411, 643, 3492, 1754, 440, 35421, 4715, 11333, 50276, 4064, 619, 31460, 3157, 689, 2014, 29318, 1754, 4499, 422, 4715, 760, 19539, 326, 634, 5933, 8379, 12845, 11935, 1491, 16202, 275, 253, 941, 285, 2085, 3710, 16039, 323, 38883, 625, 4217, 1491, 432, 10556, 50275, 20, 604, 368, 476, 7568, 634, 1039, 273, 24049, 4499, 422, 4715, 715, 3492, 1754, 440, 35421, 4715, 6131, 247, 37825, 7756, 390, 452, 247, 1534, 3064, 342, 643, 3492, 1754, 440, 35421, 4715, 253, 3486, 273, 634, 789, 588, 320, 4067, 50275, 1189, 455, 891, 1158, 436, 2929, 310, 4722, 533, 697, 7680, 310, 3710, 5474, 339, 431, 248, 2934, 273, 4715, 14237, 432, 3492, 2581, 685, 2014, 3888, 310, 271, 23176, 581, 342, 1142, 13857, 3607, 281, 1581, 247, 985, 281, 755, 1480, 2625, 327, 7286, 273, 5113, 762, 2710, 3626, 21257, 30796, 15632, 3966, 16248, 4227, 11081, 5697, 273, 2957, 1754, 327, 440, 47728, 3888, 323, 534, 352, 310, 1929, 1880, 597, 403, 2074, 390, 417, 342, 253, 2934, 273, 1095, 839, 253, 3888, 432, 3492, 310, 24045, 281, 4917, 6311, 14237, 326, 9232, 3607, 17690, 5520, 3045, 2439, 247, 5235, 273, 2014, 2460, 8892, 253, 4477, 2794, 247, 10895, 1754, 327, 3492, 342, 2762, 8557, 323, 6046, 4499, 422, 13418, 2589, 9648, 11088, 4679, 285, 9023, 281, 1056, 616, 9841, 8818, 10895, 2130, 253, 4679, 34647, 436, 1511, 273, 6311, 6779, 562, 32231, 18075, 417, 1754, 327, 10556, 327, 247, 5235, 273, 8892, 50275, 15177, 50276, 2520, 3133, 751, 247, 4891, 2929, 9159, 247, 1175, 27350, 2934, 342, 973, 4516, 5661, 2593, 281, 921, 1083, 697, 17200, 50275, 498, 15752, 50276, 783, 2929, 310, 3240, 4518, 3542, 323, 253, 954, 629, 253, 10895, 2593, 4562, 3133, 281, 452, 271, 11035, 12494, 4496, 923, 1127, 337, 2708, 2593, 4567, 310, 3240, 9644, 285, 1057, 417, 1462, 327, 697, 1211, 533, 15771, 11306, 327, 2045, 789, 7005, 2835, 1199, 273, 253, 17718, 891, 651, 5583, 9100, 247, 2372, 625, 673, 327, 17749, 352, 310, 625, 8132, 29689, 3542, 923, 323, 1650, 619, 4385, 374, 2708, 50275, 19164, 414, 50276, 783, 2929, 310, 16453, 314, 3236, 352, 24772, 767, 5368, 5697, 50276, 3529, 273, 970, 11081, 2957, 323, 440, 35421, 4715, 273, 2460, 3386, 285, 326, 273, 970, 3492, 1754, 941, 281, 1581, 323, 6793, 1650, 273, 253, 1072, 941, 326, 3936, 715, 2395, 1524, 1533, 1511, 21257, 3021, 352, 310, 1892, 281, 1750, 625, 685, 10290, 3236, 414, 2299, 50274, 9188, 40348, 50276, 783, 11701, 689, 5368, 1666, 25379, 403, 4891, 2167, 891, 651, 417, 13213, 907, 731, 347, 14138, 1677, 253, 3236, 414, 310, 671, 4891, 253, 4583, 8453, 310, 10290, 50275, 26122, 337, 253, 10895, 5978, 2593, 310, 8921, 858, 368, 35991, 1512, 1199, 359, 897, 253, 1563, 3809, 285, 16644, 5199, 281, 6635, 253, 3888, 275, 776, 10895, 970, 436, 5199, 352, 2761, 3133, 751, 247, 1643, 14683, 497, 8231, 875, 253, 806, 285, 1273, 14683, 1223, 253, 1491, 4961, 275, 253, 30762, 247, 6197, 390, 767, 1646, 281, 320, 26085, 275, 253, 2022, 1071, 50276, 19, 27935, 2685, 949, 253, 2762, 8557, 50276, 255, 436, 1127, 275, 253, 2505, 368, 452, 760, 5611, 247, 2957, 253, 6197, 32570, 253, 1953, 273, 27935, 342, 1675, 281, 752, 323, 8132, 263, 285, 19843, 273, 47284, 436, 30328, 2905, 3908, 943, 1705, 846, 368, 2312, 273, 752, 310, 253, 4764, 1025, 4809, 273, 16186, 337, 8772, 534, 27935, 403, 2668, 594, 846, 16186, 374, 310, 5611, 285, 253, 2934, 326, 253, 4735, 14237, 403, 10848, 949, 6311, 1332, 285, 275, 1798, 690, 3806, 281, 253, 48257, 368, 403, 970, 436, 2789, 352, 247, 2372, 625, 3426, 347, 247, 5740, 273, 253, 1332, 285, 3559, 275, 247, 625, 1332, 474, 1340, 50276, 20, 2139, 651, 368, 417, 5386, 253, 4500, 273, 13887, 271, 18536, 285, 2762, 347, 253, 1072, 2460, 3133, 3477, 281, 3693, 577, 849, 513, 368, 871, 604, 253, 3492, 36908, 3831, 247, 5333, 281, 247, 1027, 6200, 342, 1027, 2600, 3021, 2403, 253, 2762, 2686, 1077, 1027, 50276, 22, 1580, 253, 6779, 368, 2085, 310, 6326, 281, 3037, 14237, 326, 403, 10380, 625, 8191, 670, 3626, 21257, 30796, 15632, 50276, 261, 627, 271, 3368, 368, 476, 42684, 273, 281, 1071, 436, 2173, 9079, 891, 1158, 352, 651, 1097, 320, 4722, 285, 671, 1918, 12288, 327, 1880, 436, 310, 6296, 752, 310, 1146, 6311, 436, 1537, 671, 1056, 436, 6779, 4217, 323, 643, 3510, 273, 8892, 326, 403, 417, 3261, 387, 275, 253, 2929, 326, 891, 651, 11907, 253, 4477, 281, 8338, 5474, 33032, 2520, 2929, 29328, 271, 2934, 281, 897, 295, 336, 323, 10556, 835, 10538, 3870, 909, 800, 3733, 8557, 403, 3562, 407, 5897, 595, 10491, 1027, 13009, 275, 253, 3492, 50276, 783, 2934, 310, 9648, 15246, 627, 310, 417, 2712, 3782, 4460, 670, 253, 2746, 643, 685, 849, 253, 3733, 8557, 403, 3562, 891, 1158, 352, 651, 452, 644, 625, 4722, 285, 47860, 604, 625, 4088, 281, 6635, 841, 8557, 497, 4081, 285, 6760, 323, 1650, 3492, 556, 1142, 33433, 24088, 46206, 9797, 5748, 2685, 3966, 275, 1635, 281, 253, 8820, 285, 11935, 10103, 18216, 4088, 273, 6153, 8557, 970, 23390, 26306, 941, 651, 452, 644, 625, 4722, 50276, 16217, 2092, 595, 627, 403, 2649, 1663, 667, 14023, 281, 2045, 1881, 35421, 4715, 3082, 436, 310, 247, 3965, 2201, 14855, 347, 352, 2789, 352, 2834, 281, 2096, 849, 973, 436, 4836, 310, 2509, 3082, 751, 948, 498, 83, 2085, 5571, 7200, 327, 4440, 257, 292, 285, 2571, 513, 973, 327, 3492, 8892, 923, 5816, 2905, 2987, 2708, 4390, 516, 417, 13762, 407, 253, 4679, 50276, 783, 2175, 10941, 1027, 3215, 26208, 941, 285, 25274, 30873, 4632, 2014, 3665, 403, 4722, 285, 921, 253, 2442, 273, 253, 2746, 50275, 9088, 403, 690, 5816, 2905, 2987, 323, 1650, 50276, 1940, 7105, 4715, 273, 9797, 285, 3492, 3210, 432, 1881, 35421, 27801, 5723, 2824, 1093, 50276, 14504, 729, 261, 780, 6200, 1783, 342, 1881, 35421, 1554, 261, 41593, 3386, 23746, 87, 1093, 50276, 1173, 11932, 11655, 323, 440, 35421, 3492, 6779, 4715, 30105, 1087, 938, 50275, 20513, 2987, 512, 2085, 2266, 3045, 327, 440, 35421, 3492, 6779, 4715, 2568, 403, 417, 5393, 390, 2429, 281, 50276, 1189, 455, 891, 1158, 253, 4081, 2934, 310, 417, 3340, 4460, 285, 253, 4679, 403, 2649, 2266, 2217, 281, 921, 326, 253, 2969, 2934, 310, 1175, 891, 1158, 627, 310, 690, 2442, 275, 253, 2929, 533, 3198, 625, 281, 320, 21414, 187, 187, 4118, 18435, 27, 2520, 2929, 369, 247, 2834, 3061, 4583, 352, 3133, 281, 320, 247, 3290, 2929, 973, 3542, 285, 342, 1142, 4679, 275, 1798, 16344, 6311, 14237, 2439, 2710, 8892, 285, 15302, 253, 4477, 497, 671, 3240, 1960, 442, 528, 275, 616, 32114, 534, 310, 14109, 891, 1663, 751, 253, 1127, 253, 2929, 2789, 670, 3492, 347, 247, 3626, 42072, 285, 891, 1089, 326, 4460, 17421, 253, 3332, 295, 336, 12061, 835, 954, 9380, 10725, 21038, 327, 42072, 391, 21, 369, 671, 1077, 2762, 670, 253, 2929, 4583, 4473, 50276, 249, 2426, 273, 2929, 32213, 767, 273, 253, 30628, 14285, 323, 18235, 984, 253, 2929, 35136, 5368, 789, 327, 4499, 422, 4715, 432, 10556, 253, 4477, 30080, 22559, 310, 326, 597, 403, 253, 806, 16344, 327, 3888, 417, 327, 10556, 512, 30628, 671, 1127, 562, 3710, 7681, 38135, 534, 253, 4477, 14409, 4720, 391, 18, 310, 417, 1077, 13224, 670, 253, 4679, 50276, 1189, 455, 285, 846, 18543, 253, 4569, 17401, 3133, 281, 320, 18235, 50276 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 19071, 253, 4633, 4499, 422, 342, 440, 35421, 4715, 432, 3492, 5742, 2709, 13009, 432, 253, 1072, 3492, 310, 908, 347, 2762, 8557, 285, 13009, 432, 1027, 10556, 310, 11575, 347, 4016, 4667, 50276, 783, 2488, 671, 4081, 247, 2969, 285, 3576, 4088, 281, 4822, 966, 30063, 285, 11117, 3492, 3665, 10895, 432, 49683, 50276, 783, 2488, 5196, 9470, 7103, 4679, 327, 1097, 3492, 8981, 285, 2460, 8981, 15450, 8892, 9470, 28913, 4679, 5183, 253, 12510, 50276, 1171, 17617, 2709, 13009, 285, 253, 16645, 941, 4849, 11333, 50275, 2577, 7350, 337, 50276, 77, 471, 273, 38135, 50276, 783, 2934, 273, 10603, 1027, 13009, 275, 581, 3492, 8003, 50276, 22714, 272, 643, 13009, 275, 581, 10556, 390, 46875, 15577, 4151, 569, 273, 21496, 273, 1027, 13009, 275, 581, 3492, 310, 7561, 14859, 50276, 395, 25987, 4499, 422, 4715, 275, 3492, 440, 35421, 4715, 10076, 556, 644, 2218, 1078, 5393, 275, 253, 2905, 789, 2593, 273, 634, 2929, 1512, 374, 642, 5301, 1411, 643, 3492, 1754, 440, 35421, 4715, 11333, 50276, 4064, 619, 31460, 3157, 689, 2014, 29318, 1754, 4499, 422, 4715, 760, 19539, 326, 634, 5933, 8379, 12845, 11935, 1491, 16202, 275, 253, 941, 285, 2085, 3710, 16039, 323, 38883, 625, 4217, 1491, 432, 10556, 50275, 20, 604, 368, 476, 7568, 634, 1039, 273, 24049, 4499, 422, 4715, 715, 3492, 1754, 440, 35421, 4715, 6131, 247, 37825, 7756, 390, 452, 247, 1534, 3064, 342, 643, 3492, 1754, 440, 35421, 4715, 253, 3486, 273, 634, 789, 588, 320, 4067, 50275, 1189, 455, 891, 1158, 436, 2929, 310, 4722, 533, 697, 7680, 310, 3710, 5474, 339, 431, 248, 2934, 273, 4715, 14237, 432, 3492, 2581, 685, 2014, 3888, 310, 271, 23176, 581, 342, 1142, 13857, 3607, 281, 1581, 247, 985, 281, 755, 1480, 2625, 327, 7286, 273, 5113, 762, 2710, 3626, 21257, 30796, 15632, 3966, 16248, 4227, 11081, 5697, 273, 2957, 1754, 327, 440, 47728, 3888, 323, 534, 352, 310, 1929, 1880, 597, 403, 2074, 390, 417, 342, 253, 2934, 273, 1095, 839, 253, 3888, 432, 3492, 310, 24045, 281, 4917, 6311, 14237, 326, 9232, 3607, 17690, 5520, 3045, 2439, 247, 5235, 273, 2014, 2460, 8892, 253, 4477, 2794, 247, 10895, 1754, 327, 3492, 342, 2762, 8557, 323, 6046, 4499, 422, 13418, 2589, 9648, 11088, 4679, 285, 9023, 281, 1056, 616, 9841, 8818, 10895, 2130, 253, 4679, 34647, 436, 1511, 273, 6311, 6779, 562, 32231, 18075, 417, 1754, 327, 10556, 327, 247, 5235, 273, 8892, 50275, 15177, 50276, 2520, 3133, 751, 247, 4891, 2929, 9159, 247, 1175, 27350, 2934, 342, 973, 4516, 5661, 2593, 281, 921, 1083, 697, 17200, 50275, 498, 15752, 50276, 783, 2929, 310, 3240, 4518, 3542, 323, 253, 954, 629, 253, 10895, 2593, 4562, 3133, 281, 452, 271, 11035, 12494, 4496, 923, 1127, 337, 2708, 2593, 4567, 310, 3240, 9644, 285, 1057, 417, 1462, 327, 697, 1211, 533, 15771, 11306, 327, 2045, 789, 7005, 2835, 1199, 273, 253, 17718, 891, 651, 5583, 9100, 247, 2372, 625, 673, 327, 17749, 352, 310, 625, 8132, 29689, 3542, 923, 323, 1650, 619, 4385, 374, 2708, 50275, 19164, 414, 50276, 783, 2929, 310, 16453, 314, 3236, 352, 24772, 767, 5368, 5697, 50276, 3529, 273, 970, 11081, 2957, 323, 440, 35421, 4715, 273, 2460, 3386, 285, 326, 273, 970, 3492, 1754, 941, 281, 1581, 323, 6793, 1650, 273, 253, 1072, 941, 326, 3936, 715, 2395, 1524, 1533, 1511, 21257, 3021, 352, 310, 1892, 281, 1750, 625, 685, 10290, 3236, 414, 2299, 50274, 9188, 40348, 50276, 783, 11701, 689, 5368, 1666, 25379, 403, 4891, 2167, 891, 651, 417, 13213, 907, 731, 347, 14138, 1677, 253, 3236, 414, 310, 671, 4891, 253, 4583, 8453, 310, 10290, 50275, 26122, 337, 253, 10895, 5978, 2593, 310, 8921, 858, 368, 35991, 1512, 1199, 359, 897, 253, 1563, 3809, 285, 16644, 5199, 281, 6635, 253, 3888, 275, 776, 10895, 970, 436, 5199, 352, 2761, 3133, 751, 247, 1643, 14683, 497, 8231, 875, 253, 806, 285, 1273, 14683, 1223, 253, 1491, 4961, 275, 253, 30762, 247, 6197, 390, 767, 1646, 281, 320, 26085, 275, 253, 2022, 1071, 50276, 19, 27935, 2685, 949, 253, 2762, 8557, 50276, 255, 436, 1127, 275, 253, 2505, 368, 452, 760, 5611, 247, 2957, 253, 6197, 32570, 253, 1953, 273, 27935, 342, 1675, 281, 752, 323, 8132, 263, 285, 19843, 273, 47284, 436, 30328, 2905, 3908, 943, 1705, 846, 368, 2312, 273, 752, 310, 253, 4764, 1025, 4809, 273, 16186, 337, 8772, 534, 27935, 403, 2668, 594, 846, 16186, 374, 310, 5611, 285, 253, 2934, 326, 253, 4735, 14237, 403, 10848, 949, 6311, 1332, 285, 275, 1798, 690, 3806, 281, 253, 48257, 368, 403, 970, 436, 2789, 352, 247, 2372, 625, 3426, 347, 247, 5740, 273, 253, 1332, 285, 3559, 275, 247, 625, 1332, 474, 1340, 50276, 20, 2139, 651, 368, 417, 5386, 253, 4500, 273, 13887, 271, 18536, 285, 2762, 347, 253, 1072, 2460, 3133, 3477, 281, 3693, 577, 849, 513, 368, 871, 604, 253, 3492, 36908, 3831, 247, 5333, 281, 247, 1027, 6200, 342, 1027, 2600, 3021, 2403, 253, 2762, 2686, 1077, 1027, 50276, 22, 1580, 253, 6779, 368, 2085, 310, 6326, 281, 3037, 14237, 326, 403, 10380, 625, 8191, 670, 3626, 21257, 30796, 15632, 50276, 261, 627, 271, 3368, 368, 476, 42684, 273, 281, 1071, 436, 2173, 9079, 891, 1158, 352, 651, 1097, 320, 4722, 285, 671, 1918, 12288, 327, 1880, 436, 310, 6296, 752, 310, 1146, 6311, 436, 1537, 671, 1056, 436, 6779, 4217, 323, 643, 3510, 273, 8892, 326, 403, 417, 3261, 387, 275, 253, 2929, 326, 891, 651, 11907, 253, 4477, 281, 8338, 5474, 33032, 2520, 2929, 29328, 271, 2934, 281, 897, 295, 336, 323, 10556, 835, 10538, 3870, 909, 800, 3733, 8557, 403, 3562, 407, 5897, 595, 10491, 1027, 13009, 275, 253, 3492, 50276, 783, 2934, 310, 9648, 15246, 627, 310, 417, 2712, 3782, 4460, 670, 253, 2746, 643, 685, 849, 253, 3733, 8557, 403, 3562, 891, 1158, 352, 651, 452, 644, 625, 4722, 285, 47860, 604, 625, 4088, 281, 6635, 841, 8557, 497, 4081, 285, 6760, 323, 1650, 3492, 556, 1142, 33433, 24088, 46206, 9797, 5748, 2685, 3966, 275, 1635, 281, 253, 8820, 285, 11935, 10103, 18216, 4088, 273, 6153, 8557, 970, 23390, 26306, 941, 651, 452, 644, 625, 4722, 50276, 16217, 2092, 595, 627, 403, 2649, 1663, 667, 14023, 281, 2045, 1881, 35421, 4715, 3082, 436, 310, 247, 3965, 2201, 14855, 347, 352, 2789, 352, 2834, 281, 2096, 849, 973, 436, 4836, 310, 2509, 3082, 751, 948, 498, 83, 2085, 5571, 7200, 327, 4440, 257, 292, 285, 2571, 513, 973, 327, 3492, 8892, 923, 5816, 2905, 2987, 2708, 4390, 516, 417, 13762, 407, 253, 4679, 50276, 783, 2175, 10941, 1027, 3215, 26208, 941, 285, 25274, 30873, 4632, 2014, 3665, 403, 4722, 285, 921, 253, 2442, 273, 253, 2746, 50275, 9088, 403, 690, 5816, 2905, 2987, 323, 1650, 50276, 1940, 7105, 4715, 273, 9797, 285, 3492, 3210, 432, 1881, 35421, 27801, 5723, 2824, 1093, 50276, 14504, 729, 261, 780, 6200, 1783, 342, 1881, 35421, 1554, 261, 41593, 3386, 23746, 87, 1093, 50276, 1173, 11932, 11655, 323, 440, 35421, 3492, 6779, 4715, 30105, 1087, 938, 50275, 20513, 2987, 512, 2085, 2266, 3045, 327, 440, 35421, 3492, 6779, 4715, 2568, 403, 417, 5393, 390, 2429, 281, 50276, 1189, 455, 891, 1158, 253, 4081, 2934, 310, 417, 3340, 4460, 285, 253, 4679, 403, 2649, 2266, 2217, 281, 921, 326, 253, 2969, 2934, 310, 1175, 891, 1158, 627, 310, 690, 2442, 275, 253, 2929, 533, 3198, 625, 281, 320, 21414, 187, 187, 4118, 18435, 27, 2520, 2929, 369, 247, 2834, 3061, 4583, 352, 3133, 281, 320, 247, 3290, 2929, 973, 3542, 285, 342, 1142, 4679, 275, 1798, 16344, 6311, 14237, 2439, 2710, 8892, 285, 15302, 253, 4477, 497, 671, 3240, 1960, 442, 528, 275, 616, 32114, 534, 310, 14109, 891, 1663, 751, 253, 1127, 253, 2929, 2789, 670, 3492, 347, 247, 3626, 42072, 285, 891, 1089, 326, 4460, 17421, 253, 3332, 295, 336, 12061, 835, 954, 9380, 10725, 21038, 327, 42072, 391, 21, 369, 671, 1077, 2762, 670, 253, 2929, 4583, 4473, 50276, 249, 2426, 273, 2929, 32213, 767, 273, 253, 30628, 14285, 323, 18235, 984, 253, 2929, 35136, 5368, 789, 327, 4499, 422, 4715, 432, 10556, 253, 4477, 30080, 22559, 310, 326, 597, 403, 253, 806, 16344, 327, 3888, 417, 327, 10556, 512, 30628, 671, 1127, 562, 3710, 7681, 38135, 534, 253, 4477, 14409, 4720, 391, 18, 310, 417, 1077, 13224, 670, 253, 4679, 50276, 1189, 455, 285, 846, 18543, 253, 4569, 17401, 3133, 281, 320, 18235, 50276 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the paper proposes a novel generalisation measure ie measurement that indicates how well the network generalises based on pruning the idea is to measure the fraction of the weights that can be pruned either randomly or based on the norms without hurting the training loss of the model the paper provides thorough discussion of the related methods and motivates the measure in multiple ways further the authors show empirical evidence for the correlation of the pruning robustness to the generalisation ability of networks based on the paper by jiang et al 2019 and dataset updated with additional models provided in the paper the paper is well written and states a clear goal of introducing and proving a generalisation measure based on pruning the authors provide a nice discussion and lots of empirical evidence nonetheless several points of motivation for the measure seem unclear to me the authors refer to jiang et al 2019 saying that there are lots of generalisation measures that correlate with performance but that they fail to explain the test performance which calls for another measure if the experiments shown in jiang et al are demonstrating failure of explaining test performance then the presented paper also does not provide many more evidences that the proposed measure is not failing the motivation for pruning as a way to improve generalisation is connected to the training procedures dropout and lottery ticket hypothesis nevertheless none of these methods improve generalisation when applied on top of already trained network i would not say that lottery ticket retraining can be classified as integration of pruning into optimisation as well as dropout improves training only when all the network is used afterwards there are several failures that make me believe that more work can improve the paper the goal of the paper is to show a measure that will perfectly predict generalisation but according to the experiments it can be outperformed by other measures on the presented dataset the theoretical justification seem unclear to me what is the goal of introducing the generalisation bound moreover even appendix does not have details of derivation of the presented formula if the authors notify themselves right away that it is vacuous the justification is to give an intuition of how pruning connects to generalisation it is unclear to me though how it can be concluded based on a vacuous bound the idea to check the measures behaviour in double descent setup is very interesting but only one measure is checked there and in a different experimental setup without proper motivation for such change section543 attempts to analyse casual connection between existing measures that seems to me unclear by motivation as well one wants to see causal connection to generalization not other measures especially that the table3 is discussed only for the random perturbations measurement still not providing an answer what type of the connection is there between two i would suggest to reject the paper since the idea feels not being worked through enough minor comments 1 it would be nice to have a discussion on the type of pruning used does it somehow change the measurements in a predictable way 2 typo in the first sentence of section3 twice denote 3 typo in the first sentence of the second paragraph of section3 letter denoting data distribution 4 table1 misses highlights of the winning approaches i would like to thank authors for accurate answers and a lot of work put on reworking the paper unfortunately i still find my concerns about motivation for the metric valid which together with the rather weak performance creates a problem for this paper i highly encourage authors to continue the work and try to explain the reasons for this correlation and find justifications for usage of the metric docsepthe authors consider the problem of estimating generalization in deep neural networks and propose a measure based on the ability to set a fraction of the neural network weights to zero prunability the authors introduce some theoretical motivation based on the pacbayesian framework and perform an empirical evaluation based on a set of convolutional networks trained on the cifar10 dataset the problem of understanding generalization in deep neural networks is a fundamental problem of deep learning and the results obtained by the authors present a potentially interesting perspective on the study of generalization in deep neural networks the paper is wellwritten and presents some interesting empirical results however the theoretical contribution is not particularly novel and is not complete enough to justify all the measures evaluated in paper due to this issue i feel that the paper only has limited impact i detail my comments on the theoretical and empirical results of the paper below on the theoretical side the contribution for the case of random pruning is minor and is a straightforward extension of known results indeed random pruning is similar to many other wellstudied random perturbation schemes and corresponds exactly to the case of dropout a wellunderstood method on the other hand there does not appear to be any bound provided for the case of magnitude pruning and it is not immediately obvious why such a measure should lead to a generalization bound indeed existing work on using pruning and compression for measuring generalization arora et al 2018 zhou et al 2019 establish bounds on the pruned network and not the original network establishing a bound in terms of the magnitude pruning measure would be an interesting contribution as handling such nonrandom modifications has been a challenge in the community however the authors do not seem to make any attempt at providing such a bound or a heuristic argument for such a bound to hold finally the choice of measuring the prunability of a network as a proportion requires more careful justification in the context of magnitude pruning indeed for random pruning the proportion is easily interpreted as a magnitude of noise injected however in the magnitude pruning setup with the parallels the authors draw to occams razor and compression ideas it is the absolute number of parameters which is more theoretically relevant than the proportion of parameters which can be eliminated having 1 million parameters where half can be eliminated is still more complex than only having 100000 parameters where none can be eliminated on the empirical side i found that the methodology was clear and welladapted for the random pruning method which is closely related to dropout and other random perturbation ideas with the addition of networks of different depth in the set of networks used for evaluation the empirical methodology also seems appropriate for the magnitude pruning method and the parallels the authors draw to compression however it is not obvious to me that the empirical results support that connection as we know that in architectures which vary substantially in size eg mobilenet or efficientnet the compressibility ie the proportion of parameters which can be pruned is directly related to the size of the network see eg gupta et al 2017 and i feel that this is a further indication that the choice of using the proportion of pruned parameters should be more carefully discussed the presentation of the empirical result could be improved by including either in the main text or the appendix the standard errors for the measured correlation for all measures and ensuring that the tables are formatted similarly to the iclr template for better readability the latex package booktabs can be used for this purpose other notes please ensure that you cite published versions of papers when they are available instead of the arxiv preprints for example arora et al 2018 appeared at icml 2018 and zhou et al 2018 sic appeared at iclr 2019 there are many other such cases in the references edited after author response i thank the authors for their considerate responses overall my opinion remains mostly unchanged and i share similar opinions to reviewer 3 and 4 that although the proposed idea is interesting and intriguing the paper is not quite ready at this point i would like to see the authors present either 1 stronger empirical evidence for the importance of their metric or 2 a more solid theoretical foundation of the measure they propose docsep paper summary in order to understand why deep networks generalize well this paper proposes prunability as an empirical measure that can be predictive of the generalization prunability is roughly the smallest fraction ie in 01 of parameters that can be retained while zeroing out everything else without increasing the models training loss by too much the authors experimentally demonstrate the predictive ability of this measure in three ways 1 they consider the largescale empirical framework of jiang et al 2019 where one computes different statistics about generalization vs the measure from a large pool of trained models viz demogen dataset from jiang et al 2018 they compare prunability against four other existing measures frobenius norms random perturbation robustness a flatnessbased measure normalized margins a layerwise margin measure the average loss across training a speedofoptimization based measure then they compute three different previouslyproposed statistics a kendalls rank correlation coefficient jiang et al 2019 which tells us how well the measure can rank the models here prunability performs better than the flatness measure and much better than the norm and speed measures however normalized margins outperform everything b adjusted r2 here prunability performs just as well as other measures although normalized margins outperform again c a conditional mutual information cmi term jiang et al 2019 that tells us whether the measure has a causal role in the generalization behavior prunabilitys cmi is pretty low revealing poor causal connections 2 they conduct maddox et al 2020s experiment where one evaluates the measure for varying widths to see whether it can capture the double descent behavior of the test loss they observe that prunability does show a double descent behavior while maddox et al 2020s flatnessbased effective dimensionality only shows an ascentdescent behavior 3 finally they demonstrate that prunability captures something different from all the other measures they do this by showing that there is little causal connection between prunability and all other measures except the flatnessbased randomperturbationrobustness measure then they go on to show that pruning and random perturbations affect models differently notably pruning can lower the test loss of the network while random perturbations always only increases the test loss strengths 1 the idea that compressibility relates to generalization is not new and has been theoretically quantified via generalization bounds however such bounds are still parameter count dependent andor theyre computed for a handful of models and its not clear how well they correlate with generalization the paper takes an orthogonal direction towards relating compressibility and generalization it pins down an empirical measure of compressibility and then provides three sufficiently different kinds of arguments to demonstrate the usefulness of that metric although the experiments used within these arguments in themselves are not novel i think the fact the metric holds up in all these tests is interesting 2 the way this paper quantifies prunability in terms of the fraction of parameters is simple and also somewhat thoughtprovoking why should generalization be related to the fraction of parameters 3 the paper is honest and rigorous in terms of the values it reports prunability is not the best of all metrics and the paper is transparent about it the paper also gives sufficient credit to work that it builds upon the writing is smooth weaknesses 4 given that this is a purely empirical paper id have appreciated if the observations made in the doubledescent experiment and the effect of pruningvsperturbation on test loss experiment were also shown in at least one other datasetarchitecture overall opinion this paper provides multiple different pieces of evidence backing its claim that prunability predicts generalization these empirical observations are rigorous and would be valuable in understanding the generalization puzzle this way of quantifying prunability might also open up new theoretical questions hence i think this is a good paper worth publishing clarification questions 5 could you explain why the last few experiments are reported in terms of the crossentropy loss is it because the corresponding plots for the 01 error do not show as much different between the lines nevertheless i feel that itd be nice to have those plots in the paper too no pressure to produce those plots during rebuttal 6 could you clarify the claims in the first paragraph of 541 specifically prunability is highly informative of generalization across all of our evaluation metrics outperforming random perturbation robustness the training loss itself and the frobenius norm measures seems to contradict later observations that on some evaluation metrics they are all just as good as each other on adjusted r2 minor suggestions under table 1 and 2 would be nice to remind the reader as to whether larger values indicate better predictivity or not is the usage of adjusted r2 inspired by jiang et al 2018 if so would help to cite them appropriately page 12 typo vargins margins suggested citation on the importance of single directions for generalization ari s morcos david gt barrett neil c rabinowitz matthew botvinick iclr 2018 httpsarxivorgabs180306959 they empirically study how generalization is related to how many hidden units you can zero out certainly not the same as what the submission suggests but i think worth citing references yiding jiang behnam neyshabur hossein mobahi dilip krishnan and samy bengio fantastic generalization measures and where to find them arxiv preprint arxiv191202178 2019 yiding jiang dilip krishnan hossein mobahi and samy bengio predicting the generalization gap in deep networks with margin distributions arxiv preprint arxiv181000113 2018 wesley j maddox gregory benton and andrew gordon wilson rethinking parameter counting in deep models effective dimensionality revisited 2020 update the authors clarified all my questions very well they also added an extra plot for the double descent experiment on a different architecture resnet although i feel a bit lukewarm about the added plot in that the double descent phenomenon is only somewhat weakly reflected by their empirical measure ive increased my confidence score from 3 to 4 to appreciate their efforts in addressing my concerns good luck to the authorsdocsepin the present work the authors tackle the highly debated and sometimes confusing problem of finding a good simplicitycomplexity measure able to predict generalization performance of deep networks a novel measure called prunability is introduced and compared with some of the many alternatives in the literature this property measures how networks are able to retain low training loss when a fraction of the weights is set to zero and is clearly related to common training practices eg dropout that seems to yield better generalization performance in practice the experimental settings and the evaluation methods for this new metric are inspired by recent extensive studies on deep networks performance the authors are able to show that prunability is in fact associated with good generalization and seems able to capture some nontrivial phenomena doubledescent but they also find it to be inferior to preexisting margin based measures moreover the close relationship to perturbation robustness and flatness measures is investigated but the results are not fully conclusive in my opinion the idea behind this complexity measure makes a lot of sense but 1 it is already used explicitly in dropout so i dont see how it could inspire the development of better training heuristics 2 it is not easier to measure that most other complexity measures in the random version 3 it is closely related to preexisting metrics 4 more importantly it is not more predictive of generalization than some alternatives already discussed in the literature therefore i think that this works lacks at least one strong point that could motivate the inclusion of this latest complexity measure into the generalization debate ### Summary:
summary the authors propose to predict a neural network classifiers generalization performance by measuring the proportion of parameters that can be pruned to produce an equivalent network in terms of training error experimental and theoretical evaluation are provided discussion the overall opinion in reviews was that the idea is potentially interesting but needs to be pursued further before publication and that the empirical evaluation in particular was lacking that was followed by a detailed discussion in which authors were able to address a number of concerns and have provided helpful additional experiments recommendation this is a potentially interesting paper that is not quite there yet although reviewers have raised scores in discussion the case for acceptance would still be hard to make i recommend to reject it looks like a reasonable amount of additional work will turn this from what is now on the weak end of borderline into a potentially strong submission especially given the thoughtful and thorough feedback from reviewers the next toptier conference deadline is not far away and i encourage the authors to incorporate the feedback fully and resubmit soon that being said i agree with reviewers that the theory provided is at present not strong also a point that still seems to require work is the relation between prunability and the use of dropout note to authors and chairs anonreviewer3 explicitly stated in discussion that they would raise their score from 5 to 6 but the change was not recorded in the system my recommendation assumes their score is 6
[ 292, 390, 5919, 3024, 253, 19477, 2322, 26332, 253, 8394, 273, 3602, 534, 476, 320, 819, 37437, 310, 3587, 2905, 281, 253, 1979, 273, 253, 2990, 923, 24088, 1149, 37668, 1162, 355, 4240, 285, 891, 1928, 326, 436, 310, 247, 2007, 14011, 326, 253, 4327, 273, 970, 253, 8394, 273, 819, 37437, 3602, 943, 320, 625, 9257, 5469, 50276, 783, 9759, 273, 253, 16774, 906, 812, 320, 5520, 407, 1690, 2057, 275, 253, 2022, 2505, 390, 253, 30762, 253, 2629, 6332, 323, 253, 4080, 5921, 323, 512, 5593, 285, 17749, 326, 253, 7180, 403, 39113, 12014, 281, 253, 17857, 32888, 7646, 323, 1805, 1239, 1430, 253, 44127, 5522, 1984, 33754, 476, 320, 908, 323, 436, 4096, 50276, 977, 7211, 4496, 5416, 326, 368, 26542, 3863, 9508, 273, 9380, 672, 597, 403, 2130, 3185, 273, 253, 549, 32693, 638, 21937, 323, 1650, 549, 6464, 1162, 355, 4765, 5420, 387, 17857, 1686, 4765, 285, 1182, 14451, 1162, 355, 4765, 256, 280, 5420, 387, 17857, 32888, 6247, 627, 403, 1142, 643, 824, 2219, 275, 253, 10414, 50274, 49539, 846, 2488, 2380, 891, 5717, 253, 4477, 323, 616, 1908, 366, 6128, 4583, 619, 4743, 4558, 6571, 19965, 285, 891, 3894, 2074, 11626, 281, 37317, 495, 285, 577, 326, 3738, 253, 4081, 2934, 310, 4722, 285, 27807, 253, 2929, 310, 417, 3240, 4704, 387, 436, 1127, 891, 651, 751, 281, 923, 253, 4477, 1246, 2057, 337, 10046, 16774, 1941, 323, 253, 6349, 273, 616, 7982, 390, 374, 247, 625, 4891, 10527, 12153, 273, 253, 2557, 597, 12661, 5474, 33032, 2929, 6010, 50276, 249, 1340, 281, 2096, 2139, 3676, 6928, 39970, 973, 436, 2929, 29328, 819, 328, 1430, 347, 271, 16774, 2557, 326, 476, 320, 15970, 273, 253, 26647, 819, 328, 1430, 310, 11467, 253, 8004, 6919, 26332, 275, 14805, 273, 3602, 326, 476, 320, 14667, 1223, 5058, 272, 562, 3253, 2010, 1293, 3629, 253, 3210, 3733, 2957, 407, 1512, 1199, 253, 4477, 21657, 7568, 253, 15970, 3745, 273, 436, 2557, 275, 1264, 4088, 50276, 18, 597, 1908, 253, 1236, 2510, 25912, 16774, 7792, 273, 480, 22589, 1162, 355, 6247, 835, 581, 48169, 1027, 9990, 670, 26647, 4632, 253, 2557, 432, 247, 1781, 6363, 273, 10166, 3210, 40027, 1471, 2646, 10895, 432, 480, 22589, 1162, 355, 4765, 597, 7277, 819, 328, 1430, 1411, 1740, 643, 5368, 5593, 50276, 35255, 7564, 3750, 22429, 3632, 20452, 31640, 247, 6507, 1255, 3169, 2557, 12650, 24390, 247, 3828, 3020, 8459, 2557, 253, 3388, 2957, 2439, 3733, 247, 3885, 1171, 2178, 27996, 1754, 2557, 50274, 7461, 597, 11897, 1264, 1027, 3786, 856, 7334, 9990, 50273, 66, 465, 423, 10037, 5958, 5921, 10235, 480, 22589, 1162, 355, 6247, 534, 8599, 441, 849, 973, 253, 2557, 476, 5958, 253, 3210, 1060, 819, 328, 1430, 17923, 1805, 685, 253, 6507, 1255, 2557, 285, 1199, 1805, 685, 253, 5222, 285, 3885, 5593, 2299, 12650, 24390, 562, 32231, 3253, 50273, 67, 10904, 391, 19, 1060, 819, 328, 1430, 17923, 816, 347, 973, 347, 643, 5593, 3738, 12650, 24390, 562, 32231, 969, 50273, 68, 247, 17697, 15577, 1491, 260, 7373, 1307, 480, 22589, 1162, 355, 6247, 326, 8599, 441, 1880, 253, 2557, 556, 247, 19349, 2554, 275, 253, 26647, 3879, 819, 328, 1430, 84, 260, 7373, 310, 3965, 1698, 19678, 4105, 19349, 10291, 50273, 19, 597, 2589, 278, 1911, 1004, 1162, 355, 9169, 84, 3368, 835, 581, 44995, 253, 2557, 323, 11962, 34414, 281, 923, 1880, 352, 476, 9232, 253, 4021, 18499, 3879, 273, 253, 1071, 2957, 597, 10018, 326, 819, 328, 1430, 1057, 921, 247, 4021, 18499, 3879, 1223, 278, 1911, 1004, 1162, 355, 9169, 84, 6507, 1255, 3169, 3576, 7877, 1319, 760, 2722, 271, 49104, 3229, 1154, 3879, 50275, 20, 4720, 597, 7568, 326, 819, 328, 1430, 28174, 1633, 1027, 432, 512, 253, 643, 5593, 597, 513, 436, 407, 4645, 326, 627, 310, 1652, 19349, 4602, 875, 819, 328, 1430, 285, 512, 643, 5593, 3707, 253, 6507, 1255, 3169, 3632, 44931, 318, 18848, 461, 1255, 2557, 840, 597, 564, 327, 281, 921, 326, 819, 25004, 285, 3632, 26309, 2818, 3210, 13359, 19836, 819, 25004, 476, 2406, 253, 1071, 2957, 273, 253, 2990, 1223, 3632, 26309, 1900, 760, 5459, 253, 1071, 2957, 50274, 296, 3755, 20556, 50276, 18, 253, 2934, 326, 19477, 2322, 7033, 281, 26647, 310, 417, 747, 285, 556, 644, 28055, 18755, 3066, 26647, 14493, 2299, 824, 14493, 403, 1335, 4764, 1385, 7976, 285, 263, 597, 250, 10302, 323, 247, 17167, 273, 3210, 285, 697, 417, 2590, 849, 973, 597, 24888, 342, 26647, 50276, 783, 2929, 3936, 271, 19627, 3884, 4404, 12600, 19477, 2322, 285, 26647, 352, 23854, 1066, 271, 16774, 2557, 273, 19477, 2322, 285, 840, 50276, 11404, 1487, 1264, 10481, 1027, 9351, 273, 7125, 281, 7568, 253, 31471, 273, 326, 7982, 3738, 253, 4679, 908, 1561, 841, 7125, 275, 3746, 403, 417, 4460, 891, 1158, 253, 958, 253, 7982, 6556, 598, 275, 512, 841, 5216, 310, 4722, 50276, 19, 253, 1039, 436, 2929, 2677, 7790, 819, 328, 1430, 50276, 249, 2426, 273, 253, 6919, 273, 3602, 50276, 261, 2969, 285, 671, 8489, 1869, 11404, 6856, 2139, 943, 26647, 320, 2905, 281, 253, 6919, 273, 3602, 50275, 20, 253, 2929, 310, 8274, 285, 26565, 275, 2426, 273, 253, 2193, 352, 5012, 819, 328, 1430, 310, 417, 253, 1682, 273, 512, 17082, 285, 253, 2929, 310, 13955, 670, 352, 253, 2929, 671, 4245, 4209, 6152, 281, 789, 326, 352, 21168, 2220, 253, 4028, 310, 6032, 50275, 20881, 1255, 265, 50276, 21, 1677, 326, 436, 310, 247, 15846, 16774, 2929, 2654, 452, 14109, 604, 253, 7313, 1160, 275, 253, 25128, 40513, 3368, 285, 253, 1055, 273, 819, 25004, 10936, 44931, 318, 327, 1071, 2957, 3368, 497, 671, 2011, 275, 387, 1878, 581, 643, 10895, 1116, 38413, 50274, 1189, 455, 4743, 50276, 2520, 2929, 3400, 2709, 1027, 7437, 273, 1941, 19673, 697, 1750, 326, 819, 328, 1430, 26295, 26647, 841, 16774, 7313, 403, 26565, 285, 651, 320, 9865, 275, 4685, 253, 26647, 25351, 436, 1039, 273, 2677, 5411, 819, 328, 1430, 1537, 671, 1527, 598, 747, 10527, 3533, 7613, 891, 1158, 436, 310, 247, 1175, 2929, 4409, 18051, 50275, 498, 274, 1877, 3533, 50276, 22, 812, 368, 5513, 2139, 253, 1390, 1643, 4679, 403, 2361, 275, 2426, 273, 253, 2831, 290, 10144, 2957, 310, 352, 984, 253, 3969, 14777, 323, 253, 14805, 2228, 513, 417, 921, 347, 1199, 1027, 875, 253, 3104, 17837, 891, 1928, 326, 352, 69, 320, 5322, 281, 452, 1110, 14777, 275, 253, 2929, 1512, 642, 3473, 281, 4711, 1110, 14777, 1309, 30080, 22559, 50275, 23, 812, 368, 19148, 253, 3916, 275, 253, 806, 12494, 273, 39522, 5742, 50276, 1087, 328, 1430, 310, 4122, 27096, 273, 26647, 2439, 512, 273, 776, 7103, 17082, 50276, 483, 468, 14692, 3632, 20452, 31640, 253, 3733, 2957, 3139, 285, 253, 8954, 7564, 3750, 5222, 5593, 50275, 339, 3030, 281, 17343, 1996, 7313, 326, 327, 690, 7103, 17082, 597, 403, 512, 816, 347, 1175, 347, 1016, 643, 327, 10904, 391, 19, 50275, 37585, 13991, 50275, 4524, 2829, 337, 285, 374, 651, 320, 5322, 281, 9287, 253, 9414, 347, 281, 1880, 4067, 2193, 5224, 1805, 3283, 2351, 390, 417, 50276, 261, 253, 10393, 273, 10904, 391, 19, 11797, 407, 480, 22589, 1162, 355, 4765, 604, 594, 651, 1361, 281, 26542, 731, 20420, 50276, 6377, 1249, 1745, 80, 945, 35022, 50276, 78, 1662, 968, 50275, 35640, 264, 25577, 50276, 251, 253, 6349, 273, 2014, 10746, 323, 26647, 247, 363, 256, 2298, 4752, 34843, 301, 305, 85, 2534, 12436, 425, 300, 260, 391, 25967, 29384, 1111, 783, 88, 17994, 8498, 781, 17857, 32888, 4765, 5987, 39962, 2061, 5375, 11395, 1229, 2090, 3046, 50274, 9328, 45190, 1263, 849, 26647, 310, 2905, 281, 849, 1142, 8763, 5085, 368, 476, 5058, 562, 5604, 417, 253, 1072, 347, 752, 253, 19529, 5936, 533, 891, 1158, 4409, 19936, 50275, 250, 3065, 50275, 90, 2821, 480, 22589, 1602, 6292, 425, 656, 8621, 321, 288, 375, 34103, 9119, 42128, 7425, 532, 36407, 763, 11943, 285, 1775, 90, 270, 1205, 900, 15143, 26647, 5593, 285, 835, 281, 1089, 731, 549, 32693, 638, 3845, 549, 32693, 746, 805, 2640, 20070, 6247, 50275, 90, 2821, 480, 22589, 7425, 532, 36407, 763, 11943, 288, 375, 34103, 9119, 42128, 285, 1775, 90, 270, 1205, 900, 21565, 253, 26647, 8037, 275, 3676, 6928, 342, 8459, 10670, 549, 32693, 638, 3845, 549, 32693, 1093, 2313, 520, 1012, 4765, 50275, 88, 265, 2205, 480, 278, 1911, 1004, 305, 1747, 590, 16535, 251, 285, 285, 2663, 305, 14276, 31380, 1665, 294, 37341, 4764, 15496, 275, 3676, 3210, 3576, 7877, 1319, 27694, 959, 9169, 50275, 11183, 253, 4477, 31637, 512, 619, 3533, 1077, 973, 597, 671, 2879, 271, 4465, 7484, 323, 253, 4021, 18499, 3368, 327, 247, 1027, 10336, 501, 3024, 3738, 891, 1928, 247, 2372, 298, 17936, 44041, 670, 253, 2879, 7484, 275, 326, 253, 4021, 18499, 11562, 310, 760, 8489, 22112, 11392, 407, 616, 16774, 2557, 209, 422, 2559, 619, 7162, 4868, 432, 495, 281, 577, 281, 11435, 616, 6031, 275, 15974, 619, 7350, 1175, 7516, 281, 253, 4477, 7152, 339, 9852, 253, 1246, 789, 253, 4477, 18915, 253, 4122, 37730, 285, 4536, 21643, 1895, 273, 4560, 247, 1175, 17647, 19017, 414, 2557, 2104, 281, 3283, 26647, 3045, 273, 3676, 6928, 247, 4460, 2557, 1925, 819, 328, 1430, 310, 5611, 285, 2429, 342, 690, 273, 253, 1142, 18075, 275, 253, 6239, 436, 2867, 5593, 849, 6928, 403, 2104, 281, 13280, 1698, 3733, 2957, 672, 247, 6919, 273, 253, 13461, 310, 873, 281, 5058, 285, 310, 4518, 2905, 281, 1846, 3733, 8333, 24088, 5926, 483, 326, 3133, 281, 4917, 1805, 26647, 3045, 275, 3946, 253, 5661, 7533, 285, 253, 7103, 3082, 323, 436, 747, 7982, 403, 11797, 407, 3332, 9470, 2175, 327, 3676, 6928, 3045, 253, 4477, 403, 2104, 281, 921, 326, 819, 328, 1430, 310, 275, 958, 2330, 342, 1175, 26647, 285, 3133, 2104, 281, 9232, 690, 37825, 16958, 25128, 40513, 533, 597, 671, 1089, 352, 281, 320, 18134, 281, 638, 20137, 8459, 1754, 5593, 25761, 253, 2810, 2954, 281, 20452, 31640, 285, 6507, 1255, 5593, 310, 6949, 533, 253, 1543, 403, 417, 4751, 38662, 275, 619, 4743, 253, 2934, 3212, 436, 10454, 2557, 2789, 247, 2257, 273, 3282, 533, 337, 352, 310, 2168, 908, 11120, 275, 5926, 483, 594, 891, 13414, 923, 849, 352, 812, 26761, 253, 2440, 273, 1805, 3733, 344, 321, 3397, 374, 352, 310, 417, 6927, 281, 2557, 326, 954, 643, 10454, 5593, 275, 253, 3632, 2715, 495, 352, 310, 8244, 2905, 281, 638, 20137, 17082, 577, 625, 15538, 352, 310, 417, 625, 15970, 273, 26647, 685, 690, 18075, 2168, 5469, 275, 253, 6239, 3103, 891, 1158, 326, 436, 2987, 19756, 387, 1878, 581, 2266, 1127, 326, 812, 41509, 253, 11250, 273, 436, 6323, 10454, 2557, 715, 253, 26647, 8881, 187, 187, 4118, 18435, 27, 8774, 50276, 783, 4477, 12661, 281, 3283, 247, 11454, 2990, 49996, 26647, 3045, 407, 10499, 253, 8394, 273, 3602, 326, 476, 320, 819, 37437, 281, 4711, 271, 6425, 2990, 275, 2426, 273, 3733, 2228, 5661, 285, 10527, 7103, 403, 2530, 50275, 49794, 253, 4583, 4743, 275, 10123, 369, 326, 253, 2934, 310, 7826, 4722, 533, 3198, 281, 320, 23321, 2007, 1078, 9311, 285, 326, 253, 16774, 7103, 275, 1798, 369, 14999, 326, 369, 3560, 407, 247, 7000, 5955, 275, 534, 4477, 497, 2104, 281, 2953, 247, 1180, 273, 7350, 285, 452, 2530, 9371, 3081, 4679, 50276, 250, 27167, 318, 436, 310, 247, 7826, 4722, 2929, 326, 310, 417, 3240, 627, 2568, 3738, 30628, 452, 5439, 7363, 275, 5955, 253, 1083, 323, 14924, 651, 1335, 320, 1892, 281, 1056, 891, 5583, 281, 12009, 50276, 262, 4453, 751, 247, 5272, 50276, 19581, 273, 3081, 789, 588, 1614, 436, 432, 752, 310, 1024, 327, 253, 5075, 990, 273, 45210, 715, 247, 7826, 2266, 19529, 3340, 1677, 253, 30457, 285, 11080, 8680, 432, 30628, 253, 1735, 281, 431, 1321, 8059, 20639, 310, 417, 2080, 1977, 285, 891, 11907, 253, 4477, 281, 19071, 253, 8680, 4751, 285, 501, 538, 2225, 3517, 326, 1146, 753, 891, 5194, 342, 30628, 326, 253, 3762, 2530, 310, 387, 1246, 417, 2266, 671, 247, 1127, 326, 1335, 3133, 281, 2430, 789, 310, 253, 5886, 875, 819, 328, 1430, 285, 253, 897, 273, 5926, 483, 50276, 9939, 281, 4477, 285, 21583, 271, 251, 15337, 254, 20, 11120, 4767, 275, 5955, 326, 597, 651, 7164, 616, 4868, 432, 608, 281, 721, 533, 253, 1818, 369, 417, 5950, 275, 253, 985, 619, 17401, 19584, 616, 4868, 310, 721 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 292, 390, 5919, 3024, 253, 19477, 2322, 26332, 253, 8394, 273, 3602, 534, 476, 320, 819, 37437, 310, 3587, 2905, 281, 253, 1979, 273, 253, 2990, 923, 24088, 1149, 37668, 1162, 355, 4240, 285, 891, 1928, 326, 436, 310, 247, 2007, 14011, 326, 253, 4327, 273, 970, 253, 8394, 273, 819, 37437, 3602, 943, 320, 625, 9257, 5469, 50276, 783, 9759, 273, 253, 16774, 906, 812, 320, 5520, 407, 1690, 2057, 275, 253, 2022, 2505, 390, 253, 30762, 253, 2629, 6332, 323, 253, 4080, 5921, 323, 512, 5593, 285, 17749, 326, 253, 7180, 403, 39113, 12014, 281, 253, 17857, 32888, 7646, 323, 1805, 1239, 1430, 253, 44127, 5522, 1984, 33754, 476, 320, 908, 323, 436, 4096, 50276, 977, 7211, 4496, 5416, 326, 368, 26542, 3863, 9508, 273, 9380, 672, 597, 403, 2130, 3185, 273, 253, 549, 32693, 638, 21937, 323, 1650, 549, 6464, 1162, 355, 4765, 5420, 387, 17857, 1686, 4765, 285, 1182, 14451, 1162, 355, 4765, 256, 280, 5420, 387, 17857, 32888, 6247, 627, 403, 1142, 643, 824, 2219, 275, 253, 10414, 50274, 49539, 846, 2488, 2380, 891, 5717, 253, 4477, 323, 616, 1908, 366, 6128, 4583, 619, 4743, 4558, 6571, 19965, 285, 891, 3894, 2074, 11626, 281, 37317, 495, 285, 577, 326, 3738, 253, 4081, 2934, 310, 4722, 285, 27807, 253, 2929, 310, 417, 3240, 4704, 387, 436, 1127, 891, 651, 751, 281, 923, 253, 4477, 1246, 2057, 337, 10046, 16774, 1941, 323, 253, 6349, 273, 616, 7982, 390, 374, 247, 625, 4891, 10527, 12153, 273, 253, 2557, 597, 12661, 5474, 33032, 2929, 6010, 50276, 249, 1340, 281, 2096, 2139, 3676, 6928, 39970, 973, 436, 2929, 29328, 819, 328, 1430, 347, 271, 16774, 2557, 326, 476, 320, 15970, 273, 253, 26647, 819, 328, 1430, 310, 11467, 253, 8004, 6919, 26332, 275, 14805, 273, 3602, 326, 476, 320, 14667, 1223, 5058, 272, 562, 3253, 2010, 1293, 3629, 253, 3210, 3733, 2957, 407, 1512, 1199, 253, 4477, 21657, 7568, 253, 15970, 3745, 273, 436, 2557, 275, 1264, 4088, 50276, 18, 597, 1908, 253, 1236, 2510, 25912, 16774, 7792, 273, 480, 22589, 1162, 355, 6247, 835, 581, 48169, 1027, 9990, 670, 26647, 4632, 253, 2557, 432, 247, 1781, 6363, 273, 10166, 3210, 40027, 1471, 2646, 10895, 432, 480, 22589, 1162, 355, 4765, 597, 7277, 819, 328, 1430, 1411, 1740, 643, 5368, 5593, 50276, 35255, 7564, 3750, 22429, 3632, 20452, 31640, 247, 6507, 1255, 3169, 2557, 12650, 24390, 247, 3828, 3020, 8459, 2557, 253, 3388, 2957, 2439, 3733, 247, 3885, 1171, 2178, 27996, 1754, 2557, 50274, 7461, 597, 11897, 1264, 1027, 3786, 856, 7334, 9990, 50273, 66, 465, 423, 10037, 5958, 5921, 10235, 480, 22589, 1162, 355, 6247, 534, 8599, 441, 849, 973, 253, 2557, 476, 5958, 253, 3210, 1060, 819, 328, 1430, 17923, 1805, 685, 253, 6507, 1255, 2557, 285, 1199, 1805, 685, 253, 5222, 285, 3885, 5593, 2299, 12650, 24390, 562, 32231, 3253, 50273, 67, 10904, 391, 19, 1060, 819, 328, 1430, 17923, 816, 347, 973, 347, 643, 5593, 3738, 12650, 24390, 562, 32231, 969, 50273, 68, 247, 17697, 15577, 1491, 260, 7373, 1307, 480, 22589, 1162, 355, 6247, 326, 8599, 441, 1880, 253, 2557, 556, 247, 19349, 2554, 275, 253, 26647, 3879, 819, 328, 1430, 84, 260, 7373, 310, 3965, 1698, 19678, 4105, 19349, 10291, 50273, 19, 597, 2589, 278, 1911, 1004, 1162, 355, 9169, 84, 3368, 835, 581, 44995, 253, 2557, 323, 11962, 34414, 281, 923, 1880, 352, 476, 9232, 253, 4021, 18499, 3879, 273, 253, 1071, 2957, 597, 10018, 326, 819, 328, 1430, 1057, 921, 247, 4021, 18499, 3879, 1223, 278, 1911, 1004, 1162, 355, 9169, 84, 6507, 1255, 3169, 3576, 7877, 1319, 760, 2722, 271, 49104, 3229, 1154, 3879, 50275, 20, 4720, 597, 7568, 326, 819, 328, 1430, 28174, 1633, 1027, 432, 512, 253, 643, 5593, 597, 513, 436, 407, 4645, 326, 627, 310, 1652, 19349, 4602, 875, 819, 328, 1430, 285, 512, 643, 5593, 3707, 253, 6507, 1255, 3169, 3632, 44931, 318, 18848, 461, 1255, 2557, 840, 597, 564, 327, 281, 921, 326, 819, 25004, 285, 3632, 26309, 2818, 3210, 13359, 19836, 819, 25004, 476, 2406, 253, 1071, 2957, 273, 253, 2990, 1223, 3632, 26309, 1900, 760, 5459, 253, 1071, 2957, 50274, 296, 3755, 20556, 50276, 18, 253, 2934, 326, 19477, 2322, 7033, 281, 26647, 310, 417, 747, 285, 556, 644, 28055, 18755, 3066, 26647, 14493, 2299, 824, 14493, 403, 1335, 4764, 1385, 7976, 285, 263, 597, 250, 10302, 323, 247, 17167, 273, 3210, 285, 697, 417, 2590, 849, 973, 597, 24888, 342, 26647, 50276, 783, 2929, 3936, 271, 19627, 3884, 4404, 12600, 19477, 2322, 285, 26647, 352, 23854, 1066, 271, 16774, 2557, 273, 19477, 2322, 285, 840, 50276, 11404, 1487, 1264, 10481, 1027, 9351, 273, 7125, 281, 7568, 253, 31471, 273, 326, 7982, 3738, 253, 4679, 908, 1561, 841, 7125, 275, 3746, 403, 417, 4460, 891, 1158, 253, 958, 253, 7982, 6556, 598, 275, 512, 841, 5216, 310, 4722, 50276, 19, 253, 1039, 436, 2929, 2677, 7790, 819, 328, 1430, 50276, 249, 2426, 273, 253, 6919, 273, 3602, 50276, 261, 2969, 285, 671, 8489, 1869, 11404, 6856, 2139, 943, 26647, 320, 2905, 281, 253, 6919, 273, 3602, 50275, 20, 253, 2929, 310, 8274, 285, 26565, 275, 2426, 273, 253, 2193, 352, 5012, 819, 328, 1430, 310, 417, 253, 1682, 273, 512, 17082, 285, 253, 2929, 310, 13955, 670, 352, 253, 2929, 671, 4245, 4209, 6152, 281, 789, 326, 352, 21168, 2220, 253, 4028, 310, 6032, 50275, 20881, 1255, 265, 50276, 21, 1677, 326, 436, 310, 247, 15846, 16774, 2929, 2654, 452, 14109, 604, 253, 7313, 1160, 275, 253, 25128, 40513, 3368, 285, 253, 1055, 273, 819, 25004, 10936, 44931, 318, 327, 1071, 2957, 3368, 497, 671, 2011, 275, 387, 1878, 581, 643, 10895, 1116, 38413, 50274, 1189, 455, 4743, 50276, 2520, 2929, 3400, 2709, 1027, 7437, 273, 1941, 19673, 697, 1750, 326, 819, 328, 1430, 26295, 26647, 841, 16774, 7313, 403, 26565, 285, 651, 320, 9865, 275, 4685, 253, 26647, 25351, 436, 1039, 273, 2677, 5411, 819, 328, 1430, 1537, 671, 1527, 598, 747, 10527, 3533, 7613, 891, 1158, 436, 310, 247, 1175, 2929, 4409, 18051, 50275, 498, 274, 1877, 3533, 50276, 22, 812, 368, 5513, 2139, 253, 1390, 1643, 4679, 403, 2361, 275, 2426, 273, 253, 2831, 290, 10144, 2957, 310, 352, 984, 253, 3969, 14777, 323, 253, 14805, 2228, 513, 417, 921, 347, 1199, 1027, 875, 253, 3104, 17837, 891, 1928, 326, 352, 69, 320, 5322, 281, 452, 1110, 14777, 275, 253, 2929, 1512, 642, 3473, 281, 4711, 1110, 14777, 1309, 30080, 22559, 50275, 23, 812, 368, 19148, 253, 3916, 275, 253, 806, 12494, 273, 39522, 5742, 50276, 1087, 328, 1430, 310, 4122, 27096, 273, 26647, 2439, 512, 273, 776, 7103, 17082, 50276, 483, 468, 14692, 3632, 20452, 31640, 253, 3733, 2957, 3139, 285, 253, 8954, 7564, 3750, 5222, 5593, 50275, 339, 3030, 281, 17343, 1996, 7313, 326, 327, 690, 7103, 17082, 597, 403, 512, 816, 347, 1175, 347, 1016, 643, 327, 10904, 391, 19, 50275, 37585, 13991, 50275, 4524, 2829, 337, 285, 374, 651, 320, 5322, 281, 9287, 253, 9414, 347, 281, 1880, 4067, 2193, 5224, 1805, 3283, 2351, 390, 417, 50276, 261, 253, 10393, 273, 10904, 391, 19, 11797, 407, 480, 22589, 1162, 355, 4765, 604, 594, 651, 1361, 281, 26542, 731, 20420, 50276, 6377, 1249, 1745, 80, 945, 35022, 50276, 78, 1662, 968, 50275, 35640, 264, 25577, 50276, 251, 253, 6349, 273, 2014, 10746, 323, 26647, 247, 363, 256, 2298, 4752, 34843, 301, 305, 85, 2534, 12436, 425, 300, 260, 391, 25967, 29384, 1111, 783, 88, 17994, 8498, 781, 17857, 32888, 4765, 5987, 39962, 2061, 5375, 11395, 1229, 2090, 3046, 50274, 9328, 45190, 1263, 849, 26647, 310, 2905, 281, 849, 1142, 8763, 5085, 368, 476, 5058, 562, 5604, 417, 253, 1072, 347, 752, 253, 19529, 5936, 533, 891, 1158, 4409, 19936, 50275, 250, 3065, 50275, 90, 2821, 480, 22589, 1602, 6292, 425, 656, 8621, 321, 288, 375, 34103, 9119, 42128, 7425, 532, 36407, 763, 11943, 285, 1775, 90, 270, 1205, 900, 15143, 26647, 5593, 285, 835, 281, 1089, 731, 549, 32693, 638, 3845, 549, 32693, 746, 805, 2640, 20070, 6247, 50275, 90, 2821, 480, 22589, 7425, 532, 36407, 763, 11943, 288, 375, 34103, 9119, 42128, 285, 1775, 90, 270, 1205, 900, 21565, 253, 26647, 8037, 275, 3676, 6928, 342, 8459, 10670, 549, 32693, 638, 3845, 549, 32693, 1093, 2313, 520, 1012, 4765, 50275, 88, 265, 2205, 480, 278, 1911, 1004, 305, 1747, 590, 16535, 251, 285, 285, 2663, 305, 14276, 31380, 1665, 294, 37341, 4764, 15496, 275, 3676, 3210, 3576, 7877, 1319, 27694, 959, 9169, 50275, 11183, 253, 4477, 31637, 512, 619, 3533, 1077, 973, 597, 671, 2879, 271, 4465, 7484, 323, 253, 4021, 18499, 3368, 327, 247, 1027, 10336, 501, 3024, 3738, 891, 1928, 247, 2372, 298, 17936, 44041, 670, 253, 2879, 7484, 275, 326, 253, 4021, 18499, 11562, 310, 760, 8489, 22112, 11392, 407, 616, 16774, 2557, 209, 422, 2559, 619, 7162, 4868, 432, 495, 281, 577, 281, 11435, 616, 6031, 275, 15974, 619, 7350, 1175, 7516, 281, 253, 4477, 7152, 339, 9852, 253, 1246, 789, 253, 4477, 18915, 253, 4122, 37730, 285, 4536, 21643, 1895, 273, 4560, 247, 1175, 17647, 19017, 414, 2557, 2104, 281, 3283, 26647, 3045, 273, 3676, 6928, 247, 4460, 2557, 1925, 819, 328, 1430, 310, 5611, 285, 2429, 342, 690, 273, 253, 1142, 18075, 275, 253, 6239, 436, 2867, 5593, 849, 6928, 403, 2104, 281, 13280, 1698, 3733, 2957, 672, 247, 6919, 273, 253, 13461, 310, 873, 281, 5058, 285, 310, 4518, 2905, 281, 1846, 3733, 8333, 24088, 5926, 483, 326, 3133, 281, 4917, 1805, 26647, 3045, 275, 3946, 253, 5661, 7533, 285, 253, 7103, 3082, 323, 436, 747, 7982, 403, 11797, 407, 3332, 9470, 2175, 327, 3676, 6928, 3045, 253, 4477, 403, 2104, 281, 921, 326, 819, 328, 1430, 310, 275, 958, 2330, 342, 1175, 26647, 285, 3133, 2104, 281, 9232, 690, 37825, 16958, 25128, 40513, 533, 597, 671, 1089, 352, 281, 320, 18134, 281, 638, 20137, 8459, 1754, 5593, 25761, 253, 2810, 2954, 281, 20452, 31640, 285, 6507, 1255, 5593, 310, 6949, 533, 253, 1543, 403, 417, 4751, 38662, 275, 619, 4743, 253, 2934, 3212, 436, 10454, 2557, 2789, 247, 2257, 273, 3282, 533, 337, 352, 310, 2168, 908, 11120, 275, 5926, 483, 594, 891, 13414, 923, 849, 352, 812, 26761, 253, 2440, 273, 1805, 3733, 344, 321, 3397, 374, 352, 310, 417, 6927, 281, 2557, 326, 954, 643, 10454, 5593, 275, 253, 3632, 2715, 495, 352, 310, 8244, 2905, 281, 638, 20137, 17082, 577, 625, 15538, 352, 310, 417, 625, 15970, 273, 26647, 685, 690, 18075, 2168, 5469, 275, 253, 6239, 3103, 891, 1158, 326, 436, 2987, 19756, 387, 1878, 581, 2266, 1127, 326, 812, 41509, 253, 11250, 273, 436, 6323, 10454, 2557, 715, 253, 26647, 8881, 187, 187, 4118, 18435, 27, 8774, 50276, 783, 4477, 12661, 281, 3283, 247, 11454, 2990, 49996, 26647, 3045, 407, 10499, 253, 8394, 273, 3602, 326, 476, 320, 819, 37437, 281, 4711, 271, 6425, 2990, 275, 2426, 273, 3733, 2228, 5661, 285, 10527, 7103, 403, 2530, 50275, 49794, 253, 4583, 4743, 275, 10123, 369, 326, 253, 2934, 310, 7826, 4722, 533, 3198, 281, 320, 23321, 2007, 1078, 9311, 285, 326, 253, 16774, 7103, 275, 1798, 369, 14999, 326, 369, 3560, 407, 247, 7000, 5955, 275, 534, 4477, 497, 2104, 281, 2953, 247, 1180, 273, 7350, 285, 452, 2530, 9371, 3081, 4679, 50276, 250, 27167, 318, 436, 310, 247, 7826, 4722, 2929, 326, 310, 417, 3240, 627, 2568, 3738, 30628, 452, 5439, 7363, 275, 5955, 253, 1083, 323, 14924, 651, 1335, 320, 1892, 281, 1056, 891, 5583, 281, 12009, 50276, 262, 4453, 751, 247, 5272, 50276, 19581, 273, 3081, 789, 588, 1614, 436, 432, 752, 310, 1024, 327, 253, 5075, 990, 273, 45210, 715, 247, 7826, 2266, 19529, 3340, 1677, 253, 30457, 285, 11080, 8680, 432, 30628, 253, 1735, 281, 431, 1321, 8059, 20639, 310, 417, 2080, 1977, 285, 891, 11907, 253, 4477, 281, 19071, 253, 8680, 4751, 285, 501, 538, 2225, 3517, 326, 1146, 753, 891, 5194, 342, 30628, 326, 253, 3762, 2530, 310, 387, 1246, 417, 2266, 671, 247, 1127, 326, 1335, 3133, 281, 2430, 789, 310, 253, 5886, 875, 819, 328, 1430, 285, 253, 897, 273, 5926, 483, 50276, 9939, 281, 4477, 285, 21583, 271, 251, 15337, 254, 20, 11120, 4767, 275, 5955, 326, 597, 651, 7164, 616, 4868, 432, 608, 281, 721, 533, 253, 1818, 369, 417, 5950, 275, 253, 985, 619, 17401, 19584, 616, 4868, 310, 721 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper focuses on the problem of adding entailment relations in an eventuality activities states events kg the main contribution is a pipelined unsupervised method for this problem the method is divided into three stages 1 decomposing eventualities predicates mapped to wordnet and arguments mapped to probase 2 local inference step aggregate entailment scores on predicates and arguments and 3 global inference step use transitivity of entailments human evaluation demonstrates quality of the inferred entailment relations overall the paper is well written the choice of aggregating scores over aligned arguments logical or equation 1 is not well motivated if a single argument matches the score for the set becomes 1 irrespective of other arguments is this expected isnt logical and more suitable if not how does a logical or produce high quality entailment relations a discussion on such choices would be really useful in results analysis 52 the lower performance for sv compared to svopo is attributed to unary relations between predicates and arguments being ambiguous however the difference in performance is there only for global inference step and not for local inference step it would be interesting to know why there is more drop in case of sv it would be interesting to know how the method compares on smaller graphs if the method cant handle smaller graphs it would be informative to highlight why and what sizes does it expect in the first paragraph of introduction it will be useful to define eventuality when it is introduced with the help of an example eg the one in fig 1 fig 2 caption generated generate docsepthis paper proposes a threestep framework for acquiring eventuality entailment knowledge compared with existing approaches the proposed framework can handle much larger scale eventualities in an unsupervised manner however it would be better to include a case study and compare it with previous work such as aser comments the domain of eventuality is commonsense can this paper adopt conceptnet instead of probase of wordnet it is better to include a case study and compare it with previous work such as aser do hyperparameters eg threshold affect the result this paper improves the prior work but analysis of the proposed method and comparisons are missingdocsepthis paper proposes a method for automatically constructing large entailment graphs the method seems reasonable but there are various issues regarding clarity novelty and empirical evaluation i think this paper is below the bar of aclemnlp but focuses on a topic that is central to a conference like akbc and thus is a good fit the method proposed has 3 parts a defining the nodes of the graph which correspond to events unlike prior work these nodes do not have any variables in them and can be both unary binary and ternary authors claim that this is an advantage since the semantic meaning is more precise but there is a big price also using the rules is harder when they are more specific unfortunatley there is no real evaluation that tests whether this is indeed an issue b defining a local similarity score between pairs of nodes this is based on combining a similarity score between arguments and a similarity score between predicates the scores are relatively straightforward one thing i found weird was that the score for a set of arguments was defined using or rather than and that is it seems enough if you have two sets of aligned terms that only one of them has high similarity to give a high score to the entire set this seems counterintuitive and is not explained c defining a procedure for going through entailment paths and adding edges between events this is done in a fairly precisionoriented manner as taking the transitive closure can be too noisy overall during the experimental evaluation it is unclear how many edges one gains by adding the global step since in many cases the numbers given are coarse and it seems like very few new rules were added overall the paper proposes a method for constructing graphs and reports some accuracy of generated rules and it seems to be doing ok i am unsure if this is enough for akbc for a first tier conference there are other things that must be done 1 some discussion on the usefulness of representation empirical experiment the authors change the representation such that it is more specific but this makes it applicability low it is hard to say whether 100m rules is a lot or not it could very well be that even though 100m sounds like a lot trying to use these rules in an application will fail because the rules are too specific as it is it is hard to judge the recall of the system and it is likely that there are many rules it does not capture and it is hard to say how many rules will actually be used in a scenario where one would want to use them 2 clarity various places in the paper were unclear 321 the first paragraph is not clear 33 why do the rules create forest are there no cyces 3 experimental evaluation evaluation is posthoc only that is you sample rules from the models and estimate precision but what about recall what do the rules cover the authors say this is a large rulebase but it is hard to judge similarly it seems a better eval would be to show the rules are useful for some downstream application nowadays people are buildling less knowledgebases of rules since it is an arduous task and moving to encoding knowledge through learning and retrieval onthefly i am not convinced that buildling these kbs would be useful in a nlu task but maybe for testing probes of various sorts also it seems like in 410 cases running the global inference did not add rules or added very little it is hard to know when the number of rules is reported in millinos only why is that there is no empirical comparison to prior work as far as i can see to conclude a method for building entailment graphs is presented resulting in tens of millions of rules that are of high precision however the paper does little to convince this is a useful representation and rulebase and empirical evaluation is weak ### Summary:
this paper proposes a novel framework for acquiring eventuality entailment knowledge to construct a knowledge graph the multistep construction process is well explained and has clear justification however the paper could be stronger if it expands more on convincing audience that such knowledge graph is a useful representation has promising downstream applications the work can also benefit from adding more empirical evaluation
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 16633, 327, 253, 1895, 273, 6240, 46518, 420, 2493, 275, 271, 2362, 10982, 4712, 3054, 3394, 15841, 253, 2022, 7680, 310, 247, 9196, 293, 967, 440, 35421, 1332, 323, 436, 1895, 253, 1332, 310, 4272, 715, 1264, 8661, 337, 11101, 28163, 27585, 1005, 50275, 12787, 31290, 18301, 281, 3159, 3024, 50276, 395, 7125, 18301, 281, 1742, 511, 374, 1980, 17032, 3213, 50276, 46601, 366, 46518, 420, 7363, 50276, 251, 2063, 31290, 285, 7125, 285, 495, 4156, 17032, 3213, 50276, 2327, 811, 5714, 273, 46518, 942, 1966, 7103, 14371, 3290, 273, 253, 22245, 46518, 420, 2493, 4583, 253, 2929, 310, 973, 3542, 50276, 783, 4327, 273, 9406, 839, 7363, 689, 15616, 7125, 13760, 390, 5150, 337, 310, 417, 973, 17194, 604, 247, 2014, 4154, 10129, 253, 4868, 323, 253, 873, 4916, 337, 30472, 273, 643, 7125, 310, 436, 3264, 310, 2649, 13760, 285, 625, 7470, 604, 417, 849, 1057, 247, 13760, 390, 4711, 1029, 3290, 46518, 420, 2493, 247, 5955, 327, 824, 10165, 651, 320, 1663, 4217, 50276, 249, 1543, 1783, 8073, 253, 2406, 3045, 323, 18504, 2429, 281, 18504, 38332, 50276, 261, 12877, 281, 440, 552, 2493, 875, 2063, 31290, 285, 7125, 1146, 23851, 2299, 253, 3064, 275, 3045, 50276, 261, 627, 760, 323, 4156, 17032, 3213, 285, 417, 323, 1980, 17032, 3213, 352, 651, 320, 4722, 281, 871, 2139, 627, 310, 625, 5926, 275, 1083, 273, 18504, 50276, 262, 651, 320, 4722, 281, 871, 849, 253, 1332, 26662, 327, 4577, 14580, 604, 253, 1332, 16216, 6016, 4577, 14580, 352, 651, 320, 27096, 281, 6780, 2139, 285, 752, 9552, 1057, 352, 1902, 50275, 249, 253, 806, 12494, 273, 10199, 352, 588, 320, 4217, 281, 4853, 2362, 10982, 672, 352, 310, 5611, 342, 253, 1361, 273, 271, 1650, 24088, 253, 581, 275, 3036, 337, 50276, 926, 374, 11743, 4561, 50276, 16450, 5474, 33032, 2520, 2929, 29328, 247, 289, 250, 383, 554, 7792, 323, 28635, 2362, 10982, 46518, 420, 3640, 2429, 342, 5368, 7274, 253, 4081, 7792, 476, 6016, 1199, 4067, 4311, 27585, 1005, 275, 271, 440, 35421, 5133, 2299, 352, 651, 320, 1805, 281, 2486, 247, 1083, 1263, 285, 7277, 352, 342, 2045, 789, 824, 347, 347, 254, 50276, 26122, 50276, 783, 5028, 273, 2362, 10982, 310, 764, 49235, 476, 436, 2929, 5283, 4473, 3024, 3185, 273, 1742, 511, 273, 3159, 3024, 50276, 262, 310, 1805, 281, 2486, 247, 1083, 1263, 285, 7277, 352, 342, 2045, 789, 824, 347, 347, 254, 50276, 3088, 4373, 22041, 24088, 7887, 2818, 253, 906, 50276, 2520, 2929, 19132, 253, 2720, 789, 533, 1783, 273, 253, 4081, 1332, 285, 14023, 403, 5816, 7152, 33032, 2520, 2929, 29328, 247, 1332, 323, 8356, 26736, 1781, 46518, 420, 14580, 253, 1332, 3133, 5272, 533, 627, 403, 2710, 3374, 5001, 19843, 38135, 285, 16774, 7103, 891, 1158, 436, 2929, 310, 2708, 253, 2534, 273, 913, 5616, 13307, 81, 533, 16633, 327, 247, 9400, 326, 310, 4275, 281, 247, 8059, 751, 29507, 12847, 285, 3021, 310, 247, 1175, 4944, 50276, 783, 1332, 4081, 556, 495, 4243, 247, 13947, 253, 7632, 273, 253, 4216, 534, 2723, 281, 3394, 12401, 2720, 789, 841, 7632, 513, 417, 452, 667, 4903, 275, 731, 285, 476, 320, 1097, 440, 552, 8985, 285, 49688, 552, 4477, 1750, 326, 436, 310, 271, 5750, 1580, 253, 24705, 4495, 310, 625, 10799, 533, 627, 310, 247, 1943, 4376, 671, 50276, 5302, 253, 4803, 310, 12150, 672, 597, 403, 625, 2173, 5369, 2665, 255, 2205, 627, 310, 642, 1524, 7103, 326, 5216, 1880, 436, 310, 6296, 271, 2523, 270, 13947, 247, 1980, 14259, 4868, 875, 8557, 273, 7632, 436, 310, 1754, 327, 16248, 247, 14259, 4868, 875, 7125, 285, 247, 14259, 4868, 875, 2063, 31290, 253, 7363, 403, 4942, 15246, 50276, 531, 2181, 891, 1119, 12504, 369, 326, 253, 4868, 323, 247, 873, 273, 7125, 369, 2931, 970, 390, 2581, 685, 285, 326, 310, 352, 3133, 2217, 604, 368, 452, 767, 5239, 273, 15616, 2426, 326, 760, 581, 273, 731, 556, 1029, 14259, 281, 1918, 247, 1029, 4868, 281, 253, 2862, 873, 436, 3133, 4828, 565, 48714, 285, 310, 417, 5544, 50276, 68, 13947, 247, 5199, 323, 1469, 949, 46518, 420, 11865, 285, 6240, 9297, 875, 3394, 436, 310, 2218, 275, 247, 9648, 12320, 21085, 5133, 347, 3192, 253, 811, 1483, 14230, 476, 320, 1512, 27620, 4583, 1309, 253, 5661, 7103, 352, 310, 12744, 849, 1142, 9297, 581, 15988, 407, 6240, 253, 4156, 3213, 1580, 275, 1142, 2219, 253, 3904, 1677, 403, 25319, 285, 352, 3133, 751, 1077, 1643, 747, 4803, 497, 2879, 50276, 1189, 455, 253, 2929, 29328, 247, 1332, 323, 26736, 14580, 285, 5012, 690, 7200, 273, 4561, 4803, 285, 352, 3133, 281, 320, 2509, 8718, 891, 717, 31488, 604, 436, 310, 2217, 323, 29507, 12847, 323, 247, 806, 30625, 8059, 627, 403, 643, 1841, 326, 1364, 320, 2218, 50276, 18, 690, 5955, 327, 253, 31471, 273, 6779, 50276, 358, 5378, 474, 3368, 253, 4477, 1818, 253, 6779, 824, 326, 352, 310, 625, 2173, 533, 436, 2789, 352, 30437, 1698, 352, 310, 1892, 281, 1333, 1880, 2233, 78, 4803, 310, 247, 2257, 390, 417, 352, 812, 1077, 973, 320, 326, 1014, 2167, 2233, 78, 7835, 751, 247, 2257, 2820, 281, 897, 841, 4803, 275, 271, 2898, 588, 1891, 984, 253, 4803, 403, 1512, 2173, 347, 352, 310, 352, 310, 1892, 281, 5963, 253, 6983, 273, 253, 985, 285, 352, 310, 2779, 326, 627, 403, 1142, 4803, 352, 1057, 417, 9232, 285, 352, 310, 1892, 281, 1333, 849, 1142, 4803, 588, 2686, 320, 908, 275, 247, 10076, 835, 581, 651, 971, 281, 897, 731, 50276, 19, 19843, 50276, 2044, 784, 5053, 275, 253, 2929, 497, 12744, 33251, 253, 806, 12494, 310, 417, 2590, 5922, 2139, 513, 253, 4803, 2794, 9741, 403, 627, 642, 2645, 707, 50276, 20, 5661, 7103, 7103, 310, 1501, 37806, 760, 326, 310, 368, 3410, 4803, 432, 253, 3210, 285, 6642, 12320, 533, 752, 670, 6983, 752, 513, 253, 4803, 3835, 253, 4477, 1333, 436, 310, 247, 1781, 4086, 4793, 533, 352, 310, 1892, 281, 5963, 50275, 3549, 6241, 352, 3133, 247, 1805, 2777, 651, 320, 281, 921, 253, 4803, 403, 4217, 323, 690, 15450, 2898, 31735, 952, 403, 1973, 1981, 1679, 3640, 67, 1169, 273, 4803, 1580, 352, 310, 271, 549, 563, 528, 4836, 285, 4886, 281, 9706, 3640, 949, 4715, 285, 25064, 327, 783, 16247, 891, 717, 417, 13762, 326, 1973, 1981, 841, 465, 1768, 651, 320, 4217, 275, 247, 295, 7675, 4836, 533, 5046, 323, 5175, 19432, 273, 2710, 16308, 50276, 12563, 50276, 262, 3133, 751, 275, 31938, 2219, 3515, 253, 4156, 17032, 858, 417, 823, 4803, 390, 2879, 1077, 1652, 352, 310, 1892, 281, 871, 672, 253, 1180, 273, 4803, 310, 2361, 275, 5499, 15777, 760, 2139, 310, 326, 50276, 9088, 310, 642, 16774, 5301, 281, 2720, 789, 347, 2080, 347, 891, 476, 923, 50276, 936, 7525, 247, 1332, 323, 3652, 46518, 420, 14580, 310, 3559, 4795, 275, 7114, 273, 9790, 273, 4803, 326, 403, 273, 1029, 12320, 2299, 253, 2929, 1057, 1652, 281, 18578, 436, 310, 247, 4217, 6779, 285, 4086, 4793, 285, 16774, 7103, 310, 5075, 187, 187, 4118, 18435, 27, 2520, 2929, 29328, 247, 4460, 7792, 323, 28635, 2362, 10982, 46518, 420, 3640, 281, 3989, 247, 3640, 4216, 253, 1554, 382, 554, 5140, 1232, 310, 973, 5544, 285, 556, 2590, 22861, 2299, 253, 2929, 812, 320, 10046, 604, 352, 35205, 625, 327, 21414, 8446, 326, 824, 3640, 4216, 310, 247, 4217, 6779, 556, 12532, 15450, 4893, 253, 789, 476, 671, 5649, 432, 6240, 625, 16774, 7103 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 16633, 327, 253, 1895, 273, 6240, 46518, 420, 2493, 275, 271, 2362, 10982, 4712, 3054, 3394, 15841, 253, 2022, 7680, 310, 247, 9196, 293, 967, 440, 35421, 1332, 323, 436, 1895, 253, 1332, 310, 4272, 715, 1264, 8661, 337, 11101, 28163, 27585, 1005, 50275, 12787, 31290, 18301, 281, 3159, 3024, 50276, 395, 7125, 18301, 281, 1742, 511, 374, 1980, 17032, 3213, 50276, 46601, 366, 46518, 420, 7363, 50276, 251, 2063, 31290, 285, 7125, 285, 495, 4156, 17032, 3213, 50276, 2327, 811, 5714, 273, 46518, 942, 1966, 7103, 14371, 3290, 273, 253, 22245, 46518, 420, 2493, 4583, 253, 2929, 310, 973, 3542, 50276, 783, 4327, 273, 9406, 839, 7363, 689, 15616, 7125, 13760, 390, 5150, 337, 310, 417, 973, 17194, 604, 247, 2014, 4154, 10129, 253, 4868, 323, 253, 873, 4916, 337, 30472, 273, 643, 7125, 310, 436, 3264, 310, 2649, 13760, 285, 625, 7470, 604, 417, 849, 1057, 247, 13760, 390, 4711, 1029, 3290, 46518, 420, 2493, 247, 5955, 327, 824, 10165, 651, 320, 1663, 4217, 50276, 249, 1543, 1783, 8073, 253, 2406, 3045, 323, 18504, 2429, 281, 18504, 38332, 50276, 261, 12877, 281, 440, 552, 2493, 875, 2063, 31290, 285, 7125, 1146, 23851, 2299, 253, 3064, 275, 3045, 50276, 261, 627, 760, 323, 4156, 17032, 3213, 285, 417, 323, 1980, 17032, 3213, 352, 651, 320, 4722, 281, 871, 2139, 627, 310, 625, 5926, 275, 1083, 273, 18504, 50276, 262, 651, 320, 4722, 281, 871, 849, 253, 1332, 26662, 327, 4577, 14580, 604, 253, 1332, 16216, 6016, 4577, 14580, 352, 651, 320, 27096, 281, 6780, 2139, 285, 752, 9552, 1057, 352, 1902, 50275, 249, 253, 806, 12494, 273, 10199, 352, 588, 320, 4217, 281, 4853, 2362, 10982, 672, 352, 310, 5611, 342, 253, 1361, 273, 271, 1650, 24088, 253, 581, 275, 3036, 337, 50276, 926, 374, 11743, 4561, 50276, 16450, 5474, 33032, 2520, 2929, 29328, 247, 289, 250, 383, 554, 7792, 323, 28635, 2362, 10982, 46518, 420, 3640, 2429, 342, 5368, 7274, 253, 4081, 7792, 476, 6016, 1199, 4067, 4311, 27585, 1005, 275, 271, 440, 35421, 5133, 2299, 352, 651, 320, 1805, 281, 2486, 247, 1083, 1263, 285, 7277, 352, 342, 2045, 789, 824, 347, 347, 254, 50276, 26122, 50276, 783, 5028, 273, 2362, 10982, 310, 764, 49235, 476, 436, 2929, 5283, 4473, 3024, 3185, 273, 1742, 511, 273, 3159, 3024, 50276, 262, 310, 1805, 281, 2486, 247, 1083, 1263, 285, 7277, 352, 342, 2045, 789, 824, 347, 347, 254, 50276, 3088, 4373, 22041, 24088, 7887, 2818, 253, 906, 50276, 2520, 2929, 19132, 253, 2720, 789, 533, 1783, 273, 253, 4081, 1332, 285, 14023, 403, 5816, 7152, 33032, 2520, 2929, 29328, 247, 1332, 323, 8356, 26736, 1781, 46518, 420, 14580, 253, 1332, 3133, 5272, 533, 627, 403, 2710, 3374, 5001, 19843, 38135, 285, 16774, 7103, 891, 1158, 436, 2929, 310, 2708, 253, 2534, 273, 913, 5616, 13307, 81, 533, 16633, 327, 247, 9400, 326, 310, 4275, 281, 247, 8059, 751, 29507, 12847, 285, 3021, 310, 247, 1175, 4944, 50276, 783, 1332, 4081, 556, 495, 4243, 247, 13947, 253, 7632, 273, 253, 4216, 534, 2723, 281, 3394, 12401, 2720, 789, 841, 7632, 513, 417, 452, 667, 4903, 275, 731, 285, 476, 320, 1097, 440, 552, 8985, 285, 49688, 552, 4477, 1750, 326, 436, 310, 271, 5750, 1580, 253, 24705, 4495, 310, 625, 10799, 533, 627, 310, 247, 1943, 4376, 671, 50276, 5302, 253, 4803, 310, 12150, 672, 597, 403, 625, 2173, 5369, 2665, 255, 2205, 627, 310, 642, 1524, 7103, 326, 5216, 1880, 436, 310, 6296, 271, 2523, 270, 13947, 247, 1980, 14259, 4868, 875, 8557, 273, 7632, 436, 310, 1754, 327, 16248, 247, 14259, 4868, 875, 7125, 285, 247, 14259, 4868, 875, 2063, 31290, 253, 7363, 403, 4942, 15246, 50276, 531, 2181, 891, 1119, 12504, 369, 326, 253, 4868, 323, 247, 873, 273, 7125, 369, 2931, 970, 390, 2581, 685, 285, 326, 310, 352, 3133, 2217, 604, 368, 452, 767, 5239, 273, 15616, 2426, 326, 760, 581, 273, 731, 556, 1029, 14259, 281, 1918, 247, 1029, 4868, 281, 253, 2862, 873, 436, 3133, 4828, 565, 48714, 285, 310, 417, 5544, 50276, 68, 13947, 247, 5199, 323, 1469, 949, 46518, 420, 11865, 285, 6240, 9297, 875, 3394, 436, 310, 2218, 275, 247, 9648, 12320, 21085, 5133, 347, 3192, 253, 811, 1483, 14230, 476, 320, 1512, 27620, 4583, 1309, 253, 5661, 7103, 352, 310, 12744, 849, 1142, 9297, 581, 15988, 407, 6240, 253, 4156, 3213, 1580, 275, 1142, 2219, 253, 3904, 1677, 403, 25319, 285, 352, 3133, 751, 1077, 1643, 747, 4803, 497, 2879, 50276, 1189, 455, 253, 2929, 29328, 247, 1332, 323, 26736, 14580, 285, 5012, 690, 7200, 273, 4561, 4803, 285, 352, 3133, 281, 320, 2509, 8718, 891, 717, 31488, 604, 436, 310, 2217, 323, 29507, 12847, 323, 247, 806, 30625, 8059, 627, 403, 643, 1841, 326, 1364, 320, 2218, 50276, 18, 690, 5955, 327, 253, 31471, 273, 6779, 50276, 358, 5378, 474, 3368, 253, 4477, 1818, 253, 6779, 824, 326, 352, 310, 625, 2173, 533, 436, 2789, 352, 30437, 1698, 352, 310, 1892, 281, 1333, 1880, 2233, 78, 4803, 310, 247, 2257, 390, 417, 352, 812, 1077, 973, 320, 326, 1014, 2167, 2233, 78, 7835, 751, 247, 2257, 2820, 281, 897, 841, 4803, 275, 271, 2898, 588, 1891, 984, 253, 4803, 403, 1512, 2173, 347, 352, 310, 352, 310, 1892, 281, 5963, 253, 6983, 273, 253, 985, 285, 352, 310, 2779, 326, 627, 403, 1142, 4803, 352, 1057, 417, 9232, 285, 352, 310, 1892, 281, 1333, 849, 1142, 4803, 588, 2686, 320, 908, 275, 247, 10076, 835, 581, 651, 971, 281, 897, 731, 50276, 19, 19843, 50276, 2044, 784, 5053, 275, 253, 2929, 497, 12744, 33251, 253, 806, 12494, 310, 417, 2590, 5922, 2139, 513, 253, 4803, 2794, 9741, 403, 627, 642, 2645, 707, 50276, 20, 5661, 7103, 7103, 310, 1501, 37806, 760, 326, 310, 368, 3410, 4803, 432, 253, 3210, 285, 6642, 12320, 533, 752, 670, 6983, 752, 513, 253, 4803, 3835, 253, 4477, 1333, 436, 310, 247, 1781, 4086, 4793, 533, 352, 310, 1892, 281, 5963, 50275, 3549, 6241, 352, 3133, 247, 1805, 2777, 651, 320, 281, 921, 253, 4803, 403, 4217, 323, 690, 15450, 2898, 31735, 952, 403, 1973, 1981, 1679, 3640, 67, 1169, 273, 4803, 1580, 352, 310, 271, 549, 563, 528, 4836, 285, 4886, 281, 9706, 3640, 949, 4715, 285, 25064, 327, 783, 16247, 891, 717, 417, 13762, 326, 1973, 1981, 841, 465, 1768, 651, 320, 4217, 275, 247, 295, 7675, 4836, 533, 5046, 323, 5175, 19432, 273, 2710, 16308, 50276, 12563, 50276, 262, 3133, 751, 275, 31938, 2219, 3515, 253, 4156, 17032, 858, 417, 823, 4803, 390, 2879, 1077, 1652, 352, 310, 1892, 281, 871, 672, 253, 1180, 273, 4803, 310, 2361, 275, 5499, 15777, 760, 2139, 310, 326, 50276, 9088, 310, 642, 16774, 5301, 281, 2720, 789, 347, 2080, 347, 891, 476, 923, 50276, 936, 7525, 247, 1332, 323, 3652, 46518, 420, 14580, 310, 3559, 4795, 275, 7114, 273, 9790, 273, 4803, 326, 403, 273, 1029, 12320, 2299, 253, 2929, 1057, 1652, 281, 18578, 436, 310, 247, 4217, 6779, 285, 4086, 4793, 285, 16774, 7103, 310, 5075, 187, 187, 4118, 18435, 27, 2520, 2929, 29328, 247, 4460, 7792, 323, 28635, 2362, 10982, 46518, 420, 3640, 281, 3989, 247, 3640, 4216, 253, 1554, 382, 554, 5140, 1232, 310, 973, 5544, 285, 556, 2590, 22861, 2299, 253, 2929, 812, 320, 10046, 604, 352, 35205, 625, 327, 21414, 8446, 326, 824, 3640, 4216, 310, 247, 4217, 6779, 556, 12532, 15450, 4893, 253, 789, 476, 671, 5649, 432, 6240, 625, 16774, 7103 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: summary the paper presents a method to decompose scenes into its constituent objects this is done with a generative framework that generates both bounding boxes and segmentation masks for each object it relies on several previously existing technologies its main contribution is enforcing consistency between bounding boxes and segmentation masks pros outperforms the baselines cons the paper can be hard to read contributions seem minor good results but on two toy datasets only comments the writing should be improved as the paper can be hard to follow one one hand this includes broken sentences inspired by the observation that foreground segmentation masks and bounding boxes both contain object geometric information and should be consistent with each other grammatical errors it proves that there are still many useful information can be discovered in those unlabeled data and sentences which are just hard to parse in the former type of models the scene is encoded into the objectoriented disentangled spatial and appearance encoding explicitly on the other the authors cite many concepts without introducing them in the paper stick breaking spatial broadcast decoder multiotsu thresholding etc which makes it non selfcontained the paper presents what seem like engineering improvements over previous works eg combining bounding boxes and segmentation masks by adding more components to the framework which is quite convoluted see fig 1 resnet fpn rpn segmentation vaes etc it is hard to know where performance comes from despite the ablation tests the experiments are limited to two toy datasets with a fixed number of simple objects which must be known beforehand which show no background interference and little occlusion in all i do not think it meets the iclr bar i am not an expert on the topic so i may have missed relevant datasetsbaselines detail region of interest introduced after roi has been mentioned several timesdocsepthis paper presents a variation of the monet model where an additional region proposal network generates bounding boxes for various objects in the scene an additional loss is introduced during training to make the segmentations produced by the monet segmenter consistent with the proposed bounding boxes results are demonstrated on multi dsprites and clevr with modest performance gains the paper is somewhat middle of the road in most aspects the proposed method is in my opinion only a slight variation on the existing monet model though presented clearly i dont feel that adding that loss makes the model better in any fundamental way even if performance numbers are slightly better in some circumstances furthermore though there is some ablation analysis i feel the level of analysis of the results is subpar when a relatively simple variation of a model like here is proposed i would want to see an effort to analyse the contribution beyond how it affects the numbers do we learn anything new by introducing the variation does it tell us some fallacy or failure of the original model and if so does it fix it these are lacking here a few more concrete points in table 1 why is the resnet18 fpn missing from dsprites dataset this ablation is probably the single most important experiment present in the paper i would want to see it reported on both datasets in general i feel the performance on these datasets is quite saturated and i hope to see results on more challenging data in the future the proposed method included there is very little discussion about the choice of hyperparameters in the paper how were they chosen is the system sensitive to these choices post rebuttal comments thank you authors for the detailed response i think some of of my concerns have been answered the paper may be a valid contribution to the community and i am raising my scoredocsepin this paper the authors introduce a regionbased approach for unsupervised scene decomposition it extends the previous monet by introducing the regionbased selfsupervised training instead of purely generating foreground masks in an rnn they simultaneously predict the bounding boxes and segmentation masks using a fasterrcnnbased framework the supervision comes from the object reconstruction loss and the selfsupervised loss of classification and regression of anchors in the rpn module the experiments and comparisons are only conducted on the synthetic clevr and multidsprites dataset paper strength the paper is well motivated and the proposed approach seems to be reasonable the selfsupervised idea is interesting that uses the segmentation mask to get the pseudo bounding box label for object detection which could ensure the consistency of object mask and bounding box paper weakness 1 the selfsupervision between segmentation masks and detection bounding boxes is the main contribution while incorporating the selfsupervision into monet is meaningful and interesting the overall novelty does not look significant 2 clarification of methods how to learn mk in a selfsupervise way is unclear monet uses spatial attention to identify the most salient object one by one which makes senses but here you segment all the objects in one step how could this be possible in an unsupervised way from the example in fig 2 it looks rmonet could pick out some small and faraway objects first which is not intuitive in fasterrcnn the positivenegative samples are selected by calculating the iou threshold between the sampled bbox and the ground truth bbox however in this selfsupervised approach there is no ground truth bbox although the authors proposed to use the pseudo bbox from the segmentation mask mk how could this be reliable since mk is likely of poor quality especially at the initial stage the selected k value is unclear in the original monet the spatial attention network is an rnnlike structure they decompose the scene stepbystep therefore they define the k steps however in this fasterrcnnbased framework the objects are selected in one step how to select the k1 objects in all proposals 3 results most of the results are with toy images there is no result on real images there is no result to really demonstrate the effectiveness of the selfsupervised loss the author should compare their rmonetunet with the baseline of rmonetunet wo the selfsupervised loss ie removing the object detection branch another missing baseline is monetunet in table 1 the monet resnet18fpn is 10 percent lower than the original monet does this means the network structure has a greater influence on the performance than the selfsupervision component in table 1 the rmonetlite performs worse once again i guess this poor performance comes from the network structure as the input image is 6464 the resnet downsamples the image to a very low resolution which losses the spatial information the visual results cannot demonstrate the advantage of the proposed approach for example in figure 3 the visual performance of monet and rmonetunet are quite similar update in general i am happy with the authors responses they did show the advantage of the introduced selfsupervised loss although the selfsupervised loss is intuitive incorporating it into monet is nontrivial and it does outperform monet despite that such selfsupervised works are hard to work on real scenes this paper does have some merits i am willing to increase my rating ### Summary:
the paper has good contributions to a challenging problem leveraging a fasterrcnn framework with a novel selfsupervised learning loss however reviewer 4 and other chairs in calibration considered that the paper does not meet the bar for acceptance the other reviewers did not champion the paper either hence i am proposing rejection pros r1 and r3 agree that the proposed model improves over related models such as monet the value of the proposed selfsupervised loss connecting bounding boxes and segmentations is well validated in experiments cons r4 gives good suggestions that may be useful to reach a broader readership namely introducing more of the concepts used in the paper eg stick breaking spatial broadcast decoder multiotsu thresholding so it becomes more selfcontained r4 also suggests improving the writing more generally r4 still finds the proposed method quite complex yet derivative after the rebuttal all reviewers complain about lack of experiments in real data but the authors did revise their paper and add some coco results in the appendix these could be part of the main paper in a future version
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 8774, 50276, 783, 2929, 10262, 247, 1332, 281, 11101, 3014, 13451, 715, 697, 31239, 5113, 436, 310, 2218, 342, 247, 1006, 800, 7792, 326, 15693, 1097, 41113, 12783, 285, 26405, 25965, 323, 1016, 1789, 352, 15771, 327, 2067, 3786, 5368, 10296, 697, 2022, 7680, 310, 37703, 15274, 875, 41113, 12783, 285, 26405, 25965, 50276, 856, 84, 50275, 483, 468, 13015, 253, 1666, 25379, 50276, 5040, 50275, 783, 2929, 476, 320, 1892, 281, 1239, 50276, 1987, 8303, 1646, 5884, 50276, 12311, 1543, 533, 327, 767, 20953, 15302, 760, 50276, 26122, 50276, 783, 4028, 943, 320, 5520, 347, 253, 2929, 476, 320, 1892, 281, 956, 581, 581, 1133, 436, 3797, 7154, 14683, 11797, 407, 253, 8310, 326, 35936, 26405, 25965, 285, 41113, 12783, 1097, 3831, 1789, 17856, 1491, 285, 943, 320, 5185, 342, 1016, 643, 47412, 474, 6332, 352, 19539, 326, 627, 403, 1335, 1142, 4217, 1491, 476, 320, 6888, 275, 1110, 440, 22027, 941, 285, 14683, 534, 403, 816, 1892, 281, 14390, 275, 253, 3438, 1511, 273, 3210, 253, 6200, 310, 16202, 715, 253, 1789, 21085, 557, 290, 33195, 8820, 285, 7286, 9706, 11120, 327, 253, 643, 253, 4477, 26542, 1142, 12342, 1293, 16984, 731, 275, 253, 2929, 7356, 10155, 8820, 10675, 29810, 4471, 1502, 86, 7887, 272, 3966, 534, 2789, 352, 1327, 1881, 41010, 50276, 783, 2929, 10262, 752, 1646, 751, 11369, 11701, 689, 2045, 2987, 24088, 16248, 41113, 12783, 285, 26405, 25965, 407, 6240, 625, 4295, 281, 253, 7792, 534, 310, 3240, 2410, 311, 4525, 923, 3036, 337, 501, 3024, 269, 16077, 391, 16077, 26405, 13460, 265, 3966, 352, 310, 1892, 281, 871, 835, 3045, 3249, 432, 5747, 253, 28913, 5216, 50276, 783, 4679, 403, 3710, 281, 767, 20953, 15302, 342, 247, 4229, 1180, 273, 2969, 5113, 534, 1364, 320, 1929, 38565, 534, 921, 642, 4114, 11689, 285, 1652, 30796, 50276, 249, 512, 891, 513, 417, 1158, 352, 16382, 253, 17857, 32888, 2534, 50276, 74, 717, 417, 271, 6485, 327, 253, 9400, 594, 891, 778, 452, 9829, 4623, 15302, 10352, 25379, 50276, 20119, 2919, 273, 1600, 5611, 846, 687, 74, 556, 644, 5393, 2067, 2069, 7152, 33032, 2520, 2929, 10262, 247, 7629, 273, 253, 48938, 1566, 835, 271, 3081, 2919, 10419, 2990, 15693, 41113, 12783, 323, 2710, 5113, 275, 253, 6200, 271, 3081, 2957, 310, 5611, 1309, 3733, 281, 1056, 253, 8223, 569, 4197, 407, 253, 48938, 8223, 254, 5185, 342, 253, 4081, 41113, 12783, 1543, 403, 5183, 327, 4471, 277, 1033, 31320, 285, 1391, 24987, 342, 16453, 3045, 15988, 50276, 783, 2929, 310, 8489, 4766, 273, 253, 3971, 275, 954, 7794, 50276, 783, 4081, 1332, 310, 275, 619, 4743, 760, 247, 4512, 7629, 327, 253, 5368, 48938, 1566, 2167, 3559, 4518, 891, 13414, 1928, 326, 6240, 326, 2957, 2789, 253, 1566, 1805, 275, 667, 7936, 1039, 1014, 604, 3045, 3904, 403, 5777, 1805, 275, 690, 5989, 33810, 2167, 627, 310, 690, 28913, 1783, 891, 1928, 253, 1268, 273, 1783, 273, 253, 1543, 310, 749, 1148, 50276, 9453, 247, 4942, 2969, 7629, 273, 247, 1566, 751, 1060, 310, 4081, 891, 651, 971, 281, 923, 271, 3434, 281, 30648, 253, 7680, 4457, 849, 352, 11852, 253, 3904, 50276, 3088, 359, 3037, 2712, 747, 407, 16984, 253, 7629, 1057, 352, 2028, 441, 690, 2965, 1974, 390, 4433, 273, 253, 3236, 1566, 285, 604, 594, 1057, 352, 4993, 352, 841, 403, 14999, 1060, 50276, 66, 1643, 625, 11859, 2792, 50275, 249, 2829, 337, 50276, 22309, 310, 253, 501, 3024, 1093, 50276, 71, 16077, 5816, 432, 277, 1033, 31320, 10895, 436, 28913, 310, 3164, 253, 2014, 954, 1774, 3368, 1246, 275, 253, 2929, 50276, 74, 651, 971, 281, 923, 352, 2361, 327, 1097, 15302, 50276, 249, 2087, 50276, 74, 1928, 253, 3045, 327, 841, 15302, 310, 3240, 23543, 285, 891, 3524, 281, 923, 1543, 327, 625, 11132, 941, 275, 253, 2852, 50276, 783, 4081, 1332, 2908, 50276, 9088, 310, 1077, 1652, 5955, 670, 253, 4327, 273, 4373, 22041, 275, 253, 2929, 50276, 5430, 497, 597, 6777, 310, 253, 985, 7996, 281, 841, 10165, 50276, 5996, 30080, 22559, 5701, 50276, 47033, 368, 4477, 323, 253, 7000, 2380, 50276, 74, 1158, 690, 273, 273, 619, 7350, 452, 644, 9577, 50276, 783, 2929, 778, 320, 247, 3588, 7680, 281, 253, 3114, 285, 891, 717, 12976, 619, 11691, 406, 339, 9852, 436, 2929, 253, 4477, 9569, 247, 2919, 3169, 2746, 323, 440, 35421, 6200, 14717, 352, 8725, 253, 2045, 48938, 407, 16984, 253, 2919, 3169, 1881, 35421, 3733, 3185, 273, 15846, 11365, 35936, 25965, 275, 271, 391, 9866, 597, 10486, 3283, 253, 41113, 12783, 285, 26405, 25965, 970, 247, 7938, 3373, 9866, 3169, 7792, 253, 20446, 3249, 432, 253, 1789, 14433, 2957, 285, 253, 1881, 35421, 2957, 273, 9162, 285, 9077, 273, 44791, 275, 253, 391, 16077, 6333, 253, 4679, 285, 14023, 403, 760, 5196, 327, 253, 13506, 1391, 24987, 285, 23964, 1033, 31320, 10895, 50275, 20790, 4757, 50276, 783, 2929, 310, 973, 17194, 285, 253, 4081, 2746, 3133, 281, 320, 5272, 50276, 783, 1881, 35421, 2934, 310, 4722, 326, 4648, 253, 26405, 8989, 281, 755, 253, 17927, 41113, 3817, 5203, 323, 1789, 5481, 534, 812, 5416, 253, 15274, 273, 1789, 8989, 285, 41113, 3817, 50275, 20790, 14855, 337, 253, 1881, 12185, 4694, 875, 26405, 25965, 285, 5481, 41113, 12783, 310, 253, 2022, 7680, 1223, 24049, 253, 1881, 12185, 4694, 715, 48938, 310, 14282, 285, 4722, 253, 4583, 38135, 1057, 417, 1007, 1534, 50276, 19, 37699, 273, 3082, 50276, 5430, 281, 3037, 36904, 275, 247, 1881, 12185, 87, 885, 1039, 310, 12744, 48938, 4648, 8820, 4116, 281, 4271, 253, 954, 43066, 1789, 581, 407, 581, 534, 2789, 21768, 533, 1060, 368, 8223, 512, 253, 5113, 275, 581, 3213, 849, 812, 436, 320, 1896, 275, 271, 440, 35421, 1039, 432, 253, 1650, 275, 3036, 374, 352, 4453, 391, 2163, 292, 812, 2619, 562, 690, 1355, 285, 2080, 12594, 5113, 806, 534, 310, 417, 27350, 50276, 249, 7938, 3373, 9866, 253, 10538, 3870, 909, 800, 3530, 403, 4236, 407, 18899, 253, 891, 276, 7887, 875, 253, 19958, 270, 3364, 285, 253, 3216, 5083, 270, 3364, 2299, 275, 436, 1881, 35421, 2746, 627, 310, 642, 3216, 5083, 270, 3364, 3738, 253, 4477, 4081, 281, 897, 253, 17927, 270, 3364, 432, 253, 26405, 8989, 36904, 849, 812, 436, 320, 9630, 1580, 36904, 310, 2779, 273, 4105, 3290, 3340, 387, 253, 3302, 3924, 50276, 783, 4236, 465, 1318, 310, 12744, 275, 253, 3236, 48938, 253, 8820, 4116, 2990, 310, 271, 391, 9866, 3022, 2605, 597, 11101, 3014, 253, 6200, 3213, 1615, 10539, 3103, 597, 4853, 253, 465, 5018, 2299, 275, 436, 7938, 3373, 9866, 3169, 7792, 253, 5113, 403, 4236, 275, 581, 3213, 849, 281, 3609, 253, 465, 18, 5113, 275, 512, 18595, 50274, 20, 1543, 50276, 2252, 273, 253, 1543, 403, 342, 20953, 3888, 627, 310, 642, 906, 327, 1524, 3888, 50276, 9088, 310, 642, 906, 281, 1663, 7568, 253, 12510, 273, 253, 1881, 35421, 2957, 253, 2488, 943, 7277, 616, 391, 2163, 292, 328, 292, 342, 253, 8245, 273, 391, 2163, 292, 328, 292, 32063, 253, 1881, 35421, 2957, 26332, 11922, 253, 1789, 5481, 7789, 1529, 5816, 8245, 310, 48938, 328, 292, 50276, 249, 2829, 337, 253, 48938, 501, 3024, 1093, 71, 16077, 310, 884, 2558, 2406, 685, 253, 3236, 48938, 1057, 436, 2097, 253, 2990, 2605, 556, 247, 3687, 4833, 327, 253, 3045, 685, 253, 1881, 12185, 4694, 4445, 50276, 249, 2829, 337, 253, 391, 2163, 292, 46788, 17923, 7197, 2378, 969, 891, 5476, 436, 4105, 3045, 3249, 432, 253, 2990, 2605, 347, 253, 3280, 2460, 310, 6705, 1540, 253, 501, 3024, 1066, 33380, 253, 2460, 281, 247, 1077, 1698, 6064, 534, 11655, 253, 8820, 1491, 50276, 783, 5304, 1543, 2550, 7568, 253, 5750, 273, 253, 4081, 2746, 323, 1650, 275, 4677, 495, 253, 5304, 3045, 273, 48938, 285, 391, 2163, 292, 328, 292, 403, 3240, 2074, 50274, 11183, 275, 2087, 891, 717, 5211, 342, 253, 4477, 6128, 597, 858, 921, 253, 5750, 273, 253, 5611, 1881, 35421, 2957, 3738, 253, 1881, 35421, 2957, 310, 27350, 24049, 352, 715, 48938, 310, 37825, 285, 352, 1057, 562, 32231, 48938, 5747, 326, 824, 1881, 35421, 2987, 403, 1892, 281, 789, 327, 1524, 13451, 436, 2929, 1057, 452, 690, 16108, 891, 717, 7378, 281, 2572, 619, 13716, 50276, 187, 187, 4118, 18435, 27, 783, 2929, 556, 1175, 9021, 281, 247, 11132, 1895, 19732, 2977, 247, 7938, 3373, 9866, 7792, 342, 247, 4460, 1881, 35421, 4715, 2957, 2299, 37317, 577, 285, 643, 21583, 275, 18543, 2783, 326, 253, 2929, 1057, 417, 2525, 253, 2534, 323, 14924, 253, 643, 30628, 858, 417, 16928, 253, 2929, 2057, 7613, 891, 717, 36636, 18235, 50275, 856, 84, 50276, 83, 18, 285, 391, 20, 5194, 326, 253, 4081, 1566, 19132, 689, 2905, 3210, 824, 347, 48938, 50276, 783, 1318, 273, 253, 4081, 1881, 35421, 2957, 12873, 41113, 12783, 285, 8223, 569, 310, 973, 17618, 275, 4679, 50276, 5040, 50276, 83, 21, 4245, 1175, 13991, 326, 778, 320, 4217, 281, 3986, 247, 16055, 1239, 4249, 10775, 16984, 625, 273, 253, 12342, 908, 275, 253, 2929, 24088, 7356, 10155, 8820, 10675, 29810, 4471, 1502, 86, 7887, 272, 594, 352, 4916, 625, 1881, 41010, 391, 21, 671, 5936, 11138, 253, 4028, 625, 3839, 50276, 83, 21, 1335, 9010, 253, 4081, 1332, 3240, 2570, 2568, 4309, 846, 253, 30080, 22559, 50276, 455, 30628, 17805, 670, 3480, 273, 4679, 275, 1524, 941, 533, 253, 4477, 858, 49620, 616, 2929, 285, 823, 690, 9285, 80, 1543, 275, 253, 30762, 841, 812, 320, 629, 273, 253, 2022, 2929, 275, 247, 2852, 2715 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 8774, 50276, 783, 2929, 10262, 247, 1332, 281, 11101, 3014, 13451, 715, 697, 31239, 5113, 436, 310, 2218, 342, 247, 1006, 800, 7792, 326, 15693, 1097, 41113, 12783, 285, 26405, 25965, 323, 1016, 1789, 352, 15771, 327, 2067, 3786, 5368, 10296, 697, 2022, 7680, 310, 37703, 15274, 875, 41113, 12783, 285, 26405, 25965, 50276, 856, 84, 50275, 483, 468, 13015, 253, 1666, 25379, 50276, 5040, 50275, 783, 2929, 476, 320, 1892, 281, 1239, 50276, 1987, 8303, 1646, 5884, 50276, 12311, 1543, 533, 327, 767, 20953, 15302, 760, 50276, 26122, 50276, 783, 4028, 943, 320, 5520, 347, 253, 2929, 476, 320, 1892, 281, 956, 581, 581, 1133, 436, 3797, 7154, 14683, 11797, 407, 253, 8310, 326, 35936, 26405, 25965, 285, 41113, 12783, 1097, 3831, 1789, 17856, 1491, 285, 943, 320, 5185, 342, 1016, 643, 47412, 474, 6332, 352, 19539, 326, 627, 403, 1335, 1142, 4217, 1491, 476, 320, 6888, 275, 1110, 440, 22027, 941, 285, 14683, 534, 403, 816, 1892, 281, 14390, 275, 253, 3438, 1511, 273, 3210, 253, 6200, 310, 16202, 715, 253, 1789, 21085, 557, 290, 33195, 8820, 285, 7286, 9706, 11120, 327, 253, 643, 253, 4477, 26542, 1142, 12342, 1293, 16984, 731, 275, 253, 2929, 7356, 10155, 8820, 10675, 29810, 4471, 1502, 86, 7887, 272, 3966, 534, 2789, 352, 1327, 1881, 41010, 50276, 783, 2929, 10262, 752, 1646, 751, 11369, 11701, 689, 2045, 2987, 24088, 16248, 41113, 12783, 285, 26405, 25965, 407, 6240, 625, 4295, 281, 253, 7792, 534, 310, 3240, 2410, 311, 4525, 923, 3036, 337, 501, 3024, 269, 16077, 391, 16077, 26405, 13460, 265, 3966, 352, 310, 1892, 281, 871, 835, 3045, 3249, 432, 5747, 253, 28913, 5216, 50276, 783, 4679, 403, 3710, 281, 767, 20953, 15302, 342, 247, 4229, 1180, 273, 2969, 5113, 534, 1364, 320, 1929, 38565, 534, 921, 642, 4114, 11689, 285, 1652, 30796, 50276, 249, 512, 891, 513, 417, 1158, 352, 16382, 253, 17857, 32888, 2534, 50276, 74, 717, 417, 271, 6485, 327, 253, 9400, 594, 891, 778, 452, 9829, 4623, 15302, 10352, 25379, 50276, 20119, 2919, 273, 1600, 5611, 846, 687, 74, 556, 644, 5393, 2067, 2069, 7152, 33032, 2520, 2929, 10262, 247, 7629, 273, 253, 48938, 1566, 835, 271, 3081, 2919, 10419, 2990, 15693, 41113, 12783, 323, 2710, 5113, 275, 253, 6200, 271, 3081, 2957, 310, 5611, 1309, 3733, 281, 1056, 253, 8223, 569, 4197, 407, 253, 48938, 8223, 254, 5185, 342, 253, 4081, 41113, 12783, 1543, 403, 5183, 327, 4471, 277, 1033, 31320, 285, 1391, 24987, 342, 16453, 3045, 15988, 50276, 783, 2929, 310, 8489, 4766, 273, 253, 3971, 275, 954, 7794, 50276, 783, 4081, 1332, 310, 275, 619, 4743, 760, 247, 4512, 7629, 327, 253, 5368, 48938, 1566, 2167, 3559, 4518, 891, 13414, 1928, 326, 6240, 326, 2957, 2789, 253, 1566, 1805, 275, 667, 7936, 1039, 1014, 604, 3045, 3904, 403, 5777, 1805, 275, 690, 5989, 33810, 2167, 627, 310, 690, 28913, 1783, 891, 1928, 253, 1268, 273, 1783, 273, 253, 1543, 310, 749, 1148, 50276, 9453, 247, 4942, 2969, 7629, 273, 247, 1566, 751, 1060, 310, 4081, 891, 651, 971, 281, 923, 271, 3434, 281, 30648, 253, 7680, 4457, 849, 352, 11852, 253, 3904, 50276, 3088, 359, 3037, 2712, 747, 407, 16984, 253, 7629, 1057, 352, 2028, 441, 690, 2965, 1974, 390, 4433, 273, 253, 3236, 1566, 285, 604, 594, 1057, 352, 4993, 352, 841, 403, 14999, 1060, 50276, 66, 1643, 625, 11859, 2792, 50275, 249, 2829, 337, 50276, 22309, 310, 253, 501, 3024, 1093, 50276, 71, 16077, 5816, 432, 277, 1033, 31320, 10895, 436, 28913, 310, 3164, 253, 2014, 954, 1774, 3368, 1246, 275, 253, 2929, 50276, 74, 651, 971, 281, 923, 352, 2361, 327, 1097, 15302, 50276, 249, 2087, 50276, 74, 1928, 253, 3045, 327, 841, 15302, 310, 3240, 23543, 285, 891, 3524, 281, 923, 1543, 327, 625, 11132, 941, 275, 253, 2852, 50276, 783, 4081, 1332, 2908, 50276, 9088, 310, 1077, 1652, 5955, 670, 253, 4327, 273, 4373, 22041, 275, 253, 2929, 50276, 5430, 497, 597, 6777, 310, 253, 985, 7996, 281, 841, 10165, 50276, 5996, 30080, 22559, 5701, 50276, 47033, 368, 4477, 323, 253, 7000, 2380, 50276, 74, 1158, 690, 273, 273, 619, 7350, 452, 644, 9577, 50276, 783, 2929, 778, 320, 247, 3588, 7680, 281, 253, 3114, 285, 891, 717, 12976, 619, 11691, 406, 339, 9852, 436, 2929, 253, 4477, 9569, 247, 2919, 3169, 2746, 323, 440, 35421, 6200, 14717, 352, 8725, 253, 2045, 48938, 407, 16984, 253, 2919, 3169, 1881, 35421, 3733, 3185, 273, 15846, 11365, 35936, 25965, 275, 271, 391, 9866, 597, 10486, 3283, 253, 41113, 12783, 285, 26405, 25965, 970, 247, 7938, 3373, 9866, 3169, 7792, 253, 20446, 3249, 432, 253, 1789, 14433, 2957, 285, 253, 1881, 35421, 2957, 273, 9162, 285, 9077, 273, 44791, 275, 253, 391, 16077, 6333, 253, 4679, 285, 14023, 403, 760, 5196, 327, 253, 13506, 1391, 24987, 285, 23964, 1033, 31320, 10895, 50275, 20790, 4757, 50276, 783, 2929, 310, 973, 17194, 285, 253, 4081, 2746, 3133, 281, 320, 5272, 50276, 783, 1881, 35421, 2934, 310, 4722, 326, 4648, 253, 26405, 8989, 281, 755, 253, 17927, 41113, 3817, 5203, 323, 1789, 5481, 534, 812, 5416, 253, 15274, 273, 1789, 8989, 285, 41113, 3817, 50275, 20790, 14855, 337, 253, 1881, 12185, 4694, 875, 26405, 25965, 285, 5481, 41113, 12783, 310, 253, 2022, 7680, 1223, 24049, 253, 1881, 12185, 4694, 715, 48938, 310, 14282, 285, 4722, 253, 4583, 38135, 1057, 417, 1007, 1534, 50276, 19, 37699, 273, 3082, 50276, 5430, 281, 3037, 36904, 275, 247, 1881, 12185, 87, 885, 1039, 310, 12744, 48938, 4648, 8820, 4116, 281, 4271, 253, 954, 43066, 1789, 581, 407, 581, 534, 2789, 21768, 533, 1060, 368, 8223, 512, 253, 5113, 275, 581, 3213, 849, 812, 436, 320, 1896, 275, 271, 440, 35421, 1039, 432, 253, 1650, 275, 3036, 374, 352, 4453, 391, 2163, 292, 812, 2619, 562, 690, 1355, 285, 2080, 12594, 5113, 806, 534, 310, 417, 27350, 50276, 249, 7938, 3373, 9866, 253, 10538, 3870, 909, 800, 3530, 403, 4236, 407, 18899, 253, 891, 276, 7887, 875, 253, 19958, 270, 3364, 285, 253, 3216, 5083, 270, 3364, 2299, 275, 436, 1881, 35421, 2746, 627, 310, 642, 3216, 5083, 270, 3364, 3738, 253, 4477, 4081, 281, 897, 253, 17927, 270, 3364, 432, 253, 26405, 8989, 36904, 849, 812, 436, 320, 9630, 1580, 36904, 310, 2779, 273, 4105, 3290, 3340, 387, 253, 3302, 3924, 50276, 783, 4236, 465, 1318, 310, 12744, 275, 253, 3236, 48938, 253, 8820, 4116, 2990, 310, 271, 391, 9866, 3022, 2605, 597, 11101, 3014, 253, 6200, 3213, 1615, 10539, 3103, 597, 4853, 253, 465, 5018, 2299, 275, 436, 7938, 3373, 9866, 3169, 7792, 253, 5113, 403, 4236, 275, 581, 3213, 849, 281, 3609, 253, 465, 18, 5113, 275, 512, 18595, 50274, 20, 1543, 50276, 2252, 273, 253, 1543, 403, 342, 20953, 3888, 627, 310, 642, 906, 327, 1524, 3888, 50276, 9088, 310, 642, 906, 281, 1663, 7568, 253, 12510, 273, 253, 1881, 35421, 2957, 253, 2488, 943, 7277, 616, 391, 2163, 292, 328, 292, 342, 253, 8245, 273, 391, 2163, 292, 328, 292, 32063, 253, 1881, 35421, 2957, 26332, 11922, 253, 1789, 5481, 7789, 1529, 5816, 8245, 310, 48938, 328, 292, 50276, 249, 2829, 337, 253, 48938, 501, 3024, 1093, 71, 16077, 310, 884, 2558, 2406, 685, 253, 3236, 48938, 1057, 436, 2097, 253, 2990, 2605, 556, 247, 3687, 4833, 327, 253, 3045, 685, 253, 1881, 12185, 4694, 4445, 50276, 249, 2829, 337, 253, 391, 2163, 292, 46788, 17923, 7197, 2378, 969, 891, 5476, 436, 4105, 3045, 3249, 432, 253, 2990, 2605, 347, 253, 3280, 2460, 310, 6705, 1540, 253, 501, 3024, 1066, 33380, 253, 2460, 281, 247, 1077, 1698, 6064, 534, 11655, 253, 8820, 1491, 50276, 783, 5304, 1543, 2550, 7568, 253, 5750, 273, 253, 4081, 2746, 323, 1650, 275, 4677, 495, 253, 5304, 3045, 273, 48938, 285, 391, 2163, 292, 328, 292, 403, 3240, 2074, 50274, 11183, 275, 2087, 891, 717, 5211, 342, 253, 4477, 6128, 597, 858, 921, 253, 5750, 273, 253, 5611, 1881, 35421, 2957, 3738, 253, 1881, 35421, 2957, 310, 27350, 24049, 352, 715, 48938, 310, 37825, 285, 352, 1057, 562, 32231, 48938, 5747, 326, 824, 1881, 35421, 2987, 403, 1892, 281, 789, 327, 1524, 13451, 436, 2929, 1057, 452, 690, 16108, 891, 717, 7378, 281, 2572, 619, 13716, 50276, 187, 187, 4118, 18435, 27, 783, 2929, 556, 1175, 9021, 281, 247, 11132, 1895, 19732, 2977, 247, 7938, 3373, 9866, 7792, 342, 247, 4460, 1881, 35421, 4715, 2957, 2299, 37317, 577, 285, 643, 21583, 275, 18543, 2783, 326, 253, 2929, 1057, 417, 2525, 253, 2534, 323, 14924, 253, 643, 30628, 858, 417, 16928, 253, 2929, 2057, 7613, 891, 717, 36636, 18235, 50275, 856, 84, 50276, 83, 18, 285, 391, 20, 5194, 326, 253, 4081, 1566, 19132, 689, 2905, 3210, 824, 347, 48938, 50276, 783, 1318, 273, 253, 4081, 1881, 35421, 2957, 12873, 41113, 12783, 285, 8223, 569, 310, 973, 17618, 275, 4679, 50276, 5040, 50276, 83, 21, 4245, 1175, 13991, 326, 778, 320, 4217, 281, 3986, 247, 16055, 1239, 4249, 10775, 16984, 625, 273, 253, 12342, 908, 275, 253, 2929, 24088, 7356, 10155, 8820, 10675, 29810, 4471, 1502, 86, 7887, 272, 594, 352, 4916, 625, 1881, 41010, 391, 21, 671, 5936, 11138, 253, 4028, 625, 3839, 50276, 83, 21, 1335, 9010, 253, 4081, 1332, 3240, 2570, 2568, 4309, 846, 253, 30080, 22559, 50276, 455, 30628, 17805, 670, 3480, 273, 4679, 275, 1524, 941, 533, 253, 4477, 858, 49620, 616, 2929, 285, 823, 690, 9285, 80, 1543, 275, 253, 30762, 841, 812, 320, 629, 273, 253, 2022, 2929, 275, 247, 2852, 2715 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: the authors propose an approach for softlabeling test data for testtime adaptation tta purposes based on conjugate pseudolabels motivated by the result that metalearning the tta objective given test time labels roughly recovers the temperaturescaled training objective the authors show that if the training objective can be expressed as a conjugate function lxy fhx ythx x inputs y labels then the optimial test time loss can be approximated by lyconj fnabla fhx f the conjugate function of f which is independent of the label and implies the conjugate pseudolabels the usual softmax of hx for ce loss tta results on cifar10ccifar100c and imagenetc demonstrate that conjugate pseudolabels consistently outperform other pl approaches originality 7 quality 7 clarity 7 significance 7 the paper is well written and the approach and results are interesting using conjugate pls for polyloss novel delivers onpar or better results than existing methods the conjugate pl approach is adequately developed in the paper but could benefit significantly from a section discussing current limitations the more general case of lyconj not independent of y the dual view and future research directions see sw section docseptesttime training is an emerging and promising approach for building robust machine learning models to distribution shifts as the model can not access groundtruth labels the most prior testtime adaptation literatures design the unsupervised loss function eg entropy minimization used in tent however as claimed by this paper it is unclear what makes a good unsupervised loss in the setting of tta this paper contributes to this important problem by following three means 1 they show that the metalearned unsupervised objective which can be regarded as a best unsupervised loss recover the wellknown entropy minimization when the base classifier trained with cross entropy but it recover different unsupervised loss function when using different supervised loss function the results suggest that the best loss function differs among the choice of the supervised loss function 2 they analyze the phenomenon from the lens of conjugated convex function and show that it recovers the existing testtime adaptation strategies which is similar to the objective found by metalearning 3 based on the observation they propose conjugate pseudolabels as a method for testtime adaptation it can be regarded as a general case of pseudolabels and can be used for loss function whose pseudolabeling strategy is not trivial eg recently proposed poly loss they empirically show that the conjugate pseudo labels match the performance of entropy minimization when the source model is trained by crossentropy loss and outperforms existing methods when the source model is trained by poly loss and squared loss as mentioned in the summary this paper provides several interesting insights about how to design good unsupervised loss functions by the series of experiments and theoretical analysis this paper is well motivated each finding is well described and interesting not only does the paper explain why the existing method works well but also provide more generic forms of the similar loss function for a broader supervised loss function called conjugated pseudolabel empirical validation supports the merits of the proposed method especially the source model is trained via unusual loss function such as poly loss and squared loss while i found the paper interesting there are several unclear points from the current manuscripts major comments 1 the details of metalearning experiments are not sufficiently presented for example what parameters of the networks were updated and what optimizer learning rate batch size was used it is not clear whether these factors affect the metalearned objective or not there is no clear explanation about what is the prediction score in figure 1 regarding the meta learning experiments i am also concerned about whether the unsupervised loss function is learned via online adaptation setup or offline adaptation setup in standard metalearning literature in my understanding they repeat the inner loop and outer loop many times ie they assume offline adaptation setup while the approach can reveal good objective function it might not be effective in actual testtime online adaptation setup as there is the discrepancy between metalearning and actual testphase for example in onlineadaptation adaptation speed and adaptation stability might be an issue given that there are less clues about data distribution 2 while i like the general story of the paper and i am positive to accept the paper i am still not convinced that the conjugated convex perspective could fully explain what makes a good unsupervised loss function in testtime adaptation setup for example why does the metalearning objective recover the scaled and temperature loss function rather than the standard entropy suggested by the equation 9 i am wondering if it is related to the assumption that the source models are sufficiently overparameterized or the nature of testtime adaptation setup which needs the online adaptation and may be fast adaptation rather than other unsupervised adaptation setup 3 related to the above question the results can not explain why several prior methods are often better than tent in practice even when the source model is trained via crossentropy loss for example 1 shows that shotim adding diversity regularization term often has better results than tent they also show that the feature alignment approach often gives better results than the simple feature modulation approach while i understand that it is out of scope of this paper ie authors do not claim that the conjugated pseudo labeling is the best possible solution i am curious whether conjugate optimization perspective could give further insights 4 while it depends on readers i found the section 4 should be more selfcontained for example the logic behind some important equation eg eq 2 3 5 should be sufficiently described minor comments in algorithm 1 no description about n the index n seems to be used with different meanings around equation 6 n represents batch size and in algorithm 1 n means the index of sample or batch in table 1 hard pl without temperature should be bold see questions docsepthe paper probes and evaluates the use of metalearnt convex conjugate of the original training loss for test time adaptation to distribution shifts it has shown an intuition into the development of the method via a visual analysis of the numeric form of the learnt test time loss with an analytical approximation o known losses discovering the similarity with the socalled source loss available during training the pseudo labels formulated via the conjugate seem to provide a general method of computing a pseudolabel that works for losses besides the venerable ce the reformulation of the problem within the legendrefenchel duality may indeed be the first such work in this small but growing area of work this area would arguably grow in significance as models in use encounter noniid conditions the ideas are clearly enunciated and are therefore easy to follow the experiments define clear baselines and use hopefully comprehensive sota benchmarks the choice of the source model ie a shallow resnet is not motivated and several small details that aid reproducibility may be added eg the number of runs for obtaining the standard deviation in table 1 no comments on this docseppaper proposes a general way of obtaining testtime adaptation loss which is used for more robust predictions under distribution shifts first this loss is learnt using metalearning and the insights obtained from the same is used to obtain the proposed tta loss related to the conjugate of the training loss the proposed loss is then reinterpreted as selftraining with pseudolabels strengths 1 paper is novel and useful as it provides a general framework to perform testtime adaptation for any training loss further it provides an informal theory on why existing tta loss softmaxentropy is a good tta loss 2 paper is clearly written and wellmotivated through preliminary experiments 3 experiments show advantage of the proposed training procedure particularly in more recent nonstandard loss functions summary of weaknesses details provided later 1 the papers claims about the proposed loss being best tta loss are not substantiated theoretically 2 the use of the temperature parameter is not motivated theoretically it seems like an empirical fix 3 tuning temperature parameter requires validation data from some target domains which is not the goldstandard tta setting conjugate pl seems to be sensitive to temperature tuning eg in table 1 temperature scaling is required to outperform baselines it would be helpful to clarify the following limitations better what kind of distribution shifts are allowed when is the proposed loss best withwithout temperature scaling ### Summary:
all reviewers agree this paper presents a novel and principled approach to test time adaptation losses all reviewers find the paper clearly written and contributions meaningful i suggest acceptance
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 12661, 271, 2746, 323, 2602, 1968, 272, 1071, 941, 323, 1071, 2606, 15644, 246, 893, 6378, 1754, 327, 27442, 10585, 311, 357, 1241, 17194, 407, 253, 906, 326, 5148, 613, 920, 253, 246, 893, 8103, 1677, 1071, 673, 13301, 11467, 761, 12239, 253, 9208, 1179, 264, 3733, 8103, 253, 4477, 921, 326, 604, 253, 3733, 8103, 476, 320, 4469, 347, 247, 27442, 1159, 298, 5246, 50276, 30550, 89, 50276, 90, 394, 89, 1269, 14800, 340, 13301, 840, 253, 5556, 451, 1071, 673, 2957, 476, 320, 34930, 407, 12865, 585, 75, 50276, 4174, 6348, 269, 73, 89, 269, 253, 27442, 1159, 273, 269, 534, 310, 3907, 273, 253, 5203, 285, 8018, 253, 27442, 10585, 311, 357, 1241, 253, 7312, 2602, 4090, 273, 288, 89, 323, 2636, 2957, 246, 893, 1543, 327, 260, 338, 274, 740, 550, 338, 274, 2313, 68, 285, 4440, 257, 14069, 7568, 326, 27442, 10585, 311, 357, 1241, 12724, 562, 32231, 643, 499, 7274, 3236, 414, 818, 3290, 818, 19843, 818, 8453, 818, 50275, 783, 2929, 310, 973, 3542, 285, 253, 2746, 285, 1543, 403, 4722, 50276, 5302, 27442, 499, 84, 323, 877, 1190, 1730, 4460, 26361, 327, 1148, 390, 1805, 1543, 685, 5368, 3082, 50276, 783, 27442, 499, 2746, 310, 18212, 3715, 275, 253, 2929, 533, 812, 5649, 3012, 432, 247, 2593, 16585, 1655, 7364, 253, 625, 2087, 1083, 273, 12865, 585, 75, 417, 3907, 273, 340, 253, 8746, 1859, 285, 2852, 2561, 10746, 50276, 2887, 1863, 2593, 5474, 339, 431, 383, 2606, 3733, 310, 271, 14149, 285, 12532, 2746, 323, 3652, 10237, 5145, 4715, 3210, 281, 3268, 15036, 347, 253, 1566, 476, 417, 2289, 3216, 33024, 13301, 50276, 783, 954, 2720, 1071, 2606, 15644, 4133, 2478, 2216, 253, 440, 35421, 2957, 1159, 24088, 15579, 41458, 908, 275, 12556, 2299, 347, 7558, 407, 436, 2929, 352, 310, 12744, 752, 2789, 247, 1175, 440, 35421, 2957, 275, 253, 4758, 273, 246, 893, 50275, 2520, 2929, 17904, 281, 436, 1774, 1895, 407, 1563, 1264, 2097, 337, 597, 921, 326, 253, 5148, 613, 9306, 440, 35421, 8103, 534, 476, 320, 12258, 347, 247, 1682, 440, 35421, 2957, 9295, 253, 973, 4304, 15579, 41458, 672, 253, 2613, 30410, 10166, 342, 2831, 15579, 533, 352, 9295, 1027, 440, 35421, 2957, 1159, 672, 970, 1027, 22296, 2957, 1159, 253, 1543, 1804, 326, 253, 1682, 2957, 1159, 19986, 2190, 253, 4327, 273, 253, 22296, 2957, 1159, 374, 597, 12106, 253, 11562, 432, 253, 9655, 273, 33502, 17133, 1159, 285, 921, 326, 352, 761, 12239, 253, 5368, 1071, 2606, 15644, 8130, 534, 310, 2074, 281, 253, 8103, 1119, 407, 5148, 613, 920, 495, 1754, 327, 253, 8310, 597, 12661, 27442, 10585, 311, 357, 1241, 347, 247, 1332, 323, 1071, 2606, 15644, 352, 476, 320, 12258, 347, 247, 2087, 1083, 273, 10585, 311, 357, 1241, 285, 476, 320, 908, 323, 2957, 1159, 3692, 10585, 311, 1492, 272, 5700, 310, 417, 14916, 24088, 4102, 4081, 3488, 2957, 597, 45190, 921, 326, 253, 27442, 17927, 13301, 3761, 253, 3045, 273, 15579, 41458, 672, 253, 2603, 1566, 310, 10166, 407, 2831, 290, 10144, 2957, 285, 41731, 13015, 5368, 3082, 672, 253, 2603, 1566, 310, 10166, 407, 3488, 2957, 285, 30044, 2957, 50276, 284, 5393, 275, 253, 6010, 436, 2929, 3400, 2067, 4722, 16039, 670, 849, 281, 2216, 1175, 440, 35421, 2957, 3470, 407, 253, 2962, 273, 4679, 285, 10527, 1783, 436, 2929, 310, 973, 17194, 1016, 4560, 310, 973, 2529, 285, 4722, 417, 760, 1057, 253, 2929, 5513, 2139, 253, 5368, 1332, 2987, 973, 533, 671, 2085, 625, 12314, 4948, 273, 253, 2074, 2957, 1159, 323, 247, 16055, 22296, 2957, 1159, 1925, 33502, 10585, 311, 1492, 16774, 12820, 8525, 253, 16108, 273, 253, 4081, 1332, 3340, 253, 2603, 1566, 310, 10166, 3066, 11555, 2957, 1159, 824, 347, 3488, 2957, 285, 30044, 2957, 50276, 6050, 891, 1119, 253, 2929, 4722, 627, 403, 2067, 12744, 2792, 432, 253, 1655, 40336, 50273, 24330, 5701, 50276, 18, 253, 4278, 273, 5148, 613, 920, 4679, 403, 417, 10481, 3559, 323, 1650, 752, 3602, 273, 253, 6928, 497, 9300, 285, 752, 5556, 6081, 4715, 2281, 14604, 1979, 369, 908, 352, 310, 417, 2590, 1880, 841, 2616, 2818, 253, 5148, 613, 9306, 8103, 390, 417, 627, 310, 642, 2590, 8813, 670, 752, 310, 253, 10554, 4868, 275, 4677, 337, 50275, 1747, 13218, 253, 11419, 4715, 4679, 891, 717, 671, 7514, 670, 1880, 253, 440, 35421, 2957, 1159, 310, 6311, 3066, 3909, 15644, 9978, 390, 28841, 15644, 9978, 275, 2629, 5148, 613, 920, 6239, 275, 619, 4685, 597, 10280, 253, 6703, 6287, 285, 8346, 6287, 1142, 2069, 26332, 597, 5467, 28841, 15644, 9978, 1223, 253, 2746, 476, 10313, 1175, 8103, 1159, 352, 1537, 417, 320, 3576, 275, 4588, 1071, 2606, 3909, 15644, 9978, 347, 627, 310, 253, 26210, 875, 5148, 613, 920, 285, 4588, 1071, 14213, 323, 1650, 275, 3909, 26672, 318, 15644, 3885, 285, 15644, 7882, 1537, 320, 271, 2523, 1677, 326, 627, 403, 1679, 30591, 670, 941, 3268, 50275, 19, 1223, 891, 751, 253, 2087, 2926, 273, 253, 2929, 285, 891, 717, 2762, 281, 2997, 253, 2929, 891, 717, 1335, 417, 13762, 326, 253, 33502, 17133, 8668, 812, 4751, 5513, 752, 2789, 247, 1175, 440, 35421, 2957, 1159, 275, 1071, 2606, 15644, 9978, 323, 1650, 2139, 1057, 253, 5148, 613, 920, 8103, 9295, 253, 24337, 285, 3276, 2957, 1159, 2581, 685, 253, 2629, 15579, 5125, 407, 253, 5150, 898, 891, 717, 12371, 604, 352, 310, 2905, 281, 253, 9376, 326, 253, 2603, 3210, 403, 10481, 689, 19484, 1025, 390, 253, 3753, 273, 1071, 2606, 15644, 9978, 534, 3198, 253, 3909, 15644, 285, 778, 320, 3809, 15644, 2581, 685, 643, 440, 35421, 15644, 9978, 50275, 20, 2905, 281, 253, 1840, 1953, 253, 1543, 476, 417, 5513, 2139, 2067, 2720, 3082, 403, 2223, 1805, 685, 12556, 275, 3946, 1014, 672, 253, 2603, 1566, 310, 10166, 3066, 2831, 290, 10144, 2957, 323, 1650, 337, 2722, 326, 5103, 303, 6240, 9991, 37820, 1307, 2223, 556, 1805, 1543, 685, 12556, 597, 671, 921, 326, 253, 4735, 12420, 2746, 2223, 4245, 1805, 1543, 685, 253, 2969, 4735, 15673, 2746, 1223, 891, 2096, 326, 352, 310, 562, 273, 7990, 273, 436, 2929, 26332, 4477, 513, 417, 1750, 326, 253, 33502, 17927, 21473, 310, 253, 1682, 1896, 2900, 891, 717, 14338, 1880, 27442, 13757, 8668, 812, 1918, 2007, 16039, 50274, 21, 1223, 352, 7024, 327, 10668, 891, 1119, 253, 2593, 577, 943, 320, 625, 1881, 41010, 323, 1650, 253, 9317, 3212, 690, 1774, 5150, 24088, 16186, 374, 495, 608, 943, 320, 10481, 2529, 50273, 37585, 5701, 50275, 249, 5933, 337, 642, 5740, 670, 295, 50275, 783, 3605, 295, 3133, 281, 320, 908, 342, 1027, 30460, 1475, 5150, 721, 295, 6125, 14604, 1979, 285, 275, 5933, 337, 295, 2097, 253, 3605, 273, 3410, 390, 14604, 50275, 249, 2829, 337, 1892, 499, 1293, 3276, 943, 320, 13433, 50274, 2887, 3533, 50276, 7152, 339, 431, 248, 2929, 19432, 285, 44995, 253, 897, 273, 5148, 613, 2649, 17133, 27442, 273, 253, 3236, 3733, 2957, 323, 1071, 673, 15644, 281, 3268, 15036, 50276, 262, 556, 2011, 271, 30328, 715, 253, 2440, 273, 253, 1332, 3066, 247, 5304, 1783, 273, 253, 31437, 830, 273, 253, 34003, 1071, 673, 2957, 342, 271, 16101, 11193, 258, 1929, 11655, 30375, 253, 14259, 342, 253, 9267, 18859, 2603, 2957, 2130, 1309, 3733, 50276, 783, 17927, 13301, 26115, 50276, 13917, 253, 27442, 1646, 281, 2085, 247, 2087, 1332, 273, 12672, 247, 10585, 311, 1492, 326, 2987, 323, 11655, 16280, 253, 49110, 494, 2636, 50275, 783, 8460, 1427, 273, 253, 1895, 1561, 253, 13691, 709, 257, 29311, 34962, 778, 6296, 320, 253, 806, 824, 789, 275, 436, 1355, 533, 5675, 2170, 273, 789, 50276, 2520, 2170, 651, 25711, 1756, 275, 8453, 347, 3210, 275, 897, 13329, 1327, 74, 301, 2515, 50276, 783, 5697, 403, 50276, 49346, 546, 7157, 4215, 285, 403, 3103, 3477, 281, 956, 253, 4679, 4853, 2590, 1666, 25379, 285, 897, 18670, 11088, 256, 5503, 49602, 253, 4327, 273, 253, 2603, 1566, 26332, 247, 20126, 501, 3024, 310, 417, 17194, 285, 2067, 1355, 4278, 326, 8596, 38041, 778, 320, 2879, 24088, 253, 1180, 273, 6613, 323, 13546, 253, 2629, 11254, 275, 2829, 337, 50274, 2369, 50276, 26122, 327, 436, 50275, 7152, 339, 377, 6653, 29328, 247, 2087, 1039, 273, 13546, 1071, 2606, 15644, 2957, 534, 310, 908, 323, 625, 10237, 13650, 762, 3268, 15036, 806, 436, 2957, 310, 34003, 970, 5148, 613, 920, 285, 253, 16039, 2797, 432, 253, 1072, 310, 908, 281, 4044, 253, 4081, 246, 893, 2957, 2905, 281, 253, 27442, 273, 253, 3733, 2957, 253, 4081, 2957, 310, 840, 294, 22416, 264, 347, 11329, 649, 26208, 342, 10585, 311, 357, 1241, 50276, 296, 3755, 20556, 50276, 18, 2929, 310, 4460, 285, 4217, 347, 352, 3400, 247, 2087, 7792, 281, 1347, 1071, 2606, 15644, 323, 667, 3733, 2957, 2007, 352, 3400, 271, 25040, 3762, 327, 2139, 5368, 246, 893, 2957, 2602, 4090, 290, 10144, 310, 247, 1175, 246, 893, 2957, 50275, 19, 2929, 310, 4518, 3542, 285, 973, 24013, 8550, 949, 12611, 4679, 50275, 20, 4679, 921, 5750, 273, 253, 4081, 3733, 5199, 3782, 275, 625, 3332, 1327, 15291, 2957, 3470, 50276, 8774, 273, 32213, 4278, 2530, 1996, 50276, 18, 253, 9380, 3916, 670, 253, 4081, 2957, 1146, 1682, 246, 893, 2957, 403, 417, 4326, 4215, 28055, 50275, 19, 253, 897, 273, 253, 3276, 4764, 310, 417, 17194, 28055, 352, 3133, 751, 271, 16774, 4993, 50275, 20, 25184, 3276, 4764, 4419, 12820, 941, 432, 690, 2303, 10625, 534, 310, 417, 253, 5328, 15291, 246, 893, 4758, 27442, 499, 3133, 281, 320, 7996, 281, 3276, 25184, 24088, 275, 2829, 337, 3276, 13642, 310, 2424, 281, 562, 32231, 1666, 25379, 50274, 262, 651, 320, 9371, 281, 19148, 253, 1563, 7364, 1805, 752, 2238, 273, 3268, 15036, 403, 4136, 672, 310, 253, 4081, 2957, 1682, 342, 14920, 3276, 13642, 50276, 187, 187, 4118, 18435, 27, 455, 30628, 5194, 436, 2929, 10262, 247, 4460, 285, 3505, 74, 6216, 2746, 281, 1071, 673, 15644, 11655, 512, 30628, 1089, 253, 2929, 4518, 3542, 285, 9021, 14282, 891, 1804, 14924 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 783, 4477, 12661, 271, 2746, 323, 2602, 1968, 272, 1071, 941, 323, 1071, 2606, 15644, 246, 893, 6378, 1754, 327, 27442, 10585, 311, 357, 1241, 17194, 407, 253, 906, 326, 5148, 613, 920, 253, 246, 893, 8103, 1677, 1071, 673, 13301, 11467, 761, 12239, 253, 9208, 1179, 264, 3733, 8103, 253, 4477, 921, 326, 604, 253, 3733, 8103, 476, 320, 4469, 347, 247, 27442, 1159, 298, 5246, 50276, 30550, 89, 50276, 90, 394, 89, 1269, 14800, 340, 13301, 840, 253, 5556, 451, 1071, 673, 2957, 476, 320, 34930, 407, 12865, 585, 75, 50276, 4174, 6348, 269, 73, 89, 269, 253, 27442, 1159, 273, 269, 534, 310, 3907, 273, 253, 5203, 285, 8018, 253, 27442, 10585, 311, 357, 1241, 253, 7312, 2602, 4090, 273, 288, 89, 323, 2636, 2957, 246, 893, 1543, 327, 260, 338, 274, 740, 550, 338, 274, 2313, 68, 285, 4440, 257, 14069, 7568, 326, 27442, 10585, 311, 357, 1241, 12724, 562, 32231, 643, 499, 7274, 3236, 414, 818, 3290, 818, 19843, 818, 8453, 818, 50275, 783, 2929, 310, 973, 3542, 285, 253, 2746, 285, 1543, 403, 4722, 50276, 5302, 27442, 499, 84, 323, 877, 1190, 1730, 4460, 26361, 327, 1148, 390, 1805, 1543, 685, 5368, 3082, 50276, 783, 27442, 499, 2746, 310, 18212, 3715, 275, 253, 2929, 533, 812, 5649, 3012, 432, 247, 2593, 16585, 1655, 7364, 253, 625, 2087, 1083, 273, 12865, 585, 75, 417, 3907, 273, 340, 253, 8746, 1859, 285, 2852, 2561, 10746, 50276, 2887, 1863, 2593, 5474, 339, 431, 383, 2606, 3733, 310, 271, 14149, 285, 12532, 2746, 323, 3652, 10237, 5145, 4715, 3210, 281, 3268, 15036, 347, 253, 1566, 476, 417, 2289, 3216, 33024, 13301, 50276, 783, 954, 2720, 1071, 2606, 15644, 4133, 2478, 2216, 253, 440, 35421, 2957, 1159, 24088, 15579, 41458, 908, 275, 12556, 2299, 347, 7558, 407, 436, 2929, 352, 310, 12744, 752, 2789, 247, 1175, 440, 35421, 2957, 275, 253, 4758, 273, 246, 893, 50275, 2520, 2929, 17904, 281, 436, 1774, 1895, 407, 1563, 1264, 2097, 337, 597, 921, 326, 253, 5148, 613, 9306, 440, 35421, 8103, 534, 476, 320, 12258, 347, 247, 1682, 440, 35421, 2957, 9295, 253, 973, 4304, 15579, 41458, 672, 253, 2613, 30410, 10166, 342, 2831, 15579, 533, 352, 9295, 1027, 440, 35421, 2957, 1159, 672, 970, 1027, 22296, 2957, 1159, 253, 1543, 1804, 326, 253, 1682, 2957, 1159, 19986, 2190, 253, 4327, 273, 253, 22296, 2957, 1159, 374, 597, 12106, 253, 11562, 432, 253, 9655, 273, 33502, 17133, 1159, 285, 921, 326, 352, 761, 12239, 253, 5368, 1071, 2606, 15644, 8130, 534, 310, 2074, 281, 253, 8103, 1119, 407, 5148, 613, 920, 495, 1754, 327, 253, 8310, 597, 12661, 27442, 10585, 311, 357, 1241, 347, 247, 1332, 323, 1071, 2606, 15644, 352, 476, 320, 12258, 347, 247, 2087, 1083, 273, 10585, 311, 357, 1241, 285, 476, 320, 908, 323, 2957, 1159, 3692, 10585, 311, 1492, 272, 5700, 310, 417, 14916, 24088, 4102, 4081, 3488, 2957, 597, 45190, 921, 326, 253, 27442, 17927, 13301, 3761, 253, 3045, 273, 15579, 41458, 672, 253, 2603, 1566, 310, 10166, 407, 2831, 290, 10144, 2957, 285, 41731, 13015, 5368, 3082, 672, 253, 2603, 1566, 310, 10166, 407, 3488, 2957, 285, 30044, 2957, 50276, 284, 5393, 275, 253, 6010, 436, 2929, 3400, 2067, 4722, 16039, 670, 849, 281, 2216, 1175, 440, 35421, 2957, 3470, 407, 253, 2962, 273, 4679, 285, 10527, 1783, 436, 2929, 310, 973, 17194, 1016, 4560, 310, 973, 2529, 285, 4722, 417, 760, 1057, 253, 2929, 5513, 2139, 253, 5368, 1332, 2987, 973, 533, 671, 2085, 625, 12314, 4948, 273, 253, 2074, 2957, 1159, 323, 247, 16055, 22296, 2957, 1159, 1925, 33502, 10585, 311, 1492, 16774, 12820, 8525, 253, 16108, 273, 253, 4081, 1332, 3340, 253, 2603, 1566, 310, 10166, 3066, 11555, 2957, 1159, 824, 347, 3488, 2957, 285, 30044, 2957, 50276, 6050, 891, 1119, 253, 2929, 4722, 627, 403, 2067, 12744, 2792, 432, 253, 1655, 40336, 50273, 24330, 5701, 50276, 18, 253, 4278, 273, 5148, 613, 920, 4679, 403, 417, 10481, 3559, 323, 1650, 752, 3602, 273, 253, 6928, 497, 9300, 285, 752, 5556, 6081, 4715, 2281, 14604, 1979, 369, 908, 352, 310, 417, 2590, 1880, 841, 2616, 2818, 253, 5148, 613, 9306, 8103, 390, 417, 627, 310, 642, 2590, 8813, 670, 752, 310, 253, 10554, 4868, 275, 4677, 337, 50275, 1747, 13218, 253, 11419, 4715, 4679, 891, 717, 671, 7514, 670, 1880, 253, 440, 35421, 2957, 1159, 310, 6311, 3066, 3909, 15644, 9978, 390, 28841, 15644, 9978, 275, 2629, 5148, 613, 920, 6239, 275, 619, 4685, 597, 10280, 253, 6703, 6287, 285, 8346, 6287, 1142, 2069, 26332, 597, 5467, 28841, 15644, 9978, 1223, 253, 2746, 476, 10313, 1175, 8103, 1159, 352, 1537, 417, 320, 3576, 275, 4588, 1071, 2606, 3909, 15644, 9978, 347, 627, 310, 253, 26210, 875, 5148, 613, 920, 285, 4588, 1071, 14213, 323, 1650, 275, 3909, 26672, 318, 15644, 3885, 285, 15644, 7882, 1537, 320, 271, 2523, 1677, 326, 627, 403, 1679, 30591, 670, 941, 3268, 50275, 19, 1223, 891, 751, 253, 2087, 2926, 273, 253, 2929, 285, 891, 717, 2762, 281, 2997, 253, 2929, 891, 717, 1335, 417, 13762, 326, 253, 33502, 17133, 8668, 812, 4751, 5513, 752, 2789, 247, 1175, 440, 35421, 2957, 1159, 275, 1071, 2606, 15644, 9978, 323, 1650, 2139, 1057, 253, 5148, 613, 920, 8103, 9295, 253, 24337, 285, 3276, 2957, 1159, 2581, 685, 253, 2629, 15579, 5125, 407, 253, 5150, 898, 891, 717, 12371, 604, 352, 310, 2905, 281, 253, 9376, 326, 253, 2603, 3210, 403, 10481, 689, 19484, 1025, 390, 253, 3753, 273, 1071, 2606, 15644, 9978, 534, 3198, 253, 3909, 15644, 285, 778, 320, 3809, 15644, 2581, 685, 643, 440, 35421, 15644, 9978, 50275, 20, 2905, 281, 253, 1840, 1953, 253, 1543, 476, 417, 5513, 2139, 2067, 2720, 3082, 403, 2223, 1805, 685, 12556, 275, 3946, 1014, 672, 253, 2603, 1566, 310, 10166, 3066, 2831, 290, 10144, 2957, 323, 1650, 337, 2722, 326, 5103, 303, 6240, 9991, 37820, 1307, 2223, 556, 1805, 1543, 685, 12556, 597, 671, 921, 326, 253, 4735, 12420, 2746, 2223, 4245, 1805, 1543, 685, 253, 2969, 4735, 15673, 2746, 1223, 891, 2096, 326, 352, 310, 562, 273, 7990, 273, 436, 2929, 26332, 4477, 513, 417, 1750, 326, 253, 33502, 17927, 21473, 310, 253, 1682, 1896, 2900, 891, 717, 14338, 1880, 27442, 13757, 8668, 812, 1918, 2007, 16039, 50274, 21, 1223, 352, 7024, 327, 10668, 891, 1119, 253, 2593, 577, 943, 320, 625, 1881, 41010, 323, 1650, 253, 9317, 3212, 690, 1774, 5150, 24088, 16186, 374, 495, 608, 943, 320, 10481, 2529, 50273, 37585, 5701, 50275, 249, 5933, 337, 642, 5740, 670, 295, 50275, 783, 3605, 295, 3133, 281, 320, 908, 342, 1027, 30460, 1475, 5150, 721, 295, 6125, 14604, 1979, 285, 275, 5933, 337, 295, 2097, 253, 3605, 273, 3410, 390, 14604, 50275, 249, 2829, 337, 1892, 499, 1293, 3276, 943, 320, 13433, 50274, 2887, 3533, 50276, 7152, 339, 431, 248, 2929, 19432, 285, 44995, 253, 897, 273, 5148, 613, 2649, 17133, 27442, 273, 253, 3236, 3733, 2957, 323, 1071, 673, 15644, 281, 3268, 15036, 50276, 262, 556, 2011, 271, 30328, 715, 253, 2440, 273, 253, 1332, 3066, 247, 5304, 1783, 273, 253, 31437, 830, 273, 253, 34003, 1071, 673, 2957, 342, 271, 16101, 11193, 258, 1929, 11655, 30375, 253, 14259, 342, 253, 9267, 18859, 2603, 2957, 2130, 1309, 3733, 50276, 783, 17927, 13301, 26115, 50276, 13917, 253, 27442, 1646, 281, 2085, 247, 2087, 1332, 273, 12672, 247, 10585, 311, 1492, 326, 2987, 323, 11655, 16280, 253, 49110, 494, 2636, 50275, 783, 8460, 1427, 273, 253, 1895, 1561, 253, 13691, 709, 257, 29311, 34962, 778, 6296, 320, 253, 806, 824, 789, 275, 436, 1355, 533, 5675, 2170, 273, 789, 50276, 2520, 2170, 651, 25711, 1756, 275, 8453, 347, 3210, 275, 897, 13329, 1327, 74, 301, 2515, 50276, 783, 5697, 403, 50276, 49346, 546, 7157, 4215, 285, 403, 3103, 3477, 281, 956, 253, 4679, 4853, 2590, 1666, 25379, 285, 897, 18670, 11088, 256, 5503, 49602, 253, 4327, 273, 253, 2603, 1566, 26332, 247, 20126, 501, 3024, 310, 417, 17194, 285, 2067, 1355, 4278, 326, 8596, 38041, 778, 320, 2879, 24088, 253, 1180, 273, 6613, 323, 13546, 253, 2629, 11254, 275, 2829, 337, 50274, 2369, 50276, 26122, 327, 436, 50275, 7152, 339, 377, 6653, 29328, 247, 2087, 1039, 273, 13546, 1071, 2606, 15644, 2957, 534, 310, 908, 323, 625, 10237, 13650, 762, 3268, 15036, 806, 436, 2957, 310, 34003, 970, 5148, 613, 920, 285, 253, 16039, 2797, 432, 253, 1072, 310, 908, 281, 4044, 253, 4081, 246, 893, 2957, 2905, 281, 253, 27442, 273, 253, 3733, 2957, 253, 4081, 2957, 310, 840, 294, 22416, 264, 347, 11329, 649, 26208, 342, 10585, 311, 357, 1241, 50276, 296, 3755, 20556, 50276, 18, 2929, 310, 4460, 285, 4217, 347, 352, 3400, 247, 2087, 7792, 281, 1347, 1071, 2606, 15644, 323, 667, 3733, 2957, 2007, 352, 3400, 271, 25040, 3762, 327, 2139, 5368, 246, 893, 2957, 2602, 4090, 290, 10144, 310, 247, 1175, 246, 893, 2957, 50275, 19, 2929, 310, 4518, 3542, 285, 973, 24013, 8550, 949, 12611, 4679, 50275, 20, 4679, 921, 5750, 273, 253, 4081, 3733, 5199, 3782, 275, 625, 3332, 1327, 15291, 2957, 3470, 50276, 8774, 273, 32213, 4278, 2530, 1996, 50276, 18, 253, 9380, 3916, 670, 253, 4081, 2957, 1146, 1682, 246, 893, 2957, 403, 417, 4326, 4215, 28055, 50275, 19, 253, 897, 273, 253, 3276, 4764, 310, 417, 17194, 28055, 352, 3133, 751, 271, 16774, 4993, 50275, 20, 25184, 3276, 4764, 4419, 12820, 941, 432, 690, 2303, 10625, 534, 310, 417, 253, 5328, 15291, 246, 893, 4758, 27442, 499, 3133, 281, 320, 7996, 281, 3276, 25184, 24088, 275, 2829, 337, 3276, 13642, 310, 2424, 281, 562, 32231, 1666, 25379, 50274, 262, 651, 320, 9371, 281, 19148, 253, 1563, 7364, 1805, 752, 2238, 273, 3268, 15036, 403, 4136, 672, 310, 253, 4081, 2957, 1682, 342, 14920, 3276, 13642, 50276, 187, 187, 4118, 18435, 27, 455, 30628, 5194, 436, 2929, 10262, 247, 4460, 285, 3505, 74, 6216, 2746, 281, 1071, 673, 15644, 11655, 512, 30628, 1089, 253, 2929, 4518, 3542, 285, 9021, 14282, 891, 1804, 14924 ]
Below is given review of a research paper from cnoference journal. Please write a summary the review. ### Review: this paper provides a theoretical analysis for sequential training based on related ntk results it shows the similarity between the target functions is a key factor for forward and backward transfer and when the target functions are the same the samples size of each task is a key factor to forward and backward transfer it also shows even a slight difference between the target functions can cause catastrophic forgetting and the forgetting might still happen for the same target function when the sample size of a later task increases the paper provides some theoretical insights on the forward and backward transfer in sequential training my main concern is that i have some doubts regarding its strong assumptions on the data distribution as stated in the last paragraph on page 4 the input samples of task a and task b are iid and generated by the same distribution this assumption cannot be satisfied in most cases of continual learning especially for class incremental learning the data distribution of each task has significant differences in this sense i think the claims of this work are more about sequential training on a large dataset rather than continual learning and the experiments are insufficient to verify the theoretical results in typical continual learning benchmark tasks the experiments on cifar10 are basically sequential training on chunks of a dataset im wondering how much of these theoretical results can be conveyed to typical continual learning tasks as in most benchmarks the sample size of each task is the same even when the target function is the same such as in domain incremental learning the forgetting would like to happen eg permuted or rotated mnist tasks but in sec43 the authors claim no forgetting appears for equal sample sizes which is not the case for most continual learning tasks in my experience some detailed issues 1 fig2 does not match the description as the blue line ea is always larger in the figure 2 whats the error in fig3 the generalization error over the last task 3 most experiments results are about forward transfer i think it would be better to show backward transfer as well the theoretical results provided in this paper are kind of interesting but i think they are more about sequential training rather than continual learning the experiments part is a bit weak to support the claims as im not familiar with the ntk related work i didnt check through the proofs of those theorems docsepthis paper studies the generalizationtransfer behavior of the continual learningsequential learning under the neural tangent kernel ntk regime it gives the formulation of the generalization error between two tasks in forwarding and backward ways and analyzes the influences of target similarity and sample sizes then the selfknowledge transfer case with the same target function is studied which shows the universality of the catastrophic forgetting and the influence of the sample size the results are then generalized to many tasks theoretical studies are crucial for understanding how deep neural networks work on the continual learningsequential learning problem ntk is one of the powerful theoretical tools for analyzing the behavior of neural networks this paper extends the studies of ntk specifically the spectrum dependent learning curves 1 on the continual learning setting via studying the generalization between the sequential tasks the analyses reveal the relationship between the transferforgetting and the dataset for standard neural networks however there are still some drawbacks that may limit the significance of the paper 1 the paper focuses on studies of the behaviors of the standard networks ntk trained under standard strategies thus the results mainly reflect the influence of the data characteristics and show fewer insights related to the networks and the training strategies given the original results in 1 the results in the paper are obvious further limiting the papers significance the results in the paper so far give limited guidance related to model designing in my opinion 2 the studies are restricted to continual learning with explicit task boundaries it is acceptable but i suggested the authors claim the settings more explicitly at the beginning of the papermethod for clarity 1 bordelon blake abdulkadir canatar and cengiz pehlevan spectrum dependent learning curves in kernel regression and wide neural networks in international conference on machine learning pp 10241034 pmlr 2020 although the papers significance is a little limited by the relationship with previous work such as 1 and the restricted scenario the results especially the selfknowledge transfer case are interesting it could be a starting point for more general and informative analyses of deep continual learning docsepthe paper is a theoretical paper about continual learning that studies 2 tasks settings in the ntk regime the paper study in particular transfer from tasks with the same target functions and introduce the concept of selfknowledge transfer ie transfer from two tasks with the same target function the paper is well written and well structured each section is clear in what the authors try to explain demonstrate nevertheless the vocabulary used is a bit different from usual continual learning papers which might be a bit misleading a head in cl literature refers often to a group of output neurons while it seems that in this paper it refers only to one output neuron target function is also unusual to refer to the function specific to a group of targets this terminology does not hurt but it should be explained earlier in the paper to avoid confusion i acknowledge the use of a theoretical approach to continual learning to better define and understand the problem we are solving on the other hand it is a bit difficult to link the paper setting with continual learning classical settings continual learning is usually about learning in a noniid setting with several tasks where each task might lead to forgetting of the previous one in this paper two different tasks are not supposed to interfere with the other as long as they have different target functions in the ntk regime the interaction between different heads do not cause knowledge transfer and forgetting the concept of selfknowledge transfer is not very intuitive if i understand correctly is about learning again what the model is supposed to already know but it does not learn it from himself ie from its own knowledge but still from the data that why selfknowledge transfer terminology seems to not fit the phenomenon that the authors want to describe selfknowledge transfer would better fit to a pseudo rehearsal method or to a generative replay method questions section 43 the training on task b degrades generalization even though both tasks a and b learn the same target how is it possible is it overfitting of the target function or just that a and b tasks are about two different modes of the same distribution is it a setting similar to domain incremental settings section 42 positive transfer is guaranteed under equal sample sizes i believe sample size is important but if na nb and all the samples are the same there is no reason for enhancement ie positive transfer on the other hand if you have the same number of samples but with a lower variability you can have a negative backward transfer what am i missing here to conclude i like the aim at theoretically approaching continual learning but it seems there is still a consequent gap to make the theoretical setting proposed fit an actual continual learning setting i still will grade this paper above the acceptance threshold because i think the paper can be useful for the community docsepthe paper presents a study of continual learning of neural networks in the neural tangent kernel ntk regime the authors extend recent work which estimates the generalization error of such networks using a statistical mechanics technique known as the replica trick the theory makes a number of predictions regarding backward and forward transfer learning which describe well the behavior of finite neural networks trained with gradient descent this is a very well written paper i never learned the replica trick and im not an expert of ntk theory but the authors provide just enough background detail on the ntk and the base bordelon et al 2020 and canatar et al 2021 papers to have a grasp of the main tools used to derive the new results these new results are concisely presented and appropriately discussed overall i found the paper a pleasure to read the theory is approximate it is developed for infinitelywide networks and it is not exact even for such networks as it rests upon some tricks that are part of the replica method the authors therefore carefully validate their semianalytical results with actual neural network simulations the theory is in impressive agreement with the largenetwork simulations the fit of fig 1b which shows rich nonlinear behavior is particularly remarkable the paper contains a number of interesting results it justifies the adjective catastrophic in catastrophic forgetting and it discovers complex transfer behavior as the relative training set size and teacher noise levels are varied i think that the simulations alone are worth publishing the fact that they can be described with semianalytical methods is a strong plus a weakness of the paper which limits the generality of its conclusions is its focus on the restricted case where px is assumed to be the same for every task like other studies in the ntk regime another obvious weakness of the paper is the lack of experiments that try to validate to which extent the theory holds when the network width becomes smaller i was not capable of reviewing the calculations presented in the appendix therefore i cannot stand for the correctness of the results presented in the paper as i am not an expert in this field it is also possible that i missed some important related work that should have been cited some other small remarks it would be useful to provide some more detail as to how the etai are computed there are typos and small english mistakes everywhere that should be fixed the discussion section would benefit from some comments on which if any existing continual learning algorithms that are designed to reduce forgetting might be analyzed in the ntk regime with similar tools as those used here this is a wellpresented theoretical study that applies previously developed techniques to analytically describe transfercontinual learning in neural networks the results are to the best of my knowledge novel the theory reveals interesting rich phenomena that occurs in actual neural network simulations ### Summary:
after carefully reading the reviews and rebuttal i believe this work is of sufficient quality for acceptance understanding continual learning from a theoretical stand point is a very important topic i find that one of the main issue raised by reviewers was about the exact meaning of continual learning and whether what the authors studied was more akin to sequential learning while i dont mind the term sequential learning and is quite descriptive of the work as well i disagree that the considered setup is not continual learning
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 3400, 247, 10527, 1783, 323, 22453, 3733, 1754, 327, 2905, 295, 17922, 1543, 50276, 262, 2722, 253, 14259, 875, 253, 2303, 3470, 310, 247, 2234, 2803, 323, 3579, 285, 19265, 3700, 285, 672, 253, 2303, 3470, 403, 253, 1072, 253, 3530, 1979, 273, 1016, 4836, 310, 247, 2234, 2803, 281, 3579, 285, 19265, 3700, 352, 671, 2722, 1014, 247, 4512, 3064, 875, 253, 2303, 3470, 476, 2847, 36256, 37264, 285, 253, 37264, 1537, 1335, 5108, 323, 253, 1072, 2303, 1159, 672, 253, 3410, 1979, 273, 247, 1996, 4836, 5459, 50276, 783, 2929, 3400, 690, 10527, 16039, 327, 253, 3579, 285, 19265, 3700, 275, 22453, 3733, 50276, 2577, 2022, 4468, 310, 326, 891, 452, 690, 24626, 5001, 697, 2266, 13260, 327, 253, 941, 3268, 347, 4767, 275, 253, 1390, 12494, 327, 3239, 577, 253, 3280, 3530, 273, 4836, 247, 285, 4836, 270, 403, 891, 301, 285, 4561, 407, 253, 1072, 3268, 436, 9376, 2550, 320, 10048, 275, 954, 2219, 273, 45120, 4715, 3340, 323, 966, 32809, 4715, 253, 941, 3268, 273, 1016, 4836, 556, 1534, 3910, 50276, 249, 436, 3282, 891, 1158, 253, 3916, 273, 436, 789, 403, 625, 670, 22453, 3733, 327, 247, 1781, 10895, 2581, 685, 45120, 4715, 50276, 395, 253, 4679, 403, 12497, 281, 12654, 253, 10527, 1543, 275, 6867, 45120, 4715, 22791, 8892, 253, 4679, 327, 260, 338, 274, 740, 403, 10323, 22453, 3733, 327, 30151, 273, 247, 10895, 516, 12371, 849, 1199, 273, 841, 10527, 1543, 476, 320, 29403, 281, 6867, 45120, 4715, 8892, 347, 275, 954, 49602, 253, 3410, 1979, 273, 1016, 4836, 310, 253, 1072, 1014, 672, 253, 2303, 1159, 310, 253, 1072, 824, 347, 275, 5028, 32809, 4715, 253, 37264, 651, 751, 281, 5108, 24088, 8143, 4525, 390, 27272, 278, 79, 382, 8892, 533, 275, 4706, 3079, 253, 4477, 1750, 642, 37264, 4620, 323, 4503, 3410, 9552, 534, 310, 417, 253, 1083, 323, 954, 45120, 4715, 8892, 275, 619, 2793, 50275, 8826, 7000, 3374, 337, 3036, 19, 1057, 417, 3761, 253, 5740, 347, 253, 4797, 1386, 299, 66, 310, 1900, 4067, 275, 253, 4677, 50276, 19, 47515, 253, 2228, 275, 3036, 20, 253, 26647, 2228, 689, 253, 1390, 4836, 50276, 20, 954, 4679, 1543, 403, 670, 3579, 3700, 891, 1158, 352, 651, 320, 1805, 281, 921, 19265, 3700, 347, 973, 50276, 783, 10527, 1543, 2530, 275, 436, 2929, 403, 2238, 273, 4722, 533, 891, 1158, 597, 403, 625, 670, 22453, 3733, 2581, 685, 45120, 4715, 253, 4679, 629, 310, 247, 2372, 5075, 281, 1329, 253, 3916, 50276, 284, 516, 417, 7615, 342, 253, 295, 17922, 2905, 789, 891, 42126, 2451, 949, 253, 27947, 273, 1110, 39383, 50276, 7152, 33032, 2520, 2929, 2175, 253, 26647, 17338, 3879, 273, 253, 45120, 4715, 2346, 1624, 4715, 762, 253, 11454, 28196, 10295, 295, 17922, 9459, 352, 4245, 253, 15895, 273, 253, 26647, 2228, 875, 767, 8892, 275, 45279, 285, 19265, 4088, 285, 3537, 13505, 253, 16178, 273, 2303, 14259, 285, 3410, 9552, 840, 253, 1881, 36871, 3700, 1083, 342, 253, 1072, 2303, 1159, 310, 5421, 534, 2722, 253, 6978, 1319, 273, 253, 36256, 37264, 285, 253, 4833, 273, 253, 3410, 1979, 253, 1543, 403, 840, 14923, 281, 1142, 8892, 50276, 783, 33977, 2175, 403, 9560, 323, 4685, 849, 3676, 11454, 6928, 789, 327, 253, 45120, 4715, 2346, 1624, 4715, 1895, 295, 17922, 310, 581, 273, 253, 6422, 10527, 5657, 323, 18918, 253, 3879, 273, 11454, 6928, 436, 2929, 8725, 253, 2175, 273, 295, 17922, 5742, 253, 6637, 7976, 4715, 9191, 337, 327, 253, 45120, 4715, 4758, 3066, 12392, 253, 26647, 875, 253, 22453, 8892, 253, 6260, 10313, 253, 2954, 875, 253, 3700, 1542, 35777, 285, 253, 10895, 323, 2629, 11454, 6928, 2299, 627, 403, 1335, 690, 30453, 326, 778, 2701, 253, 8453, 273, 253, 2929, 337, 253, 2929, 16633, 327, 2175, 273, 253, 13576, 273, 253, 2629, 6928, 295, 17922, 10166, 762, 2629, 8130, 3021, 253, 1543, 7194, 4887, 253, 4833, 273, 253, 941, 5319, 285, 921, 11184, 16039, 2905, 281, 253, 6928, 285, 253, 3733, 8130, 1677, 253, 3236, 1543, 275, 337, 253, 1543, 275, 253, 2929, 403, 4755, 2007, 14155, 253, 9380, 8453, 253, 1543, 275, 253, 2929, 594, 2080, 1918, 3710, 12925, 2905, 281, 1566, 20462, 275, 619, 4743, 50276, 19, 253, 2175, 403, 11096, 281, 45120, 4715, 342, 6843, 4836, 13674, 352, 310, 12207, 533, 891, 5125, 253, 4477, 1750, 253, 7533, 625, 11120, 387, 253, 5068, 273, 253, 2929, 9349, 323, 19843, 50275, 18, 270, 636, 46842, 787, 640, 490, 69, 23073, 324, 343, 476, 15642, 285, 260, 1205, 478, 759, 73, 282, 6148, 6637, 7976, 4715, 9191, 275, 10295, 9077, 285, 4618, 11454, 6928, 275, 5213, 8059, 327, 5145, 4715, 7266, 27277, 740, 1706, 268, 1686, 83, 9169, 3738, 253, 9380, 8453, 310, 247, 1652, 3710, 407, 253, 2954, 342, 2045, 789, 824, 347, 337, 285, 253, 11096, 10076, 253, 1543, 3340, 253, 1881, 36871, 3700, 1083, 403, 4722, 352, 812, 320, 247, 4983, 1127, 323, 625, 2087, 285, 27096, 6260, 273, 3676, 45120, 4715, 50276, 7152, 339, 431, 248, 2929, 310, 247, 10527, 2929, 670, 45120, 4715, 326, 2175, 374, 8892, 7533, 275, 253, 295, 17922, 9459, 50276, 783, 2929, 1263, 275, 1798, 3700, 432, 8892, 342, 253, 1072, 2303, 3470, 285, 9569, 253, 4473, 273, 1881, 36871, 3700, 26332, 3700, 432, 767, 8892, 342, 253, 1072, 2303, 1159, 50276, 783, 2929, 310, 973, 3542, 285, 973, 18872, 1016, 2593, 310, 2590, 275, 752, 253, 4477, 1611, 281, 5513, 7568, 17837, 253, 30318, 908, 310, 247, 2372, 1027, 432, 7312, 45120, 4715, 9380, 534, 1537, 320, 247, 2372, 24363, 247, 1481, 275, 502, 6239, 10770, 2223, 281, 247, 1387, 273, 3453, 8512, 1223, 352, 3133, 326, 275, 436, 2929, 352, 10770, 760, 281, 581, 3453, 23586, 2303, 1159, 310, 671, 11555, 281, 3730, 281, 253, 1159, 2173, 281, 247, 1387, 273, 8571, 436, 28939, 1057, 417, 8513, 533, 352, 943, 320, 5544, 4321, 275, 253, 2929, 281, 3693, 13775, 50275, 74, 14409, 253, 897, 273, 247, 10527, 2746, 281, 45120, 4715, 281, 1805, 4853, 285, 2096, 253, 1895, 359, 403, 16161, 327, 253, 643, 1133, 352, 310, 247, 2372, 2834, 281, 3048, 253, 2929, 4758, 342, 45120, 4715, 8946, 7533, 45120, 4715, 310, 3798, 670, 4715, 275, 247, 1327, 74, 301, 4758, 342, 2067, 8892, 835, 1016, 4836, 1537, 1421, 281, 37264, 273, 253, 2045, 581, 275, 436, 2929, 767, 1027, 8892, 403, 417, 6326, 281, 19323, 342, 253, 643, 347, 1048, 347, 597, 452, 1027, 2303, 3470, 275, 253, 295, 17922, 9459, 253, 5016, 875, 1027, 9851, 513, 417, 2847, 3640, 3700, 285, 37264, 253, 4473, 273, 1881, 36871, 3700, 310, 417, 1077, 27350, 604, 891, 2096, 9113, 310, 670, 4715, 969, 752, 253, 1566, 310, 6326, 281, 2168, 871, 50276, 2858, 352, 1057, 417, 3037, 352, 432, 2994, 26332, 432, 697, 1211, 3640, 533, 1335, 432, 253, 941, 326, 2139, 1881, 36871, 3700, 28939, 3133, 281, 417, 4944, 253, 11562, 326, 253, 4477, 971, 281, 6266, 1881, 36871, 3700, 651, 1805, 4944, 281, 247, 17927, 33558, 267, 1332, 390, 281, 247, 1006, 800, 44864, 1332, 50275, 34974, 50276, 4674, 7652, 50276, 783, 3733, 327, 4836, 270, 372, 25013, 26647, 1014, 2167, 1097, 8892, 247, 285, 270, 3037, 253, 1072, 2303, 50276, 5430, 310, 352, 1896, 310, 352, 689, 31893, 273, 253, 2303, 1159, 390, 816, 326, 247, 285, 270, 8892, 403, 670, 767, 1027, 10006, 273, 253, 1072, 3268, 50276, 261, 352, 247, 4758, 2074, 281, 5028, 32809, 7533, 50275, 4674, 5976, 50276, 10247, 3700, 310, 16293, 762, 4503, 3410, 9552, 891, 2868, 3410, 1979, 310, 1774, 533, 604, 5549, 50276, 5668, 285, 512, 253, 3530, 403, 253, 1072, 50276, 9088, 310, 642, 1921, 323, 14314, 26332, 2762, 3700, 327, 253, 643, 1133, 604, 368, 452, 253, 1072, 1180, 273, 3530, 533, 342, 247, 2406, 13099, 368, 476, 452, 247, 4016, 19265, 3700, 752, 717, 891, 5816, 1060, 281, 7525, 891, 751, 253, 4388, 387, 28055, 17682, 45120, 4715, 533, 352, 3133, 627, 310, 1335, 247, 40082, 8037, 281, 1056, 253, 10527, 4758, 4081, 4944, 271, 4588, 45120, 4715, 4758, 891, 1335, 588, 9646, 436, 2929, 1840, 253, 14924, 7887, 984, 891, 1158, 253, 2929, 476, 320, 4217, 323, 253, 3114, 50276, 7152, 339, 431, 248, 2929, 10262, 247, 1263, 273, 45120, 4715, 273, 11454, 6928, 275, 253, 11454, 28196, 10295, 295, 17922, 9459, 253, 4477, 9017, 3332, 789, 534, 8197, 253, 26647, 2228, 273, 824, 6928, 970, 247, 7605, 17823, 5853, 1929, 347, 253, 36804, 10480, 253, 3762, 2789, 247, 1180, 273, 13650, 5001, 19265, 285, 3579, 3700, 4715, 534, 6266, 973, 253, 3879, 273, 6486, 11454, 6928, 10166, 342, 11786, 18499, 436, 310, 247, 1077, 973, 3542, 2929, 891, 1620, 6311, 253, 36804, 10480, 285, 516, 417, 271, 6485, 273, 295, 17922, 3762, 533, 253, 4477, 2085, 816, 2217, 4114, 2508, 327, 253, 295, 17922, 285, 253, 2613, 270, 636, 46842, 1162, 355, 9169, 285, 476, 15642, 1162, 355, 43425, 9380, 281, 452, 247, 15909, 273, 253, 2022, 5657, 908, 281, 15313, 253, 747, 1543, 841, 747, 1543, 403, 7036, 9299, 3559, 285, 20420, 5469, 4583, 891, 1119, 253, 2929, 247, 11284, 281, 1239, 50276, 783, 3762, 310, 16851, 352, 310, 3715, 323, 29556, 4363, 6928, 285, 352, 310, 417, 3242, 1014, 323, 824, 6928, 347, 352, 27945, 2220, 690, 24866, 326, 403, 629, 273, 253, 36804, 1332, 253, 4477, 3103, 9257, 17813, 616, 3300, 757, 267, 39977, 1543, 342, 4588, 11454, 2990, 9938, 253, 3762, 310, 275, 13943, 4345, 342, 253, 1236, 1541, 292, 1601, 9938, 253, 4944, 273, 3036, 337, 67, 534, 2722, 6793, 14561, 3879, 310, 3782, 13406, 50276, 783, 2929, 4428, 247, 1180, 273, 4722, 1543, 352, 816, 7790, 253, 519, 25667, 36256, 275, 36256, 37264, 285, 352, 41217, 2570, 3700, 3879, 347, 253, 4103, 3733, 873, 1979, 285, 9732, 6046, 2308, 403, 12848, 891, 1158, 326, 253, 9938, 3815, 403, 4409, 18051, 253, 958, 326, 597, 476, 320, 2529, 342, 3300, 757, 267, 39977, 3082, 310, 247, 2266, 5043, 50276, 66, 14855, 273, 253, 2929, 534, 7787, 253, 31376, 273, 697, 11815, 310, 697, 2770, 327, 253, 11096, 1083, 835, 268, 89, 310, 8025, 281, 320, 253, 1072, 323, 1046, 4836, 50276, 3022, 643, 2175, 275, 253, 295, 17922, 9459, 1529, 4755, 14855, 273, 253, 2929, 310, 253, 3480, 273, 4679, 326, 1611, 281, 17813, 281, 534, 6070, 253, 3762, 6556, 672, 253, 2990, 4871, 4916, 4577, 50276, 74, 369, 417, 7032, 273, 16725, 253, 10426, 3559, 275, 253, 30762, 3103, 891, 2550, 1462, 323, 253, 36594, 273, 253, 1543, 3559, 275, 253, 2929, 347, 891, 717, 417, 271, 6485, 275, 436, 1673, 352, 310, 671, 1896, 326, 891, 9829, 690, 1774, 2905, 789, 326, 943, 452, 644, 11106, 50276, 8826, 643, 1355, 16157, 50276, 262, 651, 320, 4217, 281, 2085, 690, 625, 2508, 347, 281, 849, 253, 1162, 2284, 403, 10302, 50276, 9088, 403, 963, 993, 285, 1355, 48087, 16503, 11678, 326, 943, 320, 4229, 50276, 783, 5955, 2593, 651, 5649, 432, 690, 5701, 327, 534, 604, 667, 5368, 45120, 4715, 11333, 326, 403, 4158, 281, 4796, 37264, 1537, 320, 5867, 275, 253, 295, 17922, 9459, 342, 2074, 5657, 347, 1110, 908, 1060, 50276, 2520, 310, 247, 973, 15068, 264, 10527, 1263, 326, 10384, 3786, 3715, 5609, 281, 41398, 6266, 3700, 8190, 780, 4715, 275, 11454, 6928, 253, 1543, 403, 281, 253, 1682, 273, 619, 3640, 4460, 253, 3762, 12957, 4722, 6793, 16958, 326, 6634, 275, 4588, 11454, 2990, 9938, 2490, 187, 4118, 18435, 27, 6438, 9257, 4361, 253, 10123, 285, 30080, 22559, 891, 2868, 436, 789, 310, 273, 4209, 3290, 323, 14924, 4685, 45120, 4715, 432, 247, 10527, 1462, 1127, 310, 247, 1077, 1774, 9400, 891, 1089, 326, 581, 273, 253, 2022, 2523, 5439, 407, 30628, 369, 670, 253, 3242, 4495, 273, 45120, 4715, 285, 1880, 752, 253, 4477, 5421, 369, 625, 33917, 281, 22453, 4715, 1223, 891, 13414, 2564, 253, 1307, 22453, 4715, 285, 310, 3240, 27389, 273, 253, 789, 347, 973, 891, 14936, 326, 253, 2783, 9978, 310, 417, 45120, 4715 ]
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ]
[ 30003, 310, 1677, 2278, 273, 247, 2561, 2929, 432, 260, 2369, 1793, 6698, 15, 7764, 3630, 247, 6010, 253, 2278, 15, 187, 4118, 8439, 27, 187, 2520, 2929, 3400, 247, 10527, 1783, 323, 22453, 3733, 1754, 327, 2905, 295, 17922, 1543, 50276, 262, 2722, 253, 14259, 875, 253, 2303, 3470, 310, 247, 2234, 2803, 323, 3579, 285, 19265, 3700, 285, 672, 253, 2303, 3470, 403, 253, 1072, 253, 3530, 1979, 273, 1016, 4836, 310, 247, 2234, 2803, 281, 3579, 285, 19265, 3700, 352, 671, 2722, 1014, 247, 4512, 3064, 875, 253, 2303, 3470, 476, 2847, 36256, 37264, 285, 253, 37264, 1537, 1335, 5108, 323, 253, 1072, 2303, 1159, 672, 253, 3410, 1979, 273, 247, 1996, 4836, 5459, 50276, 783, 2929, 3400, 690, 10527, 16039, 327, 253, 3579, 285, 19265, 3700, 275, 22453, 3733, 50276, 2577, 2022, 4468, 310, 326, 891, 452, 690, 24626, 5001, 697, 2266, 13260, 327, 253, 941, 3268, 347, 4767, 275, 253, 1390, 12494, 327, 3239, 577, 253, 3280, 3530, 273, 4836, 247, 285, 4836, 270, 403, 891, 301, 285, 4561, 407, 253, 1072, 3268, 436, 9376, 2550, 320, 10048, 275, 954, 2219, 273, 45120, 4715, 3340, 323, 966, 32809, 4715, 253, 941, 3268, 273, 1016, 4836, 556, 1534, 3910, 50276, 249, 436, 3282, 891, 1158, 253, 3916, 273, 436, 789, 403, 625, 670, 22453, 3733, 327, 247, 1781, 10895, 2581, 685, 45120, 4715, 50276, 395, 253, 4679, 403, 12497, 281, 12654, 253, 10527, 1543, 275, 6867, 45120, 4715, 22791, 8892, 253, 4679, 327, 260, 338, 274, 740, 403, 10323, 22453, 3733, 327, 30151, 273, 247, 10895, 516, 12371, 849, 1199, 273, 841, 10527, 1543, 476, 320, 29403, 281, 6867, 45120, 4715, 8892, 347, 275, 954, 49602, 253, 3410, 1979, 273, 1016, 4836, 310, 253, 1072, 1014, 672, 253, 2303, 1159, 310, 253, 1072, 824, 347, 275, 5028, 32809, 4715, 253, 37264, 651, 751, 281, 5108, 24088, 8143, 4525, 390, 27272, 278, 79, 382, 8892, 533, 275, 4706, 3079, 253, 4477, 1750, 642, 37264, 4620, 323, 4503, 3410, 9552, 534, 310, 417, 253, 1083, 323, 954, 45120, 4715, 8892, 275, 619, 2793, 50275, 8826, 7000, 3374, 337, 3036, 19, 1057, 417, 3761, 253, 5740, 347, 253, 4797, 1386, 299, 66, 310, 1900, 4067, 275, 253, 4677, 50276, 19, 47515, 253, 2228, 275, 3036, 20, 253, 26647, 2228, 689, 253, 1390, 4836, 50276, 20, 954, 4679, 1543, 403, 670, 3579, 3700, 891, 1158, 352, 651, 320, 1805, 281, 921, 19265, 3700, 347, 973, 50276, 783, 10527, 1543, 2530, 275, 436, 2929, 403, 2238, 273, 4722, 533, 891, 1158, 597, 403, 625, 670, 22453, 3733, 2581, 685, 45120, 4715, 253, 4679, 629, 310, 247, 2372, 5075, 281, 1329, 253, 3916, 50276, 284, 516, 417, 7615, 342, 253, 295, 17922, 2905, 789, 891, 42126, 2451, 949, 253, 27947, 273, 1110, 39383, 50276, 7152, 33032, 2520, 2929, 2175, 253, 26647, 17338, 3879, 273, 253, 45120, 4715, 2346, 1624, 4715, 762, 253, 11454, 28196, 10295, 295, 17922, 9459, 352, 4245, 253, 15895, 273, 253, 26647, 2228, 875, 767, 8892, 275, 45279, 285, 19265, 4088, 285, 3537, 13505, 253, 16178, 273, 2303, 14259, 285, 3410, 9552, 840, 253, 1881, 36871, 3700, 1083, 342, 253, 1072, 2303, 1159, 310, 5421, 534, 2722, 253, 6978, 1319, 273, 253, 36256, 37264, 285, 253, 4833, 273, 253, 3410, 1979, 253, 1543, 403, 840, 14923, 281, 1142, 8892, 50276, 783, 33977, 2175, 403, 9560, 323, 4685, 849, 3676, 11454, 6928, 789, 327, 253, 45120, 4715, 2346, 1624, 4715, 1895, 295, 17922, 310, 581, 273, 253, 6422, 10527, 5657, 323, 18918, 253, 3879, 273, 11454, 6928, 436, 2929, 8725, 253, 2175, 273, 295, 17922, 5742, 253, 6637, 7976, 4715, 9191, 337, 327, 253, 45120, 4715, 4758, 3066, 12392, 253, 26647, 875, 253, 22453, 8892, 253, 6260, 10313, 253, 2954, 875, 253, 3700, 1542, 35777, 285, 253, 10895, 323, 2629, 11454, 6928, 2299, 627, 403, 1335, 690, 30453, 326, 778, 2701, 253, 8453, 273, 253, 2929, 337, 253, 2929, 16633, 327, 2175, 273, 253, 13576, 273, 253, 2629, 6928, 295, 17922, 10166, 762, 2629, 8130, 3021, 253, 1543, 7194, 4887, 253, 4833, 273, 253, 941, 5319, 285, 921, 11184, 16039, 2905, 281, 253, 6928, 285, 253, 3733, 8130, 1677, 253, 3236, 1543, 275, 337, 253, 1543, 275, 253, 2929, 403, 4755, 2007, 14155, 253, 9380, 8453, 253, 1543, 275, 253, 2929, 594, 2080, 1918, 3710, 12925, 2905, 281, 1566, 20462, 275, 619, 4743, 50276, 19, 253, 2175, 403, 11096, 281, 45120, 4715, 342, 6843, 4836, 13674, 352, 310, 12207, 533, 891, 5125, 253, 4477, 1750, 253, 7533, 625, 11120, 387, 253, 5068, 273, 253, 2929, 9349, 323, 19843, 50275, 18, 270, 636, 46842, 787, 640, 490, 69, 23073, 324, 343, 476, 15642, 285, 260, 1205, 478, 759, 73, 282, 6148, 6637, 7976, 4715, 9191, 275, 10295, 9077, 285, 4618, 11454, 6928, 275, 5213, 8059, 327, 5145, 4715, 7266, 27277, 740, 1706, 268, 1686, 83, 9169, 3738, 253, 9380, 8453, 310, 247, 1652, 3710, 407, 253, 2954, 342, 2045, 789, 824, 347, 337, 285, 253, 11096, 10076, 253, 1543, 3340, 253, 1881, 36871, 3700, 1083, 403, 4722, 352, 812, 320, 247, 4983, 1127, 323, 625, 2087, 285, 27096, 6260, 273, 3676, 45120, 4715, 50276, 7152, 339, 431, 248, 2929, 310, 247, 10527, 2929, 670, 45120, 4715, 326, 2175, 374, 8892, 7533, 275, 253, 295, 17922, 9459, 50276, 783, 2929, 1263, 275, 1798, 3700, 432, 8892, 342, 253, 1072, 2303, 3470, 285, 9569, 253, 4473, 273, 1881, 36871, 3700, 26332, 3700, 432, 767, 8892, 342, 253, 1072, 2303, 1159, 50276, 783, 2929, 310, 973, 3542, 285, 973, 18872, 1016, 2593, 310, 2590, 275, 752, 253, 4477, 1611, 281, 5513, 7568, 17837, 253, 30318, 908, 310, 247, 2372, 1027, 432, 7312, 45120, 4715, 9380, 534, 1537, 320, 247, 2372, 24363, 247, 1481, 275, 502, 6239, 10770, 2223, 281, 247, 1387, 273, 3453, 8512, 1223, 352, 3133, 326, 275, 436, 2929, 352, 10770, 760, 281, 581, 3453, 23586, 2303, 1159, 310, 671, 11555, 281, 3730, 281, 253, 1159, 2173, 281, 247, 1387, 273, 8571, 436, 28939, 1057, 417, 8513, 533, 352, 943, 320, 5544, 4321, 275, 253, 2929, 281, 3693, 13775, 50275, 74, 14409, 253, 897, 273, 247, 10527, 2746, 281, 45120, 4715, 281, 1805, 4853, 285, 2096, 253, 1895, 359, 403, 16161, 327, 253, 643, 1133, 352, 310, 247, 2372, 2834, 281, 3048, 253, 2929, 4758, 342, 45120, 4715, 8946, 7533, 45120, 4715, 310, 3798, 670, 4715, 275, 247, 1327, 74, 301, 4758, 342, 2067, 8892, 835, 1016, 4836, 1537, 1421, 281, 37264, 273, 253, 2045, 581, 275, 436, 2929, 767, 1027, 8892, 403, 417, 6326, 281, 19323, 342, 253, 643, 347, 1048, 347, 597, 452, 1027, 2303, 3470, 275, 253, 295, 17922, 9459, 253, 5016, 875, 1027, 9851, 513, 417, 2847, 3640, 3700, 285, 37264, 253, 4473, 273, 1881, 36871, 3700, 310, 417, 1077, 27350, 604, 891, 2096, 9113, 310, 670, 4715, 969, 752, 253, 1566, 310, 6326, 281, 2168, 871, 50276, 2858, 352, 1057, 417, 3037, 352, 432, 2994, 26332, 432, 697, 1211, 3640, 533, 1335, 432, 253, 941, 326, 2139, 1881, 36871, 3700, 28939, 3133, 281, 417, 4944, 253, 11562, 326, 253, 4477, 971, 281, 6266, 1881, 36871, 3700, 651, 1805, 4944, 281, 247, 17927, 33558, 267, 1332, 390, 281, 247, 1006, 800, 44864, 1332, 50275, 34974, 50276, 4674, 7652, 50276, 783, 3733, 327, 4836, 270, 372, 25013, 26647, 1014, 2167, 1097, 8892, 247, 285, 270, 3037, 253, 1072, 2303, 50276, 5430, 310, 352, 1896, 310, 352, 689, 31893, 273, 253, 2303, 1159, 390, 816, 326, 247, 285, 270, 8892, 403, 670, 767, 1027, 10006, 273, 253, 1072, 3268, 50276, 261, 352, 247, 4758, 2074, 281, 5028, 32809, 7533, 50275, 4674, 5976, 50276, 10247, 3700, 310, 16293, 762, 4503, 3410, 9552, 891, 2868, 3410, 1979, 310, 1774, 533, 604, 5549, 50276, 5668, 285, 512, 253, 3530, 403, 253, 1072, 50276, 9088, 310, 642, 1921, 323, 14314, 26332, 2762, 3700, 327, 253, 643, 1133, 604, 368, 452, 253, 1072, 1180, 273, 3530, 533, 342, 247, 2406, 13099, 368, 476, 452, 247, 4016, 19265, 3700, 752, 717, 891, 5816, 1060, 281, 7525, 891, 751, 253, 4388, 387, 28055, 17682, 45120, 4715, 533, 352, 3133, 627, 310, 1335, 247, 40082, 8037, 281, 1056, 253, 10527, 4758, 4081, 4944, 271, 4588, 45120, 4715, 4758, 891, 1335, 588, 9646, 436, 2929, 1840, 253, 14924, 7887, 984, 891, 1158, 253, 2929, 476, 320, 4217, 323, 253, 3114, 50276, 7152, 339, 431, 248, 2929, 10262, 247, 1263, 273, 45120, 4715, 273, 11454, 6928, 275, 253, 11454, 28196, 10295, 295, 17922, 9459, 253, 4477, 9017, 3332, 789, 534, 8197, 253, 26647, 2228, 273, 824, 6928, 970, 247, 7605, 17823, 5853, 1929, 347, 253, 36804, 10480, 253, 3762, 2789, 247, 1180, 273, 13650, 5001, 19265, 285, 3579, 3700, 4715, 534, 6266, 973, 253, 3879, 273, 6486, 11454, 6928, 10166, 342, 11786, 18499, 436, 310, 247, 1077, 973, 3542, 2929, 891, 1620, 6311, 253, 36804, 10480, 285, 516, 417, 271, 6485, 273, 295, 17922, 3762, 533, 253, 4477, 2085, 816, 2217, 4114, 2508, 327, 253, 295, 17922, 285, 253, 2613, 270, 636, 46842, 1162, 355, 9169, 285, 476, 15642, 1162, 355, 43425, 9380, 281, 452, 247, 15909, 273, 253, 2022, 5657, 908, 281, 15313, 253, 747, 1543, 841, 747, 1543, 403, 7036, 9299, 3559, 285, 20420, 5469, 4583, 891, 1119, 253, 2929, 247, 11284, 281, 1239, 50276, 783, 3762, 310, 16851, 352, 310, 3715, 323, 29556, 4363, 6928, 285, 352, 310, 417, 3242, 1014, 323, 824, 6928, 347, 352, 27945, 2220, 690, 24866, 326, 403, 629, 273, 253, 36804, 1332, 253, 4477, 3103, 9257, 17813, 616, 3300, 757, 267, 39977, 1543, 342, 4588, 11454, 2990, 9938, 253, 3762, 310, 275, 13943, 4345, 342, 253, 1236, 1541, 292, 1601, 9938, 253, 4944, 273, 3036, 337, 67, 534, 2722, 6793, 14561, 3879, 310, 3782, 13406, 50276, 783, 2929, 4428, 247, 1180, 273, 4722, 1543, 352, 816, 7790, 253, 519, 25667, 36256, 275, 36256, 37264, 285, 352, 41217, 2570, 3700, 3879, 347, 253, 4103, 3733, 873, 1979, 285, 9732, 6046, 2308, 403, 12848, 891, 1158, 326, 253, 9938, 3815, 403, 4409, 18051, 253, 958, 326, 597, 476, 320, 2529, 342, 3300, 757, 267, 39977, 3082, 310, 247, 2266, 5043, 50276, 66, 14855, 273, 253, 2929, 534, 7787, 253, 31376, 273, 697, 11815, 310, 697, 2770, 327, 253, 11096, 1083, 835, 268, 89, 310, 8025, 281, 320, 253, 1072, 323, 1046, 4836, 50276, 3022, 643, 2175, 275, 253, 295, 17922, 9459, 1529, 4755, 14855, 273, 253, 2929, 310, 253, 3480, 273, 4679, 326, 1611, 281, 17813, 281, 534, 6070, 253, 3762, 6556, 672, 253, 2990, 4871, 4916, 4577, 50276, 74, 369, 417, 7032, 273, 16725, 253, 10426, 3559, 275, 253, 30762, 3103, 891, 2550, 1462, 323, 253, 36594, 273, 253, 1543, 3559, 275, 253, 2929, 347, 891, 717, 417, 271, 6485, 275, 436, 1673, 352, 310, 671, 1896, 326, 891, 9829, 690, 1774, 2905, 789, 326, 943, 452, 644, 11106, 50276, 8826, 643, 1355, 16157, 50276, 262, 651, 320, 4217, 281, 2085, 690, 625, 2508, 347, 281, 849, 253, 1162, 2284, 403, 10302, 50276, 9088, 403, 963, 993, 285, 1355, 48087, 16503, 11678, 326, 943, 320, 4229, 50276, 783, 5955, 2593, 651, 5649, 432, 690, 5701, 327, 534, 604, 667, 5368, 45120, 4715, 11333, 326, 403, 4158, 281, 4796, 37264, 1537, 320, 5867, 275, 253, 295, 17922, 9459, 342, 2074, 5657, 347, 1110, 908, 1060, 50276, 2520, 310, 247, 973, 15068, 264, 10527, 1263, 326, 10384, 3786, 3715, 5609, 281, 41398, 6266, 3700, 8190, 780, 4715, 275, 11454, 6928, 253, 1543, 403, 281, 253, 1682, 273, 619, 3640, 4460, 253, 3762, 12957, 4722, 6793, 16958, 326, 6634, 275, 4588, 11454, 2990, 9938, 2490, 187, 4118, 18435, 27, 6438, 9257, 4361, 253, 10123, 285, 30080, 22559, 891, 2868, 436, 789, 310, 273, 4209, 3290, 323, 14924, 4685, 45120, 4715, 432, 247, 10527, 1462, 1127, 310, 247, 1077, 1774, 9400, 891, 1089, 326, 581, 273, 253, 2022, 2523, 5439, 407, 30628, 369, 670, 253, 3242, 4495, 273, 45120, 4715, 285, 1880, 752, 253, 4477, 5421, 369, 625, 33917, 281, 22453, 4715, 1223, 891, 13414, 2564, 253, 1307, 22453, 4715, 285, 310, 3240, 27389, 273, 253, 789, 347, 973, 891, 14936, 326, 253, 2783, 9978, 310, 417, 45120, 4715 ]